US5867591A - Method of matching stereo images and method of measuring disparity between these image - Google Patents

Method of matching stereo images and method of measuring disparity between these image Download PDF

Info

Publication number
US5867591A
US5867591A US08/629,708 US62970896A US5867591A US 5867591 A US5867591 A US 5867591A US 62970896 A US62970896 A US 62970896A US 5867591 A US5867591 A US 5867591A
Authority
US
United States
Prior art keywords
image
disparity
images
matching
ternary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US08/629,708
Inventor
Katsumasa Onda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Assigned to MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD. reassignment MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ONDA, KATSUMASA
Priority to US09/196,329 priority Critical patent/US6125198A/en
Application granted granted Critical
Publication of US5867591A publication Critical patent/US5867591A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • This invention generally relates to a method of matching stereo images and a method of detecting disparity between these images, which is chiefly used in the industrial field of stereo cameras for detecting positional information in the image pickup space based on stereo images, volume compression of overall stereo images (i.e. three-dimensional video images), display control of these stereo images, and for the optical flow extraction of moving images and so on.
  • FIG. 1 is a view illustrating the principle of a typical stereo image measurement.
  • a three-dimensional coordinate generally defined by variables x, y and z, represents the real space.
  • a two-dimensional coordinate generally defined by variables X and Y, represents the plane of image (i.e. an image-pickup plane of a camera).
  • a position on the image plane of right camera 23R can be expressed by variables XR and YR on one two-dimensional coordinate.
  • a position on the image plane of left camera 23L can be expressed by variables XL and YL on the other two-dimensional coordinate.
  • Axes XL and XR are parallel to the axis x, while axes YL and YR are parallel to the axis y.
  • Axis z is parallel to the optical axes of two cameras 23R and 23L.
  • the origin of the real space coordinate (x, y, z) coincides with a midpoint between the projective centers of right and left cameras 23R and 23L.
  • the distance between the projective centers is generally referred to as a base length denoted by 2a.
  • a distance, denoted by f is a focal distance between each projective center and its image plane.
  • a real-space point p is projected at a point PR(XR,YR) on the right image plane and at the same time a point PL(XL,YL) on the left image plane.
  • PR and PL are determined on respective image planes (by performing the matching of stereo images) and then the real-space coordinate (x, y, z) representing the point p is obtained based on the principle of the trigonometrical survey.
  • YR and YL have identical values in this case, because two optical axes of cameras 23R and 23L exist on the same plane and X axes of cameras 23R and 23L are parallel to axis x.
  • the relationship between the coordinate values XR, YR, XR, YR and the real-space coordinate values x, y, z is expressed in the following equation. ##EQU1## where d represents the disparity (between stereo images).
  • a specific point on one image plane has a matching point on the other image plane along the same scanning line serving as an epipolar line within the region define by XL>XR. Accordingly, the matching point corresponding to a specific point on one image plane can be found on the other image plane by checking the similarity of images in each micro area along the line having the possibility of detecting the matching point.
  • FIG. 2 shows a conventional method of detecting a mutual correlation value between two images, disclosed in "Image Processing Handbook” (Shokodo publishing Co. Ltd.) by Morio ONOUE et al., for example.
  • a pixel matching to this pixel 2403 is next found along the plane of right image 2402.
  • the matching point is determined.
  • a square micro area 2404 (hereinafter referred to as a micro area) is set on the right image 2401 so as to have a size corresponding to n ⁇ m pixels sufficient to involve the designated pixel 2403 at the center thereof.
  • IL(i,j) represents the brightness of each point (pixel) within the micro area 2404.
  • a square micro area 2405 on the right image 2402 is designated as a micro area having its center on a pixel satisfying the condition of equation 4.
  • the micro area 2405 has a size corresponding to n x m pixels. It is assumed that IR(i,j) represents the brightness of each point (pixel) within the micro area 2405.
  • ⁇ L, ⁇ R, ⁇ L2 and ⁇ R2 represent averages and variances of the brightness in the micro areas 2404 and 2405.
  • the mutual correlation value of these micro areas can be given by the following equation. ##EQU2##
  • the value "c" defined by the equation 5 is calculated along the straight line (epipolar line) having the possibility of detecting a matching point. Then, the point where the value "c" is maximized is identified as the matching point to be detected. According to this method, it becomes possible to determine the matching point as having the size identical with a pixel. If the matching point is once found, the disparity "d" can be immediately obtained using the equation 3 based on the coordinate values representing thus found matching point.
  • this conventional method is disadvantageous in that a great amount of computations will be required for completely obtaining all the matching points of required pixels since even a single search of finding only one matching point of a certain pixel requires the above-described complicated computations to be repetitively performed with respect to the entire region having the possibility of detecting the matching point.
  • the computations for obtaining the correlation can be speeded up with reducing size of the micro area, although the stability in the matching point detection will be worsened due to increase of image distortion and noises.
  • increasing the size of the micro area will not only increase the computation time but deteriorate the accuracy in the matching point detection because of the change of correlation values being undesirably moderated.
  • it will be required to adequately set the size of the micro area by considering the characteristics of the image to be handled.
  • the characteristics of the above-described conventional method resides in that the determination of the disparity directly reflects the result of stereo image matching. Hence, any erroneous matching will cause an error in the measurement of disparity "d". In short, an error in the stereo image matching leads to an error in the disparity measurement.
  • one of proposed technologies is a method of dividing or dissecting the image into several blocks each having a predetermined size and determining the matching region based on the dissected blocks.
  • “Driving Aid System based on Three-dimensional Image Recognition Technology” by Jitsuyoshi et al., in the Pre-publishing 924, pp. 169-172 of Automotive Vehicle Technical Institute Scientific Lecture Meeting, October in 1992, discloses such a method of searching the matching region based on the comparison between the blocks of right and left images.
  • FIG. 3 is a view illustrating the conventional method of performing the matching of stereo images between square micro areas (blocks).
  • the left image 2501 serving as a reference image, is dissected into a plurality of blocks so that each block (2503) has a size equivalent to n ⁇ m pixels.
  • each matching region with respect to each block on the left image 2501 is searched along the plane of right image 2502.
  • the following equation is a similarity evaluation used for determining the matching region.
  • Li represents luminance of i-th pixel in the left block 2503, while Ri represents luminance of i-th pixel in the right block 2504.
  • the preprocessing is additionally required for adjusting the sensitivity difference between right and left cameras and for performing the shading correction before executing the stereo image matching processing.
  • a straight line existing in the image-pickup space may be image-formed as straight lines 2603 and 2604 different in their gradients in blocks 2605 and 2606 of left and right images 2601 and 2602, as shown in FIG. 4. In such a case, it may fail to accurately determine the matching regions.
  • two different lines may be image-formed as identical lines in blocks 2703 and 2704 on left and right images 2701 and 2702 as shown in FIG. 5.
  • comparing the pixels between two blocks 2703 and 2704 only will cause a problem that the stereo image matching may be erroneously performed and the succeeding measurement of disparity will be failed.
  • the unit for measuring each disparity is one pixel at minimum because of image data of digital data sampled at a certain frequency. However, it is possible to perform the disparity measurement more accurately.
  • FIG. 6 is a view illustrating a conventional disparity measuring method capable of detecting a disparity in a sub-pixel level accuracy.
  • FIG. 6 shows a peak position found in the similarity evaluation value C (ordinate) when the equation 6 is calculated along the search region in each block.
  • the sub-pixel level disparity measurement is performed by using similarity evaluations Ci, Ci-1, Ci+1 corresponding to particular disparities di, di-1, di+1 (in the increment of pixel) existing before and after the peak position. More specifically, a first straight line 2801 is obtained as a line crossing both of two points (di-1, Ci-1) and (di, Ci).
  • a second straight line 2802 is obtained as a line crossing a point (di+1, Ci+1) and having a gradient symmetrical with the line 2801 (i.e. identical in absolute value but opposite in sign). Then, a point 2803 is obtained as an intersecting point of two straight lines 2801 and 2802. A disparity ds, corresponding to thus obtained intersecting point 2803, is finally obtained as a sub-pixel level disparity of the concerned block.
  • the above-described conventional stereo image matching methods and disparity detecting methods are generally suffering from increase of hardware costs and enlargement of processing time due to four rules' arithmetic calculations of equations 5 and 6 required for the similarity evaluation in the stereo image matching.
  • a principal object of the present invention is to provide a method of matching stereo images and of detecting disparity between these images, small in the volume of computations, compact in the hardware construction, quick in processing, highly reliable, and excellent in accuracy.
  • a first aspect of the present invention provides a novel and excellent method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each frequency component image, after being applied the secondary differential processing, into ternary values pixel by pixel, thereby obtaining ternary-valued frequency component images TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn and ternary-valued frequency component images TR1, TR2, TR3, - - -
  • a second aspect of the present invention provides a method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each frequency component image, after being applied the secondary differential processing, into ternary values pixel by pixel by using a positive threshold TH1(>0) and a negative threshold TH2( ⁇ 0) in such a manner that a pixel larger than TH1 is designated to "p", a pixel in a range between TH1 and TH2 is designated to "z”, and a pixel smaller than TH2 is designated to "m”,
  • a third aspect of the present invention provides a method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each frequency component image, after being applied the secondary differential processing, into ternary values pixel by pixel in such a manner that a pixel not related to a zero-crossing point is designated to "z", a pixel related to a zero-crossing point and having a positive gradient is designated to "p”, and a pixel related to a zero-crossing point and having a negative gradient is designated to "m”, thereby obtaining ternary
  • a fourth aspect of the present invention provides a method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each low frequency component image of the frequency component images, after being applied the secondary differential processing, into ternary values pixel by pixel by using a positive threshold TH1(>0) and a negative threshold TH2( ⁇ 0) in such a manner that a pixel larger than TH1 is designated to "p", a pixel in a range between TH1 and TH2 is designated to "z”, and a pixel smaller than TH2
  • the first image IL is designated as a reference image for the matching operation
  • a one-dimensional window capable of encompassing N pixels therein is set on the ternary-valued frequency component image of the first image IL
  • a matching region having the same ternary-value pattern as the N pixels in the one-dimensional window is searched from the ternary-valued frequency component image of the second image IR.
  • one of the first and second images IL and IR is designated as a reference image for the matching operation
  • a plurality of one-dimensional windows are set on the entire surface of the ternary-valued frequency component image of the reference image through a scanning operation along an epipolar line, so that the one-dimensional windows are successively overlapped at the same intervals of N/2 when each of the one-dimensional windows has a size equivalent to N pixels, and the matching operation is carried out with respect to each of the one-dimensional windows.
  • pixels in a one-dimensional window of a ternary-valued frequency component image TLk of the first image IL are compared in a one-to-one manner with pixels in a designated region of a ternary-valued frequent component image TRk of the second image IR, when the ternary-valued frequency component images TLk and TRk are identical in their frequency components, wherein an evaluation result "P” is obtained when corresponding two pixels are both "p" or "m”, while an evaluation result "Z” is obtained when the corresponding two pixels are both "z”, and a similarity between two ternary-valued frequency component images TLk and TRk is evaluated by using the following equation:
  • PN represents a total number of pixels having the evaluation result "P”
  • ZN represents a total number of pixels having the evaluation result "Z”
  • ⁇ k and ⁇ k represent weighting factors.
  • PN represents a total number of pixels having the evaluation result "P”
  • ZN represents a total number of pixels having the evaluation result "Z”
  • ⁇ k and ⁇ k represent weighting factors
  • a fifth aspect of the present invention provides a novel and excellent method of detecting a disparity between stereo images, comprising the steps of: comparing pixels in a micro region defined by a one-dimensional window set on a reference image with pixels in a designated micro region on a non-reference image; evaluating a similarity between two micro regions using the following equation:
  • PN represents a total number of pixels having an evaluation result "P” while ZN represents a total number of pixels having an evaluation result "Z”, and ⁇ k and ⁇ k represent weighting factors; searching a first region having a most highest similarity and a second region having a second highest similarity; specifying a first candidate disparity as a disparity corresponding to the first region, and a second candidate disparity as a disparity corresponding to the second region; and determining a valid disparity between the stereo images based on the first and second candidate disparities.
  • PN represents a total number of pixels having an evaluation result "P” while ZN represents a total number of pixels having an evaluation result "Z”, and ⁇ k and ⁇ k represent weighting factors; searching a first region having a most highest similarity and a second region having a second highest similarity in a concerned block; specifying a first candidate disparity as a disparity corresponding to the first region, and a second candidate disparity as a disparity corresponding to the second region; creating a histogram based on the first and second candidate disparities; and determining a valid disparity of the concerned block as a disparity corresponding to a peak position of the histogram.
  • the first image IL is designated as a reference image
  • a one-dimensional window capable of encompassing N pixels therein is set on the ternary-valued frequency component image of the first image IL
  • a matching region having the same ternary-value pattern as the N pixels in the one-dimensional window is searched from the ternary-valued frequency component image of the second image IR.
  • one of the first and second images IL and IR is designated as a reference image
  • a plurality of one-dimensional windows are set on the entire surface of the ternary-valued frequency component image of the reference image through a scanning operation along an epipolar line, so that the one-dimensional windows are successively overlapped at the same intervals of N/2 when each of the one-dimensional windows has a size equivalent to N pixels, and a matching operation is carried out with respect to each of the one-dimensional windows.
  • the valid disparity is calculated as a sub-pixel level parity corresponding to an intersecting point of a first straight line crossing two points (di-1, hi-1), (di, hi) and a second straight line crossing a point (di+1, hi+1) with a gradient symmetrical with the first straight line, where di-1, di, di+1 represent disparities near the peak position of the histogram and hi-1, hi, hi+1 represent the number of occurrences of the disparities di-1, di, di+1 respectively.
  • FIG. 1 is a view illustrating the principle of the stereo image measurement
  • FIG. 2 is a view illustrating a conventional method of checking a mutual correlation value between two images
  • FIG. 3 is a view illustrating a conventional method of matching stereo images based on the comparison of square micro regions (blocks) of two images;
  • FIG. 4 is a view illustrating a problem in a conventional method
  • FIG. 5 is a view illustrating another problem in a conventional method
  • FIG. 7 is a flow diagram showing sequential processes for executing a first embodiment of the present invention, covering the pickup of stereo images through the determination of disparity;
  • FIG. 8 is a view illustrating a monochrome image used in the explanation of one embodiment method of matching stereo images and of detecting disparity between these images in accordance with the present invention
  • FIG. 9 is a block diagram showing an arrangement of a first apparatus which realizes the processing of feature extraction phase (B) of FIG. 7;
  • FIGS. 10A, 10B, 10C and 10D are graphs showing examples of various frequency component images obtained as a result of the feature extraction phase processing shown in FIGS. 9, 23 and 27;
  • FIG. 11 is a block diagram showing an arrangement of a second apparatus which realizes the processing of feature extraction phase (B) of FIG. 7;
  • FIG. 12 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the first and third embodiment of the present invention
  • FIG. 13 is a view illustrating a method of dividing an image into plural blocks, each serving as the unit for determining disparity, in accordance with the present invention
  • FIG. 14 is a view illustrating a scanning method of a one-dimensional window serving as the unit for matching stereo images in the present invention.
  • FIG. 15 is a view illustrating the relationship between the one-dimensional window serving as the unit for matching stereo images and a block serving as the unit for determining a disparity in the present invention
  • FIG. 16 is a view illustrating a method of determining a disparity candidate based on the one-dimensional window search of the present invention
  • FIG. 17 is a view illustrating a method of evaluating a similarity based on the one-dimensional window search of the present invention.
  • FIG. 18 is a view illustrating an example of a storage region used for temporarily storing candidate disparities which are determined in relation to each of one-dimensional windows in accordance with the present invention
  • FIG. 19 is a view illustrating a method of creating a histogram in relation to blocks, based on candidate disparities temporarily stored in the storing region in relation to one-dimensional windows, in accordance with the present invention
  • FIG. 20 is a graph showing an example of the histogram created in each block in accordance with the present invention.
  • FIG. 21 is a graph showing a method of measuring a disparity at the accuracy of sub-pixel level based on the histogram creased in relation to blocks of the present invention.
  • FIG. 22 is a flow diagram showing sequential processes for executing a second embodiment of the present invention, covering the pickup of stereo images through the determination of disparity;
  • FIG. 23 is a block diagram showing an arrangement of a third apparatus which realizes the processing of feature extraction phase (B') of FIG. 22 in accordance with the second embodiment;
  • FIG. 24 is a block diagram showing an arrangement of a fourth apparatus which realizes the processing of feature extraction phase (B') of FIG. 22 in accordance with the second embodiment;
  • FIG. 25 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the second and third embodiment of the present invention.
  • FIG. 26 is a flow diagram showing sequential processes for executing a third embodiment of the present invention, covering the pickup of stereo images through the determination of disparity;
  • FIG. 27 is a block diagram showing an arrangement of a fifth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26 in accordance with the third embodiment.
  • FIG. 28 is a block diagram showing an arrangement of a sixth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26 in accordance with the third embodiment.
  • a method of matching stereo images and a method of detecting a disparity between these images will be hereinafter explained in accordance with the present invention.
  • a first embodiment will be explained based on a stereo image measurement using the method of matching stereo images and detecting disparity between the images in accordance with the present invention.
  • FIG. 7 is a flow diagram showing sequential processes for executing the first embodiment of the present invention, covering the stereo image pickup phase through the disparity determination phase.
  • the image pickup phase (A) two, right and left, images are taken in through two, right and left, image-pickup devices in steps S101 and S102. Then, the right and left images, obtained in the image-pickup phase (A), are respectively subjected to feature extraction in the next feature extraction phase (B) in steps S103 and S104. Thereafter, in the succeeding matching phase (C), the extracted features of the right and left images are compared to check how they match with each other in step S105.
  • a one-dimensional window is set, this one-dimensional window is shifted along a referential image plane (one of right and left image planes) in accordance with a predetermined scanning rule so as to successively set windows each serving as the unit for matching stereo images, and a matching operation is performed by comparing the image features within one window and corresponding image features on the other (the other of right and left image planes).
  • a referential image plane one of right and left image planes
  • the referential image feature plane is dissected or divided into plural blocks each having a predetermined size, a histogram in each block is created from disparities obtained by the matching operation based on one-dimensional windows involving pixels of a concerned block, and a specific disparity just corresponding to the peak of thus obtained histogram is identified as a valid disparity representing the concerned block in step S106.
  • the processing performed in these phases (A) through (D) will be hereinafter described in greater detail.
  • this embodiment disposes a pair of right and left cameras in a parallel arrangement where two cameras are located at predetermined right and left positions in the horizontal direction so that they have paralleled optical axes.
  • the right-and-left parallel arrangement explained with reference to FIG. 1 shows an ideal arrangement model to be adopted in this embodiment too.
  • it will be impossible to perfectly build the ideal arrangement of stereo cameras without causing any dislocations.
  • it is important that the method of matching stereo images and the method of detecting a disparity between these images should be flexible for allowing such dislocations.
  • the right and left images obtained in the image-pickup phase (A) will be explained as monochrome images having a predetermined size of 768 (H) ⁇ 480 (V).
  • the images handled in the present invention are not limited to the disclosed monochrome images.
  • the right and left images, obtained in the image-pickup phase are defined as follows.
  • x represents a horizontal index of the image
  • y represents a vertical index (i.e. line number) of the image.
  • the pixel number is expressed by “x” from left to right, while the line number is expressed by “y” from top to bottom.
  • the left image is divided into a total of 48 pixels in the horizontal direction and a total of 30 pixels in the vertical direction, creating 1440 blocks in amount.
  • each block is discriminated by the following identification data BL(X,Y).
  • Block ID BL(X,Y), where 1 ⁇ X ⁇ 48, 1 ⁇ Y ⁇ 30
  • the two images, right image IR and left image IL, obtained in the image pickup phase (A), are developed into a plurality of frequency component images in the feature extraction phase (B)
  • Each frequency-component image is applied the secondary differential processing. Thereafter, each image is converted pixel by pixel into ternary values, thus obtaining the following ternary-valued frequency component images.
  • TL1 TL2, TL3, - - - , TLk, TLk+1, - - - , TLn
  • the above-described operation makes it possible to extract edges at various resolutions.
  • the primary object to perform the above-described operation is as follows.
  • each edge position receives no adverse effect derived from sensitivity difference between two cameras or shading.
  • the provision of ternary-value processing makes it possible to perform the similarity evaluation by -using a compact hardware arrangement.
  • Low-frequency edges are robust against noises, but are inaccurate in their positions.
  • high-frequency edges are accurate in their positions, although they have a tendency of being adversely effected by noises. By utilizing these natures, it becomes possible to realize a robust and accurate stereo image matching.
  • FIG. 12 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the first and third embodiment of the present invention.
  • a positive threshold TH1 >0
  • a negative threshold TH2 ⁇ 0
  • ternary values are given to respective pixels as follows. ##EQU3##
  • ternary-value processing makes it possible to quantize the images into 1 or -1 at their edges, especially in the vicinity of (positive and negative) peak positions, otherwise the images are expressed by 0.
  • This ternary-value processing is characterized in that its circuit can be simply arranged and relatively robust against noises. However, if any sensitivity difference exists between right and left images IR and IL, there will be the possibility that some pixels near the threshold may cause erroneous edge-position information due to quantization error.
  • FIG. 9 is a block diagram showing the arrangement of a first apparatus which realizes the processing of feature extraction phase (B) of FIG. 7.
  • Left image IL (or right image IR), received in the feature extraction phase (B), is the left image IL (or right image IR) obtained in the image-pickup phase (A) which is band limited to fc (Hz).
  • LPFk low-pass filters
  • HPFk high-pass filters
  • the above-described HPFk is a high pass filter having a secondary differential function.
  • Each of these plural ternary-valued frequency component image TLk thus obtained, reveals an edge position involved in each frequency component image.
  • Each edge position is used for the matching of right and left images in the succeeding matching phase (C).
  • the number of frequency component images FLk or the width of each frequency band should be determined by taking the required performance and the allowable cost range into consideration.
  • matching phase matching of right and left images is performed using the plurality of ternary-valued frequency component images obtained through ternary-value processing in the feature extraction phase (B).
  • One of two stereo images is designated as a reference image in this matching operation, and a matching region of a specific region of the reference image is searched from the other image.
  • this embodiment designates the left image as the reference image.
  • the left image serving as the reference image, which is dissected into numerous blocks each having the same size of M ⁇ L pixels as shown in FIG. 13
  • each of left ternary-valued frequency component images TLk is dissected into numerous blocks as shown in FIG. 14.
  • block identification data BLk(X,Y) is used for discriminating the left ternary-valued frequency component image TLk.
  • the previously-described equation 8 is used to evaluate the similarity between right and left ternary-valued frequency component images TLk and TRk involved in the designated one-dimensional windows.
  • a region having the most highest similarity is specified as a primary candidate disparity (disp1) and a region having the second highest similarity is specified as a secondary candidate disparity (disp2).
  • the coding operation for evaluating the similarity i.e. evaluation between corresponding pixels
  • the coding operation for evaluating the similarity is carried out with respect to all of 16 pixels in the given one-dimensional window.
  • all of ternary-valued frequency component images TLk and TRk are applied the evaluation of similarity, finally obtaining the overall similarity evaluation result as follows.
  • PN represents a total number of pixels having the evaluation result "P”
  • ZN represents a total number of pixels having the evaluation result "Z”
  • ⁇ k and ⁇ k represent weighting factors.
  • this candidate disparity should be nullified or voided in order to eliminate any erroneous matching operations.
  • primary candidate disparities (disp1) and secondary candidate disparities (disp2) will be obtained as a result of the scan based on a one-dimensional window successively shifted at strokes of 8 (M/2) pixels in an overlapped manner along the odd number line of the left image.
  • the primary candidate disparities (disp1) and secondary candidate disparities (disp2), thus obtained, are stored in the predetermined regions of a storage memory shown in FIG. 18.
  • FIG. 18 shows the memory regions in one-to-one relationship to the image data, it is noted that vacant regions in the storage memory can be eliminated.
  • FIG. 19 A method of determining a disparity of each block will be explained with reference to FIG. 19, which explains how the disparity of block BL(X,Y) is determined.
  • a histogram is created based on a total of 24 sets of primary candidate disparities (disp1) and secondary candidate disparities (disp2) existing in the region encircled by a dotted line in FIG. 19, considering the fact that all of these selected primary and secondary candidate disparities are obtained through the matching operation of the specific one-dimensional windows each comprising at least 8 pixels existing in the region of block BL(X,Y).
  • FIG. 20 is a graph showing an example of the histogram of disparities created based on the primary and secondary candidate disparities.
  • the image is obtained as digital data sampled at a predetermined frequency.
  • the measurable minimum unit for the disparity is limited to one pixel. If high accuracy in the disparity measurement is strictly requested, the following sub-pixel level measurement will be available.
  • a first straight line 1501 is obtained as a line crossing both of two points (di-1, hi-1) and (di, hi) .
  • a second straight line 1502 is obtained as a line crossing a point (di+1, hi+1) and having a gradient symmetrical with the line 1501 (i.e. identical in absolute value but opposite in sign).
  • a point 1503 is obtained as an intersecting point of two straight lines 1501 and 1502.
  • a disparity ds, corresponding to thus obtained intersecting point 1503, is finally obtained as a sub-pixel level disparity of the concerned block.
  • the sub-pixel level disparity measurement uses a histogram created by the number of occurrences; accordingly, this method is essentially different from the prior art method which basically uses the similarity evaluations C derived from the equation 6.
  • a second embodiment will be explained based on a stereo image measurement using the method of matching stereo images and detecting disparity between the images in accordance with the present invention.
  • FIG. 22 is a flow diagram showing sequential processes for executing the second embodiment of the present invention, covering the stereo image pickup phase through the disparity determination phase.
  • the image pickup phase (A) two, right and left, images are taken in through two, right and left, image-pickup devices in steps S1601 and S1602.
  • the processing performed in the image-pickup phase (A) is identical with that of the first embodiment.
  • the right and left images, obtained in the image-pickup phase (A) are respectively subjected to feature extraction in the next feature extraction phase (B') in steps S1603 and S1604.
  • the succeeding matching phase (C) the extracted features of the right and left images are compared to check how they match with each other in step S1605.
  • a disparity determination phase (D) a disparity is determined in each block (Step S1606).
  • the processing performed in the matching phase (C) and the disparity determination phase (D) are identical with those of the first embodiment.
  • the two images, right image IR and left image IL, obtained in the image pickup phase (A), are developed into a plurality of frequency component images in the feature extraction phase (B').
  • Each frequency-component image is applied the secondary differential processing. Thereafter, each image is converted pixel by pixel into ternary values, thus obtaining the following ternary-valued frequency component images.
  • TL1 TL2, TL3, - - - , TLk, TLk+1, - - - , TLn
  • FIG. 25 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the second embodiment of the present invention. As shown in FIG. 25, all of frequency component images are classified into three values by judging whether the pixel of a concerned image is related to a zero-crossing point, or whether the sign of its gradient is positive or negative when it corresponds to the zero-crossing point. For example, ternary values are given to respective pixels as follows.
  • This ternary-value processing (G) is comparative with or superior to the ternary-value processing (F) of the first embodiment in the accurate detection of edge positions, and also robustness against sensitivity difference between right and left images, although a little bit weak against noises.
  • FIG. 23 is a block diagram showing the arrangement of a third apparatus which realizes the processing of feature extraction phase (B') of FIG. 22.
  • Left image IL, received in the feature extraction phase (B'), is the image obtained in the image-pickup phase (A) which is band limited to fc (Hz).
  • LPFk low-pass filters
  • HPFk high-pass filters
  • Each of these plural ternary-valued frequency component image TLk thus obtained, reveals an edge position involved in each frequency component image.
  • Each edge position is used for the matching of right and left images in the succeeding matching phase (C).
  • the number of frequency component images FLk or the width of each frequency band should be determined by taking the required performance and the allowable cost range into consideration, in the same manner as in the first embodiment.
  • FIG. 24 is a block diagram showing the arrangement of a fourth apparatus which realizes the processing of feature extraction phase (B') of FIG. 22.
  • This fourth apparatus is identical with the second apparatus of the first embodiment shown in FIG. 11 except for the ternary-value processing (G).
  • the image is developed into a plurality of frequency component images FLk which are then converted into ternary-valued frequency component images TLk through ternary-value processing.
  • ternary-valued frequency component images TLk are sent to the succeeding matching phase (C) to perform the stereo image matching operation based on one-dimensional windows.
  • a disparity of each block is finally determined in the disparity determination phase (D).
  • a third embodiment will be explained based on a stereo image measurement using the method of matching stereo images and detecting disparity between the images in accordance with the present invention.
  • FIG. 26 is a flow diagram showing sequential processes for executing the third embodiment of the present invention, covering the stereo image pickup phase through the disparity determination phase.
  • the image pickup phase (A) two, right and left, images are taken in through two, right and left, image-pickup devices in steps S2001 and S2002.
  • the processing performed in the image-pickup phase (A) is identical with those of the first and second embodiments.
  • the right and left images, obtained in the image-pickup phase (A) are respectively subjected to feature extraction in the next feature extraction phase (B") in steps S2003 and S2004.
  • the succeeding matching phase (C) the extracted features of the right and left images are compared to check how they match with each other in step S2005.
  • a disparity determination phase (D) a disparity is determined in each block (Step S2006).
  • the processing performed in the matching phase (C) and the disparity determination phase (D) are identical with those of the first and second embodiments.
  • the two images, right image IR and left image IL, obtained in the image pickup phase (A), are developed into a plurality of frequency component images in the feature extraction phase (B").
  • TL1 TL2, TL3, - - - , TLk, TLk+1, - - - , TLn
  • the ternary-value processing of the third embodiment is characterized in that the low-frequency component images are processed through the previously-described ternary-value processing (F) of the first embodiment while the high-frequency component images are processed through the above-described ternary-value processing (G) of the second embodiment.
  • the high-frequency component images have accurate information with respect to the edge positions when they are compared with the low-frequency component images.
  • the zero-crossing point classification is used for converting high-frequency component images into ternary values.
  • the edge information, obtained through the ternary-value processing (G) tends to involve erroneous edge information due to noises.
  • the low-frequency component images are converted into ternary values by using the threshold classification since low-frequency component images are not so accurate information for representing the edge positions.
  • the edge information, obtained through the ternary-value processing (F) seldom involves erroneous edge information derived from noises.
  • FIG. 27 is a block diagram showing the arrangement of a fifth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26.
  • Left image IL, received in the feature extraction phase (B") is the image obtained in the image-pickup phase (A) which is band limited to fc (Hz).
  • LPFk low-pass filters
  • HPFk high-pass filters
  • the low-frequency component images of the developed frequency component images FLk are converted or quantized into ternary-valued data through the ternary-value processing (F) explained in the first embodiment.
  • the high-frequency component images of the developed frequency component images FLk are converted or quantized into ternary-valued data through the ternary-value processing (G) explained in the second embodiment.
  • Each of these plural ternary-valued frequency component image TLk thus obtained, reveals an edge position involved in each frequency component image.
  • Each edge position is used for the matching of right and left images in the succeeding matching phase (C).
  • the number of frequency component images FLk or the width of each frequency band, as well as selection between the ternary-value processing (F) and the ternary-value processing (G), should be determined by taking the required performance and the allowable cost range into consideration.
  • FIG. 28 is a block diagram showing the arrangement of a sixth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26.
  • This sixth apparatus is identical with the second and fourth apparatuses of the first and second embodiments shown in FIG. 11 and 24 except for the ternary-value processing portion.
  • the image is developed into a plurality of frequency component images FLk which are then converted into ternary-valued frequency component images TLk through ternary-value processing.
  • ternary-valued frequency component images TLk are sent to the succeeding matching phase (C) to perform the stereo image matching operation based on one-dimensional windows.
  • a disparity of each block is finally determined in the disparity determination phase (D).
  • the embodiments of the present invention use the odd-number lines only for the scanning operation, the same effect will be obtained by using the objective scanning lines of the even-number lines only. If all the lines are used for the scanning operation, the reliability in the measurement of disparity will be enhanced although the processing volumes is doubled.
  • the present invention provides a novel and excellent method of matching stereo images and of detecting a disparity of these images which is small in the computation amount, compact and cheap in the hardware arrangement, speedy in the processing, and reliable and accurate in the performance of the stereo image matching and the disparity detection.
  • the present invention can be applied, for example, to various industrial monitoring systems, such as an obstacle monitor at a railroad crossing or an invader monitor in a building, by utilizing its capability of always measuring a disparity based on successively sampled stereo images and detecting the change of the disparity.

Abstract

In the image pickup phase (A), right and left images are taken in through two image-pickup devices (S101, S102). Then, in the next feature extraction phase (B), right and left images are respectively subjected to feature extraction (S103, S104). Thereafter, in the succeeding matching phase (C), the extracted features of right and left images are compared to check how they match with each other (step S105). More specifically, in the matching phase (C), a one-dimensional window is set, this one-dimensional window is shifted along the left image in accordance with a predetermined scanning rule so as to successively set overlapped one-dimensional windows, and a matching operation is performed by comparing the image features within one window and corresponding image features on the right image. Subsequently, in the disparity determination phase (D), the left image is dissected or divided into plural blocks each having a predetermined size, a histogram in each block is created from disparities obtained by the matching operation based on one-dimensional windows involving pixels of a concerned block, and a specific disparity just corresponding to the peak of thus obtained histogram is identified as a valid disparity representing the concerned block (S106).

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
This invention generally relates to a method of matching stereo images and a method of detecting disparity between these images, which is chiefly used in the industrial field of stereo cameras for detecting positional information in the image pickup space based on stereo images, volume compression of overall stereo images (i.e. three-dimensional video images), display control of these stereo images, and for the optical flow extraction of moving images and so on.
2. Prior Art
Generally known, conventional methods of matching stereo images and of detecting disparity between these images will be hereinafter explained with reference to a so-called stereo image measurement technology where the position or distance information can be obtained in the image-pickup space by performing the matching between two images (stereo images) and detecting a disparity between these images.
FIG. 1 is a view illustrating the principle of a typical stereo image measurement. In FIG. 1, a three-dimensional coordinate, generally defined by variables x, y and z, represents the real space. A two-dimensional coordinate, generally defined by variables X and Y, represents the plane of image (i.e. an image-pickup plane of a camera). There are provided a pair of two-dimensional coordinates for a pair of cameras 23R and 23L. A position on the image plane of right camera 23R can be expressed by variables XR and YR on one two-dimensional coordinate. A position on the image plane of left camera 23L can be expressed by variables XL and YL on the other two-dimensional coordinate.
Axes XL and XR are parallel to the axis x, while axes YL and YR are parallel to the axis y. Axis z is parallel to the optical axes of two cameras 23R and 23L. The origin of the real space coordinate (x, y, z) coincides with a midpoint between the projective centers of right and left cameras 23R and 23L. The distance between the projective centers is generally referred to as a base length denoted by 2a. A distance, denoted by f, is a focal distance between each projective center and its image plane.
It is now assumed that a real-space point p is projected at a point PR(XR,YR) on the right image plane and at the same time a point PL(XL,YL) on the left image plane. According to the stereo image measurement, PR and PL are determined on respective image planes (by performing the matching of stereo images) and then the real-space coordinate (x, y, z) representing the point p is obtained based on the principle of the trigonometrical survey.
YR and YL have identical values in this case, because two optical axes of cameras 23R and 23L exist on the same plane and X axes of cameras 23R and 23L are parallel to axis x. The relationship between the coordinate values XR, YR, XR, YR and the real-space coordinate values x, y, z is expressed in the following equation. ##EQU1## where d represents the disparity (between stereo images).
d=XL-XR                                                    (Eq. 3)
As "a" is a positive value (a>0), the following relation is derived from the above equation 2.
XL>XR and YL=YR                                            (Eq. 4)
Understood from the above-given relationship is that a specific point on one image plane has a matching point on the other image plane along the same scanning line serving as an epipolar line within the region define by XL>XR. Accordingly, the matching point corresponding to a specific point on one image plane can be found on the other image plane by checking the similarity of images in each micro area along the line having the possibility of detecting the matching point.
Some of similarity evaluation methods will be explained below FIG. 2 shows a conventional method of detecting a mutual correlation value between two images, disclosed in "Image Processing Handbook" (Shokodo publishing Co. Ltd.) by Morio ONOUE et al., for example.
First of all, designation is given to a pixel 2403 existing somewhere on the left image 2401. A pixel matching to this pixel 2403 is next found along the plane of right image 2402. In other words, the matching point is determined. More specifically, a square micro area 2404 (hereinafter referred to as a micro area) is set on the right image 2401 so as to have a size corresponding to n×m pixels sufficient to involve the designated pixel 2403 at the center thereof. It is now assumed that IL(i,j) represents the brightness of each point (pixel) within the micro area 2404.
On the other hand, a square micro area 2405 on the right image 2402 is designated as a micro area having its center on a pixel satisfying the condition of equation 4. The micro area 2405 has a size corresponding to n x m pixels. It is assumed that IR(i,j) represents the brightness of each point (pixel) within the micro area 2405.
Furthermore, it is assumed that μL, μR, σL2 and σR2 represent averages and variances of the brightness in the micro areas 2404 and 2405. The mutual correlation value of these micro areas can be given by the following equation. ##EQU2##
The value "c" defined by the equation 5 is calculated along the straight line (epipolar line) having the possibility of detecting a matching point. Then, the point where the value "c" is maximized is identified as the matching point to be detected. According to this method, it becomes possible to determine the matching point as having the size identical with a pixel. If the matching point is once found, the disparity "d" can be immediately obtained using the equation 3 based on the coordinate values representing thus found matching point.
However, this conventional method is disadvantageous in that a great amount of computations will be required for completely obtaining all the matching points of required pixels since even a single search of finding only one matching point of a certain pixel requires the above-described complicated computations to be repetitively performed with respect to the entire region having the possibility of detecting the matching point.
The computations for obtaining the correlation can be speeded up with reducing size of the micro area, although the stability in the matching point detection will be worsened due to increase of image distortion and noises. On the contrary, increasing the size of the micro area will not only increase the computation time but deteriorate the accuracy in the matching point detection because of the change of correlation values being undesirably moderated. Thus, it will be required to adequately set the size of the micro area by considering the characteristics of the image to be handled.
Furthermore, as apparent from the equation 3, the characteristics of the above-described conventional method resides in that the determination of the disparity directly reflects the result of stereo image matching. Hence, any erroneous matching will cause an error in the measurement of disparity "d". In short, an error in the stereo image matching leads to an error in the disparity measurement.
In this manner, the method of determining a matching point with respect to each of pixels is disadvantageous in that the volume of computations becomes huge. To solve this problem, one of proposed technologies is a method of dividing or dissecting the image into several blocks each having a predetermined size and determining the matching region based on the dissected blocks. For example, "Driving Aid System based on Three-dimensional Image Recognition Technology", by Jitsuyoshi et al., in the Pre-publishing 924, pp. 169-172 of Automotive Vehicle Technical Institute Scientific Lecture Meeting, October in 1992, discloses such a method of searching the matching region based on the comparison between the blocks of right and left images.
FIG. 3 is a view illustrating the conventional method of performing the matching of stereo images between square micro areas (blocks). The left image 2501, serving as a reference image, is dissected into a plurality of blocks so that each block (2503) has a size equivalent to n×m pixels. To obtain the disparity, each matching region with respect to each block on the left image 2501 is searched along the plane of right image 2502. The following equation is a similarity evaluation used for determining the matching region.
C=Σ|Li -Ri|                        (Eq. 6)
where Li represents luminance of i-th pixel in the left block 2503, while Ri represents luminance of i-th pixel in the right block 2504.
This evaluation is not so complicated when it is compared with the calculation of equation 5 which includes the computations of subtracting the average values. However, the hardware scale is still large because of line memories used for the evaluation of two-dimensional similarity. Furthermore, the overall processing time required will be fairly long due to too many accesses to the memories.
Moreover, using the luminance value for the similarity evaluation will increase the hardware cost because the preprocessing is additionally required for adjusting the sensitivity difference between right and left cameras and for performing the shading correction before executing the stereo image matching processing.
A straight line existing in the image-pickup space may be image-formed as straight lines 2603 and 2604 different in their gradients in blocks 2605 and 2606 of left and right images 2601 and 2602, as shown in FIG. 4. In such a case, it may fail to accurately determine the matching regions.
On the contrary, two different lines may be image-formed as identical lines in blocks 2703 and 2704 on left and right images 2701 and 2702 as shown in FIG. 5. Hence, comparing the pixels between two blocks 2703 and 2704 only will cause a problem that the stereo image matching may be erroneously performed and the succeeding measurement of disparity will be failed.
According to the above-described disparity measuring methods, the unit for measuring each disparity is one pixel at minimum because of image data of digital data sampled at a certain frequency. However, it is possible to perform the disparity measurement more accurately.
FIG. 6 is a view illustrating a conventional disparity measuring method capable of detecting a disparity in a sub-pixel level accuracy. FIG. 6 shows a peak position found in the similarity evaluation value C (ordinate) when the equation 6 is calculated along the search region in each block. The sub-pixel level disparity measurement is performed by using similarity evaluations Ci, Ci-1, Ci+1 corresponding to particular disparities di, di-1, di+1 (in the increment of pixel) existing before and after the peak position. More specifically, a first straight line 2801 is obtained as a line crossing both of two points (di-1, Ci-1) and (di, Ci). A second straight line 2802 is obtained as a line crossing a point (di+1, Ci+1) and having a gradient symmetrical with the line 2801 (i.e. identical in absolute value but opposite in sign). Then, a point 2803 is obtained as an intersecting point of two straight lines 2801 and 2802. A disparity ds, corresponding to thus obtained intersecting point 2803, is finally obtained as a sub-pixel level disparity of the concerned block.
As apparent from the foregoing description, the above-described conventional stereo image matching methods and disparity detecting methods are generally suffering from increase of hardware costs and enlargement of processing time due to four rules' arithmetic calculations of equations 5 and 6 required for the similarity evaluation in the stereo image matching.
Furthermore, performing the similarity evaluation based on two-dimensional windows necessarily requires the provision of line memories as hardware which possibly requires frequent accesses to the memories, resulting in further increase of hardware costs and enlargement of processing time.
Still further, utilizing the comparison of luminance difference between right and left images definitely increases the hardware costs for the addition of preprocessing components, used in the sensitivity adjustment and shading correction between right and left cameras which are performed before executing the stereo image matching.
Yet further, using a single block as the unit for determining the disparity identical in size with a two-dimensional window serving as the unit for the matching will cause a problem that any error occurring in the matching phase based on the two-dimensional window will directly give an adverse effect on the disparity detection of the corresponding block. In short, there is no means capable of absorbing or correcting the error occurring in the matching phase.
Moreover, determining each matching region using only the pixels existing in a block (=two-dimensional window) will possibly result in the failure in the detection of a true matching region.
SUMMARY OF THE INVENTION
Accordingly, in view of above-described problems encountered in the prior art, a principal object of the present invention is to provide a method of matching stereo images and of detecting disparity between these images, small in the volume of computations, compact in the hardware construction, quick in processing, highly reliable, and excellent in accuracy.
In order to accomplish this and other related objects, a first aspect of the present invention provides a novel and excellent method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each frequency component image, after being applied the secondary differential processing, into ternary values pixel by pixel, thereby obtaining ternary-valued frequency component images TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn and ternary-valued frequency component images TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn; and performing a matching operation between the first and second images based on the ternary-valued frequency component images.
A second aspect of the present invention provides a method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each frequency component image, after being applied the secondary differential processing, into ternary values pixel by pixel by using a positive threshold TH1(>0) and a negative threshold TH2(<0) in such a manner that a pixel larger than TH1 is designated to "p", a pixel in a range between TH1 and TH2 is designated to "z", and a pixel smaller than TH2 is designated to "m", thereby obtaining ternary-valued frequency component images TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn and ternary-valued frequency component images TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn; and performing a matching operation between the first and second images based on the ternary-valued frequency component images.
A third aspect of the present invention provides a method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each frequency component image, after being applied the secondary differential processing, into ternary values pixel by pixel in such a manner that a pixel not related to a zero-crossing point is designated to "z", a pixel related to a zero-crossing point and having a positive gradient is designated to "p", and a pixel related to a zero-crossing point and having a negative gradient is designated to "m", thereby obtaining ternary-valued frequency component images TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn and ternary-valued frequency component images TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn; and performing a matching operation between the first and second images based on the ternary-valued frequency component images.
A fourth aspect of the present invention provides a method of matching stereo images, comprising the steps of: inputting first and second images IL and IR; developing the images IL and IR into a plurality of frequency component images FL1, FL2, FL3, - - - , FLk, FLk+1, - - - , FLn and a plurality of frequency component images FR1, FR2, FR3, - - - , FRk, FRk+1, - - - , FRn, respectively; applying a secondary differential processing to each of the frequency component images; converting each low frequency component image of the frequency component images, after being applied the secondary differential processing, into ternary values pixel by pixel by using a positive threshold TH1(>0) and a negative threshold TH2(<0) in such a manner that a pixel larger than TH1 is designated to "p", a pixel in a range between TH1 and TH2 is designated to "z", and a pixel smaller than TH2 is designated to "m", and converting each high frequency component image of the frequency component images, after being applied the secondary differential processing, into ternary values pixel by pixel in such a manner that a pixel not related to a zero-crossing point is designated to "z", a pixel related to a zero-crossing point and having a positive gradient is designated to "p", and a pixel related to a zero-crossing point and having a negative gradient is designated to "m", thereby obtaining ternary-valued frequency component images TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn and ternary-valued frequency component images TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn; and performing a matching operation between the first and second images based on the ternary-valued frequency component images.
According to the features of preferred embodiments of the present invention, the first image IL is designated as a reference image for the matching operation, a one-dimensional window capable of encompassing N pixels therein is set on the ternary-valued frequency component image of the first image IL, and a matching region having the same ternary-value pattern as the N pixels in the one-dimensional window is searched from the ternary-valued frequency component image of the second image IR.
According to the features of the preferred embodiments of the present invention, one of the first and second images IL and IR is designated as a reference image for the matching operation, a plurality of one-dimensional windows are set on the entire surface of the ternary-valued frequency component image of the reference image through a scanning operation along an epipolar line, so that the one-dimensional windows are successively overlapped at the same intervals of N/2 when each of the one-dimensional windows has a size equivalent to N pixels, and the matching operation is carried out with respect to each of the one-dimensional windows.
According to the features of the preferred embodiments of the present invention, pixels in a one-dimensional window of a ternary-valued frequency component image TLk of the first image IL are compared in a one-to-one manner with pixels in a designated region of a ternary-valued frequent component image TRk of the second image IR, when the ternary-valued frequency component images TLk and TRk are identical in their frequency components, wherein an evaluation result "P" is obtained when corresponding two pixels are both "p" or "m", while an evaluation result "Z" is obtained when the corresponding two pixels are both "z", and a similarity between two ternary-valued frequency component images TLk and TRk is evaluated by using the following equation:
Εall=Σβk(PN)k+Σγk(ZN)k
where PN represents a total number of pixels having the evaluation result "P", ZN represents a total number of pixels having the evaluation result "Z", and βk and γk represent weighting factors.
According to the features of the preferred embodiments of the present invention, pixels in a one-dimensional window of a ternary-valued frequency component image TLk of the first image IL are compared in a one-to-one manner with pixels in a designated region of a ternary-valued frequent component image TRk of the second image IR, when the ternary-valued frequency component images TLk and TRk are identical in their frequency components, wherein an evaluation result "P" is obtained when corresponding two pixels are both "p" or "m", while an evaluation result "Z" is obtained when the corresponding two pixels are both "z", a similarity between two ternary-valued frequency component images TLk and TRk is evaluated by using the following equation:
Εall=Σβk(PN)k+Σγk(ZN)k
where PN represents a total number of pixels having the evaluation result "P", ZN represents a total number of pixels having the evaluation result "Z", and βk and γk represent weighting factors, and a matching result in the matching operation is validated only when Σ βk(PN)k is larger than a predetermined threshold TH3(>0).
Furthermore, a fifth aspect of the present invention provides a novel and excellent method of detecting a disparity between stereo images, comprising the steps of: comparing pixels in a micro region defined by a one-dimensional window set on a reference image with pixels in a designated micro region on a non-reference image; evaluating a similarity between two micro regions using the following equation:
Εall=Σβk(PN)k+Σβk(ZN)k
where PN represents a total number of pixels having an evaluation result "P" while ZN represents a total number of pixels having an evaluation result "Z", and βk and γk represent weighting factors; searching a first region having a most highest similarity and a second region having a second highest similarity; specifying a first candidate disparity as a disparity corresponding to the first region, and a second candidate disparity as a disparity corresponding to the second region; and determining a valid disparity between the stereo images based on the first and second candidate disparities.
Moreover, a sixth aspect of the present invention provides a method of detecting a disparity between stereo images, comprising the steps of: dividing each of first and second images IL and IR into a plurality of blocks each having a size of M=L pixels; matching ternary-valued frequency component images of the images IL and IR; comparing pixels in a micro region defined by a one-dimensional window set on the first image IL with pixels in a designated micro region on the second image IR; evaluating a similarity between two micro regions using the following equation:
Εall=Σβk(PN)k+Σγk(ZN)k
where PN represents a total number of pixels having an evaluation result "P" while ZN represents a total number of pixels having an evaluation result "Z", and βk and γk represent weighting factors; searching a first region having a most highest similarity and a second region having a second highest similarity in a concerned block; specifying a first candidate disparity as a disparity corresponding to the first region, and a second candidate disparity as a disparity corresponding to the second region; creating a histogram based on the first and second candidate disparities; and determining a valid disparity of the concerned block as a disparity corresponding to a peak position of the histogram.
According to the features of the preferred embodiments of the present invention, in the above-described disparity detecting method, the first image IL is designated as a reference image, a one-dimensional window capable of encompassing N pixels therein is set on the ternary-valued frequency component image of the first image IL, and a matching region having the same ternary-value pattern as the N pixels in the one-dimensional window is searched from the ternary-valued frequency component image of the second image IR. Alternatively, one of the first and second images IL and IR is designated as a reference image, a plurality of one-dimensional windows are set on the entire surface of the ternary-valued frequency component image of the reference image through a scanning operation along an epipolar line, so that the one-dimensional windows are successively overlapped at the same intervals of N/2 when each of the one-dimensional windows has a size equivalent to N pixels, and a matching operation is carried out with respect to each of the one-dimensional windows.
According to the features of the preferred embodiments, the valid disparity is calculated as a sub-pixel level parity corresponding to an intersecting point of a first straight line crossing two points (di-1, hi-1), (di, hi) and a second straight line crossing a point (di+1, hi+1) with a gradient symmetrical with the first straight line, where di-1, di, di+1 represent disparities near the peak position of the histogram and hi-1, hi, hi+1 represent the number of occurrences of the disparities di-1, di, di+1 respectively.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other objects, features and advantages of the present invention will become more apparent from the following detailed description which is to be read in conjunction with the accompanying drawings, in which:
FIG. 1 is a view illustrating the principle of the stereo image measurement;
FIG. 2 is a view illustrating a conventional method of checking a mutual correlation value between two images;
FIG. 3 is a view illustrating a conventional method of matching stereo images based on the comparison of square micro regions (blocks) of two images;
FIG. 4 is a view illustrating a problem in a conventional method;
FIG. 5 is a view illustrating another problem in a conventional method;
FIG. 6 is a view illustrating a detection of a sub-pixel level disparity in accordance with a conventional disparity detecting method;
FIG. 7 is a flow diagram showing sequential processes for executing a first embodiment of the present invention, covering the pickup of stereo images through the determination of disparity;
FIG. 8 is a view illustrating a monochrome image used in the explanation of one embodiment method of matching stereo images and of detecting disparity between these images in accordance with the present invention;
FIG. 9 is a block diagram showing an arrangement of a first apparatus which realizes the processing of feature extraction phase (B) of FIG. 7;
FIGS. 10A, 10B, 10C and 10D are graphs showing examples of various frequency component images obtained as a result of the feature extraction phase processing shown in FIGS. 9, 23 and 27;
FIG. 11 is a block diagram showing an arrangement of a second apparatus which realizes the processing of feature extraction phase (B) of FIG. 7;
FIG. 12 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the first and third embodiment of the present invention;
FIG. 13 is a view illustrating a method of dividing an image into plural blocks, each serving as the unit for determining disparity, in accordance with the present invention;
FIG. 14 is a view illustrating a scanning method of a one-dimensional window serving as the unit for matching stereo images in the present invention;
FIG. 15 is a view illustrating the relationship between the one-dimensional window serving as the unit for matching stereo images and a block serving as the unit for determining a disparity in the present invention;
FIG. 16 is a view illustrating a method of determining a disparity candidate based on the one-dimensional window search of the present invention;
FIG. 17 is a view illustrating a method of evaluating a similarity based on the one-dimensional window search of the present invention;
FIG. 18 is a view illustrating an example of a storage region used for temporarily storing candidate disparities which are determined in relation to each of one-dimensional windows in accordance with the present invention;
FIG. 19 is a view illustrating a method of creating a histogram in relation to blocks, based on candidate disparities temporarily stored in the storing region in relation to one-dimensional windows, in accordance with the present invention;
FIG. 20 is a graph showing an example of the histogram created in each block in accordance with the present invention;
FIG. 21 is a graph showing a method of measuring a disparity at the accuracy of sub-pixel level based on the histogram creased in relation to blocks of the present invention;
FIG. 22 is a flow diagram showing sequential processes for executing a second embodiment of the present invention, covering the pickup of stereo images through the determination of disparity;
FIG. 23 is a block diagram showing an arrangement of a third apparatus which realizes the processing of feature extraction phase (B') of FIG. 22 in accordance with the second embodiment;
FIG. 24 is a block diagram showing an arrangement of a fourth apparatus which realizes the processing of feature extraction phase (B') of FIG. 22 in accordance with the second embodiment;
FIG. 25 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the second and third embodiment of the present invention;
FIG. 26 is a flow diagram showing sequential processes for executing a third embodiment of the present invention, covering the pickup of stereo images through the determination of disparity;
FIG. 27 is a block diagram showing an arrangement of a fifth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26 in accordance with the third embodiment; and
FIG. 28 is a block diagram showing an arrangement of a sixth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26 in accordance with the third embodiment.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Preferred embodiments of the present invention will be explained in greater detail hereinafter, with reference to the accompanying drawings. Identical parts are denoted by the same reference numeral throughout views.
A method of matching stereo images and a method of detecting a disparity between these images will be hereinafter explained in accordance with the present invention.
First Embodiment
A first embodiment will be explained based on a stereo image measurement using the method of matching stereo images and detecting disparity between the images in accordance with the present invention.
FIG. 7 is a flow diagram showing sequential processes for executing the first embodiment of the present invention, covering the stereo image pickup phase through the disparity determination phase. In the image pickup phase (A), two, right and left, images are taken in through two, right and left, image-pickup devices in steps S101 and S102. Then, the right and left images, obtained in the image-pickup phase (A), are respectively subjected to feature extraction in the next feature extraction phase (B) in steps S103 and S104. Thereafter, in the succeeding matching phase (C), the extracted features of the right and left images are compared to check how they match with each other in step S105.
More specifically, in the matching phase (C), a one-dimensional window is set, this one-dimensional window is shifted along a referential image plane (one of right and left image planes) in accordance with a predetermined scanning rule so as to successively set windows each serving as the unit for matching stereo images, and a matching operation is performed by comparing the image features within one window and corresponding image features on the other (the other of right and left image planes).
Subsequently, in the disparity determination phase (D), the referential image feature plane is dissected or divided into plural blocks each having a predetermined size, a histogram in each block is created from disparities obtained by the matching operation based on one-dimensional windows involving pixels of a concerned block, and a specific disparity just corresponding to the peak of thus obtained histogram is identified as a valid disparity representing the concerned block in step S106. The processing performed in these phases (A) through (D) will be hereinafter described in greater detail.
A: IMAGE-PICKUP PHASE
Although there will be various methods for arranging the stereo cameras, this embodiment disposes a pair of right and left cameras in a parallel arrangement where two cameras are located at predetermined right and left positions in the horizontal direction so that they have paralleled optical axes. The right-and-left parallel arrangement explained with reference to FIG. 1 shows an ideal arrangement model to be adopted in this embodiment too. However, in practices, it will be impossible to perfectly build the ideal arrangement of stereo cameras without causing any dislocations. In this respect, it is important that the method of matching stereo images and the method of detecting a disparity between these images should be flexible for allowing such dislocations.
In the following explanation, the right and left images obtained in the image-pickup phase (A) will be explained as monochrome images having a predetermined size of 768 (H)×480 (V). However, it is needless to say that the images handled in the present invention are not limited to the disclosed monochrome images. The right and left images, obtained in the image-pickup phase, are defined as follows.
Left Image : IL (x, y)
Right Image : IR (x, y)
where 1≦x≦768, 1 ≦y≦480, 0<IL(x,y)≦255, and 0≦IR(x,y)≦255.
As shown in the monochrome image of FIG. 8, "x" represents a horizontal index of the image, while "y" represents a vertical index (i.e. line number) of the image. The pixel number is expressed by "x" from left to right, while the line number is expressed by "y" from top to bottom.
In performing the stereo image matching, one of two images is designated as a reference image and a matching region corresponding to a specific region of this reference image is searched from the other image. The left image, serving as the reference image in this embodiment, is dissected into numerous blocks each having a size of M×L pixels as shown in FIG. 13. As a practical example, each block has a size of 16×16 pixels (M=L=16). In this case, the left image is divided into a total of 48 pixels in the horizontal direction and a total of 30 pixels in the vertical direction, creating 1440 blocks in amount. Hereinafter, each block is discriminated by the following identification data BL(X,Y).
Block ID : BL(X,Y), where 1≦X≦48, 1≦Y≦30
B: Feature Extraction Phase
The two images, right image IR and left image IL, obtained in the image pickup phase (A), are developed into a plurality of frequency component images in the feature extraction phase (B)
IL: L1, L2, L3, - - - , Lk, Lk+1, - - - , Ln
IR: R1, R2, R3, - - - . Rk, Rk+1,- - - , Rn
Each frequency-component image is applied the secondary differential processing. Thereafter, each image is converted pixel by pixel into ternary values, thus obtaining the following ternary-valued frequency component images.
TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn
TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn
The above-described operation makes it possible to extract edges at various resolutions. The primary object to perform the above-described operation is as follows.
Basically, each edge position receives no adverse effect derived from sensitivity difference between two cameras or shading. By utilizing this preferable nature, it becomes possible to accurately perform the stereo image matching without performing any pre-processing, such as sensitivity difference correction of cameras or shading correction. The provision of ternary-value processing makes it possible to perform the similarity evaluation by -using a compact hardware arrangement.
The secondary object is as follows.
Low-frequency edges are robust against noises, but are inaccurate in their positions. On the other hand, high-frequency edges are accurate in their positions, although they have a tendency of being adversely effected by noises. By utilizing these natures, it becomes possible to realize a robust and accurate stereo image matching.
Next, the ternary-value processing will be explained. FIG. 12 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the first and third embodiment of the present invention. As shown in FIG. 12, a positive threshold TH1 (>0) and a negative threshold TH2 (<0) are provided to classify all of frequency component images into three values. For example, ternary values are given to respective pixels as follows. ##EQU3##
The above-described ternary-value processing makes it possible to quantize the images into 1 or -1 at their edges, especially in the vicinity of (positive and negative) peak positions, otherwise the images are expressed by 0. This ternary-value processing is characterized in that its circuit can be simply arranged and relatively robust against noises. However, if any sensitivity difference exists between right and left images IR and IL, there will be the possibility that some pixels near the threshold may cause erroneous edge-position information due to quantization error.
FIG. 9 is a block diagram showing the arrangement of a first apparatus which realizes the processing of feature extraction phase (B) of FIG. 7. Left image IL (or right image IR), received in the feature extraction phase (B), is the left image IL (or right image IR) obtained in the image-pickup phase (A) which is band limited to fc (Hz). The input image IL is developed into a plurality of band signals having different frequency components (i.e. frequency component images FLk, k=1,2,3, - - - ,n) by plural low-pass filters (LPFk, k=1,2,3, - - - ) and high-pass filters (HPFk, k=1,2,3, - - - ,n) combined as shown in the drawing. Then, each band signal is quantized into a ternary value (i.e. ternary-valued frequency component image TLk, k=1,2,3, - - - ,n) through the succeeding ternary-value processing (F). The above-described HPFk is a high pass filter having a secondary differential function. FIGS. 10A, 10B, 10C and 10D are graphs showing examples of various frequency component images FLk (k=1,2,3, - - - ), i.e. band division examples, obtained as a result of the development using the circuit shown in the block diagram of FIG. 9.
Each of these plural ternary-valued frequency component image TLk, thus obtained, reveals an edge position involved in each frequency component image. Each edge position is used for the matching of right and left images in the succeeding matching phase (C). Regarding the settings, it is noted that the number of frequency component images FLk or the width of each frequency band should be determined by taking the required performance and the allowable cost range into consideration.
FIG. 11 is a block diagram showing the arrangement of a second apparatus which realizes the processing of feature extraction phase (B) of FIG. 7. The Laplacian-Gaussian function (∇2G), forming the basis for "σ" of Laplacian-Gaussian filter, is given by taking a second-story differential of Gaussian function. In a one-dimensional case: ##EQU4##
In a two-dimensional case: ##EQU5## where r2 =i2 +j2, and σ2 represents the variance of Gaussian function.
Obtaining a convolution of this function and the image (Laplacian-Gaussian filter) is equivalent to smoothing the image through the Gaussian filter (LPF) and then obtaining a second-story differential (Laplacean, HPF).
Changing the value of a will make it possible to extract edges at a plurality of resolutions (scales), which is widely applicable to the image processing technologies.
With the above-described method, the image is developed into a plurality of frequency component images which are then quantized into ternary-valued frequency component images as follows.
Left ternary-valued frequency component image:
TL1(x,y), TL2(x,y), TL3(x,y)
Right ternary-valued frequency component image:
TR1(x,y), TR2(x,y), TR3(x,y)
where 1≦x≦768, 1≦y≦480,
-1≦TL1(x,y), TL2(x,y), TL3(x,y), - - - ≦1, and -1≦TR1(x,y), TR2(x,y), TR3(x,y), - - - ≦1   (Eq. 10)
Thus obtained right and left ternary-valued frequency component images are sent to the succeeding matching phase (C) and used to check the matching of stereo images.
C: Matching Phase
In the matching phase, matching of right and left images is performed using the plurality of ternary-valued frequency component images obtained through ternary-value processing in the feature extraction phase (B). One of two stereo images is designated as a reference image in this matching operation, and a matching region of a specific region of the reference image is searched from the other image.
As explained in the image-pickup phase (A), this embodiment designates the left image as the reference image. Like the left image, serving as the reference image, which is dissected into numerous blocks each having the same size of M×L pixels as shown in FIG. 13, each of left ternary-valued frequency component images TLk is dissected into numerous blocks as shown in FIG. 14. Hereinafter, block identification data BLk(X,Y) is used for discriminating the left ternary-valued frequency component image TLk.
Block ID : BLk(X,Y), where 1≦X≦48, 1≦Y≦30
The matching operation of this embodiment is carried out along the odd number lines only. A scanning line is referred to as an objective scanning line when it is an object of the matching operation, hereinafter. All the information relating to the even number lines are not used at all in the matching phase and the succeeding.
First, as shown in FIG. 14, there is provided a one-dimensional window having a size of 1×16 pixels (i.e. L=1, M=16) for performing a window scan along a concerned odd number line (i.e. along one of objective scanning lines) of the left ternary-valued frequency component image TLk(x,y). Each stroke of the one-dimensional window scan is 8 pixels which is just a half (M/2) of the window size (16 pixels). In other words, the above-described window is shifted in the x direction by an amount identical with a half thereof so as to carry out the window scan by successively overlapping the area occupied by the window. This scanning operation provides a total of 95 windows successively overlapped along one objective scanning line.
A matching candidate region corresponding to each of one-dimensional windows thus provided is searched from the right ternary-valued frequency component image TRk(x,y). Each of one-dimensional windows is specified by identification data WNk(I,J).
Window ID : WNk(I,J), where 1≦I≦95 and 1≦J≦240
As shown in FIG. 15, a block BLk(X,Y) completely involves a total of 8 one-dimensional windows 901, which are generally expressed by the following equation using the block indexes X and Y.
Wnk(I,J)=WNk(2X-1, 8(Y-1)+u), where 1≦u≦8    (Eq. 11)
Meanwhile, there are existing a total of 8 one-dimensional windows 902 each bridging 8 (M/2) pixels of block BLk(X,Y) and 8 (M/2) pixels of block BLk(X-1,Y). These one-dimensional windows 902 are generally expressed by the following equation.
Wnk(I,J)=WNk(2X-2, 8(Y-1)+u), where 1≦u≦8    (Eq. 12)
On the other hand, there are existing a total of 8 one-dimensional windows 903 each bridging 8 (M/2) pixels of block BLk(X,Y) and 8 (M/2) pixels of block BLk(X+1,Y). These one-dimensional windows 903 are generally expressed by the following equation.
Wnk(I,J)=WNk(2X,8(Y-1)+u), where 1≦u≦8       (Eq. 13)
As apparent from the foregoing description, this embodiment is characterized by one-dimensional windows each serving as the unit for the matching operation. The purpose of using such one-dimensional windows is to reduce the size of hardware compared with the conventional two-dimensional window, and also to shorten the processing time as a result of reduction of accesses to the memories.
Furthermore, this embodiment is characterized in that one-dimensional windows are successively arranged in an overlapped manner at the same intervals of 8 (M/2) pixels. The purpose of adopting such an overlap arrangement is to enhance the reliability of each matching operation by allowing the supplementary use of adjacent pixels in the event that the matching region cannot be univocally determined based on only the pixels in a given block, when the disparity of the block is determined.
Next, a method of determining a matching region of each of the one-dimensional windows thus provided will be explained. As shown in FIG. 16, a matching region of each one-dimensional window being set on the left ternary-valued frequency component image TLk is searched from the right ternary-valued frequency component image TRk.
In the search, the previously-described equation 8 is used to evaluate the similarity between right and left ternary-valued frequency component images TLk and TRk involved in the designated one-dimensional windows. With respect to each of one-dimensional windows, a region having the most highest similarity is specified as a primary candidate disparity (disp1) and a region having the second highest similarity is specified as a secondary candidate disparity (disp2).
These primary and secondary candidate disparities, obtained in the above-described matching operation based on one-dimensional windows are mere candidates and are not the final disparity. The final disparity of each block is determined in the succeeding disparity determination phase (D) based on these primary and secondary candidate disparities.
Next, a method of evaluating similarity will be explained in more detail, with reference to FIG. 17. In the evaluation of similarity, all of 16 pixels in a given one-dimensional window on the left ternary-valued frequency component image TLk are compared with consecutive 16 pixels arrayed in the horizontal direction within a predetermined zone on the right ternary-valued frequency component image TRk, this predetermined zone having the possibility of detecting a matching region.
More specifically, the similarity between corresponding two pixels is evaluated using the following codes.
______________________________________
Both pixels valued 0
                  Z
Both pixels valued 1
                  P
Both pixels valued -1
                  P
Other cases       0
______________________________________
The coding operation for evaluating the similarity (i.e. evaluation between corresponding pixels) is carried out with respect to all of 16 pixels in the given one-dimensional window. In this manner, all of ternary-valued frequency component images TLk and TRk are applied the evaluation of similarity, finally obtaining the overall similarity evaluation result as follows.
Εall=Σβk(PN)k+Σγk(ZN)k      (Eq. 14)
where PN represents a total number of pixels having the evaluation result "P", ZN represents a total number of pixels having the evaluation result "Z", and βk and γk represent weighting factors.
Having a large value in the overall similarity evaluation result Εall indicates that the similarity is high. Although "k" represents consecutive integers 1,2, - - - ,n in the equation 14, it is possible to use some of them. Furthermore, the first term on the right side of the equation 14 expresses the number of pixels coinciding with each other with respect to the edge points serving as matching features. It is believed that this number reflect the reliability in the result of matching operation. The larger this number, the higher the reliability. The smaller this number, the lower the reliability.
Accordingly, if the first term on the right side is smaller than a predetermined threshold TH3 in the similarity evaluation result based on the primary candidate disparity, this candidate disparity should be nullified or voided in order to eliminate any erroneous matching operations.
Numerous primary candidate disparities (disp1) and secondary candidate disparities (disp2) will be obtained as a result of the scan based on a one-dimensional window successively shifted at strokes of 8 (M/2) pixels in an overlapped manner along the odd number line of the left image. The primary candidate disparities (disp1) and secondary candidate disparities (disp2), thus obtained, are stored in the predetermined regions of a storage memory shown in FIG. 18. Although FIG. 18 shows the memory regions in one-to-one relationship to the image data, it is noted that vacant regions in the storage memory can be eliminated. D: Disparity Determination Phase
In the disparity determination, a disparity in each of blocks (totaling 1440 blocks) is finally determined based on the primary candidate disparities (disp1) and the secondary candidate disparities (disp2) determined with respect to each of one-dimensional window.
A method of determining a disparity of each block will be explained with reference to FIG. 19, which explains how the disparity of block BL(X,Y) is determined. To determine a disparity of block BL(X,Y), a histogram is created based on a total of 24 sets of primary candidate disparities (disp1) and secondary candidate disparities (disp2) existing in the region encircled by a dotted line in FIG. 19, considering the fact that all of these selected primary and secondary candidate disparities are obtained through the matching operation of the specific one-dimensional windows each comprising at least 8 pixels existing in the region of block BL(X,Y). FIG. 20 is a graph showing an example of the histogram of disparities created based on the primary and secondary candidate disparities.
Then, a disparity having the largest number of occurrences is finally determined as the disparity of block BL(X,Y).
Returning to the second example of prior art methods, the characteristic point was that, after the image is dissected into a plurality of blocks, the similarity evaluation for the matching was independently performed in each block using only the pixels existing in this concerned block. Hence, there was the possibility of causing a mismatching due to the accidental presence of similar but different plural regions. And, the mismatching was a direct cause of the failure in the detection of disparity for each block.
However, according to the disparity detecting method of the present invention, these problems are completely solved. That is, the present invention is characterized in that a histogram is created in each block using the matching data resultant from the setting of a plurality of one-dimensional windows successively overlapped, and then the disparity of the concerned block BL(X,Y) is determined by detecting the peak position in the histogram. Hence, even if an erroneous matching may arise in the matching operation performed with respect to each of one-dimensional windows (i.e. even if an erroneous candidate disparity is accidentally detected), the present invention is sufficiently flexible to absorb or correct such an error.
Furthermore, as a superior effect of using overlapped one-dimensional windows, it becomes possible to supplementarily use the pixels existing out of the concerned block in the determination of disparity. This will surely prevent the failure in the detection of disparity even if similar but different regions are accidentally measured.
In general, in this kind of disparity detecting method, the image is obtained as digital data sampled at a predetermined frequency. Hence the measurable minimum unit for the disparity is limited to one pixel. If high accuracy in the disparity measurement is strictly requested, the following sub-pixel level measurement will be available.
The method of sub-pixel level measurement will be explained with reference to FIG. 21. FIG. 21 shows a histogram created in a certain block in accordance with the previously-described method, especially showing the distribution of the number of occurrences in the vicinity of a specific disparity corresponding to a peak position. The sub-pixel level disparity measurement is performed by using the number of occurrences hi, hi-1, hi+1 corresponding to the designated disparities di, di-1, di+1 (in the increment of pixel) existing before and after a peak position ds.
More specifically, a first straight line 1501 is obtained as a line crossing both of two points (di-1, hi-1) and (di, hi) . A second straight line 1502 is obtained as a line crossing a point (di+1, hi+1) and having a gradient symmetrical with the line 1501 (i.e. identical in absolute value but opposite in sign). Then, a point 1503 is obtained as an intersecting point of two straight lines 1501 and 1502. A disparity ds, corresponding to thus obtained intersecting point 1503, is finally obtained as a sub-pixel level disparity of the concerned block.
The sub-pixel level disparity measurement, above described, uses a histogram created by the number of occurrences; accordingly, this method is essentially different from the prior art method which basically uses the similarity evaluations C derived from the equation 6.
Second Embodiment
A second embodiment will be explained based on a stereo image measurement using the method of matching stereo images and detecting disparity between the images in accordance with the present invention.
FIG. 22 is a flow diagram showing sequential processes for executing the second embodiment of the present invention, covering the stereo image pickup phase through the disparity determination phase. In the image pickup phase (A), two, right and left, images are taken in through two, right and left, image-pickup devices in steps S1601 and S1602. The processing performed in the image-pickup phase (A) is identical with that of the first embodiment. Then, the right and left images, obtained in the image-pickup phase (A), are respectively subjected to feature extraction in the next feature extraction phase (B') in steps S1603 and S1604. Thereafter, in the succeeding matching phase (C), the extracted features of the right and left images are compared to check how they match with each other in step S1605. Furthermore, in a disparity determination phase (D), a disparity is determined in each block (Step S1606). The processing performed in the matching phase (C) and the disparity determination phase (D) are identical with those of the first embodiment.
Hereinafter, only the portion different from the first embodiment, i.e. the processing of feature extraction phase (B'), will be explained in greater detail.
B' : Feature Extraction Phase
The two images, right image IR and left image IL, obtained in the image pickup phase (A), are developed into a plurality of frequency component images in the feature extraction phase (B').
IL: L1, L2, L3, - - - , Lk, Lk+1, - - - , Ln
IR: R1, R2, R3, - - - . Rk, Rk+1, - - - , Rn
Each frequency-component image is applied the secondary differential processing. Thereafter, each image is converted pixel by pixel into ternary values, thus obtaining the following ternary-valued frequency component images.
TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn
TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn
The flow of processing and its purposes are identical with those of the feature extraction phase (B) of the first embodiment.
Next, the essential portion different from the first embodiment, i.e. a ternary-value processing, will be explained.
FIG. 25 is a view illustrating a method of transforming or quantizing the frequency component images into ternary values used in the second embodiment of the present invention. As shown in FIG. 25, all of frequency component images are classified into three values by judging whether the pixel of a concerned image is related to a zero-crossing point, or whether the sign of its gradient is positive or negative when it corresponds to the zero-crossing point. For example, ternary values are given to respective pixels as follows.
Other than zero-crossing point - - - 0
Zero-crossing point, and Positive gradient - - - 1
Zero-crossing point, and Negative gradient - - - -1
The above-described ternary-value processing makes it possible to quantize the images into 1 or -1 at their edges, especially at the inflection points (=zero-crossing points) otherwise the images are expressed by 0. This ternary-value processing (G) is comparative with or superior to the ternary-value processing (F) of the first embodiment in the accurate detection of edge positions, and also robustness against sensitivity difference between right and left images, although a little bit weak against noises.
FIG. 23 is a block diagram showing the arrangement of a third apparatus which realizes the processing of feature extraction phase (B') of FIG. 22. Left image IL, received in the feature extraction phase (B'), is the image obtained in the image-pickup phase (A) which is band limited to fc (Hz). The input image IL is developed into a plurality of band signals having different frequency components (i.e. frequency component images FLk, k=1,2,3, - - - ,n) by plural low-pass filters (LPFk, k=1,2,3, - - - ) and high-pass filters (HPFk, k=1,2,3, - - - ,n) combined as shown in the drawing. This processing is identical with that of the first embodiment. The developed frequency component images FLk are converted or quantized into ternary-valued data (i.e. ternary-valued frequency component images TLk, k=1,2,3, - - - ,n) through the above-described ternary-value processing (G).
Each of these plural ternary-valued frequency component image TLk, thus obtained, reveals an edge position involved in each frequency component image. Each edge position is used for the matching of right and left images in the succeeding matching phase (C). Regarding the settings, it is noted that the number of frequency component images FLk or the width of each frequency band should be determined by taking the required performance and the allowable cost range into consideration, in the same manner as in the first embodiment.
FIG. 24 is a block diagram showing the arrangement of a fourth apparatus which realizes the processing of feature extraction phase (B') of FIG. 22. This fourth apparatus is identical with the second apparatus of the first embodiment shown in FIG. 11 except for the ternary-value processing (G).
In this manner, the image is developed into a plurality of frequency component images FLk which are then converted into ternary-valued frequency component images TLk through ternary-value processing. Subsequently, ternary-valued frequency component images TLk are sent to the succeeding matching phase (C) to perform the stereo image matching operation based on one-dimensional windows. And, a disparity of each block is finally determined in the disparity determination phase (D).
Third Embodiment
A third embodiment will be explained based on a stereo image measurement using the method of matching stereo images and detecting disparity between the images in accordance with the present invention.
FIG. 26 is a flow diagram showing sequential processes for executing the third embodiment of the present invention, covering the stereo image pickup phase through the disparity determination phase. In the image pickup phase (A), two, right and left, images are taken in through two, right and left, image-pickup devices in steps S2001 and S2002. The processing performed in the image-pickup phase (A) is identical with those of the first and second embodiments. Then, the right and left images, obtained in the image-pickup phase (A), are respectively subjected to feature extraction in the next feature extraction phase (B") in steps S2003 and S2004. Thereafter, in the succeeding matching phase (C), the extracted features of the right and left images are compared to check how they match with each other in step S2005. Furthermore, in a disparity determination phase (D), a disparity is determined in each block (Step S2006). The processing performed in the matching phase (C) and the disparity determination phase (D) are identical with those of the first and second embodiments.
Hereinafter, only the portion different from the first and second embodiments, i.e. the processing of feature extraction phase (B"), will be explained in greater detail.
B": Feature Extraction Phase
The two images, right image IR and left image IL, obtained in the image pickup phase (A), are developed into a plurality of frequency component images in the feature extraction phase (B").
IL: L1, L2, L3, - - - , Lk, Lk+1, - - - , Ln
IR: R1, R2, R3, - - - . Rk, Rk+1, - - - , Rn
Each frequency-component image is applied the secondary differential processing. Thereafter, each image is converted pixel by pixel into ternary values, thus obtaining the following ternary-valued frequency component images.
TL1, TL2, TL3, - - - , TLk, TLk+1, - - - , TLn
TR1, TR2, TR3, - - - , TRk, TRk+1, - - - , TRn
The flow of processing and its purposes are identical with those of the feature extraction phases (B), (B') of the first and second embodiments.
Next, the essential portion different from the first and second embodiments, i.e. a ternary-value processing, will be explained. The ternary-value processing of the third embodiment is characterized in that the low-frequency component images are processed through the previously-described ternary-value processing (F) of the first embodiment while the high-frequency component images are processed through the above-described ternary-value processing (G) of the second embodiment.
The high-frequency component images have accurate information with respect to the edge positions when they are compared with the low-frequency component images. To utilize these accurate information effectively, the zero-crossing point classification is used for converting high-frequency component images into ternary values. However, the edge information, obtained through the ternary-value processing (G), tends to involve erroneous edge information due to noises. To the contrary, the low-frequency component images are converted into ternary values by using the threshold classification since low-frequency component images are not so accurate information for representing the edge positions. The edge information, obtained through the ternary-value processing (F), seldom involves erroneous edge information derived from noises.
FIG. 27 is a block diagram showing the arrangement of a fifth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26. Left image IL, received in the feature extraction phase (B"), is the image obtained in the image-pickup phase (A) which is band limited to fc (Hz). The input image IL is developed into a plurality of band signals having different frequency components (i.e. frequency component images FLk, k=1,2,3, - - - ,n) by plural low-pass filters (LPFk, k=1,2,3, - - - ) and high-pass filters (HPFk, k=1,2,3, - - - ,n) combined as shown in the drawing. This processing is identical with those of the first and second embodiments. The low-frequency component images of the developed frequency component images FLk are converted or quantized into ternary-valued data through the ternary-value processing (F) explained in the first embodiment. On the other hand, the high-frequency component images of the developed frequency component images FLk are converted or quantized into ternary-valued data through the ternary-value processing (G) explained in the second embodiment. Thus, ternary-valued frequency component images TLk (k=1,2,3 - - - ,n) are obtained.
Each of these plural ternary-valued frequency component image TLk, thus obtained, reveals an edge position involved in each frequency component image. Each edge position is used for the matching of right and left images in the succeeding matching phase (C). Regarding the settings, it is noted that the number of frequency component images FLk or the width of each frequency band, as well as selection between the ternary-value processing (F) and the ternary-value processing (G), should be determined by taking the required performance and the allowable cost range into consideration.
FIG. 28 is a block diagram showing the arrangement of a sixth apparatus which realizes the processing of feature extraction phase (B") of FIG. 26. This sixth apparatus is identical with the second and fourth apparatuses of the first and second embodiments shown in FIG. 11 and 24 except for the ternary-value processing portion.
In this manner, the image is developed into a plurality of frequency component images FLk which are then converted into ternary-valued frequency component images TLk through ternary-value processing. Subsequently, ternary-valued frequency component images TLk are sent to the succeeding matching phase (C) to perform the stereo image matching operation based on one-dimensional windows. And, a disparity of each block is finally determined in the disparity determination phase (D).
Miscellaneous
As apparent from the foregoing, the method of the present invention for matching stereo images and detecting a disparity between the images is explained based on the stereo image measurement system embodied into the first, second and third embodiment described above. Although the embodiments of the present invention use the stereo cameras disposed in parallel with each other in the right-and-left direction, it is needless to say that the arrangement of stereo cameras is not limited to the disclosed one.
Furthermore, although the embodiments of the present invention use the odd-number lines only for the scanning operation, the same effect will be obtained by using the objective scanning lines of the even-number lines only. If all the lines are used for the scanning operation, the reliability in the measurement of disparity will be enhanced although the processing volumes is doubled.
Moreover, the embodiments of the present invention adopt a window size of 1×16 (M=16) pixels extending in the horizontal direction and a block size of 16×16 (M=L=16) pixels. Needless to say, practical values for M and L can be varied flexibly.
As explained in the foregoing description, the present invention provides a novel and excellent method of matching stereo images and of detecting a disparity of these images which is small in the computation amount, compact and cheap in the hardware arrangement, speedy in the processing, and reliable and accurate in the performance of the stereo image matching and the disparity detection.
Accordingly, the present invention can be applied, for example, to various industrial monitoring systems, such as an obstacle monitor at a railroad crossing or an invader monitor in a building, by utilizing its capability of always measuring a disparity based on successively sampled stereo images and detecting the change of the disparity.
As this invention may be embodied in several forms without departing from the spirit of essential characteristics thereof, the present embodiments as described are therefore intended to be only illustrative and not restrictive, since the scope of the invention is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalents of such metes and bounds, are therefore intended to be embraced by the claims.

Claims (4)

What is claimed is:
1. A method of detecting a disparity between stereo images, comprising the steps of:
dividing each of first and second images IL and IR into a plurality of blocks each having a size of M×L pixels;
matching ternary-valued frequency component images of said images IL and IR;
comparing pixels in a micro region defined by a one-dimensional window set on said first image IL with pixels in a designated micro region on said second image IR;
evaluating a similarity between two micro regions using the following equation:
Εall=Σβk(PN)k+Σγk(ZN)k
where PN represents a total number of pixels having an evaluation result "P" while ZN represents a total number of pixels having an evaluation result "Z", and βk and γk represent weighting factors;
searching a first region having a most highest similarity and a second region having a second highest similarity in a concerned block;
specifying a first candidate disparity as a disparity corresponding to said first region, and a second candidate disparity as a disparity corresponding to said second region;
creating a histogram based on said first and second candidate disparities; and
determining a valid disparity of said concerned block as a disparity corresponding to a peak position of said histogram.
2. The method defined by claim 1, wherein said first image IL is designated as a reference image, a one-dimensional window capable of encompassing N pixels therein is set on the ternary-valued frequency component image of said first image IL, and a matching region having the same ternary-value pattern as said N pixels in said one-dimensional window is searched from the ternary-valued frequency component image of said second image IR.
3. The method defined by claim 1, wherein one of said first and second images IL and IR is designated as a reference image, a plurality of one-dimensional windows are set on the entire surface of said ternary-valued frequency component image of said reference image through a scanning operation along an epipolar line, so that said one-dimensional windows are successively overlapped at the same intervals of N/2 when each of said one-dimensional windows has a size equivalent to N pixels, and a matching operation is carried out with respect to each of said one-dimensional windows.
4. The method defined by claim 1, wherein said valid disparity is calculated as a sub-pixel level parity corresponding to an intersecting point of a first straight line crossing two points (di-1, hi-1), (di, hi) and a second straight line crossing a point (di+1, hi+1) with a gradient symmetrical with said first straight line, where di-1, di, di+1 represent disparities near the peak position of said histogram and hi-1, hi, hi+represent the number of occurrences of said disparities di-1, di, di+1 respectively.
US08/629,708 1995-04-21 1996-04-09 Method of matching stereo images and method of measuring disparity between these image Expired - Fee Related US5867591A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/196,329 US6125198A (en) 1995-04-21 1998-11-19 Method of matching stereo images and method of measuring disparity between these items

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP09720495A JP3539788B2 (en) 1995-04-21 1995-04-21 Image matching method
JP7-097204 1995-04-21

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US09/196,329 Division US6125198A (en) 1995-04-21 1998-11-19 Method of matching stereo images and method of measuring disparity between these items

Publications (1)

Publication Number Publication Date
US5867591A true US5867591A (en) 1999-02-02

Family

ID=14186093

Family Applications (2)

Application Number Title Priority Date Filing Date
US08/629,708 Expired - Fee Related US5867591A (en) 1995-04-21 1996-04-09 Method of matching stereo images and method of measuring disparity between these image
US09/196,329 Expired - Lifetime US6125198A (en) 1995-04-21 1998-11-19 Method of matching stereo images and method of measuring disparity between these items

Family Applications After (1)

Application Number Title Priority Date Filing Date
US09/196,329 Expired - Lifetime US6125198A (en) 1995-04-21 1998-11-19 Method of matching stereo images and method of measuring disparity between these items

Country Status (5)

Country Link
US (2) US5867591A (en)
EP (1) EP0738872B1 (en)
JP (1) JP3539788B2 (en)
CA (1) CA2174590C (en)
DE (1) DE69624614T2 (en)

Cited By (151)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049625A (en) * 1996-10-15 2000-04-11 Nec Corporation Method of and an apparatus for 3-dimensional structure estimation
US6111596A (en) * 1995-12-29 2000-08-29 Lucent Technologies Inc. Gain and offset correction for efficient stereoscopic coding and improved display
US6154566A (en) * 1996-05-15 2000-11-28 Omron Corporation Method and apparatus for determining image similarity and position
US6175648B1 (en) * 1997-08-12 2001-01-16 Matra Systems Et Information Process for producing cartographic data by stereo vision
US6226396B1 (en) * 1997-07-31 2001-05-01 Nec Corporation Object extraction method and system
US20010002936A1 (en) * 1999-12-02 2001-06-07 Takayuki Tsuji Image recognition system
US6275253B1 (en) * 1998-07-09 2001-08-14 Canon Kabushiki Kaisha Stereographic image compression with image moment normalization
US20010055418A1 (en) * 2000-05-17 2001-12-27 Kenji Nakamura Image-correspondence position detection device, distance measuring device and apparatus using the same
US6396397B1 (en) 1993-02-26 2002-05-28 Donnelly Corporation Vehicle imaging system with stereo imaging
US20020181764A1 (en) * 1997-05-22 2002-12-05 Kabushiki Kaisha Topcon Measuring apparatus
US6700934B2 (en) 2001-03-14 2004-03-02 Redrock Semiconductor, Ltd. Error detection using a maximum distance among four block-motion-vectors in a macroblock in a corrupted MPEG-4 bitstream
US20040151380A1 (en) * 2003-01-30 2004-08-05 Postech Foundation Multi-layered real-time stereo matching method and system
US6822563B2 (en) 1997-09-22 2004-11-23 Donnelly Corporation Vehicle imaging system with accessory control
US6865289B1 (en) * 2000-02-07 2005-03-08 Canon Kabushiki Kaisha Detection and removal of image occlusion errors
US20050123190A1 (en) * 2000-05-04 2005-06-09 Microsoft Corporation System and method for progressive stereo matching of digital images
US20050129325A1 (en) * 2003-11-27 2005-06-16 Sony Corporation Image processing apparatus and method
US20060028731A1 (en) * 1993-02-26 2006-02-09 Kenneth Schofield Vehicular vision system
US20060083421A1 (en) * 2004-10-14 2006-04-20 Wu Weiguo Image processing apparatus and method
US20070023613A1 (en) * 1993-02-26 2007-02-01 Donnelly Corporation Vehicle headlight control using imaging sensor
US20070109406A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation, A Corporation Of The State Of Michigan Image sensing system for a vehicle
US20080158359A1 (en) * 2006-12-27 2008-07-03 Matsushita Electric Industrial Co., Ltd. Solid-state imaging device, camera, vehicle and surveillance device
US20080199070A1 (en) * 2007-02-16 2008-08-21 Samsung Electronics Co., Ltd. Three-dimensional image display apparatus and method for enhancing stereoscopic effect of image
US20080260260A1 (en) * 2006-09-19 2008-10-23 Sharp Kabushiki Kaisha Image processing apparatus, image forming apparatus, image processing system, computer program and recording medium
US20090045323A1 (en) * 2007-08-17 2009-02-19 Yuesheng Lu Automatic Headlamp Control System
US20090169057A1 (en) * 2007-12-28 2009-07-02 Industrial Technology Research Institute Method for producing image with depth by using 2d images
US20090180682A1 (en) * 2008-01-11 2009-07-16 Theodore Armand Camus System and method for measuring image quality
US20090208058A1 (en) * 2004-04-15 2009-08-20 Donnelly Corporation Imaging system for vehicle
US20090237491A1 (en) * 2007-10-29 2009-09-24 Toru Saito Object Detecting System
US20090244313A1 (en) * 2008-03-26 2009-10-01 Tomonori Masuda Compound eye photographing apparatus, control method therefor, and program
US20090263009A1 (en) * 2008-04-22 2009-10-22 Honeywell International Inc. Method and system for real-time visual odometry
US20090279741A1 (en) * 2008-05-06 2009-11-12 Honeywell Method and apparatus for vision based motion determination
EP2133726A1 (en) 2008-06-10 2009-12-16 THOMSON Licensing Multi-image capture system with improved depth image resolution
US20100007720A1 (en) * 2008-06-27 2010-01-14 Beddhu Murali Method for front matching stereo vision
US20100020170A1 (en) * 2008-07-24 2010-01-28 Higgins-Luthman Michael J Vehicle Imaging System
US7655894B2 (en) 1996-03-25 2010-02-02 Donnelly Corporation Vehicular image sensing system
US7660458B1 (en) * 2004-12-14 2010-02-09 Google Inc. Three-dimensional model construction using unstructured pattern
US20100066811A1 (en) * 2008-08-11 2010-03-18 Electronics And Telecommunications Research Institute Stereo vision system and control method thereof
US20100150455A1 (en) * 2008-02-12 2010-06-17 Ichiro Oyama Compound eye imaging apparatus, distance measuring apparatus, disparity calculation method, and distance measuring method
WO2010083713A1 (en) * 2009-01-22 2010-07-29 华为技术有限公司 Method and device for disparity computation
US20100266326A1 (en) * 2009-04-21 2010-10-21 Chuang Cheng-Hua Mark-erasable pen cap
US20100265048A1 (en) * 2007-09-11 2010-10-21 Yuesheng Lu Imaging System for Vehicle
US20110074933A1 (en) * 2009-09-28 2011-03-31 Sharp Laboratories Of America, Inc. Reduction of viewer discomfort for stereoscopic images
US20110122249A1 (en) * 2004-09-30 2011-05-26 Donnelly Corporation Vision system for vehicle
US20110141274A1 (en) * 2009-12-15 2011-06-16 Industrial Technology Research Institute Depth Detection Method and System Using Thereof
US7972045B2 (en) 2006-08-11 2011-07-05 Donnelly Corporation Automatic headlamp control system
US20110193961A1 (en) * 2010-02-10 2011-08-11 Magna Mirrors Of America, Inc. Imaging and display system for vehicle
US8070332B2 (en) 2007-07-12 2011-12-06 Magna Electronics Inc. Automatic lighting system with adaptive function
CN1956555B (en) * 2005-10-12 2011-12-07 三星电子株式会社 Apparatus and method for processing 3d picture
US20120127163A1 (en) * 2010-11-23 2012-05-24 Samsung Electronics Co., Ltd. Image processing apparatus and image processing method
US8217830B2 (en) 2007-01-25 2012-07-10 Magna Electronics Inc. Forward facing sensing system for a vehicle
US20120249747A1 (en) * 2011-03-30 2012-10-04 Ziv Aviv Real-time depth extraction using stereo correspondence
US8320627B2 (en) 2010-06-17 2012-11-27 Caterpillar Inc. Machine control system utilizing stereo disparity density
US8362885B2 (en) 2000-03-31 2013-01-29 Donnelly Corporation Vehicular rearview mirror system
US8376595B2 (en) 2009-05-15 2013-02-19 Magna Electronics, Inc. Automatic headlamp control
US20130116562A1 (en) * 2011-11-09 2013-05-09 Samsung Electronics Co., Ltd. Method and apparatus for generating diagnostic image and medical image system
US8446470B2 (en) 2007-10-04 2013-05-21 Magna Electronics, Inc. Combined RGB and IR imaging sensor
US20130272582A1 (en) * 2010-12-22 2013-10-17 Thomson Licensing Apparatus and method for determining a disparity estimate
KR101345303B1 (en) 2007-03-29 2013-12-27 삼성전자주식회사 Dynamic depth control method or apparatus in stereo-view or multiview sequence images
US8643724B2 (en) 1996-05-22 2014-02-04 Magna Electronics Inc. Multi-camera vision system for a vehicle
US8665079B2 (en) 2002-05-03 2014-03-04 Magna Electronics Inc. Vision system for vehicle
US8694224B2 (en) 2012-03-01 2014-04-08 Magna Electronics Inc. Vehicle yaw rate correction
US20140219549A1 (en) * 2013-02-01 2014-08-07 Electronics And Telecommunications Research Institute Method and apparatus for active stereo matching
US8874317B2 (en) 2009-07-27 2014-10-28 Magna Electronics Inc. Parking assist system
US8886401B2 (en) 2003-10-14 2014-11-11 Donnelly Corporation Driver assistance system for a vehicle
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US9049434B2 (en) 2010-03-05 2015-06-02 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
CN104685535A (en) * 2012-09-27 2015-06-03 松下知识产权经营株式会社 Stereo image processing device and stereo image processing method
US9085261B2 (en) 2011-01-26 2015-07-21 Magna Electronics Inc. Rear vision system with trailer angle detection
US9090234B2 (en) 2012-11-19 2015-07-28 Magna Electronics Inc. Braking control system for vehicle
US9092986B2 (en) 2013-02-04 2015-07-28 Magna Electronics Inc. Vehicular vision system
US9117123B2 (en) 2010-07-05 2015-08-25 Magna Electronics Inc. Vehicular rear view camera display system with lifecheck function
US9128367B2 (en) 2010-03-05 2015-09-08 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US9180908B2 (en) 2010-11-19 2015-11-10 Magna Electronics Inc. Lane keeping system and lane centering system
US9191574B2 (en) 2001-07-31 2015-11-17 Magna Electronics Inc. Vehicular vision system
US9188849B2 (en) 2010-03-05 2015-11-17 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9194943B2 (en) 2011-04-12 2015-11-24 Magna Electronics Inc. Step filter for estimating distance in a time-of-flight ranging system
US9245448B2 (en) 2001-07-31 2016-01-26 Magna Electronics Inc. Driver assistance system for a vehicle
US9260095B2 (en) 2013-06-19 2016-02-16 Magna Electronics Inc. Vehicle vision system with collision mitigation
US9264672B2 (en) 2010-12-22 2016-02-16 Magna Mirrors Of America, Inc. Vision display system for vehicle
US9327693B2 (en) 2013-04-10 2016-05-03 Magna Electronics Inc. Rear collision avoidance system for vehicle
US9340227B2 (en) 2012-08-14 2016-05-17 Magna Electronics Inc. Vehicle lane keep assist system
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US20160261848A1 (en) * 2015-03-02 2016-09-08 Hiroyoshi Sekiguchi Information processing apparatus, image capturing apparatus, control system applicable to moveable apparatus, information processing method, and storage medium of program of method
US9446713B2 (en) 2012-09-26 2016-09-20 Magna Electronics Inc. Trailer angle detection system
CN105960570A (en) * 2014-02-05 2016-09-21 形创有限公司 Structured light matching of a set of curves from two cameras
US9481301B2 (en) 2012-12-05 2016-11-01 Magna Electronics Inc. Vehicle vision system utilizing camera synchronization
US9491450B2 (en) 2011-08-01 2016-11-08 Magna Electronic Inc. Vehicle camera alignment system
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US9487235B2 (en) 2014-04-10 2016-11-08 Magna Electronics Inc. Vehicle control system with adaptive wheel angle correction
US9495876B2 (en) 2009-07-27 2016-11-15 Magna Electronics Inc. Vehicular camera with on-board microcontroller
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US9508014B2 (en) 2013-05-06 2016-11-29 Magna Electronics Inc. Vehicular multi-camera vision system
US9547795B2 (en) 2011-04-25 2017-01-17 Magna Electronics Inc. Image processing method for detecting objects using relative motion
US9558409B2 (en) 2012-09-26 2017-01-31 Magna Electronics Inc. Vehicle vision system with trailer angle detection
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US9619716B2 (en) 2013-08-12 2017-04-11 Magna Electronics Inc. Vehicle vision system with image classification
US9623878B2 (en) 2014-04-02 2017-04-18 Magna Electronics Inc. Personalized driver assistance system for vehicle
US9681062B2 (en) 2011-09-26 2017-06-13 Magna Electronics Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US9701246B2 (en) 2013-05-21 2017-07-11 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US9743002B2 (en) 2012-11-19 2017-08-22 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US9751465B2 (en) 2012-04-16 2017-09-05 Magna Electronics Inc. Vehicle vision system with reduced image color data processing by use of dithering
US9762880B2 (en) 2011-12-09 2017-09-12 Magna Electronics Inc. Vehicle vision system with customized display
US9761142B2 (en) 2012-09-04 2017-09-12 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9842400B2 (en) 2015-01-27 2017-12-12 Samsung Electronics Co., Ltd. Method and apparatus for determining disparity
US9900490B2 (en) 2011-09-21 2018-02-20 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US10027930B2 (en) 2013-03-29 2018-07-17 Magna Electronics Inc. Spectral filtering for vehicular driver assistance systems
US10025994B2 (en) 2012-12-04 2018-07-17 Magna Electronics Inc. Vehicle vision system utilizing corner detection
US10055651B2 (en) 2016-03-08 2018-08-21 Magna Electronics Inc. Vehicle vision system with enhanced lane tracking
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10078789B2 (en) 2015-07-17 2018-09-18 Magna Electronics Inc. Vehicle parking assist system with vision-based parking space detection
US10086870B2 (en) 2015-08-18 2018-10-02 Magna Electronics Inc. Trailer parking assist system for vehicle
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US10132971B2 (en) 2016-03-04 2018-11-20 Magna Electronics Inc. Vehicle camera with multiple spectral filters
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US10160437B2 (en) 2016-02-29 2018-12-25 Magna Electronics Inc. Vehicle control system with reverse assist
US10160382B2 (en) 2014-02-04 2018-12-25 Magna Electronics Inc. Trailer backup assist system
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US10214206B2 (en) 2015-07-13 2019-02-26 Magna Electronics Inc. Parking assist system for vehicle
US10222224B2 (en) 2013-06-24 2019-03-05 Magna Electronics Inc. System for locating a parking space based on a previously parked space
US10232797B2 (en) 2013-04-29 2019-03-19 Magna Electronics Inc. Rear vision system for vehicle with dual purpose signal lines
US10286855B2 (en) 2015-03-23 2019-05-14 Magna Electronics Inc. Vehicle vision system with video compression
US10326969B2 (en) 2013-08-12 2019-06-18 Magna Electronics Inc. Vehicle vision system with reduction of temporal noise in images
US10328932B2 (en) 2014-06-02 2019-06-25 Magna Electronics Inc. Parking assist system with annotated map generation
US10457209B2 (en) 2012-02-22 2019-10-29 Magna Electronics Inc. Vehicle vision system with multi-paned view
US10493916B2 (en) 2012-02-22 2019-12-03 Magna Electronics Inc. Vehicle camera system with image manipulation
US10523904B2 (en) 2013-02-04 2019-12-31 Magna Electronics Inc. Vehicle data recording system
US10529085B2 (en) * 2018-03-30 2020-01-07 Samsung Electronics Co., Ltd. Hardware disparity evaluation for stereo matching
US10551913B2 (en) 2015-03-21 2020-02-04 Mine One Gmbh Virtual 3D methods, systems and software
US10567705B2 (en) 2013-06-10 2020-02-18 Magna Electronics Inc. Coaxial cable with bidirectional data transmission
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US10643343B2 (en) * 2014-02-05 2020-05-05 Creaform Inc. Structured light matching of a set of curves from three cameras
US10755110B2 (en) 2013-06-28 2020-08-25 Magna Electronics Inc. Trailering assist system for vehicle
US10793067B2 (en) 2011-07-26 2020-10-06 Magna Electronics Inc. Imaging system for vehicle
US10819943B2 (en) 2015-05-07 2020-10-27 Magna Electronics Inc. Vehicle vision system with incident recording function
US10853625B2 (en) * 2015-03-21 2020-12-01 Mine One Gmbh Facial signature methods, systems and software
US10875403B2 (en) 2015-10-27 2020-12-29 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US10946799B2 (en) 2015-04-21 2021-03-16 Magna Electronics Inc. Vehicle vision system with overlay calibration
US11277558B2 (en) 2016-02-01 2022-03-15 Magna Electronics Inc. Vehicle vision system with master-slave camera configuration
US11400919B2 (en) 2016-03-02 2022-08-02 Magna Electronics Inc. Vehicle vision system with autonomous parking function
US11433809B2 (en) 2016-02-02 2022-09-06 Magna Electronics Inc. Vehicle vision system with smart camera video output
US11960639B2 (en) 2021-08-29 2024-04-16 Mine One Gmbh Virtual 3D methods, systems and software

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19625727C2 (en) * 1996-06-27 2000-11-09 Bernd Porr Device for determining the disparity in a stereo image pair
BE1010929A3 (en) * 1997-02-17 1999-03-02 Krypton Electronic Eng Nv Measuring system.
GB9720864D0 (en) * 1997-10-01 1997-12-03 Univ Nottingham Trent Line-scan imaging in 3-d
KR100307883B1 (en) * 1998-04-13 2001-10-19 박호군 Method for measuring similarity by using a matching pixel count and apparatus for implementing the same
US6269175B1 (en) * 1998-08-28 2001-07-31 Sarnoff Corporation Method and apparatus for enhancing regions of aligned images using flow estimation
EP1418766A3 (en) * 1998-08-28 2010-03-24 Imax Corporation Method and apparatus for processing images
DE29902457U1 (en) * 1999-02-12 2000-07-20 Bosch Gmbh Robert Environment recognition device, in particular for traffic sign recognition
US6721446B1 (en) * 1999-04-26 2004-04-13 Adobe Systems Incorporated Identifying intrinsic pixel colors in a region of uncertain pixels
JP3349121B2 (en) 1999-09-22 2002-11-20 富士重工業株式会社 Stereo camera mounting structure
JP3255360B2 (en) * 1999-09-22 2002-02-12 富士重工業株式会社 Inspection method of distance data and its inspection device
US6671399B1 (en) * 1999-10-27 2003-12-30 Canon Kabushiki Kaisha Fast epipolar line adjustment of stereo pairs
JP4925498B2 (en) * 2000-07-12 2012-04-25 富士重工業株式会社 Outside vehicle monitoring device with fail-safe function
JP4953498B2 (en) 2000-07-12 2012-06-13 富士重工業株式会社 Outside vehicle monitoring device with fail-safe function
US7227526B2 (en) * 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
GB2372659A (en) * 2001-02-23 2002-08-28 Sharp Kk A method of rectifying a stereoscopic image
US8035612B2 (en) 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
US7259747B2 (en) * 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US8300042B2 (en) 2001-06-05 2012-10-30 Microsoft Corporation Interactive video display system using strobed light
US6690294B1 (en) 2001-07-10 2004-02-10 William E. Zierden System and method for detecting and identifying traffic law violators and issuing citations
KR100424287B1 (en) * 2001-09-10 2004-03-24 주식회사 제이앤에이치테크놀러지 Non-parallel optical axis real-time three-demensional image processing system and method
JP3758034B2 (en) * 2001-10-30 2006-03-22 株式会社デンソー Vehicle occupant protection device
US7190836B2 (en) * 2002-03-18 2007-03-13 Siemens Corporate Research, Inc. Efficient ordering of data for compression and visualization
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
KR20040000144A (en) * 2002-06-24 2004-01-03 (학)창성학원 Image Extracting Method By Using Stereo Matching
KR20040002041A (en) * 2002-06-29 2004-01-07 칩스브레인(주) An optimized real time stereo disparity decision system using vertical strip structure
AU2003264580B2 (en) * 2002-11-29 2006-08-10 Canon Kabushiki Kaisha Range Estimation Using Multi-dimensional Segmentation
AU2003301043A1 (en) 2002-12-13 2004-07-09 Reactrix Systems Interactive directed light/sound system
DE10310849B4 (en) * 2003-03-11 2009-01-02 Inb Vision Ag Method for photogrammetric distance and / or position determination
US20040217940A1 (en) * 2003-04-29 2004-11-04 Chi-Pao Huang Method of Displaying Items in an On Screen Display
CN102034197A (en) 2003-10-24 2011-04-27 瑞克楚斯系统公司 Method and system for managing an interactive video display system
WO2005041579A2 (en) 2003-10-24 2005-05-06 Reactrix Systems, Inc. Method and system for processing captured image information in an interactive video display system
JP2005227897A (en) * 2004-02-10 2005-08-25 Fuji Photo Film Co Ltd Method, device, and program for image display
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US8081822B1 (en) 2005-05-31 2011-12-20 Intellectual Ventures Holding 67 Llc System and method for sensing a feature of an object in an interactive video display
JP4282643B2 (en) * 2005-08-30 2009-06-24 ダイハツ工業株式会社 Strain evaluation apparatus and strain evaluation method
US8098277B1 (en) 2005-12-02 2012-01-17 Intellectual Ventures Holding 67 Llc Systems and methods for communication between a reactive video system and a mobile communication device
JP4821548B2 (en) * 2006-10-02 2011-11-24 コニカミノルタホールディングス株式会社 Image processing apparatus, image processing apparatus control method, and image processing apparatus control program
KR101311896B1 (en) * 2006-11-14 2013-10-14 삼성전자주식회사 Method for shifting disparity of three dimentions and the three dimentions image apparatus thereof
JP5430572B2 (en) 2007-09-14 2014-03-05 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー Gesture-based user interaction processing
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US8595218B2 (en) 2008-06-12 2013-11-26 Intellectual Ventures Holding 67 Llc Interactive display management systems and methods
JP2011030182A (en) * 2009-06-29 2011-02-10 Sony Corp Three-dimensional image data transmission device, three-dimensional image data transmission method, three-dimensional image data reception device, and three-dimensional image data reception method
US20110050857A1 (en) * 2009-09-03 2011-03-03 Electronics And Telecommunications Research Institute Apparatus and method for displaying 3d image in 3d image system
WO2011070674A1 (en) * 2009-12-11 2011-06-16 株式会社 東芝 Apparatus and method for analyzing depth frequencies of stereoscopic image
US8213708B2 (en) * 2010-03-22 2012-07-03 Eastman Kodak Company Adjusting perspective for objects in stereoscopic images
JP5477128B2 (en) * 2010-04-07 2014-04-23 ソニー株式会社 Signal processing apparatus, signal processing method, display apparatus, and program
WO2011159757A2 (en) * 2010-06-15 2011-12-22 Sensics Inc. Systems and methods for personal viewing devices
GB2483434A (en) * 2010-08-31 2012-03-14 Sony Corp Detecting stereoscopic disparity by comparison with subset of pixel change points
WO2013017306A1 (en) * 2011-08-02 2013-02-07 Qatar Foundation Copy detection
GB2493514B (en) * 2011-08-02 2015-04-08 Qatar Foundation Copy detection
US9402065B2 (en) 2011-09-29 2016-07-26 Qualcomm Incorporated Methods and apparatus for conditional display of a stereoscopic image pair
TWI507032B (en) * 2012-06-21 2015-11-01 Top Victory Invest Ltd And a display method and a device for preventing the display of the abnormality of the screen display
US8922662B1 (en) * 2012-07-25 2014-12-30 Amazon Technologies, Inc. Dynamic image selection
CN104574342B (en) * 2013-10-14 2017-06-23 株式会社理光 The noise recognizing method and Noise Identification device of parallax depth image
KR102130123B1 (en) * 2013-10-31 2020-07-03 삼성전자주식회사 Multi view image display apparatus and control method thereof
CN103852442B (en) * 2014-03-14 2016-06-01 华中科技大学 A kind of method that diffuse reflectance infrared spectroscopy extracts and sample is identified
CN104200561B (en) * 2014-06-16 2017-06-20 华中科技大学 A kind of method that RMB is stained with writing based on textural characteristics identification
WO2016092532A1 (en) * 2014-12-09 2016-06-16 Inuitive Ltd. A method for determining depth for stereoscopic reconstruction of three dimensional images
JP6512938B2 (en) * 2015-05-25 2019-05-15 キヤノン株式会社 Imaging apparatus and image processing method
JP6722084B2 (en) * 2016-10-06 2020-07-15 株式会社Soken Object detection device
WO2019041035A1 (en) 2017-08-30 2019-03-07 Innovations Mindtrick Inc. Viewer-adjusted stereoscopic image display

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4745562A (en) * 1985-08-16 1988-05-17 Schlumberger, Limited Signal processing disparity resolution
JPH02100589A (en) * 1988-10-07 1990-04-12 Nippon Telegr & Teleph Corp <Ntt> Binocular parallax detecting method
US5309522A (en) * 1992-06-30 1994-05-03 Environmental Research Institute Of Michigan Stereoscopic determination of terrain elevation
JPH06195445A (en) * 1992-12-24 1994-07-15 Canon Inc Method for extracting point corresponding to plural images
US5383013A (en) * 1992-09-18 1995-01-17 Nec Research Institute, Inc. Stereoscopic computer vision system
US5530774A (en) * 1994-03-25 1996-06-25 Eastman Kodak Company Generation of depth image through interpolation and extrapolation of intermediate images derived from stereo image pair using disparity vector fields
US5577130A (en) * 1991-08-05 1996-11-19 Philips Electronics North America Method and apparatus for determining the distance between an image and an object
US5606627A (en) * 1995-01-24 1997-02-25 Eotek Inc. Automated analytic stereo comparator
US5612735A (en) * 1995-05-26 1997-03-18 Luncent Technologies Inc. Digital 3D/stereoscopic video compression technique utilizing two disparity estimates
US5652616A (en) * 1996-08-06 1997-07-29 General Instrument Corporation Of Delaware Optimal disparity estimation for stereoscopic video coding
US5684890A (en) * 1994-02-28 1997-11-04 Nec Corporation Three-dimensional reference image segmenting method and apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60217472A (en) * 1984-04-13 1985-10-31 Hitachi Ltd Edge extracting method in picture processing
EP0373614A3 (en) * 1988-12-16 1992-08-12 Schlumberger Technologies Inc Method for direct volume measurement of three dimensional features in binocular stereo images
JP3242529B2 (en) * 1994-06-07 2001-12-25 松下通信工業株式会社 Stereo image matching method and stereo image parallax measurement method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4745562A (en) * 1985-08-16 1988-05-17 Schlumberger, Limited Signal processing disparity resolution
JPH02100589A (en) * 1988-10-07 1990-04-12 Nippon Telegr & Teleph Corp <Ntt> Binocular parallax detecting method
US5577130A (en) * 1991-08-05 1996-11-19 Philips Electronics North America Method and apparatus for determining the distance between an image and an object
US5309522A (en) * 1992-06-30 1994-05-03 Environmental Research Institute Of Michigan Stereoscopic determination of terrain elevation
US5383013A (en) * 1992-09-18 1995-01-17 Nec Research Institute, Inc. Stereoscopic computer vision system
JPH06195445A (en) * 1992-12-24 1994-07-15 Canon Inc Method for extracting point corresponding to plural images
US5684890A (en) * 1994-02-28 1997-11-04 Nec Corporation Three-dimensional reference image segmenting method and apparatus
US5530774A (en) * 1994-03-25 1996-06-25 Eastman Kodak Company Generation of depth image through interpolation and extrapolation of intermediate images derived from stereo image pair using disparity vector fields
US5606627A (en) * 1995-01-24 1997-02-25 Eotek Inc. Automated analytic stereo comparator
US5612735A (en) * 1995-05-26 1997-03-18 Luncent Technologies Inc. Digital 3D/stereoscopic video compression technique utilizing two disparity estimates
US5652616A (en) * 1996-08-06 1997-07-29 General Instrument Corporation Of Delaware Optimal disparity estimation for stereoscopic video coding

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Drive Assist System by means of 3-D Image Recognition Technique" by Saneyoshi et al, pp. 169-172, Automotive Vehicle Technical Institute Scientific Lecture Meeting, Oct., 1992.
"Image Processing Handbook" by M. Onoue, Shokodo Publishing Co., Ltd., pp. 396-397, 1987.
Drive Assist System by means of 3 D Image Recognition Technique by Saneyoshi et al, pp. 169 172, Automotive Vehicle Technical Institute Scientific Lecture Meeting, Oct., 1992. *
Image Processing Handbook by M. Onoue, Shokodo Publishing Co., Ltd., pp. 396 397, 1987. *

Cited By (489)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6396397B1 (en) 1993-02-26 2002-05-28 Donnelly Corporation Vehicle imaging system with stereo imaging
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US20070109653A1 (en) * 1993-02-26 2007-05-17 Kenneth Schofield Image sensing system for a vehicle
US20070109406A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation, A Corporation Of The State Of Michigan Image sensing system for a vehicle
US20050083184A1 (en) * 1993-02-26 2005-04-21 Donnelly Corporation Vehicle imaging system with stereo imaging
US20060028731A1 (en) * 1993-02-26 2006-02-09 Kenneth Schofield Vehicular vision system
US20070109651A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation Image sensing system for a vehicle
US20080054161A1 (en) * 1993-02-26 2008-03-06 Donnelly Corporation Image sensing system for a vehicle
US8599001B2 (en) 1993-02-26 2013-12-03 Magna Electronics Inc. Vehicular vision system
US8063759B2 (en) 1993-02-26 2011-11-22 Donnelly Corporation Vehicle vision system
US7859565B2 (en) 1993-02-26 2010-12-28 Donnelly Corporation Vision system for a vehicle including image processor
US20070176080A1 (en) * 1993-02-26 2007-08-02 Donnelly Corporation Image sensing system for a vehicle
US20070023613A1 (en) * 1993-02-26 2007-02-01 Donnelly Corporation Vehicle headlight control using imaging sensor
US7227459B2 (en) 1993-02-26 2007-06-05 Donnelly Corporation Vehicle imaging system
US20070109654A1 (en) * 1993-02-26 2007-05-17 Donnelly Corporation, A Corporation Of The State Of Michigan Image sensing system for a vehicle
US6111596A (en) * 1995-12-29 2000-08-29 Lucent Technologies Inc. Gain and offset correction for efficient stereoscopic coding and improved display
US7655894B2 (en) 1996-03-25 2010-02-02 Donnelly Corporation Vehicular image sensing system
US8222588B2 (en) 1996-03-25 2012-07-17 Donnelly Corporation Vehicular image sensing system
US7994462B2 (en) 1996-03-25 2011-08-09 Donnelly Corporation Vehicular image sensing system
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US8492698B2 (en) 1996-03-25 2013-07-23 Donnelly Corporation Driver assistance system for a vehicle
US8637801B2 (en) 1996-03-25 2014-01-28 Magna Electronics Inc. Driver assistance system for a vehicle
US8324552B2 (en) 1996-03-25 2012-12-04 Donnelly Corporation Vehicular image sensing system
US8481910B2 (en) 1996-03-25 2013-07-09 Donnelly Corporation Vehicular image sensing system
US6154566A (en) * 1996-05-15 2000-11-28 Omron Corporation Method and apparatus for determining image similarity and position
US8643724B2 (en) 1996-05-22 2014-02-04 Magna Electronics Inc. Multi-camera vision system for a vehicle
US9131120B2 (en) 1996-05-22 2015-09-08 Magna Electronics Inc. Multi-camera vision system for a vehicle
US8842176B2 (en) 1996-05-22 2014-09-23 Donnelly Corporation Automatic vehicle exterior light control
US6480620B1 (en) 1996-10-15 2002-11-12 Nec Corporation Method of and an apparatus for 3-dimensional structure estimation
US6049625A (en) * 1996-10-15 2000-04-11 Nec Corporation Method of and an apparatus for 3-dimensional structure estimation
US7016528B2 (en) * 1997-05-22 2006-03-21 Kabushiki Kaisha Topcon Measuring apparatus
US20020181764A1 (en) * 1997-05-22 2002-12-05 Kabushiki Kaisha Topcon Measuring apparatus
US6226396B1 (en) * 1997-07-31 2001-05-01 Nec Corporation Object extraction method and system
US6175648B1 (en) * 1997-08-12 2001-01-16 Matra Systems Et Information Process for producing cartographic data by stereo vision
US6822563B2 (en) 1997-09-22 2004-11-23 Donnelly Corporation Vehicle imaging system with accessory control
US6275253B1 (en) * 1998-07-09 2001-08-14 Canon Kabushiki Kaisha Stereographic image compression with image moment normalization
US8203443B2 (en) 1999-08-12 2012-06-19 Donnelly Corporation Vehicle vision system
US8629768B2 (en) 1999-08-12 2014-01-14 Donnelly Corporation Vehicle vision system
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US6658150B2 (en) * 1999-12-02 2003-12-02 Honda Giken Kogyo Kabushiki Kaisha Image recognition system
US20010002936A1 (en) * 1999-12-02 2001-06-07 Takayuki Tsuji Image recognition system
US6865289B1 (en) * 2000-02-07 2005-03-08 Canon Kabushiki Kaisha Detection and removal of image occlusion errors
US9783125B2 (en) 2000-03-31 2017-10-10 Magna Electronics Inc. Accessory system for a vehicle
US8362885B2 (en) 2000-03-31 2013-01-29 Donnelly Corporation Vehicular rearview mirror system
US8686840B2 (en) 2000-03-31 2014-04-01 Magna Electronics Inc. Accessory system for a vehicle
US7272256B2 (en) * 2000-05-04 2007-09-18 Microsoft Corporation System and method for progressive stereo matching of digital images
US7164790B2 (en) * 2000-05-04 2007-01-16 Microsoft Corporation System and method for progressive stereo matching of digital images
US7106899B2 (en) * 2000-05-04 2006-09-12 Microsoft Corporation System and method for progressive stereo matching of digital images
US20050163367A1 (en) * 2000-05-04 2005-07-28 Microsoft Corporation System and method for progressive stereo matching of digital images
US20050163366A1 (en) * 2000-05-04 2005-07-28 Microsoft Corporation System and method for progressive stereo matching of digital images
US20050123190A1 (en) * 2000-05-04 2005-06-09 Microsoft Corporation System and method for progressive stereo matching of digital images
US6909802B2 (en) * 2000-05-17 2005-06-21 Minolta Co., Ltd. Image-correspondence position detection device, distance measuring device and apparatus using the same
US20010055418A1 (en) * 2000-05-17 2001-12-27 Kenji Nakamura Image-correspondence position detection device, distance measuring device and apparatus using the same
US6700934B2 (en) 2001-03-14 2004-03-02 Redrock Semiconductor, Ltd. Error detection using a maximum distance among four block-motion-vectors in a macroblock in a corrupted MPEG-4 bitstream
US9376060B2 (en) 2001-07-31 2016-06-28 Magna Electronics Inc. Driver assist system for vehicle
US9191574B2 (en) 2001-07-31 2015-11-17 Magna Electronics Inc. Vehicular vision system
US9463744B2 (en) 2001-07-31 2016-10-11 Magna Electronics Inc. Driver assistance system for a vehicle
US10046702B2 (en) 2001-07-31 2018-08-14 Magna Electronics Inc. Control system for vehicle
US9834142B2 (en) 2001-07-31 2017-12-05 Magna Electronics Inc. Driving assist system for vehicle
US9245448B2 (en) 2001-07-31 2016-01-26 Magna Electronics Inc. Driver assistance system for a vehicle
US9656608B2 (en) 2001-07-31 2017-05-23 Magna Electronics Inc. Driver assist system for vehicle
US10406980B2 (en) 2001-07-31 2019-09-10 Magna Electronics Inc. Vehicular lane change system
US10611306B2 (en) 2001-07-31 2020-04-07 Magna Electronics Inc. Video processor module for vehicle
US10099610B2 (en) 2001-07-31 2018-10-16 Magna Electronics Inc. Driver assistance system for a vehicle
US10351135B2 (en) 2002-05-03 2019-07-16 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US9555803B2 (en) 2002-05-03 2017-01-31 Magna Electronics Inc. Driver assistance system for vehicle
US8665079B2 (en) 2002-05-03 2014-03-04 Magna Electronics Inc. Vision system for vehicle
US11203340B2 (en) 2002-05-03 2021-12-21 Magna Electronics Inc. Vehicular vision system using side-viewing camera
US10683008B2 (en) 2002-05-03 2020-06-16 Magna Electronics Inc. Vehicular driving assist system using forward-viewing camera
US9834216B2 (en) 2002-05-03 2017-12-05 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US10118618B2 (en) 2002-05-03 2018-11-06 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9643605B2 (en) 2002-05-03 2017-05-09 Magna Electronics Inc. Vision system for vehicle
US7545974B2 (en) * 2003-01-30 2009-06-09 Postech Foundation Multi-layered real-time stereo matching method and system
US20040151380A1 (en) * 2003-01-30 2004-08-05 Postech Foundation Multi-layered real-time stereo matching method and system
US8886401B2 (en) 2003-10-14 2014-11-11 Donnelly Corporation Driver assistance system for a vehicle
US20050129325A1 (en) * 2003-11-27 2005-06-16 Sony Corporation Image processing apparatus and method
US7957581B2 (en) * 2003-11-27 2011-06-07 Sony Corporation Image processing apparatus and method
US9948904B2 (en) 2004-04-15 2018-04-17 Magna Electronics Inc. Vision system for vehicle
US9191634B2 (en) 2004-04-15 2015-11-17 Magna Electronics Inc. Vision system for vehicle
US20110093179A1 (en) * 2004-04-15 2011-04-21 Donnelly Corporation Driver assistance system for vehicle
US11847836B2 (en) 2004-04-15 2023-12-19 Magna Electronics Inc. Vehicular control system with road curvature determination
US9428192B2 (en) 2004-04-15 2016-08-30 Magna Electronics Inc. Vision system for vehicle
US8593521B2 (en) 2004-04-15 2013-11-26 Magna Electronics Inc. Imaging system for vehicle
US7873187B2 (en) 2004-04-15 2011-01-18 Donnelly Corporation Driver assistance system for vehicle
US20100045797A1 (en) * 2004-04-15 2010-02-25 Donnelly Corporation Imaging system for vehicle
US20090208058A1 (en) * 2004-04-15 2009-08-20 Donnelly Corporation Imaging system for vehicle
US10462426B2 (en) 2004-04-15 2019-10-29 Magna Electronics Inc. Vehicular control system
US11503253B2 (en) 2004-04-15 2022-11-15 Magna Electronics Inc. Vehicular control system with traffic lane detection
US10015452B1 (en) 2004-04-15 2018-07-03 Magna Electronics Inc. Vehicular control system
US9736435B2 (en) 2004-04-15 2017-08-15 Magna Electronics Inc. Vision system for vehicle
US10187615B1 (en) 2004-04-15 2019-01-22 Magna Electronics Inc. Vehicular control system
US10306190B1 (en) 2004-04-15 2019-05-28 Magna Electronics Inc. Vehicular control system
US9008369B2 (en) 2004-04-15 2015-04-14 Magna Electronics Inc. Vision system for vehicle
US8090153B2 (en) 2004-04-15 2012-01-03 Donnelly Corporation Imaging system for vehicle
US10735695B2 (en) 2004-04-15 2020-08-04 Magna Electronics Inc. Vehicular control system with traffic lane detection
US9609289B2 (en) 2004-04-15 2017-03-28 Magna Electronics Inc. Vision system for vehicle
US8818042B2 (en) 2004-04-15 2014-08-26 Magna Electronics Inc. Driver assistance system for vehicle
US7792329B2 (en) 2004-04-15 2010-09-07 Donnelly Corporation Imaging system for vehicle
US10110860B1 (en) 2004-04-15 2018-10-23 Magna Electronics Inc. Vehicular control system
US7949152B2 (en) 2004-04-15 2011-05-24 Donnelly Corporation Driver assistance system for vehicle
US20100312446A1 (en) * 2004-04-15 2010-12-09 Donnelly Corporation Driver assistance system for vehicle
US20110216198A1 (en) * 2004-04-15 2011-09-08 Donnelly Corporation Imaging system for vehicle
US8325986B2 (en) 2004-04-15 2012-12-04 Donnelly Corporation Imaging system for vehicle
US20110122249A1 (en) * 2004-09-30 2011-05-26 Donnelly Corporation Vision system for vehicle
US8189871B2 (en) 2004-09-30 2012-05-29 Donnelly Corporation Vision system for vehicle
US10623704B2 (en) 2004-09-30 2020-04-14 Donnelly Corporation Driver assistance system for vehicle
US8977008B2 (en) 2004-09-30 2015-03-10 Donnelly Corporation Driver assistance system for vehicle
US8483439B2 (en) 2004-09-30 2013-07-09 Donnelly Corporation Vision system for vehicle
US7330584B2 (en) * 2004-10-14 2008-02-12 Sony Corporation Image processing apparatus and method
US20060083421A1 (en) * 2004-10-14 2006-04-20 Wu Weiguo Image processing apparatus and method
US20100177169A1 (en) * 2004-12-14 2010-07-15 Google Inc. Three-dimensional model construction using unstructured pattern
US7974463B2 (en) 2004-12-14 2011-07-05 Google Inc. Compensating for distortion in digital images
US7660458B1 (en) * 2004-12-14 2010-02-09 Google Inc. Three-dimensional model construction using unstructured pattern
US9193303B2 (en) 2004-12-23 2015-11-24 Magna Electronics Inc. Driver assistance system for vehicle
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US11308720B2 (en) 2004-12-23 2022-04-19 Magna Electronics Inc. Vehicular imaging system
US9940528B2 (en) 2004-12-23 2018-04-10 Magna Electronics Inc. Driver assistance system for vehicle
US10509972B2 (en) 2004-12-23 2019-12-17 Magna Electronics Inc. Vehicular vision system
CN1956555B (en) * 2005-10-12 2011-12-07 三星电子株式会社 Apparatus and method for processing 3d picture
US8116557B2 (en) 2005-10-12 2012-02-14 Samsung Electronics Co., Ltd. 3D image processing apparatus and method
US8434919B2 (en) 2006-08-11 2013-05-07 Donnelly Corporation Adaptive forward lighting system for vehicle
US8162518B2 (en) 2006-08-11 2012-04-24 Donnelly Corporation Adaptive forward lighting system for vehicle
US10787116B2 (en) 2006-08-11 2020-09-29 Magna Electronics Inc. Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera
US11148583B2 (en) 2006-08-11 2021-10-19 Magna Electronics Inc. Vehicular forward viewing image capture system
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US11951900B2 (en) 2006-08-11 2024-04-09 Magna Electronics Inc. Vehicular forward viewing image capture system
US7972045B2 (en) 2006-08-11 2011-07-05 Donnelly Corporation Automatic headlamp control system
US8636393B2 (en) 2006-08-11 2014-01-28 Magna Electronics Inc. Driver assistance system for vehicle
US11623559B2 (en) 2006-08-11 2023-04-11 Magna Electronics Inc. Vehicular forward viewing image capture system
US11396257B2 (en) 2006-08-11 2022-07-26 Magna Electronics Inc. Vehicular forward viewing image capture system
US9440535B2 (en) 2006-08-11 2016-09-13 Magna Electronics Inc. Vision system for vehicle
US8107728B2 (en) * 2006-09-19 2012-01-31 Sharp Kabushiki Kaisha Image processing apparatus, image forming apparatus, image processing system, computer program and recording medium
US20080260260A1 (en) * 2006-09-19 2008-10-23 Sharp Kabushiki Kaisha Image processing apparatus, image forming apparatus, image processing system, computer program and recording medium
US20080158359A1 (en) * 2006-12-27 2008-07-03 Matsushita Electric Industrial Co., Ltd. Solid-state imaging device, camera, vehicle and surveillance device
US11506782B2 (en) 2007-01-25 2022-11-22 Magna Electronics Inc. Vehicular forward-sensing system
US9140789B2 (en) 2007-01-25 2015-09-22 Magna Electronics Inc. Forward facing sensing system for vehicle
US10877147B2 (en) 2007-01-25 2020-12-29 Magna Electronics Inc. Forward sensing system for vehicle
US11815594B2 (en) 2007-01-25 2023-11-14 Magna Electronics Inc. Vehicular forward-sensing system
US8614640B2 (en) 2007-01-25 2013-12-24 Magna Electronics Inc. Forward facing sensing system for vehicle
US9335411B1 (en) 2007-01-25 2016-05-10 Magna Electronics Inc. Forward facing sensing system for vehicle
US8217830B2 (en) 2007-01-25 2012-07-10 Magna Electronics Inc. Forward facing sensing system for a vehicle
US9507021B2 (en) 2007-01-25 2016-11-29 Magna Electronics Inc. Forward facing sensing system for vehicle
US10670713B2 (en) 2007-01-25 2020-06-02 Magna Electronics Inc. Forward sensing system for vehicle
US9244165B1 (en) 2007-01-25 2016-01-26 Magna Electronics Inc. Forward facing sensing system for vehicle
US8294608B1 (en) 2007-01-25 2012-10-23 Magna Electronics, Inc. Forward facing sensing system for vehicle
US10107905B2 (en) 2007-01-25 2018-10-23 Magna Electronics Inc. Forward facing sensing system for vehicle
US20080199070A1 (en) * 2007-02-16 2008-08-21 Samsung Electronics Co., Ltd. Three-dimensional image display apparatus and method for enhancing stereoscopic effect of image
KR101345303B1 (en) 2007-03-29 2013-12-27 삼성전자주식회사 Dynamic depth control method or apparatus in stereo-view or multiview sequence images
US10086747B2 (en) 2007-07-12 2018-10-02 Magna Electronics Inc. Driver assistance system for vehicle
US8142059B2 (en) 2007-07-12 2012-03-27 Magna Electronics Inc. Automatic lighting system
US8070332B2 (en) 2007-07-12 2011-12-06 Magna Electronics Inc. Automatic lighting system with adaptive function
US10807515B2 (en) 2007-07-12 2020-10-20 Magna Electronics Inc. Vehicular adaptive headlighting system
US8814401B2 (en) 2007-07-12 2014-08-26 Magna Electronics Inc. Vehicular vision system
US20090045323A1 (en) * 2007-08-17 2009-02-19 Yuesheng Lu Automatic Headlamp Control System
US11328447B2 (en) 2007-08-17 2022-05-10 Magna Electronics Inc. Method of blockage determination and misalignment correction for vehicular vision system
US11908166B2 (en) 2007-08-17 2024-02-20 Magna Electronics Inc. Vehicular imaging system with misalignment correction of camera
US10726578B2 (en) 2007-08-17 2020-07-28 Magna Electronics Inc. Vehicular imaging system with blockage determination and misalignment correction
US9972100B2 (en) 2007-08-17 2018-05-15 Magna Electronics Inc. Vehicular imaging system comprising an imaging device with a single image sensor and image processor for determining a totally blocked state or partially blocked state of the single image sensor as well as an automatic correction for misalignment of the imaging device
US8017898B2 (en) 2007-08-17 2011-09-13 Magna Electronics Inc. Vehicular imaging system in an automatic headlamp control system
US9018577B2 (en) 2007-08-17 2015-04-28 Magna Electronics Inc. Vehicular imaging system with camera misalignment correction and capturing image data at different resolution levels dependent on distance to object in field of view
US8451107B2 (en) 2007-09-11 2013-05-28 Magna Electronics, Inc. Imaging system for vehicle
US11613209B2 (en) 2007-09-11 2023-03-28 Magna Electronics Inc. System and method for guiding reversing of a vehicle toward a trailer hitch
US20100265048A1 (en) * 2007-09-11 2010-10-21 Yuesheng Lu Imaging System for Vehicle
US10766417B2 (en) 2007-09-11 2020-09-08 Magna Electronics Inc. Imaging system for vehicle
US9796332B2 (en) 2007-09-11 2017-10-24 Magna Electronics Inc. Imaging system for vehicle
US10003755B2 (en) 2007-10-04 2018-06-19 Magna Electronics Inc. Imaging system for vehicle
US10616507B2 (en) 2007-10-04 2020-04-07 Magna Electronics Inc. Imaging system for vehicle
US8908040B2 (en) 2007-10-04 2014-12-09 Magna Electronics Inc. Imaging system for vehicle
US8446470B2 (en) 2007-10-04 2013-05-21 Magna Electronics, Inc. Combined RGB and IR imaging sensor
US11165975B2 (en) 2007-10-04 2021-11-02 Magna Electronics Inc. Imaging system for vehicle
US8174563B2 (en) * 2007-10-29 2012-05-08 Fuji Jukogyo Kabushiki Kaisha Object detecting system
US20090237491A1 (en) * 2007-10-29 2009-09-24 Toru Saito Object Detecting System
US8180145B2 (en) 2007-12-28 2012-05-15 Industrial Technology Research Institute Method for producing image with depth by using 2D images
US20090169057A1 (en) * 2007-12-28 2009-07-02 Industrial Technology Research Institute Method for producing image with depth by using 2d images
US8494251B2 (en) * 2008-01-11 2013-07-23 Sri International System and method for measuring image quality
US20090180682A1 (en) * 2008-01-11 2009-07-16 Theodore Armand Camus System and method for measuring image quality
US8090195B2 (en) * 2008-02-12 2012-01-03 Panasonic Corporation Compound eye imaging apparatus, distance measuring apparatus, disparity calculation method, and distance measuring method
US20100150455A1 (en) * 2008-02-12 2010-06-17 Ichiro Oyama Compound eye imaging apparatus, distance measuring apparatus, disparity calculation method, and distance measuring method
US20090244313A1 (en) * 2008-03-26 2009-10-01 Tomonori Masuda Compound eye photographing apparatus, control method therefor, and program
US8213706B2 (en) 2008-04-22 2012-07-03 Honeywell International Inc. Method and system for real-time visual odometry
US20090263009A1 (en) * 2008-04-22 2009-10-22 Honeywell International Inc. Method and system for real-time visual odometry
US8238612B2 (en) 2008-05-06 2012-08-07 Honeywell International Inc. Method and apparatus for vision based motion determination
US20090279741A1 (en) * 2008-05-06 2009-11-12 Honeywell Method and apparatus for vision based motion determination
EP2133726A1 (en) 2008-06-10 2009-12-16 THOMSON Licensing Multi-image capture system with improved depth image resolution
US20100007720A1 (en) * 2008-06-27 2010-01-14 Beddhu Murali Method for front matching stereo vision
US8264526B2 (en) * 2008-06-27 2012-09-11 The University Of Southern Mississippi Method for front matching stereo vision
US11091105B2 (en) 2008-07-24 2021-08-17 Magna Electronics Inc. Vehicle vision system
US9509957B2 (en) 2008-07-24 2016-11-29 Magna Electronics Inc. Vehicle imaging system
US20100020170A1 (en) * 2008-07-24 2010-01-28 Higgins-Luthman Michael J Vehicle Imaging System
US20100066811A1 (en) * 2008-08-11 2010-03-18 Electronics And Telecommunications Research Institute Stereo vision system and control method thereof
US8098276B2 (en) * 2008-08-11 2012-01-17 Electronics And Telecommunications Research Institute Stereo vision system and control method thereof
WO2010083713A1 (en) * 2009-01-22 2010-07-29 华为技术有限公司 Method and device for disparity computation
CN101790103B (en) * 2009-01-22 2012-05-30 华为技术有限公司 Parallax calculation method and device
US9324147B2 (en) 2009-01-22 2016-04-26 Huawei Technologies Co., Ltd. Method and apparatus for computing a parallax
US10839233B2 (en) 2009-02-27 2020-11-17 Magna Electronics Inc. Vehicular control system
US11288888B2 (en) 2009-02-27 2022-03-29 Magna Electronics Inc. Vehicular control system
US9911050B2 (en) 2009-02-27 2018-03-06 Magna Electronics Inc. Driver active safety control system for vehicle
US9126525B2 (en) 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
US11763573B2 (en) 2009-02-27 2023-09-19 Magna Electronics Inc. Vehicular control system
US20100266326A1 (en) * 2009-04-21 2010-10-21 Chuang Cheng-Hua Mark-erasable pen cap
US11511668B2 (en) 2009-05-15 2022-11-29 Magna Electronics Inc. Vehicular driver assistance system with construction zone recognition
US9187028B2 (en) 2009-05-15 2015-11-17 Magna Electronics Inc. Driver assistance system for vehicle
US10005394B2 (en) 2009-05-15 2018-06-26 Magna Electronics Inc. Driver assistance system for vehicle
US10744940B2 (en) 2009-05-15 2020-08-18 Magna Electronics Inc. Vehicular control system with temperature input
US8376595B2 (en) 2009-05-15 2013-02-19 Magna Electronics, Inc. Automatic headlamp control
US11518377B2 (en) 2009-07-27 2022-12-06 Magna Electronics Inc. Vehicular vision system
US9495876B2 (en) 2009-07-27 2016-11-15 Magna Electronics Inc. Vehicular camera with on-board microcontroller
US10875526B2 (en) 2009-07-27 2020-12-29 Magna Electronics Inc. Vehicular vision system
US10106155B2 (en) 2009-07-27 2018-10-23 Magna Electronics Inc. Vehicular camera with on-board microcontroller
US10569804B2 (en) 2009-07-27 2020-02-25 Magna Electronics Inc. Parking assist system
US9868463B2 (en) 2009-07-27 2018-01-16 Magna Electronics Inc. Parking assist system
US8874317B2 (en) 2009-07-27 2014-10-28 Magna Electronics Inc. Parking assist system
US9457717B2 (en) 2009-07-27 2016-10-04 Magna Electronics Inc. Parking assist system
US10875455B2 (en) 2009-09-01 2020-12-29 Magna Electronics Inc. Vehicular vision system
US10300856B2 (en) 2009-09-01 2019-05-28 Magna Electronics Inc. Vehicular display system
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US11794651B2 (en) 2009-09-01 2023-10-24 Magna Electronics Inc. Vehicular vision system
US11285877B2 (en) 2009-09-01 2022-03-29 Magna Electronics Inc. Vehicular vision system
US10053012B2 (en) 2009-09-01 2018-08-21 Magna Electronics Inc. Imaging and display system for vehicle
US9789821B2 (en) 2009-09-01 2017-10-17 Magna Electronics Inc. Imaging and display system for vehicle
US20110074933A1 (en) * 2009-09-28 2011-03-31 Sharp Laboratories Of America, Inc. Reduction of viewer discomfort for stereoscopic images
US8284235B2 (en) 2009-09-28 2012-10-09 Sharp Laboratories Of America, Inc. Reduction of viewer discomfort for stereoscopic images
US8525879B2 (en) 2009-12-15 2013-09-03 Industrial Technology Research Institute Depth detection method and system using thereof
US20110141274A1 (en) * 2009-12-15 2011-06-16 Industrial Technology Research Institute Depth Detection Method and System Using Thereof
US20110193961A1 (en) * 2010-02-10 2011-08-11 Magna Mirrors Of America, Inc. Imaging and display system for vehicle
US8890955B2 (en) 2010-02-10 2014-11-18 Magna Mirrors Of America, Inc. Adaptable wireless vehicle vision system based on wireless communication error
US9188849B2 (en) 2010-03-05 2015-11-17 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9128367B2 (en) 2010-03-05 2015-09-08 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US9049434B2 (en) 2010-03-05 2015-06-02 Panasonic Intellectual Property Management Co., Ltd. 3D imaging device and 3D imaging method
US8320627B2 (en) 2010-06-17 2012-11-27 Caterpillar Inc. Machine control system utilizing stereo disparity density
US9117123B2 (en) 2010-07-05 2015-08-25 Magna Electronics Inc. Vehicular rear view camera display system with lifecheck function
US9180908B2 (en) 2010-11-19 2015-11-10 Magna Electronics Inc. Lane keeping system and lane centering system
US9758163B2 (en) 2010-11-19 2017-09-12 Magna Electronics Inc. Lane keeping system and lane centering system
US11198434B2 (en) 2010-11-19 2021-12-14 Magna Electronics Inc. Vehicular lane centering system
US10427679B2 (en) 2010-11-19 2019-10-01 Magna Electronics Inc. Lane keeping system and lane centering system
US11753007B2 (en) 2010-11-19 2023-09-12 Magna Electronics Inc. Vehicular lane centering system
US20120127163A1 (en) * 2010-11-23 2012-05-24 Samsung Electronics Co., Ltd. Image processing apparatus and image processing method
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US11553140B2 (en) 2010-12-01 2023-01-10 Magna Electronics Inc. Vehicular vision system with multiple cameras
US10868974B2 (en) 2010-12-01 2020-12-15 Magna Electronics Inc. Method for determining alignment of vehicular cameras
US9731653B2 (en) 2010-12-22 2017-08-15 Magna Electronics Inc. Vision display system for vehicle
US11155211B2 (en) 2010-12-22 2021-10-26 Magna Electronics Inc. Vehicular multi-camera surround view system with video display
US9591281B2 (en) * 2010-12-22 2017-03-07 Thomson Licensing Apparatus and method for determining a disparity estimate
US10486597B1 (en) 2010-12-22 2019-11-26 Magna Electronics Inc. Vehicular vision system with rear backup video display
US10144352B2 (en) 2010-12-22 2018-12-04 Magna Electronics Inc. Vision display system for vehicle
US9264672B2 (en) 2010-12-22 2016-02-16 Magna Mirrors Of America, Inc. Vision display system for vehicle
US10589678B1 (en) 2010-12-22 2020-03-17 Magna Electronics Inc. Vehicular rear backup vision system with video display
US11548444B2 (en) 2010-12-22 2023-01-10 Magna Electronics Inc. Vehicular multi-camera surround view system with video display
US11708026B2 (en) 2010-12-22 2023-07-25 Magna Electronics Inc. Vehicular rear backup system with video display
US9598014B2 (en) 2010-12-22 2017-03-21 Magna Electronics Inc. Vision display system for vehicle
US10336255B2 (en) 2010-12-22 2019-07-02 Magna Electronics Inc. Vehicular vision system with rear backup video display
US20130272582A1 (en) * 2010-12-22 2013-10-17 Thomson Licensing Apparatus and method for determining a disparity estimate
US9469250B2 (en) 2010-12-22 2016-10-18 Magna Electronics Inc. Vision display system for vehicle
US10814785B2 (en) 2010-12-22 2020-10-27 Magna Electronics Inc. Vehicular rear backup vision system with video display
US11820424B2 (en) 2011-01-26 2023-11-21 Magna Electronics Inc. Trailering assist system with trailer angle detection
US9950738B2 (en) 2011-01-26 2018-04-24 Magna Electronics Inc. Trailering assist system with trailer angle detection
US10858042B2 (en) 2011-01-26 2020-12-08 Magna Electronics Inc. Trailering assist system with trailer angle detection
US9085261B2 (en) 2011-01-26 2015-07-21 Magna Electronics Inc. Rear vision system with trailer angle detection
EP2692140A2 (en) * 2011-03-30 2014-02-05 Intel Corporation Real-time depth extraction using stereo correspondence
CN103460705B (en) * 2011-03-30 2016-12-07 英特尔公司 The real-time deep utilizing stereoscopic correspondence extracts
EP2692140A4 (en) * 2011-03-30 2014-10-01 Intel Corp Real-time depth extraction using stereo correspondence
CN103460705A (en) * 2011-03-30 2013-12-18 英特尔公司 Real-time depth extraction using stereo correspondence
US20120249747A1 (en) * 2011-03-30 2012-10-04 Ziv Aviv Real-time depth extraction using stereo correspondence
US8823777B2 (en) * 2011-03-30 2014-09-02 Intel Corporation Real-time depth extraction using stereo correspondence
US10288724B2 (en) 2011-04-12 2019-05-14 Magna Electronics Inc. System and method for estimating distance between a mobile unit and a vehicle using a TOF system
US9194943B2 (en) 2011-04-12 2015-11-24 Magna Electronics Inc. Step filter for estimating distance in a time-of-flight ranging system
US10452931B2 (en) 2011-04-25 2019-10-22 Magna Electronics Inc. Processing method for distinguishing a three dimensional object from a two dimensional object using a vehicular system
US9547795B2 (en) 2011-04-25 2017-01-17 Magna Electronics Inc. Image processing method for detecting objects using relative motion
US10654423B2 (en) 2011-04-25 2020-05-19 Magna Electronics Inc. Method and system for dynamically ascertaining alignment of vehicular cameras
US11554717B2 (en) 2011-04-25 2023-01-17 Magna Electronics Inc. Vehicular vision system that dynamically calibrates a vehicular camera
US10043082B2 (en) 2011-04-25 2018-08-07 Magna Electronics Inc. Image processing method for detecting objects using relative motion
US10202077B2 (en) 2011-04-25 2019-02-12 Magna Electronics Inc. Method for dynamically calibrating vehicular cameras
US9834153B2 (en) 2011-04-25 2017-12-05 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US9357208B2 (en) 2011-04-25 2016-05-31 Magna Electronics Inc. Method and system for dynamically calibrating vehicular cameras
US10640041B2 (en) 2011-04-25 2020-05-05 Magna Electronics Inc. Method for dynamically calibrating vehicular cameras
US10919458B2 (en) 2011-04-25 2021-02-16 Magna Electronics Inc. Method and system for calibrating vehicular cameras
US11007934B2 (en) 2011-04-25 2021-05-18 Magna Electronics Inc. Method for dynamically calibrating a vehicular camera
US10793067B2 (en) 2011-07-26 2020-10-06 Magna Electronics Inc. Imaging system for vehicle
US11285873B2 (en) 2011-07-26 2022-03-29 Magna Electronics Inc. Method for generating surround view images derived from image data captured by cameras of a vehicular surround view vision system
US9491450B2 (en) 2011-08-01 2016-11-08 Magna Electronic Inc. Vehicle camera alignment system
US9900490B2 (en) 2011-09-21 2018-02-20 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US11877054B2 (en) 2011-09-21 2024-01-16 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US10827108B2 (en) 2011-09-21 2020-11-03 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US11201994B2 (en) 2011-09-21 2021-12-14 Magna Electronics Inc. Vehicular multi-camera surround view system using image data transmission and power supply via coaxial cables
US11638070B2 (en) 2011-09-21 2023-04-25 Magna Electronics Inc. Vehicular vision system using image data transmission and power supply via a coaxial cable
US10284764B2 (en) 2011-09-21 2019-05-07 Magna Electronics Inc. Vehicle vision using image data transmission and power supply via a coaxial cable
US10567633B2 (en) 2011-09-21 2020-02-18 Magna Electronics Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US9774790B1 (en) 2011-09-26 2017-09-26 Magna Electronics Inc. Method for enhancing vehicle camera image quality
US10257432B2 (en) 2011-09-26 2019-04-09 Magna Electronics Inc. Method for enhancing vehicle camera image quality
US9681062B2 (en) 2011-09-26 2017-06-13 Magna Electronics Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US11673546B2 (en) 2011-10-27 2023-06-13 Magna Electronics Inc. Vehicular control system with image processing and wireless communication
US11279343B2 (en) 2011-10-27 2022-03-22 Magna Electronics Inc. Vehicular control system with image processing and wireless communication
US9919705B2 (en) 2011-10-27 2018-03-20 Magna Electronics Inc. Driver assist system with image processing and wireless communication
US20130116562A1 (en) * 2011-11-09 2013-05-09 Samsung Electronics Co., Ltd. Method and apparatus for generating diagnostic image and medical image system
US9491451B2 (en) 2011-11-15 2016-11-08 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US10264249B2 (en) 2011-11-15 2019-04-16 Magna Electronics Inc. Calibration system and method for vehicular surround vision system
US11142123B2 (en) 2011-11-28 2021-10-12 Magna Electronics Inc. Multi-camera vehicular vision system
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10640040B2 (en) 2011-11-28 2020-05-05 Magna Electronics Inc. Vision system for vehicle
US11634073B2 (en) 2011-11-28 2023-04-25 Magna Electronics Inc. Multi-camera vehicular vision system
US11305691B2 (en) 2011-11-28 2022-04-19 Magna Electronics Inc. Vehicular vision system
US10099614B2 (en) 2011-11-28 2018-10-16 Magna Electronics Inc. Vision system for vehicle
US11787338B2 (en) 2011-11-28 2023-10-17 Magna Electronics Inc. Vehicular vision system
US10129518B2 (en) 2011-12-09 2018-11-13 Magna Electronics Inc. Vehicle vision system with customized display
US11082678B2 (en) 2011-12-09 2021-08-03 Magna Electronics Inc. Vehicular vision system with customized display
US10542244B2 (en) 2011-12-09 2020-01-21 Magna Electronics Inc. Vehicle vision system with customized display
US9762880B2 (en) 2011-12-09 2017-09-12 Magna Electronics Inc. Vehicle vision system with customized display
US11689703B2 (en) 2011-12-09 2023-06-27 Magna Electronics Inc. Vehicular vision system with customized display
US10493916B2 (en) 2012-02-22 2019-12-03 Magna Electronics Inc. Vehicle camera system with image manipulation
US11007937B2 (en) 2012-02-22 2021-05-18 Magna Electronics Inc. Vehicular display system with multi-paned image display
US10926702B2 (en) 2012-02-22 2021-02-23 Magna Electronics Inc. Vehicle camera system with image manipulation
US10457209B2 (en) 2012-02-22 2019-10-29 Magna Electronics Inc. Vehicle vision system with multi-paned view
US11607995B2 (en) 2012-02-22 2023-03-21 Magna Electronics Inc. Vehicular display system with multi-paned image display
US11577645B2 (en) 2012-02-22 2023-02-14 Magna Electronics Inc. Vehicular vision system with image manipulation
US8694224B2 (en) 2012-03-01 2014-04-08 Magna Electronics Inc. Vehicle yaw rate correction
US9715769B2 (en) 2012-03-01 2017-07-25 Magna Electronics Inc. Process for determining state of a vehicle
US9916699B2 (en) 2012-03-01 2018-03-13 Magna Electronics Inc. Process for determining state of a vehicle
US10127738B2 (en) 2012-03-01 2018-11-13 Magna Electronics Inc. Method for vehicular control
US9346468B2 (en) 2012-03-01 2016-05-24 Magna Electronics Inc. Vehicle vision system with yaw rate determination
US8849495B2 (en) 2012-03-01 2014-09-30 Magna Electronics Inc. Vehicle vision system with yaw rate determination
US11627286B2 (en) 2012-03-23 2023-04-11 Magna Electronics Inc. Vehicular vision system with accelerated determination of another vehicle
US11184585B2 (en) 2012-03-23 2021-11-23 Magna Electronics Inc. Vehicular vision system with accelerated determination of an object of interest
US10911721B2 (en) 2012-03-23 2021-02-02 Magna Electronics Inc. Vehicle vision system with accelerated determination of an object of interest
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US9751465B2 (en) 2012-04-16 2017-09-05 Magna Electronics Inc. Vehicle vision system with reduced image color data processing by use of dithering
US10434944B2 (en) 2012-04-16 2019-10-08 Magna Electronics Inc. Vehicle vision system with reduced image color data processing by use of dithering
US10922563B2 (en) 2012-05-18 2021-02-16 Magna Electronics Inc. Vehicular control system
US11308718B2 (en) 2012-05-18 2022-04-19 Magna Electronics Inc. Vehicular vision system
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US10515279B2 (en) 2012-05-18 2019-12-24 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
US11508160B2 (en) 2012-05-18 2022-11-22 Magna Electronics Inc. Vehicular vision system
US11769335B2 (en) 2012-05-18 2023-09-26 Magna Electronics Inc. Vehicular rear backup system
US9340227B2 (en) 2012-08-14 2016-05-17 Magna Electronics Inc. Vehicle lane keep assist system
US10733892B2 (en) 2012-09-04 2020-08-04 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US11663917B2 (en) 2012-09-04 2023-05-30 Magna Electronics Inc. Vehicular control system using influence mapping for conflict avoidance path determination
US10115310B2 (en) 2012-09-04 2018-10-30 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US9761142B2 (en) 2012-09-04 2017-09-12 Magna Electronics Inc. Driver assistant system using influence mapping for conflict avoidance path determination
US11872939B2 (en) 2012-09-26 2024-01-16 Magna Electronics Inc. Vehicular trailer angle detection system
US11285875B2 (en) 2012-09-26 2022-03-29 Magna Electronics Inc. Method for dynamically calibrating a vehicular trailer angle detection system
US9779313B2 (en) 2012-09-26 2017-10-03 Magna Electronics Inc. Vehicle vision system with trailer angle detection
US10586119B2 (en) 2012-09-26 2020-03-10 Magna Electronics Inc. Vehicular control system with trailering assist function
US10089541B2 (en) 2012-09-26 2018-10-02 Magna Electronics Inc. Vehicular control system with trailering assist function
US10800332B2 (en) 2012-09-26 2020-10-13 Magna Electronics Inc. Trailer driving assist system
US9802542B2 (en) 2012-09-26 2017-10-31 Magna Electronics Inc. Trailer angle detection system calibration
US10909393B2 (en) 2012-09-26 2021-02-02 Magna Electronics Inc. Vehicular control system with trailering assist function
US11410431B2 (en) 2012-09-26 2022-08-09 Magna Electronics Inc. Vehicular control system with trailering assist function
US9446713B2 (en) 2012-09-26 2016-09-20 Magna Electronics Inc. Trailer angle detection system
US10300855B2 (en) 2012-09-26 2019-05-28 Magna Electronics Inc. Trailer driving assist system
US9558409B2 (en) 2012-09-26 2017-01-31 Magna Electronics Inc. Vehicle vision system with trailer angle detection
US10009594B2 (en) * 2012-09-27 2018-06-26 Panasonic Intellectual Property Management Co., Ltd. Stereo image processing device and stereo image processing method
CN104685535B (en) * 2012-09-27 2017-11-28 松下知识产权经营株式会社 Stereoscopic image processing device and stereoscopic image processing method
US20150249814A1 (en) * 2012-09-27 2015-09-03 Panasonic Intellectual Property Management Co., Ltd. Stereo image processing device and stereo image processing method
CN104685535A (en) * 2012-09-27 2015-06-03 松下知识产权经营株式会社 Stereo image processing device and stereo image processing method
US9723272B2 (en) 2012-10-05 2017-08-01 Magna Electronics Inc. Multi-camera image stitching calibration system
US10904489B2 (en) 2012-10-05 2021-01-26 Magna Electronics Inc. Multi-camera calibration method for a vehicle moving along a vehicle assembly line
US11265514B2 (en) 2012-10-05 2022-03-01 Magna Electronics Inc. Multi-camera calibration method for a vehicle moving along a vehicle assembly line
US10284818B2 (en) 2012-10-05 2019-05-07 Magna Electronics Inc. Multi-camera image stitching calibration system
US10321064B2 (en) 2012-11-19 2019-06-11 Magna Electronics Inc. Vehicular vision system with enhanced display functions
US9090234B2 (en) 2012-11-19 2015-07-28 Magna Electronics Inc. Braking control system for vehicle
US10104298B2 (en) 2012-11-19 2018-10-16 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US10023161B2 (en) 2012-11-19 2018-07-17 Magna Electronics Inc. Braking control system for vehicle
US9481344B2 (en) 2012-11-19 2016-11-01 Magna Electronics Inc. Braking control system for vehicle
US9743002B2 (en) 2012-11-19 2017-08-22 Magna Electronics Inc. Vehicle vision system with enhanced display functions
US10025994B2 (en) 2012-12-04 2018-07-17 Magna Electronics Inc. Vehicle vision system utilizing corner detection
US10560610B2 (en) 2012-12-05 2020-02-11 Magna Electronics Inc. Method of synchronizing multiple vehicular cameras with an ECU
US10873682B2 (en) 2012-12-05 2020-12-22 Magna Electronics Inc. Method of synchronizing multiple vehicular cameras with an ECU
US10171709B2 (en) 2012-12-05 2019-01-01 Magna Electronics Inc. Vehicle vision system utilizing multiple cameras and ethernet links
US9481301B2 (en) 2012-12-05 2016-11-01 Magna Electronics Inc. Vehicle vision system utilizing camera synchronization
US9912841B2 (en) 2012-12-05 2018-03-06 Magna Electronics Inc. Vehicle vision system utilizing camera synchronization
US20140219549A1 (en) * 2013-02-01 2014-08-07 Electronics And Telecommunications Research Institute Method and apparatus for active stereo matching
US10803744B2 (en) 2013-02-04 2020-10-13 Magna Electronics Inc. Vehicular collision mitigation system
US11798419B2 (en) 2013-02-04 2023-10-24 Magna Electronics Inc. Vehicular collision mitigation system
US9092986B2 (en) 2013-02-04 2015-07-28 Magna Electronics Inc. Vehicular vision system
US9563809B2 (en) 2013-02-04 2017-02-07 Magna Electronics Inc. Vehicular vision system
US9824285B2 (en) 2013-02-04 2017-11-21 Magna Electronics Inc. Vehicular control system
US11012668B2 (en) 2013-02-04 2021-05-18 Magna Electronics Inc. Vehicular security system that limits vehicle access responsive to signal jamming detection
US10497262B2 (en) 2013-02-04 2019-12-03 Magna Electronics Inc. Vehicular collision mitigation system
US9318020B2 (en) 2013-02-04 2016-04-19 Magna Electronics Inc. Vehicular collision mitigation system
US10523904B2 (en) 2013-02-04 2019-12-31 Magna Electronics Inc. Vehicle data recording system
US10179543B2 (en) 2013-02-27 2019-01-15 Magna Electronics Inc. Multi-camera dynamic top view vision system
US11192500B2 (en) 2013-02-27 2021-12-07 Magna Electronics Inc. Method for stitching image data captured by multiple vehicular cameras
US11572015B2 (en) 2013-02-27 2023-02-07 Magna Electronics Inc. Multi-camera vehicular vision system with graphic overlay
US10780827B2 (en) 2013-02-27 2020-09-22 Magna Electronics Inc. Method for stitching images captured by multiple vehicular cameras
US10486596B2 (en) 2013-02-27 2019-11-26 Magna Electronics Inc. Multi-camera dynamic top view vision system
US9688200B2 (en) 2013-03-04 2017-06-27 Magna Electronics Inc. Calibration system and method for multi-camera vision system
US10027930B2 (en) 2013-03-29 2018-07-17 Magna Electronics Inc. Spectral filtering for vehicular driver assistance systems
US9327693B2 (en) 2013-04-10 2016-05-03 Magna Electronics Inc. Rear collision avoidance system for vehicle
US9545921B2 (en) 2013-04-10 2017-01-17 Magna Electronics Inc. Collision avoidance system for vehicle
US11718291B2 (en) 2013-04-10 2023-08-08 Magna Electronics Inc. Vehicular collision avoidance system
US10875527B2 (en) 2013-04-10 2020-12-29 Magna Electronics Inc. Collision avoidance system for vehicle
US10207705B2 (en) 2013-04-10 2019-02-19 Magna Electronics Inc. Collision avoidance system for vehicle
US9802609B2 (en) 2013-04-10 2017-10-31 Magna Electronics Inc. Collision avoidance system for vehicle
US11485358B2 (en) 2013-04-10 2022-11-01 Magna Electronics Inc. Vehicular collision avoidance system
US10232797B2 (en) 2013-04-29 2019-03-19 Magna Electronics Inc. Rear vision system for vehicle with dual purpose signal lines
US9508014B2 (en) 2013-05-06 2016-11-29 Magna Electronics Inc. Vehicular multi-camera vision system
US10057489B2 (en) 2013-05-06 2018-08-21 Magna Electronics Inc. Vehicular multi-camera vision system
US11616910B2 (en) 2013-05-06 2023-03-28 Magna Electronics Inc. Vehicular vision system with video display
US10574885B2 (en) 2013-05-06 2020-02-25 Magna Electronics Inc. Method for displaying video images for a vehicular vision system
US11050934B2 (en) 2013-05-06 2021-06-29 Magna Electronics Inc. Method for displaying video images for a vehicular vision system
US9769381B2 (en) 2013-05-06 2017-09-19 Magna Electronics Inc. Vehicular multi-camera vision system
US10266115B2 (en) 2013-05-21 2019-04-23 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US10567748B2 (en) 2013-05-21 2020-02-18 Magna Electronics Inc. Targetless vehicular camera calibration method
US11597319B2 (en) 2013-05-21 2023-03-07 Magna Electronics Inc. Targetless vehicular camera calibration system
US9563951B2 (en) 2013-05-21 2017-02-07 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US11447070B2 (en) 2013-05-21 2022-09-20 Magna Electronics Inc. Method for determining misalignment of a vehicular camera
US9701246B2 (en) 2013-05-21 2017-07-11 Magna Electronics Inc. Vehicle vision system using kinematic model of vehicle motion
US9979957B2 (en) 2013-05-21 2018-05-22 Magna Electronics Inc. Vehicle vision system with targetless camera calibration
US11109018B2 (en) 2013-05-21 2021-08-31 Magna Electronics Inc. Targetless vehicular camera misalignment correction method
US11919449B2 (en) 2013-05-21 2024-03-05 Magna Electronics Inc. Targetless vehicular camera calibration system
US11794647B2 (en) 2013-05-21 2023-10-24 Magna Electronics Inc. Vehicular vision system having a plurality of cameras
US10780826B2 (en) 2013-05-21 2020-09-22 Magna Electronics Inc. Method for determining misalignment of a vehicular camera
US10567705B2 (en) 2013-06-10 2020-02-18 Magna Electronics Inc. Coaxial cable with bidirectional data transmission
US11025859B2 (en) 2013-06-10 2021-06-01 Magna Electronics Inc. Vehicular multi-camera vision system using coaxial cables with bidirectional data transmission
US11290679B2 (en) 2013-06-10 2022-03-29 Magna Electronics Inc. Vehicular multi-camera vision system using coaxial cables with bidirectional data transmission
US11792360B2 (en) 2013-06-10 2023-10-17 Magna Electronics Inc. Vehicular vision system using cable with bidirectional data transmission
US11533452B2 (en) 2013-06-10 2022-12-20 Magna Electronics Inc. Vehicular multi-camera vision system using coaxial cables with bidirectional data transmission
US9260095B2 (en) 2013-06-19 2016-02-16 Magna Electronics Inc. Vehicle vision system with collision mitigation
US9824587B2 (en) 2013-06-19 2017-11-21 Magna Electronics Inc. Vehicle vision system with collision mitigation
US10692380B2 (en) 2013-06-19 2020-06-23 Magna Electronics Inc. Vehicle vision system with collision mitigation
US10222224B2 (en) 2013-06-24 2019-03-05 Magna Electronics Inc. System for locating a parking space based on a previously parked space
US10718624B2 (en) 2013-06-24 2020-07-21 Magna Electronics Inc. Vehicular parking assist system that determines a parking space based in part on previously parked spaces
US10755110B2 (en) 2013-06-28 2020-08-25 Magna Electronics Inc. Trailering assist system for vehicle
US11657619B2 (en) 2013-06-28 2023-05-23 Magna Electronics Inc. Vehicular trailering assist system
US11205080B2 (en) 2013-06-28 2021-12-21 Magna Electronics Inc. Trailering assist system for vehicle
US9619716B2 (en) 2013-08-12 2017-04-11 Magna Electronics Inc. Vehicle vision system with image classification
US10326969B2 (en) 2013-08-12 2019-06-18 Magna Electronics Inc. Vehicle vision system with reduction of temporal noise in images
US10137892B2 (en) 2013-12-05 2018-11-27 Magna Electronics Inc. Vehicle monitoring system
US11618441B2 (en) 2013-12-05 2023-04-04 Magna Electronics Inc. Vehicular control system with remote processor
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US10870427B2 (en) 2013-12-05 2020-12-22 Magna Electronics Inc. Vehicular control system with remote processor
US10688993B2 (en) 2013-12-12 2020-06-23 Magna Electronics Inc. Vehicle control system with traffic driving control
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US10493917B2 (en) 2014-02-04 2019-12-03 Magna Electronics Inc. Vehicular trailer backup assist system
US10160382B2 (en) 2014-02-04 2018-12-25 Magna Electronics Inc. Trailer backup assist system
US20160350929A1 (en) * 2014-02-05 2016-12-01 Creaform Inc. Structured light matching of a set of curves from two cameras
US10643343B2 (en) * 2014-02-05 2020-05-05 Creaform Inc. Structured light matching of a set of curves from three cameras
CN105960570A (en) * 2014-02-05 2016-09-21 形创有限公司 Structured light matching of a set of curves from two cameras
US10271039B2 (en) * 2014-02-05 2019-04-23 Creaform Inc. Structured light matching of a set of curves from two cameras
CN105960570B (en) * 2014-02-05 2019-03-22 形创有限公司 The structured light of curve group from two cameras matches
US11565690B2 (en) 2014-04-02 2023-01-31 Magna Electronics Inc. Vehicular driving assistance system that controls a vehicle in accordance with parameters preferred by an identified driver
US11130487B2 (en) 2014-04-02 2021-09-28 Magna Electronics Inc. Method for controlling a vehicle in accordance with parameters preferred by an identified driver
US9623878B2 (en) 2014-04-02 2017-04-18 Magna Electronics Inc. Personalized driver assistance system for vehicle
US9950707B2 (en) 2014-04-02 2018-04-24 Magna Electronics Inc. Method for controlling a vehicle in accordance with parameters preferred by an identified driver
US10202147B2 (en) 2014-04-10 2019-02-12 Magna Electronics Inc. Vehicle control system with adaptive wheel angle correction
US10994774B2 (en) 2014-04-10 2021-05-04 Magna Electronics Inc. Vehicular control system with steering adjustment
US9487235B2 (en) 2014-04-10 2016-11-08 Magna Electronics Inc. Vehicle control system with adaptive wheel angle correction
US11318928B2 (en) 2014-06-02 2022-05-03 Magna Electronics Inc. Vehicular automated parking system
US10328932B2 (en) 2014-06-02 2019-06-25 Magna Electronics Inc. Parking assist system with annotated map generation
US11572065B2 (en) 2014-09-17 2023-02-07 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US11787402B2 (en) 2014-09-17 2023-10-17 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US11198432B2 (en) 2014-09-17 2021-12-14 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US9916660B2 (en) 2015-01-16 2018-03-13 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US10235775B2 (en) 2015-01-16 2019-03-19 Magna Electronics Inc. Vehicle vision system with calibration algorithm
US9842400B2 (en) 2015-01-27 2017-12-12 Samsung Electronics Co., Ltd. Method and apparatus for determining disparity
US9794543B2 (en) * 2015-03-02 2017-10-17 Ricoh Company, Ltd. Information processing apparatus, image capturing apparatus, control system applicable to moveable apparatus, information processing method, and storage medium of program of method
US20160261848A1 (en) * 2015-03-02 2016-09-08 Hiroyoshi Sekiguchi Information processing apparatus, image capturing apparatus, control system applicable to moveable apparatus, information processing method, and storage medium of program of method
US10853625B2 (en) * 2015-03-21 2020-12-01 Mine One Gmbh Facial signature methods, systems and software
US10551913B2 (en) 2015-03-21 2020-02-04 Mine One Gmbh Virtual 3D methods, systems and software
US10286855B2 (en) 2015-03-23 2019-05-14 Magna Electronics Inc. Vehicle vision system with video compression
US10946799B2 (en) 2015-04-21 2021-03-16 Magna Electronics Inc. Vehicle vision system with overlay calibration
US11535154B2 (en) 2015-04-21 2022-12-27 Magna Electronics Inc. Method for calibrating a vehicular vision system
US10819943B2 (en) 2015-05-07 2020-10-27 Magna Electronics Inc. Vehicle vision system with incident recording function
US11483514B2 (en) 2015-05-07 2022-10-25 Magna Electronics Inc. Vehicular vision system with incident recording function
US11104327B2 (en) 2015-07-13 2021-08-31 Magna Electronics Inc. Method for automated parking of a vehicle
US10214206B2 (en) 2015-07-13 2019-02-26 Magna Electronics Inc. Parking assist system for vehicle
US10078789B2 (en) 2015-07-17 2018-09-18 Magna Electronics Inc. Vehicle parking assist system with vision-based parking space detection
US11673605B2 (en) 2015-08-18 2023-06-13 Magna Electronics Inc. Vehicular driving assist system
US10870449B2 (en) 2015-08-18 2020-12-22 Magna Electronics Inc. Vehicular trailering system
US10086870B2 (en) 2015-08-18 2018-10-02 Magna Electronics Inc. Trailer parking assist system for vehicle
US10875403B2 (en) 2015-10-27 2020-12-29 Magna Electronics Inc. Vehicle vision system with enhanced night vision
US10889293B2 (en) 2015-11-23 2021-01-12 Magna Electronics Inc. Vehicular control system for emergency handling
US11618442B2 (en) 2015-11-23 2023-04-04 Magna Electronics Inc. Vehicle control system for emergency handling
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US11277558B2 (en) 2016-02-01 2022-03-15 Magna Electronics Inc. Vehicle vision system with master-slave camera configuration
US11708025B2 (en) 2016-02-02 2023-07-25 Magna Electronics Inc. Vehicle vision system with smart camera video output
US11433809B2 (en) 2016-02-02 2022-09-06 Magna Electronics Inc. Vehicle vision system with smart camera video output
US10160437B2 (en) 2016-02-29 2018-12-25 Magna Electronics Inc. Vehicle control system with reverse assist
US10773707B2 (en) 2016-02-29 2020-09-15 Magna Electronics Inc. Vehicle control system with reverse assist
US11400919B2 (en) 2016-03-02 2022-08-02 Magna Electronics Inc. Vehicle vision system with autonomous parking function
US10132971B2 (en) 2016-03-04 2018-11-20 Magna Electronics Inc. Vehicle camera with multiple spectral filters
US10055651B2 (en) 2016-03-08 2018-08-21 Magna Electronics Inc. Vehicle vision system with enhanced lane tracking
US11288890B2 (en) 2016-03-08 2022-03-29 Magna Electronics Inc. Vehicular driving assist system
US11756316B2 (en) 2016-03-08 2023-09-12 Magna Electronics Inc. Vehicular lane keeping system
US10685243B2 (en) 2016-03-08 2020-06-16 Magna Electronics Inc. Vehicular driver assist system
US10529085B2 (en) * 2018-03-30 2020-01-07 Samsung Electronics Co., Ltd. Hardware disparity evaluation for stereo matching
US11960639B2 (en) 2021-08-29 2024-04-16 Mine One Gmbh Virtual 3D methods, systems and software

Also Published As

Publication number Publication date
US6125198A (en) 2000-09-26
JPH08294143A (en) 1996-11-05
EP0738872A2 (en) 1996-10-23
EP0738872A3 (en) 1999-07-14
CA2174590A1 (en) 1996-10-22
EP0738872B1 (en) 2002-11-06
DE69624614T2 (en) 2003-07-03
CA2174590C (en) 2000-02-08
JP3539788B2 (en) 2004-07-07
DE69624614D1 (en) 2002-12-12

Similar Documents

Publication Publication Date Title
US5867591A (en) Method of matching stereo images and method of measuring disparity between these image
US5719954A (en) Stereo matching method and disparity measuring method
US5825915A (en) Object detecting apparatus in which the position of a planar object is estimated by using hough transform
US7349581B2 (en) System and method for detecting obstacle
Wang et al. Lane detection using catmull-rom spline
US20060013438A1 (en) Obstacle detection apparatus and a method therefor
US20050008220A1 (en) Method, apparatus, and program for processing stereo image
KR101030317B1 (en) Apparatus for tracking obstacle using stereo vision and method thereof
JPH09297849A (en) Vehicle detector
JP3384278B2 (en) Distance measuring device
JP2807137B2 (en) 3D shape detection method
JPH11223516A (en) Three dimensional image pickup device
JPS60171410A (en) Stereoscopic processor
JP2006047252A (en) Image processing unit
JPH06101024B2 (en) Obstacle detection device
JPH1073432A (en) Detection method of existing range of object by image
JPH03122510A (en) Parallax detection

Legal Events

Date Code Title Description
AS Assignment

Owner name: MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ONDA, KATSUMASA;REEL/FRAME:008017/0235

Effective date: 19960404

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20110202