US20020191078A1 - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
US20020191078A1
US20020191078A1 US10/173,316 US17331602A US2002191078A1 US 20020191078 A1 US20020191078 A1 US 20020191078A1 US 17331602 A US17331602 A US 17331602A US 2002191078 A1 US2002191078 A1 US 2002191078A1
Authority
US
United States
Prior art keywords
image
vehicle
camera
center
center line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/173,316
Inventor
Shusaku Okamoto
Masamichi Nakagawa
Takashi Yoshida
Atsushi Iisaka
Atsushi Morimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=19022775&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20020191078(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Individual filed Critical Individual
Assigned to MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD reassignment MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IISAKA, ATSUSHI, MORIMURA, ATSUSHI, NAKAGAWA, MASAMICHI, OKAMOTO, SHUSAKU, YOSHIDA, TAKASHI
Publication of US20020191078A1 publication Critical patent/US20020191078A1/en
Assigned to PANASONIC CORPORATION reassignment PANASONIC CORPORATION CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD.
Priority to US12/369,979 priority Critical patent/US8201199B2/en
Priority to US13/468,661 priority patent/US20120218413A1/en
Assigned to PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. reassignment PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PANASONIC CORPORATION
Assigned to PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. reassignment PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUSLY FILED APPLICATION NUMBERS 13/384239, 13/498734, 14/116681 AND 14/301144 PREVIOUSLY RECORDED ON REEL 034194 FRAME 0143. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: PANASONIC CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • G06T5/80
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/207Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using multi-purpose displays, e.g. camera image and navigation or video on same display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8066Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring rearward traffic

Definitions

  • the present invention relates to a technique for a monitoring system for providing safety driving environment by displaying a processed image of the sate around a vehicle taken by a camera installed on the vehicle.
  • FIG. 8A is a schematic diagram of a camera installation mode
  • FIG. 8B shows an exemplified camera image obtained in the camera installation mode of FIG. 8A
  • FIG. 8C shows an image obtained when the vehicle is moved straight backward by 25 cm from the position where the image of FIG. 8B is obtained.
  • a dashed line A extending in the center indicates how the center of the vehicle behind approaches on the screen while the vehicle is moving straight backward
  • solid lines B 1 and B 2 on the left and right hand sides indicate how the left and right corners of the vehicle behind approach on the screen while the vehicle is moving straight backward.
  • the system is very useful for a user. This is because an object moving vertically on the screen can be identified as an object present on an extended center line of the vehicle, and hence, the moving direction of the vehicle can be grasped intuitively on the basis of the displayed image.
  • a license plate, a rear windshield wiper, a locking mechanism for a trunk room, a spare tire and the like are generally placed around the center of the rear portion of a vehicle, and hence it may be difficult to secure a place for installing the camera.
  • the position of the camera may be shifted so that a specified direction or region can be easily taken.
  • the position of the camera may be shifted toward the passenger's seat side so that a larger region on the passenger's seat side where a driver is difficult to see from the driver's seat can be taken.
  • FIG. 9A shows an exemplified camera image obtained in the camera installation mode of FIG. 2
  • FIG. 9B shows an image obtained when the vehicle is moved straight backward by approximately 25 cm from the position where the image of FIG. 9A is taken.
  • the dashed line A and the solid lines B 1 and B 2 have the same meanings as in FIG. 8B.
  • FIG. 10A shows an exemplified camera image taken in the cameral installation mode of FIG. 6, and FIG. 10B shows an image obtained when the vehicle is moved straight backward by approximately 25 cm from the position where the image of FIG. 10A is taken. Also in FIG. 10A, the dashed line A and the solid lines B 1 and B 2 have the same meanings as in FIGS. 8B and 9A.
  • An object of the invention is providing a rear image which does not give an odd feeling to a user even when a camera for taking an image of a state behind a vehicle is installed in a position shifted from the rear center of the vehicle.
  • the monitoring system of this invention comprises a camera for taking an image of a state behind a vehicle, installed in a rear portion of the vehicle in a position shifted from a center line along the lengthwise direction of the vehicle; and an image processing unit for receiving a camera image of the camera and generating a rear image of the state behind the vehicle on the basis of the camera image, and the image processing unit performs image processing for allowing a vertical center line of the rear image to substantially accord with the center line along the lengthwise direction of the vehicle.
  • the center line along the lengthwise direction of the vehicle substantially accords with the vertical center line of the displayed image. Therefore, when the vehicle is moved straight backward, an object present on the center line of the vehicle is moved vertically in substantially the center of the screen, and hence, this image does not give an odd feeling to a user.
  • the rear image as if it were taken by a camera installed in the rear center of the vehicle and facing to the straight backward direction can be presented to the user.
  • the restriction in the installation position and the facing direction of the camera of the monitoring system can be reduced, so as to increase the freedom in the camera installation.
  • the image processing unit of the monitoring system of this invention preferably performs parallel shifting processing at least on the camera image. Furthermore, the image processing unit preferably performs lens distortion correcting processing on the camera image.
  • the image processing unit of the monitoring system of this invention preferably performs processing for converting the camera image into an image seen from a virtual viewpoint set above the rear center of the vehicle.
  • FIG. 1 is a block diagram for showing an exemplified structure of a monitoring system according to the invention
  • FIG. 2 is a diagram for showing an example of a camera installation mode
  • FIGS. 3A, 3B and 3 C are diagrams of exemplified images for explaining image processing according to Embodiment 1 of the invention.
  • FIGS. 4A, 4B, 4 C and 4 D are diagrams of exemplified images for explaining image processing according to Embodiment 2 of the invention.
  • FIGS. 5A, 5B and 5 C are diagrams for showing virtual viewpoints employed in FIGS. 4B through 4D and 7 B through 7 D, respectively;
  • FIG. 6 is a diagram for showing another camera installation mode
  • FIGS. 7A, 7B, 7 C and 7 D are diagrams of exemplified images for explaining another image processing of Embodiment 2;
  • FIG. 8A is a diagram of a conventional camera installation mode and FIGS. 8B and 8C are diagrams of exemplified images obtained in the camera installation mode of FIG. 8A;
  • FIGS. 9A and 9B are diagrams of exemplified conventional images obtained in the camera installation mode of FIG. 2;
  • FIGS. 10A and 10B are diagrams of exemplified conventional images obtained in the camera installation mode of FIG. 6;
  • FIGS. 11A, 11B, 11 C and 11 D are diagram of exemplified images for explaining another image processing of Embodiment 1.
  • FIG. 1 is a block diagram for showing the structure of the monitoring system of this invention.
  • an image processing unit 20 receives, as inputs, a plurality of camera images output from camera means 10 including cameras 11 , so as to generate a new image by transforming and synthesizing the input camera images. This synthesized image is displayed by a display device 30 .
  • the image processing unit 20 corresponds to an image processor of this invention.
  • the display device 30 of this invention is typically a liquid crystal display, and may be another display device such as a plasma display. Also, the display device of this invention may be used also as a vehicle-install type GPS terminal display (namely, a display of what is called a car navigation system) or may be separately prepared.
  • the camera means 10 is a color or monochrome digital camera typically including a solid state image sensor such as a CCD or a CMOS device.
  • the camera means 10 may include a combination of a lens and a prism or a mirror so as to transfer incident light to the lens and the prism or the mirror through a predetermined optical path to a camera device disposed away from the camera means 10 .
  • the image processing unit 20 transforms/synthesizes a camera image of at least one camera, so as to generate a synthesized image in which the input image is shifted laterally or a synthesized image as if it were vertically or obliquely looked down from above the vehicle.
  • image transforming processing and synthesizing processing of partial images obtained by cutting out necessary areas of the transformed images including processing such as smoothing boundaries between the partial images (hereinafter referred to as the boundary processing) if a plurality of camera images are used).
  • the structure of FIG. 1 includes a mapping table referring unit 21 , and a mapping table is used for processing the camera images in one step.
  • An image synthesizing unit 22 receives the camera images from the cameras 1 through N and processes these camera images.
  • the processing performed at this point are ⁇ circle over (1) ⁇ processing for transforming and cutting the images and ⁇ circle over (2) ⁇ processing for synthesizing cut partial images (including the boundary processing).
  • the processing ⁇ circle over (1) ⁇ and ⁇ circle over (2) ⁇ may be separately performed, or all or part of these processing may be performed in one step.
  • the mapping table is included for performing the processing of the camera images in one step.
  • a “mapping table” means a table in which the corresponding relationships between pixels of a synthesized image and pixel data of the respective camera images are described, and is used for rapidly generating a synthesized image. When such a mapping table is previously built through calculation using geometric conversion or manual operations, a desired synthesized image can be rapidly generated.
  • mapping table is specifically stored in, for example, a ROM (including a writable erasable ROM such as an EE PROM) or a RAM.
  • a ROM including a writable erasable ROM such as an EE PROM
  • RAM Random Access Memory
  • mapping data obtained through calculation by a processor included in the image processing unit may be written in a ROM or a RAM, or mapping table data provided as firmware may be written in a RAM or a ROM by using data transfer means such as a communication line and a disk drive.
  • Embodiment 1 an example of processing employed in a camera installation mode in which a camera 2 facing to a straight backward direction of a vehicle 1 is installed in a position laterally shifted from the rear center of the vehicle 1 as shown in FIG. 2 will be described. The contents of this processing will now be described by using a camera image and a display screen mode resulting from the processing.
  • FIG. 3A is a camera image taken in the cameral installation mode of FIG. 2 and FIG. 3B is a rear image obtained by shifting, in a leftward direction, merely a rectangular area RC 1 surrounded with a broken line in the image of FIG. 3A for the purpose of overcoming difficulty in grasping the image due to the lateral shift of the camera position.
  • the rectangular area RC 1 is set so that the vertical center line of the rectangular area can substantially accord with an actual center line CL 1 along the lengthwise direction of the vehicle 1 (indicated by a solid dashed line in the drawings).
  • the vertical center line of the rear image on the monitor screen can be made to substantially accord with the center line CL 1 of the vehicle merely by laterally shifting the camera image.
  • the center line CL 1 of the vehicle 1 can be made to move smoothly downward in the substantially center of the screen. Therefore, when a driver reverses the vehicle toward a target, it can be easily checked whether or not the vehicle is reversing toward the target on the basis of the movement in the center of the screen.
  • a center line of the vehicle is drawn on the road behind the vehicle.
  • the length of the line depends upon the rear visual range of the camera and a length of 5 m suffices.
  • the state of the center line is taken by the camera, so as to calculate distances from the center of the image of average positions on the right and left hand sides of the center line. This distance corresponds to the extent of the image shift.
  • the travel locus of the center of the vehicle is a straight line (as shown in FIG. 8B).
  • the travel locus is a curve due to the lens distortion. Since the lens distortion is larger in an area farther from the center of the image, the curvature is larger as the shift of the installation position is larger. This curvature cannot be corrected merely by the lateral shift of the image.
  • the lens distortion can be corrected by two-dimensional image transformation for moving positions of respective pixels of the image in accordance with the characteristic of the lens of the camera. For example, a square lattice pattern is previously taken by a camera so as to measure how the respective lattice points are transformed by the lens distortion, and this transformation is reversely corrected. Thus, the lens distortion can be corrected.
  • FIG. 3C shows an image obtained from the image of FIG. 3A by correcting the lens distortion and correcting the positional shift of the camera through the lateral shift of the image.
  • an object present on the center line of the vehicle moves straight in the vertical direction in the center of the screen. Therefore, the moving direction of the vehicle and the positional relationship with an object around the vehicle can be more easily grasped.
  • Embodiment 2 corresponds to an aspect in which a user-friendly image can be generated even in these camera installation modes. Now, the contents of processing of Embodiment 2 will be described by using a camera image and a display screen mode resulting from the processing.
  • FIG. 4A shows camera images taken in the camera installation mode of FIG. 2, and FIGS. 4B through 4D show synthesized images as if they were seen from a virtual viewpoint obtained by subjecting the images of FIG. 4A to viewpoint converting processing for the purpose of overcoming difficulty in grasping the image due to the lateral shift of the camera position.
  • FIGS. 5A through 5C show the virtual viewpoints employed in the rear images of FIGS. 4B through 4D, respectively. Specifically, the virtual viewpoint is in a position above the rear center of the vehicle and the camera faces to a straight backward direction of the vehicle at an angle of 30 degrees against the road surface in FIG.
  • the virtual viewpoint is in a position above the rear center of the vehicle and the camera faces to a straight backward direction of the vehicle at an angle of 60 degrees against the road surface in FIG. 5B; and the virtual viewpoint is in a position above the rear center of the vehicle and the camera faces straight downward in FIG. 5C.
  • Each of the images of FIGS. 4B through 4D is processed through the virtual viewpoint converting processing so that the center line CL 1 along the lengthwise direction of the vehicle on the road surface can accord with the vertical center line of the screen.
  • the vehicle is moved straight backward by 25 cm, an object on the center line CL 1 of the vehicle moves straight downward in the vertical direction on the screen.
  • the virtual viewpoint can be given an angle of depression independently of the angle of depression of the actually used camera. Therefore, the positional relationship with a feature on a road surface (such as a white line) can be more easily grasped by generating an image as if it were looked down from further above.
  • FIG. 7A shows camera images taken in the camera installation mode of FIG. 6, and FIGS. 7B through 7D show synthesized images as if they were seen from a virtual viewpoint obtained by subjecting the images of FIG. 7A to the viewpoint converting processing for the purpose of overcoming difficulty in grasping the image due to the lateral shift of the camera position.
  • the virtual viewpoints employed in the images of FIGS. 7B through 7D also correspond to those shown in FIGS. 5A through 5C, respectively.
  • FIG. 11A trough 11 D exemplify a case where the image processing according to Embodiment 1 is applied in the camera installation mode shown in FIG. 6.
  • FIG. 11B is an image obtained by shifting, in a leftward direction, the rectangular area in the image of FIG. 11A, which is an original camera image
  • FIG. 11C is an image obtained by correcting the lens distortion of the image of FIG. 11B
  • FIG. 11D is an image obtained by rotating the image of FIG. 11C so as to conform the direction of the center line with the perpendicular direction of the image.
  • the image remains unbalanced on the right and left sides, which might be the limit in the two-dimensional image processing.
  • a more natural image as shown in FIGS. 7D can be generated by the virtual viewpoint conversion.
  • mapping table corresponding to an image to be displayed may be provided or a mapping table may be automatically built in accordance with the situation.
  • a vehicle of this invention includes, an ordinary car, a light car, a truck, a bus and the like.
  • the present invention is very effective in a vehicle in which a camera cannot be installed in the rear center because of a spare tire or the like placed in the rear or for reasons in the design.
  • a special vehicle such as a crane truck and an excavator may be a vehicle of this invention as far as the technical idea of the invention is applicable.

Abstract

A camera for taking a state behind a vehicle is installed in a position laterally shifted from the rear center of the vehicle. An image processing unit generates a rear image from a camera image by shifting merely a rectangular area of the camera image so that a vertical center line thereof can substantially accord with the center line along the lengthwise direction of the vehicle. Furthermore, processing for correcting lens distortion may be performed. As a result, when the vehicle is moved straight backward, an object present on the center line of the vehicle moves vertically in substantially the center of the screen, so that a user can be prevented from having an odd feeling to see the image.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to a technique for a monitoring system for providing safety driving environment by displaying a processed image of the sate around a vehicle taken by a camera installed on the vehicle. [0001]
  • As an apparatus for monitoring the state around a vehicle by using a camera, a system in which an image taken by a camera installed on a rear trunk room or the like of a vehicle is presented to a driver is conventionally known. Thus, the driver can be informed of the state in the rear of the vehicle. Furthermore, a system for supporting a parking operation in which not only a camera image but also possible travel loci of tires overlapping the camera image are present has recently been known. Thus, the driver can grasp the state ahead in a moving direction of the vehicle. [0002]
  • A conventional system will be described with reference to FIGS. 8A through 8C. FIG. 8A is a schematic diagram of a camera installation mode, FIG. 8B shows an exemplified camera image obtained in the camera installation mode of FIG. 8A, and FIG. 8C shows an image obtained when the vehicle is moved straight backward by 25 cm from the position where the image of FIG. 8B is obtained. In the image of FIG. 8B, another vehicle parks just behind the vehicle, a dashed line A extending in the center indicates how the center of the vehicle behind approaches on the screen while the vehicle is moving straight backward, and solid lines B[0003] 1 and B2 on the left and right hand sides indicate how the left and right corners of the vehicle behind approach on the screen while the vehicle is moving straight backward.
  • In the example shown in FIGS. 8A through 8C, when the vehicle is moved straight backward, an object present just behind the vehicle (i.e., another vehicle in this case) gets close to the vehicle in the downward direction vertically on the screen. Therefore, it can be easily grasped whether or not the center of the vehicle corresponds to the center of the object. [0004]
  • If the camera is installed so as to face to substantially the same direction as the straight reversing direction of the vehicle and to be positioned substantially at the center of the rear portion of the vehicle in the aforementioned conventional monitoring system, the system is very useful for a user. This is because an object moving vertically on the screen can be identified as an object present on an extended center line of the vehicle, and hence, the moving direction of the vehicle can be grasped intuitively on the basis of the displayed image. [0005]
  • However, if the camera is not installed at substantially the center of the rear portion of the vehicle, this conventional system has a problem. [0006]
  • For example, a license plate, a rear windshield wiper, a locking mechanism for a trunk room, a spare tire and the like are generally placed around the center of the rear portion of a vehicle, and hence it may be difficult to secure a place for installing the camera. Also, the position of the camera may be shifted so that a specified direction or region can be easily taken. For example, the position of the camera may be shifted toward the passenger's seat side so that a larger region on the passenger's seat side where a driver is difficult to see from the driver's seat can be taken. [0007]
  • In another camera installation mode shown in FIG. 2, the camera faces to the straight backward direction but its position is laterally shifted by approximately 50 cm from the center of the rear portion of the vehicle. FIG. 9A shows an exemplified camera image obtained in the camera installation mode of FIG. 2, and FIG. 9B shows an image obtained when the vehicle is moved straight backward by approximately 25 cm from the position where the image of FIG. 9A is taken. Also in FIG. 9A, the dashed line A and the solid lines B[0008] 1 and B2 have the same meanings as in FIG. 8B.
  • In this case, another vehicle present just behind the vehicle is imaged in an area close to the edge of the screen, and hence, it is difficult to grasp, on the screen, the positional relationship between the vehicle and the other vehicle behind. Also, when the vehicle is moved straight backward, the other vehicle behind approaches not in the vertical direction but in an oblique direction on the screen. [0009]
  • In still another camera installation mode shown in FIG. 6, the camera faces to a direction slightly toward the center line rather than the straight backward direction and its position is laterally shifted by approximately 50 cm from the center of the rear portion of the vehicle. FIG. 10A shows an exemplified camera image taken in the cameral installation mode of FIG. 6, and FIG. 10B shows an image obtained when the vehicle is moved straight backward by approximately 25 cm from the position where the image of FIG. 10A is taken. Also in FIG. 10A, the dashed line A and the solid lines B[0010] 1 and B2 have the same meanings as in FIGS. 8B and 9A.
  • In the case where the position of the camera is shifted from the rear center of the vehicle, if the camera faces to the straight backward direction of the vehicle as in FIG. 2, the visual range in a side rear region of the vehicle opposite to the camera installation position is small as is understood from FIG. 9A. In order to obtain a well-balanced image on the right and left sides of the vehicle, the camera should face to a direction toward the center line of the vehicle. When the images of FIGS. 10A and 9A are compared, it is understood that a left rear region of the vehicle is shown more largely in FIG. 10A. [0011]
  • In this case, when the vehicle is moved straight backward, another vehicle present just behind the vehicle approaches in a more oblique direction on the screen than in FIG. 9A. [0012]
  • In this manner, in the conventional system, an object actually present just behind a vehicle is imaged in an area close to the edge of the screen or moves in an oblique direction on the screen when the vehicle is moved straight backward. Therefore, when such an image is presented to a user, the user may have an odd feeling to see the image, which makes it difficult to check whether or not the vehicle is moving straight backward or whether or not the center of the vehicle corresponds to that of a target. As a result, there is a fear of a driving operation error. [0013]
  • SUMMARY OF THE INVENTION
  • An object of the invention is providing a rear image which does not give an odd feeling to a user even when a camera for taking an image of a state behind a vehicle is installed in a position shifted from the rear center of the vehicle. [0014]
  • Specifically, the monitoring system of this invention comprises a camera for taking an image of a state behind a vehicle, installed in a rear portion of the vehicle in a position shifted from a center line along the lengthwise direction of the vehicle; and an image processing unit for receiving a camera image of the camera and generating a rear image of the state behind the vehicle on the basis of the camera image, and the image processing unit performs image processing for allowing a vertical center line of the rear image to substantially accord with the center line along the lengthwise direction of the vehicle. [0015]
  • According to the invention, in the rear image displayed on a display device, the center line along the lengthwise direction of the vehicle substantially accords with the vertical center line of the displayed image. Therefore, when the vehicle is moved straight backward, an object present on the center line of the vehicle is moved vertically in substantially the center of the screen, and hence, this image does not give an odd feeling to a user. In other words, even when the camera is installed in a position laterally shifted from the rear center of the vehicle, the rear image as if it were taken by a camera installed in the rear center of the vehicle and facing to the straight backward direction can be presented to the user. As a result, the restriction in the installation position and the facing direction of the camera of the monitoring system can be reduced, so as to increase the freedom in the camera installation. [0016]
  • The image processing unit of the monitoring system of this invention preferably performs parallel shifting processing at least on the camera image. Furthermore, the image processing unit preferably performs lens distortion correcting processing on the camera image. [0017]
  • Moreover, the image processing unit of the monitoring system of this invention preferably performs processing for converting the camera image into an image seen from a virtual viewpoint set above the rear center of the vehicle.[0018]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram for showing an exemplified structure of a monitoring system according to the invention; [0019]
  • FIG. 2 is a diagram for showing an example of a camera installation mode; [0020]
  • FIGS. 3A, 3B and [0021] 3C are diagrams of exemplified images for explaining image processing according to Embodiment 1 of the invention;
  • FIGS. 4A, 4B, [0022] 4C and 4D are diagrams of exemplified images for explaining image processing according to Embodiment 2 of the invention;
  • FIGS. 5A, 5B and [0023] 5C are diagrams for showing virtual viewpoints employed in FIGS. 4B through 4D and 7B through 7D, respectively;
  • FIG. 6 is a diagram for showing another camera installation mode; [0024]
  • FIGS. 7A, 7B, [0025] 7C and 7D are diagrams of exemplified images for explaining another image processing of Embodiment 2;
  • FIG. 8A is a diagram of a conventional camera installation mode and FIGS. 8B and 8C are diagrams of exemplified images obtained in the camera installation mode of FIG. 8A; [0026]
  • FIGS. 9A and 9B are diagrams of exemplified conventional images obtained in the camera installation mode of FIG. 2; [0027]
  • FIGS. 10A and 10B are diagrams of exemplified conventional images obtained in the camera installation mode of FIG. 6; and [0028]
  • FIGS. 11A, 11B, [0029] 11C and 11D are diagram of exemplified images for explaining another image processing of Embodiment 1.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Preferred embodiments of the invention will now be described with reference to the accompanying drawings. First, the entire structure of a monitoring system of this invention necessary for practicing each embodiment will be described, and thereafter, a variety of examples of display screen modes will be described in detail. [0030]
  • FIG. 1 is a block diagram for showing the structure of the monitoring system of this invention. In the monitoring system of FIG. 1, an [0031] image processing unit 20 receives, as inputs, a plurality of camera images output from camera means 10 including cameras 11, so as to generate a new image by transforming and synthesizing the input camera images. This synthesized image is displayed by a display device 30. The image processing unit 20 corresponds to an image processor of this invention.
  • The [0032] display device 30 of this invention is typically a liquid crystal display, and may be another display device such as a plasma display. Also, the display device of this invention may be used also as a vehicle-install type GPS terminal display (namely, a display of what is called a car navigation system) or may be separately prepared.
  • The camera means [0033] 10 is a color or monochrome digital camera typically including a solid state image sensor such as a CCD or a CMOS device. Alternatively, the camera means 10 may include a combination of a lens and a prism or a mirror so as to transfer incident light to the lens and the prism or the mirror through a predetermined optical path to a camera device disposed away from the camera means 10.
  • The [0034] image processing unit 20 transforms/synthesizes a camera image of at least one camera, so as to generate a synthesized image in which the input image is shifted laterally or a synthesized image as if it were vertically or obliquely looked down from above the vehicle. In order to generate the synthesized image, it is necessary to perform image transforming processing and synthesizing processing of partial images obtained by cutting out necessary areas of the transformed images (including processing such as smoothing boundaries between the partial images (hereinafter referred to as the boundary processing) if a plurality of camera images are used). For this purpose, the structure of FIG. 1 includes a mapping table referring unit 21, and a mapping table is used for processing the camera images in one step.
  • An [0035] image synthesizing unit 22 receives the camera images from the cameras 1 through N and processes these camera images. The processing performed at this point are {circle over (1)} processing for transforming and cutting the images and {circle over (2)} processing for synthesizing cut partial images (including the boundary processing). The processing {circle over (1)} and {circle over (2)} may be separately performed, or all or part of these processing may be performed in one step. In the structure of FIG. 1, the mapping table is included for performing the processing of the camera images in one step.
  • A “mapping table” means a table in which the corresponding relationships between pixels of a synthesized image and pixel data of the respective camera images are described, and is used for rapidly generating a synthesized image. When such a mapping table is previously built through calculation using geometric conversion or manual operations, a desired synthesized image can be rapidly generated. [0036]
  • The mapping table is specifically stored in, for example, a ROM (including a writable erasable ROM such as an EE PROM) or a RAM. For storing the mapping table, mapping data obtained through calculation by a processor included in the image processing unit may be written in a ROM or a RAM, or mapping table data provided as firmware may be written in a RAM or a ROM by using data transfer means such as a communication line and a disk drive. [0037]
  • Now, examples of a variety of display screen modes according to this invention will be described in detail. [0038]
  • [0039] Embodiment 1
  • In [0040] Embodiment 1, an example of processing employed in a camera installation mode in which a camera 2 facing to a straight backward direction of a vehicle 1 is installed in a position laterally shifted from the rear center of the vehicle 1 as shown in FIG. 2 will be described. The contents of this processing will now be described by using a camera image and a display screen mode resulting from the processing.
  • FIG. 3A is a camera image taken in the cameral installation mode of FIG. 2 and FIG. 3B is a rear image obtained by shifting, in a leftward direction, merely a rectangular area RC[0041] 1 surrounded with a broken line in the image of FIG. 3A for the purpose of overcoming difficulty in grasping the image due to the lateral shift of the camera position. The rectangular area RC1 is set so that the vertical center line of the rectangular area can substantially accord with an actual center line CL1 along the lengthwise direction of the vehicle 1 (indicated by a solid dashed line in the drawings).
  • In this case, since merely the position of the [0042] camera 2 is slightly shifted from the center of the vehicle 1, when the vehicle 1 is moved straight backward, an object present just behind the vehicle on the center line CL1 moves downward in a substantially vertical direction on the screen. However, owing to the shift of the installation position of the camera 2, the center line CL1 of the vehicle on the road surface is rather shifted in a rightward direction on the monitor screen, and this rightward shift is corrected by cutting out the rectangular area from the original image and shifting it in the leftward direction.
  • As is understood from the image of FIG. 3B, the vertical center line of the rear image on the monitor screen can be made to substantially accord with the center line CL[0043] 1 of the vehicle merely by laterally shifting the camera image. Thus, when the vehicle is moved straight backward, at least an object present on the center line CL1 of the vehicle 1 can be made to move smoothly downward in the substantially center of the screen. Therefore, when a driver reverses the vehicle toward a target, it can be easily checked whether or not the vehicle is reversing toward the target on the basis of the movement in the center of the screen.
  • Next, an example of the simplest method for estimating an appropriate extent of the shift of the image will be described. First, a center line of the vehicle is drawn on the road behind the vehicle. The length of the line depends upon the rear visual range of the camera and a length of 5 m suffices. Then, the state of the center line is taken by the camera, so as to calculate distances from the center of the image of average positions on the right and left hand sides of the center line. This distance corresponds to the extent of the image shift. [0044]
  • On the other hand, in the case where the camera is installed in the rear center of the vehicle, the travel locus of the center of the vehicle is a straight line (as shown in FIG. 8B). However, in the case where the installation position of the camera is shifted from the center, the travel locus is a curve due to the lens distortion. Since the lens distortion is larger in an area farther from the center of the image, the curvature is larger as the shift of the installation position is larger. This curvature cannot be corrected merely by the lateral shift of the image. [0045]
  • The lens distortion can be corrected by two-dimensional image transformation for moving positions of respective pixels of the image in accordance with the characteristic of the lens of the camera. For example, a square lattice pattern is previously taken by a camera so as to measure how the respective lattice points are transformed by the lens distortion, and this transformation is reversely corrected. Thus, the lens distortion can be corrected. [0046]
  • FIG. 3C shows an image obtained from the image of FIG. 3A by correcting the lens distortion and correcting the positional shift of the camera through the lateral shift of the image. In the image as shown in FIG. 3C, when the vehicle is moved backward, an object present on the center line of the vehicle moves straight in the vertical direction in the center of the screen. Therefore, the moving direction of the vehicle and the positional relationship with an object around the vehicle can be more easily grasped. [0047]
  • [0048] Embodiment 2
  • In either or both of a camera installation mode in which the installation position of the camera is largely shifted from the center line of the vehicle and a camera installation mode in which the camera does not face to the straight backward direction, it may be difficult to generate a user-friendly image through the image shifting processing and the distortion correcting processing described in [0049] Embodiment 1.
  • [0050] Embodiment 2 corresponds to an aspect in which a user-friendly image can be generated even in these camera installation modes. Now, the contents of processing of Embodiment 2 will be described by using a camera image and a display screen mode resulting from the processing.
  • First, in a first camera installation mode of [0051] Embodiment 2, the installation position of the camera alone is largely shifted from the center of the vehicle as shown in FIG. 2.
  • FIG. 4A shows camera images taken in the camera installation mode of FIG. 2, and FIGS. 4B through 4D show synthesized images as if they were seen from a virtual viewpoint obtained by subjecting the images of FIG. 4A to viewpoint converting processing for the purpose of overcoming difficulty in grasping the image due to the lateral shift of the camera position. FIGS. 5A through 5C show the virtual viewpoints employed in the rear images of FIGS. 4B through 4D, respectively. Specifically, the virtual viewpoint is in a position above the rear center of the vehicle and the camera faces to a straight backward direction of the vehicle at an angle of 30 degrees against the road surface in FIG. 5A; the virtual viewpoint is in a position above the rear center of the vehicle and the camera faces to a straight backward direction of the vehicle at an angle of 60 degrees against the road surface in FIG. 5B; and the virtual viewpoint is in a position above the rear center of the vehicle and the camera faces straight downward in FIG. 5C. [0052]
  • Each of the images of FIGS. 4B through 4D is processed through the virtual viewpoint converting processing so that the center line CL[0053] 1 along the lengthwise direction of the vehicle on the road surface can accord with the vertical center line of the screen. In addition, when the vehicle is moved straight backward by 25 cm, an object on the center line CL1 of the vehicle moves straight downward in the vertical direction on the screen.
  • Furthermore, as shown in FIGS. 5A through 5C, the virtual viewpoint can be given an angle of depression independently of the angle of depression of the actually used camera. Therefore, the positional relationship with a feature on a road surface (such as a white line) can be more easily grasped by generating an image as if it were looked down from further above. [0054]
  • Next, in a second camera installation mode of [0055] Embodiment 2, the installation position of the camera is largely shifted from the center of the vehicle and the camera faces to a direction slightly oblique to the straight backward direction of the vehicle as shown in FIG. 6.
  • FIG. 7A shows camera images taken in the camera installation mode of FIG. 6, and FIGS. 7B through 7D show synthesized images as if they were seen from a virtual viewpoint obtained by subjecting the images of FIG. 7A to the viewpoint converting processing for the purpose of overcoming difficulty in grasping the image due to the lateral shift of the camera position. The virtual viewpoints employed in the images of FIGS. 7B through 7D also correspond to those shown in FIGS. 5A through 5C, respectively. [0056]
  • When these images are compared with the images obtained in the first camera installation mode, although there are differences in dead angle areas and the distortion of three-dimensional objects in the converted images because of the original difference in the camera range, a road surface area such as a white line is substantially the same as that in the image obtained through the virtual viewpoint converting processing of the camera image taken in the first camera installation mode. [0057]
  • FIGS. 11A trough [0058] 11D exemplify a case where the image processing according to Embodiment 1 is applied in the camera installation mode shown in FIG. 6. FIG. 11B is an image obtained by shifting, in a leftward direction, the rectangular area in the image of FIG. 11A, which is an original camera image, FIG. 11C is an image obtained by correcting the lens distortion of the image of FIG. 11B, and FIG. 11D is an image obtained by rotating the image of FIG. 11C so as to conform the direction of the center line with the perpendicular direction of the image. Referring to FIG. 11D, the image remains unbalanced on the right and left sides, which might be the limit in the two-dimensional image processing. In contrast, a more natural image as shown in FIGS. 7D can be generated by the virtual viewpoint conversion.
  • In order to realize a rear image described in each embodiment, a mapping table corresponding to an image to be displayed may be provided or a mapping table may be automatically built in accordance with the situation. [0059]
  • Although merely one camera image is processed in each embodiment, it goes without saying that a plurality of images are input to be converted into one image through the processing. [0060]
  • A vehicle of this invention includes, an ordinary car, a light car, a truck, a bus and the like. In particular, the present invention is very effective in a vehicle in which a camera cannot be installed in the rear center because of a spare tire or the like placed in the rear or for reasons in the design. Also, a special vehicle such as a crane truck and an excavator may be a vehicle of this invention as far as the technical idea of the invention is applicable. [0061]
  • As described above, according to the invention, even when a camera is installed in a position laterally shifted from the rear center of a vehicle, a rear image as if it were taken by a camera installed in the rear center and facing to a straight backward direction can be presented to a user. As a result, the restriction in the installation position and the facing direction of the camera of the monitoring system can be reduced, so as to increase the freedom in the camera installation. [0062]

Claims (4)

What is claimed is:
1. A monitoring system comprising:
a camera for taking an image of a state behind a vehicle, installed in a rear portion of said vehicle in a position shifted from a center line along the lengthwise direction of said vehicle; and
an image processing unit for receiving a camera image of said camera and generating a rear image of the state behind said vehicle on the basis of said camera image,
wherein said image processing unit performs image processing for allowing a vertical center line of said rear image to substantially accord with said center line along the lengthwise direction of said vehicle.
2. The monitoring system of claim 1,
wherein said image processing unit performs parallel shifting processing at least on said camera image.
3. The monitoring system of claim 2,
wherein said image processing unit performs lens distortion correcting processing on said camera image.
4. The monitoring system of claim 1,
wherein said image processing unit performs processing for converting said camera image into an image seen from a virtual viewpoint set above the rear center of said vehicle.
US10/173,316 2001-06-18 2002-06-17 Monitoring system Abandoned US20020191078A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/369,979 US8201199B2 (en) 2001-06-18 2009-02-12 Monitoring system
US13/468,661 US20120218413A1 (en) 2001-06-18 2012-05-10 Monitoring system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2001-182,741 2001-06-18
JP2001182741A JP4512293B2 (en) 2001-06-18 2001-06-18 Monitoring system and monitoring method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/369,979 Continuation US8201199B2 (en) 2001-06-18 2009-02-12 Monitoring system

Publications (1)

Publication Number Publication Date
US20020191078A1 true US20020191078A1 (en) 2002-12-19

Family

ID=19022775

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/173,316 Abandoned US20020191078A1 (en) 2001-06-18 2002-06-17 Monitoring system
US12/369,979 Expired - Lifetime US8201199B2 (en) 2001-06-18 2009-02-12 Monitoring system
US13/468,661 Abandoned US20120218413A1 (en) 2001-06-18 2012-05-10 Monitoring system

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/369,979 Expired - Lifetime US8201199B2 (en) 2001-06-18 2009-02-12 Monitoring system
US13/468,661 Abandoned US20120218413A1 (en) 2001-06-18 2012-05-10 Monitoring system

Country Status (5)

Country Link
US (3) US20020191078A1 (en)
EP (1) EP1270329B2 (en)
JP (1) JP4512293B2 (en)
AT (1) ATE442976T1 (en)
DE (1) DE60233705D1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030222983A1 (en) * 2002-05-31 2003-12-04 Kunio Nobori Vehicle surroundings monitoring device, and image production method/program
US20040061626A1 (en) * 2002-09-30 2004-04-01 Nissan Motor Co., Ltd. Preceding-vehicle following control system
US20050222753A1 (en) * 2004-03-31 2005-10-06 Denso Corporation Imaging apparatus for vehicles
US20070120660A1 (en) * 2003-12-25 2007-05-31 Shin Caterpillar Mitsubishi Ltd. Indicator control system with camera section
EP1850595A1 (en) * 2005-02-15 2007-10-31 Matsushita Electric Industrial Co., Ltd. Periphery supervising device, and periphery supervising method
US20080089607A1 (en) * 2006-10-11 2008-04-17 Kazuhiro Hirade Semiconductor integrated circuit device and rendering processing display system
US20080158011A1 (en) * 2006-12-28 2008-07-03 Aisin Seiki Kabushiki Kaisha Parking assist apparatus
US20090128630A1 (en) * 2006-07-06 2009-05-21 Nissan Motor Co., Ltd. Vehicle image display system and image display method
US20100220189A1 (en) * 2005-08-02 2010-09-02 Takura Yanagi Device and method for monitoring vehicle surroundings
US9052393B2 (en) 2013-01-18 2015-06-09 Caterpillar Inc. Object recognition system having radar and camera input
US9167214B2 (en) 2013-01-18 2015-10-20 Caterpillar Inc. Image processing system using unified images
US20160037080A1 (en) * 2013-03-15 2016-02-04 Tolo, Inc. Distortion correcting sensors for diagonal collection of oblique imagery
CN105453536A (en) * 2013-08-09 2016-03-30 株式会社电装 Image processing device, and image processing method
EP2256686A4 (en) * 2008-03-19 2017-03-01 Panasonic Intellectual Property Management Co., Ltd. Image processing device and method, driving support system, and vehicle
US10127687B2 (en) 2014-11-13 2018-11-13 Olympus Corporation Calibration device, calibration method, optical device, image-capturing device, projection device, measuring system, and measuring method
US10659677B2 (en) 2017-07-21 2020-05-19 Panasonic Intellectual Property Managment Co., Ltd. Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
CN112019816A (en) * 2020-09-03 2020-12-01 中车青岛四方车辆研究所有限公司 Train stream media transmission device and video monitoring system
US11151881B2 (en) * 2017-08-29 2021-10-19 Aisin Seiki Kabushiki Kaisha Parking assist device

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4956880B2 (en) * 2001-09-27 2012-06-20 アイシン精機株式会社 Vehicle monitoring device
JP2006129149A (en) 2004-10-29 2006-05-18 Auto Network Gijutsu Kenkyusho:Kk Visual recognition device for vehicle vicinity
EP1662440A1 (en) * 2004-11-30 2006-05-31 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Method for determining the position of an object from a digital image
JP4680670B2 (en) * 2005-05-12 2011-05-11 株式会社バンダイナムコゲームス Program, information storage medium, and image generation system
JP2006318388A (en) * 2005-05-16 2006-11-24 Namco Bandai Games Inc Program, information storage medium, and image forming system
WO2008117386A1 (en) * 2007-03-23 2008-10-02 Pioneer Corporation Image processing apparatus for vehicle and image processing program for vehicle
KR100882875B1 (en) 2007-12-18 2009-02-10 주식회사 현대오토넷 Vehicle rear image display apparatus and method for diversion position of image
KR20120053713A (en) * 2010-11-18 2012-05-29 에스엘 주식회사 Apparatus and method for controlling a vehicle camera
JP5447490B2 (en) * 2011-11-18 2014-03-19 アイシン精機株式会社 Vehicle monitoring device
CN103847638A (en) * 2012-11-28 2014-06-11 德尔福电子(苏州)有限公司 Wheel direction display device
JP5999043B2 (en) * 2013-07-26 2016-09-28 株式会社デンソー Vehicle periphery monitoring device and program
US20150175088A1 (en) * 2013-12-23 2015-06-25 Teddy Chang Extendable Surround Image Capture System for Vehicles
EP3462550B1 (en) * 2017-10-02 2021-01-27 Hosiden Corporation Connector module and onboard camera using the same
JP2022168699A (en) * 2021-04-26 2022-11-08 キヤノン株式会社 Movable body, imaging system, and imaging apparatus

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220508A (en) * 1989-12-28 1993-06-15 Kabushiki Kaisha Toyota Chuo Kenkusho Position and heading detecting device for self controlled vehicle
US5329310A (en) * 1992-06-30 1994-07-12 The Walt Disney Company Method and apparatus for controlling distortion of a projected image
US5646614A (en) * 1993-10-25 1997-07-08 Mercedes-Benz Ag System for monitoring the front or rear parking space of a motor vehicle
US5661472A (en) * 1993-01-31 1997-08-26 Isuzu Motors Limited Off-lane alarm apparatus
US5670935A (en) * 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
US5859626A (en) * 1995-02-28 1999-01-12 Sony Corporation Display circuit which automatically deciphers different video formats and optimizes the horizontal and vertical centering of the image on the display
US6005492A (en) * 1997-08-21 1999-12-21 Honda Giken Kogyo Kabushiki Kaisha Road lane recognizing apparatus
US6184781B1 (en) * 1999-02-02 2001-02-06 Intel Corporation Rear looking vision system
US6476855B1 (en) * 1998-05-25 2002-11-05 Nissan Motor Co., Ltd. Surrounding monitor apparatus for a vehicle
US6813371B2 (en) * 1999-12-24 2004-11-02 Aisin Seiki Kabushiki Kaisha On-vehicle camera calibration device
US6985171B1 (en) * 1999-09-30 2006-01-10 Kabushiki Kaisha Toyoda Jidoshokki Seisakusho Image conversion device for vehicle rearward-monitoring device

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0399952A (en) 1989-09-12 1991-04-25 Nissan Motor Co Ltd Surrounding situation monitor for vehicle
FR2673499B1 (en) * 1991-03-01 1995-08-04 Renault CAMERA REVIEW DEVICE FOR A MOTOR VEHICLE.
US6498620B2 (en) * 1993-02-26 2002-12-24 Donnelly Corporation Vision system for a vehicle including an image capture device and a display system having a long focal length
US5883739A (en) * 1993-10-04 1999-03-16 Honda Giken Kogyo Kabushiki Kaisha Information display device for vehicle
JPH08147497A (en) * 1994-11-25 1996-06-07 Canon Inc Picture processing method and device therefor
JPH10257482A (en) * 1997-03-13 1998-09-25 Nissan Motor Co Ltd Vehicle surrounding condition display device
US6529640B1 (en) * 1998-06-09 2003-03-04 Nikon Corporation Image processing apparatus
FR2780230B1 (en) * 1998-06-18 2000-08-25 Renault MAIN REVIEW DEVICE FOR A MOTOR VEHICLE
EP2259220A3 (en) 1998-07-31 2012-09-26 Panasonic Corporation Method and apparatus for displaying image
EP1004916A1 (en) * 1998-11-25 2000-05-31 Donnelly Corporation Wide angle imaging system for vehicle
JP2000161915A (en) * 1998-11-26 2000-06-16 Matsushita Electric Ind Co Ltd On-vehicle single-camera stereoscopic vision system
CA2369648A1 (en) 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Limited Image processing device and monitoring system
JP3298851B2 (en) * 1999-08-18 2002-07-08 松下電器産業株式会社 Multi-function vehicle camera system and image display method of multi-function vehicle camera
DE60105684T2 (en) * 2000-04-05 2005-02-10 Matsushita Electric Industrial Co., Ltd., Kadoma System and method for driver assistance
US6734896B2 (en) * 2000-04-28 2004-05-11 Matsushita Electric Industrial Co., Ltd. Image processor and monitoring system
JP3599639B2 (en) * 2000-05-26 2004-12-08 松下電器産業株式会社 Image processing device
JP2002359839A (en) * 2001-03-29 2002-12-13 Matsushita Electric Ind Co Ltd Method and device for displaying image of rearview camera
AU2002308651A1 (en) * 2001-05-04 2002-11-18 Leberl, Franz, W. Digital camera for and method of obtaining overlapping images
JP4760831B2 (en) * 2005-08-02 2011-08-31 日産自動車株式会社 Vehicle perimeter monitoring apparatus and vehicle perimeter monitoring method
US9826200B2 (en) * 2007-04-30 2017-11-21 Mobileye Vision Technologies Ltd. Rear obstruction detection
JP4595976B2 (en) * 2007-08-28 2010-12-08 株式会社デンソー Video processing apparatus and camera

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220508A (en) * 1989-12-28 1993-06-15 Kabushiki Kaisha Toyota Chuo Kenkusho Position and heading detecting device for self controlled vehicle
US5329310A (en) * 1992-06-30 1994-07-12 The Walt Disney Company Method and apparatus for controlling distortion of a projected image
US5661472A (en) * 1993-01-31 1997-08-26 Isuzu Motors Limited Off-lane alarm apparatus
US5670935A (en) * 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
US5646614A (en) * 1993-10-25 1997-07-08 Mercedes-Benz Ag System for monitoring the front or rear parking space of a motor vehicle
US5859626A (en) * 1995-02-28 1999-01-12 Sony Corporation Display circuit which automatically deciphers different video formats and optimizes the horizontal and vertical centering of the image on the display
US6005492A (en) * 1997-08-21 1999-12-21 Honda Giken Kogyo Kabushiki Kaisha Road lane recognizing apparatus
US6476855B1 (en) * 1998-05-25 2002-11-05 Nissan Motor Co., Ltd. Surrounding monitor apparatus for a vehicle
US6184781B1 (en) * 1999-02-02 2001-02-06 Intel Corporation Rear looking vision system
US6985171B1 (en) * 1999-09-30 2006-01-10 Kabushiki Kaisha Toyoda Jidoshokki Seisakusho Image conversion device for vehicle rearward-monitoring device
US6813371B2 (en) * 1999-12-24 2004-11-02 Aisin Seiki Kabushiki Kaisha On-vehicle camera calibration device

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7110021B2 (en) * 2002-05-31 2006-09-19 Matsushita Electric Industrial Co., Ltd. Vehicle surroundings monitoring device, and image production method/program
US20030222983A1 (en) * 2002-05-31 2003-12-04 Kunio Nobori Vehicle surroundings monitoring device, and image production method/program
US7561955B2 (en) 2002-09-30 2009-07-14 Nissan Motor Co., Ltd. Preceding-vehicle following control system
US20040061626A1 (en) * 2002-09-30 2004-04-01 Nissan Motor Co., Ltd. Preceding-vehicle following control system
US20070198162A1 (en) * 2002-09-30 2007-08-23 Nissan Motor Co., Ltd. Preceding-vehicle following control system
US7272482B2 (en) * 2002-09-30 2007-09-18 Nissan Motor Co., Ltd. Preceding-vehicle following control system
US20070120660A1 (en) * 2003-12-25 2007-05-31 Shin Caterpillar Mitsubishi Ltd. Indicator control system with camera section
US7605692B2 (en) * 2003-12-25 2009-10-20 Caterpillar Japan Ltd. Indicator control system with camera section
US20050222753A1 (en) * 2004-03-31 2005-10-06 Denso Corporation Imaging apparatus for vehicles
US7532975B2 (en) * 2004-03-31 2009-05-12 Denso Corporation Imaging apparatus for vehicles
EP1850595A1 (en) * 2005-02-15 2007-10-31 Matsushita Electric Industrial Co., Ltd. Periphery supervising device, and periphery supervising method
EP1850595A4 (en) * 2005-02-15 2014-08-06 Panasonic Corp Periphery supervising device, and periphery supervising method
US20100220189A1 (en) * 2005-08-02 2010-09-02 Takura Yanagi Device and method for monitoring vehicle surroundings
US8885045B2 (en) * 2005-08-02 2014-11-11 Nissan Motor Co., Ltd. Device and method for monitoring vehicle surroundings
US20090128630A1 (en) * 2006-07-06 2009-05-21 Nissan Motor Co., Ltd. Vehicle image display system and image display method
US7953292B2 (en) 2006-10-11 2011-05-31 Renesas Electronics Corporation Semiconductor integrated circuit device and rendering processing display system
US20080089607A1 (en) * 2006-10-11 2008-04-17 Kazuhiro Hirade Semiconductor integrated circuit device and rendering processing display system
US20080158011A1 (en) * 2006-12-28 2008-07-03 Aisin Seiki Kabushiki Kaisha Parking assist apparatus
US7940193B2 (en) * 2006-12-28 2011-05-10 Aisin Seiki Kabushiki Kaisha Parking assist apparatus
EP2256686A4 (en) * 2008-03-19 2017-03-01 Panasonic Intellectual Property Management Co., Ltd. Image processing device and method, driving support system, and vehicle
US9167214B2 (en) 2013-01-18 2015-10-20 Caterpillar Inc. Image processing system using unified images
US9052393B2 (en) 2013-01-18 2015-06-09 Caterpillar Inc. Object recognition system having radar and camera input
US20160037080A1 (en) * 2013-03-15 2016-02-04 Tolo, Inc. Distortion correcting sensors for diagonal collection of oblique imagery
US9503639B2 (en) * 2013-03-15 2016-11-22 Tolo, Inc. Distortion correcting sensors for diagonal collection of oblique imagery
CN105453536A (en) * 2013-08-09 2016-03-30 株式会社电装 Image processing device, and image processing method
US20160176344A1 (en) * 2013-08-09 2016-06-23 Denso Corporation Image processing apparatus and image processing method
US10315570B2 (en) * 2013-08-09 2019-06-11 Denso Corporation Image processing apparatus and image processing method
US10127687B2 (en) 2014-11-13 2018-11-13 Olympus Corporation Calibration device, calibration method, optical device, image-capturing device, projection device, measuring system, and measuring method
US10659677B2 (en) 2017-07-21 2020-05-19 Panasonic Intellectual Property Managment Co., Ltd. Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
US11151881B2 (en) * 2017-08-29 2021-10-19 Aisin Seiki Kabushiki Kaisha Parking assist device
CN112019816A (en) * 2020-09-03 2020-12-01 中车青岛四方车辆研究所有限公司 Train stream media transmission device and video monitoring system

Also Published As

Publication number Publication date
EP1270329A2 (en) 2003-01-02
EP1270329B1 (en) 2009-09-16
US8201199B2 (en) 2012-06-12
US20090174775A1 (en) 2009-07-09
JP2002374523A (en) 2002-12-26
DE60233705D1 (en) 2009-10-29
ATE442976T1 (en) 2009-10-15
JP4512293B2 (en) 2010-07-28
EP1270329B2 (en) 2016-11-23
EP1270329A3 (en) 2003-12-17
US20120218413A1 (en) 2012-08-30

Similar Documents

Publication Publication Date Title
EP1270329B2 (en) Monitoring system
JP4766841B2 (en) Camera device and vehicle periphery monitoring device mounted on vehicle
US7006127B2 (en) Driving aiding system
US7266219B2 (en) Monitoring system
US20020075387A1 (en) Arrangement and process for monitoring the surrounding area of an automobile
JP4639753B2 (en) Driving assistance device
US7365653B2 (en) Driving support system
JP5729158B2 (en) Parking assistance device and parking assistance method
JP4765649B2 (en) VEHICLE VIDEO PROCESSING DEVICE, VEHICLE PERIPHERAL MONITORING SYSTEM, AND VIDEO PROCESSING METHOD
JP4248570B2 (en) Image processing apparatus and visibility support apparatus and method
US20050033495A1 (en) Parking assist system
US20090022423A1 (en) Method for combining several images to a full image in the bird's eye view
JP2005311868A (en) Vehicle periphery visually recognizing apparatus
JP2005110202A (en) Camera apparatus and apparatus for monitoring vehicle periphery
CN1378746A (en) Monitor system, method of adjusting camera, and vehicle monitor system
JP5036891B2 (en) Camera device and vehicle periphery monitoring device mounted on vehicle
JP2006050246A (en) Device for visually recognizing around vehicle
EP3772719B1 (en) Image processing apparatus, image processing method, and image processing program
WO2016129552A1 (en) Camera parameter adjustment device
KR20100081964A (en) Around image generating method and apparatus
JP4499319B2 (en) Driving support device, driving support method, and driving guide data creation method
JP2012019552A (en) Driving support device
KR101861523B1 (en) Apparatus and method for supporting driving of vehicle
US20220222947A1 (en) Method for generating an image of vehicle surroundings, and apparatus for generating an image of vehicle surroundings
JP6274936B2 (en) Driving assistance device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OKAMOTO, SHUSAKU;NAKAGAWA, MASAMICHI;YOSHIDA, TAKASHI;AND OTHERS;REEL/FRAME:013023/0472

Effective date: 20020613

AS Assignment

Owner name: PANASONIC CORPORATION, JAPAN

Free format text: CHANGE OF NAME;ASSIGNOR:MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD.;REEL/FRAME:021897/0624

Effective date: 20081001

Owner name: PANASONIC CORPORATION,JAPAN

Free format text: CHANGE OF NAME;ASSIGNOR:MATSUSHITA ELECTRIC INDUSTRIAL CO., LTD.;REEL/FRAME:021897/0624

Effective date: 20081001

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION

AS Assignment

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143

Effective date: 20141110

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143

Effective date: 20141110

AS Assignment

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUSLY FILED APPLICATION NUMBERS 13/384239, 13/498734, 14/116681 AND 14/301144 PREVIOUSLY RECORDED ON REEL 034194 FRAME 0143. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:056788/0362

Effective date: 20141110