US6005987A - Picture image forming apparatus - Google Patents

Picture image forming apparatus Download PDF

Info

Publication number
US6005987A
US6005987A US08/951,713 US95171397A US6005987A US 6005987 A US6005987 A US 6005987A US 95171397 A US95171397 A US 95171397A US 6005987 A US6005987 A US 6005987A
Authority
US
United States
Prior art keywords
images
picture
picture image
parallax
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US08/951,713
Inventor
Mitsuaki Nakamura
Yoshihiro Kitamura
Hiroshi Akagi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Assigned to SHARP KABUSHIKI KAISHA reassignment SHARP KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KITAMURA, YOSHIHIRO, NAKAMURA, MITSUAKI
Assigned to SHARP KABUSHIKI KAISHA reassignment SHARP KABUSHIKI KAISHA RE-RECORD TO ADD ASSIGNOR NAME THAT WAS LEFT OFF ON A PREVIOUSLY RECORD DOCUMENT AT REEL 8940 FRAME 0460. Assignors: AKAGI, HIROSHI, KITAMURA, YOSHIHIRO, NAKAMURA, MITSUAKI
Application granted granted Critical
Publication of US6005987A publication Critical patent/US6005987A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0088Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image

Definitions

  • the present invention relates to a picture image forming apparatus for forming a composite panoramic picture image from a plurality of still digital picture images.
  • a camera is supposed to provide a picture image describing a view in a imaging range by one imaging operation.
  • imaging a digital picture image by using the camera there are a case where pan-imaging by hand holding is carried out, a case where pan-imaging is carried out by using a tripod, a case where imaging is performed by installing a plurality of cameras having overlapped imaging ranges.
  • Pan-imaging indicates plural times of imaging operation where the imaging range of a camera is moved in the horizontal direction where at each imaging operation, portions of an object are included in the imaging range in performing preceding or succeeding imaging operation.
  • cameras 301 and 302 capable of imaging an object included in imaging ranges 307 and 308 each having a predetermined size by one imaging operation, are arranged such that the imaging ranges 307 and 308 overlap, to image objects 309 and 310.
  • optical axes 303 and 304 of lenses 301a and 302a of the cameras intersect with each other by making an angle 305.
  • a portion 311 where the imaging ranges 307 and 308 overlap is indicated by attaching hatched lines.
  • the objects 309 and 310 are disposed in the portion 311.
  • Depth Distances (hereinafter, referred to as depth) from the camera 301 to the objects 309 and 310 differ from the depths from the camera 302 to the objects 309 and 310, respectively. Overlapped states of the objects 309 and 310 differ in view from the cameras 301 and 302 and therefore, a portion 321 of the object 310 is not seen from the camera 301 and seen from the camera 302.
  • FIGS. 5A and 5B show picture images 313 and 314 obtained by the cameras 301 and 302 through the imaging operation.
  • the picture image 313 includes object images 317 and 318 representing the objects 309 and 310.
  • the picture image 314 includes object images 319 and 320 of the objects 309 and 310.
  • only one camera may be arranged in the same manner as in the camera 301 of FIG. 4.
  • the positional relationship between the camera and the objects 309 and 310 in the first imaging operation equal to that between the camera 301 and the objects 309 and 310
  • the positional relationship between the camera and the objects 309 and 310 in the second imaging operation is equal to that between the camera 302 and the objects 309 and 310. Consequently two picture images obtained in this case are equal to two picture images obtained by the cameras 301 and 302.
  • positions of the lenses 301a and 302a of the cameras 301 and 302 are shifted as shown by FIG. 4.
  • the rotational center of a camera does not coincide with the lens center of the camera and accordingly, more or less movement of the position of lens is caused even in the case where pan-imaging is carried out by using a tripod.
  • the portion 321 is imaged in the image 320 of the object 310 in the picture image 314, the portion 321 is not imaged in the image 318 of the object 310 in the picture image 313. Therefore, according to the picture images 313 and 314, respective shapes and overlapped states of the images 317, 318; 319, 320 of the objects 309 and 310 in the overlapped portion 311 of the imaging ranges 307 and 308 do not coincide with each other.
  • a panoramic picture image 323 shown by FIG. 5C is formed by jointing the picture images 313 and 314 such that the positions of the picture images 313 and 314 are made to coincide with each other in order to make the images 317 and 319 of the object 309 coincide with each other, the images 318 and 320 of the object 310 do not coincide with each other but are shifted from each other.
  • 5D is formed by jointing the picture images 313 and 314 such that the positions of the picture images 313 and 314 are made to coincide with each other in order to make the images 318 and 320 of the object 310 coincide with each other, the images 317 and 319 of the object 309 do not coincide with each other but are shifted from each other. Further, when a moving body is an object, in the case where respective imaging operations are shifted over time, with the movement of the body the position and the shape of the image of the object are changed in the respective picture images and therefore, the images of the object do not coincide with each other also in the case where a panoramic picture image is formed from these picture images as described above.
  • picture images are jointed by a joint line of a straight line or a straight line having a width.
  • picture images cannot often be composited adequately.
  • portions of picture images are naturally jointed, there causes shift or doubled imaging on joint lines in respect of portions resulting in parallax or including a moving body thereby causing significant deterioration in the quality of a panoramic picture image.
  • a first aspect of the invention provides a picture image forming apparatus comprising:
  • imaging means for obtain a pair of divided picture images having a parallax by imaging an object and a periphery of the object so as to include a same part of the object in imaging regions for the pair of divided picture images;
  • parallax sampling means for sampling the parallax from the images of the object in the two overlap regions
  • intermediate picture image forming means for forming intermediate picture images to be provided by imaging the object from points between the observing points from which the respective divided picture images are imaged on the basis of the images of the object in the overlap regions of the pair of divided picture images and the sampled parallax, and
  • picture image composition means for providing a composite picture image by interpolating the parallax of the images of the object in the two overlap regions by the intermediate picture images and compositing the pair of divided picture images such that the overlap regions overlap.
  • the parallax caused by the depth of the object is calculated based on the images of the object in the overlap regions of the divided picture images and the images of the object in the overlap regions are interpolated by using the intermediate picture images for interpolating the positions of imaging the picture images based on the calculation thereby achieving an effect capable of forming a panoramic picture image that is more natural than in the conventional technology even in the case where the parallax is caused by the depth of the object.
  • a second aspect of the invention provides a picture image forming apparatus wherein, when there are a plurality of objects having different distances from the imaging means to the objects, the parallax sampling means corrects parallaxes of images of the respective objects such that the parallax of the image of any one of the objects among the parallaxes sampled from the images of the respective objects, is nullified.
  • the parallax sampling means corrects the sampled parallaxes as described above. Therefore, when the parallaxes of the images of the objects in the two overlap regions are interpolated by using the intermediate picture images formed on the basis of the parallaxes and the pair of divided picture images are composited, in the composite picture image, the images of any one of the objects are jointed naturally. Accordingly, a more natural composite picture image can be obtained.
  • a third aspect of the invention provides a picture image forming apparatus wherein the intermediate picture image forming means of the picture image forming apparatus forms a plurality of the intermediate picture images and the respective intermediate picture images are provided with observing points different from each other to provide the respective intermediate picture images by imaging the object.
  • the intermediate picture image forming means forms a plurality of the intermediate picture images as described above.
  • the picture image composition means interpolates the parallaxes of the images of the object in the two overlap regions by using the plurality of intermediate picture images.
  • a fourth aspect of the invention provides a picture image forming apparatus wherein the calculating means of the picture image forming apparatus calculates differences in angle of the images of the object in the two divided picture images and subjects the divided picture images to a rotational transformation such that the calculated differences of the angles cancel each other.
  • the calculating means corrects to rotate the divided picture images as described above. Therefore, even in the case where not only the differences in the parallaxes but the differences in angle of the images of the object are included in the divided picture images, the differences are corrected and accordingly, the picture images can be composited further finely.
  • a fifth aspect of the invention provides a picture image forming apparatus, wherein the calculating means of the picture image forming apparatus calculates differences in size of the images of the object in the divided picture images and the two divided picture images are magnified or reduced such that the calculated differences in size cancel each other.
  • the calculating means corrects to magnify or reduce the divided picture images as described above. Therefore, even when the divided picture images include not only the differences in the parallaxes but differences in size of the images of the object, the differences are corrected and accordingly, the picture images can be composited further finely.
  • a sixth aspect of the invention provides a picture image forming apparatus wherein the calculating means calculates differences in angle and size of the images of the object in the two divided picture images and subjects the divided picture images to rotational transformation and magnification or reduction such that the calculated differences in angle and size cancel each other, respectively.
  • the calculating means corrects to rotationally transform and magnify or reduce the two divided picture images.
  • the divided picture images include not only the differences in parallax but also differences in angle and size of the images of the object, the differences are corrected, and accordingly the picture images can be composited further finely.
  • a seventh aspect of the invention provides a picture image forming apparatus comprising imaging means 101 for providing a pair of divided picture images having parallaxes in images of objects by imaging the same objects respectively from two different observing points,
  • parallax sampling means for sampling the parallaxes from the images of the objects in the overlap regions
  • joint line setting means for setting joint lines on a profile of the image of one of the objects most proximate to the imaging means in the respective overlap regions based on the sampled parallaxes
  • picture image composition means for providing a composite picture image by compositing the pair of divided picture images such that the set joint lines overlap.
  • the parallax caused by the depth of the object or a shift in position of the image of a moving object in the picture image are calculated in the overlap regions of the divided picture images, and the lines which do not include portions having large parallaxes caused by the depths of the objects in the two picture images, are used as joint lines by which even in the case of having the parallaxes caused by the depths of the objects and the case of including the image representing a moving body, an effect capable of forming a natural composite picture image having quality superior to that in the conventional technology and provided with small distortion, is achieved.
  • setting the joint lines makes it possible to hide the image of another object which is behind the object closest to the imaging means and becomes invisible from time to time in a vicinity of the periphery of the image of the object closest to the imaging means, by the image of the object closest to the imaging means, and accordingly the image quality of the composite picture image can be enhanced.
  • An eighth aspect of the invention provides a picture image forming apparatus, wherein the joint line setting means of the picture image forming apparatus sets joint lines on profiles most proximate to rims of the divided picture images among the profiles of the images of the objects in the respective overlap regions.
  • the eighth aspect of the invention even in the case where the joint image formed by joint of the seventh aspect becomes unnatural, such as in the case where images of a plurality of objects having different depth dimensions exist in the overlap regions, by providing joint lines on the profiles of the image of the object close to the lines of the divided picture images, an effect capable of forming a natural composite picture image having good quality and inconsiderable distortion, is achieved.
  • a ninth aspect of the invention provides a picture image forming apparatus where the joint line setting means of the picture image forming apparatus calculates candidates of a plurality of joint lines by using respective different joint line determining methods and one of the candidates of joint lines is selected and determined to be the joint line.
  • the ninth aspect is performed in view of the case where depending on the positions of the objects in the overlap regions, the invention may not be performed adequately when the picture images as references are always fixed to either one of the pair of divided picture images according to the seventh and the eighth aspects of the invention.
  • there are selected candidates of a plurality of joint lines which are calculated by using different joint line determining methods such as selection of picture images as references, a combination of the seventh and the eighth joint line setting methods or the like and an optimum joint line mostly minimizing differences in the images of the objects on the joint line is selected from the candidates by which an effect capable of forming a natural composite picture image having further stably improved quality and inconsiderable distortion is achieved.
  • the parallax sampling means corrects the sampled parallaxes as described above.
  • An object of an image having the smallest parallax is an object farthest from the imaging means.
  • a joint line is set on the profile of the image of the object having the largest parallax, namely an object proximate to the imaging means.
  • the image of a far object on the joint line is less deviated by correcting the parallax of the image of the farthest object to zero. Further, since the joint line is set on the profile of the image of an near object, the image is hardly deviated.
  • the joint line is set on the profile of the image of an object close to the imaging means, so that in a composite picture image the image of another object which appears and disappears in the vicinity of the object close to the imaging means and is behind the object is hidden by the image of the object close to the imaging means, and additionally the deviation of the image of the object far from the imaging means becomes zero.
  • unnatural shapes, positions etc. of the image of an object within the composite picture image can be eliminated, and a further natural composite picture image can be provided.
  • An eleventh aspect of the invention provides a storage medium capable of being read by a computer, storing a program for allowing the computer to execute the processings of:
  • intermediate picture images to be provided by imaging the object from points between the observing points from which the respective divided picture images are imaged on the basis of the images of the object in the overlap regions of the pair of divided picture images and the sampled parallax;
  • the program stored in the above-mentioned storage medium is read by the computer to be executed, whereby the above-mentioned processings can be sequentially executed.
  • the computer calculates a parallax of an image of an object on the basis of an image of an object within an overlap region and forms intermediate picture images produce on the basis of the parallax to thereby combine a pair of picture images by interpolating the parallax of the image of the object. Accordingly, even when an image of an object in an input picture image has a parallax, it is possible to form a further natural composite picture image in comparison with the prior art.
  • a twelfth aspect of the invention provides a storage medium capable of being read by a computer, storing a program for allowing the computer to execute the processings of:
  • the program stored in the above-mentioned storage medium is read by the computer, which executes sequentially the above-mentioned processings.
  • the computer can set a joint line so that the joint line does not pass a position where an image of an object has a large parallax, to composite a pair of picture images. Accordingly, even when images of the pair of picture images may be deviated due to a parallax caused by a difference in depth or a motion of the object, a natural composite picture image of high quality which becomes less deformed can be formed.
  • FIG. 1 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus according to a first embodiment of the invention
  • FIG. 2 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus according to a second embodiment of the invention
  • FIG. 3 is a view showing an embodiment of RAM 112 in FIG. 1;
  • FIG. 4 is a view showing behavior of imaging picture images having overlap regions by moving a camera or in a plurality of cameras
  • FIGS. 5A and 5B are views showing picture images 313 and 314 imaged in the constitution of FIG. 4,
  • FIGS. 5C and 5D are views showing that in forming a panoramic view by a conventional technology, images 317, 319; 318, 320 are objects in overlap regions of the picture images 313 and 314 which do not coincide with each other by a difference in depths of the objects;
  • FIGS. 6A through 6C are views showing behavior of matching for detecting deviations in images of objects caused by their depths
  • FIGS. 7A through 7E are views showing behavior of forming an intermediate picture image II k (l);
  • FIGS. 8A through 8D are views showing a flow of processings in forming a panoramic picture image according to a panoramic picture image processing device in the first embodiment of the invention.
  • FIGS. 9A through 9E are views showing behavior of calculating a parallely moving amounts of picture images I(k) and I(k+1);
  • FIG. 10 is a graph showing a relationship between a deviation amount of matching between picture images and a depth from a camera
  • FIGS. 11A through 11I are views showing a flow of processings in forming a panoramic picture image according to the panoramic picture image processing device in the first embodiment of the invention.
  • FIGS. 12A and 12B are views showing behavior of determining a joint line by using a second sampling method of a joint line
  • FIGS. 13A through 13D are views showing behavior of determining a joint line by using a third sampling method of a joint line
  • FIG. 14 is a view showing an interpolating method from surroundings of undefined pixels in forming the intermediate pixel II k (l).
  • FIG. 1 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus 100 according to a first embodiment of the invention.
  • FIGS. 8A through 8D are views for explaining an outline of panoramic picture image forming processings in the panoramic picture image forming apparatus 100. An explanation will be given thereof in reference to both FIG. 1 and FIGS. 8A through 8D.
  • the panoramic picture image forming apparatus 100 includes a picture image inputting device 101, a frame memory 102, a picture image outputting device 103, a picture image input processing unit 104, a picture image preprocessing unit 105, a picture image position calculating unit 106, a picture image parallax sampling and processing unit 107, an intermediate picture image forming and processing unit 108, a picture image composition and processing unit 109, a picture image output processing unit 110, a timing control unit 111 and RAM 112.
  • the picture image inputting device 101 acquires a plurality of picture images constituting objects of compositing.
  • the picture images constituting objects of composition are provided by, for example, pan-imaging as explained in the conventional technology.
  • the plurality of picture images are respectively provided with regions having images of an object that is the same as an object in other picture images, which are referred to as overlap regions.
  • the images of the object include deviation by parallax and distortion.
  • a picture image is dealt with in a mode of a picture image signal for representing the picture image, the picture image signal may be referred to as "picture image" in the specification.
  • a plurality of picture images constituting objects of composition are picture images I(k) and I(k+1) shown by FIG. 8A.
  • the picture images I(k) and I(k+1) are the same as the picture images 313 and 314 in FIGS. 5A and 5B and an explanation of the notations will be omitted.
  • a portion of the picture image I(k) right from the central portion and a portion of the picture image I(k+1) left from the central portion constitute overlap regions W1 and W2.
  • a number of the plurality of picture images constituting objects of composition may naturally be 3 or more.
  • the picture image input processing unit 104 provides and stores the plurality of picture images constituting objects of composition from the picture image inputting device 101 to the frame memory 102.
  • the picture image preprocessing unit 105 performs preprocessing individually with respect to the respective picture images stored to the frame memory 102.
  • the picture image position calculating unit 106 calculates parallely moved amounts of the pair of picture images I(k) and I(k+1) stored to the frame memory 102.
  • the picture image parallax sampling and processing unit 107 calculates deviations of positions in the picture images I(k) and I(k+1) with respect to images of the same object in the respectives of the pair of picture images I(k) and I(k+1) and forms parallax picture images representing the magnitude of the deviation by the unit of pixel for each of the picture images I(k) and I(k+1).
  • the deviation is referred to as "deviation between images of object".
  • a parallax picture image 701 of the picture image I(k) with a reference of the picture image I(k+1) and a parallax picture image 702 of the picture image I(k+1) with a reference of the picture image I(k) are shown in FIG. 8B.
  • the intermediate picture image forming and processing unit 108 divides the overlap regions W1 and W2 respectively into n of a plurality of small regions in the horizontal direction, combines the small regions of the overlap region W1 with the small regions of the overlap region W2 and forms the intermediate picture images IIk(l) through IIk(n) as shown by FIG. 8C with the deviations of the images of the object as parameters.
  • the picture image compositing and processing unit 109 successively composites the plurality of picture images stored to the frame memory 102 by using the parallely moved amounts of the pairs of picture images and the intermediate picture images by which one sheet of a panoramic picture image is obtained.
  • the panoramic picture image is a picture image where a portion 706 of the picture image I(k) except the overlap region W1, the intermediate picture images IIk(1) through IIk(n) and a portion 707 of the picture image I(k+1) except the overlap region W2 are arranged from the left side in this order.
  • the picture image output processing unit 110 provides and displays the panoramic picture image formed by the picture image compositing and processing unit 109 to the picture image outputting device 103.
  • the timing control unit 111 controls timings of operation at the respective units 104 through 110.
  • FIG. 3 is a view showing an example of the memory structure of RAM 112 and RAM 112 is used for obtaining information stored in regions designated by predetermined addresses or storing information produced by respective processings in accordance with some processings, explained below.
  • the memory space of RAM 112 is divided into a plurality of regions at equal intervals and an address representing a position of the front of the region in the memory space is added to each region.
  • Address 1, Address 2, Address 3 . . . are added to the respective regions arranged from the upper most region toward the downward direction.
  • write, read and store information to a single or a plurality of regions of RAM 112 are referred to as "write, read and store to address”.
  • a number of frames of picture images stored to the frame memory 102 is stored to a front region 201.
  • Addresses of front regions among a single or a plurality of regions storing 1-th through n-th picture images in the frame memory 102 are stored to respective regions of a group of first regions 202 that are constituted by a plurality of regions from the second region to a region at a predetermined ordinal number.
  • the addresses of front regions among a single or a plurality of regions storing a panoramic picture image in the frame memory 102 are stored to respective regions of a second group of regions 203 that are constituted by a plurality of regions from a region next to the first group of regions 202 to a region at a predetermined ordinal number.
  • Addresses of front regions among a single or a plurality of regions storing an intermediate picture image ga in the frame memory 102 are stored to the respective regions of a third group of regions 204 that are constituted by a plurality of regions from a region next to the second group of regions 203 to a region at a predetermined ordinal number.
  • Parallely moved amounts among respective picture images are stored to respective regions of a fourth group of regions 205 that are constituted by a region next to the third group of regions 204 to a region at a predetermined ordinal number.
  • Deviations of images of objects among respective picture images are stored respectively over a plurality of regions to a fifth group of regions 206 that are constituted by a plurality of regions from a region next to the fourth group of regions 205 to a region at a predetermined ordinal number.
  • the timing control unit 111 transmits reset signals to the picture image input processing unit 104, the picture image preprocessing unit 105, the picture image position calculating unit 106, the picture image parallax sampling and processing unit 107, the intermediate picture image forming and processing unit 108, the picture image compositing and processing unit 109 and the picture image output processing unit 110.
  • the processings at the respective units 104 through 110 enter an awaiting state in response to the reset signals until they receive start signals from the timing control unit 111.
  • the timing control unit 111 outputs the start signal to the picture image input processing unit 104 for storing picture image to the frame memory 102 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image input processing unit 104.
  • the picture image input processing unit 104 Upon receiving the start signal from the timing control unit 111, the picture image input processing unit 104 performs a processing for storing a plurality of picture images to the frame memory 102 in respect of the picture image inputting device 101.
  • the picture image inputting device 101 comprises picture image inputting means, for example, a camera and an A/D converting means for converting analogue picture image signals obtained from the picture image inputting means into digital picture image signals and the processing is a processing where the picture image inputting means carries out imaging of picture image by the signal from the picture image input processing means 104, converts the analogue picture image signals obtained thereby into the digital image signals by the A/D converting means and transmits the digital picture image signals to predetermined addresses of the frame memory 102.
  • picture image inputting means for example, a camera and an A/D converting means for converting analogue picture image signals obtained from the picture image inputting means into digital picture image signals
  • the processing is a processing where the picture image inputting means carries out imaging of picture image by the signal from the picture image input processing means 104, converts the analogue picture image signals obtained thereby into the digital image signals by the A/D converting means and transmits the digital picture image signals to predetermined addresses of the frame memory 102.
  • the picture image inputting device 101 is a device for storing picture images and the processing is a processing where the picture image input processing unit 104 transmits the picture image signals stored to the picture image inputting device 101 to predetermined addresses of the frame memory 102.
  • the picture image input processing unit 104 Upon finishing the transmission of all the picture images to the frame memory 102, the picture image input processing unit 104 writes a number of frames of picture images transmitted to the frame memory 102 and addresses to which the picture images are stored in the frame memory 102, successively to predetermined addresses of RAM 112 and thereafter, transmits a picture image input finishing signal to the timing control unit 111 whereby the processing is finished.
  • the timing control unit 111 Upon receiving a finish signal of transmission of the inputted picture images to the frame memory, the timing control unit 111 transmits the start signal to the picture image preprocessing unit 105 and enters an awaiting state until the picture image preprocessing unit 105 receives a finish signal from the picture image preprocessing unit 106.
  • the picture image preprocessing unit 105 Upon receiving the start signal from the timing control unit 111, the picture image preprocessing unit 105 reads a number of frames of picture images to be processed from RAM 112 and stores the number of frames. Further, the picture image preprocessing unit 105 sets a counter k to 1. The picture image preprocessing unit 105 reads an address in the frame memory 102 of the picture image I(k) corresponding to the counter k from RAM 112. The picture image preprocessing unit 105 carries out a known processing of removing distortion in respect of the read picture image I(k). The processing includes, for example, correction of aberration of lens of imaging means or the like. The picture images removed of the distortion are again written to the same address of the frame memory 102.
  • the preprocessing in respect of the picture image I(k) comprises processings of from reading the address of the picture image I(k) to rewriting the picture image.
  • the counter k is incremented by 1 and is compared with the number of frames of picture images and when the counter is equal to or less than the number of frames of picture images, the preprocessing is carried out with respects to a next picture image.
  • the picture image preprocessing unit 105 transmits a picture image preprocessing finish signal to the timing control unit 111 whereby the processing is finished.
  • the timing control unit 111 Upon receiving the finish signal from the picture image preprocessing unit 105, the timing control unit 111 transmits the start signal to the picture image position calculating unit 106 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image position calculating unit 106.
  • the picture image position calculating unit 106 Upon receiving the start signal from the timing control unit 111, the picture image position calculating unit 106 reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image position calculating unit 106 reads from RAM 112 the address in the frame memory 102 of the reads picture image I(k) in correspondence with the counter k and a portion (Rx0, Ry0)-(Rx1, Ry1) constituting a reference region R1 of pattern matching as shown by FIG. 9A included in the predetermined picture image I(k) within the frame memory 102 in accordance with the address.
  • the reference region R1 is set to, for example, a predetermined position included in the overlap region of the picture image I(k).
  • the picture image position calculating unit 106 similarly reads a search region BL; (Sx0, Sy0)-(Sx1, Sy1) of the picture image I(k+1) next to the picture image.
  • the search region BL when a positional relationship between the picture images I(k) and I(k+1) is known to some degree, that is, when the imaging is carried out under a constant positional relationship or when an outline positional relationship is given from outside after the imaging operation, the search region is restrictively set in accordance with the information such that the search region is smaller than the total region of the picture image I(k+1), or when the positional relationship is not determined yet, the total region of the picture image is read as the search region.
  • the pattern matching is carried out based on the reference regions and the search regions of the respective read picture images whereby reference positions of regions where the reference regions match in the search regions, that is, reference positions of regions (mx0, my0) having the highest correlation values in respect of the reference regions in the search regions, are calculated.
  • the correlation value is calculated by, for example, calculating in accordance with the following equation.
  • the region is referred to as a matching region BM as a region matching the reference region with the position (mx0, my0) as one of apexes.
  • R area designates the reference region
  • R(x, y) designates a brightness value at coordinates (x, y) of the reference region
  • S(x, y) designates a brightness value at coordinates (x, y) in the search region
  • offset x , offset y designates the offset of the searched picture image from the original point (0, 0).
  • the calculated position (mx0, my0) designates a position at respective picture images of the reference region and the search region cut out from the original picture images and accordingly, correction is performed by the following Equations (2) and (3) whereby the parallely moved amounts of the picture images I(k) and I(k+1) (X move , Y move ) are calculated.
  • the parallely moved amounts are written to predetermined addresses of RAM 111.
  • a position shifted from the original point (0, 0) of the picture image I(k) by the parallely moving amounts (x move , y move ), is a position to be overlapped with the original point of the picture image I(k+1) when the picture image I(k) overlaps the picture image I(k+1) by overlapping the images of the same object.
  • Rx0 and Ry0 are x- and y-coordinates of a reference point of the reference region R, respectively, and Sx0 and Sy0 are x- and y-coordinates of a reference point of the search region S, respectively.
  • upper left vertexes of the regions R and S are considered as the reference points.
  • the processing of calculating the position of the picture image includes from reading the address of the picture image I(k) to writing the parallely moved amount.
  • the pattern matching is carried out for the respective reference regions and by using the result, a change in the scale of the image of the object can be calculated from changes in distances among the reference regions and distances among the matching regions and a relative rotational angle of image or the like can be calculated by a difference in directions made by the arrangement of the reference regions and the arrangement of the matching regions.
  • the scale S(k) is calculated by Equation (4) and the rotational angle q(k) is calculated by Equation (5).
  • designates the size of the vector Vr and
  • the processing is carried out before and after the processing of calculating the position of the picture image and from the scale S(k) and the rotational angle q(k) of the picture image I(k+1) in respect of the picture image I(k), the picture image I(k+1) is multiplied by 1/Sk and the picture image I(k+1) is rotated by -q(k) and also multiplied and rotated picture image I(k+1) may be written to the address of the original picture image I(k+1) at the frame buffer.
  • the counter k is incremented by 1, compared with a number of frames of picture images and when the counter k is smaller than the number of frames of picture images, a processing of calculating a next picture image position is carried out.
  • the picture image position calculating unit 106 transmits a picture image position calculation finish signal to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 Upon receiving the finish signal from the picture image position calculating unit 106, the timing control unit 111 transmits the start signal to the picture image parallax sampling and processing unit 107 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image parallax sampling and processing unit 107.
  • the picture image parallax sampling and processing unit 107 Upon receiving the start signal from the timing control unit 111, the picture image parallax sampling and processing unit 107 reads and stores the number of frames of picture images to be process from RAM 112 and sets the counter k to 1. Next, the picture image parallax sampling and processing unit 107 reads from RAM 112 the address in the frame memory 102 of the picture image I(k) in correspondence with the counter k and reads the picture image in the frame memory 102 by the address. Further, the picture image parallax sampling and processing unit 107 reads a next picture image (in correspondence with the counter k+1) next to the picture image.
  • the picture image parallax sampling and processing unit 107 reads from RAM 112 the parallely moved amounts calculated from the picture image position calculating unit 106 and respectively calculates regions W1 and W2 where the picture image I(k) and the picture image I(k+1) overlap with respects to the picture image I(k) and the picture image I(k+1).
  • the picture image parallax sampling and processing unit 107 divides the overlap region W1 of the picture image I(k), determines the respectives as reference regions and searches matching regions in correspondence with the respective reference regions from the picture image I(k+1) by using the method of pattern matching.
  • the search is carried out in order to calculate a deviation between the images of the object caused by parallax and therefore, the region for calculating the search correlation value may be moved only in the horizontal direction.
  • the deviation amount is an amount of apparently moving a body due to the parallax and the amount is caused by the depth and accordingly, the depth of the body represented by the images of the object in the reference regions can be estimated from the direction of deviation and the amount of deviation.
  • the deviation of the images of the same object is calculated in the overlap region of the picture images I(k) and I(k+1).
  • the image of the object of the picture image I(k+1) is further deviated to the left than the image of the object of the picture image I(k).
  • the image of the object of the picture image I(k+1) is further deviated to the right than the image of the object of the picture image I(k).
  • FIG. 10 is a graph showing a relationship between a deviation amount between images of an object and an actual depth of the object when an amount of apparent movement of a body at an infinitely remote point is 0.
  • the ordinate is the amount of deviation between the images of the object (unit; dot) and the abscissa is the depth (unit; m).
  • the amount of deviation between the images of the object is reduced in inverse proportion to the depth.
  • the depth is a distance in correspondence with the parallely moved amount calculated at the picture image position calculating unit 106, in this case, the distance may be corrected based on the calculated deviation amount in the overlap region.
  • the reference value is determined as a minimum value of the deviation amount and when the distance is matched to a location of the image of the object having the shallowest depth in the picture image, the reference value is determined as a maximum value of the deviation amount, or the reference value may be determined as an average of the deviation amounts.
  • the deviation amount is corrected by adding the calculated reference value to the parallely moved amount in RAM 112 and calculating differences between the respective deviation amounts and the reference value in the overlap region that are calculated currently and writing the differences to RAM 112 as deviation amounts. Thereby, the deviation amount as a reference value can be corrected to 0.
  • the accurate depth is not calculated, but the parallax accompanied by the difference in the depth, that is, the deviation amount per se between the images of the object is stored to RAM 112 for the respective reference regions and the respective matching regions or the respective pixels.
  • FIGS. 6A, 6B and 6C are views showing a flow of specific processings for carrying out the parallax sampling.
  • Each square of cross rule of the picture image I(k) shown by FIG. 6A corresponds to a reference region and a circle in the region designates the center of region.
  • a frame in the picture image I(k+1) designates an overlap region and a circle corresponds to the circle showing the center of the region of the picture image I(k) and an arrow mark extending therefrom designates the direction of deviation and the amount of deviation between the images of the object in each reference region calculated by the matching operation.
  • FIG. 6C shows a parallax picture image 401 that is shown by brightness values of pixels with a positive direction of the deviation amount in the left direction. The darker the brightness of the pixel, the smaller the parallax, that is, the deeper the depth and the brighter a portion, the larger the parallax, that is, the shallower the depth.
  • the depth sampling is carried out with the picture image I(k+1) as the reference side and the picture image I(k) as the search side.
  • the result of sampling the parallax of the picture image I(k+1) is similarly stored to RAM 112.
  • the parallax sampling processing includes processings of from reading the address of the picture image I(k) to storing the result of the parallax sampling.
  • the counter k is added with 1.
  • the parallax sampling processing is carried out in respect of the picture image I(k) and the picture image I(k+1).
  • the picture image parallax sampling and processing unit 107 transmits a finish signal to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 Upon receiving the finish signal from the picture image parallax sampling and processing unit 107, the timing control unit 111 transmits the start signal to the intermediate picture image forming unit 108 and enters an awaiting state until the timing control unit 111 receives a finish signal from the intermediate picture image forming unit 108.
  • the intermediate picture image forming and processing unit 108 Upon receiving the start signal from the timing control unit 111, the intermediate picture image forming and processing unit 108 reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the intermediate picture image forming and processing unit 108 reads from RAM 112 an address in the frame memory 102 of a picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. Further, the intermediate image forming and processing unit 108 reads the picture image I(k+1) (in correspondence with counter k+1) next to the picture image.
  • the intermediate picture image forming and processing unit 108 reads from RAM 112 the parallely moved amount calculated at the picture image position calculating unit 106 and respectively calculates the overlap region of the picture image I(k) and the overlap region of the picture image I(k+1).
  • An intermediate picture image is a picture image formed from 2 frames of picture images imaged at positions having different observing points by calculating a picture image to be imaged at a position at a middle point of the two observing points.
  • Positions of a camera when the picture images I(k) and I(k+1) are imaged are designated as imaging positions Tk and Tk+1.
  • the overlap regions W1 and W2 of the picture images I(k) and I(k+1) are respectively divided into n-1 of small regions in the horizontal direction.
  • an intermediate picture image IIk(l) formed by using a small region at the left end is the same as a picture image to be provided by the camera installed at a position internally dividing an interval between the imaging positions Tk and Tk+1 by a ratio of 1 to n-1.
  • intermediate images IIk(l) formed by using 2-th from left through (n-1)-th small regions are the same as picture images to be provided by the camera installed at the positions internally dividing the interval between the imaging positions Tk and Tk+1 by ratios of 2 to n-2 through n-1 to 1.
  • n-1 of the intermediate picture images IIk(l) are provided.
  • a method of forming an intermediate picture image in forming the intermediate picture image at a position internally dividing the interval between the imaging position T k and the imaging position T k+1 by a ratio of 1 to (n-1), is shown below.
  • the deviation between the images of the object in respect of the picture image I(k) and the picture image I(k+1) calculated at the parallax sampling and processing unit 107, are read from RAM 112 for each pixel or for each region.
  • the coordinates of a point P1 in the overlap region R1 (rx0, ry0)-(rx1, ry0) of the picture image I(k) in respect of the picture image I(k+1), are defined as (x1, y1)
  • the brightness value at the coordinate of the point P1 is defined as D(P1)
  • the deviation amount between the images of the object at the coordinates of the point P1 is defined as M(P1) (arrow mark of FIG.
  • the intermediate picture image II k (l) (FIG. 7C) with a reference of the picture image I(k) is formed.
  • the brightness value D(P1) of the respective point P1 (x, y), (rx0 ⁇ x ⁇ rx1, ry0 ⁇ y ⁇ ry1) in the overlap region R1 is equal to the brightness of (x+M(P1)/n-rx0, y-ry0) on the intermediate picture image II k (l).
  • the brightness value D(P1) is written to RAM 112 as the brightness value of the above-described point of the intermediate picture image II k (l). This processing is carried out with respect to all the points in overlap region R1.
  • the undefined pixel is interpolated by the intermediate picture image II k+1 (l) formed from the picture image I(k+1).
  • the intermediate picture image II k+1 (l) with a reference of the picture image I(k+1) is formed (FIG. 7D). Specifically, the respective points P2 (x, y) (rx2 ⁇ x ⁇ rx3, ry2 ⁇ y ⁇ ry3) in the overlap region W2 are written to RAM 112 as the brightness value D(P2) of points (x+M(P2)(n-1)/n-rx0, y-ry0) on the intermediate picture image II' k+1 (l). However, as mentioned above, when the brightness value has already been written, priority is given to the brightness value of the pixel having a larger deviation amount M(P1).
  • the undefined pixel in the intermediate picture image II k (l) formed by the picture image I(K) and the deviation amount for each pixel is interpolated by the pixel of the corresponding intermediate picture image II' k+1 (l).
  • the interpolation firstly assume a state where the intermediate picture images II k (l) and II k (l+1) overlap such that the images of the same object overlap. Under this state, the brightness value of the pixel of the intermediate picture image II k (l+1) overlapping the undefined pixel Px of the intermediate picture image II k (l) is written to a predetermined address of RAM 112 as the brightness value of the undefined pixel.
  • FIG. 14 is a view showing behavior of interpolation.
  • Each square correspond to one pixel and a pixel having a circle at the center is an undefined pixel.
  • the brightness value of an undefined pixel 501 surrounded by bold lines is assumed to be an average of brightness values of already defined pixels, or according to FIG. 14, pixels having triangular marks at their centers in the vicinity 8 surrounded by bold broken lines 502.
  • the above-described operation is repeated with respect to all the undefined pixels.
  • the interpolating operation is carried out again after finishing interpolation of other undefined pixels and repeated until the undefined pixels are not present.
  • An interpolated picture image having a better quality can be provided by carrying out the interpolating operation from undefined pixels having a large number of pixels in already defined pixels the vicinity 8.
  • the intermediate picture images II k (l) are formed and among them, a region 601 that is used for compositing operation is (dx*(l-1), 0)-(dx*l, (ry1-ry0)).
  • a region 601 that is used for compositing operation is (dx*(l-1), 0)-(dx*l, (ry1-ry0)).
  • the region 601 is written to the frame memory 102 as the intermediate picture images II k (l) by which the formation of the intermediate picture image of 1 to n-1 at the overlap regions of the picture image I(k) and the picture image I(k+1) is finished.
  • Notation dx designates a width of 1/(n-1) of the width of the overlap region.
  • the number of frames of the formed intermediate picture images is written to a predetermined address of RAM 112 and the counter k is added with 1.
  • the intermediate picture image processing is carried out with respect to picture images corresponding to the updated counter.
  • the intermediate picture image forming and processing unit 108 transmits an intermediate picture image forming and processing finish signal to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 Upon receiving the finish signal from the intermediate picture image forming and processing unit 108, the timing control unit 111 transmits the start signal to the picture image compositing and processing unit 109 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image compositing and processing unit 109.
  • the picture image compositing and processing unit 109 Upon receiving the start signal from the timing control unit 111, the picture image compositing and processing unit 109 reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image compositing and processing unit 109 reads from RAM 112 an address in the frame memory 102 of a picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. When the counter k is 1, the picture image compositing and processing unit 109 writes the picture image I(k) to a predetermined position of an output picture image region in the frame memory 102.
  • the picture image compositing and processing unit 109 calls from RAM 112 a deviation amount M(k) in respect of a picture image I(k-1).
  • a deviation amount M(k) in respect of a picture image I(k-1) By using the deviation amount, an overlap region R k in respect of the picture image I(k-1) for (rx0, ry0)-(rx1, ry1), is calculated.
  • the number of frames of intermediate picture images n in respect of the picture image I(k-1) and the picture image I(k) that have been formed at the intermediate picture image forming and processing unit 108, is called from RAM 112.
  • the intermediate picture images IIk(i), i1, . . . n are copied to the positions P i of the composite picture image memory in the output picture image region.
  • the timing control unit 111 Upon receiving the finish signal from the picture image compositing and processing unit 109, the timing control unit 111 transmits the start signal to the picture image output processing unit 110 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image output processing unit 110.
  • the picture image output processing unit 110 Upon receiving the start signal from the timing control unit 111, the picture image output processing unit 110 reads from RAM 112 an address of an output picture image in the frame memory 102. Next, the picture image output processing unit 110 transmits a picture image output start signal to the picture image outputting device 103, successively reads the output picture image signal from the frame memory 102 and outputs the signal to the picture image outputting device 103.
  • the picture image output processing unit 110 When the picture image output processing unit 110 finishes outputting the output picture image from the picture image outputting device 103, the picture image output processing unit 110 transmits a finish signal of the processing to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 finishes all the operation when the timing control unit 111 receives the finish signal from the picture image output processing unit 110.
  • a body proximate to a camera is composited out with a narrow width and a body remote from the camera is composited with a wide width.
  • a deviation amount of a body proximate to the camera is determined as a reference value by which the deviation amount of the total is corrected and when a portion remote from camera is intended to look natural, in the picture image parallax sampling and processing unit 107, a body remote from the camera is determined as a reference value by which the deviation amount of the total is corrected.
  • the embodiment shown here is an example of the invention and the invention is not limited to the content of the embodiment so far as the gist of the invention is not changed.
  • FIG. 2 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus 100a that is a second embodiment of the invention.
  • the panoramic picture image forming apparatus 100a includes the picture image inputting device 101, the frame memory 102, the picture image outputting device 103, the picture image input processing unit 104, the picture image preprocessing unit 105, the picture image position calculating unit 106, a picture image parallax sampling and processing unit 107a, a joint line determining and processing unit 108a, a picture image compositing and processing unit 109a, the picture image output processing unit 110, the timing control unit 111 and RAM 112.
  • the timing control unit 111 Upon receiving the finish signal from the picture image position calculating unit 106, the timing control unit 111 transmits the start signal to the picture image parallax sampling and processing unit 107a and enters an awaiting state until the timing control unit 111 receives the finish signal from the picture image parallax sampling and processing unit 107a.
  • the picture image parallax sampling and processing unit 107a Upon receiving the start signal from the timing control unit 111, the picture image parallax sampling and processing unit 107a reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image parallax sampling and processing unit 107a reads from RAM 112 an address in the frame memory 102 of a picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. Further, the picture image parallax sampling and processing unit 107a reads the picture image I(k+1) (in correspondence with counter k+1) next to the picture image.
  • the picture image parallax sampling and processing unit 107a reads from RAM 112 the parallels moved amount calculated at the picture image position calculating unit 106 and respectively calculates the overlap region W1 of the picture image I(k) and the overlap region W2 of the picture image I(k+1).
  • the picture image parallax sampling and processing unit 107a divides the overlap region W1 of the picture image I(k) and determines the respectives as reference regions and searches corresponding positions from the picture image I(k+1) by using the method of pattern matching.
  • what is searched is the deviation between the images of the object caused by the parallax and accordingly, the searching operation may be carried out only from the horizontal direction.
  • the deviation amount is an amount of apparent movement of a body caused by the parallax and the amount is caused by the depth and accordingly, the depth of the body in a reference region can be estimated by the direction of deviation and the amount of deviation.
  • the deviation amount of the region having the least deviation amount is subtracted from a distance between the respective regions or pixels and the difference is added to the parallely moved amount calculated at the picture image position calculating unit 106.
  • the deviation of the images of the object at a portion having the deepest depth at inside of the picture image is determined as a reference deviation by which the parallely moved amount in RAM 112 and the respective deviation amounts in the overlap regions are corrected.
  • the parallax sampling is carried out with the picture image I(k+1) as the reference side and the picture image I(k) as the search side.
  • the result of the parallax sampling of the picture image I(k+1) is similarly corrected and stored to RAM 112.
  • the counter k When the parallax sampling processing of the picture image I(k) and the picture image I(k+1) has been finished, the counter k is added with 1. When the counter k is smaller than the number of frames of picture images, the parallax sampling and processing with respect to the picture image I(k) and the picture image I(k+1) is carried out. When the counter k is equal to or more than the number of frames of picture images, the picture image parallax sampling and processing unit 107a transmits the finish signal to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 Upon receiving the finish signal from the picture image parallax sampling and processing unit 107a, the timing control unit 111 transmits the start signal to the joint line determining and processing unit 108a and enters an awaiting state until the timing control unit 111 receives the finish signal from the joint line determining and processing unit 108a.
  • the joint line determining and processing unit 108a Upon receiving the start signal from the timing control unit 111, the joint line determining and processing unit 108a reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the joint line determining and processing unit 108a reads from RAM 112 an address in the frame memory 102 of the picture image I(k) in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. Further, the joint line determining and processing unit 108a reads the picture image I(k+1) next to the picture image.
  • the joint line determining and processing unit 108a reads from RAM 112 the deviation amount of images of the object (parallax amount) of the picture image I(k) and the picture image I(k+1) calculated at the picture image parallax sampling and processing unit 107a.
  • E(x, y) designates a parallax amount or a concentration of a pixel of a parallax picture image at coordinates (x, y) of the picture image and Threshold designates a pertinent threshold value.
  • the point is represented by a white circle in FIG. 11E.
  • These points are points on a contour line at the right side of picture images 317 and 319 of the object.
  • the picture images 317 and 319 are arranged horizontally and accordingly, points on the contour lines of the objects 317 and 319 are arranged horizontally among the points.
  • These point rows designate the contour line on the right side of the images of the object and on the left side of the row of points corresponding to the above-described point rows on the picture image I (k+1), the unseen portion of an invisible portion of the picture image I(k) becomes visible and the difference between the picture images I(k) and I(k+1) becomes significant especially at the left side in the vicinity of these points. Further, the same thing is visible by the both picture images at the vicinity of the right side of these points and accordingly, the difference between the picture images I(k) and I(k+1) become small.
  • a first method of sampling a joint line in consideration of the above-described situation, points having parallaxes as similar as possible, are sampled from point rows having large calculated change in parallax, that is, the contour line of the same body in the vicinity of the camera is sampled and is determined as the joint line. Further, when there is no portion having the large change in parallax in the up and the down direction as shown in FIG. 11E, a line lowered downwardly from an end of a contour line most proximate thereto to the upper or lower end of the picture image is determined as a joint line.
  • a line connecting point rows on the right most side exceeding a certain threshold value "Threshold" in the overlap region among point rows having the large change in parallax that is calculated similar to the first sampling method as shown by FIG. 12A is determined as a joint line L(k).
  • a third sampling method of a joint line is as follows.
  • the first and the second sampling methods of joint line are similarly applicable by calculating point rows having the large change in parallax from left to right with the picture image I (k+1) as a reference. That is, by calculating joint lines from the picture image I(k) and the picture image I(k+1) in accordance with the two methods, a plurality of candidates of joint lines are calculated. Candidates of a plurality of joint lines can further be obtained by changing the value of the threshold value "Threshold".
  • candidates of joint lines L i (k), i1, 2, . . . are represented by white bold lines.
  • the most pertinent one in the joint line candidates L i (k), i1, 2, . . . is a joint line minimizing the following value a.
  • the following Equation calculates an integration of a difference in brightness values of points on the joint lines.
  • notation C designates paths on the joint lines L i (k) and L i ((k) (x, y) designates a brightness value at a point (x, y) on a joint line in the picture image I(k) (or a vector of each value of RGB), and L i '(k) (x', y') designates a brightness value on a point (x', y') on a joint line L i (k) in the picture image I(k+1) in correspondence with the point (x, y) of the above-mentioned picture image I(k) (or a vector of each value of RBG).
  • a candidate L i (k) of a joint line minimizing the above Equation is calculated and is determined as the joint line as shown by FIG. 13C.
  • the above-calculated joint lines L(k) are successively written to predetermined addresses of RAM 112 by which the processing of determining joint lines is finished.
  • the counter k When the processing of determining the joint lines of the picture image I(k) and the picture image I(k+1) has been finished, the counter k is added with 1. When the counter k is smaller than the number of frames of picture images, the processing of determining joint lines in respect of the picture image I(k) and the picture image I(k+1) is performed. When the counter k is equal to or more than the number of frames of picture images, the joint line determining and processing unit 108a transmits the finish signal to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 Upon receiving the finish signal from the joint line determining and processing unit 108a, the timing control unit 111 transmits the start signal to the picture image compositing and processing unit 109a and enters an awaiting state until the timing control unit 111 receives the finish signal from the picture image compositing and processing unit 109a.
  • the picture image compositing and processing unit 109a Upon receiving the start signal from the timing control unit 111, the picture image compositing and processing unit 109a reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image compositing and processing unit 109a reads from RAM 112 an address in the frame memory 102 of the picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address.
  • the picture image compositing and processing unit 109a writes the picture image I(k) to a predetermined position in the output picture image region of the frame memory 102.
  • the picture image compositing and processing unit 109a calls from RAM 112 a parallely moved amount M(k) and a joint line L(k-1) in respect of the picture image I(k-1).
  • a position P k for writing the picture image I(k) of the output picture image region is provided by the following Equation. ##EQU2## Further, a joint line is corrected to a position deviated from the original position by P k-1 .
  • Pixels are written as they are at a portion successive to the w dots on the right side of the joint line L k-1 .
  • notation w is provided with a pertinent constant value.
  • pixels on the left side of the joint line L k-1 are made to be black pixels.
  • a region 802 comprising components of w dots on the right side of the joint line L k-1 , is determined as a region for performing concentration smoothing and concentrations of pixels in the region 802 are subjected to concentration smoothing such that, for example, concentrations of pixels in contact with the joint line L k-1 are made to be substantially equal to concentrations of pixels in contact with the joint line L k-1 of the picture image I(k+1) and concentrations of pixels most remote from the joint line L k-1 of the region 802 are substantially equal to original concentration.
  • the counter k is added with 1, and when the counter k is equal to or less than the number of frames of picture image, the above-described processings in respect of an updated picture image I(k) is successively performed. Further, when the counter k is larger than the number of frames of picture images, a picture image compositing finish signal is transmitted to the timing control unit 111 by which the processing is finished. Upon receiving the finish signal from the picture image compositing and processing unit 109a, the timing control unit 111 transmits the start signal to the picture image output processing unit 110 and enters an awaiting state until the timing control unit 111 receives the finish signal from the picture image output processing unit 110.
  • the picture image output processing unit 110 Upon receiving the start signal from the timing control unit 111, the picture image output processing unit 110 reads from RAM 112 an output picture image address in the frame memory 102. Next, the picture image output processing unit 110 transmits the picture image output start signal to the picture image outputting device 103, successively reads the output picture image signal from the frame memory 102 and outputs the signal to the picture image outputting device 103.
  • the picture image output processing unit 110 When the picture image output processing unit 110 finishes outputting the output picture image from the picture image outputting device 103, the picture image output processing unit 110 transmits the finish signal of the processing to the timing control unit 111 by which the processing is finished.
  • the timing control unit 111 finishes all of the operation.
  • the more proximate to the camera an object is disposed the larger the difference in the parallax in the overlap regions of the two frames of the picture images and when the compositing operation is carried out at this point, deviation or doubled imaging may be caused, by providing joint lines on the contour of an object proximate to the camera by the above-described flow of a series of processings, few portions having considerably different parallaxes are jointed and the joint picture image looks more natural than in the conventional technology by which the quality is promoted.
  • the panoramic picture image compositing apparatus 101a of the second embodiment even when an image of a moving body is included in the picture image, in the operation of sampling parallax, the movement is shown as a parallax and accordingly, high quality jointing can similarly be performed.
  • composition of the pair of picture images I(k) and I(k+1) is given as an example.
  • composition of picture images of three frames or more is carried out by using the panoramic picture image forming apparatuses, the above-described processings may be carried out for each of pairs of picture images including images of the same object and finally all the picture images may be jointed together successively.
  • the picture images I(k) and I(K+1) provided by pan-imaging are objects of compositing operation
  • the objects of compositing operation are not limited to picture images provided by pan-imaging but picture images provided by other methods may be used.
  • picture images may be provided by tilt-imaging where a camera is moved in the vertical direction.
  • imaging ranges are arranged in a matrix and respective imaging ranges overlap portions of other imaging ranges respectively in the horizontal direction and the vertical direction, picture images can be composited respectively in the two horizontal and vertical directions.
  • a program which causes the computer to execute the forming and processing of a panoramic picture image as described in the first and second embodiments may be stored in a storage medium capable of being read by the computer.
  • storage medium may be used floppy disks, hard disks, magnetic tapes, CD-ROMs, optical disks, photomagnetic disks, minidisks, ROMs, RAMs etc.
  • a first example of program comprises a series of procedures for causing the central processing unit instead of the units 104 through 111 of the panoramic picture image forming apparatus 100 to execute the processings which are to be executed by the units 104 through 111.
  • the processings are as follows:
  • the imaging means to instruct the imaging means to capture divided picture images obtained by overlap of parts of picture images by a plurality of imaging means or by moving imaging means plural times, and instruct storage means to store the divided images captured by the imaging means, under a directed address;
  • the central processing unit executes the operation of correcting deviations of the picture images within the overlap region, and in the processing of compositing the divided picture images, the central processing unit executes the operation of interpolating deviations of the picture images by using the intermediate picture images to form a panoramic picture image.
  • a second example of program comprises a series of procedures for causing the central processing unit instead of the units 104 through 106, 107a through 109a, 110 and 111 of the panoramic picture image forming apparatus 100a to execute the processings which are to be executed by the units.
  • the processings are as follows:
  • the imaging means to instruct the imaging means to capture divided picture images obtained by overlap of parts of imaged picture images by a plurality of imaging means or by moving imaging means plural times, and instruct storage means to store the divided images captured by the imaging means, under a directed address;
  • the central processing unit executes the operation of correcting deviations of the picture images within the overlap region, and in the processing of compositing the divided picture images, the central processing unit executes the operation of joining the divided picture images on the basis of the joint line to form a composite panoramic picture image.
  • the program stored in the storage medium is installed in the computer to cause the central processing unit to execute the program.
  • the central processing unit uses a memory as the frame memory 102 and RAM 112, to operate as the unit 104 through 110 as well as the timing controller 111.
  • the central processing unit uses a memory as the frame memory 102 and RAM 112, to operate sequentially as the unit 104 through 106, 107a through 109a, and 110 as well as the timing controller 111.
  • the computer can thus form a panoramic picture like the first and second embodiments of panoramic picture image forming apparatus 100 and 100a. Accordingly the panoramic picture image forming apparatuses 100 and 100a are realized by the computer.
  • the use of the storage medium above-described makes it possible to execute the programs of the invention not only in personal computers but also in various information apparatuses such as mobile information terminals and camera incorporated picture image processing apparatuses.

Abstract

In forming a panoramic picture image by compositing picture images dividedly imaged, the invention corrects deviations of images of an object at overlap regions caused by a difference in fields of view of imaging means or movement of an object to be imaged in which a picture image forming apparatus includes a picture image inputting device, a frame memory for storing divided picture images inputted from the picture image inputting device, a picture image position calculating unit for calculating positions of compositing the divided picture images, a picture image parallax sampling and processing unit for sampling parallax information among picture images from picture images including parallaxes in the overlap regions among contiguous divided picture images, an intermediate picture image forming and processing unit for forming a plurality of intermediate picture images from the picture images including the parallaxes and a picture image composition and processing unit for compositing a panoramic picture image from the divided picture images where the deviations of images in the overlap regions are corrected by the picture image parallax sampling and processing unit and the panoramic picture images are composited performing an interpolation by the intermediate picture images formed by the intermediate picture image forming and processing unit.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a picture image forming apparatus for forming a composite panoramic picture image from a plurality of still digital picture images.
2. Description of the Related Art
There are known methods for jointing a pair of digital picture images including images of a same object and having overlap regions where the plurality of digital picture images overlap each other, to form a panoramic picture image or a highly fine (high resolution) picture image, for example, a method disclosed in Japanese Unexamined Patent Publication JP-A 3-182976(1991), in which two characteristic particles which are granular portions having high concentration in picture image, are sampled in each of a pair of picture images having overlap regions and the picture images are jointed together with a line connecting these as a joint line, and a method disclosed in Japanese Unexamined Patent Publication JP-A 5-122606(1993), in which the pair of picture images are jointed together at a region minimizing a difference in concentration.
There is disclosed a method described in "Picture Image Analysis Handbook", page 466, issued by Tokyo University Publication Association where in jointing a pair of picture images, the picture images are not jointed by a line such as a joint line but color tone is smoothed such that a change in concentrations of the two picture images does not become discontinuous at surroundings of the joint line.
A camera is supposed to provide a picture image describing a view in a imaging range by one imaging operation. In imaging a digital picture image by using the camera, there are a case where pan-imaging by hand holding is carried out, a case where pan-imaging is carried out by using a tripod, a case where imaging is performed by installing a plurality of cameras having overlapped imaging ranges.
Pan-imaging indicates plural times of imaging operation where the imaging range of a camera is moved in the horizontal direction where at each imaging operation, portions of an object are included in the imaging range in performing preceding or succeeding imaging operation.
For example, as shown by FIG. 4, cameras 301 and 302 capable of imaging an object included in imaging ranges 307 and 308 each having a predetermined size by one imaging operation, are arranged such that the imaging ranges 307 and 308 overlap, to image objects 309 and 310. In this case, optical axes 303 and 304 of lenses 301a and 302a of the cameras intersect with each other by making an angle 305. In FIG. 4, a portion 311 where the imaging ranges 307 and 308 overlap is indicated by attaching hatched lines. The objects 309 and 310 are disposed in the portion 311. Distances (hereinafter, referred to as depth) from the camera 301 to the objects 309 and 310 differ from the depths from the camera 302 to the objects 309 and 310, respectively. Overlapped states of the objects 309 and 310 differ in view from the cameras 301 and 302 and therefore, a portion 321 of the object 310 is not seen from the camera 301 and seen from the camera 302.
FIGS. 5A and 5B show picture images 313 and 314 obtained by the cameras 301 and 302 through the imaging operation. The picture image 313 includes object images 317 and 318 representing the objects 309 and 310. The picture image 314 includes object images 319 and 320 of the objects 309 and 310.
Alternately, only one camera may be arranged in the same manner as in the camera 301 of FIG. 4. First the objects 309 and 310 are imaged by the camera, and then the camera is moved from the position where the optical axis of the lens coincides with the optical axis 303, to the position where the optical axis of the lens coincides with the optical axis 304, to image the objects 309 and 310 at the latter position. In this case, the positional relationship between the camera and the objects 309 and 310 in the first imaging operation equal to that between the camera 301 and the objects 309 and 310, and the positional relationship between the camera and the objects 309 and 310 in the second imaging operation is equal to that between the camera 302 and the objects 309 and 310. Consequently two picture images obtained in this case are equal to two picture images obtained by the cameras 301 and 302.
In carrying out pan-imaging by hand holding or imaging by a plurality of cameras, positions of the lenses 301a and 302a of the cameras 301 and 302 are shifted as shown by FIG. 4. Further, in many of combinations of camera and tripod, the rotational center of a camera does not coincide with the lens center of the camera and accordingly, more or less movement of the position of lens is caused even in the case where pan-imaging is carried out by using a tripod. These differences in the position of lens give rise to a parallax caused by a difference in depths of objects in the picture image.
For example, as shown by FIGS. 5A and 5B, although the portion 321 is imaged in the image 320 of the object 310 in the picture image 314, the portion 321 is not imaged in the image 318 of the object 310 in the picture image 313. Therefore, according to the picture images 313 and 314, respective shapes and overlapped states of the images 317, 318; 319, 320 of the objects 309 and 310 in the overlapped portion 311 of the imaging ranges 307 and 308 do not coincide with each other.
Accordingly, when a panoramic picture image 323 shown by FIG. 5C is formed by jointing the picture images 313 and 314 such that the positions of the picture images 313 and 314 are made to coincide with each other in order to make the images 317 and 319 of the object 309 coincide with each other, the images 318 and 320 of the object 310 do not coincide with each other but are shifted from each other. Conversely, when a panoramic picture image 324 shown by FIG. 5D is formed by jointing the picture images 313 and 314 such that the positions of the picture images 313 and 314 are made to coincide with each other in order to make the images 318 and 320 of the object 310 coincide with each other, the images 317 and 319 of the object 309 do not coincide with each other but are shifted from each other. Further, when a moving body is an object, in the case where respective imaging operations are shifted over time, with the movement of the body the position and the shape of the image of the object are changed in the respective picture images and therefore, the images of the object do not coincide with each other also in the case where a panoramic picture image is formed from these picture images as described above.
Further, according to the conventional picture image jointing technology, picture images are jointed by a joint line of a straight line or a straight line having a width. However, when there are differences among picture images as described above, picture images cannot often be composited adequately. Particularly, even when portions of picture images are naturally jointed, there causes shift or doubled imaging on joint lines in respect of portions resulting in parallax or including a moving body thereby causing significant deterioration in the quality of a panoramic picture image.
SUMMARY OF THE INVENTION
In order to resolve the above-mentioned problems of the conventional technology, it is an object of the invention to provide a picture image forming apparatus in which in imaging a scene by a camera or the like, in the case where movement of the lens center of the camera is unavoidable, picture images taken by the camera, including a parallax caused by a difference in depth of an object are composited to obtain a composite picture image having inconsiderable deterioration in picture image quality caused by a shift in the image of the object or doubled imaging of images of the object.
Further, it is an object of the invention to provide a picture image forming apparatus whereby in imaging a scene by a camera or the like, in the case where movement of the lens center of the camera is unavoidable, or in the case an object of a moving body is unavoidable, picture images having parallax caused by a difference in depth or including an image of a moving body are composited thereby obtaining a composite picture image having inconsiderable deterioration of picture image quality caused by a shift in the image of an object or doubled imaging of images of the object.
A first aspect of the invention provides a picture image forming apparatus comprising:
imaging means for obtain a pair of divided picture images having a parallax by imaging an object and a periphery of the object so as to include a same part of the object in imaging regions for the pair of divided picture images;
calculating means for calculating overlap regions of the respective divided picture images including the images of the object,
parallax sampling means for sampling the parallax from the images of the object in the two overlap regions,
intermediate picture image forming means for forming intermediate picture images to be provided by imaging the object from points between the observing points from which the respective divided picture images are imaged on the basis of the images of the object in the overlap regions of the pair of divided picture images and the sampled parallax, and
picture image composition means for providing a composite picture image by interpolating the parallax of the images of the object in the two overlap regions by the intermediate picture images and compositing the pair of divided picture images such that the overlap regions overlap.
According to the first aspect of the invention, the parallax caused by the depth of the object is calculated based on the images of the object in the overlap regions of the divided picture images and the images of the object in the overlap regions are interpolated by using the intermediate picture images for interpolating the positions of imaging the picture images based on the calculation thereby achieving an effect capable of forming a panoramic picture image that is more natural than in the conventional technology even in the case where the parallax is caused by the depth of the object.
A second aspect of the invention provides a picture image forming apparatus wherein, when there are a plurality of objects having different distances from the imaging means to the objects, the parallax sampling means corrects parallaxes of images of the respective objects such that the parallax of the image of any one of the objects among the parallaxes sampled from the images of the respective objects, is nullified.
According to the second aspect of the invention, the parallax sampling means corrects the sampled parallaxes as described above. Thereby, when the parallaxes of the images of the objects in the two overlap regions are interpolated by using the intermediate picture images formed on the basis of the parallaxes and the pair of divided picture images are composited, in the composite picture image, the images of any one of the objects are jointed naturally. Accordingly, a more natural composite picture image can be obtained.
A third aspect of the invention provides a picture image forming apparatus wherein the intermediate picture image forming means of the picture image forming apparatus forms a plurality of the intermediate picture images and the respective intermediate picture images are provided with observing points different from each other to provide the respective intermediate picture images by imaging the object.
According to the third aspect of the invention, the intermediate picture image forming means forms a plurality of the intermediate picture images as described above. The picture image composition means interpolates the parallaxes of the images of the object in the two overlap regions by using the plurality of intermediate picture images. Thereby, since differences in parallax of the images of the plurality of objects in the respective overlap regions are further alleviated than in the case of forming one intermediate picture image, a further natural composite picture image can be obtained. The more the number of intermediate picture images, the more a difference in parallax can be lowered.
A fourth aspect of the invention provides a picture image forming apparatus wherein the calculating means of the picture image forming apparatus calculates differences in angle of the images of the object in the two divided picture images and subjects the divided picture images to a rotational transformation such that the calculated differences of the angles cancel each other.
According to the fourth aspect of the invention, the calculating means corrects to rotate the divided picture images as described above. Thereby, even in the case where not only the differences in the parallaxes but the differences in angle of the images of the object are included in the divided picture images, the differences are corrected and accordingly, the picture images can be composited further finely.
A fifth aspect of the invention provides a picture image forming apparatus, wherein the calculating means of the picture image forming apparatus calculates differences in size of the images of the object in the divided picture images and the two divided picture images are magnified or reduced such that the calculated differences in size cancel each other.
According to the fifth aspect of the invention, the calculating means corrects to magnify or reduce the divided picture images as described above. Thereby, even when the divided picture images include not only the differences in the parallaxes but differences in size of the images of the object, the differences are corrected and accordingly, the picture images can be composited further finely.
A sixth aspect of the invention provides a picture image forming apparatus wherein the calculating means calculates differences in angle and size of the images of the object in the two divided picture images and subjects the divided picture images to rotational transformation and magnification or reduction such that the calculated differences in angle and size cancel each other, respectively.
According to the sixth aspect of the invention, the calculating means, as mentioned above, corrects to rotationally transform and magnify or reduce the two divided picture images. Thereby, even when the divided picture images include not only the differences in parallax but also differences in angle and size of the images of the object, the differences are corrected, and accordingly the picture images can be composited further finely.
A seventh aspect of the invention provides a picture image forming apparatus comprising imaging means 101 for providing a pair of divided picture images having parallaxes in images of objects by imaging the same objects respectively from two different observing points,
calculating means for calculating overlap regions including the images of the objects in the respective divided picture images,
parallax sampling means for sampling the parallaxes from the images of the objects in the overlap regions,
joint line setting means for setting joint lines on a profile of the image of one of the objects most proximate to the imaging means in the respective overlap regions based on the sampled parallaxes, and
picture image composition means for providing a composite picture image by compositing the pair of divided picture images such that the set joint lines overlap.
According to the sixth aspect of the invention, the parallax caused by the depth of the object or a shift in position of the image of a moving object in the picture image, are calculated in the overlap regions of the divided picture images, and the lines which do not include portions having large parallaxes caused by the depths of the objects in the two picture images, are used as joint lines by which even in the case of having the parallaxes caused by the depths of the objects and the case of including the image representing a moving body, an effect capable of forming a natural composite picture image having quality superior to that in the conventional technology and provided with small distortion, is achieved.
Namely, as described above, setting the joint lines makes it possible to hide the image of another object which is behind the object closest to the imaging means and becomes invisible from time to time in a vicinity of the periphery of the image of the object closest to the imaging means, by the image of the object closest to the imaging means, and accordingly the image quality of the composite picture image can be enhanced.
An eighth aspect of the invention provides a picture image forming apparatus, wherein the joint line setting means of the picture image forming apparatus sets joint lines on profiles most proximate to rims of the divided picture images among the profiles of the images of the objects in the respective overlap regions.
According to the eighth aspect of the invention, even in the case where the joint image formed by joint of the seventh aspect becomes unnatural, such as in the case where images of a plurality of objects having different depth dimensions exist in the overlap regions, by providing joint lines on the profiles of the image of the object close to the lines of the divided picture images, an effect capable of forming a natural composite picture image having good quality and inconsiderable distortion, is achieved.
A ninth aspect of the invention provides a picture image forming apparatus where the joint line setting means of the picture image forming apparatus calculates candidates of a plurality of joint lines by using respective different joint line determining methods and one of the candidates of joint lines is selected and determined to be the joint line.
According to the ninth aspect of the invention, the ninth aspect is performed in view of the case where depending on the positions of the objects in the overlap regions, the invention may not be performed adequately when the picture images as references are always fixed to either one of the pair of divided picture images according to the seventh and the eighth aspects of the invention. In that case, there are selected candidates of a plurality of joint lines which are calculated by using different joint line determining methods such as selection of picture images as references, a combination of the seventh and the eighth joint line setting methods or the like and an optimum joint line mostly minimizing differences in the images of the objects on the joint line is selected from the candidates by which an effect capable of forming a natural composite picture image having further stably improved quality and inconsiderable distortion is achieved.
A tenth aspect of the invention provides a picture image forming apparatus wherein the parallax sampling means corrects the parallaxes of the images of the respective objects such that a least parallax among the parallaxes sampled from the images of the respective objects is nullified.
According to the tenth aspect of the invention, the parallax sampling means corrects the sampled parallaxes as described above. An object of an image having the smallest parallax is an object farthest from the imaging means. A joint line is set on the profile of the image of the object having the largest parallax, namely an object proximate to the imaging means. In the image forming apparatus, the image of a far object on the joint line is less deviated by correcting the parallax of the image of the farthest object to zero. Further, since the joint line is set on the profile of the image of an near object, the image is hardly deviated. Namely in the invention the joint line is set on the profile of the image of an object close to the imaging means, so that in a composite picture image the image of another object which appears and disappears in the vicinity of the object close to the imaging means and is behind the object is hidden by the image of the object close to the imaging means, and additionally the deviation of the image of the object far from the imaging means becomes zero. Thereby unnatural shapes, positions etc. of the image of an object within the composite picture image can be eliminated, and a further natural composite picture image can be provided.
An eleventh aspect of the invention provides a storage medium capable of being read by a computer, storing a program for allowing the computer to execute the processings of:
inputting a pair of divided picture images having a parallax by imaging an object and a periphery of the object so as to include a same part of the object in predetermined imaging regions;
calculating overlap regions of the respective divided picture images including the images of the object;
sampling the parallax from the images of the object in the two overlap regions;
forming intermediate picture images to be provided by imaging the object from points between the observing points from which the respective divided picture images are imaged on the basis of the images of the object in the overlap regions of the pair of divided picture images and the sampled parallax; and
providing a composite picture image by interpolating the parallax of the images of the object in the two overlap regions by the intermediate picture images and compositing the pair of divided picture images such that the overlap regions overlap.
According to the eleventh aspect of the invention, the program stored in the above-mentioned storage medium is read by the computer to be executed, whereby the above-mentioned processings can be sequentially executed. Thereby the computer calculates a parallax of an image of an object on the basis of an image of an object within an overlap region and forms intermediate picture images produce on the basis of the parallax to thereby combine a pair of picture images by interpolating the parallax of the image of the object. Accordingly, even when an image of an object in an input picture image has a parallax, it is possible to form a further natural composite picture image in comparison with the prior art.
A twelfth aspect of the invention provides a storage medium capable of being read by a computer, storing a program for allowing the computer to execute the processings of:
inputting a pair of divided picture images having parallaxes in images of objects by imaging the same objects respectively from two different observing points,
calculating overlap regions including the images of the objects in the respective divided picture images,
sampling the parallaxes from the images of the objects in the overlap regions,
setting joint lines on a profile of the image of one of the objects most proximate to the observing points in the respective overlap regions based on the sampled parallaxes, and
providing a composite picture image by compositing the pair of divided picture images such that the set joint lines overlap.
According to the twelfth aspect of the invention, the program stored in the above-mentioned storage medium is read by the computer, which executes sequentially the above-mentioned processings. Thereby the computer can set a joint line so that the joint line does not pass a position where an image of an object has a large parallax, to composite a pair of picture images. Accordingly, even when images of the pair of picture images may be deviated due to a parallax caused by a difference in depth or a motion of the object, a natural composite picture image of high quality which becomes less deformed can be formed.
BRIEF DESCRIPTION OF THE DRAWINGS
Other and further objects, features, and advantages of the invention will be more explicit from the following detailed description taken with reference to the drawings wherein:
FIG. 1 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus according to a first embodiment of the invention;
FIG. 2 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus according to a second embodiment of the invention;
FIG. 3 is a view showing an embodiment of RAM 112 in FIG. 1;
FIG. 4 is a view showing behavior of imaging picture images having overlap regions by moving a camera or in a plurality of cameras;
FIGS. 5A and 5B are views showing picture images 313 and 314 imaged in the constitution of FIG. 4, FIGS. 5C and 5D are views showing that in forming a panoramic view by a conventional technology, images 317, 319; 318, 320 are objects in overlap regions of the picture images 313 and 314 which do not coincide with each other by a difference in depths of the objects;
FIGS. 6A through 6C are views showing behavior of matching for detecting deviations in images of objects caused by their depths;
FIGS. 7A through 7E are views showing behavior of forming an intermediate picture image IIk (l);
FIGS. 8A through 8D are views showing a flow of processings in forming a panoramic picture image according to a panoramic picture image processing device in the first embodiment of the invention;
FIGS. 9A through 9E are views showing behavior of calculating a parallely moving amounts of picture images I(k) and I(k+1);
FIG. 10 is a graph showing a relationship between a deviation amount of matching between picture images and a depth from a camera;
FIGS. 11A through 11I are views showing a flow of processings in forming a panoramic picture image according to the panoramic picture image processing device in the first embodiment of the invention;
FIGS. 12A and 12B are views showing behavior of determining a joint line by using a second sampling method of a joint line;
FIGS. 13A through 13D are views showing behavior of determining a joint line by using a third sampling method of a joint line;
FIG. 14 is a view showing an interpolating method from surroundings of undefined pixels in forming the intermediate pixel IIk (l).
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Now referring to the drawings, preferred embodiments of the invention are described below.
FIG. 1 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus 100 according to a first embodiment of the invention. FIGS. 8A through 8D are views for explaining an outline of panoramic picture image forming processings in the panoramic picture image forming apparatus 100. An explanation will be given thereof in reference to both FIG. 1 and FIGS. 8A through 8D.
The panoramic picture image forming apparatus 100 includes a picture image inputting device 101, a frame memory 102, a picture image outputting device 103, a picture image input processing unit 104, a picture image preprocessing unit 105, a picture image position calculating unit 106, a picture image parallax sampling and processing unit 107, an intermediate picture image forming and processing unit 108, a picture image composition and processing unit 109, a picture image output processing unit 110, a timing control unit 111 and RAM 112.
The picture image inputting device 101 acquires a plurality of picture images constituting objects of compositing. Assume that the picture images constituting objects of composition are provided by, for example, pan-imaging as explained in the conventional technology. The plurality of picture images are respectively provided with regions having images of an object that is the same as an object in other picture images, which are referred to as overlap regions. The images of the object include deviation by parallax and distortion. Although according to the panoramic picture image forming apparatus 100, a picture image is dealt with in a mode of a picture image signal for representing the picture image, the picture image signal may be referred to as "picture image" in the specification.
In the following explanation, it is assumed that a plurality of picture images constituting objects of composition are picture images I(k) and I(k+1) shown by FIG. 8A. The picture images I(k) and I(k+1) are the same as the picture images 313 and 314 in FIGS. 5A and 5B and an explanation of the notations will be omitted. A portion of the picture image I(k) right from the central portion and a portion of the picture image I(k+1) left from the central portion constitute overlap regions W1 and W2. A number of the plurality of picture images constituting objects of composition may naturally be 3 or more.
The picture image input processing unit 104 provides and stores the plurality of picture images constituting objects of composition from the picture image inputting device 101 to the frame memory 102. The picture image preprocessing unit 105 performs preprocessing individually with respect to the respective picture images stored to the frame memory 102. The picture image position calculating unit 106 calculates parallely moved amounts of the pair of picture images I(k) and I(k+1) stored to the frame memory 102.
The picture image parallax sampling and processing unit 107 calculates deviations of positions in the picture images I(k) and I(k+1) with respect to images of the same object in the respectives of the pair of picture images I(k) and I(k+1) and forms parallax picture images representing the magnitude of the deviation by the unit of pixel for each of the picture images I(k) and I(k+1). Hereinafter, the deviation is referred to as "deviation between images of object". A parallax picture image 701 of the picture image I(k) with a reference of the picture image I(k+1) and a parallax picture image 702 of the picture image I(k+1) with a reference of the picture image I(k) are shown in FIG. 8B.
The intermediate picture image forming and processing unit 108 divides the overlap regions W1 and W2 respectively into n of a plurality of small regions in the horizontal direction, combines the small regions of the overlap region W1 with the small regions of the overlap region W2 and forms the intermediate picture images IIk(l) through IIk(n) as shown by FIG. 8C with the deviations of the images of the object as parameters. The intermediate picture image is equal to an image to be obtained by installing the camera at one of points provided by internally dividing an interval between positions of the camera in respectively imaging the pair of picture images I(k) and I(k+1) by a ratio of p:n-p(p=1˜n-1) and imaging the object constituting the object of imaging in each of the imaging operations, and is obtained by calculation.
The picture image compositing and processing unit 109 successively composites the plurality of picture images stored to the frame memory 102 by using the parallely moved amounts of the pairs of picture images and the intermediate picture images by which one sheet of a panoramic picture image is obtained. As shown by FIG. 8D, the panoramic picture image is a picture image where a portion 706 of the picture image I(k) except the overlap region W1, the intermediate picture images IIk(1) through IIk(n) and a portion 707 of the picture image I(k+1) except the overlap region W2 are arranged from the left side in this order. The picture image output processing unit 110 provides and displays the panoramic picture image formed by the picture image compositing and processing unit 109 to the picture image outputting device 103. The timing control unit 111 controls timings of operation at the respective units 104 through 110.
FIG. 3 is a view showing an example of the memory structure of RAM 112 and RAM 112 is used for obtaining information stored in regions designated by predetermined addresses or storing information produced by respective processings in accordance with some processings, explained below.
For that purpose, the memory space of RAM 112 is divided into a plurality of regions at equal intervals and an address representing a position of the front of the region in the memory space is added to each region. In FIG. 3, Address 1, Address 2, Address 3 . . . are added to the respective regions arranged from the upper most region toward the downward direction. In the following explanation, write, read and store information to a single or a plurality of regions of RAM 112 are referred to as "write, read and store to address".
For example, a number of frames of picture images stored to the frame memory 102 is stored to a front region 201. Addresses of front regions among a single or a plurality of regions storing 1-th through n-th picture images in the frame memory 102 are stored to respective regions of a group of first regions 202 that are constituted by a plurality of regions from the second region to a region at a predetermined ordinal number. The addresses of front regions among a single or a plurality of regions storing a panoramic picture image in the frame memory 102 are stored to respective regions of a second group of regions 203 that are constituted by a plurality of regions from a region next to the first group of regions 202 to a region at a predetermined ordinal number. Addresses of front regions among a single or a plurality of regions storing an intermediate picture image ga in the frame memory 102 are stored to the respective regions of a third group of regions 204 that are constituted by a plurality of regions from a region next to the second group of regions 203 to a region at a predetermined ordinal number. Parallely moved amounts among respective picture images are stored to respective regions of a fourth group of regions 205 that are constituted by a region next to the third group of regions 204 to a region at a predetermined ordinal number. Deviations of images of objects among respective picture images are stored respectively over a plurality of regions to a fifth group of regions 206 that are constituted by a plurality of regions from a region next to the fourth group of regions 205 to a region at a predetermined ordinal number.
A detailed explanation will be given of processings of forming a panoramic picture image by using the panoramic picture image forming apparatus 100 in reference to FIG. 1 as follows.
When a total of the system is started, the timing control unit 111 transmits reset signals to the picture image input processing unit 104, the picture image preprocessing unit 105, the picture image position calculating unit 106, the picture image parallax sampling and processing unit 107, the intermediate picture image forming and processing unit 108, the picture image compositing and processing unit 109 and the picture image output processing unit 110. The processings at the respective units 104 through 110 enter an awaiting state in response to the reset signals until they receive start signals from the timing control unit 111.
First, the timing control unit 111 outputs the start signal to the picture image input processing unit 104 for storing picture image to the frame memory 102 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image input processing unit 104.
Upon receiving the start signal from the timing control unit 111, the picture image input processing unit 104 performs a processing for storing a plurality of picture images to the frame memory 102 in respect of the picture image inputting device 101.
When picture images are composited on-line, that is, when imaging of an object and composition of picture images are continuously performed, the picture image inputting device 101 comprises picture image inputting means, for example, a camera and an A/D converting means for converting analogue picture image signals obtained from the picture image inputting means into digital picture image signals and the processing is a processing where the picture image inputting means carries out imaging of picture image by the signal from the picture image input processing means 104, converts the analogue picture image signals obtained thereby into the digital image signals by the A/D converting means and transmits the digital picture image signals to predetermined addresses of the frame memory 102.
When picture images are composited off-line, that is, when imaging of an object and composition of picture images are separately carried out, the picture image inputting device 101 is a device for storing picture images and the processing is a processing where the picture image input processing unit 104 transmits the picture image signals stored to the picture image inputting device 101 to predetermined addresses of the frame memory 102.
Upon finishing the transmission of all the picture images to the frame memory 102, the picture image input processing unit 104 writes a number of frames of picture images transmitted to the frame memory 102 and addresses to which the picture images are stored in the frame memory 102, successively to predetermined addresses of RAM 112 and thereafter, transmits a picture image input finishing signal to the timing control unit 111 whereby the processing is finished.
In the following explanation, assume that two frames of picture images I(k) and I(k+1)(k=1,2, . . . n) having continuous regions stored to the frame memory 102, are a pair of picture images including an image of the same object. The number of frames of picture images stored to the frame memory 102 is read as a number of frames of picture images to be processed in the following processings.
Upon receiving a finish signal of transmission of the inputted picture images to the frame memory, the timing control unit 111 transmits the start signal to the picture image preprocessing unit 105 and enters an awaiting state until the picture image preprocessing unit 105 receives a finish signal from the picture image preprocessing unit 106.
Upon receiving the start signal from the timing control unit 111, the picture image preprocessing unit 105 reads a number of frames of picture images to be processed from RAM 112 and stores the number of frames. Further, the picture image preprocessing unit 105 sets a counter k to 1. The picture image preprocessing unit 105 reads an address in the frame memory 102 of the picture image I(k) corresponding to the counter k from RAM 112. The picture image preprocessing unit 105 carries out a known processing of removing distortion in respect of the read picture image I(k). The processing includes, for example, correction of aberration of lens of imaging means or the like. The picture images removed of the distortion are again written to the same address of the frame memory 102. The preprocessing in respect of the picture image I(k) comprises processings of from reading the address of the picture image I(k) to rewriting the picture image. The counter k is incremented by 1 and is compared with the number of frames of picture images and when the counter is equal to or less than the number of frames of picture images, the preprocessing is carried out with respects to a next picture image. When the counter k becomes larger than the number of frames of picture images, the picture image preprocessing unit 105 transmits a picture image preprocessing finish signal to the timing control unit 111 whereby the processing is finished.
Upon receiving the finish signal from the picture image preprocessing unit 105, the timing control unit 111 transmits the start signal to the picture image position calculating unit 106 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image position calculating unit 106.
Upon receiving the start signal from the timing control unit 111, the picture image position calculating unit 106 reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image position calculating unit 106 reads from RAM 112 the address in the frame memory 102 of the reads picture image I(k) in correspondence with the counter k and a portion (Rx0, Ry0)-(Rx1, Ry1) constituting a reference region R1 of pattern matching as shown by FIG. 9A included in the predetermined picture image I(k) within the frame memory 102 in accordance with the address. The reference region R1 is set to, for example, a predetermined position included in the overlap region of the picture image I(k).
Further, the picture image position calculating unit 106 similarly reads a search region BL; (Sx0, Sy0)-(Sx1, Sy1) of the picture image I(k+1) next to the picture image. In respect of the search region BL, when a positional relationship between the picture images I(k) and I(k+1) is known to some degree, that is, when the imaging is carried out under a constant positional relationship or when an outline positional relationship is given from outside after the imaging operation, the search region is restrictively set in accordance with the information such that the search region is smaller than the total region of the picture image I(k+1), or when the positional relationship is not determined yet, the total region of the picture image is read as the search region.
The pattern matching is carried out based on the reference regions and the search regions of the respective read picture images whereby reference positions of regions where the reference regions match in the search regions, that is, reference positions of regions (mx0, my0) having the highest correlation values in respect of the reference regions in the search regions, are calculated. The correlation value is calculated by, for example, calculating in accordance with the following equation. In FIG. 9B, the region is referred to as a matching region BM as a region matching the reference region with the position (mx0, my0) as one of apexes.
Correlation value=
1.0/Σ.sub.(x,y)IRarea1/2 R(x,y)-S(x+offset.sub.x, y+offset.sub.y)(1)
Here, notation Rarea designates the reference region, R(x, y) designates a brightness value at coordinates (x, y) of the reference region, S(x, y) designates a brightness value at coordinates (x, y) in the search region and (offsetx, offsety) designates the offset of the searched picture image from the original point (0, 0).
Here, as shown by FIG. 9B, the calculated position (mx0, my0) designates a position at respective picture images of the reference region and the search region cut out from the original picture images and accordingly, correction is performed by the following Equations (2) and (3) whereby the parallely moved amounts of the picture images I(k) and I(k+1) (Xmove, Ymove) are calculated. The parallely moved amounts are written to predetermined addresses of RAM 111. A position shifted from the original point (0, 0) of the picture image I(k) by the parallely moving amounts (xmove, ymove), is a position to be overlapped with the original point of the picture image I(k+1) when the picture image I(k) overlaps the picture image I(k+1) by overlapping the images of the same object.
x.sub.move =Rx0-(mx0+Sx0)                                  (2)
y.sub.move =Ry0-(my0+Sy0)                                  (3)
where Rx0 and Ry0 are x- and y-coordinates of a reference point of the reference region R, respectively, and Sx0 and Sy0 are x- and y-coordinates of a reference point of the search region S, respectively. In the embodiment, upper left vertexes of the regions R and S are considered as the reference points.
By the above operation, the parallely moved amounts can be calculated. The processing of calculating the position of the picture image includes from reading the address of the picture image I(k) to writing the parallely moved amount.
Further, if necessary, as shown by FIGS. 9D and 9E, 2 or more of reference regions are provided, the pattern matching is carried out for the respective reference regions and by using the result, a change in the scale of the image of the object can be calculated from changes in distances among the reference regions and distances among the matching regions and a relative rotational angle of image or the like can be calculated by a difference in directions made by the arrangement of the reference regions and the arrangement of the matching regions. For example, when notation Vr designates a vector having both ends which are centers of two reference regions and notation Vm designates a vector having both ends which are centers of two matching regions, the scale S(k) is calculated by Equation (4) and the rotational angle q(k) is calculated by Equation (5). |Vr| designates the size of the vector Vr and |Vm| designates the size of the vector Vm.
S(k)=|Vm|/|Vr|         (4)
q(k)=cos.sup.-1 |Vm||Vr|/(|Vm|+.vertline.Vr|)                                         (5)
The processing is carried out before and after the processing of calculating the position of the picture image and from the scale S(k) and the rotational angle q(k) of the picture image I(k+1) in respect of the picture image I(k), the picture image I(k+1) is multiplied by 1/Sk and the picture image I(k+1) is rotated by -q(k) and also multiplied and rotated picture image I(k+1) may be written to the address of the original picture image I(k+1) at the frame buffer.
After performing the above-described processing, the counter k is incremented by 1, compared with a number of frames of picture images and when the counter k is smaller than the number of frames of picture images, a processing of calculating a next picture image position is carried out. When the counter k is equal to or larger than the number of frames of picture images, the picture image position calculating unit 106 transmits a picture image position calculation finish signal to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the picture image position calculating unit 106, the timing control unit 111 transmits the start signal to the picture image parallax sampling and processing unit 107 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image parallax sampling and processing unit 107.
Upon receiving the start signal from the timing control unit 111, the picture image parallax sampling and processing unit 107 reads and stores the number of frames of picture images to be process from RAM 112 and sets the counter k to 1. Next, the picture image parallax sampling and processing unit 107 reads from RAM 112 the address in the frame memory 102 of the picture image I(k) in correspondence with the counter k and reads the picture image in the frame memory 102 by the address. Further, the picture image parallax sampling and processing unit 107 reads a next picture image (in correspondence with the counter k+1) next to the picture image.
Thereafter, the picture image parallax sampling and processing unit 107 reads from RAM 112 the parallely moved amounts calculated from the picture image position calculating unit 106 and respectively calculates regions W1 and W2 where the picture image I(k) and the picture image I(k+1) overlap with respects to the picture image I(k) and the picture image I(k+1).
Further, the picture image parallax sampling and processing unit 107 divides the overlap region W1 of the picture image I(k), determines the respectives as reference regions and searches matching regions in correspondence with the respective reference regions from the picture image I(k+1) by using the method of pattern matching. At this occasion, the search is carried out in order to calculate a deviation between the images of the object caused by parallax and therefore, the region for calculating the search correlation value may be moved only in the horizontal direction. The deviation amount is an amount of apparently moving a body due to the parallax and the amount is caused by the depth and accordingly, the depth of the body represented by the images of the object in the reference regions can be estimated from the direction of deviation and the amount of deviation.
Assume that when the picture image I(k) and I(k+1) are made to overlap by coinciding certain points of the depth, the deviation of the images of the same object is calculated in the overlap region of the picture images I(k) and I(k+1). In this case, when a body more proximate to a camera than the points is the object, the image of the object of the picture image I(k+1) is further deviated to the left than the image of the object of the picture image I(k). Further, when a body further remote from the camera than the points is the object, the image of the object of the picture image I(k+1) is further deviated to the right than the image of the object of the picture image I(k).
FIG. 10 is a graph showing a relationship between a deviation amount between images of an object and an actual depth of the object when an amount of apparent movement of a body at an infinitely remote point is 0. The ordinate is the amount of deviation between the images of the object (unit; dot) and the abscissa is the depth (unit; m). The amount of deviation between the images of the object is reduced in inverse proportion to the depth.
That is, when the picture images I(k) and I(k+1) are made to overlap by coinciding certain points of depths thereof, even with the overlap region having the same width, owing to the parallaxes, the more proximate to a camera a body is disposed compared with the points, the larger the deviation amount to the left and the more remote from the camera the body is disposed compared with the points, the larger the deviation amount to the right. Here, although according to the present application, the depth is a distance in correspondence with the parallely moved amount calculated at the picture image position calculating unit 106, in this case, the distance may be corrected based on the calculated deviation amount in the overlap region. For example, Assuming that a left side deviation is positive and a right side deviation is negative, when the distance is matched to a location of the image of the object having the deepest depth in the picture image, the reference value is determined as a minimum value of the deviation amount and when the distance is matched to a location of the image of the object having the shallowest depth in the picture image, the reference value is determined as a maximum value of the deviation amount, or the reference value may be determined as an average of the deviation amounts. The deviation amount is corrected by adding the calculated reference value to the parallely moved amount in RAM 112 and calculating differences between the respective deviation amounts and the reference value in the overlap region that are calculated currently and writing the differences to RAM 112 as deviation amounts. Thereby, the deviation amount as a reference value can be corrected to 0.
According to the processing, the accurate depth is not calculated, but the parallax accompanied by the difference in the depth, that is, the deviation amount per se between the images of the object is stored to RAM 112 for the respective reference regions and the respective matching regions or the respective pixels.
FIGS. 6A, 6B and 6C are views showing a flow of specific processings for carrying out the parallax sampling. Each square of cross rule of the picture image I(k) shown by FIG. 6A corresponds to a reference region and a circle in the region designates the center of region. A frame in the picture image I(k+1) designates an overlap region and a circle corresponds to the circle showing the center of the region of the picture image I(k) and an arrow mark extending therefrom designates the direction of deviation and the amount of deviation between the images of the object in each reference region calculated by the matching operation. Here, when the right side picture image I(k+1) is imaged by a camera at a position on the right side of a position of imaging the picture image I(k) on the left as shown by FIG. 6A and FIG. 6B, in respect of the deviation in the right direction, the larger the deviation amount, the deeper the depth and in respect of the deviation in the left direction, the larger the deviation amount, the shallower the depth. FIG. 6C shows a parallax picture image 401 that is shown by brightness values of pixels with a positive direction of the deviation amount in the left direction. The darker the brightness of the pixel, the smaller the parallax, that is, the deeper the depth and the brighter a portion, the larger the parallax, that is, the shallower the depth. By further dividing the reference regions, the finer parallax sampling can be carried out.
In a similar way, the depth sampling is carried out with the picture image I(k+1) as the reference side and the picture image I(k) as the search side. The result of sampling the parallax of the picture image I(k+1) is similarly stored to RAM 112. The parallax sampling processing includes processings of from reading the address of the picture image I(k) to storing the result of the parallax sampling.
When the parallax sampling processing of the picture image I(k) and the picture image I(k+1) has been finished, the counter k is added with 1. When the counter k is smaller than the number of frames of picture images, the parallax sampling processing is carried out in respect of the picture image I(k) and the picture image I(k+1). When the counter k is equal to or more than the number of frames of picture images, the picture image parallax sampling and processing unit 107 transmits a finish signal to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the picture image parallax sampling and processing unit 107, the timing control unit 111 transmits the start signal to the intermediate picture image forming unit 108 and enters an awaiting state until the timing control unit 111 receives a finish signal from the intermediate picture image forming unit 108.
Upon receiving the start signal from the timing control unit 111, the intermediate picture image forming and processing unit 108 reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the intermediate picture image forming and processing unit 108 reads from RAM 112 an address in the frame memory 102 of a picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. Further, the intermediate image forming and processing unit 108 reads the picture image I(k+1) (in correspondence with counter k+1) next to the picture image.
Thereafter, the intermediate picture image forming and processing unit 108 reads from RAM 112 the parallely moved amount calculated at the picture image position calculating unit 106 and respectively calculates the overlap region of the picture image I(k) and the overlap region of the picture image I(k+1).
An intermediate picture image is a picture image formed from 2 frames of picture images imaged at positions having different observing points by calculating a picture image to be imaged at a position at a middle point of the two observing points.
Positions of a camera when the picture images I(k) and I(k+1) are imaged, are designated as imaging positions Tk and Tk+1. In this case, the overlap regions W1 and W2 of the picture images I(k) and I(k+1) are respectively divided into n-1 of small regions in the horizontal direction. By using the respective small regions, intermediate picture images IIk(l) (l=1, 2, 3, . . . N-1) are formed. For example, an intermediate picture image IIk(l) formed by using a small region at the left end is the same as a picture image to be provided by the camera installed at a position internally dividing an interval between the imaging positions Tk and Tk+1 by a ratio of 1 to n-1. Similarly, intermediate images IIk(l) formed by using 2-th from left through (n-1)-th small regions, are the same as picture images to be provided by the camera installed at the positions internally dividing the interval between the imaging positions Tk and Tk+1 by ratios of 2 to n-2 through n-1 to 1. For each of the pair of picture images I(k) and I(k+1), n-1 of the intermediate picture images IIk(l) are provided.
A method of forming an intermediate picture image in forming the intermediate picture image at a position internally dividing the interval between the imaging position Tk and the imaging position Tk+1 by a ratio of 1 to (n-1), is shown below. The deviation between the images of the object in respect of the picture image I(k) and the picture image I(k+1) calculated at the parallax sampling and processing unit 107, are read from RAM 112 for each pixel or for each region.
As shown by FIG. 7A and FIG. 7B, the coordinates of a point P1 in the overlap region R1 (rx0, ry0)-(rx1, ry0) of the picture image I(k) in respect of the picture image I(k+1), are defined as (x1, y1), the brightness value at the coordinate of the point P1 is defined as D(P1), the deviation amount between the images of the object at the coordinates of the point P1 is defined as M(P1) (arrow mark of FIG. 7A) and the coordinates of a point P2 in the overlap region R2 (rx2, ry2)-(rx3, ry3) of the picture image I(k+1) in respect of the picture image I(k) are defined as (x2, y2) and the brightness value at the coordinates of the point P2 is defined as D (P2) and the deviation amount between the images of the object as the point P2 is defined as M(P2) (arrow mark of FIG. 7B).
First, the intermediate picture image IIk (l) (FIG. 7C) with a reference of the picture image I(k) is formed. Assume that the brightness value D(P1) of the respective point P1 (x, y), (rx0≦x≦rx1, ry0≦y≦ry1) in the overlap region R1 is equal to the brightness of (x+M(P1)/n-rx0, y-ry0) on the intermediate picture image IIk (l). The brightness value D(P1) is written to RAM 112 as the brightness value of the above-described point of the intermediate picture image IIk (l). This processing is carried out with respect to all the points in overlap region R1.
When the intermediate picture image IIk (l) is formed only by the picture image I(k), owing to the difference in the deviation amounts caused by a difference in the parallaxes for each pixel, at a portion of the intermediate picture image, a plurality of brightness values of pixel may be written or undefined pixel where the brightness value cannot be determined may be produced as shown by a blacks pixel portion Px in FIG. 7C.
In forming the intermediate picture image, when the brightness value has already been written, the brightness value of a pixel having a large deviation amount M(P1) is written with priority. The reason is that the larger the deviation amount M(P1), the more proximate to the camera a body is disposed and writing of a brightness value of a body that is disposed deeper than that or background is unnatural. Further, with respect to the undefined pixel, as mentioned later, the undefined pixel is interpolated by the intermediate picture image IIk+1 (l) formed from the picture image I(k+1).
Similar to the formation of the intermediate picture image IIk (), the intermediate picture image IIk+1 (l) with a reference of the picture image I(k+1) is formed (FIG. 7D). Specifically, the respective points P2 (x, y) (rx2≦x≦rx3, ry2≦y≦ry3) in the overlap region W2 are written to RAM 112 as the brightness value D(P2) of points (x+M(P2)(n-1)/n-rx0, y-ry0) on the intermediate picture image II'k+1 (l). However, as mentioned above, when the brightness value has already been written, priority is given to the brightness value of the pixel having a larger deviation amount M(P1).
Thereafter, by using the intermediate picture image II'k+1 (l) formed by the picture image I(k+1) and the deviation amount for each pixel, the undefined pixel in the intermediate picture image IIk (l) formed by the picture image I(K) and the deviation amount for each pixel, is interpolated by the pixel of the corresponding intermediate picture image II'k+1 (l).
Specifically, with respect to the interpolation, firstly assume a state where the intermediate picture images IIk (l) and IIk (l+1) overlap such that the images of the same object overlap. Under this state, the brightness value of the pixel of the intermediate picture image IIk (l+1) overlapping the undefined pixel Px of the intermediate picture image IIk (l) is written to a predetermined address of RAM 112 as the brightness value of the undefined pixel.
Finally, when there is a portion where the brightness value is not written to the intermediate picture image, interpolation is carried out by the brightness value written to the surrounding. FIG. 14 is a view showing behavior of interpolation. Each square correspond to one pixel and a pixel having a circle at the center is an undefined pixel. Now, the brightness value of an undefined pixel 501 surrounded by bold lines, is assumed to be an average of brightness values of already defined pixels, or according to FIG. 14, pixels having triangular marks at their centers in the vicinity 8 surrounded by bold broken lines 502. The above-described operation is repeated with respect to all the undefined pixels. When all the pixels with the vicinity 8 are undefined, the interpolating operation is carried out again after finishing interpolation of other undefined pixels and repeated until the undefined pixels are not present.
An interpolated picture image having a better quality can be provided by carrying out the interpolating operation from undefined pixels having a large number of pixels in already defined pixels the vicinity 8.
In this way, the intermediate picture images IIk (l) are formed and among them, a region 601 that is used for compositing operation is (dx*(l-1), 0)-(dx*l, (ry1-ry0)). For that purpose, only the region 601 is written to the frame memory 102 as the intermediate picture images IIk (l) by which the formation of the intermediate picture image of 1 to n-1 at the overlap regions of the picture image I(k) and the picture image I(k+1) is finished. Notation dx designates a width of 1/(n-1) of the width of the overlap region. Thereby, the intermediate picture image IIk (l) shown by FIG. 7E is provided.
When the processing of forming the above-described intermediate picture images at l=1, . . . , n has been finished, the number of frames of the formed intermediate picture images is written to a predetermined address of RAM 112 and the counter k is added with 1. When the counter k is smaller than the number of frames of the picture images, the intermediate picture image processing is carried out with respect to picture images corresponding to the updated counter. When the counter k is equal to the number of frames of picture images, the intermediate picture image forming and processing unit 108 transmits an intermediate picture image forming and processing finish signal to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the intermediate picture image forming and processing unit 108, the timing control unit 111 transmits the start signal to the picture image compositing and processing unit 109 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image compositing and processing unit 109.
Upon receiving the start signal from the timing control unit 111, the picture image compositing and processing unit 109 reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image compositing and processing unit 109 reads from RAM 112 an address in the frame memory 102 of a picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. When the counter k is 1, the picture image compositing and processing unit 109 writes the picture image I(k) to a predetermined position of an output picture image region in the frame memory 102.
When the counter k is larger than 1, the picture image compositing and processing unit 109 calls from RAM 112 a deviation amount M(k) in respect of a picture image I(k-1). By using the deviation amount, an overlap region Rk in respect of the picture image I(k-1) for (rx0, ry0)-(rx1, ry1), is calculated. Further, the number of frames of intermediate picture images n in respect of the picture image I(k-1) and the picture image I(k) that have been formed at the intermediate picture image forming and processing unit 108, is called from RAM 112. Positions Pi of a composite picture image memory for writing the intermediate picture images IIk (i), i1, . . . , n in the output picture image region, can be calculated by the following equation, ##EQU1## Thereby, the intermediate picture images IIk(i), i1, . . . n are copied to the positions Pi of the composite picture image memory in the output picture image region. A portion at the right of x coordinate ry1 of the picture Image I(k) and thereafter is copied to the position (Σk i=1 M(i)+rx1, ry1) of the output picture image region. When the above-described processing is finished, the counter k is added with 1, and when the counter k is equal to or smaller than the number of frames of picture image, the above-described processing in respect of the updated picture image I(k) is successively performed. Further, when the counter k is larger than the number of frames of picture image, the picture image compositing finish signal is transmitted to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the picture image compositing and processing unit 109, the timing control unit 111 transmits the start signal to the picture image output processing unit 110 and enters an awaiting state until the timing control unit 111 receives a finish signal from the picture image output processing unit 110.
Upon receiving the start signal from the timing control unit 111, the picture image output processing unit 110 reads from RAM 112 an address of an output picture image in the frame memory 102. Next, the picture image output processing unit 110 transmits a picture image output start signal to the picture image outputting device 103, successively reads the output picture image signal from the frame memory 102 and outputs the signal to the picture image outputting device 103.
When the picture image output processing unit 110 finishes outputting the output picture image from the picture image outputting device 103, the picture image output processing unit 110 transmits a finish signal of the processing to the timing control unit 111 by which the processing is finished.
The timing control unit 111 finishes all the operation when the timing control unit 111 receives the finish signal from the picture image output processing unit 110.
By the above-described flow of a series of processings, difference in the parallaxes between the different images of the object in the overlap regions of the two picture images, is alleviated by forming the intermediate picture images and the finished composite picture image is dispensed with deviation or doubled compositing as in the conventional technology even at portions where the difference in the parallaxes are present. A more natural panoramic picture image can naturally be obtained by increasing the number of forming the intermediate picture images, that is, the number of dividing the overlap region.
Further, according to the present invention, in order to composite the images of the object having different deviation amounts caused by the parallax such that the images are included in the overlap regions W1 and W2 having the same width and a total of widths of the intermediate picture images II(l) through II (n) is equal to the width of the overlap regions W1 and W2, a body proximate to a camera is composited out with a narrow width and a body remote from the camera is composited with a wide width. When the portion proximate to the camera is intended to look natural, in the picture image parallax sampling and processing unit 107, a deviation amount of a body proximate to the camera is determined as a reference value by which the deviation amount of the total is corrected and when a portion remote from camera is intended to look natural, in the picture image parallax sampling and processing unit 107, a body remote from the camera is determined as a reference value by which the deviation amount of the total is corrected.
Further, the embodiment shown here is an example of the invention and the invention is not limited to the content of the embodiment so far as the gist of the invention is not changed.
An explanation will be given of a second embodiment of the invention in reference to the drawings as follows. FIG. 2 is a block diagram showing the electric constitution of a panoramic picture image forming apparatus 100a that is a second embodiment of the invention.
The panoramic picture image forming apparatus 100a includes the picture image inputting device 101, the frame memory 102, the picture image outputting device 103, the picture image input processing unit 104, the picture image preprocessing unit 105, the picture image position calculating unit 106, a picture image parallax sampling and processing unit 107a, a joint line determining and processing unit 108a, a picture image compositing and processing unit 109a, the picture image output processing unit 110, the timing control unit 111 and RAM 112.
An explanation will be omitted with respect to the processings at the picture image input processing unit 104, the picture image preprocessing unit 105 and the picture image position calculating unit 106 since the same processings are performed as in the first embodiment of the invention.
Upon receiving the finish signal from the picture image position calculating unit 106, the timing control unit 111 transmits the start signal to the picture image parallax sampling and processing unit 107a and enters an awaiting state until the timing control unit 111 receives the finish signal from the picture image parallax sampling and processing unit 107a.
Upon receiving the start signal from the timing control unit 111, the picture image parallax sampling and processing unit 107a reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image parallax sampling and processing unit 107a reads from RAM 112 an address in the frame memory 102 of a picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. Further, the picture image parallax sampling and processing unit 107a reads the picture image I(k+1) (in correspondence with counter k+1) next to the picture image.
Thereafter, the picture image parallax sampling and processing unit 107a reads from RAM 112 the parallels moved amount calculated at the picture image position calculating unit 106 and respectively calculates the overlap region W1 of the picture image I(k) and the overlap region W2 of the picture image I(k+1).
Further, the picture image parallax sampling and processing unit 107a divides the overlap region W1 of the picture image I(k) and determines the respectives as reference regions and searches corresponding positions from the picture image I(k+1) by using the method of pattern matching. In this case, what is searched is the deviation between the images of the object caused by the parallax and accordingly, the searching operation may be carried out only from the horizontal direction. The deviation amount is an amount of apparent movement of a body caused by the parallax and the amount is caused by the depth and accordingly, the depth of the body in a reference region can be estimated by the direction of deviation and the amount of deviation. Further, when points at a certain depth are made to coincide with each other, even in the overlap region having the same width, due to parallax, the more proximate to a camera from a certain point a body is disposed, the larger the deviation amount to the left side and the more remote from the camera from a certain point a body is disposed, the larger the deviation amount to the right side.
Here, when one of the calculated regions or pixels having a deepest depth, is calculated, that is, when the left direction is made to be positive, the deviation amount of the region having the least deviation amount is subtracted from a distance between the respective regions or pixels and the difference is added to the parallely moved amount calculated at the picture image position calculating unit 106. Thereby, the deviation of the images of the object at a portion having the deepest depth at inside of the picture image, is determined as a reference deviation by which the parallely moved amount in RAM 112 and the respective deviation amounts in the overlap regions are corrected.
In a similar way, the parallax sampling is carried out with the picture image I(k+1) as the reference side and the picture image I(k) as the search side. The result of the parallax sampling of the picture image I(k+1) is similarly corrected and stored to RAM 112.
When the parallax sampling processing of the picture image I(k) and the picture image I(k+1) has been finished, the counter k is added with 1. When the counter k is smaller than the number of frames of picture images, the parallax sampling and processing with respect to the picture image I(k) and the picture image I(k+1) is carried out. When the counter k is equal to or more than the number of frames of picture images, the picture image parallax sampling and processing unit 107a transmits the finish signal to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the picture image parallax sampling and processing unit 107a, the timing control unit 111 transmits the start signal to the joint line determining and processing unit 108a and enters an awaiting state until the timing control unit 111 receives the finish signal from the joint line determining and processing unit 108a.
Upon receiving the start signal from the timing control unit 111, the joint line determining and processing unit 108a reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the joint line determining and processing unit 108a reads from RAM 112 an address in the frame memory 102 of the picture image I(k) in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address. Further, the joint line determining and processing unit 108a reads the picture image I(k+1) next to the picture image.
Thereafter, the joint line determining and processing unit 108a reads from RAM 112 the deviation amount of images of the object (parallax amount) of the picture image I(k) and the picture image I(k+1) calculated at the picture image parallax sampling and processing unit 107a.
An explanation will be given of a method of calculating a joint line from the picture image I(k) in reference to FIGS. 11A through 11I, 12A, 12B and 13A through 13D.
By the picture image parallax sampling and processing unit 107a, from a parallax picture image (FIG. 1C) in respect of the picture image I(k) calculated from the picture image I(k)(FIG. 11A) and the picture image I(K+1)(FIG. 11B), a point of changing to enlarge further the parallax of the picture image in view from the right end of the picture image, that is, a point satisfying the following Equation is extracted.
E(x,y)-E(x1,y)>Threshold
Here, E(x, y) designates a parallax amount or a concentration of a pixel of a parallax picture image at coordinates (x, y) of the picture image and Threshold designates a pertinent threshold value.
According to the example of FIGS. 11A through 11I, the point is represented by a white circle in FIG. 11E. These points are points on a contour line at the right side of picture images 317 and 319 of the object. The picture images 317 and 319 are arranged horizontally and accordingly, points on the contour lines of the objects 317 and 319 are arranged horizontally among the points.
These point rows designate the contour line on the right side of the images of the object and on the left side of the row of points corresponding to the above-described point rows on the picture image I (k+1), the unseen portion of an invisible portion of the picture image I(k) becomes visible and the difference between the picture images I(k) and I(k+1) becomes significant especially at the left side in the vicinity of these points. Further, the same thing is visible by the both picture images at the vicinity of the right side of these points and accordingly, the difference between the picture images I(k) and I(k+1) become small.
According to a first method of sampling a joint line, in consideration of the above-described situation, points having parallaxes as similar as possible, are sampled from point rows having large calculated change in parallax, that is, the contour line of the same body in the vicinity of the camera is sampled and is determined as the joint line. Further, when there is no portion having the large change in parallax in the up and the down direction as shown in FIG. 11E, a line lowered downwardly from an end of a contour line most proximate thereto to the upper or lower end of the picture image is determined as a joint line.
According to a second method of sampling a joint line, a line connecting point rows on the right most side exceeding a certain threshold value "Threshold" in the overlap region among point rows having the large change in parallax that is calculated similar to the first sampling method as shown by FIG. 12A, is determined as a joint line L(k).
A third sampling method of a joint line is as follows. The first and the second sampling methods of joint line are similarly applicable by calculating point rows having the large change in parallax from left to right with the picture image I (k+1) as a reference. That is, by calculating joint lines from the picture image I(k) and the picture image I(k+1) in accordance with the two methods, a plurality of candidates of joint lines are calculated. Candidates of a plurality of joint lines can further be obtained by changing the value of the threshold value "Threshold". In FIGS. 13A and 13B, candidates of joint lines Li (k), i1, 2, . . . are represented by white bold lines.
The most pertinent one in the joint line candidates Li (k), i1, 2, . . . is a joint line minimizing the following value a. The following Equation calculates an integration of a difference in brightness values of points on the joint lines.
a=∫.sub.c |L.sub.i (k)(x,y)-L.sub.i '(k)(x',y')|
Here, notation C designates paths on the joint lines Li (k) and Li ((k) (x, y) designates a brightness value at a point (x, y) on a joint line in the picture image I(k) (or a vector of each value of RGB), and Li '(k) (x', y') designates a brightness value on a point (x', y') on a joint line Li (k) in the picture image I(k+1) in correspondence with the point (x, y) of the above-mentioned picture image I(k) (or a vector of each value of RBG). A candidate Li (k) of a joint line minimizing the above Equation is calculated and is determined as the joint line as shown by FIG. 13C.
The above-calculated joint lines L(k) are successively written to predetermined addresses of RAM 112 by which the processing of determining joint lines is finished.
When the processing of determining the joint lines of the picture image I(k) and the picture image I(k+1) has been finished, the counter k is added with 1. When the counter k is smaller than the number of frames of picture images, the processing of determining joint lines in respect of the picture image I(k) and the picture image I(k+1) is performed. When the counter k is equal to or more than the number of frames of picture images, the joint line determining and processing unit 108a transmits the finish signal to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the joint line determining and processing unit 108a, the timing control unit 111 transmits the start signal to the picture image compositing and processing unit 109a and enters an awaiting state until the timing control unit 111 receives the finish signal from the picture image compositing and processing unit 109a.
Upon receiving the start signal from the timing control unit 111, the picture image compositing and processing unit 109a reads and stores the number of frames of picture images to be processed from RAM 112 and sets the counter k to 1. Next, the picture image compositing and processing unit 109a reads from RAM 112 an address in the frame memory 102 of the picture image in correspondence with the counter k and reads the picture image I(k) in the frame memory 102 from the address.
When the counter k is 1, the picture image compositing and processing unit 109a writes the picture image I(k) to a predetermined position in the output picture image region of the frame memory 102.
When the counter k is larger than 1, the picture image compositing and processing unit 109a calls from RAM 112 a parallely moved amount M(k) and a joint line L(k-1) in respect of the picture image I(k-1).
A position Pk for writing the picture image I(k) of the output picture image region, is provided by the following Equation. ##EQU2## Further, a joint line is corrected to a position deviated from the original position by Pk-1.
In copying the picture image I(k) to the output picture image region, the operation is carried out under the following three conditions by which composition of picture images is carried out.
(1) Pixels on the left side of a joint line Lk-1 are not written.
(2) Components of w dots on the right side of the joint line Lk-1 are smoothed in respect of the picture image I(k) by concentration smoothing.
(3) Pixels are written as they are at a portion successive to the w dots on the right side of the joint line Lk-1.
Here, notation w is provided with a pertinent constant value. By carrying out the concentration smoothing by the w dots, the joint looks more naturally.
According to the example of FIGS. 11A through 11I, in respect of the processing at the picture image compositing and processing unit 109a, by covering a mask 801 on the picture image I(k+1) as shown by FIG. 11H, pixels on the left side of the joint line Lk-1 are made to be black pixels. Further, a region 802 comprising components of w dots on the right side of the joint line Lk-1, is determined as a region for performing concentration smoothing and concentrations of pixels in the region 802 are subjected to concentration smoothing such that, for example, concentrations of pixels in contact with the joint line Lk-1 are made to be substantially equal to concentrations of pixels in contact with the joint line Lk-1 of the picture image I(k+1) and concentrations of pixels most remote from the joint line Lk-1 of the region 802 are substantially equal to original concentration. A portion 804 on the right side of the joint line Lk-1 of the picture image I(k+1) after performing the processing, is jointed to a portion 803 on the left side of the joint line Lk of the picture image I(k) shown by FIG. 11G. Thereby, a composite picture image shown by FIG. 11I can be provided.
After performing the above-described processings, the counter k is added with 1, and when the counter k is equal to or less than the number of frames of picture image, the above-described processings in respect of an updated picture image I(k) is successively performed. Further, when the counter k is larger than the number of frames of picture images, a picture image compositing finish signal is transmitted to the timing control unit 111 by which the processing is finished. Upon receiving the finish signal from the picture image compositing and processing unit 109a, the timing control unit 111 transmits the start signal to the picture image output processing unit 110 and enters an awaiting state until the timing control unit 111 receives the finish signal from the picture image output processing unit 110.
Upon receiving the start signal from the timing control unit 111, the picture image output processing unit 110 reads from RAM 112 an output picture image address in the frame memory 102. Next, the picture image output processing unit 110 transmits the picture image output start signal to the picture image outputting device 103, successively reads the output picture image signal from the frame memory 102 and outputs the signal to the picture image outputting device 103.
When the picture image output processing unit 110 finishes outputting the output picture image from the picture image outputting device 103, the picture image output processing unit 110 transmits the finish signal of the processing to the timing control unit 111 by which the processing is finished.
Upon receiving the finish signal from the picture image output processing unit 110, the timing control unit 111 finishes all of the operation. Although the more proximate to the camera an object is disposed, the larger the difference in the parallax in the overlap regions of the two frames of the picture images and when the compositing operation is carried out at this point, deviation or doubled imaging may be caused, by providing joint lines on the contour of an object proximate to the camera by the above-described flow of a series of processings, few portions having considerably different parallaxes are jointed and the joint picture image looks more natural than in the conventional technology by which the quality is promoted.
Further, according to the panoramic picture image compositing apparatus 101a of the second embodiment, even when an image of a moving body is included in the picture image, in the operation of sampling parallax, the movement is shown as a parallax and accordingly, high quality jointing can similarly be performed.
The above-described embodiments are examples of embodiments of the invention and the invention is not restricted by the content of the embodiments so far as the gist of the invention is not changed.
According to the panoramic picture image forming apparatuses 100 and 100a of the first and second embodiments, composition of the pair of picture images I(k) and I(k+1) is given as an example. When composition of picture images of three frames or more is carried out by using the panoramic picture image forming apparatuses, the above-described processings may be carried out for each of pairs of picture images including images of the same object and finally all the picture images may be jointed together successively. Further, although the picture images I(k) and I(K+1) provided by pan-imaging are objects of compositing operation, when images of the same object are included in the picture images of the objects of compositing operation, the objects of compositing operation are not limited to picture images provided by pan-imaging but picture images provided by other methods may be used. For example, picture images may be provided by tilt-imaging where a camera is moved in the vertical direction. Further, when imaging ranges are arranged in a matrix and respective imaging ranges overlap portions of other imaging ranges respectively in the horizontal direction and the vertical direction, picture images can be composited respectively in the two horizontal and vertical directions.
Additionally, in order to cause a computer including a central processing unit and a memory to composite a pair of picture images in which images of an object have a parallax, interpolating the parallax, a program which causes the computer to execute the forming and processing of a panoramic picture image as described in the first and second embodiments may be stored in a storage medium capable of being read by the computer. As such storage medium may be used floppy disks, hard disks, magnetic tapes, CD-ROMs, optical disks, photomagnetic disks, minidisks, ROMs, RAMs etc.
A first example of program comprises a series of procedures for causing the central processing unit instead of the units 104 through 111 of the panoramic picture image forming apparatus 100 to execute the processings which are to be executed by the units 104 through 111. The processings are as follows:
to instruct the imaging means to capture divided picture images obtained by overlap of parts of picture images by a plurality of imaging means or by moving imaging means plural times, and instruct storage means to store the divided images captured by the imaging means, under a directed address;
to calculate a position of composition of the divided picture images;
to sample parallax information from a picture image including a parallax within an overlap region between adjacent divided picture images;
to form a plurality of intermediate picture images from the picture images including the parallax within the overlap region; and
to form a composite panoramic picture image from the divided picture images.
In the processing of sampling parallax information, the central processing unit executes the operation of correcting deviations of the picture images within the overlap region, and in the processing of compositing the divided picture images, the central processing unit executes the operation of interpolating deviations of the picture images by using the intermediate picture images to form a panoramic picture image.
A second example of program comprises a series of procedures for causing the central processing unit instead of the units 104 through 106, 107a through 109a, 110 and 111 of the panoramic picture image forming apparatus 100a to execute the processings which are to be executed by the units. The processings are as follows:
to instruct the imaging means to capture divided picture images obtained by overlap of parts of imaged picture images by a plurality of imaging means or by moving imaging means plural times, and instruct storage means to store the divided images captured by the imaging means, under a directed address;
to calculate a position of composition of the divided picture images;
to sample parallax information from a picture image including a parallax within an overlap region between adjacent divided picture images;
to set the joint line on the profile of the object closest to the imaging means of the picture image within the overlap region, on the basis of the parallax information; and
to form a composite panoramic picture image from the divided picture images.
In the processing of sampling parallax information the central processing unit executes the operation of correcting deviations of the picture images within the overlap region, and in the processing of compositing the divided picture images, the central processing unit executes the operation of joining the divided picture images on the basis of the joint line to form a composite panoramic picture image.
The program stored in the storage medium is installed in the computer to cause the central processing unit to execute the program. In the case of the first example of program, the central processing unit uses a memory as the frame memory 102 and RAM 112, to operate as the unit 104 through 110 as well as the timing controller 111. In the case of the second example of program, the central processing unit uses a memory as the frame memory 102 and RAM 112, to operate sequentially as the unit 104 through 106, 107a through 109a, and 110 as well as the timing controller 111. The computer can thus form a panoramic picture like the first and second embodiments of panoramic picture image forming apparatus 100 and 100a. Accordingly the panoramic picture image forming apparatuses 100 and 100a are realized by the computer.
As described above, the use of the storage medium above-described makes it possible to execute the programs of the invention not only in personal computers but also in various information apparatuses such as mobile information terminals and camera incorporated picture image processing apparatuses.
The invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description and all changes which come within the meaning and the range of equivalency of the claims are therefore intended to be embraced therein.

Claims (12)

What is claimed is:
1. A picture image forming apparatus comprising:
imaging means for obtain a pair of divided picture images having a parallax by imaging an object and a periphery of the object so as to include a same part of the object in imaging regions for the pair of divided picture images;
calculating means for calculating overlap regions of the respective divided picture images including the images of the object,
parallax sampling means for sampling the parallax from the images of the object in the two overlap regions,
intermediate picture image forming means for forming intermediate picture images to be provided by imaging the object from points between observing points from which the respective divided picture images are imaged on the basis of the images of the object in the overlap regions of the pair of divided picture images and the sampled parallax, and
picture image composition means for providing a composite picture image by interpolating the parallax of the images of the object in the two overlap regions by the intermediate picture images and compositing the pair of divided picture images such that the overlap regions overlap.
2. The picture image forming apparatus of claim 1, wherein, when there are a plurality of objects having different distances from the imaging means to the objects, the parallax sampling means corrects parallaxes of images of the respective objects such that the parallax of the image of any one of the objects among the parallaxes sampled from the images of the respective objects, is nullified.
3. The picture image forming apparatus of claim 1, wherein the intermediate picture image forming means of the picture image forming apparatus forms a plurality of the intermediate picture images and the respective intermediate picture images are provided with observing points different from each other to provide the respective intermediate picture images by imaging the object.
4. The picture image forming apparatus of claim 1, wherein the calculating means of the picture image forming apparatus calculates differences in angle of the images of the object in the two divided picture images and subjects the divided picture images to a rotational transformation such that the calculated differences in angle cancel each other.
5. The picture image forming apparatus of claim 1, wherein the calculating means of the picture image forming apparatus calculates differences in size of the images of the object in the divided picture images and the two divided picture images are magnified or reduced such that the calculated differences in size cancel each other.
6. The picture image forming apparatus of claim 1, wherein the calculating means calculates differences in angle and size of the images of the object in the two divided picture images and subjects the divided picture images to rotational transformation and magnification or reduction such that the calculated differences in angle and size cancel each other, respectively.
7. A picture image forming apparatus comprising:
imaging means for providing a pair of divided picture images having parallaxes in images of objects by imaging the same objects respectively from two different observing points;
calculating means for calculating overlap regions including the images of the objects in the respective divided picture images;
parallax sampling means for sampling the parallaxes from the images of the objects in the overlap regions;
joint line setting means for setting joint lines on a profile of the image of one of the objects most proximate to the imaging means in the respective overlap regions, on the basis of the sampled parallaxes; and
picture image composition means for providing a composite picture image by compositing the pair of divided picture images such that the set joint lines overlap.
8. The picture image forming apparatus of claim 7, wherein the joint line setting means of the picture image forming apparatus sets joint lines on a profile most proximate to rims of the divided picture images among the profiles of the images of the objects in the respective overlap regions.
9. The picture image forming apparatus of claim 7, wherein the joint line setting means of the picture image forming apparatus calculates candidates of a plurality of joint lines by using respective different joint line determining methods and one of the candidates of joint lines is selected and determined to be the joint line.
10. The picture image forming apparatus of claim 7, wherein the parallax sampling means corrects the parallaxes of the images of the respective objects such that a least parallax among the parallaxes sampled from the images of the respective objects is nullified.
11. A storage medium capable of being read by a computer, storing a program for allowing the computer to execute the processings of:
inputting a pair of divided picture images having a parallax by imaging an object and a periphery of the object so as to include a same part of the object in predetermined imaging regions;
calculating overlap regions of the respective divided picture images including the images of the object;
sampling the parallax from the images of the object in the two overlap regions;
forming intermediate picture images to be provided by imaging the object from points between the observing points from which the respective divided picture images are imaged on the basis of the images of the object in the overlap regions of the pair of divided picture images and the sampled parallax; and
providing a composite picture image by interpolating the parallax of the images of the object in the two overlap regions by the intermediate picture images and compositing the pair of divided picture images such that the overlap regions overlap.
12. A storage medium capable of being read by a computer, storing a program for allowing the computer to execute the processings of:
inputting a pair of divided picture images having parallaxes in images of objects by imaging the same objects respectively from two different observing points,
calculating overlap regions including the images of the objects in the respective divided picture images,
sampling the parallaxes from the images of the objects in the overlap regions,
setting joint lines on a profile of the image of one of the objects most proximate to the observing points in the respective overlap regions based on the sampled parallaxes, and
providing a composite picture image by compositing the pair of divided picture images such that the set joint lines overlap.
US08/951,713 1996-10-17 1997-10-16 Picture image forming apparatus Expired - Fee Related US6005987A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP27432696 1996-10-17
JP8-274326 1996-10-17
JP9269427A JPH10178564A (en) 1996-10-17 1997-10-02 Panorama image generator and recording medium
JP9-269427 1997-10-02

Publications (1)

Publication Number Publication Date
US6005987A true US6005987A (en) 1999-12-21

Family

ID=26548762

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/951,713 Expired - Fee Related US6005987A (en) 1996-10-17 1997-10-16 Picture image forming apparatus

Country Status (4)

Country Link
US (1) US6005987A (en)
EP (1) EP0837428B1 (en)
JP (1) JPH10178564A (en)
DE (1) DE69713243T2 (en)

Cited By (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6215914B1 (en) * 1997-06-24 2001-04-10 Sharp Kabushiki Kaisha Picture processing apparatus
US20010019363A1 (en) * 2000-02-29 2001-09-06 Noboru Katta Image pickup system and vehicle-mounted-type sensor system
US6323934B1 (en) * 1997-12-04 2001-11-27 Fuji Photo Film Co., Ltd. Image processing method and apparatus
US20020012477A1 (en) * 2000-06-30 2002-01-31 Hitoshi Inoue Image processing apparatus image processing method and recording medium
US20020041717A1 (en) * 2000-08-30 2002-04-11 Ricoh Company, Ltd. Image processing method and apparatus and computer-readable storage medium using improved distortion correction
US6380539B1 (en) 1997-01-30 2002-04-30 Applied Science Fiction, Inc. Four color trilinear CCD scanning
US6384936B1 (en) * 1998-10-26 2002-05-07 Hewlett-Packard Company Printer effort reduction through caching and reduction
US6393162B1 (en) * 1998-01-09 2002-05-21 Olympus Optical Co., Ltd. Image synthesizing apparatus
US6393160B1 (en) 1998-03-13 2002-05-21 Applied Science Fiction Image defect correction in transform space
US6404516B1 (en) 1999-02-22 2002-06-11 Applied Science Fiction, Inc. Parametric image stitching
US20020106134A1 (en) * 2000-09-22 2002-08-08 Dundon Thomas A. Multiple-orientation image defect detection and correction
US6437358B1 (en) 1999-02-04 2002-08-20 Applied Science Fiction, Inc. Apparatus and methods for capturing defect data
US6442301B1 (en) 1997-01-06 2002-08-27 Applied Science Fiction, Inc. Apparatus and method for defect channel nulling
US6439784B1 (en) 1999-08-17 2002-08-27 Applied Science Fiction, Inc. Method and system for using calibration patches in electronic film processing
US6443639B1 (en) 1999-06-29 2002-09-03 Applied Science Fiction, Inc. Slot coater device for applying developer to film for electronic film development
US6447178B2 (en) 1999-12-30 2002-09-10 Applied Science Fiction, Inc. System, method, and apparatus for providing multiple extrusion widths
US20020126914A1 (en) * 2001-03-07 2002-09-12 Daisuke Kotake Image reproduction apparatus, image processing apparatus, and method therefor
US6461061B2 (en) 1999-12-30 2002-10-08 Applied Science Fiction, Inc. System and method for digital film development using visible light
US6466262B1 (en) * 1997-06-11 2002-10-15 Hitachi, Ltd. Digital wide camera
US20020159165A1 (en) * 2000-09-22 2002-10-31 Ford Gordon D. Lens focusing device, system and method for use with multiple light wavelengths
US6475711B1 (en) 1999-12-31 2002-11-05 Applied Science Fiction, Inc. Photographic element and digital film processing method using same
US6487321B1 (en) 1999-09-16 2002-11-26 Applied Science Fiction Method and system for altering defects in a digital image
US6498867B1 (en) 1999-10-08 2002-12-24 Applied Science Fiction Inc. Method and apparatus for differential illumination image-capturing and defect handling
US20020196369A1 (en) * 2001-06-01 2002-12-26 Peter Rieder Method and device for displaying at least two images within one combined picture
US20030002730A1 (en) * 2001-07-02 2003-01-02 Petrich David B. System and method for discovering and categorizing attributes of a digital image
US6503002B1 (en) 1996-12-05 2003-01-07 Applied Science Fiction, Inc. Method and apparatus for reducing noise in electronic film development
US6505977B2 (en) 1999-12-30 2003-01-14 Applied Science Fiction, Inc. System and method for digital color dye film processing
US20030011827A1 (en) * 1999-12-29 2003-01-16 Applied Science Fiction Distinguishing positive and negative films system and method
US6512601B1 (en) 1998-02-23 2003-01-28 Applied Science Fiction, Inc. Progressive area scan in electronic film development
US6540416B2 (en) 1999-12-30 2003-04-01 Applied Science Fiction, Inc. System and method for digital film development using visible light
US6554504B2 (en) 1999-12-30 2003-04-29 Applied Science Fiction, Inc. Distributed digital film processing system and method
US20030080282A1 (en) * 2001-10-26 2003-05-01 Walley Thomas M. Apparatus and method for three-dimensional relative movement sensing
US6558052B2 (en) 1997-01-30 2003-05-06 Applied Science Fiction, Inc. System and method for latent film recovery in electronic film development
US20030091228A1 (en) * 2001-11-09 2003-05-15 Honda Giken Kogyo Kabushiki Kaisha Image recognition apparatus
US20030118249A1 (en) * 2001-04-19 2003-06-26 Edgar Albert D. Method, system and software for correcting image defects
US6590679B1 (en) 1998-02-04 2003-07-08 Applied Science Fiction, Inc. Multilinear array sensor with an infrared line
US6594041B1 (en) 1998-11-20 2003-07-15 Applied Science Fiction, Inc. Log time processing and stitching system
US6593558B1 (en) 1996-05-10 2003-07-15 Applied Science Fiction, Inc. Luminance-priority electronic color image sensor
US6599036B2 (en) 2000-02-03 2003-07-29 Applied Science Fiction, Inc. Film processing solution cartridge and method for developing and digitizing film
US6611629B2 (en) * 1997-11-03 2003-08-26 Intel Corporation Correcting correlation errors in a composite image
US6614946B1 (en) 1999-10-08 2003-09-02 Eastman Kodak Company System and method for correcting defects in digital images through selective fill-in from surrounding areas
US6619863B2 (en) 2000-02-03 2003-09-16 Eastman Kodak Company Method and system for capturing film images
US6664034B2 (en) 1999-12-31 2003-12-16 Eastman Kodak Company Digital film processing method
US6674485B2 (en) 1998-08-31 2004-01-06 Hitachi Software Engineering Co., Ltd. Apparatus and method for image compositing
US6683995B2 (en) 1999-12-23 2004-01-27 Eastman Kodak Company Method and apparatus for correcting large defects in digital images
US6704458B2 (en) 1999-12-29 2004-03-09 Eastman Kodak Company Method and apparatus for correcting heavily damaged images
US6707557B2 (en) 1999-12-30 2004-03-16 Eastman Kodak Company Method and system for estimating sensor dark current drift and sensor/illumination non-uniformities
US6711302B1 (en) 1999-10-20 2004-03-23 Eastman Kodak Company Method and system for altering defects in digital image
US6720997B1 (en) * 1997-12-26 2004-04-13 Minolta Co., Ltd. Image generating apparatus
US6720560B1 (en) 1999-12-30 2004-04-13 Eastman Kodak Company Method and apparatus for scanning images
US6733960B2 (en) 2001-02-09 2004-05-11 Eastman Kodak Company Digital film processing solutions and method of digital film processing
US6781620B1 (en) 1999-03-16 2004-08-24 Eastman Kodak Company Mixed-element stitching and noise reduction system
US6788335B2 (en) 1999-12-30 2004-09-07 Eastman Kodak Company Pulsed illumination signal modulation control & adjustment method and system
US6786655B2 (en) 2000-02-03 2004-09-07 Eastman Kodak Company Method and system for self-service film processing
US20040184656A1 (en) * 2003-03-19 2004-09-23 Minolta Co., Ltd Method for measuring object based on image and photographing apparatus
US6805501B2 (en) 2001-07-16 2004-10-19 Eastman Kodak Company System and method for digital film development using visible light
US6813392B2 (en) 1999-12-30 2004-11-02 Eastman Kodak Company Method and apparatus for aligning multiple scans of the same area of a medium using mathematical correlation
US20040246333A1 (en) * 2003-06-03 2004-12-09 Steuart Leonard P. (Skip) Digital 3D/360 degree camera system
US6862117B1 (en) 1999-12-30 2005-03-01 Eastman Kodak Company Method and apparatus for reducing the effect of bleed-through on captured images
US6865289B1 (en) * 2000-02-07 2005-03-08 Canon Kabushiki Kaisha Detection and removal of image occlusion errors
US20050083402A1 (en) * 2002-10-31 2005-04-21 Stefan Klose Auto-calibration of multi-projector systems
US6924911B1 (en) 1999-10-12 2005-08-02 Eastman Kodak Company Method and system for multi-sensor signal detection
US20060038879A1 (en) * 2003-12-21 2006-02-23 Kremen Stanley H System and apparatus for recording, transmitting, and projecting digital three-dimensional images
US7015954B1 (en) 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US20060078224A1 (en) * 2002-08-09 2006-04-13 Masashi Hirosawa Image combination device, image combination method, image combination program, and recording medium containing the image combination program
US20060170772A1 (en) * 2005-01-28 2006-08-03 Technology Advancement Group Surveillance system and method
US7091963B2 (en) 2001-08-01 2006-08-15 Microsoft Corporation Dynamic rendering of ink strokes with transparency
US7168038B2 (en) * 2001-08-01 2007-01-23 Microsoft Corporation System and method for scaling and repositioning drawings
US20070081081A1 (en) * 2005-10-07 2007-04-12 Cheng Brett A Automated multi-frame image capture for panorama stitching using motion sensor
US20070206878A1 (en) * 2002-06-28 2007-09-06 Microsoft Corporation System and method for head size equalization in 360 degree panoramic images
US20070286526A1 (en) * 2006-03-20 2007-12-13 GENERAL DYNAMICS C4 SYSTEMS and ARIZONA BOARD OF REGENTS FOR AND ON BEHALF OF ARIZONA STATE Methods for Multi-Point Descriptors for Image Registrations
US7317834B2 (en) 2000-06-21 2008-01-08 Microsoft Corporation Serial storage of ink and its properties
US20080056262A1 (en) * 2006-09-01 2008-03-06 Dheeraj Singh Approach for fast ip address lookups
US7343053B2 (en) 2001-06-27 2008-03-11 Microsoft Corporation Transform table for ink sizing and compression
US20080065754A1 (en) * 2006-08-17 2008-03-13 Benhase Linda V System And Method For Dynamic Picture Generation In A Web Or Java Application
US7346230B2 (en) 2000-06-21 2008-03-18 Microsoft Corporation Transform table for ink sizing and compression
US20080074500A1 (en) * 1998-05-27 2008-03-27 Transpacific Ip Ltd. Image-Based Method and System for Building Spherical Panoramas
US20080278518A1 (en) * 2007-05-08 2008-11-13 Arcsoft (Shanghai) Technology Company, Ltd Merging Images
US20090002774A1 (en) * 2007-06-27 2009-01-01 Anthony Michael King Phased Illumination Method for Image Capture System
US20090034795A1 (en) * 2006-02-08 2009-02-05 Thales Method for geolocalization of one or more targets
US20090138233A1 (en) * 2005-09-12 2009-05-28 Torsten Kludas Surveying Instrument and Method of Providing Survey Data of a Target Region Using a Surveying Instrument
US20090147004A1 (en) * 2007-12-06 2009-06-11 Barco Nv Method And System For Combining Images Generated By Separate Sources
US20090262180A1 (en) * 2008-04-18 2009-10-22 Samsung Electronics Co., Ltd. Apparatus for generating panoramic images and method thereof
US20100004020A1 (en) * 2008-07-02 2010-01-07 Samsung Electronics Co. Ltd. Mobile terminal and composite photographing method using multiple mobile terminals
US20100020097A1 (en) * 2002-09-19 2010-01-28 M7 Visual Intelligence, L.P. System and method for mosaicing digital ortho-images
US7710436B2 (en) 2000-02-11 2010-05-04 Sony Corporation Automatic color adjustment of a template design
US20100220209A1 (en) * 1999-08-20 2010-09-02 Yissum Research Development Company Of The Hebrew University System and method for rectified mosaicing of images recorded by a moving camera
US7810037B1 (en) 2000-02-11 2010-10-05 Sony Corporation Online story collaboration
US20100321470A1 (en) * 2009-06-22 2010-12-23 Fujifilm Corporation Imaging apparatus and control method therefor
US20110091065A1 (en) * 2009-10-21 2011-04-21 MindTree Limited Image Alignment Using Translation Invariant Feature Matching
US20110311130A1 (en) * 2010-03-19 2011-12-22 Oki Semiconductor Co., Ltd. Image processing apparatus, method, program, and recording medium
CN101815226B (en) * 2009-02-13 2012-01-11 晨星软件研发(深圳)有限公司 Image adjusting apparatus and associated method
US20120069173A1 (en) * 2010-09-16 2012-03-22 Honda Motor Co., Ltd. Workpiece inspecting apparatus and workpiece inspecting method
US20120081510A1 (en) * 2010-09-30 2012-04-05 Casio Computer Co., Ltd. Image processing apparatus, method, and storage medium capable of generating wide angle image
US20120106791A1 (en) * 2010-10-27 2012-05-03 Samsung Techwin Co., Ltd. Image processing apparatus and method thereof
US20120218390A1 (en) * 2011-02-24 2012-08-30 Au Optronics Corporation Interactive stereo display system and method for calculating three-dimensional coordinate
US20120263397A1 (en) * 2011-04-12 2012-10-18 Sony Corporation Image processing device, image processing method, and program
US8407595B1 (en) 2000-02-11 2013-03-26 Sony Corporation Imaging service for automating the display of images
US20130136340A1 (en) * 2011-11-28 2013-05-30 Panasonic Corporation Arithmetic processing device
US20130156264A1 (en) * 2011-12-15 2013-06-20 Linus Mårtensson Minimizing drift using depth camera images
US8483960B2 (en) 2002-09-20 2013-07-09 Visual Intelligence, LP Self-calibrated, remote imaging and data processing system
US20130258044A1 (en) * 2012-03-30 2013-10-03 Zetta Research And Development Llc - Forc Series Multi-lens camera
US20140015956A1 (en) * 2011-03-15 2014-01-16 Omron Corporation Image processing device and image processing program
US8861890B2 (en) 2010-11-24 2014-10-14 Douglas Alan Lefler System and method for assembling and displaying individual images as a continuous image
US8896695B2 (en) 2002-08-28 2014-11-25 Visual Intelligence Lp Retinal concave array compound camera system
US8994822B2 (en) 2002-08-28 2015-03-31 Visual Intelligence Lp Infrastructure mapping system and method
US20150248584A1 (en) * 2012-09-28 2015-09-03 Omg Plc Determination of position from images and associated camera positions
US20150342139A1 (en) * 2013-01-31 2015-12-03 Lely Patent N.V. Camera system, animal related system therewith, and method to create 3d camera images
US9237294B2 (en) 2010-03-05 2016-01-12 Sony Corporation Apparatus and method for replacing a broadcasted advertisement based on both heuristic information and attempts in altering the playback of the advertisement
WO2016005232A1 (en) * 2014-07-11 2016-01-14 Bayerische Motoren Werke Aktiengesellschaft Merging of partial images to form an image of surroundings of a mode of transport
US20160162246A1 (en) * 2014-12-04 2016-06-09 Canon Kabushiki Kaisha Display control apparatus, control method thereof and storage medium
US20160295108A1 (en) * 2015-04-01 2016-10-06 Cheng Cao System and method for panoramic imaging
US20160301841A1 (en) * 2010-05-03 2016-10-13 Invisage Technologies, Inc. Devices and methods for high-resolution image and video capture
US20170178335A1 (en) * 2015-12-18 2017-06-22 Ricoh Co., Ltd. Candidate List Generation
US9832528B2 (en) 2010-10-21 2017-11-28 Sony Corporation System and method for merging network-based content with broadcasted programming content
US10360436B2 (en) * 2015-06-30 2019-07-23 Hitachi Automotive Systems, Ltd. Object detection device
USRE49105E1 (en) 2002-09-20 2022-06-14 Vi Technologies, Llc Self-calibrated, remote imaging and data processing system

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3463612B2 (en) 1999-01-21 2003-11-05 日本電気株式会社 Image input method, image input device, and recording medium
JP3799861B2 (en) * 1999-02-24 2006-07-19 株式会社日立製作所 Image synthesizing apparatus and recording medium on which program for executing image synthesizing method is recorded
JP2002208005A (en) * 2001-01-12 2002-07-26 Minolta Co Ltd Image processor, image processing method, image processing program, and computer readable recording medium with image processing program recorded
JP4046957B2 (en) 2001-07-05 2008-02-13 キヤノン株式会社 Imaging apparatus and control method thereof, information processing apparatus and method, and storage medium
JP3821129B2 (en) 2002-04-17 2006-09-13 セイコーエプソン株式会社 Digital camera
JP2004015286A (en) 2002-06-05 2004-01-15 Seiko Epson Corp Digital camera
CN100369460C (en) * 2002-08-01 2008-02-13 精工爱普生株式会社 Digital camera
JP4048907B2 (en) 2002-10-15 2008-02-20 セイコーエプソン株式会社 Panorama composition of multiple image data
JP4606989B2 (en) * 2005-10-07 2011-01-05 富士フイルム株式会社 Imaging device
JP4888192B2 (en) * 2007-03-30 2012-02-29 株式会社ニコン Imaging device
KR101030435B1 (en) 2009-08-07 2011-04-20 포항공과대학교 산학협력단 Method of generating mosaic image in realtime using images on which parallax and moving object exist
GB2473247B (en) 2009-09-04 2015-02-11 Sony Corp A method and apparatus for image alignment
JP5907660B2 (en) * 2011-02-28 2016-04-26 オリンパス株式会社 Imaging device
JP2012205015A (en) * 2011-03-24 2012-10-22 Casio Comput Co Ltd Image processor and image processing method, and program
EP2726937B1 (en) * 2011-06-30 2019-01-23 Nokia Technologies Oy Method, apparatus and computer program product for generating panorama images
JP2013183353A (en) * 2012-03-02 2013-09-12 Toshiba Corp Image processor
JP6019729B2 (en) * 2012-05-11 2016-11-02 ソニー株式会社 Image processing apparatus, image processing method, and program
JP6089742B2 (en) * 2013-02-04 2017-03-08 カシオ計算機株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
DE102013211271A1 (en) * 2013-06-17 2014-12-18 Robert Bosch Gmbh System and method for joining pictures taken by means of a plurality of optical sensors
JP5846268B1 (en) 2014-08-12 2016-01-20 株式会社リコー Image processing system, image processing apparatus, program, and imaging system
JP6606480B2 (en) * 2016-08-12 2019-11-13 日本電信電話株式会社 Panorama video information generating apparatus, panoramic video information generating method used therefor, and panoramic video information generating program
GB201620037D0 (en) * 2016-11-28 2017-01-11 Nokia Technologies Oy Imaging device and method
US10373362B2 (en) * 2017-07-06 2019-08-06 Humaneyes Technologies Ltd. Systems and methods for adaptive stitching of digital images
JP6664082B2 (en) * 2018-03-27 2020-03-13 パナソニックIpマネジメント株式会社 Image processing system, image processing apparatus, and image processing method

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3646336A (en) * 1964-09-04 1972-02-29 Itek Corp Correlation unit
US4602857A (en) * 1982-12-23 1986-07-29 James H. Carmel Panoramic motion picture camera and method
JPH03182976A (en) * 1989-12-11 1991-08-08 Sumitomo Metal Ind Ltd Joining method for digital picture
JPH05122606A (en) * 1991-10-30 1993-05-18 Matsushita Electric Ind Co Ltd Method and device for synthesizing image
US5227832A (en) * 1990-11-22 1993-07-13 Asahi Kogaku Kogyo Kabushiki Kaisha Camera system
EP0605045A1 (en) * 1992-12-29 1994-07-06 Laboratoires D'electronique Philips S.A.S. Image processing method and apparatus for generating one image from adjacent images
EP0650299A1 (en) * 1993-10-20 1995-04-26 Laboratoires D'electronique Philips S.A.S. Image processing system comprising fixed cameras and a system simulating a moving camera
US5631697A (en) * 1991-11-27 1997-05-20 Hitachi, Ltd. Video camera capable of automatic target tracking
US5644414A (en) * 1992-08-21 1997-07-01 Fujitsu, Ltd. Stereoscopic display method of hologram and its forming method and its stereoscopic display apparatus
US5703961A (en) * 1994-12-29 1997-12-30 Worldscape L.L.C. Image transformation and synthesis methods
US5726703A (en) * 1995-07-14 1998-03-10 Pioneer Electronic Corporation Stereoscopic image display system
US5764871A (en) * 1993-10-21 1998-06-09 Eastman Kodak Company Method and apparatus for constructing intermediate images for a depth image from stereo images using velocity vector fields
WO1998034195A1 (en) * 1997-01-30 1998-08-06 Yissum Research Development Company Of The Hebrew University Of Jerusalem Generalized panoramic mosaic
US5825915A (en) * 1995-09-12 1998-10-20 Matsushita Electric Industrial Co., Ltd. Object detecting apparatus in which the position of a planar object is estimated by using hough transform

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5465163A (en) * 1991-03-18 1995-11-07 Canon Kabushiki Kaisha Image processing method and apparatus for processing oversized original images and for synthesizing multiple images
US5452105A (en) * 1992-11-19 1995-09-19 Sharp Kabushiki Kaisha Joint-portion processing device for image data for use in an image processing apparatus
FR2698699B1 (en) * 1992-11-30 1994-12-30 Peugeot Method for detecting contours in a mobile scene and device for implementing this method.

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3646336A (en) * 1964-09-04 1972-02-29 Itek Corp Correlation unit
US4602857A (en) * 1982-12-23 1986-07-29 James H. Carmel Panoramic motion picture camera and method
JPH03182976A (en) * 1989-12-11 1991-08-08 Sumitomo Metal Ind Ltd Joining method for digital picture
US5227832A (en) * 1990-11-22 1993-07-13 Asahi Kogaku Kogyo Kabushiki Kaisha Camera system
JPH05122606A (en) * 1991-10-30 1993-05-18 Matsushita Electric Ind Co Ltd Method and device for synthesizing image
US5631697A (en) * 1991-11-27 1997-05-20 Hitachi, Ltd. Video camera capable of automatic target tracking
US5644414A (en) * 1992-08-21 1997-07-01 Fujitsu, Ltd. Stereoscopic display method of hologram and its forming method and its stereoscopic display apparatus
US5444478A (en) * 1992-12-29 1995-08-22 U.S. Philips Corporation Image processing method and device for constructing an image from adjacent images
EP0605045A1 (en) * 1992-12-29 1994-07-06 Laboratoires D'electronique Philips S.A.S. Image processing method and apparatus for generating one image from adjacent images
EP0650299A1 (en) * 1993-10-20 1995-04-26 Laboratoires D'electronique Philips S.A.S. Image processing system comprising fixed cameras and a system simulating a moving camera
US5650814A (en) * 1993-10-20 1997-07-22 U.S. Philips Corporation Image processing system comprising fixed cameras and a system simulating a mobile camera
US5764871A (en) * 1993-10-21 1998-06-09 Eastman Kodak Company Method and apparatus for constructing intermediate images for a depth image from stereo images using velocity vector fields
US5703961A (en) * 1994-12-29 1997-12-30 Worldscape L.L.C. Image transformation and synthesis methods
US5726703A (en) * 1995-07-14 1998-03-10 Pioneer Electronic Corporation Stereoscopic image display system
US5825915A (en) * 1995-09-12 1998-10-20 Matsushita Electric Industrial Co., Ltd. Object detecting apparatus in which the position of a planar object is estimated by using hough transform
WO1998034195A1 (en) * 1997-01-30 1998-08-06 Yissum Research Development Company Of The Hebrew University Of Jerusalem Generalized panoramic mosaic

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
J.R. M u ller et al., Adaptive complexity registration of images, Proc. Comp.Soc. Conf. on Comp. Vision and Pattern Recognition, 953 (1994). *
J.R. Muller et al., Adaptive-complexity registration of images, Proc. Comp.Soc. Conf. on Comp. Vision and Pattern Recognition, 953 (1994).
Shenchang E. Chen, "QuickTime® VR--An Image-Based Approach to Virtual Environment Navigation", Computer Graph. Proc., 29 (1995).
Shenchang E. Chen, QuickTime VR An Image Based Approach to Virtual Environment Navigation , Computer Graph. Proc., 29 (1995). *

Cited By (196)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6593558B1 (en) 1996-05-10 2003-07-15 Applied Science Fiction, Inc. Luminance-priority electronic color image sensor
US6503002B1 (en) 1996-12-05 2003-01-07 Applied Science Fiction, Inc. Method and apparatus for reducing noise in electronic film development
US6442301B1 (en) 1997-01-06 2002-08-27 Applied Science Fiction, Inc. Apparatus and method for defect channel nulling
US6380539B1 (en) 1997-01-30 2002-04-30 Applied Science Fiction, Inc. Four color trilinear CCD scanning
US6558052B2 (en) 1997-01-30 2003-05-06 Applied Science Fiction, Inc. System and method for latent film recovery in electronic film development
US6466262B1 (en) * 1997-06-11 2002-10-15 Hitachi, Ltd. Digital wide camera
US6215914B1 (en) * 1997-06-24 2001-04-10 Sharp Kabushiki Kaisha Picture processing apparatus
US6611629B2 (en) * 1997-11-03 2003-08-26 Intel Corporation Correcting correlation errors in a composite image
US6782139B2 (en) 1997-11-03 2004-08-24 Intel Corporation Correcting correlation errors in a compound image
US6323934B1 (en) * 1997-12-04 2001-11-27 Fuji Photo Film Co., Ltd. Image processing method and apparatus
US6720997B1 (en) * 1997-12-26 2004-04-13 Minolta Co., Ltd. Image generating apparatus
US6393162B1 (en) * 1998-01-09 2002-05-21 Olympus Optical Co., Ltd. Image synthesizing apparatus
US6590679B1 (en) 1998-02-04 2003-07-08 Applied Science Fiction, Inc. Multilinear array sensor with an infrared line
US6512601B1 (en) 1998-02-23 2003-01-28 Applied Science Fiction, Inc. Progressive area scan in electronic film development
US6393160B1 (en) 1998-03-13 2002-05-21 Applied Science Fiction Image defect correction in transform space
US20080074500A1 (en) * 1998-05-27 2008-03-27 Transpacific Ip Ltd. Image-Based Method and System for Building Spherical Panoramas
US7852376B2 (en) * 1998-05-27 2010-12-14 Ju-Wei Chen Image-based method and system for building spherical panoramas
US6674485B2 (en) 1998-08-31 2004-01-06 Hitachi Software Engineering Co., Ltd. Apparatus and method for image compositing
US6384936B1 (en) * 1998-10-26 2002-05-07 Hewlett-Packard Company Printer effort reduction through caching and reduction
US6594041B1 (en) 1998-11-20 2003-07-15 Applied Science Fiction, Inc. Log time processing and stitching system
US6437358B1 (en) 1999-02-04 2002-08-20 Applied Science Fiction, Inc. Apparatus and methods for capturing defect data
US6404516B1 (en) 1999-02-22 2002-06-11 Applied Science Fiction, Inc. Parametric image stitching
US6781620B1 (en) 1999-03-16 2004-08-24 Eastman Kodak Company Mixed-element stitching and noise reduction system
US6443639B1 (en) 1999-06-29 2002-09-03 Applied Science Fiction, Inc. Slot coater device for applying developer to film for electronic film development
US7710463B2 (en) * 1999-08-09 2010-05-04 Fuji Xerox Co., Ltd. Method and system for compensating for parallax in multiple camera systems
US7277118B2 (en) 1999-08-09 2007-10-02 Fuji Xerox Co., Ltd. Method and system for compensating for parallax in multiple camera systems
US7015954B1 (en) 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US6439784B1 (en) 1999-08-17 2002-08-27 Applied Science Fiction, Inc. Method and system for using calibration patches in electronic film processing
US20100220209A1 (en) * 1999-08-20 2010-09-02 Yissum Research Development Company Of The Hebrew University System and method for rectified mosaicing of images recorded by a moving camera
US6650789B2 (en) 1999-09-16 2003-11-18 Eastman Kodak Company Method and system for altering defects in a digital image
US6487321B1 (en) 1999-09-16 2002-11-26 Applied Science Fiction Method and system for altering defects in a digital image
US6498867B1 (en) 1999-10-08 2002-12-24 Applied Science Fiction Inc. Method and apparatus for differential illumination image-capturing and defect handling
US6614946B1 (en) 1999-10-08 2003-09-02 Eastman Kodak Company System and method for correcting defects in digital images through selective fill-in from surrounding areas
US6924911B1 (en) 1999-10-12 2005-08-02 Eastman Kodak Company Method and system for multi-sensor signal detection
US6711302B1 (en) 1999-10-20 2004-03-23 Eastman Kodak Company Method and system for altering defects in digital image
US6683995B2 (en) 1999-12-23 2004-01-27 Eastman Kodak Company Method and apparatus for correcting large defects in digital images
US7164511B2 (en) 1999-12-29 2007-01-16 Eastman Kodak Company Distinguishing positive and negative films system and method
US6704458B2 (en) 1999-12-29 2004-03-09 Eastman Kodak Company Method and apparatus for correcting heavily damaged images
US20030011827A1 (en) * 1999-12-29 2003-01-16 Applied Science Fiction Distinguishing positive and negative films system and method
US6793417B2 (en) 1999-12-30 2004-09-21 Eastman Kodak Company System and method for digital film development using visible light
US6862117B1 (en) 1999-12-30 2005-03-01 Eastman Kodak Company Method and apparatus for reducing the effect of bleed-through on captured images
US6788335B2 (en) 1999-12-30 2004-09-07 Eastman Kodak Company Pulsed illumination signal modulation control & adjustment method and system
US6505977B2 (en) 1999-12-30 2003-01-14 Applied Science Fiction, Inc. System and method for digital color dye film processing
US6720560B1 (en) 1999-12-30 2004-04-13 Eastman Kodak Company Method and apparatus for scanning images
US6447178B2 (en) 1999-12-30 2002-09-10 Applied Science Fiction, Inc. System, method, and apparatus for providing multiple extrusion widths
US6813392B2 (en) 1999-12-30 2004-11-02 Eastman Kodak Company Method and apparatus for aligning multiple scans of the same area of a medium using mathematical correlation
US6461061B2 (en) 1999-12-30 2002-10-08 Applied Science Fiction, Inc. System and method for digital film development using visible light
US6705777B2 (en) 1999-12-30 2004-03-16 Eastman Kodak Company System and method for digital film development using visible light
US6707557B2 (en) 1999-12-30 2004-03-16 Eastman Kodak Company Method and system for estimating sensor dark current drift and sensor/illumination non-uniformities
US6554504B2 (en) 1999-12-30 2003-04-29 Applied Science Fiction, Inc. Distributed digital film processing system and method
US6540416B2 (en) 1999-12-30 2003-04-01 Applied Science Fiction, Inc. System and method for digital film development using visible light
US6824966B2 (en) 1999-12-31 2004-11-30 Eastman Kodak Company Digital film processing method
US6664034B2 (en) 1999-12-31 2003-12-16 Eastman Kodak Company Digital film processing method
US6475711B1 (en) 1999-12-31 2002-11-05 Applied Science Fiction, Inc. Photographic element and digital film processing method using same
US6786655B2 (en) 2000-02-03 2004-09-07 Eastman Kodak Company Method and system for self-service film processing
US6599036B2 (en) 2000-02-03 2003-07-29 Applied Science Fiction, Inc. Film processing solution cartridge and method for developing and digitizing film
US6619863B2 (en) 2000-02-03 2003-09-16 Eastman Kodak Company Method and system for capturing film images
US6865289B1 (en) * 2000-02-07 2005-03-08 Canon Kabushiki Kaisha Detection and removal of image occlusion errors
US8407595B1 (en) 2000-02-11 2013-03-26 Sony Corporation Imaging service for automating the display of images
US7810037B1 (en) 2000-02-11 2010-10-05 Sony Corporation Online story collaboration
US8694896B2 (en) 2000-02-11 2014-04-08 Sony Corporation Online story collaboration
US7843464B2 (en) 2000-02-11 2010-11-30 Sony Corporation Automatic color adjustment of template design
US20100325558A1 (en) * 2000-02-11 2010-12-23 Eric Edwards Online story collaboration
US8049766B2 (en) 2000-02-11 2011-11-01 Sony Corporation Automatic color adjustment of a template design
US7710436B2 (en) 2000-02-11 2010-05-04 Sony Corporation Automatic color adjustment of a template design
US8184124B2 (en) 2000-02-11 2012-05-22 Sony Corporation Automatic color adjustment of a template design
US8345062B2 (en) 2000-02-11 2013-01-01 Sony Corporation Automatic color adjustment of a template design
US6972796B2 (en) * 2000-02-29 2005-12-06 Matsushita Electric Industrial Co., Ltd. Image pickup system and vehicle-mounted-type sensor system
US20010019363A1 (en) * 2000-02-29 2001-09-06 Noboru Katta Image pickup system and vehicle-mounted-type sensor system
US7397949B2 (en) 2000-06-21 2008-07-08 Microsoft Corporation Serial storage of ink and its properties
US7317834B2 (en) 2000-06-21 2008-01-08 Microsoft Corporation Serial storage of ink and its properties
US7346230B2 (en) 2000-06-21 2008-03-18 Microsoft Corporation Transform table for ink sizing and compression
US7319789B2 (en) 2000-06-21 2008-01-15 Microsoft Corporation Serial storage of ink and its properties
US7321689B2 (en) 2000-06-21 2008-01-22 Microsoft Corporation Serial storage of ink and its properties
US6961478B2 (en) * 2000-06-30 2005-11-01 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and recording medium
US20020012477A1 (en) * 2000-06-30 2002-01-31 Hitoshi Inoue Image processing apparatus image processing method and recording medium
US7313289B2 (en) * 2000-08-30 2007-12-25 Ricoh Company, Ltd. Image processing method and apparatus and computer-readable storage medium using improved distortion correction
US20020041717A1 (en) * 2000-08-30 2002-04-11 Ricoh Company, Ltd. Image processing method and apparatus and computer-readable storage medium using improved distortion correction
US6750435B2 (en) 2000-09-22 2004-06-15 Eastman Kodak Company Lens focusing device, system and method for use with multiple light wavelengths
US20020106134A1 (en) * 2000-09-22 2002-08-08 Dundon Thomas A. Multiple-orientation image defect detection and correction
US20020159165A1 (en) * 2000-09-22 2002-10-31 Ford Gordon D. Lens focusing device, system and method for use with multiple light wavelengths
US6733960B2 (en) 2001-02-09 2004-05-11 Eastman Kodak Company Digital film processing solutions and method of digital film processing
US7103232B2 (en) * 2001-03-07 2006-09-05 Canon Kabushiki Kaisha Storing and processing partial images obtained from a panoramic image
US20020126914A1 (en) * 2001-03-07 2002-09-12 Daisuke Kotake Image reproduction apparatus, image processing apparatus, and method therefor
US20030118249A1 (en) * 2001-04-19 2003-06-26 Edgar Albert D. Method, system and software for correcting image defects
US6987892B2 (en) 2001-04-19 2006-01-17 Eastman Kodak Company Method, system and software for correcting image defects
US20020196369A1 (en) * 2001-06-01 2002-12-26 Peter Rieder Method and device for displaying at least two images within one combined picture
US7050112B2 (en) * 2001-06-01 2006-05-23 Micronas Gmbh Method and device for displaying at least two images within one combined picture
US7346229B2 (en) 2001-06-27 2008-03-18 Microsoft Corporation Transform table for ink sizing and compression
US7343053B2 (en) 2001-06-27 2008-03-11 Microsoft Corporation Transform table for ink sizing and compression
US20030002730A1 (en) * 2001-07-02 2003-01-02 Petrich David B. System and method for discovering and categorizing attributes of a digital image
US7764829B2 (en) 2001-07-02 2010-07-27 Petrich David B System and method for discovering and categorizing attributes of a digital image
US20060280368A1 (en) * 2001-07-02 2006-12-14 Photoinaphoto.Com, Inc. System and method for discovering and categorizing attributes of a digital image
US7113633B2 (en) 2001-07-02 2006-09-26 Photoinaphoto.Com, Inc. System and method for discovering and categorizing attributes of a digital image
US6805501B2 (en) 2001-07-16 2004-10-19 Eastman Kodak Company System and method for digital film development using visible light
US7236180B2 (en) 2001-08-01 2007-06-26 Microsoft Corporation Dynamic rendering of ink strokes with transparency
US7168038B2 (en) * 2001-08-01 2007-01-23 Microsoft Corporation System and method for scaling and repositioning drawings
US7091963B2 (en) 2001-08-01 2006-08-15 Microsoft Corporation Dynamic rendering of ink strokes with transparency
US7352366B2 (en) 2001-08-01 2008-04-01 Microsoft Corporation Dynamic rendering of ink strokes with transparency
US6770863B2 (en) * 2001-10-26 2004-08-03 Agilent Technologies, Inc. Apparatus and method for three-dimensional relative movement sensing
US20030080282A1 (en) * 2001-10-26 2003-05-01 Walley Thomas M. Apparatus and method for three-dimensional relative movement sensing
US7474765B2 (en) * 2001-11-09 2009-01-06 Honda Giken Kogyo Kabushiki Kaisha Image recognition apparatus
US20030091228A1 (en) * 2001-11-09 2003-05-15 Honda Giken Kogyo Kabushiki Kaisha Image recognition apparatus
US20070053584A1 (en) * 2001-11-09 2007-03-08 Honda Giken Kogyo Kabushiki Kaisha Image recognition apparatus
US7158664B2 (en) * 2001-11-09 2007-01-02 Honda Giken Kogyo Kabushiki Kaisha Image recognition apparatus
US7327899B2 (en) * 2002-06-28 2008-02-05 Microsoft Corp. System and method for head size equalization in 360 degree panoramic images
US20070206878A1 (en) * 2002-06-28 2007-09-06 Microsoft Corporation System and method for head size equalization in 360 degree panoramic images
US7623733B2 (en) * 2002-08-09 2009-11-24 Sharp Kabushiki Kaisha Image combination device, image combination method, image combination program, and recording medium for combining images having at least partially same background
US20060078224A1 (en) * 2002-08-09 2006-04-13 Masashi Hirosawa Image combination device, image combination method, image combination program, and recording medium containing the image combination program
US8896695B2 (en) 2002-08-28 2014-11-25 Visual Intelligence Lp Retinal concave array compound camera system
US8994822B2 (en) 2002-08-28 2015-03-31 Visual Intelligence Lp Infrastructure mapping system and method
US7925114B2 (en) * 2002-09-19 2011-04-12 Visual Intelligence, LP System and method for mosaicing digital ortho-images
US20100020097A1 (en) * 2002-09-19 2010-01-28 M7 Visual Intelligence, L.P. System and method for mosaicing digital ortho-images
USRE49105E1 (en) 2002-09-20 2022-06-14 Vi Technologies, Llc Self-calibrated, remote imaging and data processing system
US8483960B2 (en) 2002-09-20 2013-07-09 Visual Intelligence, LP Self-calibrated, remote imaging and data processing system
US9797980B2 (en) 2002-09-20 2017-10-24 Visual Intelligence Lp Self-calibrated, remote imaging and data processing system
US9389298B2 (en) 2002-09-20 2016-07-12 Visual Intelligence Lp Self-calibrated, remote imaging and data processing system
US7215362B2 (en) * 2002-10-31 2007-05-08 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Auto-calibration of multi-projector systems
US20050083402A1 (en) * 2002-10-31 2005-04-21 Stefan Klose Auto-calibration of multi-projector systems
US20040184656A1 (en) * 2003-03-19 2004-09-23 Minolta Co., Ltd Method for measuring object based on image and photographing apparatus
US7269281B2 (en) * 2003-03-19 2007-09-11 Minolta Co., Ltd. Method for measuring object based on image and photographing apparatus
US9124802B2 (en) 2003-06-03 2015-09-01 Leonard P. Steuart, III Digital 3D/360 degree camera system
US8274550B2 (en) 2003-06-03 2012-09-25 Steuart Iii Leonard P Skip Digital 3D/360 degree camera system
US11012622B2 (en) 2003-06-03 2021-05-18 Leonard P. Steuart, III Digital 3D/360 degree camera system
US20040246333A1 (en) * 2003-06-03 2004-12-09 Steuart Leonard P. (Skip) Digital 3D/360 degree camera system
US10574888B2 (en) 2003-06-03 2020-02-25 Leonard P. Steuart, III Digital 3D/360 degree camera system
US8937640B2 (en) 2003-06-03 2015-01-20 Leonard P. Steuart, III Digital 3D/360 degree camera system
US7463280B2 (en) 2003-06-03 2008-12-09 Steuart Iii Leonard P Digital 3D/360 degree camera system
US9706119B2 (en) 2003-06-03 2017-07-11 Leonard P. Steuart, III Digital 3D/360 degree camera system
US8659640B2 (en) 2003-06-03 2014-02-25 Leonard P. Steuart, III Digital 3D/360 ° camera system
US10218903B2 (en) 2003-06-03 2019-02-26 Leonard P. Steuart, III Digital 3D/360 degree camera system
US7027081B2 (en) 2003-12-21 2006-04-11 Kremen Stanley H System and apparatus for recording, transmitting, and projecting digital three-dimensional images
US20060038879A1 (en) * 2003-12-21 2006-02-23 Kremen Stanley H System and apparatus for recording, transmitting, and projecting digital three-dimensional images
US20060170772A1 (en) * 2005-01-28 2006-08-03 Technology Advancement Group Surveillance system and method
US7609290B2 (en) * 2005-01-28 2009-10-27 Technology Advancement Group, Inc. Surveillance system and method
US8024144B2 (en) * 2005-09-12 2011-09-20 Trimble Jena Gmbh Surveying instrument and method of providing survey data of a target region using a surveying instrument
US20090138233A1 (en) * 2005-09-12 2009-05-28 Torsten Kludas Surveying Instrument and Method of Providing Survey Data of a Target Region Using a Surveying Instrument
US20070081081A1 (en) * 2005-10-07 2007-04-12 Cheng Brett A Automated multi-frame image capture for panorama stitching using motion sensor
US8107684B2 (en) * 2006-02-08 2012-01-31 Thales Method for geolocalization of one or more targets
US20090034795A1 (en) * 2006-02-08 2009-02-05 Thales Method for geolocalization of one or more targets
US20070286526A1 (en) * 2006-03-20 2007-12-13 GENERAL DYNAMICS C4 SYSTEMS and ARIZONA BOARD OF REGENTS FOR AND ON BEHALF OF ARIZONA STATE Methods for Multi-Point Descriptors for Image Registrations
US8417060B2 (en) * 2006-03-20 2013-04-09 Arizona Board Of Regents For And On Behalf Of Arizona State University Methods for multi-point descriptors for image registrations
US20080065754A1 (en) * 2006-08-17 2008-03-13 Benhase Linda V System And Method For Dynamic Picture Generation In A Web Or Java Application
US20080056262A1 (en) * 2006-09-01 2008-03-06 Dheeraj Singh Approach for fast ip address lookups
US20080278518A1 (en) * 2007-05-08 2008-11-13 Arcsoft (Shanghai) Technology Company, Ltd Merging Images
US8275215B2 (en) * 2007-05-08 2012-09-25 Arcsoft (Shanghai) Technology Company, Ltd Merging images
US20090002774A1 (en) * 2007-06-27 2009-01-01 Anthony Michael King Phased Illumination Method for Image Capture System
US8203764B2 (en) * 2007-06-27 2012-06-19 Lexmark International, Inc. Phased illumination method for image capture system
US20090147004A1 (en) * 2007-12-06 2009-06-11 Barco Nv Method And System For Combining Images Generated By Separate Sources
US8164600B2 (en) * 2007-12-06 2012-04-24 Barco Nv Method and system for combining images generated by separate sources
US20090262180A1 (en) * 2008-04-18 2009-10-22 Samsung Electronics Co., Ltd. Apparatus for generating panoramic images and method thereof
US20100004020A1 (en) * 2008-07-02 2010-01-07 Samsung Electronics Co. Ltd. Mobile terminal and composite photographing method using multiple mobile terminals
CN101815226B (en) * 2009-02-13 2012-01-11 晨星软件研发(深圳)有限公司 Image adjusting apparatus and associated method
US20100321470A1 (en) * 2009-06-22 2010-12-23 Fujifilm Corporation Imaging apparatus and control method therefor
US20110091065A1 (en) * 2009-10-21 2011-04-21 MindTree Limited Image Alignment Using Translation Invariant Feature Matching
US8385689B2 (en) * 2009-10-21 2013-02-26 MindTree Limited Image alignment using translation invariant feature matching
US9237294B2 (en) 2010-03-05 2016-01-12 Sony Corporation Apparatus and method for replacing a broadcasted advertisement based on both heuristic information and attempts in altering the playback of the advertisement
US8917929B2 (en) * 2010-03-19 2014-12-23 Lapis Semiconductor Co., Ltd. Image processing apparatus, method, program, and recording medium
US20110311130A1 (en) * 2010-03-19 2011-12-22 Oki Semiconductor Co., Ltd. Image processing apparatus, method, program, and recording medium
US20160301841A1 (en) * 2010-05-03 2016-10-13 Invisage Technologies, Inc. Devices and methods for high-resolution image and video capture
US10506147B2 (en) * 2010-05-03 2019-12-10 Invisage Technologies, Inc. Devices and methods for high-resolution image and video capture
US8723946B2 (en) * 2010-09-16 2014-05-13 Honda Motor Co., Ltd. Workpiece inspecting apparatus and workpiece inspecting method
US20120069173A1 (en) * 2010-09-16 2012-03-22 Honda Motor Co., Ltd. Workpiece inspecting apparatus and workpiece inspecting method
US9699378B2 (en) * 2010-09-30 2017-07-04 Casio Computer Co., Ltd. Image processing apparatus, method, and storage medium capable of generating wide angle image
US20120081510A1 (en) * 2010-09-30 2012-04-05 Casio Computer Co., Ltd. Image processing apparatus, method, and storage medium capable of generating wide angle image
US9832528B2 (en) 2010-10-21 2017-11-28 Sony Corporation System and method for merging network-based content with broadcasted programming content
US8983121B2 (en) * 2010-10-27 2015-03-17 Samsung Techwin Co., Ltd. Image processing apparatus and method thereof
US20120106791A1 (en) * 2010-10-27 2012-05-03 Samsung Techwin Co., Ltd. Image processing apparatus and method thereof
US8861890B2 (en) 2010-11-24 2014-10-14 Douglas Alan Lefler System and method for assembling and displaying individual images as a continuous image
US8531506B2 (en) * 2011-02-24 2013-09-10 Au Optronics Corporation Interactive stereo display system and method for calculating three-dimensional coordinate
US20120218390A1 (en) * 2011-02-24 2012-08-30 Au Optronics Corporation Interactive stereo display system and method for calculating three-dimensional coordinate
US9998683B2 (en) * 2011-03-15 2018-06-12 Omron Corporation Image processing device and image processing program
US20140015956A1 (en) * 2011-03-15 2014-01-16 Omron Corporation Image processing device and image processing program
US20120263397A1 (en) * 2011-04-12 2012-10-18 Sony Corporation Image processing device, image processing method, and program
US20130136340A1 (en) * 2011-11-28 2013-05-30 Panasonic Corporation Arithmetic processing device
US9076215B2 (en) * 2011-11-28 2015-07-07 Panasonic Intellectual Property Management Co., Ltd. Arithmetic processing device
US20130156264A1 (en) * 2011-12-15 2013-06-20 Linus Mårtensson Minimizing drift using depth camera images
US9111351B2 (en) * 2011-12-15 2015-08-18 Sony Corporation Minimizing drift using depth camera images
US20130258044A1 (en) * 2012-03-30 2013-10-03 Zetta Research And Development Llc - Forc Series Multi-lens camera
US10885328B2 (en) 2012-09-28 2021-01-05 The Boeing Company Determination of position from images and associated camera positions
US9607219B2 (en) * 2012-09-28 2017-03-28 2D3 Limited Determination of position from images and associated camera positions
US20150248584A1 (en) * 2012-09-28 2015-09-03 Omg Plc Determination of position from images and associated camera positions
US10311297B2 (en) 2012-09-28 2019-06-04 The Boeing Company Determination of position from images and associated camera positions
US20150342139A1 (en) * 2013-01-31 2015-12-03 Lely Patent N.V. Camera system, animal related system therewith, and method to create 3d camera images
US10426127B2 (en) * 2013-01-31 2019-10-01 Lely Patent N.V. Camera system, animal related system therewith, and method to create 3D camera images
WO2016005232A1 (en) * 2014-07-11 2016-01-14 Bayerische Motoren Werke Aktiengesellschaft Merging of partial images to form an image of surroundings of a mode of transport
CN107027329A (en) * 2014-07-11 2017-08-08 宝马股份公司 The topography of the surrounding environment of traveling instrument is spliced into an image
US20170116710A1 (en) * 2014-07-11 2017-04-27 Bayerische Motoren Werke Aktiengesellschaft Merging of Partial Images to Form an Image of Surroundings of a Mode of Transport
US11087438B2 (en) * 2014-07-11 2021-08-10 Bayerische Motoren Werke Aktiengesellschaft Merging of partial images to form an image of surroundings of a mode of transport
CN107027329B (en) * 2014-07-11 2021-12-14 宝马股份公司 Stitching together partial images of the surroundings of a running tool into one image
US10209943B2 (en) * 2014-12-04 2019-02-19 Canon Kabushiki Kaisha Display control apparatus, control method thereof and storage medium
US20160162246A1 (en) * 2014-12-04 2016-06-09 Canon Kabushiki Kaisha Display control apparatus, control method thereof and storage medium
US20160295108A1 (en) * 2015-04-01 2016-10-06 Cheng Cao System and method for panoramic imaging
US10360436B2 (en) * 2015-06-30 2019-07-23 Hitachi Automotive Systems, Ltd. Object detection device
US9875548B2 (en) * 2015-12-18 2018-01-23 Ricoh Co., Ltd. Candidate list generation
US20170178335A1 (en) * 2015-12-18 2017-06-22 Ricoh Co., Ltd. Candidate List Generation

Also Published As

Publication number Publication date
EP0837428B1 (en) 2002-06-12
DE69713243D1 (en) 2002-07-18
JPH10178564A (en) 1998-06-30
EP0837428A2 (en) 1998-04-22
DE69713243T2 (en) 2003-01-23
EP0837428A3 (en) 2000-01-05

Similar Documents

Publication Publication Date Title
US6005987A (en) Picture image forming apparatus
KR101956149B1 (en) Efficient Determination of Optical Flow Between Images
US8189031B2 (en) Method and apparatus for providing panoramic view with high speed image matching and mild mixed color blending
US8175399B2 (en) Multiple-resolution image processing apparatus
EP2437494B1 (en) Device for monitoring area around vehicle
EP2518995B1 (en) Multocular image pickup apparatus and multocular image pickup method
KR100414629B1 (en) 3D display image generation method, image processing method using depth information, depth information generation method
US20100103175A1 (en) Method for generating a high-resolution virtual-focal-plane image
US5602584A (en) Apparatus for producing a panoramic image using a plurality of optical systems
EP1762979B1 (en) Layered image based rendering
US7933332B2 (en) Method and device for determination of motion vectors that are coordinated with regions of an image
JPH1093808A (en) Image synthesis device/method
EP3446283B1 (en) Image stitching method and device
WO2010095460A1 (en) Image processing system, image processing method, and image processing program
JP7150134B2 (en) Head-mounted display and image display method
EP3929900A1 (en) Image generation device, head-mounted display, and image generation method
JP5566199B2 (en) Image processing apparatus, control method therefor, and program
JPH10126665A (en) Image composing device
JP7142762B2 (en) Display device and image display method
WO2019026183A1 (en) Image generation device and image generation method
CN114972025A (en) Image fast splicing method based on YUV color space
JP3786300B2 (en) Motion vector detection apparatus and motion vector detection method
TWI375182B (en) A fast dynamic stitching point search method
JP3915664B2 (en) Image processing apparatus, image processing method used therefor, and program therefor
JPH09231408A (en) Moving image compositing device

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHARP KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAMURA, MITSUAKI;KITAMURA, YOSHIHIRO;REEL/FRAME:008940/0460

Effective date: 19971202

AS Assignment

Owner name: SHARP KABUSHIKI KAISHA, JAPAN

Free format text: RE-RECORD TO ADD ASSIGNOR NAME THAT WAS LEFT OFF ON A PREVIOUSLY RECORD DOCUMENT AT REEL 8940 FRAME 0460.;ASSIGNORS:NAKAMURA, MITSUAKI;KITAMURA, YOSHIHIRO;AKAGI, HIROSHI;REEL/FRAME:009048/0657

Effective date: 19971202

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20111221