WO1998025402A1 - A method and system for assisting in the manual capture of overlapping images for composite image generation in a digital camera - Google Patents

A method and system for assisting in the manual capture of overlapping images for composite image generation in a digital camera Download PDF

Info

Publication number
WO1998025402A1
WO1998025402A1 PCT/US1997/022387 US9722387W WO9825402A1 WO 1998025402 A1 WO1998025402 A1 WO 1998025402A1 US 9722387 W US9722387 W US 9722387W WO 9825402 A1 WO9825402 A1 WO 9825402A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
zone
view finder
user
camera
Prior art date
Application number
PCT/US1997/022387
Other languages
French (fr)
Inventor
Eric E. Anderson
Original Assignee
Flashpoint Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flashpoint Technology, Inc. filed Critical Flashpoint Technology, Inc.
Priority to AU53760/98A priority Critical patent/AU5376098A/en
Publication of WO1998025402A1 publication Critical patent/WO1998025402A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • the present invention relates generally to digital cameras, and more particularly to a method and system for assisting a user to manually capture overlapping images for composite image generation.
  • a single composite image may be generated from a series of overlapping photographs.
  • a "panorama” is an image created from a series of overlapping photographs that were taken while the camera is rotated less than 360 degrees
  • a "virtual world” is created from a series of photographs that were taken while the camera is rotated 360 degrees.
  • Specialized equipment has typically been required to generate such composite images. This is because the photographs used to generate the composite image must be overlapped in a manner that sufficiently aligns the images both horizontally and vertically. Such alignment is necessary to allow a software program, called a stitcher, to appropriately "stitch" the images together to form the composite image.
  • the type of extra equipment necessary to align the overlapping images typically includes a tripod and a mechanical alignment device fitted between the camera and a tripod that mechanically rotates the camera into pre-set positions.
  • This equipment enables a user to take a picture at each pre-set position, which automatically provides properly aligned overlapping photographs. After the photographs are taken, they are developed and then scanned into a computer where they are stitched together by the stitcher program.
  • the present invention provides a method and system for assisting a user in manually capturing overlapping images for composite image generation using a camera.
  • the method and system includes dividing the view finder into a first and second zone in response to the user capturing a first image. The portion of the first image that is to overlap with the next image is then displayed in the first zone of the view finder, while a live image is displayed in the second zone.
  • FIG. 1 is a block diagram of a digital camera that operates in accordance with the present invention.
  • FIG. 2 is a block diagram of the preferred embodiment for the imaging device of FIG. 1.
  • FIG. 3 is a block diagram of the preferred embodiment for the computer of FIG. 1.
  • FIG. 4 is a memory map showing the preferred embodiment of the Dynamic Random- Access-Memory (DRAM).
  • FIG. 5 is a diagram depicting a user interface for the digital camera.
  • FIGS. 6 A and 6B are diagrams illustrating the capture of a series of overlapping images by a camera for use in composite image generation.
  • FIG. 7 is a flow chart depicting the process of assisting in the manual capture of overlapping images for composite image generation in accordance with the present invention.
  • FIG. 8 is a diagram showing an example scene that may be used to create a panorama using three overlapping images.
  • FIGS. 9 A and 9B are diagrams showing a camera view finder divided into two zones in accordance with the present invention.
  • the present invention is a digital camera that includes a method and system for capturing overlapping images for composite image generation.
  • Camera 110 preferably comprises an imaging device 114, a system bus 116 and a computer 118.
  • Imaging device 114 is optically coupled to an object 112 and electrically coupled via system bus 116 to computer 118.
  • computer 118 commands imaging device 114 via system bus 116 to capture raw image data representing object 112. The captured raw image data is transferred over system bus 116 to computer
  • Imaging device 114 preferably comprises a lens 220 having an iris, a filter 222, an image sensor 224, a timing generator 226, an analog signal processor (ASP) 228, an analog-to-digital (A D) converter 230, an interface 232, and one or more motors 234.
  • ASP analog signal processor
  • a D analog-to-digital
  • imaging device 114 captures an image of object 112 via reflected light impacting image sensor 224 along optical path 236.
  • Image sensor 224 responsively generates a set of raw image data representing the captured image 112. The raw image data is then routed through ASP 228, A/D converter
  • Interface 232 has outputs for controlling ASP 228, motors 234 and timing generator 226. From interface 232, the raw image data passes over system bus 116 to computer 118.
  • System bus 116 provides connection paths between imaging device 114, power manager 342, central processing unit (CPU) 344, dynamic random-access memory (DRAM) 346, input/output interface (I/O) 348, read-only memory (ROM) 350, and buffers/connector 352.
  • Removable memory 354 connects to system bus 116 via buffers/connector 352.
  • camera 110 may be implemented without removable memory 354 or buffers/connector 352.
  • Power manager 342 communicates via line 366 with power supply 356 and coordinates power management operations for camera 110.
  • CPU 344 typically includes a conventional processor device for controlling the operation of camera 110. In the preferred embodiment, CPU 344 is capable of concurrently running multiple software routines to control the various processes of camera 110 within a multi-threading environment.
  • DRAM 346 is a contiguous block of dynamic memory which may be selectively allocated to various storage functions.
  • LCD controller 390 accesses DRAM 346 and transfers processed image data to LCD view finder 402 for display, as explained further below.
  • I/O 348 is an interface device allowing communications to and from computer 118.
  • I/O 348 permits an external host computer (not shown) to connect to and communicate with computer 1 18.
  • I/O 348 also permits a camera 1 10 user to communicate with camera 110 via an external user interface and via an external display panel, referred to as a view finder.
  • ROM 350 typically comprises a conventional nonvolatile read-only memory which stores a set of computer-readable program instructions to control the operation of camera
  • Power supply 356 supplies operating power to the various components of camera 110.
  • power supply 356 provides operating power to a main power bus 362 and also to a secondary power bus 364.
  • the main power bus 362 provides power to imaging device 114, I/O 348, ROM 350 and removable memory 354.
  • the secondary power bus 364 provides power to power manager 342, CPU 344 and DRAM 346.
  • the main batteries 358 provide operating power to power supply 356 which then provides the operating power to camera 110 via both main power bus 362 and secondary power bus 364.
  • the backup batteries 360 provide operating power to power supply 356 which then provides the operating power only to the secondary power bus 364 of camera 110.
  • DRAM 346 includes RAM disk 532, a system area 534, and working memory 530.
  • RAM disk 532 is a memory area used for storing raw and compressed image data and typically is organized in a "sectored" format similar to that of conventional hard disk drives.
  • RAM disk 532 uses a well-known and standardized file system to permit external host computer systems, via I/O 348, to readily recognize and access the data stored on RAM disk 532.
  • System area 534 typically stores data regarding system errors (for example, why a system shutdown occurred) for use by CPU
  • input buffers 538 include an input buffer A and an input buffer B
  • frame buffers 536 include a frame buffer A and a frame buffer B.
  • the input buffers A and B alternate between an input cycle and a processing cycle.
  • the input buffers 538 are filled with raw image data from the image device 114, and during the processing cycle, CPU 344 processes the raw data and transmits the processed data to the frame buffers 536. More specifically, while input buffer A is filling with image data, the data from input buffer B is processed and transmitted to frame buffer B. At the same time, previously processed data in frame buffer A is output to the LCD view finder 402 for display. While input buffer B is filling with image data, the data from input buffer A is processed and transmitted to frame buffer
  • FIG. 5 is a diagram depicting a user interface 400 for the digital camera as described in co-pending U.S. Patent Application Serial No. , entitled “A Method and System For Displaying Images In The Interface of a Digital Camera,” which is assigned to the Assignee of the present application and incorporated herein by reference.
  • the user interface includes the LCD view finder 402 (hereinafter "view finder"), an image capture button called a photo button 404, a four-way navigation control button 406, a menu button 408, a menu area 410 within the view finder 402, and function keys 412.
  • the user interface 400 may also include an optional sound button 414, and a mode button 416.
  • the user interface 400 operates in two modes: live view mode and review mode.
  • the photo button 404 is a two position button.
  • the live view mode begins when a user aims the camera at an object 112 and presses the photo button 404 into the first position. Once this occurs, the view finder 402 displays a live image of the object 112 as shown through the camera's imaging device 114. The user may then press the photo button 404 into the second position to capture the image shown in the view finder 402.
  • Review mode begins by pressing any other button on the interface 400.
  • the view finder 402 displays a series of cells 420 that represent the digital images that have been captured in the digital camera.
  • the view finder 402 is shown here as displaying nine image cells 420.
  • Each cell 420 displays a small-sized image corresponding to one of the captured images.
  • the user may navigate through the series of displayed cells 420 in the view finder 402 using the four-way navigation control button 406. As the user navigates through the cells 420, the old image cells 420 are scrolled-off the view finder 402 and replaced by new image cells 420 representing other images stored in the camera.
  • the present invention provides a method and system for assisting a user in manually capturing a series of overlapping images in order to create a single composite image or panorama. Although the present invention will be explained with reference to the digital camera described herein, one with ordinary skill in the art will recognize that the method and system of the present invention will function with a conventional camera equipped with an electric view finder to create a panorama as well.
  • FIGS. 6 A and 6B are diagrams illustrating the capture of a series of overlapping images by a camera for use in composite image generation.
  • FIG. 6A is a top view showing the camera rotated into three positions to capture three corresponding images.
  • FIG. 6B shows the area of overlap 440a between image 1 and image 2, and the area of overlap 440b between image 2 and image 3.
  • the generation of a composite image from overlapping images typically requires an overlap area 440 of approximately twenty-five percent between two images.
  • the present invention provides a method and system for assisting a user to manually align and capture the overlapping images without extra equipment. This is accomplished by dividing the view finder 402 of the camera into two zones, where one zone displays a live image and the other zone displays a still image of the overlapping portion of the last captured image. This enables the user to manually align the live image with the still image without the need for alignment equipment, such as a tripod etc.
  • FIG. 7 is a flow chart depicting the process of assisting in the manual capture of overlapping images for composite image generation in a preferred embodiment of the present invention.
  • FIG. 8 is a diagram showing an example scene that may be used to create a panorama using three overlapping images, as shown in FIGS. 6 A and 6B. Although a three image example is used here, a composite image maybe made with any number of overlapping images.
  • the user captures the first image by placing the camera into position 1 so that one edge of the scene appears in the view finder 402 (the left edge in this example).
  • the view finder 402 of the camera is divided into two separate zones in step 556 in accordance with the present invention.
  • FIGS. 9A and 9B are diagrams illustrating the camera view finder 402 divided into zones A and B.
  • zone A in this example, displays a portion of the previously captured image that overlaps the next image in step 558.
  • zone B displays a live image of the scene as shown through the camera's imaging device in step 560.
  • a still image of the overlap area 440a of the first image is displayed in zone A of the view finder 402, while a live image of the next image in the scene is displayed in zone B.
  • zones A and B are shaped by a dividing line 580 comprising a series of darkened pixels.
  • the dividing line 580 is shown here as interlocking cut-outs, but could also be drawn as a straight line, a diagonal line, or a zig- zag line, for instance.
  • the user after displaying the live image in zone B in step 560, the user establishes horizontal and vertical alignment between the live image in zone B with the still image in zone A in step 562 by altering the position of the camera.
  • the view finder 402 is again divided into two zones in step 556, and the process continues. This is shown in FIG. 9B which shows that after the user captures the second image from position 2, the overlap area 440b of the second image is displayed in zone A of the view finder 402, while zone B displays the live view of the scene in camera position 3.
  • the dividing of the view finder 402 into separate zones is accomplished by manipulating the input buffers 538 and the frame buffers 536 (FIG. 4B).
  • the input buffers 538 and the frame buffers 536 are manipulated by dividing each of the input buffers A and B and each of the frame buffers A and B into two zones (A and B) corresponding to the two zones in the view finder 402. This division of the input and frame buffers 538 and 536 is a multi-stage process.
  • the input and frame buffers 538 and 536 are processed as shown with reference to FIG. 4B.
  • a post-capture process is performed on the input buffers 538 and the frame buffers 536 to display the overlap portion of the previously captured image in zone A of the view finder 402.
  • the input buffers 538 and the frame buffers 536 are processed according to a modified live view process to display a view only in zone B of the view finder 402.
  • the input buffers 538 and the frame buffers 536 are processed by the live view generation program running on CPU 344, but could also be processed using well known hardware operating in accordance with the present invention.
  • FIGS. 10 and 11 are block diagrams illustrating the post-capture processing of the input buffers 538 and the frame buffers 536.
  • each of the input buffers A and B and each of the frame buffers A and B are divided into two zones (A and B) corresponding to the two zones in the view finder 402, where zone A corresponds to the overlap area of the previously captured image.
  • zone A corresponds to the overlap area of the previously captured image.
  • the CPU 344 processes the data from zone A of input buffer A and transfers the data to zone A of frame buffer A for output to the view finder 402.
  • the CPU 344 processes the data from zone A of input buffer B and transfers the data to zone A of frame buffer A.
  • the data transferred to zone A of frame buffer A is then copied to zone A of frame buffer B for output to zone A of the view finder 402.
  • the data for zone B of the view finder 402 must be processed.
  • FIG. 12 is a block diagram illustrating the modified live view processing of the input buffers 538 and the frame buffers 536 during composite image capture. As shown, the positioning of zone A and zone B in the input buffers A and B are switched during live view processing, and the object as seen through the camera's imaging device is input directly into zone B of both input buffers.

Abstract

A method and system for assisting a user in manually capturing overlapping images for composite image generation using a camera (110). The method and system includes dividing the view finder (402) into a first and second zone (556) in response to the user capturing a first image (552). The portion of the first image (440a) that is to overlap with the next image is then displayed (558) on the first zone (A) of the view finder (402), while a live image is displayed (560) in the second zone (B). The two zones of the view finder thereby enable the user to align (562) the live image with the first image before capturing (564) the next image.

Description

A METHOD AND SYSTEM FOR ASSISTING IN THE MANUAL
CAPTURE OF OVERLAPPING IMAGES FOR COMPOSITE
IMAGE GENERATION IN A DIGITAL CAMERA
FIELD OF THE INVENTION
The present invention relates generally to digital cameras, and more particularly to a method and system for assisting a user to manually capture overlapping images for composite image generation.
BACKGROUND OF THE INVENTION
In the field of photography, a single composite image may be generated from a series of overlapping photographs. There are several types of composite images. For example, a "panorama" is an image created from a series of overlapping photographs that were taken while the camera is rotated less than 360 degrees, while a "virtual world" is created from a series of photographs that were taken while the camera is rotated 360 degrees. Specialized equipment has typically been required to generate such composite images. This is because the photographs used to generate the composite image must be overlapped in a manner that sufficiently aligns the images both horizontally and vertically. Such alignment is necessary to allow a software program, called a stitcher, to appropriately "stitch" the images together to form the composite image. The type of extra equipment necessary to align the overlapping images typically includes a tripod and a mechanical alignment device fitted between the camera and a tripod that mechanically rotates the camera into pre-set positions. This equipment enables a user to take a picture at each pre-set position, which automatically provides properly aligned overlapping photographs. After the photographs are taken, they are developed and then scanned into a computer where they are stitched together by the stitcher program.
Although the photographs provided through the use of the extra equipment creates satisfactory composite images, there are several drawbacks to this approach. One drawback is that typical camera owners do not generally travel with a tripod. Therefore, when a user discovers a scene that is a good candidate for a composite image, the user either does not attempt to take overlapping images, or the images that are taken are not properly overlapped to generate the composite image. And even in instances where the user has a tripod, the user may not have the mechanical alignment device, or may not have the expertise to use the device correctly.
Accordingly, what is needed is a method and system for assisting a user in manually capturing overlapping images for composite image generation without the use of extra equipment. The present invention addresses such a need.
SUMMARY OF THE INVENTION
The present invention provides a method and system for assisting a user in manually capturing overlapping images for composite image generation using a camera. The method and system includes dividing the view finder into a first and second zone in response to the user capturing a first image. The portion of the first image that is to overlap with the next image is then displayed in the first zone of the view finder, while a live image is displayed in the second zone.
According to the system and method disclosed herein, displaying the two zones in the view finder enables the user to align the live image with the first image without the need for extra equipment.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a digital camera that operates in accordance with the present invention.
FIG. 2 is a block diagram of the preferred embodiment for the imaging device of FIG. 1.
FIG. 3 is a block diagram of the preferred embodiment for the computer of FIG. 1.
FIG. 4 is a memory map showing the preferred embodiment of the Dynamic Random- Access-Memory (DRAM). FIG. 5 is a diagram depicting a user interface for the digital camera.
FIGS. 6 A and 6B are diagrams illustrating the capture of a series of overlapping images by a camera for use in composite image generation.
FIG. 7 is a flow chart depicting the process of assisting in the manual capture of overlapping images for composite image generation in accordance with the present invention. FIG. 8 is a diagram showing an example scene that may be used to create a panorama using three overlapping images.
FIGS. 9 A and 9B are diagrams showing a camera view finder divided into two zones in accordance with the present invention.
FIGS 10-12 are diagrams illustrating the processing of the input buffers and frame buffers to support the display of zones in the camera view finder.
DESCRIPTION OF THE INVENTION
The present invention relates to an improvement in digital cameras. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements.
Various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. Thus, the present invention is not intended to be limited to the embodiment shown but is to be accorded the widest scope consistent with the principles and features described herein.
The present invention is a digital camera that includes a method and system for capturing overlapping images for composite image generation.
A digital camera architecture has been disclosed in co-pending U.S. Patent
Application Serial No. , entitled "A System And Method For Using A Unified Memory Architecture To Implement A Digital Camera Device," filed on , 1996, and assigned to the Assignee of the present application. The Applicant hereby incorporates the co-pending application by reference, and reproduces portions of that application herein with reference to FIGS. 1-3 for convenience.
Referring now to FIG. 1, a block diagram of a camera 110 is shown according to the present invention. Camera 110 preferably comprises an imaging device 114, a system bus 116 and a computer 118. Imaging device 114 is optically coupled to an object 112 and electrically coupled via system bus 116 to computer 118. Once a photographer has focused imaging device 114 on object 112 and, using a capture button or some other means, instructed camera 110 to capture an image of object 112, computer 118 commands imaging device 114 via system bus 116 to capture raw image data representing object 112. The captured raw image data is transferred over system bus 116 to computer
118 which performs various image processing functions on the image data before storing it in its internal memory. System bus 116 also passes various status and control signals between imaging device 114 and computer 118.
Referring now to FIG. 2, a block diagram of the preferred embodiment of imaging device 114 is shown. Imaging device 114 preferably comprises a lens 220 having an iris, a filter 222, an image sensor 224, a timing generator 226, an analog signal processor (ASP) 228, an analog-to-digital (A D) converter 230, an interface 232, and one or more motors 234.
U.S. Patent Application Serial No. 08/355,031, entitled "A System and Method For Generating a Contrast Overlay as a Focus Assist for an Imaging Device," filed on
December 13, 1994, is incorporated herein by reference and provides a detailed discussion of the preferred elements of imaging device 114. Briefly, imaging device 114 captures an image of object 112 via reflected light impacting image sensor 224 along optical path 236. Image sensor 224 responsively generates a set of raw image data representing the captured image 112. The raw image data is then routed through ASP 228, A/D converter
230 and interface 232. Interface 232 has outputs for controlling ASP 228, motors 234 and timing generator 226. From interface 232, the raw image data passes over system bus 116 to computer 118.
Referring now to FIG. 3, a block diagram of the preferred embodiment for computer 118 is shown. System bus 116 provides connection paths between imaging device 114, power manager 342, central processing unit (CPU) 344, dynamic random-access memory (DRAM) 346, input/output interface (I/O) 348, read-only memory (ROM) 350, and buffers/connector 352. Removable memory 354 connects to system bus 116 via buffers/connector 352. Alternately, camera 110 may be implemented without removable memory 354 or buffers/connector 352.
Power manager 342 communicates via line 366 with power supply 356 and coordinates power management operations for camera 110. CPU 344 typically includes a conventional processor device for controlling the operation of camera 110. In the preferred embodiment, CPU 344 is capable of concurrently running multiple software routines to control the various processes of camera 110 within a multi-threading environment. DRAM 346 is a contiguous block of dynamic memory which may be selectively allocated to various storage functions. LCD controller 390 accesses DRAM 346 and transfers processed image data to LCD view finder 402 for display, as explained further below.
I/O 348 is an interface device allowing communications to and from computer 118. For example, I/O 348 permits an external host computer (not shown) to connect to and communicate with computer 1 18. I/O 348 also permits a camera 1 10 user to communicate with camera 110 via an external user interface and via an external display panel, referred to as a view finder.
ROM 350 typically comprises a conventional nonvolatile read-only memory which stores a set of computer-readable program instructions to control the operation of camera
110. Removable memory 354 serves as an additional image data storage area and is preferably a non- volatile device, readily removable and replaceable by a camera 110 user via buffers/connector 352. Thus, a user who possesses several removable memories 354 may replace a full removable memory 354 with an empty removable memory 354 to effectively expand the picture-taking capacity of camera 110. In the preferred embodiment of the present invention, removable memory 354 is typically implemented using a flash disk.
Power supply 356 supplies operating power to the various components of camera 110. In the preferred embodiment, power supply 356 provides operating power to a main power bus 362 and also to a secondary power bus 364. The main power bus 362 provides power to imaging device 114, I/O 348, ROM 350 and removable memory 354. The secondary power bus 364 provides power to power manager 342, CPU 344 and DRAM 346.
Power supply 356 is connected to main batteries 358 and also to backup batteries 360. In the preferred embodiment, a camera 110 user may also connect power supply
356 to an external power source. During normal operation of power supply 356, the main batteries 358 provide operating power to power supply 356 which then provides the operating power to camera 110 via both main power bus 362 and secondary power bus 364. During a power failure mode in which the main batteries 358 have failed (when their output voltage has fallen below a minimum operational voltage level) the backup batteries 360 provide operating power to power supply 356 which then provides the operating power only to the secondary power bus 364 of camera 110.
Referring now to FIG. 4A, a memory map showing the preferred embodiment of dynamic random-access-memory (DRAM) 346 is shown. In the preferred embodiment, DRAM 346 includes RAM disk 532, a system area 534, and working memory 530. RAM disk 532 is a memory area used for storing raw and compressed image data and typically is organized in a "sectored" format similar to that of conventional hard disk drives. In the preferred embodiment, RAM disk 532 uses a well-known and standardized file system to permit external host computer systems, via I/O 348, to readily recognize and access the data stored on RAM disk 532. System area 534 typically stores data regarding system errors (for example, why a system shutdown occurred) for use by CPU
344 upon a restart of computer 118.
Working memory 530 includes various stacks, data structures and variables used by CPU 344 while executing the software routines used within computer 118. Working memory 530 also includes input buffers 538 for initially storing sets of raw image data received from imaging device 114, and frame buffers 536 for temporarily storing image data during the image processing and compression process.
In a preferred embodiment, the conversion process is performed by a live view generation program, which is stored in ROM 350 and executed on CPU 344. However, the conversion process can also be implemented using hardware. Referring again to FIG. 3, during the execution of the live view generation program, the CPU 344 takes the raw image data from the input buffers 538 in CCD format and performs color space conversion on the data. The conversions process performs gamma correction and converts the raw CCD data into either a RGB or YCC format which is compatible with the LCD view finder 402 display. After the conversion, CPU 344 stores the image data in the frame buffers 536. The LCD controller 390 then transfers the processed image data from the frame buffers to the LCD view finder 402 (via an optional analog converter) for display.
Referring now to FIG. 4B, the contents of input buffers 538 and the frame buffers 536 are shown. In a preferred embodiment, both the input buffers 538 and the frame buffers 536 utilize two separate buffers, called ping-pong buffers, to improve the display speed of the digital camera and to prevent the tearing of the image in the view finder 402.
As shown, input buffers 538 include an input buffer A and an input buffer B, and frame buffers 536 include a frame buffer A and a frame buffer B.
The input buffers A and B alternate between an input cycle and a processing cycle. During the input cycle, the input buffers 538 are filled with raw image data from the image device 114, and during the processing cycle, CPU 344 processes the raw data and transmits the processed data to the frame buffers 536. More specifically, while input buffer A is filling with image data, the data from input buffer B is processed and transmitted to frame buffer B. At the same time, previously processed data in frame buffer A is output to the LCD view finder 402 for display. While input buffer B is filling with image data, the data from input buffer A is processed and transmitted to frame buffer
A. At the same time, previously processed data in frame buffer B is output to the LCD view finder 402 for display.
FIG. 5 is a diagram depicting a user interface 400 for the digital camera as described in co-pending U.S. Patent Application Serial No. , entitled "A Method and System For Displaying Images In The Interface of a Digital Camera," which is assigned to the Assignee of the present application and incorporated herein by reference. The user interface includes the LCD view finder 402 (hereinafter "view finder"), an image capture button called a photo button 404, a four-way navigation control button 406, a menu button 408, a menu area 410 within the view finder 402, and function keys 412. The user interface 400 may also include an optional sound button 414, and a mode button 416.
The user interface 400 operates in two modes: live view mode and review mode. In a preferred embodiment, the photo button 404 is a two position button. The live view mode begins when a user aims the camera at an object 112 and presses the photo button 404 into the first position. Once this occurs, the view finder 402 displays a live image of the object 112 as shown through the camera's imaging device 114. The user may then press the photo button 404 into the second position to capture the image shown in the view finder 402. Review mode begins by pressing any other button on the interface 400.
Once the digital camera 110 is placed in the review mode, the view finder 402 displays a series of cells 420 that represent the digital images that have been captured in the digital camera. The view finder 402 is shown here as displaying nine image cells 420. Each cell 420 displays a small-sized image corresponding to one of the captured images.
The user may navigate through the series of displayed cells 420 in the view finder 402 using the four-way navigation control button 406. As the user navigates through the cells 420, the old image cells 420 are scrolled-off the view finder 402 and replaced by new image cells 420 representing other images stored in the camera. The present invention provides a method and system for assisting a user in manually capturing a series of overlapping images in order to create a single composite image or panorama. Although the present invention will be explained with reference to the digital camera described herein, one with ordinary skill in the art will recognize that the method and system of the present invention will function with a conventional camera equipped with an electric view finder to create a panorama as well.
FIGS. 6 A and 6B are diagrams illustrating the capture of a series of overlapping images by a camera for use in composite image generation. FIG. 6A is a top view showing the camera rotated into three positions to capture three corresponding images. FIG. 6B shows the area of overlap 440a between image 1 and image 2, and the area of overlap 440b between image 2 and image 3. In a preferred embodiment, the generation of a composite image from overlapping images typically requires an overlap area 440 of approximately twenty-five percent between two images.
As stated above, past techniques for enabling a user to capture overlapping images have required extra equipment to place the camera into pre-set positions in order to automatically align the overlapping images. The present invention, in contrast, provides a method and system for assisting a user to manually align and capture the overlapping images without extra equipment. This is accomplished by dividing the view finder 402 of the camera into two zones, where one zone displays a live image and the other zone displays a still image of the overlapping portion of the last captured image. This enables the user to manually align the live image with the still image without the need for alignment equipment, such as a tripod etc. FIG. 7 is a flow chart depicting the process of assisting in the manual capture of overlapping images for composite image generation in a preferred embodiment of the present invention. The process begins when a user depresses a function button on the camera to start the composite image capture process in step 550. In a preferred embodiment, the digital camera will then display a prompt in the view finder 402 requesting the user to capture the first image of the panorama in step 552. In response, the user aims the camera at a desired scene and captures the first image of the panorama in step 554.
FIG. 8 is a diagram showing an example scene that may be used to create a panorama using three overlapping images, as shown in FIGS. 6 A and 6B. Although a three image example is used here, a composite image maybe made with any number of overlapping images. As shown, the user captures the first image by placing the camera into position 1 so that one edge of the scene appears in the view finder 402 (the left edge in this example). Referring again to FIG. 7, after the first image is captured, the view finder 402 of the camera is divided into two separate zones in step 556 in accordance with the present invention.
FIGS. 9A and 9B are diagrams illustrating the camera view finder 402 divided into zones A and B. Referring again to FIG. 7, one of the zones in the view finder 402, zone A in this example, displays a portion of the previously captured image that overlaps the next image in step 558. The other zone, zone B, displays a live image of the scene as shown through the camera's imaging device in step 560.
Referring again to FIGS. 8 and 9A for example, after the first image is captured and after the user places the camera into position 2 (steps 558 and 560), a still image of the overlap area 440a of the first image is displayed in zone A of the view finder 402, while a live image of the next image in the scene is displayed in zone B.
In a preferred embodiment, zones A and B are shaped by a dividing line 580 comprising a series of darkened pixels. The dividing line 580 is shown here as interlocking cut-outs, but could also be drawn as a straight line, a diagonal line, or a zig- zag line, for instance.
Referring again to FIG. 7, after displaying the live image in zone B in step 560, the user establishes horizontal and vertical alignment between the live image in zone B with the still image in zone A in step 562 by altering the position of the camera.
After aligning the live image with the still image in step 562, the user captures the second image of the panorama in step 564. If the user depresses a button indicating that the panorama capture is done in step 566, the process ends at step 568. Otherwise, the view finder 402 is again divided into two zones in step 556, and the process continues. This is shown in FIG. 9B which shows that after the user captures the second image from position 2, the overlap area 440b of the second image is displayed in zone A of the view finder 402, while zone B displays the live view of the scene in camera position 3. According to the present invention, the dividing of the view finder 402 into separate zones is accomplished by manipulating the input buffers 538 and the frame buffers 536 (FIG. 4B). In a preferred embodiment, the input buffers 538 and the frame buffers 536 are manipulated by dividing each of the input buffers A and B and each of the frame buffers A and B into two zones (A and B) corresponding to the two zones in the view finder 402. This division of the input and frame buffers 538 and 536 is a multi-stage process. When the digital camera is in normal live view mode, the input and frame buffers 538 and 536 are processed as shown with reference to FIG. 4B.
After the user has initiated the composite image sequence, and has then captured the first image in the sequence, a post-capture process is performed on the input buffers 538 and the frame buffers 536 to display the overlap portion of the previously captured image in zone A of the view finder 402. Thereafter, the input buffers 538 and the frame buffers 536 are processed according to a modified live view process to display a view only in zone B of the view finder 402. In a preferred embodiment, the input buffers 538 and the frame buffers 536 are processed by the live view generation program running on CPU 344, but could also be processed using well known hardware operating in accordance with the present invention.
FIGS. 10 and 11 are block diagrams illustrating the post-capture processing of the input buffers 538 and the frame buffers 536. As shown, each of the input buffers A and B and each of the frame buffers A and B are divided into two zones (A and B) corresponding to the two zones in the view finder 402, where zone A corresponds to the overlap area of the previously captured image. To compute the display of zone A in the view finder 402 if input buffer A is currently in a processing cycle, the CPU 344 processes the data from zone A of input buffer A and transfers the data to zone A of frame buffer A for output to the view finder 402. Alternatively, if input buffer B is currently in a processing cycle, the CPU 344 processes the data from zone A of input buffer B and transfers the data to zone A of frame buffer A. Referring to FIG. 11, the data transferred to zone A of frame buffer A is then copied to zone A of frame buffer B for output to zone A of the view finder 402. Next, the data for zone B of the view finder 402 must be processed.
FIG. 12 is a block diagram illustrating the modified live view processing of the input buffers 538 and the frame buffers 536 during composite image capture. As shown, the positioning of zone A and zone B in the input buffers A and B are switched during live view processing, and the object as seen through the camera's imaging device is input directly into zone B of both input buffers.
If input buffer A is currently in a processing cycle, the CPU 344 processes the data from zone B of input buffer A and transfers the data into zone B of frame buffer A for output to zone B of the view finder 402. If input buffer B is currently in a processing cycle, the CPU 344 processes the data from zone B of input buffer B and transfers the data into zone B of frame buffer B. This processing of zone B data from the input buffers 538 to the frame buffers 536 continues until the next image is captured. After the next image is captured, the overlap portion of the newly captured image is copied into zone A of the input buffers 538 for the post-capture process, and the above process is repeated until the user ends the composite image capture process.
A method and system for assisting a user in the capture of overlapping images for composite image generation has been disclosed which dispenses for the need of extra camera equipment. Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention. For example, the dividing of the viewfinder into zones is not limited to two zones, but rather may include more than two. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.

Claims

CLAIMSWhat is claimed is:
1. A method for assisting a user in manually capturing overlapping images for composite image generation using a camera, wherein the overlapping images including a first image and a second image, and the camera includes a view finder, the method comprising the steps of:
(a) dividing the view finder into a first and a second zone in response to the user capturing the first image; (b) displaying a portion of the first image that is to overlap with the second image in the first zone; and
(c) displaying a live image in the second zone to enable the user to align the live image with the first image.
2. A method as in claim 1 further including the step of:
(d) capturing the live image to provide the second image.
3. A method as in claim 2 wherein step (a) includes the step of:
(al) shaping the first zone and the second zone in the view finder using a contiguously shaped line.
4. A method as in claim 3 wherein step (a) is performed in response to the user initiating a composite image capture process in the camera.
5. A method as in claim 4 wherein step (a) further includes the step of prompting the user to capture the first image.
6. A method as in claim 5 further including the step of:
(e) repeating steps (a) through (d) until the user indicates the composite image process has completed.
7. A digital camera comprising: an imaging device for capturing image data; a memory for storing the image data from the imaging device; a view finder; and processing means coupled to the imaging device, the memory and to the view finder, the processing means functioning to divide the view finder into a first and a second zone in response to the user initiating a composite image capture process and capturing a first image, the processing means further functioning to display a portion of the first image that is to overlap with a second image in the first zone, and to display a live image in the second zone to enable the user to align the live image with the first image before capturing the live image.
8. The invention of claim 7 wherein the memory includes a first buffer and a second buffer, and wherein the first buffer stores the image data from imaging device, the processing means functioning to perform color space conversion on the image data to provide processed image data, and to store the processed image data in the frame buffer for output to the view finder.
9. The invention of claim 8 wherein the first buffer includes a first and second input buffer, and the second buffer includes a first and second frame buffer.
10. The invention of claim 9 wherein the first and second input buffers and the first and second frame buffers are divided into a first zone and a second zone corresponding to the view finder.
11. The invention of claim 10 wherein the digital camera further an LCD controller coupled to the memory and to the view finder for transferring the image data from the first and second frame buffers to the view finder for display.
12. The invention of claim 11 wherein the color space conversion converts the raw CCD data into one of a RGB and a YCC format.
13. The invention of claim 12 wherein the processing means comprises a software program.
14. The invention of claim 13 wherein the processing means comprises hardware.
15. The invention of claim 14 wherein the view finder includes a plurality of zones.
PCT/US1997/022387 1996-12-06 1997-12-05 A method and system for assisting in the manual capture of overlapping images for composite image generation in a digital camera WO1998025402A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU53760/98A AU5376098A (en) 1996-12-06 1997-12-05 A method and system for assisting in the manual capture of overlapping images for composite image generation in digital camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US76130196A 1996-12-06 1996-12-06
US08/761,301 1996-12-06

Publications (1)

Publication Number Publication Date
WO1998025402A1 true WO1998025402A1 (en) 1998-06-11

Family

ID=25061822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/022387 WO1998025402A1 (en) 1996-12-06 1997-12-05 A method and system for assisting in the manual capture of overlapping images for composite image generation in a digital camera

Country Status (2)

Country Link
AU (1) AU5376098A (en)
WO (1) WO1998025402A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999066716A1 (en) * 1998-06-18 1999-12-23 Sony Electronics Inc. A method of and apparatus for partitioning, scaling and displaying video and/or graphics across several display devices
US6593937B2 (en) 1998-06-18 2003-07-15 Sony Corporation Method of and apparatus for handling high bandwidth on-screen-display graphics data over a distributed IEEE 1394 network utilizing an isochronous data transmission format
WO2003105466A1 (en) * 2002-06-07 2003-12-18 Koninklijke Philips Electronics N.V. Method of imaging an object and mobile imaging device
EP1379073A1 (en) * 2002-07-03 2004-01-07 Siemens Aktiengesellschaft Method of capturing images for a panoramic image
WO2004006566A1 (en) 2002-07-08 2004-01-15 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing
WO2005093652A1 (en) * 2004-03-27 2005-10-06 Koninklijke Philips Electronics N.V. Image capture device and related method
US7024054B2 (en) 2002-09-27 2006-04-04 Eastman Kodak Company Method and system for generating a foreground mask for a composite image
US7646400B2 (en) 2005-02-11 2010-01-12 Creative Technology Ltd Method and apparatus for forming a panoramic image
US9185284B2 (en) 2013-09-06 2015-11-10 Qualcomm Incorporated Interactive image composition

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
US5194944A (en) * 1990-05-01 1993-03-16 Chinon Kabushiki Kaisha Image signal processing apparatus for successively transferring a series of color signals obtained as image signals
US5200818A (en) * 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
US5194944A (en) * 1990-05-01 1993-03-16 Chinon Kabushiki Kaisha Image signal processing apparatus for successively transferring a series of color signals obtained as image signals
US5200818A (en) * 1991-03-22 1993-04-06 Inbal Neta Video imaging system with interactive windowing capability

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999066716A1 (en) * 1998-06-18 1999-12-23 Sony Electronics Inc. A method of and apparatus for partitioning, scaling and displaying video and/or graphics across several display devices
US6501441B1 (en) 1998-06-18 2002-12-31 Sony Corporation Method of and apparatus for partitioning, scaling and displaying video and/or graphics across several display devices
US6593937B2 (en) 1998-06-18 2003-07-15 Sony Corporation Method of and apparatus for handling high bandwidth on-screen-display graphics data over a distributed IEEE 1394 network utilizing an isochronous data transmission format
US7075557B2 (en) 1998-06-18 2006-07-11 Sony Corporation Method of and apparatus for handling high bandwidth on-screen-display graphics data over a distributed IEEE 1394 network utilizing an isochronous data transmission format
WO2003105466A1 (en) * 2002-06-07 2003-12-18 Koninklijke Philips Electronics N.V. Method of imaging an object and mobile imaging device
EP1379073A1 (en) * 2002-07-03 2004-01-07 Siemens Aktiengesellschaft Method of capturing images for a panoramic image
WO2004006566A1 (en) 2002-07-08 2004-01-15 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing
US7321391B2 (en) 2002-07-08 2008-01-22 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing
US7024054B2 (en) 2002-09-27 2006-04-04 Eastman Kodak Company Method and system for generating a foreground mask for a composite image
WO2005093652A1 (en) * 2004-03-27 2005-10-06 Koninklijke Philips Electronics N.V. Image capture device and related method
US7646400B2 (en) 2005-02-11 2010-01-12 Creative Technology Ltd Method and apparatus for forming a panoramic image
US9185284B2 (en) 2013-09-06 2015-11-10 Qualcomm Incorporated Interactive image composition

Also Published As

Publication number Publication date
AU5376098A (en) 1998-06-29

Similar Documents

Publication Publication Date Title
US6657667B1 (en) Method and apparatus for capturing a multidimensional array of overlapping images for composite image generation
US5973734A (en) Method and apparatus for correcting aspect ratio in a camera graphical user interface
JP3326426B2 (en) Method and system for automatically rotating a display to manage portrait and landscape images
US6215523B1 (en) Method and system for accelerating a user interface of an image capture unit during review mode
EP0927489B1 (en) A method and system for displaying images in the interface of a digital camera
US6978051B2 (en) System and method for capturing adjacent images by utilizing a panorama mode
US6473123B1 (en) Method and system for organizing DMA transfers to support image rotation
US6278447B1 (en) Method and system for accelerating a user interface of an image capture unit during play mode
US6020920A (en) Method and system for speculative decompression of compressed image data in an image capture unit
US6134606A (en) System/method for controlling parameters in hand-held digital camera with selectable parameter scripts, and with command for retrieving camera capabilities and associated permissible parameter values
US5933137A (en) Method and system for acclerating a user interface of an image capture unit during play mode
EP0929970B1 (en) Systematic image group formation
US6512548B1 (en) Method and apparatus for providing live view and instant review in an image capture device
US5861918A (en) Method and system for managing a removable memory in a digital camera
US6680749B1 (en) Method and system for integrating an application user interface with a digital camera user interface
US8922667B2 (en) Image pickup apparatus capable of applying color conversion to captured image and control method thereof
JP5106142B2 (en) Electronic camera
WO1998025402A1 (en) A method and system for assisting in the manual capture of overlapping images for composite image generation in a digital camera
JP2000295508A (en) Electronic still camera, its control method and storage medium storing its operation processing program
JPH11103436A (en) Image processor, image processing method and storage medium
JP4148817B2 (en) Panoramic image photographing apparatus and panoramic image photographing method
JPH09322039A (en) Image pickup device
JP2004208281A (en) Image correction apparatus and imaging unit
JPH11346339A (en) Image processor, its control method and storage medium
JP5289101B2 (en) Image processing apparatus, method, and program

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA CN IL JP MX

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase