WO2005057278A1 - Method and device for capturing multiple images - Google Patents

Method and device for capturing multiple images Download PDF

Info

Publication number
WO2005057278A1
WO2005057278A1 PCT/FI2003/000944 FI0300944W WO2005057278A1 WO 2005057278 A1 WO2005057278 A1 WO 2005057278A1 FI 0300944 W FI0300944 W FI 0300944W WO 2005057278 A1 WO2005057278 A1 WO 2005057278A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
capturing apparatus
image capturing
images
colour
Prior art date
Application number
PCT/FI2003/000944
Other languages
French (fr)
Inventor
Timo Kolehmainen
Markku Rytivaara
Timo Tokkonen
Jakke Mäkelä
Kai Ojala
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/FI2003/000944 priority Critical patent/WO2005057278A1/en
Priority to AU2003285380A priority patent/AU2003285380A1/en
Publication of WO2005057278A1 publication Critical patent/WO2005057278A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/41Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors

Definitions

  • the invention relates to an imaging device and a method of creating an image file. Especially the invention relates to digital imaging devices com- prising more than one image capturing apparatus.
  • An object of the invention is to provide an improved solution for creating images. Another object of the invention is to enhance the dynamic range of images.
  • an imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image.
  • the apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.
  • a method of creating an image file in an imaging device comprising producing images with at least two image capturing apparatus, and utilising at least a portion of the images produced with different image capturing apparatus with each other to produce an image with enhanced image quality.
  • At least one image capturing apparatus has different light capturing properties compared to the other apparatus.
  • the image produced by the apparatus is used for enhancing the dynamic range of the image produced with the other of the image capturing apparatus.
  • at least one image capturing apparatus has a small aperture.
  • the image produced by the apparatus has fewer aberrations, as a smaller aperture produces a sharper image.
  • the information in the image may be utilised and combined with the images produced by other apparatus.
  • at least one image capturing apparatus has a higher aperture than other apparatus. Thus, the apparatus gathers more light and it is able to get more details from dark areas of the photographed area.
  • the imaging device comprises a lenslet array with at least four lenses and a sensor array.
  • the four image capturing apparatus each use one lens from the lenslet array, and a portion of the sensor array.
  • Three image capturing apparatus each comprise unique colour filter from a group of RGB or CMY filters or other system of colour filters and thus the three apparatus are required for producing a colour image.
  • the fourth image capturing apparatus may be manufactured with different light capturing properties compared to other apparatus and used for enhancing the image quality produced with the three apparatus.
  • Figure 1 illustrates an example of an imaging device of an embodiment
  • Figure 2A and 2B illustrate an example of an image sensing arrangement
  • Figure 2C illustrates an example of colour image combining
  • Figures 3A and 3B illustrate embodiments of the invention
  • Figure 4 illustrates a method of an embodiment with a flowchart
  • Figure 5 illustrates an embodiment where a polarization filter is used.
  • FIG. 1 illustrates a generalised digital image device which may be utilized in some embodiments of the invention. It should be noted that embodiments of the invention may also be utilised in other kinds of digital cameras than the apparatus of Figure 1 , which is just an example of a possible structure.
  • the apparatus of Figure 1 comprises an image sensing arrangement 100.
  • the image sensing arrangement comprises a lens assembly and an image sensor.
  • the structure of the arrangement 100 will be discussed in more detail later.
  • the image sensing arrangement captures an image and converts the captured image into an electrical form.
  • the electric signal produced by the apparatus 100 is led to an A/D converter 102 which converts the analogue signal into a digital form. From the converter the digitised signal is taken to a signal processor 104.
  • the image data is processed in the signal processor to create an image file.
  • the output signal of the image sensing arrangement 100 contains raw image data which needs post processing, such as white balanc- ing and colour processing.
  • the signal processor is also responsible for giving exposure control commands 106 to image sensing arrangement 100.
  • the apparatus may further comprise an image memory 108 where the signal processor may store finished images, a work memory 110 for data and program storage, a display 112 and a user interface 114, which typically comprises a keyboard or corresponding means for the user to give input to the apparatus.
  • Figure 2A illustrates an example of image sensing arrangement 100.
  • the image sensing arrangement comprises in this example a lens assembly 200 which comprises a lenslet array with four lenses.
  • the arrangement further comprises an image sensor 202, an aperture plate 204, a colour filter arrangement 206 and an infra-red filter 208.
  • Figure 2B illustrates the structure of the image sensing arrangement from another point of view.
  • the lens assembly 200 comprises four separate lenses 210 - 216 in a lenslet array.
  • the aper- ture plate 204 comprises a fixed aperture 218 - 224 for each lens.
  • the aperture plate controls the amount of light that is passed to the lens. It should be noted that the structure of the aperture plate is not relevant to the embodiments, i.e. the aperture value of each lens needs not be the same.
  • the number of lenses is not limited to four, either.
  • the colour filter arrangement 206 of the image sensing arrangement comprises in this example three colour filters, i.e. red 226, green 228 and blue 230 in front of lenses 201 - 214, respectively.
  • the sensor array 202 is in this example divided into four sections 234 to 239.
  • the image sensing arrangement comprises in this example four image capturing apparatus 240 - 246.
  • the image capturing apparatus 240 comprises the colour filter 226, the aperture 218, the lens 210 and the section 234 of the sensor array.
  • the image capturing apparatus 242 comprises the colour filter 228, the aperture 220, the lens 212 and the section 236 of the sensor array and the image capturing apparatus 244 comprises the colour filter 230, the aperture 222, the lens 214 and the section 238 of the sensor array.
  • the fourth image capturing apparatus 246 comprises the aperture 224, the lens 216 and a section 239 of the sensor array.
  • the fourth apparatus 246 does not in this example comprise a colour filter.
  • the image sensing arrangement of Figures 2A and 2B is thus able to form four separate images on the image sensor 202.
  • the image sensor 202 is typically, but not necessarily, a single solid-state sensor, such as a CCD (Charged Coupled Device) or CMOS (Complementary Metal-oxide Semiconductor) sensor known to one skilled in the art.
  • the image sensor 202 may be divided between lenses, as described above.
  • the image sensor 202 may also comprise four different sensors, one for each lens.
  • the image sensor 202 converts light into an electric current.
  • the sensor 202 comprises a given number of pixels.
  • the number of pixels in the sensor determines the resolution of the sensor. Each pixel produces an electric signal in response to light.
  • the number of pixels in the sensor of an imaging apparatus is a design parameter. Typically in low cost imaging apparatus the number of pixels may be 640x480 along the long and short sides of the sensor. A sensor of this resolution is often called a VGA sensor. In general, the higher the number of pixels in a sensor, the more detailed image can be produced by the sensor.
  • the image sensor 202 is thus sensitive to light and produces an electric signal when exposed to light.
  • the senor is not able to differ- entiate different colours from each other.
  • the sensor as such produces only black and white images.
  • a number of solutions are proposed to enable a digital imaging apparatus to produce colour images. It is well known for one skilled in the art that a full colour image can be produced using only three basic colours in the image capturing phase.
  • One generally used combination of three suitable colours is red, green and blue RGB.
  • Another widely used combination is cyan, magenta and yellow (CMY).
  • CCMY cyan, magenta and yellow
  • all colours can be synthesised using three colours, also other solutions are available, such as RGBE, where emerald is used as the fourth colour.
  • One solution used in single lens digital image capturing apparatus is to provide a colour filter array in front of the image sensor, the filter consisting of a three-colour pattern of RGB or CMY colours.
  • a colour filter array in front of the image sensor, the filter consisting of a three-colour pattern of RGB or CMY colours.
  • Such a solution is often called a Bayer matrix.
  • RGB Bayer matrix filter each pixel is typically covered by a filter of a single colour in such a way that in horizontal direction every other pixel is covered with a green filter and every other pixel is covered by a red filter on every other line and by a blue filter on every other line.
  • a single colour filter passes through to the sensor pixel under the filter light which wavelength corresponds to the wavelength of the single colour.
  • the signal processor interpolates the image signal received from the sensor in such a way that all pixels receive a colour value for all three colours.
  • the image sensing arrangement comprises a colour filter arrangement 206 in front of the lens assembly 200.
  • the filter arrangement may be located also in a different part of the arrangement, for example between the lenses and the sensor.
  • the colour filter arrangement 206 comprises three filters, one of each of the three RGB colours, each filter being in front of a lens. Alternatively also CMY colours or other colour spaces, may be used as well.
  • the lens 210 is associated with a red filter, the lens 212 with a green filter and the lens 214 with a blue filter. Thus one lens 216 has no colour filter.
  • the lens assembly may in an embodiment comprise an infra-red filter 208 associated with the lenses.
  • the infra-red filter does not necessarily cover all lenses at it may also be situated elsewhere, for example between the lenses and the sensor.
  • Each lens of the lens assembly 200 thus produces a separate image to the sensor 202.
  • the sensor is divided between the lenses in such a way that the images produced by the lenses do not overlap.
  • the area of the sensor divided to the lenses may be equal, or the areas may be of different sizes, de- pending on the embodiment.
  • the sensor 202 is a VGA imaging sensor and that the sections 234 - 239 allocated for each lens are of Quarter VGA (QVGA) resolution (320x240).
  • QVGA Quarter VGA
  • the electric signal produced by the sensor 202 is digitised and taken to the signal processor 104.
  • the signal processor proc- esses the signals from the sensor in such a way that three separate subimages from the signals of lenses 210 - 214 are produced, one filtered with a single colour.
  • the signal processor further processes the subimages and combines a VGA resolution image from the subimages.
  • Figure 2C illustrates one possible embodiment to combine the final image from the subimages. This ex- ample assumes that each lens of the lenslet comprises a colour filter, in such a way that there are two green filters, one blue and one red.
  • Figure 2C shows the top left corner of the combined image 250, and four subimages, a green one 252, a red one 254, a blue one 256 and a green one 258.
  • Each of the subimages thus comprises a 320x240 pixel array.
  • the top left pixels of the subimages correspond to each other and differ only in that the colour filter used in producing the pixel information is different.
  • the subimages are first registered. Registering means that any two image points are identified as corresponding to the same physical point.
  • the top left pixel R1 C1 of the combined image is taken from the greenl image 252,
  • the pixel R1 C2 is taken from the red image 254, the pixel R2C1 is taken from the blue image 256 and the pixel R2C2 is taken from the green2 image 258.
  • the final image corresponds in total resolution with the image produced with a single lens system with a VGA sensor array and a corresponding Bayer colour matrix.
  • the signal processor 104 may take into account the parallax error arising from the distances of the lenses 210 - 214 from each other.
  • the electric signal produced by the sensor 202 is digitised and taken to the signal processor 104.
  • the signal processor processes the signals from the sensor in such a way that three separate subimages from the signals of lenses 210 - 214 are produced, one being filtered with a single colour.
  • the signal processor further processes the subimages and combines a VGA resolution image from the subimages.
  • Each of the subimages thus comprise a 320x240 pixel array.
  • the top left pixels of the subimages correspond to each other and differ only in that the colour filter used in producing the pixel information is different. Due to the parallax error the same pixels of the subimages do not necessarily correspond to each other. The parallax error is compensated by an algorithm.
  • the final image formation may be described as comprising many steps: first the three subimages are registered (also called matching). Registering means that any two image points are identified as corresponding to the same physical point).Then, the subimages are interpolated and the interpolated subimages are fused to an RGB-color image.
  • Interpolation and fusion may also be in another order.
  • the final image corresponds in total resolution with the image produced with a single lens system with a VGA sensor ar- ray and a corresponding Bayer colour matrix.
  • the subimages produced by the three image capturing apparatus 240 - 244 are used to produce a colour image.
  • the fourth image capturing apparatus 246 may have different properties compared with the other apparatus.
  • the aperture plate 204 may comprise an aperture 224 of a different size for the fourth image capturing apparatus 246 compared to the three other image capturing apparatus.
  • the signal processor 104 is configured to combine at least a portion of the subimage produced with the fourth image capturing apparatus with the subimages produced with the three image capturing apparatus 240 - 244 to produce a colour image with an enhanced image quality.
  • the signal processor 104 is configured to analyse the images produced with the image capturing apparatus and to determine which portions of the images to combine.
  • the fourth image capturing apparatus has a small aperture 224 compared to the apertures 218 - 222 of the rest of the image capturing apparatus. This is illustrated in Figure 3A. When the aperture is small there are less aberrations in the resulting image, because a small aperture draws a sharp image.
  • a subimage taken with a small aperture adds information on the final image on bright areas which would otherwise be overexposed.
  • Apertures are usually denoted with so called F-numbers. They de- note the size of the aperture hole, through which the light passes to the lens. F-numbers are a fraction of the focal length of a lens. Thus, the smaller the F- number the more light is passed to the lens. For example, if the focal length of a lens is 50 mm, an F-number of 2.8 means that the aperture is 1/2.8 of 50 mm, i.e. 18 mm. A small aperture in this embodiment corresponds to F-number 4 or greater.
  • the fourth image capturing apparatus has a larger aperture 224 than the apertures 218 - 222 of the rest of the apparatus. This is illustrated in Figure 3B.
  • the large aperture enables the apparatus to have better light sensitivity compared to other apparatus.
  • the difference between the apertures is preferably fairly great.
  • the final image has a lower noise level because it is averaged using many images.
  • the dynamic area is bigger.
  • the final image will have more details in otherwise dark areas of the image. In this way, the final image contains more details in areas where the light intensity is low. These areas would be dark without the dynamic range enhancement.
  • the colour filter arrangement 206 may be a black and white image. In such a case the colour filter arrangement 206 does not have a colour filter for the fourth lens 216. In an embodiment the colour filter arrangement 206 may comprise a separate Bayer matrix 232 or a corresponding colour matrix filter structure. Thus the fourth lens can be used to enhance a colour image.
  • the subimage or portions of the subimage produced with the fourth image capturing apparatus and the subimages produced with the three image capturing apparatus 240 - 244 may be combined by the signal processor 104 using several different methods. In an embodiment the combining is made us- ing an averaging method for each pixel to be combined: D ⁇ / - p ⁇ Y R + py 4 rVfinal R , 2
  • PV R , PV G , and PV B are the pixel values of red, green and blue filtered apparatus (in the example of Figure 2B, the pixel values from the subimages produced by the apparatus
  • PV 4 is the pixel value of the fourth apparatus 246.
  • PV R , PV G , and PV B are the pixel values of red, green and blue filtered apparatus. Since the fourth apparatus produces black and white images, also the colour saturation must be increased for the combined pixels.
  • the algorithm is for the situation where the aperture of the fourth apparatus 246 is larger than in other apparatus. In the weighted mean method information of the final image is taken mainly using the three RGB apparatus. Information produced by the fourth apparatus with the larger aperture can be utilised for example in the darkest areas of the image. The above algorithm automatically takes the above condition into account. In the embodiment where the aperture of the fourth apparatus is smaller and the image thus sharper than in the other apparatus the images may be combined with an averaging or advanced method, where the images are compared and the sharpest areas of both images are combined into the final image.
  • the amount of information in each image can be measured by taking standard deviation from the small areas of the images.
  • the amount of in- formation corresponds to sharpness.
  • the flowchart of Figure 4 illustrates the method.
  • phase 400 standard deviation from a small area of the image produced with the three RGB apparatus is calculated.
  • phase 402 standard deviation from a corresponding area of the image produced with the fourth apparatus is calculated.
  • phase 404 these deviations are compared with each other.
  • the area which has bigger deviation is assumed to be sharper and it is emphasised when producing the final image.
  • the attention is moved to the next area.
  • the fourth apparatus is configured to use different exposure time compared to other apparatus. This enables the apparatus to have different light sensitivity compared to other apparatus.
  • the fourth apparatus produces infra-red images. This is achieved by removing the infra-red filter 208 at least partially in front of the lens 216. Thus near-IR light reaches the sensor. In this case the colour filter arrangement 206 does not have a colour filter for the fourth lens 216.
  • the infra-red filter may be a partially leaky Infra-red filter, in which case it passes both visible light and infra-red light to the sensor via the lens 216.
  • the fourth apparatus may act as an apparatus to be used for imaging in darkness. Imaging is possible when the scene is lit by an IR-light source.
  • the fourth apparatus may also be used as a black/white (B/W) reference im- age, which is taken without the infra-red filter..
  • B/W image can also be used for document imaging.
  • the lack of a colour filter array enhances the spatial resolution of the image compared to a colour image.
  • the reference B/W image may also be useful when the three colour filtered images are registered. The registration process is enhanced when a common reference image is available.
  • Figure 5 illustrates an embodiment of the invention.
  • Figure 5 shows the lens assembly 200, the image sensor 202, the aperture plate 204 and the colour filter arrangement 206 in a more compact form.
  • the fourth apparatus comprises a polarization filter 500.
  • a polarization filter blocks light waves which are polarized in perpendicular to the polarization direction of the filter.
  • a vertically polarized filter does not allow any horizontally polarized waves to pass through.
  • the most common use of polarized filters is to block reflected light.
  • sunshine horizontal surfaces, such as roads and water reflect horizontally polarized light.
  • the fourth apparatus comprises a vertically polarized filter which allows non-polarized light to pass through but blocks reflected light.
  • the fourth apparatus comprises a polarization filter which can be rotated by the user. The polarization filter may also be used with the other embodiments described above.
  • the lens with the polarization filter is similar in optical and light gathering properties compared to the other subsystem in order to simplify calculations.
  • the default image produced by the non-polarized apparatus is defined to be the "normal image" NI. This is the image that is transmitted to the viewfinder for the user to view and stored in memory as the main image.
  • the polarized image PI is stored separately.
  • the user is able to decide whether or not to use the information contained in PI to manipulate NI to form a "corrected image" CI. For example, when viewing images, he can be presented with a simple menu, which allows him to choose the "glare correction" , if desired.
  • the correction is made automatically and the corrected image is shown on the viewfinder and stored.
  • the user does not need to be aware that any correction has even been made. This is simple for the user, but taking the image requires more processing and is more difficult to realize in real time.
  • the image taken by the other apparatus and the polarized image taken by the fourth apparatus are reformatted into a same colour space in which there is only the intensity component (i.e. the are reformatted into greyscale images, for example).
  • These reformatted images may be called NY (for the normal image) and PY (for the polarized image).
  • the pixel values X, j in the matrix X are equal to k, but where the polarizing filter has blocked a significant amount of light from a given location, the pixel values X, j are much smaller.
  • the matrix X is thus essentially a "map" of the areas with reflected light: where there is significant reflection, the map is dark (close to zero), while it has a constant non-zero value in other areas.
  • the "glare matrix" GM is defined to be a greyscale image with the same dimen- sions as PY and NY.
  • GM is not uniquely defined, but is related to X in that it is a measure of the "excess light" which is to be removed from the image.
  • the image sensor is a temperature sensitive unit and generates a small electric current, which depends on the temperature of the sen- sor. This current is called a dark current, because it occurs also when the sensor is not exposed to light.
  • one apparatus is shielded from light and thus produces an image based on the dark current only. Information from this image may be used to suppress at least part of the dark current present in the other apparatus used for producing the actual image. For example, the dark current image may be subtracted from the images of other apparatus.
  • at least one image capturing apparatus is used for measuring white balance or measuring exposure parameters. Usually digital cameras measure white balance and exposure parameters using one or more captured images and calculating parameters for white balance and exposure adjustments by averaging pixel values over the image or over the images.
  • the calculation requires computing resources and increases current consumption in a digital camera. In such a case the same lens that creates the image is also used for these measuring purposes.
  • the imaging apparatus has a dedicated image capturing apparatus with a lens arrangement and image sensor area for these measuring purposes.
  • the required software and required algorithms may be designed better as the image capturing and the measuring functions are separated to different apparatus. Thus measuring can be made faster and more accurately than in conventional solutions.
  • the associated image capturing apparatus detects spectral information by capturing light intensity in many spectrum bands by means of diode detectors with corresponding colour filters (for example, red, green, blue and near-IR bands are used). These parameters are used by the processor of the imaging device for estimating parameters needed for white balance and exposure adjustment.
  • the benefit is a processing time much reduced compared to the case of calculating these parameters by averaging over a full image.
  • the white balance and exposure parameters may also be calculated by taking a normal colour image with the image capturing apparatus and averaging pixels over the image in a fashion suitable for white balance and exposure adjustment.
  • the image may be saved and used for later image post-processing on computer, for example.
  • each image capturing apparatus has a different aperture size.
  • Each image capturing apparatus produces a colour image.
  • Each image capturing apparatus comprises a colour filter. Large aperture variations enable high dynamic range imaging. Images of two or more image capturing apparatus may be used to compose a dynamically enhanced colour image. The images may be registered and averaged pixelwise to achieve a high dynamic range colour image.
  • Weighted averaging may also be used as an advanced method to combine images.
  • the weight coefficient can be taken from the best exposure image or derived from all sub-images.
  • the weight value indicates what subimages to use as the source of information, when calculating pixel value in final image. When the weight value is high the information is taken from small aperture cameras and vice versa. Typically the camera sensor sensitivity is dependent on wavelength.
  • each image capturing apparatus comprises a different aperture size and each image capturing apparatus is dedicated to its own spectral band (for instance: R, G, B, Clear).

Abstract

The invention relates to a method of creating an image file in an imaging device and an imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image. The apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.

Description

Method and device for capturing multiple images
Field The invention relates to an imaging device and a method of creating an image file. Especially the invention relates to digital imaging devices com- prising more than one image capturing apparatus.
Background The popularity of photography is continuously increasing. This applies especially to digital photography as the supply of inexpensive digital cameras has improved. Also the integrated cameras in mobile phones have contributed to the increase in the popularity of photography. The quality of images is naturally important for every photographer. In many situations it is difficult to evaluate correct parameters used in photographing. For example correct exposure in situations where there are well lit and dark areas nearby may be difficult. The automatic exposure programs in modern camera usually produce good quality images in many situations, but in some difficult exposure situations the automatic exposure may not be able to produce the best possible result. Also the optical quality of cameras set limits to the image quality. Especially in low cost cameras, which are used in mobile phones, for example, the optical quality of the lenses is not comparable to high-end cameras.
Brief description of invention An object of the invention is to provide an improved solution for creating images. Another object of the invention is to enhance the dynamic range of images. According to an aspect of the invention, there is provided an imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image. The apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality. According to another aspect of the invention, there is provided a method of creating an image file in an imaging device, comprising producing images with at least two image capturing apparatus, and utilising at least a portion of the images produced with different image capturing apparatus with each other to produce an image with enhanced image quality. The method and system of the invention provide several advantages. In general, at least one image capturing apparatus has different light capturing properties compared to the other apparatus. Thus the image produced by the apparatus is used for enhancing the dynamic range of the image produced with the other of the image capturing apparatus. In an embodiment of the invention, at least one image capturing apparatus has a small aperture. Thus, the image produced by the apparatus has fewer aberrations, as a smaller aperture produces a sharper image. The information in the image may be utilised and combined with the images produced by other apparatus. In an embodiment of the invention, at least one image capturing apparatus has a higher aperture than other apparatus. Thus, the apparatus gathers more light and it is able to get more details from dark areas of the photographed area. In an embodiment of the invention, the imaging device comprises a lenslet array with at least four lenses and a sensor array. The four image capturing apparatus each use one lens from the lenslet array, and a portion of the sensor array. Three image capturing apparatus each comprise unique colour filter from a group of RGB or CMY filters or other system of colour filters and thus the three apparatus are required for producing a colour image. The fourth image capturing apparatus may be manufactured with different light capturing properties compared to other apparatus and used for enhancing the image quality produced with the three apparatus.
List of drawings In the following, the invention will be described in greater detail with reference to the preferred embodiments and the accompanying drawings, in which Figure 1 illustrates an example of an imaging device of an embodiment; Figure 2A and 2B illustrate an example of an image sensing arrangement, Figure 2C illustrates an example of colour image combining, Figures 3A and 3B illustrate embodiments of the invention; Figure 4 illustrates a method of an embodiment with a flowchart, and Figure 5 illustrates an embodiment where a polarization filter is used.
Description of embodiments Figure 1 illustrates a generalised digital image device which may be utilized in some embodiments of the invention. It should be noted that embodiments of the invention may also be utilised in other kinds of digital cameras than the apparatus of Figure 1 , which is just an example of a possible structure. The apparatus of Figure 1 comprises an image sensing arrangement 100. The image sensing arrangement comprises a lens assembly and an image sensor. The structure of the arrangement 100 will be discussed in more detail later. The image sensing arrangement captures an image and converts the captured image into an electrical form. The electric signal produced by the apparatus 100 is led to an A/D converter 102 which converts the analogue signal into a digital form. From the converter the digitised signal is taken to a signal processor 104. The image data is processed in the signal processor to create an image file. The output signal of the image sensing arrangement 100 contains raw image data which needs post processing, such as white balanc- ing and colour processing. The signal processor is also responsible for giving exposure control commands 106 to image sensing arrangement 100. The apparatus may further comprise an image memory 108 where the signal processor may store finished images, a work memory 110 for data and program storage, a display 112 and a user interface 114, which typically comprises a keyboard or corresponding means for the user to give input to the apparatus. Figure 2A illustrates an example of image sensing arrangement 100. The image sensing arrangement comprises in this example a lens assembly 200 which comprises a lenslet array with four lenses. The arrangement further comprises an image sensor 202, an aperture plate 204, a colour filter arrangement 206 and an infra-red filter 208. Figure 2B illustrates the structure of the image sensing arrangement from another point of view. In this example the lens assembly 200 comprises four separate lenses 210 - 216 in a lenslet array. Correspondingly, the aper- ture plate 204 comprises a fixed aperture 218 - 224 for each lens. The aperture plate controls the amount of light that is passed to the lens. It should be noted that the structure of the aperture plate is not relevant to the embodiments, i.e. the aperture value of each lens needs not be the same. The number of lenses is not limited to four, either. The colour filter arrangement 206 of the image sensing arrangement comprises in this example three colour filters, i.e. red 226, green 228 and blue 230 in front of lenses 201 - 214, respectively. The sensor array 202 is in this example divided into four sections 234 to 239. Thus, the image sensing arrangement comprises in this example four image capturing apparatus 240 - 246. Thus, the image capturing apparatus 240 comprises the colour filter 226, the aperture 218, the lens 210 and the section 234 of the sensor array. Respectively, the image capturing apparatus 242 comprises the colour filter 228, the aperture 220, the lens 212 and the section 236 of the sensor array and the image capturing apparatus 244 comprises the colour filter 230, the aperture 222, the lens 214 and the section 238 of the sensor array. The fourth image capturing apparatus 246 comprises the aperture 224, the lens 216 and a section 239 of the sensor array. Thus, the fourth apparatus 246 does not in this example comprise a colour filter. The image sensing arrangement of Figures 2A and 2B is thus able to form four separate images on the image sensor 202. The image sensor 202 is typically, but not necessarily, a single solid-state sensor, such as a CCD (Charged Coupled Device) or CMOS (Complementary Metal-oxide Semiconductor) sensor known to one skilled in the art. In an embodiment, the image sensor 202 may be divided between lenses, as described above. The image sensor 202 may also comprise four different sensors, one for each lens. The image sensor 202 converts light into an electric current. This electric analogue signal is converted in the image capturing apparatus into a digital form by the A/D converter 102, as illustrated in Figure 1. The sensor 202 comprises a given number of pixels. The number of pixels in the sensor determines the resolution of the sensor. Each pixel produces an electric signal in response to light. The number of pixels in the sensor of an imaging apparatus is a design parameter. Typically in low cost imaging apparatus the number of pixels may be 640x480 along the long and short sides of the sensor. A sensor of this resolution is often called a VGA sensor. In general, the higher the number of pixels in a sensor, the more detailed image can be produced by the sensor. The image sensor 202 is thus sensitive to light and produces an electric signal when exposed to light. However, the sensor is not able to differ- entiate different colours from each other. Thus, the sensor as such produces only black and white images. A number of solutions are proposed to enable a digital imaging apparatus to produce colour images. It is well known for one skilled in the art that a full colour image can be produced using only three basic colours in the image capturing phase. One generally used combination of three suitable colours is red, green and blue RGB. Another widely used combination is cyan, magenta and yellow (CMY). Also other combinations are possible. Although all colours can be synthesised using three colours, also other solutions are available, such as RGBE, where emerald is used as the fourth colour. One solution used in single lens digital image capturing apparatus is to provide a colour filter array in front of the image sensor, the filter consisting of a three-colour pattern of RGB or CMY colours. Such a solution is often called a Bayer matrix. When using an RGB Bayer matrix filter, each pixel is typically covered by a filter of a single colour in such a way that in horizontal direction every other pixel is covered with a green filter and every other pixel is covered by a red filter on every other line and by a blue filter on every other line. A single colour filter passes through to the sensor pixel under the filter light which wavelength corresponds to the wavelength of the single colour. The signal processor interpolates the image signal received from the sensor in such a way that all pixels receive a colour value for all three colours. Thus a colour image can be produced. In the multiple lens embodiment of Figure 2A a different approach is used in producing a colour image. The image sensing arrangement comprises a colour filter arrangement 206 in front of the lens assembly 200. In practise the filter arrangement may be located also in a different part of the arrangement, for example between the lenses and the sensor. In an embodiment the colour filter arrangement 206 comprises three filters, one of each of the three RGB colours, each filter being in front of a lens. Alternatively also CMY colours or other colour spaces, may be used as well. In the example of Figure 2B the lens 210 is associated with a red filter, the lens 212 with a green filter and the lens 214 with a blue filter. Thus one lens 216 has no colour filter. As illustrated in Figure 2A, the lens assembly may in an embodiment comprise an infra-red filter 208 associated with the lenses. The infra-red filter does not necessarily cover all lenses at it may also be situated elsewhere, for example between the lenses and the sensor. Each lens of the lens assembly 200 thus produces a separate image to the sensor 202. The sensor is divided between the lenses in such a way that the images produced by the lenses do not overlap. The area of the sensor divided to the lenses may be equal, or the areas may be of different sizes, de- pending on the embodiment. Let in this example assume that the sensor 202 is a VGA imaging sensor and that the sections 234 - 239 allocated for each lens are of Quarter VGA (QVGA) resolution (320x240). As described above, the electric signal produced by the sensor 202 is digitised and taken to the signal processor 104. The signal processor proc- esses the signals from the sensor in such a way that three separate subimages from the signals of lenses 210 - 214 are produced, one filtered with a single colour. The signal processor further processes the subimages and combines a VGA resolution image from the subimages. Figure 2C illustrates one possible embodiment to combine the final image from the subimages. This ex- ample assumes that each lens of the lenslet comprises a colour filter, in such a way that there are two green filters, one blue and one red. Figure 2C shows the top left corner of the combined image 250, and four subimages, a green one 252, a red one 254, a blue one 256 and a green one 258. Each of the subimages thus comprises a 320x240 pixel array. The top left pixels of the subimages correspond to each other and differ only in that the colour filter used in producing the pixel information is different. The subimages are first registered. Registering means that any two image points are identified as corresponding to the same physical point. The top left pixel R1 C1 of the combined image is taken from the greenl image 252, The pixel R1 C2 is taken from the red image 254, the pixel R2C1 is taken from the blue image 256 and the pixel R2C2 is taken from the green2 image 258. This process is repeated for all pixels in the combined image 250. After this the combined image pixels are fused together so that each pixel has all three RGB colours. The final image corresponds in total resolution with the image produced with a single lens system with a VGA sensor array and a corresponding Bayer colour matrix. In an embodiment, when composing the final image, the signal processor 104 may take into account the parallax error arising from the distances of the lenses 210 - 214 from each other. The electric signal produced by the sensor 202 is digitised and taken to the signal processor 104. The signal processor processes the signals from the sensor in such a way that three separate subimages from the signals of lenses 210 - 214 are produced, one being filtered with a single colour. The signal processor further processes the subimages and combines a VGA resolution image from the subimages. Each of the subimages thus comprise a 320x240 pixel array. The top left pixels of the subimages correspond to each other and differ only in that the colour filter used in producing the pixel information is different. Due to the parallax error the same pixels of the subimages do not necessarily correspond to each other. The parallax error is compensated by an algorithm. The final image formation may be described as comprising many steps: first the three subimages are registered (also called matching). Registering means that any two image points are identified as corresponding to the same physical point).Then, the subimages are interpolated and the interpolated subimages are fused to an RGB-color image. Interpolation and fusion may also be in another order. The final image corresponds in total resolution with the image produced with a single lens system with a VGA sensor ar- ray and a corresponding Bayer colour matrix. In an embodiment the subimages produced by the three image capturing apparatus 240 - 244 are used to produce a colour image. The fourth image capturing apparatus 246 may have different properties compared with the other apparatus. The aperture plate 204 may comprise an aperture 224 of a different size for the fourth image capturing apparatus 246 compared to the three other image capturing apparatus. The signal processor 104 is configured to combine at least a portion of the subimage produced with the fourth image capturing apparatus with the subimages produced with the three image capturing apparatus 240 - 244 to produce a colour image with an enhanced image quality. The signal processor 104 is configured to analyse the images produced with the image capturing apparatus and to determine which portions of the images to combine. In an embodiment the fourth image capturing apparatus has a small aperture 224 compared to the apertures 218 - 222 of the rest of the image capturing apparatus. This is illustrated in Figure 3A. When the aperture is small there are less aberrations in the resulting image, because a small aperture draws a sharp image. In addition, a subimage taken with a small aperture adds information on the final image on bright areas which would otherwise be overexposed. Apertures are usually denoted with so called F-numbers. They de- note the size of the aperture hole, through which the light passes to the lens. F-numbers are a fraction of the focal length of a lens. Thus, the smaller the F- number the more light is passed to the lens. For example, if the focal length of a lens is 50 mm, an F-number of 2.8 means that the aperture is 1/2.8 of 50 mm, i.e. 18 mm. A small aperture in this embodiment corresponds to F-number 4 or greater. In an embodiment the fourth image capturing apparatus has a larger aperture 224 than the apertures 218 - 222 of the rest of the apparatus. This is illustrated in Figure 3B. The large aperture enables the apparatus to have better light sensitivity compared to other apparatus. The difference between the apertures is preferably fairly great. With this solution a large dynamic range is achieved. The final image has a lower noise level because it is averaged using many images. The dynamic area is bigger. The final image will have more details in otherwise dark areas of the image. In this way, the final image contains more details in areas where the light intensity is low. These areas would be dark without the dynamic range enhancement. The subimage produced by the fourth image capturing apparatus
246 may be a black and white image. In such a case the colour filter arrangement 206 does not have a colour filter for the fourth lens 216. In an embodiment the colour filter arrangement 206 may comprise a separate Bayer matrix 232 or a corresponding colour matrix filter structure. Thus the fourth lens can be used to enhance a colour image. The subimage or portions of the subimage produced with the fourth image capturing apparatus and the subimages produced with the three image capturing apparatus 240 - 244 may be combined by the signal processor 104 using several different methods. In an embodiment the combining is made us- ing an averaging method for each pixel to be combined: D\ / - pγ Y R + py 4 rVfinal R , 2
P rV Vfinai G - PVG + PV4 p rvfinai B - PVB + PV< 2 2 where PVπnai_R, PVfinai_G and PVfjnai_B are final pixel values, PVR, PVG, and PVB are the pixel values of red, green and blue filtered apparatus (in the example of Figure 2B, the pixel values from the subimages produced by the apparatus
240, 242 and 244), and PV4 is the pixel value of the fourth apparatus 246. In an embodiment the combining is made using a weighted mean method for each pixel to be combined: _. . _ M * PVR + (255 -M) * PV4 PVflnal R = — , _ M * PVG + (255 - M) * PV PV final G - 255 M * PVB + (255 -M) * PV< PV final B — _ _ ^ 255 where M = (PVR, + PVG + PVB) I 3 and PVflnai_R- PVtinai_G and PVfinai_B are final pixel values. PVR, PVG, and PVB are the pixel values of red, green and blue filtered apparatus. Since the fourth apparatus produces black and white images, also the colour saturation must be increased for the combined pixels. In the above example the algorithm is for the situation where the aperture of the fourth apparatus 246 is larger than in other apparatus. In the weighted mean method information of the final image is taken mainly using the three RGB apparatus. Information produced by the fourth apparatus with the larger aperture can be utilised for example in the darkest areas of the image. The above algorithm automatically takes the above condition into account. In the embodiment where the aperture of the fourth apparatus is smaller and the image thus sharper than in the other apparatus the images may be combined with an averaging or advanced method, where the images are compared and the sharpest areas of both images are combined into the final image. The amount of information in each image can be measured by taking standard deviation from the small areas of the images. The amount of in- formation corresponds to sharpness. The flowchart of Figure 4 illustrates the method. In phase 400, standard deviation from a small area of the image produced with the three RGB apparatus is calculated. In phase 402, standard deviation from a corresponding area of the image produced with the fourth apparatus is calculated. In phase 404 these deviations are compared with each other. In phase 406, the area which has bigger deviation is assumed to be sharper and it is emphasised when producing the final image. In phase 408 the attention is moved to the next area. With the above method a well balanced contrast is achieved for the whole image area. This applies especially to situations where there are high contrast differences in the image. In addition, the amount of information on the image can be increased and perceived noise decreased. In an embodiment, the fourth apparatus is configured to use different exposure time compared to other apparatus. This enables the apparatus to have different light sensitivity compared to other apparatus. In an embodiment, the fourth apparatus produces infra-red images. This is achieved by removing the infra-red filter 208 at least partially in front of the lens 216. Thus near-IR light reaches the sensor. In this case the colour filter arrangement 206 does not have a colour filter for the fourth lens 216. The infra-red filter may be a partially leaky Infra-red filter, in which case it passes both visible light and infra-red light to the sensor via the lens 216. In this embodiment the fourth apparatus may act as an apparatus to be used for imaging in darkness. Imaging is possible when the scene is lit by an IR-light source. The fourth apparatus may also be used as a black/white (B/W) reference im- age, which is taken without the infra-red filter.. The B/W image can also be used for document imaging. The lack of a colour filter array enhances the spatial resolution of the image compared to a colour image. The reference B/W image may also be useful when the three colour filtered images are registered. The registration process is enhanced when a common reference image is available. Figure 5 illustrates an embodiment of the invention. Figure 5 shows the lens assembly 200, the image sensor 202, the aperture plate 204 and the colour filter arrangement 206 in a more compact form. In this embodiment the fourth apparatus comprises a polarization filter 500. A polarization filter blocks light waves which are polarized in perpendicular to the polarization direction of the filter. Thus, a vertically polarized filter does not allow any horizontally polarized waves to pass through. In photography (and also in sunglasses) the most common use of polarized filters is to block reflected light. In sunshine horizontal surfaces, such as roads and water, reflect horizontally polarized light. In an embodiment of the invention the fourth apparatus comprises a vertically polarized filter which allows non-polarized light to pass through but blocks reflected light. In an embodiment of the invention the fourth apparatus comprises a polarization filter which can be rotated by the user. The polarization filter may also be used with the other embodiments described above. However, in the following discussion it is assumed that the lens with the polarization filter is similar in optical and light gathering properties compared to the other subsystem in order to simplify calculations. In an embodiment, the default image produced by the non-polarized apparatus is defined to be the "normal image" NI. This is the image that is transmitted to the viewfinder for the user to view and stored in memory as the main image. The polarized image PI is stored separately. In an embodiment, the user is able to decide whether or not to use the information contained in PI to manipulate NI to form a "corrected image" CI. For example, when viewing images, he can be presented with a simple menu, which allows him to choose the "glare correction" , if desired. In an embodiment, the correction is made automatically and the corrected image is shown on the viewfinder and stored. Thus, the user does not need to be aware that any correction has even been made. This is simple for the user, but taking the image requires more processing and is more difficult to realize in real time. Also, it is usually preferable to store PI together with CI, in case the processing to create CI cannot be done correctly. This may happen e.g. if one of the lenses is dirty or the sensors lose their calibration over time, which results in the optical systems of the lenses being non-identical. To make corrections, the image taken by the other apparatus and the polarized image taken by the fourth apparatus are reformatted into a same colour space in which there is only the intensity component (i.e. the are reformatted into greyscale images, for example). In an implementation, this could be the Y component of a YUV-coded image. These reformatted images may be called NY (for the normal image) and PY (for the polarized image). Mathematically, NY and PY are matrices containing the intensity information about NI and PI. If there is no preferred orientation of the polarization, NY and PY are linearly proportional: PY=k*NY, with k<1 because the polarizing filter blocks out some of the light. However, if the light coming to part of the image is strongly polarized in a specific direction, the NY image will be overexposed compared to the PY image in these locations if the polarizing filter is oriented so that it blocks light in this specific direction of polarization. As described above, such a situation most typically occurs when light is reflected from a large flat surface, e.g. water or a road surface, and is then primarily horizontally polarized. This excess of reflected light (the glare) is what causes the partial overexposure of the image NY. Mathematically, the simple linear relationship between PY and NY is lost in the presence of glare, and the relationship must be defined with a matrix X having the same dimensions as PY and NY. The relation is the pointwise product PY=X NY. It should be noted that this is a pointwise product and not a matrix product. Most of the pixel values X,j in the matrix X are equal to k, but where the polarizing filter has blocked a significant amount of light from a given location, the pixel values X,j are much smaller. The matrix X is thus essentially a "map" of the areas with reflected light: where there is significant reflection, the map is dark (close to zero), while it has a constant non-zero value in other areas. However, since the above equation is a non-linear equation, simplifications must be made to utilize this equation practically. In an embodiment, the "glare matrix" GM is defined to be a greyscale image with the same dimen- sions as PY and NY. GM is not uniquely defined, but is related to X in that it is a measure of the "excess light" which is to be removed from the image. In this embodiment, GM may be defined empirically from the formula GM=(cι*NY-c2 *PY) / (C1+C2). The values of ci and c2 may be determined empirically or they may be defined by the user. From this, the corrected greyscale image CY is then given by CY= (c3*NY - c4*GM) / (c3+c4), where the values of c3 and c may again be empirically determined or user- defined constants. From this, it is possible to determine the final corrected im- age by transforming CY back into the original colour space (in the simplest embodiment by simply using the U and V fields for the original NI and transforming (CY,U,V)->CI. The specific embodiment shown is only one of many, but illustrates the main steps needed: transformation into at least one common colour space, evaluation of the glare effect in each of these colour spaces, elimination of the glare effect in each of these colour spaces, and transformation back into the original colour space. Note that these steps could also be done separately for each colour in an RGB space rather than transforming to a YUV space as shown in the above embodiment. In an embodiment, at least one image capturing apparatus is shielded for producing a dark reference. The image sensor converts light into en electric current. The image sensor is a temperature sensitive unit and generates a small electric current, which depends on the temperature of the sen- sor. This current is called a dark current, because it occurs also when the sensor is not exposed to light. In this embodiment one apparatus is shielded from light and thus produces an image based on the dark current only. Information from this image may be used to suppress at least part of the dark current present in the other apparatus used for producing the actual image. For example, the dark current image may be subtracted from the images of other apparatus. In an embodiment, at least one image capturing apparatus is used for measuring white balance or measuring exposure parameters. Usually digital cameras measure white balance and exposure parameters using one or more captured images and calculating parameters for white balance and exposure adjustments by averaging pixel values over the image or over the images. The calculation requires computing resources and increases current consumption in a digital camera. In such a case the same lens that creates the image is also used for these measuring purposes. In this embodiment the imaging apparatus has a dedicated image capturing apparatus with a lens arrangement and image sensor area for these measuring purposes. The required software and required algorithms may be designed better as the image capturing and the measuring functions are separated to different apparatus. Thus measuring can be made faster and more accurately than in conventional solutions. When performing white balance or exposure parameters measurement the associated image capturing apparatus detects spectral information by capturing light intensity in many spectrum bands by means of diode detectors with corresponding colour filters (for example, red, green, blue and near-IR bands are used). These parameters are used by the processor of the imaging device for estimating parameters needed for white balance and exposure adjustment. The benefit is a processing time much reduced compared to the case of calculating these parameters by averaging over a full image. The white balance and exposure parameters may also be calculated by taking a normal colour image with the image capturing apparatus and averaging pixels over the image in a fashion suitable for white balance and exposure adjustment. In an embodiment the image may be saved and used for later image post-processing on computer, for example. In an embodiment, each image capturing apparatus has a different aperture size. Each image capturing apparatus produces a colour image. Each image capturing apparatus comprises a colour filter. Large aperture variations enable high dynamic range imaging. Images of two or more image capturing apparatus may be used to compose a dynamically enhanced colour image. The images may be registered and averaged pixelwise to achieve a high dynamic range colour image. Weighted averaging may also be used as an advanced method to combine images. The weight coefficient can be taken from the best exposure image or derived from all sub-images. The weight value indicates what subimages to use as the source of information, when calculating pixel value in final image. When the weight value is high the information is taken from small aperture cameras and vice versa. Typically the camera sensor sensitivity is dependent on wavelength.
For example, the sensitivity of a blue channel is much lower than that of a red channel in both CCD and CMOS sensors. A bigger aperture increases light flux, thus allowing more photons to the sensor. The lower the sensor sensitivity to a certain channel, the bigger the corresponding aperture size should be. The aperture variations of the image capturing apparatus enable a good signal balance between colour channels with similar signal-to-noise ratios. In an embodiment each image capturing apparatus comprises a different aperture size and each image capturing apparatus is dedicated to its own spectral band (for instance: R, G, B, Clear). Even though the invention is described above with reference to an example according to the accompanying drawings, it is clear that the invention is not restricted thereto but it can be modified in several ways within the scope of the appended claims.

Claims

Claims 1. An imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image, characterized by the apparatus being configured to utilize at least a portion of the im- ages produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.
2. The device of claim 1, characterized by the apparatus being configured to analyse the images produced with the image capturing apparatus and to determine which portions of an image to utilize.
3. The device of claim 1 , characterized by the apparatus being configured to combine at least a portion of the images produced with different image capturing apparatus with each other.
4. The device of claim 1, characterized in that at least one image capturing apparatus has a small aperture.
5. The device of claim 1, characterized in that at least one image capturing apparatus has higher aperture than other apparatus.
6. The device of claim 1, characterized in that at least one image capturing apparatus has a different light gathering capability and that the image produced by the apparatus is used for enhancing the dynamic range of the image produced with the other image capturing apparatus.
7. The device of claim 1, characterized in that at least one image capturing apparatus comprises a polarisation filter.
8. The device of claim 1, characterized in that the image capturing apparatus comprise a lens system and a sensor array configured to produce electric signal and that the device comprises a processor operationally connected to the sensor arrays and configured to produce an image proportional to the electrical signal received from the sensor arrays.
9. The device of claim 8, characterized in that the device comprises a sensor array divided between at least two image capturing appa- ratus.
10. The device of claim 1 , characterized by the device comprising a lenslet array with at least four lenses.
11. The device of claim 8, characterized in that the device comprises a sensor array and four image capturing apparatus, each apparatus using one lens from the lenslet array and a portion of the sensor array.
12. The device of claim 9, characterized in that three image capturing apparatus are configured to produce a colour image; that the fourth image capturing apparatus is configured to produce an image; and that the device comprises a processor configured to combine at least a portion of the images with each other to produce an image with an enhanced image quality.
13. The device of claim 10, characterized in that the three image capturing apparatus each comprise an unique colour filter from a group of filters red, green or blue.
14. The device of claim 10, characterized in that each of the three image capturing apparatus comprises a unique colour filter from a group of filters cyan, magenta or yellow.
15. The device of claim 12, characterized in that the fourth image capturing apparatus comprises a Bayer matrix.
16. The device of claim 12, characterized in that the fourth image capturing apparatus produces infra-red images.
17. The device of claim 1, characterized in that at least one image capturing apparatus is shielded for producing a dark reference.
18. The device of claim 1, characterized in that at least one image capturing apparatus is used for measuring white balance.
19. The device of claim 1, characterized in that at least one image capturing apparatus is used for measuring exposure parameters.
20. The device of claim 1, characterized in that the fourth image capturing apparatus comprises a polarization filter.
21. The device of claim 1, characterized in that the fourth image capturing apparatus produces images from which a specific light polarization direction has been removed.
22. The device of claim 1, characterized in that each image capturing apparatus comprises a different aperture and is dedicated to a different spectral band.
23. The device of claim 1 , characterized in that each image capturing apparatus comprises a lens arrangement.
24. The device of claim 1, characterized in that at least one image capturing apparatus is configured to use a different exposure time compared to other apparatus.
25. A method of creating an image file in an imaging device, comprising producing images with at least two image capturing apparatus, characterized by utilising at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.
26. The method of claim 25, characterized by analysing the images produced with the image capturing apparatus and determining which portions of the images to utilize.
27. The method of claim 25, characterized by producing images with image capturing apparatus of a different light gathering capability.
28. The method of claim 25, characterized by producing images with image capturing apparatus comprising a lens system and a sensor array configured to produce an electric signal and processing the images proportional to the electric signal with a processor operationally connected to the sensor arrays.
29. The method of claim 25, characterized by producing images with a sensor array and four image capturing apparatus, each apparatus using one lens from the lenslet array and a portion of the sensor array.
30. The method of claim 29, characterized by producing a colour image with three image capturing apparatus, producing an image with the fourth image capturing apparatus and combining at least a portion of the images with each other to produce an image with an enhanced image quality.
31. The method of claim 30, characterized by producing a colour image with the fourth capturing apparatus by using a Bayer matrix filter.
32. The method of claim 30, characterized by producing an infra-red image with the fourth capturing apparatus.
33. The method of claim 25, characterized by combining at least a portion of the images produced with different image capturing apparatus with each other.
34. The method of claim 25, characterized by using at least one image capturing apparatus for producing a dark reference.
35. The method of claim 25, characterized by using at least one image capturing apparatus for measuring white balance.
36. The method of claim 25, characterized by using at least one image capturing apparatus for measuring exposure parameters.
37. The method of claim 25, characterized by using at least one image capturing apparatus for producing images from which a specific light polarization direction has been removed.
38. The method of claim 25, characterized that each image capturing apparatus produces images with a lens arrangement of its own.
PCT/FI2003/000944 2003-12-11 2003-12-11 Method and device for capturing multiple images WO2005057278A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/FI2003/000944 WO2005057278A1 (en) 2003-12-11 2003-12-11 Method and device for capturing multiple images
AU2003285380A AU2003285380A1 (en) 2003-12-11 2003-12-11 Method and device for capturing multiple images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2003/000944 WO2005057278A1 (en) 2003-12-11 2003-12-11 Method and device for capturing multiple images

Publications (1)

Publication Number Publication Date
WO2005057278A1 true WO2005057278A1 (en) 2005-06-23

Family

ID=34673805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2003/000944 WO2005057278A1 (en) 2003-12-11 2003-12-11 Method and device for capturing multiple images

Country Status (2)

Country Link
AU (1) AU2003285380A1 (en)
WO (1) WO2005057278A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1874034A2 (en) * 2006-06-26 2008-01-02 Samsung Electro-Mechanics Co., Ltd. Apparatus and method of recovering high pixel image
WO2008112053A3 (en) * 2007-03-09 2008-11-13 Eastman Kodak Co Operating dual lens cameras to augment images
CN102348093A (en) * 2011-08-23 2012-02-08 太原理工大学 Intelligent base of Android mobile phone for video chat
JP2018119856A (en) * 2017-01-25 2018-08-02 京セラ株式会社 Imaging member and imaging device
TWI781085B (en) * 2015-11-24 2022-10-21 日商索尼半導體解決方案公司 Fly-eye lens module and fly-eye camera module

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) * 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
EP0858208A1 (en) * 1997-02-07 1998-08-12 Eastman Kodak Company Method of producing digital images with improved performance characteristic
EP0930770A2 (en) * 1998-01-14 1999-07-21 Mitsubishi Denki Kabushiki Kaisha Portable cellular phone having the function of a camera
US20030117501A1 (en) * 2001-12-21 2003-06-26 Nec Corporation Camera device for portable equipment
US6611289B1 (en) * 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) * 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
EP0858208A1 (en) * 1997-02-07 1998-08-12 Eastman Kodak Company Method of producing digital images with improved performance characteristic
EP0930770A2 (en) * 1998-01-14 1999-07-21 Mitsubishi Denki Kabushiki Kaisha Portable cellular phone having the function of a camera
US6611289B1 (en) * 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses
US20030117501A1 (en) * 2001-12-21 2003-06-26 Nec Corporation Camera device for portable equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1874034A2 (en) * 2006-06-26 2008-01-02 Samsung Electro-Mechanics Co., Ltd. Apparatus and method of recovering high pixel image
EP1874034A3 (en) * 2006-06-26 2011-12-21 Samsung Electro-Mechanics Co., Ltd. Apparatus and method of recovering high pixel image
WO2008112053A3 (en) * 2007-03-09 2008-11-13 Eastman Kodak Co Operating dual lens cameras to augment images
US7859588B2 (en) 2007-03-09 2010-12-28 Eastman Kodak Company Method and apparatus for operating a dual lens camera to augment an image
CN102348093A (en) * 2011-08-23 2012-02-08 太原理工大学 Intelligent base of Android mobile phone for video chat
TWI781085B (en) * 2015-11-24 2022-10-21 日商索尼半導體解決方案公司 Fly-eye lens module and fly-eye camera module
JP2018119856A (en) * 2017-01-25 2018-08-02 京セラ株式会社 Imaging member and imaging device

Also Published As

Publication number Publication date
AU2003285380A1 (en) 2005-06-29

Similar Documents

Publication Publication Date Title
US20070177004A1 (en) Image creating method and imaging device
USRE47458E1 (en) Pattern conversion for interpolation
US9077886B2 (en) Image pickup apparatus and image processing apparatus
CN102783135B (en) Utilize the method and apparatus that low-resolution image provides high-definition picture
US7688368B2 (en) Image sensor with improved light sensitivity
TWI488144B (en) Method for using low resolution images and at least one high resolution image of a scene captured by the same image capture device to provide an imoroved high resolution image
TWI428006B (en) Method of processing array of pixels and processing images
TWI462055B (en) Cfa image with synthetic panchromatic image
TWI495336B (en) Producing full-color image using cfa image
JP5825817B2 (en) Solid-state imaging device and imaging apparatus
EP2664153B1 (en) Imaging system using a lens unit with longitudinal chromatic aberrations and method of operating
US6813046B1 (en) Method and apparatus for exposure control for a sparsely sampled extended dynamic range image sensing device
US20050128509A1 (en) Image creating method and imaging device
EP1458183A2 (en) Camera using a beam splitter with micro-lens array for image amplification
US10630920B2 (en) Image processing apparatus
US20090278966A1 (en) Image sensor and imaging apparatus
US20100253833A1 (en) Exposing pixel groups in producing digital images
US20090051984A1 (en) Image sensor having checkerboard pattern
CN101449575A (en) Image sensor with improved light sensitivity
US20210019899A1 (en) Imaging device, distance measurement method, distance measurement program, and recording medium
WO2005057278A1 (en) Method and device for capturing multiple images
JP4649734B2 (en) Video signal processing apparatus and recording medium on which video signal processing program is recorded
Allen et al. Digital cameras and scanners
JP2002185976A (en) Video signal processor and recording medium with video signal processing program recorded thereon

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP