WO2007101887A2 - Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities - Google Patents

Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities Download PDF

Info

Publication number
WO2007101887A2
WO2007101887A2 PCT/EP2007/055338 EP2007055338W WO2007101887A2 WO 2007101887 A2 WO2007101887 A2 WO 2007101887A2 EP 2007055338 W EP2007055338 W EP 2007055338W WO 2007101887 A2 WO2007101887 A2 WO 2007101887A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
image taking
imaging parameter
scene
obtaining
Prior art date
Application number
PCT/EP2007/055338
Other languages
French (fr)
Other versions
WO2007101887A3 (en
Inventor
Alain Wacker
Original Assignee
Sinar Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sinar Ag filed Critical Sinar Ag
Priority to PCT/EP2007/055338 priority Critical patent/WO2007101887A2/en
Publication of WO2007101887A2 publication Critical patent/WO2007101887A2/en
Publication of WO2007101887A3 publication Critical patent/WO2007101887A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane

Definitions

  • the invention relates to the field of photography, and in particular to an image taking apparatus and a control unit for an image taking apparatus and to a method of manufacturing a photograph and to a use of said method.
  • the usual way to obtain a picture of a scene by means of a camera is to select appropriate imaging parameters like the focus setting, the zoom setting, the aperture opening (aperture number) and the exposure time and then expose an image taking element accordingly.
  • Said image taking element can be a photographic film or, nowadays, rather a photoelectric chip (typically a CCD or a CMOS chip) , which has a large number (today typically at least several millions) of pixels (picture elements). Since it is sometimes difficult to precisely determine the appropriate imaging parameters in advance, some cameras provide for a so-called “bracketing” or “exposure bracketing”, which allows to obtain a number of (typically three) images, each taken at a different aperture opening and/or exposure time.
  • the release button Upon pressing the release button once, e.g., firstly one picture at a presumed optimal exposure time is taken, then one picture at twice said presumed optimal exposure time is taken, and finally, one picture at half said presumed optimal exposure time is taken. Afterwards, the best picture can be selected from these three.
  • an optical detection system which allows to minimize undesired effects of aberration in image taking by using a spatial light modulator embodied as a deformable mirror comprised of a multitude of deformable micro-mirrors.
  • an image pickup apparatus specifically designed for the purpose of picking up an entirety of a flat surface of a white board in-focus is disclosed, i.e. an image pickup apparatus for a particular kind of oblique photography.
  • Such an image pickup apparatus is installed with an inclination angle ⁇ between a sensing surface of the image pickup apparatus and the surface of the white board. Said angle ⁇ is adjustable, and dependent on said angle ⁇ , a mirror is rotated, a taking lens is moved and a CCD line image sensor is moved, so as to obtain line-by-line a full image of said white board surface, which is fully in-focus, wherein light from said white board surface is reflected by said mirror, then runs throught said taking lens and is then detected by said line sensor.
  • an object of the invention is to create an image taking apparatus, a control module for an image taking apparatus and a method of manufacturing a picture and a use of said method, which are new and provide for useful degrees of freedom in creating a picture of a scene. Another object of the invention is to provide for enhanced imaging capabilities. Another object of the invention is to provide a photographer with enhanced creative possibilities for creating photographs.
  • Another object of the invention is to provide for enhanced ways of creating high-quality images in general-purpose photography.
  • an image taking apparatus a control module for an image taking apparatus, a control unit for an image taking apparatus, a method of manufacturing a picture of a scene and a use of said method according to the patent claims.
  • the use according to the invention is a use of the method according to the invention for intentionally causing that said picture comprises portions, which are deliberately out-of-focus .
  • the control module for an image taking apparatus comprises an image taking element and an image-forming optical system comprising at least one lens and is adapted to enabling
  • the control unit for an image taking apparatus is comprised in and/or connectable to said image taking apparatus and comprises a control module according to the invention.
  • the image taking apparatus comprises a control module according to the invention.
  • the invention it is possible to obtain pictures, in particular still images, throughout which at least one imaging parameter is varied. I.e., different parts of the picture are obtained from exposures with (at least partially) different imaging parameters such as focus settings, soft focus settings and zoom settings. Pictures being partially in-focus and partially deliberately out-of- focus can be obtained in a well-defined way, which provides for great creative possibilities in creating photographs.
  • the method can allow to obtain pictures of pre-selectable distributions of sharpness and blur across the picture. It is possible to achieve the effect of a view camera with tilted lenses without tilting lenses and still further creative effects.
  • An advantage of the invention can be, that it can be realized with photographic cameras, which are, from an optical point of view, identical with known cameras. But, the way the exposure and said at least one imaging parameter are controlled is different from what is known from conventional photographic cameras.
  • step al) usually the image taking element will be used for successively obtaining said image components.
  • Said relative movement put forward in step bl) causes, at least in part, said pre-defined variation of said at least one imaging parameter put forward in step b) .
  • said at least one imaging parameter is, at least in part, different for different image components.
  • said pre-defined variation of said at least one imaging parameter is carried out such as to achieve that, at least for a portion of said image components of said set, different image components of said portion are obtained using different settings of said at least one imaging parameter.
  • said variation of said at least one imaging parameter is a pre-defined variation. Therefore, for each image component, the setting of said at least one imaging parameter to be used during obtaining the respective image component is prescribed (typically by the photographer) before the respective image component is obtained, in particular, even before a first image component of said set is obtained.
  • said at least one imaging parameter is set to a pre-defined initial setting while the first image component of said set of image components is obtained, and said at least one imaging parameter is set to a pre-defined final setting while the last image component of said set of image components is obtained.
  • said pre-defined variation is clearly distinguished from such variations of said at least one imaging parameter, which are carried out irrespective of which image component is currently obtained, such as it is done, e.g., for camera-shake compensation.
  • a camera-shake compensation is accomplished by detecting a change of the position of the optical detection system and compensating for a shift in focal point caused by the detected position change.
  • said method comprises the step of e) defining said pre-defined variation of said at least one imaging parameter.
  • Step e) is typically carried out before carrying out steps a) and b) .
  • step e) is typically carried out by the photographer.
  • Said fractions of said scene to be captured in said image components mentioned in step a2) are usually pre-defined, too:
  • the fraction of said scene to be captured in the respective image component is prescribed before the respective image component is obtained, in particular, before a first image component of said set is obtained.
  • the method comprises the step of adjusting at least one control parameter which influences shape and/or arrangement of all those portions of said image taking element, each of which is exposed for obtaining one respective image component.
  • said fractions of said scene captured in each of said image components do substantially not or not at all overlap.
  • said method comprises the step of switching from a normal mode of said image taking apparatus, in which no pre-defined variation of said at least one imaging parameter and/or no movement of said at least one lens can be carried out during exposure, to another mode of said image taking apparatus, in which steps a) and b) can be carried out.
  • Said normal mode can be the mode, in which standard photo cameras work, i.e. in which imaging parameters (and lenses) remain fixed during exposing an image taking element.
  • Said other mode is the mode according to the invention.
  • said picture is composed of a multitude of picture constituents derived from said set of image components and said method comprises the steps of deriving said picture constituents from said set of image components and deriving said picture from said picture constituents.
  • said picture is composed of picture constituents which are substantially identical with said image components. This is usually preferred in case of photochemical converters as image taking elements.
  • both, a focus setting and a magnification and/or zoom setting is varied, wherein the variation of said magnification and/or zoom setting is chosen such that it compensates for changes in magnification caused by said focus setting variation.
  • Said deriving of said picture constituents and/or of said picture may in full or in part take place in the image taking apparatus and in full or in part in a computer connectable to the image taking apparatus or in a computer separate therefrom.
  • Said image components can also be referred to as "raw image components” in the sense that they are usually subject to some further processing in order to obtain said picture constituents.
  • data representing said image components can be what generally is referred to as “raw data” in digital photography, yet it is also possible to have processed, e.g., compressed, data representing said image components.
  • processed, e.g., compressed, data representing said image components can be used, e.g., compressed, data representing said image components.
  • said method comprises the step of d) deliberately creating out-of-focus portions of said scene in said picture by carrying out step b) .
  • said picture comprises in-focus portions and deliberately out-of-focus portions because of carrying out step b) .
  • step b) there are varying degrees of sharpness (and blur) across said picture because of carrying out step b) .
  • step a) comprises the step of a3) exposing said image taking element with light from said scene in full by successively exposing different portions of said image taking element with light from different parts of said scene.
  • said image taking element which in this case typically is a two-dimensional converter such as a sheet of photographic film or a two-dimensional CCD or CMOS chip, is subject to exposure on a portion-by-portion basis, until light from said scene in full is captured, which typically is the case when the whole image taking element has been exposed.
  • a two-dimensional converter such as a sheet of photographic film or a two-dimensional CCD or CMOS chip
  • said image taking element is subject to exposure on a portion-by-portion basis, until light from said scene in full is captured, which typically is the case when the whole image taking element has been exposed.
  • different portions of said image taking element are exposed at different times.
  • said method comprises the step of a31) using a sensing area definition element arranged within said image taking apparatus between said scene and said image taking element for defining said different portions of said image taking element to be exposed.
  • said sensing area definition element is arranged close to said image taking element, and it usually comprises an opaque portion for blocking light from travelling to said image taking element. It usually comprises, in addition, a transparent portion, which allows light to travel to said image taking element, so as to define said different portions of said image taking element to be exposed. Typically, on its way to said image taking element, the light travels through said transparent portion substantially without changing its direction and/or substantially unperturbed.
  • said sensing area definition element can comprise a transparent portion, which is confined, fully or in part, by said opaque portion.
  • the sensing area definition element can, e.g., comprise a liquid crystal element such as a liquid crystal material between glass substrates having electrodes to selectively create transparent and light-blocking areas of the liquid crystal element. If many electrodes are provided, a great flexibility for defining said image components can be provided this way. Nevertheless, said transparent portion is preferably an opening, since this provides a great optical quality, in particular compared to providing glass substrates in the light path used for exposing said image taking element. Accordingly, in one embodiment, said sensing area definition element is capable of forming an opening, and said step a31) comprises the step of moving said opening (with respect to said image taking element) of said sensing area definition element for defining said different portions of said image taking element to be exposed.
  • a slit-shaped opening can be moved (continuously or quasi- continuously or step-wise) in the light path, e.g., shutter curtains as usually used for controlling the exposure in photographic cameras can be used as sensing area definition elements. It is also possible to use a pin diaphragm
  • step a3) can be accomplished by successively bringing different groups of photosensitive members of said multitude of photosensitive members into a suitable photosensitive state and back into a photo-insensitive - In ⁇
  • the image taking element may be constantly illuminated, and the portion of the image taking element to be exposed for obtaining a single image component can easily be chosen by a control signal switching the photo- sensitivity of said groups of photosensitive members. And even the times at which and the time during which an exposure for a single image component takes place may readily be defined by said control signal.
  • a control signal switching the photo- sensitivity of said groups of photosensitive members. And even the times at which and the time during which an exposure for a single image component takes place may readily be defined by said control signal.
  • an electronic shutter may be realized in the image taking element itself. This can render a sensing area definition element superfluous.
  • step a) may comprise the step of a4) successively obtaining the image components of said set by multiply exposing said image taking element, each time with light from a different part of said scene.
  • step a4) comprises the step of a41) moving said image taking element for accomplishing said exposing of said image taking element with light from said different parts of said scene.
  • Said moving said image taking element can be a continuous or quasi-continuous or step-wise moving.
  • a drive functionally connected to said image taking element can be used for accomplishing said movement.
  • said method comprises the step of cl) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define and/or does not describe and/or is unrelated to an angle between an axis defined by said image taking element and an axis defined by an object to be imaged.
  • said method comprises the step of c2) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not describe and/or is unrelated to an alignment of said image taking apparatus relative to an object to be imaged.
  • said method comprises the step of c3) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of an axis defined by said image taking element relative to an axis defined by an object to be imaged.
  • said method comprises the step of c4) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of said image taking apparatus relative to said scene.
  • said method comprises the step of c5) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define and/or does not describe and/or is unrelated to a distance between said image taking element and an object to be imaged.
  • Steps cl) , c2), c3), c4) and c5), respectively, are typically carried out by the photographer, and typically before carrying out steps a) and b) .
  • control parameter mentioned in one of steps cl), c2), c3), c4) and c5) is identical with the control parameter mentioned in another one or more of these steps .
  • a control parameter, on which said predefined variation of said at least one imaging parameter depends, is adjusted for creatively designing said picture of said scene.
  • said at least one imaging parameter which is varied in a pre-defined way in step b) , depends on said relative movement mentioned in step bl) .
  • said pre-defined variation of said at least one imaging parameter leaves the signal strength generated by said image taking element substantially unchanged.
  • said at least one imaging parameter is a parameter which leaves the amount of light to which said image taking element is exposed during obtaining said image components substantially unchanged. In one embodiment, said at least one imaging parameter is a parameter which influences light paths relevant for said obtaining said image components.
  • said at least one lens is comprised in a focussing section of said image taking apparatus, and said at least one imaging parameter is a parameter of said focussing section.
  • said at least one imaging parameter comprises a focus setting.
  • Variation of the focus setting allows to place the location of maximum sharpness in different distances for different places in the picture (different parts of the scene) .
  • the plane of maximum sharpness may be angled arbitrarily, and in principle, even an arbitrarily shaped surface of maximum sharpness may be realized.
  • said method comprises the step of f) creating a bent focal surface by carrying out step b) .
  • the focal surface is the surface constituted by points which are imaged in-focus.
  • a bent focal surface is a not-flat focal surface.
  • the light used for obtaining the respective image component travels within said image taking apparatus along a light path to said image taking element without being reflected by a mirror.
  • light paths are used, which are, within said image taking apparatus, free from mirrors.
  • mirror-free embodiments have the advantage that they allow for highest-quality imaging, since mirrors are optical elements which may have imperfections, may require adjusting and maintenance and may generate failures.
  • step bl) comprises the step of bll) moving said at least one lens during step a) .
  • This moving of said at least one lens typically takes place along the optical axis of the image taking apparatus.
  • said image taking apparatus comprises a drive for moving said at least one lens, and step bll) is accomplished using said drive.
  • This drive may be identical with a drive used for autofocus operations in said image taking apparatus.
  • the method comprises the step of g) moving said image taking element during step a) .
  • This moving can be a moving contributing to or even embodying said relative movement mentioned in step bl) , in which case it typically takes place along the optical axis of the image taking apparatus or at least comprises a movement component along the optical axis of the image taking apparatus.
  • the moving said image taking element mentioned in step g) comprises a tilting movement, wherein said tilting may be a tilting about an arbitrary axis, e.g., a hoizontally aligned axis or a vertically aligned axis.
  • the image taking element is aligned such that the line (in case of a line scanner) or the surface (in case of a two-dimensional photosensive element such as photographic film or an imaging photoelectric detector) defined by it lies in a plane perpendicular to the optical axis of the image taking apparatus, typically in the image plane.
  • said tilting movement typically comprises a movement component along the optical axis of the image taking apparatus, but is not a movement solely along the optical axis of the image taking apparatus. If a movement according to step g) is carried out, in particular a tilting movement, while also a sensing area definition element is used for defining said different portions of said image taking element to be exposed (cf.
  • step a31 it can be advantageous to move said sensing area definition element simultaneously with and in the same manner as said image taking element, e.g., for keeping a distance between said sensing area definition element and said image taking element constant and/or for ensuring that - for all image components - the length of the light path relevant for obtaining the respective image component from said sensing area definition element to said image taking element is of the same magnitude.
  • said image taking apparatus is a general-purpose photographic camera.
  • a general-purpose photographic camera is intended for use for various types of photography, not limited to only one type of photography such as taking pictures of white boards only.
  • said image taking apparatus is a camera for hand-held use and/or for use mounted on a camera stand or tripod.
  • said set of image components is obtained automatically.
  • said settings of said at least one imaging parameter are varied automatically.
  • said pre-defined variation of said at least one imaging parameter mentioned in step b) is carried out by varying settings of said at least one imaging parameter in a continuous or quasi-continuous or step-wise manner.
  • Continuous variations of said at least one imaging parameter, of said relative movement mentioned in step bl), of said moving of said image taking element and/or of said lens and/or of said transparent portion of said sensing area definition element
  • analogue control e.g., by using an analogue control and an analogue motor for continuously varying, e.g., a focus setting, a lens position, an image taking element position, an aperture position.
  • Quasi-continuous variations are possible with a drive and a digital control (having a reasonable resolution) .
  • Step-wise variations are possible by various means .
  • said image taking element is a photoelectric converter, in particular a CMOS chip or a CCD chip or a line scanner.
  • said image taking element is a an imaging photochemical converter, in particular photographic film.
  • said method comprises the step of storing data representative of said set of image components in a storage unit, in particular in at least one of the group comprising
  • a data carrier in particular magnetic or optical or electrical data carrier, in particular a removable data carrier.
  • the "data" of the image components are "stored” in the image taking element, typically a photographic film.
  • said method comprises the step of
  • line-shaped image components with a small width, wherein the line may be curved or straight, continuous or discontinuous.
  • the width of the line would, for high resolution, be only one pixel, or maybe up of two or three pixels; in case of a color-sensitive chip, one pixel would preferably be considered to comprise a couple of photosensitive members, accounting for the different colors, e.g., one for red, two for green, one for blue.
  • the number of image components in said set will in many cases be several hundreds to several thousands.
  • image components with a larger width can be used for faster image-taking, usually at the expense of resolution; if, however, the variation of said imaging parametercomprises only in a relatively small number of steps, there may be no loss of resolution.
  • the image taking apparatus comprises at least one of
  • an image-forming optical system in particular a detachable image-forming optical system
  • an exposure time definition unit an image taking module comprising an image taking element, in particular a detachable image taking module .
  • It may be a camera system, in particular a photographic camera, a general-purpose photographic camera, a still image camera, more particularly a single-lens reflex camera .
  • said image taking apparatus is free from mirrors, which are arranged, for at least one of said image components, along a light path along which light used for obtaining said at least one image component travels within said image taking apparatus to said image taking element during obtaining said at least one image component.
  • said image taking apparatus comprises at least one of the group comprising
  • a first storage unit for storing data representative of said set of image components
  • a second storage unit which may be identical with or different from said first storage unit, for storing data representative of said pre-defined variation of said at least one imaging parameter
  • a third storage unit which may be identical with or different from said first and/or second storage units, for storing data relating each of said image components of said set to said settings of said at least one imaging parameter used during obtaining the respective image component.
  • Said data can be digital data.
  • said image taking apparatus comprises, as image taking element, a line scanner and a drive functionally connected to said line scanner, wherein said control module is adapted to controlling said drive such that said line scanner is moved during obtaining said set of image components, so as to define for each of said image components the corresponding fractions of said scene.
  • said image taking apparatus comprises an exposure time definition unit.
  • an exposure time definition unit it is defined when and for how long an exposure of the image taking element takes place, or, more precisely, when and for how long photons are collected by the image taking element.
  • exposure time definition units can be, e.g., shutters (shutter curtains) and/or apertures. It is possible that an image taking element itself implements an exposure time definition unit, e.g., if the photo-sensitivity of the image taking element (or portions thereof) can be switched on and off. It can be useful to provide an exposure time definition unit in addition to a sensing area definition element.
  • the methods according to the invention may also be considered methods of operating an image taking apparatus, in particular, methods of operating an image taking apparatus for obtaining one final image, which one final image is composed of a multitude of picture constituents derived from a set of image components.
  • Said picture (or final image) of said scene can be considered a mosaic-like composition of the picture constituents.
  • said set of image components may be interpreted to forming a mosaic-like pattern from all its image components, wherein each image component corresponds to one mosaic-piece-like fraction of said scene.
  • the image components may be considered partial images or mosaic image fractions .
  • the mosaic-like arrangement of picture components represented by the final image is usually identical with the mosaic-like arrangement that can meaningfully be formed from the set of image components or at least obtainable therefrom by relatively simple geometric transformations, which usually do not alter the neighboring-relationships of the mosaic parts.
  • the scene and its illumination shall remain unchanged during a time span ⁇ t, during which all image components of the set are derived, and also the image taking apparatus should not be moved within that time.
  • the advantages of the methods correspond to the advantages of corresponding apparatuses.
  • Fig. 1 an illustration of a simple embodiment with focus variation
  • FIG. 2 an illustration of a simple embodiment with focus variation
  • Fig. 3 an image taking apparatus with focus variation
  • FIG. 4 an illustration of a simple embodiment with "electronic shutter"
  • Fig. 5 an illustration of a camera system with computer and storage unit
  • Fig. 6 an illustration of a simple embodiment with sensing area definition element
  • Fig. 7 an illustration of a simple embodiment with a line scanner
  • Fig. 8 an illustration of image components and corresponding imaging parameter settings over time
  • Fig. 9 an illustration of image components, picture constituents and final image
  • Fig. 10 an illustration of a mosaic pattern of picture constituents
  • Fig. 11 an illustration of a mosaic pattern of picture constituents
  • Fig. 12 an illustration of exposures and imaging parameter settings over time
  • Fig. 13 an illustration of exposures and imaging parameter settings over time
  • Fig. 14 an illustration of exposures and imaging parameter settings over time
  • Fig. 15 an illustration of exposures and imaging parameter settings over time
  • Fig. 16 an image taking apparatus with focus and aperture opening variation
  • Fig. 17 an illustration of a simple embodiment with focus variation and image taking element tilting, in a side view
  • Fig. 18 an illustration of the simple embodiment with focus variation and image taking element tilting of Fig. 17, in a top view;
  • Fig. 19 an illustration of a simple embodiment with focus variation, image taking element tilting and sensing area definition element tilting, in a side view;
  • Fig. 20 an illustration of the simple embodiment with focus variation, image taking element tilting and sensing area definition element tilting of
  • Fig. 1 is an illustration of a simple embodiment, in which an imaging parameter, namely the focus setting is varied.
  • a scene 99 (or an object 99) is imaged onto an image taking element 60 by means of an image-forming optical system 20, which is drawn as a single lens, which at the same time is a focussing section 29.
  • a control module 4 can control a drive 28 (dash-dotted lines indicate functional connections) such that the focus setting is changed, illustrated by the dotted lenses connected by the dotted arrows.
  • the focussing section 29 is moved step-wise from an initial position at time tO to a final position tf, via several intermediate steps, one of which is drawn at tn, the location of the maximum sharpness of the image will also move, as illustrated by the small image sections labelled with the corresponding point in time (t ⁇ ,tn,tf).
  • Said control module 4 controls said image taking module 60 such, that at different times ti different portions of the image taking module 60 capture a partial image representing only a fraction of said scene 99.
  • a partial image is referred to as a image component Ri (indices of image components corresponding to indices of points in time) .
  • Fig.2 is an illustration of the same simple embodiment as in Fig. 1. This time, the object 99 is tilted.
  • the focus settings properly and controlling the capturing of the image components Ri accordingly, it is possible to obtain a final image (picture) of the (tilted) surface of the object 99, which shows the surface in maximum sharpness throughout the whole picture.
  • the maximum sharpness of the image of the surface is achieved in the image plane 86 of the image taking element 60.
  • control module 4 will usually be programmed by the photographer before time t ⁇ , i.e., before starting capturing said image components RO...Rf, so that the described focus variation is a pre-defined variation of at least one imaging parameter.
  • Fig. 3 shows schematically an image taking apparatus 1 with focus variation, similar to the embodiment of Figs. 1 and 2.
  • the image taking apparatus e.g., a camera, comprises an image taking module 6, e.g., a digital back, which comprises the image taking element 60, e.g., a CCD chip or a CMOS chip.
  • Fig. 3 further illustrates that it is possible to define that area of image taking element 60, which is to be exposed for obtaining a certain image component, by means of a sensing area definition element 58, which is separate from image taking element 60.
  • a sensing area definition element 58 is usually located close to image taking element 60.
  • Sensing area definition element 58 comprises a transparent portion 580 which lets light pass and an opaque portion which blocks light from travelling further.
  • the sensing area definition element 58 of the embodiment of Fig. 3 can, e.g., be a liquid crystal element with several or better with a multitude of electrode pairs, which pairs are arranged, e.g., in a regular array.
  • the shape of an image component Rn to be captured (and, therewith, of a fraction of scene 99 to be captured in said image component) can then easily be defined by selecting one or more of said electrode pairs and not applying a voltage to the selected electrode pair(s) while applying a voltage to the other electrode pairs.
  • the application of a voltage will provoke an ordered arrangement of liquid crystals between the corresponding electrode pairs, which causes intransparency.
  • the data generated by image taking element 60 are stored in a storage unit 10, which may be separate from the camera or comprised therein.
  • the control unit 4 ensures that exposure takes place at appropriate times ti and at appropriate portions of image taking element 60.
  • FIG. 4, 6 and 7 illustrate some of such ways, wherein a moving lens is drawn at an intermediate time tn and, dashed, at an initial time t0 and at a final time tf, for illustrating the imaging parameter variation (e.g., focus setting, zoom setting, soft focus setting) .
  • the imaging parameter variation e.g., focus setting, zoom setting, soft focus setting
  • FIG. 4 illustrates a simple embodiment with "electronic shutter”.
  • a CMOS chip 60c image taking element
  • control unit 4 such, that as a function of time, different groups of pixels (photosensitive members 65) are in action, i.e., are in a photosensitive state, while the rest of the photosensitive members 65 are photo-insensitive, i.e., they do not register photons impinging on them.
  • each row of pixels is switched active, from top to bottom of the CMOS chip 60c, which takes altogeher a time ⁇ t .
  • the imaging parameter is varied.
  • the resulting image components RO, ...Rn, ...Rf are stored in a storage unit 10, e.g., after all rows were active (all data of the image components are collected) , or image component by image component, i.e., data read-out into storage unit 10 each time after one row has been switched back to the inactive state.
  • a storage unit 10 e.g., after all rows were active (all data of the image components are collected) , or image component by image component, i.e., data read-out into storage unit 10 each time after one row has been switched back to the inactive state.
  • Fig. 6 illustrates a simple embodiment with a sensing area definition element 58, e.g., embodied as a mechanical shutter.
  • a sensing area definition element 58 e.g., embodied as a mechanical shutter.
  • a photochemical converter 60b like a photographic film 60b, may be used; a two-dimensional photoelectric converter could be used, too.
  • the control unit 4 controls (via drive 59) sensing area definition element 58, e.g., shutter curtain 58, which defines, as a function of time, the locations on the photographic film 60b, which shall be exposed to light.
  • the movement of the shutter opening may be step-wise or continuous.
  • the set of image components may be considered to contain an infinite number of image components, or a finite number of image components with the delimitation of one image component to a neighboring image component being choosable rather freely. It is possible to use the finally fully exposed photographic film 60b as the final image, or to subject it to further processing, before obtaining the final image. It is, e.g., possible to scan the finally fully exposed photographic film 60b and continue with the so-obtained data. Comparing the embodiment of Fig. 4 with the embodiment of Fig. 6, it is to be noted that the amount of controlling that has to be carried out for obtaining the image components can be considerably smaller in an embodiment of Fig. 6.
  • Fig. 7 is an illustration of a simple embodiment with a line scanner 60a.
  • the line scanner 60a (as image taking element) is moved, so as to be exposed to different parts (or fractions) of the scene to be imaged.
  • a drive for driving the line scanner 60a has not been drawn in Fig. 7.
  • a line scanner 60a is a (usually) linear arrangement of a multitude of photosensitive members (several hundreds or several thousands or more) . It may have more than one row in order to be able to obtain color information.
  • the movement may be continuous or (rather) step-wise, wherein data are read out after a certain exposure time. It is possible to provide a shutter as an exposure time definition element in the light path, which prevents light from impinging on the line scanner 60a when no new data shall be obtained, e.g., when data shall be read out.
  • a sensing area definition element such as a mechanical shutter may be used in conjunction with a CCD chip 6Od or a CMOS chip 60c.
  • an exposure time definition unit for generally allowing or prohibiting that an image taking element 60 is exposed to light.
  • a shutter can be used, which is opened before the first image component RO is taken and is closed after the last image component Rf has been taken.
  • At least some embodiments of the invention can be used for obtaining pictures of scenes with moving objects, which are substantially free from distortions caused by said moving of said objects.
  • the example shows stripe-shaped image components Ri, which are arranged in parallel. This can readily be realized in any of the embodiments of Figs. 4, 5 and 7.
  • the imaging parameter in the example of Fig. 8 is varied continuously, in a nonlinear fashion.
  • Fig. 9 illustrates the relation of image components Ri, picture constituents Ci and final image P. If no processing of the image components Ri is necessary, the image components Ri may be identical with the picture constituents Ci.
  • Fig. 9 shows a case, in which the
  • (lateral) magnification varies with the variation of the imaging parameter.
  • the dotted lines within the image components Ri indicate, which part of each image components will be used in the corresponding picture constituent Ci; the outer part of the image components Ri can be discarded, only image component Rf is used in full.
  • the picture constituents Ci are obtained.
  • the full set of picture constituents Ci corresponds to the picture P, which can be represented as data or as a print on (photo) paper.
  • Figs. 10 and 11 are illustrations of example mosaic patterns of picture constituents Ci. This shall illustrate that, in fact, arbitrary mosaic patterns are possible. Only a small number of picture constituents Ci has been drawn in the Figures, whereas there will typically be of the order of 500 to of the order of 5000 or even more picture constituents Ci. For naturally-looking pictures, it is advisable to have a large number of picture constituents Ci, i.e., to have a maximum resolution, and to have smooth or no changes from one image component to a neighboring image component.
  • Fig. 10 illustrates that stripes do not need to be parallel to the image frame (they do not even have to be parallel to each other) , and that they do not need to have the same width.
  • Fig. 11 shall illustrate that any other mosaic (or puzzle-like) pattern is possible, e.g., a hillock-like one.
  • neighboring image components Ri (which will lead to neighboring picture constituents Ci) will be obtained in succession.
  • Figs. 12 to 15 illustrate some examples of how exposures and imaging parameter settings pi may change over time.
  • Fig. 12 illustrates discrete exposures with the same exposure time ⁇ i for each image component.
  • the imaging parameter is varied quasi-continuously or step-wise. It is possible to make imaging parameter steps only between two exposures, but it is also possible to make the imaging parameter setting steps independent from the timing of the discrete exposures, like shown in Fig. 12.
  • Fig. 13 illustrates, like Figs. 12, discrete exposures, but two imaging parameters, namely the focus setting (ai) and a zoom setting (si) of the image taking element are varied. This may be done for changing the focus and at the same time compensating for the change in magnification due to said change in focus, which could render a processing of so-obtained image components for deriving picture constituents therefrom superfluous.
  • Each imaging parameter can, e.g., be varied continuously or quasi-continuously .
  • Fig. 14 illustrates discrete exposures with varying exposure times ⁇ i for different image components.
  • the imaging parameter settings (pCL.pf) partially vary continuously, but also show a step and a constant region.
  • Fig. 15 illustrates a continuous exposure, as it may be realized, e.g., in an embodiment with a continuously moving opening of a sensing area definition element such as an opening of a shutter (cf . Fig. 6) .
  • An imaging parameter (pCL.pf) e.g., a soft focus setting, is varied.
  • Fig. 16 shows schematically an image taking apparatus 1.
  • This exemplary apparatus 1 is a modular single-lens reflex camera 1. It has the following parts, which are all (optionally) detachable: a lens module 2, a focussed-state detection module 3, a control unit 40, an optional adapter plate 6', an image taking module 6 and a focussed-state detection module 7.
  • the lens module 2 corresponds to an image-forming optical system 20 comprising a number of lenses 21 and an aperture 22 and possibly a shutter (not shown) .
  • a part of the lenses 21 forms a part of a focussing section 29, which also comprises a drive 28 (for focussing) .
  • the drive 28 does not have to be arranged at or within the lens barrel 2.
  • a drive 23 may be provided, which allows to open and close the aperture 22 and adjust its opening.
  • the lens barrel 2 is attached to a focussed-state detection module 3, which in the camera of Fig. 1 is at the same time a mirror module containing a mirror arrangement comprising a main mirror 35 and an auxiliary mirror 36.
  • the focussed-state detection module 7 is embodied as a view finder module 7 attached to the mirror module 3.
  • the focussed-state detection module 7 may, in general, present images for example optically or electro- optically.
  • the thick wavy line represents the image of the object 99 in the image plane 87 of focussing screen 70.
  • the camera 1 comprises an autofocus sensor 30.
  • Light from the object 99 reaches the autofocus sensor 30 on a light path 9a ' through the main mirror 35 and via reflection at the auxiliary mirror 36.
  • an image is formed in an image plane 83 of the autofocus sensor 30.
  • the optical path length from object 99 to the image plane 83 of the autofocus sensor 30 is the same as the optical path length from object 99 to the image plane 87 of the focussing screen 70.
  • the mirror arrangement (main mirror 35 and auxiliary mirror 36) is moved as indicated by the small arrow.
  • This lets the light pass along a light path 9b through the control unit 40, which contains a sensing area definition element 58, e.g., a shutter 58, and a control module 4 (or control circuit) embodied in a microprocessor ⁇ P.
  • the shutter 58 and the control module 4 do not necessarily have to be arranged within the control unit 40.
  • the control module 4 may control the drive 28, the aperture 22 (via the drive 23), a mechanism for moving the mirror arrangement (not shown) and the shutter 58 and other functions of the image taking apparatus. It may receive input from the autofocus sensor
  • the functional connections of the control module 4 to the various units and elements are not shown in Fig. 16.
  • the light will pass the adapter plate 6' and impinge on an image taking element 60 of the image taking module 6, which is embodied as a digital back 6 with a CCD or CMOS chip 60 and comprises a storage unit 10 (memory) .
  • the image plane of the image taking element 60 is labelled 86.
  • At least the following imaging parameters can be varied while obtaining the image components: parameters of the shutter 58 (movement, slit widths) , focus settings (via drive 28) and opening of aperture 22 (via drive 23) .
  • parameters of the shutter 58 movement, slit widths
  • focus settings via drive 28
  • opening of aperture 22 via drive 23
  • other imaging parameters and parts or elements of the camera 1 could be controlled by the control module 4 in order to vary these during obtaining the image components.
  • the photographer could select, manually or automatically (autofocus or the like) a setting of the at least one imaging parameter to be varied, e.g., a focus setting and/or a zoom setting. Then, the user marks, on a screen, a point or an area in the scene, at which this setting should be used, e.g., in the middle of the top of the screen. Then the same is done for a second setting and a second point or area in the scene, e.g. in the middle of the bottom of the screen. It is possible to input further settings and points / areas.
  • a setting of the at least one imaging parameter e.g., a focus setting and/or a zoom setting.
  • a suitable fitting mode could be selected, which defines the algorithm (fitting procedure for interpolation / extrapolation) to be used for obtaining the mosaic patterns and the imaging parameter variation from the input.
  • one mode could be: finest-resolution and horizontally-oriented stripe- shaped image components from top to bottom, and polynomial interpolation between the imaging parameter settings.
  • the "exposure program” is defined, and upon a start signal, e.g., pressing the release button of the camera, the image components are obtained according to the "exposure program” and stored in a storage unit. This way of defining the "exposure program” (and providing the input for that) could even be realized with the camera alone.
  • the data from the image taking element 60 could be output on a screen of the camera, and points could be set, e.g., by means of a tracking ball or cursor keys.
  • a software for defining the instructions can be implemented in the camera itself and/or can be run on a (separate) computer connected (at least temporarily) to the camera.
  • Fig. 5 is an illustration of a camera system with computer 100 and storage unit 10.
  • the storage unit 10 may be part of the computer 100 and contain image components and/or picture constituents and/or final images (pictures).
  • a software for defining exposure programs and/or software for obtaining final images (from image components and/or picture constituents) and even software embodying the functions of the control module 4 (for remote-controlling the camera) may be installed and run.
  • Fig. 17 schematically illustrates, in a side view, a simple embodiment with a variation of an imaging parameter and with image taking element tilting.
  • Said imaging parameter can be a focus setting, a zoom setting, a soft focus setting or another imaging parameter. It is symbolized by a moving lens 21 and will usually also comprise a movement of at least one lens 21.
  • a sensing area definition element 58 having a transparent portion 580, and an image taking element 60 are shown at two points in time tl,t2.
  • the arrangement at time tl is drawn in solid lines, the arrangement at time t2 is drawn in thick dotted lines.
  • sensing area definition element 58 can, e.g., be embodied as a shutter curtain.
  • image taking element 60 is tilted (inclined) with respect to a certain unchanged angle during obtaining the set of image components; e.g., image taking element 60 is tilted into such a tilted position shortly - A A -
  • image taking element 60 is not kept in a fixed position (tilted or not tilted) during obtaining the set of image components, but the tilting is varied (in angle and/or direction) during obtaining said set of image components.
  • the effect of varying the tilting angle during obtaining said set of image components is readily understood: Assuming that a flat surface aligned perpendicularly to optical axis A is to be imaged, the slice-shaped image component Rl obtained at time tl has, as indicated in the right part of Fig. 17 by dots having the same size, the same degree of sharpness or blur across itself. But the slice-shaped image component R2 obtained at time t2 (with image taking element 60 tilted) has, as indicated in the right part of Fig.
  • Fig. 18 shows the embodiment of Fig. 17 is a top view.
  • the axis of rotation of image taking element 60 can, as shown in Figs. 17 and 18, run centrally through image taking element 60, but it could also be a different axis, e.g., one running along an edge of image taking element 60.
  • the movements of the image taking element can be accomplished using a drive, e.g., a rotatory or a linear drive, e.g., piezo driven.
  • a drive e.g., a rotatory or a linear drive, e.g., piezo driven.
  • Figs. 19 and 20 show, in side view and in top view, respectively, an embodiment which is similar to the one shown in Figs. 17 and 18, but along with the image taking element, also the sensing area definition element 58 is tilted.
  • This can be advantageous in particular if - as often will be the case - the sensing area definition element 58 is arranged close to image taking element 60.
  • the sensing area definition element 58 can be separate from image taking element 60 or can be fixed to image taking element 60. In the latter case, the size of sensing area definition element 58 with respect to the size of image taking element 60 has to be chosen sufficiently large, so as to allow for proper exposure of image taking element 60 when tilted, also in the peripheral parts of image taking element 60.
  • the tilting of image taking element 60 can be accomplished and used in the same fashion as discussed in conjunction with Figs. 17 an 18.
  • a tilting of image taking element 60 can also be accomplished when the image taking element is a line sensor (cf., e.g., Fig. 7) .
  • a sensing area definition element can be dispensed with or can be used, e.g., with a transparent portion being moved along the extension of the line sensor.
  • lens module 2 lens module, objective module, lens barrel 10 storage unit
  • control module control circuit
  • sensing area definition element 58 sensing area definition element, aperture element, aperture, slit-shaped aperture, means for defining a portion of the image taking element to be exposed, shutter 580 transparent portion of sensing area definition element, opening of sensing area definition element
  • image plane of focussed-state detection arrangement image plane of autofocus sensor 86 image plane of image taking element
  • Ci picture constituent P picture final image pi imaging parameter setting

Abstract

The method of manufacturing a picture (P) of a scene (99) using an image taking apparatus (1) comprising an image taking element (60) and an image-forming optical system (20) comprising at least one lens (21) comprises the steps of a) obtaining a set of image components (Ri), wherein said set of image components (Ri) comprises information on said scene (99) in full; b) carrying out a pre-defined variation of at least one imaging parameter during step a); wherein step a) comprises the steps of a1) obtaining the image components (Ri) of said set one after the other; and a2) capturing a substantially different fraction of said scene (99) in each of said image components (Ri); wherein step b) comprises the step of b1) carrying out a relative movement of said at least one lens (21) with respect to said image taking element (60) during step a), wherein said at least one imaging parameter is - for each of said image components (Ri) - substantially constant during obtaining the respective image component (Ri). Said at least one imaging parameter can be a focus setting, so that the method allows to obtain images of pre- selectable distributions of sharpness and blur.

Description

Method of Manufacturing a Picture and Image Taking Apparatus with Enhanced Imaging Capabilities
Technical Field
The invention relates to the field of photography, and in particular to an image taking apparatus and a control unit for an image taking apparatus and to a method of manufacturing a photograph and to a use of said method.
Background of the Invention
The usual way to obtain a picture of a scene by means of a camera is to select appropriate imaging parameters like the focus setting, the zoom setting, the aperture opening (aperture number) and the exposure time and then expose an image taking element accordingly. Said image taking element can be a photographic film or, nowadays, rather a photoelectric chip (typically a CCD or a CMOS chip) , which has a large number (today typically at least several millions) of pixels (picture elements). Since it is sometimes difficult to precisely determine the appropriate imaging parameters in advance, some cameras provide for a so-called "bracketing" or "exposure bracketing", which allows to obtain a number of (typically three) images, each taken at a different aperture opening and/or exposure time. Upon pressing the release button once, e.g., firstly one picture at a presumed optimal exposure time is taken, then one picture at twice said presumed optimal exposure time is taken, and finally, one picture at half said presumed optimal exposure time is taken. Afterwards, the best picture can be selected from these three.
From DE 10 2004 007 608 Al, a method for obtaining images with infinite, extended or modified depth of field is disclosed. It is suggested to capture a multitude of images of the same scene, each focused differently. From each of these images, a part having a desired amount of sharpness is extracted, and by putting together all these parts, a picture of the full scene is synthesized, which shows sharpness and/or blur as desired. This method has the disadvantage that a large amount of data is gathered, and that selecting and putting together the desired parts from said multitude of images can be cumbersome.
From US 2005/0224695 Al, an optical detection system is known, which allows to minimize undesired effects of aberration in image taking by using a spatial light modulator embodied as a deformable mirror comprised of a multitude of deformable micro-mirrors.
From US 6 535 250 Bl, an image pickup apparatus specifically designed for the purpose of picking up an entirety of a flat surface of a white board in-focus is disclosed, i.e. an image pickup apparatus for a particular kind of oblique photography. Such an image pickup apparatus is installed with an inclination angle θ between a sensing surface of the image pickup apparatus and the surface of the white board. Said angle θ is adjustable, and dependent on said angle θ, a mirror is rotated, a taking lens is moved and a CCD line image sensor is moved, so as to obtain line-by-line a full image of said white board surface, which is fully in-focus, wherein light from said white board surface is reflected by said mirror, then runs throught said taking lens and is then detected by said line sensor.
It is desirable to provide for an image taking apparatus with enhanced imaging capabilities and a new way of obtaining pictures, which provide useful degrees of freedom in photography.
Summary of the Invention
Therefore, an object of the invention is to create an image taking apparatus, a control module for an image taking apparatus and a method of manufacturing a picture and a use of said method, which are new and provide for useful degrees of freedom in creating a picture of a scene. Another object of the invention is to provide for enhanced imaging capabilities. Another object of the invention is to provide a photographer with enhanced creative possibilities for creating photographs.
Another object of the invention is to provide for enhanced ways of creating high-quality images in general-purpose photography.
These objects can, at least in part, be achieved by an image taking apparatus, a control module for an image taking apparatus, a control unit for an image taking apparatus, a method of manufacturing a picture of a scene and a use of said method according to the patent claims.
The method of manufacturing a picture of a scene using an image taking apparatus comprising an image taking element and an image-forming optical system comprising at least one lens comprises the steps of a) obtaining a set of image components, wherein said set of image components comprises information on said scene in full; b) carrying out a pre-defined variation of at least one imaging parameter during step a) ; wherein step a) comprises the steps of al) obtaining the image components of said set one after the other; and a2) capturing a substantially different fraction of said scene in each of said image components; wherein step b) comprises the step of bl) carrying out a relative movement of said at least one lens with respect to said image taking element during step a) , wherein said at least one imaging parameter is - for each of said image components - substantially constant during obtaining the respective image component .
The use according to the invention is a use of the method according to the invention for intentionally causing that said picture comprises portions, which are deliberately out-of-focus .
The control module for an image taking apparatus comprises an image taking element and an image-forming optical system comprising at least one lens and is adapted to enabling
— to automatically obtain a set of image components, which set comprises information on a scene in full, said automatically obtaining comprising successively obtaining the image components of said set and capturing a substantially different fraction of said scene in each of said image components; and
— to automatically vary settings of at least one imaging parameter in a pre-defined way during said automatically obtaining said set of image components, said automatically varying settings comprising carrying out a relative movement of said at least one lens with respect to said image taking element during said obtaining said set of image components, wherein said at least one imaging parameter is - for each of said image components - substantially constant during obtaining the respective image component. The control unit for an image taking apparatus is comprised in and/or connectable to said image taking apparatus and comprises a control module according to the invention.
The image taking apparatus comprises a control module according to the invention.
By means of the invention, it is possible to obtain pictures, in particular still images, throughout which at least one imaging parameter is varied. I.e., different parts of the picture are obtained from exposures with (at least partially) different imaging parameters such as focus settings, soft focus settings and zoom settings. Pictures being partially in-focus and partially deliberately out-of- focus can be obtained in a well-defined way, which provides for great creative possibilities in creating photographs. The method can allow to obtain pictures of pre-selectable distributions of sharpness and blur across the picture. It is possible to achieve the effect of a view camera with tilted lenses without tilting lenses and still further creative effects. An advantage of the invention can be, that it can be realized with photographic cameras, which are, from an optical point of view, identical with known cameras. But, the way the exposure and said at least one imaging parameter are controlled is different from what is known from conventional photographic cameras.
In the following, it is primarily referred to methods according to the invention, but all or most of the thoughts can be readily applied to apparatuses according to the invention by analogy. In step al) , usually the image taking element will be used for successively obtaining said image components.
Said relative movement put forward in step bl) causes, at least in part, said pre-defined variation of said at least one imaging parameter put forward in step b) . Despite said at least one imaging parameter being substantially constant during obtaining a single image component, as put forward in step bl), said at least one imaging parameter is, at least in part, different for different image components. Referring to step b) , said pre-defined variation of said at least one imaging parameter is carried out such as to achieve that, at least for a portion of said image components of said set, different image components of said portion are obtained using different settings of said at least one imaging parameter.
This allows to achieve the above-mentioned creative possiblities in photography.
As put forward in step b) , said variation of said at least one imaging parameter is a pre-defined variation. Therefore, for each image component, the setting of said at least one imaging parameter to be used during obtaining the respective image component is prescribed (typically by the photographer) before the respective image component is obtained, in particular, even before a first image component of said set is obtained.
In one embodiment, said at least one imaging parameter is set to a pre-defined initial setting while the first image component of said set of image components is obtained, and said at least one imaging parameter is set to a pre-defined final setting while the last image component of said set of image components is obtained.
Accordingly, said pre-defined variation is clearly distinguished from such variations of said at least one imaging parameter, which are carried out irrespective of which image component is currently obtained, such as it is done, e.g., for camera-shake compensation. In the optical detection system disclosed in the above-mentioned US 2005/0224695 Al, a camera-shake compensation is accomplished by detecting a change of the position of the optical detection system and compensating for a shift in focal point caused by the detected position change.
Accordingly, in one embodiment, said method comprises the step of e) defining said pre-defined variation of said at least one imaging parameter.
By means of this defining said pre-defined variation, for each of said image components of said set, a corresponding setting of said at least one imaging parameter is defined. Step e) is typically carried out before carrying out steps a) and b) . And step e) is typically carried out by the photographer.
Said fractions of said scene to be captured in said image components mentioned in step a2) are usually pre-defined, too: For each image component, the fraction of said scene to be captured in the respective image component is prescribed before the respective image component is obtained, in particular, before a first image component of said set is obtained. Accordingly, in one embodiment, the method comprises the step of adjusting at least one control parameter which influences shape and/or arrangement of all those portions of said image taking element, each of which is exposed for obtaining one respective image component.
Usually, said fractions of said scene captured in each of said image components do substantially not or not at all overlap.
In one embodiment, said method comprises the step of switching from a normal mode of said image taking apparatus, in which no pre-defined variation of said at least one imaging parameter and/or no movement of said at least one lens can be carried out during exposure, to another mode of said image taking apparatus, in which steps a) and b) can be carried out. Said normal mode can be the mode, in which standard photo cameras work, i.e. in which imaging parameters (and lenses) remain fixed during exposing an image taking element. Said other mode is the mode according to the invention. In one embodiment, said picture is composed of a multitude of picture constituents derived from said set of image components and said method comprises the steps of deriving said picture constituents from said set of image components and deriving said picture from said picture constituents. This is the case, e.g., when the (lateral) magnification is changed by varying said at least one imaging parameter, as it is usually the case when varying the focus setting. In such a case, some processing (e.g., some calculating) has to be applied to the image components in order to obtain the picture constituents.
In one embodiment, said picture is composed of picture constituents which are substantially identical with said image components. This is usually preferred in case of photochemical converters as image taking elements.
In one embodiment, both, a focus setting and a magnification and/or zoom setting is varied, wherein the variation of said magnification and/or zoom setting is chosen such that it compensates for changes in magnification caused by said focus setting variation.
Said deriving of said picture constituents and/or of said picture (final imgage) may in full or in part take place in the image taking apparatus and in full or in part in a computer connectable to the image taking apparatus or in a computer separate therefrom.
Said image components can also be referred to as "raw image components" in the sense that they are usually subject to some further processing in order to obtain said picture constituents. In case of digital photography, data representing said image components can be what generally is referred to as "raw data" in digital photography, yet it is also possible to have processed, e.g., compressed, data representing said image components. The fact that in each of said image components, only a fraction of said scene is captured, wherein said fractions are substantially different from each other (cf. step a2), avoids the generation of large amounts of superfluous data, of course on the expense of loosing many possibilities of offline processing.
In one embodiment, said method comprises the step of d) deliberately creating out-of-focus portions of said scene in said picture by carrying out step b) .
Typically, said picture comprises in-focus portions and deliberately out-of-focus portions because of carrying out step b) .
Typically, there are varying degrees of sharpness (and blur) across said picture because of carrying out step b) .
This makes a clear and strong difference to methods described in the above-mentioned US 6 535 250 Bl, which clearly aim solely at creating a fully in-focus, blur-free images of a flat object. In one embodiment of said method, step a) comprises the step of a3) exposing said image taking element with light from said scene in full by successively exposing different portions of said image taking element with light from different parts of said scene.
Accordingly, said image taking element, which in this case typically is a two-dimensional converter such as a sheet of photographic film or a two-dimensional CCD or CMOS chip, is subject to exposure on a portion-by-portion basis, until light from said scene in full is captured, which typically is the case when the whole image taking element has been exposed. Typically, therefore, for each image component, different portions of said image taking element are exposed at different times.
In one embodiment, said method comprises the step of a31) using a sensing area definition element arranged within said image taking apparatus between said scene and said image taking element for defining said different portions of said image taking element to be exposed.
Typically, said sensing area definition element is arranged close to said image taking element, and it usually comprises an opaque portion for blocking light from travelling to said image taking element. It usually comprises, in addition, a transparent portion, which allows light to travel to said image taking element, so as to define said different portions of said image taking element to be exposed. Typically, on its way to said image taking element, the light travels through said transparent portion substantially without changing its direction and/or substantially unperturbed. For example, said sensing area definition element can comprise a transparent portion, which is confined, fully or in part, by said opaque portion.
The sensing area definition element can, e.g., comprise a liquid crystal element such as a liquid crystal material between glass substrates having electrodes to selectively create transparent and light-blocking areas of the liquid crystal element. If many electrodes are provided, a great flexibility for defining said image components can be provided this way. Nevertheless, said transparent portion is preferably an opening, since this provides a great optical quality, in particular compared to providing glass substrates in the light path used for exposing said image taking element. Accordingly, in one embodiment, said sensing area definition element is capable of forming an opening, and said step a31) comprises the step of moving said opening (with respect to said image taking element) of said sensing area definition element for defining said different portions of said image taking element to be exposed. E.g., a slit-shaped opening can be moved (continuously or quasi- continuously or step-wise) in the light path, e.g., shutter curtains as usually used for controlling the exposure in photographic cameras can be used as sensing area definition elements. It is also possible to use a pin diaphragm
(possibly one with several grades such as those the f-stop of photgraphs is commonly chosen with) movably arranged in the light path before the image taking element as a sensing area definition element. It is also possible to use two shutter curtains aligned in parallel, but with a non-zero angle, e.g., with a angle of substantially 90°, between the slits formable by the shutter curtains. This allows for a rather flexible design of said image components.
In particular if the image taking element is or comprises an imaging photoelectric converter comprising an array of a multitude of photosensitive members, such as a CMOS chip, step a3) can be accomplished by successively bringing different groups of photosensitive members of said multitude of photosensitive members into a suitable photosensitive state and back into a photo-insensitive - In ¬
state. The image taking element may be constantly illuminated, and the portion of the image taking element to be exposed for obtaining a single image component can easily be chosen by a control signal switching the photo- sensitivity of said groups of photosensitive members. And even the times at which and the time during which an exposure for a single image component takes place may readily be defined by said control signal. This way, what can be referred to as an "electronic shutter" may be realized in the image taking element itself. This can render a sensing area definition element superfluous.
If a photochemical converter, e.g., photographic film, is used as image taking element together with a sensing area definition element such as a shutter with a slit moving in front of the image taking element, said set of image components may be considered to comprise a finite number of image components taken successively at different times, wherein these image components are chosen such that, as put forward in step bl), said at least one imaging parameter can be considered practically unchanged during the time interval within which said image components are obtained, or may be considered to comprise an infinite number of image components taken at infintely closely spaced points in time. In particular with one-dimensional image taking elements such as line scanners, step a) may comprise the step of a4) successively obtaining the image components of said set by multiply exposing said image taking element, each time with light from a different part of said scene.
In one embodiment, step a4) comprises the step of a41) moving said image taking element for accomplishing said exposing of said image taking element with light from said different parts of said scene.
Said moving said image taking element can be a continuous or quasi-continuous or step-wise moving.
A drive functionally connected to said image taking element can be used for accomplishing said movement.
In one embodiment, said method comprises the step of cl) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define and/or does not describe and/or is unrelated to an angle between an axis defined by said image taking element and an axis defined by an object to be imaged.
This provides for a great amount of freedom in creating photographs, also an in particular with respect to the disclosure of the above-mentioned US 6 535 250 Bl, in which solely the above-mentioned inclination angle θ can be adjusted for influencing a variation of an imaging parameter.
In a similar embodiment, said method comprises the step of c2) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not describe and/or is unrelated to an alignment of said image taking apparatus relative to an object to be imaged.
In another similar embodiment, said method comprises the step of c3) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of an axis defined by said image taking element relative to an axis defined by an object to be imaged.
In yet another similar embodiment, said method comprises the step of c4) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of said image taking apparatus relative to said scene.
In yet another similar embodiment, said method comprises the step of c5) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define and/or does not describe and/or is unrelated to a distance between said image taking element and an object to be imaged.
Steps cl) , c2), c3), c4) and c5), respectively, are typically carried out by the photographer, and typically before carrying out steps a) and b) . - li ¬
lt is possible that the control parameter mentioned in one of steps cl), c2), c3), c4) and c5) is identical with the control parameter mentioned in another one or more of these steps . In one embodiment, a control parameter, on which said predefined variation of said at least one imaging parameter depends, is adjusted for creatively designing said picture of said scene.
As has been put forward before, said at least one imaging parameter, which is varied in a pre-defined way in step b) , depends on said relative movement mentioned in step bl) .
In one embodiment, said pre-defined variation of said at least one imaging parameter leaves the signal strength generated by said image taking element substantially unchanged.
In one embodiment, said at least one imaging parameter is a parameter which leaves the amount of light to which said image taking element is exposed during obtaining said image components substantially unchanged. In one embodiment, said at least one imaging parameter is a parameter which influences light paths relevant for said obtaining said image components.
In one embodiment, said at least one lens is comprised in a focussing section of said image taking apparatus, and said at least one imaging parameter is a parameter of said focussing section.
In one embodiment, said at least one imaging parameter comprises a focus setting. Variation of the focus setting allows to place the location of maximum sharpness in different distances for different places in the picture (different parts of the scene) . In particular, the plane of maximum sharpness may be angled arbitrarily, and in principle, even an arbitrarily shaped surface of maximum sharpness may be realized.
In one embodiment, said method comprises the step of f) creating a bent focal surface by carrying out step b) .
This is readily accomplished, e.g., by means of corresponding variations of said relative movement mentioned in step bl). The focal surface is the surface constituted by points which are imaged in-focus. A bent focal surface is a not-flat focal surface.
In one embodiment, for each of said image components, the light used for obtaining the respective image component travels within said image taking apparatus along a light path to said image taking element without being reflected by a mirror.
In one embodiment, for said obtaining said image components of said set, light paths are used, which are, within said image taking apparatus, free from mirrors.
These "mirror-free" embodiments have the advantage that they allow for highest-quality imaging, since mirrors are optical elements which may have imperfections, may require adjusting and maintenance and may generate failures.
In one embodiment, step bl) comprises the step of bll) moving said at least one lens during step a) . This moving of said at least one lens typically takes place along the optical axis of the image taking apparatus.
In one embodiment, said image taking apparatus comprises a drive for moving said at least one lens, and step bll) is accomplished using said drive. This drive may be identical with a drive used for autofocus operations in said image taking apparatus.
In one embodiment, the method comprises the step of g) moving said image taking element during step a) . This moving can be a moving contributing to or even embodying said relative movement mentioned in step bl) , in which case it typically takes place along the optical axis of the image taking apparatus or at least comprises a movement component along the optical axis of the image taking apparatus.
In one embodiment, the moving said image taking element mentioned in step g) comprises a tilting movement, wherein said tilting may be a tilting about an arbitrary axis, e.g., a hoizontally aligned axis or a vertically aligned axis. Usually, the image taking element is aligned such that the line (in case of a line scanner) or the surface (in case of a two-dimensional photosensive element such as photographic film or an imaging photoelectric detector) defined by it lies in a plane perpendicular to the optical axis of the image taking apparatus, typically in the image plane. But by means of said tilting, said line or surface will be moved out of said plane, so that at least some of said image components will be obtained in such a correspondingly tilted position of the image taking element. Accordingly, said tilting movement typically comprises a movement component along the optical axis of the image taking apparatus, but is not a movement solely along the optical axis of the image taking apparatus. If a movement according to step g) is carried out, in particular a tilting movement, while also a sensing area definition element is used for defining said different portions of said image taking element to be exposed (cf. step a31), it can be advantageous to move said sensing area definition element simultaneously with and in the same manner as said image taking element, e.g., for keeping a distance between said sensing area definition element and said image taking element constant and/or for ensuring that - for all image components - the length of the light path relevant for obtaining the respective image component from said sensing area definition element to said image taking element is of the same magnitude.
In one embodiment, said image taking apparatus is a general-purpose photographic camera. A general-purpose photographic camera is intended for use for various types of photography, not limited to only one type of photography such as taking pictures of white boards only.
Typically, said image taking apparatus is a camera for hand-held use and/or for use mounted on a camera stand or tripod.
In one embodiment, said set of image components is obtained automatically.
In one embodiment, said settings of said at least one imaging parameter are varied automatically. In one embodiment, said pre-defined variation of said at least one imaging parameter mentioned in step b) is carried out by varying settings of said at least one imaging parameter in a continuous or quasi-continuous or step-wise manner.
Continuous variations (of said at least one imaging parameter, of said relative movement mentioned in step bl), of said moving of said image taking element and/or of said lens and/or of said transparent portion of said sensing area definition element) can be carried out with analogue control, e.g., by using an analogue control and an analogue motor for continuously varying, e.g., a focus setting, a lens position, an image taking element position, an aperture position. Quasi-continuous variations are possible with a drive and a digital control (having a reasonable resolution) . Step-wise variations are possible by various means .
In order to improve image quality and decrease the time for obtaining said set of image components, it can be advisable to obtain neighboring image components consecutively, and to make (at least mostly) only small or no changes in the imaging parameter settings from one image component to a neighboring image component.
In one embodiment, said image taking element is a photoelectric converter, in particular a CMOS chip or a CCD chip or a line scanner.
In one embodiment, said image taking element is a an imaging photochemical converter, in particular photographic film. In one embodiment, said method comprises the step of storing data representative of said set of image components in a storage unit, in particular in at least one of the group comprising
— a storage unit of said image taking apparatus;
— a storage unit of a computer separate from or separable from said image taking apparatus;
— a data carrier, in particular magnetic or optical or electrical data carrier, in particular a removable data carrier.
In case of a photochemical image taking element, the "data" of the image components are "stored" in the image taking element, typically a photographic film.
In one embodiment, said method comprises the step of
— storing in said or in another storage unit data representative of the different settings of said at least one imaging parameter used for obtaining said image components.
Furthermore, it can be useful to store information relating the image components to their respective future location in the final image or to the portion of the image taking element employed for capturing the respective image component or to the time when the respective image component has been obtained or the like. This helps to correctly assemble the picture constituents in the end. It is possible and usually desirable to obtain the whole set of image components upon one start signal such as pressing a release button.
In many cases, it will be preferred to have line-shaped image components with a small width, wherein the line may be curved or straight, continuous or discontinuous. In case that an image taking element comprising a multitude of photosensitive members is used, e.g., a CMOS or CCD chip, the width of the line would, for high resolution, be only one pixel, or maybe up of two or three pixels; in case of a color-sensitive chip, one pixel would preferably be considered to comprise a couple of photosensitive members, accounting for the different colors, e.g., one for red, two for green, one for blue. The number of image components in said set will in many cases be several hundreds to several thousands.
Of course, image components with a larger width can be used for faster image-taking, usually at the expense of resolution; if, however, the variation of said imaging parametercomprises only in a relatively small number of steps, there may be no loss of resolution.
In one embodiment of said image taking apparatus, the image taking apparatus comprises at least one of
— an image-forming optical system, in particular a detachable image-forming optical system;
— a focussing section;
— an exposure time definition unit; — an image taking module comprising an image taking element, in particular a detachable image taking module .
It may be a camera system, in particular a photographic camera, a general-purpose photographic camera, a still image camera, more particularly a single-lens reflex camera .
In one embodiment, said image taking apparatus is free from mirrors, which are arranged, for at least one of said image components, along a light path along which light used for obtaining said at least one image component travels within said image taking apparatus to said image taking element during obtaining said at least one image component.
In one embodiment, said image taking apparatus comprises at least one of the group comprising
— a first storage unit for storing data representative of said set of image components;
— a second storage unit, which may be identical with or different from said first storage unit, for storing data representative of said pre-defined variation of said at least one imaging parameter;
— a third storage unit, which may be identical with or different from said first and/or second storage units, for storing data relating each of said image components of said set to said settings of said at least one imaging parameter used during obtaining the respective image component.
Said data can be digital data. In one embodiment, said image taking apparatus comprises, as image taking element, a line scanner and a drive functionally connected to said line scanner, wherein said control module is adapted to controlling said drive such that said line scanner is moved during obtaining said set of image components, so as to define for each of said image components the corresponding fractions of said scene.
In one embodiment, said image taking apparatus comprises an exposure time definition unit. By means of an exposure time definition unit, it is defined when and for how long an exposure of the image taking element takes place, or, more precisely, when and for how long photons are collected by the image taking element. Accordingly, exposure time definition units can be, e.g., shutters (shutter curtains) and/or apertures. It is possible that an image taking element itself implements an exposure time definition unit, e.g., if the photo-sensitivity of the image taking element (or portions thereof) can be switched on and off. It can be useful to provide an exposure time definition unit in addition to a sensing area definition element.
Considered under a certain point of view, the methods according to the invention may also be considered methods of operating an image taking apparatus, in particular, methods of operating an image taking apparatus for obtaining one final image, which one final image is composed of a multitude of picture constituents derived from a set of image components.
Said picture (or final image) of said scene can be considered a mosaic-like composition of the picture constituents. And, said set of image components may be interpreted to forming a mosaic-like pattern from all its image components, wherein each image component corresponds to one mosaic-piece-like fraction of said scene. The image components may be considered partial images or mosaic image fractions .
Furthermore, typically, there is a one-to-one relation between each one image component and each picture constituent. And, the mosaic-like arrangement of picture components represented by the final image is usually identical with the mosaic-like arrangement that can meaningfully be formed from the set of image components or at least obtainable therefrom by relatively simple geometric transformations, which usually do not alter the neighboring-relationships of the mosaic parts.
Usually, the scene and its illumination shall remain unchanged during a time span Δt, during which all image components of the set are derived, and also the image taking apparatus should not be moved within that time. The advantages of the methods correspond to the advantages of corresponding apparatuses.
Further preferred embodiments and advantages emerge from the dependent claims and the figures.
Brief Description of the Drawings
Below, the invention is described in more detail by means of examples and the included drawings. The figures show schematically:
Fig. 1 an illustration of a simple embodiment with focus variation;
Fig. 2 an illustration of a simple embodiment with focus variation; Fig. 3 an image taking apparatus with focus variation;
Fig. 4 an illustration of a simple embodiment with "electronic shutter";
Fig. 5 an illustration of a camera system with computer and storage unit; Fig. 6 an illustration of a simple embodiment with sensing area definition element;
Fig. 7 an illustration of a simple embodiment with a line scanner;
Fig. 8 an illustration of image components and corresponding imaging parameter settings over time;
Fig. 9 an illustration of image components, picture constituents and final image;
Fig. 10 an illustration of a mosaic pattern of picture constituents; Fig. 11 an illustration of a mosaic pattern of picture constituents;
Fig. 12 an illustration of exposures and imaging parameter settings over time; Fig. 13 an illustration of exposures and imaging parameter settings over time;
Fig. 14 an illustration of exposures and imaging parameter settings over time;
Fig. 15 an illustration of exposures and imaging parameter settings over time;
Fig. 16 an image taking apparatus with focus and aperture opening variation;
Fig. 17 an illustration of a simple embodiment with focus variation and image taking element tilting, in a side view;
Fig. 18 an illustration of the simple embodiment with focus variation and image taking element tilting of Fig. 17, in a top view;
Fig. 19 an illustration of a simple embodiment with focus variation, image taking element tilting and sensing area definition element tilting, in a side view;
Fig. 20 an illustration of the simple embodiment with focus variation, image taking element tilting and sensing area definition element tilting of
Fig. 19, in a top view. The reference symbols used in the figures and their meaning are summarized in the list of reference symbols. The described embodiments are meant as examples and shall not confine the invention.
Detailed Description of the Invention
Fig. 1 is an illustration of a simple embodiment, in which an imaging parameter, namely the focus setting is varied. A scene 99 (or an object 99) is imaged onto an image taking element 60 by means of an image-forming optical system 20, which is drawn as a single lens, which at the same time is a focussing section 29. A control module 4 can control a drive 28 (dash-dotted lines indicate functional connections) such that the focus setting is changed, illustrated by the dotted lenses connected by the dotted arrows. If the focussing section 29 is moved step-wise from an initial position at time tO to a final position tf, via several intermediate steps, one of which is drawn at tn, the location of the maximum sharpness of the image will also move, as illustrated by the small image sections labelled with the corresponding point in time (tθ,tn,tf).
Said control module 4 controls said image taking module 60 such, that at different times ti different portions of the image taking module 60 capture a partial image representing only a fraction of said scene 99. Such a partial image is referred to as a image component Ri (indices of image components corresponding to indices of points in time) .
After taking all said image components RO...Rf, these might undergo some processing, in order to provide for some corrections (changing the focus setting usually results in a change in magnification, which preferably is corrected for) , and then a final picture can be obtained from assembling the results.
This makes it possible to achieve the effect of a view camera with tilted lenses without tilting lenses, as will become clear in Fig. 2.
By means of the movement of the at least one lens comprised in focussing section 29, a desired distribution of blur and sharpness in the final picture can be achieved, as can also be seen in Fig. 1.
Fig.2 is an illustration of the same simple embodiment as in Fig. 1. This time, the object 99 is tilted. By changing the focus settings properly and controlling the capturing of the image components Ri accordingly, it is possible to obtain a final image (picture) of the (tilted) surface of the object 99, which shows the surface in maximum sharpness throughout the whole picture. In every image component Ri, the maximum sharpness of the image of the surface is achieved in the image plane 86 of the image taking element 60.
It is to be noted that this effect is achieved without tilting the lens with respect to the image taking element as it would be the case when achieving this effect with a bellows camera, and without a mirror in the light path that would reflect the light used for obtaining said image components RO...Rf. Furthermore, said control module 4 will usually be programmed by the photographer before time tθ, i.e., before starting capturing said image components RO...Rf, so that the described focus variation is a pre-defined variation of at least one imaging parameter.
Fig. 3 shows schematically an image taking apparatus 1 with focus variation, similar to the embodiment of Figs. 1 and 2. The image taking apparatus 1, e.g., a camera, comprises an image taking module 6, e.g., a digital back, which comprises the image taking element 60, e.g., a CCD chip or a CMOS chip.
Fig. 3 further illustrates that it is possible to define that area of image taking element 60, which is to be exposed for obtaining a certain image component, by means of a sensing area definition element 58, which is separate from image taking element 60. Such a sensing area definition element 58 is usually located close to image taking element 60. Sensing area definition element 58 comprises a transparent portion 580 which lets light pass and an opaque portion which blocks light from travelling further. The sensing area definition element 58 of the embodiment of Fig. 3 can, e.g., be a liquid crystal element with several or better with a multitude of electrode pairs, which pairs are arranged, e.g., in a regular array. The shape of an image component Rn to be captured (and, therewith, of a fraction of scene 99 to be captured in said image component) can then easily be defined by selecting one or more of said electrode pairs and not applying a voltage to the selected electrode pair(s) while applying a voltage to the other electrode pairs. The application of a voltage will provoke an ordered arrangement of liquid crystals between the corresponding electrode pairs, which causes intransparency.
The data generated by image taking element 60 are stored in a storage unit 10, which may be separate from the camera or comprised therein. The control unit 4 ensures that exposure takes place at appropriate times ti and at appropriate portions of image taking element 60.
Various ways of obtaining image components Ri can be thought of, some of which have been discussed above or will be discussed below. Figs. 4, 6 and 7 illustrate some of such ways, wherein a moving lens is drawn at an intermediate time tn and, dashed, at an initial time t0 and at a final time tf, for illustrating the imaging parameter variation (e.g., focus setting, zoom setting, soft focus setting) .
Fig. 4 illustrates a simple embodiment with "electronic shutter". A CMOS chip 60c (image taking element) is controlled by control unit 4 such, that as a function of time, different groups of pixels (photosensitive members 65) are in action, i.e., are in a photosensitive state, while the rest of the photosensitive members 65 are photo-insensitive, i.e., they do not register photons impinging on them. For example, one after another, each row of pixels is switched active, from top to bottom of the CMOS chip 60c, which takes altogeher a time Δt . During the time span Δt, the imaging parameter is varied. The resulting image components RO, ...Rn, ...Rf are stored in a storage unit 10, e.g., after all rows were active (all data of the image components are collected) , or image component by image component, i.e., data read-out into storage unit 10 each time after one row has been switched back to the inactive state. Such an embodiment has the advantages, that the shape and location of the image components can be selected in a very flexible way, and that is has the potential to be a very fast way of obtaining the image components.
Fig. 6 illustrates a simple embodiment with a sensing area definition element 58, e.g., embodied as a mechanical shutter. As image taking element, a photochemical converter 60b, like a photographic film 60b, may be used; a two-dimensional photoelectric converter could be used, too. The control unit 4 controls (via drive 59) sensing area definition element 58, e.g., shutter curtain 58, which defines, as a function of time, the locations on the photographic film 60b, which shall be exposed to light. The movement of the shutter opening may be step-wise or continuous. In the latter case, the set of image components may be considered to contain an infinite number of image components, or a finite number of image components with the delimitation of one image component to a neighboring image component being choosable rather freely. It is possible to use the finally fully exposed photographic film 60b as the final image, or to subject it to further processing, before obtaining the final image. It is, e.g., possible to scan the finally fully exposed photographic film 60b and continue with the so-obtained data. Comparing the embodiment of Fig. 4 with the embodiment of Fig. 6, it is to be noted that the amount of controlling that has to be carried out for obtaining the image components can be considerably smaller in an embodiment of Fig. 6.
Fig. 7 is an illustration of a simple embodiment with a line scanner 60a. Controlled by the control unit 4, the line scanner 60a (as image taking element) is moved, so as to be exposed to different parts (or fractions) of the scene to be imaged. A drive for driving the line scanner 60a has not been drawn in Fig. 7. A line scanner 60a is a (usually) linear arrangement of a multitude of photosensitive members (several hundreds or several thousands or more) . It may have more than one row in order to be able to obtain color information. The movement may be continuous or (rather) step-wise, wherein data are read out after a certain exposure time. It is possible to provide a shutter as an exposure time definition element in the light path, which prevents light from impinging on the line scanner 60a when no new data shall be obtained, e.g., when data shall be read out.
Of course, the different features of the discussed embodiments may be combined in other ways than shown in the Figures. E.g., a sensing area definition element such as a mechanical shutter may be used in conjunction with a CCD chip 6Od or a CMOS chip 60c. And, it can be advantageous to provide, in addition, an exposure time definition unit for generally allowing or prohibiting that an image taking element 60 is exposed to light. E.g., a shutter can be used, which is opened before the first image component RO is taken and is closed after the last image component Rf has been taken.
Comparing the embodiment of Fig. 7 with the embodiments of Figs. 4 and 6, it is to be noted that - unless a particularly fast drive for moving the line scanner in Fig. 7 is provided - embodiments according to Fig. 7 are suited rather for imaging unmoving objects that for imaging fast-moving objects (unless distortions induced by notably moving objects are desired). The embodiments of Figs. 4 and 6, however, can readily be designed so as to allow for imaging moving objects substantially without distortions (smear effects) caused by the movement of the objects during obtaining the set of image components. Obtaining the full set of image components within 1/50 seconds or 1/100 seconds or faster is expected to be readily achievable.
Accordingly, at least some embodiments of the invention can be used for obtaining pictures of scenes with moving objects, which are substantially free from distortions caused by said moving of said objects.
Fig. 8 is an illustration of image components Ri and corresponding imaging parameter pi settings over time (ti). (The index i is always i=0, ...n, ...f . ) The example shows stripe-shaped image components Ri, which are arranged in parallel. This can readily be realized in any of the embodiments of Figs. 4, 5 and 7. The imaging parameter in the example of Fig. 8 is varied continuously, in a nonlinear fashion. Fig. 9 illustrates the relation of image components Ri, picture constituents Ci and final image P. If no processing of the image components Ri is necessary, the image components Ri may be identical with the picture constituents Ci. Fig. 9 shows a case, in which the
(lateral) magnification varies with the variation of the imaging parameter. The dotted lines within the image components Ri indicate, which part of each image components will be used in the corresponding picture constituent Ci; the outer part of the image components Ri can be discarded, only image component Rf is used in full. After the magnification factor correction and possibly a slight rearranging (slight shifting) of the image components Ri, the picture constituents Ci are obtained. The full set of picture constituents Ci corresponds to the picture P, which can be represented as data or as a print on (photo) paper.
Figs. 10 and 11 are illustrations of example mosaic patterns of picture constituents Ci. This shall illustrate that, in fact, arbitrary mosaic patterns are possible. Only a small number of picture constituents Ci has been drawn in the Figures, whereas there will typically be of the order of 500 to of the order of 5000 or even more picture constituents Ci. For naturally-looking pictures, it is advisable to have a large number of picture constituents Ci, i.e., to have a maximum resolution, and to have smooth or no changes from one image component to a neighboring image component. Fig. 10 illustrates that stripes do not need to be parallel to the image frame (they do not even have to be parallel to each other) , and that they do not need to have the same width. This can be accomplished using a shutter curtain arranged such that its slit runs not parallel or perpendicular to an edge of the image taking element. Fig. 11 shall illustrate that any other mosaic (or puzzle-like) pattern is possible, e.g., a hillock-like one.
Typically, neighboring image components Ri (which will lead to neighboring picture constituents Ci) will be obtained in succession.
Figs. 12 to 15 illustrate some examples of how exposures and imaging parameter settings pi may change over time.
Mostly, there will be one, two or three orders of magnitude more exposures (and imaging parameter steps) than drawn in the Figs. The exposure is illustrated in solid lines, the imaging parameter settings in dashed or dotted lines. Fig. 12 illustrates discrete exposures with the same exposure time τi for each image component. The imaging parameter is varied quasi-continuously or step-wise. It is possible to make imaging parameter steps only between two exposures, but it is also possible to make the imaging parameter setting steps independent from the timing of the discrete exposures, like shown in Fig. 12.
Fig. 13 illustrates, like Figs. 12, discrete exposures, but two imaging parameters, namely the focus setting (ai) and a zoom setting (si) of the image taking element are varied. This may be done for changing the focus and at the same time compensating for the change in magnification due to said change in focus, which could render a processing of so-obtained image components for deriving picture constituents therefrom superfluous. Each imaging parameter can, e.g., be varied continuously or quasi-continuously .
Fig. 14 illustrates discrete exposures with varying exposure times τi for different image components. The imaging parameter settings (pCL.pf) partially vary continuously, but also show a step and a constant region.
Fig. 15 illustrates a continuous exposure, as it may be realized, e.g., in an embodiment with a continuously moving opening of a sensing area definition element such as an opening of a shutter (cf . Fig. 6) . An imaging parameter (pCL.pf) , e.g., a soft focus setting, is varied.
Fig. 16 shows schematically an image taking apparatus 1. This exemplary apparatus 1 is a modular single-lens reflex camera 1. It has the following parts, which are all (optionally) detachable: a lens module 2, a focussed-state detection module 3, a control unit 40, an optional adapter plate 6', an image taking module 6 and a focussed-state detection module 7.
The lens module 2 (or lens barrel 2 or objective module 2) corresponds to an image-forming optical system 20 comprising a number of lenses 21 and an aperture 22 and possibly a shutter (not shown) . A part of the lenses 21 (optionally all lenses) forms a part of a focussing section 29, which also comprises a drive 28 (for focussing) . The drive 28 does not have to be arranged at or within the lens barrel 2. A drive 23 may be provided, which allows to open and close the aperture 22 and adjust its opening. The lens barrel 2 is attached to a focussed-state detection module 3, which in the camera of Fig. 1 is at the same time a mirror module containing a mirror arrangement comprising a main mirror 35 and an auxiliary mirror 36. Light from an object or scene 99 to be imaged runs along a light path 9 (only central beams are drawn) through the image-forming optical system 20 and hits the main mirror 35. The upwardly reflected light follows a light path 9a to a groundglass focussing screen 70 as focussed-state detection arrangement 70. The focussed-state detection module 7 is embodied as a view finder module 7 attached to the mirror module 3. The focussed-state detection module 7 may, in general, present images for example optically or electro- optically. The thick wavy line represents the image of the object 99 in the image plane 87 of focussing screen 70.
As a second focussed-state detection arrangement 30, the camera 1 comprises an autofocus sensor 30. Light from the object 99 reaches the autofocus sensor 30 on a light path 9a' through the main mirror 35 and via reflection at the auxiliary mirror 36. Thus, an image is formed in an image plane 83 of the autofocus sensor 30. Usually, the optical path length from object 99 to the image plane 83 of the autofocus sensor 30 is the same as the optical path length from object 99 to the image plane 87 of the focussing screen 70.
When a photograph shall be taken, in particular when a set of image components shall be obtained, the mirror arrangement (main mirror 35 and auxiliary mirror 36) is moved as indicated by the small arrow. This lets the light pass along a light path 9b through the control unit 40, which contains a sensing area definition element 58, e.g., a shutter 58, and a control module 4 (or control circuit) embodied in a microprocessor μP. The shutter 58 and the control module 4 do not necessarily have to be arranged within the control unit 40. The control module 4 may control the drive 28, the aperture 22 (via the drive 23), a mechanism for moving the mirror arrangement (not shown) and the shutter 58 and other functions of the image taking apparatus. It may receive input from the autofocus sensor
30, from light intensity sensors (not shown) and from other sources (including the photographer). For reasons of clarity, the functional connections of the control module 4 to the various units and elements are not shown in Fig. 16. After passing the control unit 40, the light will pass the adapter plate 6' and impinge on an image taking element 60 of the image taking module 6, which is embodied as a digital back 6 with a CCD or CMOS chip 60 and comprises a storage unit 10 (memory) . The image plane of the image taking element 60 is labelled 86.
At least the following imaging parameters can be varied while obtaining the image components: parameters of the shutter 58 (movement, slit widths) , focus settings (via drive 28) and opening of aperture 22 (via drive 23) . Of course, also other imaging parameters (and parts or elements of the camera 1) could be controlled by the control module 4 in order to vary these during obtaining the image components. For further imaging parameters that could be varied, it is referred to above ("summary of the invention", closer to the beginning of this patent application) .
It is possible to provide within the camera 1, or in a connected computer (of a camera system) , means for instructing the camera (or the control module 4), which imaging parameter to vary and in which way, and/or which image components to be obtained (what kind of mosaic pattern, which exposure times to use, in which order to take the image components). The corresponding instructions shall be referred to as an "exposure program".
For example, after switching the camera 1 into a suitable mode, the photographer could select, manually or automatically (autofocus or the like) a setting of the at least one imaging parameter to be varied, e.g., a focus setting and/or a zoom setting. Then, the user marks, on a screen, a point or an area in the scene, at which this setting should be used, e.g., in the middle of the top of the screen. Then the same is done for a second setting and a second point or area in the scene, e.g. in the middle of the bottom of the screen. It is possible to input further settings and points / areas. In addition, a suitable fitting mode could be selected, which defines the algorithm (fitting procedure for interpolation / extrapolation) to be used for obtaining the mosaic patterns and the imaging parameter variation from the input. E.g., one mode could be: finest-resolution and horizontally-oriented stripe- shaped image components from top to bottom, and polynomial interpolation between the imaging parameter settings. Then the "exposure program" is defined, and upon a start signal, e.g., pressing the release button of the camera, the image components are obtained according to the "exposure program" and stored in a storage unit. This way of defining the "exposure program" (and providing the input for that) could even be realized with the camera alone. E.g., the data from the image taking element 60 (or of a view finder with an electro-optic converter) could be output on a screen of the camera, and points could be set, e.g., by means of a tracking ball or cursor keys.
Many other ways of defining the instructions are possible, but they shall not be discussed here. A software for defining the instructions can be implemented in the camera itself and/or can be run on a (separate) computer connected (at least temporarily) to the camera.
Fig. 5 is an illustration of a camera system with computer 100 and storage unit 10. The storage unit 10 may be part of the computer 100 and contain image components and/or picture constituents and/or final images (pictures). On the computer, a software for defining exposure programs and/or software for obtaining final images (from image components and/or picture constituents) and even software embodying the functions of the control module 4 (for remote-controlling the camera) may be installed and run. Fig. 17 schematically illustrates, in a side view, a simple embodiment with a variation of an imaging parameter and with image taking element tilting. Said imaging parameter can be a focus setting, a zoom setting, a soft focus setting or another imaging parameter. It is symbolized by a moving lens 21 and will usually also comprise a movement of at least one lens 21. Through a tilting of said image taking element 60 before or during obtaining image components, an additional degree of freedom in image taking is introduced.
In Fig. 17, said lens 21, a sensing area definition element 58 having a transparent portion 580, and an image taking element 60 are shown at two points in time tl,t2. The arrangement at time tl is drawn in solid lines, the arrangement at time t2 is drawn in thick dotted lines.
From time tl to time t2, lens 21 is moved, so as to cause, e.g., said focus variation, and the transparent portion 580 of sensing area definition element 58 is moved, too, so as to cause that at different points in time tl,t2, different parts of a scene to be photographed are imaged onto different parts of image taking element 60. For a better visibility in the Figure, the boundaries of transparent portion 580 have been indicated by short horizontal lines. The sensing area definition element 58 can, e.g., be embodied as a shutter curtain.
So far, this corresponds largely with what has been shown above, e.g., in Figs. 3 and 6. But instead of keeping image taking element 60 in a fixed position perpendicularly to axis A (or moving it, without tilting, along the optical axis A), it is tilted (rotated about an axis) .
In a more simple case, image taking element 60 is tilted (inclined) with respect to a certain unchanged angle during obtaining the set of image components; e.g., image taking element 60 is tilted into such a tilted position shortly - A A -
before starting the obtaining of the image components, and possibly, it is moved back into its normal position (perpendicular to axis A) shortly after all image components of the set have been obtained. Accordingly, planes of maximum sharpness being angled arbitrarily with respect to the optical axis A can be realized already solely by means of this image taking element tilting, which approximately corresponds to the effect of the "classical" bellows camera (view camera) with lenses tilted with respect to the photographic plate. Not to forget, this effect is combined with the effect achieved by moving lens 21 and the transparent portion 580 of sensing area definition element 58 during obtaining the image components of the set. In a more advanced case, which is illustrated in Fig. 17, image taking element 60 is not kept in a fixed position (tilted or not tilted) during obtaining the set of image components, but the tilting is varied (in angle and/or direction) during obtaining said set of image components. The effect of varying the tilting angle during obtaining said set of image components is readily understood: Assuming that a flat surface aligned perpendicularly to optical axis A is to be imaged, the slice-shaped image component Rl obtained at time tl has, as indicated in the right part of Fig. 17 by dots having the same size, the same degree of sharpness or blur across itself. But the slice-shaped image component R2 obtained at time t2 (with image taking element 60 tilted) has, as indicated in the right part of Fig. 17 by dots having different sizes, a varying degree of blur across itself. Accordingly, not only planes of maximum sharpness being angled arbitrarily with respect to the optical axis A can be realized, but even curved surfaces of maximum sharpness can readily be realized. And this, although an element as simple as a shutter curtain is used as a sensing area definition element 58. Of course, other sensing area definition elements 58 could be used instead, and also an "electronic shutter" (cf. Fig. 4 above) could be used.
Fig. 18 shows the embodiment of Fig. 17 is a top view. The axis of rotation of image taking element 60 can, as shown in Figs. 17 and 18, run centrally through image taking element 60, but it could also be a different axis, e.g., one running along an edge of image taking element 60.
The movements of the image taking element can be accomplished using a drive, e.g., a rotatory or a linear drive, e.g., piezo driven.
Figs. 19 and 20 show, in side view and in top view, respectively, an embodiment which is similar to the one shown in Figs. 17 and 18, but along with the image taking element, also the sensing area definition element 58 is tilted. This can be advantageous in particular if - as often will be the case - the sensing area definition element 58 is arranged close to image taking element 60. The sensing area definition element 58 can be separate from image taking element 60 or can be fixed to image taking element 60. In the latter case, the size of sensing area definition element 58 with respect to the size of image taking element 60 has to be chosen sufficiently large, so as to allow for proper exposure of image taking element 60 when tilted, also in the peripheral parts of image taking element 60.
The tilting of image taking element 60 can be accomplished and used in the same fashion as discussed in conjunction with Figs. 17 an 18.
Of course, such a tilting of image taking element 60 can also be accomplished when the image taking element is a line sensor (cf., e.g., Fig. 7) . In this case, a sensing area definition element can be dispensed with or can be used, e.g., with a transparent portion being moved along the extension of the line sensor.
List of Reference Symbols
1 image taking apparatus, camera system, camera, photographic camera, general-purpose camera, reflex camera
2 lens module, objective module, lens barrel 10 storage unit
20 image-forming optical system 21 lens
22 aperture
23 drive
28 drive
29 focussing section 3 focussed-state detection module, mirror module
30 focussed-state detection arrangement, autofocus sensor
35 mirror, main mirror
36 mirror, auxiliary mirror
4 control module, control circuit, microprocessor 40 control unit
58 sensing area definition element, aperture element, aperture, slit-shaped aperture, means for defining a portion of the image taking element to be exposed, shutter 580 transparent portion of sensing area definition element, opening of sensing area definition element
59 drive, aperture drive, aperture control 6 image taking module, digital back 6' adapter plate
60 image taking element 60a line scanner
60b photographic film, CCD chip 65 photosensitive member 1 focussed-state detection module, view finder module
70 focussed-state detection arrangement, focussing screen, groundglass
83 image plane of focussed-state detection arrangement, image plane of autofocus sensor 86 image plane of image taking element
87 image plane of focussed-state detection arrangement, image plane of focussing screen
9, 9a, 9a', 9b light paths (central beam) 99 object, scene 100 computer
A optical axis ai focus setting
Ci picture constituent P picture, final image pi imaging parameter setting
Ri image component si zoom setting ti point in time
Δt time span μP microprocessor, controller τi exposure time

Claims

Patent Claims :
1. Method of manufacturing a picture (P) of a scene (99) using an image taking apparatus (1) comprising an image taking element (60) and an image-forming optical system (20) comprising at least one lens (21), said method comprising the steps of a) obtaining a set of image components (Ri), wherein said set of image components (Ri) comprises information on said scene (99) in full; b) carrying out a pre-defined variation of at least one imaging parameter during step a) ; wherein step a) comprises the steps of al) obtaining the image components (Ri) of said set one after the other; and a2) capturing a substantially different fraction of said scene (99) in each of said image components (Ri); wherein step b) comprises the step of bl) carrying out a relative movement of said at least one lens (21) with respect to said image taking element (60) during step a), wherein said at least one imaging parameter is - for each of said image components (Ri) - substantially constant during obtaining the respective image component (Ri) .
2. Method according to claim 1, comprising the step of d) deliberately creating out-of-focus portions of said scene (99) in said picture (P) by carrying out step b) .
3. Method according to claim 1 to claim 2, wherein step a) comprises the step of a3) exposing said image taking element (60) with light from said scene (99) in full by successively exposing different portions of said image taking element (60) with light from different parts of said scene (99) .
4. Method according to claim 3, comprising the step of a31) using a sensing area definition element (58) arranged within said image taking apparatus (1) between said scene (99) and said image taking element (60) for defining said different portions of said image taking element (60) to be exposed.
5. Method according to claim 4, wherein said sensing area definition element (58) is capable of forming an opening (580), and wherein said step a31) comprises the step of moving said opening (580) of said sensing area definition element (58) for defining said different portions of said image taking element (60) to be exposed.
6. Method according to claim 1 to claim 2, wherein step a) comprises the step of a4) successively obtaining the image components (Ri) of said set by multiply exposing said image taking element (60), each time with light from a different part of said scene (99) .
7. Method according to claim 6, wherein step a4) comprises the step of a41) moving said image taking element (60) for accomplishing said exposing of said image taking element (60) with light from said different parts of said scene ( 99) .
8. Method according to one of the preceding claims, comprising the step of cl) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not define an angle between an axis defined by said image taking element (60) and an axis defined by an object to be imaged.
9. Method according to one of the preceding claims, comprising the step of c2) adjusting at least one control parameter on which said pre-defined variation of said at least one imaging parameter depends and which does not describe an alignment of said image taking apparatus (1) relative to an object to be imaged.
10. Method according to one of the preceding claims, comprising the step of c3) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of an axis defined by said image taking element (60) relative to an axis defined by an object to be imaged.
11. Method according to one of the preceding claims, comprising the step of c4) adjusting at least one control parameter, on which said pre-defined variation of said at least one imaging parameter depends, in a way which is not predetermined by an alignment of said image taking apparatus (1) relative to said scene (99).
12. Method according to one of the preceding claims, wherein said at least one imaging parameter comprises a focus setting.
13. Method according to one of the preceding claims, wherein said at least one imaging parameter comprises a soft focus setting.
14. Method according to one of the preceding claims, wherein said at least one imaging parameter comprises a zoom setting.
15. Method according to one of the preceding claims, wherein for each of said image components (Ri) , the light used for obtaining the respective image component (Ri) travels within said image taking apparatus (1) along a light path (9) to said image taking element (60) without being reflected by a mirror.
16. Method according to one of the preceding claims, comprising the step of f) creating a bent focal surface by carrying out step b)
17. Method according to one of the preceding claims, wherein step bl) comprises the step of bll) moving said at least one lens (21) during step a) .
18. Method according to one of the preceding claims, comprising the step of g) moving said image taking element (60) during step a)
19. Method according to one of the preceding claims, wherein said image taking apparatus (1) is a general- purpose photographic camera.
20. Use of the method according to one of claims 1 to 19 for intentionally causing that said picture (P) comprises portions, which are deliberately out-of-focus .
21. Control module (4) for an image taking apparatus (1) comprising an image taking element (60) and an image- forming optical system (20) comprising at least one lens (21), wherein said control module (4) is adapted to enabling
— to automatically obtain a set of image components (Ri) , which set comprises information on a scene (99) in full, said automatically obtaining comprising successively obtaining the image components (Ri) of said set and capturing a substantially different fraction of said scene (99) in each of said image components (Ri) ; and
— to automatically vary settings (pi) of at least one imaging parameter in a pre-defined way during said automatically obtaining said set of image components (Ri) , said automatically varying settings comprising carrying out a relative movement of said at least one lens (21) with respect to said image taking element (60) during said obtaining said set of image components (Ri), wherein said at least one imaging parameter is - for each of said image components (Ri) - substantially constant during obtaining the respective image component (Ri) .
22. The control module (4) according to claim 21, wherein said control module (4) is adapted to enabling to automatically exposing said image taking element (60) with light from said scene (99) in full by successively exposing different portions of said image taking element (60) with light from different fractions of said scene (99) for said obtaining said set of image components (Ri) .
23. The control module (4) according to claim 22, wherein said control module (4) is adapted to enabling to control a sensing area definition element (58) of said image taking apparatus (1) for defining said different portions of said image taking element (60) to be exposed.
24. The control module (4) according to claim 23, wherein said sensing area definition element (58) is capable of forming an opening (580), wherein said control module (4) is adapted to enabling to control said sensing area definition element (58) such that said opening (580) is moved with respect to said image taking element (60) for defining said different portions of said image taking element (60) to be exposed.
25. The control module (4) according to claim 24, wherein said pre-defined variation of said settings (pi) is dependent on at least one control parameter, which does not define an angle between an axis defined by said image taking element (60) and an axis defined by an object to be imaged.
25. The control module (4) according to one of claims 21 to 24, wherein said at least one imaging parameter comprises a focus setting.
26. The control module (4) according to one of claims 21 to 25, wherein said at least one imaging parameter comprises a soft focus setting.
27. The control module (4) according to one of claims 21 to 26, wherein said at least one imaging parameter comprises a zoom setting.
28. Control unit (40) for an image taking apparatus (1), which is comprised in and/or connectable to said image taking apparatus (1), comprising a control module (4) according to one of claims 21 to 27.
29. Image taking apparatus (1) comprising a control module (4) according to one of claims 21 to 27.
30. The apparatus (1) according to claim 29, which is free from mirrors, which are arranged, for at least one of said image components (Ri), along a light path (9) along which light used for obtaining said at least one image component (Ri) travels within said image taking apparatus (1) to said image taking element (60) during obtaining said at least one image component (Ri) .
31. The apparatus (1) according to claim 29 or claim 30, wherein said image taking apparatus (1) is a general- purpose photographic camera.
32. The apparatus (1) according to one of claims 29 to 31, comprising at least one of the group comprising
— a first storage unit (10) for storing data representative of said set of image components (Ri) ;
— a second storage unit (10), which may be identical with or different from said first storage unit (10), for storing data representative of said pre-defined variation of said at least one imaging parameter;
— a third storage unit (10), which may be identical with or different from said first and/or second storage units (10), for storing data relating each of said image components (Ri) of said set to said settings (pi) of said at least one imaging parameter used during obtaining the respective image component (Ri) .
PCT/EP2007/055338 2007-05-31 2007-05-31 Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities WO2007101887A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2007/055338 WO2007101887A2 (en) 2007-05-31 2007-05-31 Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2007/055338 WO2007101887A2 (en) 2007-05-31 2007-05-31 Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities

Publications (2)

Publication Number Publication Date
WO2007101887A2 true WO2007101887A2 (en) 2007-09-13
WO2007101887A3 WO2007101887A3 (en) 2008-04-03

Family

ID=38475219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2007/055338 WO2007101887A2 (en) 2007-05-31 2007-05-31 Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities

Country Status (1)

Country Link
WO (1) WO2007101887A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2059024A3 (en) * 2007-10-29 2009-05-20 Ricoh Company, Ltd. Image processing device, image processing method, and computer program product
CN114793269A (en) * 2022-03-25 2022-07-26 岚图汽车科技有限公司 Control method of camera and related equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907353A (en) * 1995-03-28 1999-05-25 Canon Kabushiki Kaisha Determining a dividing number of areas into which an object image is to be divided based on information associated with the object
US20020054217A1 (en) * 2000-11-07 2002-05-09 Minolta Co., Ltd. Method for connecting split images and image shooting apparatus
US6535250B1 (en) * 1997-06-12 2003-03-18 Minolta Co., Ltd. Image pickup apparatus
JP2005039680A (en) * 2003-07-18 2005-02-10 Casio Comput Co Ltd Camera device, photographing method and program
EP1553521A1 (en) * 2002-10-15 2005-07-13 Seiko Epson Corporation Panorama synthesis processing of a plurality of image data
US20050178950A1 (en) * 2004-02-18 2005-08-18 Fujinon Corporation Electronic imaging system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907353A (en) * 1995-03-28 1999-05-25 Canon Kabushiki Kaisha Determining a dividing number of areas into which an object image is to be divided based on information associated with the object
US6535250B1 (en) * 1997-06-12 2003-03-18 Minolta Co., Ltd. Image pickup apparatus
US20020054217A1 (en) * 2000-11-07 2002-05-09 Minolta Co., Ltd. Method for connecting split images and image shooting apparatus
EP1553521A1 (en) * 2002-10-15 2005-07-13 Seiko Epson Corporation Panorama synthesis processing of a plurality of image data
JP2005039680A (en) * 2003-07-18 2005-02-10 Casio Comput Co Ltd Camera device, photographing method and program
US20050178950A1 (en) * 2004-02-18 2005-08-18 Fujinon Corporation Electronic imaging system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2059024A3 (en) * 2007-10-29 2009-05-20 Ricoh Company, Ltd. Image processing device, image processing method, and computer program product
CN101426093B (en) * 2007-10-29 2011-11-16 株式会社理光 Image processing device, image processing method
CN114793269A (en) * 2022-03-25 2022-07-26 岚图汽车科技有限公司 Control method of camera and related equipment

Also Published As

Publication number Publication date
WO2007101887A3 (en) 2008-04-03

Similar Documents

Publication Publication Date Title
US8432481B2 (en) Image sensing apparatus that controls start timing of charge accumulation and control method thereof
US9591246B2 (en) Image pickup apparatus with blur correcting function
JP5901246B2 (en) Imaging device
JP5645846B2 (en) Focus adjustment device and focus adjustment method
JP5254904B2 (en) Imaging apparatus and method
US8063944B2 (en) Imaging apparatus
KR101280248B1 (en) Camera and camera system
JP5168798B2 (en) Focus adjustment device and imaging device
US8854533B2 (en) Image capture apparatus and control method therefor
JP5676962B2 (en) Focus detection apparatus and imaging apparatus
JP4948266B2 (en) Imaging apparatus and control method thereof
WO2007148169A1 (en) Method and system for image stabilization
JP2011081271A (en) Image capturing apparatus
JP2001042207A (en) Electronic camera
JP6997295B2 (en) Imaging device, imaging method, and program
WO2007101887A2 (en) Method of manufacturing a picture and image taking apparatus with enhanced imaging capabilities
JP2014130231A (en) Imaging apparatus, method for controlling the same, and control program
CN101505370B (en) Imaging apparatus
JPH11223761A (en) Camera with focus detector
JP4135202B2 (en) Focus detection device and camera
JP6477275B2 (en) Imaging device
JP2016057402A (en) Imaging device and method for controlling the same
JP2018042145A (en) Imaging apparatus and control method of the same, program, and storage mediums
JP2011176457A (en) Electronic camera
JP4106725B2 (en) Focus detection device and camera with focus detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07729740

Country of ref document: EP

Kind code of ref document: A2