WO2005076217A2 - Optimized red-eye filter method and apparatus involving subsample representations of selected image regions - Google Patents

Optimized red-eye filter method and apparatus involving subsample representations of selected image regions Download PDF

Info

Publication number
WO2005076217A2
WO2005076217A2 PCT/EP2005/001171 EP2005001171W WO2005076217A2 WO 2005076217 A2 WO2005076217 A2 WO 2005076217A2 EP 2005001171 W EP2005001171 W EP 2005001171W WO 2005076217 A2 WO2005076217 A2 WO 2005076217A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
red
eye
subsample
analysis
Prior art date
Application number
PCT/EP2005/001171
Other languages
French (fr)
Other versions
WO2005076217A3 (en
WO2005076217A9 (en
Inventor
Alexandru Drimbarean
Petronel Bigioi
Alexei Pososin
Eran Steinberg
Yury Prilutsky
Peter Corcoran
Original Assignee
Fotonation Vision Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fotonation Vision Limited filed Critical Fotonation Vision Limited
Priority to EP05707215A priority Critical patent/EP1714252A2/en
Priority to JP2006551816A priority patent/JP4966021B2/en
Publication of WO2005076217A2 publication Critical patent/WO2005076217A2/en
Publication of WO2005076217A9 publication Critical patent/WO2005076217A9/en
Publication of WO2005076217A3 publication Critical patent/WO2005076217A3/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/62Retouching, i.e. modification of isolated colours only or in isolated picture areas only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/62Retouching, i.e. modification of isolated colours only or in isolated picture areas only
    • H04N1/624Red-eye correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2215/00Special procedures for taking photographs; Apparatus therefor
    • G03B2215/05Combinations of cameras with electronic flash units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30216Redeye defect
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Definitions

  • the invention relates generally to the area of flash photography, and more specifically to filtering "red-eye” from a digital camera image.
  • Red-eye is a phenomenon in flash photography where a flash is reflected within a subject's eye and appears in a photograph as a red dot where the black pupil of the subject's eye would normally appear.
  • the unnatural glowing red of an eye is due to internal reflections from the vascular membrane behind the retina, which is rich in blood vessels.
  • This objectionable phenomenon is well understood to be caused in part by a small angle between the flash of the camera and the lens of the camera. This angle has decreased with the miniaturization of cameras with integral flash capabilities.
  • Additional contributors include the relative closeness of the subject to the camera and ambient light levels.
  • the red-eye phenomenon can be minimized by causing the iris to reduce the opening of the pupil. This is typically done with a "pre-flash", a flash or illumination of light shortly before a flash photograph is taken. This causes the iris to close.
  • the pre-flash is an objectionable 0.2 to 0.6 seconds prior to the flash photograph. This delay is readily discernible and easily within the reaction time of a human subject. Consequently the subject may believe the pre-flash is the actual photograph and be in a less than desirable position at the time of the actual photograph. Alternately, the subject must be informed ofthe pre-flash, typically loosing any spontaneity of the subject captured in the photograph.
  • Digital cameras eliminate the need for film as the image is digitally captured and stored in a memory array for display on a display screen on the camera itself. This allows photographs to be viewed and enjoyed virtually instantaneously as opposed to waiting for film processing. Furthermore, the digitally captured image may be downloaded to another display device such as a personal computer or color printer for further enhanced viewing. Digital cameras include microprocessors for image processing and compression and camera systems control. Nevertheless, without a pre-flash, both digital and film cameras can capture the red-eye phenomenon as the flash reflects within a subject's eye. Thus, what is needed is a method of eliminating red-eye phenomenon within a miniature digital camera having a flash without the distraction of a pre-flash.
  • a digital apparatus is provided with a red-eye filter for modifying an area within a digitized image indicative of a red-eye phenomenon based on an analysis of a subsample representation of selected regions of the digitized image.
  • the analysis may be performed at least in part for determining the area, and/or may be performed at least in part for determining the modifying.
  • the selected regions ofthe digitized image may include the entire image or one or more regions may be excluded.
  • the selected regions may include multi resolution encoding of the image.
  • the analysis may be performed in part on a full resolution image and in part on a subsample resolution of the digital image.
  • the apparatus may include a module for changing the degree of said subsampling.
  • This changing the degree ofthe subsampling may be determined empirically, and/or based on a size of the image or selected regions thereof, and/or based on data obtained from the camera relating to the settings of the camera at the time of image capture.
  • the data obtained from the camera may include an aperture setting, focus of the camera, distance of the subject from the camera, or a combination of these.
  • the changing the degree of the subsampling may also be determined based digitized image metadata information and/or a complexity of calculation for the red eye filter.
  • the modifying of the area may be performed including the full resolution of the digital image.
  • the red-eye filter may include multiple sub filters.
  • the subsampling for the sub filters operating on selected regions of the image may be determined by one or more of the image size, suspected as red eye region size, filter computation complexity, empirical success rate of said sub filter, empirical false detection rate of said sub filter, falsing probability of said sub filter, relations between said suspected regions as red eye, results of previous analysis of other said sub filters.
  • the apparatus may include a memory for saving the digitized image after applying the filter for modifying pixels as a modified image, and/or a memory for saving the subsample representation of the image.
  • the subsample representation of selected regions of the image may be determined in hardware. The analysis may be performed in part on the full resolution image and in part on a subsample resolution of the image.
  • a digital apparatus includes an image store and a red eye filter.
  • the image store is for holding a temporary copy of an unprocessed image known as a pre- capture image, a permanent copy of a digitally processed, captured image, and a subsample representation of selected regions of at least one of the images, e.g., the pre-capture image.
  • the red-eye filter is for modifying an area within at least one of the images indicative of a red-eye phenomenon based on an analysis of the subsample representation.
  • the at least one of the images includes the digitally processed, captured image.
  • This further aspect may also include one or more features in accordance with the first aspect.
  • the changing the degree of the subsampling may be determined based on data obtained from the camera relating to image processing analysis of said precapture images.
  • the image processing analysis may be based on histogram data or color correlogram data, or both, obtained from the pre-capture image.
  • the image processing analysis may also be based on global luminance or white balance image data, or both, obtained from the pre-capture image.
  • the image processing analysis may also be based on a face detection analysis of the pre-capture image, or on determining pixel regions with a color characteristic indicative of redeye, or both.
  • the image processing analysis may be performed in hardware.
  • the changing of the degree of the subsampling may be determined based on image metadata information.
  • a method of filtering a red eye phenomenon from a digitized image is also provided in accordance with another aspect, wherein the image includes a multiplicity of pixels indicative of color.
  • the method includes determining whether one or more regions within a subsample representation of the digitized image are suspected as including red eye artifact.
  • the method may include varying a degree of the subsample representation for each region of the one or more regions based on the image, and/or generating a subsample representation based on the image.
  • the subsample representation may be generated or the degree varied, or both, utilizing a hardware-implemented subsampling engine.
  • the method may further include associating the one or more regions within the subsample presentation of the image with one or more corresponding regions within the digitized image, and modifying the one or more corresponding regions within the digitized image.
  • the determining may include analyzing meta-data information including image acquisition device- specific information.
  • the method may include analyzing the subsample representation of selected regions of the digitized image, and modifying an area determined to include red eye artifact. The analysis may be performed at least in part for determining said area and/or thee modifying.
  • the selected regions of the digitized image may include the entire image or may exclude one or more regions.
  • the selected regions of the digitized image may include multi resolution encoding of the image.
  • the analyzing may be performed in part on a full resolution image and in part on a subsample resolution of said image.
  • the method may include changing the degree of the subsampling. This changing ofthe degree of subsampling may be determined empirically, and/or based on a size of the image or selected regions thereof.
  • the method may include saving the digitized image after applying the filter for modifying pixels as a modified image, and/or saving said subsample representation ofthe image.
  • the method may include determining the subsample representation of the image in hardware, and/or using a spline or bi-cubic interpolation.
  • the modifying of the area may be performed including the full resolution of the image.
  • the method may include determining the subsample representation utilizing a plurality of sub- filters.
  • the determining of the plurality of sub-filters may be based on one or more of the image size, a suspected red eye region size, filter computation complexity, empirical success rate of said sub-filter, empirical false detection rate of said sub-filter, falsing probability of said sub- filter, relations between said suspected red eye regions, or results of previous analysis of one or more other sub-filters.
  • FIG. 1 shows a block diagram of a camera apparatus operating in accordance with the present invention.
  • FIG. 2 shows a pixel grid upon which an image of an eye is focused.
  • FIG. 3 shows pixel coordinates of the pupil of FIG. 2.
  • FIG. 4 shows pixel coordinates of the iris of FIG. 2.
  • FIG. 5 shows pixel coordinates which contain a combination of iris and pupil colors of FIG. 2.
  • FIG. 6 shows pixel coordinates of the white eye area of FIG. 2.
  • FIG. 7 shows pixel coordinates of the eyebrow area of FIG. 2.
  • FIG. 8 shows a flow chart of a method operating in accordance with the present invention.
  • FIG. 9 shows a flow chart for testing if conditions indicate the possibility of a red-eye phenomenon photograph.
  • FIG. 10 shows a flow chart for testing if conditions indicate a false red-eye grouping.
  • FIG. 11 illustrates in block form an exemplary arrangement in accordance with a precapture image utilization aspect.
  • FIG. 1 shows a block diagram of a camera apparatus operating in accordance with the present invention.
  • the camera 20 includes an exposure control 30 that, in response to a user input, initiates and controls the digital photographic process.
  • Ambient light is determined using light sensor 40 in order to automatically determine if a flash is to be used.
  • the distance to the subject is determined using focusing means 50 which also focuses the image on image capture means 60.
  • the image capture means digitally records the image in color.
  • the image capture means is known to those familiar with the art and may include a CCD (charge coupled device) to facilitate digital recording.
  • CCD charge coupled device
  • exposure control means 30 causes the flash means 70 to generate a photographic flash in substantial coincidence with the recording of the image by image capture means 60.
  • the flash may be selectively generated either in response to the light sensor 40 or a manual input from the user of the camera.
  • the image recorded by image capture means 60 is stored in image store means 80 which may comprise computer memory such a dynamic random access memory or a nonvolatile memory.
  • the red-eye filter 90 then analyzes the stored image for characteristics of red-eye, and if found, modifies the image and removes the red-eye phenomenon from the photograph as will be described in more detail.
  • the red-eye filter includes a pixel locator 92 for locating pixels having a color indicative of red-eye; a shape analyzer 94 for determining if a grouping of at least a portion of the pixels located by the pixel locator comprise a shape indicative of red-eye; a pixel modifier 96 for modifying the color of pixels within the grouping; and an falsing analyzer 98 for further processing the image around the grouping for details indicative of an image of an eye.
  • the modified image may be either displayed on image display 100 or downloaded to another display device, such as a personal computer or printer via image output means 110. It can be appreciated that many of the processes implemented in the digital camera may be implemented in or controlled by software operating in a microcomputer ( ⁇ C) or digital signal processor (DSP) and/or an application specific integrated circuit (ASIC).
  • ⁇ C microcomputer
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • the image capture means 60 of FIG. 1 includes an optional image subsampling means, wherein the image is actively down-sampled.
  • the subsampling is done using a bi-cubic spline algorithm, such as those that are known to one familiar in the art of signal and image processing.
  • a bi-cubic spline algorithm such as those that are known to one familiar in the art of signal and image processing.
  • Those familiar with this art are aware of subsampling algorithms that interpolate and preserve pixel relationships as best they can given the limitation that less data is available.
  • the subsampling stage is performed to maintain significant data while minimizing the image size, thus the amount of pixel-wise calculations involved, which are generally costly operations.
  • a subsample representation may include a multi resolution presentation of the image, as well as a representation in which the sampling rate is not constant for the entire image. For example, areas suspected as indicative of red eye may have different resolution, most likely higher resolution, than areas positively determined not to include red eye.
  • the subsampling means utilizes hardware based subsampling wherein the processing unit of the digital imaging appliance incorporates a dedicated subsampling engine providing the advantage of a very fast execution of a subsampling operation.
  • a dedicated subsampling engine may be based on a state-of-art digital imaging appliance incorporating hardware that facilitates the rapid generation of image thumbnails. The decision to subsample the image is, in part, dependent on the size of the original image. If the user has selected a low resolution image format, there may be little gain in performance of redeye detection and false avoidance steps.
  • the inclusion of a subsampling means, or step or operation is optional.
  • the red eye detection filter of the preferred embodiment may comprise a selection of sub filters that may be calculated in succession or in parallel.
  • the sub-filters may operate on only a selected region, or a suspected region. Such regions are substantially smaller than the entire image.
  • the decision to subsample the image is, in part, dependent on one or a combination of a few factors such as the size of the suspected region, the success or failure of previous or parallel filters, the distance between the regions and the complexity of the computation of the sub filter. Many of the parameters involved in deciding whether or not to subsample a region, and to what degree, may also be determined by an empirical process of optimization between success rate, failure rate and computation time.
  • both the original and subsampled images are preferably stored in the image store 80 of FIG. 1.
  • the subsampled image is now available to be used by the redeye detector 90 and the false avoidance analyzer 98 of FIG. 1.
  • the image capture means 60 of Fig. 1 and the image subsampling means incorporated within step 60 may occur on separate imaging appliances as we shall next describe.
  • data relating to the status of system components 30, 40, 50 and 70 of FIG. 1 at the moment of image capture may be stored within the header of a digital image, using standardized metadata tags.
  • the subsampling means and the redeye filtering means may occur at a time after the image capture and, as all the necessary data which would have been available at that time can be stored as metadata within the digital image the actual subsampling process and redeye filtering process may occur on a physically separate imaging appliance.
  • an image captured by a digital camera may be later subsampled and subject to redeye filtering after being transferred to a printer, and prior to the actual printing of the image.
  • the system and method of the preferred embodiment involves the detection and removal of red eye artifacts.
  • the actual removal ofthe red eye will eventually be performed on the full resolution image.
  • all or portions of the detection of redeye candidate pixel groupings, the subsequent testing of said pixel groupings for determining false redeye groupings, and the initial step of the removal, where the image is presented to the user for user confirmation of the correction can be performed on the entire image, the subsampled image, or a subset of regions of the entire image or the subsampled image. There is generally a tradeoff between speed and accuracy.
  • the detection, and subsequent false-determining may be performed selectively, e.g., sometimes on full resolution regions that are suspected as red-eye, and sometimes on a subsampled resolution.
  • the search step 200 of FIG. 8 comprises, in a practical embodiment, a number of successively applied color filters based on iterative refinements of an initial pixel by pixel search of the captured image. In addition to searching for a red color, it is preferably determined whether the luminance, or brightness of a redeye region, lies within a suitable range of values. Further, the local spatial distribution of color and luminance are relevant factors in the initial search for redeye pixel groupings.
  • each subsequent filter is preferably only applied locally to pixels in close proximity to a grouping of potential redeye pixels, it can equally well be applied to the corresponding region in the full-sized image.
  • filters which may be employed in the false-determining analyzer 98.
  • non-color based false-determining analysis filters include those which consider the localized contrast, saturation or texture distributions in the vicinity of a potential redeye pixel grouping, those that perform localized edge or shape detection and more sophisticated filters which statistically combine the results of a number of simple local filters to enhance the accuracy of the resulting false-determining analysis.
  • filters that operate on larger portions of the images will utilize a subsampled version, while the more sensitive and delicate filters may be applied to the corresponding region of the full resolution image. It is preferred that in the case of full resolution only small portions of the image will be used for such filters. As a non exhaustive example, filters that look for a distinction between lips and eyes may utilize a full resolution portion, while filters that distinguish between background colors may use a subsample of the image. Furthermore, several different sizes of subsampled images may be generated and employed selectively to suit the sensitivity of the different pixel locating and false determining filters.
  • the decision whether the filter should use a subsampled representation, and the rate of the downsampling, may be determined empirically by a-priori statistically comparing the success rate vs. mis-detection rate of a filter with the subsampling rate and technique of known images. It is further worth noting that the empirical determination will often be specific to a particular camera model. Thus, the decision to use the full sized image or the subsampled image data, for a particular pixel locating or false determining filter, may be empirically determined for each camera.
  • a pre-acquisition or precapture image may be effectively utilized in an embodiment of the invention.
  • Another type of subsampled representation of the image may be one that differs temporally from the captured image, in addition or alternative to the spatial differentiation with other aforementioned algorithms such as spline and bi-cubic.
  • the subsample representation of the image may be an image captured before the final image is captured, and preferably just before.
  • a camera may provide a digital preview of the image, which may be a continuous subsample version of the image.
  • Such pre-capture may be used by the camera and the camera user, for example, to establish correct exposure, focus and/or composition.
  • the precapture image process may involve an additional step of conversion from the sensor domain, also referred to as raw-ccd, to a known color space that the red eye filter is using for calculations.
  • an additional step of alignment may be used in the case that the final image and the pre-capture differ, such as in camera or object movement.
  • the pre-acquisition image may be normally processed directly from an image sensor without loading it into camera memory.
  • a dedicated hardware subsystem is implemented to perform pre-acquisition image processing.
  • the pre-acquisition image processing may satisfy some predetermined criteria which then implements the loading of raw image data from the buffer of the imaging sensor into the main system memory together with report data, possibly stored as metadata, on the predetermined criteria.
  • Some predetermined criteria is the existence of red areas within the pre-acquisition image prior to the activation of the camera flash module. Report data on such red areas can be passed to the redeye filter to eliminate such areas from the redeye detection process.
  • the pre-acquisition image processing module can loop to obtain a new pre-acquisition test image from the imaging sensor. This looping may continue until either the test criteria are satisfied or a system time-out occurs. Note further that the pre-acquisition image processing step is significantly faster than the subsequent image processing chain of operations due to the taking of image data directly from the sensor buffers and the dedicated hardware subsystem used to process this data.
  • the raw image data may be then properly loaded into main system memory to allow image processing operations to convert the raw sensor data into a final pixelated image.
  • Typical steps may include converting Bayer or RGGB image data to YCC or RGB pixelated image data, calculation and adjustment of image white balance, calculation and adjustment of image color range, and calculation and adjustment of image luminence, potentially among others.
  • the final, full-size image may be available in system memory, and may then be copied to the image store for further processing by the redeye filter subsystem.
  • a camera may incorporate dedicated hardware to do global luminance and/or color/grayscale histogram calculations on the raw and/or final image data.
  • One or more windows within the image may be selected for doing "local" calculations, for example.
  • valuable data may be obtained using a "first pass" or pre-acquisition image before committing to a main image processing approach which generates a more final picture.
  • a subsampled image, in addition to the precapture and more finalized images, may be generated in parallel with the final image by a main image processing toolchain. Such processing may be preferably performed within the image capture module 60 of FIG. 1.
  • An exemplary process may include the following operations. First, a raw image may be acquired or pre-captured. This raw image may be processed prior to storage. This processing may generate some report data based on some predetermined test criteria. If the criteria are not met, the pre- acquisition image processing operation may obtain a second, and perhaps one or more additional, pre-acquisition images from the imaging sensor buffer until such test criteria are satisfied.
  • a full-sized raw image may be loaded into system memory and the full image processing chain may be applied to the image.
  • a final image and a subsample image may then ultimately preferably be generated.
  • FIG. 11 illustrates in block form a further exemplary arrangement in accordance with a precapture image utilization aspect.
  • the "raw" image is loaded from the sensor into the image capture module.
  • the image capture module After converting the image from its raw format (e.g., Bayer RGGB) into a more standardized pixel format such as YCC or RGB, it may be then subject to a post-capture image processing chain which eventually generates a full-sized final image and one or more subsampled copies of the original. These may be preferably passed to the image store, and the red-eye filter is preferably then applied.
  • the image capture and image store functional blocks of FIG. 11 correspond to blocks 60 and 80 illustrated at FIG. 1.
  • FIG. 2 shows a pixel grid upon which an image of an eye is focused.
  • the digital camera records an image comprising a grid of pixels at least 640 by 480.
  • FIG. 2 shows a 24 by 12 pixel portion of the larger grid labeled columns A-X and rows 1-12 respectively.
  • FIG. 3 shows pixel coordinates of the pupil of FIG. 2.
  • the pupil is the darkened circular portion and substantially includes seventeen pixels: K7, K8, L6, L7, L8, L9, M5, M6, M7, M8, M9, N6, N7, N8, N9, O7 and O8, as indicated by shaded squares at the aforementioned coordinates. In a non-flash photograph, these pupil pixels would be substantially black in color.
  • FIG. 4 shows pixel coordinates of the iris of FIG. 2.
  • the iris pixels are substantially adjacent to the pupil pixels of FIG. 2. Iris pixels J5, J6, J7, J8, J9, K5, K10, L10, M10, N1O, O5, O10, P5, P6, P7, P8 and P9 are indicated by shaded squares at the aforementioned coordinates.
  • the iris pixels substantially surround the pupil pixels and may be used as further indicia of a pupil.
  • the iris pixels will have a substantially constant color. However, the color will vary as the natural color of the eyes each individual subject varies.
  • the existence of iris pixels depends upon the size of the iris at the time of the photograph, if the pupil is very large then iris pixels may not be present.
  • FIG. 5 shows pixel coordinates which include a combination of iris and pupil colors of FIG. 2.
  • the pupil/iris pixels are located at K6, K9, L5, N5, 06, and 09, as indicated by shaded squares at the aforementioned coordinates.
  • the pupil/iris pixels are adjacent to the pupil pixels, and also adjacent to any iris pixels which may be present.
  • Pupil/iris pixels may also contain colors of other areas of the subject's eyes including skin tones and white areas of the eye.
  • FIG. 6 shows pixel coordinates of the white eye area of FIG. 2. The seventy one pixels are indicated by the shaded squares of FIG. 6 and are substantially white in color and are in the vicinity of and substantially surround the pupil pixels of FIG. 2.
  • FIG. 7 shows pixel coordinates of the eyebrow area of FIG. 2. The pixels are indicated by the shaded squares of FIG. 7 and are substantially white in color.
  • the eyebrow pixels substantially form a continuous line in the vicinity of the pupil pixels. The color of the line will vary as the natural color of the eyebrow of each individual subject varies. Furthermore, some subjects may have no visible eyebrow at all. It should be appreciated that the representations of FIG. 2 through FIG.
  • the red-eye filter 90 of FIG. 1 searches the digitally stored image for pixels having a substantially red color, then determines if the grouping has a round or oval characteristics, similar to the pixels of FIG. 3. If found, the color of the grouping is modified. In the preferred embodiment, the color is modified to black. Searching for a circular or oval grouping helps eliminate falsely modifying red pixels which are not due to the red-eye phenomenon. In the example of FIG.
  • the red-eye phenomenon is found in a 5.times.5 grouping of pixels of FIG. 3.
  • the grouping may contain substantially more or less pixels depending upon the actual number of pixels comprising the image of an eye, but the color and shape of the grouping will be similar. Thus for example, a long line of red pixels will not be falsely modified because the shape is not substantially round or oval. Additional tests may be used to avoid falsely modifying a round group of pixels having a color indicative of the red-eye phenomenon by further analysis of the pixels in the vicinity of the grouping.
  • the radius is large enough to analyze enough pixels to avoid falsing, yet small enough to exclude the other eye of the subject, which may also have the red-eye phenomenon.
  • the radius includes a range between two and five times the radius of the grouping. Other indicia of the recording may be used to validate the existence of red-eye including identification of iris pixels of FIG. 4 which surround the pupil pixels.
  • the iris pixels will have a substantially common color, but the size and color of the iris will vary from subject to subject.
  • the white area of the eye may be identified as a grouping of substantially white pixels in the vicinity of and substantially surrounding the pupil pixels as shown in FIG. 6.
  • the location of the pupil within the opening of the eyelids is variable depending upon the orientation of the head of the subject at the time of the photograph. Consequently, identification of a number of substantially white pixels in the vicinity of the iris without a requirement of surrounding the grouping will further validate the identification of the red-eye phenomenon and prevent false modification of other red pixel groupings.
  • the number of substantially white pixels is preferably between two and twenty times the number of pixels in the pupil grouping.
  • the eyebrow pixels of FIG. 7 can be identified.
  • additional criterion can be used to avoid falsely modifying a grouping of red pixels.
  • the criterion include determining if the photographic conditions were indicative of the red-eye phenomenon. These include conditions known in the art including use of a flash, ambient light levels and distance of the subject. If the conditions indicate the red-eye phenomenon is not present, then red-eye filter 90 is not engaged.
  • FIG. 5 shows combination pupil/iris pixels which have color components of the red-eye phenomenon combined with color components of the iris or even the white area of the eye.
  • the invention modifies these pixels by separating the color components associated with red-eye, modifying color of the separated color components and then adding back modified color to the pixel. Preferably the modified color is black.
  • FIG. 8 shows a flow chart of a method operating in accordance with the present invention.
  • the red-eye filter process is in addition to other processes known to those skilled in the art which operate within the camera. These other processes include flash control, focus, and image recording, storage and display.
  • the red-eye filter process preferably operates within software within a .mu.C or DSP and processes an image stored in image store 80.
  • the red-eye filter process is entered at step 200.
  • step 210 conditions are checked for the possibility of the red-eye phenomenon. These conditions are included in signals from exposure control means 30 which are communicated directly to the red-eye filter. Alternatively the exposure control means may store the signals along with the digital image in image store 80. If conditions do not indicate the possibility of red-eye at step 210, then the process exits at step 215. Step 210 is further detailed in FIG. 9, and is an optional step which may be bypassed in an alternate embodiment. Then is step 220 the digital image is searched of pixels having a color indicative of red-eye. The grouping of the red-eye pixels are then analyzed at step 230. Red-eye is determined if the shape of a grouping is indicative of the red-eye phenomenon.
  • This step also accounts for multiple redeye groupings in response to a subject having two red-eyes, or multiple subjects having red-eyes. If no groupings indicative of red-eye are found, then the process exits at step 215. Otherwise, false red-eye groupings are checked at optional step 240. Step 240 is further detailed in FIG. 10 and prevents the red-eye filter from falsely modifying red pixel groupings which do not have further indicia of the eye of a subject. After eliminating false groupings, if no grouping remain, the process exits at step 215. Otherwise step 250 modifies the color ofthe groupings which pass step 240, preferably substituting the color red for the color black within the grouping.
  • the pixels surrounding a red-eye grouping are analyzed for a red component. These are equivalent to the pixels of FIG. 5.
  • the red component is substituted for black by the red-eye filter.
  • the process then exits at step 215.
  • the pixel color modification can be stored directly in the image store by replacing red-eye pixels with pixels modified by the red-eye filter.
  • the modified pixels can be stored as an overlay in the image store, thereby preserving the recorded image and only modifying the image when displayed in image display 100.
  • the filtered image is communicated through image output means 110.
  • the unfiltered image with the overlay may be communicated through image output means 110 to a external device such as a personal computer capable of processing such information.
  • step 310 checks if a flash was used in the photograph. If not, step 315 indicates that red-eye is not possible. Otherwise optional step 320 checks if a low level of ambient light was present at the time of the photograph. If not, step 315 indicates that red-eye is not possible. Otherwise optional step 330 checks if the subject is relatively close to the camera at the time of the photograph. If not, step 215 indicates that red-eye is not possible. Otherwise step 340 indicates that red-eye is possible.
  • FIG. 10 shows a flow chart for testing if conditions indicate a false red-eye grouping corresponding to step 240 of FIG. 8.
  • step 410 checks if other red-eye pixels are found within a radius of a grouping. Preferably the radius is between two and five times the radius ofthe grouping. If found step 415 indicates a false red-eye grouping. Otherwise step 420 checks if a substantially white area of pixels is found in the vicinity of the grouping. This area is indicative of the white area of a subject's eye and has preferably between two and twenty times the number of pixels in the grouping. If not found step 415 indicates a false red-eye grouping. Otherwise step 430 searches the vicinity of the grouping for an iris ring or an eyebrow line. If not found, step 415 indicates a false red-eye grouping. Otherwise step 440 indicates the red-eye grouping is not false.
  • each of the tests 410, 420 and 430 check for a false red-eye grouping.
  • other tests may be used to prevent false modification of the image, or the tests of FIG. 10 may be used either alone or in combination.
  • either the red-eye condition test 210 or the red-eye falsing test 240 of FIG. 8 may be used to achieve satisfactory results.
  • test 240 may be acceptable enough to eliminate test 210, or visa versa.
  • the selectivity of either the color and/or grouping analysis of the red-eye phenomenon may be sufficient to eliminate both tests 210 and 240 of FIG. 8.
  • the color red as used herein means the range of colors and hues and brightnesses indicative of the red-eye phenomenon
  • the color white as used herein means the range of colors and hues and brightnesses indicative of the white area of the human eye.

Abstract

A digital camera has an integral flash and stores and displays a digital image. Under certain conditions, a flash photograph taken with the camera may result in a red-eye phenomenon due to a reflection within an eye of a subject of the photograph. A digital apparatus has a red-eye filter which analyzes the stored image for the red-eye phenomenon and modifies the stored image to eliminate the red-eye phenomenon by changing the read area to black. The modification of the image is enabled when a photograph is taken under conditions indicative of the red-eye phenomenon. The modification is subject to anti-falsing analysis which further examines the area around the red-eye for indicia of the eye of the subject. The detection and correction can be optimized for performance and quality by operating on subsample versions of the image when appropriate.

Description

Optimized Performance and Performance for Red-eye Filter Method and Apparatus
FIELD OF THE INVENTION The invention relates generally to the area of flash photography, and more specifically to filtering "red-eye" from a digital camera image. BACKGROUND OF THE INVENTION "Red-eye" is a phenomenon in flash photography where a flash is reflected within a subject's eye and appears in a photograph as a red dot where the black pupil of the subject's eye would normally appear. The unnatural glowing red of an eye is due to internal reflections from the vascular membrane behind the retina, which is rich in blood vessels. This objectionable phenomenon is well understood to be caused in part by a small angle between the flash of the camera and the lens of the camera. This angle has decreased with the miniaturization of cameras with integral flash capabilities. Additional contributors include the relative closeness of the subject to the camera and ambient light levels. The red-eye phenomenon can be minimized by causing the iris to reduce the opening of the pupil. This is typically done with a "pre-flash", a flash or illumination of light shortly before a flash photograph is taken. This causes the iris to close. Unfortunately, the pre-flash is an objectionable 0.2 to 0.6 seconds prior to the flash photograph. This delay is readily discernible and easily within the reaction time of a human subject. Consequently the subject may believe the pre-flash is the actual photograph and be in a less than desirable position at the time of the actual photograph. Alternately, the subject must be informed ofthe pre-flash, typically loosing any spontaneity of the subject captured in the photograph. Those familiar with the art have developed complex analysis processes operating within a camera prior to invoking a pre-flash. Various conditions are monitored prior to the photograph before the pre-flash is generated, the conditions include the ambient light level and the distance of the subject from the camera. Such a system is described in U.S. Pat. No. 5,070,355 to Inoue et al. Although that invention minimizes the occurrences where a pre-flash is used, it does not eliminate the need for a pre-flash. What is needed is a method of eliminating the red-eye phenomenon with a miniature camera having an integral without the distraction of a pre-flash. Digital cameras are becoming more popular and smaller in size. Digital cameras have several advantages over film cameras. Digital cameras eliminate the need for film as the image is digitally captured and stored in a memory array for display on a display screen on the camera itself. This allows photographs to be viewed and enjoyed virtually instantaneously as opposed to waiting for film processing. Furthermore, the digitally captured image may be downloaded to another display device such as a personal computer or color printer for further enhanced viewing. Digital cameras include microprocessors for image processing and compression and camera systems control. Nevertheless, without a pre-flash, both digital and film cameras can capture the red-eye phenomenon as the flash reflects within a subject's eye. Thus, what is needed is a method of eliminating red-eye phenomenon within a miniature digital camera having a flash without the distraction of a pre-flash.
BRIEF SUMMARY OF THE INVENTION A digital apparatus is provided with a red-eye filter for modifying an area within a digitized image indicative of a red-eye phenomenon based on an analysis of a subsample representation of selected regions of the digitized image. The analysis may be performed at least in part for determining the area, and/or may be performed at least in part for determining the modifying. The selected regions ofthe digitized image may include the entire image or one or more regions may be excluded. The selected regions may include multi resolution encoding of the image. The analysis may be performed in part on a full resolution image and in part on a subsample resolution of the digital image. The apparatus may include a module for changing the degree of said subsampling. This changing the degree ofthe subsampling may be determined empirically, and/or based on a size of the image or selected regions thereof, and/or based on data obtained from the camera relating to the settings of the camera at the time of image capture. In the latter case, the data obtained from the camera may include an aperture setting, focus of the camera, distance of the subject from the camera, or a combination of these. The changing the degree of the subsampling may also be determined based digitized image metadata information and/or a complexity of calculation for the red eye filter. The modifying of the area may be performed including the full resolution of the digital image. The red-eye filter may include multiple sub filters. The subsampling for the sub filters operating on selected regions of the image may be determined by one or more of the image size, suspected as red eye region size, filter computation complexity, empirical success rate of said sub filter, empirical false detection rate of said sub filter, falsing probability of said sub filter, relations between said suspected regions as red eye, results of previous analysis of other said sub filters. The apparatus may include a memory for saving the digitized image after applying the filter for modifying pixels as a modified image, and/or a memory for saving the subsample representation of the image. The subsample representation of selected regions of the image may be determined in hardware. The analysis may be performed in part on the full resolution image and in part on a subsample resolution of the image. The subsample representation may be determined using spline interpolation, and may be determined using bi-cubic interpolation. According to another aspect, a digital apparatus includes an image store and a red eye filter. The image store is for holding a temporary copy of an unprocessed image known as a pre- capture image, a permanent copy of a digitally processed, captured image, and a subsample representation of selected regions of at least one of the images, e.g., the pre-capture image. The red-eye filter is for modifying an area within at least one of the images indicative of a red-eye phenomenon based on an analysis of the subsample representation. Preferably, the at least one of the images includes the digitally processed, captured image. This further aspect may also include one or more features in accordance with the first aspect. In addition, the changing the degree of the subsampling may be determined based on data obtained from the camera relating to image processing analysis of said precapture images. The image processing analysis may be based on histogram data or color correlogram data, or both, obtained from the pre-capture image. The image processing analysis may also be based on global luminance or white balance image data, or both, obtained from the pre-capture image. The image processing analysis may also be based on a face detection analysis of the pre-capture image, or on determining pixel regions with a color characteristic indicative of redeye, or both. The image processing analysis may be performed in hardware. The changing of the degree of the subsampling may be determined based on image metadata information. A method of filtering a red eye phenomenon from a digitized image is also provided in accordance with another aspect, wherein the image includes a multiplicity of pixels indicative of color. The method includes determining whether one or more regions within a subsample representation of the digitized image are suspected as including red eye artifact. The method may include varying a degree of the subsample representation for each region of the one or more regions based on the image, and/or generating a subsample representation based on the image. The subsample representation may be generated or the degree varied, or both, utilizing a hardware-implemented subsampling engine. One or more regions within said subsample representation determined as including red eye artifact may be tested for determining any false redeye groupings. The method may further include associating the one or more regions within the subsample presentation of the image with one or more corresponding regions within the digitized image, and modifying the one or more corresponding regions within the digitized image. The determining may include analyzing meta-data information including image acquisition device- specific information. The method may include analyzing the subsample representation of selected regions of the digitized image, and modifying an area determined to include red eye artifact. The analysis may be performed at least in part for determining said area and/or thee modifying. The selected regions of the digitized image may include the entire image or may exclude one or more regions. The selected regions of the digitized image may include multi resolution encoding of the image. The analyzing may be performed in part on a full resolution image and in part on a subsample resolution of said image. The method may include changing the degree of the subsampling. This changing ofthe degree of subsampling may be determined empirically, and/or based on a size of the image or selected regions thereof. The method may include saving the digitized image after applying the filter for modifying pixels as a modified image, and/or saving said subsample representation ofthe image. The method may include determining the subsample representation of the image in hardware, and/or using a spline or bi-cubic interpolation. The modifying of the area may be performed including the full resolution of the image.
The method may include determining the subsample representation utilizing a plurality of sub- filters. The determining of the plurality of sub-filters may be based on one or more of the image size, a suspected red eye region size, filter computation complexity, empirical success rate of said sub-filter, empirical false detection rate of said sub-filter, falsing probability of said sub- filter, relations between said suspected red eye regions, or results of previous analysis of one or more other sub-filters.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 shows a block diagram of a camera apparatus operating in accordance with the present invention.
FIG. 2 shows a pixel grid upon which an image of an eye is focused.
FIG. 3 shows pixel coordinates of the pupil of FIG. 2. FIG. 4 shows pixel coordinates of the iris of FIG. 2.
FIG. 5 shows pixel coordinates which contain a combination of iris and pupil colors of FIG. 2.
FIG. 6 shows pixel coordinates of the white eye area of FIG. 2.
FIG. 7 shows pixel coordinates of the eyebrow area of FIG. 2.
FIG. 8 shows a flow chart of a method operating in accordance with the present invention.
FIG. 9 shows a flow chart for testing if conditions indicate the possibility of a red-eye phenomenon photograph.
FIG. 10 shows a flow chart for testing if conditions indicate a false red-eye grouping.
FIG. 11 illustrates in block form an exemplary arrangement in accordance with a precapture image utilization aspect. DESCRIPTION OF A PREFERRED EMBODIMENT FIG. 1 shows a block diagram of a camera apparatus operating in accordance with the present invention. The camera 20 includes an exposure control 30 that, in response to a user input, initiates and controls the digital photographic process. Ambient light is determined using light sensor 40 in order to automatically determine if a flash is to be used. The distance to the subject is determined using focusing means 50 which also focuses the image on image capture means 60. The image capture means digitally records the image in color. The image capture means is known to those familiar with the art and may include a CCD (charge coupled device) to facilitate digital recording. If a flash is to be used, exposure control means 30 causes the flash means 70 to generate a photographic flash in substantial coincidence with the recording of the image by image capture means 60. The flash may be selectively generated either in response to the light sensor 40 or a manual input from the user of the camera. The image recorded by image capture means 60 is stored in image store means 80 which may comprise computer memory such a dynamic random access memory or a nonvolatile memory. The red-eye filter 90 then analyzes the stored image for characteristics of red-eye, and if found, modifies the image and removes the red-eye phenomenon from the photograph as will be described in more detail. The red-eye filter includes a pixel locator 92 for locating pixels having a color indicative of red-eye; a shape analyzer 94 for determining if a grouping of at least a portion of the pixels located by the pixel locator comprise a shape indicative of red-eye; a pixel modifier 96 for modifying the color of pixels within the grouping; and an falsing analyzer 98 for further processing the image around the grouping for details indicative of an image of an eye. The modified image may be either displayed on image display 100 or downloaded to another display device, such as a personal computer or printer via image output means 110. It can be appreciated that many of the processes implemented in the digital camera may be implemented in or controlled by software operating in a microcomputer (μC) or digital signal processor (DSP) and/or an application specific integrated circuit (ASIC).
In a first embodiment, the image capture means 60 of FIG. 1 includes an optional image subsampling means, wherein the image is actively down-sampled. In one embodiment, the subsampling is done using a bi-cubic spline algorithm, such as those that are known to one familiar in the art of signal and image processing. Those familiar with this art are aware of subsampling algorithms that interpolate and preserve pixel relationships as best they can given the limitation that less data is available. In other words, the subsampling stage is performed to maintain significant data while minimizing the image size, thus the amount of pixel-wise calculations involved, which are generally costly operations. A subsample representation may include a multi resolution presentation of the image, as well as a representation in which the sampling rate is not constant for the entire image. For example, areas suspected as indicative of red eye may have different resolution, most likely higher resolution, than areas positively determined not to include red eye.
In an alternative embodiment, the subsampling means utilizes hardware based subsampling wherein the processing unit of the digital imaging appliance incorporates a dedicated subsampling engine providing the advantage of a very fast execution of a subsampling operation. Such digital imaging appliance with dedicated subsampling engine may be based on a state-of-art digital imaging appliance incorporating hardware that facilitates the rapid generation of image thumbnails. The decision to subsample the image is, in part, dependent on the size of the original image. If the user has selected a low resolution image format, there may be little gain in performance of redeye detection and false avoidance steps. Thus, the inclusion of a subsampling means, or step or operation, is optional.
The red eye detection filter of the preferred embodiment may comprise a selection of sub filters that may be calculated in succession or in parallel. In such cases, the sub-filters may operate on only a selected region, or a suspected region. Such regions are substantially smaller than the entire image. The decision to subsample the image is, in part, dependent on one or a combination of a few factors such as the size of the suspected region, the success or failure of previous or parallel filters, the distance between the regions and the complexity of the computation of the sub filter. Many of the parameters involved in deciding whether or not to subsample a region, and to what degree, may also be determined by an empirical process of optimization between success rate, failure rate and computation time. Where the subsampling means, step or operation is implemented, then both the original and subsampled images are preferably stored in the image store 80 of FIG. 1. The subsampled image is now available to be used by the redeye detector 90 and the false avoidance analyzer 98 of FIG. 1. In other embodiments the image capture means 60 of Fig. 1 and the image subsampling means incorporated within step 60 may occur on separate imaging appliances as we shall next describe. As is well known in the prior art, data relating to the status of system components 30, 40, 50 and 70 of FIG. 1 at the moment of image capture may be stored within the header of a digital image, using standardized metadata tags. Thus the subsampling means and the redeye filtering means may occur at a time after the image capture and, as all the necessary data which would have been available at that time can be stored as metadata within the digital image the actual subsampling process and redeye filtering process may occur on a physically separate imaging appliance. To give a practical example, an image captured by a digital camera may be later subsampled and subject to redeye filtering after being transferred to a printer, and prior to the actual printing of the image.
As discussed before, the system and method of the preferred embodiment involves the detection and removal of red eye artifacts. The actual removal ofthe red eye will eventually be performed on the full resolution image. However, all or portions of the detection of redeye candidate pixel groupings, the subsequent testing of said pixel groupings for determining false redeye groupings, and the initial step of the removal, where the image is presented to the user for user confirmation of the correction, can be performed on the entire image, the subsampled image, or a subset of regions of the entire image or the subsampled image. There is generally a tradeoff between speed and accuracy. Therefore, according to yet another embodiment involving performing all detection on the subsampled image, the detection, and subsequent false-determining, may be performed selectively, e.g., sometimes on full resolution regions that are suspected as red-eye, and sometimes on a subsampled resolution. We remark that the search step 200 of FIG. 8 comprises, in a practical embodiment, a number of successively applied color filters based on iterative refinements of an initial pixel by pixel search of the captured image. In addition to searching for a red color, it is preferably determined whether the luminance, or brightness of a redeye region, lies within a suitable range of values. Further, the local spatial distribution of color and luminance are relevant factors in the initial search for redeye pixel groupings. As each subsequent filter is preferably only applied locally to pixels in close proximity to a grouping of potential redeye pixels, it can equally well be applied to the corresponding region in the full-sized image. Thus, where it is advantageous to the accuracy of a particular color-based filter, it is possible to apply that filter to the full-sized image rather than to the subsampled image. This applies equally to filters which may be employed in the false-determining analyzer 98. Examples of non-color based false-determining analysis filters include those which consider the localized contrast, saturation or texture distributions in the vicinity of a potential redeye pixel grouping, those that perform localized edge or shape detection and more sophisticated filters which statistically combine the results of a number of simple local filters to enhance the accuracy of the resulting false-determining analysis. It is preferred that more computationally expensive filters that operate on larger portions of the images will utilize a subsampled version, while the more sensitive and delicate filters may be applied to the corresponding region of the full resolution image. It is preferred that in the case of full resolution only small portions of the image will be used for such filters. As a non exhaustive example, filters that look for a distinction between lips and eyes may utilize a full resolution portion, while filters that distinguish between background colors may use a subsample of the image. Furthermore, several different sizes of subsampled images may be generated and employed selectively to suit the sensitivity of the different pixel locating and false determining filters. The decision whether the filter should use a subsampled representation, and the rate of the downsampling, may be determined empirically by a-priori statistically comparing the success rate vs. mis-detection rate of a filter with the subsampling rate and technique of known images. It is further worth noting that the empirical determination will often be specific to a particular camera model. Thus, the decision to use the full sized image or the subsampled image data, for a particular pixel locating or false determining filter, may be empirically determined for each camera.
In another aspect, a pre-acquisition or precapture image may be effectively utilized in an embodiment of the invention. Another type of subsampled representation of the image may be one that differs temporally from the captured image, in addition or alternative to the spatial differentiation with other aforementioned algorithms such as spline and bi-cubic. The subsample representation of the image may be an image captured before the final image is captured, and preferably just before. A camera may provide a digital preview of the image, which may be a continuous subsample version of the image. Such pre-capture may be used by the camera and the camera user, for example, to establish correct exposure, focus and/or composition.
The precapture image process may involve an additional step of conversion from the sensor domain, also referred to as raw-ccd, to a known color space that the red eye filter is using for calculations. In the case that the preview or precapture image is being used, an additional step of alignment may be used in the case that the final image and the pre-capture differ, such as in camera or object movement.
The pre-acquisition image may be normally processed directly from an image sensor without loading it into camera memory. To facilitate this processing, a dedicated hardware subsystem is implemented to perform pre-acquisition image processing. Depending on the settings of this hardware subsystem, the pre-acquisition image processing may satisfy some predetermined criteria which then implements the loading of raw image data from the buffer of the imaging sensor into the main system memory together with report data, possibly stored as metadata, on the predetermined criteria. One example of such a test criterion is the existence of red areas within the pre-acquisition image prior to the activation of the camera flash module. Report data on such red areas can be passed to the redeye filter to eliminate such areas from the redeye detection process. Note that where the test criteria applied by the pre-acquisition image processing module are not met then it can loop to obtain a new pre-acquisition test image from the imaging sensor. This looping may continue until either the test criteria are satisfied or a system time-out occurs. Note further that the pre-acquisition image processing step is significantly faster than the subsequent image processing chain of operations due to the taking of image data directly from the sensor buffers and the dedicated hardware subsystem used to process this data.
Once the test criteria are satisfied, the raw image data may be then properly loaded into main system memory to allow image processing operations to convert the raw sensor data into a final pixelated image. Typical steps may include converting Bayer or RGGB image data to YCC or RGB pixelated image data, calculation and adjustment of image white balance, calculation and adjustment of image color range, and calculation and adjustment of image luminence, potentially among others.
Following the application of this image processing chain, the final, full-size image may be available in system memory, and may then be copied to the image store for further processing by the redeye filter subsystem. A camera may incorporate dedicated hardware to do global luminance and/or color/grayscale histogram calculations on the raw and/or final image data. One or more windows within the image may be selected for doing "local" calculations, for example. Thus, valuable data may be obtained using a "first pass" or pre-acquisition image before committing to a main image processing approach which generates a more final picture.
A subsampled image, in addition to the precapture and more finalized images, may be generated in parallel with the final image by a main image processing toolchain. Such processing may be preferably performed within the image capture module 60 of FIG. 1. An exemplary process may include the following operations. First, a raw image may be acquired or pre-captured. This raw image may be processed prior to storage. This processing may generate some report data based on some predetermined test criteria. If the criteria are not met, the pre- acquisition image processing operation may obtain a second, and perhaps one or more additional, pre-acquisition images from the imaging sensor buffer until such test criteria are satisfied.
Once the test criteria are satisfied, a full-sized raw image may be loaded into system memory and the full image processing chain may be applied to the image. A final image and a subsample image may then ultimately preferably be generated.
FIG. 11 illustrates in block form a further exemplary arrangement in accordance with a precapture image utilization aspect. After the pre-acquisition test phase, the "raw" image is loaded from the sensor into the image capture module. After converting the image from its raw format (e.g., Bayer RGGB) into a more standardized pixel format such as YCC or RGB, it may be then subject to a post-capture image processing chain which eventually generates a full-sized final image and one or more subsampled copies of the original. These may be preferably passed to the image store, and the red-eye filter is preferably then applied. Note that the image capture and image store functional blocks of FIG. 11 correspond to blocks 60 and 80 illustrated at FIG. 1.
FIG. 2 shows a pixel grid upon which an image of an eye is focused. Preferably the digital camera records an image comprising a grid of pixels at least 640 by 480. FIG. 2 shows a 24 by 12 pixel portion of the larger grid labeled columns A-X and rows 1-12 respectively. FIG. 3 shows pixel coordinates of the pupil of FIG. 2. The pupil is the darkened circular portion and substantially includes seventeen pixels: K7, K8, L6, L7, L8, L9, M5, M6, M7, M8, M9, N6, N7, N8, N9, O7 and O8, as indicated by shaded squares at the aforementioned coordinates. In a non-flash photograph, these pupil pixels would be substantially black in color. In a red-eye photograph, these pixels would be substantially red in color. It should be noted that the aforementioned pupil pixels have a shape indicative of the pupil ofthe subject, the shape preferably being a substantially circular, semi-circular or oval grouping of pixels. Locating a group of substantially red pixels forming a substantially circular or oval area is useful by the redeye filter. FIG. 4 shows pixel coordinates of the iris of FIG. 2. The iris pixels are substantially adjacent to the pupil pixels of FIG. 2. Iris pixels J5, J6, J7, J8, J9, K5, K10, L10, M10, N1O, O5, O10, P5, P6, P7, P8 and P9 are indicated by shaded squares at the aforementioned coordinates. The iris pixels substantially surround the pupil pixels and may be used as further indicia of a pupil. In a typical subject, the iris pixels will have a substantially constant color. However, the color will vary as the natural color of the eyes each individual subject varies. The existence of iris pixels depends upon the size of the iris at the time of the photograph, if the pupil is very large then iris pixels may not be present. FIG. 5 shows pixel coordinates which include a combination of iris and pupil colors of FIG. 2. The pupil/iris pixels are located at K6, K9, L5, N5, 06, and 09, as indicated by shaded squares at the aforementioned coordinates. The pupil/iris pixels are adjacent to the pupil pixels, and also adjacent to any iris pixels which may be present. Pupil/iris pixels may also contain colors of other areas of the subject's eyes including skin tones and white areas of the eye. FIG. 6 shows pixel coordinates of the white eye area of FIG. 2. The seventy one pixels are indicated by the shaded squares of FIG. 6 and are substantially white in color and are in the vicinity of and substantially surround the pupil pixels of FIG. 2. FIG. 7 shows pixel coordinates of the eyebrow area of FIG. 2. The pixels are indicated by the shaded squares of FIG. 7 and are substantially white in color. The eyebrow pixels substantially form a continuous line in the vicinity of the pupil pixels. The color of the line will vary as the natural color of the eyebrow of each individual subject varies. Furthermore, some subjects may have no visible eyebrow at all. It should be appreciated that the representations of FIG. 2 through FIG. 7 are particular to the example shown. The coordinates of pixels and actual number of pixels comprising the image of an eye will vary depending upon a number of variables. These variables include the location of the subject within the photograph, the distance between the subject and the camera, and the pixel density of the camera. The red-eye filter 90 of FIG. 1 searches the digitally stored image for pixels having a substantially red color, then determines if the grouping has a round or oval characteristics, similar to the pixels of FIG. 3. If found, the color of the grouping is modified. In the preferred embodiment, the color is modified to black. Searching for a circular or oval grouping helps eliminate falsely modifying red pixels which are not due to the red-eye phenomenon. In the example of FIG. 2, the red-eye phenomenon is found in a 5.times.5 grouping of pixels of FIG. 3. In other examples, the grouping may contain substantially more or less pixels depending upon the actual number of pixels comprising the image of an eye, but the color and shape of the grouping will be similar. Thus for example, a long line of red pixels will not be falsely modified because the shape is not substantially round or oval. Additional tests may be used to avoid falsely modifying a round group of pixels having a color indicative of the red-eye phenomenon by further analysis of the pixels in the vicinity of the grouping. For example, in a red-eye phenomenon photograph, there will typically be no other pixels within the vicinity of a radius originating at the grouping having a similar red color because the pupil is surrounded by components of the subject's face, and the red-eye color is not normally found as a natural color on the face of the subject. Preferably the radius is large enough to analyze enough pixels to avoid falsing, yet small enough to exclude the other eye of the subject, which may also have the red-eye phenomenon. Preferably, the radius includes a range between two and five times the radius of the grouping. Other indicia of the recording may be used to validate the existence of red-eye including identification of iris pixels of FIG. 4 which surround the pupil pixels. The iris pixels will have a substantially common color, but the size and color of the iris will vary from subject to subject. Furthermore, the white area of the eye may be identified as a grouping of substantially white pixels in the vicinity of and substantially surrounding the pupil pixels as shown in FIG. 6. However, the location of the pupil within the opening of the eyelids is variable depending upon the orientation of the head of the subject at the time of the photograph. Consequently, identification of a number of substantially white pixels in the vicinity of the iris without a requirement of surrounding the grouping will further validate the identification of the red-eye phenomenon and prevent false modification of other red pixel groupings. The number of substantially white pixels is preferably between two and twenty times the number of pixels in the pupil grouping. As a further validation, the eyebrow pixels of FIG. 7 can be identified. Further, additional criterion can be used to avoid falsely modifying a grouping of red pixels. The criterion include determining if the photographic conditions were indicative of the red-eye phenomenon. These include conditions known in the art including use of a flash, ambient light levels and distance of the subject. If the conditions indicate the red-eye phenomenon is not present, then red-eye filter 90 is not engaged. FIG. 5 shows combination pupil/iris pixels which have color components of the red-eye phenomenon combined with color components of the iris or even the white area of the eye. The invention modifies these pixels by separating the color components associated with red-eye, modifying color of the separated color components and then adding back modified color to the pixel. Preferably the modified color is black. The result of modifying the red component with a black component makes for a more natural looking result. For example, if the iris is substantially green, a pupil/iris pixel will have components of red and green. The red-eye filter removes the red component and substitutes a black component, effectively resulting in a dark green pixel. FIG. 8 shows a flow chart of a method operating in accordance with the present invention. The red-eye filter process is in addition to other processes known to those skilled in the art which operate within the camera. These other processes include flash control, focus, and image recording, storage and display. The red-eye filter process preferably operates within software within a .mu.C or DSP and processes an image stored in image store 80. The red-eye filter process is entered at step 200. At step 210 conditions are checked for the possibility of the red-eye phenomenon. These conditions are included in signals from exposure control means 30 which are communicated directly to the red-eye filter. Alternatively the exposure control means may store the signals along with the digital image in image store 80. If conditions do not indicate the possibility of red-eye at step 210, then the process exits at step 215. Step 210 is further detailed in FIG. 9, and is an optional step which may be bypassed in an alternate embodiment. Then is step 220 the digital image is searched of pixels having a color indicative of red-eye. The grouping of the red-eye pixels are then analyzed at step 230. Red-eye is determined if the shape of a grouping is indicative of the red-eye phenomenon. This step also accounts for multiple redeye groupings in response to a subject having two red-eyes, or multiple subjects having red-eyes. If no groupings indicative of red-eye are found, then the process exits at step 215. Otherwise, false red-eye groupings are checked at optional step 240. Step 240 is further detailed in FIG. 10 and prevents the red-eye filter from falsely modifying red pixel groupings which do not have further indicia of the eye of a subject. After eliminating false groupings, if no grouping remain, the process exits at step 215. Otherwise step 250 modifies the color ofthe groupings which pass step 240, preferably substituting the color red for the color black within the grouping. Then in optional step 260, the pixels surrounding a red-eye grouping are analyzed for a red component. These are equivalent to the pixels of FIG. 5. The red component is substituted for black by the red-eye filter. The process then exits at step 215. It should be appreciated that the pixel color modification can be stored directly in the image store by replacing red-eye pixels with pixels modified by the red-eye filter. Alternately the modified pixels can be stored as an overlay in the image store, thereby preserving the recorded image and only modifying the image when displayed in image display 100. Preferably the filtered image is communicated through image output means 110. Alternately the unfiltered image with the overlay may be communicated through image output means 110 to a external device such as a personal computer capable of processing such information. FIG. 9 shows a flow chart for testing if conditions indicate the possibility of a red-eye phenomenon corresponding to step 210 of FIG. 8. Entered at step 300, step 310 checks if a flash was used in the photograph. If not, step 315 indicates that red-eye is not possible. Otherwise optional step 320 checks if a low level of ambient light was present at the time of the photograph. If not, step 315 indicates that red-eye is not possible. Otherwise optional step 330 checks if the subject is relatively close to the camera at the time of the photograph. If not, step 215 indicates that red-eye is not possible. Otherwise step 340 indicates that red-eye is possible. FIG. 10 shows a flow chart for testing if conditions indicate a false red-eye grouping corresponding to step 240 of FIG. 8. Entered at step 400, step 410 checks if other red-eye pixels are found within a radius of a grouping. Preferably the radius is between two and five times the radius ofthe grouping. If found step 415 indicates a false red-eye grouping. Otherwise step 420 checks if a substantially white area of pixels is found in the vicinity of the grouping. This area is indicative of the white area of a subject's eye and has preferably between two and twenty times the number of pixels in the grouping. If not found step 415 indicates a false red-eye grouping. Otherwise step 430 searches the vicinity of the grouping for an iris ring or an eyebrow line. If not found, step 415 indicates a false red-eye grouping. Otherwise step 440 indicates the red-eye grouping is not false. It should be appreciated that each of the tests 410, 420 and 430 check for a false red-eye grouping. In alternate embodiments, other tests may be used to prevent false modification of the image, or the tests of FIG. 10 may be used either alone or in combination. It should be further appreciated that either the red-eye condition test 210 or the red-eye falsing test 240 of FIG. 8 may be used to achieve satisfactory results. In an alternate embodiment test 240 may be acceptable enough to eliminate test 210, or visa versa. Alternately the selectivity of either the color and/or grouping analysis of the red-eye phenomenon may be sufficient to eliminate both tests 210 and 240 of FIG. 8. Furthermore, the color red as used herein means the range of colors and hues and brightnesses indicative of the red-eye phenomenon, and the color white as used herein means the range of colors and hues and brightnesses indicative of the white area of the human eye. Thus, what has been provided is an improved method and apparatus for eliminating redeye phenomenon within a miniature digital camera having a flash without the distraction of a pre-flash.

Claims

Claims:
1. A digital apparatus comprising a red-eye filter for modifying an area within a digitized image indicative of a red-eye phenomenon based on an analysis of a subsample representation of selected regions of said digitized image.
2. The apparatus of claim 1, wherein the analysis is performed at least in part for determining said area.
3. The apparatus of claim 1, wherein the analysis is performed at least in part for determining said modifying.
4. The apparatus of claim 1, wherein said selected regions of said digitized image comprise the entire image.
5. The apparatus of claim 1, wherein said selected regions of said digitized image comprise multi resolution encoding of said image.
6. The apparatus of claim 1, wherein at least one region of the entire image is not included among said selected regions of said image.
7. The apparatus of claim 1, wherein said analysis is performed in part on a full resolution image and in part on a subsample resolution of said digital image.
8. The apparatus of claim 1 , further comprising a module for changing the degree of said subsampling.
9. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined empirically.
10. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined based on a size of said image.
11. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined based on a size of selected regions of the image.
12. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined based on data obtained from the camera relating to the settings ofthe camera at the time of image capture.
13. The apparatus of claim 12, wherein the data obtained from the camera includes an aperture setting or focus of the camera, or both.
14. The apparatus of claim 12, wherein the data obtained from the camera includes the distance of the subject from the camera.
15. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined based on digitized image metadata information.
16. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined based on a complexity of calculation for said filter.
17. The apparatus of claim 8, wherein said changing the degree of said subsampling is determined based on data obtained from the camera relating to image processing analysis of said precapture images.
18. The apparatus of claim 17, wherein said image processing analysis is based on histogram data obtained from said pre-capture image.
19. The apparatus of claim 17, wherein said image processing analysis is based on color correlogram data obtained from said pre-capture image.
20. The apparatus of claim 17, wherein said image processing analysis is based on global luminance or white balance image data, or both, obtained from said pre-capture image.
21. The apparatus of claim 17, wherein said image processing analysis is based on face detection analysis of said pre-capture image.
22. The apparatus of claim 17, wherein said image processing analysis is based on determining pixel regions with a color characteristic indicative of redeye.
23. The apparatus of claim 1 or 8, wherein said modifying the area is performed on the full resolution of said digital image.
24. The apparatus of claim 1 or 8, wherein said red-eye filter comprises of a plurality of sub filters.
25. The apparatus of claim 24, wherein said subsampling for said sub filters operating on selected regions of said image is determined by one or more of the image size, suspected as red eye region size, filter computation complexity, empirical success rate of said sub filter, empirical false detection rate of said sub filter, falsing probability of said sub filter, relations between said suspected regions as red eye, results of previous analysis of other said sub filters.
26. The apparatus of claim 1, further comprising memory for saving said digitized image after applying said filter for modifying pixels as a modified image.
27. The apparatus of claim 1, further comprising memory for saving said subsample representation of said image.
28. The apparatus of claim 1, wherein said subsample representation of selected regions of said image is determined in hardware.
29. The apparatus of claim 1, wherein said analysis is performed in part on the full resolution image and in part on a subsample resolution of said image.
30. The apparatus of claim 1 , wherein said subsample representation is determined using spline interpolation.
31. The apparatus of claim 1, wherein said subsample representation is determined using bicubic interpolation.
32. The apparatus of claim 1 further comprising: an image store for holding: (i) a pre-capture image comprising a temporary copy of an unprocessed image; (ii) a permanent copy of a digitally processed, captured image, and (iii) a subsample representation of selected regions of the pre-capture image; and wherein said red-eye filter is arranged to modify an area indicative of a red-eye phenomenon within at least one of said pre-capture or captured images based on an analysis of the subsample representation.
33. The apparatus of claim 1 comprising one of a portable digital camera or a printer.
34. A method of filtering a red eye phenomenon from a digitized image comprising a multiplicity of pixels indicative of color, the method comprising determining whether one or more regions within a subsample representation of said digitized image are suspected as including red eye artifact.
35. The method of claim 34, further comprising varying a degree of the subsample representation for each region of said one or more regions based on said image.
36. The method of claim 34, further comprising generating the subsample representation based on said image.
37. The method of claim 34, further comprising generating the subsample representation utilizing a hardware-implemented subsampling engine.
38. The method of claim 34, further comprising testing one or more regions within said subsample representation determined as including red eye artifact for determining any false redeye groupings.
39. The method of claim 34, further comprising (c) associating said one or more regions within said subsample representation of said image with one or more corresponding regions within said image; and (d) modifying said one or more corresponding regions within said image.
40. The method of claim 34, wherein the determining comprises analyzing meta-data information including image acquisition device-specific information.
41. The method of claim 34, further comprising analyzing the subsample representation of selected regions of said digitized image, and modifying an area determined to include red eye artifact.
42. The method of claim 41 , wherein the analysis is performed at least in part for determining said area.
43. The method of claim 41, wherein the analysis is performed at least in part for determining said modifying.
44. The method of claim 41, wherein said selected regions of said digitized image comprise the entire image.
45. The method of claim 41, wherein said selected regions of said digitized image comprise multi resolution encoding of said image.
46. The method of claim 41, wherein at least one region of the entire image is not included among said selected regions of said image.
47. The method of claim 41, wherein said analysizing is performed in part on a full resolution image and in part on a subsample resolution of said image.
48. The method of claim 41, further comprising saving said digitized image after applying said filter for modifying pixels as a modified image.
49. The method of claim 41 , further comprising saving said subsample representation of said image.
PCT/EP2005/001171 2004-02-04 2005-02-03 Optimized red-eye filter method and apparatus involving subsample representations of selected image regions WO2005076217A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP05707215A EP1714252A2 (en) 2004-02-04 2005-02-03 Optimized red-eye filter method and apparatus involving subsample representations of selected image regions
JP2006551816A JP4966021B2 (en) 2004-02-04 2005-02-03 Method and apparatus for optimizing red eye filter performance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/773,092 US20050140801A1 (en) 2003-08-05 2004-02-04 Optimized performance and performance for red-eye filter method and apparatus
US10/773,092 2004-02-04

Publications (3)

Publication Number Publication Date
WO2005076217A2 true WO2005076217A2 (en) 2005-08-18
WO2005076217A9 WO2005076217A9 (en) 2005-10-13
WO2005076217A3 WO2005076217A3 (en) 2006-04-20

Family

ID=34837874

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2005/001171 WO2005076217A2 (en) 2004-02-04 2005-02-03 Optimized red-eye filter method and apparatus involving subsample representations of selected image regions

Country Status (5)

Country Link
US (2) US20050140801A1 (en)
EP (1) EP1714252A2 (en)
JP (1) JP4966021B2 (en)
IE (1) IES20050052A2 (en)
WO (1) WO2005076217A2 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010025908A1 (en) * 2008-09-03 2010-03-11 Fotonation Ireland Limited Partial face detector red-eye filter method and apparatus
US7680342B2 (en) 2004-08-16 2010-03-16 Fotonation Vision Limited Indoor/outdoor classification in digital images
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US7692696B2 (en) 2005-12-27 2010-04-06 Fotonation Vision Limited Digital image acquisition system with portrait mode
US7746385B2 (en) 1997-10-09 2010-06-29 Fotonation Vision Limited Red-eye filter method and apparatus
US7804531B2 (en) 1997-10-09 2010-09-28 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7865036B2 (en) 2005-11-18 2011-01-04 Tessera Technologies Ireland Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US7868922B2 (en) 2006-02-14 2011-01-11 Tessera Technologies Ireland Limited Foreground/background segmentation in digital images
US7912285B2 (en) 2004-08-16 2011-03-22 Tessera Technologies Ireland Limited Foreground/background segmentation in digital images with differential exposure calculations
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7953287B2 (en) 2006-02-14 2011-05-31 Tessera Technologies Ireland Limited Image blurring
US7962629B2 (en) 2005-06-17 2011-06-14 Tessera Technologies Ireland Limited Method for establishing a paired connection between media devices
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7995804B2 (en) 2007-03-05 2011-08-09 Tessera Technologies Ireland Limited Red eye false positive filtering using face location and orientation
US8000526B2 (en) 2007-11-08 2011-08-16 Tessera Technologies Ireland Limited Detecting redeye defects in digital images
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
US8126208B2 (en) 2003-06-26 2012-02-28 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US8184900B2 (en) 2006-02-14 2012-05-22 DigitalOptics Corporation Europe Limited Automatic detection and correction of non-red eye flash defects
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US8320641B2 (en) 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US8358841B2 (en) 2006-05-03 2013-01-22 DigitalOptics Corporation Europe Limited Foreground/background separation in digital images
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US8633999B2 (en) 2009-05-29 2014-01-21 DigitalOptics Corporation Europe Limited Methods and apparatuses for foreground, top-of-the-head separation from background
US8971628B2 (en) 2010-07-26 2015-03-03 Fotonation Limited Face detection using division-generated haar-like features for illumination invariance
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus

Families Citing this family (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7042505B1 (en) 1997-10-09 2006-05-09 Fotonation Ireland Ltd. Red-eye filter method and apparatus
US7352394B1 (en) 1997-10-09 2008-04-01 Fotonation Vision Limited Image modification based on red-eye filter analysis
US8330831B2 (en) 2003-08-05 2012-12-11 DigitalOptics Corporation Europe Limited Method of gathering visual meta data using a reference image
US8155397B2 (en) 2007-09-26 2012-04-10 DigitalOptics Corporation Europe Limited Face tracking in a camera processor
US7471846B2 (en) 2003-06-26 2008-12-30 Fotonation Vision Limited Perfecting the effect of flash within an image acquisition devices using face detection
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US8494286B2 (en) 2008-02-05 2013-07-23 DigitalOptics Corporation Europe Limited Face detection in mid-shot digital images
US7440593B1 (en) 2003-06-26 2008-10-21 Fotonation Vision Limited Method of improving orientation and color balance of digital images using face detection information
US8698924B2 (en) 2007-03-05 2014-04-15 DigitalOptics Corporation Europe Limited Tone mapping for low-light video frame enhancement
US9160897B2 (en) * 2007-06-14 2015-10-13 Fotonation Limited Fast motion estimation method
US8682097B2 (en) 2006-02-14 2014-03-25 DigitalOptics Corporation Europe Limited Digital image enhancement with reference images
US8180173B2 (en) * 2007-09-21 2012-05-15 DigitalOptics Corporation Europe Limited Flash artifact eye defect correction in blurred images using anisotropic blurring
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
US7565030B2 (en) 2003-06-26 2009-07-21 Fotonation Vision Limited Detecting orientation of digital images using face detection information
US8593542B2 (en) 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US8989516B2 (en) 2007-09-18 2015-03-24 Fotonation Limited Image processing method and apparatus
US7587085B2 (en) * 2004-10-28 2009-09-08 Fotonation Vision Limited Method and apparatus for red-eye detection in an acquired digital image
US8498452B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US7362368B2 (en) * 2003-06-26 2008-04-22 Fotonation Vision Limited Perfecting the optics within a digital image acquisition device using face detection
US8264576B2 (en) * 2007-03-05 2012-09-11 DigitalOptics Corporation Europe Limited RGBW sensor array
US9129381B2 (en) 2003-06-26 2015-09-08 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7636486B2 (en) 2004-11-10 2009-12-22 Fotonation Ireland Ltd. Method of determining PSF using multiple instances of a nominally similar scene
US7269292B2 (en) 2003-06-26 2007-09-11 Fotonation Vision Limited Digital image adjustable compression and resolution using face detection information
US8989453B2 (en) 2003-06-26 2015-03-24 Fotonation Limited Digital image processing using face detection information
US7844076B2 (en) 2003-06-26 2010-11-30 Fotonation Vision Limited Digital image processing using face detection and skin tone information
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US7536036B2 (en) * 2004-10-28 2009-05-19 Fotonation Vision Limited Method and apparatus for red-eye detection in an acquired digital image
US7620218B2 (en) * 2006-08-11 2009-11-17 Fotonation Ireland Limited Real-time face tracking with reference images
US7639889B2 (en) 2004-11-10 2009-12-29 Fotonation Ireland Ltd. Method of notifying users regarding motion artifacts based on image analysis
US7685341B2 (en) * 2005-05-06 2010-03-23 Fotonation Vision Limited Remote control apparatus for consumer electronic appliances
US8417055B2 (en) * 2007-03-05 2013-04-09 DigitalOptics Corporation Europe Limited Image processing method and apparatus
US8199222B2 (en) * 2007-03-05 2012-06-12 DigitalOptics Corporation Europe Limited Low-light video frame enhancement
US20050031224A1 (en) * 2003-08-05 2005-02-10 Yury Prilutsky Detecting red eye filter and apparatus using meta-data
US20110102643A1 (en) * 2004-02-04 2011-05-05 Tessera Technologies Ireland Limited Partial Face Detector Red-Eye Filter Method and Apparatus
JP2005346806A (en) * 2004-06-02 2005-12-15 Funai Electric Co Ltd Dvd recorder and recording and reproducing apparatus
JP4599110B2 (en) * 2004-07-30 2010-12-15 キヤノン株式会社 Image processing apparatus and method, imaging apparatus, and program
US7551797B2 (en) * 2004-08-05 2009-06-23 Canon Kabushiki Kaisha White balance adjustment
US7639888B2 (en) * 2004-11-10 2009-12-29 Fotonation Ireland Ltd. Method and apparatus for initiating subsequent exposures based on determination of motion blurring artifacts
US7715597B2 (en) 2004-12-29 2010-05-11 Fotonation Ireland Limited Method and component for image recognition
US8503800B2 (en) 2007-03-05 2013-08-06 DigitalOptics Corporation Europe Limited Illumination detection using classifier chains
US7315631B1 (en) 2006-08-11 2008-01-01 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US7694048B2 (en) 2005-05-06 2010-04-06 Fotonation Vision Limited Remote control apparatus for printer appliances
US7903870B1 (en) * 2006-02-24 2011-03-08 Texas Instruments Incorporated Digital camera and method
IES20070229A2 (en) * 2006-06-05 2007-10-03 Fotonation Vision Ltd Image acquisition method and apparatus
US7916897B2 (en) 2006-08-11 2011-03-29 Tessera Technologies Ireland Limited Face tracking for controlling imaging parameters
US7403643B2 (en) 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
ATE472140T1 (en) 2007-02-28 2010-07-15 Fotonation Vision Ltd SEPARATION OF DIRECTIONAL ILLUMINATION VARIABILITY IN STATISTICAL FACIAL MODELING BASED ON TEXTURE SPACE DECOMPOSITIONS
EP2188759A1 (en) 2007-03-05 2010-05-26 Fotonation Vision Limited Face searching and detection in a digital image acquisition device
JP4175425B2 (en) * 2007-03-15 2008-11-05 オムロン株式会社 Pupil color correction apparatus and program
US7773118B2 (en) * 2007-03-25 2010-08-10 Fotonation Vision Limited Handheld article with movement discrimination
US7916971B2 (en) 2007-05-24 2011-03-29 Tessera Technologies Ireland Limited Image processing method and apparatus
JP5246159B2 (en) * 2007-06-08 2013-07-24 株式会社ニコン Imaging device, image display device, program
US20080309770A1 (en) * 2007-06-18 2008-12-18 Fotonation Vision Limited Method and apparatus for simulating a camera panning effect
US7855737B2 (en) 2008-03-26 2010-12-21 Fotonation Ireland Limited Method of making a digital camera image of a scene including the camera user
CN106919911A (en) 2008-07-30 2017-07-04 快图有限公司 Modified using the automatic face and skin of face detection
US8379917B2 (en) 2009-10-02 2013-02-19 DigitalOptics Corporation Europe Limited Face recognition performance using additional image features
US8300929B2 (en) * 2009-10-07 2012-10-30 Seiko Epson Corporation Automatic red-eye object classification in digital photographic images
US20110216157A1 (en) 2010-03-05 2011-09-08 Tessera Technologies Ireland Limited Object Detection and Rendering for Wide Field of View (WFOV) Image Acquisition Systems
WO2011135158A1 (en) * 2010-04-30 2011-11-03 Nokia Corporation Method, apparatus and computer program product for compensating eye color defects
KR101454988B1 (en) 2010-06-28 2014-10-27 노키아 코포레이션 Method, apparatus and computer program product for compensating eye color defects
US8970770B2 (en) 2010-09-28 2015-03-03 Fotonation Limited Continuous autofocus based on face detection and tracking
US8508652B2 (en) 2011-02-03 2013-08-13 DigitalOptics Corporation Europe Limited Autofocus method
US8982180B2 (en) 2011-03-31 2015-03-17 Fotonation Limited Face and other object detection and tracking in off-center peripheral regions for nonlinear lens geometries
US8947501B2 (en) 2011-03-31 2015-02-03 Fotonation Limited Scene enhancements in off-center peripheral regions for nonlinear lens geometries
US8896703B2 (en) 2011-03-31 2014-11-25 Fotonation Limited Superresolution enhancment of peripheral regions in nonlinear lens geometries
US8723959B2 (en) 2011-03-31 2014-05-13 DigitalOptics Corporation Europe Limited Face and other object tracking in off-center peripheral regions for nonlinear lens geometries
US9721160B2 (en) 2011-04-18 2017-08-01 Hewlett-Packard Development Company, L.P. Manually-assisted detection of redeye artifacts
US9041954B2 (en) * 2011-06-07 2015-05-26 Hewlett-Packard Development Company, L.P. Implementing consistent behavior across different resolutions of images
US8493460B2 (en) 2011-09-15 2013-07-23 DigitalOptics Corporation Europe Limited Registration of differently scaled images
US8493459B2 (en) 2011-09-15 2013-07-23 DigitalOptics Corporation Europe Limited Registration of distorted images
US8970902B2 (en) 2011-09-19 2015-03-03 Hewlett-Packard Development Company, L.P. Red-eye removal systems and method for variable data printing (VDP) workflows
CN110174063A (en) * 2019-06-11 2019-08-27 上海工程技术大学 A kind of neutral pen ink height detecting system and detection method based on machine vision

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432863A (en) * 1993-07-19 1995-07-11 Eastman Kodak Company Automated detection and correction of eye color defects due to flash illumination
US5751836A (en) * 1994-09-02 1998-05-12 David Sarnoff Research Center Inc. Automated, non-invasive iris recognition system and method
US6278491B1 (en) * 1998-01-29 2001-08-21 Hewlett-Packard Company Apparatus and a method for automatically detecting and reducing red-eye in a digital image
US6407777B1 (en) * 1997-10-09 2002-06-18 Deluca Michael Joseph Red-eye filter method and apparatus
US20020136450A1 (en) * 2001-02-13 2002-09-26 Tong-Xian Chen Red-eye detection based on red region detection with eye confirmation
US20030044070A1 (en) * 2001-09-03 2003-03-06 Manfred Fuersich Method for the automatic detection of red-eye defects in photographic image data

Family Cites Families (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0111026B1 (en) * 1982-12-11 1986-03-05 DR.-ING. RUDOLF HELL GmbH Process and device for the copying retouching in the electronic colour picture reproduction
US4646134A (en) * 1984-03-21 1987-02-24 Sony Corporation Apparatus for encoding image signal
US5400113A (en) * 1988-03-16 1995-03-21 Nikon Corporation Control device for preventing red-eye effect on camera
US5231674A (en) * 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
JP2859351B2 (en) * 1990-02-07 1999-02-17 三菱電機株式会社 Method for manufacturing semiconductor device
US5335072A (en) * 1990-05-30 1994-08-02 Minolta Camera Kabushiki Kaisha Photographic system capable of storing information on photographed image data
CA2087523C (en) * 1990-07-17 1997-04-15 Mark Andrew Shackleton Method of processing an image
JP2748678B2 (en) * 1990-10-09 1998-05-13 松下電器産業株式会社 Gradation correction method and gradation correction device
KR930007065B1 (en) * 1991-01-30 1993-07-26 삼성전자 주식회사 Device for editing pictures in camcoder
US5249053A (en) * 1991-02-05 1993-09-28 Dycam Inc. Filmless digital camera with selective image compression
JPH04340526A (en) * 1991-05-16 1992-11-26 Olympus Optical Co Ltd Bounce stroboscopic device
JP3528184B2 (en) * 1991-10-31 2004-05-17 ソニー株式会社 Image signal luminance correction apparatus and luminance correction method
JP2962012B2 (en) * 1991-11-08 1999-10-12 日本ビクター株式会社 Video encoding device and decoding device therefor
JP2500726B2 (en) * 1992-06-12 1996-05-29 日本電気株式会社 Method and apparatus for detecting upper eyelid region, inner corner of eye, outer corner of eye, upper eyelid region and eye structure
JPH0678320A (en) * 1992-08-25 1994-03-18 Matsushita Electric Ind Co Ltd Color adjustment device
US5649238A (en) * 1992-09-14 1997-07-15 Nikon Corporation Camera having built-in flash light emitting device for improving picture quality and method thereof
JP3009561B2 (en) * 1993-04-26 2000-02-14 富士写真フイルム株式会社 Still video camera and strobe light emission control data adjusting device
JPH07226911A (en) * 1994-02-15 1995-08-22 Eastman Kodak Japan Kk Electronic still camera
US5781650A (en) * 1994-02-18 1998-07-14 University Of Central Florida Automatic feature detection and age classification of human faces in digital images
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
JP2630923B2 (en) * 1994-12-05 1997-07-16 日本アイ・ビー・エム株式会社 Image recognition method and apparatus
JP3400888B2 (en) * 1995-03-29 2003-04-28 大日本スクリーン製造株式会社 How to change the color of a color image
US5724456A (en) * 1995-03-31 1998-03-03 Polaroid Corporation Brightness adjustment of images using digital scene analysis
JP3426060B2 (en) * 1995-07-28 2003-07-14 三菱電機株式会社 Face image processing device
JP3420405B2 (en) * 1995-09-20 2003-06-23 キヤノン株式会社 Imaging device
US6104839A (en) * 1995-10-16 2000-08-15 Eastman Kodak Company Method and apparatus for correcting pixel values in a digital image
US5708866A (en) * 1996-05-02 1998-01-13 Eastman Kodak Company Camera selects unused flash bulb farthest from taking lens to reduce red-eye effect when camera-to-subject distance within near range
JP3037140B2 (en) * 1996-06-13 2000-04-24 日本電気オフィスシステム株式会社 Digital camera
US6195127B1 (en) * 1996-07-18 2001-02-27 Sanyo Electric Co., Ltd. Digital camera, having a flash unit, which determines proper flash duration through an assessment of image luminance and, where needed, a preliminary flash emission
US6028611A (en) * 1996-08-29 2000-02-22 Apple Computer, Inc. Modular digital image processing via an image processing chain
JP3791635B2 (en) * 1996-10-22 2006-06-28 富士写真フイルム株式会社 Image reproduction method, image reproduction apparatus, image processing method, and image processing apparatus
JP3684017B2 (en) * 1997-02-19 2005-08-17 キヤノン株式会社 Image processing apparatus and method
US6573927B2 (en) * 1997-02-20 2003-06-03 Eastman Kodak Company Electronic still camera for capturing digital image and creating a print order
US6249315B1 (en) * 1997-03-24 2001-06-19 Jack M. Holm Strategy for pictorial digital image processing
JP3222091B2 (en) * 1997-05-27 2001-10-22 シャープ株式会社 Image processing apparatus and medium storing image processing apparatus control program
US6204858B1 (en) * 1997-05-30 2001-03-20 Adobe Systems Incorporated System and method for adjusting color data of pixels in a digital image
US6381345B1 (en) * 1997-06-03 2002-04-30 At&T Corp. Method and apparatus for detecting eye location in an image
US5892837A (en) * 1997-08-29 1999-04-06 Eastman Kodak Company Computer program product for locating objects in an image
US7042505B1 (en) * 1997-10-09 2006-05-09 Fotonation Ireland Ltd. Red-eye filter method and apparatus
US7738015B2 (en) * 1997-10-09 2010-06-15 Fotonation Vision Limited Red-eye filter method and apparatus
US7352394B1 (en) * 1997-10-09 2008-04-01 Fotonation Vision Limited Image modification based on red-eye filter analysis
US6266054B1 (en) * 1997-11-05 2001-07-24 Microsoft Corporation Automated removal of narrow, elongated distortions from a digital image
US6035072A (en) * 1997-12-08 2000-03-07 Read; Robert Lee Mapping defects or dirt dynamically affecting an image acquisition device
JPH11175699A (en) * 1997-12-12 1999-07-02 Fuji Photo Film Co Ltd Picture processor
US6268939B1 (en) * 1998-01-08 2001-07-31 Xerox Corporation Method and apparatus for correcting luminance and chrominance data in digital color images
US6192149B1 (en) * 1998-04-08 2001-02-20 Xerox Corporation Method and apparatus for automatic detection of image target gamma
US6233364B1 (en) * 1998-09-18 2001-05-15 Dainippon Screen Engineering Of America Incorporated Method and system for detecting and tagging dust and scratches in a digital image
JP2000115539A (en) * 1998-09-30 2000-04-21 Fuji Photo Film Co Ltd Method and device for processing image and recording medium
US6036072A (en) * 1998-10-27 2000-03-14 De Poan Pneumatic Corporation Nailer magazine
US6396599B1 (en) * 1998-12-21 2002-05-28 Eastman Kodak Company Method and apparatus for modifying a portion of an image in accordance with colorimetric parameters
US6396963B2 (en) * 1998-12-29 2002-05-28 Eastman Kodak Company Photocollage generation and modification
US6438264B1 (en) * 1998-12-31 2002-08-20 Eastman Kodak Company Method for compensating image color when adjusting the contrast of a digital color image
US6421468B1 (en) * 1999-01-06 2002-07-16 Seiko Epson Corporation Method and apparatus for sharpening an image by scaling elements of a frequency-domain representation
AUPP898499A0 (en) * 1999-03-02 1999-03-25 University Of Queensland, The Method for image texture analysis
JP2000305141A (en) * 1999-04-21 2000-11-02 Olympus Optical Co Ltd Electronic camera
US6393148B1 (en) * 1999-05-13 2002-05-21 Hewlett-Packard Company Contrast enhancement of an image using luminance and RGB statistical metrics
US7019778B1 (en) * 1999-06-02 2006-03-28 Eastman Kodak Company Customizing a digital camera
DE60040933D1 (en) * 1999-06-02 2009-01-08 Eastman Kodak Co Individually adapted digital image transmission
US6707950B1 (en) * 1999-06-22 2004-03-16 Eastman Kodak Company Method for modification of non-image data in an image processing chain
EP1962511A3 (en) * 2000-04-05 2010-10-27 Sony United Kingdom Limited Audio and/or video generation apparatus and method using a list of content items
JP2001339675A (en) * 2000-05-25 2001-12-07 Sony Corp Information processing equipment and method
JP3927353B2 (en) * 2000-06-15 2007-06-06 株式会社日立製作所 Image alignment method, comparison inspection method, and comparison inspection apparatus in comparison inspection
US20020019859A1 (en) * 2000-08-01 2002-02-14 Fuji Photo Film Co., Ltd. Method and system for contents data processing service
US6728401B1 (en) * 2000-08-17 2004-04-27 Viewahead Technology Red-eye removal using color image processing
KR100378351B1 (en) * 2000-11-13 2003-03-29 삼성전자주식회사 Method and apparatus for measuring color-texture distance, and method and apparatus for sectioning image into a plurality of regions using the measured color-texture distance
US6429924B1 (en) * 2000-11-30 2002-08-06 Eastman Kodak Company Photofinishing method
SE0004741D0 (en) * 2000-12-21 2000-12-21 Smart Eye Ab Image capturing device with reflex reduction
JP4167401B2 (en) * 2001-01-12 2008-10-15 富士フイルム株式会社 Digital camera and operation control method thereof
JP4666274B2 (en) * 2001-02-20 2011-04-06 日本電気株式会社 Color image processing apparatus and method
US7216289B2 (en) * 2001-03-16 2007-05-08 Microsoft Corporation Method and apparatus for synchronizing multiple versions of digital data
JP2002287017A (en) * 2001-03-28 2002-10-03 Fuji Photo Optical Co Ltd Focusing state detecting device for photographic lens
US6859565B2 (en) * 2001-04-11 2005-02-22 Hewlett-Packard Development Company, L.P. Method and apparatus for the removal of flash artifacts
US6766067B2 (en) * 2001-04-20 2004-07-20 Mitsubishi Electric Research Laboratories, Inc. One-pass super-resolution images
US6516154B1 (en) * 2001-07-17 2003-02-04 Eastman Kodak Company Image revising camera and method
JP2003066298A (en) * 2001-08-28 2003-03-05 Pentax Corp Controlling device for optical axis of lens
EP1288859A1 (en) * 2001-09-03 2003-03-05 Agfa-Gevaert AG Method for automatic detection of red-eye defecs in photographic images
GB2379819B (en) * 2001-09-14 2005-09-07 Pixology Ltd Image processing to remove red-eye features
US7262798B2 (en) * 2001-09-17 2007-08-28 Hewlett-Packard Development Company, L.P. System and method for simulating fill flash in photography
US7133070B2 (en) * 2001-09-20 2006-11-07 Eastman Kodak Company System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
JP4076057B2 (en) * 2001-09-26 2008-04-16 富士フイルム株式会社 Image data transmission method, digital camera, and program
US7324246B2 (en) * 2001-09-27 2008-01-29 Fujifilm Corporation Apparatus and method for image processing
US7130446B2 (en) * 2001-12-03 2006-10-31 Microsoft Corporation Automatic detection and tracking of multiple individuals using multiple cues
US7688349B2 (en) * 2001-12-07 2010-03-30 International Business Machines Corporation Method of detecting and tracking groups of people
JP2003179807A (en) * 2001-12-13 2003-06-27 Fuji Photo Film Co Ltd Image pickup device
US7162102B2 (en) * 2001-12-19 2007-01-09 Eastman Kodak Company Method and system for compositing images to produce a cropped image
US7289664B2 (en) * 2002-01-17 2007-10-30 Fujifilm Corporation Method of detecting and correcting the red eye
JP4275344B2 (en) * 2002-01-22 2009-06-10 富士フイルム株式会社 Imaging apparatus, imaging method, and program
US20030161506A1 (en) * 2002-02-25 2003-08-28 Eastman Kodak Company Face detection computer program product for redeye correction
JP3973462B2 (en) * 2002-03-18 2007-09-12 富士フイルム株式会社 Image capture method
JP4285948B2 (en) * 2002-06-18 2009-06-24 オリンパス株式会社 Imaging device
US7035462B2 (en) * 2002-08-29 2006-04-25 Eastman Kodak Company Apparatus and method for processing digital images having eye color defects
JP2004104940A (en) * 2002-09-11 2004-04-02 Nidec Copal Corp Motor drive unit
EP1404113A1 (en) * 2002-09-24 2004-03-31 Agfa-Gevaert AG Method for automatically processing digital image data
US7181082B2 (en) * 2002-12-18 2007-02-20 Sharp Laboratories Of America, Inc. Blur detection system
US7224850B2 (en) * 2003-05-13 2007-05-29 Microsoft Corporation Modification of red-eye-effect in digital image
US8170294B2 (en) * 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US7362368B2 (en) * 2003-06-26 2008-04-22 Fotonation Vision Limited Perfecting the optics within a digital image acquisition device using face detection
US7844076B2 (en) * 2003-06-26 2010-11-30 Fotonation Vision Limited Digital image processing using face detection and skin tone information
US7792335B2 (en) * 2006-02-24 2010-09-07 Fotonation Vision Limited Method and apparatus for selective disqualification of digital images
US7689009B2 (en) * 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US20050031224A1 (en) * 2003-08-05 2005-02-10 Yury Prilutsky Detecting red eye filter and apparatus using meta-data
US7453506B2 (en) * 2003-08-25 2008-11-18 Fujifilm Corporation Digital camera having a specified portion preview section
US7590305B2 (en) * 2003-09-30 2009-09-15 Fotonation Vision Limited Digital camera with built-in lens calibration table
US7369712B2 (en) * 2003-09-30 2008-05-06 Fotonation Vision Limited Automated statistical self-calibrating detection and removal of blemishes in digital images based on multiple occurrences of dust in images
US7412105B2 (en) * 2003-10-03 2008-08-12 Adobe Systems Incorporated Tone selective adjustment of images
US20050134719A1 (en) * 2003-12-23 2005-06-23 Eastman Kodak Company Display device with automatic area of importance display
US7627146B2 (en) * 2004-06-30 2009-12-01 Lexmark International, Inc. Method and apparatus for effecting automatic red eye reduction
EP1628494A1 (en) * 2004-08-17 2006-02-22 Dialog Semiconductor GmbH Intelligent light source with synchronization with a digital camera
US7315631B1 (en) * 2006-08-11 2008-01-01 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
US7715597B2 (en) * 2004-12-29 2010-05-11 Fotonation Ireland Limited Method and component for image recognition
US7599577B2 (en) * 2005-11-18 2009-10-06 Fotonation Vision Limited Method and apparatus of correcting hybrid flash artifacts in digital images
EP1987475A4 (en) * 2006-02-14 2009-04-22 Fotonation Vision Ltd Automatic detection and correction of non-red eye flash defects
EP2033142B1 (en) * 2006-06-12 2011-01-26 Tessera Technologies Ireland Limited Advances in extending the aam techniques from grayscale to color images
US7515740B2 (en) * 2006-08-02 2009-04-07 Fotonation Vision Limited Face recognition with combined PCA-based datasets
US7403643B2 (en) * 2006-08-11 2008-07-22 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432863A (en) * 1993-07-19 1995-07-11 Eastman Kodak Company Automated detection and correction of eye color defects due to flash illumination
US5751836A (en) * 1994-09-02 1998-05-12 David Sarnoff Research Center Inc. Automated, non-invasive iris recognition system and method
US6407777B1 (en) * 1997-10-09 2002-06-18 Deluca Michael Joseph Red-eye filter method and apparatus
US6278491B1 (en) * 1998-01-29 2001-08-21 Hewlett-Packard Company Apparatus and a method for automatically detecting and reducing red-eye in a digital image
US20020136450A1 (en) * 2001-02-13 2002-09-26 Tong-Xian Chen Red-eye detection based on red region detection with eye confirmation
US20030044070A1 (en) * 2001-09-03 2003-03-06 Manfred Fuersich Method for the automatic detection of red-eye defects in photographic image data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JOFFE S ED - INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS: "Red eye detection with machine learning" PROCEEDINGS 2003 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING. ICIP-2003. BARCELONA, SPAIN, SEPT. 14 - 17, 2003, INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, NEW YORK, NY : IEEE, US, vol. VOL. 2 OF 3, 14 September 2003 (2003-09-14), pages 871-874, XP010670596 ISBN: 0-7803-7750-8 *
See also references of EP1714252A2 *

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8537251B2 (en) 1997-10-09 2013-09-17 DigitalOptics Corporation Europe Limited Detecting red eye filter and apparatus using meta-data
US8203621B2 (en) 1997-10-09 2012-06-19 DigitalOptics Corporation Europe Limited Red-eye filter method and apparatus
US7746385B2 (en) 1997-10-09 2010-06-29 Fotonation Vision Limited Red-eye filter method and apparatus
US7787022B2 (en) 1997-10-09 2010-08-31 Fotonation Vision Limited Red-eye filter method and apparatus
US7804531B2 (en) 1997-10-09 2010-09-28 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7847840B2 (en) 1997-10-09 2010-12-07 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7847839B2 (en) 1997-10-09 2010-12-07 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7852384B2 (en) 1997-10-09 2010-12-14 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US8131016B2 (en) 2003-06-26 2012-03-06 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8224108B2 (en) 2003-06-26 2012-07-17 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8126208B2 (en) 2003-06-26 2012-02-28 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8957993B2 (en) 2003-08-05 2015-02-17 FotoNation Detecting red eye filter and apparatus using meta-data
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US9025054B2 (en) 2003-08-05 2015-05-05 Fotonation Limited Detecting red eye filter and apparatus using meta-data
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus
US7912285B2 (en) 2004-08-16 2011-03-22 Tessera Technologies Ireland Limited Foreground/background segmentation in digital images with differential exposure calculations
US7957597B2 (en) 2004-08-16 2011-06-07 Tessera Technologies Ireland Limited Foreground/background segmentation in digital images
US8175385B2 (en) 2004-08-16 2012-05-08 DigitalOptics Corporation Europe Limited Foreground/background segmentation in digital images with differential exposure calculations
US7680342B2 (en) 2004-08-16 2010-03-16 Fotonation Vision Limited Indoor/outdoor classification in digital images
US8320641B2 (en) 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
US8265388B2 (en) 2004-10-28 2012-09-11 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US7962629B2 (en) 2005-06-17 2011-06-14 Tessera Technologies Ireland Limited Method for establishing a paired connection between media devices
US7953252B2 (en) 2005-11-18 2011-05-31 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8422780B2 (en) 2005-11-18 2013-04-16 DigitalOptics Corporation Europe Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US7865036B2 (en) 2005-11-18 2011-01-04 Tessera Technologies Ireland Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US8823830B2 (en) 2005-11-18 2014-09-02 DigitalOptics Corporation Europe Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US7869628B2 (en) 2005-11-18 2011-01-11 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7970183B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8126217B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8126265B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US8126218B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8131021B2 (en) 2005-11-18 2012-03-06 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8160308B2 (en) 2005-11-18 2012-04-17 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8175342B2 (en) 2005-11-18 2012-05-08 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US7970184B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8180115B2 (en) 2005-11-18 2012-05-15 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8184868B2 (en) 2005-11-18 2012-05-22 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8212897B2 (en) 2005-12-27 2012-07-03 DigitalOptics Corporation Europe Limited Digital image acquisition system with portrait mode
US7692696B2 (en) 2005-12-27 2010-04-06 Fotonation Vision Limited Digital image acquisition system with portrait mode
US8184900B2 (en) 2006-02-14 2012-05-22 DigitalOptics Corporation Europe Limited Automatic detection and correction of non-red eye flash defects
US7868922B2 (en) 2006-02-14 2011-01-11 Tessera Technologies Ireland Limited Foreground/background segmentation in digital images
US7953287B2 (en) 2006-02-14 2011-05-31 Tessera Technologies Ireland Limited Image blurring
US8358841B2 (en) 2006-05-03 2013-01-22 DigitalOptics Corporation Europe Limited Foreground/background separation in digital images
US8363908B2 (en) 2006-05-03 2013-01-29 DigitalOptics Corporation Europe Limited Foreground / background separation in digital images
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
US7995804B2 (en) 2007-03-05 2011-08-09 Tessera Technologies Ireland Limited Red eye false positive filtering using face location and orientation
US8233674B2 (en) 2007-03-05 2012-07-31 DigitalOptics Corporation Europe Limited Red eye false positive filtering using face location and orientation
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US8036458B2 (en) 2007-11-08 2011-10-11 DigitalOptics Corporation Europe Limited Detecting redeye defects in digital images
US8000526B2 (en) 2007-11-08 2011-08-16 Tessera Technologies Ireland Limited Detecting redeye defects in digital images
US8525898B2 (en) 2008-01-30 2013-09-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
WO2010025908A1 (en) * 2008-09-03 2010-03-11 Fotonation Ireland Limited Partial face detector red-eye filter method and apparatus
US8633999B2 (en) 2009-05-29 2014-01-21 DigitalOptics Corporation Europe Limited Methods and apparatuses for foreground, top-of-the-head separation from background
US8971628B2 (en) 2010-07-26 2015-03-03 Fotonation Limited Face detection using division-generated haar-like features for illumination invariance
US8977056B2 (en) 2010-07-26 2015-03-10 Fotonation Limited Face detection using division-generated Haar-like features for illumination invariance

Also Published As

Publication number Publication date
WO2005076217A3 (en) 2006-04-20
IES20050052A2 (en) 2005-09-21
EP1714252A2 (en) 2006-10-25
JP2007525121A (en) 2007-08-30
JP4966021B2 (en) 2012-07-04
WO2005076217A9 (en) 2005-10-13
US20050140801A1 (en) 2005-06-30
US20080043121A1 (en) 2008-02-21

Similar Documents

Publication Publication Date Title
US7352394B1 (en) Image modification based on red-eye filter analysis
US20050140801A1 (en) Optimized performance and performance for red-eye filter method and apparatus
US7042505B1 (en) Red-eye filter method and apparatus
US8279301B2 (en) Red-eye filter method and apparatus
US9412007B2 (en) Partial face detector red-eye filter method and apparatus
US6407777B1 (en) Red-eye filter method and apparatus
US8520093B2 (en) Face tracker and partial face tracker for red-eye filter method and apparatus
US8170294B2 (en) Method of detecting redeye in a digital image
US20100053367A1 (en) Partial face tracker for red-eye filter method and apparatus
US20110102643A1 (en) Partial Face Detector Red-Eye Filter Method and Apparatus
WO2010025908A1 (en) Partial face detector red-eye filter method and apparatus
IE20050052U1 (en) Optimized performance for red-eye filter method and apparatus
IES84151Y1 (en) Optimized performance for red-eye filter method and apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
COP Corrected version of pamphlet

Free format text: PAGE 1, DESCRIPTION, REPLACED BY A NEW PAGE 1 (WITH AN UPDATED VERSION OF PAMPHLET FRONT PAGE)

WWE Wipo information: entry into national phase

Ref document number: 2006551816

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 2005707215

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2005707215

Country of ref document: EP