US20150363916A1 - Low power demosaic with intergrated chromatic aliasing repair - Google Patents

Low power demosaic with intergrated chromatic aliasing repair Download PDF

Info

Publication number
US20150363916A1
US20150363916A1 US14/304,253 US201414304253A US2015363916A1 US 20150363916 A1 US20150363916 A1 US 20150363916A1 US 201414304253 A US201414304253 A US 201414304253A US 2015363916 A1 US2015363916 A1 US 2015363916A1
Authority
US
United States
Prior art keywords
color
subpixels
subpixel
subset
color values
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/304,253
Inventor
Anthony Botzas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/304,253 priority Critical patent/US20150363916A1/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BOTZAS, ANTHONY
Priority to KR1020150075379A priority patent/KR20150142601A/en
Publication of US20150363916A1 publication Critical patent/US20150363916A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/001Image restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4015Demosaicing, e.g. colour filter array [CFA], Bayer pattern
    • G06K9/4652
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/70
    • G06T5/73
    • G06T7/408
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • H04N25/585Control of the dynamic range involving two or more exposures acquired simultaneously with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20182Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present disclosure relates to an apparatus and method for processing an image. More particularly, the present disclosure relates to an apparatus and method for demosaicing an image.
  • Mobile terminals are developed to provide wireless communication between users. As technology has advanced, mobile terminals now provide many additional features beyond simple telephone conversation. For example, mobile terminals are now able to provide image and video capture. As a result of the ubiquity of mobile terminals, image capture and/or video capture have become increasingly popular. Consequently, various image processing techniques are used to provide a user with an accurate representation of the image intended to be captured.
  • demosaicing refers to the digital imaging process used to reconstruct a full color image from color samples output from an image detector.
  • an aspect of the present disclosure is to provide an apparatus and method for demosaicing sampled color values.
  • a method for demosaicing sampled color values includes sampling color information respectively at a plurality of subpixels using a plurality of photo detectors, generating a low frequency color reconstruction solution for a first subset of the plurality of subpixels corresponding to subpixels for which a first color is sampled, generating a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled, determining color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution, and determining, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
  • an apparatus for demosaicing sampled color values includes a storage unit, and at least one processor configured to sample color information respectively at a plurality of subpixels using a plurality of photo detectors, to generate a low frequency color reconstruction solution for a first subset of the plurality of subpixels corresponding to subpixels for which a first color is sampled, to generate a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled, to determine color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution, and to determine, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
  • FIG. 1 illustrates an array of photosites of an image detector according to the related art
  • FIG. 2 illustrates a plurality of subpixels according to an embodiment of the present disclosure
  • FIG. 3 illustrates green subpixels according to an embodiment of the present disclosure
  • FIG. 4 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure
  • FIG. 5 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure
  • FIG. 6 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • an electronic device may include communication functionality.
  • an electronic device may be a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook PC, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an mp3 player, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.
  • PDA Personal Digital Assistant
  • PMP Portable Multimedia Player
  • mp3 player a mobile medical device
  • a wearable device e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch
  • an electronic device may be a smart home appliance with communication functionality.
  • a smart home appliance may be, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSyncTM, Apple TVTM, or Google TVTM), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.
  • DVD Digital Video Disk
  • an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.
  • MRA Magnetic Resonance Angiography
  • MRI Magnetic Resonance Imaging
  • CT Computed Tomography
  • an imaging device an ultrasonic device
  • GPS Global Positioning System
  • EDR Event Data Recorder
  • FDR Flight Data Recorder
  • automotive infotainment device e.g., a navigation device, a Global Positioning System (GPS) receiver, an Event
  • an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.
  • various measuring devices e.g., water, electricity, gas or electro-magnetic wave measuring devices
  • an electronic device may be any combination of the foregoing devices.
  • an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.
  • Various embodiments of the present disclosure include an apparatus and method for demosaicing sampled color values.
  • various embodiments of the present disclosure include an apparatus and method for efficiently demosaicing sampled color values.
  • the apparatus and method for efficiently demosaicing sampled color values emphasizes the sampled green color values.
  • the apparatus and method for efficiently demosaicing sampled color values may be ideal for demosaicing sampled color values to render a preview image so as to minimize (or reduce) the resources required to perform image processing of the image.
  • An image is represented by a number of areas called pixels.
  • Each pixel is associated with a color that should be substantially reproduced by a set of subpixels in a display.
  • each subpixel displays a primary color.
  • each subpixel according to the related art is associated with some hue and saturation. Other colors may be obtained by mixing primary colors.
  • Each pixel is mapped into a set of one or more subpixels which are to display the color of the pixel.
  • each repeating set of subpixels includes a subpixel for each primary color.
  • the subpixels are small, and are spaced closely together, to provide a desired resolution.
  • An image detector has a plurality of photo detectors used to sample an image.
  • Each of the plurality of photo detectors may sample (e.g., capture) a value for a single color.
  • each of the plurality of photo detectors may be configured with a color filter.
  • a Color Filter Array (CFA) or a Color Filter Mosaic (CFM) is an array or mosaic of color filters disposed above the plurality of photo detectors.
  • Each of the plurality of photo detectors may be located at a photosite of the image detector.
  • the photosite refers to the spatial location at which a color may be sampled by a photo detector.
  • the array or mosaic of color filters may be disposed above the plurality of photo detectors such that each photo detector has a single corresponding color filter. Accordingly, each photosite may have a corresponding sampled value for a single color.
  • Each of the photosites may be mapped or otherwise correspond to a subpixel of the image. Accordingly, each subpixel of the image may have a corresponding sampled value for a single color. Because each subpixel of the image does not have sampled values for all colors, an image represented by the sampled color values at each subpixel may appear pixelated with a disjointed color representation of the intended image. In other words, the image represented by the sampled color values at each subpixel may be an inaccurate representation of the image intended to be captured.
  • demosaicing refers to the digital imaging process used to reconstruct a full color image from sampled color values output from an image detector. Because each photo detector has a spatial footprint in the image detector, the image detector is unable to capture a color value for every color at each respective photosite thereof.
  • the reconstruction of the full color image using color samples output from the respective photo detectors constituting the image detector may use interpolation or other numerical methods to determine values for each color (e.g., red, green, and blue) at each subpixel.
  • FIG. 1 illustrates an array of photosites of an image detector according to the related art.
  • an array of photosites 100 comprising red, green and blue photosites.
  • the red, green, and blue photosites may be created by placing a Bayer filter (e.g., an RBG filter) over the photo detectors respectively corresponding to the photosites.
  • a Bayer filter e.g., an RBG filter
  • the array of photosites 100 may correspond to a plurality of pixels. As illustrated in FIG. 1 , each of the plurality of pixels is distinguished from one another by a solid line. Each of the plurality of pixels may include a plurality of subpixels. For example, each of the plurality of pixels may include four subpixels. Each of the plurality of subpixels in each of the plurality of pixels is distinguished from one another by a dotted line in FIG. 1 .
  • an image detector comprises a Bayer filter
  • the plurality of subpixels constituting a pixel corresponding to the respective groups of photosites of the image detector may include a red subpixel, a blue subpixel, and two green subpixels.
  • a pixel may include two green subpixels because the RGB pixel includes three colors, however, there are certain benefits associated with designing an array of photo detectors and thus sub pixels that are arranged by an order of two.
  • the color green is selected as a color to be repeated in the pixel because the human eye has been determined to perceive luminance best through the color green.
  • a red subpixel is denoted by ‘R
  • a blue subpixel is denoted by ‘B
  • a green subpixel is denoted by ‘G.’
  • the green subpixels may respectively sample a green color value in response to a request to capture an image.
  • the green subpixels may correspond to a subpixel 105 a , a subpixel 105 d , a subpixel 110 a , a subpixel 110 d , a subpixel 115 a , a subpixel 115 d , a subpixel 120 a , a subpixel 120 d , a subpixel 125 a , a subpixel 125 d , a subpixel 130 a , a subpixel 130 d , a subpixel 135 a , a subpixel 135 d , a subpixel 140 a , a subpixel 140 d , a subpixel 145 a , and a subpixel 145 d.
  • the red subpixels may respectively sample a red color value in response to a request to capture an image.
  • the red subpixels may correspond to a subpixel 105 b , a subpixel 110 b , a subpixel 115 b , a subpixel 120 b , a subpixel 125 b , a subpixel 130 b , a subpixel 135 b , a subpixel 140 b , and a subpixel 145 b.
  • the blue subpixels may respectively sample a blue color value in response to a request to capture an image.
  • the blue subpixels may correspond to a subpixel 105 c , a subpixel 110 c , a subpixel 115 c , a subpixel 120 c , a subpixel 125 c , a subpixel 130 c , a subpixel 135 c , a subpixel 140 c , and a subpixel 145 c.
  • FIG. 2 illustrates a plurality of subpixels according to an embodiment of the present disclosure.
  • an array of pixels 200 respectively comprising four subpixels is illustrated.
  • the four subpixels correspond to areas at which a corresponding color value is sampled.
  • each pixel is distinguished from one another by a solid line.
  • the plurality of subpixels in each of the plurality of pixels is distinguished from one another by a dotted line.
  • a color value is respectively sampled at a subpixel corresponding to a photosite. For example, as illustrated in FIG. 2 , a green color value is sampled at the subpixel denoted by G 0 , a red color value is sampled at the subpixel denoted by R 0 , a blue color value is sampled at the subpixel denoted by B 0 , and another green color value is sampled at the subpixel denoted by G 3 .
  • a green color value is sampled at the subpixel denoted by G 1
  • a red color value is sampled at the subpixel denoted by R 1
  • a blue color value is sampled at the subpixel denoted by B 1
  • another green color value is sampled at the subpixel denoted by G 4 .
  • a green color value is sampled at the subpixel denoted by G 2
  • a red color value is sampled at the subpixel denoted by R 2
  • a blue color value is sampled at the subpixel denoted by B 2
  • another green color value is sampled at the subpixel denoted by G 5 .
  • demosaicing methods use significant memory and processing resources to demosaic the sampled color values.
  • the related art uses a large memory buffer to store sampled color values with which the color representation is reconstructed.
  • the related art references several sampled color values in order to interpolate or otherwise calculate color values at corresponding photosites.
  • an efficient method of reconstructing a color representation of an image is provided.
  • a slim buffer e.g., a small buffer relative to the buffer used in related art demosaicing methods
  • reconstructing the color representation of the image e.g., demosaicing the sampled color values.
  • an apparatus and method for demosaicing the sampled color values may use only sampled color values in close proximity to a particular subpixel in order to calculate an estimated color value corresponding to the particular subpixel.
  • an apparatus and method for demosaicing the sampled color values may use only sampled color values immediately adjacent to a particular subpixel in order to calculate an estimated color value corresponding to the particular subpixel.
  • color values may be calculated at subpixels for which a green color value is sampled of a particular pixel using the respective red color value and the blue color value sampled at the corresponding subpixels of the particular pixel.
  • various embodiments of the present disclosure may calculate a red color value and a blue color value at the subpixel denoted by G 4 using the red color value sampled at the subpixel denoted by R 1 and using the blue color value sampled at the subpixel denoted by B 1 .
  • the subpixel denoted by G 4 , the subpixel denoted by R 1 , and the subpixel denoted by B 1 are within the same pixel.
  • an apparatus and method for calculating color values at subpixels for which a green color value was sampled may calculate the color values according to a low frequency demosaic. According to various embodiments of the present disclosure, an apparatus and method for calculating color values at subpixels for which a green color value was sampled may calculate the color values according to a high frequency demosaic. According to various embodiments of the present disclosure, an apparatus and method for calculating color values at subpixels for which a green color value was sampled may calculate the color values according to a blend of a low frequency demosaic and a high frequency demosaic.
  • FIG. 3 illustrates green subpixels according to an embodiment of the present disclosure.
  • a color value for a particular color may be calculated (e.g., estimated) at corresponding subpixels using an average of the color values for the particular color.
  • an apparatus and method for demosaicing sampled color values may blend solutions between a low frequency demosaic solution and a high frequency demosaic solution.
  • the reconstruction of the color representation of an image may be an average or blend between a low frequency demosaic solution and a high frequency solution.
  • Another assumption is that a human eye generally disregards chrominance values sampled at high spatial frequencies. In other words, a human may not be able to detect chrominance values at high spatial frequencies. As a result, the luminance values at high frequencies are more important to image processing than the chrominance values. In other words, the luminance values sampled at high frequencies are more important (e.g., to the human perception) than the chrominance values sampled at high frequencies.
  • an apparatus and method for demosaicing sampled images attempts to capture the elements of the sampled values (e.g., color values, color information, and/or the like) at the high spatial frequencies.
  • an apparatus and method for demosaicing sampled images attempts to capture the energy (e.g., the luminance) at the high spatial frequencies.
  • the typical Bayer filter e.g., an RBG filter has two green photo detectors for each red detector or each blue detector.
  • each pixel in a typical Bayer filter has two green photosites, one red photosite, and one blue photosite.
  • an image detector has a greater density of green photosites relative to a density of red blue photosites and a density of blue photosites. Because of the relatively greater density of green photosites, the color characteristics sampled at green subpixels of a pixel may be more accurate than the color characteristics sampled at a red subpixel or the color characteristics sampled at a green subpixel.
  • the greater sample set of measurements obtained through the green subpixels may result in the measurements obtained through the green subpixels of a pixel being more representative of a true image than the measurements obtained through the red subpixel or the measurements obtained through the blue subpixel.
  • the apparatus and method for demosaicing sampled color values uses color values sampled at green pixels to analyze characteristics of the image at high frequencies in relation to the characteristics of the images at low frequencies.
  • the sampled green values are used to determine (e.g., detect) the high frequencies.
  • the high frequencies may be detected using a 3 ⁇ 3 frequency detection filter.
  • the high frequencies may be detected using a 2 ⁇ 3 frequency detection filter as provided in Equation (1) below.
  • a modified green color channel may be generated.
  • an alpha channel may be generated using the green values.
  • the alpha channel may be determined according to Equation (2) below.
  • GSGAIN may correspond to a coefficient for sharpening the alpha channel.
  • the GSGAIN may refer to a Green Sharpening Gain.
  • the GSGAIN may be used to amplify the frequency response or to deemphasize the frequency response.
  • the GREEN DATA may be 3 ⁇ 2 data that is filtered using the frequency detection filter to determine a frequency response.
  • a set of sampled green color values 310 from the array of sampled green color values 305 corresponds to the GREEN DATA used to determine an alpha ( ⁇ ) value corresponding to a corresponding pixel by filtering the sampled green color values 305 through the frequency detection filter 315 .
  • the color field may be determined to be flat.
  • the value of ⁇ is 0 at a particular point, then the green values of the set of sampled green color values 310 are the same. Accordingly, a flat field of green values creates no response by the frequency detection filter 315 .
  • the color field may be determined to be non-flat (e.g., have an edge).
  • the frequency detection filter 315 corresponds to an edge detector at which the frequency detection filter 315 detects an edge in the green color field.
  • the frequency may be sampled by using two rows of sampled color values. Because the frequency is detected using two rows of sampled color values, the memory (e.g., the buffer) required to store the corresponding sampled color values may be reduced (e.g., relative to the memory requirements of the related art).
  • color values may be calculated (e.g., estimated) using information from a neighboring one row of color values. The color values may be calculated using the sampled color values from a single neighboring row of color values (e.g., rather than two neighboring row of color values as used according to the related art).
  • the generated alpha ( ⁇ ) channel may be used as a basis for blending low frequency demosaic solutions with high frequency demosaic solutions.
  • the alpha channel may be used to determine a ratio between the low frequency demosaic solution component and the high frequency demosaic solution component.
  • the alpha channel may be used to determine an extent of the high frequency demosaic solution used to determine (e.g., calculated) the color values at the respective subpixels.
  • the alpha channel may be used to determine an extent of the low frequency demosaic solution used to determine (e.g., calculated) the color values at the respective subpixels.
  • the blend of the low frequency demosaic solution with the high frequency demosaic solution may be determined according to Equation (3) below.
  • may correspond to an extent to which the high frequency demosaic solution is used to determine color values at a respective subpixel.
  • 1 ⁇ may correspond to an extent to which the low frequency demosaic solution is used to determine color values at a respective subpixel.
  • the alpha parameter ( ⁇ ) may be determined by the edge detector (e.g., the frequency detection filter 315 ).
  • FIG. 4 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure.
  • a low frequency demosaic solution 310 is combined with a high frequency demosaic solution 350 to determine a color value for a particular color at a particular subpixel.
  • the portion of the high frequency demosaic solution 350 used to determine color value for a particular color at a particular subpixel may be determined according to the alpha channel.
  • the portion of the high frequency demosaic solution 350 used to determine color value may defined according to ⁇ .
  • the portion of the low frequency demosaic solution 310 used to determine color value for a particular color at a particular subpixel may be determined according to the alpha channel.
  • the portion of the low frequency demosaic solution 310 used to determine color value may defined according to 1 ⁇ .
  • color values at a subpixel may be calculated according to Equation (4) below.
  • R′G′B ′ ⁇ *(high frequency RGB value)+(1 ⁇ )*(low frequency RGB value) Equitation (4)
  • a human eye generally disregards chrominance values sampled at high spatial frequencies, and as a result, at high spatial frequencies, the luminance is more important than the chromatic values.
  • the color values are changing very rapidly at the corresponding subpixels.
  • the green subpixel is substituted with a grey color value. For example, as illustrated in FIG. 4 , at the green subpixel 471 and at the green subpixel 475 , the corresponding red color value and blue color value are substituted with the corresponding sampled green color value.
  • the red color value and the blue color value are set to the sampled green color value that was captured at the green subpixel 471 .
  • the red color value and the blue color value are set to the sampled green color value that was captured at the green subpixel 475 .
  • the sampled color values are assumed to be accurate representations of the true image. Accordingly, with respect to the low frequency demosaic solution for the green subpixels, the red color value and the blue red color value are reconstructed using substitution of the corresponding red color value and the blue red color value with a neighboring sampled red color value and a neighboring sampled green color value. For example, at a green subpixel 431 , a red color value is set to be equal to the sampled red color value sampled at a red subpixel 423 , and a blue color value is set to be equal to the sampled blue color value sampled at a blue subpixel 429 .
  • the green color value at the green subpixel 431 corresponds to the sampled green color value at the green subpixel 431 .
  • a red color value is set to be equal to the sampled red color value sampled at a red subpixel 427
  • a blue color value is set to be equal to the sampled blue color value sampled at a blue subpixel 433
  • green color value corresponds to the sampled green color value at the green subpixel 435 .
  • a red color value and a blue color value at a green subpixel 421 may be set to be equal to the red color value and the blue color value substituted at the green subpixel 431 .
  • the red color value is set to be equal to the sampled red color value sampled at the red subpixel 423
  • the blue color value is set to be equal to the sampled blue color value sampled at the blue subpixel 429
  • the green color value corresponds to the sampled green color value at the green subpixel 421 .
  • a red color value and a blue color value at a green subpixel 421 may be set to be equal to the red color value and the blue color value substituted at the green subpixel 435 .
  • the red color value is set to be equal to the sampled red color value sampled at the red subpixel 427
  • the blue color value is set to be equal to the sampled blue color value sampled at the blue subpixel 433
  • the green color value corresponds to the sampled green color value at the green subpixel 425 .
  • a sampled color value is set as the estimated (e.g., calculated) color value at a neighboring subpixel for which the color value is not sampled because the likelihood that the true color values are materially different between neighboring subpixels is very remote.
  • substitution of color values is an efficient color reconstruction method that avoids more complex interpolation or other numerical methods that require intensive memory and processing power.
  • color values for a particular subpixel are calculated to be a blend between the color values of the subpixel according to the low frequency solution and the color values of the subpixel according to the high frequency solution.
  • reconstructed color values at a green subpixel may be determined according to Equations (5) and (6).
  • G corresponds to the sampled green value at the particular subpixel
  • R corresponds to the sampled red value at a red subpixel neighboring the particular green subpixel
  • G corresponds to the sampled blue value at a blue subpixel neighboring the particular green subpixel.
  • may be calculated using, for example, Equation (2). According to various embodiments of the present disclosure, ⁇ may be calculated using another example to derive a blending ration for blending a high frequency demosaic solution with a low frequency demosaic solution.
  • R′ corresponds to the reconstructed (e.g., estimated) red value at the particular green pixel
  • B′ corresponds to the reconstructed (e.g., estimated) blue value at the particular green pixel.
  • the color values (RGB color) has been reconstructed at the green subpixels, then the corresponding color values are respectively reconstructed at the red subpixels and the blue subpixels. Because of the respective lower density of the red subpixels and the blue subpixels in relation to the green subpixels, additional frequency detection on the red or blue color planes would require more line buffers. As a result, according to the related art, color reconstruction would require significant memory to reconstruct at least the red color pixels and the blue color pixel.
  • the color values (e.g., color information) at the red subpixels and the blue subpixels may be derived from available reconstructed and “repaired” color from neighboring green subpixels.
  • the color values at the green subpixels are used to reconstruct the color values at the neighboring red subpixel and the neighboring blue subpixel.
  • the color values at the red subpixels and the blue subpixels may be determined for a low frequency demosaic solution and for a high frequency demosaic solution. Thereafter, the color values at the red subpixels for the low frequency demosaic solution may be blended with the color values at the red subpixels for the high frequency demosaic solution. Similarly, color values at the blue subpixels for the low frequency demosaic solution may be blended with the color values at the blue subpixels for the high frequency demosaic solution.
  • the color values at the red subpixels and the blue subpixels may be determined for a low frequency demosaic solution and for a high frequency demosaic solution. Thereafter, the color values at the red subpixels for the low frequency demosaic solution may be blended with the color values at the red subpixels for the high frequency demosaic solution. Similarly, color values at the blue subpixels for the low frequency demosaic solution may be blended with the color values at the blue subpixels for the high frequency demosaic solution.
  • the color values at a red subpixel may be equal to an average of the color values at the neighboring green pixels.
  • the color values at a red subpixel may be the average of the color values at the horizontally neighboring green subpixels.
  • the average of the color values at the horizontally neighboring green subpixel may be a weighted average of such color values.
  • the color values at a red subpixel may be equal to an average of the blended color values at the neighboring green pixels.
  • the color values at the horizontally neighboring green subpixels may be alpha blended between the low frequency demosaic solution and the high frequency demosaic solution before the color values at the horizontally neighboring green subpixels are averaged.
  • the color values at the red subpixel corresponding to red subpixel 423 of the low frequency demosaic solution may be derived using the color values at the green subpixel 421 , the green subpixel 425 , the green subpixel 461 , and the green subpixel 465 .
  • the red color value at the red subpixel 423 may correspond to the sampled red color value at the red subpixel 423 .
  • the green color value at the red subpixel 423 may be an average between (i) a blend of the green color value sampled at the green subpixel 421 and the green color value sampled at the green subpixel 461 , and (ii) a blend of the green color value sampled at the green subpixel 425 and the green color value sampled at the green subpixel 465 .
  • the blue color value may be an average between (i) a blend of the blue color value at the green subpixel 421 and the blue color value at the subpixel 461 , and (ii) a blend of the blue color value at the green subpixel 425 and the blue color value at the subpixel 465 .
  • the color information resulting from the above color values at the red subpixel corresponding to red subpixel 423 of the low frequency demosaic solution may represent an estimated chrominance information for the particular subpixel.
  • color information may not necessarily be considered the brightness (e.g., luminance) information for the particular subpixel. Therefore, according to various embodiments of the present disclosure, the calculated color information (e.g., color values) may be normalized for brightness. For example, the average of the alpha blends of the color information may be normalized at step 487 .
  • the calculated information may be normalized according to an additive normalization. According to various embodiments of the present disclosure, the calculated information may be normalized according to a multiplicative normalization. The multiplicative normalization does not result in desaturation side-effects.
  • the color values may be determined using an additive normalization according to Equation (7).
  • Ro corresponds to the red color value calculated for the particular red subpixel
  • Go corresponds to the green color value calculated for the particular red subpixel
  • Bo corresponds to the blue color value calculated for the particular red subpixel.
  • R corresponds to the sampled red color value at the particular red subpixel.
  • R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • ⁇ R corresponds to a difference between (i) the sampled red color value at the particular red subpixel, and (ii) an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • the color values may be determined using an additive normalization according to Equation (8).
  • Bo corresponds to the blue color value calculated for the particular blue subpixel
  • Go corresponds to the green color value calculated for the particular blue subpixel
  • Ro corresponds to the red color value calculated for the particular blue subpixel.
  • B corresponds to the sampled blue color value at the particular blue subpixel.
  • B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • ⁇ B corresponds to a difference between (i) the sampled blue color value at the particular blue subpixel, and (ii) an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • the color values may be determined using a multiplicative normalization according to Equation (9).
  • Ro corresponds to the red color value calculated for the particular red subpixel
  • Go corresponds to the green color value calculated for the particular red subpixel
  • Bo corresponds to the blue color value calculated for the particular red subpixel.
  • R corresponds to the sampled red color value at the particular red subpixel.
  • R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel.
  • the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • the color values may be determined using a multiplicative normalization according to Equation (10).
  • Bo corresponds to the blue color value calculated for the particular blue subpixel
  • Go corresponds to the green color value calculated for the particular blue subpixel
  • Ro corresponds to the red color value calculated for the particular blue subpixel.
  • B corresponds to the sampled blue color value at the particular blue subpixel.
  • B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel.
  • the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • operations 481 - 487 illustrate a method for reconstructing color information (e.g., color values) at the blue subpixel corresponding to the blue subpixel 433 of the low frequency demosaic solution.
  • color information e.g., color values
  • the color values determined at the green subpixel 431 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 471 of the high frequency demosaic solution.
  • the color values determined at the green subpixel 431 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 471 of the high frequency demosaic solution according to an alpha blend.
  • the color values determined at the green subpixel 435 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 475 of the high frequency demosaic solution.
  • the color values determined at the green subpixel 435 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 475 of the high frequency demosaic solution according to an alpha blend.
  • the color values determined at operation 481 and the color values determined at operation 483 are averaged.
  • the average color value for the subpixel corresponding to the blue subpixel 433 of the low frequency solution determined at operation 485 is normalized.
  • the average color value for the subpixel corresponding to the blue subpixel 433 of the low frequency solution may be normalized according to an additive normalization (e.g., using Equations (7) and (8)).
  • the average color value for the subpixel corresponding to the blue subpixel 433 of the low frequency solution may be normalized according to a multiplicative normalization (e.g., using Equations (9) and (10)).
  • FIG. 5 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure.
  • an electronic device captures an image.
  • the electronic device may include an image detectors that has a plurality of photo detectors may be used to capture the image.
  • Each of the photo detectors may sample a single color.
  • the color information from the plurality of photo detectors may be stored.
  • the color information constituting an intended image that may require reconstruction of the color representation thereof may be received/stored.
  • the electronic device may calculate an alpha (a) parameter at the subpixels for which a green color value is sampled (e.g., the green subpixels).
  • the electronic device may generate an alpha channel using the green color values sampled at the green subpixels.
  • the electronic device may calculate the alpha parameter using Equation (2), or the like.
  • the electronic device may determine color values at the green subpixels.
  • the electronic device may determine the color values at the green subpixels using Equations (5), (6), or the like. According to various embodiments of the present disclosure, electronic device may determine the color the color values at the green subpixels across the entire green color plane (e.g., across all green subpixels). According to various embodiments of the present disclosure, the electronic device may determine the color values at the green subpixels across two rows of subpixels.
  • the electronic device may proceed to the next subpixel (e.g., for which color values are required to be determined).
  • the electronic device determines whether a current subpixel is a red subpixel or a blue subpixel.
  • the electronic device may proceed to operation 530 at which the electronic device may determine color values at the current subpixel (e.g., at the red subpixel). Thereafter, the electronic device may proceed to operation 540 .
  • the electronic device may proceed to operation 535 at which the electronic device may determine color values at the current subpixel (e.g., at the blue subpixel). Thereafter, the electronic device may proceed to operation 540 .
  • the electronic device may determine whether color values for all subpixels have been calculated.
  • the electronic device may proceed to operation 520 .
  • the electronic device may proceed to operation 545 at which the electronic device may store the color information (e.g., the color values) for the respective subpixels.
  • the electronic device may store the image.
  • the electronic device may render the image.
  • the electronic device may render the image in a preview frame (e.g., an instance in which a fast or lightweight demosaicing processing is preferred).
  • the electronic device may perform further image processing before rendering the image.
  • FIG. 6 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • an electronic device may include a control unit 610 , a storage unit 620 , a camera unit 630 , an image processing unit 640 , a display unit 650 , an input unit 650 , and a communication unit 670 .
  • the electronic device comprises at least one control unit 610 .
  • the at least one control unit 610 may be configured to operatively control the electronic device.
  • the at least one control unit 610 may control operation of the various components or units included in the electronic device.
  • the at least one control unit 610 may transmit a signal to the various components included in the electronic device and control a signal flow between internal blocks of the electronic device.
  • the at least one control unit 610 may be or otherwise include at least one processor.
  • the at least one control unit 610 may include an Application Processor (AP), and/or the like.
  • AP Application Processor
  • the storage unit 620 may be configured to store user data, and the like, as well a program which performs operating functions according to various embodiments of the present disclosure.
  • the storage unit 620 may include a non-transitory computer-readable storage medium.
  • the storage unit 620 may store a program for controlling general operation of an electronic device, an Operating System (OS) which boots the electronic device, and application program for performing other optional functions such as a camera function, a sound replay function, an image or video replay function, a signal strength measurement function, a route generation function, image processing, and the like.
  • OS Operating System
  • the storage unit 620 may store user data generated according to a user of the electronic device, such as, for example, a text message, a game file, a music file, a movie file, and the like.
  • the storage unit 620 may store an application or a plurality of applications that individually or in combination operate a camera unit (not shown) to capture (e.g., contemporaneously) one or more images, and/or the like.
  • the storage unit 620 may store an application or a plurality of applications that individually or in combination operate the image processing unit 640 or the control unit 610 to determine an alpha parameter at green subpixels, to determine (e.g., reconstruct) color values at green subpixels, to determine (e.g., reconstruct) color values at red subpixels, to determine (e.g., reconstruct) color values at blue subpixels, to store color values for one or more subpixels, to render an image (e.g., using reconstructed color values), and/or the like.
  • the storage unit 620 may store an application or a plurality of applications that individually or in combination operate the control unit 610 and the communication unit 670 to communicate with a counterpart electronic device to receive color information (e.g., color values) for an image from the counterpart electronic device, and/or the like.
  • the storage unit 620 may store an application or a plurality of applications that individually or in combination operate the display unit 650 to display a graphical user interface, an image, a video, and/or the like.
  • the storage unit 620 may store an application or a plurality of applications that individually or in combination operate the display unit 650 to display a preview image using reconstructed color information using reconstruction (e.g., demosaicing) methods according to various embodiments of the present disclosure.
  • the preview image may be displayed in a viewfinder while the camera unit 630 is operated.
  • the preview image may display a view of that may be captured upon command to capture an image.
  • the camera unit 630 may capture an image.
  • the camera unit 630 may include an image detector (not shown) that includes a plurality of photo detectors. Each of the plurality of photo detectors may correspond to a photosite.
  • the camera unit may capture a preview image as the camera unit 630 is operated. The preview image may be displayed on the display unit 650 while the camera unit 630 is operated (e.g., before the camera unit 630 is instructed to capture the image).
  • the image processing unit 640 may be configured to process image data, images, and/or the like.
  • the image processing unit 640 may include a Sub Pixel Rendering (SPR) unit (not shown), a demosaicing unit (not shown), and/or the like.
  • SPR Sub Pixel Rendering
  • demosaicing unit not shown
  • the image processing unit 640 may be configured to perform demosaicing of image data and/or images, SPR, and/or the like.
  • the image processing unit 640 may be configured to determine an alpha parameter at green subpixels, to determine (e.g., reconstruct) color values at green subpixels, to determine (e.g., reconstruct) color values at red subpixels, to determine (e.g., reconstruct) color values at blue subpixels, to store color values for one or more subpixels, to render an image (e.g., using reconstructed color values), and/or the like.
  • the display unit 650 displays information inputted by user or information to be provided to user as well as various menus of the electronic device.
  • the display unit 650 may provide various screens according to a user of the electronic device, such as an idle screen, a message writing screen, a calling screen, a route planning screen, and the like.
  • the display unit 650 may display an interface which the user may manipulate or otherwise enter inputs via a touch screen to enter selection of the function relating to the signal strength of the electronic device.
  • the display unit 650 can be formed as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), and the like.
  • LCD Liquid Crystal Display
  • OLED Organic Light Emitting Diode
  • AMOLED Active Matrix Organic Light Emitting Diode
  • the display unit 650 can perform the function of the input unit 660 if the display unit 650 is formed as a touch screen.
  • the input unit 660 may include input keys and function keys for receiving user input.
  • the input unit 660 may include input keys and function keys for receiving an input of numbers or various sets of letter information, setting various functions, and controlling functions of the electronic device.
  • the input unit 660 may include a calling key for requesting a voice call, a video call request key for requesting a video call, a termination key for requesting termination of a voice call or a video call, a volume key for adjusting output volume of an audio signal, a direction key, and the like.
  • the input unit 660 may transmit to the at least one control unit 610 signals related to the operation of a camera unit (not shown), to selection of an image, to selection of a viewpoint, and/or the like.
  • a camera unit not shown
  • Such an input unit 660 may be formed by one or a combination of input means such as a touch pad, a touchscreen, a button-type key pad, a joystick, a wheel key, and the like.
  • the communication unit 670 may be configured for communicating with other electronic devices and/or networks. According to various embodiments of the present disclosure, the communication unit 670 may be configured to communicate using various communication protocols and various communication transceivers. For example, the communication unit 670 may be configured to communicate via Bluetooth technology, NFC technology, WiFi technology, 2G technology, 3G technology, LTE technology, or another wireless technology, and/or the like.
  • Any such software may be stored in a non-transitory computer readable storage medium.
  • the non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • CD Compact Disk
  • DVD Digital Versatile Disc
  • the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.

Abstract

An apparatus and method for demosaicing sampled color values are provided. The method includes sampling color information respectively at a plurality of subpixels using a plurality of photo detectors, sampling color information respectively at a plurality of subpixels using a plurality of photo detectors, generating a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled, determining color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution, and determining, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application claims the benefit under 35 U.S.C. §119(e) of a U.S. Provisional application filed on Jun. 12, 2014 in the U.S. Patent and Trademark Office and assigned Ser. No. 62/011,311, the entire disclosure of which is hereby incorporated by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to an apparatus and method for processing an image. More particularly, the present disclosure relates to an apparatus and method for demosaicing an image.
  • BACKGROUND
  • Mobile terminals are developed to provide wireless communication between users. As technology has advanced, mobile terminals now provide many additional features beyond simple telephone conversation. For example, mobile terminals are now able to provide image and video capture. As a result of the ubiquity of mobile terminals, image capture and/or video capture have become increasingly popular. Consequently, various image processing techniques are used to provide a user with an accurate representation of the image intended to be captured.
  • In order to provide a more accurate representation of the color of the image intended to be captured, a technique commonly referred to as demosaicing may be performed. Demosaicing refers to the digital imaging process used to reconstruct a full color image from color samples output from an image detector.
  • Accordingly, there is a need for an apparatus and method for providing an improved representation of the image intended to be captured. Further, there is a need for an apparatus and method for demosaicing color values sampled during image capture.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
  • SUMMARY
  • Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an apparatus and method for demosaicing sampled color values.
  • In accordance with an aspect of the present disclosure, a method for demosaicing sampled color values is provided. The method includes sampling color information respectively at a plurality of subpixels using a plurality of photo detectors, generating a low frequency color reconstruction solution for a first subset of the plurality of subpixels corresponding to subpixels for which a first color is sampled, generating a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled, determining color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution, and determining, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
  • In accordance with another aspect of the present disclosure, an apparatus for demosaicing sampled color values is provided. The apparatus includes a storage unit, and at least one processor configured to sample color information respectively at a plurality of subpixels using a plurality of photo detectors, to generate a low frequency color reconstruction solution for a first subset of the plurality of subpixels corresponding to subpixels for which a first color is sampled, to generate a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled, to determine color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution, and to determine, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
  • Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of various embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 illustrates an array of photosites of an image detector according to the related art;
  • FIG. 2 illustrates a plurality of subpixels according to an embodiment of the present disclosure;
  • FIG. 3 illustrates green subpixels according to an embodiment of the present disclosure;
  • FIG. 4 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure;
  • FIG. 5 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure; and
  • FIG. 6 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
  • DETAILED DESCRIPTION
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure are provided for illustration purpose only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
  • According to various embodiments of the present disclosure, an electronic device may include communication functionality. For example, an electronic device may be a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook PC, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an mp3 player, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.
  • According to various embodiments of the present disclosure, an electronic device may be a smart home appliance with communication functionality. A smart home appliance may be, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSync™, Apple TV™, or Google TV™), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.
  • According to various embodiments of the present disclosure, an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.
  • According to various embodiments of the present disclosure, an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.
  • According to various embodiments of the present disclosure, an electronic device may be any combination of the foregoing devices. In addition, it will be apparent to one having ordinary skill in the art that an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.
  • Various embodiments of the present disclosure include an apparatus and method for demosaicing sampled color values. In addition, various embodiments of the present disclosure include an apparatus and method for efficiently demosaicing sampled color values. According to various embodiments of the present disclosure, the apparatus and method for efficiently demosaicing sampled color values emphasizes the sampled green color values. According to various embodiments of the present disclosure, the apparatus and method for efficiently demosaicing sampled color values may be ideal for demosaicing sampled color values to render a preview image so as to minimize (or reduce) the resources required to perform image processing of the image.
  • An image is represented by a number of areas called pixels. Each pixel is associated with a color that should be substantially reproduced by a set of subpixels in a display. According to the related art, each subpixel displays a primary color. For example, each subpixel according to the related art is associated with some hue and saturation. Other colors may be obtained by mixing primary colors. Each pixel is mapped into a set of one or more subpixels which are to display the color of the pixel.
  • In some displays and/or cameras, each repeating set of subpixels includes a subpixel for each primary color. The subpixels are small, and are spaced closely together, to provide a desired resolution.
  • An image detector has a plurality of photo detectors used to sample an image. Each of the plurality of photo detectors may sample (e.g., capture) a value for a single color. For example, each of the plurality of photo detectors may be configured with a color filter. According to the related art, a Color Filter Array (CFA) or a Color Filter Mosaic (CFM) is an array or mosaic of color filters disposed above the plurality of photo detectors.
  • Each of the plurality of photo detectors may be located at a photosite of the image detector. The photosite refers to the spatial location at which a color may be sampled by a photo detector. The array or mosaic of color filters may be disposed above the plurality of photo detectors such that each photo detector has a single corresponding color filter. Accordingly, each photosite may have a corresponding sampled value for a single color.
  • Each of the photosites may be mapped or otherwise correspond to a subpixel of the image. Accordingly, each subpixel of the image may have a corresponding sampled value for a single color. Because each subpixel of the image does not have sampled values for all colors, an image represented by the sampled color values at each subpixel may appear pixelated with a disjointed color representation of the intended image. In other words, the image represented by the sampled color values at each subpixel may be an inaccurate representation of the image intended to be captured.
  • In order to provide a more accurate representation of the color of the image intended to be captured, a technique commonly referred to as demosaicing may be performed. Demosaicing refers to the digital imaging process used to reconstruct a full color image from sampled color values output from an image detector. Because each photo detector has a spatial footprint in the image detector, the image detector is unable to capture a color value for every color at each respective photosite thereof. The reconstruction of the full color image using color samples output from the respective photo detectors constituting the image detector may use interpolation or other numerical methods to determine values for each color (e.g., red, green, and blue) at each subpixel.
  • FIG. 1 illustrates an array of photosites of an image detector according to the related art.
  • Referring to FIG. 1, an array of photosites 100 comprising red, green and blue photosites is provided. As an example, the red, green, and blue photosites may be created by placing a Bayer filter (e.g., an RBG filter) over the photo detectors respectively corresponding to the photosites.
  • The array of photosites 100 may correspond to a plurality of pixels. As illustrated in FIG. 1, each of the plurality of pixels is distinguished from one another by a solid line. Each of the plurality of pixels may include a plurality of subpixels. For example, each of the plurality of pixels may include four subpixels. Each of the plurality of subpixels in each of the plurality of pixels is distinguished from one another by a dotted line in FIG. 1.
  • If an image detector comprises a Bayer filter, then the plurality of subpixels constituting a pixel corresponding to the respective groups of photosites of the image detector may include a red subpixel, a blue subpixel, and two green subpixels. A pixel may include two green subpixels because the RGB pixel includes three colors, however, there are certain benefits associated with designing an array of photo detectors and thus sub pixels that are arranged by an order of two. The color green is selected as a color to be repeated in the pixel because the human eye has been determined to perceive luminance best through the color green.
  • As illustrated in FIG. 1, a red subpixel is denoted by ‘R,’ a blue subpixel is denoted by ‘B,’ and a green subpixel is denoted by ‘G.’
  • The green subpixels may respectively sample a green color value in response to a request to capture an image. The green subpixels may correspond to a subpixel 105 a, a subpixel 105 d, a subpixel 110 a, a subpixel 110 d, a subpixel 115 a, a subpixel 115 d, a subpixel 120 a, a subpixel 120 d, a subpixel 125 a, a subpixel 125 d, a subpixel 130 a, a subpixel 130 d, a subpixel 135 a, a subpixel 135 d, a subpixel 140 a, a subpixel 140 d, a subpixel 145 a, and a subpixel 145 d.
  • The red subpixels may respectively sample a red color value in response to a request to capture an image. The red subpixels may correspond to a subpixel 105 b, a subpixel 110 b, a subpixel 115 b, a subpixel 120 b, a subpixel 125 b, a subpixel 130 b, a subpixel 135 b, a subpixel 140 b, and a subpixel 145 b.
  • The blue subpixels may respectively sample a blue color value in response to a request to capture an image. The blue subpixels may correspond to a subpixel 105 c, a subpixel 110 c, a subpixel 115 c, a subpixel 120 c, a subpixel 125 c, a subpixel 130 c, a subpixel 135 c, a subpixel 140 c, and a subpixel 145 c.
  • FIG. 2 illustrates a plurality of subpixels according to an embodiment of the present disclosure.
  • Referring to FIG. 2, an array of pixels 200 respectively comprising four subpixels is illustrated. The four subpixels correspond to areas at which a corresponding color value is sampled. For example, as illustrated in the array of pixels 200, each pixel is distinguished from one another by a solid line. The plurality of subpixels in each of the plurality of pixels is distinguished from one another by a dotted line.
  • According to various embodiments of the present disclosure, a color value is respectively sampled at a subpixel corresponding to a photosite. For example, as illustrated in FIG. 2, a green color value is sampled at the subpixel denoted by G0, a red color value is sampled at the subpixel denoted by R0, a blue color value is sampled at the subpixel denoted by B0, and another green color value is sampled at the subpixel denoted by G3. Similarly, a green color value is sampled at the subpixel denoted by G1, a red color value is sampled at the subpixel denoted by R1, a blue color value is sampled at the subpixel denoted by B1, and another green color value is sampled at the subpixel denoted by G4. Further, a green color value is sampled at the subpixel denoted by G2, a red color value is sampled at the subpixel denoted by R2, a blue color value is sampled at the subpixel denoted by B2, and another green color value is sampled at the subpixel denoted by G5.
  • According to the related art, significant resources are used to demosaic the sampled color values in order to reconstruct a color representation of an image. For example, demosaicing methods according to the related art use significant memory and processing resources to demosaic the sampled color values. The related art uses a large memory buffer to store sampled color values with which the color representation is reconstructed. The related art references several sampled color values in order to interpolate or otherwise calculate color values at corresponding photosites.
  • In contrast, according to various embodiments of the present disclosure, an efficient method of reconstructing a color representation of an image is provided. According to various embodiments of the present disclosure, a slim buffer (e.g., a small buffer relative to the buffer used in related art demosaicing methods) is used for reconstructing the color representation of the image (e.g., demosaicing the sampled color values). For example, according to various embodiments of the present disclosure, an apparatus and method for demosaicing the sampled color values may use only sampled color values in close proximity to a particular subpixel in order to calculate an estimated color value corresponding to the particular subpixel. According to various embodiments of the present disclosure, an apparatus and method for demosaicing the sampled color values may use only sampled color values immediately adjacent to a particular subpixel in order to calculate an estimated color value corresponding to the particular subpixel.
  • According to various embodiments of the present disclosure, color values may be calculated at subpixels for which a green color value is sampled of a particular pixel using the respective red color value and the blue color value sampled at the corresponding subpixels of the particular pixel. With reference to FIG. 2, various embodiments of the present disclosure may calculate a red color value and a blue color value at the subpixel denoted by G4 using the red color value sampled at the subpixel denoted by R1 and using the blue color value sampled at the subpixel denoted by B1. As illustrated in FIG. 2, the subpixel denoted by G4, the subpixel denoted by R1, and the subpixel denoted by B1 are within the same pixel.
  • According to various embodiments of the present disclosure, an apparatus and method for calculating color values at subpixels for which a green color value was sampled may calculate the color values according to a low frequency demosaic. According to various embodiments of the present disclosure, an apparatus and method for calculating color values at subpixels for which a green color value was sampled may calculate the color values according to a high frequency demosaic. According to various embodiments of the present disclosure, an apparatus and method for calculating color values at subpixels for which a green color value was sampled may calculate the color values according to a blend of a low frequency demosaic and a high frequency demosaic.
  • FIG. 3 illustrates green subpixels according to an embodiment of the present disclosure.
  • Referring to FIG. 3, a color value for a particular color may be calculated (e.g., estimated) at corresponding subpixels using an average of the color values for the particular color. According to various embodiments of the present disclosure, an apparatus and method for demosaicing sampled color values may blend solutions between a low frequency demosaic solution and a high frequency demosaic solution. For example, the reconstruction of the color representation of an image may be an average or blend between a low frequency demosaic solution and a high frequency solution.
  • Various embodiments of the present disclosure may operate under a few basic assumptions.
  • One assumption is that in natural images (e.g., images captured by a camera), the natural chrominance is low. In other words, in an actual image, chrominance in the world is low. There tends to be a low spatial frequency in the chrominance of a natural image. For example, there tends to be greater energy at lower spatial frequency in the chrominance. The chrominance in a natural image is best captured at lower frequencies.
  • Another assumption is that a human eye generally disregards chrominance values sampled at high spatial frequencies. In other words, a human may not be able to detect chrominance values at high spatial frequencies. As a result, the luminance values at high frequencies are more important to image processing than the chrominance values. In other words, the luminance values sampled at high frequencies are more important (e.g., to the human perception) than the chrominance values sampled at high frequencies.
  • According to various embodiments of the present disclosure, an apparatus and method for demosaicing sampled images attempts to capture the elements of the sampled values (e.g., color values, color information, and/or the like) at the high spatial frequencies. For example, an apparatus and method for demosaicing sampled images attempts to capture the energy (e.g., the luminance) at the high spatial frequencies.
  • The typical Bayer filter (e.g., an RBG) filter has two green photo detectors for each red detector or each blue detector. In other words, each pixel in a typical Bayer filter has two green photosites, one red photosite, and one blue photosite. As a result, an image detector has a greater density of green photosites relative to a density of red blue photosites and a density of blue photosites. Because of the relatively greater density of green photosites, the color characteristics sampled at green subpixels of a pixel may be more accurate than the color characteristics sampled at a red subpixel or the color characteristics sampled at a green subpixel. In other words, on an individual pixel basis, the greater sample set of measurements obtained through the green subpixels may result in the measurements obtained through the green subpixels of a pixel being more representative of a true image than the measurements obtained through the red subpixel or the measurements obtained through the blue subpixel.
  • According to various embodiments of the present disclosure, the apparatus and method for demosaicing sampled color values uses color values sampled at green pixels to analyze characteristics of the image at high frequencies in relation to the characteristics of the images at low frequencies. According to various embodiments of the present disclosure, the sampled green values are used to determine (e.g., detect) the high frequencies.
  • According to the related art, the high frequencies may be detected using a 3×3 frequency detection filter. In contrast, according to various embodiments of the present disclosure, the high frequencies may be detected using a 2×3 frequency detection filter as provided in Equation (1) below.
  • - 1 0 - 1 0 2 0 Equitation ( 1 )
  • According to various embodiments of the present disclosure, a modified green color channel may be generated. For example, an alpha channel may be generated using the green values. According to various embodiments of the present disclosure, the alpha channel may be determined according to Equation (2) below.
  • α = ( GREEN DATA ) * - 1 0 - 1 0 2 0 * G S GAIN Equitation ( 2 )
  • Referring to Equation (2), GSGAIN may correspond to a coefficient for sharpening the alpha channel. The GSGAIN may refer to a Green Sharpening Gain. The GSGAIN may be used to amplify the frequency response or to deemphasize the frequency response. The GREEN DATA may be 3×2 data that is filtered using the frequency detection filter to determine a frequency response.
  • Referring to FIG. 3, a set of sampled green color values 310 from the array of sampled green color values 305 corresponds to the GREEN DATA used to determine an alpha (α) value corresponding to a corresponding pixel by filtering the sampled green color values 305 through the frequency detection filter 315.
  • If the value of α at a particular pixel is 0, then the color field may be determined to be flat. In other words, if the value of α is 0 at a particular point, then the green values of the set of sampled green color values 310 are the same. Accordingly, a flat field of green values creates no response by the frequency detection filter 315.
  • If the value of α at a particular pixel is 1, then the color field may be determined to be non-flat (e.g., have an edge). In other words, if the value of α is 1 at a particular point, then the frequency detection filter 315 corresponds to an edge detector at which the frequency detection filter 315 detects an edge in the green color field.
  • According to various embodiments of the present disclosure, the frequency may be sampled by using two rows of sampled color values. Because the frequency is detected using two rows of sampled color values, the memory (e.g., the buffer) required to store the corresponding sampled color values may be reduced (e.g., relative to the memory requirements of the related art). According to various embodiments of the present disclosure, color values may be calculated (e.g., estimated) using information from a neighboring one row of color values. The color values may be calculated using the sampled color values from a single neighboring row of color values (e.g., rather than two neighboring row of color values as used according to the related art).
  • According to various embodiments of the present disclosure, the generated alpha (α) channel may be used as a basis for blending low frequency demosaic solutions with high frequency demosaic solutions. The alpha channel may be used to determine a ratio between the low frequency demosaic solution component and the high frequency demosaic solution component. According to various embodiments of the present disclosure, the alpha channel may be used to determine an extent of the high frequency demosaic solution used to determine (e.g., calculated) the color values at the respective subpixels. According to various embodiments of the present disclosure, the alpha channel may be used to determine an extent of the low frequency demosaic solution used to determine (e.g., calculated) the color values at the respective subpixels.
  • The blend of the low frequency demosaic solution with the high frequency demosaic solution may be determined according to Equation (3) below.

  • 1=α+(1−α), where 0≦α≦1  Equitation (3)
  • According to various embodiments of the present disclosure, α may correspond to an extent to which the high frequency demosaic solution is used to determine color values at a respective subpixel. Conversely, 1−α may correspond to an extent to which the low frequency demosaic solution is used to determine color values at a respective subpixel. As discussed above, the alpha parameter (α) may be determined by the edge detector (e.g., the frequency detection filter 315).
  • FIG. 4 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure.
  • Referring to FIG. 4, a low frequency demosaic solution 310 is combined with a high frequency demosaic solution 350 to determine a color value for a particular color at a particular subpixel. As discussed above, the portion of the high frequency demosaic solution 350 used to determine color value for a particular color at a particular subpixel may be determined according to the alpha channel. The portion of the high frequency demosaic solution 350 used to determine color value may defined according to α. The portion of the low frequency demosaic solution 310 used to determine color value for a particular color at a particular subpixel may be determined according to the alpha channel. The portion of the low frequency demosaic solution 310 used to determine color value may defined according to 1−α.
  • According to various embodiments of the present disclosure, color values at a subpixel may be calculated according to Equation (4) below.

  • R′G′B′=α*(high frequency RGB value)+(1−α)*(low frequency RGB value)  Equitation (4)
  • As discussed above, a human eye generally disregards chrominance values sampled at high spatial frequencies, and as a result, at high spatial frequencies, the luminance is more important than the chromatic values. At high frequencies, the color values are changing very rapidly at the corresponding subpixels. In order to reduce the possibility of calculating an erroneous color value at a green subpixel for the high frequency demosaic solution, the green subpixel is substituted with a grey color value. For example, as illustrated in FIG. 4, at the green subpixel 471 and at the green subpixel 475, the corresponding red color value and blue color value are substituted with the corresponding sampled green color value. At the green subpixel 471, the red color value and the blue color value are set to the sampled green color value that was captured at the green subpixel 471. Similarly, at the green subpixel 475, the red color value and the blue color value are set to the sampled green color value that was captured at the green subpixel 475.
  • At low frequencies, the sampled color values are assumed to be accurate representations of the true image. Accordingly, with respect to the low frequency demosaic solution for the green subpixels, the red color value and the blue red color value are reconstructed using substitution of the corresponding red color value and the blue red color value with a neighboring sampled red color value and a neighboring sampled green color value. For example, at a green subpixel 431, a red color value is set to be equal to the sampled red color value sampled at a red subpixel 423, and a blue color value is set to be equal to the sampled blue color value sampled at a blue subpixel 429. The green color value at the green subpixel 431 corresponds to the sampled green color value at the green subpixel 431. Similarly, at a green subpixel 435, a red color value is set to be equal to the sampled red color value sampled at a red subpixel 427, a blue color value is set to be equal to the sampled blue color value sampled at a blue subpixel 433, and green color value corresponds to the sampled green color value at the green subpixel 435.
  • According to various embodiments of the present disclosure, a red color value and a blue color value at a green subpixel 421 may be set to be equal to the red color value and the blue color value substituted at the green subpixel 431. For example, the red color value is set to be equal to the sampled red color value sampled at the red subpixel 423, the blue color value is set to be equal to the sampled blue color value sampled at the blue subpixel 429, and the green color value corresponds to the sampled green color value at the green subpixel 421.
  • Similarly, at a green subpixel 425, a red color value and a blue color value at a green subpixel 421 may be set to be equal to the red color value and the blue color value substituted at the green subpixel 435. For example, the red color value is set to be equal to the sampled red color value sampled at the red subpixel 427, the blue color value is set to be equal to the sampled blue color value sampled at the blue subpixel 433, and the green color value corresponds to the sampled green color value at the green subpixel 425.
  • According to various embodiments of the present disclosure, a sampled color value is set as the estimated (e.g., calculated) color value at a neighboring subpixel for which the color value is not sampled because the likelihood that the true color values are materially different between neighboring subpixels is very remote. Such substitution of color values is an efficient color reconstruction method that avoids more complex interpolation or other numerical methods that require intensive memory and processing power.
  • According to various embodiments of the present disclosure, color values for a particular subpixel are calculated to be a blend between the color values of the subpixel according to the low frequency solution and the color values of the subpixel according to the high frequency solution.
  • According to various embodiments of the present disclosure, reconstructed color values at a green subpixel may be determined according to Equations (5) and (6).
  • { R = α * G + ( 1 - α ) * R G = G B = α * G + ( 1 - α ) * B Equitation ( 5 ) { R = R + α * ( G - R ) G = G B = B + α * ( G - B ) Equitation ( 6 )
  • Referring to Equations (5) and (6), G corresponds to the sampled green value at the particular subpixel, R corresponds to the sampled red value at a red subpixel neighboring the particular green subpixel, and G corresponds to the sampled blue value at a blue subpixel neighboring the particular green subpixel. α may be calculated using, for example, Equation (2). According to various embodiments of the present disclosure, α may be calculated using another example to derive a blending ration for blending a high frequency demosaic solution with a low frequency demosaic solution. R′ corresponds to the reconstructed (e.g., estimated) red value at the particular green pixel, and B′ corresponds to the reconstructed (e.g., estimated) blue value at the particular green pixel.
  • After the color values (RGB color) has been reconstructed at the green subpixels, then the corresponding color values are respectively reconstructed at the red subpixels and the blue subpixels. Because of the respective lower density of the red subpixels and the blue subpixels in relation to the green subpixels, additional frequency detection on the red or blue color planes would require more line buffers. As a result, according to the related art, color reconstruction would require significant memory to reconstruct at least the red color pixels and the blue color pixel. However, according to various embodiments of the present disclosure, the color values (e.g., color information) at the red subpixels and the blue subpixels may be derived from available reconstructed and “repaired” color from neighboring green subpixels.
  • According to various embodiments of the present disclosure, after the color values have been calculated at the green subpixels, the color values at the green subpixels are used to reconstruct the color values at the neighboring red subpixel and the neighboring blue subpixel.
  • According to various embodiments of the present disclosure, the color values at the red subpixels and the blue subpixels may be determined for a low frequency demosaic solution and for a high frequency demosaic solution. Thereafter, the color values at the red subpixels for the low frequency demosaic solution may be blended with the color values at the red subpixels for the high frequency demosaic solution. Similarly, color values at the blue subpixels for the low frequency demosaic solution may be blended with the color values at the blue subpixels for the high frequency demosaic solution.
  • According to various embodiments of the present disclosure, the color values at the red subpixels and the blue subpixels may be determined for a low frequency demosaic solution and for a high frequency demosaic solution. Thereafter, the color values at the red subpixels for the low frequency demosaic solution may be blended with the color values at the red subpixels for the high frequency demosaic solution. Similarly, color values at the blue subpixels for the low frequency demosaic solution may be blended with the color values at the blue subpixels for the high frequency demosaic solution.
  • The color values at a red subpixel may be equal to an average of the color values at the neighboring green pixels. For example, the color values at a red subpixel may be the average of the color values at the horizontally neighboring green subpixels. The average of the color values at the horizontally neighboring green subpixel may be a weighted average of such color values.
  • According to various embodiments of the present disclosure, the color values at a red subpixel may be equal to an average of the blended color values at the neighboring green pixels. For example, the color values at the horizontally neighboring green subpixels may be alpha blended between the low frequency demosaic solution and the high frequency demosaic solution before the color values at the horizontally neighboring green subpixels are averaged.
  • With reference to FIG. 4, the color values at the red subpixel corresponding to red subpixel 423 of the low frequency demosaic solution may be derived using the color values at the green subpixel 421, the green subpixel 425, the green subpixel 461, and the green subpixel 465. The red color value at the red subpixel 423 may correspond to the sampled red color value at the red subpixel 423.
  • The green color value at the red subpixel 423 may be an average between (i) a blend of the green color value sampled at the green subpixel 421 and the green color value sampled at the green subpixel 461, and (ii) a blend of the green color value sampled at the green subpixel 425 and the green color value sampled at the green subpixel 465. Similarly, the blue color value may be an average between (i) a blend of the blue color value at the green subpixel 421 and the blue color value at the subpixel 461, and (ii) a blend of the blue color value at the green subpixel 425 and the blue color value at the subpixel 465.
  • According to various embodiments of the present disclosure, the color information resulting from the above color values at the red subpixel corresponding to red subpixel 423 of the low frequency demosaic solution may represent an estimated chrominance information for the particular subpixel. However, such color information may not necessarily be considered the brightness (e.g., luminance) information for the particular subpixel. Therefore, according to various embodiments of the present disclosure, the calculated color information (e.g., color values) may be normalized for brightness. For example, the average of the alpha blends of the color information may be normalized at step 487.
  • According to various embodiments of the present disclosure, the calculated information may be normalized according to an additive normalization. According to various embodiments of the present disclosure, the calculated information may be normalized according to a multiplicative normalization. The multiplicative normalization does not result in desaturation side-effects.
  • According to various embodiments of the present disclosure, for a particular red subpixel, the color values may be determined using an additive normalization according to Equation (7).
  • { Ro = R Δ R = R - R avg Go = G avg + Δ R Bo = B avg + Δ R Equitation ( 7 )
  • Referring to Equation (7), Ro corresponds to the red color value calculated for the particular red subpixel, Go corresponds to the green color value calculated for the particular red subpixel, and Bo corresponds to the blue color value calculated for the particular red subpixel. R corresponds to the sampled red color value at the particular red subpixel. R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. As an example, the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6). ΔR corresponds to a difference between (i) the sampled red color value at the particular red subpixel, and (ii) an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. As an example, the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6). B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. As an example, the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • According to various embodiments of the present disclosure, for a particular blue subpixel, the color values may be determined using an additive normalization according to Equation (8).
  • { Bo = B Δ B = B - B avg Go = G avg + Δ B Ro = R avg + Δ B Equitation ( 8 )
  • Referring to Equation (8), Bo corresponds to the blue color value calculated for the particular blue subpixel, Go corresponds to the green color value calculated for the particular blue subpixel, and Ro corresponds to the red color value calculated for the particular blue subpixel. B corresponds to the sampled blue color value at the particular blue subpixel. B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. As an example, the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6). ΔB corresponds to a difference between (i) the sampled blue color value at the particular blue subpixel, and (ii) an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. As an example, the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6). R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. As an example, the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • According to various embodiments of the present disclosure, for a particular red subpixel, the color values may be determined using a multiplicative normalization according to Equation (9).
  • { Ro = R Go = G avg + R / R avg Bo = B avg + R / R avg Equitation ( 9 )
  • Referring to Equation (9), Ro corresponds to the red color value calculated for the particular red subpixel, Go corresponds to the green color value calculated for the particular red subpixel, and Bo corresponds to the blue color value calculated for the particular red subpixel. R corresponds to the sampled red color value at the particular red subpixel. R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. As an example, the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6). G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. As an example, the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6). B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel. As an example, the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular red subpixel may be determined according to Equation (5) or (6).
  • According to various embodiments of the present disclosure, for a particular blue subpixel, the color values may be determined using a multiplicative normalization according to Equation (10).
  • { Bo = B Go = G avg + B / B avg Ro = R avg + B / B avg Equitation ( 10 )
  • Referring to Equation (10), Bo corresponds to the blue color value calculated for the particular blue subpixel, Go corresponds to the green color value calculated for the particular blue subpixel, and Ro corresponds to the red color value calculated for the particular blue subpixel. B corresponds to the sampled blue color value at the particular blue subpixel. B′avg corresponds to an average between calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. As an example, the calculated blue color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6). G′avg corresponds to an average between calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. As an example, the calculated green color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6). R′avg corresponds to an average between calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel. As an example, the calculated red color values at the green subpixels that neighbor (e.g., horizontally neighbor) the particular blue subpixel may be determined according to Equation (5) or (6).
  • Referring back to FIG. 4, operations 481-487 illustrate a method for reconstructing color information (e.g., color values) at the blue subpixel corresponding to the blue subpixel 433 of the low frequency demosaic solution.
  • At operation 481, the color values determined at the green subpixel 431 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 471 of the high frequency demosaic solution. According to various embodiments of the present disclosure, the color values determined at the green subpixel 431 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 471 of the high frequency demosaic solution according to an alpha blend.
  • At operation 483, the color values determined at the green subpixel 435 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 475 of the high frequency demosaic solution. According to various embodiments of the present disclosure, the color values determined at the green subpixel 435 of the low frequency demosaic solution are combined with the color values determined at the green subpixel 475 of the high frequency demosaic solution according to an alpha blend.
  • At operation 485, the color values determined at operation 481 and the color values determined at operation 483 are averaged.
  • At operation 487, the average color value for the subpixel corresponding to the blue subpixel 433 of the low frequency solution determined at operation 485 is normalized. According to various embodiments of the present disclosure, the average color value for the subpixel corresponding to the blue subpixel 433 of the low frequency solution may be normalized according to an additive normalization (e.g., using Equations (7) and (8)). According to various embodiments of the present disclosure, the average color value for the subpixel corresponding to the blue subpixel 433 of the low frequency solution may be normalized according to a multiplicative normalization (e.g., using Equations (9) and (10)).
  • FIG. 5 illustrates a method of determining color values at corresponding subpixels according to an embodiment of the present disclosure.
  • Referring to FIG. 5, at operation 505, an electronic device captures an image. The electronic device may include an image detectors that has a plurality of photo detectors may be used to capture the image. Each of the photo detectors may sample a single color. According to various embodiments of the present disclosure, the color information from the plurality of photo detectors may be stored. According to various embodiments of the present disclosure, the color information constituting an intended image that may require reconstruction of the color representation thereof may be received/stored.
  • At operation 510, the electronic device may calculate an alpha (a) parameter at the subpixels for which a green color value is sampled (e.g., the green subpixels). According to various embodiments of the present disclosure, the electronic device may generate an alpha channel using the green color values sampled at the green subpixels. The electronic device may calculate the alpha parameter using Equation (2), or the like.
  • At operation 515, the electronic device may determine color values at the green subpixels. The electronic device may determine the color values at the green subpixels using Equations (5), (6), or the like. According to various embodiments of the present disclosure, electronic device may determine the color the color values at the green subpixels across the entire green color plane (e.g., across all green subpixels). According to various embodiments of the present disclosure, the electronic device may determine the color values at the green subpixels across two rows of subpixels.
  • At operation 520, the electronic device may proceed to the next subpixel (e.g., for which color values are required to be determined).
  • At operation 525, the electronic device determines whether a current subpixel is a red subpixel or a blue subpixel.
  • If the electronic device determines that the current subpixel is a red subpixel at operation 522, then the electronic device may proceed to operation 530 at which the electronic device may determine color values at the current subpixel (e.g., at the red subpixel). Thereafter, the electronic device may proceed to operation 540.
  • In contrast, if the electronic device determines that the current subpixel is a blue subpixel at operation 522, then the electronic device may proceed to operation 535 at which the electronic device may determine color values at the current subpixel (e.g., at the blue subpixel). Thereafter, the electronic device may proceed to operation 540.
  • At operation 540, the electronic device may determine whether color values for all subpixels have been calculated.
  • If the electronic device determines that the color values for all the subpixels have not been calculated at operation 540, then the electronic device may proceed to operation 520.
  • In contrast, if the electronic device determines that the color values for all the subpixels have been calculated at operation 540, then the electronic device may proceed to operation 545 at which the electronic device may store the color information (e.g., the color values) for the respective subpixels. The electronic device may store the image.
  • Thereafter, at operation 550, the electronic device may render the image. The electronic device may render the image in a preview frame (e.g., an instance in which a fast or lightweight demosaicing processing is preferred). The electronic device may perform further image processing before rendering the image.
  • FIG. 6 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 6, an electronic device may include a control unit 610, a storage unit 620, a camera unit 630, an image processing unit 640, a display unit 650, an input unit 650, and a communication unit 670.
  • According to various embodiments of the present disclosure, the electronic device comprises at least one control unit 610. The at least one control unit 610 may be configured to operatively control the electronic device. For example, the at least one control unit 610 may control operation of the various components or units included in the electronic device. The at least one control unit 610 may transmit a signal to the various components included in the electronic device and control a signal flow between internal blocks of the electronic device. The at least one control unit 610 may be or otherwise include at least one processor. The at least one control unit 610 may include an Application Processor (AP), and/or the like.
  • The storage unit 620 may be configured to store user data, and the like, as well a program which performs operating functions according to various embodiments of the present disclosure. The storage unit 620 may include a non-transitory computer-readable storage medium. As an example, the storage unit 620 may store a program for controlling general operation of an electronic device, an Operating System (OS) which boots the electronic device, and application program for performing other optional functions such as a camera function, a sound replay function, an image or video replay function, a signal strength measurement function, a route generation function, image processing, and the like. Further, the storage unit 620 may store user data generated according to a user of the electronic device, such as, for example, a text message, a game file, a music file, a movie file, and the like. According to various embodiments of the present disclosure, the storage unit 620 may store an application or a plurality of applications that individually or in combination operate a camera unit (not shown) to capture (e.g., contemporaneously) one or more images, and/or the like. According to various embodiments of the present disclosure, the storage unit 620 may store an application or a plurality of applications that individually or in combination operate the image processing unit 640 or the control unit 610 to determine an alpha parameter at green subpixels, to determine (e.g., reconstruct) color values at green subpixels, to determine (e.g., reconstruct) color values at red subpixels, to determine (e.g., reconstruct) color values at blue subpixels, to store color values for one or more subpixels, to render an image (e.g., using reconstructed color values), and/or the like. The storage unit 620 may store an application or a plurality of applications that individually or in combination operate the control unit 610 and the communication unit 670 to communicate with a counterpart electronic device to receive color information (e.g., color values) for an image from the counterpart electronic device, and/or the like. The storage unit 620 may store an application or a plurality of applications that individually or in combination operate the display unit 650 to display a graphical user interface, an image, a video, and/or the like. The storage unit 620 may store an application or a plurality of applications that individually or in combination operate the display unit 650 to display a preview image using reconstructed color information using reconstruction (e.g., demosaicing) methods according to various embodiments of the present disclosure. The preview image may be displayed in a viewfinder while the camera unit 630 is operated. The preview image may display a view of that may be captured upon command to capture an image.
  • The camera unit 630 may capture an image. The camera unit 630 may include an image detector (not shown) that includes a plurality of photo detectors. Each of the plurality of photo detectors may correspond to a photosite. The camera unit may capture a preview image as the camera unit 630 is operated. The preview image may be displayed on the display unit 650 while the camera unit 630 is operated (e.g., before the camera unit 630 is instructed to capture the image).
  • The image processing unit 640 may be configured to process image data, images, and/or the like. The image processing unit 640 may include a Sub Pixel Rendering (SPR) unit (not shown), a demosaicing unit (not shown), and/or the like. In the alternative or in addition, the image processing unit 640 may be configured to perform demosaicing of image data and/or images, SPR, and/or the like. The image processing unit 640 may be configured to determine an alpha parameter at green subpixels, to determine (e.g., reconstruct) color values at green subpixels, to determine (e.g., reconstruct) color values at red subpixels, to determine (e.g., reconstruct) color values at blue subpixels, to store color values for one or more subpixels, to render an image (e.g., using reconstructed color values), and/or the like.
  • The display unit 650 displays information inputted by user or information to be provided to user as well as various menus of the electronic device. For example, the display unit 650 may provide various screens according to a user of the electronic device, such as an idle screen, a message writing screen, a calling screen, a route planning screen, and the like. According to various embodiments of the present disclosure, the display unit 650 may display an interface which the user may manipulate or otherwise enter inputs via a touch screen to enter selection of the function relating to the signal strength of the electronic device. The display unit 650 can be formed as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), and the like. However, various embodiments of the present disclosure are not limited to these examples. Further, the display unit 650 can perform the function of the input unit 660 if the display unit 650 is formed as a touch screen.
  • The input unit 660 may include input keys and function keys for receiving user input. For example, the input unit 660 may include input keys and function keys for receiving an input of numbers or various sets of letter information, setting various functions, and controlling functions of the electronic device. For example, the input unit 660 may include a calling key for requesting a voice call, a video call request key for requesting a video call, a termination key for requesting termination of a voice call or a video call, a volume key for adjusting output volume of an audio signal, a direction key, and the like. In particular, according to various embodiments of the present disclosure, the input unit 660 may transmit to the at least one control unit 610 signals related to the operation of a camera unit (not shown), to selection of an image, to selection of a viewpoint, and/or the like. Such an input unit 660 may be formed by one or a combination of input means such as a touch pad, a touchscreen, a button-type key pad, a joystick, a wheel key, and the like.
  • The communication unit 670 may be configured for communicating with other electronic devices and/or networks. According to various embodiments of the present disclosure, the communication unit 670 may be configured to communicate using various communication protocols and various communication transceivers. For example, the communication unit 670 may be configured to communicate via Bluetooth technology, NFC technology, WiFi technology, 2G technology, 3G technology, LTE technology, or another wireless technology, and/or the like.
  • It will be appreciated that various embodiments of the present disclosure according to the claims and description in the specification can be realized in the form of hardware, software or a combination of hardware and software.
  • Any such software may be stored in a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like. It will be appreciated that the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.
  • While the disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents.

Claims (25)

What is claimed is:
1. A method for demosaicing sampled color values, the method comprising:
sampling color information respectively at a plurality of subpixels using a plurality of photo detectors;
generating a low frequency color reconstruction solution for a first subset of the plurality of subpixels corresponding to subpixels for which a first color is sampled;
generating a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled;
determining color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution; and
determining, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
2. The method of claim 1, further comprising:
determining, using the first color values for the first subset of the plurality of pixels, color values for a third subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
3. The method of claim 1, wherein the first color is green.
4. The method of claim 1, further comprising:
determining an alpha parameter using a frequency detection filter.
5. The method of claim 4, wherein the determining of the color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution comprises:
blending the color values according to the low frequency color reconstruction solution with the color values according to the high frequency color reconstruction solution according to a ratio based at least in part on the alpha parameter.
6. The method of claim 4, wherein the determining of the color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled, using the first color values for the first subset of the plurality of pixels comprises:
determining the color values for a particular subpixel of the second subset of the plurality of subpixels using respective color values of subpixels of the first subset of the plurality of subpixels that neighbor the particular subpixel.
7. The method of claim 6, wherein the determining of the color values for a particular subpixel of the second subset of the plurality of subpixels using respective color values of subpixels of the first subset of the plurality of subpixels that neighbor the particular subpixel comprises:
calculating an average of color values of a second color for subpixels of the first subset of the plurality of subpixels that horizontally neighbor the particular subpixel.
8. The method of claim 7, wherein the determining of the color values for a particular subpixel of the second subset of the plurality of subpixels using respective color values of subpixels of the first subset of the plurality of subpixels that neighbor the particular subpixel further comprises:
calculating an average of color values of a third color for subpixels of the first subset of the plurality of subpixels that horizontally neighbor the particular subpixel.
9. The method of claim 8, further comprising:
normalizing a color value of one or more of the first color, the second color, and the third color, for the particular subpixel.
10. The method of claim 9, wherein the normalizing of the color value comprises:
additively normalizing the color value.
11. The method of claim 9, wherein the normalizing of the color value comprises:
multiplicatively normalizing the color value.
12. The method of claim 9, further comprising:
rendering an image on a preview screen of an electronic device, using the normalized color value.
13. A non-transitory computer-readable storage medium storing instructions that, when executed, cause at least one processor to perform the method of claim 1.
14. An apparatus for demosaicing sampled color values, the apparatus comprising:
a storage unit; and
at least one processor configured to sample color information respectively at a plurality of subpixels using a plurality of photo detectors, to generate a low frequency color reconstruction solution for a first subset of the plurality of subpixels corresponding to subpixels for which a first color is sampled, to generate a high frequency color reconstruction solution for the first subset of the plurality of subpixels corresponding to subpixels for which the first color is sampled, to determine color values for the first subset of the plurality of pixels using a combination of the low frequency color reconstruction solution and the high frequency color reconstruction solution, and to determine, using the first color values for the first subset of the plurality of pixels, color values for a second subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
15. The apparatus of claim 14, wherein the at least one processor is further configured to determine, using the first color values for the first subset of the plurality of pixels, color values for a third subset of the plurality of subpixels corresponding to subpixels for which the first color is not sampled.
16. The apparatus of claim 14, wherein the first color is green.
17. The apparatus of claim 14, wherein the at least one processor is further configured to determine an alpha parameter using a frequency detection filter.
18. The apparatus of claim 17, wherein the at least one processor is further configured to blend the color values according to the low frequency color reconstruction solution with the color values according to the high frequency color reconstruction solution according to a ratio based at least in part on the alpha parameter.
19. The apparatus of claim 17, wherein the at least one processor is further configured to determine the color values for a particular subpixel of the second subset of the plurality of subpixels using respective color values of subpixels of the first subset of the plurality of subpixels that neighbor the particular subpixel.
20. The apparatus of claim 19, wherein the at least one processor is further configured to calculate an average of color values of a second color for subpixels of the first subset of the plurality of subpixels that horizontally neighbor the particular subpixel.
21. The apparatus of claim 20, wherein the at least one processor is further configured to calculate an average of color values of a third color for subpixels of the first subset of the plurality of subpixels that horizontally neighbor the particular subpixel.
22. The apparatus of claim 21, wherein the at least one processor is further configured to normalize a color value of one or more of the first color, the second color, and the third color, for the particular subpixel.
23. The apparatus of claim 21, wherein the at least one processor is further configured to normalize a color value of one or more of the first color, the second color, and the third color, for the particular subpixel using an additive normalization.
24. The apparatus of claim 21, wherein the at least one processor is further configured to normalize a color value of one or more of the first color, the second color, and the third color, for the particular subpixel using an multiplicative normalization.
25. The apparatus of claim 21, further comprising:
a display unit,
wherein the at least one processor is further configured to render an image on a preview screen of the display unit, using the normalized color value.
US14/304,253 2014-06-12 2014-06-13 Low power demosaic with intergrated chromatic aliasing repair Abandoned US20150363916A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/304,253 US20150363916A1 (en) 2014-06-12 2014-06-13 Low power demosaic with intergrated chromatic aliasing repair
KR1020150075379A KR20150142601A (en) 2014-06-12 2015-05-28 Low power demosaic with intergrated chromatic aliasing repair

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462011311P 2014-06-12 2014-06-12
US14/304,253 US20150363916A1 (en) 2014-06-12 2014-06-13 Low power demosaic with intergrated chromatic aliasing repair

Publications (1)

Publication Number Publication Date
US20150363916A1 true US20150363916A1 (en) 2015-12-17

Family

ID=54836573

Family Applications (5)

Application Number Title Priority Date Filing Date
US14/304,227 Abandoned US20150363912A1 (en) 2014-06-12 2014-06-13 Rgbw demosaic method by combining rgb chrominance with w luminance
US14/304,253 Abandoned US20150363916A1 (en) 2014-06-12 2014-06-13 Low power demosaic with intergrated chromatic aliasing repair
US14/304,625 Abandoned US20150363922A1 (en) 2014-06-12 2014-06-13 Super-resolution from handheld camera
US14/304,214 Expired - Fee Related US9418398B2 (en) 2014-06-12 2014-06-13 Low power subpixel rendering on RGBW display
US14/304,315 Abandoned US20150363913A1 (en) 2014-06-12 2014-06-13 Adaptive filter demosaicizing for super resolution

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/304,227 Abandoned US20150363912A1 (en) 2014-06-12 2014-06-13 Rgbw demosaic method by combining rgb chrominance with w luminance

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/304,625 Abandoned US20150363922A1 (en) 2014-06-12 2014-06-13 Super-resolution from handheld camera
US14/304,214 Expired - Fee Related US9418398B2 (en) 2014-06-12 2014-06-13 Low power subpixel rendering on RGBW display
US14/304,315 Abandoned US20150363913A1 (en) 2014-06-12 2014-06-13 Adaptive filter demosaicizing for super resolution

Country Status (2)

Country Link
US (5) US20150363912A1 (en)
KR (3) KR20150142599A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108665419A (en) * 2017-03-30 2018-10-16 展讯通信(上海)有限公司 A kind of method and device of image denoising
US10356407B2 (en) * 2015-11-20 2019-07-16 Facebook Technologies, Llc Display-side video decompression using quantization tables
WO2020012316A1 (en) * 2018-07-11 2020-01-16 Sodyo Ltd. Detection of machine-readable tags with high resolution using mosaic image sensors
US10635958B2 (en) 2015-01-28 2020-04-28 Sodyo Ltd. Hybrid visual tagging using customized colored tiles
US10744585B2 (en) 2012-06-06 2020-08-18 Sodyo Ltd. Anchors for location-based navigation and augmented reality applications
CN113781303A (en) * 2021-09-01 2021-12-10 瑞芯微电子股份有限公司 Image processing method, medium, processor and electronic device
US11600029B2 (en) 2012-06-06 2023-03-07 Sodyo Ltd. Display synchronization using colored anchors
US11645734B2 (en) * 2019-05-15 2023-05-09 Realtek Semiconductor Corp. Circuitry for image demosaicing and contrast enhancement and image-processing method

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9693427B2 (en) * 2014-01-06 2017-06-27 Fibar Group S.A. RGBW controller
JP6300658B2 (en) * 2014-06-19 2018-03-28 オリンパス株式会社 Sample observation equipment
JP5980294B2 (en) * 2014-10-27 2016-08-31 キヤノン株式会社 Data processing apparatus, imaging apparatus, and data processing method
WO2016110984A1 (en) * 2015-01-08 2016-07-14 オリンパス株式会社 Image processing device, method for operating image processing device, program for operating image processing device, and endoscope device
US9697436B2 (en) * 2015-01-16 2017-07-04 Haier Us Appliance Solutions, Inc. Method for selecting an image of an appliance with a suitable position or orientation of a door of the appliance
US9779691B2 (en) * 2015-01-23 2017-10-03 Dell Products, Lp Display front of screen performance architecture
CN104656263B (en) * 2015-03-17 2017-07-04 京东方科技集团股份有限公司 3 D displaying method and device
CN105430358B (en) * 2015-11-26 2018-05-11 努比亚技术有限公司 A kind of image processing method and device, terminal
CN107275359B (en) * 2016-04-08 2021-08-13 乐金显示有限公司 Organic light emitting display device
CN107633795B (en) * 2016-08-19 2019-11-08 京东方科技集团股份有限公司 The driving method of display device and display panel
CN108090887B (en) * 2016-11-23 2020-09-04 杭州海康威视数字技术股份有限公司 Video image processing method and device
CN106791755B (en) 2016-12-27 2018-11-23 武汉华星光电技术有限公司 A kind of RGBW pixel rendering device and method
US10210826B2 (en) * 2017-02-22 2019-02-19 Himax Technologies Limited Sub-pixel rendering method for delta RGBW panel and delta RGBW panel with sub-pixel rendering function
KR102401648B1 (en) 2017-06-07 2022-05-26 삼성디스플레이 주식회사 Display device
TWI634543B (en) * 2017-06-26 2018-09-01 友達光電股份有限公司 Driving device and driving method
CN107656717B (en) * 2017-09-25 2021-03-26 京东方科技集团股份有限公司 Display method, image processing module and display device
KR102484150B1 (en) * 2017-10-20 2023-01-04 삼성전자 주식회사 Electronic device and method for upsampling image thereof
JP6840860B2 (en) * 2017-10-23 2021-03-10 株式会社ソニー・インタラクティブエンタテインメント Image processing equipment, image processing methods and programs
CN110458789B (en) * 2018-05-02 2022-04-05 杭州海康威视数字技术股份有限公司 Image definition evaluating method and device and electronic equipment
US10692177B2 (en) 2018-08-10 2020-06-23 Apple Inc. Image pipeline with dual demosaicing circuit for efficient image processing
GB201908517D0 (en) * 2019-06-13 2019-07-31 Spectral Edge Ltd 3D digital imagenoise reduction system and method
KR20210002966A (en) * 2019-07-01 2021-01-11 삼성전자주식회사 Image sensor and driving method thereof
US10877540B2 (en) 2019-10-04 2020-12-29 Intel Corporation Content adaptive display power savings systems and methods
CN110740272B (en) * 2019-10-31 2021-05-14 Oppo广东移动通信有限公司 Image acquisition method, camera assembly and mobile terminal
CN110971799B (en) * 2019-12-09 2021-05-07 Oppo广东移动通信有限公司 Control method, camera assembly and mobile terminal

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6978050B2 (en) * 2001-07-18 2005-12-20 Hewlett-Packard Development Company, L.P. Electronic image color plane reconstruction
US7006686B2 (en) * 2001-07-18 2006-02-28 Hewlett-Packard Development Company, L.P. Image mosaic data reconstruction
US7072509B2 (en) * 2001-07-27 2006-07-04 Hewlett-Packard Development Company, L.P. Electronic image color plane reconstruction
US7082218B2 (en) * 2001-07-27 2006-07-25 Hewlett-Packard Development Company, L.P. Color correction of images
US20070257943A1 (en) * 2006-05-08 2007-11-08 Eastman Kodak Company Method for rendering color EL display and display device with improved resolution
US7474337B1 (en) * 2000-10-24 2009-01-06 Sony Corporation Method and apparatus to provide edge enhancements as part of a demosaicing process
US7525583B2 (en) * 2005-02-11 2009-04-28 Hewlett-Packard Development Company, L.P. Decreasing aliasing in electronic images
US7643676B2 (en) * 2004-03-15 2010-01-05 Microsoft Corp. System and method for adaptive interpolation of images from patterned sensors
US20100182466A1 (en) * 2009-01-16 2010-07-22 Samsung Digital Imaging Co., Ltd. Image interpolation method and apparatus using pattern characteristics of color filter array
US8005297B2 (en) * 2006-01-18 2011-08-23 Qualcomm Incorporated Method and apparatus for adaptive and self-calibrated sensor green channel gain balancing
US20130077862A1 (en) * 2010-06-16 2013-03-28 Yoshikuni Nomura Image processing apparatus, image processing method and program
US8422771B2 (en) * 2008-10-24 2013-04-16 Sharp Laboratories Of America, Inc. Methods and systems for demosaicing
US8452090B1 (en) * 2005-04-25 2013-05-28 Apple Inc. Bayer reconstruction of images using a GPU
US8482636B2 (en) * 2010-05-05 2013-07-09 DigitalOptics Corporation Europe Limited Digital zoom on bayer
US8531563B2 (en) * 2011-02-28 2013-09-10 Fujifilm Corporation Color imaging apparatus
US8605164B2 (en) * 2009-04-20 2013-12-10 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, and storage medium
US20140049645A1 (en) * 2012-08-16 2014-02-20 Gentex Corporation Method and system for imaging an external scene by employing a custom image sensor
US8873847B2 (en) * 2010-09-06 2014-10-28 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method of demosaicing a digital raw image, corresponding computer program and graphics or imager circuit
US8891866B2 (en) * 2011-08-31 2014-11-18 Sony Corporation Image processing apparatus, image processing method, and program

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW583603B (en) * 2003-02-21 2004-04-11 Inventec Appliances Corp Method for producing enhanced-resolution image by use of a plurality of low-resolution images
JP4461937B2 (en) * 2003-09-30 2010-05-12 セイコーエプソン株式会社 Generation of high-resolution images based on multiple low-resolution images
US20060291750A1 (en) * 2004-12-16 2006-12-28 Peyman Milanfar Dynamic reconstruction of high resolution video from low-resolution color-filtered video (video-to-video super-resolution)
US20060291751A1 (en) * 2004-12-16 2006-12-28 Peyman Milanfar Robust reconstruction of high resolution grayscale images from a sequence of low-resolution frames (robust gray super-resolution)
JP4951440B2 (en) * 2007-08-10 2012-06-13 富士フイルム株式会社 Imaging apparatus and solid-state imaging device driving method
JP5076755B2 (en) * 2007-09-07 2012-11-21 ソニー株式会社 Image processing apparatus, image processing method, and computer program
KR101367199B1 (en) * 2007-09-07 2014-02-27 삼성전자주식회사 Image display device and method for revising display character thereof
WO2009087641A2 (en) * 2008-01-10 2009-07-16 Ramot At Tel-Aviv University Ltd. System and method for real-time super-resolution
JP2010183357A (en) * 2009-02-05 2010-08-19 Panasonic Corp Solid state imaging element, camera system, and method of driving solid state imaging element
US8995793B1 (en) * 2009-10-09 2015-03-31 Lockheed Martin Corporation Moving object super-resolution systems and methods
JP5764740B2 (en) * 2010-10-13 2015-08-19 パナソニックIpマネジメント株式会社 Imaging device
US9137503B2 (en) * 2010-11-03 2015-09-15 Sony Corporation Lens and color filter arrangement, super-resolution camera system and method
US8750647B2 (en) * 2011-02-03 2014-06-10 Massachusetts Institute Of Technology Kinetic super-resolution imaging
US20130235234A1 (en) * 2012-03-12 2013-09-12 Megan Lyn Cucci Digital camera having multiple image capture systems
CN107346061B (en) * 2012-08-21 2020-04-24 快图有限公司 System and method for parallax detection and correction in images captured using an array camera

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7474337B1 (en) * 2000-10-24 2009-01-06 Sony Corporation Method and apparatus to provide edge enhancements as part of a demosaicing process
US7006686B2 (en) * 2001-07-18 2006-02-28 Hewlett-Packard Development Company, L.P. Image mosaic data reconstruction
US6978050B2 (en) * 2001-07-18 2005-12-20 Hewlett-Packard Development Company, L.P. Electronic image color plane reconstruction
US7072509B2 (en) * 2001-07-27 2006-07-04 Hewlett-Packard Development Company, L.P. Electronic image color plane reconstruction
US7082218B2 (en) * 2001-07-27 2006-07-25 Hewlett-Packard Development Company, L.P. Color correction of images
US7643676B2 (en) * 2004-03-15 2010-01-05 Microsoft Corp. System and method for adaptive interpolation of images from patterned sensors
US7525583B2 (en) * 2005-02-11 2009-04-28 Hewlett-Packard Development Company, L.P. Decreasing aliasing in electronic images
US8452090B1 (en) * 2005-04-25 2013-05-28 Apple Inc. Bayer reconstruction of images using a GPU
US8005297B2 (en) * 2006-01-18 2011-08-23 Qualcomm Incorporated Method and apparatus for adaptive and self-calibrated sensor green channel gain balancing
US20070257943A1 (en) * 2006-05-08 2007-11-08 Eastman Kodak Company Method for rendering color EL display and display device with improved resolution
US8422771B2 (en) * 2008-10-24 2013-04-16 Sharp Laboratories Of America, Inc. Methods and systems for demosaicing
US8571312B2 (en) * 2009-01-16 2013-10-29 Samsung Electronics Co., Ltd. Image interpolation method and apparatus using pattern characteristics of color filter array
US20100182466A1 (en) * 2009-01-16 2010-07-22 Samsung Digital Imaging Co., Ltd. Image interpolation method and apparatus using pattern characteristics of color filter array
US8605164B2 (en) * 2009-04-20 2013-12-10 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, and storage medium
US8482636B2 (en) * 2010-05-05 2013-07-09 DigitalOptics Corporation Europe Limited Digital zoom on bayer
US20130077862A1 (en) * 2010-06-16 2013-03-28 Yoshikuni Nomura Image processing apparatus, image processing method and program
US8873847B2 (en) * 2010-09-06 2014-10-28 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method of demosaicing a digital raw image, corresponding computer program and graphics or imager circuit
US8531563B2 (en) * 2011-02-28 2013-09-10 Fujifilm Corporation Color imaging apparatus
US8891866B2 (en) * 2011-08-31 2014-11-18 Sony Corporation Image processing apparatus, image processing method, and program
US20140049645A1 (en) * 2012-08-16 2014-02-20 Gentex Corporation Method and system for imaging an external scene by employing a custom image sensor

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10744585B2 (en) 2012-06-06 2020-08-18 Sodyo Ltd. Anchors for location-based navigation and augmented reality applications
US11600029B2 (en) 2012-06-06 2023-03-07 Sodyo Ltd. Display synchronization using colored anchors
US10635958B2 (en) 2015-01-28 2020-04-28 Sodyo Ltd. Hybrid visual tagging using customized colored tiles
US10356407B2 (en) * 2015-11-20 2019-07-16 Facebook Technologies, Llc Display-side video decompression using quantization tables
CN108665419A (en) * 2017-03-30 2018-10-16 展讯通信(上海)有限公司 A kind of method and device of image denoising
WO2020012316A1 (en) * 2018-07-11 2020-01-16 Sodyo Ltd. Detection of machine-readable tags with high resolution using mosaic image sensors
US11423273B2 (en) 2018-07-11 2022-08-23 Sodyo Ltd. Detection of machine-readable tags with high resolution using mosaic image sensors
US11645734B2 (en) * 2019-05-15 2023-05-09 Realtek Semiconductor Corp. Circuitry for image demosaicing and contrast enhancement and image-processing method
CN113781303A (en) * 2021-09-01 2021-12-10 瑞芯微电子股份有限公司 Image processing method, medium, processor and electronic device

Also Published As

Publication number Publication date
US9418398B2 (en) 2016-08-16
US20150363913A1 (en) 2015-12-17
US20150363944A1 (en) 2015-12-17
US20150363912A1 (en) 2015-12-17
KR20150142600A (en) 2015-12-22
KR20150142601A (en) 2015-12-22
US20150363922A1 (en) 2015-12-17
KR20150142599A (en) 2015-12-22

Similar Documents

Publication Publication Date Title
US20150363916A1 (en) Low power demosaic with intergrated chromatic aliasing repair
US9582853B1 (en) Method and system of demosaicing bayer-type image data for image processing
US9794529B2 (en) Method for acquiring image and electronic device thereof
JP5867629B2 (en) System and method for image processing
US9232199B2 (en) Method, apparatus and computer program product for capturing video content
US9363493B2 (en) Image processing apparatus, method, recording medium and image pickup apparatus
CN105408936A (en) System and method of correcting image artifacts
CN108391060B (en) Image processing method, image processing device and terminal
US11194536B2 (en) Image processing method and apparatus for displaying an image between two display screens
US20180159971A1 (en) Method and apparatus for generating unlocking interface, and electronic device
WO2022111730A1 (en) Image processing method and apparatus, and electronic device
US9361537B2 (en) Techniques to reduce color artifacts in a digital image
US11010879B2 (en) Video image processing method and apparatus thereof, display device, computer readable storage medium and computer program product
US20160284053A1 (en) Edge sensing measure for raw image processing
WO2022121893A1 (en) Image processing method and apparatus, and computer device and storage medium
TWI717317B (en) Method and device for taking screenshot of display image of display device
US20140240534A1 (en) Method for image processing and an electronic device thereof
US20210342989A1 (en) Hue preservation post processing with early exit for highlight recovery
CN112437237B (en) Shooting method and device
US20140307116A1 (en) Method and system for managing video recording and/or picture taking in a restricted environment
JP5865517B2 (en) Image display method and apparatus
US10321321B2 (en) Method and device for displaying locked interface and mobile terminal
US8929652B2 (en) Method and apparatus for processing image
WO2017107605A1 (en) Image detail processing method, device, terminal and storage medium
Liu et al. Research and implementation of color image processing pipeline based on FPGA

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BOTZAS, ANTHONY;REEL/FRAME:033100/0118

Effective date: 20140612

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION