US20110317048A1 - Image sensor with dual layer photodiode structure - Google Patents

Image sensor with dual layer photodiode structure Download PDF

Info

Publication number
US20110317048A1
US20110317048A1 US12/826,313 US82631310A US2011317048A1 US 20110317048 A1 US20110317048 A1 US 20110317048A1 US 82631310 A US82631310 A US 82631310A US 2011317048 A1 US2011317048 A1 US 2011317048A1
Authority
US
United States
Prior art keywords
photodiodes
pixel
pixels
image sensor
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/826,313
Inventor
Yingjun Bai
Qun Sun
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Semiconductor Components Industries LLC
Original Assignee
Aptina Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aptina Imaging Corp filed Critical Aptina Imaging Corp
Priority to US12/826,313 priority Critical patent/US20110317048A1/en
Assigned to APTINA IMAGING CORPORATION reassignment APTINA IMAGING CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAI, YINGJUN, SUN, QUN
Publication of US20110317048A1 publication Critical patent/US20110317048A1/en
Assigned to SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC reassignment SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APTINA IMAGING CORPORATION
Priority to US14/793,480 priority patent/US10032810B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14609Pixel-elements with integrated switching, control, storage or amplification elements
    • H01L27/1461Pixel-elements with integrated switching, control, storage or amplification elements characterised by the photosensitive area
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • H01L27/14605Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • H01L27/14607Geometry of the photosensitive area
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • H01L27/14647Multicolour imagers having a stacked pixel-element structure, e.g. npn, npnpn or MQW elements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14649Infrared imagers
    • H01L27/14652Multispectral infrared imagers, having a stacked pixel-element structure, e.g. npn, npnpn or MQW structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • Image sensors are used in many different types of electronic devices to capture an image.
  • modern cameras e.g., video cameras and digital cameras
  • image sensors use image sensors to capture an image.
  • Image sensors typically have color processing capabilities.
  • an image sensor can include a color filter array (“CFA”) that can separate various colors from a color image. The resulting output from the image sensor can then be interpolated to form a full color image.
  • CFA color filter array
  • FIG. 1 is a schematic view of an illustrative electronic device configured in accordance with embodiments of the invention.
  • FIG. 2A is a cross-sectional view of a first illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 2B is a cross-sectional view of a second illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 3 is a cross-sectional view of a third illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 4 is a cross-sectional view of a fourth illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 5 is a representation of a typical color filter array (“CFA”) in a Bayer pattern.
  • CFA color filter array
  • FIGS. 6A-6L are representations of pixel arrangements in accordance with embodiments of the invention.
  • FIG. 7 is a flowchart of an illustrative process for binning photodiodes in accordance with embodiments of the invention.
  • FIG. 1 is a schematic view of an illustrative electronic device configured in accordance with embodiments of the invention.
  • Electronic device 100 can be any type of user device that utilizes an image sensor (embodied here as image sensor 110 ) and is controlled generally by control circuitry 120 .
  • electronic device 100 can include a camera, such as a computer camera, still camera, or portable video camera.
  • Electronic device 100 can also include any other components in a typical camera (or otherwise), which are not depicted in FIG. 1 to avoid any distractions from embodiments of the invention.
  • Image sensor 110 can capture image data corresponding to a streaming image.
  • image sensor 110 can include any combination of lenses and arrays of cells (e.g., charge-coupled devices (“CCDs”) or complementary metal oxide semiconductor (“CMOS”) sensor cells) for capturing light.
  • CCDs charge-coupled devices
  • CMOS complementary metal oxide semiconductor
  • Control circuitry 120 may process data generated by image sensor 110 , and may perform any suitable operations based on this data. For example, control circuitry 120 can obtain multiple color pixels (e.g., red, green, and/or blue pixels) generated by image sensor 110 . Upon obtaining the color pixels, control circuitry 120 can optionally bin the color pixels in one or more dimensions to form one or more pixel groups (e.g., one or more red, green, and/or blue pixel groups). After binning the multiple color pixels, control circuitry 120 can interpolate the one or more pixel groups to output a pixel image.
  • multiple color pixels e.g., red, green, and/or blue pixels
  • control circuitry 120 can optionally bin the color pixels in one or more dimensions to form one or more pixel groups (e.g., one or more red, green, and/or blue pixel groups). After binning the multiple color pixels, control circuitry 120 can interpolate the one or more pixel groups to output a pixel image.
  • a “pixel” can refer to a photodiode and/or a filter that is capable of responding to one or more wavelengths of light (e.g., responding to one or more colors).
  • a “color pixel” can correspond to a photodiode or a photodiode and filter combination capable of absorbing green, blue, or red wavelengths of light.
  • a “clear pixel” can correspond to a photodiode or a photodiode and filter combination capable of absorbing all visible light or both visible light and near infrared signals.
  • an “infrared pixel” can correspond to a photodiode or a photodiode and filter combination capable of absorbing near infrared and infrared signals.
  • Image sensor 110 and control circuitry 120 may be implemented using any suitable combination of hardware and software.
  • image sensor 110 can be implemented substantially all in hardware (e.g., as a system-on-a-chip (“SoC”)). This way, image sensor 110 can have a small design that minimizes the area occupied on electronic device 100 .
  • image sensor 110 may have circuit components designed to maximize the speed of operation.
  • Control circuitry 120 may include, for example, one or more processors, microprocessors, ASICS, FPGAs, or any suitable combination of hardware and software.
  • FIGS. 2-4 show illustrative image sensors in accordance with embodiments of the inventions. It will be understood that, for the sake of simplicity, not all of the layers of an image sensor are shown in FIGS. 2-4 . For example, there may be metal interconnect layers formed between the layers shown and additional dielectric layers for insulation purposes. It will also be understood that image sensors of FIGS. 2-4 can include additional photodiodes not shown in FIGS. 2-4 . It will further be understood that FIGS. 2-4 are not drawn to scale.
  • Image sensor 200 can be the same as or substantially similar to image sensor 110 of FIG. 1 .
  • Image sensor 200 can include substrate 202 that can incorporate multiple photodiodes 204 (e.g., photodiodes 205 - 210 ).
  • image sensor 200 can include a color filter array (“CFA”) (not shown in FIG. 2A ).
  • Photodiodes 204 can convert light into an electrical light signal that can then be processed by control circuitry (e.g., control circuitry 120 of FIG. 1 ) of an electronic device (e.g., electronic device 100 of FIG. 1 ).
  • the level of intensity of light striking photodiodes 204 can affect the magnitude of the electronic signal that can be read by the control circuitry. For example, a higher intensity of light striking photodiodes 204 can generate an increase in the amount of charge collected by photodiodes 204 .
  • photodiodes 204 can have a dual layer photodiode pixel structure.
  • photodiodes 204 can include a first layer (e.g., a bottom layer) of photodiodes such as, for example, photodiodes 205 , 207 , and 209 .
  • photodiodes 204 can include a second layer (e.g., a top layer) of photodiodes such as, for example, photodiodes 206 , 208 , and 210 .
  • the second layer of photodiode may not be disposed uniformly on top of all photodiodes in a first layer of photodiodes.
  • FIG. 2B shows a cross-sectional view of image sensor 230 , which can include photodiodes 240 .
  • Image sensor 230 can be the same as or substantially similar to image sensor 110 of FIG. 1 and image sensor 200 of FIG. 2A .
  • photodiodes 240 may include only one thick integrated layer of photodiodes. For instance, as shown in FIG. 2B , photodiodes 242 and 244 can have a single layer photodiode pixel structure. However, at other photodiode position(s) in image sensor 230 , a second layer of photodiodes may be present, which may be positioned adjacent to photodiodes 242 and 244 . For instance, as shown in FIG.
  • a second layer of photodiodes (e.g., photodiode 246 ) can be disposed on top of a first layer of photodiodes (e.g., photodiode 248 ).
  • the thickness of each of photodiodes 242 and 244 may be the same as or substantially similar to the combined thickness of photodiodes 246 and 248 .
  • an image sensor (e.g., image sensor 200 of FIG. 2A and/or image sensor 230 of FIG. 2B ) can include insulation layer 212 , which can be disposed between the first layer of the photodiodes and the second layer of the photodiodes.
  • Insulation layer 212 which can provide insulation between the two layers, can be formed using any suitable dielectric material such as, for example, paper, plastic, glass, rubber-like polymers, and or any other suitable material.
  • One or more photodiodes 204 e.g., a dual layer pair of photodiodes such as photodiodes 205 and 206 , photodiodes 207 and 208 , or photodiodes 209 and 210
  • phototodiodes 240 e.g., a dual layer pair of photodiodes such as photodiodes 246 and 248
  • each photodiode of photodiodes 204 and/or photodiodes 240 may have a different spectral sensitivity curve.
  • the wavelengths of light from longest to shortest can be ranked as follows: red light, green light, and blue light. Physically, longer wavelengths of light can reach greater depth within the silicon.
  • the usual arrangement of photodiodes 204 can include photodiodes in one or more higher layers (e.g., a top layer) that can respond to shorter wavelengths of light (e.g., blue light).
  • Photodiodes 204 can also include photodiodes in one or more lower layers (e.g., a bottom layer) that can respond to longer wavelengths of light (e.g., red light).
  • the first layer of photodiodes (e.g., photodiodes 205 , 207 , and 209 of FIG. 2A and photodiode 246 of FIG. 2B ) can absorb red and/or green wavelengths of light
  • the second layer of photodiodes (e.g., photodiodes 206 , 208 , and 210 of FIG. 2A and photodiode 248 of FIG. 2B ) can absorb blue and/or green wavelengths of light.
  • photodiodes 205 , 207 , and 209 of FIG. 2A and photodiode 246 of FIG. 2B can correspond to red and/or green pixels
  • photodiodes 206 , 208 , and 210 of FIG. 2A and photodiode 248 of FIG. 2B can correspond to blue and/or green pixels.
  • one or more photodiodes 204 and/or photodiodes 240 can include one or more clear pixels.
  • the one or more clear pixels can absorb a combination of various wavelengths of light (e.g., red, green, and blue wavelengths of light).
  • a single layer of photodiodes e.g., a top or bottom layer of photodiodes
  • clear pixels can be more sensitive to light as compared to color pixels, clear pixels can improve the imaging performance of image sensor 200 under low light conditions. In addition, clear pixels can enable high speed imaging under normal lighting conditions.
  • one or more photodiodes 204 and/or photodiodes 240 can include one or more luminance pixels (e.g., pixels with brightness information corresponding to a color image).
  • Luminance pixels can have a spectral response that is wider than the spectral response of green pixels, but the spectral response of the luminance pixels can be narrower than a full response.
  • one or more photodiodes 204 and/or photodiodes 240 can include one or more infrared pixels (e.g., pixels capable of absorbing near infrared and infrared signals).
  • image sensor 200 can include an infrared (“IR”) cutoff filter 220 , which can be positioned over a CFA (not shown in FIG. 2A or FIG. 2B ) or photodiodes (e.g., photodiodes 204 of FIG. 2A and/or photodiodes 240 of FIG. 2B ) to ensure correct color imaging.
  • IR cutoff filter 220 can limit the effects of IR light on the responses of photodiodes 204 . For example, IR cutoff filter 220 can block undesirable IR light from reaching the photodiodes.
  • lens 222 which can be positioned over IR cutoff filter 220 , can focus light on the photodiodes. For example, light passing through lens 222 can pass through IR cutoff filter 220 and fall on the photodiodes.
  • Lens 222 can include any suitable lens such as, for example, a single lens or multiple micro-lenses. Thus, in one configuration, each micro-lens can be formed over one or more corresponding photodiodes.
  • IR cutoff filter 220 can cover the one or more infrared pixels, and can thereby behave as a dual band IR cutoff filter.
  • IR cutoff filter 220 can pass both visible light and certain portions of the IR light by creating an extra window in the IR band.
  • Image sensor 300 can be the same as or substantially similar to image sensor 110 of FIG. 1 .
  • Image sensor 300 can include substrate 302 , photodiodes 304 (e.g., photodiodes 305 - 310 ), insulation layer 312 , color filter array (“CFA”) 314 , IR cutoff filter 316 , and lens 318 .
  • CFA 314 can be positioned over photodiodes 304
  • IR cutoff filter 316 can be positioned over CFA 314 .
  • photodiodes 304 can include one or more single layer photodiode pixel structures, one or more dual layer photodiode pixel structures, and/or any combination thereof. Persons skilled in the art will also appreciate that photodiodes 304 can include any suitable number of layers in a particular photodiode pixel structure.
  • CFA 314 can include one or more filter elements that may correspond to the desired color responses required for a color system.
  • CFA 314 can include magenta filter element 320 , which can be positioned over one or more photodiodes 304 (e.g., a dual layer pair of photodiodes such as photodiodes 307 and 308 ).
  • photodiodes 307 and 308 can correspond to red and blue pixels, respectively.
  • CFA 314 in order to produce a desired color reproduction quality in photodiodes 304 , can include magenta filer element 320 if photodiodes 304 are unable to achieve a desired quantum efficiency response for red and blue pixels.
  • CFA 314 can include green filter element 322 , which can be positioned over one or more photodiodes 304 (e.g., a dual layer pair of photodiodes such as photodiodes 305 and 306 ). In some cases, photodiodes 305 and 306 can correspond to green pixels.
  • CFA 314 can include one or more blue, red, yellow, and/or cyan filter elements.
  • one or more filter elements e.g., filter elements 320 and 322 of CFA 314 can separate out a particular spectral response of light (e.g., a combination of red and blue spectral responses or green spectral responses) by, for example, blocking passage of other spectra of light.
  • a particular spectral response of light e.g., a combination of red and blue spectral responses or green spectral responses
  • light passing through lens 318 can pass through IR cutoff filter 316 and filter elements 320 and 322 .
  • the filtered light can then fall on a dual layer pair of photodiodes such as photodiodes 307 and 308 or photodiodes 305 and 306 .
  • clear pixels of photodiodes 304 can be those pixels that are associated with no CFA coating on CFA 314 .
  • Image sensor 400 can be the same as or substantially similar to image sensor 110 of FIG. 1 .
  • Image sensor 400 can include substrate 402 , photodiodes 404 (e.g., photodiodes 405 - 410 ), insulation layer 412 , CFA 414 , IR cutoff filter 416 , and lens 418 .
  • photodiodes 404 can include one or more single layer photodiode pixel structures, one or more dual layer photodiode pixel structures, and/or any combination thereof.
  • the imaging capabilities of image sensor 400 can vary depending on the types of pixels that correspond to one or more of photodiodes 404 .
  • photodiodes 405 - 408 can correspond to color pixels (e.g., green, red, and/or blue pixels), and photodiodes 409 and 410 can correspond to clear pixels.
  • IR cutoff filter 416 can cover only a subset of photodiodes 404 corresponding to color pixels (e.g., photodiodes 405 - 408 ). The remaining photodiodes 404 that are uncovered can correspond to clear pixels (e.g., photodiodes 409 and 410 ) and can bring additional IR sensing capabilities and resolution to image sensor 400 .
  • photodiodes 409 and 410 can sense an infrared signal that is beyond the visible light wavelength, which can thus be used to enable night vision application for image sensor 400 . Furthermore, because the color pixels can still be filtered by IR cutoff filter 416 , image sensor 400 can continue to render high-quality color images.
  • photodiodes 405 - 408 can correspond to color pixels
  • photodiodes 409 and 410 can correspond to luminance pixels.
  • IR cutoff filter 416 only covers photodiodes 405 - 408
  • photodiodes 409 and 410 can remain uncovered.
  • the luminance pixels can also provide additional IR sensing capabilities for image sensor 400 .
  • the resulting image sensor configuration may be less sensitive to light because luminance pixels may have a lower sensitivity to light as compared to clear pixels.
  • a typical image sensor can use a CFA (e.g., CFA 314 of FIG. 3 or CFA 414 of FIG. 4 ) to generate a Bayer pattern from a color image.
  • FIG. 5 shows a representation of CFA 500 in a Bayer pattern in accordance with embodiments of the invention.
  • CFA 500 can have a repeating patterns of alternating rows of green and red filters (e.g., row 502 ) with rows of blue and green filters (e.g., row 504 ).
  • FIGS. 6A-6L show illustrative representations of pixel arrangements in accordance with embodiments of the invention.
  • These pixel arrangements can represent arrangements of photodiodes with a dual layer photodiode pixel structure.
  • These photodiodes can be the same as or similar to photodiodes 204 of FIG. 2A , photodiodes 240 of FIG. 2B , photodiodes 304 of FIG. 3 , and/or photodiodes 404 of FIG. 4 .
  • At least one pixel in a second layer (e.g., top layer) of the photodiodes can have a different color than a pixel at the same position in a first layer (e.g., bottom layer) of the photodiodes.
  • “R/B” pixels can correspond to dual layer photodiode pixels with red and blue pixels stacked together at the same positions (e.g., a red pixel at a particular position in a bottom layer of the photodiodes and a blue pixel stacked on top of the red pixel at the same position in a top layer of the photodiodes).
  • “C” pixels can correspond to dual layer photodiode pixels with two clear pixels stacked together at the same position (e.g., clear pixels at the same positions in both a top layer and a bottom layer of the photodiodes).
  • “IR” pixels can correspond to dual layer photodiode pixels with two infrared pixels stacked together at the same position (e.g., infrared pixels at the same positions in both a top layer and a bottom layer of the photodiodes).
  • “G” pixels can correspond to dual layer photodiode pixels with two green pixels stacked together at the same position (e.g., green pixels at the same positions in both a top layer and a bottom layer of the photodiodes).
  • “G” pixels can correspond to only one integrated layer of green pixels. For instance, a top layer of photodiodes can be combined with a bottom layer of photodiodes without any insulation in between the two layers (e.g., photodiodes 242 and 244 of FIG. 2B ).
  • an image sensor e.g., image sensor 110 of FIG. 1 , image sensor 200 of FIG. 2A , image sensor 230 of FIG. 2B , image sensor 300 of FIG. 3 , and/or image sensor 400 of FIG. 4
  • an image sensor can include any suitable pixel arrangement.
  • one or more clear pixels in a pixel arrangement can be replaced with one or more green pixels, one or more luminance pixels (e.g., luminance pixels at the same positions in both a top layer and a bottom layer of the photodiodes), one or more infrared pixels (e.g., infrared pixels at the same positions in both a top layer and a bottom layer of the photodiodes), one or more cyan pixels (e.g., a green pixel at a particular position in a bottom layer of the photodiodes and a blue pixel stacked on top of the green pixel at the same position in a top layer of the photodiodes), one or more yellow pixels (e.g., a red pixel at a particular position in a bottom layer of the photodiodes and a green pixel stacked on top of the red pixel at the same position in a top layer of the photodiodes), any other suitable pixels, and/or any combination thereof.
  • luminance pixels e.g.
  • one or more green pixels in a pixel arrangement can be replaced with one or more clear pixels, one or more luminance pixels, one or more infrared pixels, one or more cyan pixels, one or more yellow pixels, any other suitable pixels, and/or any combination thereof.
  • one or more pixels in these pixel arrangements can be replaced with stack IR pixels.
  • Stack IR pixels can correspond to dual layer photodiode pixels with an infrared pixel and either a color, clear, or luminance pixel stacked together at the same position (e.g., an infrared pixel at a particular position in a bottom layer of the photodiodes and either a color, clear, or luminance pixel stacked on top of the infrared pixel at the same position in a top layer of the photodiodes).
  • pixel arrangements 600 and 602 show two illustrative pixel arrangements for G and R/B pixels.
  • pixel arrangement 600 of FIG. 6A can provide an adamantine (e.g., a checkerboard) pattern of pixels.
  • pixel arrangement 602 of FIG. 6B can provide a rectilinear pattern of pixels.
  • at least one line (e.g., every row and column) of the top and bottom layers of pixel arrangement 602 have pixels of the same color.
  • Pixel arrangement 600 can provide for efficient binning (e.g., downsampling) of the one or more pixels.
  • the binning can be executed by control circuitry of an image system (e.g., control circuitry 120 of FIG. 1 ).
  • control circuitry can separately bin one or more pixel clusters of pixel arrangement 600 .
  • control circuitry can bin a set of pixels (e.g., G pixels 605 and 606 ) along a first direction in order to form a first pixel group.
  • the first direction can, for example, be along a diagonal direction of ⁇ 45°.
  • the control circuitry can bin another set of pixels (e.g., R/B pixels 607 and 608 ) along a second direction (e.g., a diagonal direction of 45°) to form a second pixel group.
  • the control circuitry can bin pixel arrangement 600 without generating as many artifacts as would be generated from binning pixels arranged in a Bayer pattern (e.g., a Bayer pattern as shown in FIG. 5 ).
  • the control circuitry can bin a set of pixels (e.g., G pixels 611 and 612 ) along a first direction (e.g., a horizontal direction) to form a first pixel group.
  • the control circuitry can bin another set of pixels (e.g., R/B pixels 613 and 614 ) along a second direction (e.g., a vertical direction) to form a second pixel group.
  • the control circuitry can combine first pixel group and the second pixel group to form a binned pixel cluster. Due to the dual layer photodiode pixel structure of pixel arrangement 600 of FIG. 6A and pixel arrangement 602 of FIG. 6B , the center pixel of both binned pixel clusters will have a red-green-blue (“RGB”) triplet without requiring any demosaic processing.
  • the control circuitry can interpolate the binned pixel cluster to output a pixel image after binning. For example, the control circuitry can output a 2 ⁇ pixel image with a rectilinear pattern.
  • pixel arrangements 620 and 622 show two illustrative pixel arrangements for C, G, and R/B pixels.
  • pixel arrangement 620 of FIG. 6C can provide an adamantine pattern of pixels.
  • pixel arrangement 622 of FIG. 6D can provide a rectilinear pattern of pixels.
  • at least one line (e.g., rows 626 - 630 ) of the top and bottom layers of pixel arrangement 622 can have pixels of the same color.
  • an image sensor can obtain at least as much luminance resolution as with a regular Bayer pattern of a CFA (e.g., CFA 500 of FIG. 5 ).
  • a CFA e.g., CFA 500 of FIG. 5
  • pixel arrangement 620 can have the same amount of green pixels as compared to a pixel cluster (e.g., pixel cluster 510 of FIG. 5 ) of a Bayer pattern. For instance, for both of the configurations shown in FIGS. 5 and 6C , there are 2 green pixels for each 2 ⁇ 2 pixel cluster.
  • an image sensor employing pixel arrangement 620 has the additional advantage of higher sensitivity to light as compared to a regular Bayer pattern. Furthermore, because pixel arrangement 620 can provide the same amount of red and blue pixels as compared to a regular Bayer pattern, the additional sensitivity is gained without sacrificing red and blue color resolution. In other embodiments, by using pixel arrangement 622 of FIG. 6D , an image sensor can completely avoid green imbalance issues because the G pixels can be arranged along one or more lines of pixel arrangement 622 (e.g., along rows 626 - 630 ).
  • pixel arrangements 631 and 632 show two additional illustrative pixel arrangements for C, G, and R/B pixels.
  • pixel arrangement 631 of FIG. 6E can provide an adamantine pattern of pixels.
  • pixel arrangement 632 of FIG. 6F can provide a rectilinear pattern of pixels.
  • at least one line (e.g., alternate rows and columns) of the top and bottom layers of pixel arrangement 632 have pixels of the same color.
  • pixel arrangements 631 and 632 have a greater number of C pixels, thereby providing additional sensitivity to light. Furthermore, because pixel arrangement 632 provides C pixels along one or more lines (e.g., rows 633 - 637 ), pixel arrangement 632 can allow for separate exposures of C pixels and RIB and G pixels if per line control can be achieved. By separately exposing the C pixels and the R/B and G pixels, pixel arrangement 632 can allow an image sensor to simultaneously avoid saturation of the C pixels and improve the dynamic range of image capture.
  • pixel arrangements 640 and 642 show two illustrative pixel arrangements for C and R/B pixels.
  • pixel arrangement 640 of FIG. 6G can provide an adamantine pattern of pixels.
  • pixel arrangement 642 of FIG. 6H can provide a rectilinear pattern of pixels.
  • at least one line (e.g., alternate rows and columns) of the top and bottom layers of pixel arrangement 632 can have pixels of the same color.
  • pixel arrangements 640 and 642 can provide even more sensitivity to light as compared to pixel arrangements 631 of FIGS. 6E and 632 of FIG. 6F .
  • pixel arrangements 640 and 642 can be used for low light conditions because of lower color resolution requirements.
  • pixel arrangements 640 and 642 can provide C pixels along one or more lines (e.g., one or more rows 643 - 651 and one or more columns 652 - 655 ), which can allow for separate exposures of C pixels and R/B pixels if per line control can be achieved.
  • pixel arrangements 640 and 642 can allow an image sensor to simultaneously avoid saturation of the C pixels and improve the dynamic range of image capture. This way, the image sensor can capture additional information from images than would otherwise be possible.
  • G pixel values can be derived for pixel arrangements 640 and 642 using any suitable approach. For example, because clear pixels are a combination of green, red, and blue pixels, control circuitry (e.g., control circuitry 120 ) can obtain G pixel values by interpolating neighboring R/B pixel values and C pixel values jointly.
  • pixel arrangements 660 and 662 show two illustrative pixel arrangements for G and R/B pixels.
  • pixel arrangement 660 of FIG. 6G can provide a non-rotated pattern of pixels.
  • pixel arrangement 662 of FIG. 6H can provide a rotated pattern of pixels.
  • Pixel arrangements 660 and 662 can include one or more G pixels that may have replaced one or more C pixels of pixel arrangements 620 ( FIG. 6C ), 622 ( FIG. 6D ), 631 ( FIG. 6E ), 632 ( FIG. 6F ), 640 ( FIG. 6G ), and 642 ( FIG. 6H ).
  • the number of G pixels in pixel arrangements 660 and 662 can exceed the number of R/B pixels.
  • Pixel arrangements can have any suitable pre-determined orientations (e.g., 10°, 20°, 30°, etc.).
  • pixel arrangement 660 can have a pre-determined arrangement of 0°.
  • pixel arrangement 662 can have a pre-determined orientation of 45°.
  • the sampling frequency of pixel arrangement 662 along the vertical and horizontal directions may be different from the sampling frequency of pixel arrangement 660 along the vertical, horizontal, and 45° directions.
  • Pixel arrangements 670 and 672 show two illustrative pixel arrangements for IR, G, and R/B pixels.
  • Pixel arrangements 670 ( FIG. 6K) and 672 ( FIG. 6L ) can be substantially similar to pixel arrangements 620 ( FIG. 6C) and 622 ( FIG. 6D ), respectively.
  • the C pixels of pixel arrangements 620 and 622 can be replaced with IR pixels in pixel arrangements 670 and 672 .
  • Pixel arrangement 670 of FIG. 6K can provide an adamantine pattern of pixels.
  • pixel arrangement 672 of FIG. 6L can provide a rectilinear pattern of pixels.
  • at least one line (e.g., rows 673 - 677 ) of the top and bottom layers of pixel arrangement 672 can have pixels of the same color.
  • an image sensor can obtain at least as much luminance resolution as with a regular Bayer pattern of a CFA (e.g., CFA 500 of FIG. 5 ).
  • a CFA e.g., CFA 500 of FIG. 5
  • pixel arrangement 670 can have the same amount of green pixels as compared to a pixel cluster (e.g., pixel cluster 510 of FIG. 5 ) of a Bayer pattern. For instance, for both of the configurations shown in FIGS. 5 and 6K , there are 2 green pixels for each 2 ⁇ 2 pixel cluster.
  • an image sensor employing pixel arrangement 670 can be capable of absorbing near infrared signals and infrared signals. Furthermore, because pixel arrangement 670 can provide the same amount of red and blue pixels as compared to a regular Bayer pattern, this additional capability is gained without sacrificing red and blue color resolution. In other embodiments, by using pixel arrangement 672 of FIG. 6L , an image sensor can completely avoid green imbalance issues because the G pixels can be arranged along one or more lines of pixel arrangement 672 (e.g., along rows 673 - 677 ).
  • FIG. 7 is a flowchart of process 700 for binning photodiodes.
  • process 700 can be executed by control circuitry (e.g., control circuitry 120 of FIG. 1 ) of an image system configured in accordance with embodiments of the invention. It should be understood that process 700 is merely illustrative, and that any steps can be removed, modified, combined, or any steps may be added, without departing from the scope of the invention.
  • Process 700 begins at step 702 .
  • the control circuitry can bin a set of first color pixels (e.g., green pixels) of multiple photodiodes along a first direction to form a first pixel group, where the multiple photodiodes can include a dual layer photodiode pixel structure (e.g., photodiodes 204 of FIG. 2A , photodiodes 246 and 248 of FIG. 2B , photodiodes 304 of FIG. 3 , or photodiodes 404 of FIG. 4 ).
  • a particular pixel arrangement e.g., pixel arrangements 600 and 602 of FIGS.
  • the control circuitry can bin G pixels (e.g., G pixels 605 and 606 of FIG. 6A or G pixels 611 and 612 of FIG. 6B ) along a first direction (e.g., a diagonal direction of ⁇ 45° or a horizontal direction).
  • G pixels e.g., G pixels 605 and 606 of FIG. 6A or G pixels 611 and 612 of FIG. 6B
  • a first direction e.g., a diagonal direction of ⁇ 45° or a horizontal direction.
  • the control circuitry can bin a set of second color pixels and third color pixels (e.g., red and blue pixels) of the multiple photodiodes along a second direction to form a second pixel group.
  • the control circuitry can bin R/B pixels (e.g., R/B pixels 607 and 608 of FIG. 6A or R/B pixels 612 and 613 of FIG. 6B ) along a second direction (e.g., a diagonal direction of 45° or a vertical direction).
  • the first direction can be orthogonal to the second direction.
  • the control circuitry can combine the first pixel group and the second pixel group to form a binned pixel cluster.
  • the control circuitry can interpolate the binned pixel cluster to output a pixel image (e.g., a 2 ⁇ pixel image with a rectilinear pattern).
  • a pixel image e.g., a 2 ⁇ pixel image with a rectilinear pattern.
  • an image system with a dual layer photodiode structure for processing color images.
  • the image system can include an image sensor that can include photodiodes with a dual layer photodiode structure.
  • the dual layer photodiode can include a first layer of photodiodes (e.g., a bottom layer), a second layer of photodiodes (e.g., a top layer), and an insulation layer disposed between the bottom and top layers.
  • the first layer of photodiodes can include one or more suitable pixels (e.g., green, red, clear, luminance, and/or infrared pixels).
  • the second layer of photodiodes can include one or more suitable pixels (e.g., green, blue, clear, luminance, and/or infrared pixels).
  • dual layer photodiodes can include additional clear pixels and still maintain the same or a greater number of green pixels.
  • an image sensor incorporating dual layer photodiodes can gain light sensitivity with the additional clear pixels and maintain luminance information with the green pixels.
  • the additional resolution and light sensitivity for such an image sensor can be advantageous for many situations (e.g., normal lighting conditions, low light conditions, and/or high speed imaging under normal lighting conditions).
  • the image sensor can include a CFA that can include a magenta filter element.
  • the magenta filter element which can be positioned over one or more photodiodes, can allow the image sensor to achieve a desired quantum efficiency response.
  • the image sensor can include an IR cutoff filter, which can cover only a subset of the photodiodes of the image sensor. For example, if the IR cutoff filter only covers photodiodes corresponding to color pixels, uncovered photodiodes corresponding to clear pixels can provide greater sensitivity and resolution to the image sensor. Furthermore, because the color pixels can still be filtered by the IR cutoff filter, the image sensor can continue to render high-quality color images.
  • the image system can include control circuitry for processing data generated by an image sensor.
  • the control circuitry can bin pixels of multiple photodiodes of the image sensor.
  • the control circuitry can bin a set of first color pixels of the multiple photodiodes to form a first pixel group.
  • control circuitry can bin a set of second color pixels and third color pixels of the multiple photodiodes to form a second pixel group.
  • the control circuitry can then combine the first pixel group and the second pixel group to form a binned pixel cluster.
  • the control circuitry can interpolate the binned pixel cluster to output a pixel image (e.g., a 2 ⁇ pixel image with a rectilinear pattern).

Abstract

An image system with a dual layer photodiode structure is provided for processing color images. In particular, the image system can include an image sensor that can include photodiodes with a dual layer photodiode structure. In some embodiments, the dual layer photodiode can include a first layer of photodiodes (e.g., a bottom layer), an insulation layer disposed on the first layer of photodiodes, and a second layer of photodiodes (e.g., a top layer) disposed on the insulation layer. The first layer of photodiodes can include one or more suitable pixels (e.g., green, blue, clear, luminance, and/or infrared pixels). Likewise, the second layer of photodiodes can include one or more suitable pixels (e.g., green, red, clear, luminance, and/or infrared pixels). An image sensor incorporating dual layer photodiodes can gain light sensitivity with additional clear pixels and maintain luminance information with green pixels.

Description

    FIELD OF THE INVENTION
  • This is directed to an image sensor with a dual layer photodiode structure.
  • BACKGROUND OF THE DISCLOSURE
  • Image sensors are used in many different types of electronic devices to capture an image. For example, modern cameras (e.g., video cameras and digital cameras) and other image capturing devices use image sensors to capture an image.
  • Image sensors typically have color processing capabilities. For example, an image sensor can include a color filter array (“CFA”) that can separate various colors from a color image. The resulting output from the image sensor can then be interpolated to form a full color image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic view of an illustrative electronic device configured in accordance with embodiments of the invention.
  • FIG. 2A is a cross-sectional view of a first illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 2B is a cross-sectional view of a second illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 3 is a cross-sectional view of a third illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 4 is a cross-sectional view of a fourth illustrative image sensor in accordance with embodiments of the invention.
  • FIG. 5 is a representation of a typical color filter array (“CFA”) in a Bayer pattern.
  • FIGS. 6A-6L are representations of pixel arrangements in accordance with embodiments of the invention.
  • FIG. 7 is a flowchart of an illustrative process for binning photodiodes in accordance with embodiments of the invention.
  • DETAILED DESCRIPTION OF THE DISCLOSURE
  • FIG. 1 is a schematic view of an illustrative electronic device configured in accordance with embodiments of the invention. Electronic device 100 can be any type of user device that utilizes an image sensor (embodied here as image sensor 110) and is controlled generally by control circuitry 120. For example, electronic device 100 can include a camera, such as a computer camera, still camera, or portable video camera. Electronic device 100 can also include any other components in a typical camera (or otherwise), which are not depicted in FIG. 1 to avoid any distractions from embodiments of the invention.
  • Image sensor 110 can capture image data corresponding to a streaming image. For example, image sensor 110 can include any combination of lenses and arrays of cells (e.g., charge-coupled devices (“CCDs”) or complementary metal oxide semiconductor (“CMOS”) sensor cells) for capturing light.
  • Control circuitry 120 may process data generated by image sensor 110, and may perform any suitable operations based on this data. For example, control circuitry 120 can obtain multiple color pixels (e.g., red, green, and/or blue pixels) generated by image sensor 110. Upon obtaining the color pixels, control circuitry 120 can optionally bin the color pixels in one or more dimensions to form one or more pixel groups (e.g., one or more red, green, and/or blue pixel groups). After binning the multiple color pixels, control circuitry 120 can interpolate the one or more pixel groups to output a pixel image. As used herein, a “pixel” can refer to a photodiode and/or a filter that is capable of responding to one or more wavelengths of light (e.g., responding to one or more colors). For example, a “color pixel” can correspond to a photodiode or a photodiode and filter combination capable of absorbing green, blue, or red wavelengths of light. As another example, a “clear pixel” can correspond to a photodiode or a photodiode and filter combination capable of absorbing all visible light or both visible light and near infrared signals. As yet another example, an “infrared pixel” can correspond to a photodiode or a photodiode and filter combination capable of absorbing near infrared and infrared signals.
  • Image sensor 110 and control circuitry 120 may be implemented using any suitable combination of hardware and software. In some embodiments, image sensor 110 can be implemented substantially all in hardware (e.g., as a system-on-a-chip (“SoC”)). This way, image sensor 110 can have a small design that minimizes the area occupied on electronic device 100. In addition, image sensor 110 may have circuit components designed to maximize the speed of operation. Control circuitry 120 may include, for example, one or more processors, microprocessors, ASICS, FPGAs, or any suitable combination of hardware and software.
  • Turning now to FIGS. 2-4, these figures show illustrative image sensors in accordance with embodiments of the inventions. It will be understood that, for the sake of simplicity, not all of the layers of an image sensor are shown in FIGS. 2-4. For example, there may be metal interconnect layers formed between the layers shown and additional dielectric layers for insulation purposes. It will also be understood that image sensors of FIGS. 2-4 can include additional photodiodes not shown in FIGS. 2-4. It will further be understood that FIGS. 2-4 are not drawn to scale.
  • Turning first to FIG. 2A, a cross-sectional view of image sensor 200 is shown. Image sensor 200 can be the same as or substantially similar to image sensor 110 of FIG. 1. Image sensor 200 can include substrate 202 that can incorporate multiple photodiodes 204 (e.g., photodiodes 205-210). In some embodiments, image sensor 200 can include a color filter array (“CFA”) (not shown in FIG. 2A). Photodiodes 204 can convert light into an electrical light signal that can then be processed by control circuitry (e.g., control circuitry 120 of FIG. 1) of an electronic device (e.g., electronic device 100 of FIG. 1). In particular, the level of intensity of light striking photodiodes 204 can affect the magnitude of the electronic signal that can be read by the control circuitry. For example, a higher intensity of light striking photodiodes 204 can generate an increase in the amount of charge collected by photodiodes 204.
  • In some embodiments, photodiodes 204 can have a dual layer photodiode pixel structure. Thus, photodiodes 204 can include a first layer (e.g., a bottom layer) of photodiodes such as, for example, photodiodes 205, 207, and 209. In addition, photodiodes 204 can include a second layer (e.g., a top layer) of photodiodes such as, for example, photodiodes 206, 208, and 210.
  • In some embodiments, the second layer of photodiode may not be disposed uniformly on top of all photodiodes in a first layer of photodiodes. For example, FIG. 2B shows a cross-sectional view of image sensor 230, which can include photodiodes 240. Image sensor 230 can be the same as or substantially similar to image sensor 110 of FIG. 1 and image sensor 200 of FIG. 2A.
  • At some photodiode position(s) in image sensor 230, photodiodes 240 may include only one thick integrated layer of photodiodes. For instance, as shown in FIG. 2B, photodiodes 242 and 244 can have a single layer photodiode pixel structure. However, at other photodiode position(s) in image sensor 230, a second layer of photodiodes may be present, which may be positioned adjacent to photodiodes 242 and 244. For instance, as shown in FIG. 2B, a second layer of photodiodes (e.g., photodiode 246) can be disposed on top of a first layer of photodiodes (e.g., photodiode 248). In some cases, the thickness of each of photodiodes 242 and 244 may be the same as or substantially similar to the combined thickness of photodiodes 246 and 248.
  • In some embodiments, an image sensor (e.g., image sensor 200 of FIG. 2A and/or image sensor 230 of FIG. 2B) can include insulation layer 212, which can be disposed between the first layer of the photodiodes and the second layer of the photodiodes. Insulation layer 212, which can provide insulation between the two layers, can be formed using any suitable dielectric material such as, for example, paper, plastic, glass, rubber-like polymers, and or any other suitable material.
  • One or more photodiodes 204 (e.g., a dual layer pair of photodiodes such as photodiodes 205 and 206, photodiodes 207 and 208, or photodiodes 209 and 210) or phototodiodes 240 (e.g., a dual layer pair of photodiodes such as photodiodes 246 and 248) can correspond to color pixels, which can respond to different wavelengths of light based on the depths that each wavelength of light can reach inside the silicon. Accordingly, each photodiode of photodiodes 204 and/or photodiodes 240 may have a different spectral sensitivity curve.
  • The wavelengths of light from longest to shortest can be ranked as follows: red light, green light, and blue light. Physically, longer wavelengths of light can reach greater depth within the silicon. As such, the usual arrangement of photodiodes 204 can include photodiodes in one or more higher layers (e.g., a top layer) that can respond to shorter wavelengths of light (e.g., blue light). Photodiodes 204 can also include photodiodes in one or more lower layers (e.g., a bottom layer) that can respond to longer wavelengths of light (e.g., red light). Because longer wavelengths can be absorbed at greater depth by photodiodes 204 and 240, the first layer of photodiodes (e.g., photodiodes 205, 207, and 209 of FIG. 2A and photodiode 246 of FIG. 2B) can absorb red and/or green wavelengths of light, and the second layer of photodiodes (e.g., photodiodes 206, 208, and 210 of FIG. 2A and photodiode 248 of FIG. 2B) can absorb blue and/or green wavelengths of light. Thus, photodiodes 205, 207, and 209 of FIG. 2A and photodiode 246 of FIG. 2B can correspond to red and/or green pixels and photodiodes 206, 208, and 210 of FIG. 2A and photodiode 248 of FIG. 2B can correspond to blue and/or green pixels.
  • In some embodiments, in addition to or instead of color pixels, one or more photodiodes 204 and/or photodiodes 240 (e.g., a dual layer pair of photodiodes such as photodiodes 205 and 206, photodiodes 207 and 208, or photodiodes 209 and 210 of FIG. 2A and/or photodiodes 246 and 248 of FIG. 2B) can include one or more clear pixels. The one or more clear pixels can absorb a combination of various wavelengths of light (e.g., red, green, and blue wavelengths of light). In some cases, a single layer of photodiodes (e.g., a top or bottom layer of photodiodes) can include a combination of clear pixels and color pixels.
  • Because clear pixels can be more sensitive to light as compared to color pixels, clear pixels can improve the imaging performance of image sensor 200 under low light conditions. In addition, clear pixels can enable high speed imaging under normal lighting conditions.
  • In other embodiments, in addition to or instead of color pixels, one or more photodiodes 204 and/or photodiodes 240 (e.g., a dual layer pair of photodiodes such as photodiodes 205 and 206, photodiodes 207 and 208, or photodiodes 209 and 210 of FIG. 2A and/or photodiodes 246 and 248 of FIG. 2B) can include one or more luminance pixels (e.g., pixels with brightness information corresponding to a color image). Luminance pixels can have a spectral response that is wider than the spectral response of green pixels, but the spectral response of the luminance pixels can be narrower than a full response.
  • In further embodiments, in addition to or instead of color pixels, one or more photodiodes 204 and/or photodiodes 240 (e.g., a dual layer pair of photodiodes such as photodiodes 205 and 206, photodiodes 207 and 208, or photodiodes 209 and 210 of FIG. 2A and/or photodiodes 246 and 248 of FIG. 2B) can include one or more infrared pixels (e.g., pixels capable of absorbing near infrared and infrared signals).
  • In some embodiments, for image devices that use visible light responses, image sensor 200 can include an infrared (“IR”) cutoff filter 220, which can be positioned over a CFA (not shown in FIG. 2A or FIG. 2B) or photodiodes (e.g., photodiodes 204 of FIG. 2A and/or photodiodes 240 of FIG. 2B) to ensure correct color imaging. IR cutoff filter 220 can limit the effects of IR light on the responses of photodiodes 204. For example, IR cutoff filter 220 can block undesirable IR light from reaching the photodiodes. In some cases, lens 222, which can be positioned over IR cutoff filter 220, can focus light on the photodiodes. For example, light passing through lens 222 can pass through IR cutoff filter 220 and fall on the photodiodes. Lens 222 can include any suitable lens such as, for example, a single lens or multiple micro-lenses. Thus, in one configuration, each micro-lens can be formed over one or more corresponding photodiodes.
  • In some embodiments, if one or more photodiodes are infrared pixels, IR cutoff filter 220 can cover the one or more infrared pixels, and can thereby behave as a dual band IR cutoff filter. For example, IR cutoff filter 220 can pass both visible light and certain portions of the IR light by creating an extra window in the IR band.
  • Turning now to FIG. 3, a cross-sectional view of image sensor 300 is shown. Image sensor 300 can be the same as or substantially similar to image sensor 110 of FIG. 1. Image sensor 300 can include substrate 302, photodiodes 304 (e.g., photodiodes 305-310), insulation layer 312, color filter array (“CFA”) 314, IR cutoff filter 316, and lens 318. In one configuration, CFA 314 can be positioned over photodiodes 304, and IR cutoff filter 316 can be positioned over CFA 314. Persons skilled in the art will appreciate that photodiodes 304 can include one or more single layer photodiode pixel structures, one or more dual layer photodiode pixel structures, and/or any combination thereof. Persons skilled in the art will also appreciate that photodiodes 304 can include any suitable number of layers in a particular photodiode pixel structure.
  • CFA 314 can include one or more filter elements that may correspond to the desired color responses required for a color system. For example, as shown in FIG. 3, CFA 314 can include magenta filter element 320, which can be positioned over one or more photodiodes 304 (e.g., a dual layer pair of photodiodes such as photodiodes 307 and 308). In some cases, photodiodes 307 and 308 can correspond to red and blue pixels, respectively. In some embodiments, in order to produce a desired color reproduction quality in photodiodes 304, CFA 314 can include magenta filer element 320 if photodiodes 304 are unable to achieve a desired quantum efficiency response for red and blue pixels.
  • As another example, CFA 314 can include green filter element 322, which can be positioned over one or more photodiodes 304 (e.g., a dual layer pair of photodiodes such as photodiodes 305 and 306). In some cases, photodiodes 305 and 306 can correspond to green pixels. Persons skilled in the art will appreciate that, in addition to or instead of magenta filter element 320 and/or green filter element 322, CFA 314 can include one or more blue, red, yellow, and/or cyan filter elements.
  • For the one or more photodiodes, one or more filter elements (e.g., filter elements 320 and 322) of CFA 314 can separate out a particular spectral response of light (e.g., a combination of red and blue spectral responses or green spectral responses) by, for example, blocking passage of other spectra of light. For instance, light passing through lens 318 can pass through IR cutoff filter 316 and filter elements 320 and 322. The filtered light can then fall on a dual layer pair of photodiodes such as photodiodes 307 and 308 or photodiodes 305 and 306.
  • In some embodiments, clear pixels of photodiodes 304 (e.g., a dual layer pair of photodiodes such as photodiodes 309 and 310) can be those pixels that are associated with no CFA coating on CFA 314.
  • Turning next to FIG. 4, a cross-sectional view of image sensor 400 is shown. Image sensor 400 can be the same as or substantially similar to image sensor 110 of FIG. 1. Image sensor 400 can include substrate 402, photodiodes 404 (e.g., photodiodes 405-410), insulation layer 412, CFA 414, IR cutoff filter 416, and lens 418. Persons skilled in the art will appreciate that photodiodes 404 can include one or more single layer photodiode pixel structures, one or more dual layer photodiode pixel structures, and/or any combination thereof.
  • The imaging capabilities of image sensor 400 can vary depending on the types of pixels that correspond to one or more of photodiodes 404. For example, photodiodes 405-408 can correspond to color pixels (e.g., green, red, and/or blue pixels), and photodiodes 409 and 410 can correspond to clear pixels. As shown for image sensor 400, IR cutoff filter 416 can cover only a subset of photodiodes 404 corresponding to color pixels (e.g., photodiodes 405-408). The remaining photodiodes 404 that are uncovered can correspond to clear pixels (e.g., photodiodes 409 and 410) and can bring additional IR sensing capabilities and resolution to image sensor 400. For example, photodiodes 409 and 410 can sense an infrared signal that is beyond the visible light wavelength, which can thus be used to enable night vision application for image sensor 400. Furthermore, because the color pixels can still be filtered by IR cutoff filter 416, image sensor 400 can continue to render high-quality color images.
  • As another example, photodiodes 405-408 can correspond to color pixels, and photodiodes 409 and 410 can correspond to luminance pixels. Thus, if IR cutoff filter 416 only covers photodiodes 405-408, photodiodes 409 and 410 can remain uncovered. Similar to an image sensor incorporating uncovered clear pixels, the luminance pixels can also provide additional IR sensing capabilities for image sensor 400. However, the resulting image sensor configuration may be less sensitive to light because luminance pixels may have a lower sensitivity to light as compared to clear pixels.
  • As discussed in connection with FIG. 3, a typical image sensor (e.g., image sensor 110 of FIG. 1) can use a CFA (e.g., CFA 314 of FIG. 3 or CFA 414 of FIG. 4) to generate a Bayer pattern from a color image. For example, FIG. 5 shows a representation of CFA 500 in a Bayer pattern in accordance with embodiments of the invention. As shown in FIG. 5, CFA 500 can have a repeating patterns of alternating rows of green and red filters (e.g., row 502) with rows of blue and green filters (e.g., row 504).
  • Turning now to FIGS. 6A-6L, these figures show illustrative representations of pixel arrangements in accordance with embodiments of the invention. These pixel arrangements can represent arrangements of photodiodes with a dual layer photodiode pixel structure. These photodiodes can be the same as or similar to photodiodes 204 of FIG. 2A, photodiodes 240 of FIG. 2B, photodiodes 304 of FIG. 3, and/or photodiodes 404 of FIG. 4. In some embodiments, at least one pixel in a second layer (e.g., top layer) of the photodiodes can have a different color than a pixel at the same position in a first layer (e.g., bottom layer) of the photodiodes.
  • As shown in FIGS. 6A-6L, “R/B” pixels can correspond to dual layer photodiode pixels with red and blue pixels stacked together at the same positions (e.g., a red pixel at a particular position in a bottom layer of the photodiodes and a blue pixel stacked on top of the red pixel at the same position in a top layer of the photodiodes). As another example, “C” pixels can correspond to dual layer photodiode pixels with two clear pixels stacked together at the same position (e.g., clear pixels at the same positions in both a top layer and a bottom layer of the photodiodes). As yet another example, “IR” pixels can correspond to dual layer photodiode pixels with two infrared pixels stacked together at the same position (e.g., infrared pixels at the same positions in both a top layer and a bottom layer of the photodiodes). As a further example, “G” pixels can correspond to dual layer photodiode pixels with two green pixels stacked together at the same position (e.g., green pixels at the same positions in both a top layer and a bottom layer of the photodiodes). As yet a further example, “G” pixels can correspond to only one integrated layer of green pixels. For instance, a top layer of photodiodes can be combined with a bottom layer of photodiodes without any insulation in between the two layers (e.g., photodiodes 242 and 244 of FIG. 2B).
  • Persons skilled in the art will appreciate that the representations in FIGS. 6A-6L are merely exemplary, and that an image sensor (e.g., image sensor 110 of FIG. 1, image sensor 200 of FIG. 2A, image sensor 230 of FIG. 2B, image sensor 300 of FIG. 3, and/or image sensor 400 of FIG. 4) can include any suitable pixel arrangement. For example, one or more clear pixels in a pixel arrangement can be replaced with one or more green pixels, one or more luminance pixels (e.g., luminance pixels at the same positions in both a top layer and a bottom layer of the photodiodes), one or more infrared pixels (e.g., infrared pixels at the same positions in both a top layer and a bottom layer of the photodiodes), one or more cyan pixels (e.g., a green pixel at a particular position in a bottom layer of the photodiodes and a blue pixel stacked on top of the green pixel at the same position in a top layer of the photodiodes), one or more yellow pixels (e.g., a red pixel at a particular position in a bottom layer of the photodiodes and a green pixel stacked on top of the red pixel at the same position in a top layer of the photodiodes), any other suitable pixels, and/or any combination thereof. As another example, one or more green pixels in a pixel arrangement can be replaced with one or more clear pixels, one or more luminance pixels, one or more infrared pixels, one or more cyan pixels, one or more yellow pixels, any other suitable pixels, and/or any combination thereof. As yet another example, one or more pixels in these pixel arrangements can be replaced with stack IR pixels. Stack IR pixels can correspond to dual layer photodiode pixels with an infrared pixel and either a color, clear, or luminance pixel stacked together at the same position (e.g., an infrared pixel at a particular position in a bottom layer of the photodiodes and either a color, clear, or luminance pixel stacked on top of the infrared pixel at the same position in a top layer of the photodiodes).
  • Turning first to FIGS. 6A and 6B, pixel arrangements 600 and 602 show two illustrative pixel arrangements for G and R/B pixels. For example, pixel arrangement 600 of FIG. 6A can provide an adamantine (e.g., a checkerboard) pattern of pixels. As another example, pixel arrangement 602 of FIG. 6B can provide a rectilinear pattern of pixels. For the rectilinear pattern of pixel arrangement 602, at least one line (e.g., every row and column) of the top and bottom layers of pixel arrangement 602 have pixels of the same color.
  • Pixel arrangement 600 can provide for efficient binning (e.g., downsampling) of the one or more pixels. The binning can be executed by control circuitry of an image system (e.g., control circuitry 120 of FIG. 1). For example, the control circuitry can separately bin one or more pixel clusters of pixel arrangement 600.
  • For instance, for pixel cluster 604 of pixel arrangement 600, control circuitry can bin a set of pixels (e.g., G pixels 605 and 606) along a first direction in order to form a first pixel group. The first direction can, for example, be along a diagonal direction of −45°. In addition, for the same pixel cluster 604, the control circuitry can bin another set of pixels (e.g., R/B pixels 607 and 608) along a second direction (e.g., a diagonal direction of 45°) to form a second pixel group. Because pixels in the first and second pixel groups (e.g., G pixels 605 and 606 and R/B pixels 607 and 608) have a shorter distance relative to one another as compared to pixels that are positioned vertically or horizontally, the control circuitry can bin pixel arrangement 600 without generating as many artifacts as would be generated from binning pixels arranged in a Bayer pattern (e.g., a Bayer pattern as shown in FIG. 5).
  • Binning can similarly be performed by the control circuitry for pixel arrangement 602 of FIG. 6B. For example, for pixel cluster 610 of pixel arrangement 602, the control circuitry can bin a set of pixels (e.g., G pixels 611 and 612) along a first direction (e.g., a horizontal direction) to form a first pixel group. In addition, for the same pixel cluster 610, the control circuitry can bin another set of pixels (e.g., R/B pixels 613 and 614) along a second direction (e.g., a vertical direction) to form a second pixel group.
  • After binning, the control circuitry can combine first pixel group and the second pixel group to form a binned pixel cluster. Due to the dual layer photodiode pixel structure of pixel arrangement 600 of FIG. 6A and pixel arrangement 602 of FIG. 6B, the center pixel of both binned pixel clusters will have a red-green-blue (“RGB”) triplet without requiring any demosaic processing. In some embodiments, for pixel arrangement 600 of FIG. 6A, the control circuitry can interpolate the binned pixel cluster to output a pixel image after binning. For example, the control circuitry can output a 2× pixel image with a rectilinear pattern.
  • Turning next to FIGS. 6C and 6D, pixel arrangements 620 and 622 show two illustrative pixel arrangements for C, G, and R/B pixels. For example, pixel arrangement 620 of FIG. 6C can provide an adamantine pattern of pixels. As another example, pixel arrangement 622 of FIG. 6D can provide a rectilinear pattern of pixels. For the rectilinear pattern of pixel arrangement 622, at least one line (e.g., rows 626-630) of the top and bottom layers of pixel arrangement 622 can have pixels of the same color.
  • In some embodiments, by using pixel arrangement 620 of FIG. 6C, an image sensor can obtain at least as much luminance resolution as with a regular Bayer pattern of a CFA (e.g., CFA 500 of FIG. 5). For example, for any particular pixel cluster (e.g., pixel cluster 624), pixel arrangement 620 can have the same amount of green pixels as compared to a pixel cluster (e.g., pixel cluster 510 of FIG. 5) of a Bayer pattern. For instance, for both of the configurations shown in FIGS. 5 and 6C, there are 2 green pixels for each 2×2 pixel cluster.
  • Because of the addition of C pixels (e.g., C pixels 625) in pixel arrangement 620, an image sensor employing pixel arrangement 620 has the additional advantage of higher sensitivity to light as compared to a regular Bayer pattern. Furthermore, because pixel arrangement 620 can provide the same amount of red and blue pixels as compared to a regular Bayer pattern, the additional sensitivity is gained without sacrificing red and blue color resolution. In other embodiments, by using pixel arrangement 622 of FIG. 6D, an image sensor can completely avoid green imbalance issues because the G pixels can be arranged along one or more lines of pixel arrangement 622 (e.g., along rows 626-630).
  • Turning to FIGS. 6E and 6F, pixel arrangements 631 and 632 show two additional illustrative pixel arrangements for C, G, and R/B pixels. For example, pixel arrangement 631 of FIG. 6E can provide an adamantine pattern of pixels. As another example, pixel arrangement 632 of FIG. 6F can provide a rectilinear pattern of pixels. For the rectilinear pattern of pixel arrangement 632, at least one line (e.g., alternate rows and columns) of the top and bottom layers of pixel arrangement 632 have pixels of the same color.
  • In comparison to pixel arrangements 620 of FIGS. 6C and 622 of FIG. 6D, pixel arrangements 631 and 632 have a greater number of C pixels, thereby providing additional sensitivity to light. Furthermore, because pixel arrangement 632 provides C pixels along one or more lines (e.g., rows 633-637), pixel arrangement 632 can allow for separate exposures of C pixels and RIB and G pixels if per line control can be achieved. By separately exposing the C pixels and the R/B and G pixels, pixel arrangement 632 can allow an image sensor to simultaneously avoid saturation of the C pixels and improve the dynamic range of image capture.
  • Turning next to FIGS. 6G and 6H, pixel arrangements 640 and 642 show two illustrative pixel arrangements for C and R/B pixels. For example, pixel arrangement 640 of FIG. 6G can provide an adamantine pattern of pixels. As another example, pixel arrangement 642 of FIG. 6H can provide a rectilinear pattern of pixels. For the rectilinear pattern of pixel arrangement 642, at least one line (e.g., alternate rows and columns) of the top and bottom layers of pixel arrangement 632 can have pixels of the same color.
  • Because of additional C pixels in pixel arrangements 640 and 642, pixel arrangements 640 and 642 can provide even more sensitivity to light as compared to pixel arrangements 631 of FIGS. 6E and 632 of FIG. 6F. In some embodiments, pixel arrangements 640 and 642 can be used for low light conditions because of lower color resolution requirements.
  • As shown in FIGS. 6G and 6H, pixel arrangements 640 and 642 can provide C pixels along one or more lines (e.g., one or more rows 643-651 and one or more columns 652-655), which can allow for separate exposures of C pixels and R/B pixels if per line control can be achieved. By separately exposing the C pixels and the R/B pixels, pixel arrangements 640 and 642 can allow an image sensor to simultaneously avoid saturation of the C pixels and improve the dynamic range of image capture. This way, the image sensor can capture additional information from images than would otherwise be possible.
  • In some embodiments, G pixel values can be derived for pixel arrangements 640 and 642 using any suitable approach. For example, because clear pixels are a combination of green, red, and blue pixels, control circuitry (e.g., control circuitry 120) can obtain G pixel values by interpolating neighboring R/B pixel values and C pixel values jointly.
  • Turning now to FIGS. 6I and 6J, pixel arrangements 660 and 662 show two illustrative pixel arrangements for G and R/B pixels. For example, pixel arrangement 660 of FIG. 6G can provide a non-rotated pattern of pixels. As another example, pixel arrangement 662 of FIG. 6H can provide a rotated pattern of pixels.
  • Pixel arrangements 660 and 662 can include one or more G pixels that may have replaced one or more C pixels of pixel arrangements 620 (FIG. 6C), 622 (FIG. 6D), 631 (FIG. 6E), 632 (FIG. 6F), 640 (FIG. 6G), and 642 (FIG. 6H). In some embodiments, the number of G pixels in pixel arrangements 660 and 662 can exceed the number of R/B pixels.
  • Pixel arrangements can have any suitable pre-determined orientations (e.g., 10°, 20°, 30°, etc.). For example, pixel arrangement 660 can have a pre-determined arrangement of 0°. As another example, pixel arrangement 662 can have a pre-determined orientation of 45°. As a result of its pre-determined orientation, the sampling frequency of pixel arrangement 662 along the vertical and horizontal directions may be different from the sampling frequency of pixel arrangement 660 along the vertical, horizontal, and 45° directions.
  • Turning next to FIGS. 6K and 6L, pixel arrangements 670 and 672 show two illustrative pixel arrangements for IR, G, and R/B pixels. Pixel arrangements 670 (FIG. 6K) and 672 (FIG. 6L) can be substantially similar to pixel arrangements 620 (FIG. 6C) and 622 (FIG. 6D), respectively. In particular, the C pixels of pixel arrangements 620 and 622 can be replaced with IR pixels in pixel arrangements 670 and 672.
  • Pixel arrangement 670 of FIG. 6K can provide an adamantine pattern of pixels. Moreover, pixel arrangement 672 of FIG. 6L can provide a rectilinear pattern of pixels. For the rectilinear pattern of pixel arrangement 672, at least one line (e.g., rows 673-677) of the top and bottom layers of pixel arrangement 672 can have pixels of the same color.
  • In some embodiments, by using pixel arrangement 670 of FIG. 6K, an image sensor can obtain at least as much luminance resolution as with a regular Bayer pattern of a CFA (e.g., CFA 500 of FIG. 5). For example, for any particular pixel cluster (e.g., pixel cluster 680), pixel arrangement 670 can have the same amount of green pixels as compared to a pixel cluster (e.g., pixel cluster 510 of FIG. 5) of a Bayer pattern. For instance, for both of the configurations shown in FIGS. 5 and 6K, there are 2 green pixels for each 2×2 pixel cluster.
  • Because of the addition of IR pixels (e.g., IR pixels 682) in pixel arrangement 670, an image sensor employing pixel arrangement 670 can be capable of absorbing near infrared signals and infrared signals. Furthermore, because pixel arrangement 670 can provide the same amount of red and blue pixels as compared to a regular Bayer pattern, this additional capability is gained without sacrificing red and blue color resolution. In other embodiments, by using pixel arrangement 672 of FIG. 6L, an image sensor can completely avoid green imbalance issues because the G pixels can be arranged along one or more lines of pixel arrangement 672 (e.g., along rows 673-677).
  • As discussed previously, an image system (e.g., image system 100 of FIG. 1) can perform binning on one or more photodiodes. FIG. 7 is a flowchart of process 700 for binning photodiodes. In particular, process 700 can be executed by control circuitry (e.g., control circuitry 120 of FIG. 1) of an image system configured in accordance with embodiments of the invention. It should be understood that process 700 is merely illustrative, and that any steps can be removed, modified, combined, or any steps may be added, without departing from the scope of the invention.
  • Process 700 begins at step 702. At step 704, the control circuitry can bin a set of first color pixels (e.g., green pixels) of multiple photodiodes along a first direction to form a first pixel group, where the multiple photodiodes can include a dual layer photodiode pixel structure (e.g., photodiodes 204 of FIG. 2A, photodiodes 246 and 248 of FIG. 2B, photodiodes 304 of FIG. 3, or photodiodes 404 of FIG. 4). For example, for a particular pixel arrangement (e.g., pixel arrangements 600 and 602 of FIGS. 6A and 6B, respectively) of the multiple photodiodes, the control circuitry can bin G pixels (e.g., G pixels 605 and 606 of FIG. 6A or G pixels 611 and 612 of FIG. 6B) along a first direction (e.g., a diagonal direction of −45° or a horizontal direction).
  • Then, at step 706, the control circuitry can bin a set of second color pixels and third color pixels (e.g., red and blue pixels) of the multiple photodiodes along a second direction to form a second pixel group. For example, the control circuitry can bin R/B pixels (e.g., R/B pixels 607 and 608 of FIG. 6A or R/B pixels 612 and 613 of FIG. 6B) along a second direction (e.g., a diagonal direction of 45° or a vertical direction). In some embodiments, the first direction can be orthogonal to the second direction.
  • Continuing to step 708, the control circuitry can combine the first pixel group and the second pixel group to form a binned pixel cluster. After forming the binned pixel cluster, at step 710, the control circuitry can interpolate the binned pixel cluster to output a pixel image (e.g., a 2× pixel image with a rectilinear pattern). Process 700 then ends at step 712.
  • In conclusion, an image system with a dual layer photodiode structure is provided for processing color images. In particular, the image system can include an image sensor that can include photodiodes with a dual layer photodiode structure. In some embodiments, the dual layer photodiode can include a first layer of photodiodes (e.g., a bottom layer), a second layer of photodiodes (e.g., a top layer), and an insulation layer disposed between the bottom and top layers. The first layer of photodiodes can include one or more suitable pixels (e.g., green, red, clear, luminance, and/or infrared pixels). Likewise, the second layer of photodiodes can include one or more suitable pixels (e.g., green, blue, clear, luminance, and/or infrared pixels).
  • In contrast to conventional photodiodes, dual layer photodiodes can include additional clear pixels and still maintain the same or a greater number of green pixels. Thus, an image sensor incorporating dual layer photodiodes can gain light sensitivity with the additional clear pixels and maintain luminance information with the green pixels. The additional resolution and light sensitivity for such an image sensor can be advantageous for many situations (e.g., normal lighting conditions, low light conditions, and/or high speed imaging under normal lighting conditions).
  • In some embodiments, the image sensor can include a CFA that can include a magenta filter element. The magenta filter element, which can be positioned over one or more photodiodes, can allow the image sensor to achieve a desired quantum efficiency response.
  • In some embodiments, the image sensor can include an IR cutoff filter, which can cover only a subset of the photodiodes of the image sensor. For example, if the IR cutoff filter only covers photodiodes corresponding to color pixels, uncovered photodiodes corresponding to clear pixels can provide greater sensitivity and resolution to the image sensor. Furthermore, because the color pixels can still be filtered by the IR cutoff filter, the image sensor can continue to render high-quality color images.
  • In some embodiments, the image system can include control circuitry for processing data generated by an image sensor. For example, the control circuitry can bin pixels of multiple photodiodes of the image sensor. For example, the control circuitry can bin a set of first color pixels of the multiple photodiodes to form a first pixel group. In addition, control circuitry can bin a set of second color pixels and third color pixels of the multiple photodiodes to form a second pixel group. The control circuitry can then combine the first pixel group and the second pixel group to form a binned pixel cluster. After forming the binned pixel cluster, the control circuitry can interpolate the binned pixel cluster to output a pixel image (e.g., a 2× pixel image with a rectilinear pattern).
  • The described embodiments of the invention are presented for the purpose of illustration and not of limitation.

Claims (33)

1. A plurality of photodiodes for receiving light, the plurality of photodiodes comprising:
a bottom layer of photodiodes;
a top layer of photodiodes, wherein at least one pixel in the top layer of photodiodes has a different spectral response than a pixel at the same position in the bottom layer of photodiodes; and
an insulation layer disposed between the bottom layer of the photodiodes and the top layer of the photodiodes.
2. The plurality of photodiodes of claim 1, wherein the bottom layer of the photodiodes comprises a bottom pixel at a first position, and wherein the top layer of the photodiodes comprises a top pixel stacked on top of the bottom pixel at the first position.
3. The plurality of photodiodes of claim 2, wherein the bottom pixel is a red pixel and the top pixel is a blue pixel.
4. The plurality of photodiodes of claim 2, wherein the bottom pixel is a first green pixel and the top pixel is a second green pixel.
5. The plurality of photodiodes of claim 2, wherein the bottom pixel is a first clear pixel and the top pixel is a second clear pixel.
6. The plurality of photodiodes of claim 2, wherein the bottom pixel is a first luminance pixel and the top pixel is a second luminance pixel.
7. The plurality of photodiodes of claim 2, wherein the bottom pixel is a first infrared pixel and the top pixel is a second infrared pixel.
8. The plurality of photodiodes of claim 2, wherein the bottom pixel is an infrared pixel and the top pixel is one of a color pixel, a clear pixel, or a luminance pixel.
9. The plurality of photodiodes of claim 1, wherein the first array of the photodiodes and the second array of the photodiodes are arranged in an adamantine pattern.
10. The plurality of photodiodes of claim 1, wherein the bottom layer of the photodiodes and the top layer of the photodiodes are arranged in a rectilinear pattern, wherein at least one line of the bottom layer and the top layer comprises a plurality of pixels of the same color.
11. The plurality of photodiodes of claim 10, wherein the at least one line is at least one row or at least one column.
12. The plurality of photodiodes of claim 1, wherein the bottom layer, the insulation layer, and the top layer have a pre-determined orientation.
13. The plurality of photodiodes of claim 1, wherein the pre-determined orientation is 45°.
14. An image sensor, the image sensor comprising:
a plurality of photodiodes with a dual layer photodiode pixel structure; and
an infrared (“IR”) cutoff filter, wherein the IR cutoff filter is positioned over at least a portion of the plurality of photodiodes.
15. The image sensor of claim 14, further comprising a color filter array (“CFA”), and wherein the CFA comprises one or more of a magenta filter element and a green filter element.
16. The image sensor of claim 14, wherein the plurality of photodiodes comprises one or more infrared pixels, and wherein the IR cutoff filter covers the one or more infrared pixels.
17. The image sensor of claim 14, wherein the plurality of photodiodes comprises one or more color pixels and one or more clear pixels.
18. The image sensor of claim 17, wherein the IR cutoff filter covers only the one or more color pixels.
19. The image sensor of claim 14, wherein the dual layer photodiode pixel structure comprises a bottom layer of photodiodes and a top layer of photodiodes.
20. The image sensor of claim 19, wherein the bottom layer comprises a set of red pixels at one or more positions and the top layer comprises a set of blue pixels stacked on top of the set of red pixels at the one or more positions.
21. The image sensor of claim 19, wherein the bottom layer comprises a set of green pixels at one or more positions and the top layer comprises a set of blue pixels stacked on top of the set of green pixels at the one or more positions.
22. The image sensor of claim 19, wherein the bottom layer comprises a set of red pixels at one or more positions and the top layer comprises a set of green pixels stacked on top of the set of red pixels at the one or more positions.
23. The image sensor of claim 19, wherein the bottom layer comprises a set of red pixels and a first set of clear pixels, and the top layer comprises a set of blue pixels and a second set of clear pixels.
24. The image sensor of claim 19, wherein the bottom layer comprises a set of red pixels and a first set of green pixels, and the top layer comprises a set of blue pixels and a second set of green pixels.
25. The image sensor of claim 19, wherein the bottom layer comprises a set of red pixels, a first set of green pixels, and a first set of clear pixels, and the top layer comprises a set of blue pixels, a second set of green pixels, and a second set of clear pixels.
26. The image sensor of claim 14 further comprising at least one photodiode with a single layer photodiode pixel structure, wherein the plurality of photodiodes is positioned adjacent to the at least one photodiode.
27. A method for binning a plurality of photodiodes, the method comprising:
binning a set of first color pixels of the plurality of photodiodes along a first direction to form a first pixel group, wherein the plurality of photodiodes comprises a dual layer photodiode pixel structure;
binning a set of second color pixels and third color pixels of the plurality of photodiodes along a second direction to form a second pixel group;
combining the first pixel group and the second pixel group to form a binned pixel cluster; and
interpolating the binned pixel cluster to output a pixel image.
28. The method of claim 27, wherein the dual layer photodiode pixel structure is arranged in an adamantine pixel arrangement.
29. The method of claim 27, wherein the first color pixels are green pixels, the second color pixels are blue pixels, and the third color pixels are red pixels.
30. The method of claim 27, wherein the pixel image comprises a rectilinear grid.
31. The method of claim 27, wherein the first direction is orthogonal to the second direction.
32. The method of claim 27, wherein the first direction is along a horizontal direction and the second direction is along a vertical direction.
33. The method of claim 27, wherein the first direction is along a −45° direction and the second direction is along 45° direction.
US12/826,313 2010-06-29 2010-06-29 Image sensor with dual layer photodiode structure Abandoned US20110317048A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/826,313 US20110317048A1 (en) 2010-06-29 2010-06-29 Image sensor with dual layer photodiode structure
US14/793,480 US10032810B2 (en) 2010-06-29 2015-07-07 Image sensor with dual layer photodiode structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/826,313 US20110317048A1 (en) 2010-06-29 2010-06-29 Image sensor with dual layer photodiode structure

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/793,480 Continuation US10032810B2 (en) 2010-06-29 2015-07-07 Image sensor with dual layer photodiode structure

Publications (1)

Publication Number Publication Date
US20110317048A1 true US20110317048A1 (en) 2011-12-29

Family

ID=45352206

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/826,313 Abandoned US20110317048A1 (en) 2010-06-29 2010-06-29 Image sensor with dual layer photodiode structure
US14/793,480 Expired - Fee Related US10032810B2 (en) 2010-06-29 2015-07-07 Image sensor with dual layer photodiode structure

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/793,480 Expired - Fee Related US10032810B2 (en) 2010-06-29 2015-07-07 Image sensor with dual layer photodiode structure

Country Status (1)

Country Link
US (2) US20110317048A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120025080A1 (en) * 2010-07-30 2012-02-02 Changmeng Liu Color correction circuitry and methods for dual-band imaging systems
US8582006B2 (en) * 2011-01-21 2013-11-12 Aptina Imaging Corporation Pixel arrangement for extended dynamic range imaging
US20140146181A1 (en) * 2012-11-29 2014-05-29 Hyundai Motor Company Apparatus and method for acquiring differential image
EP2784820A1 (en) * 2013-03-25 2014-10-01 Kabushiki Kaisha Toshiba Solid state imaging device
CN104205808A (en) * 2012-03-30 2014-12-10 株式会社尼康 Imaging device and image sensor
US20150138366A1 (en) * 2013-11-21 2015-05-21 Aptina Imaging Corporation Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
JP2015179731A (en) * 2014-03-19 2015-10-08 株式会社東芝 Solid state imaging apparatus
US20150381957A1 (en) * 2014-06-26 2015-12-31 Pixart Imaging (Penang) Sdn.Bhd. Color image sensor and operating method thereof
JP2016028457A (en) * 2012-04-02 2016-02-25 ソニー株式会社 Solid-state image pickup device and electronic device
CN105789227A (en) * 2014-10-06 2016-07-20 采钰科技股份有限公司 Stacked filter and image sensor containing the same
US9679933B2 (en) 2014-10-06 2017-06-13 Visera Technologies Company Limited Image sensors and methods of forming the same
US9793306B2 (en) 2015-04-08 2017-10-17 Semiconductor Components Industries, Llc Imaging systems with stacked photodiodes and chroma-luma de-noising
CN108089774A (en) * 2018-02-09 2018-05-29 京东方科技集团股份有限公司 A kind of touch control device
US20180301484A1 (en) * 2017-04-17 2018-10-18 Semiconductor Components Industries, Llc Image sensors with high dynamic range and autofocusing hexagonal pixels
US20180315788A1 (en) * 2017-05-01 2018-11-01 Visera Technologies Company Limited Image sensor
US20180337216A1 (en) * 2014-11-27 2018-11-22 Sony Semiconductor Solutions Corporation Solid-state imaging device and electronic apparatus
US20180358393A1 (en) * 2017-06-13 2018-12-13 Renesas Electronics Corporation Solid-state imaging element and manufacturing method thereof
CN109141632A (en) * 2018-11-06 2019-01-04 德淮半导体有限公司 Pixel unit, imaging sensor and its manufacturing method and imaging device
US20190199438A1 (en) * 2017-08-24 2019-06-27 Molex, Llc Free space optical data transmission using photodetector array
US10367992B2 (en) * 2013-03-25 2019-07-30 Sony Corporation Image sensor and electronic apparatus
US10424568B1 (en) 2018-06-19 2019-09-24 Globalfoundries Singapore Pte. Ltd. Image sensor with stacked SPAD and method for producing the same
CN110797366A (en) * 2019-11-14 2020-02-14 Oppo广东移动通信有限公司 Pixel structure, complementary metal oxide semiconductor image sensor and terminal
US10644073B2 (en) * 2016-12-19 2020-05-05 Samsung Electronics Co., Ltd. Image sensors and electronic devices including the same
US20200236312A1 (en) * 2012-01-13 2020-07-23 Nikon Corporation Solid-state imaging device and electronic camera
US20220020795A1 (en) * 2020-07-14 2022-01-20 SK Hynix Inc. Image sensing device
US11252345B2 (en) * 2018-02-11 2022-02-15 Zhejiang Uniview Technologies Co., Ltd Dual-spectrum camera system based on a single sensor and image processing method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016217282A1 (en) 2016-09-12 2018-03-15 Conti Temic Microelectronic Gmbh IMAGE SENSOR, IMAGING DEVICE, DRIVER ASSISTANCE SYSTEM, VEHICLE AND METHOD FOR EVALUATING ELECTROMAGNETIC RADIATION
US10770505B2 (en) * 2017-04-05 2020-09-08 Intel Corporation Per-pixel performance improvement for combined visible and ultraviolet image sensor arrays
US10931902B2 (en) * 2018-05-08 2021-02-23 Semiconductor Components Industries, Llc Image sensors with non-rectilinear image pixel arrays
KR20200109551A (en) * 2019-03-13 2020-09-23 삼성전자주식회사 Sensor and electronic device

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485204A (en) * 1993-08-02 1996-01-16 Nec Corporation Solid state image sensor and imaging method using spatial pixel offset
US5745171A (en) * 1994-04-14 1998-04-28 Asahi Kogaku Kogyo Kabushiki Kaisha Device for generating a luminance signal from a single field
US6281561B1 (en) * 1997-08-28 2001-08-28 Forschungszentrum Julich Gmbh Multicolor-color sensor
US20040017497A1 (en) * 2002-07-19 2004-01-29 Nobuo Suzuki Solid-state image pick-up device
US20050017154A1 (en) * 2003-07-15 2005-01-27 Takehiko Ozumi Solid-state imaging device and method for driving the same
US20050178948A1 (en) * 2004-02-13 2005-08-18 Young Optics Inc. Projection method of display device
US20060181623A1 (en) * 2004-12-09 2006-08-17 Hiroki Endo Solid-state image device
US7164444B1 (en) * 2002-05-17 2007-01-16 Foveon, Inc. Vertical color filter detector group with highlight detector
US20070131987A1 (en) * 2005-12-09 2007-06-14 Dongbu Electronics Co., Ltd. Vertical image sensor and method for manufacturing the same
US20070218578A1 (en) * 2006-03-17 2007-09-20 Sharp Laboratories Of America, Inc. Real-time CMOS imager having stacked photodiodes fabricated on SOI wafer
US20080149976A1 (en) * 2006-12-22 2008-06-26 Su Lim Vertical type cmos iamge sensor and method of manufacturing the same
US20080160723A1 (en) * 2005-09-13 2008-07-03 Lumiense Photonics Inc. Method of fabricating silicon/dielectric multi-layer semiconductor structures using layer transfer technology and also a three-dimensional multi-layer semiconductor device and stacked layer type image sensor using the same method, and a method of manufacturing a three-dimensional multi-layer semiconductor device and the stack type image sensor
US20080280388A1 (en) * 2006-04-28 2008-11-13 Kenji Ishida CCD type solid-state imaging device and method for manufacturing the same
US20090009637A1 (en) * 2007-06-12 2009-01-08 Tetsu Wada Imaging apparatus
US20090015693A1 (en) * 2007-06-28 2009-01-15 Fujifilm Corporation Signal processing apparatus, image pickup apparatus and computer readable medium
US20090040353A1 (en) * 2007-08-10 2009-02-12 Takeshi Yamamoto Imaging apparatus and method of driving solid-state imaging device
US20090079855A1 (en) * 2007-09-20 2009-03-26 Victor Company Of Japan, Ltd. Imaging apparatus and method of processing video signal
US20090085135A1 (en) * 2007-09-27 2009-04-02 Bang Sun Kyung Image Sensor and Manufacturing Method Thereof
US20090135167A1 (en) * 2007-11-26 2009-05-28 Sony Corporation Display device and electronic apparatus
US20090152604A1 (en) * 2007-12-13 2009-06-18 Semiconductor Manufacturing International (Shanghai) Corporation System and method for sensing image on CMOS
US20090160981A1 (en) * 2007-12-20 2009-06-25 Micron Technology, Inc. Apparatus including green and magenta pixels and method thereof
US20090200626A1 (en) * 2008-02-08 2009-08-13 Omnivision Technologies Inc. Backside illuminated imaging sensor with vertical pixel sensor
US20090200584A1 (en) * 2008-02-04 2009-08-13 Tweet Douglas J Full Color CMOS Imager Filter
US20090213256A1 (en) * 2008-02-26 2009-08-27 Sony Corporation Solid-state imaging device and camera
US20090219410A1 (en) * 2008-02-28 2009-09-03 Sheng Teng Hsu CMOS Imager Flush Reset
US7652701B2 (en) * 2000-03-14 2010-01-26 Fujifilm Corporation Solid-state honeycomb type image pickup apparatus using a complementary color filter and signal processing method therefor
WO2010070869A1 (en) * 2008-12-19 2010-06-24 パナソニック株式会社 Image pickup device
US20100157116A1 (en) * 2008-12-22 2010-06-24 Sony Corporation Solid-state image pickup device and electronic apparatus using the same
US7773137B2 (en) * 2004-12-16 2010-08-10 Fujitsu Semiconductor Limited Imaging apparatus, imaging element, and image processing method
US7786426B2 (en) * 2007-06-06 2010-08-31 Sony Corporation Imaging device with a color filter that contains a layer only covering the surrounding areas
US7839437B2 (en) * 2006-05-15 2010-11-23 Sony Corporation Image pickup apparatus, image processing method, and computer program capable of obtaining high-quality image data by controlling imbalance among sensitivities of light-receiving devices
US7952636B2 (en) * 2007-09-06 2011-05-31 Fujifilm Corporation Method for driving solid-state imaging device and imaging apparatus
US20110249158A1 (en) * 2010-04-13 2011-10-13 Jaroslav Hynecek Image sensor pixels with vertical charge transfer
US8125547B2 (en) * 2007-11-22 2012-02-28 Fujifilm Corporation Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus including photoelectric conversion elements for luminance detection

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003298038A (en) * 2002-04-05 2003-10-17 Canon Inc Photoelectric conversion element and solid-state imaging device using the same
JP4154165B2 (en) * 2002-04-05 2008-09-24 キヤノン株式会社 PHOTOELECTRIC CONVERSION ELEMENT, SOLID-STATE IMAGING DEVICE, CAMERA, AND IMAGE READING DEVICE USING THE SAME
CN102017147B (en) * 2007-04-18 2014-01-29 因维萨热技术公司 Materials, systems and methods for optoelectronic devices
JP5277565B2 (en) * 2007-05-31 2013-08-28 富士通セミコンダクター株式会社 Solid-state image sensor
US20090078316A1 (en) * 2007-09-24 2009-03-26 Qualcomm Incorporated Interferometric photovoltaic cell
US20100102229A1 (en) * 2008-10-28 2010-04-29 Sony Ericsson Mobile Communications Ab Combined sensor for portable communication devices
JP2011159757A (en) * 2010-01-29 2011-08-18 Sony Corp Solid-state imaging device and manufacturing method thereof, driving method of solid-state imaging device, and electronic device

Patent Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5485204A (en) * 1993-08-02 1996-01-16 Nec Corporation Solid state image sensor and imaging method using spatial pixel offset
US5745171A (en) * 1994-04-14 1998-04-28 Asahi Kogaku Kogyo Kabushiki Kaisha Device for generating a luminance signal from a single field
US6281561B1 (en) * 1997-08-28 2001-08-28 Forschungszentrum Julich Gmbh Multicolor-color sensor
US7652701B2 (en) * 2000-03-14 2010-01-26 Fujifilm Corporation Solid-state honeycomb type image pickup apparatus using a complementary color filter and signal processing method therefor
US7164444B1 (en) * 2002-05-17 2007-01-16 Foveon, Inc. Vertical color filter detector group with highlight detector
US20040017497A1 (en) * 2002-07-19 2004-01-29 Nobuo Suzuki Solid-state image pick-up device
US20050017154A1 (en) * 2003-07-15 2005-01-27 Takehiko Ozumi Solid-state imaging device and method for driving the same
US20050178948A1 (en) * 2004-02-13 2005-08-18 Young Optics Inc. Projection method of display device
US20060181623A1 (en) * 2004-12-09 2006-08-17 Hiroki Endo Solid-state image device
US7773137B2 (en) * 2004-12-16 2010-08-10 Fujitsu Semiconductor Limited Imaging apparatus, imaging element, and image processing method
US20080160723A1 (en) * 2005-09-13 2008-07-03 Lumiense Photonics Inc. Method of fabricating silicon/dielectric multi-layer semiconductor structures using layer transfer technology and also a three-dimensional multi-layer semiconductor device and stacked layer type image sensor using the same method, and a method of manufacturing a three-dimensional multi-layer semiconductor device and the stack type image sensor
US20070131987A1 (en) * 2005-12-09 2007-06-14 Dongbu Electronics Co., Ltd. Vertical image sensor and method for manufacturing the same
US20070218578A1 (en) * 2006-03-17 2007-09-20 Sharp Laboratories Of America, Inc. Real-time CMOS imager having stacked photodiodes fabricated on SOI wafer
US7419844B2 (en) * 2006-03-17 2008-09-02 Sharp Laboratories Of America, Inc. Real-time CMOS imager having stacked photodiodes fabricated on SOI wafer
US20080303072A1 (en) * 2006-03-17 2008-12-11 Sharp Laboratories Of America, Inc. CMOS Active Pixel Sensor
US20080280388A1 (en) * 2006-04-28 2008-11-13 Kenji Ishida CCD type solid-state imaging device and method for manufacturing the same
US7839437B2 (en) * 2006-05-15 2010-11-23 Sony Corporation Image pickup apparatus, image processing method, and computer program capable of obtaining high-quality image data by controlling imbalance among sensitivities of light-receiving devices
US20080149976A1 (en) * 2006-12-22 2008-06-26 Su Lim Vertical type cmos iamge sensor and method of manufacturing the same
US7786426B2 (en) * 2007-06-06 2010-08-31 Sony Corporation Imaging device with a color filter that contains a layer only covering the surrounding areas
US20090009637A1 (en) * 2007-06-12 2009-01-08 Tetsu Wada Imaging apparatus
US20090015693A1 (en) * 2007-06-28 2009-01-15 Fujifilm Corporation Signal processing apparatus, image pickup apparatus and computer readable medium
US20090040353A1 (en) * 2007-08-10 2009-02-12 Takeshi Yamamoto Imaging apparatus and method of driving solid-state imaging device
US7952636B2 (en) * 2007-09-06 2011-05-31 Fujifilm Corporation Method for driving solid-state imaging device and imaging apparatus
US20090079855A1 (en) * 2007-09-20 2009-03-26 Victor Company Of Japan, Ltd. Imaging apparatus and method of processing video signal
US20090085135A1 (en) * 2007-09-27 2009-04-02 Bang Sun Kyung Image Sensor and Manufacturing Method Thereof
US8125547B2 (en) * 2007-11-22 2012-02-28 Fujifilm Corporation Driving method of solid-state imaging device, solid-state imaging device, and imaging apparatus including photoelectric conversion elements for luminance detection
US20090135167A1 (en) * 2007-11-26 2009-05-28 Sony Corporation Display device and electronic apparatus
US20090152604A1 (en) * 2007-12-13 2009-06-18 Semiconductor Manufacturing International (Shanghai) Corporation System and method for sensing image on CMOS
US20090160981A1 (en) * 2007-12-20 2009-06-25 Micron Technology, Inc. Apparatus including green and magenta pixels and method thereof
US20090200584A1 (en) * 2008-02-04 2009-08-13 Tweet Douglas J Full Color CMOS Imager Filter
US20090200626A1 (en) * 2008-02-08 2009-08-13 Omnivision Technologies Inc. Backside illuminated imaging sensor with vertical pixel sensor
US20090213256A1 (en) * 2008-02-26 2009-08-27 Sony Corporation Solid-state imaging device and camera
US20090219410A1 (en) * 2008-02-28 2009-09-03 Sheng Teng Hsu CMOS Imager Flush Reset
WO2010070869A1 (en) * 2008-12-19 2010-06-24 パナソニック株式会社 Image pickup device
US20110037869A1 (en) * 2008-12-19 2011-02-17 Masao Hiramoto Image capture device
US20100157116A1 (en) * 2008-12-22 2010-06-24 Sony Corporation Solid-state image pickup device and electronic apparatus using the same
US20110249158A1 (en) * 2010-04-13 2011-10-13 Jaroslav Hynecek Image sensor pixels with vertical charge transfer

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120025080A1 (en) * 2010-07-30 2012-02-02 Changmeng Liu Color correction circuitry and methods for dual-band imaging systems
US8357899B2 (en) * 2010-07-30 2013-01-22 Aptina Imaging Corporation Color correction circuitry and methods for dual-band imaging systems
US8582006B2 (en) * 2011-01-21 2013-11-12 Aptina Imaging Corporation Pixel arrangement for extended dynamic range imaging
US20200236312A1 (en) * 2012-01-13 2020-07-23 Nikon Corporation Solid-state imaging device and electronic camera
US11588991B2 (en) * 2012-01-13 2023-02-21 Nikon Corporation Solid-state imaging device and electronic camera
US20230171517A1 (en) * 2012-01-13 2023-06-01 Nikon Corporation Solid-state imaging device and electronic camera
CN104205808A (en) * 2012-03-30 2014-12-10 株式会社尼康 Imaging device and image sensor
US20150222833A1 (en) * 2012-03-30 2015-08-06 Nikon Corporation Image-capturing device and image sensor
US9826183B2 (en) * 2012-03-30 2017-11-21 Nikon Corporation Image-capturing device and image sensor
US10389959B2 (en) * 2012-03-30 2019-08-20 Nikon Corporation Image-capturing device and image sensor
JP2016028457A (en) * 2012-04-02 2016-02-25 ソニー株式会社 Solid-state image pickup device and electronic device
US20140146181A1 (en) * 2012-11-29 2014-05-29 Hyundai Motor Company Apparatus and method for acquiring differential image
US10051195B2 (en) * 2012-11-29 2018-08-14 Hyundai Motor Company Apparatus and method for acquiring differential image
US11641521B2 (en) 2013-03-25 2023-05-02 Sony Group Corporation Image sensor and electronic apparatus
EP2784820A1 (en) * 2013-03-25 2014-10-01 Kabushiki Kaisha Toshiba Solid state imaging device
US11962902B2 (en) 2013-03-25 2024-04-16 Sony Group Corporation Image sensor and electronic apparatus
US10367992B2 (en) * 2013-03-25 2019-07-30 Sony Corporation Image sensor and electronic apparatus
US20150138366A1 (en) * 2013-11-21 2015-05-21 Aptina Imaging Corporation Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
US10136107B2 (en) * 2013-11-21 2018-11-20 Semiconductor Components Industries, Llc Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
JP2015179731A (en) * 2014-03-19 2015-10-08 株式会社東芝 Solid state imaging apparatus
US9300937B2 (en) * 2014-06-26 2016-03-29 Pixart Imaging (Penang) Sdn, Bhd. Color image sensor and operating method thereof
US20150381957A1 (en) * 2014-06-26 2015-12-31 Pixart Imaging (Penang) Sdn.Bhd. Color image sensor and operating method thereof
US9666620B2 (en) * 2014-10-06 2017-05-30 Visera Technologies Company Limited Stacked filter and image sensor containing the same
CN105789227A (en) * 2014-10-06 2016-07-20 采钰科技股份有限公司 Stacked filter and image sensor containing the same
TWI563647B (en) * 2014-10-06 2016-12-21 Visera Technologies Co Ltd Stacked filters and image sensors containing the same
US9679933B2 (en) 2014-10-06 2017-06-13 Visera Technologies Company Limited Image sensors and methods of forming the same
US20180337216A1 (en) * 2014-11-27 2018-11-22 Sony Semiconductor Solutions Corporation Solid-state imaging device and electronic apparatus
US10700132B2 (en) * 2014-11-27 2020-06-30 Sony Semiconductor Solutions Corporation Solid-state imaging device and electronic apparatus
US9793306B2 (en) 2015-04-08 2017-10-17 Semiconductor Components Industries, Llc Imaging systems with stacked photodiodes and chroma-luma de-noising
US10644073B2 (en) * 2016-12-19 2020-05-05 Samsung Electronics Co., Ltd. Image sensors and electronic devices including the same
US20180301484A1 (en) * 2017-04-17 2018-10-18 Semiconductor Components Industries, Llc Image sensors with high dynamic range and autofocusing hexagonal pixels
US20180315788A1 (en) * 2017-05-01 2018-11-01 Visera Technologies Company Limited Image sensor
US20180358393A1 (en) * 2017-06-13 2018-12-13 Renesas Electronics Corporation Solid-state imaging element and manufacturing method thereof
US20190199438A1 (en) * 2017-08-24 2019-06-27 Molex, Llc Free space optical data transmission using photodetector array
US10756815B2 (en) * 2017-08-24 2020-08-25 Molex, Llc Free space optical data transmission using photodetector array
US10795505B2 (en) * 2018-02-09 2020-10-06 Boe Technology Group Co., Ltd. Touch device
US20190250729A1 (en) * 2018-02-09 2019-08-15 Boe Technology Group Co., Ltd. Touch device
CN108089774A (en) * 2018-02-09 2018-05-29 京东方科技集团股份有限公司 A kind of touch control device
US11252345B2 (en) * 2018-02-11 2022-02-15 Zhejiang Uniview Technologies Co., Ltd Dual-spectrum camera system based on a single sensor and image processing method
US10424568B1 (en) 2018-06-19 2019-09-24 Globalfoundries Singapore Pte. Ltd. Image sensor with stacked SPAD and method for producing the same
CN109141632A (en) * 2018-11-06 2019-01-04 德淮半导体有限公司 Pixel unit, imaging sensor and its manufacturing method and imaging device
CN110797366A (en) * 2019-11-14 2020-02-14 Oppo广东移动通信有限公司 Pixel structure, complementary metal oxide semiconductor image sensor and terminal
US20220020795A1 (en) * 2020-07-14 2022-01-20 SK Hynix Inc. Image sensing device
US11817468B2 (en) * 2020-07-14 2023-11-14 SK Hynix Inc. Image sensing device

Also Published As

Publication number Publication date
US20150311242A1 (en) 2015-10-29
US10032810B2 (en) 2018-07-24

Similar Documents

Publication Publication Date Title
US10032810B2 (en) Image sensor with dual layer photodiode structure
US10015416B2 (en) Imaging systems with high dynamic range and phase detection pixels
US8035708B2 (en) Solid-state imaging device with an organic photoelectric conversion film and imaging apparatus
JP6584451B2 (en) RGBC color filter array pattern to minimize color aliasing
US8339489B2 (en) Image photographing apparatus, method and medium with stack-type image sensor, complementary color filter, and white filter
TWI519161B (en) Image sensor and imaging sensing process
US7348539B2 (en) Image sensor for semiconductor light-sensing device and image processing apparatus using the same
US9143760B2 (en) Solid-state imaging device
US20120189293A1 (en) Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
KR101714978B1 (en) Color filter array patterns for reduction of color aliasing
JP2008005488A (en) Camera module
KR20160065464A (en) Color filter array, image sensor having the same and infrared data acquisition method using the same
JP2009088255A (en) Color solid-state imaging device and electronic information equipment
US20090295962A1 (en) Image sensor having differing wavelength filters
JP2006165362A (en) Solid-state imaging element
WO2010100896A1 (en) Image pickup device and solid-state image pickup element of the type illuminated from both faces
US20220028918A1 (en) Imaging device
JP2004172278A (en) Color solid-state imaging device
US9674493B2 (en) Color image sensor with metal mesh to detect infrared light
KR102219784B1 (en) Color filter array and image sensor having the same
US8885079B2 (en) Back-illuminated solid-state image sensing element, method of manufacturing the same, and imaging device
JP2004186311A (en) Mos-type image sensor and digital camera
JP2009009971A (en) Solid-state imaging apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: APTINA IMAGING CORPORATION, CAYMAN ISLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAI, YINGJUN;SUN, QUN;REEL/FRAME:024612/0585

Effective date: 20100629

AS Assignment

Owner name: SEMICONDUCTOR COMPONENTS INDUSTRIES, LLC, ARIZONA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTINA IMAGING CORPORATION;REEL/FRAME:034037/0711

Effective date: 20141023

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION