US20030193594A1 - Image sensor with processor controlled integration time - Google Patents

Image sensor with processor controlled integration time Download PDF

Info

Publication number
US20030193594A1
US20030193594A1 US10/383,450 US38345003A US2003193594A1 US 20030193594 A1 US20030193594 A1 US 20030193594A1 US 38345003 A US38345003 A US 38345003A US 2003193594 A1 US2003193594 A1 US 2003193594A1
Authority
US
United States
Prior art keywords
coupled
image sensor
decoder
pixel array
counter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/383,450
Inventor
Hiok Tay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Candela Microsystems Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/383,450 priority Critical patent/US20030193594A1/en
Assigned to CANDELA MICROSYSTEMS, INC. reassignment CANDELA MICROSYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TAY, HIOK NAM
Publication of US20030193594A1 publication Critical patent/US20030193594A1/en
Priority to US12/893,032 priority patent/US9584739B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/42Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by switching between different modes of operation using different resolutions or aspect ratios, e.g. switching between interlaced and non-interlaced mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/533Control of the integration time by using differing integration times for different sensor regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/587Control of the dynamic range involving two or more exposures acquired sequentially, e.g. using the combination of odd and even image fields
    • H04N25/589Control of the dynamic range involving two or more exposures acquired sequentially, e.g. using the combination of odd and even image fields with different integration times, e.g. short and long exposures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/616Noise processing, e.g. detecting, correcting, reducing or removing noise involving a correlated sampling function, e.g. correlated double sampling [CDS] or triple sampling

Definitions

  • the subject matter disclosed generally relates to the field of semiconductor image sensors.
  • Photographic equipment such as digital cameras and digital camcorders contain electronic image sensors that capture light for processing into a still or video image, respectively.
  • electronic image sensors There are two primary types of electronic image sensors, charge coupled devices (CCDs) and complimentary metal oxide semiconductor (CMOS) sensors.
  • CCD image sensors have relatively high signal to noise ratios (SNR) that provide quality images.
  • SNR signal to noise ratios
  • CCDs can be fabricated to have pixel arrays that are relatively small while conforming with most camera and video resolution requirements. A pixel is the smallest discrete element of an image. For these reasons, CCDs are used in most commercially available cameras and camcorders.
  • CMOS sensors are faster and consume less power than CCD devices. Additionally, CMOS fabrication processes are used to make many types of integrated circuits. Consequently, there is a greater abundance of manufacturing capacity for CMOS sensors than CCD sensors.
  • CMOS sensor that has the same SNR and pixel pitch requirements as commercially available CCD sensors.
  • Pixel pitch is the space between the centers of adjacent pixels. It would be desirable to provide a CMOS sensor that has relatively high SNR while providing a commercially acceptable pixel pitch.
  • the image sensor is typically connected to an external processor and external memory.
  • the external memory stores data from the image sensor.
  • the processor processes the stored data.
  • the data includes one or more images generated by exposing the pixels for a predetermined time interval.
  • the exposure time of the pixels is typically controlled by an internal clock(s) of the image sensor.
  • the exposure time of a picture frame is established by a word written into an exposure time register. Changing the exposure time requires writing new data into the register and then reading the data. In video and fast successive still photo shots this technique may create confusion regarding the exposure time of incoming pixel data, thereby creating instability in the system. It would be desirable to provide processor control of the exposure time of the pixels that improves stability and does not require an undesirable number of pins and signals.
  • Camera or camcorder products typically have an auto-focus function.
  • the camera may be designed to process only a “window” of the pixel array.
  • the auto-focus routine may require the window to move around the pixel array of the image sensor. It would be desirable to provide processor control of the window data in a manner that minimizes the pin count and number of signals required for the image sensor.
  • An image sensor coupled to a process that generates a plurality of control signals.
  • the image sensor includes a pixel array that is arranged into a number of rows.
  • the sensor may also contain a logic circuit that selects a row of the pixel array to generate and retrieve pixel data in response to a first edge and a second edge of the control signals.
  • a time interval between a resetting and a reading of the selected row is proportional to an interval between the first and second edges of the control signals.
  • FIGS. 1 is a schematic of an embodiment of an image sensor
  • FIG. 2 is a schematic of an embodiment of a pixel of the image sensor
  • FIG. 3 is a schematic of an embodiment of a light reader circuit of the image sensor
  • FIG. 4 is a flowchart for a first mode of operation of the image sensor
  • FIG. 5 is a timing diagram for the first mode of operation of the image sensor
  • FIG. 6 is a diagram showing the levels of a signal across a photodiode of a pixel
  • FIG. 7 is a schematic for a logic circuit for generating the timing diagrams of FIG. 5;
  • FIG. 8 is a schematic of a logic circuit for generating a RST signal for a row of pixels
  • FIG. 9 is a timing diagram for the logic circuit shown in FIG. 8;
  • FIG. 10 is a flowchart showing a second mode of operation of the image sensor
  • FIG. 11 is a timing diagram for the second mode of operation of the image sensor
  • FIG. 12 is a schematic of an embodiment of a row decoder of the image sensor
  • FIG. 13 is a timing diagram for the row decoder shown in FIG. 12;
  • FIG. 14 is a timing diagram showing the transfer of pixel data when the image sensor is in a low noise mode
  • FIG. 15 is a timing diagram showing the transfer of pixel data when the image sensor is in an extended dynamic range mode
  • FIG. 16 is an illustration of a window of the pixel array
  • FIG. 17 is timing diagram showing an embedded narrow pulse used to determine a start location of the window.
  • an image sensor that has one or more pixels within a pixel array.
  • the pixels are arranged within a plurality of rows within the array.
  • Each row of the pixel array can be selected by a row decoder in response to an edge of a control signal.
  • the control signal may be one of a plurality of signals generated by a processor coupled to the image sensor.
  • the processor can control the exposure time of the pixels by varying the control signals.
  • the control signals may also have an embedded narrow pulse that is used to determine the location of a “window” in the pixel array.
  • the pixel may be a three transistor structure that minimizes the pixel pitch of the image sensor.
  • the entire image sensor is preferably constructed with CMOS fabrication processes and circuits.
  • the CMOS image sensor has the characteristics of being high speed, low power consumption, small pixel pitch and a high SNR.
  • FIG. 1 shows an image sensor 10 .
  • the image sensor 10 includes a pixel array 12 that contains a plurality of individual photodetecting pixels 14 .
  • the pixels 14 are arranged in a two-dimensional array of rows and columns.
  • the pixel array 12 is coupled to a light reader circuit 16 by a bus 18 and to a row decoder 20 by control lines 22 .
  • the row decoder 20 can select an individual row of the pixel array 12 .
  • the light reader 16 can then read specific-discrete columns within the selected row. Together, the row decoder 20 and light reader 16 allow for the reading of an individual pixel 14 in the array 12 .
  • the light reader 16 may be coupled to an analog to digital converter 24 (ADC) by output line(s) 26 .
  • ADC analog to digital converter 24
  • the ADC 24 generates a digital bit string that corresponds to the amplitude of the signal provided by the light reader 16 and the selected pixels 14 .
  • the ADC 24 may be coupled to line buffers 28 by data lines 30 .
  • the line buffers 28 may include separate pairs of buffers for first image data and second image data.
  • the line buffers 28 are coupled to a data interface 32 that transfers data to a processor 34 over bus 36 .
  • the processor 34 may be coupled to memory 38 by bus 40 .
  • the memory 38 is shown coupled to the processor 34 , it is to be understood that the system may have other configurations.
  • the processor 34 and memory 38 may be coupled to the interface 32 by separate busses.
  • the data interface 32 may be connected to a control line INTG 42 which provides a control signal from the processor 34 .
  • the control signal may contain a series of pulses that control the transfer of data to the processor 34 .
  • the pixel data may be transferred to the processor 34 in an interleaving manner.
  • the buffers 28 may store pixel data of a first image and a second image.
  • the data interface 32 may interleave the data by sending a first line of the first image and then a first line of the second image and so forth and so on.
  • the image sensor 10 may have registers 44 that store mode and gain values. The values can be provided to the data interface 32 , buffers 28 , light reader 16 and row decoder 20 over lines 46 , 48 , 50 and 52 , respectively. The values can be loaded into the registers 44 through lines 54 , 56 and 58 .
  • the image sensor 10 may also have clock circuits 60 that provide CLK timing signals over line 62 .
  • the light reader circuit 16 may be coupled to a column decoder 64 by control lines 66 .
  • the decoder 64 selects a column within the pixel array 12 to generate and retrieve pixel data from the pixels 14 .
  • the decoder 64 is coupled to a counter 68 by a bus 70 .
  • the counter 68 provides a count value that causes the decoder 64 to switch the selection of a column in the pixel array 12 .
  • Counter 68 is also connected to an input line HD 72 and an output line HDF 74 .
  • the row decoder 20 may include a plurality of row drivers 76 that are coupled to the pixel array 12 .
  • the row drivers 76 may be coupled to decoders 78 and counters 80 .
  • the counters 80 may be coupled to a counter/latch circuit 82 .
  • the row decoder 20 may also include a phase sequence decoder 84 .
  • the phase sequence decoder 84 may be coupled to the light reader 16 , row drivers 76 and decoders 78 by control signals 86 .
  • the row decoder 20 may further include a wide pulse detector 88 and a narrow pulse detector 90 .
  • the wide pulse detector 88 may be connected to the counters 80 by LEAD 92 and LAG 94 control signals, respectively.
  • the narrow pulse detector 90 may be connected to the counter/latch 82 by control signal NP 96 .
  • the pulse detectors 88 and 90 may be connected to the INTG control line 42 that is coupled to the processor 34 .
  • the counter/latch 82 , narrow pulse detector 90 and phase sequence decoder 84 may be connected to the mode line 52 of register 44 .
  • FIG. 2 shows an embodiment of a cell structure for a pixel 14 of the pixel array 12 .
  • the pixel 14 may contain a photodetector 100 .
  • the photodetector 100 may be a photodiode.
  • the photodetector 100 may be connected to a reset transistor 112 .
  • the photodetector 100 may also be coupled to a select transistor 114 through a level shifting transistor 116 .
  • the transistors 112 , 114 and 116 may be field effect transistors (FETs).
  • the gate of reset transistor 112 may be connected to a RST line 118 .
  • the drain node of the transistor 112 may be connected to IN line 120 .
  • the gate of select transistor 114 may be connected to a SEL line 122 .
  • the source node of transistor 114 may be connected to an OUT line 124 .
  • the RST 118 and SEL lines 122 may be common for an entire row of pixels in the pixel array 12 .
  • the IN 120 and OUT 124 lines may be common for an entire column of pixels in the pixel array 12 .
  • the RST line 118 and SEL line 122 are connected to the row decoder 20 and are part of the control lines 22 .
  • FIG. 3 shows an embodiment of a light reader circuit 16 .
  • the light reader 16 may include a plurality of double sampling capacitor circuits 150 each connected to an OUT line 124 of the pixel array 12 .
  • Each double sampling circuit 150 may include a first capacitor 152 and a second capacitor 154 .
  • the first capacitor 152 is coupled to the OUT line 124 and ground GND 1 156 by switches 158 and 160 , respectively.
  • the second capacitor 154 is coupled to the OUT line 124 and ground GND 1 by switches 162 and 164 , respectively.
  • Switches 158 and 160 are controlled by a control line SAM 1 166 .
  • Switches 162 and 164 are controlled by a control line SAM 2 168 .
  • the capacitors 152 and 154 can be connected together to perform a voltage subtraction by closing switch 170 .
  • the switch 170 is controlled by a control line SUB 172 .
  • the double sampling circuits 150 are connected to an operational amplifier 180 by a plurality of first switches 182 and a plurality of second switches 184 .
  • the amplifier 180 has a negative terminal ⁇ coupled to the first capacitors 152 by the first switches 182 and a positive terminal+coupled to the second capacitors 154 by the second switches 184 .
  • the operational amplifier 180 has a positive output+connected to an output line OP 188 and a negative output ⁇ connected to an output line OM 186 .
  • the output lines 186 and 188 are connected to the ADC 24 (see FIG. 1).
  • the operational amplifier 180 provides an amplified signal that is the difference between the voltage stored in the first capacitor 152 and the voltage stored in the second capacitor 154 of a sampling circuit 150 connected to the amplifier 180 .
  • the gain of the amplifier 180 can be varied by adjusting the variable capacitors 190 .
  • the variable capacitors 190 may be discharged by closing a pair of switches 192 .
  • the switches 192 may be connected to a corresponding control line (not shown). Although a single amplifier is shown and described, it is to be understood that more than one amplifier can be used in the light reader circuit 16 .
  • FIGS. 4 and 5 show an operation of the image sensor 10 in a first mode also referred to as a low noise mode.
  • a reference signal is written into each pixel 14 of the pixel array and then a first reference output signal is stored in the light reader 16 .
  • this can be accomplished by switching the RST 118 and IN 120 lines from a low voltage to a high voltage to turn on transistor 112 .
  • the RST line 118 is driven high for an entire row.
  • IN line 120 is driven high for an entire column.
  • RST line 118 is first driven high while the IN line 120 is initially low.
  • the RST line 118 may be connected to a tri-state buffer (not shown) that is switched to a tri-state when the IN line 120 is switched to a high state. This allows the gate voltage to float to a value that is higher than the voltage on the IN line 120 . This causes the transistor 112 to enter the triode region. In the triode region the voltage across the photodiode 100 is approximately the same as the voltage on the IN line 120 . Generating a higher gate voltage allows the photodetector to be reset at a level close to Vdd. CMOS sensors of the prior art reset the photodetector to a level of Vdd-Vgs, where Vgs can be up to 1 V.
  • the SEL line 122 is also switched to a high voltage level which turns on transistor 114 .
  • the voltage of the photodiode 100 is provided to the OUT line 124 through level shifter transistor 116 and select transistor 114 .
  • the SAM 1 control line 166 of the light reader 16 (see FIG. 3) is selected so that the voltage on the OUT line 124 is stored in the first capacitor 152 .
  • process block 302 the pixels of the pixel array are then reset and reset output signals are then stored in the light reader 16 .
  • this can be accomplished by driving the RST line 118 low to turn off the transistor 112 and reset the pixel 14 . Turning off the transistor 112 will create reset noise, charge injection and clock feedthrough voltage that resides across the photodiode 100 . As shown in FIG. 6 the noise reduces the voltage at the photodetector 100 when the transistor 112 is reset.
  • the SAM 2 line 168 is driven high, the SEL line 122 is driven low and then high again, so that a level shifted voltage of the photodiode 100 is stored as a reset output signal in the second capacitor 154 of the light reader circuit 16 .
  • Process blocks 300 and 302 are repeated for each pixel 14 in the array 12 .
  • the reset output signals are then subtracted from the first reference output signals to create noise output signals that are then converted to digital bit strings by ADC 24 .
  • the digital output data can be stored within the line buffers 28 and eventually transferred and stored within the external memory 38 .
  • the noise signals may be referred to as a first image.
  • the subtraction process can be accomplished by closing switches 182 , 184 and 170 of the light reader circuit 16 (FIG. 3) to subtract the voltage across the second capacitor 154 from the voltage across the first capacitor 152 .
  • light response output signals are sampled from the pixels 14 of the pixel array 12 and stored in the light reader circuit 16 .
  • the light response output signals correspond to the optical image that is being detected by the image sensor 10 . Referring to FIGS. 2, 3 and 5 this can be accomplished by having the IN 120 , SEL 122 and SAM 2 lines 168 in a high state and RST 118 in a low state.
  • the second capacitor 152 of the light reader circuit 16 stores a level shifted voltage of the photodiode 100 as the light response output signal.
  • a second reference output signal is then generated in the pixels 14 and stored in the light reader circuit 16 .
  • this can be accomplished similar to generating and storing the first reference output signal.
  • the RST line 118 is first driven high and then into a tri-state.
  • the IN line 120 is then driven high to cause the transistor 112 to enter the triode region so that the voltage across the photodiode 100 is the voltage on IN line 120 .
  • the SEL 122 and SAM 2 168 lines are then driven high to store the second reference output voltage in the first capacitor 154 of the light reader circuit 16 .
  • Process blocks 306 and 308 are repeated for each pixel 14 in the array 12 .
  • the light response output signal is subtracted from the second reference output signal to create a normalized light response output signal.
  • the normalized light response output signal is converted into a digital bit string to create normalized light output data that is transferred to the processor 34 .
  • the normalized light response output signals may be referred to as a second image.
  • the subtraction process can be accomplished by closing switches 170 , 182 and 184 of the light reader 16 to subtract the voltage across the first capacitor 152 from the voltage across the second capacitor 154 .
  • the difference is then amplified by amplifier 180 and converted into a digital bit string by ADC 24 as light response data.
  • the noise data is retrieved from memory 38 .
  • the noise data, first image is combined (subtracted) with the normalized light output data, second image, by the processor 34 .
  • the noise data corresponds to the first image and the normalized light output data corresponds to the second image.
  • the second reference output signal is the same or approximately the same as the first reference output signal such that the present technique subtracts the noise data, due to reset noise, charge injection and clock feedthrough, from the normalized light response signal. This improves the signal to noise ratio of the final image data.
  • the process described is performed in a sequence across the various rows of the pixels in the pixel array 12 .
  • the n-th row in the pixel array may be generating noise signals while the n- 1 -th row generates normalized light response signals, where 1 is the exposure duration in multiples of a line period.
  • the various control signals RST, SEL, IN, SAM 1 , SAM 2 and SUB can be generated in the circuit generally referred to as the phase sequence decoder 84 .
  • FIG. 7 shows an embodiment of logic to generate the IN, SEL, SAM 1 , SAM 2 and RST signals in accordance with the timing diagram of FIG. 5.
  • the logic may include a plurality of comparators 350 with one input connected to a counter 68 and another input connected to hardwired signals that contain a lower count value and an upper count value.
  • the counter 68 sequentially generates a count.
  • the comparators 350 compare the present count with the lower and upper count values. If the present count is between the lower and upper count values the comparators 350 output a logical 1.
  • the comparators 350 are connected to plurality of AND gates 356 and OR gates 358 .
  • the OR gates 358 are connected to latches 360 .
  • the latches 360 provide the corresponding IN, SEL, SAM 1 , SAM 2 and RST signals.
  • the AND gates 356 are also connected to a mode line 364 . To operate in accordance with the timing diagram shown in FIG. 5, the mode line 364 is set at a logic 1.
  • the latches 360 switch between a logic 0and a logic 1 in accordance with the logic established by the AND gates 356 , OR gates 358 , comparators 350 and the present count of the counter 352 .
  • the hardwired signals for the comparator coupled to the IN latch may contain a count values of 6 and a count value of 24 . If the count from the counter is greater or equal to 6 but less than 24 the comparator 350 will provide a logic 1 that will cause the IN latch 360 to output a logic 1.
  • the lower and upper count values establish the sequence and duration of the pulses shown in FIG. 5.
  • the mode line 364 can be switched to a logic 0 which causes the image sensor to function in a second mode.
  • the sensor 10 may have a plurality of reset RST(n) drivers 370 , each driver 370 being connected to a row of pixels.
  • FIGS. 8 and 9 show an exemplary driver circuit 370 and the operation of the circuit 370 .
  • Each driver 370 may have a pair of NOR gates 372 that are connected to the RST and SAM 1 latches shown in FIG. 7.
  • the NOR gates control the state of a tri-state buffer 374 .
  • the tri-state buffer 374 is connected to the reset transistors in a row of pixels.
  • the input of the tri-state buffer is connected to an AND gate 376 that is connected to the RST latch and a row enable ROWEN(n) line.
  • FIGS. 10 and 11 show operation of the image sensor in a second mode also referred to as an extended dynamic range mode.
  • this mode the image provides a sufficient amount of optical energy so that the SNR is adequate even without the noise cancellation technique described in FIGS. 4 and 5 .
  • the noise cancellation technique shown in FIGS. 4 and 5 can be utilized while the image sensor 10 is in the extended dynamic range mode.
  • the extended dynamic mode has both a short exposure period and a long exposure period. Referring to FIG. 10, in block 400 each pixel 14 is reset to start a short exposure period.
  • the mode of the image sensor can be set by the processor 34 through register 44 to determine whether the sensor should be in the low noise mode, or the extended dynamic range mode.
  • a short exposure output signal is generated in the selected pixel and stored in the second capacitor 154 of the light reader circuit 16 .
  • the selected pixel is then reset.
  • the level shifted reset voltage of the photodiode 100 is stored in the first capacitor 152 of the light reader circuit 16 as a reset output signal.
  • the short exposure output signal is subtracted from the reset output signal in the light reader circuit 16 .
  • the difference between the short exposure signal and the reset signal is converted into a binary bit string by ADC 24 and stored into the external memory 38 .
  • the short exposure data corresponds to the first image pixel data. Then each pixel is again reset to start a long exposure period.
  • the light reader circuit 16 stores a long exposure output signal from the pixel in the second capacitor 154 .
  • the pixel is reset and the light reader circuit 16 stores the reset output signal in the first capacitor 152 .
  • the long exposure output signal is subtracted from the reset output signal, amplified and converted into a binary bit string by ADC 24 as long exposure data.
  • the short exposure data is retrieved from memory 38 .
  • the short exposure data is combined with the long exposure data by the processor 34 .
  • the data may be combined in a number of different manners.
  • the external processor 34 may first analyze the image with the long exposure data. The photodiodes may be saturated if the image is too bright. This would normally result in a “washed out” image.
  • the processor 34 can process the long exposure data to determine whether the image is washed out, if so, the processor 34 can then use the short exposure image data.
  • the processor 34 can also use both the long and short exposure data to compensate for saturated portions of the detected image.
  • the image may be initially set to all zeros.
  • the processor 34 analyzes the long exposure data. If the long exposure data does not exceed a threshold then N least significant bits (LSB) of the image is replaced with all N bits of the long exposure data. If the long exposure data does exceed the threshold then N most significant bits (MSB) of the image are replaced by all N bits of the short exposure data.
  • FIG. 11 shows the timing of data generation and retrieval for the long and short exposure data.
  • FIG. 11 shows timing of data generation and retrieval wherein a n-th row of pixels starts a short exposure, the (n-k)-th row ends the short exposure period and starts the long exposure period, and the (n-k- 1 )-th row of pixels ends the long exposure period. Where k is the short exposure duration in multiples of the line period, and 1 is the long exposure duration in multiples of the line period.
  • the processor 34 begins to retrieve short exposure data for the pixels in row (n-k) at the same time as the (n-k- 1 )-th row in the pixel array is completing the long exposure period.
  • the light reader circuit 16 retrieves the short exposure output signals from the (n-k)-th row of the pixel array 12 as shown by the enablement of signals SAM 1 , SAM 2 , SEL(n-k) and RST(n-k). The light reader circuit 16 then retrieves the long exposure data of the (n-k- 1 )-th row.
  • the dual modes of the image sensor 10 can compensate for varying brightness in the image.
  • the output signals from the pixels are relatively low. This would normally reduce the SNR of the resultant data provided by the sensor, assuming the average noise is relatively constant.
  • the noise compensation scheme shown in FIGS. 4 and 5 improve the SNR of the output data so that the image sensor provides a quality picture even when the subject image is relatively dark.
  • the extended dynamic range mode depicted in FIGS. 10 and 11 compensates for such brightness to provide a quality picture.
  • FIG. 12 shows an embodiment of a row driver 76 and adecoder 78 of the row decoder 20 .
  • the decoder 20 may contain an address decoder 500 and a latch 502 .
  • the input of the latch 502 is connected to input lines CLR 504 , D 0 , D 1 506 from the phase decoder circuit 84 (see FIG. 1) and the output line LE 508 of the address decoder 500 .
  • a phase decoder circuit 84 is shown and described, it is to be understood that any state value generator may be utilized.
  • the input of the driver 76 is connected to output lines Q 0 , Q 1 510 of the latch 502 and input lines RST 512 and SEL 514 from the phase sequence decoder 84 .
  • the latches 502 for each row of pixels are all connected to the phase decoder circuit 84 by the same common control lines 504 and 506 .
  • the common control lines 504 and 506 minimize the lines, transistors and space required by the row decoder while providing a means for loading the state valves with a time division muliplexing process.
  • the address decoder 500 is coupled to a multiplexor 520 by an address bus 522 .
  • the address decoder 500 is also connected to control lines PRE# 524 and EVA# 526 from the phase sequence decoder 84 .
  • the multiplexor 520 may have three input address busses 528 , 530 and 532 .
  • the address busses 528 , 530 and 532 are connected to a first counter 534 , a second counter 536 and a third counter 538 , respectively.
  • counters 534 , 536 and 538 are shown and described, it is to be understood that any address generator may be implemented.
  • the output of the multiplexor 520 is switched between the busses 528 , 530 and 532 by a control line PA 540 from the phase sequence decoder 84 .
  • the multiplexor 520 provides a time division multiplexing means for selecting a row of the pixel array with a reduced number of lines and transistors which minimizes the size of the image sensor.
  • FIGS. 13 and 14 show an operation of the row decoder 20 and transfer of pixel data. As shown in FIG. 14, the integration time and transfer of data is dependent on the control signal INTG from the processor 34 . Making the integration time and data transfer dependent on the control signal INTG allows the processor 34 to control and vary these parameters.
  • the INTG control signal contains a plurality of pulses each with a falling edge and a rising edge. Referring to FIGS. 1, 12, 13 and 14 , a falling edge is detected by the wide pulse detector 88 , which generates an output on the LEAD control line 92 .
  • the LEAD control signal starts the first counter 534 .
  • the first counter 534 outputs an address that is provided to the multiplexor 520 .
  • the PA control signal switches some of the multiplexors 520 to provide the address from the first counter 534 to the corresponding address decoders 500 . If the address from the first counter 534 matches a stored address within the address decoder 500 the decoder 500 will enable the latch 502 through line LE 508 .
  • the latch 502 loads state values Q 0 and Q 1 into the row driver 76 .
  • the output state values correspond to state values D 0 and D 1 that were previously loaded into the latch 502 from the phase sequence decoder 84 . When in low noise mode the state values allow for the RST and SEL signals to pass through the driver 76 to the selected row to generate and retrieve reference and reset signals, the first image.
  • the first counter 534 continues to output new address values which in turn sequentially select rows of the pixel array 12 to allow for the generation and retrieval of reference and noise signals for each row.
  • the falling edge of the INTG control signal also enables the transfer of the first image to the processor 34 from the data interface 32 . The process continues until all of the first image data is transferred to the processor 34 , and stored in memory 38 .
  • a rising edge of a pulse is detected by the wide pulse detector 88 which generates an output on the LAG control line 94 .
  • the LAG signal initiates the second counter 536 .
  • the second counter 536 provides addresses that are provided to the multiplexors 520 of each row.
  • the multiplexors 520 mux the addresses to the decoders 500 . If the addresses match, the latch 502 is enabled to load state values into the row drivers 76 . When in the low noise mode the state values allow for the generation and retrieval of light response and reference signals for the second image.
  • the rising edge also enables the data interface 32 to transfer the second image data to the processor 34 . As shown in FIG. 14, the transfer of first and second image data may overlap.
  • the interface 32 can transfer the overlapping data to the processor 34 in an interleaving manner.
  • FIG. 15 shows the transfer of data when the image sensor 10 is in the extended dynamic range mode.
  • the INTG control signal includes a narrow pulse between wide pulses. Short exposure is initiated by the falling edge of a wide pulse. The narrow pulse is detected by the narrow pulse detector 90 which initiates the third counter 538 .
  • the third counter 538 provides addresses which are decoded by matching decoders 500 to enable corresponding latches 502 .
  • the enabled latches 502 load state values into the row drivers 76 that allow for the generation and retrieval of long exposure and reference signals of the second image.
  • the narrow pulse also enables the data interface 32 to transfer the short exposure and reference signals of the first image to the processor 34 .
  • the processor 34 can change the exposure time by varying the width of the pulses in the control signal.
  • the variation in pulse width is an integer multiple of the line period so that the change in pulse width is in synchronization with the signals generated by the phase sequence decoder 82 .
  • the exposure time can be varied by changing the location of the narrow pulse.
  • the image sensor may generate data within a window 550 of the pixel array 12 .
  • the window 550 is an area typically offset from the first row of the pixel array 12 .
  • the window information may be provided to the processor 34 to auto-focus the camera. In auto-focus mode the window offset may vary to capture different parts of the image.
  • FIG. 17 shows an INTG control signal with an in embedded narrow pulse that is used to determine the offset location of the window 550 .
  • the narrow pulse detector 90 detects the embedded narrow pulse and provides a START control signal to the counter/latch 82 on the NP control line 96 .
  • the wide pulse detector 88 detects the rising edge of the next pulse and provides a STOP control signal to the counter/latch 82 on the LAG control line 94 .
  • the counter/latch 82 uses the START and STOP control signal to determine the offset for the window.
  • An offset value is loaded into the counters 534 , 536 and 538 to provide an initial count value.
  • the processor 34 can control the window offset by varying the location of the embedded narrow pulse within the control signal.

Abstract

An image sensor that has one or more pixels within a pixel array. The pixels are arranged within a plurality of rows within the array. Each row of the pixel array can be selected by a row decoder in response to an edge of a control signal. The control signal may be one of a plurality of signals generated by a processor coupled to the image sensor. The processor can control the exposure time of the pixels by varying the control signals. The control signals may also have an embedded narrow pulse that is used to determine the location of a “window” in the pixel array.

Description

    REFERENCE TO CROSS RELATED APPLICATION
  • This application claims priority under 35 U.S.C §119(e) to provisional application No. 60/372,902 filed on Apr. 16, 2002.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The subject matter disclosed generally relates to the field of semiconductor image sensors. [0003]
  • 2. Background Information [0004]
  • Photographic equipment such as digital cameras and digital camcorders contain electronic image sensors that capture light for processing into a still or video image, respectively. There are two primary types of electronic image sensors, charge coupled devices (CCDs) and complimentary metal oxide semiconductor (CMOS) sensors. CCD image sensors have relatively high signal to noise ratios (SNR) that provide quality images. Additionally, CCDs can be fabricated to have pixel arrays that are relatively small while conforming with most camera and video resolution requirements. A pixel is the smallest discrete element of an image. For these reasons, CCDs are used in most commercially available cameras and camcorders. [0005]
  • CMOS sensors are faster and consume less power than CCD devices. Additionally, CMOS fabrication processes are used to make many types of integrated circuits. Consequently, there is a greater abundance of manufacturing capacity for CMOS sensors than CCD sensors. [0006]
  • To date there has not been developed a CMOS sensor that has the same SNR and pixel pitch requirements as commercially available CCD sensors. Pixel pitch is the space between the centers of adjacent pixels. It would be desirable to provide a CMOS sensor that has relatively high SNR while providing a commercially acceptable pixel pitch. [0007]
  • The image sensor is typically connected to an external processor and external memory. The external memory stores data from the image sensor. The processor processes the stored data. The data includes one or more images generated by exposing the pixels for a predetermined time interval. The exposure time of the pixels is typically controlled by an internal clock(s) of the image sensor. [0008]
  • The exposure time of a picture frame is established by a word written into an exposure time register. Changing the exposure time requires writing new data into the register and then reading the data. In video and fast successive still photo shots this technique may create confusion regarding the exposure time of incoming pixel data, thereby creating instability in the system. It would be desirable to provide processor control of the exposure time of the pixels that improves stability and does not require an undesirable number of pins and signals. [0009]
  • Camera or camcorder products typically have an auto-focus function. To increase the speed of an auto-focus cycle the camera may be designed to process only a “window” of the pixel array. The auto-focus routine may require the window to move around the pixel array of the image sensor. It would be desirable to provide processor control of the window data in a manner that minimizes the pin count and number of signals required for the image sensor. [0010]
  • BRIEF SUMMARY OF THE INVENTION
  • An image sensor coupled to a process that generates a plurality of control signals. The image sensor includes a pixel array that is arranged into a number of rows. The sensor may also contain a logic circuit that selects a row of the pixel array to generate and retrieve pixel data in response to a first edge and a second edge of the control signals. A time interval between a resetting and a reading of the selected row is proportional to an interval between the first and second edges of the control signals. [0011]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. [0012] 1 is a schematic of an embodiment of an image sensor;
  • FIG. 2 is a schematic of an embodiment of a pixel of the image sensor; [0013]
  • FIG. 3 is a schematic of an embodiment of a light reader circuit of the image sensor; [0014]
  • FIG. 4 is a flowchart for a first mode of operation of the image sensor; [0015]
  • FIG. 5 is a timing diagram for the first mode of operation of the image sensor; [0016]
  • FIG. 6 is a diagram showing the levels of a signal across a photodiode of a pixel; [0017]
  • FIG. 7 is a schematic for a logic circuit for generating the timing diagrams of FIG. 5; [0018]
  • FIG. 8 is a schematic of a logic circuit for generating a RST signal for a row of pixels; [0019]
  • FIG. 9 is a timing diagram for the logic circuit shown in FIG. 8; [0020]
  • FIG. 10 is a flowchart showing a second mode of operation of the image sensor; [0021]
  • FIG. 11 is a timing diagram for the second mode of operation of the image sensor; [0022]
  • FIG. 12 is a schematic of an embodiment of a row decoder of the image sensor; [0023]
  • FIG. 13 is a timing diagram for the row decoder shown in FIG. 12; [0024]
  • FIG. 14 is a timing diagram showing the transfer of pixel data when the image sensor is in a low noise mode; [0025]
  • FIG. 15 is a timing diagram showing the transfer of pixel data when the image sensor is in an extended dynamic range mode; [0026]
  • FIG. 16 is an illustration of a window of the pixel array; [0027]
  • FIG. 17 is timing diagram showing an embedded narrow pulse used to determine a start location of the window. [0028]
  • DETAILED DESCRIPTION
  • Disclosed is an image sensor that has one or more pixels within a pixel array. The pixels are arranged within a plurality of rows within the array. Each row of the pixel array can be selected by a row decoder in response to an edge of a control signal. The control signal may be one of a plurality of signals generated by a processor coupled to the image sensor. The processor can control the exposure time of the pixels by varying the control signals. The control signals may also have an embedded narrow pulse that is used to determine the location of a “window” in the pixel array. [0029]
  • The pixel may be a three transistor structure that minimizes the pixel pitch of the image sensor. The entire image sensor is preferably constructed with CMOS fabrication processes and circuits. The CMOS image sensor has the characteristics of being high speed, low power consumption, small pixel pitch and a high SNR. [0030]
  • Referring to the drawings more particularly by reference numbers, FIG. 1 shows an [0031] image sensor 10. The image sensor 10 includes a pixel array 12 that contains a plurality of individual photodetecting pixels 14. The pixels 14 are arranged in a two-dimensional array of rows and columns.
  • The [0032] pixel array 12 is coupled to a light reader circuit 16 by a bus 18 and to a row decoder 20 by control lines 22. The row decoder 20 can select an individual row of the pixel array 12. The light reader 16 can then read specific-discrete columns within the selected row. Together, the row decoder 20 and light reader 16 allow for the reading of an individual pixel 14 in the array 12.
  • The [0033] light reader 16 may be coupled to an analog to digital converter 24 (ADC) by output line(s) 26. The ADC 24 generates a digital bit string that corresponds to the amplitude of the signal provided by the light reader 16 and the selected pixels 14.
  • The [0034] ADC 24 may be coupled to line buffers 28 by data lines 30. The line buffers 28 may include separate pairs of buffers for first image data and second image data. The line buffers 28 are coupled to a data interface 32 that transfers data to a processor 34 over bus 36. The processor 34 may be coupled to memory 38 by bus 40. Although the memory 38 is shown coupled to the processor 34, it is to be understood that the system may have other configurations. For example, the processor 34 and memory 38 may be coupled to the interface 32 by separate busses.
  • The data interface [0035] 32 may be connected to a control line INTG 42 which provides a control signal from the processor 34. The control signal may contain a series of pulses that control the transfer of data to the processor 34. The pixel data may be transferred to the processor 34 in an interleaving manner. For example, the buffers 28 may store pixel data of a first image and a second image. The data interface 32 may interleave the data by sending a first line of the first image and then a first line of the second image and so forth and so on.
  • The [0036] image sensor 10 may have registers 44 that store mode and gain values. The values can be provided to the data interface 32, buffers 28, light reader 16 and row decoder 20 over lines 46, 48, 50 and 52, respectively. The values can be loaded into the registers 44 through lines 54, 56 and 58. The image sensor 10 may also have clock circuits 60 that provide CLK timing signals over line 62.
  • The [0037] light reader circuit 16 may be coupled to a column decoder 64 by control lines 66. The decoder 64 selects a column within the pixel array 12 to generate and retrieve pixel data from the pixels 14. The decoder 64 is coupled to a counter 68 by a bus 70. The counter 68 provides a count value that causes the decoder 64 to switch the selection of a column in the pixel array 12. Counter 68 is also connected to an input line HD 72 and an output line HDF 74.
  • The [0038] row decoder 20 may include a plurality of row drivers 76 that are coupled to the pixel array 12. The row drivers 76 may be coupled to decoders 78 and counters 80. The counters 80 may be coupled to a counter/latch circuit 82.
  • The [0039] row decoder 20 may also include a phase sequence decoder 84. The phase sequence decoder 84 may be coupled to the light reader 16, row drivers 76 and decoders 78 by control signals 86. The row decoder 20 may further include a wide pulse detector 88 and a narrow pulse detector 90. The wide pulse detector 88 may be connected to the counters 80 by LEAD 92 and LAG 94 control signals, respectively. The narrow pulse detector 90 may be connected to the counter/latch 82 by control signal NP 96. The pulse detectors 88 and 90 may be connected to the INTG control line 42 that is coupled to the processor 34. The counter/latch 82, narrow pulse detector 90 and phase sequence decoder 84 may be connected to the mode line 52 of register 44.
  • FIG. 2 shows an embodiment of a cell structure for a [0040] pixel 14 of the pixel array 12. The pixel 14 may contain a photodetector 100. By way of example, the photodetector 100 may be a photodiode. The photodetector 100 may be connected to a reset transistor 112. The photodetector 100 may also be coupled to a select transistor 114 through a level shifting transistor 116. The transistors 112, 114 and 116 may be field effect transistors (FETs).
  • The gate of [0041] reset transistor 112 may be connected to a RST line 118. The drain node of the transistor 112 may be connected to IN line 120. The gate of select transistor 114 may be connected to a SEL line 122. The source node of transistor 114 may be connected to an OUT line 124. The RST 118 and SEL lines 122 may be common for an entire row of pixels in the pixel array 12. Likewise, the IN 120 and OUT 124 lines may be common for an entire column of pixels in the pixel array 12. The RST line 118 and SEL line 122 are connected to the row decoder 20 and are part of the control lines 22.
  • FIG. 3 shows an embodiment of a [0042] light reader circuit 16. The light reader 16 may include a plurality of double sampling capacitor circuits 150 each connected to an OUT line 124 of the pixel array 12. Each double sampling circuit 150 may include a first capacitor 152 and a second capacitor 154. The first capacitor 152 is coupled to the OUT line 124 and ground GND1 156 by switches 158 and 160, respectively. The second capacitor 154 is coupled to the OUT line 124 and ground GND1 by switches 162 and 164, respectively. Switches 158 and 160 are controlled by a control line SAM1 166. Switches 162 and 164 are controlled by a control line SAM2 168. The capacitors 152 and 154 can be connected together to perform a voltage subtraction by closing switch 170. The switch 170 is controlled by a control line SUB 172.
  • The [0043] double sampling circuits 150 are connected to an operational amplifier 180 by a plurality of first switches 182 and a plurality of second switches 184. The amplifier 180 has a negative terminal−coupled to the first capacitors 152 by the first switches 182 and a positive terminal+coupled to the second capacitors 154 by the second switches 184. The operational amplifier 180 has a positive output+connected to an output line OP 188 and a negative output−connected to an output line OM 186. The output lines 186 and 188 are connected to the ADC 24 (see FIG. 1).
  • The [0044] operational amplifier 180 provides an amplified signal that is the difference between the voltage stored in the first capacitor 152 and the voltage stored in the second capacitor 154 of a sampling circuit 150 connected to the amplifier 180. The gain of the amplifier 180 can be varied by adjusting the variable capacitors 190. The variable capacitors 190 may be discharged by closing a pair of switches 192. The switches 192 may be connected to a corresponding control line (not shown). Although a single amplifier is shown and described, it is to be understood that more than one amplifier can be used in the light reader circuit 16.
  • FIGS. 4 and 5 show an operation of the [0045] image sensor 10 in a first mode also referred to as a low noise mode. In process block 300 a reference signal is written into each pixel 14 of the pixel array and then a first reference output signal is stored in the light reader 16. Referring to FIGS. 2 and 5, this can be accomplished by switching the RST 118 and IN 120 lines from a low voltage to a high voltage to turn on transistor 112. The RST line 118 is driven high for an entire row. IN line 120 is driven high for an entire column. In the preferred embodiment, RST line 118 is first driven high while the IN line 120 is initially low.
  • The [0046] RST line 118 may be connected to a tri-state buffer (not shown) that is switched to a tri-state when the IN line 120 is switched to a high state. This allows the gate voltage to float to a value that is higher than the voltage on the IN line 120. This causes the transistor 112 to enter the triode region. In the triode region the voltage across the photodiode 100 is approximately the same as the voltage on the IN line 120. Generating a higher gate voltage allows the photodetector to be reset at a level close to Vdd. CMOS sensors of the prior art reset the photodetector to a level of Vdd-Vgs, where Vgs can be up to 1 V.
  • The [0047] SEL line 122 is also switched to a high voltage level which turns on transistor 114. The voltage of the photodiode 100 is provided to the OUT line 124 through level shifter transistor 116 and select transistor 114. The SAM1 control line 166 of the light reader 16 (see FIG. 3) is selected so that the voltage on the OUT line 124 is stored in the first capacitor 152.
  • Referring to FIG. 4, in process block [0048] 302 the pixels of the pixel array are then reset and reset output signals are then stored in the light reader 16. Referring to FIGS. 2 and 5 this can be accomplished by driving the RST line 118 low to turn off the transistor 112 and reset the pixel 14. Turning off the transistor 112 will create reset noise, charge injection and clock feedthrough voltage that resides across the photodiode 100. As shown in FIG. 6 the noise reduces the voltage at the photodetector 100 when the transistor 112 is reset.
  • The [0049] SAM2 line 168 is driven high, the SEL line 122 is driven low and then high again, so that a level shifted voltage of the photodiode 100 is stored as a reset output signal in the second capacitor 154 of the light reader circuit 16. Process blocks 300 and 302 are repeated for each pixel 14 in the array 12.
  • Referring to FIG. 4, in process block [0050] 304 the reset output signals are then subtracted from the first reference output signals to create noise output signals that are then converted to digital bit strings by ADC 24. The digital output data can be stored within the line buffers 28 and eventually transferred and stored within the external memory 38. The noise signals may be referred to as a first image. Referring to FIG. 3, the subtraction process can be accomplished by closing switches 182, 184 and 170 of the light reader circuit 16 (FIG. 3) to subtract the voltage across the second capacitor 154 from the voltage across the first capacitor 152.
  • Referring to FIG. 4, in [0051] block 306 light response output signals are sampled from the pixels 14 of the pixel array 12 and stored in the light reader circuit 16. The light response output signals correspond to the optical image that is being detected by the image sensor 10. Referring to FIGS. 2, 3 and 5 this can be accomplished by having the IN 120, SEL 122 and SAM2 lines 168 in a high state and RST 118 in a low state. The second capacitor 152 of the light reader circuit 16 stores a level shifted voltage of the photodiode 100 as the light response output signal.
  • Referring to FIG. 4, in block [0052] 308 a second reference output signal is then generated in the pixels 14 and stored in the light reader circuit 16. Referring to FIGS. 2, 3 and 5, this can be accomplished similar to generating and storing the first reference output signal. The RST line 118 is first driven high and then into a tri-state. The IN line 120 is then driven high to cause the transistor 112 to enter the triode region so that the voltage across the photodiode 100 is the voltage on IN line 120. The SEL 122 and SAM2 168 lines are then driven high to store the second reference output voltage in the first capacitor 154 of the light reader circuit 16. Process blocks 306 and 308 are repeated for each pixel 14 in the array 12.
  • Referring to FIG. 4, in [0053] block 310 the light response output signal is subtracted from the second reference output signal to create a normalized light response output signal. The normalized light response output signal is converted into a digital bit string to create normalized light output data that is transferred to the processor 34. The normalized light response output signals may be referred to as a second image. Referring to FIGS. 2, 3 and 5 the subtraction process can be accomplished by closing switches 170, 182 and 184 of the light reader 16 to subtract the voltage across the first capacitor 152 from the voltage across the second capacitor 154. The difference is then amplified by amplifier 180 and converted into a digital bit string by ADC 24 as light response data.
  • Referring to FIG. 4, in [0054] block 312 the noise data is retrieved from memory 38. In block 314 the noise data, first image, is combined (subtracted) with the normalized light output data, second image, by the processor 34. The noise data corresponds to the first image and the normalized light output data corresponds to the second image. The second reference output signal is the same or approximately the same as the first reference output signal such that the present technique subtracts the noise data, due to reset noise, charge injection and clock feedthrough, from the normalized light response signal. This improves the signal to noise ratio of the final image data.
  • The process described is performed in a sequence across the various rows of the pixels in the [0055] pixel array 12. As shown in FIG. 5, the n-th row in the pixel array may be generating noise signals while the n-1-th row generates normalized light response signals, where 1 is the exposure duration in multiples of a line period.
  • The various control signals RST, SEL, IN, SAM[0056] 1, SAM2 and SUB can be generated in the circuit generally referred to as the phase sequence decoder 84. FIG. 7 shows an embodiment of logic to generate the IN, SEL, SAM1, SAM2 and RST signals in accordance with the timing diagram of FIG. 5. The logic may include a plurality of comparators 350 with one input connected to a counter 68 and another input connected to hardwired signals that contain a lower count value and an upper count value. The counter 68 sequentially generates a count. The comparators 350 compare the present count with the lower and upper count values. If the present count is between the lower and upper count values the comparators 350 output a logical 1.
  • The [0057] comparators 350 are connected to plurality of AND gates 356 and OR gates 358. The OR gates 358 are connected to latches 360. The latches 360 provide the corresponding IN, SEL, SAM1, SAM2 and RST signals. The AND gates 356 are also connected to a mode line 364. To operate in accordance with the timing diagram shown in FIG. 5, the mode line 364 is set at a logic 1.
  • The [0058] latches 360 switch between a logic 0and a logic 1 in accordance with the logic established by the AND gates 356, OR gates 358, comparators 350 and the present count of the counter 352. For example, the hardwired signals for the comparator coupled to the IN latch may contain a count values of 6 and a count value of 24. If the count from the counter is greater or equal to 6 but less than 24 the comparator 350 will provide a logic 1 that will cause the IN latch 360 to output a logic 1. The lower and upper count values establish the sequence and duration of the pulses shown in FIG. 5. The mode line 364 can be switched to a logic 0 which causes the image sensor to function in a second mode.
  • The [0059] sensor 10 may have a plurality of reset RST(n) drivers 370, each driver 370 being connected to a row of pixels. FIGS. 8 and 9 show an exemplary driver circuit 370 and the operation of the circuit 370. Each driver 370 may have a pair of NOR gates 372 that are connected to the RST and SAM1 latches shown in FIG. 7. The NOR gates control the state of a tri-state buffer 374. The tri-state buffer 374 is connected to the reset transistors in a row of pixels. The input of the tri-state buffer is connected to an AND gate 376 that is connected to the RST latch and a row enable ROWEN(n) line.
  • FIGS. 10 and 11 show operation of the image sensor in a second mode also referred to as an extended dynamic range mode. In this mode the image provides a sufficient amount of optical energy so that the SNR is adequate even without the noise cancellation technique described in FIGS. [0060] 4 and 5. Although it is to be understood that the noise cancellation technique shown in FIGS. 4 and 5 can be utilized while the image sensor 10 is in the extended dynamic range mode. The extended dynamic mode has both a short exposure period and a long exposure period. Referring to FIG. 10, in block 400 each pixel 14 is reset to start a short exposure period. The mode of the image sensor can be set by the processor 34 through register 44 to determine whether the sensor should be in the low noise mode, or the extended dynamic range mode.
  • In block [0061] 402 a short exposure output signal is generated in the selected pixel and stored in the second capacitor 154 of the light reader circuit 16.
  • In [0062] block 404 the selected pixel is then reset. The level shifted reset voltage of the photodiode 100 is stored in the first capacitor 152 of the light reader circuit 16 as a reset output signal. The short exposure output signal is subtracted from the reset output signal in the light reader circuit 16. The difference between the short exposure signal and the reset signal is converted into a binary bit string by ADC 24 and stored into the external memory 38. The short exposure data corresponds to the first image pixel data. Then each pixel is again reset to start a long exposure period.
  • In [0063] block 406 the light reader circuit 16 stores a long exposure output signal from the pixel in the second capacitor 154. In block 408 the pixel is reset and the light reader circuit 16 stores the reset output signal in the first capacitor 152. The long exposure output signal is subtracted from the reset output signal, amplified and converted into a binary bit string by ADC 24 as long exposure data.
  • Referring to FIG. 10, in [0064] block 410 the short exposure data is retrieved from memory 38. In block 412 the short exposure data is combined with the long exposure data by the processor 34. The data may be combined in a number of different manners. The external processor 34 may first analyze the image with the long exposure data. The photodiodes may be saturated if the image is too bright. This would normally result in a “washed out” image. The processor 34 can process the long exposure data to determine whether the image is washed out, if so, the processor 34 can then use the short exposure image data. The processor 34 can also use both the long and short exposure data to compensate for saturated portions of the detected image.
  • By way of example, the image may be initially set to all zeros. The [0065] processor 34 then analyzes the long exposure data. If the long exposure data does not exceed a threshold then N least significant bits (LSB) of the image is replaced with all N bits of the long exposure data. If the long exposure data does exceed the threshold then N most significant bits (MSB) of the image are replaced by all N bits of the short exposure data. The image data is N+M bits per pixel. This technique increases the dynamic range by M bits, where M is the exponential in an exposure duration ratio of long and short exposures that is defined by the equation l=2M. The replaced image may undergo a logarithmic mapping to a final picture of N bits in accordance with the mapping equation Y=2N log2 (X)/(N+M).
  • FIG. 11 shows the timing of data generation and retrieval for the long and short exposure data. The reading of output signals from the [0066] pixel array 12 overlap with the retrieval of signals from memory 38. FIG. 11 shows timing of data generation and retrieval wherein a n-th row of pixels starts a short exposure, the (n-k)-th row ends the short exposure period and starts the long exposure period, and the (n-k-1)-th row of pixels ends the long exposure period. Where k is the short exposure duration in multiples of the line period, and 1 is the long exposure duration in multiples of the line period.
  • The [0067] processor 34 begins to retrieve short exposure data for the pixels in row (n-k) at the same time as the (n-k-1)-th row in the pixel array is completing the long exposure period. At the beginning of a line period, the light reader circuit 16 retrieves the short exposure output signals from the (n-k)-th row of the pixel array 12 as shown by the enablement of signals SAM1, SAM2, SEL(n-k) and RST(n-k). The light reader circuit 16 then retrieves the long exposure data of the (n-k-1)-th row.
  • The dual modes of the [0068] image sensor 10 can compensate for varying brightness in the image. When the image brightness is low the output signals from the pixels are relatively low. This would normally reduce the SNR of the resultant data provided by the sensor, assuming the average noise is relatively constant. The noise compensation scheme shown in FIGS. 4 and 5 improve the SNR of the output data so that the image sensor provides a quality picture even when the subject image is relatively dark. Conversely, when the subject image is too bright the extended dynamic range mode depicted in FIGS. 10 and 11 compensates for such brightness to provide a quality picture. Although a process having a short exposure followed by a long exposure is shown and described, it is to be understood that the short exposure may follow the long exposure.
  • FIG. 12 shows an embodiment of a [0069] row driver 76 and adecoder 78 of the row decoder 20. The decoder 20 may contain an address decoder 500 and a latch 502. The input of the latch 502 is connected to input lines CLR 504, D0, D1 506 from the phase decoder circuit 84 (see FIG. 1) and the output line LE 508 of the address decoder 500. Although a phase decoder circuit 84 is shown and described, it is to be understood that any state value generator may be utilized. The input of the driver 76 is connected to output lines Q0, Q1 510 of the latch 502 and input lines RST 512 and SEL 514 from the phase sequence decoder 84. The latches 502 for each row of pixels are all connected to the phase decoder circuit 84 by the same common control lines 504 and 506. The common control lines 504 and 506 minimize the lines, transistors and space required by the row decoder while providing a means for loading the state valves with a time division muliplexing process.
  • The [0070] address decoder 500 is coupled to a multiplexor 520 by an address bus 522. The address decoder 500 is also connected to control lines PRE# 524 and EVA# 526 from the phase sequence decoder 84. The multiplexor 520 may have three input address busses 528, 530 and 532. The address busses 528, 530 and 532 are connected to a first counter 534, a second counter 536 and a third counter 538, respectively. Although counters 534, 536 and 538 are shown and described, it is to be understood that any address generator may be implemented.
  • The output of the [0071] multiplexor 520 is switched between the busses 528, 530 and 532 by a control line PA 540 from the phase sequence decoder 84. There is a corresponding address decoder 500 and latch 502 for each row of the pixel array 12. The multiplexor 520 provides a time division multiplexing means for selecting a row of the pixel array with a reduced number of lines and transistors which minimizes the size of the image sensor.
  • FIGS. 13 and 14 show an operation of the [0072] row decoder 20 and transfer of pixel data. As shown in FIG. 14, the integration time and transfer of data is dependent on the control signal INTG from the processor 34. Making the integration time and data transfer dependent on the control signal INTG allows the processor 34 to control and vary these parameters.
  • The INTG control signal contains a plurality of pulses each with a falling edge and a rising edge. Referring to FIGS. 1, 12, [0073] 13 and 14, a falling edge is detected by the wide pulse detector 88, which generates an output on the LEAD control line 92. The LEAD control signal starts the first counter 534. The first counter 534 outputs an address that is provided to the multiplexor 520.
  • The PA control signal switches some of the [0074] multiplexors 520 to provide the address from the first counter 534 to the corresponding address decoders 500. If the address from the first counter 534 matches a stored address within the address decoder 500 the decoder 500 will enable the latch 502 through line LE 508. The latch 502 loads state values Q0 and Q1 into the row driver 76. The output state values correspond to state values D0 and D1 that were previously loaded into the latch 502 from the phase sequence decoder 84. When in low noise mode the state values allow for the RST and SEL signals to pass through the driver 76 to the selected row to generate and retrieve reference and reset signals, the first image.
  • The [0075] first counter 534 continues to output new address values which in turn sequentially select rows of the pixel array 12 to allow for the generation and retrieval of reference and noise signals for each row. The falling edge of the INTG control signal also enables the transfer of the first image to the processor 34 from the data interface 32. The process continues until all of the first image data is transferred to the processor 34, and stored in memory 38.
  • A rising edge of a pulse is detected by the [0076] wide pulse detector 88 which generates an output on the LAG control line 94. The LAG signal initiates the second counter 536. The second counter 536 provides addresses that are provided to the multiplexors 520 of each row. The multiplexors 520 mux the addresses to the decoders 500. If the addresses match, the latch 502 is enabled to load state values into the row drivers 76. When in the low noise mode the state values allow for the generation and retrieval of light response and reference signals for the second image. The rising edge also enables the data interface 32 to transfer the second image data to the processor 34. As shown in FIG. 14, the transfer of first and second image data may overlap. The interface 32 can transfer the overlapping data to the processor 34 in an interleaving manner.
  • FIG. 15 shows the transfer of data when the [0077] image sensor 10 is in the extended dynamic range mode. In this mode the INTG control signal includes a narrow pulse between wide pulses. Short exposure is initiated by the falling edge of a wide pulse. The narrow pulse is detected by the narrow pulse detector 90 which initiates the third counter 538. The third counter 538 provides addresses which are decoded by matching decoders 500 to enable corresponding latches 502. The enabled latches 502 load state values into the row drivers 76 that allow for the generation and retrieval of long exposure and reference signals of the second image. The narrow pulse also enables the data interface 32 to transfer the short exposure and reference signals of the first image to the processor 34.
  • The [0078] processor 34 can change the exposure time by varying the width of the pulses in the control signal. The variation in pulse width is an integer multiple of the line period so that the change in pulse width is in synchronization with the signals generated by the phase sequence decoder 82. When in the extended dynamic range mode the exposure time can be varied by changing the location of the narrow pulse.
  • As shown in FIG. 16, the image sensor may generate data within a [0079] window 550 of the pixel array 12. The window 550 is an area typically offset from the first row of the pixel array 12. The window information may be provided to the processor 34 to auto-focus the camera. In auto-focus mode the window offset may vary to capture different parts of the image.
  • FIG. 17 shows an INTG control signal with an in embedded narrow pulse that is used to determine the offset location of the [0080] window 550. When the register 44 sets the image sensor in a window mode, the narrow pulse detector 90 detects the embedded narrow pulse and provides a START control signal to the counter/latch 82 on the NP control line 96. The wide pulse detector 88 detects the rising edge of the next pulse and provides a STOP control signal to the counter/latch 82 on the LAG control line 94. The counter/latch 82 uses the START and STOP control signal to determine the offset for the window. An offset value is loaded into the counters 534, 536 and 538 to provide an initial count value. The processor 34 can control the window offset by varying the location of the embedded narrow pulse within the control signal.
  • It is the intention of the inventor that only claims which contain the term “means” shall be construed under 35 U.S.C. §112, sixth paragraph. [0081]
  • While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. [0082]

Claims (20)

What is claimed is:
1. An image sensor that is connected to a processor which generates a plurality of control signals, the control signals including a first edge separated from a second edge by a control interval, comprising:
a pixel array that contains a plurality of rows of pixels; and,
a selection circuit that selects a row of said pixel array to generate and retrieve pixel data from said pixel array by resetting and reading said selected row of said pixel array, a time interval between the resetting and reading of said selected row being proportional to the control interval between the first and second edges.
2. The image sensor of claim 1, wherein said selection circuit includes a decoder circuit coupled to said pixel array, an address generator coupled to said decoder circuit and a pulse detector coupled to said address generator and the processor.
3. The image sensor of claim 2, wherein said address generator circuit includes a first counter that is started in response to the first edge and a second counter that is started in response to the second edge.
4. The image sensor of claim 3, wherein said selection circuit includes a narrow pulse detector that is coupled to a third counter of said address generator, said third counter being coupled to said decoder circuit.
5. The image sensor of claim 3, wherein said decoder circuit includes a multiplexor coupled to an address decoder, said multiplexor being coupled to said first and second counters.
6. The image sensor of claim 5, wherein said selection circuit includes a row driver coupled to a latch of said decoder circuit, said latch being coupled to said address decoder.
7. The image sensor of claim 1, further comprising a light reader circuit coupled to said pixel array.
8. The image sensor of claim 4, wherein said selection circuit includes a counter/latch that is coupled to said narrow pulse detector and said address generator.
9. The image sensor of claim 6, wherein said selection circuit includes a phase sequence decoder that is coupled to said light reader circuit and said row driver.
10. An image sensor that is connected to a processor which generates a plurality of control signals including a first pulse that has a first width and a second pulse that has a different second width, comprising:
a pixel array that contains a plurality of rows of pixels; and,
a selection circuit that selects a group of rows of said pixel array, the group being a function of a location of the second pulse relative to the first pulse.
11. The image sensor of claim 10, wherein said selection circuit includes a decoder circuit coupled to said pixel array, an address generator coupled to said decoder circuit and a pulse detector coupled to said address generator and the processor.
12. The image sensor of claim 11, wherein said address generator includes a first counter that is started in response to a first edge in the plurality of control signals and a second counter that is started in response to a second edge in the plurality of control signals selection.
13. The image sensor of claim 12, wherein said logic circuit includes a pulse detector that is coupled to a third counter of said address generator, said third counter being coupled to said decoder circuit.
14. The image sensor of claim 12, wherein said decoder circuit includes a multiplexor coupled to an address decoder, said multiplexor being coupled to said first and second counters.
15. The image sensor of claim 14, wherein said selection circuit includes a row driver coupled to a latch of said decoder circuit, said latch being coupled to said address decoder.
16. The image sensor of claim 10, further comprising a light reader circuit coupled to said pixel array.
17. The image sensor of claim 13, wherein said selection circuit includes a counter/latch that is coupled to said pulse detector and said address generator.
18. The image sensor of claim 16, wherein said selection circuit includes a phase sequence decoder that is coupled to said light reader circuit and said row driver.
19. An image sensor, comprising:
a pixel array that contains a plurality of rows of pixels;
an address decoder coupled to a row of said pixel array;
a multiplexor coupled to said address decoder;
a first address generator coupled to said multiplexor; and,
a second address generator coupled to said multiplexor.
20. The system of claim 19, further comprising a pulse detector coupled to said first and second address generators.
US10/383,450 2002-04-16 2003-03-06 Image sensor with processor controlled integration time Abandoned US20030193594A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/383,450 US20030193594A1 (en) 2002-04-16 2003-03-06 Image sensor with processor controlled integration time
US12/893,032 US9584739B2 (en) 2002-04-16 2010-09-29 CMOS image sensor with processor controlled integration time

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US37290202P 2002-04-16 2002-04-16
US10/383,450 US20030193594A1 (en) 2002-04-16 2003-03-06 Image sensor with processor controlled integration time

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/893,032 Continuation US9584739B2 (en) 2002-04-16 2010-09-29 CMOS image sensor with processor controlled integration time

Publications (1)

Publication Number Publication Date
US20030193594A1 true US20030193594A1 (en) 2003-10-16

Family

ID=28794503

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/383,450 Abandoned US20030193594A1 (en) 2002-04-16 2003-03-06 Image sensor with processor controlled integration time
US12/893,032 Expired - Fee Related US9584739B2 (en) 2002-04-16 2010-09-29 CMOS image sensor with processor controlled integration time

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/893,032 Expired - Fee Related US9584739B2 (en) 2002-04-16 2010-09-29 CMOS image sensor with processor controlled integration time

Country Status (1)

Country Link
US (2) US20030193594A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1711007A1 (en) * 2004-01-26 2006-10-11 Hamamatsu Photonics K.K. Solid state image pickup device
US20090244344A1 (en) * 2008-03-26 2009-10-01 Micron Technology, Inc. Systems, methods, and devices for preventing shoot-through current within and between signal line drivers of semiconductor devices
US20090244039A1 (en) * 2004-08-20 2009-10-01 Micron Technology, Inc. Redundancy in column parallel or row architectures
US20130020465A1 (en) * 2011-07-20 2013-01-24 Lg Innotek Co., Ltd. Pixel, pixel array, image sensor including the same, and method for driving image sensor
CN104486557A (en) * 2014-12-29 2015-04-01 上海集成电路研发中心有限公司 System and method for inserting short frames in image capture process
US20150189197A1 (en) * 2013-12-29 2015-07-02 Cista System Corp. Compact row decoder with multiple voltage support
US20150215556A1 (en) * 2012-04-12 2015-07-30 Taiwan Semiconductor Manufacturing Company, Ltd. Readout device with readout circuit
CN109639993A (en) * 2018-12-28 2019-04-16 北京思比科微电子技术股份有限公司 Multiwindow exposal control method
WO2019106643A1 (en) * 2017-12-01 2019-06-06 Uti Limited Partnership Apparatus and method of imaging
CN113660425A (en) * 2021-08-19 2021-11-16 维沃移动通信(杭州)有限公司 Image processing method and device, electronic equipment and readable storage medium
US11423851B2 (en) * 2019-12-26 2022-08-23 Samsung Electronics Co., Ltd. Image sensor driving circuit including power switch and image sensor including the same

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MX2011006316A (en) * 2008-12-16 2011-09-01 Hiok Nam Tay Noise-cancelling image sensors.
JP5661260B2 (en) * 2009-07-16 2015-01-28 キヤノン株式会社 Solid-state imaging device and driving method thereof
JP5631058B2 (en) * 2010-05-20 2014-11-26 キヤノン株式会社 Imaging device, imaging system, and driving method of imaging device
US9325923B2 (en) 2013-03-14 2016-04-26 Taiwan Semiconductor Manufacturing Co., Ltd. Systems and methods to mitigate transient current for sensors

Citations (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3931674A (en) * 1974-02-08 1976-01-13 Fairchild Camera And Instrument Corporation Self aligned CCD element including two levels of electrodes and method of manufacture therefor
US4425501A (en) * 1981-03-30 1984-01-10 Honeywell Inc. Light aperture for a lenslet-photodetector array
US4473836A (en) * 1982-05-03 1984-09-25 Dalsa Inc. Integrable large dynamic range photodetector element for linear and area integrated circuit imaging arrays
US4614996A (en) * 1984-07-19 1986-09-30 Shimizu Construction Co., Ltd. Ceiling illumination apparatus
US4647975A (en) * 1985-10-30 1987-03-03 Polaroid Corporation Exposure control system for an electronic imaging camera having increased dynamic range
US4704633A (en) * 1985-04-01 1987-11-03 Fuji Photo Film Co., Ltd. Method for reading out image information on an image having a wide dynamic range
US4858013A (en) * 1987-03-19 1989-08-15 Mitsubishi Denki Kabushiki Kaisha Solid state imaging device with adaptive pixel correction
US4974093A (en) * 1987-12-22 1990-11-27 Fuji Photo Film Co., Ltd. Solid state image-pickup device with expanded dynamic range
US5043821A (en) * 1988-08-31 1991-08-27 Canon Kabushiki Kaisha Image pickup device having a frame-size memory
US5138458A (en) * 1989-12-22 1992-08-11 Olympus Optical Co., Ltd. Electronic camera apparatus capable of providing wide dynamic range image signal
US5163914A (en) * 1991-10-24 1992-11-17 Abel Elaine R Support for a respirator hose
US5235197A (en) * 1991-06-25 1993-08-10 Dalsa, Inc. High photosensitivity and high speed wide dynamic range ccd image sensor
US5278658A (en) * 1990-05-15 1994-01-11 Ricoh Company, Ltd. Image reading apparatus having a function for correcting dark signals generated in a photoelectric conversion element
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US5420635A (en) * 1991-08-30 1995-05-30 Fuji Photo Film Co., Ltd. Video camera, imaging method using video camera, method of operating video camera, image processing apparatus and method, and solid-state electronic imaging device
US5434620A (en) * 1991-02-22 1995-07-18 Nippondenso Co., Ltd. Image sensor
US5436662A (en) * 1992-05-01 1995-07-25 Olympus Optical Co., Ltd. Imaging apparatus having a solid state matrix-type imaging element and pulse generator for the expanding the dynamic range
US5452004A (en) * 1993-06-17 1995-09-19 Litton Systems, Inc. Focal plane array imaging device with random access architecture
US5455621A (en) * 1992-10-27 1995-10-03 Matsushita Electric Industrial Co., Ltd. Imaging method for a wide dynamic range and an imaging device for a wide dynamic range
US5461425A (en) * 1994-02-15 1995-10-24 Stanford University CMOS image sensor with pixel level A/D conversion
US5471515A (en) * 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US5587738A (en) * 1993-11-17 1996-12-24 Canon Kabushiki Kaisha Solid-state image pickup device having plural switches for subtracting a stored signal from a pixel output
US5638118A (en) * 1987-06-09 1997-06-10 Canon Kabushiki Kaisha Image sensing device with diverse storage times used in picture composition
US5665959A (en) * 1995-01-13 1997-09-09 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Adminstration Solid-state image sensor with focal-plane digital photon-counting pixel array
US5675381A (en) * 1990-10-31 1997-10-07 Canon Kabushiki Kaisha Control of solid-state image sensor
US5737016A (en) * 1985-11-15 1998-04-07 Canon Kabushiki Kaisha Solid state image pickup apparatus for reducing noise
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
US5841126A (en) * 1994-01-28 1998-11-24 California Institute Of Technology CMOS active pixel sensor type imaging system on a chip
US5861620A (en) * 1996-01-19 1999-01-19 Canon Kabushiki Kaisha Photoelectric converting apparatus
US5880460A (en) * 1996-09-10 1999-03-09 Foveonics, Inc. Active pixel sensor cell that reduces noise in the photo information extracted from the cell
US5883830A (en) * 1995-12-29 1999-03-16 Intel Corporation CMOS imaging device with integrated flash memory image correction circuitry
US5886659A (en) * 1996-08-21 1999-03-23 California Institute Of Technology On-focal-plane analog-to-digital conversion for current-mode imaging devices
US5892541A (en) * 1996-09-10 1999-04-06 Foveonics, Inc. Imaging system and method for increasing the dynamic range of an array of active pixel sensor cells
US5909026A (en) * 1996-11-12 1999-06-01 California Institute Of Technology Integrated sensor with frame memory and programmable resolution for light adaptive imaging
US5926214A (en) * 1996-09-12 1999-07-20 Vlsi Vision Limited Camera system and associated method for removing reset noise and fixed offset noise from the output of an active pixel array
US5929908A (en) * 1995-02-03 1999-07-27 Canon Kabushiki Kaisha Image sensing apparatus which performs dynamic range expansion and image sensing method for dynamic range expansion
US5953061A (en) * 1995-01-03 1999-09-14 Xerox Corporation Pixel cells having integrated analog memories and arrays thereof
US5962844A (en) * 1997-09-03 1999-10-05 Foveon, Inc. Active pixel image cell with embedded memory and pixel level signal processing capability
US5990506A (en) * 1996-03-20 1999-11-23 California Institute Of Technology Active pixel sensors with substantially planarized color filtering elements
US6005619A (en) * 1997-10-06 1999-12-21 Photobit Corporation Quantum efficiency improvements in active pixel sensors
US6008486A (en) * 1997-12-31 1999-12-28 Gentex Corporation Wide dynamic range optical sensor
US6021172A (en) * 1994-01-28 2000-02-01 California Institute Of Technology Active pixel sensor having intra-pixel charge transfer with analog-to-digital converter
US6024881A (en) * 1998-08-11 2000-02-15 Just; Gerard A. Magnetic absorption treatment of fluid phases
US6040858A (en) * 1994-11-18 2000-03-21 Canon Kabushiki Kaisha Method and apparatus for expanding the dynamic range of sensed color images
US6049357A (en) * 1992-06-30 2000-04-11 Canon Kabushiki Kaisha Image pickup apparatus including signal accumulating cells
US6101287A (en) * 1998-05-27 2000-08-08 Intel Corporation Dark frame subtraction
US6115065A (en) * 1995-11-07 2000-09-05 California Institute Of Technology Image sensor producing at least two integration times from each sensing pixel
US6115066A (en) * 1997-06-12 2000-09-05 International Business Machines Corporation Image sensor with direct digital correlated sampling
US6144408A (en) * 1995-02-24 2000-11-07 Eastman Kodak Company Black pattern correction for charge transfer sensor
US6198686B1 (en) * 1998-06-29 2001-03-06 Fujitsu Limited Memory device having row decoder
US6246436B1 (en) * 1997-11-03 2001-06-12 Agilent Technologies, Inc Adjustable gain active pixel sensor
US6300978B1 (en) * 1995-08-11 2001-10-09 Kabushiki Kaisha Toshiba MOS-type solid-state imaging apparatus
US6317154B2 (en) * 1998-02-27 2001-11-13 Intel Corporation Method to reduce reset noise in photodiode based CMOS image sensors
US20010040631A1 (en) * 2000-05-09 2001-11-15 Ewedemi Odutola Oluseye CMOS sensor array with a memory interface
US6365950B1 (en) * 1998-06-02 2002-04-02 Samsung Electronics Co., Ltd. CMOS active pixel sensor
US6369737B1 (en) * 1997-10-30 2002-04-09 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for converting a low dynamic range analog signal to a large dynamic range floating-point digital representation
US6369853B1 (en) * 1997-11-13 2002-04-09 Foveon, Inc. Intra-pixel frame storage element, array, and electronic shutter method suitable for electronic still camera applications
US6418245B1 (en) * 1996-07-26 2002-07-09 Canon Kabushiki Kaisha Dynamic range expansion method for image sensed by solid-state image sensing device
US6473122B1 (en) * 1999-12-06 2002-10-29 Hemanth G. Kanekal Method and apparatus to capture high resolution images using low resolution sensors and optical spatial image sampling
US6493030B1 (en) * 1998-04-08 2002-12-10 Pictos Technologies, Inc. Low-noise active pixel sensor for imaging arrays with global reset
US6532040B1 (en) * 1998-09-09 2003-03-11 Pictos Technologies, Inc. Low-noise active-pixel sensor for imaging arrays with high speed row reset
US6603702B2 (en) * 2000-03-01 2003-08-05 Matsushita Electric Industrial Co., Ltd. Semiconductor integrated circuit
US6794627B2 (en) * 2001-10-24 2004-09-21 Foveon, Inc. Aggregation of active pixel sensor signals
US6847398B1 (en) * 1998-03-31 2005-01-25 Micron Technology, Inc. Latched row logic for a rolling exposure snap
US6856349B1 (en) * 1996-09-30 2005-02-15 Intel Corporation Method and apparatus for controlling exposure of a CMOS sensor array
US6992715B2 (en) * 2001-01-11 2006-01-31 Psion Teklogix Systems Inc. Row decoding scheme for double sampling in 3T pixel arrays

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4441125A (en) * 1981-11-03 1984-04-03 Micron Technology, Inc. Image sensor using dynamic random access memory
US6456326B2 (en) * 1994-01-28 2002-09-24 California Institute Of Technology Single chip camera device having double sampling operation
US5877715A (en) * 1997-06-12 1999-03-02 International Business Machines Corporation Correlated double sampling with up/down counter
US6522357B2 (en) * 1997-09-30 2003-02-18 Intel Corporation Method and apparatus for increasing retention time in image sensors having an electronic shutter

Patent Citations (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3931674A (en) * 1974-02-08 1976-01-13 Fairchild Camera And Instrument Corporation Self aligned CCD element including two levels of electrodes and method of manufacture therefor
US4425501A (en) * 1981-03-30 1984-01-10 Honeywell Inc. Light aperture for a lenslet-photodetector array
US4473836A (en) * 1982-05-03 1984-09-25 Dalsa Inc. Integrable large dynamic range photodetector element for linear and area integrated circuit imaging arrays
US4614996A (en) * 1984-07-19 1986-09-30 Shimizu Construction Co., Ltd. Ceiling illumination apparatus
US4704633A (en) * 1985-04-01 1987-11-03 Fuji Photo Film Co., Ltd. Method for reading out image information on an image having a wide dynamic range
US4647975A (en) * 1985-10-30 1987-03-03 Polaroid Corporation Exposure control system for an electronic imaging camera having increased dynamic range
US5737016A (en) * 1985-11-15 1998-04-07 Canon Kabushiki Kaisha Solid state image pickup apparatus for reducing noise
US4858013A (en) * 1987-03-19 1989-08-15 Mitsubishi Denki Kabushiki Kaisha Solid state imaging device with adaptive pixel correction
US5638118A (en) * 1987-06-09 1997-06-10 Canon Kabushiki Kaisha Image sensing device with diverse storage times used in picture composition
US4974093A (en) * 1987-12-22 1990-11-27 Fuji Photo Film Co., Ltd. Solid state image-pickup device with expanded dynamic range
US5043821A (en) * 1988-08-31 1991-08-27 Canon Kabushiki Kaisha Image pickup device having a frame-size memory
US5138458A (en) * 1989-12-22 1992-08-11 Olympus Optical Co., Ltd. Electronic camera apparatus capable of providing wide dynamic range image signal
US5278658A (en) * 1990-05-15 1994-01-11 Ricoh Company, Ltd. Image reading apparatus having a function for correcting dark signals generated in a photoelectric conversion element
US5675381A (en) * 1990-10-31 1997-10-07 Canon Kabushiki Kaisha Control of solid-state image sensor
US5434620A (en) * 1991-02-22 1995-07-18 Nippondenso Co., Ltd. Image sensor
US5235197A (en) * 1991-06-25 1993-08-10 Dalsa, Inc. High photosensitivity and high speed wide dynamic range ccd image sensor
US5420635A (en) * 1991-08-30 1995-05-30 Fuji Photo Film Co., Ltd. Video camera, imaging method using video camera, method of operating video camera, image processing apparatus and method, and solid-state electronic imaging device
US5163914A (en) * 1991-10-24 1992-11-17 Abel Elaine R Support for a respirator hose
US5436662A (en) * 1992-05-01 1995-07-25 Olympus Optical Co., Ltd. Imaging apparatus having a solid state matrix-type imaging element and pulse generator for the expanding the dynamic range
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6049357A (en) * 1992-06-30 2000-04-11 Canon Kabushiki Kaisha Image pickup apparatus including signal accumulating cells
US5455621A (en) * 1992-10-27 1995-10-03 Matsushita Electric Industrial Co., Ltd. Imaging method for a wide dynamic range and an imaging device for a wide dynamic range
US5452004A (en) * 1993-06-17 1995-09-19 Litton Systems, Inc. Focal plane array imaging device with random access architecture
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
US5587738A (en) * 1993-11-17 1996-12-24 Canon Kabushiki Kaisha Solid-state image pickup device having plural switches for subtracting a stored signal from a pixel output
US5471515A (en) * 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US6021172A (en) * 1994-01-28 2000-02-01 California Institute Of Technology Active pixel sensor having intra-pixel charge transfer with analog-to-digital converter
US5841126A (en) * 1994-01-28 1998-11-24 California Institute Of Technology CMOS active pixel sensor type imaging system on a chip
US5461425A (en) * 1994-02-15 1995-10-24 Stanford University CMOS image sensor with pixel level A/D conversion
US6040858A (en) * 1994-11-18 2000-03-21 Canon Kabushiki Kaisha Method and apparatus for expanding the dynamic range of sensed color images
US5953061A (en) * 1995-01-03 1999-09-14 Xerox Corporation Pixel cells having integrated analog memories and arrays thereof
US5665959A (en) * 1995-01-13 1997-09-09 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Adminstration Solid-state image sensor with focal-plane digital photon-counting pixel array
US5929908A (en) * 1995-02-03 1999-07-27 Canon Kabushiki Kaisha Image sensing apparatus which performs dynamic range expansion and image sensing method for dynamic range expansion
US6144408A (en) * 1995-02-24 2000-11-07 Eastman Kodak Company Black pattern correction for charge transfer sensor
US6300978B1 (en) * 1995-08-11 2001-10-09 Kabushiki Kaisha Toshiba MOS-type solid-state imaging apparatus
US6115065A (en) * 1995-11-07 2000-09-05 California Institute Of Technology Image sensor producing at least two integration times from each sensing pixel
US5883830A (en) * 1995-12-29 1999-03-16 Intel Corporation CMOS imaging device with integrated flash memory image correction circuitry
US5861620A (en) * 1996-01-19 1999-01-19 Canon Kabushiki Kaisha Photoelectric converting apparatus
US5990506A (en) * 1996-03-20 1999-11-23 California Institute Of Technology Active pixel sensors with substantially planarized color filtering elements
US6418245B1 (en) * 1996-07-26 2002-07-09 Canon Kabushiki Kaisha Dynamic range expansion method for image sensed by solid-state image sensing device
US5886659A (en) * 1996-08-21 1999-03-23 California Institute Of Technology On-focal-plane analog-to-digital conversion for current-mode imaging devices
US5892541A (en) * 1996-09-10 1999-04-06 Foveonics, Inc. Imaging system and method for increasing the dynamic range of an array of active pixel sensor cells
US5880460A (en) * 1996-09-10 1999-03-09 Foveonics, Inc. Active pixel sensor cell that reduces noise in the photo information extracted from the cell
US5926214A (en) * 1996-09-12 1999-07-20 Vlsi Vision Limited Camera system and associated method for removing reset noise and fixed offset noise from the output of an active pixel array
US6856349B1 (en) * 1996-09-30 2005-02-15 Intel Corporation Method and apparatus for controlling exposure of a CMOS sensor array
US5909026A (en) * 1996-11-12 1999-06-01 California Institute Of Technology Integrated sensor with frame memory and programmable resolution for light adaptive imaging
US6115066A (en) * 1997-06-12 2000-09-05 International Business Machines Corporation Image sensor with direct digital correlated sampling
US5962844A (en) * 1997-09-03 1999-10-05 Foveon, Inc. Active pixel image cell with embedded memory and pixel level signal processing capability
US6005619A (en) * 1997-10-06 1999-12-21 Photobit Corporation Quantum efficiency improvements in active pixel sensors
US6538593B2 (en) * 1997-10-30 2003-03-25 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for converting a low dynamic range analog signal to a large dynamic range floating-point digital representation
US6369737B1 (en) * 1997-10-30 2002-04-09 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for converting a low dynamic range analog signal to a large dynamic range floating-point digital representation
US6246436B1 (en) * 1997-11-03 2001-06-12 Agilent Technologies, Inc Adjustable gain active pixel sensor
US6369853B1 (en) * 1997-11-13 2002-04-09 Foveon, Inc. Intra-pixel frame storage element, array, and electronic shutter method suitable for electronic still camera applications
US6008486A (en) * 1997-12-31 1999-12-28 Gentex Corporation Wide dynamic range optical sensor
US6317154B2 (en) * 1998-02-27 2001-11-13 Intel Corporation Method to reduce reset noise in photodiode based CMOS image sensors
US6847398B1 (en) * 1998-03-31 2005-01-25 Micron Technology, Inc. Latched row logic for a rolling exposure snap
US6493030B1 (en) * 1998-04-08 2002-12-10 Pictos Technologies, Inc. Low-noise active pixel sensor for imaging arrays with global reset
US6101287A (en) * 1998-05-27 2000-08-08 Intel Corporation Dark frame subtraction
US6365950B1 (en) * 1998-06-02 2002-04-02 Samsung Electronics Co., Ltd. CMOS active pixel sensor
US6198686B1 (en) * 1998-06-29 2001-03-06 Fujitsu Limited Memory device having row decoder
US6024881A (en) * 1998-08-11 2000-02-15 Just; Gerard A. Magnetic absorption treatment of fluid phases
US6532040B1 (en) * 1998-09-09 2003-03-11 Pictos Technologies, Inc. Low-noise active-pixel sensor for imaging arrays with high speed row reset
US6473122B1 (en) * 1999-12-06 2002-10-29 Hemanth G. Kanekal Method and apparatus to capture high resolution images using low resolution sensors and optical spatial image sampling
US6603702B2 (en) * 2000-03-01 2003-08-05 Matsushita Electric Industrial Co., Ltd. Semiconductor integrated circuit
US20010040631A1 (en) * 2000-05-09 2001-11-15 Ewedemi Odutola Oluseye CMOS sensor array with a memory interface
US6992715B2 (en) * 2001-01-11 2006-01-31 Psion Teklogix Systems Inc. Row decoding scheme for double sampling in 3T pixel arrays
US6794627B2 (en) * 2001-10-24 2004-09-21 Foveon, Inc. Aggregation of active pixel sensor signals

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1711007A4 (en) * 2004-01-26 2010-11-24 Hamamatsu Photonics Kk Solid state image pickup device
US20070242151A1 (en) * 2004-01-26 2007-10-18 Yasuhiro Suzuki Solid State Image Pickup Device
EP1711007A1 (en) * 2004-01-26 2006-10-11 Hamamatsu Photonics K.K. Solid state image pickup device
US7973844B2 (en) * 2004-01-26 2011-07-05 Hamamatsu Photonics K.K. Solid state image pickup device
US8446507B2 (en) * 2004-08-20 2013-05-21 Micron Technology, Inc. Redundancy in column parallel or row architectures
US8072523B2 (en) * 2004-08-20 2011-12-06 Micron Technology, Inc. Redundancy in column parallel or row architectures
US20120044219A1 (en) * 2004-08-20 2012-02-23 Christian Boemler Redundancy in column parallel or row architectures
US8634010B2 (en) 2004-08-20 2014-01-21 Micron Technology, Inc. Redundancy in column parallel or row architectures
US20090244039A1 (en) * 2004-08-20 2009-10-01 Micron Technology, Inc. Redundancy in column parallel or row architectures
US8035718B2 (en) 2008-03-26 2011-10-11 Aptina Imaging Corporation Systems, methods, and devices for preventing shoot-through current within and between signal line drivers of semiconductor devices
US20090244344A1 (en) * 2008-03-26 2009-10-01 Micron Technology, Inc. Systems, methods, and devices for preventing shoot-through current within and between signal line drivers of semiconductor devices
US20130020465A1 (en) * 2011-07-20 2013-01-24 Lg Innotek Co., Ltd. Pixel, pixel array, image sensor including the same, and method for driving image sensor
US20150215556A1 (en) * 2012-04-12 2015-07-30 Taiwan Semiconductor Manufacturing Company, Ltd. Readout device with readout circuit
US9369652B2 (en) * 2012-04-12 2016-06-14 Taiwan Semiconductor Manufacturing Company, Ltd. Readout device with readout circuit
US20150189197A1 (en) * 2013-12-29 2015-07-02 Cista System Corp. Compact row decoder with multiple voltage support
US9609254B2 (en) * 2013-12-29 2017-03-28 Cista System Corp. Compact row decoder with multiple voltage support
CN104486557A (en) * 2014-12-29 2015-04-01 上海集成电路研发中心有限公司 System and method for inserting short frames in image capture process
WO2019106643A1 (en) * 2017-12-01 2019-06-06 Uti Limited Partnership Apparatus and method of imaging
CN109639993A (en) * 2018-12-28 2019-04-16 北京思比科微电子技术股份有限公司 Multiwindow exposal control method
US11423851B2 (en) * 2019-12-26 2022-08-23 Samsung Electronics Co., Ltd. Image sensor driving circuit including power switch and image sensor including the same
CN113660425A (en) * 2021-08-19 2021-11-16 维沃移动通信(杭州)有限公司 Image processing method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
US20110013045A1 (en) 2011-01-20
US9584739B2 (en) 2017-02-28

Similar Documents

Publication Publication Date Title
US9584739B2 (en) CMOS image sensor with processor controlled integration time
US7612817B2 (en) CMOS image sensor with noise cancellation
US7880775B2 (en) Image sensor with interleaved image output
US8054357B2 (en) Image sensor with time overlapping image output
US8174593B2 (en) Method and apparatus for detecting image darkening due to reset droop
US6377303B2 (en) Strobe compatible digital image sensor with low device count per pixel analog-to-digital conversion
US6330030B1 (en) Digital image sensor with low device count per pixel analog-to-digital conversion
US7002628B1 (en) Analog to digital converter with internal data storage
US20110058082A1 (en) CMOS Image Sensor with Noise Cancellation
US11503229B2 (en) Image sensor and imaging device including the same
US7250592B2 (en) Image sensor with improved sensitivity and method for driving the same
US11509843B2 (en) Low power shared image pixel architecture

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANDELA MICROSYSTEMS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAY, HIOK NAM;REEL/FRAME:013866/0418

Effective date: 20030306

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION