US20090091554A1 - Correcting for ambient light in an optical touch-sensitive device - Google Patents

Correcting for ambient light in an optical touch-sensitive device Download PDF

Info

Publication number
US20090091554A1
US20090091554A1 US11/868,466 US86846607A US2009091554A1 US 20090091554 A1 US20090091554 A1 US 20090091554A1 US 86846607 A US86846607 A US 86846607A US 2009091554 A1 US2009091554 A1 US 2009091554A1
Authority
US
United States
Prior art keywords
pixels
field
ambient light
frame
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/868,466
Other versions
US8004502B2 (en
Inventor
Nigel Keam
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KEAM, NIGEL
Priority to US11/868,466 priority Critical patent/US8004502B2/en
Priority to CA2698623A priority patent/CA2698623C/en
Priority to EP08836179.5A priority patent/EP2206033B1/en
Priority to CN2008801104246A priority patent/CN101815979B/en
Priority to JP2010528117A priority patent/JP5134688B2/en
Priority to KR1020107007059A priority patent/KR101465835B1/en
Priority to PCT/US2008/078515 priority patent/WO2009046154A2/en
Publication of US20090091554A1 publication Critical patent/US20090091554A1/en
Publication of US8004502B2 publication Critical patent/US8004502B2/en
Application granted granted Critical
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment

Definitions

  • Touch-sensitive devices may detect touch via several different mechanisms, including but not limited to optical, resistive, and capacitive mechanisms.
  • Some optical touch-sensitive devices detect touch by capturing an image of a backside of a touch screen via an image sensor, and then processing the image to detect objects located on the screen.
  • Such devices may include a light source within the device to illuminate the backside of the display screen such that objects on the screen reflect the incident light toward the image sensor, thereby allowing the object to be detected.
  • One difficulty that may be encountered with optical touch screen devices involves differentiating between external (ambient) light and light reflected from the light source within the device. Ambient light of sufficient brightness may be mistaken for an object touching the device, and therefore may degrade the performance of the device.
  • one disclosed embodiment comprises integrating a first field of pixels in an image data frame for a different duration of ambient light exposure than a second field of pixels in the image data frame.
  • Intensity data is read from the first field of pixels and the second field of pixels, and an ambient light value is determined for one or more pixels in the image data frame from the intensity data.
  • the ambient light value then is used to adjusting one or more pixels of the data frame for ambient light.
  • FIG. 1 shows an embodiment of an optical touch-sensitive device.
  • FIG. 2 shows a process flow depicting an embodiment of a method of correcting for ambient light in an optical touch-sensitive device.
  • FIG. 3 shows a timing diagram depicting an embodiment of a method for integrating and reading an image sensor in an optical touch-sensitive device.
  • FIG. 4 shows a schematic depiction of intensity data of two fields of pixels in adjacent image frames captured according to the method of FIG. 3 .
  • FIG. 5 shows a schematic depiction of one embodiment of a method of determining an ambient light value from the intensity data of FIG. 4 .
  • FIG. 6 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4 .
  • FIG. 7 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4 .
  • FIG. 8 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4 .
  • FIG. 9 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4 .
  • FIGS. 10A-D show a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4 .
  • FIG. 11 shows a timing diagram of an ambient light frequency compared to a frame rate prior to correction of the frame rate.
  • FIG. 12 shows a timing diagram of an ambient light frequency compared to a frame rate after correction of the frame rate.
  • FIG. 13 shows a process flow depicting an embodiment of a method of correcting for ambient light in an optical touch-sensitive device.
  • FIG. 1 shows a schematic depiction of an embodiment of an interactive display device 100 utilizing an optical touch-sensing mechanism.
  • Interactive display device 100 comprises a projection display system having an image source 102 , and a display screen 106 onto which images are projected. While shown in the context of a projection display system, it will be appreciated that the embodiments described herein may also be implemented with other suitable display systems, including but not limited to LCD panel systems.
  • Image source 102 includes an optical or light source 108 such as a lamp (depicted), an LED array, or other suitable light source.
  • Image source 102 also includes an image-producing element 110 such as the depicted LCD (liquid crystal display), an LCOS (liquid crystal on silicon) display, a DLP (digital light processing) display, or any other suitable image-producing element.
  • Display screen 106 includes a clear, transparent portion 112 , such as sheet of glass, and a diffuser screen layer 114 disposed on top of the clear, transparent portion 112 .
  • a diffuser screen layer 114 may be disposed over diffuser screen layer 114 to provide a smooth look and feel to the display surface.
  • the diffuser screen layer 114 may be omitted.
  • interactive display device 100 further includes an electronic controller 116 comprising memory 118 and a microprocessor 120 .
  • Controller 116 may further include a field programmable gate array (FPGA) 122 , and/or any other suitable electronic components, including application-specific integrated circuits (ASICs) (not shown), digital signal processors (DSPs) (not shown), etc. configured to conduct one or more ambient light correction calculations, as described below. While shown as part of controller 116 , it will be appreciated that FPGA 122 and/or other electronic components may also be provided as one or more separate devices in electrical communication with controller 116 .
  • FPGA field programmable gate array
  • memory 118 may comprise instructions stored thereon that are executable by the processor 120 to control the various parts of device 100 to effect the methods and processes described herein.
  • the FPGA 122 also may be configured to perform one or more of the correction methods described in detail below.
  • interactive display device 100 includes an image sensor 124 configured to capture an image of the entire backside of display screen 106 , and to provide the image to electronic controller 116 for the detection of objects appearing in the image.
  • Diffuser screen layer 114 helps to avoid the imaging of objects that are not in contact with or positioned within a few millimeters of display screen 106 , and therefore helps to ensure that only objects that are touching or in close proximity to display screen 106 are detected by image sensor 124 .
  • Image sensor 124 may include any suitable image sensing mechanism. Examples of suitable image sensing mechanisms include but are not limited to CCD and CMOS image sensors. Further, the image sensing mechanisms may capture images of display screen 106 at a sufficient frequency to detect motion of an object across display screen 106 . While the embodiment of FIG. 1 shows one image sensor, it will be appreciated that more than one image sensor may be used to capture images of display screen 106 .
  • Image sensor 124 may be configured to detect reflected or emitted energy of any suitable wavelength, including but not limited to infrared and visible wavelengths. To assist in detecting objects placed on display screen 106 , image sensor 124 may further include an illuminant 126 such as one or more light emitting diodes (LEDs) 126 configured to produce infrared or visible light. Light from illuminant 126 may be reflected by objects placed on display screen 106 and then detected by image sensor 124 . The use of infrared LEDs as opposed to visible LEDs may help to avoid washing out the appearance of images projected on display screen 106 . Further, an infrared bandpass filter 127 may be utilized to pass light of the frequency emitted by the illuminant 126 but prevent light at frequencies outside of the bandpass frequencies from reaching image sensor 124 .
  • LEDs light emitting diodes
  • FIG. 1 also depicts an object 130 placed on display screen 106 .
  • Light from the illuminant 126 reflected by object 130 may be detected by image sensor 124 , thereby allowing the object 130 to be detected on the screen.
  • Object 130 represents any object that may be in contact with display screen 106 , including but not limited to fingers, brushes, optically readable tags, etc.
  • ambient light sources may emit light in the band passed by bandpass filter 127 .
  • the term “ambient light” is used herein to describe light other than light from the illuminant 126 .
  • Examples of such ambient light sources include but are not limited to broad-spectrum light sources such as sunlight, incandescent lamp light, etc. Such light may have a sufficient intensity at the bandpass frequencies that the ambient light is difficult to distinguish from reflected light from the illuminant 126 . Therefore, such ambient may cause the interactive display device 100 to mistakenly identify ambient light as an object on the display screen 106 .
  • the illuminant 126 which may be referred to as “local” light, could be strobed such that alternate frames are exposed to “ambient-only” and “ambient+local” light. This allows the ambient light intensity to be determined by subtracting the “ambient-only” frame from the “ambient+local” frame to correct for ambient. However, because the local light is turned on only every other frame, this effectively cuts the frame rate of the device in half.
  • Another potential technique is to utilize a separate sensor (possibly with an optical filter) configured to integrate only ambient light.
  • the use of an additional sensor may be expensive, and may be prone to errors due to the different positioning of the sensors in the device.
  • Yet another potential technique may be to utilize an extremely bright local light source in combination with a band-pass filter to boost the intensity of reflected light relative to ambient light.
  • this approach may be susceptible to failure where the ambient light exceeds some percentage of local light.
  • FIG. 2 shows a flow diagram of a method for correcting an image for ambient light that may help to avoid the problems found in the methods described above.
  • Method 200 first comprises, at 202 , integrating first and second fields of pixels in an image sensor for different durations of ambient light exposure. Generally, this also involves integrating the first and second fields of images for an equal (or approximately equal) duration of ambient+local exposure such that both fields have been exposed to a similar duration of local light but different durations of ambient light.
  • the two fields may be interlaced fields (for example, odd/even rows or odd/even columns of pixels), or may have any other suitable spatial relationship. Furthermore, in some embodiments, three or more fields of pixels may be exposed to different intervals of ambient light.
  • method 200 comprises, at 204 , reading intensity data from the first and second fields of pixels in the image sensor, and determining, at 206 , a measure of ambient light from the difference between the intensity data from the first and second fields of pixels.
  • method 200 comprises, at 208 , adjusting the image data to correct for the ambient light based upon the determined measure of ambient light.
  • the image data may be adjusted only if it is first determined, at 210 , if the ambient light measure is over a threshold value. Further, in some embodiments, it is first determined whether an object detected on the screen is in motion, as indicated at 212 , and then an adjustment to make to the image data is selected at 214 based upon whether the object is determined to be in motion. Each of these decisions is discussed in more detail below.
  • method 200 allows a correction for ambient light to be made to image data without the use of an additional image sensor or other additional parts, and also without any loss of frame rate. Each of the processes shown in method 200 is described in more detail below.
  • FIG. 3 illustrates a timing diagram 300 showing one manner of operating the illuminant 126 (shown in FIG. 3 as an LED) and the image sensor 124 to accomplish this.
  • the illuminant 126 is operated in an off/on strobe pattern shown at 302 such that it is in an “on” state for a portion of an image frame and an “off” state for a portion of the image frame. According to this pattern, the illuminant 126 is in an “on” state for somewhat less than one-half the period of one frame.
  • the sensor is then globally exposed (i.e.
  • the exposure pattern is shown in FIG. 3 at 304 .
  • the readout of the pixels of image data from the image sensor can be controlled such that different fields of pixels have different levels of ambient exposure in each frame.
  • FIG. 3 shows this readout pattern in the context of odd/even rows of pixels, but it will be understood that the fields of pixels may have any other suitable spatial relationship to one another. Further, while disclosed herein in the context of utilizing two fields of pixels, it will be appreciated that the concepts also may be utilized with three or more fields of pixels.
  • the data for a first image data frame n is read out first from the odd frames, and then from the even frames.
  • the image sensor pixels are reset to an unexposed state. Therefore, at the time the odd field of pixels of frame n is read out, the odd field pixels has integrated light for a period of t 1 (i.e. since the beginning of the last odd pixel readout shown in pattern 306 ).
  • the even field of pixels of frame n has integrated light for a t 2 period, then a t 1 period, and then another t 2 period, for a total of t 1 +2(t 2 ).
  • the cumulative integration times for each frame are shown in FIG. 3 at 310 .
  • the data for a next image data frame n+1 is read out first from the even frames, and then from the odd frames. Therefore, in frame n+1, the even field of pixels has integrated light for only a t 1 period at the time of readout, while the odd field of pixels has integrated light for a period of t 1 +2(t 2 ). As can be seen in FIG. 3 , the odd fields have a greater ambient exposure than the even fields in frame n+1, while the even fields have a greater ambient exposure than the odd fields in frame n. It will further be noted that the image sensor integrated local light for a period of t 1 for each frame. Thus, each frame has image data that can be used to identify objects on display screen 106 , thereby allowing the frame rate to be preserved.
  • the periods t 1 and t 2 may have any suitable lengths, and may be the same or different from one another. In some use environments, t 2 may be shorter than t 1 to reduce the chance that the image sensor will saturate before reading, as saturation of the image sensor may lead to inaccurate calculations of ambient intensities. Further, where saturated pixels are detected, the length of t 1 and/or t 2 may be modified to reduce the total integration time of a frame to avoid saturation in future pixels. Likewise, where intensities are low, t 1 and/or t 2 may be increased to increase the amount of light integrated in future frames. Alternately or additionally, the gain on the image sensor may be adjusted dynamically to avoid saturation and/or to increase the response of the sensor to an amount of light exposure. Correcting an image for saturation is discussed in more detail below.
  • t 1 and t 2 may have similar lengths.
  • ambient light is fluctuating in intensity (i.e. incandescent light fluctuating at twice a line frequency of 50 or 60 Hz)
  • the average incident ambient light strength will be different during the t 2 phases compared to the t 1 phases for at least some image frames (depending upon the frame rate compared to the fluctuation frequency).
  • t 2 may be adjusted to have an approximately equal length to t 1 when fluctuating ambient light is detected.
  • each image frame is integrated for different periods of ambient light exposure. Further, each single field of pixels has different periods of ambient light exposure in adjacent image frames. These differences in intra-frame and inter-frame ambient light exposure may be utilized in various ways to correct an image frame for ambient light.
  • FIG. 4 illustrates how the readout from the sensor for the two image frames, which show a stationary scene, may appear when integrated and read according to the process shown in FIG. 3 .
  • a simple stationary scene with no ambient light is shown at 402
  • a 3 ⁇ 3 matrix of pixels from scene 402 is shown at 404 .
  • the images in FIG. 4 have only three intensity levels, wherein the lightest pixels signify the most integrated light and the darkest pixels signify the least integrated light.
  • the ambient light can be calculated for the odd rows by subtracting frame n from frame n ⁇ 1 (as shown at 502 ), and for the even rows by subtracting frame n ⁇ 1 from frame n (as shown at 504 ). Combining the ambient determined for the odd rows with the ambient determined for the even rows yields an overall ambient 506 for the 3 ⁇ 3 matrix.
  • FIGS. 6-8 show examples of various methods that may be used to correct an image frame for ambient light with the image data shown in FIG. 4 . These figures are shown in the context of determining ambient for a single pixel at a time. This may allow different ambient calculation methods to be used for different pixels depending upon pixel-specific factors.
  • an ambient light value at a pixel may be calculated as described above for FIG. 5 by simply subtracting frame n ⁇ 1 from frame n.
  • ambient for pixels in the top and bottom rows of the 3 ⁇ 3 matrix may be determined simply by subtracting frame n from frame n ⁇ 1.
  • This method utilizes information from temporally adjacent frames but does not utilize information from spatially adjacent pixels. Therefore, the method illustrated in FIG. 6 may be referred to herein as a “temporal-local” correction.
  • the temporal-local correction may effectively halve the frame rate of the device. For this reason, this correction may be used for stationary objects.
  • FIG. 7 shows another example of a method for correcting an image frame for ambient light.
  • the method shown in FIG. 7 takes into account both temporal information (i.e. temporally adjacent image frames) and spatial information (i.e. spatially adjacent pixels) when calculating the ambient for a pixel. Therefore, the method shown in FIG. 7 may be referred to as a “temporal-spatial” correction. While shown in the context of a 3 ⁇ 3 matrix, it will be appreciated that the concepts shown in FIG. 7 , as well as FIG.
  • a matrix of any size of pixels and any shape/pattern around the pixel of interest including but not limited to 5 ⁇ 5 and 7 ⁇ 7 matrices, as well as other shapes (such as a cross-shaped matrix formed by omitting each corner pixel from a 5 ⁇ 5 matrix).
  • the temporal-spatial correction shown in FIG. 7 utilizes a weighted average intensity of the pixels in the sample matrix to determine an ambient value, wherein the center pixel is weighted more strongly (1 ⁇ 4) than the side pixels (1 ⁇ 8 each), which are in turn weighted more strongly than the corner pixels.
  • the intensities of the pixels are multiplied by the shown weighting factors, the two frames are added, and then the value at each pixel in the matrix after the addition of the two frames is summed to yield the ambient intensity at the center pixel. Because spatial data is taken into account in addition to temporal data, the temporal-spatial correction allows a frame rate to be maintained.
  • FIG. 8 shows another example of a method for correcting an image frame for ambient light.
  • the method of FIG. 8 utilizes only spatial information, and not temporal information, in making the ambient correction.
  • the correction is made entirely from a weighted average of intra-frame data, utilizing no inter-frame data. As depicted, this calculation may lead to slightly high values of ambient light, but can avoid calculation problems due to motion that may occur in methods that utilize temporal information.
  • the touch-sensitive device may be able to detect objects without any problems caused by ambient. Therefore, before performing any of the above-described corrections (or any others), it may be determined whether there is any potentially problematic ambient by comparing the sum of the intensities in the first field in a frame to the sum of the intensities in the second field in the frame. Because the intensities in the two fields differ by the amount of ambient light integrated, if the sums are relatively close together, it can be determined that the ambient light levels are sufficiently low not to interfere with device operation, and correction for ambient may be omitted, as shown in FIG. 9 .
  • FIGS. 10A-D illustrate another embodiment of a method for correcting for ambient.
  • a 5 ⁇ 5 region of pixels in a current frame (frame n) and a single pixel in two prior frames (frames n ⁇ 1, n ⁇ 2) are considered for ambient correction.
  • a 3 ⁇ 3 region of pixels, or any other suitable region of pixels, in a current frame may be considered in the ambient correction.
  • a center of a current frame is compared to a pixel from frame n ⁇ 2, which was read in the same field order. If the difference between these pixels exceeds a threshold amount, this indicates that motion may have occurred, and a “motion flag” for that pixel is set.
  • the value of the motion flag is compared to motion flags for nearby pixels (for example, via a Boolean “OR” operation), and if the result is zero (i.e. frame n ⁇ 2 and frame n look the same in a local region), then a temporal ambient correction is performed by determining difference between a current center pixel in frame n and the same pixel in frame n- 1 , as indicated in FIG. 10C .
  • FIG. 10D shows one non-limiting example of a suitable weighting factor scheme for a 5 ⁇ 5 pixel spatial correction.
  • the determination of whether to utilize a 5 ⁇ 5 or a 3 ⁇ 3 pixel region for ambient correction may depend upon factors such as the resolution and stability of the image sensor. For example, a 3 ⁇ 3 region may yield a slightly noisier result, while a 5 ⁇ 5 region may blur the result slightly. Other region sizes may be used, including but not limited to a 1 ⁇ 3 region (which may be noisier than a 3 ⁇ 3 region).
  • Some ambient sources may fluctuate in a periodic manner. For example, electric lighting generally fluctuates at a frequency of twice the line frequency, which may be either 50 or 60 Hz depending upon location. This is illustrated graphically in FIG. 11 .
  • the frame rate of the device has a frequency other than the line frequency or 2 ⁇ the line frequency
  • the ambient light detected by the image sensor will have a beat frequency detectable as a periodic variation in the ambient light level.
  • Such variations in ambient light levels may cause problems when temporal information is used in correcting ambient. These problems may be pronounced where the ambient light fluctuations are large between adjacent frames, which may occur if a frame rate of 100 Hz is used in the presence of 120 Hz ambient, or vice versa.
  • FIG. 10 only shows a single integration period (t 1 ) per frame, but it will be understood that similar problems would be encountered with the use of multiple integration periods per frame, as shown in FIG. 3 .
  • the frame rate of an optical touch-sensitive device may be set to equal the line frequency or 2 ⁇ the line frequency.
  • this setting may be stored in the system, derived from a local power source, or detected optically.
  • the fluctuation may be detected optically by observing a beat frequency in the overall levels of detected light, or by monitoring frame-to-frame variation in the amount of total ambient light measured. If the detected ambient light fluctuation frequency is not the same as the frame rate of the device, the frame rate can then be adjusted so that it matches the ambient light frequency, as shown in FIG. 12 .
  • FIG. 13 shows a process flow of a method 1300 for performing an ambient light correction that takes into account the various factors described above.
  • the method of FIG. 13 may be performed on a pixel-by-pixel basis, or in any other suitable manner.
  • Method 1300 first comprises, at 1302 , acquiring image data frames, and then, at 1304 , determining for an image data frame whether the global ambient is below a threshold value. This can be determined, for example, by subtracting the sum of the intensities of all pixels in a first field from the sum of all intensities of pixels in a second field, and determining if the result of the calculation is below a threshold value.
  • method 1300 ends without performing any correction.
  • method 1300 comprises, at 1306 , determining whether any motion is perceived in the intensity data. This may be performed, for example, by subtracting the intensity value for the pixel in the current frame (frame n) from the intensity value for the same pixel in frame n ⁇ 2 (as the same pixel in n ⁇ 1 has a different ambient exposure time). If the difference between these intensity values is sufficiently small, then it can be determined that the intensity data contains no motion information. In this case, a temporal local correction that utilizes no spatial information may be performed, as indicated at 1308 .
  • the pixel contains motion data (as long as the frame rate has been corrected for any periodically fluctuating ambient light), and either a spatial or a temporal-spatial correction may be used, as indicated at 1310 .
  • a spatial correction may be used where all spatial variation in a frame can be corrected with other information in the frame.
  • One example of a method for making this determination is as follows. First, if any pixels in row (i ⁇ 1) of the sample matrix differ significantly from the pixels in the same column in row (i+1), there is spatial information that may be corrected via a temporal-spatial correction. Likewise, if any of the pixels in row (i) of the sample matrix minus the mean for row (i) differs significantly from the corresponding pixels in row (i ⁇ 1) minus the mean for the pixels in row (i ⁇ 1) then there is spatial information that may be corrected via a temporal-spatial correction. In other cases where there is perceived motion but these conditions are not met, a spatial correction may be used. Alternatively, either a spatial or temporal-spatial may be used exclusive of the other where motion information is contained in a frame.
  • the correction calculations and calculation selection routine described above may be performed in any suitable manner.
  • an FPGA (as shown at 122 in FIG. 1 ) may be programmed to perform a plurality of different correction calculations simultaneously for each frame. Then, the best ambient value for each pixel in a frame may be selected based upon the specific temporal and local characteristics of that pixel, as described for method 1200 . Alternatively, the best ambient calculation for a pixel may be determined before performing the correction, such that only one correction is performed for each pixel. It will be appreciated that these specific examples of how to perform an ambient correction from the intensity data integrated and collected are described only for the purpose of illustration, and are not intended to be limiting in any manner.
  • the lengths of the integration periods t 1 and/or t 2 may be adjusted to prevent saturation in future frames. Further, a frame in which saturation is detected may also be processed in a manner to correct for the saturation. As an example, if saturated pixels are observed, it can be assumed that the saturated pixels are directly exposed to ambient light (as reflected light from the illuminant is generally not sufficiently intense to cause saturation). Therefore, in this situation, all light in the saturated region can be deemed to be ambient. Where saturation exists, a noise margin may exist around the saturated pixels.
  • a minimum ambient level may be determined for this region by setting one possible ambient using the near-saturation test, and another using a computed ambient as described above. The higher of these two values may then be used as the value to be subtracted from the image when correcting pixels in this region for ambient.

Abstract

The correction of an image for ambient light in an optical touch-sensitive device is disclosed. For example, one disclosed embodiment comprises integrating a first field of pixels in an image data frame for a different duration of ambient light exposure than a second field of pixels in the image data frame. Intensity data is read from the first field of pixels and the second field of pixels, and an ambient light value is determined for one or more pixels in the image data frame from the intensity data. The ambient light value then is used to adjusting one or more pixels of the data frame for ambient light.

Description

    BACKGROUND
  • Touch-sensitive devices may detect touch via several different mechanisms, including but not limited to optical, resistive, and capacitive mechanisms. Some optical touch-sensitive devices detect touch by capturing an image of a backside of a touch screen via an image sensor, and then processing the image to detect objects located on the screen. Such devices may include a light source within the device to illuminate the backside of the display screen such that objects on the screen reflect the incident light toward the image sensor, thereby allowing the object to be detected.
  • One difficulty that may be encountered with optical touch screen devices involves differentiating between external (ambient) light and light reflected from the light source within the device. Ambient light of sufficient brightness may be mistaken for an object touching the device, and therefore may degrade the performance of the device.
  • SUMMARY
  • Accordingly, various methods for correcting for ambient light in an optical touch-sensitive device are disclosed below in the Detailed Description. For example, one disclosed embodiment comprises integrating a first field of pixels in an image data frame for a different duration of ambient light exposure than a second field of pixels in the image data frame. Intensity data is read from the first field of pixels and the second field of pixels, and an ambient light value is determined for one or more pixels in the image data frame from the intensity data. The ambient light value then is used to adjusting one or more pixels of the data frame for ambient light.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an embodiment of an optical touch-sensitive device.
  • FIG. 2 shows a process flow depicting an embodiment of a method of correcting for ambient light in an optical touch-sensitive device.
  • FIG. 3 shows a timing diagram depicting an embodiment of a method for integrating and reading an image sensor in an optical touch-sensitive device.
  • FIG. 4 shows a schematic depiction of intensity data of two fields of pixels in adjacent image frames captured according to the method of FIG. 3.
  • FIG. 5 shows a schematic depiction of one embodiment of a method of determining an ambient light value from the intensity data of FIG. 4.
  • FIG. 6 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4.
  • FIG. 7 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4.
  • FIG. 8 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4.
  • FIG. 9 shows a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4.
  • FIGS. 10A-D show a schematic depiction of another embodiment of a method of determining an ambient light value from the intensity data of FIG. 4.
  • FIG. 11 shows a timing diagram of an ambient light frequency compared to a frame rate prior to correction of the frame rate.
  • FIG. 12 shows a timing diagram of an ambient light frequency compared to a frame rate after correction of the frame rate.
  • FIG. 13 shows a process flow depicting an embodiment of a method of correcting for ambient light in an optical touch-sensitive device.
  • DETAILED DESCRIPTION
  • Prior to discussing the correction of an image in an optical touch-sensitive device for ambient light, an embodiment of one suitable use environment is described. FIG. 1 shows a schematic depiction of an embodiment of an interactive display device 100 utilizing an optical touch-sensing mechanism. Interactive display device 100 comprises a projection display system having an image source 102, and a display screen 106 onto which images are projected. While shown in the context of a projection display system, it will be appreciated that the embodiments described herein may also be implemented with other suitable display systems, including but not limited to LCD panel systems.
  • Image source 102 includes an optical or light source 108 such as a lamp (depicted), an LED array, or other suitable light source. Image source 102 also includes an image-producing element 110 such as the depicted LCD (liquid crystal display), an LCOS (liquid crystal on silicon) display, a DLP (digital light processing) display, or any other suitable image-producing element.
  • Display screen 106 includes a clear, transparent portion 112, such as sheet of glass, and a diffuser screen layer 114 disposed on top of the clear, transparent portion 112. In some embodiments, an additional transparent layer (not shown) may be disposed over diffuser screen layer 114 to provide a smooth look and feel to the display surface. Further, in embodiments that utilize a LCD panel rather than a projection image source to display images on display screen 106, the diffuser screen layer 114 may be omitted.
  • Continuing with FIG. 1, interactive display device 100 further includes an electronic controller 116 comprising memory 118 and a microprocessor 120. Controller 116 may further include a field programmable gate array (FPGA) 122, and/or any other suitable electronic components, including application-specific integrated circuits (ASICs) (not shown), digital signal processors (DSPs) (not shown), etc. configured to conduct one or more ambient light correction calculations, as described below. While shown as part of controller 116, it will be appreciated that FPGA 122 and/or other electronic components may also be provided as one or more separate devices in electrical communication with controller 116. It will further be understood that memory 118 may comprise instructions stored thereon that are executable by the processor 120 to control the various parts of device 100 to effect the methods and processes described herein. Likewise, the FPGA 122 also may be configured to perform one or more of the correction methods described in detail below.
  • To sense objects placed on display screen 106, interactive display device 100 includes an image sensor 124 configured to capture an image of the entire backside of display screen 106, and to provide the image to electronic controller 116 for the detection of objects appearing in the image. Diffuser screen layer 114 helps to avoid the imaging of objects that are not in contact with or positioned within a few millimeters of display screen 106, and therefore helps to ensure that only objects that are touching or in close proximity to display screen 106 are detected by image sensor 124.
  • Image sensor 124 may include any suitable image sensing mechanism. Examples of suitable image sensing mechanisms include but are not limited to CCD and CMOS image sensors. Further, the image sensing mechanisms may capture images of display screen 106 at a sufficient frequency to detect motion of an object across display screen 106. While the embodiment of FIG. 1 shows one image sensor, it will be appreciated that more than one image sensor may be used to capture images of display screen 106.
  • Image sensor 124 may be configured to detect reflected or emitted energy of any suitable wavelength, including but not limited to infrared and visible wavelengths. To assist in detecting objects placed on display screen 106, image sensor 124 may further include an illuminant 126 such as one or more light emitting diodes (LEDs) 126 configured to produce infrared or visible light. Light from illuminant 126 may be reflected by objects placed on display screen 106 and then detected by image sensor 124. The use of infrared LEDs as opposed to visible LEDs may help to avoid washing out the appearance of images projected on display screen 106. Further, an infrared bandpass filter 127 may be utilized to pass light of the frequency emitted by the illuminant 126 but prevent light at frequencies outside of the bandpass frequencies from reaching image sensor 124.
  • FIG. 1 also depicts an object 130 placed on display screen 106. Light from the illuminant 126 reflected by object 130 may be detected by image sensor 124, thereby allowing the object 130 to be detected on the screen. Object 130 represents any object that may be in contact with display screen 106, including but not limited to fingers, brushes, optically readable tags, etc.
  • In some use environments, ambient light sources may emit light in the band passed by bandpass filter 127. The term “ambient light” is used herein to describe light other than light from the illuminant 126. Examples of such ambient light sources include but are not limited to broad-spectrum light sources such as sunlight, incandescent lamp light, etc. Such light may have a sufficient intensity at the bandpass frequencies that the ambient light is difficult to distinguish from reflected light from the illuminant 126. Therefore, such ambient may cause the interactive display device 100 to mistakenly identify ambient light as an object on the display screen 106.
  • Various techniques may be used to cancel or otherwise correct for ambient light in an image captured by image sensor 124. For example, the illuminant 126, which may be referred to as “local” light, could be strobed such that alternate frames are exposed to “ambient-only” and “ambient+local” light. This allows the ambient light intensity to be determined by subtracting the “ambient-only” frame from the “ambient+local” frame to correct for ambient. However, because the local light is turned on only every other frame, this effectively cuts the frame rate of the device in half.
  • Another potential technique is to utilize a separate sensor (possibly with an optical filter) configured to integrate only ambient light. However, the use of an additional sensor may be expensive, and may be prone to errors due to the different positioning of the sensors in the device. Yet another potential technique may be to utilize an extremely bright local light source in combination with a band-pass filter to boost the intensity of reflected light relative to ambient light. However, this approach may be susceptible to failure where the ambient light exceeds some percentage of local light.
  • FIG. 2 shows a flow diagram of a method for correcting an image for ambient light that may help to avoid the problems found in the methods described above. Method 200 first comprises, at 202, integrating first and second fields of pixels in an image sensor for different durations of ambient light exposure. Generally, this also involves integrating the first and second fields of images for an equal (or approximately equal) duration of ambient+local exposure such that both fields have been exposed to a similar duration of local light but different durations of ambient light. The two fields may be interlaced fields (for example, odd/even rows or odd/even columns of pixels), or may have any other suitable spatial relationship. Furthermore, in some embodiments, three or more fields of pixels may be exposed to different intervals of ambient light.
  • Next, method 200 comprises, at 204, reading intensity data from the first and second fields of pixels in the image sensor, and determining, at 206, a measure of ambient light from the difference between the intensity data from the first and second fields of pixels. Finally, method 200 comprises, at 208, adjusting the image data to correct for the ambient light based upon the determined measure of ambient light. In some embodiments, the image data may be adjusted only if it is first determined, at 210, if the ambient light measure is over a threshold value. Further, in some embodiments, it is first determined whether an object detected on the screen is in motion, as indicated at 212, and then an adjustment to make to the image data is selected at 214 based upon whether the object is determined to be in motion. Each of these decisions is discussed in more detail below.
  • Compared to other methods of correcting for ambient light, method 200 allows a correction for ambient light to be made to image data without the use of an additional image sensor or other additional parts, and also without any loss of frame rate. Each of the processes shown in method 200 is described in more detail below.
  • The integration of the first and second fields of pixels to different durations of ambient light but similar durations of local light in a single frame may be performed in any suitable manner. FIG. 3 illustrates a timing diagram 300 showing one manner of operating the illuminant 126 (shown in FIG. 3 as an LED) and the image sensor 124 to accomplish this. First, the illuminant 126 is operated in an off/on strobe pattern shown at 302 such that it is in an “on” state for a portion of an image frame and an “off” state for a portion of the image frame. According to this pattern, the illuminant 126 is in an “on” state for somewhat less than one-half the period of one frame. The sensor is then globally exposed (i.e. all fields are exposed) such that it integrates each frame for a period t1 during which the illuminant 126 is on (i.e. an “on” state interval) and a period t2 during which the illuminant 126 is off (i.e. an “off” state interval). The exposure pattern is shown in FIG. 3 at 304.
  • Using the LED strobe pattern and image sensor integration pattern shown in FIG. 3, the readout of the pixels of image data from the image sensor can be controlled such that different fields of pixels have different levels of ambient exposure in each frame. FIG. 3 shows this readout pattern in the context of odd/even rows of pixels, but it will be understood that the fields of pixels may have any other suitable spatial relationship to one another. Further, while disclosed herein in the context of utilizing two fields of pixels, it will be appreciated that the concepts also may be utilized with three or more fields of pixels.
  • Referring to the readout pattern shown at 306 and the frame identifier indicator shown at 308, the data for a first image data frame n is read out first from the odd frames, and then from the even frames. At the time of read-out, the image sensor pixels are reset to an unexposed state. Therefore, at the time the odd field of pixels of frame n is read out, the odd field pixels has integrated light for a period of t1 (i.e. since the beginning of the last odd pixel readout shown in pattern 306). On the other hand, at the time the even field of pixels of frame n is read out, the even field of pixels has integrated light for a t2 period, then a t1 period, and then another t2 period, for a total of t1+2(t2). The cumulative integration times for each frame are shown in FIG. 3 at 310.
  • Continuing with FIG. 3, the data for a next image data frame n+1 is read out first from the even frames, and then from the odd frames. Therefore, in frame n+1, the even field of pixels has integrated light for only a t1 period at the time of readout, while the odd field of pixels has integrated light for a period of t1+2(t2). As can be seen in FIG. 3, the odd fields have a greater ambient exposure than the even fields in frame n+1, while the even fields have a greater ambient exposure than the odd fields in frame n. It will further be noted that the image sensor integrated local light for a period of t1 for each frame. Thus, each frame has image data that can be used to identify objects on display screen 106, thereby allowing the frame rate to be preserved.
  • The periods t1 and t2 may have any suitable lengths, and may be the same or different from one another. In some use environments, t2 may be shorter than t1 to reduce the chance that the image sensor will saturate before reading, as saturation of the image sensor may lead to inaccurate calculations of ambient intensities. Further, where saturated pixels are detected, the length of t1 and/or t2 may be modified to reduce the total integration time of a frame to avoid saturation in future pixels. Likewise, where intensities are low, t1 and/or t2 may be increased to increase the amount of light integrated in future frames. Alternately or additionally, the gain on the image sensor may be adjusted dynamically to avoid saturation and/or to increase the response of the sensor to an amount of light exposure. Correcting an image for saturation is discussed in more detail below.
  • In other situations, it may be advantageous for t1 and t2 to have similar lengths. For example, where ambient light is fluctuating in intensity (i.e. incandescent light fluctuating at twice a line frequency of 50 or 60 Hz), the average incident ambient light strength will be different during the t2 phases compared to the t1 phases for at least some image frames (depending upon the frame rate compared to the fluctuation frequency). Thus, t2 may be adjusted to have an approximately equal length to t1 when fluctuating ambient light is detected.
  • By following the timing diagram in FIG. 3, the two fields of pixels in each image frame are integrated for different periods of ambient light exposure. Further, each single field of pixels has different periods of ambient light exposure in adjacent image frames. These differences in intra-frame and inter-frame ambient light exposure may be utilized in various ways to correct an image frame for ambient light.
  • In order to illustrate various ambient correction methods, a representative group of intensity data from two image frames, labeled frames n−1 and n, are described with reference to FIG. 4. Specifically, FIG. 4 illustrates how the readout from the sensor for the two image frames, which show a stationary scene, may appear when integrated and read according to the process shown in FIG. 3. First, a simple stationary scene with no ambient light is shown at 402, and a 3×3 matrix of pixels from scene 402 is shown at 404. For the purpose of simplicity, the images in FIG. 4 have only three intensity levels, wherein the lightest pixels signify the most integrated light and the darkest pixels signify the least integrated light.
  • In frame n−1, the odd rows have a greater interval of ambient exposure than the even rows. The addition of this ambient pattern to the 3×3 scene yields the intensity data shown at 406. Likewise, in frame n, the even rows have a greater interval of ambient exposure than the odd rows. The addition of this ambient pattern to the 3×3 scene yields the intensity data shown at 408. Referring next to FIG. 5, the ambient light can be calculated for the odd rows by subtracting frame n from frame n−1 (as shown at 502), and for the even rows by subtracting frame n−1 from frame n (as shown at 504). Combining the ambient determined for the odd rows with the ambient determined for the even rows yields an overall ambient 506 for the 3×3 matrix.
  • FIGS. 6-8 show examples of various methods that may be used to correct an image frame for ambient light with the image data shown in FIG. 4. These figures are shown in the context of determining ambient for a single pixel at a time. This may allow different ambient calculation methods to be used for different pixels depending upon pixel-specific factors.
  • First referring to FIG. 6, an ambient light value at a pixel (for example, the center pixel of the 3×3 matrix shown in FIGS. 4-5) may be calculated as described above for FIG. 5 by simply subtracting frame n−1 from frame n. Likewise, ambient for pixels in the top and bottom rows of the 3×3 matrix may be determined simply by subtracting frame n from frame n−1. This method utilizes information from temporally adjacent frames but does not utilize information from spatially adjacent pixels. Therefore, the method illustrated in FIG. 6 may be referred to herein as a “temporal-local” correction. However, due to the sensor readout pattern shown in FIG. 3, after subtraction of ambient, the intensity at that pixel is the same as in an adjacent frame. Thus, the temporal-local correction may effectively halve the frame rate of the device. For this reason, this correction may be used for stationary objects.
  • FIG. 7 shows another example of a method for correcting an image frame for ambient light. As opposed to that shown in FIG. 6, the method shown in FIG. 7 takes into account both temporal information (i.e. temporally adjacent image frames) and spatial information (i.e. spatially adjacent pixels) when calculating the ambient for a pixel. Therefore, the method shown in FIG. 7 may be referred to as a “temporal-spatial” correction. While shown in the context of a 3×3 matrix, it will be appreciated that the concepts shown in FIG. 7, as well as FIG. 8, may be applied to a matrix of any size of pixels and any shape/pattern around the pixel of interest, including but not limited to 5×5 and 7×7 matrices, as well as other shapes (such as a cross-shaped matrix formed by omitting each corner pixel from a 5×5 matrix).
  • The temporal-spatial correction shown in FIG. 7 utilizes a weighted average intensity of the pixels in the sample matrix to determine an ambient value, wherein the center pixel is weighted more strongly (¼) than the side pixels (⅛ each), which are in turn weighted more strongly than the corner pixels. To perform the correction, the intensities of the pixels are multiplied by the shown weighting factors, the two frames are added, and then the value at each pixel in the matrix after the addition of the two frames is summed to yield the ambient intensity at the center pixel. Because spatial data is taken into account in addition to temporal data, the temporal-spatial correction allows a frame rate to be maintained.
  • FIG. 8 shows another example of a method for correcting an image frame for ambient light. As opposed to the methods shown in FIGS. 6 and 7, the method of FIG. 8 utilizes only spatial information, and not temporal information, in making the ambient correction. In other words, the correction is made entirely from a weighted average of intra-frame data, utilizing no inter-frame data. As depicted, this calculation may lead to slightly high values of ambient light, but can avoid calculation problems due to motion that may occur in methods that utilize temporal information.
  • In some embodiments, it may be determined whether the global ambient light exceeds a predetermined threshold level before performing any of the above ambient correction methods. Where ambient light is of sufficiently low intensity or is absent, the touch-sensitive device may be able to detect objects without any problems caused by ambient. Therefore, before performing any of the above-described corrections (or any others), it may be determined whether there is any potentially problematic ambient by comparing the sum of the intensities in the first field in a frame to the sum of the intensities in the second field in the frame. Because the intensities in the two fields differ by the amount of ambient light integrated, if the sums are relatively close together, it can be determined that the ambient light levels are sufficiently low not to interfere with device operation, and correction for ambient may be omitted, as shown in FIG. 9.
  • FIGS. 10A-D illustrate another embodiment of a method for correcting for ambient. Referring first to FIG. 10A, a 5×5 region of pixels in a current frame (frame n) and a single pixel in two prior frames (frames n−1, n−2) are considered for ambient correction. However, it will be appreciated that a 3×3 region of pixels, or any other suitable region of pixels, in a current frame may be considered in the ambient correction. First referring to FIG. 10A, a center of a current frame is compared to a pixel from frame n−2, which was read in the same field order. If the difference between these pixels exceeds a threshold amount, this indicates that motion may have occurred, and a “motion flag” for that pixel is set. The value of the motion flag is compared to motion flags for nearby pixels (for example, via a Boolean “OR” operation), and if the result is zero (i.e. frame n−2 and frame n look the same in a local region), then a temporal ambient correction is performed by determining difference between a current center pixel in frame n and the same pixel in frame n-1, as indicated in FIG. 10C.
  • On the other hand, if the OR operation with adjacent motion flags result in a value of 1, this indicates that there has been some nearby motion in this frame. In this case, prior frames may be ignored for the ambient correction, and a spatial correction utilizing adjacent pixels in frame n is performed. Any suitable weighting factor scheme may be used to perform this spatial correction. FIG. 10D shows one non-limiting example of a suitable weighting factor scheme for a 5×5 pixel spatial correction.
  • The determination of whether to utilize a 5×5 or a 3×3 pixel region for ambient correction may depend upon factors such as the resolution and stability of the image sensor. For example, a 3×3 region may yield a slightly noisier result, while a 5×5 region may blur the result slightly. Other region sizes may be used, including but not limited to a 1×3 region (which may be noisier than a 3×3 region).
  • Some ambient sources may fluctuate in a periodic manner. For example, electric lighting generally fluctuates at a frequency of twice the line frequency, which may be either 50 or 60 Hz depending upon location. This is illustrated graphically in FIG. 11. Where the frame rate of the device has a frequency other than the line frequency or 2× the line frequency, the ambient light detected by the image sensor will have a beat frequency detectable as a periodic variation in the ambient light level. Such variations in ambient light levels may cause problems when temporal information is used in correcting ambient. These problems may be pronounced where the ambient light fluctuations are large between adjacent frames, which may occur if a frame rate of 100 Hz is used in the presence of 120 Hz ambient, or vice versa. For the purpose of simplicity, FIG. 10 only shows a single integration period (t1) per frame, but it will be understood that similar problems would be encountered with the use of multiple integration periods per frame, as shown in FIG. 3.
  • To prevent such problems caused by fluctuating ambient light levels, the frame rate of an optical touch-sensitive device may be set to equal the line frequency or 2× the line frequency. For example, this setting may be stored in the system, derived from a local power source, or detected optically. The fluctuation may be detected optically by observing a beat frequency in the overall levels of detected light, or by monitoring frame-to-frame variation in the amount of total ambient light measured. If the detected ambient light fluctuation frequency is not the same as the frame rate of the device, the frame rate can then be adjusted so that it matches the ambient light frequency, as shown in FIG. 12.
  • FIG. 13 shows a process flow of a method 1300 for performing an ambient light correction that takes into account the various factors described above. The method of FIG. 13 may be performed on a pixel-by-pixel basis, or in any other suitable manner. Method 1300 first comprises, at 1302, acquiring image data frames, and then, at 1304, determining for an image data frame whether the global ambient is below a threshold value. This can be determined, for example, by subtracting the sum of the intensities of all pixels in a first field from the sum of all intensities of pixels in a second field, and determining if the result of the calculation is below a threshold value.
  • If the global ambient is below a threshold value, then method 1300 ends without performing any correction. On the other hand, if the global ambient is not below a threshold value, then method 1300 comprises, at 1306, determining whether any motion is perceived in the intensity data. This may be performed, for example, by subtracting the intensity value for the pixel in the current frame (frame n) from the intensity value for the same pixel in frame n−2 (as the same pixel in n−1 has a different ambient exposure time). If the difference between these intensity values is sufficiently small, then it can be determined that the intensity data contains no motion information. In this case, a temporal local correction that utilizes no spatial information may be performed, as indicated at 1308. On the other hand, if the differences between the intensity values is sufficiently large, it can be assumed that the pixel contains motion data (as long as the frame rate has been corrected for any periodically fluctuating ambient light), and either a spatial or a temporal-spatial correction may be used, as indicated at 1310.
  • The decision whether to utilize a spatial or temporal-spatial correction may be made in any suitable manner. Generally, a spatial correction may be used where all spatial variation in a frame can be corrected with other information in the frame. One example of a method for making this determination is as follows. First, if any pixels in row (i−1) of the sample matrix differ significantly from the pixels in the same column in row (i+1), there is spatial information that may be corrected via a temporal-spatial correction. Likewise, if any of the pixels in row (i) of the sample matrix minus the mean for row (i) differs significantly from the corresponding pixels in row (i−1) minus the mean for the pixels in row (i−1) then there is spatial information that may be corrected via a temporal-spatial correction. In other cases where there is perceived motion but these conditions are not met, a spatial correction may be used. Alternatively, either a spatial or temporal-spatial may be used exclusive of the other where motion information is contained in a frame.
  • The correction calculations and calculation selection routine described above may be performed in any suitable manner. For example, in one embodiment, an FPGA (as shown at 122 in FIG. 1) may be programmed to perform a plurality of different correction calculations simultaneously for each frame. Then, the best ambient value for each pixel in a frame may be selected based upon the specific temporal and local characteristics of that pixel, as described for method 1200. Alternatively, the best ambient calculation for a pixel may be determined before performing the correction, such that only one correction is performed for each pixel. It will be appreciated that these specific examples of how to perform an ambient correction from the intensity data integrated and collected are described only for the purpose of illustration, and are not intended to be limiting in any manner.
  • As described above, where saturation of the image sensor is detected, the lengths of the integration periods t1 and/or t2 may be adjusted to prevent saturation in future frames. Further, a frame in which saturation is detected may also be processed in a manner to correct for the saturation. As an example, if saturated pixels are observed, it can be assumed that the saturated pixels are directly exposed to ambient light (as reflected light from the illuminant is generally not sufficiently intense to cause saturation). Therefore, in this situation, all light in the saturated region can be deemed to be ambient. Where saturation exists, a noise margin may exist around the saturated pixels. To avoid discontinuities in the corrected image in the noise margin region, a minimum ambient level may be determined for this region by setting one possible ambient using the near-saturation test, and another using a computed ambient as described above. The higher of these two values may then be used as the value to be subtracted from the image when correcting pixels in this region for ambient.
  • While disclosed herein in the context of an interactive display device, it will be appreciated that the disclosed embodiments may also be used in any other suitable optical touch-sensitive device, as well as in any other touch-sensitive device in which a background signal correction may be performed to improve device performance.
  • It will further be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of any of the above-described processes is not necessarily required to achieve the features and/or results of the embodiments described herein, but is provided for ease of illustration and description.
  • The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims (20)

1. A method of correcting an image for ambient light in an optical touch-sensitive device, the optical touch-sensitive device comprising a screen, a light source and an image sensor with two or more fields of pixels, the method comprising:
integrating a first field of pixels in an image data frame for a different duration of ambient light exposure than a second field of pixels in the image data frame;
reading intensity data from the first field of pixels and the second field of pixels;
determining an ambient light value for one or more pixels in the image data frame from the intensity data; and
adjusting one or more pixels of the data frame for ambient light based upon the ambient light value.
2. The method of claim 1, wherein the first field of pixels comprises even rows or even columns of pixels, and wherein the second field of pixels comprises odd rows or odd columns of pixels.
3. The method of claim 1, further comprising reading intensity data first from the first field of pixels and next from the second field of pixels in one image data frame, and then reading intensity data first from the second field of pixels and next from the first field of pixels in a next image data frame.
4. The method of claim 1, wherein adjusting the data frame for ambient light comprises first determining whether ambient light exceeds a threshold level, and then correcting for ambient light only if the ambient light exceeds the threshold level.
5. The method of claim 4, further comprising determining if any objects are in motion on the screen, and selecting an ambient correction method for use depending upon whether or not any objects are detected in motion on the screen.
6. The method of claim 1, further comprising, during integration of each data frame:
exposing the image sensor for an “on” state interval during which the light source is in an “on” state and for an “off” state interval during which the light source is in an “off” state; and
for alternating frames n and n+1
for frame n, reading intensity levels of the first field of pixels after integration of one “on” state interval, and then reading levels of the second field of pixels after integration of one “on” state interval and two “off” state intervals; and
for frame n+1, reading intensity levels of the second field of pixels after integration of one “on state interval, and then reading levels of the first field pixels after integration of one “on” state interval and two “off” state intervals.
7. The method of claim 6, further comprising adjusting one or more of a length of the interval during which the image sensor is exposed while light source is in the “on” state and the length of the interval during the image sensor is exposed which the light source is in the “off” state to reduce saturation of the image sensor or to increase an amount of light integrated.
8. The method of claim 1, further comprising detecting a saturation condition in one or more pixels, and reducing a duration that the image sensor is exposed to light during one or more of the “on” state and the “off” state of the light source.
9. The method of claim 1, further comprising determining ambient light values for a plurality of frames, measuring an ambient light beat frequency using the ambient light values, and adjusting a frame rate of the device based upon the ambient light beat frequency.
10. A method of correcting an image for ambient light in an optical touch-sensitive device, the optical touch-sensitive device comprising a screen, a light source, and an image sensor with two or more interlaced fields of pixels, the method comprising:
during integration of each image data frame, operating the light source in an “on” state for part of the frame and an “off” state for part of the frame;
during integration of each image data frame,
integrating light for a longer period of time with a first field of pixels than with a second field of pixels while the light source is in an “off state”;
integrating light for an equal period of time with the first field of pixels and the second field of pixels while the light source is in an “on” state;
reading intensity data from the first and second fields of pixels; and
adjusting the image data frame for ambient light based upon differences in intensity data from the first field of pixels and intensity data from the second field of pixels.
11. The method of claim 10, further comprising reading intensity data first from the first field of pixels and next from the second field of pixels in one image data frame, and then reading intensity data first from the second field of pixels and next from the first field of pixels in a next image data frame.
12. The method of claim 10, wherein adjusting the data frame for ambient light comprises first determining whether ambient light exceeds a threshold level, and then correcting for ambient light only if the ambient light exceeds the threshold level.
13. The method of claim 12, further comprising determining if any objects are in motion on the screen, and selecting an ambient correction calculation for use depending upon whether or not any objects are detected in motion on the screen.
14. The method of claim 10, further comprising:
for alternating frames n and n+1
for frame n, reading intensity levels of the first field of pixels after integration of one “on” state interval, and then reading levels of the second field of pixels after integration of one “on” state interval and two “off” state intervals; and
for frame n+1, reading intensity levels of the second field of pixels after integration of one “on” state interval, and then reading levels of the first field pixels after integration of one “on” state interval and two “off” state intervals.
15. The method of claim 10, further comprising adjusting one or more of a length of the “on” state interval and the length of the “off state” interval based upon one or more characteristics of the intensity data.
16. An optical touch-sensitive device, comprising:
a screen having a touch surface and a backside;
an image sensor configured to capture an image of a backside of the screen and comprising two or more interlaced fields of pixels;
a light source configured to illuminate the backside of the screen; and
a controller configured to:
modulate the light source during capture of an image data frame,
modulate exposure of the image sensor to light during capture of an image data frame such that the image source integrates each image data frame for a portion of time when the light source is on and a portion of time when the light source is off,
read a first image data frame from the image sensor such that a first field of pixels is read before a second field of pixels, and
read a next image data frame from the image sensor such that the second field of pixels is read before the first field of pixels.
17. The device of claim 16, wherein the controller is configured to determine an ambient light value for one or more pixels in the image data frame from differences in the intensity data between the first field of pixels and the second field of pixels, and to adjust one or more pixels of the data frame for ambient light based upon the ambient light value.
18. The device of claim 17, wherein the controller is configured to determine whether the ambient light value exceeds a threshold level, and then to adjust one or more pixels of the data frame for ambient light only if the ambient light value exceeds the threshold level.
19. The device of claim 17, wherein the controller is configured to determine if any objects are in motion on the screen, and to select an ambient correction calculation for use depending upon whether or not any objects are detected in motion on the screen.
20. The device of claim 19, further comprising a field programmable gate array configured to perform multiple ambient correction calculations from which an ambient light value can be selected depending upon whether or not any objects are detected in motion on the screen.
US11/868,466 2007-10-05 2007-10-05 Correcting for ambient light in an optical touch-sensitive device Expired - Fee Related US8004502B2 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US11/868,466 US8004502B2 (en) 2007-10-05 2007-10-05 Correcting for ambient light in an optical touch-sensitive device
JP2010528117A JP5134688B2 (en) 2007-10-05 2008-10-01 Ambient light correction in optical touch-sensitive devices
EP08836179.5A EP2206033B1 (en) 2007-10-05 2008-10-01 Correcting for ambient light in an optical touch-sensitive device
CN2008801104246A CN101815979B (en) 2007-10-05 2008-10-01 Correcting for ambient light in an optical touch-sensitive device
CA2698623A CA2698623C (en) 2007-10-05 2008-10-01 Correcting for ambient light in an optical touch-sensitive device
KR1020107007059A KR101465835B1 (en) 2007-10-05 2008-10-01 Correcting for ambient light in an optical touch-sensitive device
PCT/US2008/078515 WO2009046154A2 (en) 2007-10-05 2008-10-01 Correcting for ambient light in an optical touch-sensitive device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/868,466 US8004502B2 (en) 2007-10-05 2007-10-05 Correcting for ambient light in an optical touch-sensitive device

Publications (2)

Publication Number Publication Date
US20090091554A1 true US20090091554A1 (en) 2009-04-09
US8004502B2 US8004502B2 (en) 2011-08-23

Family

ID=40522867

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/868,466 Expired - Fee Related US8004502B2 (en) 2007-10-05 2007-10-05 Correcting for ambient light in an optical touch-sensitive device

Country Status (7)

Country Link
US (1) US8004502B2 (en)
EP (1) EP2206033B1 (en)
JP (1) JP5134688B2 (en)
KR (1) KR101465835B1 (en)
CN (1) CN101815979B (en)
CA (1) CA2698623C (en)
WO (1) WO2009046154A2 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100220077A1 (en) * 2009-02-27 2010-09-02 Sony Corporation Image input device, image input-output device and electronic unit
US20100320919A1 (en) * 2009-06-22 2010-12-23 Nokia Corporation Method and apparatus for modifying pixels based at least in part on ambient light level
US20110012866A1 (en) * 2009-07-17 2011-01-20 Microsoft Corporation Ambient correction in rolling image capture system
US20110234540A1 (en) * 2010-03-26 2011-09-29 Quanta Computer Inc. Background image updating method and touch screen
CN102262486A (en) * 2010-05-28 2011-11-30 株式会社半导体能源研究所 Photodetector
US20120050226A1 (en) * 2010-09-01 2012-03-01 Toshiba Tec Kabushiki Kaisha Display input apparatus and display input method
US20120154535A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Capturing gated and ungated light in the same frame on the same photosurface
CN102622138A (en) * 2012-02-29 2012-08-01 广东威创视讯科技股份有限公司 Optical touch control positioning method and optical touch control positioning system
US20120218390A1 (en) * 2011-02-24 2012-08-30 Au Optronics Corporation Interactive stereo display system and method for calculating three-dimensional coordinate
US20120308130A1 (en) * 2011-06-01 2012-12-06 Nokia Corporation Method and Apparatus for Image Signal Processing
US20120327035A1 (en) * 2011-06-21 2012-12-27 Pixart Imaging Inc. Optical touch system and image processing method thereof
US20130141393A1 (en) * 2011-12-06 2013-06-06 Yu-Yen Chen Frameless optical touch device and image processing method for frameless optical touch device
US20140232695A1 (en) * 2011-06-16 2014-08-21 Light Blue Optics Ltd. Touch-Sensitive Display Devices
EP2810146A4 (en) * 2012-01-31 2015-09-16 Flatfrog Lab Ab Performance monitoring and correction in a touch-sensitive apparatus
US20150338997A1 (en) * 2010-01-20 2015-11-26 Nexys Control device and electronic device comprising same
US20160044225A1 (en) * 2012-11-02 2016-02-11 Microsoft Technology Licensing, Llc Rapid Synchronized Lighting and Shuttering
US20170243358A1 (en) * 2012-10-31 2017-08-24 Pixart Imaging Inc. Detection system
US20170244890A1 (en) * 2016-02-19 2017-08-24 Samsung Electronics Co., Ltd. Electronic device and method for controlling operation thereof
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
US9946405B2 (en) 2014-05-29 2018-04-17 Semiconductor Energy Laboratory Co., Ltd. Information processing device
US9968285B2 (en) 2014-07-25 2018-05-15 Christie Digital Systems Usa, Inc. Multispectral medical imaging devices and methods thereof
WO2018096430A1 (en) * 2016-11-24 2018-05-31 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
US20180198982A1 (en) * 2017-01-06 2018-07-12 Samsung Electronics Co., Ltd. Image capturing method and electronic device
US10126882B2 (en) 2014-01-16 2018-11-13 Flatfrog Laboratories Ab TIR-based optical touch systems of projection-type
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
US10161886B2 (en) 2014-06-27 2018-12-25 Flatfrog Laboratories Ab Detection of surface contamination
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US10481737B2 (en) 2017-03-22 2019-11-19 Flatfrog Laboratories Ab Pen differentiation for touch display
US10496227B2 (en) 2015-02-09 2019-12-03 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10559072B2 (en) * 2015-06-23 2020-02-11 Hoya Corporation Image detection device and image detection system
US20200137336A1 (en) * 2018-10-30 2020-04-30 Bae Systems Information And Electronic Systems Integration Inc. Interlace image sensor for low-light-level imaging
CN112164003A (en) * 2020-09-11 2021-01-01 珠海市一微半导体有限公司 Method for acquiring laser image by mobile robot, chip and robot
US11182023B2 (en) 2015-01-28 2021-11-23 Flatfrog Laboratories Ab Dynamic touch quarantine frames
US11256371B2 (en) 2017-09-01 2022-02-22 Flatfrog Laboratories Ab Optical component
US11301089B2 (en) 2015-12-09 2022-04-12 Flatfrog Laboratories Ab Stylus identification
CN114641982A (en) * 2019-11-06 2022-06-17 皇家飞利浦有限公司 System for performing ambient light image correction
US11474644B2 (en) 2017-02-06 2022-10-18 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus
US11943563B2 (en) 2019-01-25 2024-03-26 FlatFrog Laboratories, AB Videoconferencing terminal and method of operating the same

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5043204B2 (en) * 2009-01-20 2012-10-10 シャープ株式会社 Liquid crystal display with light intensity sensor
WO2013081672A1 (en) * 2011-11-30 2013-06-06 TransluSense, LLC Multi-touch input device
JP5924577B2 (en) * 2012-03-19 2016-05-25 日本精機株式会社 Position detection device
US9411928B2 (en) 2012-07-17 2016-08-09 Parade Technologies, Ltd. Discontinuous integration using half periods
WO2014015032A2 (en) * 2012-07-19 2014-01-23 Cypress Semiconductor Corporation Touchscreen data processing
US10008016B2 (en) * 2012-09-05 2018-06-26 Facebook, Inc. Proximity-based image rendering
US9143696B2 (en) * 2012-10-13 2015-09-22 Hewlett-Packard Development Company, L.P. Imaging using offsetting accumulations
US10705404B2 (en) 2013-07-08 2020-07-07 Concord (Hk) International Education Limited TIR-modulated wide viewing angle display
TWI504931B (en) * 2014-02-19 2015-10-21 Coretronic Corp Projection system and projection method thereof
TWI524772B (en) * 2014-02-19 2016-03-01 中強光電股份有限公司 Projection system and projection method thereof
TWI522722B (en) 2014-06-06 2016-02-21 中強光電股份有限公司 Light source device and adjusting method thereof
US9842551B2 (en) 2014-06-10 2017-12-12 Apple Inc. Display driver circuitry with balanced stress
US20160111062A1 (en) * 2014-10-15 2016-04-21 Intel Corporation Ambient light-based image adjustment
WO2019147649A1 (en) * 2018-01-23 2019-08-01 Clearink Displays, Inc. Method, system and apparatus for color saturation in reflective image displays
CN113424550A (en) 2019-01-09 2021-09-21 杜比实验室特许公司 Display management with ambient light compensation
KR20210061572A (en) 2019-11-20 2021-05-28 삼성전자주식회사 Method of obtaining reference image for optical object recognition, method of performing optical object recognition using the same and electronic device performing the same

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3674008A (en) * 1970-07-13 1972-07-04 Battelle Development Corp Quantitative pulsed transilluminator and method of operation
US5742279A (en) * 1993-11-08 1998-04-21 Matsushita Electrical Co., Ltd. Input/display integrated information processing device
US5801684A (en) * 1996-02-29 1998-09-01 Motorola, Inc. Electronic device with display and display driver and method of operation of a display driver
US5874731A (en) * 1995-03-20 1999-02-23 Schlumberger Industries, Inc. Ambient light filter
US6088470A (en) * 1998-01-27 2000-07-11 Sensar, Inc. Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US6339748B1 (en) * 1997-11-11 2002-01-15 Seiko Epson Corporation Coordinate input system and display apparatus
US6593929B2 (en) * 1995-11-22 2003-07-15 Nintendo Co., Ltd. High performance low cost video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
US6600168B1 (en) * 2000-02-03 2003-07-29 Genex Technologies, Inc. High speed laser three-dimensional imager
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US20050083293A1 (en) * 2003-10-21 2005-04-21 Dixon Brian S. Adjustment of color in displayed images based on identification of ambient light sources
US20050134751A1 (en) * 2003-12-17 2005-06-23 Adiel Abileah Light sensitive display
US20050249390A1 (en) * 2004-04-29 2005-11-10 Mcclurg George W Method and apparatus for discriminating ambient light in a fingerprint scanner
US7027353B2 (en) * 2003-04-03 2006-04-11 Sri International Method and apparatus for real-time vibration imaging
US20060255152A1 (en) * 2005-05-06 2006-11-16 Tong Xie Light source control in optical pointing device
US7145657B2 (en) * 2003-09-23 2006-12-05 X-Rite, Incorporated Color measurement instrument
US20070176916A1 (en) * 2006-01-27 2007-08-02 Samsung Electronics Co., Ltd Image display apparatus and method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3234736B2 (en) * 1994-04-12 2001-12-04 松下電器産業株式会社 I / O integrated information operation device
JP4422851B2 (en) * 1999-03-17 2010-02-24 キヤノン株式会社 Coordinate input apparatus and method
US6525702B1 (en) * 1999-09-17 2003-02-25 Koninklijke Philips Electronics N.V. Method of and unit for displaying an image in sub-fields
US6674446B2 (en) * 1999-12-17 2004-01-06 Koninilijke Philips Electronics N.V. Method of and unit for displaying an image in sub-fields
AU6026501A (en) 2000-05-01 2001-11-12 Delsy Electronic Components Ag Device for inputting relative coordinates
GB0220614D0 (en) * 2002-09-05 2002-10-16 Koninkl Philips Electronics Nv Electroluminescent display devices
JP3805316B2 (en) * 2003-03-14 2006-08-02 富士通株式会社 Optical scanning touch panel
US7362320B2 (en) * 2003-06-05 2008-04-22 Hewlett-Packard Development Company, L.P. Electronic device having a light emitting/detecting display screen
US7146082B2 (en) 2003-12-23 2006-12-05 Intel Corporation Steering isolator for an opto-electronic assembly focusing apparatus
JP2007011233A (en) * 2005-07-04 2007-01-18 Toshiba Matsushita Display Technology Co Ltd Flat display device and imaging method using same
JP2007140260A (en) 2005-11-21 2007-06-07 Alpine Electronics Inc Display apparatus
US20070153117A1 (en) * 2005-12-30 2007-07-05 Yen-Yu Lin Apparatus and method for adjusting display-related setting of an electronic device

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3674008A (en) * 1970-07-13 1972-07-04 Battelle Development Corp Quantitative pulsed transilluminator and method of operation
US5742279A (en) * 1993-11-08 1998-04-21 Matsushita Electrical Co., Ltd. Input/display integrated information processing device
US6714665B1 (en) * 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
US5874731A (en) * 1995-03-20 1999-02-23 Schlumberger Industries, Inc. Ambient light filter
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US6593929B2 (en) * 1995-11-22 2003-07-15 Nintendo Co., Ltd. High performance low cost video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
US5801684A (en) * 1996-02-29 1998-09-01 Motorola, Inc. Electronic device with display and display driver and method of operation of a display driver
US6339748B1 (en) * 1997-11-11 2002-01-15 Seiko Epson Corporation Coordinate input system and display apparatus
US6088470A (en) * 1998-01-27 2000-07-11 Sensar, Inc. Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US6600168B1 (en) * 2000-02-03 2003-07-29 Genex Technologies, Inc. High speed laser three-dimensional imager
US7027353B2 (en) * 2003-04-03 2006-04-11 Sri International Method and apparatus for real-time vibration imaging
US7145657B2 (en) * 2003-09-23 2006-12-05 X-Rite, Incorporated Color measurement instrument
US20050083293A1 (en) * 2003-10-21 2005-04-21 Dixon Brian S. Adjustment of color in displayed images based on identification of ambient light sources
US20050134751A1 (en) * 2003-12-17 2005-06-23 Adiel Abileah Light sensitive display
US20050249390A1 (en) * 2004-04-29 2005-11-10 Mcclurg George W Method and apparatus for discriminating ambient light in a fingerprint scanner
US20060255152A1 (en) * 2005-05-06 2006-11-16 Tong Xie Light source control in optical pointing device
US20070176916A1 (en) * 2006-01-27 2007-08-02 Samsung Electronics Co., Ltd Image display apparatus and method

Cited By (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US20100220077A1 (en) * 2009-02-27 2010-09-02 Sony Corporation Image input device, image input-output device and electronic unit
US20100320919A1 (en) * 2009-06-22 2010-12-23 Nokia Corporation Method and apparatus for modifying pixels based at least in part on ambient light level
WO2011009005A3 (en) * 2009-07-17 2011-05-05 Microsoft Corporation Ambient correction in rolling image capture system
US8289300B2 (en) * 2009-07-17 2012-10-16 Microsoft Corporation Ambient correction in rolling image capture system
EP2454651A4 (en) * 2009-07-17 2014-05-28 Microsoft Corp Ambient correction in rolling image capture system
JP2012533801A (en) * 2009-07-17 2012-12-27 マイクロソフト コーポレーション Ambient correction in rotating image capture system
US8681126B2 (en) 2009-07-17 2014-03-25 Microsoft Corporation Ambient correction in rolling image capture system
US20110012866A1 (en) * 2009-07-17 2011-01-20 Microsoft Corporation Ambient correction in rolling image capture system
EP2454651A2 (en) * 2009-07-17 2012-05-23 Microsoft Corporation Ambient correction in rolling image capture system
WO2011009005A2 (en) 2009-07-17 2011-01-20 Microsoft Corporation Ambient correction in rolling image capture system
US10216336B2 (en) * 2010-01-20 2019-02-26 Nexys Control device and electronic device comprising same
US20150338997A1 (en) * 2010-01-20 2015-11-26 Nexys Control device and electronic device comprising same
US20110234540A1 (en) * 2010-03-26 2011-09-29 Quanta Computer Inc. Background image updating method and touch screen
US20110291013A1 (en) * 2010-05-28 2011-12-01 Semiconductor Energy Laboratory Co., Ltd. Photodetector
US9846515B2 (en) 2010-05-28 2017-12-19 Semiconductor Energy Laboratory Co., Ltd. Photodetector and display device with light guide configured to face photodetector circuit and reflect light from a source
CN102262486A (en) * 2010-05-28 2011-11-30 株式会社半导体能源研究所 Photodetector
US8772701B2 (en) * 2010-05-28 2014-07-08 Semiconductor Energy Laboratory Co., Ltd. Photodetector and display device with light guide configured to face photodetector circuit and reflect light from a source
US20120050226A1 (en) * 2010-09-01 2012-03-01 Toshiba Tec Kabushiki Kaisha Display input apparatus and display input method
US20120154535A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Capturing gated and ungated light in the same frame on the same photosurface
US8531506B2 (en) * 2011-02-24 2013-09-10 Au Optronics Corporation Interactive stereo display system and method for calculating three-dimensional coordinate
US20120218390A1 (en) * 2011-02-24 2012-08-30 Au Optronics Corporation Interactive stereo display system and method for calculating three-dimensional coordinate
US20120308130A1 (en) * 2011-06-01 2012-12-06 Nokia Corporation Method and Apparatus for Image Signal Processing
US8588523B2 (en) * 2011-06-01 2013-11-19 Nokia Corporation Method and apparatus for image signal processing
US20140232695A1 (en) * 2011-06-16 2014-08-21 Light Blue Optics Ltd. Touch-Sensitive Display Devices
US9524061B2 (en) * 2011-06-16 2016-12-20 Promethean Limited Touch-sensitive display devices
US10282036B2 (en) 2011-06-21 2019-05-07 Pixart Imaging Inc. Optical touch system and image processing method thereof
US20120327035A1 (en) * 2011-06-21 2012-12-27 Pixart Imaging Inc. Optical touch system and image processing method thereof
US20130141393A1 (en) * 2011-12-06 2013-06-06 Yu-Yen Chen Frameless optical touch device and image processing method for frameless optical touch device
US10372265B2 (en) 2012-01-31 2019-08-06 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
US9588619B2 (en) 2012-01-31 2017-03-07 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
EP2810146A4 (en) * 2012-01-31 2015-09-16 Flatfrog Lab Ab Performance monitoring and correction in a touch-sensitive apparatus
CN102622138A (en) * 2012-02-29 2012-08-01 广东威创视讯科技股份有限公司 Optical touch control positioning method and optical touch control positioning system
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US20170243358A1 (en) * 2012-10-31 2017-08-24 Pixart Imaging Inc. Detection system
US10755417B2 (en) 2012-10-31 2020-08-25 Pixart Imaging Inc. Detection system
US10255682B2 (en) * 2012-10-31 2019-04-09 Pixart Imaging Inc. Image detection system using differences in illumination conditions
US9544504B2 (en) * 2012-11-02 2017-01-10 Microsoft Technology Licensing, Llc Rapid synchronized lighting and shuttering
US20160044225A1 (en) * 2012-11-02 2016-02-11 Microsoft Technology Licensing, Llc Rapid Synchronized Lighting and Shuttering
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
US10126882B2 (en) 2014-01-16 2018-11-13 Flatfrog Laboratories Ab TIR-based optical touch systems of projection-type
US9946405B2 (en) 2014-05-29 2018-04-17 Semiconductor Energy Laboratory Co., Ltd. Information processing device
US10161886B2 (en) 2014-06-27 2018-12-25 Flatfrog Laboratories Ab Detection of surface contamination
US9968285B2 (en) 2014-07-25 2018-05-15 Christie Digital Systems Usa, Inc. Multispectral medical imaging devices and methods thereof
US11182023B2 (en) 2015-01-28 2021-11-23 Flatfrog Laboratories Ab Dynamic touch quarantine frames
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
US11029783B2 (en) 2015-02-09 2021-06-08 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10496227B2 (en) 2015-02-09 2019-12-03 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
US10559072B2 (en) * 2015-06-23 2020-02-11 Hoya Corporation Image detection device and image detection system
US11301089B2 (en) 2015-12-09 2022-04-12 Flatfrog Laboratories Ab Stylus identification
US20170244890A1 (en) * 2016-02-19 2017-08-24 Samsung Electronics Co., Ltd. Electronic device and method for controlling operation thereof
US10609276B2 (en) * 2016-02-19 2020-03-31 Samsung Electronics Co., Ltd. Electronic device and method for controlling operation of camera-related application based on memory status of the electronic device thereof
WO2018096430A1 (en) * 2016-11-24 2018-05-31 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US10761657B2 (en) 2016-11-24 2020-09-01 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US11281335B2 (en) 2016-12-07 2022-03-22 Flatfrog Laboratories Ab Touch device
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US11579731B2 (en) 2016-12-07 2023-02-14 Flatfrog Laboratories Ab Touch device
US10775935B2 (en) 2016-12-07 2020-09-15 Flatfrog Laboratories Ab Touch device
US10574895B2 (en) * 2017-01-06 2020-02-25 Samsung Electronics Co., Ltd. Image capturing method and camera equipped electronic device
US20180198982A1 (en) * 2017-01-06 2018-07-12 Samsung Electronics Co., Ltd. Image capturing method and electronic device
US11740741B2 (en) 2017-02-06 2023-08-29 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11474644B2 (en) 2017-02-06 2022-10-18 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US10606414B2 (en) 2017-03-22 2020-03-31 Flatfrog Laboratories Ab Eraser for touch displays
US11016605B2 (en) 2017-03-22 2021-05-25 Flatfrog Laboratories Ab Pen differentiation for touch displays
US10481737B2 (en) 2017-03-22 2019-11-19 Flatfrog Laboratories Ab Pen differentiation for touch display
US11099688B2 (en) 2017-03-22 2021-08-24 Flatfrog Laboratories Ab Eraser for touch displays
US10845923B2 (en) 2017-03-28 2020-11-24 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10606416B2 (en) 2017-03-28 2020-03-31 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11269460B2 (en) 2017-03-28 2022-03-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11281338B2 (en) 2017-03-28 2022-03-22 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10739916B2 (en) 2017-03-28 2020-08-11 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11256371B2 (en) 2017-09-01 2022-02-22 Flatfrog Laboratories Ab Optical component
US11650699B2 (en) 2017-09-01 2023-05-16 Flatfrog Laboratories Ab Optical component
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US20200137336A1 (en) * 2018-10-30 2020-04-30 Bae Systems Information And Electronic Systems Integration Inc. Interlace image sensor for low-light-level imaging
US11943563B2 (en) 2019-01-25 2024-03-26 FlatFrog Laboratories, AB Videoconferencing terminal and method of operating the same
CN114641982A (en) * 2019-11-06 2022-06-17 皇家飞利浦有限公司 System for performing ambient light image correction
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus
CN112164003A (en) * 2020-09-11 2021-01-01 珠海市一微半导体有限公司 Method for acquiring laser image by mobile robot, chip and robot

Also Published As

Publication number Publication date
CN101815979B (en) 2013-05-29
CN101815979A (en) 2010-08-25
EP2206033A4 (en) 2012-06-13
WO2009046154A2 (en) 2009-04-09
KR101465835B1 (en) 2014-11-26
CA2698623C (en) 2014-03-11
KR20100063765A (en) 2010-06-11
EP2206033A2 (en) 2010-07-14
EP2206033B1 (en) 2018-04-04
WO2009046154A3 (en) 2009-07-02
JP2011501841A (en) 2011-01-13
US8004502B2 (en) 2011-08-23
CA2698623A1 (en) 2009-04-09
JP5134688B2 (en) 2013-01-30

Similar Documents

Publication Publication Date Title
US8004502B2 (en) Correcting for ambient light in an optical touch-sensitive device
US8681126B2 (en) Ambient correction in rolling image capture system
US7973779B2 (en) Detecting ambient light levels in a vision system
JP2011501841A5 (en)
JP2019029948A (en) Image monitoring device, image monitoring method and image monitoring program
JP2007281556A (en) Imaging element, imaging apparatus, and imaging system
JP2007226439A (en) Display device and electronic apparatus
US10348983B2 (en) Non-transitory storage medium encoded with computer readable image processing program, information processing system, information processing apparatus, and image processing method for determining a position of a subject in an obtained infrared image
US20140098062A1 (en) Optical touch panel system and positioning method thereof
JP6127558B2 (en) Imaging device
US9841846B2 (en) Exposure mechanism of optical touch system and optical touch system using the same
US9906705B2 (en) Image pickup apparatus
CN101571768A (en) Method for preventing screen cursor from abnormal movement
JP2022053871A (en) Distance measuring sensor and distance measuring method
JP2013085191A (en) Imaging device and imaging element driving method
JP2010112990A (en) Projector
JP2013196348A (en) Position detection device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KEAM, NIGEL;REEL/FRAME:019927/0618

Effective date: 20071003

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034542/0001

Effective date: 20141014

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230823