US6731330B2 - Method for robust determination of visible points of a controllable display within a camera view - Google Patents

Method for robust determination of visible points of a controllable display within a camera view Download PDF

Info

Publication number
US6731330B2
US6731330B2 US09/774,452 US77445201A US6731330B2 US 6731330 B2 US6731330 B2 US 6731330B2 US 77445201 A US77445201 A US 77445201A US 6731330 B2 US6731330 B2 US 6731330B2
Authority
US
United States
Prior art keywords
image
images
display area
captured
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US09/774,452
Other versions
US20020126136A1 (en
Inventor
I-Jong Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to US09/774,452 priority Critical patent/US6731330B2/en
Assigned to HEWLETT-PACKARD COMPANY reassignment HEWLETT-PACKARD COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIN, I-JONG
Priority to PCT/US2002/002716 priority patent/WO2002065388A2/en
Priority to EP02702108A priority patent/EP1356422A2/en
Priority to JP2002565239A priority patent/JP2005500590A/en
Publication of US20020126136A1 publication Critical patent/US20020126136A1/en
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HEWLETT-PACKARD COMPANY
Application granted granted Critical
Publication of US6731330B2 publication Critical patent/US6731330B2/en
Adjusted expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • an initial calibration step is necessary where the computer system determines the location of the screen or more specifically the area in which the image is displayed within the capture area of the camera. By determining the location of the display area within the capture area, the system can then identify any object that is not part of the displayed image (e.g., objects in front of the displayed image). In one prior art system, the location of the display area within the capture area is determined by using complex, and sometimes fragile, pattern matching algorithms requiring a significant amount of computing time and system memory usage.
  • FIGS. 4A and 4B illustrate a third embodiment of the method of locating a display area within a capture area in a user interactive, computer controlled display system in accordance with the present invention
  • FIG. 5A shows an exemplary image of a display area in a capture area
  • system and method can be implemented such that instead of applying constructive feedback to display area values and destructive feedback to non-display area values in array A c (x,y), the inverse feedback conditions apply. Specifically, destructive feedback is applied to the display area values while constructive feedback is applied to non-display area values in order to identify the non-display area within the capture area of the image capture device.

Abstract

A system and method of locating the display area of a user interactive, computer controlled display including a computing device for generating image data corresponding to a plurality of selected images, a display area for displaying the plurality of selected images, an image capture device for capturing a plurality of images within its capture area each captured image including one of the selected images and a display area locator for locating the display area within the capture area by deriving constructive and destructive feedback data from image data corresponding to the plurality of captured images.

Description

FIELD OF THE INVENTION
The present invention relates to a computer controllable display system and in particular to the interaction of a user with a computer controlled image display or projected image.
BACKGROUND OF THE INVENTION
Computer controlled projection systems generally include a computer system for generating image data and a projector for projecting the image data onto a projection screen. Typically, the computer controlled projection system is used to allow a presenter to project presentations that were created with the computer system onto a larger screen so that more than one viewer can easily see the presentation. Often, the presenter interacts with the projected image by pointing to notable areas on the projected image with their finger, laser pointer, or some other pointing.
The problem with this type of system is that if a user wants to cause any change to the projected image they must interact with the computer system using an input device such as a mouse, keyboard or remote device. For instance, a device is often employed by a presenter to remotely control the computer system via infrared signals to display the next slide in a presentation. However, this can be distracting to the viewers of the presentation since the presenter is no longer interacting with them and the projected presentation and, instead, is interacting with the computer system. Often, this interaction can lead to significant interruptions in the presentation.
Hence, a variation of the above system developed to overcome the computer-only interaction problem allows the presenter to directly interact with the projected image. In this system, the computer system generates image data (e.g. presentation slides) to be projected onto a screen (e.g. projection screen) with an image projector device. The system also includes a digital image capture device such as a digital camera for capturing the projected image. The captured projected image data is transmitted back to the computing system and is used to determine the location of the an object in front of the screen such as a pointing device. The computer system may then be controlled dependent on the determined location of the pointing device. For instance, in U.S. Pat. No. 5,138,304 assigned to the assignee of the subject application, a light beam is projected onto the screen and is detected by a camera. To determine the position of the light beam, the captured image data of the projected image and the original image data are compared to determine the pointer position. The computer is then caused to position a cursor in the video image at the pointer position or is caused to modify the projected image data in response to the pointer position.
In order to implement a user interactive, computer controlled display or projection system, an initial calibration step is necessary where the computer system determines the location of the screen or more specifically the area in which the image is displayed within the capture area of the camera. By determining the location of the display area within the capture area, the system can then identify any object that is not part of the displayed image (e.g., objects in front of the displayed image). In one prior art system, the location of the display area within the capture area is determined by using complex, and sometimes fragile, pattern matching algorithms requiring a significant amount of computing time and system memory usage.
The present invention is an efficient, robust, and simple technique for locating an image display area within the capture area of an image capture device in a user interactive, computer controlled display or projection system.
SUMMARY OF THE INVENTION
A system for locating an user interactive, computer controlled image display area includes a display area for displaying a plurality of selected digital images, a computing system for providing digital image data for causing the plurality of selected digital images to be displayed, an image capture device having an image capture area for capturing a plurality of images within the capture area each including one of the selected digital images, and a display area locator including a means for deriving constructive or destructive feedback data from digital image data corresponding to the plurality of captured images to determine the location of the display area within the image capture area.
The method for determining the location of the user interactive, computer controlled display area within an image capture area of an image capture device is performed by displaying a plurality of selected images in the display area, capturing a plurality of images within the capture area each including one of the plurality of selected images, and deriving constructive or destructive image feedback data from the image data corresponding to the plurality of captured images in order to locate the display screen in the image capture area of the image capture device.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a block diagram of a first embodiment of a system for locating a display area in a user interactive, computer controlled display system;
FIG. 2 illustrates a first embodiment of the method of locating a display area within a capture area in a user interactive, computer controlled display system in accordance with the present invention;
FIG. 3 illustrates a second embodiment of the method of locating a display area within a capture area in a user interactive, computer controlled display system in accordance with the present invention;
FIGS. 4A and 4B illustrate a third embodiment of the method of locating a display area within a capture area in a user interactive, computer controlled display system in accordance with the present invention;
FIG. 5A shows an exemplary image of a display area in a capture area;
FIG. 5B shows an image corresponding to image data obtained by performing the method of the present invention; and
FIGS. 6A and 6B illustrate selected images for displaying in a fourth embodiment of the method of locating a display area within a capture area in a user interactive, computer controlled display system in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 shows a block diagram of an interactive computer controlled image display system which includes a computing system 10 for generating image data 10A and a graphical interface 11 for causing selected images 10B corresponding to the image data 10A to be displayed in display area 12. It should be understood that the graphical interface may be a portion of the computing system or may be a distinct element external to the computing system. The system further includes an image capture device 13 having an associated image capture area 13A for capturing a plurality of images each including one of the plurality of selected displayed images 10B. The captured images also include images 10C that are not within the display area. Non-display area images include anything other than what is being displayed within the display area. The plurality of captured images is converted into digital image data 13B which is transmitted to a display area locator 14. Display locator 14 includes a means for deriving constructive and destructive image feedback data 15 from the captured image data 13B to determine the location of the display area 12 within the image capture area 13A. The display locator can then optionally transmit the screen location information 14A to the computing system 10 for use in the user interactive, computer controlled display system.
In this embodiment, the computing system includes at least a central processing unit (CPU) and a memory for storing digital data (e.g., image data) and has the capability of generating at least three levels of grayscale images. The display area can be a computer monitor driven by the graphical interface or can be an area on a projection screen or projection area (e.g., a wall). In the case in which images are displayed using projection, the system includes an image projector (not shown in FIG. 1) that is responsive to image data provided from the graphical interface.
In one embodiment, the image capture device is a digital camera arranged so as to capture at least all of the images 10B displayed in the display area 12 within a known time delay. It is well known in the field of digital image capture that an image is captured by a digital camera using an array of sensors that detect the intensity of the light impinging on the sensors within the capture area of the camera. The light intensity signals are then converted into digital image data corresponding to the captured image. Hence, the captured image data 13B is digital image data corresponding to the captured image. In another embodiment the image capture device can be a digital video camera capable of capturing still images of the display area.
In one embodiment, the selected images 10B are a plurality of single intensity grayscale images that are successively displayed in the display area and captured by the image capture device.
It should be understood that all or a portion of the functions of the display locator 14 can be performed by the computing system. Consequently, although it is shown external to the computing system, all or portions of the display locator 14 may be implemented within the computing system.
It should be further understood that the display locator can be implemented in a software implementation, hardware implementation, or any combination of software and hardware implementations.
FIG. 2 shows a first embodiment of a method of determining the location of the display area within the capture area of an image capture device in an user interactive, computer controlled display system. According to this embodiment, a plurality of selected images is displayed within a display area (block 20). A plurality of images is captured within the capture area of an image capture device each captured image including one of the selected images (block 21). Constructive and destructive feedback data is derived from image data corresponding to the captured images to locate the display area within the capture area (block 22).
FIG. 3 shows a second embodiment of a method of determining the location of the display area within the capture area of an image capture device in an user interactive, computer controlled display system. In this method, at least three single intensity grayscale images are displayed within the display area (block 30). A plurality of images are captured within the capture area of the image capture device each including one of the, at least three, single intensity grayscale images (block 31). Constructive or destructive feedback data is derived by storing image data corresponding to a first captured image including a first one of the, at least three, images in a pixel array (block 32) and by incrementing or decrementing the pixel values dependent on the image data corresponding to the remainder of the captured images including at least the second and third single intensity grayscale images (block 33). As shown, block 33 generates image data showing the location of the display area within the capture area. It should be understood that constructive feedback infers that a given pixel value is incremented and destructive feedback infers that a given pixel value is decremented. In one embodiment, pixel values within the array that correspond to the display area are incremented by a first predetermined constant value and pixel values within the array that correspond to the non-display area are decremented by a second predetermined constant value. In one variation of the third embodiment, the feedback can be applied iteratively. This iterative process is achieved by redisplaying at least second and third images and again incrementing or decrementing pixel values in the array.
FIG. 4A shows a third embodiment of a method of the present invention and FIG. 4B illustrates corresponding operations performed by the method. Initially, an array Ac(x,y) of digital storage is initialized within a digital memory storage area wherein each element within the array stores a value corresponding to each pixel location within the image capture area of the capture device. The initialization of the array sets all values within the array to zero such that Ac(x,y)=0 (block 40A and 40B).
A first single intensity grayscale image having an intensity of Imid (where IHIGH>Imid>ILOW) is displayed in the display area 12 and a first image is captured within image capture area 13A having a corresponding array of image data Ac1(x,y) (block 41A and 41B). The first captured image includes a first single grayscale image displayed in display 12. Specifically, as shown by 41B the first captured image includes an area of pixels s that corresponds to the display area 12 all having an intensity of Imid and an area of pixels {overscore (s)} that corresponds to the non-display area. Array Ac1(x,y) is also stored.
Next, a second single intensity grayscale image is displayed in display area 12 having a maximum intensity level of IHIGH and a second image is captured within the image capture area having a corresponding image data array Ac2(x,y) (block 42A). The array Ac2(x,y) includes pixels s having intensities of IHIGH and non-display area pixels {overscore (s)} (block 42B). The pixel values of array Ac1(x,y) are compared with corresponding pixel values of Ac2(x,y) (block 43) and the values of Ac(x,y) are incremented or decremented (block 44A) according to the following conditions:
if A c2(x,y)>A c1(x,y), then A c(x,y)=A c(x,y)+c pos;  condition 1
otherwise A c(x,y)=A c(x,y)+c neg;  condition 2
where cneg is a negative integer and cpos, is a positive integer. In one embodiment, cpos=20 and cneg=−16. Restated, Ac(x,y)=A c(x,y)+c feedback1.
It should be noted that the value of Ac(x,y) is dependent on the dynamic behavior of corresponding pixel values from the first captured image to the third captured image in the display and non-display areas. Consequently, since the intensity level of the display area pixels s in the second captured image (42B) is IHIGH and since the intensity level of the display area pixels s in the first captured image (41B) is Imid, then Ac2(x,y)>Ac(x,y) in the display area s. As a result, condition 1 applies and cpos is added to all values corresponding to the display area in array Ac(x,y). Since Ac(x,y) is initialized such that all values are set to 0 prior to block 44A, then the values corresponding to s in array Ac(x,y) are incremented to a value of cpos after the first feedback data is applied.
Alternatively, assuming the background of the display area does not change, the intensity of the non-display area pixels {overscore (s)} will be the same and Ac2(x,y)=Ac1(x,y). Consequently, condition 2 applies and Cneg is added to the values corresponding to non-display pixels in array Ac(x,y). However, in some cases, condition 2 may apply due to image noise or if the background changes. Consequently, values of Ac corresponding to non-display area {overscore (s)} will range in value from (Cpos)→(cneg).
Hence, constructive feedback is applied to all values corresponding to the display area in array Ac(x,y) and, for the most part, destructive feedback is applied to values corresponding to non-display area in array Ac(x,y). As a result, display area values are driven positively and non-display area values are driven in negatively.
Next, a third single intensity grayscale image having an intensity level of ILOW is displayed in display area 12 and a third image is captured in capture area 13A having a corresponding array of image data Ac3(x,y) (block 45A and 45B). As shown the pixel array Ac3(x,y) includes pixels s having intensities of ILOW and pixels {overscore (s)} (45B). The pixel values of array Ac1(x,y) are then compared with corresponding pixel values of Ac3(x,y) (block 46) and the values of Ac(x,y) are incremented or decremented (block 47A) according to the following conditions:
if A c3(x,y)<A c1(x,y) then A c(x,y)=A c(x,y)+c pos;  condition 3
otherwise A c(x,y)=A c(x,y)+C neg.  condition 4
Restated, Ac(x,y)=Ac(x,y)+cfeedback2. Note conditions 3 and 4 are essentially the inverse of conditions 1 and 2. As a result, since the intensity of the display area pixels s in the third captured image data (45B) is ILOW and since the intensity of the display area pixels s in Ac1(x,y) is Imid, then Ac3(x,y)<Ac1(x,y) in display area s. As a result, condition 3 applies and cpos is once again added to all values corresponding to the display area s in array Ac(x,y). Hence, constructive feedback is applied to the display area values in array Ac(x,y) when applying both of the first and second feedback data.
In contrast, assuming again that the non-display pixel values in the third captured image data 45B still have not changed then, in most cases condition 4 applies and cneg is once again added to non-display values in Ac(x,y). In the case in which the non-display area image has changed or due to noise, condition 3 may apply. However, destructive feedback, for the most part, is applied to the non-display area values in array Ac(x,y) when applying both of the first and second feedback data.
The net effect of feedback blocks 44A and 47A is that the values in Ac(x,y) corresponding to the display area s are driven positively while the values of the non-display area {overscore (s)} are driven negatively. FIG. 4A also shows that an iterative loop 48 can be performed such that for each of the second and third captured images additional feedback is added to Ac(x,y) such that if i is the number of iterations in the loop 48, then: A cfinal = lim i A c ( x , y ) = { if ( x , y ) s , if ( x , y ) s , -
Figure US06731330-20040504-M00001
The resulting array Acfinal(x,y) will comprise very positive values in pixel locations corresponding to the display area and very negative values in pixel locations corresponding to non-display areas. Hence, a threshold comparison can be used to identify extremely positive/negative values and hence can be used to identify the location of the display area within the capture area of the image capture device.
FIG. 5A illustrates an exemplary image of a display area 12 captured within a capture area 13A. In particular, the display area 12 is a computer monitor. FIG. 5B represents the data (i.e., Acfinal(x,y)) providing the location information of the display area 12. As can be seen, the display area 12 is white corresponding to pixel values having a maximum intensity level of IHIGH and the non-display area surrounding the display area within the capture area 13A is black corresponding to pixel values having a minimum intensity level of ILOW.
In a fourth embodiment of the method of the present invention, the second and third images are not displayed with a single grayscale intensity. Instead, the second image is displayed such that a first region of the image is displayed to have a first intensity value and a second region is displayed to have second intensity value while the third image is displayed to have the converse intensities in corresponding regions. For instance, FIGS. 6A and 6B show display area 12 within the capture area 13A, and the manner in which the second and third images are displayed for the fourth embodiment of the method of the present invention. As shown in FIG. 6A, the second image 60 is displayed within display area 12 such that a region P1 of the display area 12 is set to a grayscale intensity value corresponding to IHIGH (i.e., white) and the other region P2 is set to a grayscale value corresponding to ILOW (i.e., black). It should be noted that region P2 essentially frames the display area 12. Region P3 corresponds to the non-display area. Conversely, the third image 61 shown in FIG. 6B is displayed such that region P1 is a set to a grayscale intensity value corresponding to ILOW and region P2 is set to a grayscale intensity value corresponding to IHIGH.
The effect of displaying the second and third images as shown in FIGS. 6A and 6B is to obtain final array data Acfinal(x,y) that identifies the location of a display area in the capture area as the display area region P1 with a non-fuzzy or “clean” border around it. In particular, by using the method as shown in FIGS. 4A and 4B and displaying the second and third images as shown in FIGS. 6A and 6B, the display area P1 is driven positively, the non-display screen area P3 is driven negatively, and the display area border P2 is driven even more negatively than the non-display screen area such that a sharper border is established about display area P1.
The second, third and fourth embodiments can be implemented such that array Ac(x,y) is normalized prior to iteratively applying the first and second feedback (blocks 43A-48A). Normalization is performed by subtracting a static constant value from all pixel values in the array as follows:
A c(x,y)=A c(x,y)−cstatic  eq. 1
This normalization is performed so as to better discriminate between regions that have constructive feedback and those that do not. In other words, normalization is performed to make the iterative feedback more effective in providing optimized final array data for identifying display and non-display areas.
In another embodiment, the system and method can be implemented such that instead of applying constructive feedback to display area values and destructive feedback to non-display area values in array Ac(x,y), the inverse feedback conditions apply. Specifically, destructive feedback is applied to the display area values while constructive feedback is applied to non-display area values in order to identify the non-display area within the capture area of the image capture device.
In still another embodiment, the system and method can be implemented such that instead of comparing the pixel values of the stored first image to corresponding pixel values of the second and third captured image data (FIGS. 4A and 4B, blocks 43 and 46), first captured image pixel values are compared to noise adjusted second and third captured image pixel values, where noise adjusted pixel values are the original pixel value plus a noise factor value, Cnf. For instance, conditions 1 and 3 become the following:
if A c2(x,y)+c nf >A c1(x,y), then A c(x,y)=A c(x,y)+cpos;  condition 1′
if A c3(x,y)+cnf<Ac1(x,y) then A c(x,y)=A c(x,y)+cpos;  condition 3′
where cnf is a constant value that takes into account minor fluctuations in pixel values within the user interactive, computer controlled display system due to lighting conditions, camera quality, and the quality and type of displayed image. The addition of the noise factor results in a more robust implementation of the system and method of the present invention.
Hence, a system and method is described that provides an arithmetically non-complex solution to finding the display area within the capture area of an image capture device in a user interactive, computer controlled display system.
In the preceding description, numerous specific details are set forth, such as feedback constants, intensity values, and image types in order to provide a through understanding of the present invention. It will be apparent, however, to one skilled in the art that these specific details need not be employed to practice the present invention. In other instances, well-known image processing techniques have not been described in detail in order to avoid unnecessarily obscuring the present invention.
In addition, although elements of the present invention have been described in conjunction with certain embodiments, it is appreciated that the invention can be implement in a variety of other ways. Consequently, it is to be understood that the particular embodiments shown and described by way of illustration is in no way intended to be considered limiting. Reference to the details of these embodiments is not intended to limit the scope of the claims which themselves recited only those features regarded as essential to the invention.

Claims (18)

I claim:
1. A method for determining the location of a computer controlled display area within an image capture area of an image capture device:
displaying a plurality of selected images within the display area;
capturing a plurality of images within the capture area of the image capture device each including one of the displayed selected images; and
deriving constructive and destructive feedback data from image data corresponding to the plurality of captured images;
wherein displaying the plurality of selected images comprises displaying at least three images including, a first selected image having a first single intensity grayscale value I1, displaying a second selected image having a second single intensity grayscale value I2 and displaying a third selected image having a third single intensity grayscale value I3, wherein I1>I2>I3.
2. The method as described in claim 1 wherein the second single intensity grayscale value corresponds to one of a maximum intensity level and a minimum intensity level, the third single intensity grayscale value corresponds to the other of the maximum intensity level and the minimum intensity level, and the first single intensity grayscale value is a value between the maximum and minimum intensity values.
3. The method as described in claim 1 wherein displaying a plurality of selected images comprises displaying each of the second selected and the third selected images such that first regions have a corresponding intensity value I2 and second regions have a corresponding intensity value I3.
4. The method as described in claim 1 wherein deriving constructive and destructive feedback data from the image data comprises:
storing the image data corresponding to a first captured image of the plurality of captured images which includes the first selected image in a pixel array associated with the capture area; and
one of incrementing and decrementing pixel values in the pixel array dependent on the image data corresponding to a second captured image of the plurality of captured images including the second selected image and dependent on the image data corresponding to a third captured image of the plurality of captured images including the third selected image.
5. The method as described in claim 4 wherein the pixel values are incremented by a first predetermined constant value and the pixel values are decremented by a second predetermined constant value.
6. The method as described in claim 4 wherein pixel values corresponding to the display area within the capture area are incremented so as to create the constructive feedback data and pixel values not corresponding to the display area are decremented so as to create the destructive feedback data.
7. The method as described in claim 4 wherein incrementing and decrementing pixel values further comprises comparing captured image data of the first captured image to image data corresponding to each of the second captured image and the third captured image and incrementing and decrementing dependent on the comparison results.
8. The method as described in claim 7 wherein comparing further comprises comparing pixel values of the first captured image data to corresponding noise adjusted pixel values of each of the second captured image and the third captured image data, wherein the noise adjusted pixel value comprises the original pixel value plus a noise factor value.
9. The method as described in claim 4 wherein the display area is a projection screen and the image capture device is a digital camera.
10. A system comprising:
a computing system;
a display area controlled by the computing system to display a plurality of selected images within the display area;
an image capture device having an image capture area for capturing a plurality of images within the capture area each including one of the selected displayed images;
a display area locator for deriving constructive and destructive feedback data from image data corresponding to the plurality of captured images;
wherein displaying the plurality of selected images comprises displaying at least three images including, a first selected image having a first single intensity grayscale value I1, displaying a second selected image having a second single intensity grayscale value I2 and displaying a third selected image having a third single intensity grayscale value I3, wherein I1>I2>I3.
11. The system as described in claim 10 wherein the second single intensity grayscale value corresponds to one of a maximum intensity level and a minimum intensity level, the third single intensity grayscale value corresponds to the other of the maximum intensity level and the minimum intensity level, and the first single intensity grayscale value is a value between the maximum and minimum intensity values.
12. The system as described in claim 10 wherein the second selected image includes a first region having a corresponding intensity value I2 and second region have a corresponding intensity value I3 and the third selected image includes the first region having a corresponding intensity value I3 and the second region have a corresponding intensity value I2.
13. The system as described in claim 10 further comprising:
a means for storing an array of pixel values associated with the image data corresponding to a first captured image of the plurality of captured images, the first captured image including a first selected image of the plurality of selected images; and
a means for one of incrementing and decrementing the pixel values dependent on the image data corresponding to the remainder of the plurality of captured images.
14. The system as described in claim 13 wherein the pixel values are incremented by a first predetermined constant value and the pixel values are decremented by a second predetermined constant value.
15. The system as described in claim 13 wherein pixel values corresponding to the display area within the capture area are incremented so as to create the constructive feedback data and pixel values not corresponding to the display area are decremented so as to create the destructive feedback data.
16. The system as described in claim 13 wherein the display area is a projection screen and the image capture device is a digital camera.
17. A apparatus for determining the location of a computer controlled display area within an image capture area of an image capture device comprising:
a means for storing image data corresponding to at least one of a plurality of captured images each captured image including one of a plurality of selected images displayed within the display;
a means for deriving constructive and destructive feedback data from the image data corresponding to the plurality of captured images;
wherein displaying the plurality of selected images comprises displaying at least three images including, a first selected image having a first single intensity grayscale value I1, displaying a second selected image having a second single intensity grayscale value I2 and displaying a third selected image having a third single intensity grayscale value I3, wherein I1>I2>I3.
18. The apparatus as described in claim 17 wherein the derivation means comprises:
a means for comparing stored image data corresponding to a first captured image of the plurality of captured images to second and third image data corresponding to each of second and third captured images of the plurality of captured images; and
a means for one of incrementing and decrementing the stored image data dependent on the comparison of the image data.
US09/774,452 2001-01-30 2001-01-30 Method for robust determination of visible points of a controllable display within a camera view Expired - Fee Related US6731330B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US09/774,452 US6731330B2 (en) 2001-01-30 2001-01-30 Method for robust determination of visible points of a controllable display within a camera view
PCT/US2002/002716 WO2002065388A2 (en) 2001-01-30 2002-01-29 Method for robust determination of visible points of a controllable display within a camera view
EP02702108A EP1356422A2 (en) 2001-01-30 2002-01-29 Method for robust determination of visible points of a controllable display within a camera view
JP2002565239A JP2005500590A (en) 2001-01-30 2002-01-29 Robust determination method of controllable display visible point in camera view

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/774,452 US6731330B2 (en) 2001-01-30 2001-01-30 Method for robust determination of visible points of a controllable display within a camera view

Publications (2)

Publication Number Publication Date
US20020126136A1 US20020126136A1 (en) 2002-09-12
US6731330B2 true US6731330B2 (en) 2004-05-04

Family

ID=25101272

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/774,452 Expired - Fee Related US6731330B2 (en) 2001-01-30 2001-01-30 Method for robust determination of visible points of a controllable display within a camera view

Country Status (4)

Country Link
US (1) US6731330B2 (en)
EP (1) EP1356422A2 (en)
JP (1) JP2005500590A (en)
WO (1) WO2002065388A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040062004A1 (en) * 2002-09-27 2004-04-01 Pfeifer David W. Bus bar assembly for use with a compact power conversion assembly
US20040239653A1 (en) * 2003-05-27 2004-12-02 Wolfgang Stuerzlinger Collaborative pointing devices
US20040246276A1 (en) * 2003-05-30 2004-12-09 Seiko Epson Corporation Image display device and image display system
US20060139362A1 (en) * 2004-12-23 2006-06-29 I-Jong Lin Robust estimation of pixel values
US20110285624A1 (en) * 2010-05-24 2011-11-24 Hon Hai Precision Industry Co., Ltd. Screen positioning system and method based on light source type
US20120176304A1 (en) * 2011-01-07 2012-07-12 Sanyo Electric Co., Ltd. Projection display apparatus
US20150205345A1 (en) * 2014-01-21 2015-07-23 Seiko Epson Corporation Position detection system and control method of position detection system

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7031034B2 (en) * 2000-10-02 2006-04-18 Eastman Kodak Company Articulating camera for digital image acquisition
FI117488B (en) 2001-05-16 2006-10-31 Myorigo Sarl Browsing information on screen
JP2004199299A (en) * 2002-12-18 2004-07-15 Casio Comput Co Ltd Handwritten information recording method and projection recording device
US20090297062A1 (en) * 2005-03-04 2009-12-03 Molne Anders L Mobile device with wide-angle optics and a radiation sensor
US20090305727A1 (en) * 2005-03-04 2009-12-10 Heikki Pylkko Mobile device with wide range-angle optics and a radiation sensor
CN101213550B (en) 2005-06-30 2011-09-28 诺基亚公司 Camera control means to allow operating of a destined location of the information surface of a presentation and information system
US7835544B2 (en) 2006-08-31 2010-11-16 Avago Technologies General Ip (Singapore) Pte. Ltd. Method and system for far field image absolute navigation sensing
JP4872610B2 (en) * 2006-11-09 2012-02-08 カシオ計算機株式会社 Camera pointer device, labeling method and program
WO2008094458A1 (en) 2007-01-26 2008-08-07 F-Origin, Inc. Viewing images with tilt control on a hand-held device
KR100936816B1 (en) * 2007-05-26 2010-01-14 이문기 Pointing device using camera and outputting mark
US20130187854A1 (en) 2007-05-26 2013-07-25 Moon Key Lee Pointing Device Using Camera and Outputting Mark
ES2323399B2 (en) * 2007-06-19 2010-02-26 Gat Microencapsulation Ag SUSPENSIONS IN AGRICULTURAL SULFONILE AND COMBINATIONS OILS.
FR2952396A1 (en) * 2009-11-12 2011-05-13 Conception Mecanique Soc D MOTORIZED LADDER DEVICE FOR VERY HEAVY SWING DOOR AND VERY HEAVY SWING DOOR EQUIPPED WITH SUCH A DEVICE
US8947415B1 (en) * 2010-04-19 2015-02-03 Green Display LLC Display system and method
WO2011133553A1 (en) * 2010-04-19 2011-10-27 Mowry Craig P Display system and device with sustainable power generation
WO2016036370A1 (en) * 2014-09-04 2016-03-10 Hewlett-Packard Development Company, L.P. Projection alignment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
EP0718748A2 (en) 1994-12-22 1996-06-26 Canon Kabushiki Kaisha Pointed-position detecting apparatus and method
US5729252A (en) * 1994-12-27 1998-03-17 Lucent Technologies, Inc. Multimedia program editing system and method
US5973672A (en) * 1996-10-15 1999-10-26 Raytheon Company Multiple participant interactive interface
US6075895A (en) 1997-06-20 2000-06-13 Holoplex Methods and apparatus for gesture recognition based on templates
US6292171B1 (en) * 1999-03-31 2001-09-18 Seiko Epson Corporation Method and apparatus for calibrating a computer-generated projected image
US6346933B1 (en) * 1999-09-21 2002-02-12 Seiko Epson Corporation Interactive display presentation system
US6486894B1 (en) * 1999-11-18 2002-11-26 International Business Machines Corporation Contrasting graphical user interface pointer
US6512507B1 (en) * 1998-03-31 2003-01-28 Seiko Epson Corporation Pointing position detection device, presentation system, and method, and computer-readable medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19612949C1 (en) * 1996-04-01 1997-08-14 Siemens Ag Computer-aided classification of finger-shaped object on hand-shaped object using virtual touch screen

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
EP0718748A2 (en) 1994-12-22 1996-06-26 Canon Kabushiki Kaisha Pointed-position detecting apparatus and method
US5729252A (en) * 1994-12-27 1998-03-17 Lucent Technologies, Inc. Multimedia program editing system and method
US5973672A (en) * 1996-10-15 1999-10-26 Raytheon Company Multiple participant interactive interface
US6075895A (en) 1997-06-20 2000-06-13 Holoplex Methods and apparatus for gesture recognition based on templates
US6512507B1 (en) * 1998-03-31 2003-01-28 Seiko Epson Corporation Pointing position detection device, presentation system, and method, and computer-readable medium
US6292171B1 (en) * 1999-03-31 2001-09-18 Seiko Epson Corporation Method and apparatus for calibrating a computer-generated projected image
US6346933B1 (en) * 1999-09-21 2002-02-12 Seiko Epson Corporation Interactive display presentation system
US6486894B1 (en) * 1999-11-18 2002-11-26 International Business Machines Corporation Contrasting graphical user interface pointer

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040062004A1 (en) * 2002-09-27 2004-04-01 Pfeifer David W. Bus bar assembly for use with a compact power conversion assembly
US20040239653A1 (en) * 2003-05-27 2004-12-02 Wolfgang Stuerzlinger Collaborative pointing devices
US7193608B2 (en) * 2003-05-27 2007-03-20 York University Collaborative pointing devices
US20040246276A1 (en) * 2003-05-30 2004-12-09 Seiko Epson Corporation Image display device and image display system
US20060139362A1 (en) * 2004-12-23 2006-06-29 I-Jong Lin Robust estimation of pixel values
US7555164B2 (en) 2004-12-23 2009-06-30 Hewlett-Packard Development Company, L.P. Robust estimation of pixel values
US20110285624A1 (en) * 2010-05-24 2011-11-24 Hon Hai Precision Industry Co., Ltd. Screen positioning system and method based on light source type
US20120176304A1 (en) * 2011-01-07 2012-07-12 Sanyo Electric Co., Ltd. Projection display apparatus
US20150205345A1 (en) * 2014-01-21 2015-07-23 Seiko Epson Corporation Position detection system and control method of position detection system
US9639165B2 (en) * 2014-01-21 2017-05-02 Seiko Epson Corporation Position detection system and control method of position detection system
US10114475B2 (en) 2014-01-21 2018-10-30 Seiko Epson Corporation Position detection system and control method of position detection system

Also Published As

Publication number Publication date
WO2002065388A2 (en) 2002-08-22
EP1356422A2 (en) 2003-10-29
JP2005500590A (en) 2005-01-06
US20020126136A1 (en) 2002-09-12
WO2002065388A3 (en) 2002-10-31

Similar Documents

Publication Publication Date Title
US6731330B2 (en) Method for robust determination of visible points of a controllable display within a camera view
US7068841B2 (en) Automatic digital image enhancement
US6766066B2 (en) Detection of pointed position using image processing
JP3885458B2 (en) Projected image calibration method and apparatus, and machine-readable medium
JP3909554B2 (en) Presentation control system and control method thereof
US7342572B2 (en) System and method for transforming an ordinary computer monitor into a touch screen
US20060165311A1 (en) Spatial standard observer
JP5560721B2 (en) Image processing apparatus, image display system, and image processing method
US20020136455A1 (en) System and method for robust foreground and background image data separation for location of objects in front of a controllable display within a camera view
EP1356423B1 (en) System and method for extracting a point of interest of an object in front of a computer controllable display captured by an imaging device
US20110025929A1 (en) Light Transport Matrix from Homography
US20190005323A1 (en) Information processing apparatus for tracking processing
US20190053607A1 (en) Electronic apparatus and method for providing makeup trial information thereof
JP4229702B2 (en) Local improvement of display information
CN109886864B (en) Privacy mask processing method and device
US7440595B2 (en) Method and apparatus for processing images
US7555164B2 (en) Robust estimation of pixel values
US11481507B2 (en) Augmented reality document redaction
US11393214B2 (en) Apparatus and method for congestion visualization
JP2011090606A (en) Image processor, image display system and image processing method
KR20220025552A (en) Image processing device and image enhancing method
JP6944828B2 (en) Image processing system, image processing method, and image processing program
US11330236B2 (en) Projector controlling method and projector
US11763509B2 (en) Frame calibration for robust video synthesis
JP2008225555A (en) Handwritten character input device

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEWLETT-PACKARD COMPANY, COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIN, I-JONG;REEL/FRAME:011696/0302

Effective date: 20010129

AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD COMPANY;REEL/FRAME:014061/0492

Effective date: 20030926

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY L.P.,TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD COMPANY;REEL/FRAME:014061/0492

Effective date: 20030926

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20160504