WO2006090386A2 - A virtual keyboard device - Google Patents

A virtual keyboard device Download PDF

Info

Publication number
WO2006090386A2
WO2006090386A2 PCT/IL2006/000246 IL2006000246W WO2006090386A2 WO 2006090386 A2 WO2006090386 A2 WO 2006090386A2 IL 2006000246 W IL2006000246 W IL 2006000246W WO 2006090386 A2 WO2006090386 A2 WO 2006090386A2
Authority
WO
WIPO (PCT)
Prior art keywords
detector
location
light
optical element
light modulator
Prior art date
Application number
PCT/IL2006/000246
Other languages
French (fr)
Other versions
WO2006090386A3 (en
Inventor
Klony Lieberman
Original Assignee
Vkb Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vkb Inc. filed Critical Vkb Inc.
Publication of WO2006090386A2 publication Critical patent/WO2006090386A2/en
Publication of WO2006090386A3 publication Critical patent/WO2006090386A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • H04N5/7416Projection arrangements for image reproduction, e.g. using eidophor involving the use of a spatial light modulator, e.g. a light valve, controlled by a video signal
    • H04N5/7441Projection arrangements for image reproduction, e.g. using eidophor involving the use of a spatial light modulator, e.g. a light valve, controlled by a video signal the modulator being an array of liquid crystal cells
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present invention relates to data entry devices generally.
  • the present invention seeks to provide an improved data entry device.
  • a virtual data entry device including an illuminator generating a generally planar beam of light and an impingement sensor assembly operative to sense at least one location of impingement of the planar beam of light by at least one object, the impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object, the at least one optical element having optical power in a first direction such that it focuses light at at least one focus line location, which the at least one focus line location is a function of the location of the at least one object relative to the at least one optical element and a multi-element detector arranged to receive light passing through the at least one optical element, wherein the distribution of the light detected by the multi-element detector among multiple elements thereof indicates the location of the at least one object.
  • the at least one optical element has optical power in a second direction which is at least one of zero or substantially different from the optical power in the first direction.
  • the at least one optical element is at least one of a cylindrical lens, a conical lens and a torroidal lens.
  • the multi-element detector has multiple elements arranged side-by-side along a detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the detector line indicates distance of the at least one object from the at least one optical element. Additionally or alternatively, the multi-element detector has multiple elements arranged side-by-side along a detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the detector line indicates azimuthal location of the at least one object from the at least one optical element.
  • the multi-element detector has multiple elements arranged side-by-side in two dimensions in a plane which intersects the at least one focus line, whereby the location of intersection of the at least one focus line on the multi-element detector indicates the location of the at least one object in two-dimensions.
  • the at least one multi-element detector comprises a first multi-element detector and a second multi-element detector and the at least one optical element comprises a first optical element and a second optical element, the first multi-element detector having multiple elements arranged side-by-side along a first detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the first detector line indicates azimuthal location of the at least one object from the first optical element and the second multi-element detector having multiple elements arranged side-by-side along a second detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the second detector line indicates distance of the at least one object from the second optical element.
  • the multi-element detector includes a generally one-dimensional slit lying in the second plane along a detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the detector line indicates distance of the at least one object from the at least one optical element.
  • the generally planar beam of light is spaced from and generally parallel to a generally planar display surface. More preferably, the generally planar display surface includes an LCD screen.
  • the virtual data entry device also includes processing circuitry operative to receive at least one output signal of the multi-element detector and to determine the location of the at least one object.
  • a virtual data entry device including an illuminator generating a generally planar beam of light generally parallel to a generally flat surface which is at least partially light reflecting and an impingement sensor assembly operative to sense at least one location of impingement of the planar beam of light by at least one object, the impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object directly and indirectly via the surface and a two-dimensional multi-element detector arranged to receive light passing through the at least one optical element, wherein the spatial separation on the detector of the light detected by the multi-element detector which was reflected directly from the at least one object and the light which was reflected via the light reflecting surface thereof indicates distance of the at least one object from the at least one optical element and location on the detector of the light reflected directly and indirectly from the at least one object indicates the azimuthal location of the at least one object relative to the at least one optical element.
  • the generally flat light-reflecting surface includes an LCD screen.
  • the virtual data entry device also includes processing circuitry operative to receive at least one output signal of the multi-element detector and to determine at least one of the distance of the at least one object from the at least one optical element and the azimuthal location of the at least one object relative to the at least one optical element. Additionally or alternatively, the processing circuitry is operative to output the location of the at least one object in Cartesian coordinates.
  • a pattern projector including a source of light to be projected, a spatial light modulator arranged in a spatial light modulator plane, the spatial light modulator receiving the light from the source of light and being configured to pass the light therethrough in a first pattern and projection optics receiving the light from the spatial light modulator and being operative to project a desired second pattern onto a projection surface lying in a projection surface plane which is angled with respect to the spatial light modulator plane, the first pattern being a distortion of the desired second pattern configured such that keystone distortions resulting from the difference in angular orientations of the spatial light modulator plane and the projection surface plane are compensated.
  • the pattern projector also includes a collimator interposed between the source of light and the spatial light modulator, the collimator being operative to distribute the light from the source of light across the spatial light modulator in a non-uniform distribution such that light distribution in the desired second pattern has uniformity greater than the nonuniform distribution.
  • the spatial light modulator includes an LCD. More preferably, the LCD is a matrix LCD. Alternatively, the LCD is a segmented LCD.
  • the spatial light modulator plane is angled with respect to the projection optics such that the desired second pattern is focused to a generally uniform extent.
  • the spatial light modulator includes a plurality of pixels, different ones of the plurality of pixels of the spatial light modulator having different sizes. Alternatively, different ones of the plurality of pixels of the spatial light modulator have different shapes.
  • the spatial light modulator plane is angled with respect to the projection optics such that the desired second pattern is focused to a generally uniform extent.
  • the spatial light modulator includes a plurality of segments, different ones of the plurality of segments of the spatial light modulator having different sizes. Alternatively, different ones of the plurality of segments of the spatial light modulator have different shapes.
  • a method for data entry including utilizing an illuminator to generate a generally planar beam of light and sensing at least one location of impingement of the planar beam of light by at least one object, using an impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object, the at least one optical element having optical power in a first direction such that it focuses light at at least one focus line location, which the at least one focus line location is a function of the location of the at least one object relative to the at least one optical element and a multielement detector arranged to receive light passing through the at least one optical element, wherein the distribution of the light detected by the multi-element detector among multiple elements thereof indicates the location of the at least one object.
  • the method for data entry also includes determining the distance of the at least one object from the at least one optical element by determining a location of intersection between the at least one focus line and a detector line along which are arranged, in a side-by-side orientation, multiple elements of the multi-element detector. Additionally or alternatively, the method for data entry also includes determining the azimuthal location of the at least one object with respect to the at least one optical element by determining a location of intersection between the at least one focus line and a detector line along which are arranged, in a side-by-side orientation, multiple elements of the multi-element detector.
  • the method for data entry also includes determining the location of the at least one object in two-dimensions by determining a location of intersection between the at least one focus line and a detector plane along which are arranged, in a side-by-side two-dimensional orientation, multiple elements of the multi-element detector.
  • the at least one multi-element detector comprises a first multi-element detector and a second multi-element detector and the at least one optical element comprises a first optical element and a second optical element
  • the method also including determining the azimuthal location of the at least one object with respect to the first optical element by determining a location of intersection between the at least one focus line and a first detector line along which are arranged, in a side-by-side orientation, multiple elements of the first multi-element detector and determining the distance of the at least one object from the second optical element by determining a location of intersection between the at least one focus line and a second detector line along which are arranged, in a side-by-side orientation, multiple elements of the second multi-element detector.
  • the method of data entry also includes providing processing circuitry for receiving at least one output signal of the multi-element detector and determining the location of the at least one object.
  • a method for data entry including utilizing an illuminator for generating a generally planar beam of light generally parallel to a generally flat surface which is at least partially light reflecting and sensing at least one location of impingement of the planar beam of light by at least one object, using an impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object directly and indirectly via the surface and a two-dimensional multi-element detector arranged to receive light passing through the at least one optical element, wherein the spatial separation on the detector of the light detected by the multi-element detector which was reflected directly from the at least one object and the light which was reflected via the light reflecting surface thereof indicates distance of the at least one object from the at least one optical element and location on the detector of the light reflected directly and indirectly from the at least one object indicates the azimuthal location of the at least one object relative to the at least one optical element.
  • the method for data entry also includes providing processing circuitry for receiving at least one output signal of the multi-element detector and for determining at least one of the distance of the at least one object from the at least one optical element and the azimuthal location of the at least one object relative to the at least one optical element. Additionally or alternatively, the providing processing circuitry also includes providing processing circuitry for outputting the location of the at least one object in Cartesian coordinates.
  • a method for projecting a pattern including providing a source of light to be projected, arranging a spatial light modulator, in a spatial light modulator plane, to receive the light from the source of light, the spatial light modulator being configured to pass the light therethrough in a first pattern and using projection optics for receiving the light from the spatial light modulator and for projecting a desired second pattern onto a projection surface lying in a projection surface plane which is angled with respect to the spatial light modulator plane, the first pattern being a distortion of the desired second pattern configured such that keystone distortions resulting from the difference in angular orientations of the spatial light modulator plane and the projection surface plane are compensated.
  • the method for projecting a pattern also includes providing a collimator interposed between the source of light and the spatial light modulator, and utilizing the collimator to distribute the light from the source of light across the spatial light modulator in a nonuniform distribution such that light distribution in the desired second pattern has uniformity greater than the non-uniform distribution.
  • the arranging the spatial light modulator includes angling the spatial light modulator plane with respect to the projection optics such that the desired second pattern is focused to a generally uniform extent.
  • the arranging the spatial light modulator includes providing a spatial light modulator including a plurality of pixels, different ones of the plurality of pixels having different sizes. Alternatively, different ones of the plurality of pixels have different shapes.
  • the arranging the spatial light modulator includes providing a spatial light modulator including a plurality of segments, different ones of the plurality of segments having different sizes. Alternatively, different ones of the plurality of segments have different shapes.
  • Fig. 1 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with a preferred embodiment of the present invention
  • Fig. 2 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with another preferred embodiment of the present invention
  • Fig. 3 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with yet another preferred embodiment of the present invention
  • Fig. 4 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with still another preferred embodiment of the present invention
  • Fig. 5 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet another preferred embodiment of the present invention
  • Fig. 6 is a sectional illustration of the projection device of Fig. 5, taken along lines VI - VI in Fig. 5;
  • Fig. 7 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with a further preferred embodiment of the present invention.
  • Fig. 8 is a sectional illustration of the projection device of Fig. 7, taken along lines VIII - VIII in Fig. 7;
  • Fig. 9 is a sectional illustration of the projection device of Fig. 7, taken along lines IX- IX in Fig. 7;
  • Fig. 10 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet a further preferred embodiment of the present invention; and
  • Fig. 11 is a sectional illustration of the projection device of Fig. 10, taken along lines XI - XI in Fig. 10.
  • Fig. 1 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with a preferred embodiment of the present invention.
  • a virtual data entry device generally designated by reference numeral 100 and including an illuminator 102, generating a generally planar beam of light, generally designated by reference numeral 104.
  • the generally planar beam of light 104 lies in spaced, generally parallel relationship to a generally planar surface 106 of a display 108, such as a LCD display.
  • An impingement sensor assembly 110 is operative to sense the distance of at least one location of impingement of the planar beam of light 104 by at least one object, such as a finger or a stylus.
  • the planar beam of light 104 impinges on two fingers 112 and 114 which typically are touching display surface 106.
  • the impingement sensor assembly 110 preferably includes at least one optical element, such as a cylindrical lens 116, having an optical axis 118 and arranged to receive light from planar beam 104 scattered or otherwise reflected by the fingers 112 and 114.
  • a conical lens or a torroidal lens may be used.
  • the optical element has optical power in a first direction 120, here perpendicular to display surface 106, thereby to focus light from finger 112 at a given focus line location 124, and to focus light from finger 114 at a given focus line location 128.
  • the focus line locations 124 and 128 are each a function of the distance of respective fingers 112 and 114, which scatter light from planar beam 104, from the optical element 116.
  • a multi-element detector preferably a one-dimensional detector 130, such as a linear CMOS or CCD array commercially available from Panavision LTD. of
  • each pixel is generally square, is preferably inclined with respect to the optical element 116 and is arranged to receive light passing through the at least one optical element 116, wherein the distribution of light, detected by the multi-element detector 130, indicates the distance of each of fingers 112 and 114 from optical element 116.
  • the multi-element detector 130 has multiple elements 134 arranged side-by-side along a detector line 136 which intersects the focus lines 124 and 128, wherein the location of intersection of each of focus lines 124 and 128 with the detector line 136 indicates the respective distances of fingers 112 and 114 from the optical element 116.
  • Detector 130 provides an output signal 138 which indicates the position therealong of the intersections of focus lines 124 and 128 therewith.
  • Output 138 is used by computerized processing circuitry 140, preferably forming part of the data entry device 100, to calculate the distances of fingers 112 and 114 from optical element 116.
  • Circuitry 140 preferably operates by locating the peaks 142 and 144 of the output signal 138, which correspond to detector element locations along detector 130. The detector element locations of the peaks 142 and 144 are directly mapped onto distances of the fingers 112 and 114 from optical element 116.
  • non-focused light scattered by each of the fingers 112 and 114 is also received by detector 130 and creates a background signal.
  • Conventional detection techniques which isolate signal peaks from background are preferably employed for eliminating inaccuracies which could otherwise result from the background signals.
  • FIG. 2 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with another preferred embodiment of the present invention.
  • a virtual data entry device generally designated by reference numeral 200, and including an illuminator 202, generating a generally planar beam of light, generally designated by reference numeral 204.
  • the generally planar beam of light 204 lies in spaced, generally parallel relationship to a generally planar surface 206 of a display 208, such as a LCD display.
  • An impingement sensor assembly 210 is operative to sense in two dimensions at least one location of impingement of the planar beam of light 204 by at least one object, such as a finger or a stylus.
  • the impingement sensor assembly 210 preferably includes at least one optical element, such as a cylindrical lens 216, having an optical axis 218 and arranged to receive light from planar beam 204 scattered or otherwise reflected by the fingers 212 and 214.
  • a conical lens or a torroidal lens may be used.
  • the optical element has optical power in a first direction 220, here parallel to display surface 206, thereby to focus light from finger 212 at a given focus line location 224, and to focus light from finger 214 at a given focus line location 228.
  • the focus line locations 224 and 228 are each a function of the location of respective fingers 212 and 214, which scatter light from planar beam 204, from the optical element 216.
  • a multi-element detector preferably a two-dimensional detector 230, such as a two-dimensional CMOS or CCD array commercially available from OmniVision Technologies Inc. of 1341 La Drive, Sunnyvale, California, typically including 640x480 pixels, is arranged to receive light passing through the at least one optical element 216, wherein the distribution of light detected by the multi-element detector 230 indicates the location of each of fingers 212 and 214 relative to optical element 216.
  • the multi-element detector 230 has multiple elements 234 arranged to lie in a detector plane 236 which is inclined with respect to focus lines 224 and 228 so as to intersect the focus lines 224 and 228, wherein the location of intersection of each of focus lines 224 and 228 with the detector plane 236 indicates the respective locations of fingers 212 and 214 relative to optical element 216.
  • the Y location of each of signal peaks 238 and 240 which correspond to fingers 212 and 214 respectively, indicates the distance of the respective finger from optical element 216
  • the X location of each of the signal peaks 238 and 240 indicates the azimuthal location of the respective finger relative to optical element 216.
  • Detector 230 provides an output image signal 242 which indicates the positions of fingers 212 and 214.
  • Output image signal 242 is used by computerized processing circuitry 244, preferably forming part of the data entry device 200, to calculate the positions of fingers 212 and 214 with respect to optical element 216.
  • computerized processing circuitry 244 preferably forming part of the data entry device 200, to calculate the positions of fingers 212 and 214 with respect to optical element 216.
  • Fig. 3 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with yet another preferred embodiment of the present invention.
  • a virtual data entry device generally designated by reference numeral 300 and including an illuminator 302, generating a generally planar beam of light, generally designated by reference numeral 304.
  • the generally planar beam of light 304 lies in spaced, generally parallel relationship to a generally planar surface 306 of a display 308, such as a LCD display.
  • An impingement sensor assembly 310 is operative to sense at least one location of impingement of the planar beam of light 304 by at least one object, such as a finger or a stylus.
  • the planar beam of light 304 impinges on two fingers 312 and 314 which typically are touching display surface 306.
  • the impingement sensor assembly 310 preferably includes at least two optical elements, such as cylindrical lenses 315 and 316, having respective optical axes 317 and 318 and arranged to receive light from planar beam 304 scattered or otherwise reflected by the fingers 312 and 314.
  • cylindrical lenses 315 and 316 may be replaced by a conical lens or a torroidal lens.
  • the optical element 315 has optical power in a first direction 320, here perpendicular to display surface 306, thereby to focus light from finger 312 at a given focus line location 324, and to focus light from finger 314 at a given focus line location 328.
  • the focus line locations 324 and 328 are each a function of the distance of respective fingers 312 and 314, which scatter light from planar beam 304, from the optical element 315.
  • the optical element 316 has optical power in a second direction 330, here parallel to display surface 306, thereby to focus light from finger 312 at a given focus line location 334, and to focus light from finger 314 at a given focus line location 338.
  • the focus line locations 334 and 338 are each a function of the azimuthal location of respective fingers 312 and 314, which scatter light from planar beam 304, relative to the optical element 316.
  • a multi-element detector preferably a one-dimensional detector 340, such as a linear CMOS or CCD array commercially available from Panavision LTD. of 6219 De Soto Avenue, Woodland Hills, California, typically including 2048 pixels arranged along a straight line wherein each pixel is generally square, is preferably inclined with respect to optical element 315 and is arranged to receive light passing through the at least one optical element 315, wherein the distribution of light, detected by the multi-element detector 340, indicates the distance of each of fingers 312 and 314 from optical element 315.
  • a one-dimensional detector 340 such as a linear CMOS or CCD array commercially available from Panavision LTD. of 6219 De Soto Avenue, Woodland Hills, California, typically including 2048 pixels arranged along a straight line wherein each pixel is generally square, is preferably inclined with respect to optical element 315 and is arranged to receive light passing through the at least one optical element 315, wherein the distribution of light, detected by the multi-element detector 340, indicates the distance
  • the multi-element detector 340 has multiple elements 344 arranged side-by-side along a detector line 346 which intersects the focus lines 324 and 328, wherein the location of intersection of each of focus lines 324 and 328 with the detector line 346 indicates the respective distances of fingers 312 and 314 from the optical element 315.
  • non-focused light scattered by each of the fingers 312 and 314 is also received by detector 340 and creates a background signal.
  • Conventional detection techniques which isolate signal peaks from background are preferably employed for eliminating inaccuracies which could otherwise result from the background signals.
  • Impingement sensor assembly 300 includes, in addition to multi-element detector 340, an additional multi-element detector, which is preferably a one- dimensional detector 350, such as a linear CMOS or CCD array commercially available from Panavision LTD. of 6219 De Soto Avenue, Woodland Hills, California, typically including 2048 pixels arranged along a straight line.
  • Detector 350 is arranged to receive light passing through optical element 316, wherein the distribution of light, detected by the multi-element detector 350, indicates the azimuthal location of each of fingers 312 and 314 relative to optical element 316.
  • the multi-element detector 350 has multiple elements 354 arranged side-by-side along a detector line 356 which intersects the focus lines 334 and 338, wherein the location of intersection of each of focus lines 334 and 338 with the detector line 356 indicates the respective azimuthal locations of fingers 312 and 314 relative to optical element 316.
  • Detector 340 provides an output signal 360 which indicates the position therealong of the intersections of focus lines 324 and 328 therewith.
  • Output signal 360 is used by computerized processing circuitry 362, preferably forming part of the data entry device 300, to calculate the distances of fingers 312 and 314 from optical element 315.
  • Circuitry 362 preferably operates by locating peaks 364 and 366 of the output signal 360, which correspond to detector element locations along detector 340. The detector element locations of the peaks 364 and 366 are directly mapped by processing circuitry 362 onto distances of the fingers 312 and 314 from optical element 315.
  • Detector 350 provides an output signal 370 which indicates the position therealong of the intersections of focus lines 334 and 338 therewith.
  • Output signal 370 is used by computerized processing circuitry 362 to calculate the azimuthal locations of fingers 312 and 314 relative to optical element 316.
  • Circuitry 362 preferably operates by locating peaks 374 and 376 of the output signal 370, which correspond to detector element locations along detector 350. The detector element locations of the peaks 374 and 376 are directly mapped by processing circuitry 362 onto azimuthal locations of the fingers 312 and 314 from optical element 316.
  • Circuitry 362 preferably maps the distances and azimuthal locations of the fingers 312 and 314 onto Cartesian coordinate expressions of the two-dimensional positions of the fingers.
  • Fig. 4 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with still another preferred embodiment of the present invention.
  • a virtual data entry device generally designated by reference numeral 400, and including an illuminator 402, generating a generally planar beam of light, generally designated by reference numeral 404.
  • the generally planar beam of light 404 lies in spaced, generally parallel relationship to a generally planar, at least partially reflective surface 406, such as a surface of a display 408, such as a LCD display.
  • An impingement sensor assembly 410 is operative to sense in two dimensions at least one location of impingement of the planar beam of light 404 by at least one object, such as a finger or a stylus.
  • the planar beam of light 404 impinges on two fingers 412 and 414 which typically are touching surface 406.
  • the impingement sensor assembly 410 preferably includes at least one optical element, such as a lens 416, having an optical axis 418 and arranged to receive light from planar beam 404 scattered or otherwise reflected by the fingers.
  • Lens 416 is preferably arranged to receive light from the planar beam 404 which is reflected by fingers 412 and 414 directly as well as indirectly via at least partially reflecting surface 406.
  • a multi-element detector preferably a two-dimensional detector 430, such as a two-dimensional CMOS or CCD array commercially available from OmniVision Technologies Inc. of 1341 La Drive, Sunnyvale, California, typically including 640x480 pixels, is arranged to receive light passing through the at least one optical element 416, wherein the distribution of light, detected by the multi-element detector 430 indicates the location of each of fingers 412 and 414 relative to optical element 416.
  • the multi-element detector 430 which has multiple elements 434 arranged to lie in an image plane of optical element 416, receives light via optical element 416 from the planar beam 404 which is reflected by fingers 412 and 414 directly as well as indirectly via at least partially reflecting surface 406.
  • the respective spatial separations Dl and D2 on detector 430 of impingements 442 and 444 of light which is reflected directly from fingers 412 and 414 and impingements 452 and 454 of light which is reflected via at least partially reflecting surface 406 indicate distances of the fingers 412 and 414 from optical element 416.
  • the respective X locations of impingements 442 and 444 on detector 430 indicate the azimuthal locations of fingers 412 and 414 relative to optical element 416.
  • Detector 430 provides an output image signal 462 which indicates the positions of fingers 412 and 414.
  • Output image signal 462 is used by computerized processing circuitry 464, preferably forming part of the data entry device 400, to calculate the two-dimensional positions of fingers 412 and 414 in Cartesian coordinates.
  • Fig. 5 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet another preferred embodiment of the present invention and to Fig. 6, which is a sectional illustration of the projection device of Fig. 5, taken along section lines VI - VI in Fig. 5.
  • a projection device which optionally has the functionality of a virtual data entry device and is generally designated by reference numeral 500.
  • the projector preferably comprises a housing 502, in which there is provided a light source 504, such as an LED. Downstream of the light source
  • collimating optics such as a condensing lens 506.
  • a spatial light modulator 508 such as a matrix LCD.
  • projection optics such as a projection lens 510.
  • the projection lens 510 projects light from the light source 504, via a projection window 512 formed in housing 502 onto a projection surface 514.
  • impingement-sensing functionality may be combined with the projector described hereinabove.
  • an impingement sensor 520 is mounted on or located within housing 502.
  • Impingement sensor 520 is preferably of the type described hereinabove with reference to any of Figs. 1 - 4.
  • impingement sensor 520 includes an illuminator 522, generating a generally planar beam of light which lies in spaced, generally parallel relationship to projection surface 514, and a sensor assembly 524, which is operative to sense the distance of at least one location of impingement of the planar beam of light by at least one object, such as a finger or a stylus.
  • the angular and positional relationships of the various components of the projector are as follows: generally light is being projected in a direction generally indicated by a line 530 in Fig. 6, at an acute angle alpha ( ⁇ ) with respect to the projection surface 514.
  • Condensing lens 506 is aligned generally along an optical axis of light source 504.
  • the spatial light modulator 508 and the projecting lens 510 are arranged with respect to the plane of the projection surface 514 and with respect to the optical axis of the light source and condensing lens in accordance with the Scheimpflug principle, i.e.
  • the spatial light modulator 508 is preferably offset in a direction indicated by an arrow 532 in order to optimize uniformity of projection intensity.
  • the spatial light modulator 508 is configured to compensate for keystone distortions which result from projection therethrough onto projection surface 514.
  • the size and shape of each pixel of the spatial light modulator 508 is distorted appropriately to provide a projected image 540 on projection surface 514, in which each pixel is of the same size and shape.
  • each pixel in the projected image is generally rectangular as shown, while each pixel in the spatial light modulator has generally parallel top and bottom edges with non-mutually parallel side edges as shown in the enlarged portion of Fig. 5.
  • the pixels in the spatial light modulator are of differing sizes and the pixels towards the bottom of the spatial light modulator, which are projected a longer distance and at a more oblique angle are smaller than those at the top of the spatial light modulator, which are projected a shorter distance and at a less oblique angle.
  • the spatial light modulator 508 is offset with respect to the condensing lens 506, such that more light per unit area is directed through the smaller pixels of the spatial light modulator 508 so as to enhance uniformity of light intensity over the pixels in the projected image 540.
  • Fig. 7 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with a further preferred embodiment of the present invention, to Fig. 8, which is a sectional illustration of the projection device of Fig. 7, taken along lines VIII - VIII in Fig. 7 and to Fig. 9, which is a sectional illustration of the projection device of Fig. 7, taken along section lines IX- IX in Fig. 7.
  • a projection device which optionally has the functionality of a virtual data entry device and is generally designated by reference numeral 700.
  • the projector preferably comprises a housing 702, in which there is provided a light source 704, such as an LED. Downstream of the light source 704 there is preferably provided collimating optics such as a condensing lens 706.
  • a spatial light modulator 708, such as a matrix LCD Downstream of condensing lens 706 there is preferably provided a spatial light modulator 708, such as a matrix LCD. Downstream of spatial light modulator 708 there is preferably provided projection optics, such as a projection lens 710. Preferably, the projection lens 710 projects light from the light source 704, via a projection window 712 formed in housing 702 onto a projection surface 714.
  • impingement-sensing functionality may be combined with the projector described hereinabove.
  • an impingement sensor 720 is mounted on or located within housing 702.
  • Impingement sensor 720 is preferably of the type described hereinabove with reference to any of Figs. 1 - 4.
  • impingement sensor 720 includes an illuminator 722, generating a generally planar beam of light which lies in spaced, generally parallel relationship to projection surface 714, and a sensor assembly 724, which is operative to sense the distance of at least one location of impingement of the planar beam of light by at least one object, such as a finger or a stylus.
  • Figs. 7 - 9 is distinguished from that of Figs. 5 and 6 in that whereas the embodiment of Figs. 5 and 6 provides oblique projection along one axis of the projection surface 514, the embodiment of Figs. 7 - 9 provides oblique projection along two axes of the projection surface 714.
  • the angular and positional relationships of the various components of the projector preferably are as follows: generally light is being projected in a direction generally indicated by a line 730 in Figs. 8 and 9, at respective acute angles alpha ( ⁇ ) and beta ( ⁇ ) with respect to the projection surface 714.
  • Condensing lens 706 is aligned generally along an optical axis of light source 704.
  • the spatial light modulator 708 and the projecting lens 710 are arranged with respect to the plane of the projection surface 714 and with respect to the optical axis of the light source and condensing lens in accordance with the Scheimpflug principle, i.e.
  • the spatial light modulator 708 is preferably offset in directions indicated by an arrow 732 (Fig. 8) and by an arrow 734 (Fig. 9) in order to optimize uniformity of projection intensity.
  • the spatial light modulator 708 is configured to compensate for geometrical distortions which result from projection therethrough onto projection surface 714. These distortions include keystone distortions along two mutually perpendicular axes as well as possible other optical distortions.
  • the size and shape of each pixel of the spatial light modulator 708 is distorted appropriately to provide a projected image 740 on projection surface 714, in which each pixel is of the same size and shape.
  • each pixel in the projected image is generally rectangular as shown, while each pixel in the spatial light modulator has non-mutually parallel side edges.
  • the pixels in the spatial light modulator are of differing sizes and the pixels towards the bottom right corner of the spatial light modulator (as seen in Fig. 7), which are projected a longer distance and at a more oblique angle are smaller than those at the top left corner of the spatial light modulator, which are projected a shorter distance and at a less oblique angle.
  • the spatial light modulator 708 is offset with respect to the condensing lens 706, such that more light per unit area is directed through the smaller pixels of the spatial light modulator 708 so as to enhance uniformity of light intensity over the pixels in the projected image 740.
  • Fig. 10 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet a further preferred embodiment of the present invention and to Fig. 11, which is a sectional illustration of the projection device of Fig. 10, taken along section lines XI - XI in Fig. 10.
  • a projection device which optionally has the functionality of a virtual data entry device and is generally designated by reference numeral 900.
  • the projector preferably comprises a housing 902, in which there is provided a light source 904, such as an LED. Downstream of the light source 904 there is preferably provided collimating optics such as a condensing lens 906.
  • a spatial light modulator 908 such as a segmented LCD.
  • projection optics such as a projection lens 910.
  • the projection lens 910 projects light from the light source 904, via a projection window 912 formed in housing 902 onto a projection surface 914.
  • impingement-sensing functionality may be combined with the projector described hereinabove.
  • an impingement sensor 920 is mounted on or located within housing 902.
  • Impingement sensor 920 is preferably of the type described hereinabove with reference to any of Figs. 1 - 4.
  • impingement sensor 920 includes an illuminator 922, generating a generally planar beam of light which lies in spaced, generally parallel relationship to projection surface 914, and a sensor assembly 924, which is operative to sense the distance of at least one location of impingement of the planar beam of light by at least one object, such as a finger or a stylus.
  • the angular and positional relationships of the various components of the projector are as follows: generally light is being projected in a direction generally indicated by a line 930 in Fig. 11, at an acute angle alpha ( ⁇ ) with respect to the projection surface 914.
  • Condensing lens 906 is aligned generally along an optical axis of light source 904.
  • the spatial light modulator 908 and the projecting lens 910 are arranged with respect to the plane of the projection surface 914 and with respect to the optical axis of the light source and condensing lens in accordance with the Scheimpflug principle, i.e.
  • the spatial light modulator 908 is preferably offset in a direction indicated by an arrow 932 in order to optimize uniformity of projection intensity. It is a particular feature of the present invention that the spatial light modulator 908 is configured to compensate for keystone distortions which result from projection therethrough onto projection surface 914. In accordance with a preferred embodiment of the present invention, the size and shape of each segment of the spatial light modulator 908 is distorted appropriately to provide a projected image 940 on projection surface 914, in which each pixel is of the same size and shape.
  • the segments in the spatial light modulator are of differing sizes and the segments towards the bottom of the spatial light modulator, which are projected a longer distance and at a more oblique angle, are smaller than those at the top of the spatial light modulator, which are projected a shorter distance and at a less oblique angle.
  • the spatial light modulator 908 is offset with respect to the condensing lens 906, such that more light per unit area is directed through the smaller segments of the spatial light modulator 908 so as to enhance uniformity of light intensity over the segments in the projected image 940.

Abstract

A virtual data entry device (100) includes an illuminator (102) generating a planar beam of light (104) and an impingement sensor (110) operating to sense at a location of impingement of a planar beam of light by an object. The impingement sensor includes an optical element (116) arranges to receive light from the planar beam reflected by the object. An optical element has optical power in a first direction (120) such that it focuses light at a focus line location (124), which is a function of the location of the object relative to the optical element and a multi-element detector (130) arranged to receive light passing through the optical element, wherein the distribution of the light detecting by the multi-element detector among multiple elements thereof indicates the location of the object.

Description

INPUT DEVICE
REFERENCE TO RELATED APPLICATIONS
Reference is made to U.S. Provisional Patent Application No.
60/655,409, entitled DISTANCE MEASUREMENT TECHNIQUE FOR VIRTUAL INTERFACES, filed February 24, 2005, and to U.S. Provisional Patent Application No. 60/709,042, entitled APPARATUS FOR LOCATING AN INTERACTION IN TWO DIMENSIONS, filed August 18, 2005, the disclosures of which are hereby incorporated by reference and priority of which is hereby claimed pursuant to 37 CFR 1.78(a) (4) and (5)(i).
FIELD OF THE INVENTION
The present invention relates to data entry devices generally.
BACKGROUND OF THE INVENTION
The following patent publications are believed to represent the current state of the art:
U.S. Patent Numbers: 6,351,260; 6,761,457; 4,782,328; 6,690,363 and 6,281,878;
U.S. Patent Application Publication Numbers: 2005/271319 and 2005/190162; and PCT Patent Publication Numbers: WO04/023208; WO05/043231;
WO04/003656 and WO02/054169. SUMMARY OF THE INVENTION
The present invention seeks to provide an improved data entry device.
There is thus provided in accordance with a preferred embodiment of the present invention a virtual data entry device including an illuminator generating a generally planar beam of light and an impingement sensor assembly operative to sense at least one location of impingement of the planar beam of light by at least one object, the impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object, the at least one optical element having optical power in a first direction such that it focuses light at at least one focus line location, which the at least one focus line location is a function of the location of the at least one object relative to the at least one optical element and a multi-element detector arranged to receive light passing through the at least one optical element, wherein the distribution of the light detected by the multi-element detector among multiple elements thereof indicates the location of the at least one object.
In accordance with a preferred embodiment of the present invention the at least one optical element has optical power in a second direction which is at least one of zero or substantially different from the optical power in the first direction. Preferably, the at least one optical element is at least one of a cylindrical lens, a conical lens and a torroidal lens.
In accordance with another preferred embodiment of the present invention the multi-element detector has multiple elements arranged side-by-side along a detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the detector line indicates distance of the at least one object from the at least one optical element. Additionally or alternatively, the multi-element detector has multiple elements arranged side-by-side along a detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the detector line indicates azimuthal location of the at least one object from the at least one optical element. As a further alternative, the multi-element detector has multiple elements arranged side-by-side in two dimensions in a plane which intersects the at least one focus line, whereby the location of intersection of the at least one focus line on the multi-element detector indicates the location of the at least one object in two-dimensions.
In accordance with still another preferred embodiment of the present invention the at least one multi-element detector comprises a first multi-element detector and a second multi-element detector and the at least one optical element comprises a first optical element and a second optical element, the first multi-element detector having multiple elements arranged side-by-side along a first detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the first detector line indicates azimuthal location of the at least one object from the first optical element and the second multi-element detector having multiple elements arranged side-by-side along a second detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the second detector line indicates distance of the at least one object from the second optical element. In accordance with yet another preferred embodiment of the present invention the multi-element detector includes a generally one-dimensional slit lying in the second plane along a detector line which intersects the at least one focus line, wherein the location of intersection of the at least one focus line and the detector line indicates distance of the at least one object from the at least one optical element. Preferably, the generally planar beam of light is spaced from and generally parallel to a generally planar display surface. More preferably, the generally planar display surface includes an LCD screen.
In accordance with still another preferred embodiment of the present invention the virtual data entry device also includes processing circuitry operative to receive at least one output signal of the multi-element detector and to determine the location of the at least one object.
There is further provided in accordance with a further preferred embodiment of the present invention a virtual data entry device including an illuminator generating a generally planar beam of light generally parallel to a generally flat surface which is at least partially light reflecting and an impingement sensor assembly operative to sense at least one location of impingement of the planar beam of light by at least one object, the impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object directly and indirectly via the surface and a two-dimensional multi-element detector arranged to receive light passing through the at least one optical element, wherein the spatial separation on the detector of the light detected by the multi-element detector which was reflected directly from the at least one object and the light which was reflected via the light reflecting surface thereof indicates distance of the at least one object from the at least one optical element and location on the detector of the light reflected directly and indirectly from the at least one object indicates the azimuthal location of the at least one object relative to the at least one optical element. In accordance with a preferred embodiment of the present invention the generally flat light-reflecting surface includes an LCD screen. Preferably, the virtual data entry device also includes processing circuitry operative to receive at least one output signal of the multi-element detector and to determine at least one of the distance of the at least one object from the at least one optical element and the azimuthal location of the at least one object relative to the at least one optical element. Additionally or alternatively, the processing circuitry is operative to output the location of the at least one object in Cartesian coordinates.
There is additionally provided in accordance with an additional preferred embodiment of the present invention a pattern projector including a source of light to be projected, a spatial light modulator arranged in a spatial light modulator plane, the spatial light modulator receiving the light from the source of light and being configured to pass the light therethrough in a first pattern and projection optics receiving the light from the spatial light modulator and being operative to project a desired second pattern onto a projection surface lying in a projection surface plane which is angled with respect to the spatial light modulator plane, the first pattern being a distortion of the desired second pattern configured such that keystone distortions resulting from the difference in angular orientations of the spatial light modulator plane and the projection surface plane are compensated.
In accordance with a preferred embodiment of the present invention the pattern projector also includes a collimator interposed between the source of light and the spatial light modulator, the collimator being operative to distribute the light from the source of light across the spatial light modulator in a non-uniform distribution such that light distribution in the desired second pattern has uniformity greater than the nonuniform distribution. Preferably, the spatial light modulator includes an LCD. More preferably, the LCD is a matrix LCD. Alternatively, the LCD is a segmented LCD.
In accordance with another preferred embodiment of the present invention the spatial light modulator plane is angled with respect to the projection optics such that the desired second pattern is focused to a generally uniform extent. Preferably, the spatial light modulator includes a plurality of pixels, different ones of the plurality of pixels of the spatial light modulator having different sizes. Alternatively, different ones of the plurality of pixels of the spatial light modulator have different shapes. In accordance with yet another preferred embodiment of the present invention the spatial light modulator plane is angled with respect to the projection optics such that the desired second pattern is focused to a generally uniform extent. Preferably, the spatial light modulator includes a plurality of segments, different ones of the plurality of segments of the spatial light modulator having different sizes. Alternatively, different ones of the plurality of segments of the spatial light modulator have different shapes.
There is also provided in accordance with yet another preferred embodiment of the present invention a method for data entry including utilizing an illuminator to generate a generally planar beam of light and sensing at least one location of impingement of the planar beam of light by at least one object, using an impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object, the at least one optical element having optical power in a first direction such that it focuses light at at least one focus line location, which the at least one focus line location is a function of the location of the at least one object relative to the at least one optical element and a multielement detector arranged to receive light passing through the at least one optical element, wherein the distribution of the light detected by the multi-element detector among multiple elements thereof indicates the location of the at least one object. In accordance with another preferred embodiment of the present invention the method for data entry also includes determining the distance of the at least one object from the at least one optical element by determining a location of intersection between the at least one focus line and a detector line along which are arranged, in a side-by-side orientation, multiple elements of the multi-element detector. Additionally or alternatively, the method for data entry also includes determining the azimuthal location of the at least one object with respect to the at least one optical element by determining a location of intersection between the at least one focus line and a detector line along which are arranged, in a side-by-side orientation, multiple elements of the multi-element detector. As a further alternative, the method for data entry also includes determining the location of the at least one object in two-dimensions by determining a location of intersection between the at least one focus line and a detector plane along which are arranged, in a side-by-side two-dimensional orientation, multiple elements of the multi-element detector.
In accordance with still another preferred embodiment of the present invention the at least one multi-element detector comprises a first multi-element detector and a second multi-element detector and the at least one optical element comprises a first optical element and a second optical element, the method also including determining the azimuthal location of the at least one object with respect to the first optical element by determining a location of intersection between the at least one focus line and a first detector line along which are arranged, in a side-by-side orientation, multiple elements of the first multi-element detector and determining the distance of the at least one object from the second optical element by determining a location of intersection between the at least one focus line and a second detector line along which are arranged, in a side-by-side orientation, multiple elements of the second multi-element detector.
In accordance with another preferred embodiment of the present invention the method of data entry also includes providing processing circuitry for receiving at least one output signal of the multi-element detector and determining the location of the at least one object.
There is yet further provided in accordance with yet another preferred embodiment of the present invention a method for data entry including utilizing an illuminator for generating a generally planar beam of light generally parallel to a generally flat surface which is at least partially light reflecting and sensing at least one location of impingement of the planar beam of light by at least one object, using an impingement sensor assembly including at least one optical element arranged to receive light from the planar beam reflected by the at least one object directly and indirectly via the surface and a two-dimensional multi-element detector arranged to receive light passing through the at least one optical element, wherein the spatial separation on the detector of the light detected by the multi-element detector which was reflected directly from the at least one object and the light which was reflected via the light reflecting surface thereof indicates distance of the at least one object from the at least one optical element and location on the detector of the light reflected directly and indirectly from the at least one object indicates the azimuthal location of the at least one object relative to the at least one optical element.
In accordance with a preferred embodiment of the present invention the method for data entry also includes providing processing circuitry for receiving at least one output signal of the multi-element detector and for determining at least one of the distance of the at least one object from the at least one optical element and the azimuthal location of the at least one object relative to the at least one optical element. Additionally or alternatively, the providing processing circuitry also includes providing processing circuitry for outputting the location of the at least one object in Cartesian coordinates.
There is additionally provided in accordance with another preferred embodiment of the present invention a method for projecting a pattern including providing a source of light to be projected, arranging a spatial light modulator, in a spatial light modulator plane, to receive the light from the source of light, the spatial light modulator being configured to pass the light therethrough in a first pattern and using projection optics for receiving the light from the spatial light modulator and for projecting a desired second pattern onto a projection surface lying in a projection surface plane which is angled with respect to the spatial light modulator plane, the first pattern being a distortion of the desired second pattern configured such that keystone distortions resulting from the difference in angular orientations of the spatial light modulator plane and the projection surface plane are compensated. In accordance with a preferred embodiment of the present invention the method for projecting a pattern also includes providing a collimator interposed between the source of light and the spatial light modulator, and utilizing the collimator to distribute the light from the source of light across the spatial light modulator in a nonuniform distribution such that light distribution in the desired second pattern has uniformity greater than the non-uniform distribution. Preferably, the arranging the spatial light modulator includes angling the spatial light modulator plane with respect to the projection optics such that the desired second pattern is focused to a generally uniform extent.
In accordance with another preferred embodiment of the present invention the arranging the spatial light modulator includes providing a spatial light modulator including a plurality of pixels, different ones of the plurality of pixels having different sizes. Alternatively, different ones of the plurality of pixels have different shapes.
In accordance with yet another preferred embodiment of the present invention the arranging the spatial light modulator includes providing a spatial light modulator including a plurality of segments, different ones of the plurality of segments having different sizes. Alternatively, different ones of the plurality of segments have different shapes.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
Fig. 1 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with a preferred embodiment of the present invention;
Fig. 2 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with another preferred embodiment of the present invention;
Fig. 3 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with yet another preferred embodiment of the present invention; Fig. 4 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with still another preferred embodiment of the present invention;
Fig. 5 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet another preferred embodiment of the present invention;
Fig. 6 is a sectional illustration of the projection device of Fig. 5, taken along lines VI - VI in Fig. 5;
Fig. 7 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with a further preferred embodiment of the present invention;
Fig. 8 is a sectional illustration of the projection device of Fig. 7, taken along lines VIII - VIII in Fig. 7;
Fig. 9 is a sectional illustration of the projection device of Fig. 7, taken along lines IX- IX in Fig. 7; Fig. 10 is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet a further preferred embodiment of the present invention; and Fig. 11 is a sectional illustration of the projection device of Fig. 10, taken along lines XI - XI in Fig. 10.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Reference is now made to Fig. 1, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with a preferred embodiment of the present invention.
As seen in Fig. 1, there is provided a virtual data entry device, generally designated by reference numeral 100 and including an illuminator 102, generating a generally planar beam of light, generally designated by reference numeral 104. In the illustrated preferred embodiment of the invention, the generally planar beam of light 104 lies in spaced, generally parallel relationship to a generally planar surface 106 of a display 108, such as a LCD display. An impingement sensor assembly 110 is operative to sense the distance of at least one location of impingement of the planar beam of light 104 by at least one object, such as a finger or a stylus. In the illustrated embodiment of Fig. 1, the planar beam of light 104 impinges on two fingers 112 and 114 which typically are touching display surface 106. The impingement sensor assembly 110 preferably includes at least one optical element, such as a cylindrical lens 116, having an optical axis 118 and arranged to receive light from planar beam 104 scattered or otherwise reflected by the fingers 112 and 114. Alternatively, a conical lens or a torroidal lens may be used.
Preferably, the optical element has optical power in a first direction 120, here perpendicular to display surface 106, thereby to focus light from finger 112 at a given focus line location 124, and to focus light from finger 114 at a given focus line location 128. The focus line locations 124 and 128 are each a function of the distance of respective fingers 112 and 114, which scatter light from planar beam 104, from the optical element 116.
A multi-element detector, preferably a one-dimensional detector 130, such as a linear CMOS or CCD array commercially available from Panavision LTD. of
6219 De Soto Avenue, Woodland Hills, California, typically including 2048 pixels arranged along a straight line wherein each pixel is generally square, is preferably inclined with respect to the optical element 116 and is arranged to receive light passing through the at least one optical element 116, wherein the distribution of light, detected by the multi-element detector 130, indicates the distance of each of fingers 112 and 114 from optical element 116. It is a particular feature of the present invention that the multi-element detector 130 has multiple elements 134 arranged side-by-side along a detector line 136 which intersects the focus lines 124 and 128, wherein the location of intersection of each of focus lines 124 and 128 with the detector line 136 indicates the respective distances of fingers 112 and 114 from the optical element 116. Detector 130 provides an output signal 138 which indicates the position therealong of the intersections of focus lines 124 and 128 therewith. Output 138 is used by computerized processing circuitry 140, preferably forming part of the data entry device 100, to calculate the distances of fingers 112 and 114 from optical element 116. Circuitry 140 preferably operates by locating the peaks 142 and 144 of the output signal 138, which correspond to detector element locations along detector 130. The detector element locations of the peaks 142 and 144 are directly mapped onto distances of the fingers 112 and 114 from optical element 116.
It is appreciated that non-focused light scattered by each of the fingers 112 and 114 is also received by detector 130 and creates a background signal. Conventional detection techniques which isolate signal peaks from background are preferably employed for eliminating inaccuracies which could otherwise result from the background signals.
Reference is now made to Fig. 2, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with another preferred embodiment of the present invention.
As seen in Fig. 2, there is provided a virtual data entry device, generally designated by reference numeral 200, and including an illuminator 202, generating a generally planar beam of light, generally designated by reference numeral 204. In the illustrated preferred embodiment of the invention, the generally planar beam of light 204 lies in spaced, generally parallel relationship to a generally planar surface 206 of a display 208, such as a LCD display. An impingement sensor assembly 210 is operative to sense in two dimensions at least one location of impingement of the planar beam of light 204 by at least one object, such as a finger or a stylus. In the illustrated embodiment of Fig. 2, the planar beam of light 204 impinges on two fingers 212 and 214 which typically are touching display surface 206. The impingement sensor assembly 210 preferably includes at least one optical element, such as a cylindrical lens 216, having an optical axis 218 and arranged to receive light from planar beam 204 scattered or otherwise reflected by the fingers 212 and 214. Alternatively, a conical lens or a torroidal lens may be used. Preferably, the optical element has optical power in a first direction 220, here parallel to display surface 206, thereby to focus light from finger 212 at a given focus line location 224, and to focus light from finger 214 at a given focus line location 228. The focus line locations 224 and 228 are each a function of the location of respective fingers 212 and 214, which scatter light from planar beam 204, from the optical element 216.
A multi-element detector, preferably a two-dimensional detector 230, such as a two-dimensional CMOS or CCD array commercially available from OmniVision Technologies Inc. of 1341 Orleans Drive, Sunnyvale, California, typically including 640x480 pixels, is arranged to receive light passing through the at least one optical element 216, wherein the distribution of light detected by the multi-element detector 230 indicates the location of each of fingers 212 and 214 relative to optical element 216.
It is a particular feature of the present invention that the multi-element detector 230 has multiple elements 234 arranged to lie in a detector plane 236 which is inclined with respect to focus lines 224 and 228 so as to intersect the focus lines 224 and 228, wherein the location of intersection of each of focus lines 224 and 228 with the detector plane 236 indicates the respective locations of fingers 212 and 214 relative to optical element 216. Specifically, as seen in Fig. 2, the Y location of each of signal peaks 238 and 240, which correspond to fingers 212 and 214 respectively, indicates the distance of the respective finger from optical element 216, while the X location of each of the signal peaks 238 and 240 indicates the azimuthal location of the respective finger relative to optical element 216. Detector 230 provides an output image signal 242 which indicates the positions of fingers 212 and 214. Output image signal 242 is used by computerized processing circuitry 244, preferably forming part of the data entry device 200, to calculate the positions of fingers 212 and 214 with respect to optical element 216. Reference is now made to Fig. 3, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with yet another preferred embodiment of the present invention.
As seen in Fig. 3, there is provided a virtual data entry device, generally designated by reference numeral 300 and including an illuminator 302, generating a generally planar beam of light, generally designated by reference numeral 304. In the illustrated preferred embodiment of the invention, the generally planar beam of light 304 lies in spaced, generally parallel relationship to a generally planar surface 306 of a display 308, such as a LCD display.
An impingement sensor assembly 310 is operative to sense at least one location of impingement of the planar beam of light 304 by at least one object, such as a finger or a stylus. In the illustrated embodiment of Fig. 3, the planar beam of light 304 impinges on two fingers 312 and 314 which typically are touching display surface 306. The impingement sensor assembly 310 preferably includes at least two optical elements, such as cylindrical lenses 315 and 316, having respective optical axes 317 and 318 and arranged to receive light from planar beam 304 scattered or otherwise reflected by the fingers 312 and 314. Alternatively, one or both of cylindrical lenses 315 and 316 may be replaced by a conical lens or a torroidal lens.
Preferably, the optical element 315 has optical power in a first direction 320, here perpendicular to display surface 306, thereby to focus light from finger 312 at a given focus line location 324, and to focus light from finger 314 at a given focus line location 328. The focus line locations 324 and 328 are each a function of the distance of respective fingers 312 and 314, which scatter light from planar beam 304, from the optical element 315.
Preferably, the optical element 316 has optical power in a second direction 330, here parallel to display surface 306, thereby to focus light from finger 312 at a given focus line location 334, and to focus light from finger 314 at a given focus line location 338. The focus line locations 334 and 338 are each a function of the azimuthal location of respective fingers 312 and 314, which scatter light from planar beam 304, relative to the optical element 316.
A multi-element detector, preferably a one-dimensional detector 340, such as a linear CMOS or CCD array commercially available from Panavision LTD. of 6219 De Soto Avenue, Woodland Hills, California, typically including 2048 pixels arranged along a straight line wherein each pixel is generally square, is preferably inclined with respect to optical element 315 and is arranged to receive light passing through the at least one optical element 315, wherein the distribution of light, detected by the multi-element detector 340, indicates the distance of each of fingers 312 and 314 from optical element 315.
It is a particular feature of the present invention that the multi-element detector 340 has multiple elements 344 arranged side-by-side along a detector line 346 which intersects the focus lines 324 and 328, wherein the location of intersection of each of focus lines 324 and 328 with the detector line 346 indicates the respective distances of fingers 312 and 314 from the optical element 315.
It is appreciated that non-focused light scattered by each of the fingers 312 and 314 is also received by detector 340 and creates a background signal. Conventional detection techniques which isolate signal peaks from background are preferably employed for eliminating inaccuracies which could otherwise result from the background signals.
Impingement sensor assembly 300 includes, in addition to multi-element detector 340, an additional multi-element detector, which is preferably a one- dimensional detector 350, such as a linear CMOS or CCD array commercially available from Panavision LTD. of 6219 De Soto Avenue, Woodland Hills, California, typically including 2048 pixels arranged along a straight line. Detector 350 is arranged to receive light passing through optical element 316, wherein the distribution of light, detected by the multi-element detector 350, indicates the azimuthal location of each of fingers 312 and 314 relative to optical element 316.
It is a particular feature of the present invention that the multi-element detector 350 has multiple elements 354 arranged side-by-side along a detector line 356 which intersects the focus lines 334 and 338, wherein the location of intersection of each of focus lines 334 and 338 with the detector line 356 indicates the respective azimuthal locations of fingers 312 and 314 relative to optical element 316.
Detector 340 provides an output signal 360 which indicates the position therealong of the intersections of focus lines 324 and 328 therewith. Output signal 360 is used by computerized processing circuitry 362, preferably forming part of the data entry device 300, to calculate the distances of fingers 312 and 314 from optical element 315. Circuitry 362 preferably operates by locating peaks 364 and 366 of the output signal 360, which correspond to detector element locations along detector 340. The detector element locations of the peaks 364 and 366 are directly mapped by processing circuitry 362 onto distances of the fingers 312 and 314 from optical element 315.
Detector 350 provides an output signal 370 which indicates the position therealong of the intersections of focus lines 334 and 338 therewith. Output signal 370 is used by computerized processing circuitry 362 to calculate the azimuthal locations of fingers 312 and 314 relative to optical element 316. Circuitry 362 preferably operates by locating peaks 374 and 376 of the output signal 370, which correspond to detector element locations along detector 350. The detector element locations of the peaks 374 and 376 are directly mapped by processing circuitry 362 onto azimuthal locations of the fingers 312 and 314 from optical element 316. Circuitry 362 preferably maps the distances and azimuthal locations of the fingers 312 and 314 onto Cartesian coordinate expressions of the two-dimensional positions of the fingers.
Reference is now made to Fig. 4, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a data entry device constructed and operative in accordance with still another preferred embodiment of the present invention. As seen in Fig. 4, there is provided a virtual data entry device, generally designated by reference numeral 400, and including an illuminator 402, generating a generally planar beam of light, generally designated by reference numeral 404. In the illustrated preferred embodiment of the invention, the generally planar beam of light 404 lies in spaced, generally parallel relationship to a generally planar, at least partially reflective surface 406, such as a surface of a display 408, such as a LCD display.
An impingement sensor assembly 410 is operative to sense in two dimensions at least one location of impingement of the planar beam of light 404 by at least one object, such as a finger or a stylus. In the illustrated embodiment of Fig. 4, the planar beam of light 404 impinges on two fingers 412 and 414 which typically are touching surface 406. The impingement sensor assembly 410 preferably includes at least one optical element, such as a lens 416, having an optical axis 418 and arranged to receive light from planar beam 404 scattered or otherwise reflected by the fingers. Lens 416 is preferably arranged to receive light from the planar beam 404 which is reflected by fingers 412 and 414 directly as well as indirectly via at least partially reflecting surface 406.
A multi-element detector, preferably a two-dimensional detector 430, such as a two-dimensional CMOS or CCD array commercially available from OmniVision Technologies Inc. of 1341 Orleans Drive, Sunnyvale, California, typically including 640x480 pixels, is arranged to receive light passing through the at least one optical element 416, wherein the distribution of light, detected by the multi-element detector 430 indicates the location of each of fingers 412 and 414 relative to optical element 416.
It is a particular feature of the present invention that the multi-element detector 430, which has multiple elements 434 arranged to lie in an image plane of optical element 416, receives light via optical element 416 from the planar beam 404 which is reflected by fingers 412 and 414 directly as well as indirectly via at least partially reflecting surface 406. The respective spatial separations Dl and D2 on detector 430 of impingements 442 and 444 of light which is reflected directly from fingers 412 and 414 and impingements 452 and 454 of light which is reflected via at least partially reflecting surface 406 indicate distances of the fingers 412 and 414 from optical element 416. The respective X locations of impingements 442 and 444 on detector 430 indicate the azimuthal locations of fingers 412 and 414 relative to optical element 416.
Detector 430 provides an output image signal 462 which indicates the positions of fingers 412 and 414. Output image signal 462 is used by computerized processing circuitry 464, preferably forming part of the data entry device 400, to calculate the two-dimensional positions of fingers 412 and 414 in Cartesian coordinates.
Reference is now made to Fig. 5, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet another preferred embodiment of the present invention and to Fig. 6, which is a sectional illustration of the projection device of Fig. 5, taken along section lines VI - VI in Fig. 5.
As seen in Figs. 5 and 6, there is provided a projection device, which optionally has the functionality of a virtual data entry device and is generally designated by reference numeral 500. The projector preferably comprises a housing 502, in which there is provided a light source 504, such as an LED. Downstream of the light source
504 there is preferably provided collimating optics such as a condensing lens 506.
Downstream of condensing lens 506 there is preferably provided a spatial light modulator 508, such as a matrix LCD. Downstream of spatial light modulator 508 there is preferably provided projection optics, such as a projection lens 510. Preferably, the projection lens 510 projects light from the light source 504, via a projection window 512 formed in housing 502 onto a projection surface 514.
Optionally, impingement-sensing functionality may be combined with the projector described hereinabove. As seen in Figs. 5 and 6, an impingement sensor 520 is mounted on or located within housing 502. Impingement sensor 520 is preferably of the type described hereinabove with reference to any of Figs. 1 - 4. Preferably, impingement sensor 520 includes an illuminator 522, generating a generally planar beam of light which lies in spaced, generally parallel relationship to projection surface 514, and a sensor assembly 524, which is operative to sense the distance of at least one location of impingement of the planar beam of light by at least one object, such as a finger or a stylus.
Preferably, the angular and positional relationships of the various components of the projector are as follows: generally light is being projected in a direction generally indicated by a line 530 in Fig. 6, at an acute angle alpha (α) with respect to the projection surface 514. Condensing lens 506 is aligned generally along an optical axis of light source 504. The spatial light modulator 508 and the projecting lens 510 are arranged with respect to the plane of the projection surface 514 and with respect to the optical axis of the light source and condensing lens in accordance with the Scheimpflug principle, i.e. that the plane of the spatial light modulator is focused onto the projection surface, which is obliquely angled with respect to the optical axis of the light source and condensing lens. The spatial light modulator 508 is preferably offset in a direction indicated by an arrow 532 in order to optimize uniformity of projection intensity.
It is a particular feature of the present invention that the spatial light modulator 508 is configured to compensate for keystone distortions which result from projection therethrough onto projection surface 514. In accordance with a preferred embodiment of the present invention, the size and shape of each pixel of the spatial light modulator 508 is distorted appropriately to provide a projected image 540 on projection surface 514, in which each pixel is of the same size and shape.
Preferably, each pixel in the projected image is generally rectangular as shown, while each pixel in the spatial light modulator has generally parallel top and bottom edges with non-mutually parallel side edges as shown in the enlarged portion of Fig. 5. Normally, the pixels in the spatial light modulator are of differing sizes and the pixels towards the bottom of the spatial light modulator, which are projected a longer distance and at a more oblique angle are smaller than those at the top of the spatial light modulator, which are projected a shorter distance and at a less oblique angle.
Preferably, the spatial light modulator 508 is offset with respect to the condensing lens 506, such that more light per unit area is directed through the smaller pixels of the spatial light modulator 508 so as to enhance uniformity of light intensity over the pixels in the projected image 540. Reference is now made to Fig. 7, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with a further preferred embodiment of the present invention, to Fig. 8, which is a sectional illustration of the projection device of Fig. 7, taken along lines VIII - VIII in Fig. 7 and to Fig. 9, which is a sectional illustration of the projection device of Fig. 7, taken along section lines IX- IX in Fig. 7.
As seen in Figs. 7 - 9, there is provided a projection device, which optionally has the functionality of a virtual data entry device and is generally designated by reference numeral 700. The projector preferably comprises a housing 702, in which there is provided a light source 704, such as an LED. Downstream of the light source 704 there is preferably provided collimating optics such as a condensing lens 706.
Downstream of condensing lens 706 there is preferably provided a spatial light modulator 708, such as a matrix LCD. Downstream of spatial light modulator 708 there is preferably provided projection optics, such as a projection lens 710. Preferably, the projection lens 710 projects light from the light source 704, via a projection window 712 formed in housing 702 onto a projection surface 714.
Optionally, impingement-sensing functionality may be combined with the projector described hereinabove. As seen in Figs. 7 - 9, an impingement sensor 720 is mounted on or located within housing 702. Impingement sensor 720 is preferably of the type described hereinabove with reference to any of Figs. 1 - 4. Preferably, impingement sensor 720 includes an illuminator 722, generating a generally planar beam of light which lies in spaced, generally parallel relationship to projection surface 714, and a sensor assembly 724, which is operative to sense the distance of at least one location of impingement of the planar beam of light by at least one object, such as a finger or a stylus.
The embodiment of Figs. 7 - 9 is distinguished from that of Figs. 5 and 6 in that whereas the embodiment of Figs. 5 and 6 provides oblique projection along one axis of the projection surface 514, the embodiment of Figs. 7 - 9 provides oblique projection along two axes of the projection surface 714.
Accordingly, the angular and positional relationships of the various components of the projector preferably are as follows: generally light is being projected in a direction generally indicated by a line 730 in Figs. 8 and 9, at respective acute angles alpha (α) and beta (β) with respect to the projection surface 714. Condensing lens 706 is aligned generally along an optical axis of light source 704. The spatial light modulator 708 and the projecting lens 710 are arranged with respect to the plane of the projection surface 714 and with respect to the optical axis of the light source and condensing lens in accordance with the Scheimpflug principle, i.e. that the plane of the spatial light modulator is focused onto the projection surface, which is obliquely angled with respect to the optical axis of the light source and condensing lens. The spatial light modulator 708 is preferably offset in directions indicated by an arrow 732 (Fig. 8) and by an arrow 734 (Fig. 9) in order to optimize uniformity of projection intensity.
It is a particular feature of the present invention that the spatial light modulator 708 is configured to compensate for geometrical distortions which result from projection therethrough onto projection surface 714. These distortions include keystone distortions along two mutually perpendicular axes as well as possible other optical distortions. In accordance with a preferred embodiment of the present invention, the size and shape of each pixel of the spatial light modulator 708 is distorted appropriately to provide a projected image 740 on projection surface 714, in which each pixel is of the same size and shape. Preferably, each pixel in the projected image is generally rectangular as shown, while each pixel in the spatial light modulator has non-mutually parallel side edges. Normally, the pixels in the spatial light modulator are of differing sizes and the pixels towards the bottom right corner of the spatial light modulator (as seen in Fig. 7), which are projected a longer distance and at a more oblique angle are smaller than those at the top left corner of the spatial light modulator, which are projected a shorter distance and at a less oblique angle.
Preferably, the spatial light modulator 708 is offset with respect to the condensing lens 706, such that more light per unit area is directed through the smaller pixels of the spatial light modulator 708 so as to enhance uniformity of light intensity over the pixels in the projected image 740.
Reference is now made to Fig. 10, which is a simplified partially pictorial, partially diagrammatic illustration of a portion of a projection device constructed and operative in accordance with yet a further preferred embodiment of the present invention and to Fig. 11, which is a sectional illustration of the projection device of Fig. 10, taken along section lines XI - XI in Fig. 10.
As seen in Figs. 10 and 11, there is provided a projection device, which optionally has the functionality of a virtual data entry device and is generally designated by reference numeral 900. The projector preferably comprises a housing 902, in which there is provided a light source 904, such as an LED. Downstream of the light source 904 there is preferably provided collimating optics such as a condensing lens 906.
Downstream of condensing lens 906 there is preferably provided a spatial light modulator 908, such as a segmented LCD. Downstream of spatial light modulator 908 there is preferably provided projection optics, such as a projection lens 910. Preferably, the projection lens 910 projects light from the light source 904, via a projection window 912 formed in housing 902 onto a projection surface 914.
Optionally, impingement-sensing functionality may be combined with the projector described hereinabove. As seen in Figs. 10 and 11, an impingement sensor 920 is mounted on or located within housing 902. Impingement sensor 920 is preferably of the type described hereinabove with reference to any of Figs. 1 - 4. Preferably, impingement sensor 920 includes an illuminator 922, generating a generally planar beam of light which lies in spaced, generally parallel relationship to projection surface 914, and a sensor assembly 924, which is operative to sense the distance of at least one location of impingement of the planar beam of light by at least one object, such as a finger or a stylus.
Preferably, the angular and positional relationships of the various components of the projector are as follows: generally light is being projected in a direction generally indicated by a line 930 in Fig. 11, at an acute angle alpha (α) with respect to the projection surface 914. Condensing lens 906 is aligned generally along an optical axis of light source 904. The spatial light modulator 908 and the projecting lens 910 are arranged with respect to the plane of the projection surface 914 and with respect to the optical axis of the light source and condensing lens in accordance with the Scheimpflug principle, i.e. that the plane of the spatial light modulator is focused onto the projection surface, which is obliquely angled with respect to the optical axis of the light source and condensing lens. The spatial light modulator 908 is preferably offset in a direction indicated by an arrow 932 in order to optimize uniformity of projection intensity. It is a particular feature of the present invention that the spatial light modulator 908 is configured to compensate for keystone distortions which result from projection therethrough onto projection surface 914. In accordance with a preferred embodiment of the present invention, the size and shape of each segment of the spatial light modulator 908 is distorted appropriately to provide a projected image 940 on projection surface 914, in which each pixel is of the same size and shape.
Preferably, the segments in the spatial light modulator are of differing sizes and the segments towards the bottom of the spatial light modulator, which are projected a longer distance and at a more oblique angle, are smaller than those at the top of the spatial light modulator, which are projected a shorter distance and at a less oblique angle.
Preferably, the spatial light modulator 908 is offset with respect to the condensing lens 906, such that more light per unit area is directed through the smaller segments of the spatial light modulator 908 so as to enhance uniformity of light intensity over the segments in the projected image 940.
It is appreciated that a modification of the segmented spatial light modulator of Figs. 10 and 11 for oblique projection in two directions can be carried out in accordance with the above description of Figs. 7 - 9.
It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove.
Rather the scope of the present invention includes combinations and subcombinations of various features of the present invention as well as modifications which would occur to persons reading the foregoing description and which are not in the prior art.

Claims

C LA I M S
1. A virtual data entry device comprising: an illuminator generating a generally planar beam of light; and an impingement sensor assembly operative to sense at least one location of impingement of said planar beam of light by at least one object, said impingement sensor assembly comprising: at least one optical element arranged to receive light from said planar beam reflected by said at least one object, said at least one optical element having optical power in a first direction such that it focuses light at at least one focus line location, which said at least one focus line location is a function of the location of said at least one object relative to said at least one optical element; and at least one multi-element detector arranged to receive light passing through said at least one optical element, wherein the distribution of said light detected by said multi-element detector among multiple elements thereof indicates said location of said at least one object.
2. A virtual data entry device according to claim 1 and wherein said at least one optical element has optical power in a second direction which is at least one of zero or substantially different from said optical power in said first direction.
3. A virtual data entry device according to claim 1 or claim 2 and wherein said at least one optical element is at least one of a cylindrical lens, a conical lens and a torroidal lens.
4. A virtual data entry device according to any of claims 1 to 3 and wherein said at least one multi-element detector has multiple elements arranged side-by-side along a detector line which intersects said at least one focus line, wherein the location of intersection of said at least one focus line and said detector line indicates distance of said at least one object from said at least one optical element.
5. A virtual data entry device according to any of claims 1 to 3 and wherein said at least one multi-element detector has multiple elements arranged side-by-side along a detector line which intersects said at least one focus line, wherein the location of intersection of said at least one focus line and said detector line indicates azimuthaϊ location of said at least one object from said at least one optical element.
6. A virtual data entry device according to any of claims 1 to 3 wherein said at least one multi-element detector comprises a first multi-element detector and a second multielement detector and said at least one optical element comprises a first optical element and a second optical element, said first multi-element detector having multiple elements arranged side-by-side along a first detector line which intersects said at least one focus line, wherein the location of intersection of said at least one focus line and said first detector line indicates azimuthal location of said at least one object from said first optical element and said second multi-element detector having multiple elements arranged side- by-side along a second detector line which intersects said at least one focus line, wherein the location of intersection of said at least one focus line and said second detector line indicates distance of said at least one object from said second optical element.
7. A virtual data entry device according to any of claims 1 to 3 and wherein said at least one multi-element detector has multiple elements arranged side-by-side in two dimensions in a plane which intersects said at least one focus line, whereby the location of intersection of said at least one focus line on said multi-element detector indicates the location of said at least one object in two-dimensions.
8. A virtual data entry device according to claim 4 and wherein said at least one multi-element detector comprises a generally one-dimensional slit lying in said second plane along a detector line which intersects said at least one focus line, wherein the location of intersection of said at least one focus line and said detector line indicates distance of said at least one object from said at least one optical element.
9. A virtual data entry device according to any of the preceding claims and wherein said generally planar beam of light is spaced from and generally parallel to a generally planar display surface.
10. A virtual data entry device according to claim 9 and wherein said generally planar display surface comprises an LCD screen.
11. A virtual data entry device according to any of the preceding claims and also comprising processing circuitry operative to receive at least one output signal of said multi-element detector and to determine said location of said at least one object.
12. A virtual data entry device comprising: an illuminator generating a generally planar beam of light generally parallel to a generally flat surface which is at least partially light reflecting; and an impingement sensor assembly operative to sense at least one location of impingement of said planar beam of light by at least one object, said impingement sensor assembly comprising: at least one optical element arranged to receive light from said planar beam reflected by said at least one object directly and indirectly via said surface; and a two-dimensional multi-element detector arranged to receive light passing through said at least one optical element, wherein the spatial separation on said detector of said light detected by said, multi-element detector which was reflected directly from said at least one object and said light which was reflected via said surface thereof indicates distance said at least one object and location on said detector of said light reflected directly and indirectly from said at least one object indicates the azimuthal location of said at least one object.
13. A virtual data entry device according to claim 12 and wherein said generally flat light reflecting surface comprises an LCD screen.
14. A virtual data entry device according to either of claims 12 and 13 and also comprising processing circuitry operative to receive at least one output signal of said multi-element detector and to determine at least one of said distance of said at least one object and said azimuthal location of said at least one object.
15. A virtual data entry device according to claim 14 and wherein said processing circuitry is operative to output said location of said at least one object in Cartesian coordinates.
16. A pattern projector comprising: a source of light to be projected; a spatial light modulator arranged in a spatial light modulator plane, said spatial light modulator receiving said light from said source of light and being configured to pass said light therethrough in a first pattern; and projection optics receiving said light from said spatial light modulator and being operative to project a desired second pattern onto a projection surface lying in a projection surface plane which is angled with respect to said spatial light modulator plane, said first pattern being a distortion of said desired second pattern configured such that keystone distortions resulting from the difference in angular orientations of said spatial light modulator plane and said projection surface plane are compensated.
17. A pattern projector according to claim 16 and also comprising a collimator interposed between said source of light and said spatial light modulator, said collimator being operative to distribute said light from said source of light across said spatial light modulator in a non-uniform distribution such that light distribution in said desired second pattern has uniformity greater than said non-uniform distribution.
18. A pattern projector according to either of claims 16 and 17 and wherein said spatial light modulator comprises an LCD.
19. A pattern projector according to claim 18, and wherein said LCD comprises a matrix LCD.
20. A pattern projector according to claim 19, and wherein said spatial light modulator includes a plurality of pixels, different ones of said plurality of pixels of said spatial light modulator having different sizes.
21. A pattern projector according to either of claims 19 and 20, and wherein said spatial light modulator includes a plurality of pixels, different ones of said plurality of pixels of said spatial light modulator having different shapes.
22. A pattern projector according to claim 18, and wherein said LCD comprises a segmented LCD.
23. A pattern projector according to claim 22, and wherein said spatial light modulator includes a plurality of segments, different ones of said plurality of segments of said spatial light modulator having different sizes.
24. A pattern projector according to either of claims 22 and 23, and wherein said spatial light modulator includes a plurality of segments, different ones of said plurality of segments of said spatial light modulator having different shapes.
25. A pattern projector according to any of claims 16 to 24 and wherein said spatial light modulator plane is angled with respect to said projection optics such that said desired second pattern is focused to a generally uniform extent.
26. A method for data entry comprising: utilizing an illuminator to generate a generally planar beam of light; and sensing at least one location of impingement of said planar beam of light by at least one object, using an impingement sensor assembly comprising: at least one optical element arranged to receive light from said planar beam reflected by said at least one object, said at least one optical element having optical power in a first direction such that it focuses light at at least one focus line location, which said at least one focus line location is a function of the location of said at least one object relative to said at least one optical element; and a multi-element detector arranged to receive light passing through said at least one optical element, wherein the distribution of said light detected by said multielement detector among multiple elements thereof indicates said location of said at least one object.
27. A method for data entry according to claim 26 and also comprising determining the distance of said at least one object from said at least one optical element by determining a location of intersection between said at least one focus line and a detector line along which are arranged in a side-by-side orientation multiple elements of said multi-element detector.
28. A method for data entry according to claim 26 and also comprising determining the azimuthal location of said at least one object with respect to said at least one optical element by determining a location of intersection between said at least one focus line and a detector line along which are arranged in a side-by-side orientation multiple elements of said multi-element detector.
29. A method for data entry according to claim 26, wherein said at least one multielement detector comprises a first multi-element detector and a second multi-element detector and said at least one optical element comprises a first optical element and a second optical element and also comprising: determining the azimuthal location of said at least one object with respect to said first optical element by determining a location of intersection between said at least one focus line and a first detector line along which are arranged in a side-by-side orientation multiple elements of said first multi-element detector; and determining the distance of said at least one object from said second optical element by determining a location of intersection between said at least one focus line and a second detector line along which are arranged in a side-by-side orientation multiple elements of said second multi-element detector.
30. A method for data entry according to claim 26 and also comprising determining the location of said at least one object in two-dimensions by determining a location of intersection between said at least one focus line and a detector plane along which are arranged in a side-by-side two-dimensional orientation multiple elements of said multielement detector.
31. A method of data entry according to any of claims 26 to 30 and also comprising providing processing circuitry for receiving at least one output signal of said multielement detector and determining said location of said at least one object.
32. A method for data entry comprising: utilizing an illuminator for generating a generally planar beam of light generally parallel to a generally flat surface which is at least partially light reflecting; and sensing at least one location of impingement of said planar beam of light by at least one object, using an impingement sensor assembly comprising: at least one optical element arranged to receive light from said planar beam reflected by said at least one object directly and indirectly via said surface; and a two-dimensional multi-element detector arranged to receive light passing through said at least one optical element, wherein the spatial separation on said detector of said light detected by said multi-element detector which was reflected directly from said at least one object and said light which was reflected via said surface thereof indicates distance said at least one object and location on said detector of said light reflected directly and indirectly from said at least one object indicates the azimuthal location of said at least one object.
33. A method for data entry according to claim 32 and also comprising providing processing circuitry for receiving at least one output signal of said multi-element detector and for determining at least one of said distance of said at least one object and said azimuthal location of said at least one object.
34. A method for data entry according to claim 33 and wherein said providing processing circuitry also comprises providing processing circuitry for outputting said location of said at least one object in Cartesian coordinates.
35. A method for projecting a pattern comprising: providing a source of light to be projected; arranging a spatial light modulator, in a spatial light modulator plane, to receive said light from said source of light, said spatial light modulator being configured to pass said light therethrough in a first pattern; and using projection optics for receiving said light from said spatial light modulator and for projecting a desired second pattern onto a projection surface lying in a projection surface plane which is angled with respect to said spatial light modulator plane, said first pattern being a distortion of said desired second pattern configured such that keystone distortions resulting from the difference in angular orientations of said spatial light modulator plane and said projection surface plane are compensated.
36. A method for projecting a pattern according to claim 35 and also comprising providing a collimator interposed between said source of light and said spatial light modulator, and utilizing said collimator to distribute said light from said source of light across said spatial light modulator in a non-uniform distribution such that light distribution in said desired second pattern has uniformity greater than said non-uniform distribution.
37. ' A method for projecting a pattern according to either of claims 35 and 36 and wherein said arranging said spatial light modulator includes angling said spatial light modulator plane with respect to said projection optics such that said desired second pattern is focused to a generally uniform extent.
38. A method for projecting a pattern according to any of claims 35 to 37, and wherein said arranging said spatial light modulator includes providing a spatial light modulator including a plurality of pixels, different ones of said plurality of pixels having different sizes.
39. A method for projecting a pattern according to any of claims 35 to 38, and wherein said arranging said spatial light modulator includes providing a spatial light modulator including a plurality of pixels, different ones of said plurality of pixels having different shapes.
40. A method for projecting a pattern according to any of claims 35 to 37, and wherein said arranging said spatial light modulator includes providing a spatial light modulator including a plurality of segments, different ones of said plurality of segments having different sizes.
41. A method for projecting a pattern according to any of claims 35 to 37 and 40, and wherein said arranging said spatial light modulator includes providing a spatial light modulator including a plurality of segments, different ones of said plurality of segments having different shapes.
PCT/IL2006/000246 2005-02-24 2006-02-22 A virtual keyboard device WO2006090386A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US65540905P 2005-02-24 2005-02-24
US60/655,409 2005-02-24
US70904205P 2005-08-18 2005-08-18
US60/709,042 2005-08-18

Publications (2)

Publication Number Publication Date
WO2006090386A2 true WO2006090386A2 (en) 2006-08-31
WO2006090386A3 WO2006090386A3 (en) 2007-12-13

Family

ID=36927826

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2006/000246 WO2006090386A2 (en) 2005-02-24 2006-02-22 A virtual keyboard device

Country Status (2)

Country Link
US (2) US7670006B2 (en)
WO (1) WO2006090386A2 (en)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8339379B2 (en) * 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
WO2006090386A2 (en) * 2005-02-24 2006-08-31 Vkb Inc. A virtual keyboard device
WO2009099280A2 (en) * 2008-02-05 2009-08-13 Lg Electronics Inc. Input unit and control method thereof
US8698753B2 (en) * 2008-02-28 2014-04-15 Lg Electronics Inc. Virtual optical input device with feedback and method of controlling the same
TW201007254A (en) * 2008-08-04 2010-02-16 Pixart Imaging Inc Image-sensing module and image-sensing system
WO2010019802A1 (en) * 2008-08-15 2010-02-18 Gesturetek, Inc. Enhanced multi-touch detection
US20110212774A1 (en) * 2008-11-14 2011-09-01 Karl Wudtke Terminal including a button and button having projected images and method
GB2466497B (en) * 2008-12-24 2011-09-14 Light Blue Optics Ltd Touch sensitive holographic displays
CN101901082A (en) * 2009-06-01 2010-12-01 北京汇冠新技术股份有限公司 Touch detection device
TWI490751B (en) * 2009-08-04 2015-07-01 瑞鼎科技股份有限公司 Optical touch apparatus
TWI433008B (en) * 2010-04-21 2014-04-01 Pixart Imaging Inc Optical touch apparatus and light sensing module thereof
EP2441635B1 (en) * 2010-10-06 2015-01-21 Harman Becker Automotive Systems GmbH Vehicle User Interface System
GB201110156D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch-sensitive display devices
GB201110159D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch sensitive display devices
GB201110157D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch sensitive display devices
GB201117542D0 (en) 2011-10-11 2011-11-23 Light Blue Optics Ltd Touch-sensitive display devices
US20140362052A1 (en) 2012-01-20 2014-12-11 Light Blue Optics Ltd Touch Sensitive Image Display Devices
GB2513498A (en) 2012-01-20 2014-10-29 Light Blue Optics Ltd Touch sensitive image display devices
TWI479391B (en) * 2012-03-22 2015-04-01 Wistron Corp Optical touch control device and method for determining coordinate thereof
GB201205303D0 (en) 2012-03-26 2012-05-09 Light Blue Optics Ltd Touch sensing systems
US9170474B2 (en) 2012-06-21 2015-10-27 Qualcomm Mems Technologies, Inc. Efficient spatially modulated illumination system
US9291806B2 (en) 2012-06-21 2016-03-22 Qualcomm Mems Technologies, Inc. Beam pattern projector with modulating array of light sources
US8994495B2 (en) 2012-07-11 2015-03-31 Ford Global Technologies Virtual vehicle entry keypad and method of use thereof
US11885738B1 (en) 2013-01-22 2024-01-30 J.A. Woollam Co., Inc. Reflectometer, spectrophotometer, ellipsometer or polarimeter system including sample imaging system that simultaneously meet the scheimpflug condition and overcomes keystone error
US11188154B2 (en) * 2018-05-30 2021-11-30 International Business Machines Corporation Context dependent projection of holographic objects
US10720939B2 (en) * 2018-06-12 2020-07-21 Asahi Kasei Microdevices Corporation Delta-sigma ad converter and delta-sigma ad converting method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6911972B2 (en) * 2001-04-04 2005-06-28 Matsushita Electric Industrial Co., Ltd. User interface device
US7042442B1 (en) * 2000-06-27 2006-05-09 International Business Machines Corporation Virtual invisible keyboard
US7071924B2 (en) * 2002-01-10 2006-07-04 International Business Machines Corporation User input method and apparatus for handheld computers
US7151530B2 (en) * 2002-08-20 2006-12-19 Canesta, Inc. System and method for determining an input selected by a user through a virtual interface
US7215327B2 (en) * 2002-12-31 2007-05-08 Industrial Technology Research Institute Device and method for generating a virtual keyboard/display

Family Cites Families (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US581484A (en) * 1897-04-27 Boiler-cleaner
US4553842A (en) * 1983-05-09 1985-11-19 Illinois Tool Works Inc. Two dimensional optical position indicating apparatus
US4561017A (en) * 1983-08-19 1985-12-24 Richard Greene Graphic input apparatus
US4688933A (en) * 1985-05-10 1987-08-25 The Laitram Corporation Electro-optical position determining system
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US6002799A (en) * 1986-07-25 1999-12-14 Ast Research, Inc. Handwritten keyboardless entry computer system
US5914481A (en) * 1986-08-08 1999-06-22 Norand Corporation Portable data collection terminal with handwritten input area
US6149062A (en) * 1988-01-14 2000-11-21 Intermec Ip Corp. Interface with hand-held data capture terminal, proximity and label sensing, and enhanced sensitivity and power efficiency
US4782328A (en) * 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
US5933132A (en) * 1989-11-07 1999-08-03 Proxima Corporation Method and apparatus for calibrating geometrically an optical computer input system
US5181181A (en) * 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
US5182659A (en) * 1991-02-20 1993-01-26 Holographix, Inc. Holographic recording and scanning system and method
US5457550A (en) * 1991-02-27 1995-10-10 Ricoh Company, Ltd. Optical scanner unit having recursive optical system
US5181108A (en) * 1991-10-07 1993-01-19 Greene Richard M Graphic input device with uniform sensitivity and no keystone distortion
EP0554492B1 (en) * 1992-02-07 1995-08-09 International Business Machines Corporation Method and device for optical input of commands or data
US5577848A (en) * 1992-08-24 1996-11-26 Bowen; James H. Light controlled touch pad for cursor and selection control on a computer display
US5605406A (en) * 1992-08-24 1997-02-25 Bowen; James H. Computer input devices with light activated switches and light emitter protection
JP3025121B2 (en) * 1992-12-24 2000-03-27 キヤノン株式会社 Information processing method and apparatus
DE4492865T1 (en) * 1993-04-28 1996-04-25 Mcpheters Holographic user interface
US5863113A (en) * 1993-06-22 1999-01-26 Mitsubishi Rayon Co., Ltd. Plane light source unit
US5677978A (en) * 1993-08-08 1997-10-14 Lewis; Aaron Bent probe microscopy
US5969698A (en) * 1993-11-29 1999-10-19 Motorola, Inc. Manually controllable cursor and control panel in a virtual image
US5557353A (en) * 1994-04-22 1996-09-17 Stahl; Thomas D. Pixel compensated electro-optical display system
US5581484A (en) 1994-06-27 1996-12-03 Prince; Kevin R. Finger mounted computer input device
JP3453428B2 (en) * 1994-07-07 2003-10-06 キヤノン株式会社 Information processing apparatus and control method therefor
US6281878B1 (en) * 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
US5736976A (en) * 1995-02-13 1998-04-07 Cheung; Nina T. Computer data entry apparatus with hand motion sensing and monitoring
US5748512A (en) * 1995-02-28 1998-05-05 Microsoft Corporation Adjusting keyboard
US5734375A (en) * 1995-06-07 1998-03-31 Compaq Computer Corporation Keyboard-compatible optical determination of object's position
US5633691A (en) * 1995-06-07 1997-05-27 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
US5786810A (en) * 1995-06-07 1998-07-28 Compaq Computer Corporation Method of determining an object's position and associated apparatus
US5831601A (en) * 1995-06-07 1998-11-03 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
US5502514A (en) * 1995-06-07 1996-03-26 Nview Corporation Stylus position sensing and digital camera with a digital micromirror device
DE19539955A1 (en) * 1995-10-26 1997-04-30 Sick Ag Optical detection device
US5705878A (en) * 1995-11-29 1998-01-06 Lewis; Aaron Flat scanning stage for scanned probe microscopy
US5595449A (en) * 1995-12-21 1997-01-21 Delco Electronics Corporation Inflatable keyboard
US5880712A (en) * 1995-12-21 1999-03-09 Goldman; Alfred Data input device
US5867146A (en) * 1996-01-17 1999-02-02 Lg Electronics Inc. Three dimensional wireless pointing device
FI961459A0 (en) 1996-04-01 1996-04-01 Kyoesti Veijo Olavi Maula Arrangements for optical fiber production are specified
US5781252A (en) * 1996-04-02 1998-07-14 Kopin Corporation Dual light valve color projector system
US5986261A (en) * 1996-04-29 1999-11-16 Nanoptics, Inc. Tapered structure suitable for microthermocouples microelectrodes, field emission tips and micromagnetic sensors with force sensing capabilities
US5680205A (en) * 1996-08-16 1997-10-21 Dew Engineering And Development Ltd. Fingerprint imaging apparatus with auxiliary lens
US5936615A (en) * 1996-09-12 1999-08-10 Digital Equipment Corporation Image-based touchscreen
WO1998013725A1 (en) * 1996-09-24 1998-04-02 Seiko Epson Corporation Projection display having light source
JP3159085B2 (en) * 1996-10-07 2001-04-23 スタンレー電気株式会社 Optical pointing device
US5818361A (en) * 1996-11-07 1998-10-06 Acevedo; Elkin Display keyboard
FR2756077B1 (en) 1996-11-19 1999-01-29 Opto System TOUCH SCREEN AND VISUALIZATION DEVICE USING THE SAME
GB2358779B (en) * 1996-12-13 2001-10-10 Ibm System, method, and pointing device for remote operation of data processing apparatus
US5835094A (en) * 1996-12-31 1998-11-10 Compaq Computer Corporation Three-dimensional computer environment
US5793358A (en) * 1997-01-14 1998-08-11 International Business Machines Corporation Method and means for managing a luminescent laptop keyboard
US5914709A (en) * 1997-03-14 1999-06-22 Poa Sana, Llc User input device for a computer system
US5821922A (en) * 1997-05-27 1998-10-13 Compaq Computer Corporation Computer having video controlled cursor system
JP3876942B2 (en) * 1997-06-13 2007-02-07 株式会社ワコム Optical digitizer
US5864334A (en) * 1997-06-27 1999-01-26 Compaq Computer Corporation Computer keyboard with switchable typing/cursor control modes
US6094196A (en) * 1997-07-03 2000-07-25 International Business Machines Corporation Interaction spheres of three-dimensional objects in three-dimensional workspace displays
US6104384A (en) * 1997-09-12 2000-08-15 Ericsson, Inc. Image based keyboard for a small computing device
US6037882A (en) * 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
US6031519A (en) * 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
US5952731A (en) 1998-02-02 1999-09-14 Lear Automotive Dearborn, Inc. Membrane keyless entry switch for vehicles
DE29803435U1 (en) 1998-02-27 1998-05-28 Josta Technik Fahrradhalter Un Double-level parking system for bicycles
US6043805A (en) * 1998-03-24 2000-03-28 Hsieh; Kuan-Hong Controlling method for inputting messages to a computer
US5977867A (en) * 1998-05-29 1999-11-02 Nortel Networks Corporation Touch pad panel with tactile feedback
JP4033582B2 (en) * 1998-06-09 2008-01-16 株式会社リコー Coordinate input / detection device and electronic blackboard system
US6266048B1 (en) * 1998-08-27 2001-07-24 Hewlett-Packard Company Method and apparatus for a virtual display/keyboard for a PDA
CA2246258A1 (en) * 1998-08-31 2000-02-29 Photonics Research Ontario Novel optical scheme for holographic imaging of complex defractive elements in materials
US6367933B1 (en) * 1998-10-02 2002-04-09 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
US6690357B1 (en) * 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
TW464800B (en) 1998-10-07 2001-11-21 Intel Corp A method for inputting data to an electronic device, an article comprising a medium for storing instructions, and an image processing system
FI990676A (en) 1999-03-26 2000-09-27 Nokia Mobile Phones Ltd Hand-held entry system for data entry and mobile phone
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US7030863B2 (en) * 2000-05-26 2006-04-18 America Online, Incorporated Virtual keyboard system with automatic correction
KR100636090B1 (en) * 1999-06-11 2006-10-19 삼성전자주식회사 LCD Projection System
JP4057200B2 (en) * 1999-09-10 2008-03-05 株式会社リコー Coordinate input device and recording medium for coordinate input device
JP4094794B2 (en) * 1999-09-10 2008-06-04 株式会社リコー Coordinate detection apparatus, information storage medium, and coordinate detection method
US6424338B1 (en) * 1999-09-30 2002-07-23 Gateway, Inc. Speed zone touchpad
JP4052498B2 (en) * 1999-10-29 2008-02-27 株式会社リコー Coordinate input apparatus and method
US20030132921A1 (en) * 1999-11-04 2003-07-17 Torunoglu Ilhami Hasan Portable sensory input device
US6710770B2 (en) * 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
JP3934846B2 (en) * 2000-03-06 2007-06-20 株式会社リコー Coordinate input / detection device, electronic blackboard system, light receiving element positional deviation correction method, and storage medium
US6611252B1 (en) * 2000-05-17 2003-08-26 Dufaux Douglas P. Virtual data input device
US6650318B1 (en) * 2000-10-13 2003-11-18 Vkb Inc. Data input device
JP2003535405A (en) 2000-05-29 2003-11-25 ブイケービー インコーポレイティド Virtual data input device and method for inputting characters, numbers and other data
US6690363B2 (en) * 2000-06-19 2004-02-10 Next Holdings Limited Touch panel display system
US6520647B2 (en) * 2000-08-17 2003-02-18 Mitsubishi Electric Research Laboratories Inc. Automatic keystone correction for projectors with arbitrary orientation
CN1232882C (en) * 2000-10-06 2005-12-21 松下电器产业株式会社 Illumination optical unit and proejction display comprising it
US6690354B2 (en) 2000-11-19 2004-02-10 Canesta, Inc. Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
FI113094B (en) * 2000-12-15 2004-02-27 Nokia Corp An improved method and arrangement for providing a function in an electronic device and an electronic device
KR20030072591A (en) * 2001-01-08 2003-09-15 브이케이비 인코포레이티드 A data input device
AU2002951208A0 (en) 2002-09-05 2002-09-19 Digislide International Pty Ltd A portable image projection device
DE20109394U1 (en) * 2001-06-06 2001-08-16 Zeiss Carl Jena Gmbh Projection arrangement
WO2002101443A2 (en) * 2001-06-12 2002-12-19 Silicon Optix Inc. System and method for correcting keystone distortion
US6854870B2 (en) * 2001-06-30 2005-02-15 Donnelly Corporation Vehicle handle assembly
JP2003173237A (en) * 2001-09-28 2003-06-20 Ricoh Co Ltd Information input-output system, program and storage medium
US7307661B2 (en) 2002-06-26 2007-12-11 Vbk Inc. Multifunctional integrated image sensor and application to virtual interface technology
DE10260305A1 (en) * 2002-12-20 2004-07-15 Siemens Ag HMI setup with an optical touch screen
US7629967B2 (en) * 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
AU2003304127A1 (en) * 2003-05-19 2004-12-03 Itzhak Baruch Optical coordinate input device comprising few elements
US20080297614A1 (en) 2003-10-31 2008-12-04 Klony Lieberman Optical Apparatus for Virtual Interface Projection and Sensing
JP4522113B2 (en) * 2004-03-11 2010-08-11 キヤノン株式会社 Coordinate input device
US7471865B2 (en) * 2004-06-04 2008-12-30 Poa Sana Liquidating Trust Apparatus and method for a molded waveguide for use with touch screen displays
US7248151B2 (en) 2005-01-05 2007-07-24 General Motors Corporation Virtual keypad for vehicle entry control
GB2440683B (en) * 2005-02-23 2010-12-08 Zienon L L C Method and apparatus for data entry input
WO2006090386A2 (en) * 2005-02-24 2006-08-31 Vkb Inc. A virtual keyboard device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7042442B1 (en) * 2000-06-27 2006-05-09 International Business Machines Corporation Virtual invisible keyboard
US6911972B2 (en) * 2001-04-04 2005-06-28 Matsushita Electric Industrial Co., Ltd. User interface device
US7071924B2 (en) * 2002-01-10 2006-07-04 International Business Machines Corporation User input method and apparatus for handheld computers
US7151530B2 (en) * 2002-08-20 2006-12-19 Canesta, Inc. System and method for determining an input selected by a user through a virtual interface
US7215327B2 (en) * 2002-12-31 2007-05-08 Industrial Technology Research Institute Device and method for generating a virtual keyboard/display

Also Published As

Publication number Publication date
US8243015B2 (en) 2012-08-14
US20060187198A1 (en) 2006-08-24
WO2006090386A3 (en) 2007-12-13
US7670006B2 (en) 2010-03-02
US20060187199A1 (en) 2006-08-24

Similar Documents

Publication Publication Date Title
US7670006B2 (en) System and method for projection
US6362468B1 (en) Optical unit for detecting object and coordinate input apparatus using same
US10289250B2 (en) Touchscreen for detecting multiple touches
EP1577745B1 (en) Coordinate input apparatus, its control method, and program
US7755026B2 (en) Generating signals representative of sensed light that is associated with writing being done by a user
US5159322A (en) Apparatus to digitize graphic and scenic information and to determine the position of a stylus for input into a computer or the like
JP2010257089A (en) Optical position detection apparatus
US20080074755A1 (en) Lens array imaging with cross-talk inhibiting optical stop structure
US20070132742A1 (en) Method and apparatus employing optical angle detectors adjacent an optical input area
JP4054847B2 (en) Optical digitizer
JP2010277122A (en) Optical position detection apparatus
JP6431468B2 (en) Non-contact input device and method
CN108762585A (en) Non-contactly detection reproduces the method and device of the indicating positions of image
JP2005107607A (en) Optical position detecting apparatus
US20150015545A1 (en) Pointing input system having sheet-like light beam layer
US8089466B2 (en) System and method for performing optical navigation using a compact optical element
JP6740042B2 (en) Position detection system
JP2012133487A (en) Coordinate input device and coordinate input method
US8259069B1 (en) Speckle-based optical navigation on curved tracking surface
JP2000267798A (en) Coordinate inputting/detecting device
JPH06250780A (en) Data input device
JP2004086775A (en) Light source part mounting state detection device and light source part mounting state detection method
JP2013250814A (en) Coordinate input device, and coordinate input system
KR20180106276A (en) motion recognition apparatus
JP2013182343A (en) Coordinate input device and coordinate input system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC

122 Ep: pct application non-entry in european phase

Ref document number: 06711228

Country of ref document: EP

Kind code of ref document: A2