US20050231479A1 - Illumination spot alignment - Google Patents

Illumination spot alignment Download PDF

Info

Publication number
US20050231479A1
US20050231479A1 US10/827,864 US82786404A US2005231479A1 US 20050231479 A1 US20050231479 A1 US 20050231479A1 US 82786404 A US82786404 A US 82786404A US 2005231479 A1 US2005231479 A1 US 2005231479A1
Authority
US
United States
Prior art keywords
pointing device
array
image
illumination spot
image array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/827,864
Other versions
US8325140B2 (en
Inventor
Tong Xie
Michael Brosnan
Tiong Siah
Tong Liew
Lye Bernard Chan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Agilent Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agilent Technologies Inc filed Critical Agilent Technologies Inc
Priority to US10/827,864 priority Critical patent/US8325140B2/en
Assigned to AGILENT TECHNOLOGIES, INC. reassignment AGILENT TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROSNAN, MICHAEL, CHAN, LYE HOCK BERNARD, LIEW, TONG SEN, SIAH, TIONG HENG, XIE, TONG
Priority to TW093139147A priority patent/TWI357008B/en
Priority to CNA2004101040036A priority patent/CN1690938A/en
Priority to DE102005006161A priority patent/DE102005006161A1/en
Priority to JP2005118179A priority patent/JP2005310142A/en
Publication of US20050231479A1 publication Critical patent/US20050231479A1/en
Assigned to AVAGO TECHNOLOGIES GENERAL IP PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AGILENT TECHNOLOGIES, INC.
Assigned to CITICORP NORTH AMERICA, INC. reassignment CITICORP NORTH AMERICA, INC. SECURITY AGREEMENT Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Publication of US8325140B2 publication Critical patent/US8325140B2/en
Application granted granted Critical
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. MERGER (SEE DOCUMENT FOR DETAILS). Assignors: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: CITICORP NORTH AMERICA, INC.
Assigned to DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT reassignment DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001) Assignors: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE NAME PREVIOUSLY RECORDED AT REEL: 017206 FRAME: 0666. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: AGILENT TECHNOLOGIES, INC.
Assigned to PIXART IMAGING INC. reassignment PIXART IMAGING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface

Definitions

  • the optical mouse uses photodetectors arranged as an image array of pixels to image the spatial features of generally any micro textured or micro detailed work surface located below the optical mouse.
  • Photodetector responses are digitized and stored as a frame into memory.
  • Motion produces successive frames of translated patterns of pixel information. The successive frames are compared by cross-correlation to ascertain the direction and amount of movement.
  • illumination spot alignment is performed by capturing an image by an image array.
  • the image is evaluated to determine an illumination spot size and location. Coordinates identifying the illumination spot size and location are stored.
  • FIG. 1 is a simplified view of the underside of a pointing device.
  • FIG. 2 is a simplified block diagram of an optical navigation sensing system in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates illumination spot alignment calibration in accordance with an embodiment of the present invention.
  • FIG. 4 is a simplified flowchart illustrating illumination spot alignment in accordance with an embodiment of the present invention.
  • FIG. 1 is a simplified view of the underside of a pointing device 10 .
  • pointing device 10 is an optical mouse.
  • a low friction guide 14 , a low friction guide 15 and a low friction guide 16 are used by pointing device 10 to make contact with an underlying surface.
  • a connecting cable 12 and strain relief 11 are also shown.
  • pointing device 10 can be a wireless mouse and connecting cable 12 can be omitted.
  • an illuminator 17 and an image array 18 are included within illuminator 17 and/or image array 18 .
  • various optics, as necessary or desirable, are included within illuminator 17 and/or image array 18 .
  • illuminator 17 is implemented using a light emitting diode (LED), an infrared (IR) LED, or a laser.
  • FIG. 2 is a simplified block diagram of an optical navigation sensing system.
  • Image array 18 is implemented, for example, using a 32 by 32 array of photodetectors. Alternatively, other array sizes can be used.
  • An analog-to-digital converter (ADC) 21 receives analog signals from image array 18 and converts the signals to digital data.
  • the interface between image array 18 and ADC 21 is a serial interface.
  • the interface between image array 18 and ADC 21 is a parallel interface.
  • An automatic gain control (AGC) 22 evaluates digital data received from ADC 21 and controls shutter speed and gain adjust within image array 18 . This is done, for example, to prevent saturation or underexposure of images captured by image array 18 .
  • a cross-correlator 26 evaluates the digital data from ADC 21 and performs a convolution to calculate overlap of images and to determine peak shift between images in order to detect motion.
  • a navigator 27 takes results from cross-correlator 26 to determine a delta x value placed on an output 28 and to determine a delta y value placed on an output 29 .
  • An initialization block 24 checks to see whether pointing device 10 is in the illumination spot identification mode. For example, pointing device 10 is placed in the illumination spot identification mode by toggling a switch within pointing device 10 . For example, the switch is not accessible to an end user of pointing device 10 .
  • digital data from ADC 21 is forwarded directly to cross-correlator 26 and pointing device 10 proceeds in a normal mode to track movement.
  • normal operation e.g., not in an illumination spot identification mode
  • only pixels of image array 18 that are within a bounding box determined by data within a memory 23 are forwarded to ADC 21 and used for movement/position tracking.
  • Illumination spot identification block 25 When pointing device 10 is in the illumination spot identification mode, digital data from ADC 21 is intercepted and sent to an illumination spot identification block 25 . Data from all pixels within image array 18 are used when pointing device 10 is in the illumination spot identification mode. Illumination spot identification block 25 performs illumination spot alignment using the received data.
  • illumination spot identification block 25 finds a location of the peak illumination intensity of light detected by image array 18 . Illumination spot identification block 25 then determines a boundary where a predetermined percentage of peak illumination intensity occurs. For example the predetermined percentage is 50%. Alternatively, the predetermined percentage varies depending upon the components used to implement optical pointing device 10 and/or upon the desired resolution.
  • coordinates for a bounding box are determined.
  • the coordinates for the bounding box are stored within memory 23 .
  • stored coordinates can be in the form of x, y coordinates for two corners of the bounding box (e.g., the lower left hand corner of the bounding box, and the upper right hand corner of the bounding box).
  • the stored coordinates can be in the form of a single x, y coordinate (e.g., for the lower left hand corner of the box, the upper right corner of the box, or the center of the box, etc.), and an x length and a y length of the bounding box.
  • the coordinates can be any type of coordinates that define the bounding box.
  • FIG. 3 variously illustrates the illumination spot alignment performed by illumination spot identification block 25 .
  • Array area 41 represents the full area covered by all the pixels of image array 18 .
  • Illumination spot identification block 25 locates a peak illumination intensity 42 and a boundary 43 where a predetermined percentage of peak illumination intensity occurs. From detected peak illumination intensity 42 and boundary 43 , illumination spot identification block 25 generates coordinates for a bounding box 44 .
  • Bounding box 44 includes only a subset of pixels of image array 18 .
  • array area 51 represents the full area covered by all the pixels of image array 18 .
  • Illumination spot identification block 25 locates a peak illumination intensity 52 and a boundary 53 where a predetermined percentage of peak illumination intensity occurs. From detected peak illumination intensity 52 and boundary 53 , illumination spot identification generates coordinates for a bounding box 54 . Bounding box 54 includes only a subset of pixels of image array 18 .
  • Illumination spot identification block 25 locates a peak illumination intensity 62 and a boundary 63 where a predetermined percentage of peak illumination intensity occurs. From detected peak illumination intensity 62 and boundary 63 , illumination spot identification generates coordinates for a bounding box 64 . Bounding box 64 includes only a subset of pixels of image array 18 .
  • FIG. 4 is a simplified flowchart illustrating illumination spot alignment within a manufacturing process.
  • a block 31 final assembly of a pointing device, such as an optical mouse, is performed.
  • the pointing device is placed on a reflective surface specifically used for the performance of illumination spot alignment. The use of such a specific reflective surface ensures uniformity in initialization results.
  • the pointing device captures an image while within an illumination spot identification mode.
  • the captured image is captured using all the pixels within the image array.
  • the pointing device evaluates the image to determine an illumination spot size and location. For example, the pointing device locates a peak illumination intensity and a boundary where a predetermined percentage of peak illumination intensity occurs. This allows calculation of illumination spot size and center.
  • coordinates for spot size are stored in memory for use by the image array.

Abstract

Illumination spot alignment is performed by capturing an image by an image array. The image is evaluated to determine an illumination spot size and location. Coordinates identifying the illumination spot size and location are stored.

Description

    BACKGROUND
  • In one type of optical mouse, the optical mouse uses photodetectors arranged as an image array of pixels to image the spatial features of generally any micro textured or micro detailed work surface located below the optical mouse. Photodetector responses are digitized and stored as a frame into memory. Motion produces successive frames of translated patterns of pixel information. The successive frames are compared by cross-correlation to ascertain the direction and amount of movement. For more information on this type of optical mouse, see, for example, U.S. Pat. No. 6,281,882 B1.
  • The imaging and performance tracking required by optical mice, and similar pointing devices that require optical navigation sensing, rely heavily on uniform illumination across the array of pixels. Uniform illumination is obtained by careful alignment of the array of pixels, an illuminator and accompanying optics. Such careful alignment can be expensive from a manufacturing point of view.
  • SUMMARY OF THE INVENTION
  • In accordance with an embodiment of the present invention, illumination spot alignment is performed by capturing an image by an image array. The image is evaluated to determine an illumination spot size and location. Coordinates identifying the illumination spot size and location are stored.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a simplified view of the underside of a pointing device.
  • FIG. 2 is a simplified block diagram of an optical navigation sensing system in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates illumination spot alignment calibration in accordance with an embodiment of the present invention.
  • FIG. 4 is a simplified flowchart illustrating illumination spot alignment in accordance with an embodiment of the present invention.
  • DESCRIPTION OF THE EMBODIMENT
  • FIG. 1 is a simplified view of the underside of a pointing device 10. For example, pointing device 10 is an optical mouse. A low friction guide 14, a low friction guide 15 and a low friction guide 16 are used by pointing device 10 to make contact with an underlying surface. A connecting cable 12 and strain relief 11 are also shown. Alternatively, pointing device 10 can be a wireless mouse and connecting cable 12 can be omitted.
  • Within an orifice 13 is shown an illuminator 17 and an image array 18. For example, various optics, as necessary or desirable, are included within illuminator 17 and/or image array 18. For example, illuminator 17 is implemented using a light emitting diode (LED), an infrared (IR) LED, or a laser.
  • FIG. 2 is a simplified block diagram of an optical navigation sensing system. Image array 18 is implemented, for example, using a 32 by 32 array of photodetectors. Alternatively, other array sizes can be used.
  • An analog-to-digital converter (ADC) 21 receives analog signals from image array 18 and converts the signals to digital data. For example, the interface between image array 18 and ADC 21 is a serial interface. Alternatively, the interface between image array 18 and ADC 21 is a parallel interface.
  • An automatic gain control (AGC) 22 evaluates digital data received from ADC 21 and controls shutter speed and gain adjust within image array 18. This is done, for example, to prevent saturation or underexposure of images captured by image array 18.
  • A cross-correlator 26 evaluates the digital data from ADC 21 and performs a convolution to calculate overlap of images and to determine peak shift between images in order to detect motion. A navigator 27 takes results from cross-correlator 26 to determine a delta x value placed on an output 28 and to determine a delta y value placed on an output 29.
  • Existing optical mice include functionality identical or similar to image array 18, ADC 21, AGC 22, cross-correlator 26 and navigator 27. For further information on how this standard functionality or similar functionality of optical mice are implemented, see, for example, U.S. Pat. No. 5,644,139, U.S. Pat. No. 5,578,813, U.S. Pat. No. 5,786,804 and/or U.S. Pat. No. 6,281,882 B1.
  • An initialization block 24 checks to see whether pointing device 10 is in the illumination spot identification mode. For example, pointing device 10 is placed in the illumination spot identification mode by toggling a switch within pointing device 10. For example, the switch is not accessible to an end user of pointing device 10. When pointing device 10 is not in the illumination spot identification mode, digital data from ADC 21 is forwarded directly to cross-correlator 26 and pointing device 10 proceeds in a normal mode to track movement. During normal operation (e.g., not in an illumination spot identification mode) of pointing device 10, only pixels of image array 18 that are within a bounding box determined by data within a memory 23 are forwarded to ADC 21 and used for movement/position tracking.
  • When pointing device 10 is in the illumination spot identification mode, digital data from ADC 21 is intercepted and sent to an illumination spot identification block 25. Data from all pixels within image array 18 are used when pointing device 10 is in the illumination spot identification mode. Illumination spot identification block 25 performs illumination spot alignment using the received data.
  • For example, illumination spot identification block 25 finds a location of the peak illumination intensity of light detected by image array 18. Illumination spot identification block 25 then determines a boundary where a predetermined percentage of peak illumination intensity occurs. For example the predetermined percentage is 50%. Alternatively, the predetermined percentage varies depending upon the components used to implement optical pointing device 10 and/or upon the desired resolution.
  • On the basis of the location of the detected peak illumination intensity and boundary, coordinates for a bounding box are determined. The coordinates for the bounding box are stored within memory 23. For example, stored coordinates can be in the form of x, y coordinates for two corners of the bounding box (e.g., the lower left hand corner of the bounding box, and the upper right hand corner of the bounding box). Alternatively, the stored coordinates can be in the form of a single x, y coordinate (e.g., for the lower left hand corner of the box, the upper right corner of the box, or the center of the box, etc.), and an x length and a y length of the bounding box. Alternatively, the coordinates can be any type of coordinates that define the bounding box.
  • FIG. 3 variously illustrates the illumination spot alignment performed by illumination spot identification block 25. Array area 41 represents the full area covered by all the pixels of image array 18. Illumination spot identification block 25 locates a peak illumination intensity 42 and a boundary 43 where a predetermined percentage of peak illumination intensity occurs. From detected peak illumination intensity 42 and boundary 43, illumination spot identification block 25 generates coordinates for a bounding box 44. Bounding box 44 includes only a subset of pixels of image array 18.
  • Likewise, array area 51 represents the full area covered by all the pixels of image array 18. Illumination spot identification block 25 locates a peak illumination intensity 52 and a boundary 53 where a predetermined percentage of peak illumination intensity occurs. From detected peak illumination intensity 52 and boundary 53, illumination spot identification generates coordinates for a bounding box 54. Bounding box 54 includes only a subset of pixels of image array 18.
  • Likewise, array area 61 represents the full area covered by all the pixels of image array 18. Illumination spot identification block 25 locates a peak illumination intensity 62 and a boundary 63 where a predetermined percentage of peak illumination intensity occurs. From detected peak illumination intensity 62 and boundary 63, illumination spot identification generates coordinates for a bounding box 64. Bounding box 64 includes only a subset of pixels of image array 18.
  • FIG. 4 is a simplified flowchart illustrating illumination spot alignment within a manufacturing process. In a block 31, final assembly of a pointing device, such as an optical mouse, is performed. In a block 32, the pointing device is placed on a reflective surface specifically used for the performance of illumination spot alignment. The use of such a specific reflective surface ensures uniformity in initialization results.
  • In a block 33, the pointing device captures an image while within an illumination spot identification mode. The captured image is captured using all the pixels within the image array.
  • In a block 34, the pointing device evaluates the image to determine an illumination spot size and location. For example, the pointing device locates a peak illumination intensity and a boundary where a predetermined percentage of peak illumination intensity occurs. This allows calculation of illumination spot size and center. In a block 35, coordinates for spot size are stored in memory for use by the image array.
  • The foregoing discussion discloses and describes merely exemplary methods and embodiments of the present invention. As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

Claims (20)

1. A method for performing illumination spot alignment comprising:
capturing an image by an image array;
evaluating the image to determine an illumination spot size and location; and,
storing coordinates identifying the illumination spot size and location.
2. A method as in claim 1 wherein evaluating the image includes:
locating a peak illumination intensity and a boundary where a predetermined percentage of peak illumination intensity occurs.
3. A method as in claim 1 wherein the coordinates identifying the illumination spot size and location define a subsection of the image array.
4. A method as in claim 1 wherein capturing the image is performed by an optical mouse placed on a reflective surface.
5. A method as in claim 1 wherein the coordinates are stored in a memory used by the image array to determine an illumination spot to be used for motion detection.
6. A method as in claim 1 wherein capturing the image is performed by an optical mouse, the optical mouse being placed in an initialization mode.
7. A pointing device, comprising:
an image array;
initialization selector that selects a data path for data from the image array; and,
an illumination spot identifier that receives the data from the image array when the pointing device is in a special mode, the illumination spot identifier identifying a subset of the image array as being within an illumination spot.
8. A pointing device as in claim 7 wherein the image array comprises a matrix of photodetectors.
9. A pointing device as in claim 7 additionally comprising:
an illuminator that emits light which when reflected from a surface is detected by the image array.
10. A pointing device as in claim 7 wherein the subset of the image array is a subsection of the image array.
11. A pointing device as in claim 7, additionally comprising a motion detector that receives the data from the image array when the pointing device is not in a special mode.
12. A pointing device as in claim 7 additionally comprising:
a memory in which is stored coordinates that identify the subset of the image array within the illumination spot.
13. A pointing device as in claim 7:
wherein when the pointing device is in the special mode, the image array outputs data for all pixels in the image array; and,
wherein when the pointing device is not in the special mode, the image array outputs data only for pixels within the subset of the image array being within the illumination spot.
14. A pointing device, comprising:
array means for producing an array of output signals that represent detected illuminance;
selection means for selecting a data path for data from the array means; and,
identification means for receiving the data from the array means when the pointing device is in a special mode, and for identifying a subset of the output signals as being within an illumination spot.
15. A pointing device as in claim 14 wherein the array means comprises a matrix of photodetectors.
16. A pointing device as in claim 14 additionally comprising:
illumination means for emitting light which when reflected from a surface is detected by the array means.
17. A pointing device as in claim 14 wherein the subset of the output signals are generated from a subsection of the array means.
18. A pointing device as in claim 14, additionally comprising a detector means for receiving the data from the array means when the pointing device is not in a special mode.
19. A pointing device as in claim 14 additionally comprising:
memory means for storing coordinates that identify the subset of the output signals.
20. A pointing device as in claim 14:
wherein when the pointing device is in the special mode, the array means outputs data for all pixels in the array means; and,
wherein when the pointing device is not in the special mode, the array means outputs data only for pixels within the subset of the output signals being within the illumination spot.
US10/827,864 2004-04-20 2004-04-20 Illumination spot alignment Active 2031-10-02 US8325140B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US10/827,864 US8325140B2 (en) 2004-04-20 2004-04-20 Illumination spot alignment
TW093139147A TWI357008B (en) 2004-04-20 2004-12-16 Illumination spot alignment
CNA2004101040036A CN1690938A (en) 2004-04-20 2004-12-30 Illumination spot alignment
DE102005006161A DE102005006161A1 (en) 2004-04-20 2005-02-10 Lighting spot alignment
JP2005118179A JP2005310142A (en) 2004-04-20 2005-04-15 Illumination spot alignment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/827,864 US8325140B2 (en) 2004-04-20 2004-04-20 Illumination spot alignment

Publications (2)

Publication Number Publication Date
US20050231479A1 true US20050231479A1 (en) 2005-10-20
US8325140B2 US8325140B2 (en) 2012-12-04

Family

ID=35095807

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/827,864 Active 2031-10-02 US8325140B2 (en) 2004-04-20 2004-04-20 Illumination spot alignment

Country Status (5)

Country Link
US (1) US8325140B2 (en)
JP (1) JP2005310142A (en)
CN (1) CN1690938A (en)
DE (1) DE102005006161A1 (en)
TW (1) TWI357008B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080158158A1 (en) * 2006-12-29 2008-07-03 Chiang Sun Cheah Optical navigation device adapted for navigation on a transparent plate
US20080252602A1 (en) * 2007-04-11 2008-10-16 Ramakrishna Kakarala Dynamically reconfigurable pixel array for optical navigation
US20100007167A1 (en) * 2006-09-06 2010-01-14 Toyota Boshoku Kabushiki Kaisha Vehicle seat

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5574480A (en) * 1994-06-07 1996-11-12 Kensington Microware Limited Computer pointing device
US5589880A (en) * 1994-01-25 1996-12-31 Hitachi Denshi Kabushiki Kaisha Television camera using two image pickup devices with different sensitivity
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
US6433780B1 (en) * 1995-10-06 2002-08-13 Agilent Technologies, Inc. Seeing eye mouse for a computer system
US7086768B2 (en) * 2003-06-20 2006-08-08 Matsushita Electric Industrial Co., Ltd. Illumination device and illuminated input device
US7122781B2 (en) * 2001-12-05 2006-10-17 Em Microelectronic-Marin Sa Method and sensing device for motion detection in an optical pointing device, such as an optical mouse

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5578813A (en) 1995-03-02 1996-11-26 Allen; Ross R. Freehand image scanning device which compensates for non-linear movement
US6462860B1 (en) 2000-06-05 2002-10-08 Hrl Laboratories, Llc Method and apparatus of detection of pulse position modulated optical signals
TW509867B (en) 2001-08-03 2002-11-11 Pixart Imaging Inc Optical mouse image formation system using pinhole imaging
TW573765U (en) 2003-06-20 2004-01-21 Kye Systems Corp Optical pointer device capable of adjusting illumination

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5589880A (en) * 1994-01-25 1996-12-31 Hitachi Denshi Kabushiki Kaisha Television camera using two image pickup devices with different sensitivity
US5574480A (en) * 1994-06-07 1996-11-12 Kensington Microware Limited Computer pointing device
US6433780B1 (en) * 1995-10-06 2002-08-13 Agilent Technologies, Inc. Seeing eye mouse for a computer system
US6256016B1 (en) * 1997-06-05 2001-07-03 Logitech, Inc. Optical detection system, device, and method utilizing optical matching
US20050168445A1 (en) * 1997-06-05 2005-08-04 Julien Piot Optical detection system, device, and method utilizing optical matching
US7122781B2 (en) * 2001-12-05 2006-10-17 Em Microelectronic-Marin Sa Method and sensing device for motion detection in an optical pointing device, such as an optical mouse
US7086768B2 (en) * 2003-06-20 2006-08-08 Matsushita Electric Industrial Co., Ltd. Illumination device and illuminated input device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100007167A1 (en) * 2006-09-06 2010-01-14 Toyota Boshoku Kabushiki Kaisha Vehicle seat
US20080158158A1 (en) * 2006-12-29 2008-07-03 Chiang Sun Cheah Optical navigation device adapted for navigation on a transparent plate
US7965278B2 (en) * 2006-12-29 2011-06-21 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Optical navigation device adapted for navigation on a transparent plate
US20080252602A1 (en) * 2007-04-11 2008-10-16 Ramakrishna Kakarala Dynamically reconfigurable pixel array for optical navigation
US9052759B2 (en) * 2007-04-11 2015-06-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Dynamically reconfigurable pixel array for optical navigation

Also Published As

Publication number Publication date
US8325140B2 (en) 2012-12-04
TWI357008B (en) 2012-01-21
TW200535679A (en) 2005-11-01
DE102005006161A1 (en) 2005-11-17
JP2005310142A (en) 2005-11-04
CN1690938A (en) 2005-11-02

Similar Documents

Publication Publication Date Title
JP4392377B2 (en) Optical device that measures the distance between the device and the surface
US7313255B2 (en) System and method for optically detecting a click event
US8212794B2 (en) Optical finger navigation utilizing quantized movement information
US8525777B2 (en) Tracking motion of mouse on smooth surfaces
JP6302414B2 (en) Motion sensor device having a plurality of light sources
US20220206151A1 (en) Tracking device with improved work surface adaptability
US8638988B2 (en) Movement analysis and/or tracking system
US20070182725A1 (en) Capturing Hand Motion
WO2014162675A1 (en) Motion-sensor device having multiple light sources
US10228772B2 (en) Remote controller
JPH08506193A (en) Device and method for diffusion-assisted localization for visual detection of pens
US20060213280A1 (en) Displacement sensor equipped with automatic setting device for measurement region
TW201633077A (en) Image processing method capable of detecting noise and related navigation device
CN103376897A (en) Method and device for ascertaining a gesture performed in the light cone of a projected image
US8325140B2 (en) Illumination spot alignment
CN103777741B (en) The gesture identification and system followed the trail of based on object
US20080158540A1 (en) Optical navigation device adapted for navigation on a transparent structure
US7193203B1 (en) Method and device for tracking movement between a surface and an imager
JPH10222646A (en) Device for inputting picture and method therefor
US20180130230A1 (en) Recognition apparatus, determination method, and article manufacturing method
JP2006127539A (en) Device for extracting image
JP2004272310A (en) Ultrasonic light coordinate input device
KR20180118584A (en) Apparatus for Infrared sensing footing device, Method for TWO-DIMENSIONAL image detecting and program using the same
JP5195041B2 (en) Pointing device, object recognition device, and program
JP2521156Y2 (en) Position measurement target

Legal Events

Date Code Title Description
AS Assignment

Owner name: AGILENT TECHNOLOGIES, INC., COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XIE, TONG;BROSNAN, MICHAEL;SIAH, TIONG HENG;AND OTHERS;REEL/FRAME:014864/0393;SIGNING DATES FROM 20040330 TO 20040415

Owner name: AGILENT TECHNOLOGIES, INC., COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XIE, TONG;BROSNAN, MICHAEL;SIAH, TIONG HENG;AND OTHERS;SIGNING DATES FROM 20040330 TO 20040415;REEL/FRAME:014864/0393

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP PTE. LTD.,SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:017206/0666

Effective date: 20051201

Owner name: AVAGO TECHNOLOGIES GENERAL IP PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:017206/0666

Effective date: 20051201

AS Assignment

Owner name: CITICORP NORTH AMERICA, INC.,DELAWARE

Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017207/0882

Effective date: 20051201

Owner name: CITICORP NORTH AMERICA, INC., DELAWARE

Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017207/0882

Effective date: 20051201

AS Assignment

Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.,S

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017675/0518

Effective date: 20060127

Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.,

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:017675/0518

Effective date: 20060127

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: MERGER;ASSIGNOR:AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.;REEL/FRAME:030369/0528

Effective date: 20121030

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CITICORP NORTH AMERICA, INC.;REEL/FRAME:030422/0021

Effective date: 20110331

AS Assignment

Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT, NEW YORK

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001

Effective date: 20140506

Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AG

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001

Effective date: 20140506

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001

Effective date: 20160201

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001

Effective date: 20160201

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE NAME PREVIOUSLY RECORDED AT REEL: 017206 FRAME: 0666. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:AGILENT TECHNOLOGIES, INC.;REEL/FRAME:038632/0662

Effective date: 20051201

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: PIXART IMAGING INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:039788/0572

Effective date: 20160805

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:039862/0129

Effective date: 20160826

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:039862/0129

Effective date: 20160826

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041710/0001

Effective date: 20170119

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041710/0001

Effective date: 20170119

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8