|Publication number||US7965278 B2|
|Application number||US 11/618,427|
|Publication date||21 Jun 2011|
|Priority date||29 Dec 2006|
|Also published as||US20080158158|
|Publication number||11618427, 618427, US 7965278 B2, US 7965278B2, US-B2-7965278, US7965278 B2, US7965278B2|
|Inventors||Chiang Sun Cheah, Chin Heong Yeoh, Chiang Mei Teo|
|Original Assignee||Avago Technologies Ecbu Ip (Singapore) Pte. Ltd.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (19), Classifications (4), Legal Events (6)|
|External Links: USPTO, USPTO Assignment, Espacenet|
An optical navigation device, such as an optical mouse, typically includes a light source to illuminate a navigation surface and an optical navigation sensor integrated circuit (IC) that functions as a miniature digital camera to continually collect images of the illuminated navigation surface and to determine the speed and direction that the device is being moved across the surface by comparing sequentially recorded frames of image information. Image frames are collected at a very high rate and the resolution of the optical navigation sensor IC is high enough to detect very small movements of the device relative to the navigation surface.
When an optical mouse is used on most opaque surfaces, such as desktops, the collected image frames have enough features for the optical navigation sensor IC to determine relative movement between image frames. However, using an optical mouse on a transparent surface, such as a plate of glass that sits on a desktop, presents unique challenges. In particular, the top surface of the glass is usually too smooth to provide distinguishable features in the collected image frames and the thickness of the glass changes the geometry between the light source, the optical navigation sensor IC, and the underlying desktop such that an insufficient amount of light is reflected from the desktop to the optical navigation sensor IC.
An optical mouse could be designed for dedicated use on a desktop that is covered by a glass plate of known thickness. Although such an application-specific design is possible, it is not practical because an optical mouse may be used on both transparent and opaque surfaces over its lifetime and because the thickness of glass plates that cover desktops is difficult to predict and may change from plate to plate.
An optical navigation device, such an optical mouse, includes a housing, an illumination system, a tracking engine, and multiple height-specific imaging systems located within the housing. The height-specific imaging systems generate image information in response to reflected light from a navigation surface and each one the height-specific imaging systems is positioned to detect the largest portion of reflected light at a different separation distance between the housing and the navigation surface. That is, the optical navigation device includes multiple different imaging systems that are optimally positioned to collect image information at different distances from the navigation surface. This enables the optical navigation system to accurately track relative movement whether the optical mouse sits directly on a navigation surface such as a desktop or on a transparent surface such as a glass plate that lies between the optical mouse and the navigation surface. Further, the multiple different imaging systems enable the optical navigation system to automatically adapt to transparent plates having different thicknesses.
In an embodiment, an integrated circuit (IC) device for optical navigation includes an aperture plate having height-specific apertures and height-specific navigation sensor arrays aligned in one-to-one correspondence with the height-specific apertures and configured to generate image information in response to light that passes through the corresponding height-specific apertures. The IC device also includes a tracking engine configured to output relative movement information in response to the image information. In an embodiment, the tracking engine is configured to select image information from one of the height-specific navigation sensor arrays in response to a comparison of the image information from the plurality of height-specific sensor arrays and to use the selected image information to generate the relative movement information.
In an embodiment, an optical element for use in an optical navigation system that includes an illumination source and that is configured to detect relative movement between the optical navigation system and a navigation surface includes an illumination source lens configured to focus light, which is output from the illumination source, onto the navigation surface and multiple height-specific focal lenses configured to focus light that reflects off the navigation surface, wherein each height-specific focal lens is positioned to receive the largest portion of reflected light at a different separation distance between the optical navigation system and the navigation surface. In an embodiment, the illumination source and the plurality of height-specific focal lenses are linearly aligned.
Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
Throughout the description similar reference numbers may be used to identify similar elements.
Each height-specific sensor array 134-1-134-6 includes an array of distinct photodetectors (not shown), for example, a 16×16 or 32×32 array of distinct photodetectors configured to detect light that is reflected from the illuminated spot on the navigation surface. Each of the photodetectors in the array generates light intensity information that is output as a digital value (e.g., an 8-bit digital value). Image information is captured by the sensor arrays in sensor-specific frames, where a frame of image information includes a set of simultaneously captured values for each distinct photodetector in the respective sensor array. Image frames captured by the height-specific sensor arrays include data that represents features on the navigation surface 102. The rate of image frame capture and tracking resolution can be programmable. In an embodiment, the image frame capture rate ranges up to 2,300 frames per second with a resolution of 800 counts per inch (cpi). Although some examples of frame capture rates and resolutions are provided, different frame capture rates and resolutions are contemplated.
The tracking engine 120 compares successive image frames from the same height-specific sensor array to determine the movement of image features between frames. In particular, the tracking engine determines movement by correlating common features that exist in successive image frames from the same sensor array. The movement between image frames is expressed in terms of movement vectors in, for example, X and Y directions (e.g., ΔX and ΔY). The movement vectors are then used to determine the movement of the optical mouse relative to the navigation surface. More detailed descriptions of exemplary navigation sensor movement tracking techniques are provided in U.S. Pat. No. 5,644,139, entitled NAVIGATION TECHNIQUE FOR DETECTING MOVEMENT OF NAVIGATION SENSORS RELATIVE TO AN OBJECT, and U.S. Pat. No. 6,222,174, entitled METHOD OF CORRELATING IMMEDIATELY ACQUIRED AND PREVIOUSLY STORED FEATURE INFORMATION FOR MOTION SENSING, both of which are incorporated by reference herein.
As described above, an optical mouse may be used on a transparent structure such as a glass plate that sits on top of an opaque structure such as a desktop. The optical navigation device 100 depicted in
As depicted in
The light 112 that illuminates the navigation surface 122 is typically scattered in random directions by the navigation surface and all of the height-specific imaging systems 122-1-122-6 tend to collect image information of varying degrees of quality. Although light is scattered in random directions, a major portion of the collimated light is reflected from the navigation surface at the angle incidence. Given that a major portion of the light is reflected at the angle of incidence, the height-specific imaging systems can be positioned such that each one of the imaging systems detects the largest portion of the reflected light at a different separation distance. Referring again to
Because light is scattered amongst the height-specific imaging systems 122-1-122-6, the image information generated from the height-specific imaging systems has varying degrees of quality. In general, the more light that is detected by a height-specific imaging system, the more distinguishable features that appear in the corresponding image frames. The more distinguishable features that appear in the image frames, the better the correlation results.
Because all of the height-specific imaging systems 122-1-122-6 tend to generate image information, the tracking engine 120 is configured to select image information from one of the height-specific imaging systems to use for navigation tracking. In an embodiment, the tracking engine compares the image information from the different height-specific imaging systems and selects the image information from the height-specific imaging system that is generating the highest quality image information. For example, the height-specific imaging system that generates image frames with the most distinguishable features. In an embodiment, the quality of the image information is identified based on a comparison of surface quality (SQUAL) values, where a SQUAL value is a measure of the number of valid features on a navigation surface that are visible by a sensor array in the current image frame. Although SQUAL is provided as one example, other techniques for comparing the quality of the image information from the different height-specific imaging systems can be used to select the optimal image information for use in navigation tracking.
In operation, the tracking engine 120 generates a different SQUAL value for each height-specific imaging system 122-1-122-6 using image information from each height-specific imaging system. The tracking engine then compares the SQUAL values and selects the source of the highest SQUAL value as the source of the image information that is used for navigation tracking. In an embodiment, the tracking engine periodically evaluates the SQUAL values of all of the height-specific imaging systems to determine if the optimal image information is still being used to generate the relative movement information. If the image information corresponding to the highest SQUAL value is not being used to generate the relative movement information, then an appropriate change can be made. The relative differences between SQUAL values should stay fairly constant during normal use such that the source that corresponds to the actual separation distance (Z distance) maintains the highest SQUAL value relative to the other SQUAL values.
In an embodiment, if the SQUAL values of two adjacent height-specific imaging systems 122-1-122-6 are the same, or nearly the same, (e.g., when the separation distance Z is between two height-specific imaging systems), either one of the height-specific imaging systems can be selected.
Given the position of the height-specific imaging systems 122-1-122-6 and the process for selecting the image information for use in navigation tracking, the optical mouse is able to adapt to different separation distances between the optical mouse and the navigation surface. For example, if the optical mouse is used directly on a navigation surface, such that the separation distance is Z=1, then image information from the height-specific imaging system 122-1 is used by the tracking engine to generate relative movement information. If on the other hand the optical mouse is used on a transparent plate that sits between the optical mouse and the navigation surface, then image information from one of the other height-specific imaging systems is used by the tracking engine to generate relative movement information. The source of the selected image information is a function of the separation distance between the optical mouse and the navigation surface. When the optical mouse is used on a transparent plate, the selected image information is a function of the thickness of the transparent plate between the optical mouse and the navigation surface. For example, if the thickness of the transparent plate sets the separation distance at Z=3, then the image information from height-specific imaging system 122-3 is used to generate relative movement information. Likewise, if the thickness of the transparent plate sets the separation distance at Z=5, then the image information from the height-specific imaging system 122-5 is used to generate relative movement information. The same is true for the other identified separation distances. Further, if the separation distance changes during use, the change in SQUAL value will be recognized and image information from a more appropriate (e.g., higher SQUAL value) height-specific imaging system will be selected.
The total depth of field of the navigation sensor system is a function of the number and positioning of the height-specific imaging systems 122-1-122-6. Although six height-specific imaging systems are described in the examples, other numbers of height-specific imaging systems are possible. Even a navigation sensor system with two height-specific imaging systems would provide the ability to adapt to use on both a navigation surface and a transparent plate over a transparent surface. In an embodiment, smooth transitions between height-specific imaging systems are achieved by positioning the height-specific imaging systems directly adjacent to each other. For example, the height-specific imaging systems are configured such that the respective height-specific sensor arrays are side-by-side.
In an embodiment, each height-specific imaging system enable navigation tracking within a range of separation distances. For example, each height-specific imaging system enables adequate navigation tracking at a separation distance of ±0.2 mm of the separation distance Z. The distance of adequate navigation tracking is referred to as the “depth of field.” In an embodiment, the adjacent height-specific imaging systems are positioned such that the depth of field is slightly overlapping for each pair of adjacent imaging systems. The slight overlapping of the depth of field of adjacent height-specific imaging systems results in a continuous depth of field that includes the ranges of all of the height-specific imaging systems. For example, the combined and continuous depth of field in the example of
In an embodiment, the height-specific sensor arrays 134-1-134-6 and the tracking engine are fabricated on the same substrate.
In an embodiment, the height-specific focal lenses 130-1-130-6 and the illumination system optics 110 are integrated into a single optical element.
As described above, the purpose of the optical element is to collimate and guide light from a light source to illuminate a spot on the navigation surface and to focus reflected light onto the height-specific sensor arrays.
Some users may rapidly change the separation distance between the optical mouse and the navigation surface with lifting and/or diving actions. Lifting and/or diving actions result in changes to the quality of the image information generated at each height-specific imaging system. For example, the highest SQUAL value will move between the different height-specific imaging systems as the optical mouse moves between the different separation distances (e.g., from Z=6 to Z=1).
An advantage of utilizing height-specific imaging systems to adapt an optical mouse to different separation distances is that this technique requires no moving parts and no adjustments to the geometry of the optical mouse, including the navigation sensor system 106 and the illumination system 102. That is, an optical mouse that utilizes height-specific imaging systems as described above requires no physical changes to be made by a user to adapt the optical mouse for use on a glass surface or for use on glass surfaces of different thicknesses.
Although specific embodiments of the invention have been described and illustrated, the invention is not to be limited to the specific forms or arrangements of parts as described and illustrated herein. The invention is limited only by the claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5644139||14 Aug 1996||1 Jul 1997||Allen; Ross R.||Navigation technique for detecting movement of navigation sensors relative to an object|
|US6078312 *||9 Jul 1997||20 Jun 2000||Gateway 2000, Inc.||Pointing device with absolute and relative positioning capability|
|US6222174||5 Mar 1999||24 Apr 2001||Hewlett-Packard Company||Method of correlating immediately acquired and previously stored feature information for motion sensing|
|US6281882||30 Mar 1998||28 Aug 2001||Agilent Technologies, Inc.||Proximity detector for a seeing eye mouse|
|US6433780||2 Jan 2001||13 Aug 2002||Agilent Technologies, Inc.||Seeing eye mouse for a computer system|
|US6462330 *||24 Mar 2000||8 Oct 2002||Microsoft Corporation||Cover with integrated lens for integrated chip optical sensor|
|US7339575||25 May 2004||4 Mar 2008||Avago Technologies Ecbu Ip Pte Ltd||Optical pointing device with variable focus|
|US20020080121 *||7 May 2001||27 Jun 2002||Samsung Electro-Mechanics Co.,Ltd||Optical mouse|
|US20030193529||11 Apr 2003||16 Oct 2003||Lee Bang Won||Navigation system and navigation method|
|US20040189593||31 Mar 2003||30 Sep 2004||Koay Ban Kuan||Optical mouse adapted for use on glass surfaces|
|US20050060668||11 Apr 2003||17 Mar 2005||Lee Bang Won||Navigation system and navigation method|
|US20050078087 *||8 Oct 2003||14 Apr 2005||Universal Electronics Inc.||Control device having integrated mouse and remote control capabilities|
|US20050231479 *||20 Apr 2004||20 Oct 2005||Tong Xie||Illumination spot alignment|
|US20050264531 *||25 May 2004||1 Dec 2005||Tai Li C||Optical pointing device with variable focus|
|US20060086712 *||25 Oct 2004||27 Apr 2006||Feldmeier David C||Safety device for flat irons based on optical motion detection|
|US20070296699||23 Jun 2006||27 Dec 2007||Microsoft Corporation||Multi-mode optical navigation|
|US20080061219||25 Aug 2006||13 Mar 2008||Wui Pin Lee||Lift detection adapted for navigation on a transparent structure|
|US20080252602 *||11 Apr 2007||16 Oct 2008||Ramakrishna Kakarala||Dynamically reconfigurable pixel array for optical navigation|
|DE69808522T2||25 Aug 1998||7 Aug 2003||Holding B E V S A||Gerät zur bildverarbeitung|
|20 Feb 2007||AS||Assignment|
Owner name: AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.,
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEAH, CHIANG SUN;YEOH, CHIN HEONG;TEO, CHIANG MEI;REEL/FRAME:018905/0716
Effective date: 20061219
|7 May 2013||AS||Assignment|
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD
Free format text: MERGER;ASSIGNOR:AVAGO TECHNOLOGIES ECBU IP (SINGAPORE) PTE. LTD.;REEL/FRAME:030369/0528
Effective date: 20121030
|8 May 2014||AS||Assignment|
Owner name: DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AG
Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:032851/0001
Effective date: 20140506
|19 Nov 2014||FPAY||Fee payment|
Year of fee payment: 4
|2 Feb 2016||AS||Assignment|
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD
Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENT RIGHTS (RELEASES RF 032851-0001);ASSIGNOR:DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT;REEL/FRAME:037689/0001
Effective date: 20160201
|11 Feb 2016||AS||Assignment|
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH
Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:037808/0001
Effective date: 20160201