US20060152482A1 - Virtual interface and control device - Google Patents
Virtual interface and control device Download PDFInfo
- Publication number
- US20060152482A1 US20060152482A1 US11/327,785 US32778506A US2006152482A1 US 20060152482 A1 US20060152482 A1 US 20060152482A1 US 32778506 A US32778506 A US 32778506A US 2006152482 A1 US2006152482 A1 US 2006152482A1
- Authority
- US
- United States
- Prior art keywords
- antennae
- frequency
- input device
- computer
- programmable device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/046—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
Definitions
- the present invention is related to methods and devices for interfacing with electronic devices that receive commands from an operator, such as computer systems.
- Input devices for use with a computer have transformed significantly over the last three decades. Generally speaking, punch cards gave way to terminals with keyboards; keyboards gave way to the mouse.
- the mouse has evolved from a unit housing a ball interacting with motion detectors, to a number of variants, some of which are as follows:
- U.S. Pat. No. 6,313,825 to Gilbert discloses an input device for a computer that detects movement of an object, such as a finger, within a selected field of space.
- the input device is used to control movement of a cursor over a display device.
- the device includes directional transducers that receive reflections of EMF from an object in the field, and provides signals to an “interpreter.”
- the interpreter detects movements by employing a clock which determines the time difference between the reflections received by the transducers, which it then reduces to a signal that controls the cursor.
- U.S. Pat. No. 6,690,357 to Dunton discloses an input device that uses images of input devices, and scanning sensors that detect user interaction with those images.
- the scanning sensors include digital video cameras that capture the movement of a user's hands and convert the movement into input command signals.
- the scanning sensors may alternatively sense the projected light reflected from the user's hands, or may detect the combination of the reflected projected light and the user's hands.
- U.S. Pat. No. 6,614,422 to Rafii, et al. discloses an input device that employs a three-dimensional sensor imaging to capture three-dimensional data as to the placement of a user's fingers on a substrate that either bears or displays a template similar to a keyboard or a keypad.
- the three-dimensional sensor transmits optically acquired data to a companion computer system that computes the velocity and location of the user's fingers, and converts that information into a command.
- U.S. Pat. No. 6,498,628 to Iwamura discloses an electronic appliance remote controller that employs a camera as a motion-sensing interface.
- the camera captures video images of a users' hand, evaluates the moving speed and direction of the hand, and correspondingly moves a cursor appearing on a screen.
- Zimmerman discloses an apparatus for generating control signals for the manipulation of virtual objects in a computer system.
- the apparatus includes a glove worn on a hand that includes sensors for detecting the gestures of the hand, and hand position.
- the computer system receives data from the sensors, and generates corresponding control signals in response.
- U.S. Patent Application Publication 2002/0006807 Mantyjarvi, et al., teaches a device for entering data that creates a virtual keyboard by using an infrared transceiver arrangement.
- the infrared transceivers record reflection data obtained from an object placed within a field of infrared light, and processes the data to correspond to a key position or function.
- the invention comprises an input device for a computer or other programmable circuit that translates the proximity of an object to one or more antennae into an electronic signal.
- the antenna generates a reference first frequency and a second frequency.
- FIG. I is a perspective view of a preferred embodiment of the intervention.
- FIG. 2 is a side view of a preferred embodiment of the intervention.
- FIG. 3 is perspective view of a prefaced embodiment of the invention as used with a personal computer.
- FIG. 1 shows a perspective view of preferred embodiment of the invention that may be used to operate a programmable device, such as a computer.
- a first antenna 1 , and second antenna are 2 rotatably and pivotally attached to control pad 3 .
- Control pad 3 may include a third antenna 4 , which may be embedded in, or externally attached to control panel 3 .
- Control pad 3 may also include buttons 5 and 6 , which can correspond to the left and right buttons found on a conventional mouse.
- Antennas 1 and 2 may be constructed from conventional materials known to those of ordinary skill in the art. Antennas 1 and 2 may be rotatably and pivotally attached to control pad 3 by a combination of actuators that position the antennae in optimal relationships based upon feedback from the system driver.
- Control pad 3 may resemble a conventional mouse pad known to those in the art. Control pad 3 may be constructed from any non-conductive material that is electromagnetically invisible to signals emitted or received by the antennae. Antenna 4 may be formed from conventional materials, and can be embedded within control pad 3 , or may be attached externally in a manner similar to antennae 1 and 2 .
- FIG. 2 depicts a side view where antennae 1 and 2 are in vertical positions relative to control pad 3 , which is merely an example of how the antennae may be positioned. In practice, antennae 1 and 2 may be positioned in any positions relative to each other and control pad 3 to achieve optimal transmission and reception.
- FIG. 3 presents a perspective view of the invention as applied to a conventional computer 7 . While FIG. 3 depicts the invention being used with a personal computer (“PC”), it is important to note that the invention can be used with any size or type of computer or device that depends on input from a human operator. Examples include, but are not limited to, notebook computers, laptop computers, workstations, video game consoles, cash registers, automatic tellers, vehicle electronics or surgical/medical devices.
- PC personal computer
- This invention may be used with one or more antennae.
- the antenna or antennae 1 , 2 and 3 act as both emitters and receivers of electromagnetic fields.
- the antennae are operated in an electromagnetic spectrum range of 3 Hz to 1.24 eV.
- the antenna or antennae are arrayed in various arrangements depending upon the particular application, and the current frequency range being used.
- Each of the antennae initially emits a reference frequency.
- an object such as a hand
- the object creates a disturbance to the field. This disturbance is registered as a change in value.
- the value change is translated into a coordinate by a device driver or other software conventionally installed in the device to be controlled.
- the change in coordinates may be expressed as a command to the device to be controlled, for example, the movement of a cursor on a computer screen.
- the invention can register more than one disturbance to the field at a given time, giving the ability to convey more complex commands to a device to be controlled than can be achieved through conventional means.
- two or more interfaces will be linked with an imaging device that projects three-dimensional images.
- An example of such a three-dimensional imaging device is a holographic projector.
- the field emitted by the interfaces can overlay the projection.
- Through device drivers or other software programmed into a programmable circuit, attempts to interact with the images in the three-dimensional projection will be captured by the interface and will enable the user to move the virtual objects.
- the invention will provide software specifically designed to relate two- and three-dimensional motion in two- and three-dimensional images as represented by a programmable circuit.
- This software can also interpret disturbances to the field for programmable circuits designed to control the motion of mechanical devices.
- the software may have a specific user interface that is modifiable for the user.
- the invention may be linked electronically (wirelessly) or mechanically to the device to be controlled or object device, and can be powered by battery, a separate AC connection, by the object device, or any other conventional means known to those in the art. Two or more of these inventions may be connected to the device to be controlled so that a single user may use both hands simultaneously, or that multiple users can control the device. If used in conjunction with a computer as depicted in FIG. 3 , buttons 5 and 6 may be used as on a conventional mouse, or alternatively, the optional third antenna 4 can be employed to interpret movement in three dimensions so that the invention can replicate the conventional functions of buttons 5 and 6 electronically, in a manner familiar to users of conventional mouse devices.
Abstract
An input device for a computer or other programmable device translates the proximity of an object to one or more antennae into an electronic signal. The antennae generate a first frequency and a second frequency. When an object, such as a hand, is placed in proximity to the antenna, the object causes the first and second frequencies to heterodyne, which creates a third frequency, also referred to as a beat frequency or pulse frequency. A receiver interprets the pulse frequency and translates it into an electronic signal that can be used to command a computer or other programmable device.
Description
- This patent application is related to, and claims the benefit of U.S. Provisional Patent Application Ser. No. 60/641, 809 filed Jan. 7, 2005, which application is incorporated herein by this reference thereto.
- A portion of the disclosure of this patent document contains material which is subject o copyright protection. The copyright owner has no objection to the facsimile reproduction by any one of the paten disclosure as it appears in the U.S. Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
- 1. Field of the Invention
- The present invention is related to methods and devices for interfacing with electronic devices that receive commands from an operator, such as computer systems.
- 2. Description of the Related Art
- The following description and discussion of the prior art is undertaken in order to provide background information so that the present invention may be completely understood and appreciated in its proper context.
- Input devices for use with a computer have transformed significantly over the last three decades. Generally speaking, punch cards gave way to terminals with keyboards; keyboards gave way to the mouse. The mouse has evolved from a unit housing a ball interacting with motion detectors, to a number of variants, some of which are as follows:
- U.S. Pat. No. 6,313,825 to Gilbert discloses an input device for a computer that detects movement of an object, such as a finger, within a selected field of space. The input device is used to control movement of a cursor over a display device. The device includes directional transducers that receive reflections of EMF from an object in the field, and provides signals to an “interpreter.” The interpreter detects movements by employing a clock which determines the time difference between the reflections received by the transducers, which it then reduces to a signal that controls the cursor.
- U.S. Pat. No. 6,690,357 to Dunton discloses an input device that uses images of input devices, and scanning sensors that detect user interaction with those images. The scanning sensors include digital video cameras that capture the movement of a user's hands and convert the movement into input command signals. The scanning sensors may alternatively sense the projected light reflected from the user's hands, or may detect the combination of the reflected projected light and the user's hands.
- U.S. Pat. No. 6,614,422 to Rafii, et al., discloses an input device that employs a three-dimensional sensor imaging to capture three-dimensional data as to the placement of a user's fingers on a substrate that either bears or displays a template similar to a keyboard or a keypad. The three-dimensional sensor transmits optically acquired data to a companion computer system that computes the velocity and location of the user's fingers, and converts that information into a command.
- U.S. Pat. No. 6,498,628 to Iwamura discloses an electronic appliance remote controller that employs a camera as a motion-sensing interface. The camera captures video images of a users' hand, evaluates the moving speed and direction of the hand, and correspondingly moves a cursor appearing on a screen.
- In U.S. Patent Application Publication 2003/0048312, Zimmerman discloses an apparatus for generating control signals for the manipulation of virtual objects in a computer system. The apparatus includes a glove worn on a hand that includes sensors for detecting the gestures of the hand, and hand position. The computer system receives data from the sensors, and generates corresponding control signals in response.
- U.S. Patent Application Publication 2002/0075240, Lieberman, et al., describes a device for inputting alphanumeric information into a computer that employs sensors that may be optical, acoustic or position sensors to sense the “pressing” or “striking” of virtual keys. The sensor then forwards data to a processor, which converts the “pressing” or “striking” data with characters, instructions information or data.
- U.S. Patent Application Publication 2002/0006807, Mantyjarvi, et al., teaches a device for entering data that creates a virtual keyboard by using an infrared transceiver arrangement. The infrared transceivers record reflection data obtained from an object placed within a field of infrared light, and processes the data to correspond to a key position or function.
- The invention comprises an input device for a computer or other programmable circuit that translates the proximity of an object to one or more antennae into an electronic signal. The antenna generates a reference first frequency and a second frequency.
- FIG. I is a perspective view of a preferred embodiment of the intervention.
-
FIG. 2 is a side view of a preferred embodiment of the intervention. -
FIG. 3 is perspective view of a prefaced embodiment of the invention as used with a personal computer. - In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
-
FIG. 1 shows a perspective view of preferred embodiment of the invention that may be used to operate a programmable device, such as a computer. Afirst antenna 1, and second antenna are 2 rotatably and pivotally attached tocontrol pad 3.Control pad 3 may include athird antenna 4, which may be embedded in, or externally attached tocontrol panel 3.Control pad 3 may also includebuttons -
Antennas Antennas pad 3 by a combination of actuators that position the antennae in optimal relationships based upon feedback from the system driver. -
Control pad 3 may resemble a conventional mouse pad known to those in the art.Control pad 3 may be constructed from any non-conductive material that is electromagnetically invisible to signals emitted or received by the antennae.Antenna 4 may be formed from conventional materials, and can be embedded withincontrol pad 3, or may be attached externally in a manner similar toantennae -
FIG. 2 depicts a side view whereantennae control pad 3, which is merely an example of how the antennae may be positioned. In practice,antennae control pad 3 to achieve optimal transmission and reception. -
FIG. 3 presents a perspective view of the invention as applied to a conventional computer 7. WhileFIG. 3 depicts the invention being used with a personal computer (“PC”), it is important to note that the invention can be used with any size or type of computer or device that depends on input from a human operator. Examples include, but are not limited to, notebook computers, laptop computers, workstations, video game consoles, cash registers, automatic tellers, vehicle electronics or surgical/medical devices. - This invention may be used with one or more antennae. In operation, the antenna or
antennae control pad 3, the object creates a disturbance to the field. This disturbance is registered as a change in value. The value change is translated into a coordinate by a device driver or other software conventionally installed in the device to be controlled. As the object moves within the field, the change in coordinates may be expressed as a command to the device to be controlled, for example, the movement of a cursor on a computer screen. If desired, the invention can register more than one disturbance to the field at a given time, giving the ability to convey more complex commands to a device to be controlled than can be achieved through conventional means. - In an alternative embodiment, two or more interfaces will be linked with an imaging device that projects three-dimensional images. An example of such a three-dimensional imaging device is a holographic projector. The field emitted by the interfaces can overlay the projection. Through device drivers or other software programmed into a programmable circuit, attempts to interact with the images in the three-dimensional projection will be captured by the interface and will enable the user to move the virtual objects.
- The invention will provide software specifically designed to relate two- and three-dimensional motion in two- and three-dimensional images as represented by a programmable circuit. This software can also interpret disturbances to the field for programmable circuits designed to control the motion of mechanical devices. The software may have a specific user interface that is modifiable for the user.
- The invention may be linked electronically (wirelessly) or mechanically to the device to be controlled or object device, and can be powered by battery, a separate AC connection, by the object device, or any other conventional means known to those in the art. Two or more of these inventions may be connected to the device to be controlled so that a single user may use both hands simultaneously, or that multiple users can control the device. If used in conjunction with a computer as depicted in
FIG. 3 ,buttons third antenna 4 can be employed to interpret movement in three dimensions so that the invention can replicate the conventional functions ofbuttons - Although specific embodiments have been illustrated and described herein, it is appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention. Therefore, it is intended that the invention be limited only by the claims and their equivalents.
Claims (7)
1. An input device, comprising;
at least one antennae for generating a first frequency and a second frequency; and at least one receiver, for sensing a pulse frequency created by an object placed in proximity to said receiver, interpreting said pulse frequency, and translating said pulse frequency into an electronic signal that can be used to command a programmable device.
2. The input device of claim 1 , wherein the first frequency and second frequency are emitted over a holographic projection.
3. The input device of claim 2 , wherein the at least one antennae is physically connected to a programmable device.
4. The input device of claim 2 wherein the at least one antennae is wirelessly connected to a programmable device.
5. The input device of claim 1 wherein the at least one antennae generates a third frequency.
6. The input device of claim 5 wherein the at least one antennae is physically connected to a programmable device.
7. The input device of claim 5 wherein the at least one antennae is electronically connected to a programmable device.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/327,785 US20060152482A1 (en) | 2005-01-07 | 2006-01-06 | Virtual interface and control device |
US13/174,357 US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
US13/746,244 US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
US15/252,066 USRE48054E1 (en) | 2005-01-07 | 2016-08-30 | Virtual interface and control device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US64180905P | 2005-01-07 | 2005-01-07 | |
US11/327,785 US20060152482A1 (en) | 2005-01-07 | 2006-01-06 | Virtual interface and control device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/174,357 Continuation US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060152482A1 true US20060152482A1 (en) | 2006-07-13 |
Family
ID=36652769
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/327,785 Abandoned US20060152482A1 (en) | 2005-01-07 | 2006-01-06 | Virtual interface and control device |
US13/174,357 Active US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
US13/746,244 Ceased US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
US15/252,066 Expired - Fee Related USRE48054E1 (en) | 2005-01-07 | 2016-08-30 | Virtual interface and control device |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/174,357 Active US8358283B2 (en) | 2005-01-07 | 2011-06-30 | Virtual interface and control device |
US13/746,244 Ceased US8823648B2 (en) | 2005-01-07 | 2013-01-21 | Virtual interface and control device |
US15/252,066 Expired - Fee Related USRE48054E1 (en) | 2005-01-07 | 2016-08-30 | Virtual interface and control device |
Country Status (1)
Country | Link |
---|---|
US (4) | US20060152482A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013095985A1 (en) * | 2011-12-22 | 2013-06-27 | Smsc, S.A.R.L. | Gesturing architecture using proximity sensing |
GB2515830A (en) * | 2013-07-05 | 2015-01-07 | Broadcom Corp | Method and apparatus for use in a radio communication device |
KR101873749B1 (en) * | 2012-01-26 | 2018-07-03 | 엘지전자 주식회사 | Mobile Terminal |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060152482A1 (en) * | 2005-01-07 | 2006-07-13 | Chauncy Godwin | Virtual interface and control device |
KR101789683B1 (en) * | 2011-06-13 | 2017-11-20 | 삼성전자주식회사 | Display apparatus and Method for controlling display apparatus and remote controller |
US10289771B2 (en) * | 2015-12-16 | 2019-05-14 | Dassault Systemes | Modification of a constrained asymmetrical subdivision mesh |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5973677A (en) * | 1997-01-07 | 1999-10-26 | Telxon Corporation | Rechargeable, untethered electronic stylus for computer with interactive display screen |
US6313825B1 (en) * | 1998-12-28 | 2001-11-06 | Gateway, Inc. | Virtual input device |
US20020006807A1 (en) * | 2000-06-28 | 2002-01-17 | Jani Mantyjarvi | Method and arrangement for entering data in an electronic apparatus and an electronic apparatus |
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US20020075334A1 (en) * | 2000-10-06 | 2002-06-20 | Yfantis Evangelos A. | Hand gestures and hand motion for replacing computer mouse events |
US20020075240A1 (en) * | 2000-05-29 | 2002-06-20 | Vkb Inc | Virtual data entry device and method for input of alphanumeric and other data |
US6498628B2 (en) * | 1998-10-13 | 2002-12-24 | Sony Corporation | Motion sensing interface |
US20030165048A1 (en) * | 2001-12-07 | 2003-09-04 | Cyrus Bamji | Enhanced light-generated interface for use with electronic devices |
US20030193479A1 (en) * | 2000-05-17 | 2003-10-16 | Dufaux Douglas Paul | Optical system for inputting pointer and character data into electronic equipment |
US6650318B1 (en) * | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
US6654001B1 (en) * | 2002-09-05 | 2003-11-25 | Kye Systems Corp. | Hand-movement-sensing input device |
US20030218761A1 (en) * | 2002-05-22 | 2003-11-27 | Carlo Tomasi | Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices |
US6690357B1 (en) * | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
US20050276448A1 (en) * | 2000-07-07 | 2005-12-15 | Pryor Timothy R | Multi-functional control and entertainment systems |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4524348A (en) * | 1983-09-26 | 1985-06-18 | Lefkowitz Leonard R | Control interface |
US4988981B1 (en) | 1987-03-17 | 1999-05-18 | Vpl Newco Inc | Computer data entry and manipulation apparatus and method |
US5392054A (en) * | 1993-01-29 | 1995-02-21 | Ericsson Ge Mobile Communications Inc. | Diversity antenna assembly for portable radiotelephones |
WO1995022097A2 (en) * | 1994-02-15 | 1995-08-17 | Monamed Medizintechnik Gmbh | A computer pointing device |
DE9411602U1 (en) * | 1994-07-18 | 1995-08-17 | Siemens Ag | Arrangement for the detection of objects in an area to be monitored |
GB9521072D0 (en) * | 1995-10-14 | 1995-12-20 | Rank Xerox Ltd | Calibration of an interactive desktop system |
US5990865A (en) * | 1997-01-06 | 1999-11-23 | Gard; Matthew Davis | Computer interface device |
DE19723331B4 (en) * | 1997-06-04 | 2010-11-11 | Ipcom Gmbh & Co. Kg | radio set |
US6064354A (en) * | 1998-07-01 | 2000-05-16 | Deluca; Michael Joseph | Stereoscopic user interface method and apparatus |
US6614422B1 (en) | 1999-11-04 | 2003-09-02 | Canesta, Inc. | Method and apparatus for entering data using a virtual input device |
US6738044B2 (en) * | 2000-08-07 | 2004-05-18 | The Regents Of The University Of California | Wireless, relative-motion computer input device |
TWI313835B (en) * | 2002-06-04 | 2009-08-21 | Koninkl Philips Electronics Nv | Method of measuring the movement of an object relative to a user's input device and related input device,mobile phone apparatus, cordless phone apparatus, laptor computer, mouse and remote control |
US20040041828A1 (en) * | 2002-08-30 | 2004-03-04 | Zellhoefer Jon William | Adaptive non-contact computer user-interface system and method |
JP4217043B2 (en) * | 2002-09-20 | 2009-01-28 | 京セラ株式会社 | Adaptive array radio communication apparatus, reception level display method, reception level adjustment method, reception level display program, and reception level adjustment program |
EP1413974A1 (en) * | 2002-10-24 | 2004-04-28 | Hewlett-Packard Company | Hybrid sensing techniques for position determination |
US7492351B2 (en) * | 2003-12-18 | 2009-02-17 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical navigation based on laser feedback or laser interferometry |
US7492357B2 (en) * | 2004-05-05 | 2009-02-17 | Smart Technologies Ulc | Apparatus and method for detecting a pointer relative to a touch surface |
US20060152482A1 (en) * | 2005-01-07 | 2006-07-13 | Chauncy Godwin | Virtual interface and control device |
US7534988B2 (en) * | 2005-11-08 | 2009-05-19 | Microsoft Corporation | Method and system for optical tracking of a pointing object |
US7414705B2 (en) * | 2005-11-29 | 2008-08-19 | Navisense | Method and system for range measurement |
US8139029B2 (en) * | 2006-03-08 | 2012-03-20 | Navisense | Method and device for three-dimensional sensing |
US8614669B2 (en) * | 2006-03-13 | 2013-12-24 | Navisense | Touchless tablet method and system thereof |
US8578282B2 (en) * | 2006-03-15 | 2013-11-05 | Navisense | Visual toolkit for a virtual user interface |
KR100851977B1 (en) * | 2006-11-20 | 2008-08-12 | 삼성전자주식회사 | Controlling Method and apparatus for User Interface of electronic machine using Virtual plane. |
US8166421B2 (en) * | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
CN102334086A (en) * | 2009-01-26 | 2012-01-25 | 泽罗技术(2009)有限公司 | Device and method for monitoring an object's behavior |
US9335825B2 (en) * | 2010-01-26 | 2016-05-10 | Nokia Technologies Oy | Gesture control |
US20110298708A1 (en) * | 2010-06-07 | 2011-12-08 | Microsoft Corporation | Virtual Touch Interface |
US8730162B1 (en) * | 2011-04-07 | 2014-05-20 | Google Inc. | Methods and apparatus related to cursor device calibration |
US8933913B2 (en) * | 2011-06-28 | 2015-01-13 | Microsoft Corporation | Electromagnetic 3D stylus |
-
2006
- 2006-01-06 US US11/327,785 patent/US20060152482A1/en not_active Abandoned
-
2011
- 2011-06-30 US US13/174,357 patent/US8358283B2/en active Active
-
2013
- 2013-01-21 US US13/746,244 patent/US8823648B2/en not_active Ceased
-
2016
- 2016-08-30 US US15/252,066 patent/USRE48054E1/en not_active Expired - Fee Related
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5973677A (en) * | 1997-01-07 | 1999-10-26 | Telxon Corporation | Rechargeable, untethered electronic stylus for computer with interactive display screen |
US6690357B1 (en) * | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
US6498628B2 (en) * | 1998-10-13 | 2002-12-24 | Sony Corporation | Motion sensing interface |
US6313825B1 (en) * | 1998-12-28 | 2001-11-06 | Gateway, Inc. | Virtual input device |
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US20030193479A1 (en) * | 2000-05-17 | 2003-10-16 | Dufaux Douglas Paul | Optical system for inputting pointer and character data into electronic equipment |
US20020075240A1 (en) * | 2000-05-29 | 2002-06-20 | Vkb Inc | Virtual data entry device and method for input of alphanumeric and other data |
US20020006807A1 (en) * | 2000-06-28 | 2002-01-17 | Jani Mantyjarvi | Method and arrangement for entering data in an electronic apparatus and an electronic apparatus |
US20050276448A1 (en) * | 2000-07-07 | 2005-12-15 | Pryor Timothy R | Multi-functional control and entertainment systems |
US20020075334A1 (en) * | 2000-10-06 | 2002-06-20 | Yfantis Evangelos A. | Hand gestures and hand motion for replacing computer mouse events |
US6650318B1 (en) * | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
US20030165048A1 (en) * | 2001-12-07 | 2003-09-04 | Cyrus Bamji | Enhanced light-generated interface for use with electronic devices |
US20030218761A1 (en) * | 2002-05-22 | 2003-11-27 | Carlo Tomasi | Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices |
US6654001B1 (en) * | 2002-09-05 | 2003-11-25 | Kye Systems Corp. | Hand-movement-sensing input device |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013095985A1 (en) * | 2011-12-22 | 2013-06-27 | Smsc, S.A.R.L. | Gesturing architecture using proximity sensing |
CN104272233A (en) * | 2011-12-22 | 2015-01-07 | Smsc控股有限责任公司 | Gesturing architecture using proximity sensing |
JP2015503783A (en) * | 2011-12-22 | 2015-02-02 | エスエムエスツェー, エス.アー.エール.エル. | Gesture motion architecture using proximity sensing |
KR20150028762A (en) * | 2011-12-22 | 2015-03-16 | 에스엠에스씨 홀딩스 에스에이알엘 | Gesturing Arcitecture Using Proximity Sensing |
US9298333B2 (en) | 2011-12-22 | 2016-03-29 | Smsc Holdings S.A.R.L. | Gesturing architecture using proximity sensing |
KR102050508B1 (en) | 2011-12-22 | 2019-11-29 | 에스엠에스씨 홀딩스 에스에이알엘 | Gesturing Arcitecture Using Proximity Sensing |
KR101873749B1 (en) * | 2012-01-26 | 2018-07-03 | 엘지전자 주식회사 | Mobile Terminal |
GB2515830A (en) * | 2013-07-05 | 2015-01-07 | Broadcom Corp | Method and apparatus for use in a radio communication device |
Also Published As
Publication number | Publication date |
---|---|
US8358283B2 (en) | 2013-01-22 |
US20130222242A1 (en) | 2013-08-29 |
US20120025959A1 (en) | 2012-02-02 |
US8823648B2 (en) | 2014-09-02 |
USRE48054E1 (en) | 2020-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE48054E1 (en) | Virtual interface and control device | |
US11099655B2 (en) | System and method for gesture based data and command input via a wearable device | |
US11221730B2 (en) | Input device for VR/AR applications | |
US9268400B2 (en) | Controlling a graphical user interface | |
US5598187A (en) | Spatial motion pattern input system and input method | |
EP2717120B1 (en) | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications | |
US7358963B2 (en) | Mouse having an optically-based scrolling feature | |
US5132672A (en) | Three degree of freedom graphic object controller | |
US20030174125A1 (en) | Multiple input modes in overlapping physical space | |
US20070222746A1 (en) | Gestural input for navigation and manipulation in virtual space | |
US20110037695A1 (en) | Ergonomic control unit for providing a pointing function | |
US20150193000A1 (en) | Image-based interactive device and implementing method thereof | |
WO2006068357A1 (en) | System for wearable general-purpose 3-dimensional input | |
US20050264522A1 (en) | Data input device | |
US20080036739A1 (en) | Integrated Wireless Pointing Device, Terminal Equipment with the Same, and Pointing Method Using Wireless Pointing Device | |
JP2008227605A (en) | Led pointing remote controller, video display device using the same and video system | |
EP1160651A1 (en) | Wireless cursor control | |
WO2003003185A1 (en) | System for establishing a user interface | |
EP1785817A2 (en) | External operation signal recognition system of a mobile communication terminal | |
WO2010020986A2 (en) | An ergonomic control unit for providing a pointing function | |
KR100812998B1 (en) | System for controlling mouse cursor with bluetooth | |
JP2000187551A (en) | Input device | |
KR20080017194A (en) | Wireless mouse and driving method thereof | |
US20230031200A1 (en) | Touchless, Gesture-Based Human Interface Device | |
KR100364738B1 (en) | wireless mouse combined ball mouse and trackball mouse |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |