US20100125815A1 - Gesture-based control method for interactive screen control - Google Patents
Gesture-based control method for interactive screen control Download PDFInfo
- Publication number
- US20100125815A1 US20100125815A1 US12/424,380 US42438009A US2010125815A1 US 20100125815 A1 US20100125815 A1 US 20100125815A1 US 42438009 A US42438009 A US 42438009A US 2010125815 A1 US2010125815 A1 US 2010125815A1
- Authority
- US
- United States
- Prior art keywords
- gesture
- control method
- based control
- predefined
- function
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- This invention relates to a control method for interactive screen control, more particularly to a gesture-based control method for interactive screen control.
- a conventional input device such as a computer mouse or a computer keyboard, may be used to control an interactive screen.
- the conventional input device is bulky and is thus inconvenient to carry. Therefore, various conventional control methods for interactive screen control that eliminate the use of the conventional input device have been proposed.
- voice commands are issued to control an interactive screen.
- This conventional voice-based control method is not applicable for those who have speech impairment.
- gestures are detected such as by a camera to control an electrical appliance.
- AR augmented reality
- a first predefined gesture such as extending a finger
- a second predefined gesture such as extending a pair of fingers
- the electrical appliance is turned off.
- the aforementioned conventional gesture-based control method is disadvantageous in that different gestures are used for performing different operations to control the electrical appliance. That is, when turning on the electrical appliance, a gesture, i.e., the first predefined gesture, is made, and when turning off the electrical appliance, a different gesture, i.e., the second predefined gesture, is made.
- the user needs to memorize an indefinite number of predefined gestures in order to fully control the electrical appliance.
- the object of the present invention is to provide a gesture-based control method for interactive screen control that can fully control an interactive screen using only a limited number of predefined gestures.
- a gesture-based control method for interactive screen control comprises configuring an image-capturing module to capture a sequence of images of a gesture, configuring an analyzing module to determine whether the images captured by the image-capturing module match a predefined gesture corresponding to a function of an input device, and when it is determined that the images captured by the image-capturing module match the predefined gesture, configuring a processing module to perform an operation associated with the corresponding function of the input device and to control an interactive screen to show a result of the operation performed thereby.
- FIG. 1 is a schematic perspective view of the preferred embodiment of a system according to this invention.
- FIGS. 2 to 6 are schematic diagrams illustrating predefined gestures stored in the system in FIG. 1 ;
- FIG. 7 is a flow chart of the preferred embodiment of a gesture-based control method for interactive screen control according to this invention
- the preferred embodiment of a system includes a database 51 , an image-capturing module 52 , an analyzing module 53 , and a processing module 54 .
- the database 51 stores therein a set of predefined gestures, namely, first, second, third, fourth, and fifth predefined gestures, and corresponding functions of a computer mouse.
- the database 51 stores therein a set of predefined gestures, and corresponding functions of one of a computer keyboard, a computer steering wheel, and a computer joystick.
- the predefined gestures and the corresponding functions of the computer mouse are stored in a storage device, such as a disk drive.
- the first predefined gesture is assigned with a single-click function of a computer mouse.
- the first predefined gesture is a pinch gesture made by a combination of an index finger and a thumb. That is, as illustrated in FIG. 2 , the index finger 6 and the thumb 7 are first disposed at a spread apart position, where the index finger 6 and the thumb 7 are spaced apart from each other. Then, the index finger 6 and the thumb 7 are disposed at a pinch position, where the tips of the index finger 6 and the thumb 7 are in contact. Thereafter, the index finger 6 and the thumb 7 are disposed back to the spread apart position.
- the second predefined gesture is assigned with a double-click function of a computer mouse.
- the second predefined gesture is a double pinch gesture made by a combination of an index finger and a thumb. That is, as illustrated in FIG. 3 , the index finger 6 and the thumb 7 make the pinch gesture twice.
- the third predefined gesture is assigned with a select function of a computer mouse.
- the third predefined gesture is a spread gesture made by a combination of an index finger and a thumb. That is, as illustrated in FIG. 4 , the index finger 6 and the thumb 7 are first disposed at the pinch position, and then at the spread apart position.
- the fourth predefined gesture is assigned with a click-and-drag function of a computer mouse.
- the fourth predefined gesture is a pinch-and-point gesture made by a combination of an index finger and a thumb. That is, as illustrated in FIG. 5 , the index finger 6 and the thumb 7 are first disposed at the pinch position, and are then moved in a direction indicated by the arrow (A).
- the fifth predefined gesture is assigned with a function of a computer mouse for moving a cursor.
- the fifth predefined gesture is a point gesture made by an index finger. That is, as illustrated in FIG. 6 , the index finger 6 is moved in a direction indicated by the arrow (B).
- the image-capturing module 52 includes a lens unit 521 that captures a sequence of images such as of a gesture made by a hand 3 , and a converter 522 that converts the images captured by the lens unit 521 into a digital format.
- the system such as notebook computer, further includes a display module 2 on which a computer-generated graphics, i.e., an interactive screen, and a real-world element, i.e., the images captured by the lens unit 521 of the image-capturing module 52 , are displayed.
- a computer-generated graphics i.e., an interactive screen
- a real-world element i.e., the images captured by the lens unit 521 of the image-capturing module 52
- the analyzing module 53 is connected to the database 51 and the image-capturing module 52 , and determines whether the images captured by the lens unit 521 of the image-capturing module 52 match one of the predefined gestures stored in the database 51 .
- the analyzing module 53 first analyzes the images captured by the lens unit 521 of the image-capturing module 52 , and then compares the images captured by the lens unit 521 of the image-capturing module 52 with the predefined gestures stored in the database 51 .
- the analyzing module 53 determines that the images captured by the lens unit 521 of the image-capturing module 52 match the second predefined gesture stored in the database 51 , the analyzing module 53 generates a result that corresponds to the double-click function of a computer mouse.
- the processing module 54 is connected to the analyzing module 53 , and performs an operation based on the result generated by the analyzing module 53 .
- the processing module 54 performs an operation associated with the double-click function of a computer mouse and controls the interactive screen to show a result of the operation performed thereby.
- the lens unit 521 of the image-capturing module 52 is configured to capture a sequence of images of a gesture
- the converter 522 of the image-capturing module 52 is configured to convert the images captured by the lens unit 521 of the image-capturing module 52 into a digital format.
- step 72 the analyzing module 53 is configured to determine whether the images captured in step 71 match one of the predefined gestures.
- step 73 when it is determined in step 72 that the images captured in step 71 match one of the predefined gestures, the flow proceeds to step 74 . Otherwise, the flow goes back to step 71 .
- step 74 the processing module 54 performs an operation associated with a function of a computer mouse that corresponds to the matching one of the predefined gestures and controls an interactive screen to show a result of the operation performed thereby.
- the gesture-based control method of this invention uses predefined gestures, each of which corresponds to a function of an input device, i.e., a computer mouse.
- the same gesture may be made to perform different operations to control an interactive screen.
- the cursor is first moved to the corresponding button of the window using the fifth predefined gesture and then single-click the corresponding button of the window using the first predefined gesture. That is, for the three different operations, the same succession of the fifth and first predefined gestures is made.
- the gesture-based control method of this invention therefore, only requires the user to memorize a limited number of predefined gestures in order to fully control an interactive screen.
Abstract
A gesture-based control method for interactive screen control includes configuring an image-capturing module to capture a sequence of images of a gesture, configuring an analyzing module to determine whether the images captured by the image-capturing module match a predefined gesture corresponding to a function of an input device, and when it is determined that the images captured by the image-capturing module match the predefined gesture, configuring a processing module to perform an operation associated with the corresponding function of the input device and to control an interactive screen to show a result of the operation performed thereby.
Description
- This application claims priority of Taiwanese Application No. 097144686, filed on Nov. 19, 2008.
- 1. Field of the Invention
- This invention relates to a control method for interactive screen control, more particularly to a gesture-based control method for interactive screen control.
- 2. Description of the Related Art
- A conventional input device, such as a computer mouse or a computer keyboard, may be used to control an interactive screen. The conventional input device, however, is bulky and is thus inconvenient to carry. Therefore, various conventional control methods for interactive screen control that eliminate the use of the conventional input device have been proposed. In one conventional control method, voice commands are issued to control an interactive screen. This conventional voice-based control method, however, is not applicable for those who have speech impairment. In another conventional control method, which employs the augmented reality (AR) technology, gestures are detected such as by a camera to control an electrical appliance. For example, when a first predefined gesture, such as extending a finger, is detected, and when an operation associated with the first predefined gesture is turning on the electrical appliance, the electrical appliance is turned on, and when a second predefined gesture, such as extending a pair of fingers, and when an operation associated with the second predefined gesture is turning off the electrical appliance, the electrical appliance is turned off. The aforementioned conventional gesture-based control method is disadvantageous in that different gestures are used for performing different operations to control the electrical appliance. That is, when turning on the electrical appliance, a gesture, i.e., the first predefined gesture, is made, and when turning off the electrical appliance, a different gesture, i.e., the second predefined gesture, is made. As a consequence, the user needs to memorize an indefinite number of predefined gestures in order to fully control the electrical appliance.
- Therefore, the object of the present invention is to provide a gesture-based control method for interactive screen control that can fully control an interactive screen using only a limited number of predefined gestures.
- According to the present invention, a gesture-based control method for interactive screen control comprises configuring an image-capturing module to capture a sequence of images of a gesture, configuring an analyzing module to determine whether the images captured by the image-capturing module match a predefined gesture corresponding to a function of an input device, and when it is determined that the images captured by the image-capturing module match the predefined gesture, configuring a processing module to perform an operation associated with the corresponding function of the input device and to control an interactive screen to show a result of the operation performed thereby.
- Other features and advantages of the present invention will become apparent in the following detailed description of the preferred embodiment with reference to the accompanying drawings, of which:
-
FIG. 1 is a schematic perspective view of the preferred embodiment of a system according to this invention; -
FIGS. 2 to 6 are schematic diagrams illustrating predefined gestures stored in the system inFIG. 1 ; and -
FIG. 7 is a flow chart of the preferred embodiment of a gesture-based control method for interactive screen control according to this invention - Referring to
FIG. 1 , the preferred embodiment of a system according to this invention includes adatabase 51, an image-capturingmodule 52, ananalyzing module 53, and aprocessing module 54. - The
database 51 stores therein a set of predefined gestures, namely, first, second, third, fourth, and fifth predefined gestures, and corresponding functions of a computer mouse. - In an alternative embodiment, the
database 51 stores therein a set of predefined gestures, and corresponding functions of one of a computer keyboard, a computer steering wheel, and a computer joystick. - In yet another embodiment, the predefined gestures and the corresponding functions of the computer mouse are stored in a storage device, such as a disk drive.
- The first predefined gesture is assigned with a single-click function of a computer mouse. In this embodiment, the first predefined gesture is a pinch gesture made by a combination of an index finger and a thumb. That is, as illustrated in
FIG. 2 , theindex finger 6 and thethumb 7 are first disposed at a spread apart position, where theindex finger 6 and thethumb 7 are spaced apart from each other. Then, theindex finger 6 and thethumb 7 are disposed at a pinch position, where the tips of theindex finger 6 and thethumb 7 are in contact. Thereafter, theindex finger 6 and thethumb 7 are disposed back to the spread apart position. - The second predefined gesture is assigned with a double-click function of a computer mouse. In this embodiment, the second predefined gesture is a double pinch gesture made by a combination of an index finger and a thumb. That is, as illustrated in
FIG. 3 , theindex finger 6 and thethumb 7 make the pinch gesture twice. - The third predefined gesture is assigned with a select function of a computer mouse. In this embodiment, the third predefined gesture is a spread gesture made by a combination of an index finger and a thumb. That is, as illustrated in
FIG. 4 , theindex finger 6 and thethumb 7 are first disposed at the pinch position, and then at the spread apart position. - The fourth predefined gesture is assigned with a click-and-drag function of a computer mouse. In this embodiment, the fourth predefined gesture is a pinch-and-point gesture made by a combination of an index finger and a thumb. That is, as illustrated in
FIG. 5 , theindex finger 6 and thethumb 7 are first disposed at the pinch position, and are then moved in a direction indicated by the arrow (A). - The fifth predefined gesture is assigned with a function of a computer mouse for moving a cursor. In this embodiment, the fifth predefined gesture is a point gesture made by an index finger. That is, as illustrated in
FIG. 6 , theindex finger 6 is moved in a direction indicated by the arrow (B). - The image-capturing
module 52 includes alens unit 521 that captures a sequence of images such as of a gesture made by ahand 3, and aconverter 522 that converts the images captured by thelens unit 521 into a digital format. - The system, such as notebook computer, further includes a
display module 2 on which a computer-generated graphics, i.e., an interactive screen, and a real-world element, i.e., the images captured by thelens unit 521 of the image-capturing module 52, are displayed. - The
analyzing module 53 is connected to thedatabase 51 and the image-capturing module 52, and determines whether the images captured by thelens unit 521 of the image-capturing module 52 match one of the predefined gestures stored in thedatabase 51. - For example, the
analyzing module 53 first analyzes the images captured by thelens unit 521 of the image-capturing module 52, and then compares the images captured by thelens unit 521 of the image-capturingmodule 52 with the predefined gestures stored in thedatabase 51. When theanalyzing module 53 determines that the images captured by thelens unit 521 of the image-capturing module 52 match the second predefined gesture stored in thedatabase 51, theanalyzing module 53 generates a result that corresponds to the double-click function of a computer mouse. - The
processing module 54 is connected to theanalyzing module 53, and performs an operation based on the result generated by theanalyzing module 53. In particular, as in the example above, theprocessing module 54 performs an operation associated with the double-click function of a computer mouse and controls the interactive screen to show a result of the operation performed thereby. - The preferred embodiment of a gesture-based control method for interactive screen control to be implemented using the aforementioned system according to this invention will now be described with further reference to
FIG. 7 . - In
step 71, thelens unit 521 of the image-capturing module 52 is configured to capture a sequence of images of a gesture, and theconverter 522 of the image-capturing module 52 is configured to convert the images captured by thelens unit 521 of the image-capturingmodule 52 into a digital format. - In
step 72, theanalyzing module 53 is configured to determine whether the images captured instep 71 match one of the predefined gestures. - In
step 73, when it is determined instep 72 that the images captured instep 71 match one of the predefined gestures, the flow proceeds tostep 74. Otherwise, the flow goes back tostep 71. - In
step 74, theprocessing module 54 performs an operation associated with a function of a computer mouse that corresponds to the matching one of the predefined gestures and controls an interactive screen to show a result of the operation performed thereby. - It has thus been shown that the gesture-based control method of this invention uses predefined gestures, each of which corresponds to a function of an input device, i.e., a computer mouse. As such, the same gesture may be made to perform different operations to control an interactive screen. For example, when performing different operations to control, such as to minimize, maximize, or close, an interactive screen, such as a window, the cursor is first moved to the corresponding button of the window using the fifth predefined gesture and then single-click the corresponding button of the window using the first predefined gesture. That is, for the three different operations, the same succession of the fifth and first predefined gestures is made. The gesture-based control method of this invention, therefore, only requires the user to memorize a limited number of predefined gestures in order to fully control an interactive screen.
- While the present invention has been described in connection with what is considered the most practical and preferred embodiment, it is understood that this invention is not limited to the disclosed embodiment but is intended to cover various arrangements included within the spirit and scope of the broadest interpretation so as to encompass all such modifications and equivalent arrangements.
Claims (20)
1. A gesture-based control method for interactive screen control, comprising:
A) configuring an image-capturing module to capture a sequence of images of a gesture;
B) configuring an analyzing module to determine whether the images captured in step A) match a predefined gesture corresponding to a function of an input device; and
C) when it is determined in step B) that the images captured in step A) match the predefined gesture, configuring a processing module to perform an operation associated with the corresponding function of the input device and to control an interactive screen to show a result of the operation performed thereby.
2. The gesture-based control method as claimed in claim 1 , wherein the input device is one of a computer mouse, a computer keyboard, a computer steering wheel, and a computer joystick.
3. The gesture-based control method as claimed in claim 1 , wherein the predefined gesture is assigned with a function of a computer mouse for moving a cursor.
4. The gesture-based control method as claimed in claim 3 , wherein the predefined gesture is a point gesture.
5. The gesture-based control method as claimed in claim 4 , wherein the predefined gesture is a gesture made by an index finger.
6. The gesture-based control method as claimed in claim 1 , wherein the predefined gesture is assigned with a single-click function of a computer mouse.
7. The gesture-based control method as claimed in claim 6 , wherein the predefined gesture is a pinch gesture.
8. The gesture-based control method as claimed in claim 7 , wherein the predefined gesture is a gesture made by a combination of an index finger and a thumb.
9. The gesture-based control method as claimed in claim 1 , wherein the predefined gesture is assigned with a double-click function of a computer mouse.
10. The gesture-based control method as claimed in claim 9 , wherein the predefined gesture is a double-pinch gesture.
11. The gesture-based control method as claimed in claim 10 , wherein the predefined gesture is a gesture made by a combination of an index finger and a thumb.
12. The gesture-based control method as claimed in claim 1 , wherein the predefined gesture is assigned with a select function of a computer mouse.
13. The gesture-based control method as claimed in claim 12 , wherein the predefined gesture is a spread gesture.
14. The gesture-based control method as claimed in claim 13 , wherein the predefined gesture is a gesture made by a combination of an index finger and a thumb.
15. The gesture-based control method as claimed in claim 1 , wherein the predefined gesture is assigned with a click-and-drag function of a computer mouse.
16. The gesture-based control method as claimed in claim 15 , wherein the predefined gesture is a pinch-and-point gesture.
17. The gesture-based control method as claimed in claim 16 , wherein the predefined gesture is a gesture made by a combination of an index finger and a thumb.
18. The gesture-based control method as claimed in claim 1 , wherein the predefined gesture and the corresponding function of the input device are stored in one of a database and a storage device.
19. The gesture-based control method as claimed in claim 1 , wherein step A) includes the sub-step of configuring the image-capturing module to convert the images captured thereby into a digital format.
20. The gesture-based control method as claimed in claim 1 , wherein, in step B), the analyzing module generates a result that corresponds to the function of the input device, and in step C), the operation performed by the processing module is based on the result generated by the analyzing module.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW097144686 | 2008-11-19 | ||
TW097144686A TW201020896A (en) | 2008-11-19 | 2008-11-19 | Method of gesture control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100125815A1 true US20100125815A1 (en) | 2010-05-20 |
Family
ID=42172959
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/424,380 Abandoned US20100125815A1 (en) | 2008-11-19 | 2009-04-15 | Gesture-based control method for interactive screen control |
Country Status (2)
Country | Link |
---|---|
US (1) | US20100125815A1 (en) |
TW (1) | TW201020896A (en) |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110037731A1 (en) * | 2009-08-12 | 2011-02-17 | Inventec Appliances Corp. | Electronic device and operating method thereof |
US20110055846A1 (en) * | 2009-08-31 | 2011-03-03 | Microsoft Corporation | Techniques for using human gestures to control gesture unaware programs |
US20120174028A1 (en) * | 2010-10-01 | 2012-07-05 | Imerj LLC | Opening child windows in dual display communication devices |
US20120235904A1 (en) * | 2011-03-19 | 2012-09-20 | The Board of Trustees of the Leland Stanford, Junior, University | Method and System for Ergonomic Touch-free Interface |
US20130027503A1 (en) * | 2007-09-24 | 2013-01-31 | Qualcomm Incorporated | Enhanced interface for voice and video communications |
US20130229346A1 (en) * | 2012-03-05 | 2013-09-05 | E.G.O. Elektro-Geraetebau Gmbh | Method and apparatus for a camera module for operating gesture recognition and home appliance |
US20130229348A1 (en) * | 2010-11-04 | 2013-09-05 | Macron Co., Ltd. | Driving method of virtual mouse |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US20140033140A1 (en) * | 2012-07-11 | 2014-01-30 | Guang Dong Oppo Mobile Telecommunications Corp., Ltd. | Quick access function setting method for a touch control device |
US20140068526A1 (en) * | 2012-02-04 | 2014-03-06 | Three Bots Ltd | Method and apparatus for user interaction |
US20140214185A1 (en) * | 2013-01-25 | 2014-07-31 | Kuo-Chung Huang | Somatosensory Household Electricity Control Equipment and System Thereof |
US20150062033A1 (en) * | 2012-04-26 | 2015-03-05 | Panasonic Intellectual Property Corporation Of America | Input device, input assistance method, and program |
US20150070272A1 (en) * | 2013-09-10 | 2015-03-12 | Samsung Electronics Co., Ltd. | Apparatus, method and recording medium for controlling user interface using input image |
US8994671B2 (en) | 2011-09-27 | 2015-03-31 | Z124 | Display notifications on a dual screen device |
US9070019B2 (en) | 2012-01-17 | 2015-06-30 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
EP2908215A1 (en) * | 2014-02-18 | 2015-08-19 | Sony Corporation | Method and apparatus for gesture detection and display control |
US20150261406A1 (en) * | 2014-03-17 | 2015-09-17 | Shenzhen Futaihong Precision Industry Co.,Ltd. | Device and method for unlocking electronic device |
US9201585B1 (en) * | 2012-09-17 | 2015-12-01 | Amazon Technologies, Inc. | User interface navigation gestures |
US9285893B2 (en) | 2012-11-08 | 2016-03-15 | Leap Motion, Inc. | Object detection and tracking with variable-field illumination devices |
US9465461B2 (en) | 2013-01-08 | 2016-10-11 | Leap Motion, Inc. | Object detection and tracking with audio and optical signals |
US9495613B2 (en) | 2012-01-17 | 2016-11-15 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging using formed difference images |
US9501152B2 (en) | 2013-01-15 | 2016-11-22 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
US9632658B2 (en) | 2013-01-15 | 2017-04-25 | Leap Motion, Inc. | Dynamic user interactions for display control and scaling responsiveness of display objects |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US9702977B2 (en) | 2013-03-15 | 2017-07-11 | Leap Motion, Inc. | Determining positional information of an object in space |
US9747696B2 (en) | 2013-05-17 | 2017-08-29 | Leap Motion, Inc. | Systems and methods for providing normalized parameters of motions of objects in three-dimensional space |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
US10048827B2 (en) | 2010-10-01 | 2018-08-14 | Z124 | Multi-display control |
US10139918B2 (en) | 2013-01-15 | 2018-11-27 | Leap Motion, Inc. | Dynamic, free-space user interactions for machine control |
US10281987B1 (en) | 2013-08-09 | 2019-05-07 | Leap Motion, Inc. | Systems and methods of free-space gestural interaction |
US10609285B2 (en) | 2013-01-07 | 2020-03-31 | Ultrahaptics IP Two Limited | Power consumption in motion-capture systems |
US10620709B2 (en) | 2013-04-05 | 2020-04-14 | Ultrahaptics IP Two Limited | Customized gesture interpretation |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11416078B2 (en) | 2018-05-21 | 2022-08-16 | Vestel Elektronik Sanayi Ve Ticaret A.S. | Method, system and computer program for remotely controlling a display device via head gestures |
US11720180B2 (en) | 2012-01-17 | 2023-08-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US11726578B1 (en) * | 2022-02-11 | 2023-08-15 | Meta Platforms Technologies, Llc | Scrolling and navigation in virtual reality |
US11775033B2 (en) | 2013-10-03 | 2023-10-03 | Ultrahaptics IP Two Limited | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US11778159B2 (en) | 2014-08-08 | 2023-10-03 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US11875012B2 (en) | 2018-05-25 | 2024-01-16 | Ultrahaptics IP Two Limited | Throwable interface for augmented reality and virtual reality environments |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI475474B (en) * | 2012-07-30 | 2015-03-01 | Mitac Int Corp | Gesture combined with the implementation of the icon control method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080028325A1 (en) * | 2006-07-25 | 2008-01-31 | Northrop Grumman Corporation | Networked gesture collaboration system |
US20080068195A1 (en) * | 2004-06-01 | 2008-03-20 | Rudolf Ritter | Method, System And Device For The Haptically Controlled Transfer Of Selectable Data Elements To A Terminal |
US20080244468A1 (en) * | 2006-07-13 | 2008-10-02 | Nishihara H Keith | Gesture Recognition Interface System with Vertical Display |
US20080309632A1 (en) * | 2007-06-13 | 2008-12-18 | Apple Inc. | Pinch-throw and translation gestures |
-
2008
- 2008-11-19 TW TW097144686A patent/TW201020896A/en unknown
-
2009
- 2009-04-15 US US12/424,380 patent/US20100125815A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080068195A1 (en) * | 2004-06-01 | 2008-03-20 | Rudolf Ritter | Method, System And Device For The Haptically Controlled Transfer Of Selectable Data Elements To A Terminal |
US20080244468A1 (en) * | 2006-07-13 | 2008-10-02 | Nishihara H Keith | Gesture Recognition Interface System with Vertical Display |
US20080028325A1 (en) * | 2006-07-25 | 2008-01-31 | Northrop Grumman Corporation | Networked gesture collaboration system |
US20080309632A1 (en) * | 2007-06-13 | 2008-12-18 | Apple Inc. | Pinch-throw and translation gestures |
Cited By (113)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8830292B2 (en) * | 2007-09-24 | 2014-09-09 | Qualcomm Incorporated | Enhanced interface for voice and video communications |
US20130027503A1 (en) * | 2007-09-24 | 2013-01-31 | Qualcomm Incorporated | Enhanced interface for voice and video communications |
US20110037731A1 (en) * | 2009-08-12 | 2011-02-17 | Inventec Appliances Corp. | Electronic device and operating method thereof |
US9141193B2 (en) * | 2009-08-31 | 2015-09-22 | Microsoft Technology Licensing, Llc | Techniques for using human gestures to control gesture unaware programs |
US20110055846A1 (en) * | 2009-08-31 | 2011-03-03 | Microsoft Corporation | Techniques for using human gestures to control gesture unaware programs |
US20150363005A1 (en) * | 2009-08-31 | 2015-12-17 | Microsoft Corporation | Techniques for using human gestures to control gesture unaware programs |
US10261651B2 (en) | 2010-10-01 | 2019-04-16 | Z124 | Multiple child windows in dual display communication devices |
US20120174028A1 (en) * | 2010-10-01 | 2012-07-05 | Imerj LLC | Opening child windows in dual display communication devices |
US10705674B2 (en) | 2010-10-01 | 2020-07-07 | Z124 | Multi-display control |
US10552007B2 (en) | 2010-10-01 | 2020-02-04 | Z124 | Managing expose views in dual display communication devices |
US10949051B2 (en) | 2010-10-01 | 2021-03-16 | Z124 | Managing presentation of windows on a mobile device |
US10048827B2 (en) | 2010-10-01 | 2018-08-14 | Z124 | Multi-display control |
US9213431B2 (en) * | 2010-10-01 | 2015-12-15 | Z124 | Opening child windows in dual display communication devices |
US8984440B2 (en) | 2010-10-01 | 2015-03-17 | Z124 | Managing expose views in dual display communication devices |
US10871871B2 (en) | 2010-10-01 | 2020-12-22 | Z124 | Methods and systems for controlling window minimization and maximization on a mobile device |
US9134756B2 (en) | 2010-10-01 | 2015-09-15 | Z124 | Dual screen application visual indicator |
US9047047B2 (en) | 2010-10-01 | 2015-06-02 | Z124 | Allowing multiple orientations in dual screen view |
US20130229348A1 (en) * | 2010-11-04 | 2013-09-05 | Macron Co., Ltd. | Driving method of virtual mouse |
US20120235904A1 (en) * | 2011-03-19 | 2012-09-20 | The Board of Trustees of the Leland Stanford, Junior, University | Method and System for Ergonomic Touch-free Interface |
US9857868B2 (en) * | 2011-03-19 | 2018-01-02 | The Board Of Trustees Of The Leland Stanford Junior University | Method and system for ergonomic touch-free interface |
US9351237B2 (en) | 2011-09-27 | 2016-05-24 | Z124 | Displaying of charging status on dual screen device |
US9524027B2 (en) | 2011-09-27 | 2016-12-20 | Z124 | Messaging application views |
US8994671B2 (en) | 2011-09-27 | 2015-03-31 | Z124 | Display notifications on a dual screen device |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US9652668B2 (en) | 2012-01-17 | 2017-05-16 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9945660B2 (en) | 2012-01-17 | 2018-04-17 | Leap Motion, Inc. | Systems and methods of locating a control object appendage in three dimensional (3D) space |
US9934580B2 (en) | 2012-01-17 | 2018-04-03 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US11720180B2 (en) | 2012-01-17 | 2023-08-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US10366308B2 (en) | 2012-01-17 | 2019-07-30 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US10410411B2 (en) | 2012-01-17 | 2019-09-10 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US9436998B2 (en) | 2012-01-17 | 2016-09-06 | Leap Motion, Inc. | Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections |
US10767982B2 (en) | 2012-01-17 | 2020-09-08 | Ultrahaptics IP Two Limited | Systems and methods of locating a control object appendage in three dimensional (3D) space |
US9495613B2 (en) | 2012-01-17 | 2016-11-15 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging using formed difference images |
US9778752B2 (en) | 2012-01-17 | 2017-10-03 | Leap Motion, Inc. | Systems and methods for machine control |
US9070019B2 (en) | 2012-01-17 | 2015-06-30 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US10699155B2 (en) | 2012-01-17 | 2020-06-30 | Ultrahaptics IP Two Limited | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9626591B2 (en) | 2012-01-17 | 2017-04-18 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging |
US9153028B2 (en) | 2012-01-17 | 2015-10-06 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US11308711B2 (en) | 2012-01-17 | 2022-04-19 | Ultrahaptics IP Two Limited | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US9672441B2 (en) | 2012-01-17 | 2017-06-06 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging based on differences between images |
US11782516B2 (en) | 2012-01-17 | 2023-10-10 | Ultrahaptics IP Two Limited | Differentiating a detected object from a background using a gaussian brightness falloff pattern |
US9767345B2 (en) | 2012-01-17 | 2017-09-19 | Leap Motion, Inc. | Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections |
US9697643B2 (en) | 2012-01-17 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US10565784B2 (en) | 2012-01-17 | 2020-02-18 | Ultrahaptics IP Two Limited | Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space |
US9741136B2 (en) | 2012-01-17 | 2017-08-22 | Leap Motion, Inc. | Systems and methods of object shape and position determination in three-dimensional (3D) space |
US20140068526A1 (en) * | 2012-02-04 | 2014-03-06 | Three Bots Ltd | Method and apparatus for user interaction |
US20130229346A1 (en) * | 2012-03-05 | 2013-09-05 | E.G.O. Elektro-Geraetebau Gmbh | Method and apparatus for a camera module for operating gesture recognition and home appliance |
US9329714B2 (en) * | 2012-04-26 | 2016-05-03 | Panasonic Intellectual Property Corporation Of America | Input device, input assistance method, and program |
US20150062033A1 (en) * | 2012-04-26 | 2015-03-05 | Panasonic Intellectual Property Corporation Of America | Input device, input assistance method, and program |
US20140033140A1 (en) * | 2012-07-11 | 2014-01-30 | Guang Dong Oppo Mobile Telecommunications Corp., Ltd. | Quick access function setting method for a touch control device |
US9823834B2 (en) * | 2012-07-11 | 2017-11-21 | Guang Dong Oppo Mobile Telecommunications., Ltd. | Quick access gesture setting and accessing method for a touch control device |
US9201585B1 (en) * | 2012-09-17 | 2015-12-01 | Amazon Technologies, Inc. | User interface navigation gestures |
US9285893B2 (en) | 2012-11-08 | 2016-03-15 | Leap Motion, Inc. | Object detection and tracking with variable-field illumination devices |
US10609285B2 (en) | 2013-01-07 | 2020-03-31 | Ultrahaptics IP Two Limited | Power consumption in motion-capture systems |
US9626015B2 (en) | 2013-01-08 | 2017-04-18 | Leap Motion, Inc. | Power consumption in motion-capture systems with audio and optical signals |
US9465461B2 (en) | 2013-01-08 | 2016-10-11 | Leap Motion, Inc. | Object detection and tracking with audio and optical signals |
US10097754B2 (en) | 2013-01-08 | 2018-10-09 | Leap Motion, Inc. | Power consumption in motion-capture systems with audio and optical signals |
US11874970B2 (en) | 2013-01-15 | 2024-01-16 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US11243612B2 (en) | 2013-01-15 | 2022-02-08 | Ultrahaptics IP Two Limited | Dynamic, free-space user interactions for machine control |
US10042510B2 (en) | 2013-01-15 | 2018-08-07 | Leap Motion, Inc. | Dynamic user interactions for display control and measuring degree of completeness of user gestures |
US10139918B2 (en) | 2013-01-15 | 2018-11-27 | Leap Motion, Inc. | Dynamic, free-space user interactions for machine control |
US10042430B2 (en) | 2013-01-15 | 2018-08-07 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
US11269481B2 (en) | 2013-01-15 | 2022-03-08 | Ultrahaptics IP Two Limited | Dynamic user interactions for display control and measuring degree of completeness of user gestures |
US10817130B2 (en) | 2013-01-15 | 2020-10-27 | Ultrahaptics IP Two Limited | Dynamic user interactions for display control and measuring degree of completeness of user gestures |
US11740705B2 (en) | 2013-01-15 | 2023-08-29 | Ultrahaptics IP Two Limited | Method and system for controlling a machine according to a characteristic of a control object |
US10241639B2 (en) | 2013-01-15 | 2019-03-26 | Leap Motion, Inc. | Dynamic user interactions for display control and manipulation of display objects |
US10782847B2 (en) | 2013-01-15 | 2020-09-22 | Ultrahaptics IP Two Limited | Dynamic user interactions for display control and scaling responsiveness of display objects |
US9632658B2 (en) | 2013-01-15 | 2017-04-25 | Leap Motion, Inc. | Dynamic user interactions for display control and scaling responsiveness of display objects |
US10564799B2 (en) | 2013-01-15 | 2020-02-18 | Ultrahaptics IP Two Limited | Dynamic user interactions for display control and identifying dominant gestures |
US10739862B2 (en) | 2013-01-15 | 2020-08-11 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US9501152B2 (en) | 2013-01-15 | 2016-11-22 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
US11353962B2 (en) | 2013-01-15 | 2022-06-07 | Ultrahaptics IP Two Limited | Free-space user interface and control using virtual constructs |
US9696867B2 (en) | 2013-01-15 | 2017-07-04 | Leap Motion, Inc. | Dynamic user interactions for display control and identifying dominant gestures |
US20140214185A1 (en) * | 2013-01-25 | 2014-07-31 | Kuo-Chung Huang | Somatosensory Household Electricity Control Equipment and System Thereof |
US10585193B2 (en) | 2013-03-15 | 2020-03-10 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US9702977B2 (en) | 2013-03-15 | 2017-07-11 | Leap Motion, Inc. | Determining positional information of an object in space |
US11693115B2 (en) | 2013-03-15 | 2023-07-04 | Ultrahaptics IP Two Limited | Determining positional information of an object in space |
US10620709B2 (en) | 2013-04-05 | 2020-04-14 | Ultrahaptics IP Two Limited | Customized gesture interpretation |
US11347317B2 (en) | 2013-04-05 | 2022-05-31 | Ultrahaptics IP Two Limited | Customized gesture interpretation |
US11099653B2 (en) | 2013-04-26 | 2021-08-24 | Ultrahaptics IP Two Limited | Machine responsiveness to dynamic user movements and gestures |
US10452151B2 (en) | 2013-04-26 | 2019-10-22 | Ultrahaptics IP Two Limited | Non-tactile interface systems and methods |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
US9747696B2 (en) | 2013-05-17 | 2017-08-29 | Leap Motion, Inc. | Systems and methods for providing normalized parameters of motions of objects in three-dimensional space |
US10831281B2 (en) | 2013-08-09 | 2020-11-10 | Ultrahaptics IP Two Limited | Systems and methods of free-space gestural interaction |
US11567578B2 (en) | 2013-08-09 | 2023-01-31 | Ultrahaptics IP Two Limited | Systems and methods of free-space gestural interaction |
US10281987B1 (en) | 2013-08-09 | 2019-05-07 | Leap Motion, Inc. | Systems and methods of free-space gestural interaction |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11461966B1 (en) | 2013-08-29 | 2022-10-04 | Ultrahaptics IP Two Limited | Determining spans and span lengths of a control object in a free space gesture control environment |
US11776208B2 (en) | 2013-08-29 | 2023-10-03 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11282273B2 (en) | 2013-08-29 | 2022-03-22 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
EP2846231A3 (en) * | 2013-09-10 | 2015-07-15 | Samsung Electronics Co., Ltd | Apparatus and method for controlling a user interface using an input image |
KR102165818B1 (en) * | 2013-09-10 | 2020-10-14 | 삼성전자주식회사 | Method, apparatus and recovering medium for controlling user interface using a input image |
US9898090B2 (en) * | 2013-09-10 | 2018-02-20 | Samsung Electronics Co., Ltd. | Apparatus, method and recording medium for controlling user interface using input image |
US10579152B2 (en) | 2013-09-10 | 2020-03-03 | Samsung Electronics Co., Ltd. | Apparatus, method and recording medium for controlling user interface using input image |
US20150070272A1 (en) * | 2013-09-10 | 2015-03-12 | Samsung Electronics Co., Ltd. | Apparatus, method and recording medium for controlling user interface using input image |
US11061480B2 (en) | 2013-09-10 | 2021-07-13 | Samsung Electronics Co., Ltd. | Apparatus, method and recording medium for controlling user interface using input image |
US11513608B2 (en) | 2013-09-10 | 2022-11-29 | Samsung Electronics Co., Ltd. | Apparatus, method and recording medium for controlling user interface using input image |
KR20150029463A (en) * | 2013-09-10 | 2015-03-18 | 삼성전자주식회사 | Method, apparatus and recovering medium for controlling user interface using a input image |
US11775033B2 (en) | 2013-10-03 | 2023-10-03 | Ultrahaptics IP Two Limited | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US11568105B2 (en) | 2013-10-31 | 2023-01-31 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11868687B2 (en) | 2013-10-31 | 2024-01-09 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US11010512B2 (en) | 2013-10-31 | 2021-05-18 | Ultrahaptics IP Two Limited | Improving predictive information for free space gesture control and communication |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
US9996160B2 (en) | 2014-02-18 | 2018-06-12 | Sony Corporation | Method and apparatus for gesture detection and display control |
EP2908215A1 (en) * | 2014-02-18 | 2015-08-19 | Sony Corporation | Method and apparatus for gesture detection and display control |
US20150261406A1 (en) * | 2014-03-17 | 2015-09-17 | Shenzhen Futaihong Precision Industry Co.,Ltd. | Device and method for unlocking electronic device |
US11778159B2 (en) | 2014-08-08 | 2023-10-03 | Ultrahaptics IP Two Limited | Augmented reality with motion sensing |
US11416078B2 (en) | 2018-05-21 | 2022-08-16 | Vestel Elektronik Sanayi Ve Ticaret A.S. | Method, system and computer program for remotely controlling a display device via head gestures |
US11875012B2 (en) | 2018-05-25 | 2024-01-16 | Ultrahaptics IP Two Limited | Throwable interface for augmented reality and virtual reality environments |
US11726578B1 (en) * | 2022-02-11 | 2023-08-15 | Meta Platforms Technologies, Llc | Scrolling and navigation in virtual reality |
Also Published As
Publication number | Publication date |
---|---|
TW201020896A (en) | 2010-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100125815A1 (en) | Gesture-based control method for interactive screen control | |
US7849421B2 (en) | Virtual mouse driving apparatus and method using two-handed gestures | |
EP3258423B1 (en) | Handwriting recognition method and apparatus | |
US20180024643A1 (en) | Gesture Based Interface System and Method | |
US9448620B2 (en) | Input method and apparatus of portable device for mapping segments of a hand to a plurality of keys | |
US8791900B2 (en) | Computing device notes | |
US9367202B2 (en) | Information processing method and electronic device | |
US20070274591A1 (en) | Input apparatus and input method thereof | |
WO2012011263A1 (en) | Gesture input device and gesture input method | |
US9880697B2 (en) | Remote multi-touch control | |
CN108616712B (en) | Camera-based interface operation method, device, equipment and storage medium | |
JP2014186361A (en) | Information processing device, operation control method, and program | |
US8830192B2 (en) | Computing device for performing functions of multi-touch finger gesture and method of the same | |
Jeong et al. | Single-camera dedicated television control system using gesture drawing | |
JP6991486B2 (en) | Methods and systems for inserting characters into strings | |
US20110221918A1 (en) | Information Processing Apparatus, Information Processing Method, and Program | |
US20120218307A1 (en) | Electronic device with touch control screen and display control method thereof | |
US20110304649A1 (en) | Character selection | |
KR101488662B1 (en) | Device and method for providing interface interacting with a user using natural user interface device | |
US20110037731A1 (en) | Electronic device and operating method thereof | |
JP2011243157A (en) | Electronic apparatus, button size control method, and program | |
TWI603226B (en) | Gesture recongnition method for motion sensing detector | |
Garg | Comparative Studies of Gesture-Based and Sensor-Based Input Methods for Mobile User Interfaces | |
JP2009151631A (en) | Information processor, information processing method, and program | |
TW201602898A (en) | Operating method for fingers on touch screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NATIONAL APPLIED RESEARCH LABORATORIES,TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, MING-JEN;CHEN, KUEN-MEAU;REEL/FRAME:022687/0360 Effective date: 20090401 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |