WO2010032268A2 - System and method for controlling graphical objects - Google Patents

System and method for controlling graphical objects Download PDF

Info

Publication number
WO2010032268A2
WO2010032268A2 PCT/IN2009/000511 IN2009000511W WO2010032268A2 WO 2010032268 A2 WO2010032268 A2 WO 2010032268A2 IN 2009000511 W IN2009000511 W IN 2009000511W WO 2010032268 A2 WO2010032268 A2 WO 2010032268A2
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
gestures
user interface
teleporting
graphical user
Prior art date
Application number
PCT/IN2009/000511
Other languages
French (fr)
Other versions
WO2010032268A3 (en
Inventor
Avinash Saxena
Original Assignee
Avinash Saxena
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avinash Saxena filed Critical Avinash Saxena
Publication of WO2010032268A2 publication Critical patent/WO2010032268A2/en
Publication of WO2010032268A3 publication Critical patent/WO2010032268A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention describes a system and a method for controlling a graphical user interface, more particularly controlling position of an object of the graphical user interface.
  • Pointing systems have revolutionized human computer interface.
  • a pointing system is an input interface that allows a user to input spatial data into a computer.
  • pointing systems are mice, touchpads, trackpads, touch screens, joysticks, eye tracking devices and laser pointers.
  • the spatial data provided by the user is used to control graphical user interface, objects displayed on a screen.
  • a direct input pointing system such as touch screen, uses direct mapping mode, where the objects of graphical user interface are controlled according to absolute spatial input sensed by the pointing system. The same can be observed while moving the pointer on the screen using a touch screen, where the pointer is moved according to absolute position of a stylus or fingers as sensed by the touch screen.
  • direct input pointing system is natural to human behavior, where user can interact with the objects of the graphical user interface like they interact with objects in real world.
  • direct input devices as the size of the screen increases, user is required to make large movements to control the objects of displayed graphical user interface. Further, the cost of such system is also increased as a large 'sensing area is required. Furthermore, a large amount of data representing the absolute spatial input is required to be transferred by the pointing system.
  • An indirect input pointing system such as a mouse uses a relative mapping mode where the objects of the graphical user interface are controlled according to change in spatial input sensed by the pointing system. The same can be observed while moving the pointer on the screen using a mouse, the change in spatial data sensed by the mouse is mapped to change in position of pointer or cursor on the display device. Further, the position of the pointer can also be changed according to the rate of change of the spatial data sensed by the pointing system.
  • the indirect input pointing systems are not natural to human behavior. The indirect input pointing system also imposes a compromise between speed and accuracy of usage. Also, as the change in position of the pointer is mapped to the change in spatial data sensed by the mouse, user has to be aware of the present position of the pointer.
  • One or more of the current input devices have one or more of the following limitations. Large movements are required to control objects of graphical user interface on a large display. User has to be aware of the position of the pointer. There is compromise between speed and accuracy. A large amount of data representing the absolute spatial input is required to be transferred. To overcome the limitation mentioned above we need an input system which is accurate, natural to human behavior and quick to use.
  • the invention describes a system, method, for changing position of an object of a graphical user interface.
  • the system comprises a sensing unit generating sense data.
  • the sense data comprises spatial data corresponding to user interaction with the sensing unit.
  • the system further comprises a processing unit for recognizing a set of gestures from the sense data and calculating the position of the object according to the sense data.
  • the set of gestures may comprise a teleporting gesture.
  • position of the graphical object is calculated based on a position of the teleporting gesture.
  • position of the object is calculated based on a change in spatial data when gesture from the set of gestures is not recognized.
  • Various embodiments of the present invention allows a user to control a graphical user interface without being aware of the location of the graphical objects such as a pointer or a cursor, user can simply use a teleporting gesture to change the position of the pointer to desired location on display device. Further, user can accurately control objects of the graphical user interface with minimalistic movements. Also, the invention reduces time taken to reach area of interest on the display. DESCRIPTION OF DRAWINGS
  • FIG. 1 illustrates a block diagram of a system 100 for changing position of a graphical object on a display device according to an embodiment of the invention
  • FIG. 2 illustrates a block diagram of a system 100 for changing position of a graphical object on a display device according to another embodiment of the invention
  • FIG. 3 illustrates a block diagram of a system 300 for changing position of a graphical object on a display device according to another embodiment of the invention.
  • FIG. 4 illustrates a flowchart of a method for controlling a graphical user interface according to an embodiment of the invention.
  • the present invention describes a system and a method for controlling a graphical user interface using a sensing unit.
  • FIG. 1 describes a block diagram of a system 100 for changing position of an object of a graphical user interface based on spatial input provided by a user, according to an embodiment of the invention.
  • the system 100 includes a display device 102.
  • the display device 102 comprises a display area for displaying one or more objects of a graphical user interface such as windows, pointer, icons, images, menus and files. Examples of the display device 102 include but are not limited to monitors, 3D monitors, holographic displays, LCD, multi screen systems, projector screens, etc.
  • the display device 102 displays an object of the graphical user interface called pointer 110.
  • the position of the pointer 110 on the display device 102 can be changed by a user by providing spatial input to a sensing unit 104.
  • the sensing unit 104 generates sense data based on spatial input provided by the user to change the position of the pointer 110 on the display. Examples of the sensing unit 104 include but are not limited to touchpads, touch screens, multi touch screens, laser pointers, hand movement recognition systems, gesture recognition systems and eye tracking systems. Further, the sensing unit 104 can sense spatial data corresponding to a spatial input provided by the user using a pointing device. Examples of the pointing device include but are not limited to single finger, multiple fingers, other body parts, pen, stylus, laser pointer and so forth. According to an embodiment of the invention, a pointing device can be any selection or control mechanism.
  • the spatial data sensed by the sensing unit 104 can be three dimensional or two dimensional data.
  • the spatial data can be Cartesian coordinate data, polar coordinate data or spherical coordinate data.
  • the sense data is generated at regular intervals of time.
  • the sense data further comprises sensitivity data, pressure data, image data, area of contact data, number of contact points data, and button data.
  • the sense data can be data from a sensor. Examples of the sensor include but are not limited to IR sensor, touch sensor, capacitive touch screen, resistive touch screen, optical sensor, image sensor.
  • the sensing unit 104 transmits the sense data to a processing unit 106.
  • the processing unit 106 calculates a position of the pointer 110 based on the sense data.
  • a gesture can be any predefined pattern in change of sense data as recognized by the sensing unit 104. Further, a gesture can be any predefined pattern in change of sense data with respect to time as recognized by the sensing unit 104. Also, a gesture may be predefined as keeping finger still on a touch sensing device for a predefined period of time. Also, a gesture can be defined as pressing a physical button with finger touching the touch sensitive device. Moreover, a gesture can be defined as change in area of contact as sensed by the sensing unit.
  • gestures include but are not limited to single tap, double tap, pinching, multi finger tap, flick, single stroke, multi strokes made by the user on a touch sensing device, eye movement in a eye tracking device and hand movement in a hand tracking device.
  • a gesture may be a predefined symbol made by user on a touch sensing device. At least one gesture in the set of predefined gestures is designated as teleporting gesture. Further, one or more gestures in the set of predefined gesture may be designated as command gestures.
  • the processing unit 106 calculates the position of the pointer 110 depending on spatial position of the teleporting gesture, if the teleporting gesture is recognized.
  • the spatial position of the teleporting gesture can be spatial data sensed by the sensing unit 104 at the start of the gesture, at the end of the gesture or during the gesture. Further, the spatial position of the teleporting gesture can be average spatial data sensed during the gesture or any function of spatial data sensed during the gesture.
  • the processing unit 106 calculates the next position of the pointer 110 using a mapping function on the spatial position of the teleporting gesture, if the teleporting gesture is recognized.
  • the processing unit 106 executes one or more commands associated with the command gesture, if the command gesture is recognized.
  • commands include but are not limited to opening a file, to selecting an icon, to zooming into a picture and so forth.
  • the processing unit 106 calculates the position of the pointer 110 based on the change in the spatial data as sensed by the sensing unit 104, if no gesture from the predefined set of gesture is recognized.
  • the processing unit 106 may also calculate the position of the pointer 110 based c i the rate of change of the spatial data as sensed by the sensing unit 104, if no gesture from the predefined set of gestures is recognized.
  • the position of the pointer 110 is adjusted depending on the positions of other objects of the user interface.
  • the position of the pointer 110 may be adjusted to the position of the object nearest to the next position of the pointer 110, if the next position of the pointer 110 is calculated based on the spatial position of the teleporting gesture.
  • the sensing unit 104 includes an interactive area for sensing spatial input provided by the user.
  • the interactive area of the sensing unit 104 is mapped onto the display area on the display device using a mapping function.
  • the position of the pointer 110 is result of the mapping function with inputs as the position of the teleporting gesture.
  • the interactive area of the sensing unit 104 is proportionally mapped with the area of the display 102.
  • a control signal can be provided to the processing unit 106 to designate at least one command gesture as a teleporting gesture.
  • control signals include but are not limited to pressing a button, performing a command gesture, providing a predefined spatial input.
  • control signals can also be provided to the processing unit 106 from an auxiliary input device such as keyboard, joystick or button pads.
  • a tap on a touchpad can be designated as a teleporting gesture, if a physical button is pressed and control gesture to select a graphical object, if the physical button is not pressed.
  • Furtner a tap on the touchpad can be designated as a teleporting gesture, if user is touching right corner of the touchpad and as a command gesture, if user is not touching the right corner of the touchpad.
  • the interactive area of the sensing unit 104 can be mapped tc only a first part of display area on the display device 102 using the mapping function.
  • the interactive area cf the sensing unit 104 can be mapped to a second part of display area on the display device 102 using the mapping function, based on a control signal provided to the processing unit 106.
  • control signals include but are not limited to pressing a button, performing a command gesture.
  • control signals can also be provided to the processing unit 106 from an auxiliary input device such as keyboard, joystick or button pads.
  • the interactive area of the sensing unit can be mapped to the left half of the display area, if a physical button is pressed on an auxiliary keyboard and to right half of the display area, if the physical button is not pressed.
  • the interactive area of the sensing unit in a multi screen display device can be mapped to area of display on a first screen when a control signal is present and to area of display on a second screen when the control signal is not present.
  • the interactive area of the sensing unit in a multi screen display device can be mapped to the complete display area when a control signal is present and to area of display on a first screen when the control signal is not present.
  • the set of gestures comprises a first teleporting gesture and a second teleporting gesture.
  • the interactive area of the sensing unit 104 can be mapped to a first part of display area on the display device 102 using the mapping function, when the first teleporting gesture is recognized.
  • the interactive area of the 5 sensing unit 104 can be mapped to a second part of display area on the display device 102 using the mapping function, when the second teleporting gesture is recognized.
  • the interactive area of the sensing unit can be mapped to the left half of the display area, when the first teleporting ge ⁇ 'ure is recognized and to right half of the display area, when the second teleporting gesture is recognized.
  • the interactive area of the sensing unit can be mapped to area of display on a first screen when the first teleporting gesture is recognized and to area of display on a second screen when the second teleporting gesture is recognized.
  • the interactive area of the sensing unit in a multi screen display device can be
  • the processing unit 106 can be combined with the sensing unit 104 or the display device 102.
  • FIG 2 describes another exemplary embodiment illustrating the working of the system 100.
  • the pointer 110 is at the initial position 112 on the display device 102.
  • a user intends to change the position of a graphical object such as a pointer 110 from the initial position 112 to the next position 114.
  • the user can change the position of the pointer 110 from the initial position 25 112 to the next position 114, changing his position on the interactive area on the sensing unit 104 in a direction corresponding to a direction from the initial position 112 to the next position 114.
  • the processing unit 106 will calculate the movement of the pointer 110 according to the change in the spatial data on the interactive area. Further, according to the movement calculated by the processing unit 106, the pointer 110 will move from the initial position 112 30 to the next position 114.
  • the user can also change the position of the pointer 110 from the initial position 112 to the next position 114, by making a teleporting gesture at position 204 on interactive area of the sensing unit 104.
  • the processing unit 106 will recognize the teleporting gesture and accordingly the pointer 110 will move directly to a position 202 on the display device 102.
  • the position 202 will be corresponding to the position 204 on the sensing unit 104.
  • the user can then move the pointer 110 to the position 114 by moving the position of the pointing device with respect io the sensing unit 104 in a direction corresponding to a direction from the position 202 to the position 114.
  • FIG. 3 describes a block diagram of a system 300 for changing position of an object of a graphical user interface based on spatial input provided by a user, according to an embodiment of the invention.
  • the system 300 include but are not limited to mobile phones, laptops, desktops, televisions, and handheld devices.
  • the system 300 includes a display device 302.
  • the display device 302 comprises a display area for displaying one or more objects of a graphical user interface such as windows, pointer, icons, images, menus and files. Examples of the display device 302 include but are not limited to monitors, 3D monitors, holographic displays, LCD, multi screen systems, projector screens, etc.
  • the display device 302 displays an object of the graphical user interface called pointer 312.
  • the position of the pointer 312 on the display device 302 can be changed by a user by providing spatial input to a sensing unit 304.
  • the sensing unit 304 include but are not limited to touchpads, touch screens, multi touch screens, laser pointers, hand movement recognition systems, gesture recognition systems and eye tracking systems.
  • the sensing unit 304 comprises a spatial sensor system 306 and a sensor control unit 3O8.
  • the spatial sensor system 306 senses spatial input provided by the user. Examples of sensors include but are not limited to capacitive sensors, resistive sensors, image sensors, thermal sensors, optical sensors, Electroencephalography (EEG) sensors. Further, the spatial sensor system 306 can sense spatial data corresponding to a spatial input provided by the user using a pointing device. Examples of the pointing device include but are not limited to single finger, multiple fingers, other body parts, pen, stylus, laser pointer and so forth. According to an embodiment of the invention, a pointing device can be any selection or control mechanism. The spatial data sensed by the spatial sensor system 306 can be three dimensional or two dimensional data.
  • the sensor control unit 308 generates sense data based on spatial input provided by the user as sensed by the spatial sensor system 306.
  • the spatial data can be Cartesian coordinate data, polar coordinate data or spherical coordinate data.
  • the sensor control unit 308 generates sense data based on the change in spatial input provided by the user as sensed by the spatial sensor system 306.
  • the sensor control unit 308 generates sense data based on the absolute spatial input provided by the user as sensed by the spatial sensor system 306.
  • the sensor control unit 308 generates sense data at regular intervals of time.
  • the sense data further comprises sensitivity data, pressure data, image data, area of contact data, number of contact points data, and button data.
  • a gesture can be any predefined pattern in change of sense data. Further, a gesture can be any predefined pattern in change of sense data with respect to time. Examples of gesture include but are not limited to single tap, double tap, pinching, multi finger tap, flick .single stroke ,multi strokes made by the user on a touch sensing device, eye movement in a eye tracking device and hand movement in a hand tracking device. Further, a gesture may be a predefined symbol made by user on a touch sensing device. Further, a gesture may be predefined as keeping finger still on a touch sensing device for a predefined period of time. Also, a gesture can be defined as pressing a physical button with finger touching the touch sensitive device. Also, a gesture can be defined as change in area of contact as sensed by the sensing unit.
  • At least one gesture in the set of predefined gestures is designated as teleporting gesture. Further, one or more gesture in the set of predefined gesture may be designated as command gestures.
  • the sensor control unit 308 generates a gesture data depending on the gesture recognized.
  • the sensor control unit 308 provides the sense data and the gesture data to the processing unit 310.
  • the processing unit 310 calculates a position of the pointer 312 based on the sense data.
  • the processing unit 310 calculates the position of the pointer 312 depending on spatial position of the teleporting gesture, if gesture data corresponding to the teleporting gesture is provided by the sensor control unit 308.
  • the spatial position of the teleporting gesture can be spatial data sensed by the spatial sensor system 306 at the start of the gesture, at the end of the gesture or during the gesture. Further, the spatial position of the teleporting gesture can be average spatial data sensed during the gesture or any function of spatial data sensed during the gesture.
  • the processing unit 310 calculates the next position of the pointer 312 using a mapping function on the spatial position of the teleporting gesture, if the teleporting gesture is recognized.
  • the processing unit 310 executes one or more commands associated with a command gesture, if the gesture data corresponding to the command gesture is provided by the sensor control unit 308.
  • commands include but are not limited to opening a file, to selecting an icon, to zooming into a picture and so forth.
  • the processing unit 310 calculates the position of the pointer 312 based on the change in the spatial data as sensed by the spatial sensor system 306, if no gesture from the predefined set of gesture is recognized.
  • the processing unit 308 may also calculate the position of the pointer 312 based on the rate of change of the spatial data as sensed by the spatial sensor system 306, if no gesture from the predefined set of gestures is recognized.
  • the sensor control unit 308 generates sense data comprising spatial data corresponding to absolute spatial input provided by the user, if the teleporting gesture is detected. According to another embodiment of the invention, the sensor control unit 308 generates sense data comprising spatial data corresponding to relative change in spatial input provided by the user, if no gesture from the predefined set of gestures is recognized.
  • the sense data from the sensor control unit 308 is transferred to the processing unit 310 using a wireless link.
  • the wireless link include but are not limited to Bluetooth, Wi-Fi, Radio Frequency link, Infrared(IR), microwave link and so forth.
  • FIG 4 illustrates a flowchart of a method for controlling a graphical user interface according to an embodiment of the invention.
  • the method for controlling a graphical user interface is implemented by a system including a user, a sensing unit, and a processing unit.
  • the user provides spatial input to the system for controlling the graphical user interface.
  • the sensing unit provides functionalities similar to sensing unit 104 as described in FIG. 1.
  • the processing unit provides functionalities similar to processing unit 106 as described in FIG. 1.
  • the method for controlling the graphical object is initiated at step 402. At step 402.
  • step 406 spatial input provided by the user is sensed by the sensing unit. Thereafter step 406 is performed.
  • step 406 sense data is generated based on the spatial input.
  • the sense data is generated by the sensing unit.
  • step 408 is performed.
  • step 408 it is determined if gestures of a set of predefined gestures recognized from the sense data.
  • Step 408 is performed by the processing unit by analyzing the sense data. If the gestures are not recognized at step 408, step 410 is performed.
  • step 410 a position of an object of the graphical user interface is calculated based on a change in the sense data. The position of the object is calculated at step 410 by the processing unit.
  • step 412 is performed. At step 412, it is determined if the gestures identified from the sense data include teleporting gestures. Step 412 is performed by the processing unit. If the gestures identified from the sense data do not include the teleporting gestures, step 414 is performed. At step 414, commands associated with the gestures identified from the sense data are executed by the processing unit.
  • step 416 is performed.
  • the position of the object of the graphical user interface is calculated based on the sense data corresponding to the teleporting gestures.
  • the position of the object is calculated at step 416 by the processing unit.
  • dragging of a graphical object can be performed by identifying the position of the teleporting gesture and moving the graphical object on the screen according to the position of the teleporting gesture.
  • the position of the gesture is calculated as a distance in X direction and Y direction from a point of origin of the
  • Il sensing unit Other references and methods can also be used to calculate the position of the teleporting gesture.
  • a gesture is recognized, if the subsequent spatial positions of a pointing device with respect to a sensing unit provided in the sense data satisfy predetermined recognition criteria.
  • the predetermined recognition criteria for recognition of a gesture can be automatically adjusted according to the usage of a user.
  • the teleporting gesture is recognized as one or more subsequent arcs made by the user on the sensing unit, based on at least one of length of arcs, time for each arcs, time of no touch between the arcs and number of contact points.
  • the parameters for finding the next position of the pointer from the position of the gesture can be automatically adjusted according to the usage of the user over a period of time.
  • a feedback is provided for recognition of a gesture.
  • the feedback can be visual or in form of audio.
  • recognition of the teleporting gesture at specific spatial positions can be used to initiate a process.
  • a teleporting gesture on the right edge of the sensing unit may initiate a scroll process.
  • change in sound volume can also be initiated by sensing teleporting gesture in a specified area of the touch pad.
  • the sensing unit can be a camera or infrared sensor for sensing movements and gestures made by a user to change the position of the graphical object.
  • movements and gestures include but are not limited to movements and gestures made by fingers, eyes and head of the user.
  • the sensing unit can be a brain-computer interface (BCI) or a direct neural interface or a brain-machine interface. Further, input to' the sensing unit can be provided directly with the help of signals or message from the brain or nervous system o r the user. According to yet another embodiment of the invention, the sensing unit can be a combination of more than one type of sensing units. For example, a teleporting gesture can be recognized using a brain-computer interface (BCI) and the pointer can also be moved with the help of a touchpad in relative mode.
  • BCI brain-computer interface
  • the pointer can also be moved with the help of a touchpad in relative mode.
  • only a part of the display device can be mapped to the sensing unit.
  • the display device can be mapped to only a part of the sensing unit.
  • the invention provides a novel system and method that uses teleporting gestures sensed by a sensing unit to move a pointer for long distances on a display device.
  • the method reduces the time and movement required for moving pointer to long distances on the display device. Further, the invention can lead to reduction of the required size of a sensing device without altering the accuracy of the sensing device.
  • Computer readable media may be available media that may be accessed by a computer.
  • Computer readable media may comprise, but is not limited to, "computer storage media” and "communications media”.

Abstract

The invention describes a system, method, and a computer program for changing position of an object of a graphical user interface. The system comprises a sensing unit generating sense data. The sense data comprises spatial data corresponding to user interaction with the sensing unit. The system further comprises a processing unit for recognizing a set of gestures from the sense data and calculating the position of the object according to the sense data. The set of gestures comprises a teleporting gesture. When the teleporting gesture is recognized, position of the graphical object is calculated based on position of the teleporting gesture. The position of the graphical object is calculated based on the change in spatial data when gesture from the set of gestures is not recognized.

Description

SYSTEM AND METHOD FOR CONTROLLING GRAPHICAL OBJECTS
TECHNICAL FIELD
The present invention describes a system and a method for controlling a graphical user interface, more particularly controlling position of an object of the graphical user interface.
BACKGROUND
Pointing systems have revolutionized human computer interface. A pointing system is an input interface that allows a user to input spatial data into a computer.
Examples of pointing systems are mice, touchpads, trackpads, touch screens, joysticks, eye tracking devices and laser pointers. The spatial data provided by the user is used to control graphical user interface, objects displayed on a screen.
A direct input pointing system such as touch screen, uses direct mapping mode, where the objects of graphical user interface are controlled according to absolute spatial input sensed by the pointing system. The same can be observed while moving the pointer on the screen using a touch screen, where the pointer is moved according to absolute position of a stylus or fingers as sensed by the touch screen. Using direct input pointing system is natural to human behavior, where user can interact with the objects of the graphical user interface like they interact with objects in real world. However, while using direct input devices, as the size of the screen increases, user is required to make large movements to control the objects of displayed graphical user interface. Further, the cost of such system is also increased as a large 'sensing area is required. Furthermore, a large amount of data representing the absolute spatial input is required to be transferred by the pointing system.
An indirect input pointing system such as a mouse uses a relative mapping mode where the objects of the graphical user interface are controlled according to change in spatial input sensed by the pointing system. The same can be observed while moving the pointer on the screen using a mouse, the change in spatial data sensed by the mouse is mapped to change in position of pointer or cursor on the display device. Further, the position of the pointer can also be changed according to the rate of change of the spatial data sensed by the pointing system. The indirect input pointing systems are not natural to human behavior. The indirect input pointing system also imposes a compromise between speed and accuracy of usage. Also, as the change in position of the pointer is mapped to the change in spatial data sensed by the mouse, user has to be aware of the present position of the pointer.
One or more of the current input devices have one or more of the following limitations. Large movements are required to control objects of graphical user interface on a large display. User has to be aware of the position of the pointer. There is compromise between speed and accuracy. A large amount of data representing the absolute spatial input is required to be transferred. To overcome the limitation mentioned above we need an input system which is accurate, natural to human behavior and quick to use.
SUMMARY
The invention describes a system, method, for changing position of an object of a graphical user interface. The system comprises a sensing unit generating sense data. The sense data comprises spatial data corresponding to user interaction with the sensing unit. The system further comprises a processing unit for recognizing a set of gestures from the sense data and calculating the position of the object according to the sense data. The set of gestures may comprise a teleporting gesture. When the teleporting gesture is recognized, position of the graphical object is calculated based on a position of the teleporting gesture. According to an embodiment of the invention position of the object is calculated based on a change in spatial data when gesture from the set of gestures is not recognized.
Various embodiments of the present invention allows a user to control a graphical user interface without being aware of the location of the graphical objects such as a pointer or a cursor, user can simply use a teleporting gesture to change the position of the pointer to desired location on display device. Further, user can accurately control objects of the graphical user interface with minimalistic movements. Also, the invention reduces time taken to reach area of interest on the display. DESCRIPTION OF DRAWINGS
The system and methods for changing position of an object of a graphical user interface on a display device are further described with reference to the accompanying drawings in which:
FIG. 1 illustrates a block diagram of a system 100 for changing position of a graphical object on a display device according to an embodiment of the invention;
FIG. 2 illustrates a block diagram of a system 100 for changing position of a graphical object on a display device according to another embodiment of the invention;
FIG. 3 illustrates a block diagram of a system 300 for changing position of a graphical object on a display device according to another embodiment of the invention; and
FIG. 4 illustrates a flowchart of a method for controlling a graphical user interface according to an embodiment of the invention.
DETAILED DESCRIPTION
The present invention describes a system and a method for controlling a graphical user interface using a sensing unit.
FIG. 1 describes a block diagram of a system 100 for changing position of an object of a graphical user interface based on spatial input provided by a user, according to an embodiment of the invention. Examples of the system 100 include but are not limited to mobile phones, laptops, desktops, televisions, and handheld devices. The system 100 includes a display device 102. The display device 102 comprises a display area for displaying one or more objects of a graphical user interface such as windows, pointer, icons, images, menus and files. Examples of the display device 102 include but are not limited to monitors, 3D monitors, holographic displays, LCD, multi screen systems, projector screens, etc. The display device 102 displays an object of the graphical user interface called pointer 110. The position of the pointer 110 on the display device 102 can be changed by a user by providing spatial input to a sensing unit 104. The sensing unit 104 generates sense data based on spatial input provided by the user to change the position of the pointer 110 on the display. Examples of the sensing unit 104 include but are not limited to touchpads, touch screens, multi touch screens, laser pointers, hand movement recognition systems, gesture recognition systems and eye tracking systems. Further, the sensing unit 104 can sense spatial data corresponding to a spatial input provided by the user using a pointing device. Examples of the pointing device include but are not limited to single finger, multiple fingers, other body parts, pen, stylus, laser pointer and so forth. According to an embodiment of the invention, a pointing device can be any selection or control mechanism. The spatial data sensed by the sensing unit 104 can be three dimensional or two dimensional data. The spatial data can be Cartesian coordinate data, polar coordinate data or spherical coordinate data.
According to another embodiment of the invention the sense data is generated at regular intervals of time. According to yet another embodiment of the invention the sense data further comprises sensitivity data, pressure data, image data, area of contact data, number of contact points data, and button data. According to another embodiment of the invention the sense data can be data from a sensor. Examples of the sensor include but are not limited to IR sensor, touch sensor, capacitive touch screen, resistive touch screen, optical sensor, image sensor.
The sensing unit 104 transmits the sense data to a processing unit 106. The processing unit 106 calculates a position of the pointer 110 based on the sense data.
Further, the processing unit 106 monitors the sense data to recognize a gesture from a set of predefined gestures. A gesture can be any predefined pattern in change of sense data as recognized by the sensing unit 104. Further, a gesture can be any predefined pattern in change of sense data with respect to time as recognized by the sensing unit 104. Also, a gesture may be predefined as keeping finger still on a touch sensing device for a predefined period of time. Also, a gesture can be defined as pressing a physical button with finger touching the touch sensitive device. Moreover, a gesture can be defined as change in area of contact as sensed by the sensing unit. Examples of gesture include but are not limited to single tap, double tap, pinching, multi finger tap, flick, single stroke, multi strokes made by the user on a touch sensing device, eye movement in a eye tracking device and hand movement in a hand tracking device. Further, a gesture may be a predefined symbol made by user on a touch sensing device. At least one gesture in the set of predefined gestures is designated as teleporting gesture. Further, one or more gestures in the set of predefined gesture may be designated as command gestures.
The processing unit 106 calculates the position of the pointer 110 depending on spatial position of the teleporting gesture, if the teleporting gesture is recognized. The spatial position of the teleporting gesture can be spatial data sensed by the sensing unit 104 at the start of the gesture, at the end of the gesture or during the gesture. Further, the spatial position of the teleporting gesture can be average spatial data sensed during the gesture or any function of spatial data sensed during the gesture. According to an embodiment of the invention, the processing unit 106 calculates the next position of the pointer 110 using a mapping function on the spatial position of the teleporting gesture, if the teleporting gesture is recognized.
The processing unit 106 executes one or more commands associated with the command gesture, if the command gesture is recognized. The examples of commands include but are not limited to opening a file, to selecting an icon, to zooming into a picture and so forth.
The processing unit 106 calculates the position of the pointer 110 based on the change in the spatial data as sensed by the sensing unit 104, if no gesture from the predefined set of gesture is recognized. The processing unit 106 may also calculate the position of the pointer 110 based c i the rate of change of the spatial data as sensed by the sensing unit 104, if no gesture from the predefined set of gestures is recognized.
According to another embodiment of the invention the position of the pointer 110 is adjusted depending on the positions of other objects of the user interface. For example the position of the pointer 110 may be adjusted to the position of the object nearest to the next position of the pointer 110, if the next position of the pointer 110 is calculated based on the spatial position of the teleporting gesture.
According to an embodiment of the invention the sensing unit 104 includes an interactive area for sensing spatial input provided by the user. The interactive area of the sensing unit 104 is mapped onto the display area on the display device using a mapping function. The position of the pointer 110 is result of the mapping function with inputs as the position of the teleporting gesture. According to. an embodiment of the invention the interactive area of the sensing unit 104 is proportionally mapped with the area of the display 102.
According to an embodiment of the invention a control signal can be provided to the processing unit 106 to designate at least one command gesture as a teleporting gesture. The examples of control signals include but are not limited to pressing a button, performing a command gesture, providing a predefined spatial input. Further, control signals can also be provided to the processing unit 106 from an auxiliary input device such as keyboard, joystick or button pads. For example, a tap on a touchpad can be designated as a teleporting gesture, if a physical button is pressed and control gesture to select a graphical object, if the physical button is not pressed. Furtner, a tap on the touchpad can be designated as a teleporting gesture, if user is touching right corner of the touchpad and as a command gesture, if user is not touching the right corner of the touchpad.
According to an embodiment of the invention the interactive area of the sensing unit 104 can be mapped tc only a first part of display area on the display device 102 using the mapping function. According to another embodiment of the invention, the interactive area cf the sensing unit 104 can be mapped to a second part of display area on the display device 102 using the mapping function, based on a control signal provided to the processing unit 106. The example of control signals include but are not limited to pressing a button, performing a command gesture. Further, control signals can also be provided to the processing unit 106 from an auxiliary input device such as keyboard, joystick or button pads. For example, the interactive area of the sensing unit can be mapped to the left half of the display area, if a physical button is pressed on an auxiliary keyboard and to right half of the display area, if the physical button is not pressed.
According to another embodiment of the invention in a multi screen display device the interactive area of the sensing unit can be mapped to area of display on a first screen when a control signal is present and to area of display on a second screen when the control signal is not present. According to yet another embodiment of the invention in a multi screen display device the interactive area of the sensing unit can be mapped to the complete display area when a control signal is present and to area of display on a first screen when the control signal is not present. According to another embodiment of the invention the set of gestures comprises a first teleporting gesture and a second teleporting gesture. The interactive area of the sensing unit 104 can be mapped to a first part of display area on the display device 102 using the mapping function, when the first teleporting gesture is recognized. The interactive area of the 5 sensing unit 104 can be mapped to a second part of display area on the display device 102 using the mapping function, when the second teleporting gesture is recognized. For example, the interactive area of the sensing unit can be mapped to the left half of the display area, when the first teleporting geε'ure is recognized and to right half of the display area, when the second teleporting gesture is recognized.
10 According to another embodiment of the invention in a multi screen display device the interactive area of the sensing unit can be mapped to area of display on a first screen when the first teleporting gesture is recognized and to area of display on a second screen when the second teleporting gesture is recognized. According to yet another embodiment of the invention in a multi screen display device the interactive area of the sensing unit can be
15 mapped to the complete display area when the first teleporting gesture is recognized and to area of display on a first screen when the second teleporting gesture is recognized.
According to another embodiment of the invention, the processing unit 106 can be combined with the sensing unit 104 or the display device 102.
20 FIG 2 describes another exemplary embodiment illustrating the working of the system 100. In the system 100, the pointer 110 is at the initial position 112 on the display device 102. A user intends to change the position of a graphical object such as a pointer 110 from the initial position 112 to the next position 114.
The user can change the position of the pointer 110 from the initial position 25 112 to the next position 114, changing his position on the interactive area on the sensing unit 104 in a direction corresponding to a direction from the initial position 112 to the next position 114. The processing unit 106 will calculate the movement of the pointer 110 according to the change in the spatial data on the interactive area. Further, according to the movement calculated by the processing unit 106, the pointer 110 will move from the initial position 112 30 to the next position 114. The user can also change the position of the pointer 110 from the initial position 112 to the next position 114, by making a teleporting gesture at position 204 on interactive area of the sensing unit 104. The processing unit 106 will recognize the teleporting gesture and accordingly the pointer 110 will move directly to a position 202 on the display device 102. The position 202 will be corresponding to the position 204 on the sensing unit 104. The user can then move the pointer 110 to the position 114 by moving the position of the pointing device with respect io the sensing unit 104 in a direction corresponding to a direction from the position 202 to the position 114.
FIG. 3 describes a block diagram of a system 300 for changing position of an object of a graphical user interface based on spatial input provided by a user, according to an embodiment of the invention. Examples of the system 300 include but are not limited to mobile phones, laptops, desktops, televisions, and handheld devices. The system 300 includes a display device 302. The display device 302 comprises a display area for displaying one or more objects of a graphical user interface such as windows, pointer, icons, images, menus and files. Examples of the display device 302 include but are not limited to monitors, 3D monitors, holographic displays, LCD, multi screen systems, projector screens, etc. The display device 302 displays an object of the graphical user interface called pointer 312. The position of the pointer 312 on the display device 302 can be changed by a user by providing spatial input to a sensing unit 304. Examples of the sensing unit 304 include but are not limited to touchpads, touch screens, multi touch screens, laser pointers, hand movement recognition systems, gesture recognition systems and eye tracking systems.
The sensing unit 304 comprises a spatial sensor system 306 and a sensor control unit 3O8.The spatial sensor system 306 senses spatial input provided by the user. Examples of sensors include but are not limited to capacitive sensors, resistive sensors, image sensors, thermal sensors, optical sensors, Electroencephalography (EEG) sensors. Further, the spatial sensor system 306 can sense spatial data corresponding to a spatial input provided by the user using a pointing device. Examples of the pointing device include but are not limited to single finger, multiple fingers, other body parts, pen, stylus, laser pointer and so forth. According to an embodiment of the invention, a pointing device can be any selection or control mechanism. The spatial data sensed by the spatial sensor system 306 can be three dimensional or two dimensional data. The sensor control unit 308 generates sense data based on spatial input provided by the user as sensed by the spatial sensor system 306. The spatial data can be Cartesian coordinate data, polar coordinate data or spherical coordinate data. According to one embodiment, the sensor control unit 308 generates sense data based on the change in spatial input provided by the user as sensed by the spatial sensor system 306. According to another embodiment, the sensor control unit 308 generates sense data based on the absolute spatial input provided by the user as sensed by the spatial sensor system 306.
According to another embodiment of the invention, the sensor control unit 308 generates sense data at regular intervals of time. According to yet another embodiment of the invention the sense data further comprises sensitivity data, pressure data, image data, area of contact data, number of contact points data, and button data.
Further, the sensor control unit 308 monitors the sense data to recognize a gesture from a set of predefined gestures. A gesture can be any predefined pattern in change of sense data. Further, a gesture can be any predefined pattern in change of sense data with respect to time. Examples of gesture include but are not limited to single tap, double tap, pinching, multi finger tap, flick .single stroke ,multi strokes made by the user on a touch sensing device, eye movement in a eye tracking device and hand movement in a hand tracking device. Further, a gesture may be a predefined symbol made by user on a touch sensing device. Further, a gesture may be predefined as keeping finger still on a touch sensing device for a predefined period of time. Also, a gesture can be defined as pressing a physical button with finger touching the touch sensitive device. Also, a gesture can be defined as change in area of contact as sensed by the sensing unit.
At least one gesture in the set of predefined gestures is designated as teleporting gesture. Further, one or more gesture in the set of predefined gesture may be designated as command gestures.
The sensor control unit 308 generates a gesture data depending on the gesture recognized. The sensor control unit 308 provides the sense data and the gesture data to the processing unit 310. The processing unit 310 calculates a position of the pointer 312 based on the sense data.
The processing unit 310 calculates the position of the pointer 312 depending on spatial position of the teleporting gesture, if gesture data corresponding to the teleporting gesture is provided by the sensor control unit 308. The spatial position of the teleporting gesture can be spatial data sensed by the spatial sensor system 306 at the start of the gesture, at the end of the gesture or during the gesture. Further, the spatial position of the teleporting gesture can be average spatial data sensed during the gesture or any function of spatial data sensed during the gesture. According to an embodiment of the invention, the processing unit 310 calculates the next position of the pointer 312 using a mapping function on the spatial position of the teleporting gesture, if the teleporting gesture is recognized.
The processing unit 310 executes one or more commands associated with a command gesture, if the gesture data corresponding to the command gesture is provided by the sensor control unit 308. The examples of commands include but are not limited to opening a file, to selecting an icon, to zooming into a picture and so forth.
The processing unit 310 calculates the position of the pointer 312 based on the change in the spatial data as sensed by the spatial sensor system 306, if no gesture from the predefined set of gesture is recognized. The processing unit 308 may also calculate the position of the pointer 312 based on the rate of change of the spatial data as sensed by the spatial sensor system 306, if no gesture from the predefined set of gestures is recognized.
According to an embodiment of the invention, the sensor control unit 308 generates sense data comprising spatial data corresponding to absolute spatial input provided by the user, if the teleporting gesture is detected. According to another embodiment of the invention, the sensor control unit 308 generates sense data comprising spatial data corresponding to relative change in spatial input provided by the user, if no gesture from the predefined set of gestures is recognized.
According to an embodiment of the invention, the sense data from the sensor control unit 308 is transferred to the processing unit 310 using a wireless link. Examples of the wireless link include but are not limited to Bluetooth, Wi-Fi, Radio Frequency link, Infrared(IR), microwave link and so forth.
FIG 4 illustrates a flowchart of a method for controlling a graphical user interface according to an embodiment of the invention. According to an embodiment of the present invention, the method for controlling a graphical user interface is implemented by a system including a user, a sensing unit, and a processing unit. The user provides spatial input to the system for controlling the graphical user interface. The sensing unit provides functionalities similar to sensing unit 104 as described in FIG. 1. The processing unit provides functionalities similar to processing unit 106 as described in FIG. 1.
The method for controlling the graphical object is initiated at step 402. At step
404, spatial input provided by the user is sensed by the sensing unit. Thereafter step 406 is performed.
At step 406, sense data is generated based on the spatial input. The sense data is generated by the sensing unit. After step 406, step 408 is performed.
At step 408, it is determined if gestures of a set of predefined gestures recognized from the sense data. Step 408 is performed by the processing unit by analyzing the sense data. If the gestures are not recognized at step 408, step 410 is performed. At step 410, a position of an object of the graphical user interface is calculated based on a change in the sense data. The position of the object is calculated at step 410 by the processing unit.
If the gestures are recognized at step 408, step 412 is performed. At step 412, it is determined if the gestures identified from the sense data include teleporting gestures. Step 412 is performed by the processing unit. If the gestures identified from the sense data do not include the teleporting gestures, step 414 is performed. At step 414, commands associated with the gestures identified from the sense data are executed by the processing unit.
If the gestures identified from the sense data include the teleporting gestures, step 416 is performed. At step 416, the position of the object of the graphical user interface is calculated based on the sense data corresponding to the teleporting gestures. The position of the object is calculated at step 416 by the processing unit.
According to another embodiment of the invention, dragging of a graphical object can be performed by identifying the position of the teleporting gesture and moving the graphical object on the screen according to the position of the teleporting gesture.
According to yet another embodiment of the invention, the position of the gesture is calculated as a distance in X direction and Y direction from a point of origin of the
Il sensing unit. Other references and methods can also be used to calculate the position of the teleporting gesture.
According to yet another embodiment of the invention a gesture is recognized, if the subsequent spatial positions of a pointing device with respect to a sensing unit provided in the sense data satisfy predetermined recognition criteria. According to yet another embodiment of the invention, the predetermined recognition criteria for recognition of a gesture can be automatically adjusted according to the usage of a user.
According to yet another embodiment of the invention the teleporting gesture is recognized as one or more subsequent arcs made by the user on the sensing unit, based on at least one of length of arcs, time for each arcs, time of no touch between the arcs and number of contact points.
According to yet another embodiment of the invention, the parameters for finding the next position of the pointer from the position of the gesture can be automatically adjusted according to the usage of the user over a period of time.
According to yet another embodiment of the invention, a feedback is provided for recognition of a gesture. The feedback can be visual or in form of audio.
According to yet another embodiment of the invention, recognition of the teleporting gesture at specific spatial positions can be used to initiate a process. For example, a teleporting gesture on the right edge of the sensing unit may initiate a scroll process. Further, change in sound volume can also be initiated by sensing teleporting gesture in a specified area of the touch pad.
According to yet another embodiment of the invention, the sensing unit can be a camera or infrared sensor for sensing movements and gestures made by a user to change the position of the graphical object. Examples of the movements and gestures include but are not limited to movements and gestures made by fingers, eyes and head of the user.
According to yet another embodiment of the invention, the sensing unit can be a brain-computer interface (BCI) or a direct neural interface or a brain-machine interface. Further, input to' the sensing unit can be provided directly with the help of signals or message from the brain or nervous system or the user. According to yet another embodiment of the invention, the sensing unit can be a combination of more than one type of sensing units. For example, a teleporting gesture can be recognized using a brain-computer interface (BCI) and the pointer can also be moved with the help of a touchpad in relative mode.
According to yet another embodiment of the invention, only a part of the display device can be mapped to the sensing unit.
According to yet another embodiment of the invention, the display device can be mapped to only a part of the sensing unit.
The invention provides a novel system and method that uses teleporting gestures sensed by a sensing unit to move a pointer for long distances on a display device. The method reduces the time and movement required for moving pointer to long distances on the display device. Further, the invention can lead to reduction of the required size of a sensing device without altering the accuracy of the sensing device.
While embodiments of the invention have been illustrated and described, it is not intended that these embodiments illustrate and describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitations, and it is understood that various changes may be made without departing from the spirit and scope of the invention.
An implementation of the invention may be stored on or transmitted across some form of computer readable media. Computer readable media may be available media that may be accessed by a computer. By way of example, computer readable media may comprise, but is not limited to, "computer storage media" and "communications media".

Claims

What is claimed is:
1. A system for controlling a graphical user interface, the system comprising: a sensing unit, the sensing unit sensing spatial input and providing a sense data based on the spatial input;
a processing unit, the processing unit calculating a position of at least one object of the graphical user interface based on the sense data and recognizing one or more gestures of a set of gestures, the one or more gestures recognized from the sense data, the set of gestures comprising at least one teleporting gesture; and
a display area, the display area displaying the at least one object of the graphical user interface at the position calculated by the processing unit;
wherein the position of the at least one object of the graphical user interface is calculated based on the sense data corresponding to the at least one teleporting gesture when the at least one teleporting gesture is recognized,
wherein the position of the at least one object of the graphical user interface is calculated based on a change in the sense data when no gesture from the set of gestures is recognized.
2. The system according to claim 1 , wherein the processing unit further executes a command when the one or more gestures from the set of gestures is recognized and the one or more gestures does not include the at least one teleporting gesture.
3. The system according to claim 1, wherein the processing unit further designates a gesture from the set of gestures as a teleporting gesture based on a control signal.
4. The system according to claim 1 , wherein the processing unit calculates the position of the at least one object of the graphical user interface based on positions of objects of the graphical user interface.
5. The system according to claim 1, wherein the sensing unit further comprising an interactive area.
6. The system according to claim 5, wherein the position of the at least one object of the graphical user interface is calculated based on the sense data corresponding to the at least one teleporting gesture by proportionally mapping the interactive area to the display area.
7. The system according to claim 5, wherein the position of the at least one object of the graphical user interface is calculated based on the sense data corresponding to the at least one teleporting gesture by mapping the sense area to a first part of the display area.
8. The system according to claim 7, wherein the position of the at least one object is calculated by mapping a second part of the display area to the interactive area of the sensing unit based on a control signal provided to the processing device.
9. A system for controlling a graphical user interface, the system comprising: a sensing unit, the sensing unit sensing spatial input and providing a sense data based on the spatial input, the sensing unit comprising:
a spatial sensor system, the spatial sensor system sensing the spatial input; and
a sensor control unit, the sensor control unit providing sense data based on the spatial input and recognizing one or more gestures of a set of gestures, the set of gestures comprising at least one teleporting gesture;
a processing unit, the processing unit calculating a position of at least one object of the graphical user interface based on the sense data; and a display area, the display area displaying the at least one object of the graphical user interface at the position calculated by the processing unit;
wherein the position of the object of the graphical user interface is calculated based on the sense data corresponding to the teleporting gesture when the at least one teleporting gesture is recognized,
wherein the position of the at least one object of the graphical user interface is calculated based on a change in the sense data when no gesture from the set of gestures is recognized.
A system for controlling a graphical user interface, the system comprising: a sensing unit, the sensing unit sensing spatial input and providing a sense data based on the spatial input, the sensing unit comprising:
a spatial sensor system, the spatial sensor system sensing the spatial input; and
a sensor control unit, the sensor control unit providing sense data based on the spatial input and recognizing one or more gestures of a set of gestures, the set of gestures comprising at least one teleporting gesture, the sensor control unit providing the sense data based on the spatial input corresponding to the at least one teleporting gesture when the at least one teleporting gesture is recognized, the sensor control unit providing the sense data based on change in the spatial input when no gesture from the set of gestures is recognized;
a processing unit, the processing unit calculating a position of at feast one object of the graphical user interface based on the sense data; and
a display area, the display area displaying the at least one object of the graphical user interface at the position calculated by the processing unit;
11.The system according to claim 10, wherein the sensor control unit providing sense data using a wireless link.
12. A method for controlling a g aphical user interface, the method comprising: a. sensing spatial input; b. recognizing one or r iore gestures of a set of gestures, the one or more gestures recognized from a sense data, the sense data being based on the spatial input, the set of gestures comprising at least one teleporting gesture; and c. calculating a position of at least one object of the graphical user interface based on the sense data,
wherein the position of the at least one object of the graphical user interface is calculated based on the sense data corresponding to the at least one teleporting gesture when the at least one teleporting gesture is recognized,
wherein the position of the at least one object of the graphical user interface is calculated based on a change in the sense data when no gesture from the set of gestures is recognized.
13. The method according to claim 13, wherein a gesture from the set of gestures is designated as a teleporting gesture based on a control signal.
14. The method according to claim 13, wherein the position of the at least one object of the graphical user interface is calculated based on positions of objects of the graphical user interface.
15. The method according to claim 13, wherein the spatial data is sensed on an interactive area.
16. The metnod according to claim 15, wherein calculating the position of the at least one object of the graphical user interface comprises proportionally mapping the interactive area to a display area.
17. The method according to claim 16, wherein calculating the position of the at least one object of the graphical user interface comprises mapping the sense area to a first part of the display area.
18. The method according to claim 17, wherein calculating the position of the at least one object further comprises mapping a second part of the display area to the interactive area of the sensing unit based on a control signal provided to the processing device.
19. The method according to claim 13, further comprising executing a command when the one or more gestures from the set of gestures is recognized and the one or more gestures does not include the at least one teleporting gesture.
PCT/IN2009/000511 2008-09-19 2009-09-18 System and method for controlling graphical objects WO2010032268A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN2197/DEL/2008 2008-09-19
IN2197DE2008 2008-09-19

Publications (2)

Publication Number Publication Date
WO2010032268A2 true WO2010032268A2 (en) 2010-03-25
WO2010032268A3 WO2010032268A3 (en) 2010-12-02

Family

ID=42039975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2009/000511 WO2010032268A2 (en) 2008-09-19 2009-09-18 System and method for controlling graphical objects

Country Status (1)

Country Link
WO (1) WO2010032268A2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9070019B2 (en) 2012-01-17 2015-06-30 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9153028B2 (en) 2012-01-17 2015-10-06 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9495613B2 (en) 2012-01-17 2016-11-15 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging using formed difference images
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9702977B2 (en) 2013-03-15 2017-07-11 Leap Motion, Inc. Determining positional information of an object in space
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
US10139918B2 (en) 2013-01-15 2018-11-27 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060036944A1 (en) * 2004-08-10 2006-02-16 Microsoft Corporation Surface UI for gesture-based interaction
US20070291009A1 (en) * 2006-06-19 2007-12-20 Cypress Semiconductor Corporation Apparatus and method for detecting a touch-sensor pad gesture
WO2008000435A1 (en) * 2006-06-26 2008-01-03 Uiq Technology Ab Browsing responsive to speed of gestures on contact sensitive display
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20080165140A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Detecting gestures on multi-event sensitive devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060036944A1 (en) * 2004-08-10 2006-02-16 Microsoft Corporation Surface UI for gesture-based interaction
US20070291009A1 (en) * 2006-06-19 2007-12-20 Cypress Semiconductor Corporation Apparatus and method for detecting a touch-sensor pad gesture
WO2008000435A1 (en) * 2006-06-26 2008-01-03 Uiq Technology Ab Browsing responsive to speed of gestures on contact sensitive display
US20080165140A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Detecting gestures on multi-event sensitive devices
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10410411B2 (en) 2012-01-17 2019-09-10 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US10366308B2 (en) 2012-01-17 2019-07-30 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9436998B2 (en) 2012-01-17 2016-09-06 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US9070019B2 (en) 2012-01-17 2015-06-30 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9495613B2 (en) 2012-01-17 2016-11-15 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging using formed difference images
US11782516B2 (en) 2012-01-17 2023-10-10 Ultrahaptics IP Two Limited Differentiating a detected object from a background using a gaussian brightness falloff pattern
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US9626591B2 (en) 2012-01-17 2017-04-18 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US11308711B2 (en) 2012-01-17 2022-04-19 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9652668B2 (en) 2012-01-17 2017-05-16 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9672441B2 (en) 2012-01-17 2017-06-06 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US9153028B2 (en) 2012-01-17 2015-10-06 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US10767982B2 (en) 2012-01-17 2020-09-08 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US10699155B2 (en) 2012-01-17 2020-06-30 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9767345B2 (en) 2012-01-17 2017-09-19 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US9778752B2 (en) 2012-01-17 2017-10-03 Leap Motion, Inc. Systems and methods for machine control
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US9934580B2 (en) 2012-01-17 2018-04-03 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9945660B2 (en) 2012-01-17 2018-04-17 Leap Motion, Inc. Systems and methods of locating a control object appendage in three dimensional (3D) space
US9741136B2 (en) 2012-01-17 2017-08-22 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US10565784B2 (en) 2012-01-17 2020-02-18 Ultrahaptics IP Two Limited Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9626015B2 (en) 2013-01-08 2017-04-18 Leap Motion, Inc. Power consumption in motion-capture systems with audio and optical signals
US10097754B2 (en) 2013-01-08 2018-10-09 Leap Motion, Inc. Power consumption in motion-capture systems with audio and optical signals
US10739862B2 (en) 2013-01-15 2020-08-11 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11874970B2 (en) 2013-01-15 2024-01-16 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11243612B2 (en) 2013-01-15 2022-02-08 Ultrahaptics IP Two Limited Dynamic, free-space user interactions for machine control
US10139918B2 (en) 2013-01-15 2018-11-27 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US10042430B2 (en) 2013-01-15 2018-08-07 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US9702977B2 (en) 2013-03-15 2017-07-11 Leap Motion, Inc. Determining positional information of an object in space
US11693115B2 (en) 2013-03-15 2023-07-04 Ultrahaptics IP Two Limited Determining positional information of an object in space
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
US10452151B2 (en) 2013-04-26 2019-10-22 Ultrahaptics IP Two Limited Non-tactile interface systems and methods
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11776208B2 (en) 2013-08-29 2023-10-03 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11282273B2 (en) 2013-08-29 2022-03-22 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11461966B1 (en) 2013-08-29 2022-10-04 Ultrahaptics IP Two Limited Determining spans and span lengths of a control object in a free space gesture control environment
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing

Also Published As

Publication number Publication date
WO2010032268A3 (en) 2010-12-02

Similar Documents

Publication Publication Date Title
WO2010032268A2 (en) System and method for controlling graphical objects
US10402042B2 (en) Force vector cursor control
EP2972669B1 (en) Depth-based user interface gesture control
EP2575006B1 (en) Touch and non touch based interaction of a user with a device
US9841827B2 (en) Command of a device by gesture emulation of touch gestures
US20170293351A1 (en) Head mounted display linked to a touch sensitive input device
US20120127070A1 (en) Control signal input device and method using posture recognition
US20110060986A1 (en) Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same
CN102693035A (en) Modal touch input
EP2575007A1 (en) Scaling of gesture based input
WO2017222397A1 (en) Computer mouse
JP2013539580A (en) Method and apparatus for motion control on device
US20190187887A1 (en) Information processing apparatus
US10956030B2 (en) Multi-touch based drawing input method and apparatus
KR102297473B1 (en) Apparatus and method for providing touch inputs by using human body
US20120249417A1 (en) Input apparatus
KR20160097410A (en) Method of providing touchless input interface based on gesture recognition and the apparatus applied thereto
US20140298275A1 (en) Method for recognizing input gestures
WO2009119716A1 (en) Information processing system, information processing device, method, and program
EP3433713B1 (en) Selecting first digital input behavior based on presence of a second, concurrent, input
US9940900B2 (en) Peripheral electronic device and method for using same
US10133346B2 (en) Gaze based prediction device and method
KR20140110262A (en) Portable device and operating method using cursor
US9454248B2 (en) Touch input method and electronic apparatus thereof
KR20140083303A (en) Method for providing user interface using one point touch, and apparatus therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09814187

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09814187

Country of ref document: EP

Kind code of ref document: A2