WO2013175341A2 - Method and apparatus for controlling multiple devices - Google Patents

Method and apparatus for controlling multiple devices Download PDF

Info

Publication number
WO2013175341A2
WO2013175341A2 PCT/IB2013/053884 IB2013053884W WO2013175341A2 WO 2013175341 A2 WO2013175341 A2 WO 2013175341A2 IB 2013053884 W IB2013053884 W IB 2013053884W WO 2013175341 A2 WO2013175341 A2 WO 2013175341A2
Authority
WO
WIPO (PCT)
Prior art keywords
zone
user
gesture
controlling
control signal
Prior art date
Application number
PCT/IB2013/053884
Other languages
French (fr)
Other versions
WO2013175341A3 (en
Inventor
Jimmy ZHONG
Jia Wei
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2013175341A2 publication Critical patent/WO2013175341A2/en
Publication of WO2013175341A3 publication Critical patent/WO2013175341A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device

Definitions

  • the invention relates to gesture control, more particularly, to control at least two devices via gesture control.
  • a laptop is used as a primary device for web browsing and a portable media player connected with the laptop is used as a secondary device for viewing videos on line.
  • US2011/01 19640A1 discloses techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture -based system.
  • the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones.
  • gesture control solution which can control multiple devices. It would also be desirable that the gesture control solution does not require the users to remember too many gestures or to move in a very large scale of area.
  • an apparatus for controlling a first device and/or a second device comprises:
  • a second unit for identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data
  • a third unit for generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
  • the basic idea is to have a zone corresponding to all of the devices besides having multiple zones corresponding to each device respectively.
  • the user can interact with both the first and second devices with the same or different sets of gestures so that the user need not move in a large scale area to interact with the devices.
  • the user can interact with the first and second devices respectively with the same set of gestures so that the user need not remember lots of gestures for interaction with the devices. In this way, by properly define the interaction gestures in these three zones, the balance between less gestures and small movement area can be achieved.
  • the first device comprising the above described apparatus for controlling the first device and/or a second device is provided.
  • a system comprising a first device, a second device and the above described apparatus for controlling the first device and/or the second device is provided.
  • a method of controlling a first device and/or a second device comprises the steps of: obtaining data representing a physical place where there is a user and a user's gesture in the physical space, the physical place comprising a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device;
  • a set of computer- executable instructions configured to perform the above described method is provided.
  • Fig.l depicts a schematic diagram of an apparatus in accordance with an embodiment of the present invention.
  • Fig. 2a and Fig. 2b depict a schematic diagram of the zones associated with the devices in accordance with an embodiment of the present invention.
  • Fig.3 depicts a flow chart of a method in accordance with an embodiment of the present invention.
  • Fig.l depicts a schematic diagram of an apparatus in accordance with an embodiment of the present invention.
  • an apparatus for controlling a first device and/or a second device is provided.
  • the apparatus 100 comprises a first unit 110, a second unit 120 and a third unit 130.
  • the first and second devices can be any kinds of devices, such as desktop computers, laptop computers, portable media players, mobile phones and etc.
  • the first device is a computer and the second device is a digital photo frame or a portable media play which can play 3D contents.
  • the first device can be connected with the second device via wire or wireless interface so that they can share the same information.
  • the first device can also not be connected with the second device.
  • the first unit 1 10 is for obtaining data representing a physical place where there is a user and the user's gesture in the physical space.
  • the physical place comprises a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device.
  • Each of the first, second and third zones represents a three-dimensional physical space.
  • the data representing the physical place comprises both the data of the physical space and the data of the user in the physical place. That is to say, the obtained data represents the physical space, the user in the physical space and the user's gesture in the physical space.
  • the gesture may include any kinds of user motion, dynamic or static, such as running, jumping, moving a finger, waving a hand or a static pose.
  • the first unit can be implemented in many kinds of ways.
  • the first unit is a receiver for receiving the data from other units which can capture the data.
  • the first unit comprises a camera for capturing image data of the physical place and the user's motion. The image data of the user's motion represents the user's gesture.
  • the first unit comprises an accelerometer and a GPS module.
  • the accelerometer is attached to the body of the user for measuring the acceleration of a part of the user and the measured acceleration represents the user's gesture.
  • the GPS module is attached to the body of the user to measure the position of a part of the user, and the measured position represents the user in the physical space. It is known to the person skilled in the art how to obtain the data representing the physical space, the user in the physical space and the user's gesture in the physical space, and it will not be elaborated here.
  • the second unit 120 is for identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data.
  • the identified position can be in one of the first zone, the second zone and the third zone.
  • the second unit 120 can be implemented in many ways.
  • the data representing the user's gesture is captured image data.
  • the second unit identifies the user's gesture by comparing the captured image data with pre-stored gesture information, and identifies the position of the user's gesture by comparing the captured image data with pre-stored physical space information.
  • the data representing the user's gesture is the measured acceleration
  • the second unit identifies the user's gesture based on the measured acceleration and pre-stored gesture information.
  • the data representing the user in the physical space is the measured position
  • the second unit identifies the position of the user's gesture based on the measured position
  • the third unit 130 is for generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
  • the control signal mapping table is about the mapping relationship among multiple control signals, multiple gestures and multiple positions of the user's gestures.
  • the control signal mapping table can be pre-stored or generated according to the user's input.
  • the third unit 130 can be implemented in many ways.
  • the control signal is for controlling the first device; if the identified position is in the second zone associated with the second device, the control signal is for controlling the second device; and if the identified position is in the third zone associated with the first device and the second device, the control signal is for controlling both the first device and the second device. For example, waving a hand in the first zone is to close the first device; waving a hand in the second zone is to close the second device; and waving a hand in the third zone is to close both the first and second devices.
  • the control signal is for controlling the first device; if the identified position is in the second zone associated with the second device, the control signal is for controlling the second device; and if the identified position is in the third zone associated with the first device and the second device, the control signal is for controlling one of the first device and the second device. For example, waving a hand up and down in the first zone is to close the first device; waving a hand up and down in the second zone is to close the second device; waving a hand up and down in the third zone is to close the first device; waving a hand left and right in the third zone is to close the second device.
  • the user can interact with the first and the second device separately with the same set of gestures in the first zone and the second zone so that the user need not remember lots of gestures for interaction with the devices, and the user can interact with both the first and second devices with the same or different sets of gestures in the third zone so that the user need not move in a large scale area to interact with the devices.
  • the control signal is for data transmission between the first device and the second device. For example, waving a hand from left to right is to send data from the first device to the second device, and waving a hand from right to left is to send data from the second device to the first device. For another example, waving a hand from left to right is to make the second device play the content stored in the first device.
  • the control signal is for controlling both the first and second devices.
  • the control signal may be for many kinds of applications, such as adjusting the audio volume or adjusting the displayed fonts, etc. For example, waving a hand from up to down is to close both the first device and the second device, and waving a hand from down to up is to open both the first device and the second device.
  • the control signal is for controlling one of the first device and the second device. For example, waving up and down in the third zone is to adjust the brightness of the first device, and waving left and right in the third zone is to adjust the brightness of the second device.
  • the zones can be associated with the first and second devices in many ways.
  • Fig.2a and Fig.2b depict schematic diagram of the zones associated with the devices in accordance with an embodiment of the present invention.
  • the third zone 250 associated with the first device 210 and the second device 220 is between the first zone 230 associated with the first device 210 and the second zone 240 associated with the second device 220. In this way, it would be easy for the user to remember the position of the zones.
  • first zone 231 associated with the first device 210, the second zone 241 associated with the second device 220 and the third zone 251 associated with the first device 210 and the second device 220 lay out as shown in Fig.2b.
  • the first device comprising the above described apparatus for controlling the first device and/or a second device. That is to say, the apparatus for controlling the first device and/or the second device is embedded in the first device.
  • the first device can be the primary device for controlling the second device or transmitting data to the second device, or can be the auxiliary device controlled by the second device or receiving data from the second device.
  • the first device and the second device can also work independently.
  • the first and second devices can be many kinds of devices described above.
  • a system comprising a first device, a second device and the above described apparatus for controlling the first device and/or the second device.
  • the first and second devices can be many kinds of devices described above.
  • the apparatus for controlling the first device and/or the second device can be a separate device, and can also be embedded in the first device or the second device.
  • the system comprises a desktop, a glassesless 3D portable media player and a camera.
  • the camera captures the data representing a physical place where there is a user and a user's gesture in the physical space, identifies the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone, and then generates on the basis of the user's gesture and the position of the user's gesture, a control signal for controlling the desktop and/or the 3D portable media player. If the camera captures a gesture of waving hand from left to right in the third zone associated with both the desktop and the portable media player, the desktop is controlled to send 3D content to the 3D portable media player and then the 3D portable media player can play the 3D content.
  • Fig.3 depicts a flow chart of a method in accordance with an embodiment of the present invention. According to an embodiment of a fourth aspect of the present invention, a method of controlling a first device and/or a second device is provided.
  • the method comprises step 310 of obtaining data representing a physical place where there is a user and a user's gesture in the physical space.
  • the physical place comprises a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device.
  • the method further comprises step 320 of identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data.
  • the method further comprises step 330 of generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
  • control signal is for data transmission between the first device and the second device if the identified position is in the third zone.
  • control signal is for controlling both the first device and the second device if the identified position is in the third zone.
  • control signal is for controlling one of the first device and the second device if the identified position is in the third zone.
  • a set of computer-executable instructions is further proposed to perform the methods described above.
  • the instructions can reside in the first, second and third units separately to perform any step of the above disclosed methods.
  • modules may be implemented in hardware circuitry, computer program code, or any combination of hardware circuitry and computer program code.

Abstract

This invention provides an apparatus and a method of controlling a first device and/or a second device. The apparatus comprises a first unit, a second unit and a third unit. The first unit is for obtaining data representing a physical place where there is a user and a user's gesture in the physical space. The physical place comprises a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device. The second unit is for identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data. The third unit is for generating, on the basis of the user's gesture and the position of the user's gesture, a control signal for controlling the first device and/or the second device.In this way, by properly define the interaction gestures in these three zones, the balance between less gestures and small movement area can be achieved.

Description

METHOD AND APPARATUS FOR CONTROLLING MULTIPLE DEVICES
FIELD OF THE INVENTION
The invention relates to gesture control, more particularly, to control at least two devices via gesture control.
BACKGROUND OF THE INVENTION
People have more and more digital devices, desktop computers, laptop computers, portable media players, mobile phones and so on. It is common that people use multiple devices at the same time to improve the user experience. For example, a laptop is used as a primary device for web browsing and a portable media player connected with the laptop is used as a secondary device for viewing videos on line.
US2011/01 19640A1 discloses techniques for scaling and translating gestures such that the applicable gestures for control may vary depending on the user's distance from a gesture -based system. In an example embodiment, the physical space is divided into virtual zones of interaction, and the system may scale or translate a gesture based on the zones.
However, US2011/0119640A1 does not discuss about how to handle multiple devices via gesture control when the multiple devices are connected with each other.
SUMMARY OF THE INVENTION
It would be advantageous to provide a gesture control solution which can control multiple devices. It would also be desirable that the gesture control solution does not require the users to remember too many gestures or to move in a very large scale of area.
According to an embodiment of a first aspect of the present invention, an apparatus for controlling a first device and/or a second device is provided. The apparatus comprises:
a first unit for obtaining data representing a physical place where there is a user and a user's gesture in the physical space, the physical place comprising a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device;
a second unit for identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data; and
a third unit for generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
The basic idea is to have a zone corresponding to all of the devices besides having multiple zones corresponding to each device respectively. In the third zone associated with both the first and second devices, the user can interact with both the first and second devices with the same or different sets of gestures so that the user need not move in a large scale area to interact with the devices. In the first and second zones associated with the first and second devices respectively, the user can interact with the first and second devices respectively with the same set of gestures so that the user need not remember lots of gestures for interaction with the devices. In this way, by properly define the interaction gestures in these three zones, the balance between less gestures and small movement area can be achieved.
According to an embodiment of a second aspect of the present invention, the first device comprising the above described apparatus for controlling the first device and/or a second device is provided.
According to an embodiment of a third aspect of the present invention, a system comprising a first device, a second device and the above described apparatus for controlling the first device and/or the second device is provided.
According to an embodiment of a fourth aspect of the present invention, a method of controlling a first device and/or a second device is provided. The method comprises the steps of: obtaining data representing a physical place where there is a user and a user's gesture in the physical space, the physical place comprising a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device;
identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data; and generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
According to an embodiment of a fifth aspect of the present invention, a set of computer- executable instructions configured to perform the above described method is provided.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
DESCRIPTION OF THE DRAWINGS
The above and other objects and features of the present invention will become more apparent from the following detailed description considered in connection with the accompanying drawings, in which:
Fig.l depicts a schematic diagram of an apparatus in accordance with an embodiment of the present invention;
Fig. 2a and Fig. 2b depict a schematic diagram of the zones associated with the devices in accordance with an embodiment of the present invention; and
Fig.3 depicts a flow chart of a method in accordance with an embodiment of the present invention.
The same reference numerals are used to denote similar parts throughout the figures. DETAILED DESCRIPTION
Detailed description of the present invention is given below in connection with the accompanying drawings.
Fig.l depicts a schematic diagram of an apparatus in accordance with an embodiment of the present invention. According to an embodiment of a first aspect of the present invention, an apparatus for controlling a first device and/or a second device is provided. As shown in Fig.l , the apparatus 100 comprises a first unit 110, a second unit 120 and a third unit 130.
The first and second devices can be any kinds of devices, such as desktop computers, laptop computers, portable media players, mobile phones and etc. In an embodiment, the first device is a computer and the second device is a digital photo frame or a portable media play which can play 3D contents. The first device can be connected with the second device via wire or wireless interface so that they can share the same information. The first device can also not be connected with the second device.
The first unit 1 10 is for obtaining data representing a physical place where there is a user and the user's gesture in the physical space. The physical place comprises a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device. Each of the first, second and third zones represents a three-dimensional physical space.
Because the user is in the physical space, the data representing the physical place comprises both the data of the physical space and the data of the user in the physical place. That is to say, the obtained data represents the physical space, the user in the physical space and the user's gesture in the physical space. The gesture may include any kinds of user motion, dynamic or static, such as running, jumping, moving a finger, waving a hand or a static pose.
The first unit can be implemented in many kinds of ways.
In an embodiment, the first unit is a receiver for receiving the data from other units which can capture the data. In another embodiment, the first unit comprises a camera for capturing image data of the physical place and the user's motion. The image data of the user's motion represents the user's gesture.
In a further embodiment, the first unit comprises an accelerometer and a GPS module. The accelerometer is attached to the body of the user for measuring the acceleration of a part of the user and the measured acceleration represents the user's gesture. The GPS module is attached to the body of the user to measure the position of a part of the user, and the measured position represents the user in the physical space. It is known to the person skilled in the art how to obtain the data representing the physical space, the user in the physical space and the user's gesture in the physical space, and it will not be elaborated here.
The second unit 120 is for identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data.
The identified position can be in one of the first zone, the second zone and the third zone.
The second unit 120 can be implemented in many ways. In an embodiment, the data representing the user's gesture is captured image data. The second unit identifies the user's gesture by comparing the captured image data with pre-stored gesture information, and identifies the position of the user's gesture by comparing the captured image data with pre-stored physical space information.
In another embodiment, the data representing the user's gesture is the measured acceleration, and the second unit identifies the user's gesture based on the measured acceleration and pre-stored gesture information.
In a further embodiment, the data representing the user in the physical space is the measured position, and the second unit identifies the position of the user's gesture based on the measured position.
It is known to the person skilled in the art how to identify the user's gesture and the position of the user's gesture, and it will not be elaborated here.
The third unit 130 is for generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
The control signal mapping table is about the mapping relationship among multiple control signals, multiple gestures and multiple positions of the user's gestures. The control signal mapping table can be pre-stored or generated according to the user's input. The third unit 130 can be implemented in many ways.
In an embodiment, if the identified position is in the first zone associated with the first device, the control signal is for controlling the first device; if the identified position is in the second zone associated with the second device, the control signal is for controlling the second device; and if the identified position is in the third zone associated with the first device and the second device, the control signal is for controlling both the first device and the second device. For example, waving a hand in the first zone is to close the first device; waving a hand in the second zone is to close the second device; and waving a hand in the third zone is to close both the first and second devices.
In another embodiment, if the identified position is in the first zone associated with the first device, the control signal is for controlling the first device; if the identified position is in the second zone associated with the second device, the control signal is for controlling the second device; and if the identified position is in the third zone associated with the first device and the second device, the control signal is for controlling one of the first device and the second device. For example, waving a hand up and down in the first zone is to close the first device; waving a hand up and down in the second zone is to close the second device; waving a hand up and down in the third zone is to close the first device; waving a hand left and right in the third zone is to close the second device.
In this way, the user can interact with the first and the second device separately with the same set of gestures in the first zone and the second zone so that the user need not remember lots of gestures for interaction with the devices, and the user can interact with both the first and second devices with the same or different sets of gestures in the third zone so that the user need not move in a large scale area to interact with the devices.
In a further embodiment, if the identified position is in the third zone, the control signal is for data transmission between the first device and the second device. For example, waving a hand from left to right is to send data from the first device to the second device, and waving a hand from right to left is to send data from the second device to the first device. For another example, waving a hand from left to right is to make the second device play the content stored in the first device.
In another embodiment, if the identified position is in the third zone, the control signal is for controlling both the first and second devices. The control signal may be for many kinds of applications, such as adjusting the audio volume or adjusting the displayed fonts, etc. For example, waving a hand from up to down is to close both the first device and the second device, and waving a hand from down to up is to open both the first device and the second device. In a further another embodiment, if the identified position is in the third zone, the control signal is for controlling one of the first device and the second device. For example, waving up and down in the third zone is to adjust the brightness of the first device, and waving left and right in the third zone is to adjust the brightness of the second device.
The zones can be associated with the first and second devices in many ways. Fig.2a and Fig.2b depict schematic diagram of the zones associated with the devices in accordance with an embodiment of the present invention.
In an embodiment as shown in Fig.2a, the third zone 250 associated with the first device 210 and the second device 220 is between the first zone 230 associated with the first device 210 and the second zone 240 associated with the second device 220. In this way, it would be easy for the user to remember the position of the zones.
In another embodiment, the first zone 231 associated with the first device 210, the second zone 241 associated with the second device 220 and the third zone 251 associated with the first device 210 and the second device 220 lay out as shown in Fig.2b.
According to an embodiment of a second aspect of the present invention, the first device comprising the above described apparatus for controlling the first device and/or a second device is provided. That is to say, the apparatus for controlling the first device and/or the second device is embedded in the first device. The first device can be the primary device for controlling the second device or transmitting data to the second device, or can be the auxiliary device controlled by the second device or receiving data from the second device. The first device and the second device can also work independently. The first and second devices can be many kinds of devices described above.
According to an embodiment of a third aspect of the present invention, a system comprising a first device, a second device and the above described apparatus for controlling the first device and/or the second device is provided. The first and second devices can be many kinds of devices described above. The apparatus for controlling the first device and/or the second device can be a separate device, and can also be embedded in the first device or the second device.
In an embodiment of the system, the system comprises a desktop, a glassesless 3D portable media player and a camera. The camera captures the data representing a physical place where there is a user and a user's gesture in the physical space, identifies the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone, and then generates on the basis of the user's gesture and the position of the user's gesture, a control signal for controlling the desktop and/or the 3D portable media player. If the camera captures a gesture of waving hand from left to right in the third zone associated with both the desktop and the portable media player, the desktop is controlled to send 3D content to the 3D portable media player and then the 3D portable media player can play the 3D content.
Fig.3 depicts a flow chart of a method in accordance with an embodiment of the present invention. According to an embodiment of a fourth aspect of the present invention, a method of controlling a first device and/or a second device is provided.
Referring to Fig.3, the method comprises step 310 of obtaining data representing a physical place where there is a user and a user's gesture in the physical space. The physical place comprises a first zone associated with the first device, a second zone associated with the second device and a third zone associated with the first device and the second device.
The method further comprises step 320 of identifying the user's gesture and a position of the user's gesture among the first zone, the second zone and the third zone on the basis of the obtained data.
The method further comprises step 330 of generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device and/or the second device.
The step 330 of generating can be implemented in many ways. In an embodiment, the control signal is for data transmission between the first device and the second device if the identified position is in the third zone. In another embodiment, the control signal is for controlling both the first device and the second device if the identified position is in the third zone. In a further embodiment, the control signal is for controlling one of the first device and the second device if the identified position is in the third zone.
A set of computer-executable instructions is further proposed to perform the methods described above. The instructions can reside in the first, second and third units separately to perform any step of the above disclosed methods. Although the present invention will be described with reference to the embodiment shown in the drawings, it should be understood that the present invention may be embodied in many alternate forms including any combination of hardware and software. In addition, any suitable size, shape or type of materials, elements, computer program elements, computer program code, or computer program modules could be used.
While discussed in the context of computer program code, it should be understood that the modules may be implemented in hardware circuitry, computer program code, or any combination of hardware circuitry and computer program code.
It should be noted that the above-mentioned embodiments illustrated rather than limit the invention and that those skilled in the art would be able to design alternative embodiments without departing from the scope of the appended claims. The embodiments are illustrative rather than restrictive. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim or in the description. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. In the device claims enumerating several units, several of these units can be embodied by one and the same item of hardware or software. The usage of the words first, second and third, et cetera, does not indicate any ordering. These words are to be interpreted as names.

Claims

CLAIMS:
1. An apparatus for controlling a first device (210) and/or a second device (220), comprising:
a first unit (1 10) for obtaining data representing a physical place where there is a user and a user's gesture in the physical space, the physical place comprising a first zone (230, 231) associated with the first device (210), a second zone (240, 241) associated with the second device (220) and a third zone (250, 251) associated with the first device (210) and the second device (220);
a second unit (120) for identifying the user's gesture and a position of the user's gesture among the first zone (230, 231), the second zone (240,241) and the third zone (250, 251) on the basis of the obtained data; and
a third unit (130) for generating, on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device (210) and/or the second device (220).
2. An apparatus for controlling a first device (210) and/or a second device (220) as claim 1 , wherein the control signal is for data transmission between the first device (210) and the second device (220) if the identified position is in the third zone (250, 251).
3. An apparatus for controlling a first device (210) and/or a second device (220) as claim 1 , wherein the control signal is for controlling both the first device (210) and the second device (220) if the identified position is in the third zone (250, 251).
4. An apparatus for controlling a first device (210) and/or a second device (220) as claim 1 , wherein the control signal is for controlling one of the first device (210) and the second device (220) if the identified position is in the third zone (250, 251).
5. An apparatus for controlling a first device and/or a second device as claim 1 , wherein the third zone (250) is between the first zone (230) and the second zone (240).
6. A first device comprising an apparatus for controlling the first device and/or a second device as claimed in any one of claims 1 to 5.
7. A system comprising a first device, a second device and an apparatus for controlling the first device and/or the second device as claimed in any one of claims 1 to 5.
8. A method of controlling a first device (210) and/or a second device (220), comprising the steps of:
obtaining (310) data representing a physical place where there is a user and a user's gesture in the physical space, the physical place comprising a first zone (230, 231) associated with the first device (210), a second zone (240, 241) associated with the second device (220) and a third zone (250, 251) associated with the first device (210) and the second device (220);
identifying (320) the user's gesture and a position of the user's gesture among the first zone (230, 231), the second zone (240,241) and the third zone (250, 251) on the basis of the obtained data; and
generating (330), on the basis of the user's gesture, the position of the user's gesture and a control signal mapping table, a control signal for controlling the first device (210) and/or the second device (220).
9. A method of controlling a first device (210) and/or a second device (220) as claim 8, wherein the control signal is for data transmission between the first device (210) and the second device (220) if the identified position is in the third zone (250, 251).
10. A method of controlling a first device (210) and/or a second device (220) as claim 8, wherein the control signal is for controlling both the first device (210) and the second device (220) if the identified position is in the third zone (250, 251).
1 1. A method of controlling a first device (210) and/or a second device (220) as claim 8, wherein the third zone (250) is between the first zone (230) and the second zone (240).
12. A set of computer-executable instructions, configured to perform any one of the claims 8 to 11.
PCT/IB2013/053884 2012-05-24 2013-05-13 Method and apparatus for controlling multiple devices WO2013175341A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2012075974 2012-05-24
CNPCT/CN2012/075974 2012-05-24

Publications (2)

Publication Number Publication Date
WO2013175341A2 true WO2013175341A2 (en) 2013-11-28
WO2013175341A3 WO2013175341A3 (en) 2014-11-13

Family

ID=48747628

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/053884 WO2013175341A2 (en) 2012-05-24 2013-05-13 Method and apparatus for controlling multiple devices

Country Status (1)

Country Link
WO (1) WO2013175341A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015113833A1 (en) * 2014-01-30 2015-08-06 Koninklijke Philips N.V. Gesture control
FR3072235A1 (en) * 2017-10-09 2019-04-12 Otodo PORTABLE REMOTE CONTROL OBJECT AND METHOD FOR IMPLEMENTING THE SAME

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110119640A1 (en) 2009-11-19 2011-05-19 Microsoft Corporation Distance scalable no touch computing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011096976A1 (en) * 2010-02-05 2011-08-11 Sony Computer Entertainment Inc. Controller for interfacing with a computing program using position, orientation, or motion

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110119640A1 (en) 2009-11-19 2011-05-19 Microsoft Corporation Distance scalable no touch computing

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015113833A1 (en) * 2014-01-30 2015-08-06 Koninklijke Philips N.V. Gesture control
US20160345407A1 (en) * 2014-01-30 2016-11-24 Philips Lighting Holding B.V. Gesture control
JP2017506794A (en) * 2014-01-30 2017-03-09 フィリップス ライティング ホールディング ビー ヴィ Gesture control
US9872362B2 (en) 2014-01-30 2018-01-16 Philips Lighting Holding B.V. Gesture based control of a utility
FR3072235A1 (en) * 2017-10-09 2019-04-12 Otodo PORTABLE REMOTE CONTROL OBJECT AND METHOD FOR IMPLEMENTING THE SAME

Also Published As

Publication number Publication date
WO2013175341A3 (en) 2014-11-13

Similar Documents

Publication Publication Date Title
CN102906671B (en) Gesture input device and gesture input method
US20160034058A1 (en) Mobile Device Input Controller For Secondary Display
US20120159402A1 (en) Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event
US20120139907A1 (en) 3 dimensional (3d) display system of responding to user motion and user interface for the 3d display system
US9671873B2 (en) Device interaction with spatially aware gestures
WO2014140827A2 (en) Systems and methods for proximity sensor and image sensor based gesture detection
US10635180B2 (en) Remote control of a desktop application via a mobile device
CN111045511B (en) Gesture-based control method and terminal equipment
JP2015516624A (en) Method for emphasizing effective interface elements
CN104081307A (en) Image processing apparatus, image processing method, and program
US20230092282A1 (en) Methods for moving objects in a three-dimensional environment
US20140267384A1 (en) Display apparatus and control method thereof
US20150199021A1 (en) Display apparatus and method for controlling display apparatus thereof
JP2016507810A (en) Using distance between objects in touchless gesture interface
CN108446073A (en) A kind of method, apparatus and terminal for simulating mouse action using gesture
US20160147294A1 (en) Apparatus and Method for Recognizing Motion in Spatial Interaction
US20150309681A1 (en) Depth-based mode switching for touchless gestural interfaces
US20180286089A1 (en) Electronic device and method for providing colorable content
WO2013175341A2 (en) Method and apparatus for controlling multiple devices
CN114296627B (en) Content display method, device, equipment and storage medium
CN105739684B (en) Electronic system and its operating method with gesture alignment mechanism
US20130106757A1 (en) First response and second response
US20190096130A1 (en) Virtual mobile terminal implementing system in mixed reality and control method thereof
WO2015030623A1 (en) Methods and systems for locating substantially planar surfaces of 3d scene
JP6031429B2 (en) Viscous sensation presentation device, method and program

Legal Events

Date Code Title Description
122 Ep: pct application non-entry in european phase

Ref document number: 13734493

Country of ref document: EP

Kind code of ref document: A2