WO2017108418A1 - Toy interaction based lighting control. - Google Patents

Toy interaction based lighting control. Download PDF

Info

Publication number
WO2017108418A1
WO2017108418A1 PCT/EP2016/080181 EP2016080181W WO2017108418A1 WO 2017108418 A1 WO2017108418 A1 WO 2017108418A1 EP 2016080181 W EP2016080181 W EP 2016080181W WO 2017108418 A1 WO2017108418 A1 WO 2017108418A1
Authority
WO
WIPO (PCT)
Prior art keywords
toy
data
control data
sensor
lighting device
Prior art date
Application number
PCT/EP2016/080181
Other languages
French (fr)
Inventor
Reinier Imre Anton DEN BOER
Hicham SABIR
Marijn GEELS
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2017108418A1 publication Critical patent/WO2017108418A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/003Dolls specially adapted for a particular function not connected with dolls
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/19Controlling the light source by remote control via wireless transmission
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the invention generally relates to toy figures, and more specifically to toy figures for controlling a remote lighting device.
  • the invention further relates to a method for controlling a lighting device and to a computer program product for performing the method.
  • Modern lighting devices offer advanced control options, such as remote on/off control, dim level control and color control.
  • User interfaces offering these control functions are typically offered as an application on a smart device (e.g. phone, tablet, wall panel) or via remote control units. Proper use of such control options is not always self-evident to users, especially to young children. Additionally, some users might not have access to a smart device.
  • Automated control options such as geo-fencing based control or gesture based control, avoid the need for having access to a smart device, yet can again be difficult to use for children.
  • the inventors have realized that young children tend to make toy figures mimic their activities. For example, a child reading a book might place a teddy bear, doll or action figure next to him/her to read the book, as it were, together with the toy figure. When the child goes to sleep, the toy figure is placed in the bed alongside the child. As such, the bodily position of the toy figure (e.g. sitting, lying down) can be used to determine how a lighting device should be controlled (e.g. reading light when sitting, lights fade to off when lying down).
  • a lighting device e.g. reading light when sitting, lights fade to off when lying down.
  • a toy figure comprises at least one sensor, a wireless interface and a processor.
  • the at least one sensor is arranged for providing as first data a detected bodily position of the toy figure.
  • the sensor can be embedded or integrated in the toy figure. Examples of such at least one sensor are: a resistive sensor placed in a limb of the toy figure or a magnetic or tilt switch placed in a joint of the toy figure. Examples of such a bodily position are: standing, sitting, lying, kneeling, crouching or all- fours.
  • the wireless interface is arranged for transmitting control data for controlling a lighting device remote to the toy figure.
  • Examples of such a wireless interface are: a ZigBee (Light Link) interface, a WiFi interface, a visual light communications interface or another type of RF interface.
  • the lighting device can be controlled directly or indirectly over the wireless interface.
  • An example of direct control is the wireless interface sending control data using the ZigBee (Light Link) standard to a lighting device.
  • An example of an indirect control is the wireless interface sending control data over a WiFi interface to a lighting controller (e.g. a bridge device) and this lighting controller in turn controlling the lighting device (e.g. using DALI, DMX, ZigBee (Light Link) or by controlling the power provided to the lighting device like a dimmer switch).
  • the processor is coupled (wired or wirelessly) to the sensor and the wireless interface.
  • the processor is arranged for generating the control data based on the first data. As such, the control data is based on the detected bodily position, thus the lighting device remote to the toy figure is controlled in dependency on the detected bodily position.
  • the at least one sensor further provides: a detected orientation of the toy figure with respect to its environment, a detected movement of the toy figure, a detected attachment or a proximity of an object to the toy figure, a measured temperature of at least a part of the toy figure or its environment, a sensed pressure exerted on at least a part of the toy figure, a captured audio and/or video feed of at least a part of the environment of the toy figure, a sensed ambient light level of the environment of the toy figure.
  • the control data is then generated based on one or more of these further data the at least one sensor provides.
  • Each type of data can be provided by one or more (types of) sensors.
  • a single sensor e.g. a gyroscopic sensor
  • a first sensor can be used for detecting orientation
  • a second sensor can be used for detecting movement.
  • the detected orientation of the toy figure with respect to its environment can, for example, be used to (better) distinguish between the toy figure lying down or standing up; or between the toy figure lying down face down or lying down face up.
  • the detected movement of the toy figure can be used, for example, to determine if a child is carrying the toy figure while walking or running. Detecting proximity or attachment of an object can be used to determine, for example, that a hat has been placed on the head of the toy figure or the toy figure has been placed in a doll house. Temperature and pressure measurements can, for example, be used, separately or in combination, to determine whether a child is hugging the toy figure, taking the toy figure outside the house, etc.
  • Capturing an audio and/or video feed can be used to determine, for example, whether a child is talking to the toy figure or looking at the toy figure. Further examples of use of such an audio and/or video feed are: determining what the amount of background noise is in the audio feed (e.g. to determine whether a child is in a sleeping environment or not), determining the pitch of a voice speaking in the audio feed (e.g. to determine whether it is a child or an adult speaking), determining a light level in the video feed (e.g. to detect whether it is day or night) or detecting an object that the toy figure is positioned towards in the video feed (e.g. to determine an orientation of the toy figure). The sensed ambient light level can be used to determine what the light level is in the environment of the toy figure (e.g. to determine a time of day or to calibrate the required reading light levels).
  • control data is further generated based on a determined position of the toy figure (e.g. based on GPS or indoor positioning).
  • the toy figure could then be configured such that it controls the lighting device when used in a child's bedroom where the lighting device in this example is located, yet does not control a lighting device when used in the kitchen from which the lighting device is this example is not visible.
  • the control data is further based on current time.
  • the toy figure further comprises an interface, arranged for receiving a user selection of a time period, and the control data is further based on the received time period.
  • the lighting control can be adapted such that the same data received from the one or more sensors during a first time period (e.g. daytime) has a different effect on the lighting control compared to the same data being received during a second time period (e.g. nighttime). For example, movement of the toy figure during the day can trigger a dynamic lighting effect to enhance playtime, whereas movement of the toy figure at night does not trigger any lighting effects or triggers lighting effects that are soothing to assist a child in getting back to sleep.
  • a first time period e.g. daytime
  • a second time period e.g. nighttime
  • Day and night time can be pre-determined and/or estimated based on data received from the one or more sensors (e.g. nighttime is defined as the time period between 10pm and 6am as well as two hours before and after said period as long as the level of ambient light measured is low). It is beneficial if a user, e.g. a parent of a child, is able to select time periods (such a daytime sleep hours, playtime, nighttime sleep hours, a timeslot dedicated to reading, etc.).
  • the toy figure can receive such a selection of a time period via a (wired or wireless) interface, for example, the toy figure can have a USB interface, a SD card reader or a WiFi interface.
  • Certain of such interfaces could further be used for, for example, charging the toy figure, configuring the toy figure, providing a firmware update or reading an error code if the toy figure malfunctions.
  • (select) sensor data can be stored such that earlier interactions with the toy figure can provide further data to be used for generating control data.
  • sensor measurements determine that frequently after the toy figure has been actively played with it is left lying down for an hour (e.g. as a child has lunch in a different room)
  • such data can be used to distinguish between the toy figure lying down as part of playing activities (e.g. making the toy figure 'go to sleep') versus leaving the toy figure behind when play-time is over.
  • a system comprising the toy figure according to the first aspect and further comprises a lighting system controller, arranged for receiving the control data.
  • the lighting system controller is arranged for controlling a lighting device.
  • the system also comprises the lighting device.
  • the toy figure controls the lighting device via the lighting system controller (e.g. the Philips Hue bridge or a smart home controller).
  • the lighting system controller e.g. the Philips Hue bridge or a smart home controller.
  • a home can be equipped with a home control system for controlling one or more lighting devices in the home. The toy figure can then be added as a controller device to the home control system.
  • the system comprises the toy figure and the lighting device, yet no lighting system controller. This is beneficial in a system where the toy figure directly controls the lighting device.
  • a method for controlling a lighting device remote to a toy figure comprises: providing, to a processor by at least one sensor, as first data a detected bodily position of the toy figure; generating, by the processor coupled to the at least one sensor, a control data based on the first data, and transmitting, by a wireless interface coupled to the processor, the control data for controlling the lighting device.
  • the control data is further based on current time. Based on the current time, the lighting control can be adapted such that the same data received from the one or more sensors during a first time period (e.g. daytime) has a different effect on the lighting control compared to the same data being received during a second time period (e.g. nighttime). For example, movement of the toy figure during the day can trigger a dynamic lighting effect to enhance playtime, whereas movement of the toy figure at night does not trigger any lighting effects.
  • a computer program product for performing the method according to the third aspect.
  • the computer program product performs the method when run on a computer device.
  • a computer program product can, for example, be downloaded to a toy figure comprising a computer (e.g. a microchip capable of performing, in conjunction with at least one sensor for detecting a bodily position of the toy figure and a wireless interface for controlling a lighting device, the method) or can be provided embedded on a chip for integration in such a toy figure.
  • Fig. 1 shows schematically and exemplarily a toy figure and a lighting device controlled based on a bodily position of the toy figure
  • Fig. 2 shows schematically and exemplarily certain bodily positions a toy figure can take
  • Fig. 3 shows schematically and exemplarily a toy figure holding an object
  • Fig. 4 shows schematically and exemplarily a decision tree for generating control data for controlling a lighting device
  • Fig. 5 shows schematically and exemplarily a method for controlling a lighting device remote to a toy figure, based on a bodily position of the toy figure.
  • a toy figure 100 is shown.
  • the toy figure comprises a head 110 coupled via a head joint 115 to a body 120.
  • the toy figure further comprises a left arm 130 coupled via a left arm joint 135 to the body 120, a right arm 140 coupled via a right arm joint 145 to the body 120, a left leg 150 coupled via a left leg joint 155 to the body 120, a right leg 160 coupled via a right leg joint 165 to the body 120.
  • Each of the joints comprises a sensor and each of the sensors is coupled via, in this example, a bus 170 to a processor 180.
  • one or more sensors can be individually coupled to the processor 180.
  • the processor 180 is coupled to a wireless interface 185 which is arranged for transmitting a wireless signal 190 to a lighting device 195, causing the lighting device 195 to be controlled according to the control data.
  • a child playing with the toy figure 100 can move the various limbs (i.e. the left arm 130, right arm 140, left leg 150, right leg 160) and the head 110 relative to the body 120.
  • the child can make the toy figure take on a variety of bodily position, such as sitting and standing, which are detected by one or more of the sensors in the joints 115, 135, 145, 155, 165.
  • the sensors can, for example, detect the angle between the body 120 and the limb 130, 140, 150, 160 or head 110 that they are coupled to.
  • Certain bodily positions e.g. holding up a hand for giving a high five
  • other bodily positions e.g. sitting down
  • the processing logic can be provided fully by the processor 180 or some intelligence can be built into the sensors.
  • the processor 180 receiving first data from one or more sensors can determine what control data to generate based on the received first data. For example, if the first data comprises an indication that the toy figure is sitting, the control data generated can be for setting the lighting device to a reading light setting. As a further example, if the first data comprises the angle at which, for example, the limbs and the head are positioned, the processor can determine based on these angles that the toy figure is sitting.
  • the processor 180 generates control data based on the received first data, such that the wireless interface 185 controls the lighting device 195 according to the control data.
  • a power supply e.g. a battery, solar panel or kinetic energy harvester
  • other elements for building a toy figure e.g. stuffing, foam, a support skeleton
  • a toy figure may be used for controlling multiple lighting devices.
  • the lighting device(s) can be controlled via, for example, an RF signal transmitted by the wireless interface of the toy figure.
  • Such a signal can be sent directly to the lighting device or to a lighting controller device (e.g. a bridge) which in turn controls the lighting device(s).
  • the toy figure can further be arranged to control more than lighting devices, such as controlling an audiovisual system, a window blinds system and/or a HVAC system (e.g. when the toy figure is placed in a position lying down, the lights are turned off, the TV and radio are turned off, the window blinds are closed and the temperature is lowered).
  • lighting devices such as controlling an audiovisual system, a window blinds system and/or a HVAC system (e.g. when the toy figure is placed in a position lying down, the lights are turned off, the TV and radio are turned off, the window blinds are closed and the temperature is lowered).
  • Associating a lighting device with the toy figure as controller of the lighting device e.g. commissioning, linking or pairing
  • configuring the light scenes to be set for certain bodily positions can be done in a variety of manners, known to a person skilled in the art.
  • a toy figure comprises a body, four limbs and a head
  • other (types of) toy figures can be used as well.
  • a toy figure might have no limbs (e.g. a snake or a tree), less limbs (e.g. an alien figure or personated vehicles such as a plane, boat, car, train, etc.) or more limbs (e.g. a caterpillar).
  • the limbs if any, can comprise a single part or multiple parts (e.g. a leg coupled to the body by a joint, the leg comprising an upper leg coupled to a lower leg by a further joint).
  • the toy figure might comprise no head (e.g.
  • the toy figure is preferably flexible (e.g. through the use of joints or flexible materials) and can take on a plurality of bodily positions, yet a simple embodiment comprises a toy figure that is rigid and where, for example, only two bodily positions (e.g. lying down or standing up) can be detected.
  • FIG. 2 different bodily positions of a toy figure are illustrated. These are mere examples, as the characteristics of the toy figure (e.g. size, number of limbs, number of joints per limb, number of directions a limb can turn, angle to which a limb can turn, etc.) can determine which bodily positions can be taken.
  • a standing position for a teddy bear can appear different from a standing position for a penguin.
  • the type and number of sensors comprised in the toy figure can determine which bodily positions can be detected.
  • a standing toy figure 210 Shown in the figure are: a standing toy figure 210, a sitting toy figure 220, a toy figure lying down 230, a kneeling toy figure 240, a crouching toy figure 250 and a toy figure on all-fours 260.
  • Toy figures may take on none, some or all of the aforementioned positions and additionally or alternatively take on further bodily positions not illustrated here (e.g. jumping, stretching).
  • Not every position that a toy figure takes on is necessarily a bodily position that a sensor detects or that causes the lighting device to be controlled.
  • the at least one sensor or the processor can limit how often or under what conditions a bodily position change leads to control data being generated for controlling the lighting device.
  • the lighting device change output (repeatedly) between a light output associated with sitting and a light output associated with standing.
  • the sensors or the processor can be configured such that interactions with the toy figure are filtered such that certain bodily positions do not lead to control data being generated.
  • the lighting device or a lighting device controller to which control data is sent by the toy figure filters out undesirable (e.g. rapid) light setting changes.
  • a toy figure 300 comprising a sensor 310 for detecting the attachment of an object. Aspects already discussed in Fig. 1 are not shown here to enhance the readability of the figure.
  • a wand 320 is attached to the toy figure and as such the sensor 310 transmits a wireless signal 330 to the wireless interface 385.
  • the processor 380 can then generate control data based on said wireless signal 330.
  • detection of the attachment of the wand 320 to the toy figure 300 can provide further data based on which control data can be generated.
  • the toy figure standing without holding the wand 320 would, in this example, cause a lighting device to be controlled such that a default light scene is provided, whereas when the wand 320 is attached to the toy figure 300 and the toy figure is standing, a light scene is activated that provides 'magical' light effects enhancing a child's playtime.
  • a sensor can be used that detects specific objects being attached. Further a sensor can be used that detects proximity of an object (e.g. a child holding a book or a toy object near the toy figure) instead of or in addition to attachment of an object.
  • the sensor can provide data to the processor over a wire (not shown) or wirelessly, in which case the processor can have a separate wireless interface for receiving said data (not shown) or can receive the data over the same wireless interface that is used to control the lighting device.
  • a decision tree 400 is shown to illustrate how various data supplied by the at least one sensor is used to generate the control data for controlling the lighting device.
  • the first decision fork 410 encountered relates to, in this example, a bodily position being detected based on sensing an angle of the legs relative to the body.
  • the legs of the toy figure can only move in a single plane (front to back) and only together (i.e. if the left leg moves so does the right leg) between an angle of 90 degrees (legs sticking out in front of the toy figure) and 180 degrees (legs straight down). If the angle of the legs to the body is between 90 degrees and 100 degrees, then the bodily position is determined to be 'sitting'.
  • path 415 When the toy figure is in a sitting bodily position, path 415 is selected which leads to control data 420 being sent to the lighting device for setting a reading light scene (i.e. controlling one or more lighting devices to emit, for example, cool white light at a high intensity).
  • control data 420 being sent to the lighting device for setting a reading light scene (i.e. controlling one or more lighting devices to emit, for example, cool white light at a high intensity).
  • path 425 When the angle of the legs relative to the body is between 170 degrees and 180 degrees, then path 425 is selected as the bodily position could be either one of: the toy figure standing up or the toy figure lying down.
  • the decision fork 430 leads to path 435 when the toy figure is placed in a horizontal position and to path 445 when the toy figure is placed in a vertical position.
  • control data 440 Upon detecting the horizontal position in combination with the 170 degrees to 180 degrees angle of the legs relative to the body, control data 440 are sent to the lighting device to set a sleeping light scene.
  • control data 450 for setting a play-time light scene is sent to the lighting device.
  • Further examples of data that can be used in conjunction to generate appropriate control data are: when the legs are straight, use a measured temperature to determine if a child is holding the toy figure (e.g. causing a play-time light scene to be set); determine is a child is merely holding the toy figure or hugging the toy figure based on a measured pressure exerted on the toy figure (e.g. to, within the play-time light scene, provide additional light effects to show affection); when a toy figure is placed in a sitting position detect if there are other toy figures in the vicinity (e.g. causing a tea-party light scene to be set) using an object proximity sensor or image processing based on visual data captured using a camera; etc.
  • the sensor data can be used for further purposes as well.
  • a detected bodily position or a sensed movement can be fed to a computer device that renders an animation (e.g. in a computer game) or generates sound effects (e.g. snoring when it is determined that the toy figure is sleeping).
  • Such further effects can be rendered by a display integrated in the toy figure or by a remote display, such as a TV or computer monitor.
  • a dinosaur figure is placed by the child in a sleeping position, the TV is turned off and the lights are dimmed; when the child plays with the toy by making it walk, an animation of a walking dinosaur is shown on the TV.
  • a method 500 comprising: providing 510, to a processor by at least one sensor, as first data a detected bodily position of the toy figure; generating 520, by the processor coupled to the at least one sensor, a control data based on the first data, and transmitting 530, by a wireless interface coupled to the processor, the control data for controlling the lighting device.
  • the method can advantageously use aspects of the various embodiments of the toy figure discussed above.
  • Control data can relate to, for example, setting a scene (e.g. setting color and intensity of one or more lighting devices), turning a lighting device on or off, amending a single aspect of light output of a lighting device (e.g. setting the dim level to 50%), dynamic lighting patterns (e.g. a color cycle or a dim down to off), etc.

Abstract

A toy figure, such as a teddy bear, can be used by children to control a lighting device. The toy figure comprises at least one sensor, a processor and a wireless interface for controlling a lighting device. By determining a bodily position (e.g. lying down or sitting) which the teddy bear is in, the appropriate control data can be sent to the lighting device (e.g. lights off when the teddy bear is lying down or lights to reading mode when the teddy bear is sitting).

Description

Toy interaction based lighting control
FIELD OF THE INVENTION
The invention generally relates to toy figures, and more specifically to toy figures for controlling a remote lighting device. The invention further relates to a method for controlling a lighting device and to a computer program product for performing the method.
BACKGROUND OF THE INVENTION
Modern lighting devices offer advanced control options, such as remote on/off control, dim level control and color control. User interfaces offering these control functions are typically offered as an application on a smart device (e.g. phone, tablet, wall panel) or via remote control units. Proper use of such control options is not always self-evident to users, especially to young children. Additionally, some users might not have access to a smart device. Automated control options, such as geo-fencing based control or gesture based control, avoid the need for having access to a smart device, yet can again be difficult to use for children.
SUMMARY OF THE INVENTION
The inventors have realized that young children tend to make toy figures mimic their activities. For example, a child reading a book might place a teddy bear, doll or action figure next to him/her to read the book, as it were, together with the toy figure. When the child goes to sleep, the toy figure is placed in the bed alongside the child. As such, the bodily position of the toy figure (e.g. sitting, lying down) can be used to determine how a lighting device should be controlled (e.g. reading light when sitting, lights fade to off when lying down).
In a first aspect, a toy figure is provided. The toy figure comprises at least one sensor, a wireless interface and a processor. The at least one sensor is arranged for providing as first data a detected bodily position of the toy figure. The sensor can be embedded or integrated in the toy figure. Examples of such at least one sensor are: a resistive sensor placed in a limb of the toy figure or a magnetic or tilt switch placed in a joint of the toy figure. Examples of such a bodily position are: standing, sitting, lying, kneeling, crouching or all- fours. The wireless interface is arranged for transmitting control data for controlling a lighting device remote to the toy figure. Examples of such a wireless interface are: a ZigBee (Light Link) interface, a WiFi interface, a visual light communications interface or another type of RF interface. The lighting device can be controlled directly or indirectly over the wireless interface. An example of direct control is the wireless interface sending control data using the ZigBee (Light Link) standard to a lighting device. An example of an indirect control is the wireless interface sending control data over a WiFi interface to a lighting controller (e.g. a bridge device) and this lighting controller in turn controlling the lighting device (e.g. using DALI, DMX, ZigBee (Light Link) or by controlling the power provided to the lighting device like a dimmer switch). The processor is coupled (wired or wirelessly) to the sensor and the wireless interface. The processor is arranged for generating the control data based on the first data. As such, the control data is based on the detected bodily position, thus the lighting device remote to the toy figure is controlled in dependency on the detected bodily position.
In various embodiments of the toy figure the at least one sensor further provides: a detected orientation of the toy figure with respect to its environment, a detected movement of the toy figure, a detected attachment or a proximity of an object to the toy figure, a measured temperature of at least a part of the toy figure or its environment, a sensed pressure exerted on at least a part of the toy figure, a captured audio and/or video feed of at least a part of the environment of the toy figure, a sensed ambient light level of the environment of the toy figure. The control data is then generated based on one or more of these further data the at least one sensor provides. Each type of data can be provided by one or more (types of) sensors. For example, a single sensor (e.g. a gyroscopic sensor) can provide data on a detected orientation as well as a detected movement, or a first sensor can be used for detecting orientation and a second sensor can be used for detecting movement.
The detected orientation of the toy figure with respect to its environment can, for example, be used to (better) distinguish between the toy figure lying down or standing up; or between the toy figure lying down face down or lying down face up. The detected movement of the toy figure can be used, for example, to determine if a child is carrying the toy figure while walking or running. Detecting proximity or attachment of an object can be used to determine, for example, that a hat has been placed on the head of the toy figure or the toy figure has been placed in a doll house. Temperature and pressure measurements can, for example, be used, separately or in combination, to determine whether a child is hugging the toy figure, taking the toy figure outside the house, etc. Capturing an audio and/or video feed can be used to determine, for example, whether a child is talking to the toy figure or looking at the toy figure. Further examples of use of such an audio and/or video feed are: determining what the amount of background noise is in the audio feed (e.g. to determine whether a child is in a sleeping environment or not), determining the pitch of a voice speaking in the audio feed (e.g. to determine whether it is a child or an adult speaking), determining a light level in the video feed (e.g. to detect whether it is day or night) or detecting an object that the toy figure is positioned towards in the video feed (e.g. to determine an orientation of the toy figure). The sensed ambient light level can be used to determine what the light level is in the environment of the toy figure (e.g. to determine a time of day or to calibrate the required reading light levels).
In a further embodiment, the control data is further generated based on a determined position of the toy figure (e.g. based on GPS or indoor positioning). The toy figure could then be configured such that it controls the lighting device when used in a child's bedroom where the lighting device in this example is located, yet does not control a lighting device when used in the kitchen from which the lighting device is this example is not visible. Other ways of limiting the area in which the toy figure can be used for controlling the lighting device are: emitting coded light by the lighting device such that the a sensor in the toy figure detects this coded light and the lighting device, once turned on, is only controlled when coded light is detected; or limiting the range of the signal transmitted by the wireless interface of the toy figure, such that only a lighting device sufficiently nearby (i.e. within range) will receive the control data and amends its light output.
In another embodiment of the toy figure, the control data is further based on current time. In yet another embodiment, the toy figure further comprises an interface, arranged for receiving a user selection of a time period, and the control data is further based on the received time period. Based on the current time, the lighting control can be adapted such that the same data received from the one or more sensors during a first time period (e.g. daytime) has a different effect on the lighting control compared to the same data being received during a second time period (e.g. nighttime). For example, movement of the toy figure during the day can trigger a dynamic lighting effect to enhance playtime, whereas movement of the toy figure at night does not trigger any lighting effects or triggers lighting effects that are soothing to assist a child in getting back to sleep. Day and night time can be pre-determined and/or estimated based on data received from the one or more sensors (e.g. nighttime is defined as the time period between 10pm and 6am as well as two hours before and after said period as long as the level of ambient light measured is low). It is beneficial if a user, e.g. a parent of a child, is able to select time periods (such a daytime sleep hours, playtime, nighttime sleep hours, a timeslot dedicated to reading, etc.). The toy figure can receive such a selection of a time period via a (wired or wireless) interface, for example, the toy figure can have a USB interface, a SD card reader or a WiFi interface. This can allow a user to select the time period via, for example, an application on a smart device and upload the selection to the toy figure. Certain of such interfaces could further be used for, for example, charging the toy figure, configuring the toy figure, providing a firmware update or reading an error code if the toy figure malfunctions.
In further embodiments, (select) sensor data can be stored such that earlier interactions with the toy figure can provide further data to be used for generating control data. As an example, if sensor measurements determine that frequently after the toy figure has been actively played with it is left lying down for an hour (e.g. as a child has lunch in a different room), such data can be used to distinguish between the toy figure lying down as part of playing activities (e.g. making the toy figure 'go to sleep') versus leaving the toy figure behind when play-time is over.
In a second aspect, a system is provided. The system comprises the toy figure according to the first aspect and further comprises a lighting system controller, arranged for receiving the control data. The lighting system controller is arranged for controlling a lighting device. Optionally, the system also comprises the lighting device. This is advantageous in a system where the toy figure controls the lighting device via the lighting system controller (e.g. the Philips Hue bridge or a smart home controller). As an example, a home can be equipped with a home control system for controlling one or more lighting devices in the home. The toy figure can then be added as a controller device to the home control system. An adult user can then, continuing the example, use a remote control for controlling the lighting devices and a child can use the toy figure for controlling (a subset of) the lighting devices. Alternatively, the system comprises the toy figure and the lighting device, yet no lighting system controller. This is beneficial in a system where the toy figure directly controls the lighting device.
In a third aspect, a method is provided for controlling a lighting device remote to a toy figure. The method comprises: providing, to a processor by at least one sensor, as first data a detected bodily position of the toy figure; generating, by the processor coupled to the at least one sensor, a control data based on the first data, and transmitting, by a wireless interface coupled to the processor, the control data for controlling the lighting device. In an embodiment of the method, the control data is further based on current time. Based on the current time, the lighting control can be adapted such that the same data received from the one or more sensors during a first time period (e.g. daytime) has a different effect on the lighting control compared to the same data being received during a second time period (e.g. nighttime). For example, movement of the toy figure during the day can trigger a dynamic lighting effect to enhance playtime, whereas movement of the toy figure at night does not trigger any lighting effects.
In a fourth aspect, a computer program product is provided for performing the method according to the third aspect. The computer program product performs the method when run on a computer device. Such a computer program product can, for example, be downloaded to a toy figure comprising a computer (e.g. a microchip capable of performing, in conjunction with at least one sensor for detecting a bodily position of the toy figure and a wireless interface for controlling a lighting device, the method) or can be provided embedded on a chip for integration in such a toy figure.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings:
Fig. 1 shows schematically and exemplarily a toy figure and a lighting device controlled based on a bodily position of the toy figure,
Fig. 2 shows schematically and exemplarily certain bodily positions a toy figure can take,
Fig. 3 shows schematically and exemplarily a toy figure holding an object, Fig. 4 shows schematically and exemplarily a decision tree for generating control data for controlling a lighting device, and
Fig. 5 shows schematically and exemplarily a method for controlling a lighting device remote to a toy figure, based on a bodily position of the toy figure.
DETAILED DESCRIPTION OF EMBODIMENTS
In Fig. 1 a toy figure 100 is shown. The toy figure comprises a head 110 coupled via a head joint 115 to a body 120. The toy figure further comprises a left arm 130 coupled via a left arm joint 135 to the body 120, a right arm 140 coupled via a right arm joint 145 to the body 120, a left leg 150 coupled via a left leg joint 155 to the body 120, a right leg 160 coupled via a right leg joint 165 to the body 120. Each of the joints comprises a sensor and each of the sensors is coupled via, in this example, a bus 170 to a processor 180.
Alternatively one or more sensors can be individually coupled to the processor 180. The processor 180 is coupled to a wireless interface 185 which is arranged for transmitting a wireless signal 190 to a lighting device 195, causing the lighting device 195 to be controlled according to the control data.
A child playing with the toy figure 100 can move the various limbs (i.e. the left arm 130, right arm 140, left leg 150, right leg 160) and the head 110 relative to the body 120. The child can make the toy figure take on a variety of bodily position, such as sitting and standing, which are detected by one or more of the sensors in the joints 115, 135, 145, 155, 165. The sensors can, for example, detect the angle between the body 120 and the limb 130, 140, 150, 160 or head 110 that they are coupled to. Certain bodily positions (e.g. holding up a hand for giving a high five) can be detected by a single sensor, other bodily positions (e.g. sitting down) are jointly detected by multiple sensors. The processing logic can be provided fully by the processor 180 or some intelligence can be built into the sensors. The processor 180 receiving first data from one or more sensors can determine what control data to generate based on the received first data. For example, if the first data comprises an indication that the toy figure is sitting, the control data generated can be for setting the lighting device to a reading light setting. As a further example, if the first data comprises the angle at which, for example, the limbs and the head are positioned, the processor can determine based on these angles that the toy figure is sitting. The processor 180 generates control data based on the received first data, such that the wireless interface 185 controls the lighting device 195 according to the control data.
Not shown in the figure are a power supply (e.g. a battery, solar panel or kinetic energy harvester) or other elements for building a toy figure (e.g. stuffing, foam, a support skeleton) that a person skilled in the art is familiar with. Although a single lighting device is shown, a toy figure may be used for controlling multiple lighting devices. The lighting device(s) can be controlled via, for example, an RF signal transmitted by the wireless interface of the toy figure. Such a signal can be sent directly to the lighting device or to a lighting controller device (e.g. a bridge) which in turn controls the lighting device(s). The toy figure can further be arranged to control more than lighting devices, such as controlling an audiovisual system, a window blinds system and/or a HVAC system (e.g. when the toy figure is placed in a position lying down, the lights are turned off, the TV and radio are turned off, the window blinds are closed and the temperature is lowered). Associating a lighting device with the toy figure as controller of the lighting device (e.g. commissioning, linking or pairing) and configuring the light scenes to be set for certain bodily positions can be done in a variety of manners, known to a person skilled in the art.
Although the toy figure described here comprises a body, four limbs and a head, other (types of) toy figures can be used as well. For example, a toy figure might have no limbs (e.g. a snake or a tree), less limbs (e.g. an alien figure or personated vehicles such as a plane, boat, car, train, etc.) or more limbs (e.g. a caterpillar). The limbs, if any, can comprise a single part or multiple parts (e.g. a leg coupled to the body by a joint, the leg comprising an upper leg coupled to a lower leg by a further joint). The toy figure might comprise no head (e.g. a turtle wherein the head is in the shell) or multiple heads (e.g. a mythological figure). The toy figure is preferably flexible (e.g. through the use of joints or flexible materials) and can take on a plurality of bodily positions, yet a simple embodiment comprises a toy figure that is rigid and where, for example, only two bodily positions (e.g. lying down or standing up) can be detected.
In Fig. 2 different bodily positions of a toy figure are illustrated. These are mere examples, as the characteristics of the toy figure (e.g. size, number of limbs, number of joints per limb, number of directions a limb can turn, angle to which a limb can turn, etc.) can determine which bodily positions can be taken. A standing position for a teddy bear can appear different from a standing position for a penguin. The type and number of sensors comprised in the toy figure can determine which bodily positions can be detected. Shown in the figure are: a standing toy figure 210, a sitting toy figure 220, a toy figure lying down 230, a kneeling toy figure 240, a crouching toy figure 250 and a toy figure on all-fours 260. Toy figures may take on none, some or all of the aforementioned positions and additionally or alternatively take on further bodily positions not illustrated here (e.g. jumping, stretching). Not every position that a toy figure takes on is necessarily a bodily position that a sensor detects or that causes the lighting device to be controlled. To avoid rapid changes of light settings, the at least one sensor or the processor can limit how often or under what conditions a bodily position change leads to control data being generated for controlling the lighting device. For example, when a child with limited motor skills is attempting to make a toy figure with legs 'walk', then potentially the legs are moved between what would be a sitting position and a standing position. It can then be undesirable to have the lighting device change output (repeatedly) between a light output associated with sitting and a light output associated with standing. The sensors or the processor can be configured such that interactions with the toy figure are filtered such that certain bodily positions do not lead to control data being generated. As a further option, the lighting device or a lighting device controller to which control data is sent by the toy figure filters out undesirable (e.g. rapid) light setting changes.
In Fig. 3 a toy figure 300 is shown comprising a sensor 310 for detecting the attachment of an object. Aspects already discussed in Fig. 1 are not shown here to enhance the readability of the figure. In this example, a wand 320 is attached to the toy figure and as such the sensor 310 transmits a wireless signal 330 to the wireless interface 385. The processor 380 can then generate control data based on said wireless signal 330. When the toy figure 300 is standing and this bodily position is detected by at least one sensor (not shown in this figure), detection of the attachment of the wand 320 to the toy figure 300 can provide further data based on which control data can be generated. As an example, the toy figure standing without holding the wand 320 would, in this example, cause a lighting device to be controlled such that a default light scene is provided, whereas when the wand 320 is attached to the toy figure 300 and the toy figure is standing, a light scene is activated that provides 'magical' light effects enhancing a child's playtime.
Instead of or in addition to detecting attachment of an (undetermined) object, a sensor can be used that detects specific objects being attached. Further a sensor can be used that detects proximity of an object (e.g. a child holding a book or a toy object near the toy figure) instead of or in addition to attachment of an object. The sensor can provide data to the processor over a wire (not shown) or wirelessly, in which case the processor can have a separate wireless interface for receiving said data (not shown) or can receive the data over the same wireless interface that is used to control the lighting device.
In Fig. 4 a decision tree 400 is shown to illustrate how various data supplied by the at least one sensor is used to generate the control data for controlling the lighting device. From the start 405 the first decision fork 410 encountered relates to, in this example, a bodily position being detected based on sensing an angle of the legs relative to the body. In this example the legs of the toy figure can only move in a single plane (front to back) and only together (i.e. if the left leg moves so does the right leg) between an angle of 90 degrees (legs sticking out in front of the toy figure) and 180 degrees (legs straight down). If the angle of the legs to the body is between 90 degrees and 100 degrees, then the bodily position is determined to be 'sitting'. When the toy figure is in a sitting bodily position, path 415 is selected which leads to control data 420 being sent to the lighting device for setting a reading light scene (i.e. controlling one or more lighting devices to emit, for example, cool white light at a high intensity). When the angle of the legs relative to the body is between 170 degrees and 180 degrees, then path 425 is selected as the bodily position could be either one of: the toy figure standing up or the toy figure lying down. Using as further data the orientation of the toy figure relative to its environment, the decision fork 430 leads to path 435 when the toy figure is placed in a horizontal position and to path 445 when the toy figure is placed in a vertical position. Upon detecting the horizontal position in combination with the 170 degrees to 180 degrees angle of the legs relative to the body, control data 440 are sent to the lighting device to set a sleeping light scene. When instead the vertical position in combination with the 170 degrees to 180 degrees angle of the legs relative to the body is detected, control data 450 for setting a play-time light scene is sent to the lighting device.
Further examples of data that can be used in conjunction to generate appropriate control data are: when the legs are straight, use a measured temperature to determine if a child is holding the toy figure (e.g. causing a play-time light scene to be set); determine is a child is merely holding the toy figure or hugging the toy figure based on a measured pressure exerted on the toy figure (e.g. to, within the play-time light scene, provide additional light effects to show affection); when a toy figure is placed in a sitting position detect if there are other toy figures in the vicinity (e.g. causing a tea-party light scene to be set) using an object proximity sensor or image processing based on visual data captured using a camera; etc.
The sensor data, based on which the lighting device is controlled, can be used for further purposes as well. As an example, a detected bodily position or a sensed movement can be fed to a computer device that renders an animation (e.g. in a computer game) or generates sound effects (e.g. snoring when it is determined that the toy figure is sleeping). Such further effects can be rendered by a display integrated in the toy figure or by a remote display, such as a TV or computer monitor. As an example, when a dinosaur figure is placed by the child in a sleeping position, the TV is turned off and the lights are dimmed; when the child plays with the toy by making it walk, an animation of a walking dinosaur is shown on the TV.
In Fig. 5 a method 500 is shown, comprising: providing 510, to a processor by at least one sensor, as first data a detected bodily position of the toy figure; generating 520, by the processor coupled to the at least one sensor, a control data based on the first data, and transmitting 530, by a wireless interface coupled to the processor, the control data for controlling the lighting device. The method can advantageously use aspects of the various embodiments of the toy figure discussed above. Control data can relate to, for example, setting a scene (e.g. setting color and intensity of one or more lighting devices), turning a lighting device on or off, amending a single aspect of light output of a lighting device (e.g. setting the dim level to 50%), dynamic lighting patterns (e.g. a color cycle or a dim down to off), etc.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. The reference to first data, second data, third data, etc. does not indicate any order or relationship between such data. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A toy figure comprising:
at least one sensor, arranged for providing as first data a detected bodily position of the toy figure,
a wireless interface, arranged for transmitting control data for controlling a lighting device remote to the toy figure, and
a processor, coupled to the sensor and the wireless interface, arranged for generating the control data based on the first data,
wherein the detected bodily position is one of: standing, sitting, lying, kneeling, crouching or all- fours.
2. The toy figure according to claim 1, wherein the at least one sensor is further arranged for providing as second data a detected orientation of the toy figure with respect to its environment, and
wherein the control data is further based on the second data.
3. The toy figure according any one of the preceding claims, wherein the at least one sensor is further arranged for providing as third data a detected movement of the toy figure, and
wherein the control data is further based on the third data.
4. The toy figure according any one of the preceding claims, wherein the at least one sensor is further arranged for providing as fourth data a detected attachment or a detected proximity of an object to the toy figure, and
wherein the control data is further based on the fourth data.
5. The toy figure according to any one of the preceding claims, wherein the at least one sensor is further arranged for providing as fifth data a measured temperature of at least a part of the toy figure or its environment, and
wherein the control data is further based on the fifth data.
6. The toy figure according to any one of the preceding claims, wherein the at least one sensor is further arranged for providing as sixth data a sensed pressure exerted on at least a part of the toy figure, and
wherein the control data is further based on the sixth data.
7. The toy figure according to any one of the preceding claims, wherein the at least one sensor is further arranged for providing as seventh data a captured audio and/or video feed of at least a part of the environment of the toy figure, and
wherein the control data is further based on the seventh data.
8. The toy figure according to any one of the preceding claims, wherein the at least one sensor is further arranged for providing as eight data a sensed ambient light level of the environment of the toy figure, and
wherein the control data is further based on the eight data.
9. The toy figure according to any one of the preceding claims, wherein the control data is further based on current time.
10. The toy figure according to claim 9, the toy figure further comprising:
an interface, arranged for receiving a user selection of a time period, wherein the control data is further based on the received time period.
11. A system comprising:
the toy figure according to any one of the preceding claims,
a lighting system controller, arranged for receiving the control data, wherein the lighting system controller is arranged for controlling a lighting device.
12. The system according to claim 11 , the system further comprising the lighting device.
13. A method for controlling a lighting device remote to a toy figure, the method comprising: providing, to a processor by at least one sensor, as first data a detected bodily position of the toy figure,
generating, by the processor coupled to the at least one sensor, a control data based on the first data, and
- transmitting, by a wireless interface coupled to the processor, the control data for controlling the lighting device,
wherein the detected bodily position is one of: standing, sitting, lying, kneeling, crouching or all- fours.
14. The method according to claim 13, wherein the control data is further based current time.
15. A computer program product arranged for performing the method according to claim 13 or claim 14 when run on a computer device.
PCT/EP2016/080181 2015-12-22 2016-12-08 Toy interaction based lighting control. WO2017108418A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15202084 2015-12-22
EP15202084.8 2015-12-22

Publications (1)

Publication Number Publication Date
WO2017108418A1 true WO2017108418A1 (en) 2017-06-29

Family

ID=55022360

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/080181 WO2017108418A1 (en) 2015-12-22 2016-12-08 Toy interaction based lighting control.

Country Status (1)

Country Link
WO (1) WO2017108418A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060154726A1 (en) * 2000-02-22 2006-07-13 Weston Denise C Multi-layered interactive play experience
JP2010279817A (en) * 2010-09-27 2010-12-16 Namco Bandai Games Inc Toy
US20120139727A1 (en) * 2011-03-28 2012-06-07 Physical Apps, Llc Physical interaction device for personal electronics and method for use
WO2015092631A1 (en) * 2013-12-19 2015-06-25 Koninklijke Philips N.V. Lighting control based on interaction with toys in play area
WO2015113824A1 (en) * 2014-01-31 2015-08-06 Koninklijke Philips N.V. A method of controlling lighting devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060154726A1 (en) * 2000-02-22 2006-07-13 Weston Denise C Multi-layered interactive play experience
JP2010279817A (en) * 2010-09-27 2010-12-16 Namco Bandai Games Inc Toy
US20120139727A1 (en) * 2011-03-28 2012-06-07 Physical Apps, Llc Physical interaction device for personal electronics and method for use
WO2015092631A1 (en) * 2013-12-19 2015-06-25 Koninklijke Philips N.V. Lighting control based on interaction with toys in play area
WO2015113824A1 (en) * 2014-01-31 2015-08-06 Koninklijke Philips N.V. A method of controlling lighting devices

Similar Documents

Publication Publication Date Title
US11818627B2 (en) Gesture-based load control via wearable devices
US10095314B2 (en) Gesture-based load control
US8058975B2 (en) Remote control device, in particular a wand having motion detection
US20200129875A1 (en) Robot having a changeable character
ES2812300T3 (en) Lighting control
US9226330B2 (en) Wireless motion activated user device with bi-modality communication
US10111307B2 (en) Systems and methods for remotely controlling an imitation candle device
US10412815B2 (en) Lighting system and multi-mode lighting device thereof and controlling method
ES2640907T3 (en) Portable interaction detection control device
JP2009520316A (en) Method and apparatus for lighting control
US11241368B1 (en) Smart pacifier that performs functions by wireless connection to a computing device and application
CN104582773A (en) A breathing apparatus having a display with user selectable background
US6822556B2 (en) Methods and apparatus for a multi-mode night-light configured to emulate a traffic signal
KR101701385B1 (en) Led lamp for infant and wrist band linked the same
WO2017108418A1 (en) Toy interaction based lighting control.
US10357630B2 (en) Lighting apparatus, luminaire, and electronic apparatus
TW201807701A (en) A display assembly
CN111459042B (en) Control method and system of household appliance, television and storage medium
CN106292314B (en) Information processing method and electronic equipment
US10300242B2 (en) Sleep training child night light
ES2939368T3 (en) Selection of a destination for a sensor signal based on an active lighting configuration
CN207519928U (en) Multifunctional children bed
CN106094941B (en) A kind of method and system changing indoor scene
WO2018169405A1 (en) Luminaire for a water slide and water slide
KR101926734B1 (en) The sensing device and the method for controlling the electronic device by the sensing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16810313

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16810313

Country of ref document: EP

Kind code of ref document: A1