WO2006090546A2 - Input device for a computer and environmental control system using the same - Google Patents

Input device for a computer and environmental control system using the same Download PDF

Info

Publication number
WO2006090546A2
WO2006090546A2 PCT/JP2006/301080 JP2006301080W WO2006090546A2 WO 2006090546 A2 WO2006090546 A2 WO 2006090546A2 JP 2006301080 W JP2006301080 W JP 2006301080W WO 2006090546 A2 WO2006090546 A2 WO 2006090546A2
Authority
WO
WIPO (PCT)
Prior art keywords
command
input device
face
resting
motion
Prior art date
Application number
PCT/JP2006/301080
Other languages
French (fr)
Other versions
WO2006090546A3 (en
Inventor
Fumiaki Oobayashi
Masaaki Terano
Yoshifumi Murakami
Kazufumi Oogi
Masahiro Yamamoto
Original Assignee
Matsushita Electric Works, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Works, Ltd. filed Critical Matsushita Electric Works, Ltd.
Publication of WO2006090546A2 publication Critical patent/WO2006090546A2/en
Publication of WO2006090546A3 publication Critical patent/WO2006090546A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/30Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2854Wide area networks, e.g. public data networks
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2120/00Control inputs relating to users or occupants
    • F24F2120/10Occupancy

Definitions

  • the present invention is related to an input device for a computer and an environmental control system using the same.
  • Japanese Patent Publication No. 9-84145 discloses an input device for a computer.
  • the input device has a polyhedral body having a plurality of flat faces by each of which the body rests on a supporting surface, i.e., a desktop.
  • the body incorporates a plurality of switches each of which corresponds to each face and is closed in response to the body being rest on the supporting surface with the corresponding face up.
  • Each of the faces is designed with a mark identifying a specific action assigned to the closure of the corresponding switch. The user can select the specific action by resting the body on the supporting surface with the corresponding face up.
  • the input device is therefore given the number of different actions which is determined by the number of faces.
  • the polyhedral body Since the number of actions is restricted by the number of faces, the polyhedral body is required to have a more number of faces in order to give a more number of actions. With this result, each face of the polyhedral body becomes smaller and is therefore less stable on the supporting surface. Moreover, as the number of faces increase, the user becomes difficult to identify the face or the mark on the face and is therefore required to make careful attention for confirming the face of the polyhedral body, which is inconvenient for selecting the intended action. Accordingly, it is desirable to reduce the number of faces, yet increasing the number of actions.
  • the input device in accordance with the present invention includes a polyhedral body which is configured to be manipulated by a user, and to have a plurality of flat resting faces by which the body can be rested upon a supporting surface.
  • the polyhedral body incorporates an acceleration sensor, a face judge, an action-command relation memory, a command composer.
  • the acceleration sensor is configured to provide a sensor output indicating an acceleration being applied to the body.
  • the face judge is configured to discriminate which one of the resting faces is on the supporting surface based upon the sensor output and to provide a face output indicative of the determined resting face.
  • the action-command relation memory is configured to associate the resting faces respectively with different ones of predefined commands in a first set.
  • the command composer is configured to select one of the predefined commands in the first set which is associated with the fact output from the face judge, and to transmit the selected command to the computer.
  • a motion judge which is configured to analyze the sensor output for identifying a specific dynamic motion which the body undergoes, and to provide a motion output indicative of thus identified specific dynamic motion.
  • the action-command relation memory is configured to associate a plurality of predefined dynamic motions with different ones of predefined commands in a second set such that the command composer selects one of the predefined commands in the second set which is associated with the motion output, and to transmit the selected command to the computer.
  • the input device of the present invention can afford a large number of actions and the corresponding commands more than the number of faces of the polyhedral body, which contributes to increase the number of the actions and commands, while retaining the simply geometry of the polyhedral body, for example, hexahedron with a relatively small number of the faces, for example . Accordingly, the user is easy to manipulate the polyhedral body not only by relatively high face stability but also by easy eye confirmation of the face. Moreover, since the dynamic actions can be easily related to direct and natural user's intention or commands, the user is easy to give the commands to the computer by taking the advantage of the dynamic motions.
  • the action-command relation memory is configured to associate one of the predefined commands in the second set with one of the predefined commands in the first set which is recently discriminated by the face judge, whereby one of the specific dynamic motions brings about one of the predefined commands in the first set discriminated immediately previously.
  • one specific dynamic action such as "tapping" can be utilized to repeat the command assigned to the one static motion assigned to anyone of the resting faces.
  • the reentry of the command can be done simply by making the dynamic action or tapping without forcing the user to reposition the body.
  • the motion judge is configured to accumulate time series data of said sensor output until the body is kept stable for a predetermined period in order to analyze the time series data for identifying the specific dynamic motion.
  • the command composer transmits the selected command only after the elapse of a predetermined short period of time, thereby assuring reliable identification of the dynamic motion.
  • the present invention is also related to an environmental apparatus control system in which the input device is best utilized for easy operation.
  • the control system includes an apparatus configured to control a residential environment, and a plurality of the input devices which are used to belong to individual residents in the residential environment for entry of demands of changing the residential environment.
  • the control system also includes a server configured to be connected to the input devices and to give an analysis of the demands collected from the input devices for determination of a control project of controlling the apparatus based upon the analysis; and an apparatus control means configured to control the apparatus in accordance with the control project.
  • the server is configured to analyze the collected demands in view of a total number of the residents in the residential environment.
  • the input device is configured to allocate one of the resting faces with the command indicating the presence of the resident in the residential environment so that the server can collect the total number of the residents in the residential environment.
  • FIG. 1 is a schematic view of an input device in accordance with a preferred embodiment of the present invention
  • FIG. 2 is a block diagram of an electronic circuit module incorporated in the input device
  • FIG. 3 is schematic view illustrating an environmental control system using the above input device
  • FIG. 4 is a schematic view illustrating an air- conditioning control of the above system
  • FIG. 5 is a table illustrating a relation between the resting faces of the input device and accelerations detected along X, Y, Z-axes;
  • FIG. 6 is a table illustrating a relation between actions at the input device and commands created by the actions
  • FIG. 7 is a flowchart illustrating how the actions are interpreted
  • FIG. 8 is a flowchart illustrating the air-conditioning control of the above system
  • FIG. 9 is a schematic view illustrating a light control of the above system
  • FIG. 10 is a flowchart illustrating the light control of the above system
  • FIG. 11 is a table illustrating a relation between the resting faces of the input device and actions allocated to the resting faces employed in an alternative light control of the above system.
  • FIG. 12 is a flowchart illustrating the alternative light control of the above system.
  • the input device 10 has a polyhedral body 20 of a size to be easily manipulated by one hand of a user on or above a supporting surface such as desktop.
  • the polyhedral body 20 is in the form of a cube having six congruent resting faces by anyone of which the body is rested on the supporting surface.
  • the body 20 incorporates therein an electronic circuit module 30, and three LEDs 41 , 42, and 43 respectively emitting red, green, and blue rays.
  • the module 30 includes an acceleration sensor 50, an action processor 60, and a communicator 70 for data transmission with the computer 100 through a cable 90.
  • the acceleration sensor 50 in response to the acceleration that the body 20 undergoes, provides a sensor output representing accelerations along X, Y, and Z axes of the body 20.
  • the action processor 60 includes an action analyzer 61 , face judge 62, a motion judge 63, a command composer 64, an action-command relation memory 66, and an LED controller 67.
  • the action analyzer 61 accumulates time series data of the sensor output from the acceleration sensor 50 to analyze the data in terms of the acceleration directions and the number of the accelerations acknowledged for each predetermined short time period, and gives an action signal to a face judge 62 and to a motion judge 63.
  • the face judge 63 is configured to identify which one of the resting faces is on the supporting surface in response to the action signal, and provides a face output indicative of thus identified resting face to the command composer 64.
  • the face judge 63 identifies the resting face of the body stably held on the supporting surface by reference to a table which, as shown in FIG.
  • the motion judge 63 is configured to identify a moving pattern of the body 20 by analyzing the changes of the accelerations along the X-axis, Y-axis, and Z-axis within the predetermined period, and provides a motion output indicative of thus identified moving pattern. The motion output is therefore given when the body 20 is moved within the short time period as a result of being shaken or tapped.
  • the command composer 64 Upon receiving the face output or motion output, the command composer 64 refers to the action-command relation memory 66 for selecting one of predefined commands, and sends the selected command to the computer 100 for execution thereat.
  • the memory 66 is configured to give a table which allocates static actions and dynamic actions respectively to the different commands, for example, as shown in FIG. 6.
  • the static actions are meant in the description to denote the results of placing the body 20 on the supporting surface with any one of the resting faces down, and therefore denote the corresponding resting faces of the body 20.
  • the dynamic actions are meant to the actions such as shaking and tapping applied to the body 20.
  • the input device 10 is utilized for an air-conditioning control and a lighting control for an enclosed residential space or environment as schematically shown in FIG. 3, in reflectance of demands from the residents or the presence of the resident in the space. Details of such controls will be discussed later in the description.
  • FIG. 6 lists the predefined commands to be entered at the input device for making the above controls. The commands are classified into a first set and a second set. The commands in the first set include "keep temperature”, “raise temperature”, “lower temperature”, “is presence”, “is absence”, and "emergency call” which are allocated to the static actions, i.e., faces 1 to 6.
  • the commands in the second set include "cancel previous command”, “emphasize previous command”, “inform system failure”, and “repeat previous command” which are respectively allocated to the dynamic actions, i.e., “shaking 2 or 3 times”, “shaking 5 to 8 times”, “rapidly shaking 2 or 3 times", and “tapping".
  • the control is realized by a system equipped with a server 200 connected to the computers of the residents and to an air-conditioning apparatus 210.
  • the server 200 is configured to have an air-conditioning controller 204, a data memory 205, and a control logic memory 206 holding information as to parameters such operating conditions or specifications of the apparatus 210, and temperature, humidity, and thermal characteristics of the residential space.
  • the data memory 205 is configured to store time-series data of the inputs from the input devices through the related computers 100 and a communicator 201.
  • the inputs in this instance, indicate residents' demands as well as the total number of the residents.
  • the controller 204 is configured to read out the inputs at regular intervals, i.e.
  • control logic memory 206 in order to create a control project of controlling the air-conditioning apparatus in a direction of raising, lowering, or keeping the temperature based upon a predominant rule, i.e. which one of the demands of "raising temperature”, “lowering temperature” and “keeping temperature” is predominant.
  • the control project is then sent through a communicator 202 to the apparatus 210 in order to vary or keep the temperature setting in compliance with the predominant demand.
  • the controller 204 sends back a feedback signal to the individual input devices 10, which activates the LED controller 47 to produce different colors from the combination of LEDs 41 , 42, and 43, depending upon the control project created at the controller 204, i.e.
  • the LED controller 47 is also configured to another external signal from the corresponding computer to flash the LEDs at selected frequency or in a predetermined frequency pattern to create an optical signal for controlling a lighting fixture 300, details of which will be discussed later.
  • the operation of the input device 10 is now discussed in accordance with a flowchart of FIG. 7.
  • the LED controller 67 is activated to turn on the LEDs 41 , 42, and 43 for indication of the connection (S2).
  • the command composer 64 checks whether or not it receives the external signal (S3). In the presence of the external signal, the command composer 64 passes the external signal to the LED controller 67 which responds to activate the LEDs in a manner designated by the extemal signal (S4).
  • a static action analyzing sequence is initiated at step (S5) for locating the resting face of the body provided that the action analyzer 61 gives the static action signal
  • a dynamic action analyzing sequence is initiated at step (S8) for checking whether or not the dynamic action signal is detected.
  • the face judge 62 locates the resting face of the body 20 on the supporting surface (S5), it is checked whether thus located resting face is changed from that located previously (S6).
  • the sequence goes back to step (S3).
  • the command composer 64 identifies the specific command corresponding to the newly located resting face (S7).
  • the LED controller 67 activates the LEDs in a combination specific to the command (S12), and the command composer 64 transmits the command to the computer 100 (S13) such that the computer 100 executes the command and transmits the corresponding instruction to the server 200 for the air-conditioning control.
  • the LEDs as a whole give a color of red, green, and blue respectively with the control project of raising, lowering, and maintaining the temperature.
  • step (S9) the sequence proceeds to step (S9) for checking whether or not the dynamic action is the tapping action, i.e., the command "Repeat Previous Command” (S9). Otherwise, the sequence goes back to step (S3).
  • the command composer 64 fetches or recalls the command identified as corresponding to the resting face located most recently (S10).
  • the command composer 64 refers to the action-command relation memory 66 to select the command associated with the specific dynamic action (S11), which is followed by steps (S12) (S13) to execute the command at the computer 100.
  • the simple manipulation of the input device creates the commands which are executed at the computer 100 for controlling the air-conditioning system by the server 200.
  • the server 200 is kept running to collect the inputs from each of the computers 100 at regular intervals, for example, 30 minutes. As shown in the flowchart of FIG. 8, while 30 minutes are not elapsed (S21), the server 200 checks whether the inputs are available or coming from the computer 100 (S22). If available, the server 200 accumulates the inputs in the data memory 205 to give time series data of the inputs (S23). If not available, or no input is acknowledged, the server 200 is held standby.
  • the controller 204 Upon elapse of 30 minutes, the controller 204 reads out the time-series data from the data memory 205 and fetches recent one for each kind of the inputs given from each of the input devices 10 (S24). Next, the controller 204 calculates the total number of the residents from the inputs indicating the presence and absence of the resident (face 4 & face 5), and analyzes the inputs indicating the demands of keeping the temperature (face 1), raising the temperature (face2), and lowering the temperature (face 3) so as to create the control project in accordance with the predominant rule and with reference to criteria fetched from the control logic memory 206 (S25).
  • the controller 204 transmits a control signal to the air-conditioning apparatus 202 in order to raise, lower, or keep the temperature in accordance with the control project (S26). Concurrently, the controller 204 generates an informative signal indicative of the control project thus created and transmits the signal to the computers 100 and the input devices 10.
  • the LED controller 67 is activated to turn on LEDs 41 , 42, and 43 for generating the color expressing the control project (S27).
  • the controller 204 cancels the inputs in the data memory 205 (S28), and send a signal indicative of the cancellation of the inputs to the computers 100 and the input devices 10 (S29).
  • the computer 100 Upon receiving the signal, the computer 100 provides information that the previous inputs or demands are cancelled and the system awaits next inputs, while the LEDs of the input device 10 gives a color of green indicating that the system awaits the inputs.
  • the controller 204 When the command from the input device is identified as “Cancel Previous Command”, the controller 204 removes the recent command from the data memory 205 and designates the previous command as the recent one. When the command “Emphasize previous command” is acknowledged, the controller 204 gives a weight on the recent command, which is one of "Keep Temperature”, “Raise Temperature”, and “Lower Temperature” determined respectively by the resting faces of "FACE 1", “FACE 2", and “FACE 3" of the body 20. The weight is considered in creating the control project. When the command “Emergency Call” or “Inform System Failure” is acknowledged, the controller 204 provides a corresponding alarm signal which is sent to a computer in a supervisory room to inform the occurrence of abnormal condition for asking a suitable remedy.
  • the lighting control system in which the input devices 10 are utilized to transmit the optical signal to a lighting administer 310 for control of the lighting fixture 300.
  • the optical signal which represents in this instance the presence or absence of the resident in the residential space, is received at a light receiver 301 and is stored in a signal memory 302.
  • the lighting administrator 310 includes a controller 312 which fetches the data from the signal memory on a regular basis to determine a lighting project for control of the lighting fixture, and includes a control logic memory 316 storing parameters with regard to operating conditions and specifications of the lighting fixture 300.
  • the controller 312 reads the data at every 1 minute, for example, to obtain the total number of the residents present in the residential space and stores the total number in the resident's number memory 314.
  • the controller 312 determine the lighting project and transmits a control signal to the lighting fixture 300 for controlling it in accordance with the lighting project.
  • determined control project is accumulated in a lighting project memory 318 to provide a control history.
  • the control project designates the dimming and the extinction of the lighting fixture 300.
  • the computer 100 connected to the input device 10 is programmed to activate the LED controller 67 for generating the optical signal at regular intervals by flashing the LEDs at one of the predetermined different flashing frequencies each designating each of the commands, in this instance, "is presence” and "is absence”.
  • the controller 312 constantly monitors whether or not the light or the optical signal is received at the receiver 301 (S30), and check the flashing frequency (S31). When the flashing frequency matches with one of the predetermined frequencies (S32), the controller 312 identifies the command by the flashing frequency (S32).
  • the controller 312 increments the number of residents (S34). When the commands is identified to denote the absence of the resident (S35), the controller 312 decrements the number of the residents (S36). If the command indicating the presence of the resident is not acknowledged from anyone of the input devices (S37), the controller 312 generates the control signal of turning off the lighting fixture 300 (S38). Otherwise, the controller 312 obtains the total number of the residents (S39), and compares the total number with a reference value X (S40). If the total number is less than the reference value X (S40), the controller 312 generates the control signal of dimming the light down to, for example, 400 LUX (S41). Otherwise, the sequence returns to step (S31). In this manner, the lighting administer 310 control the illumination level of the lighting fixture on a real time basis in accordance with the number of the residents known from the command carried on the optical signal from each of the input devices.
  • FIGS. 11 and 12 illustrate an alternative lighting control system in which the input device 10 is utilized to transmits the commands in the form of the optical signal to the light receiver 301.
  • the static actions i.e., the resting faces of the input device are allocated to commands "1" to "6", which are transmitted by the flashing the light from the LEDs at different frequency patterns as shown in a table of FIG. 11.
  • LEDs are controlled to give the different colors in match with the commands as shown in the table of FIG. 11.
  • the optical signal from the LEDs are constantly received at the light receiver 301 (S50), as shown in the flowchart of FIG. 12.
  • the light receiver 301 When the flashing frequency matches with one of the predetermined frequency patterns (S51), the light receiver 301 accumulates the optical signal in the signal memory 302 (S52).
  • the controller 312 reads the optical signals every 1 minutes (S53) and identifies the commands by the flashing frequency pattern with reference to the predefined relation held in the control logic memory 316 (S54). Then, the controller 312 creates a control project of controlling the lighting fixture 300 according to the predominant rule (S55), and sends a corresponding control signal for controlling the lighting fixture 300 in accordance with the control project (S56). It is noted in this connection that the optical signal designating the commands by different frequency patterns can be equally applicable to the air-conditioning control system as explained in the above.

Abstract

An input device for a computer has a polyhedral body which is sized to be manipulated by a user and has a plurality of resting faces by which the body can be rested upon a supporting surface. The body incorporates an acceleration sensor providing a sensor output indicating an acceleration being applied to the body. The body includes a circuit which identifies which one of the resting faces is on the supporting surface as well as one of dynamic actions that the user applies to the body by analyzing the sensor output from the acceleration sensor. The resting faces and the dynamic actions are allocated to different ones of command to be input to the computer so that the command can be discriminated from the identified resting face and the dynamic motion. Thus, the dynamic actions are added to increase the number of the commands, improving maneuverability while retaining the body in a rather simple geometry with a limited number of resting faces.

Description

DESCRIPTION
INPUT DEVICE FOR A COMPUTER AND ENVIRONMENTAL CONTROL SYSTEM USING THE SAME
TECHNICAL FIELD
The present invention is related to an input device for a computer and an environmental control system using the same.
BACKGROUND ART
Japanese Patent Publication No. 9-84145 discloses an input device for a computer. The input device has a polyhedral body having a plurality of flat faces by each of which the body rests on a supporting surface, i.e., a desktop. The body incorporates a plurality of switches each of which corresponds to each face and is closed in response to the body being rest on the supporting surface with the corresponding face up. Each of the faces is designed with a mark identifying a specific action assigned to the closure of the corresponding switch. The user can select the specific action by resting the body on the supporting surface with the corresponding face up. The input device is therefore given the number of different actions which is determined by the number of faces. Since the number of actions is restricted by the number of faces, the polyhedral body is required to have a more number of faces in order to give a more number of actions. With this result, each face of the polyhedral body becomes smaller and is therefore less stable on the supporting surface. Moreover, as the number of faces increase, the user becomes difficult to identify the face or the mark on the face and is therefore required to make careful attention for confirming the face of the polyhedral body, which is inconvenient for selecting the intended action. Accordingly, it is desirable to reduce the number of faces, yet increasing the number of actions.
DISCLOSURE OF THE INVENTION
In view of the above problem, the present invention has been achieved to provide an input device for the computer which is capable of being easily manipulated, yet increasing the number of the actions. The input device in accordance with the present invention includes a polyhedral body which is configured to be manipulated by a user, and to have a plurality of flat resting faces by which the body can be rested upon a supporting surface. The polyhedral body incorporates an acceleration sensor, a face judge, an action-command relation memory, a command composer. The acceleration sensor is configured to provide a sensor output indicating an acceleration being applied to the body. The face judge is configured to discriminate which one of the resting faces is on the supporting surface based upon the sensor output and to provide a face output indicative of the determined resting face. The action-command relation memory is configured to associate the resting faces respectively with different ones of predefined commands in a first set. The command composer is configured to select one of the predefined commands in the first set which is associated with the fact output from the face judge, and to transmit the selected command to the computer. Also incorporated in the polyhedral body is a motion judge which is configured to analyze the sensor output for identifying a specific dynamic motion which the body undergoes, and to provide a motion output indicative of thus identified specific dynamic motion. In this connection, the action-command relation memory is configured to associate a plurality of predefined dynamic motions with different ones of predefined commands in a second set such that the command composer selects one of the predefined commands in the second set which is associated with the motion output, and to transmit the selected command to the computer. With the incorporation of the motion judge for identification of the specific dynamic motion that the polyhedral body sees and allocate the specific dynamic motion to one of the predefined commands to be sent to the computer, the input device of the present invention can afford a large number of actions and the corresponding commands more than the number of faces of the polyhedral body, which contributes to increase the number of the actions and commands, while retaining the simply geometry of the polyhedral body, for example, hexahedron with a relatively small number of the faces, for example . Accordingly, the user is easy to manipulate the polyhedral body not only by relatively high face stability but also by easy eye confirmation of the face. Moreover, since the dynamic actions can be easily related to direct and natural user's intention or commands, the user is easy to give the commands to the computer by taking the advantage of the dynamic motions.
Preferably, the action-command relation memory is configured to associate one of the predefined commands in the second set with one of the predefined commands in the first set which is recently discriminated by the face judge, whereby one of the specific dynamic motions brings about one of the predefined commands in the first set discriminated immediately previously. For example, one specific dynamic action such as "tapping" can be utilized to repeat the command assigned to the one static motion assigned to anyone of the resting faces. Thus, when it is required to make reentry of the command which has been made by the static motion of the resting the body on the supporting surface, the reentry of the command can be done simply by making the dynamic action or tapping without forcing the user to reposition the body.
In this connection, the motion judge is configured to accumulate time series data of said sensor output until the body is kept stable for a predetermined period in order to analyze the time series data for identifying the specific dynamic motion. Thus, the command composer transmits the selected command only after the elapse of a predetermined short period of time, thereby assuring reliable identification of the dynamic motion.
The present invention is also related to an environmental apparatus control system in which the input device is best utilized for easy operation. The control system includes an apparatus configured to control a residential environment, and a plurality of the input devices which are used to belong to individual residents in the residential environment for entry of demands of changing the residential environment. The control system also includes a server configured to be connected to the input devices and to give an analysis of the demands collected from the input devices for determination of a control project of controlling the apparatus based upon the analysis; and an apparatus control means configured to control the apparatus in accordance with the control project.
In a preferred embodiment, the server is configured to analyze the collected demands in view of a total number of the residents in the residential environment. In this instance, the input device is configured to allocate one of the resting faces with the command indicating the presence of the resident in the residential environment so that the server can collect the total number of the residents in the residential environment. Thus, so long as the user is in the residential environment, the system can eliminate a positive action of expressing the presence in calculating the total number of the residents, thereby improving system maneuverability.
These and still other advantageous features of the present invention will become more apparent from the following detailed explanation of the embodiments when taking in conjunction with the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic view of an input device in accordance with a preferred embodiment of the present invention;
FIG. 2 is a block diagram of an electronic circuit module incorporated in the input device;
FIG. 3 is schematic view illustrating an environmental control system using the above input device;
FIG. 4 is a schematic view illustrating an air- conditioning control of the above system;
FIG. 5 is a table illustrating a relation between the resting faces of the input device and accelerations detected along X, Y, Z-axes;
FIG. 6 is a table illustrating a relation between actions at the input device and commands created by the actions;
FIG. 7 is a flowchart illustrating how the actions are interpreted;
FIG. 8 is a flowchart illustrating the air-conditioning control of the above system;
FIG. 9 is a schematic view illustrating a light control of the above system; FIG. 10 is a flowchart illustrating the light control of the above system;
FIG. 11 is a table illustrating a relation between the resting faces of the input device and actions allocated to the resting faces employed in an alternative light control of the above system; and
FIG. 12 is a flowchart illustrating the alternative light control of the above system.
BEST MODE FOR CARRYING OUT THE INVENTION
Referring now to FIGS. 1 and 2, there is shown an input device 10 for a computer 100 in accordance with a preferred embodiment of the present invention. The input device 10 has a polyhedral body 20 of a size to be easily manipulated by one hand of a user on or above a supporting surface such as desktop. The polyhedral body 20 is in the form of a cube having six congruent resting faces by anyone of which the body is rested on the supporting surface. The body 20 incorporates therein an electronic circuit module 30, and three LEDs 41 , 42, and 43 respectively emitting red, green, and blue rays. The module 30 includes an acceleration sensor 50, an action processor 60, and a communicator 70 for data transmission with the computer 100 through a cable 90. The acceleration sensor 50, in response to the acceleration that the body 20 undergoes, provides a sensor output representing accelerations along X, Y, and Z axes of the body 20. The action processor 60 includes an action analyzer 61 , face judge 62, a motion judge 63, a command composer 64, an action-command relation memory 66, and an LED controller 67.
The action analyzer 61 accumulates time series data of the sensor output from the acceleration sensor 50 to analyze the data in terms of the acceleration directions and the number of the accelerations acknowledged for each predetermined short time period, and gives an action signal to a face judge 62 and to a motion judge 63. The face judge 63 is configured to identify which one of the resting faces is on the supporting surface in response to the action signal, and provides a face output indicative of thus identified resting face to the command composer 64. In this connection, the face judge 63 identifies the resting face of the body stably held on the supporting surface by reference to a table which, as shown in FIG. 5, relates each of the resting faces, i.e., "Face 1 ", "Face 2", to "Face 6" to specific combinations of the accelerations along X-axis, Y-axis, and Z-axis. That is, the face output is generated when the acceleration in the same direction continue over the short time period. Also in response to the action signal, the motion judge 63 is configured to identify a moving pattern of the body 20 by analyzing the changes of the accelerations along the X-axis, Y-axis, and Z-axis within the predetermined period, and provides a motion output indicative of thus identified moving pattern. The motion output is therefore given when the body 20 is moved within the short time period as a result of being shaken or tapped.
Upon receiving the face output or motion output, the command composer 64 refers to the action-command relation memory 66 for selecting one of predefined commands, and sends the selected command to the computer 100 for execution thereat. The memory 66 is configured to give a table which allocates static actions and dynamic actions respectively to the different commands, for example, as shown in FIG. 6. The static actions are meant in the description to denote the results of placing the body 20 on the supporting surface with any one of the resting faces down, and therefore denote the corresponding resting faces of the body 20. While the dynamic actions are meant to the actions such as shaking and tapping applied to the body 20.
In the present embodiment, the input device 10 is utilized for an air-conditioning control and a lighting control for an enclosed residential space or environment as schematically shown in FIG. 3, in reflectance of demands from the residents or the presence of the resident in the space. Details of such controls will be discussed later in the description. FIG. 6 lists the predefined commands to be entered at the input device for making the above controls. The commands are classified into a first set and a second set. The commands in the first set include "keep temperature", "raise temperature", "lower temperature", "is presence", "is absence", and "emergency call" which are allocated to the static actions, i.e., faces 1 to 6. The commands in the second set include "cancel previous command", "emphasize previous command", "inform system failure", and "repeat previous command" which are respectively allocated to the dynamic actions, i.e., "shaking 2 or 3 times", "shaking 5 to 8 times", "rapidly shaking 2 or 3 times", and "tapping".
Referring to FIG. 4, the air-conditioning control is explained herein. The control is realized by a system equipped with a server 200 connected to the computers of the residents and to an air-conditioning apparatus 210. The server 200 is configured to have an air-conditioning controller 204, a data memory 205, and a control logic memory 206 holding information as to parameters such operating conditions or specifications of the apparatus 210, and temperature, humidity, and thermal characteristics of the residential space. The data memory 205 is configured to store time-series data of the inputs from the input devices through the related computers 100 and a communicator 201. The inputs, in this instance, indicate residents' demands as well as the total number of the residents. The controller 204 is configured to read out the inputs at regular intervals, i.e. 30 minutes from the data memory 205, and refers to the control logic memory 206 in order to create a control project of controlling the air-conditioning apparatus in a direction of raising, lowering, or keeping the temperature based upon a predominant rule, i.e. which one of the demands of "raising temperature", "lowering temperature" and "keeping temperature" is predominant. The control project is then sent through a communicator 202 to the apparatus 210 in order to vary or keep the temperature setting in compliance with the predominant demand. The controller 204 sends back a feedback signal to the individual input devices 10, which activates the LED controller 47 to produce different colors from the combination of LEDs 41 , 42, and 43, depending upon the control project created at the controller 204, i.e. temperature raising, lowering and keeping, whereby the individual residents can confirm the currently scheduled control project by the color of the LEDs. It is noted in this connection that the LED controller 47 is also configured to another external signal from the corresponding computer to flash the LEDs at selected frequency or in a predetermined frequency pattern to create an optical signal for controlling a lighting fixture 300, details of which will be discussed later.
The operation of the input device 10 is now discussed in accordance with a flowchart of FIG. 7. Upon connecting the input device 10 to the computer 100 at step S1 , the LED controller 67 is activated to turn on the LEDs 41 , 42, and 43 for indication of the connection (S2). Then, the command composer 64 checks whether or not it receives the external signal (S3). In the presence of the external signal, the command composer 64 passes the external signal to the LED controller 67 which responds to activate the LEDs in a manner designated by the extemal signal (S4). In the absence or extinction of the external signal, a static action analyzing sequence is initiated at step (S5) for locating the resting face of the body provided that the action analyzer 61 gives the static action signal, and a dynamic action analyzing sequence is initiated at step (S8) for checking whether or not the dynamic action signal is detected. After the face judge 62 locates the resting face of the body 20 on the supporting surface (S5), it is checked whether thus located resting face is changed from that located previously (S6). When no change is acknowledged, the sequence goes back to step (S3). When, on the other hand, the change of the resting face is acknowledged, the command composer 64 identifies the specific command corresponding to the newly located resting face (S7). After the command is identified, the LED controller 67 activates the LEDs in a combination specific to the command (S12), and the command composer 64 transmits the command to the computer 100 (S13) such that the computer 100 executes the command and transmits the corresponding instruction to the server 200 for the air-conditioning control. For example, the LEDs as a whole give a color of red, green, and blue respectively with the control project of raising, lowering, and maintaining the temperature.
When the dynamic action is detected at step (S8), the sequence proceeds to step (S9) for checking whether or not the dynamic action is the tapping action, i.e., the command "Repeat Previous Command" (S9). Otherwise, the sequence goes back to step (S3). When the tapping action is acknowledged at step (S9), the command composer 64 fetches or recalls the command identified as corresponding to the resting face located most recently (S10). When the dynamic action is found to the specific action other than tapping action, the command composer 64 refers to the action-command relation memory 66 to select the command associated with the specific dynamic action (S11), which is followed by steps (S12) (S13) to execute the command at the computer 100.
In this manner, the simple manipulation of the input device creates the commands which are executed at the computer 100 for controlling the air-conditioning system by the server 200. The server 200 is kept running to collect the inputs from each of the computers 100 at regular intervals, for example, 30 minutes. As shown in the flowchart of FIG. 8, while 30 minutes are not elapsed (S21), the server 200 checks whether the inputs are available or coming from the computer 100 (S22). If available, the server 200 accumulates the inputs in the data memory 205 to give time series data of the inputs (S23). If not available, or no input is acknowledged, the server 200 is held standby. Upon elapse of 30 minutes, the controller 204 reads out the time-series data from the data memory 205 and fetches recent one for each kind of the inputs given from each of the input devices 10 (S24). Next, the controller 204 calculates the total number of the residents from the inputs indicating the presence and absence of the resident (face 4 & face 5), and analyzes the inputs indicating the demands of keeping the temperature (face 1), raising the temperature (face2), and lowering the temperature (face 3) so as to create the control project in accordance with the predominant rule and with reference to criteria fetched from the control logic memory 206 (S25). Subsequently, the controller 204 transmits a control signal to the air-conditioning apparatus 202 in order to raise, lower, or keep the temperature in accordance with the control project (S26). Concurrently, the controller 204 generates an informative signal indicative of the control project thus created and transmits the signal to the computers 100 and the input devices 10. Upon receiving the signal on the side of the input device 10, the LED controller 67 is activated to turn on LEDs 41 , 42, and 43 for generating the color expressing the control project (S27). Thereafter, the controller 204 cancels the inputs in the data memory 205 (S28), and send a signal indicative of the cancellation of the inputs to the computers 100 and the input devices 10 (S29). Upon receiving the signal, the computer 100 provides information that the previous inputs or demands are cancelled and the system awaits next inputs, while the LEDs of the input device 10 gives a color of green indicating that the system awaits the inputs.
When the command from the input device is identified as "Cancel Previous Command", the controller 204 removes the recent command from the data memory 205 and designates the previous command as the recent one. When the command "Emphasize previous command" is acknowledged, the controller 204 gives a weight on the recent command, which is one of "Keep Temperature", "Raise Temperature", and "Lower Temperature" determined respectively by the resting faces of "FACE 1", "FACE 2", and "FACE 3" of the body 20. The weight is considered in creating the control project. When the command "Emergency Call" or "Inform System Failure" is acknowledged, the controller 204 provides a corresponding alarm signal which is sent to a computer in a supervisory room to inform the occurrence of abnormal condition for asking a suitable remedy.
Now referring to FIG. 9, there is shown the lighting control system in which the input devices 10 are utilized to transmit the optical signal to a lighting administer 310 for control of the lighting fixture 300. The optical signal, which represents in this instance the presence or absence of the resident in the residential space, is received at a light receiver 301 and is stored in a signal memory 302. The lighting administrator 310 includes a controller 312 which fetches the data from the signal memory on a regular basis to determine a lighting project for control of the lighting fixture, and includes a control logic memory 316 storing parameters with regard to operating conditions and specifications of the lighting fixture 300. The controller 312 reads the data at every 1 minute, for example, to obtain the total number of the residents present in the residential space and stores the total number in the resident's number memory 314. Based upon the total number of the residents and the parameters from the control logic memory 316, the controller 312 determine the lighting project and transmits a control signal to the lighting fixture 300 for controlling it in accordance with the lighting project. Thus determined control project is accumulated in a lighting project memory 318 to provide a control history. In this instance, the control project designates the dimming and the extinction of the lighting fixture 300.
The operation of the system is explained with reference to the flowchart of FIG. 10. Prior to following the sequence of the operation, it is noted that the computer 100 connected to the input device 10 is programmed to activate the LED controller 67 for generating the optical signal at regular intervals by flashing the LEDs at one of the predetermined different flashing frequencies each designating each of the commands, in this instance, "is presence" and "is absence". The controller 312 constantly monitors whether or not the light or the optical signal is received at the receiver 301 (S30), and check the flashing frequency (S31). When the flashing frequency matches with one of the predetermined frequencies (S32), the controller 312 identifies the command by the flashing frequency (S32). When the commands is identified to denote the presence of the resident (S33), the controller 312 increments the number of residents (S34). When the commands is identified to denote the absence of the resident (S35), the controller 312 decrements the number of the residents (S36). If the command indicating the presence of the resident is not acknowledged from anyone of the input devices (S37), the controller 312 generates the control signal of turning off the lighting fixture 300 (S38). Otherwise, the controller 312 obtains the total number of the residents (S39), and compares the total number with a reference value X (S40). If the total number is less than the reference value X (S40), the controller 312 generates the control signal of dimming the light down to, for example, 400 LUX (S41). Otherwise, the sequence returns to step (S31). In this manner, the lighting administer 310 control the illumination level of the lighting fixture on a real time basis in accordance with the number of the residents known from the command carried on the optical signal from each of the input devices.
FIGS. 11 and 12 illustrate an alternative lighting control system in which the input device 10 is utilized to transmits the commands in the form of the optical signal to the light receiver 301. In this system, the static actions, i.e., the resting faces of the input device are allocated to commands "1" to "6", which are transmitted by the flashing the light from the LEDs at different frequency patterns as shown in a table of FIG. 11. For easy confirmation of the intended commands by the user or the resident, LEDs are controlled to give the different colors in match with the commands as shown in the table of FIG. 11. The optical signal from the LEDs are constantly received at the light receiver 301 (S50), as shown in the flowchart of FIG. 12. When the flashing frequency matches with one of the predetermined frequency patterns (S51), the light receiver 301 accumulates the optical signal in the signal memory 302 (S52). The controller 312 reads the optical signals every 1 minutes (S53) and identifies the commands by the flashing frequency pattern with reference to the predefined relation held in the control logic memory 316 (S54). Then, the controller 312 creates a control project of controlling the lighting fixture 300 according to the predominant rule (S55), and sends a corresponding control signal for controlling the lighting fixture 300 in accordance with the control project (S56). It is noted in this connection that the optical signal designating the commands by different frequency patterns can be equally applicable to the air-conditioning control system as explained in the above.

Claims

1. An input device for a computer comprising: a polyhedral body configured to be manipulated by a user, said body having a plurality of flat resting faces by which said body can be rested upon a supporting surface; an acceleration sensor incorporated within said polyhedral body to provide a sensor output indicating an acceleration being applied to said body; a face judge configured to discriminate which one of the resting faces is on said supporting surface based upon the sensor output, and provide a face output indicative of the determined resting face; an action-command relation memory configured to associate said resting faces respectively with different ones of predefined commands in a first set, a command composer configured to select one of said predefined commands in said first set which is associated with said fact output from said face judge, and transmit the selected command to said computer; wherein said device includes a motion judge which is configured to analyze said sensor output for identifying a specific dynamic motion which said body undergoes, and to provide a motion output indicative of thus identified specific dynamic motion; said action-command relation memory is configured to associate a plurality of predefined dynamic motions with different ones of predefined commands in a second set such that said command composer selects one of said predefined commands in said second set which is associated with said motion output, and transmit the selected command to the computer.
2. The input device as set forth in claim 1 , wherein said action-command relation memory is configured to associate one of said predefined commands in said second set with one of said predefined commands in said first set which is recently discriminated by said face judge, whereby one of said specific dynamic motions brings about one of said predefined commands in said first set discriminated immediately previously.
3. The input device as set fort in claim 1 or 2, wherein said motion judge is configured to accumulate time series data of said sensor output until said body is kept stable for a predetermined period, and to analyze said time series data for identifying said specific dynamic motion such that said command composer transmits the selected command only after an elapse of a predetermined period of time.
4. An environmental apparatus control system using said input device of claim 1 , said system comprising: an apparatus configured to control a residential environment; a plurality of said input devices belonging to individual residents in said residential environment, said input device being configured to input demands of changing said residential environment, a server configured to be connected to said input devices and to give an analysis of said demands collected from said input devices for determination of a control project of controlling said apparatus based upon the analysis; and an apparatus control means configured to control said apparatus in accordance with said control project.
5. The system as set forth in claim 4, wherein said server is configured to analyze said collected demands in view of a total number of the residents in said residential environment, said input device is configured to allocate one of said resting faces with the command indicating the presence of the resident in said residential environment so that said server can collect the total number of the residents in said residential environment.
PCT/JP2006/301080 2005-02-23 2006-01-18 Input device for a computer and environmental control system using the same WO2006090546A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2005-048039 2005-02-23
JP2005048039 2005-02-23
JP2005-243090 2005-08-24
JP2005243090A JP2006270913A (en) 2005-02-23 2005-08-24 Input unit and environment control system using the same

Publications (2)

Publication Number Publication Date
WO2006090546A2 true WO2006090546A2 (en) 2006-08-31
WO2006090546A3 WO2006090546A3 (en) 2007-07-12

Family

ID=36263984

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/301080 WO2006090546A2 (en) 2005-02-23 2006-01-18 Input device for a computer and environmental control system using the same

Country Status (2)

Country Link
JP (1) JP2006270913A (en)
WO (1) WO2006090546A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013035022A1 (en) * 2011-09-06 2013-03-14 Koninklijke Philips Electronics N.V. Activity monitoring for demand-controlled ventilation
WO2014167465A1 (en) * 2013-04-11 2014-10-16 Koninklijke Philips N.V. User interface with adaptive extent of user control based on user presence information
CN107957120A (en) * 2017-11-24 2018-04-24 广东美的制冷设备有限公司 Air supply method, air conditioner and the computer-readable recording medium of air conditioner
CN111854116A (en) * 2019-04-26 2020-10-30 珠海格力电器股份有限公司 Control method and device of controller, storage medium and controller

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4690376B2 (en) * 2007-10-18 2011-06-01 株式会社ナナオ Remote control device, remote control system and electrical equipment
JP5258816B2 (en) * 2010-02-27 2013-08-07 三菱電機株式会社 Air conditioner

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5390206A (en) * 1991-10-01 1995-02-14 American Standard Inc. Wireless communication system for air distribution system
JPH0984145A (en) * 1995-09-18 1997-03-28 Omron Corp Polygon-type remote control unit and various kinds of systems using the unit
US5745599A (en) * 1994-01-19 1998-04-28 Nippon Telegraph And Telephone Corporation Character recognition method
US20020072356A1 (en) * 2000-12-13 2002-06-13 Atsushi Yamashita Mobile terminal, and automatic remote control system and automatic remote control method
US20020167699A1 (en) * 2000-05-17 2002-11-14 Christopher Verplaetse Motion-based input system for handheld devices
WO2003001340A2 (en) * 2001-06-22 2003-01-03 Motion Sense Corporation Gesture recognition system and method
US20030076343A1 (en) * 1997-08-29 2003-04-24 Xerox Corporation Handedness detection for a physical manipulatory grammar
US20030103091A1 (en) * 2001-11-30 2003-06-05 Wong Yoon Kean Orientation dependent functionality of an electronic device
US20030179246A1 (en) * 2002-03-22 2003-09-25 Koninklijke Philips Electronics N.V. Low cost interactive program control system and method
US20040227741A1 (en) * 2003-05-16 2004-11-18 Fuji Xerox Co., Ltd. Instruction inputting device and instruction inputting method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5390206A (en) * 1991-10-01 1995-02-14 American Standard Inc. Wireless communication system for air distribution system
US5745599A (en) * 1994-01-19 1998-04-28 Nippon Telegraph And Telephone Corporation Character recognition method
JPH0984145A (en) * 1995-09-18 1997-03-28 Omron Corp Polygon-type remote control unit and various kinds of systems using the unit
US20030076343A1 (en) * 1997-08-29 2003-04-24 Xerox Corporation Handedness detection for a physical manipulatory grammar
US20020167699A1 (en) * 2000-05-17 2002-11-14 Christopher Verplaetse Motion-based input system for handheld devices
US20020072356A1 (en) * 2000-12-13 2002-06-13 Atsushi Yamashita Mobile terminal, and automatic remote control system and automatic remote control method
WO2003001340A2 (en) * 2001-06-22 2003-01-03 Motion Sense Corporation Gesture recognition system and method
US20030103091A1 (en) * 2001-11-30 2003-06-05 Wong Yoon Kean Orientation dependent functionality of an electronic device
US20030179246A1 (en) * 2002-03-22 2003-09-25 Koninklijke Philips Electronics N.V. Low cost interactive program control system and method
US20040227741A1 (en) * 2003-05-16 2004-11-18 Fuji Xerox Co., Ltd. Instruction inputting device and instruction inputting method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013035022A1 (en) * 2011-09-06 2013-03-14 Koninklijke Philips Electronics N.V. Activity monitoring for demand-controlled ventilation
WO2014167465A1 (en) * 2013-04-11 2014-10-16 Koninklijke Philips N.V. User interface with adaptive extent of user control based on user presence information
US10024566B2 (en) 2013-04-11 2018-07-17 Philips Lighting Holding B.V. User interface with adaptive extent of user control based on user presence information
CN107957120A (en) * 2017-11-24 2018-04-24 广东美的制冷设备有限公司 Air supply method, air conditioner and the computer-readable recording medium of air conditioner
CN107957120B (en) * 2017-11-24 2021-06-25 广东美的制冷设备有限公司 Air supply method of air conditioner, air conditioner and computer readable storage medium
CN111854116A (en) * 2019-04-26 2020-10-30 珠海格力电器股份有限公司 Control method and device of controller, storage medium and controller

Also Published As

Publication number Publication date
JP2006270913A (en) 2006-10-05
WO2006090546A3 (en) 2007-07-12

Similar Documents

Publication Publication Date Title
US10271405B2 (en) Lighting fixture sensor network
CN106793378B (en) Recording medium
WO2006090546A2 (en) Input device for a computer and environmental control system using the same
US5467264A (en) Method and system for selectively interdependent control of devices
JP5371290B2 (en) Lighting control system
CN104429094B (en) Portable information terminal and its control method
CN104584580B (en) Portable information terminal and its control method
US20080183651A1 (en) Automatic configuration and control of devices using metadata
CN104350335B (en) The control method and recording medium of portable information terminal, portable information terminal
JP2008533669A (en) Handheld programmer for lighting control system
CN104169660B (en) The control method and program of portable information terminal
CN104247568A (en) Program and method for controlling portable information terminal
CN104755852B (en) Portable information terminal and its control method
CN102184695A (en) Control system for detecting LED (Light Emitting Diode) display screen on line
JP2009184781A (en) File cabinet management system and file cabinet management method
CN108475043A (en) The touch control system of multi input
JP5264190B2 (en) Lighting control system
CN109068458A (en) Intelligent lighting control system and method
CN106793375B (en) The program recorded medium of portable information terminal
CN104169658B (en) The control method and program of portable information terminal
CN205389288U (en) Energy -conserving lighting control system of intelligence
JP6628109B2 (en) Signal indicator system and signal indicator therefor
KR20150042023A (en) Lamp control system
JP2016177985A (en) Lighting control system
JP4137357B2 (en) Disaster prevention system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06701418

Country of ref document: EP

Kind code of ref document: A2

WWW Wipo information: withdrawn in national office

Ref document number: 6701418

Country of ref document: EP