US20070072511A1 - USB desktop toy - Google Patents

USB desktop toy Download PDF

Info

Publication number
US20070072511A1
US20070072511A1 US11/349,991 US34999106A US2007072511A1 US 20070072511 A1 US20070072511 A1 US 20070072511A1 US 34999106 A US34999106 A US 34999106A US 2007072511 A1 US2007072511 A1 US 2007072511A1
Authority
US
United States
Prior art keywords
model
computer
gesture
movable parts
mechanical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/349,991
Inventor
Shuka Zernovizky
Itzhak Pomerantz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Western Digital Israel Ltd
Original Assignee
M Systems Flash Disk Pionners Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by M Systems Flash Disk Pionners Ltd filed Critical M Systems Flash Disk Pionners Ltd
Priority to US11/349,991 priority Critical patent/US20070072511A1/en
Assigned to M-SYSTEMS FLASH DISK PIONEERS LTD. reassignment M-SYSTEMS FLASH DISK PIONEERS LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: POMERANTZ, ITZHAK, ZERNOVIZKY, SHUKA
Publication of US20070072511A1 publication Critical patent/US20070072511A1/en
Assigned to MSYSTEMS LTD reassignment MSYSTEMS LTD CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: M-SYSTEMS FLASH DISK PIONEERS LTD.
Assigned to SANDISK IL LTD. reassignment SANDISK IL LTD. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: MSYSTEMS LTD
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H13/00Toy figures with self-moving parts, with or without movement of the toy as a whole
    • A63H13/02Toy figures with self-moving parts, with or without movement of the toy as a whole imitating natural actions, e.g. catching a mouse by a cat, the kicking of an animal

Definitions

  • the present invention relates to a system and method for connecting an amusing mechanical toy to a computer that can be controlled by a computer user (either locally or remotely to the computer) in order to convey various forms of expression.
  • a device that could convey an intended emotion or gesture by movement in physical space (as opposed to just a screen depiction or animation) would be desirable.
  • This type of three-dimensional figurine or toy would extend the forms of expression available to these types of users. This would expand the variety of multimedia available to textual message software, and its users, beyond just screen images and sounds. In addition, it would provide an amusing experience like most other toys generally do.
  • amusing is used in this application to describe the property of stimulating at least one of an interest, an excitement, or an arousal by a viewer.
  • An amusing object or event is, under this definition, distinguished from a functional object or event.
  • a robot is an automated mechanical device that is functional, in the sense that it can work unobtrusively and may not create any sentiment or interest by the viewer.
  • a toy drummer is an automated mechanical device that may not be functional or useful at all, but its motion is funny, threatening, intriguing or arousing to a viewer.
  • a device can be both amusing and functional, or be neither amusing nor functional. This application is about amusing devices, in the sense of the definition provided above.
  • gesture is used in this application as a name of a predetermined collection of movements of the model (specifically, a minimum of three movements), carried out through a set of software instructions designed to represent a known motion, such as nodding, kneeling, or waving hands.
  • the present invention to provide an amusing mechanical toy that is connected to the computer via a communication port, such as a universal serial bus (USB) port.
  • a communication port such as a universal serial bus (USB) port.
  • USB universal serial bus
  • the toy responds with motion to instructions provided to it by either its local user or a remote user who is in communication with the local user.
  • the toy is a mechanical model of a real or imaginary creature, such as a person or an animal, preferably with an amusing appearance, which is is connected to a computer via a serial port or a USB port and is placed on or near a desk.
  • the toy is equipped with mechanical actuators, such as motors or electromagnets, that cause perceptible motion in the model in response to commands sent from the computer in order to convey a gesture.
  • perceptible motion is used in this application to refer to motion that is appreciable enough to be observed by a user, as opposed to slight variations in shape created e.g. by tightening, loosening, vibrating, or heating an object.
  • a gesture is expressed directly by motion itself and not by a consequence of the motion. Therefore, we are excluding a three-dimensional motion that indirectly expresses a two-dimensional gesture, such as the motion of a printer printing a smiley face on paper.
  • This model can serve as a toy or as mechanism to support communication between people by adding mechanized body language to the verbal or textual discourse.
  • the remote user has a vocabulary of gestures from which to choose. If the local user has such a model connected to the local computer, and if the remote user is made aware of this fact, then the remote user can include, in his text, commands that will cause the local toy to perform some of the motions to physically emphasize the text.
  • a variety of models of the toy are made available that have a different set of gestures.
  • Each toy is supported by a file listing its set of gestures, and the host computer is able to read the toy capabilities and configure the toy software accordingly.
  • This enables a remote computer to activate a local toy by reading its possible gestures and presenting these to the remote user, who can then send proper instructions to the local computer to activate the local toy.
  • the system includes a telecommunication mechanism for triggering a command from a computer, and a communication mechanism for sending a command from a second computer to the device.
  • the telecommunication mechanism includes an “internet” which is a set of interconnected computer networks. The most well-known internet is the Internet.
  • a system for expressing a gesture in three dimensions including: (a) a mechanical device operative to perceptively move at least three parts thereof in response to gesture instructions; (b) a gesture instruction, for controlling said mechanical device, embedded in a correspondence, the gesture instruction originating from a remote computer; (c) a local computer for supplying the gesture instruction to the mechanical device during the course of the correspondence between the remote computer and the local computer; and (d) a gesture instruction interpreter operative to extract the gesture instruction from the correspondence, and convey the gesture instruction to the mechanical device, thereby directly expressing the gesture visually.
  • the system also includes: (e) a telecommunication mechanism for sending the gesture instruction from a remote computer.
  • the telecommunication mechanism also includes an internet.
  • the remote computer also includes a graphical user interface (GUI) for programming the gesture instruction via the telecommunication mechanism.
  • GUI graphical user interface
  • the system also includes: (e) a communication mechanism, such as a generic, wired serial port or a wireless port, for sending at least one gesture instruction from a local computer to the mechanical device.
  • a communication mechanism such as a generic, wired serial port or a wireless port, for sending at least one gesture instruction from a local computer to the mechanical device.
  • the system also includes: (e) a UFD housed in the mechanical device for storing and retrieving data.
  • the UFD is detachable.
  • an amusing physical model controlled by a computer including: (a) at least three movable parts; (b) at least one mechanical actuator for moving at least three movable parts, the number of mechanical actuators is less than the number of movable parts; and (c) a communication port for receiving a gesture instruction from the computer for controlling the mechanical actuator to directly express a gesture visually.
  • the remote computer also includes the model is shaped as a realistic creature or as a fictitious creature.
  • the movable parts represent limbs.
  • the model also includes: (d) a UFD housed in the model for storing and retrieving data to and from the local computer.
  • the UFD is detachable.
  • the model also includes: (d) a speaker housed in the model for receiving signals from the computer via the communication port for conveying sound.
  • the model includes only one mechanical actuator that is operative to move at least three movable parts using at least one flexible mechanical linkage for controlling the model to express the gesture.
  • the mechanical actuator is pneumatically-controlled.
  • the mechanical actuator is hydraulically-controlled.
  • the mechanical actuator comprises at least one gear for moving the movable parts.
  • the communication port is a generic, wired serial port or a wireless port.
  • the model also includes: (d) a computer-readable storage medium that includes a data file of the gesture instruction.
  • an amusing physical model controlled by a computer including: (a) at least three movable parts; (b) a mechanical actuator for moving at least three movable parts; (c) a communication port for receiving gesture instructions from the computer for controlling the mechanical actuator to directly express a gesture visually; and (d) a UFD housed in the model for storing and retrieving data to and from the computer via the communication port.
  • the model is shaped as a realistic creature or as a fictitious creature.
  • the movable parts represent limbs.
  • the UFD is detachable.
  • the model also includes: (e) a speaker housed in the model for receiving signals from the computer via the communication port for conveying sound.
  • the model includes only one mechanical actuator that is operative to move at least three movable parts using at least one flexible mechanical linkage for controlling the model to express the gesture.
  • the mechanical actuator is pneumatically-controlled.
  • the mechanical actuator is hydraulically-controlled.
  • the mechanical actuator comprises at least one gear for moving the movable parts.
  • the communication port is a generic, wired serial port or a wireless port.
  • the model also includes: (e) a computer-readable storage medium that includes a data file of the gesture instructions.
  • a method of directly expressing a gesture of a physical model with at least three movable parts visually in three dimensions by a remote user of a remote computer to a local user of a local computer includes the steps of: (a) providing the local user with a computer-controlled mechanical device operationally connected to the local computer; and (b) remotely activating the mechanical device to move in a manner that expresses the gesture visually by the remote user to the local user.
  • the step of activating the mechanical device is effected by an application running on the remote computer.
  • a method of self-expression through a three-dimensional visual gesture of a physical model with at least three movable parts includes the steps of: (a) programming a computer with at least one software module that activates the model to perform the gesture by moving parts of the model; (b) operationally connecting the model to the computer; and (c) activating at least one software module from the computer, thereby directly expressing the gesture.
  • Devices that resemble the present invention are known in the art.
  • One such device is the Doc Johnson High Joy Enabled® iVibe Rabbit from High Joy Products, LLC which allows a user to be stimulated physically by the device through a computer control which can be operated by the user or a remote operator.
  • the user is required to be in physical contact with the device in order to be stimulated.
  • the act of wearing the device creates a sense of anticipication of the forthcoming stimulation in the user.
  • the user can be spontaneously surprised by the device since its operation does not require active involvement by the user (other than being in viewing range).
  • the stimulation in and of itself, does not constitute a gesture as defined above.
  • Nabaztag (“Wi-Fi Rabbit”) from Violet, The Smart Object Company.
  • This device allows a user to be notified of various information through a wireless link to the Internet.
  • the information is conveyed by the device's speaker via simulated voices.
  • the information is obtained from the services that the Nabaztag provides (such as weather forecasts).
  • the Nabaztag can flash lights, play music, and move its ears.
  • the expression of gestures as defined in this application is not possible by the Nabaztag.
  • the limited rotational movement of the Nabaztag is inadequate to express a gesture.
  • the device makes at least three movements to express a gesture. This is considered, in this application, to be the minimum number of motions necessary to express a gesture realistically, where the moving parts represent limbs of a figure or creature (e.g. head, arms, legs, tail, etc.).
  • a USB jacket that can simulate the act of hugging to the wearer transmitted by a remote operator.
  • the expression of the gesture is created by physical contact and creates no visual effect.
  • the prior art device does not move or change its shape perceptibly, as defined above, in its operation.
  • the device changes its shape perceptibly to convey the visual expression of a gesture.
  • FIG. 1A shows a simplified block diagram of a system for controlling a mechanical model via a local computer
  • FIG. 1B shows a simplified block diagram of a system for controlling a mechanical model via a UFD
  • FIG. 2 shows an illustration of a toy robot with motors
  • FIG. 3 shows illustrations of collapsible toy characters
  • FIG. 4A shows a simplified block diagram of a collapsible model with its control string spooled
  • FIG. 4B shows a simplified block diagram of a collapsible model with its control string unspooled.
  • the present invention is of a system for a USB desktop toy and a method for expressing gestures via such a toy. Specifically, the present invention can be used for amusement and expressing emotion.
  • the principles and operation of a USB desktop toy according to the present invention may be better understood with reference to the drawings and the accompanying description.
  • FIG. 1A shows a simplified block diagram of a system for controlling a mechanical model (or toy) 20 of a human figure, made of a head 22 , arms 24 , 26 , a torso 28 and legs 30 , 32 via a local computer 34 .
  • a mechanical actuator 36 such as a high-torque motor used in robotic applications (available, for example, from Hobby Engineering, 180 El Camino Real, Millbrae, Calif. 94030, known as Boe-Bot Complete Robot Kit v2), is mechanically engaged to some of the movable body parts of the model.
  • Local computer 34 connected to mechanical actuator 36 through a USB connection, sends gesture instructions to mechanical actuator 36 .
  • the gesture instructions are motion commands.
  • any motion is a part of a gesture, and any gesture is made of motion.
  • the instructions to mechanical actuator 36 in FIG. 1A can be relayed in a number of ways. As shown in FIG. 1A , the instructions can be: (a) played from a file on local computer 34 , (b) programmed by the user on an I/O interface 38 (that represents user I/O devices such as a keyboard, a mouse, or a display screen), or (c) communicated via an internet 40 from a remote computer 42 in the course of a correspondence session.
  • the model can represent either real creatures, such as humans or animals, or also imaginary creatures, such as Donald Duck or Smurfs.
  • toy 20 is located near local computer 34 , to which toy 20 is connected via a communication port (such as a USB port), toy 20 can also easily serve as a portable storage device and contain a USB flash memory drive (UFD).
  • local computer 34 is connected via a communication port to a UFD 44 located within toy 20 . Only one communication port is necessary to control mechanical actuator 36 and interface to UFD 44 .
  • UFD 44 may be detached from toy 20 in order to allow greater mobility to UFD 44 without disconnecting toy 20 .
  • a batch of gesture instructions contained in a data file can either be created on local computer 34 , or can be received from remote computer 42 .
  • This batch of instructions can make the toy move according to predefined choreography in order to perform predefined gestures.
  • Remote computer 42 contains a memory 46 and a database 48 .
  • Database 48 contains the data files representing the feasible gestures that toy 20 can make. Database 48 is either loaded into memory 46 or transmitted to remote computer 42 via internet 40 at the start of a correspondence session.
  • remote computer 42 embeds the gesture instruction in the correspondence with local computer 34 .
  • a gesture instruction interpreter 49 residing on local computer 34 , extracts the gesture instruction from the correspondence, and conveys the gesture instruction to toy 20 .
  • FIG. 1B shows a simplified block diagram of a system for controlling a mechanical model (or toy) 20 of a human figure, made of a head 22 , arms 24 , 26 , a torso 28 and legs 30 , 32 via a UFD 44 .
  • toy 20 is configured similar to FIG. 1A except for the control of mechanical actuator 36 .
  • local computer 34 sends a gesture instruction to UFD 44 .
  • the internal processor of UFD 44 then decodes the gesture instruction, and sends the appropriate sequence of motion commands to mechanical actuator 36 . This enables a user to choose gestures which are composed of many motion commands in a single step (e.g. Hello, Goodbye, Deep Bow gesture instructions).
  • FIG. 2 shows an illustration of a toy robot with motors.
  • the model is a robot with motors moving the limbs and head as shown in FIG. 2 (available under the name “Elenco Robomech Mechanical Motorized Wooden Kit” from Tower Hobbies, Tower Hobbies, Champaign, Ill.).
  • FIG. 3 shows illustrations of collapsible toy characters.
  • the model includes linkages and strings, like the plastic sports characters with movable body parts as shown in FIG. 3 .
  • the linkages (which represent the limbs) are moved by a control string.
  • FIG. 4A shows a simplified block diagram of a collapsible model with its control string spooled.
  • a model base 50 houses a mechanical actuator 52 which controls the rotation of a spool 54 .
  • a control string 56 is attached to spool 54 and can wound onto spool 54 by actuator 52 .
  • Actuator 52 includes a communication port 58 such as a generic, wired serial port or a wireless port (such as a Bluetooth port).
  • a generic, wired serial port is a general-purpose interface used for serial communication.
  • a USB port is one type of generic, wired serial port, and a wireless port may be a serial or parallel port.
  • Base 50 also houses an optional UFD 60 that communicates with the computer via communication port 58 . This extends the functional use of the toy.
  • Limbs such as a torso 68 , and a head 74 , can be constructed of multiple links for greater definition of motion (e.g. a lower leg 62 , an upper leg 64 , a forearm 70 , and an upper arm 72 ).
  • Control string 56 can connect to, and control, the model independently or through an additional string 66 attached to the model.
  • the model also houses a speaker 76 for producing sounds via communication port 58 .
  • FIG. 4B shows a simplified block diagram of a collapsible model with its control string unspooled.
  • spool 54 is unwound allowing string to loosen thereby moving the linkages in the model.
  • the relative slack of string creates flexing at a joint 78 between linkages and tipping of a limb at a fastening point 80 .
  • the set of possible movements can be saved in a file which can be made available to remote computers. This will enable a remote user to program instructions for the local toy, and use these instructions to cause the local toy to move in response to remote instructions.
  • the toy adds gestures and body language to emphasize important ideas and feelings/moods within the conversation.
  • the toy preferably is schematically represented on the screen of the computer so that the user can program movements by using a mouse to click and drag control points on the screen which represent real movements of the corresponding points of the toy in space.
  • the system has a “record” mode and a “play” mode; wherein, the record mode stores the gestures as marked by the mouse on the screen, and the play mode sends these gestures to the toy for execution.
  • the model also includes a speaker, and the instructions to the model can include sound files to be played through that speaker.
  • the toy does not need to be connected to the computer through a USB port. It can equally be connected via a serial port, a parallel port, or a wireless port such as Bluetooth.
  • the toy can have a USB socket, and serve as a UFD cradle on the desk on which the toy rests.
  • the instructions to the toy can be triggered by software instructions in running applications.

Abstract

An amusing desktop toy, and methods of its use. The toy, in the shape of a realistic or imaginary creature, is connected to a local computer via a communication port, and is actuated by a local or remote user to move so as to directly express gestures visually in three dimensions. The remote user sends gesture instructions to actuate the toy via a telecommunication mechanism from a remote computer. The remote user can program the gesture instructions via a graphical user interface. Optionally, the toy includes a speaker and/or a UFD.

Description

  • This patent application claims the benefit of U.S. Provisional Patent Application No. 60/720,056 filed Sep. 26, 2005.
  • FIELD AND BACKGROUND OF THE INVENTION
  • The present invention relates to a system and method for connecting an amusing mechanical toy to a computer that can be controlled by a computer user (either locally or remotely to the computer) in order to convey various forms of expression.
  • Software programs used for textual communication between computer users often provide users with graphical symbols known as icons, emoticons, and winks. Users often embed these symbols in their messages for additional modes of expression. In the prior art, these symbols have been limited to two-dimensional drawings on the screen.
  • In order to further enhance textual communication between computer users, a device that could convey an intended emotion or gesture by movement in physical space (as opposed to just a screen depiction or animation) would be desirable. This type of three-dimensional figurine (or toy) would extend the forms of expression available to these types of users. This would expand the variety of multimedia available to textual message software, and its users, beyond just screen images and sounds. In addition, it would provide an amusing experience like most other toys generally do.
  • For the purpose of this disclosure and claims, the term “amusing” is used in this application to describe the property of stimulating at least one of an interest, an excitement, or an arousal by a viewer. An amusing object or event is, under this definition, distinguished from a functional object or event. A robot is an automated mechanical device that is functional, in the sense that it can work unobtrusively and may not create any sentiment or interest by the viewer. A toy drummer is an automated mechanical device that may not be functional or useful at all, but its motion is funny, threatening, intriguing or arousing to a viewer. Clearly, a device can be both amusing and functional, or be neither amusing nor functional. This application is about amusing devices, in the sense of the definition provided above.
  • The term “gesture” is used in this application as a name of a predetermined collection of movements of the model (specifically, a minimum of three movements), carried out through a set of software instructions designed to represent a known motion, such as nodding, kneeling, or waving hands.
  • SUMMARY OF THE INVENTION
  • It is the purpose of the present invention to provide an amusing mechanical toy that is connected to the computer via a communication port, such as a universal serial bus (USB) port. This toy responds with motion to instructions provided to it by either its local user or a remote user who is in communication with the local user. The toy is a mechanical model of a real or imaginary creature, such as a person or an animal, preferably with an amusing appearance, which is is connected to a computer via a serial port or a USB port and is placed on or near a desk. The toy is equipped with mechanical actuators, such as motors or electromagnets, that cause perceptible motion in the model in response to commands sent from the computer in order to convey a gesture.
  • The term perceptible motion is used in this application to refer to motion that is appreciable enough to be observed by a user, as opposed to slight variations in shape created e.g. by tightening, loosening, vibrating, or heating an object.
  • A gesture is expressed directly by motion itself and not by a consequence of the motion. Therefore, we are excluding a three-dimensional motion that indirectly expresses a two-dimensional gesture, such as the motion of a printer printing a smiley face on paper.
  • This model can serve as a toy or as mechanism to support communication between people by adding mechanized body language to the verbal or textual discourse. In these latter applications, the remote user has a vocabulary of gestures from which to choose. If the local user has such a model connected to the local computer, and if the remote user is made aware of this fact, then the remote user can include, in his text, commands that will cause the local toy to perform some of the motions to physically emphasize the text.
  • Preferably, a variety of models of the toy are made available that have a different set of gestures. Each toy is supported by a file listing its set of gestures, and the host computer is able to read the toy capabilities and configure the toy software accordingly. This enables a remote computer to activate a local toy by reading its possible gestures and presenting these to the remote user, who can then send proper instructions to the local computer to activate the local toy.
  • The system includes a telecommunication mechanism for triggering a command from a computer, and a communication mechanism for sending a command from a second computer to the device. The telecommunication mechanism includes an “internet” which is a set of interconnected computer networks. The most well-known internet is the Internet.
  • Therefore, according to the present invention, there is provided for the first time a system for expressing a gesture in three dimensions, the system including: (a) a mechanical device operative to perceptively move at least three parts thereof in response to gesture instructions; (b) a gesture instruction, for controlling said mechanical device, embedded in a correspondence, the gesture instruction originating from a remote computer; (c) a local computer for supplying the gesture instruction to the mechanical device during the course of the correspondence between the remote computer and the local computer; and (d) a gesture instruction interpreter operative to extract the gesture instruction from the correspondence, and convey the gesture instruction to the mechanical device, thereby directly expressing the gesture visually.
  • Preferably, the system also includes: (e) a telecommunication mechanism for sending the gesture instruction from a remote computer.
  • Most preferably, the telecommunication mechanism also includes an internet.
  • Most preferably, the remote computer also includes a graphical user interface (GUI) for programming the gesture instruction via the telecommunication mechanism.
  • Preferably, the system also includes: (e) a communication mechanism, such as a generic, wired serial port or a wireless port, for sending at least one gesture instruction from a local computer to the mechanical device.
  • Preferably, the system also includes: (e) a UFD housed in the mechanical device for storing and retrieving data.
  • Most Preferably, the UFD is detachable.
  • According to the present invention, there is provided for the first time an amusing physical model controlled by a computer, the model including: (a) at least three movable parts; (b) at least one mechanical actuator for moving at least three movable parts, the number of mechanical actuators is less than the number of movable parts; and (c) a communication port for receiving a gesture instruction from the computer for controlling the mechanical actuator to directly express a gesture visually.
  • Preferably, the remote computer also includes the model is shaped as a realistic creature or as a fictitious creature.
  • Preferably, the movable parts represent limbs.
  • Preferably, the model also includes: (d) a UFD housed in the model for storing and retrieving data to and from the local computer.
  • Most preferably, the UFD is detachable.
  • Preferably, the model also includes: (d) a speaker housed in the model for receiving signals from the computer via the communication port for conveying sound.
  • Preferably, the model includes only one mechanical actuator that is operative to move at least three movable parts using at least one flexible mechanical linkage for controlling the model to express the gesture.
  • Preferably, the mechanical actuator is pneumatically-controlled.
  • Preferably, the mechanical actuator is hydraulically-controlled.
  • Preferably, the mechanical actuator comprises at least one gear for moving the movable parts.
  • Preferably, the communication port is a generic, wired serial port or a wireless port.
  • Preferably, the model also includes: (d) a computer-readable storage medium that includes a data file of the gesture instruction.
  • According to the present invention, there is provided for the first time an amusing physical model controlled by a computer, the model including: (a) at least three movable parts; (b) a mechanical actuator for moving at least three movable parts; (c) a communication port for receiving gesture instructions from the computer for controlling the mechanical actuator to directly express a gesture visually; and (d) a UFD housed in the model for storing and retrieving data to and from the computer via the communication port.
  • Preferably, the model is shaped as a realistic creature or as a fictitious creature.
  • Preferably, the movable parts represent limbs.
  • Preferably, the UFD is detachable.
  • Preferably, the model also includes: (e) a speaker housed in the model for receiving signals from the computer via the communication port for conveying sound.
  • Preferably, the model includes only one mechanical actuator that is operative to move at least three movable parts using at least one flexible mechanical linkage for controlling the model to express the gesture.
  • Preferably, the mechanical actuator is pneumatically-controlled.
  • Preferably, the mechanical actuator is hydraulically-controlled.
  • Preferably, the mechanical actuator comprises at least one gear for moving the movable parts.
  • Preferably, the communication port is a generic, wired serial port or a wireless port.
  • Preferably, the model also includes: (e) a computer-readable storage medium that includes a data file of the gesture instructions.
  • According to the present invention, there is provided for the first time a method of directly expressing a gesture of a physical model with at least three movable parts visually in three dimensions by a remote user of a remote computer to a local user of a local computer, the method includes the steps of: (a) providing the local user with a computer-controlled mechanical device operationally connected to the local computer; and (b) remotely activating the mechanical device to move in a manner that expresses the gesture visually by the remote user to the local user.
  • Preferably, the step of activating the mechanical device is effected by an application running on the remote computer.
  • According to the present invention, there is provided for the first time a method of self-expression through a three-dimensional visual gesture of a physical model with at least three movable parts, the method includes the steps of: (a) programming a computer with at least one software module that activates the model to perform the gesture by moving parts of the model; (b) operationally connecting the model to the computer; and (c) activating at least one software module from the computer, thereby directly expressing the gesture.
  • These and further embodiments will be apparent from the detailed description and examples that follow.
  • Devices that resemble the present invention are known in the art. One such device is the Doc Johnson High Joy Enabled® iVibe Rabbit from High Joy Products, LLC which allows a user to be stimulated physically by the device through a computer control which can be operated by the user or a remote operator. In contrast to the present invention, the user is required to be in physical contact with the device in order to be stimulated. In addition, the act of wearing the device creates a sense of anticipication of the forthcoming stimulation in the user. Whereas, in the case of the present invention, the user can be spontaneously surprised by the device since its operation does not require active involvement by the user (other than being in viewing range). Furthermore, the stimulation, in and of itself, does not constitute a gesture as defined above.
  • Another device that resembles the present invention is the Nabaztag (“Wi-Fi Rabbit”) from Violet, The Smart Object Company. This device allows a user to be notified of various information through a wireless link to the Internet. The information is conveyed by the device's speaker via simulated voices. The information is obtained from the services that the Nabaztag provides (such as weather forecasts). In addition to talking, the Nabaztag can flash lights, play music, and move its ears. In contrast to the present invention, the expression of gestures as defined in this application is not possible by the Nabaztag. The limited rotational movement of the Nabaztag is inadequate to express a gesture. Whereas, in the case of the present invention, the device makes at least three movements to express a gesture. This is considered, in this application, to be the minimum number of motions necessary to express a gesture realistically, where the moving parts represent limbs of a figure or creature (e.g. head, arms, legs, tail, etc.).
  • Another example of a prior art device, developed by a research team at Nanyang Technological University, Singapore, is a USB jacket that can simulate the act of hugging to the wearer transmitted by a remote operator. In contrast to the present invention, the expression of the gesture is created by physical contact and creates no visual effect. Furthermore, the prior art device does not move or change its shape perceptibly, as defined above, in its operation. In the case of the present invention, the device changes its shape perceptibly to convey the visual expression of a gesture.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:
  • FIG. 1A shows a simplified block diagram of a system for controlling a mechanical model via a local computer;
  • FIG. 1B shows a simplified block diagram of a system for controlling a mechanical model via a UFD;
  • FIG. 2 shows an illustration of a toy robot with motors;
  • FIG. 3 shows illustrations of collapsible toy characters;
  • FIG. 4A shows a simplified block diagram of a collapsible model with its control string spooled;
  • FIG. 4B shows a simplified block diagram of a collapsible model with its control string unspooled.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention is of a system for a USB desktop toy and a method for expressing gestures via such a toy. Specifically, the present invention can be used for amusement and expressing emotion. The principles and operation of a USB desktop toy according to the present invention may be better understood with reference to the drawings and the accompanying description.
  • Referring now to the drawings, FIG. 1A shows a simplified block diagram of a system for controlling a mechanical model (or toy) 20 of a human figure, made of a head 22, arms 24, 26, a torso 28 and legs 30, 32 via a local computer 34. A mechanical actuator 36, such as a high-torque motor used in robotic applications (available, for example, from Hobby Engineering, 180 El Camino Real, Millbrae, Calif. 94030, known as Boe-Bot Complete Robot Kit v2), is mechanically engaged to some of the movable body parts of the model. Local computer 34, connected to mechanical actuator 36 through a USB connection, sends gesture instructions to mechanical actuator 36. In this embodiment, the gesture instructions are motion commands. To clarify the distinction, we mean that any motion is a part of a gesture, and any gesture is made of motion.
  • The instructions to mechanical actuator 36 in FIG. 1A can be relayed in a number of ways. As shown in FIG. 1A, the instructions can be: (a) played from a file on local computer 34, (b) programmed by the user on an I/O interface 38 (that represents user I/O devices such as a keyboard, a mouse, or a display screen), or (c) communicated via an internet 40 from a remote computer 42 in the course of a correspondence session.
  • Clearly, the model can represent either real creatures, such as humans or animals, or also imaginary creatures, such as Donald Duck or Smurfs. As toy 20 is located near local computer 34, to which toy 20 is connected via a communication port (such as a USB port), toy 20 can also easily serve as a portable storage device and contain a USB flash memory drive (UFD). As seen in FIG. 1A, local computer 34 is connected via a communication port to a UFD 44 located within toy 20. Only one communication port is necessary to control mechanical actuator 36 and interface to UFD 44. Optionally, UFD 44 may be detached from toy 20 in order to allow greater mobility to UFD 44 without disconnecting toy 20.
  • As toy 20 is computer-controlled, a batch of gesture instructions contained in a data file can either be created on local computer 34, or can be received from remote computer 42. This batch of instructions can make the toy move according to predefined choreography in order to perform predefined gestures. Remote computer 42 contains a memory 46 and a database 48. Database 48 contains the data files representing the feasible gestures that toy 20 can make. Database 48 is either loaded into memory 46 or transmitted to remote computer 42 via internet 40 at the start of a correspondence session.
  • In one embodiment, remote computer 42 embeds the gesture instruction in the correspondence with local computer 34. A gesture instruction interpreter 49, residing on local computer 34, extracts the gesture instruction from the correspondence, and conveys the gesture instruction to toy 20.
  • FIG. 1B shows a simplified block diagram of a system for controlling a mechanical model (or toy) 20 of a human figure, made of a head 22, arms 24, 26, a torso 28 and legs 30, 32 via a UFD 44. In this system, toy 20 is configured similar to FIG. 1A except for the control of mechanical actuator 36. In this embodiment, local computer 34 sends a gesture instruction to UFD 44. The internal processor of UFD 44 then decodes the gesture instruction, and sends the appropriate sequence of motion commands to mechanical actuator 36. This enables a user to choose gestures which are composed of many motion commands in a single step (e.g. Hello, Goodbye, Deep Bow gesture instructions).
  • FIG. 2 shows an illustration of a toy robot with motors. In another preferred embodiment of this invention, the model is a robot with motors moving the limbs and head as shown in FIG. 2 (available under the name “Elenco Robomech Mechanical Motorized Wooden Kit” from Tower Hobbies, Tower Hobbies, Champaign, Ill.).
  • FIG. 3 shows illustrations of collapsible toy characters. In one embodiment of the present invention, the model includes linkages and strings, like the plastic sports characters with movable body parts as shown in FIG. 3. The linkages (which represent the limbs) are moved by a control string.
  • FIG. 4A shows a simplified block diagram of a collapsible model with its control string spooled. A model base 50 houses a mechanical actuator 52 which controls the rotation of a spool 54. A control string 56 is attached to spool 54 and can wound onto spool 54 by actuator 52. Actuator 52 includes a communication port 58 such as a generic, wired serial port or a wireless port (such as a Bluetooth port). A generic, wired serial port is a general-purpose interface used for serial communication. A USB port is one type of generic, wired serial port, and a wireless port may be a serial or parallel port. Base 50 also houses an optional UFD 60 that communicates with the computer via communication port 58. This extends the functional use of the toy. Limbs, such as a torso 68, and a head 74, can be constructed of multiple links for greater definition of motion (e.g. a lower leg 62, an upper leg 64, a forearm 70, and an upper arm 72). Control string 56 can connect to, and control, the model independently or through an additional string 66 attached to the model. The model also houses a speaker 76 for producing sounds via communication port 58.
  • FIG. 4B shows a simplified block diagram of a collapsible model with its control string unspooled. In this depiction, spool 54 is unwound allowing string to loosen thereby moving the linkages in the model. The relative slack of string creates flexing at a joint 78 between linkages and tipping of a limb at a fastening point 80.
  • As there may be many models, with a variety of possible movements, the set of possible movements can be saved in a file which can be made available to remote computers. This will enable a remote user to program instructions for the local toy, and use these instructions to cause the local toy to move in response to remote instructions. When this is done in the course of real-time correspondence, the toy adds gestures and body language to emphasize important ideas and feelings/moods within the conversation.
  • As many of the potential users of this toy may not be professional programmers, the toy preferably is schematically represented on the screen of the computer so that the user can program movements by using a mouse to click and drag control points on the screen which represent real movements of the corresponding points of the toy in space. Preferably, the system has a “record” mode and a “play” mode; wherein, the record mode stores the gestures as marked by the mouse on the screen, and the play mode sends these gestures to the toy for execution.
  • In a preferred embodiment of the present invention, the model also includes a speaker, and the instructions to the model can include sound files to be played through that speaker.
  • It is noted that the toy does not need to be connected to the computer through a USB port. It can equally be connected via a serial port, a parallel port, or a wireless port such as Bluetooth.
  • In a preferred embodiment of this invention, the toy can have a USB socket, and serve as a UFD cradle on the desk on which the toy rests.
  • In another preferred embodiment of this invention, the instructions to the toy can be triggered by software instructions in running applications.
  • While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications, and other applications of the invention may be made.

Claims (36)

1. A system for expressing a gesture in three dimensions, the system comprising:
(a) a mechanical device operative to perceptively move at least three parts thereof in response to gesture instructions;
(b) a gesture instruction, for controlling said mechanical device, embedded in a correspondence, said gesture instruction originating from a remote computer;
(c) a local computer for supplying said gesture instruction to said mechanical device during the course of said correspondence between said remote computer and said local computer; and
(d) a gesture instruction interpreter operative to extract said gesture instruction from said correspondence, and convey said gesture instruction to said mechanical device, thereby directly expressing the gesture visually.
2. The system of claim 1, further comprising:
(e) a telecommunication mechanism for sending said gesture instruction from said remote computer.
3. The system of claim 2, wherein said telecommunication mechanism includes an internet.
4. The system of claim 2, wherein said remote computer includes a graphical user interface (GUI) for programming said gesture instruction via said telecommunication mechanism.
5. The system of claim 1, the system further comprising:
(e) a communication mechanism for sending at least one said gesture instruction from said local computer to said mechanical device.
6. The system of claim 5, wherein said communication mechanism is a generic, wired serial port or a wireless port.
7. The system of claim 1, the system further comprising:
(e) a UFD housed in said mechanical device for storing and retrieving data to and from said local computer.
8. The system of claim 7, wherein said UFD is detachable.
9. An amusing physical model controlled by a computer, the model comprising:
(a) at least three movable parts;
(b) at least one mechanical actuator for moving said at least three movable parts, number of said at least one mechanical actuator less than number of said at least three movable parts; and (c) a communication port for receiving a gesture instruction from the computer for controlling said mechanical actuator to directly express a gesture visually.
10. The model of claim 9, wherein the model is shaped as a realistic creature.
11. The model of claim 9, wherein the model is shaped as a fictitious creature.
12. The model of claim 9, wherein said movable parts represent limbs.
13. The model of claim 9, the model further comprising:
(d) a UFD housed in the model for storing and retrieving data to and from said local computer.
14. The model of claim 13, wherein said UFD is detachable.
15. The model of claim 9, the model further comprising:
(d) a speaker housed in the model for receiving signals from the computer via said communication port for conveying sound.
16. The model of claim 9, wherein a single said mechanical actuator is operative to move said at least three movable parts using at least one flexible mechanical linkage for controlling the model to express said gesture.
17. The model of claim 9, wherein said mechanical actuator is pneumatically-controlled.
18. The model of claim 9, wherein said mechanical actuator is hydraulically-controlled.
19. The model of claim 9, wherein said mechanical actuator comprises at least one gear for moving said at least three movable parts.
20. The model of claim 9, wherein said communication port is a generic, wired serial port or a wireless port.
21. The model of claim 9, the model further comprising:
(d) a computer-readable storage medium that includes a data file of said gesture instruction.
22. An amusing physical model controlled by a computer, the model comprising:
(a) at least three movable parts;
(b) a mechanical actuator for moving said at least three movable parts;
(c) a communication port for receiving a gesture instruction from the computer for controlling said mechanical actuator to directly express a gesture visually; and
(d) a UFD housed in the model for storing and retrieving data to and from the computer via said communication port.
23. The model of claim 22, wherein the model is shaped as a realistic creature.
24. The model of claim 22, wherein the model is shaped as a fictitious creature.
25. The model of claim 22, wherein said movable parts represent limbs.
26. The model of claim 22, wherein said UFD is detachable.
27. The model of claim 22, the model further comprising:
(e) a speaker housed in the model for receiving signals from the computer via said communication port for conveying sound.
28. The model of claim 22, wherein a single said mechanical actuator is operative to move said at least three movable parts using at least one flexible mechanical linkage for controlling the model to express said gesture.
29. The model of claim 22, wherein said mechanical actuator is pneumatically-controlled.
30. The model of claim 22, wherein said mechanical actuator is hydraulically-controlled.
31. The model of claim 22, wherein said mechanical actuator comprises at least one gear for moving said at least three movable parts.
32. The model of claim 22, wherein said communication port is a generic, wired serial port or a wireless port.
33. The model of claim 22, wherein the model further comprising:
(e) a computer-readable storage medium that includes a data file of said gesture instruction.
34. A method of directly expressing a gesture of a physical model with at least three movable parts visually in three dimensions by a remote user of a remote computer to a local user of a local computer, the method comprising the steps of:
(a) providing the local user with a computer-controlled mechanical device operationally connected to the local computer; and
(b) remotely activating said mechanical device to move in a manner that expresses the gesture visually by the remote user to the local user.
35. The method of claim 34, wherein said step of activating said mechanical device is effected by an application running on the remote computer.
36. A method of self-expression through a three-dimensional visual gesture of a physical model with at least three movable parts, the method comprising the steps of:
(a) programming a computer with at least one software module that activates the model to perform the gesture by moving parts of the model;
(b) operationally connecting the model to said computer; and
(c) activating at least one said software module from said computer, thereby directly expressing the gesture.
US11/349,991 2005-09-26 2006-02-09 USB desktop toy Abandoned US20070072511A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/349,991 US20070072511A1 (en) 2005-09-26 2006-02-09 USB desktop toy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US72005605P 2005-09-26 2005-09-26
US11/349,991 US20070072511A1 (en) 2005-09-26 2006-02-09 USB desktop toy

Publications (1)

Publication Number Publication Date
US20070072511A1 true US20070072511A1 (en) 2007-03-29

Family

ID=37894710

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/349,991 Abandoned US20070072511A1 (en) 2005-09-26 2006-02-09 USB desktop toy

Country Status (1)

Country Link
US (1) US20070072511A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080313356A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Remote control of devices through instant messenger
EP2031481A1 (en) * 2007-08-29 2009-03-04 Industrial Technology Research Institut Information communication and interaction device and method for the same
US20090137323A1 (en) * 2007-09-14 2009-05-28 John D. Fiegener Toy with memory and USB Ports

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4393728A (en) * 1979-03-16 1983-07-19 Robotgruppen Hb Flexible arm, particularly a robot arm
US5746602A (en) * 1996-02-27 1998-05-05 Kikinis; Dan PC peripheral interactive doll
US6012961A (en) * 1997-05-14 2000-01-11 Design Lab, Llc Electronic toy including a reprogrammable data storage device
US6292714B1 (en) * 2000-05-12 2001-09-18 Fujitsu Limited Robot cooperation device, and robot cooperation program storage medium
US6290565B1 (en) * 1999-07-21 2001-09-18 Nearlife, Inc. Interactive game apparatus with game play controlled by user-modifiable toy
US6319010B1 (en) * 1996-04-10 2001-11-20 Dan Kikinis PC peripheral interactive doll
US20020081937A1 (en) * 2000-11-07 2002-06-27 Satoshi Yamada Electronic toy
US6493606B2 (en) * 2000-03-21 2002-12-10 Sony Corporation Articulated robot and method of controlling the motion of the same
US6620024B2 (en) * 2000-02-02 2003-09-16 Silverlit Toys Manufactory, Ltd. Computerized toy
US20040015265A1 (en) * 2002-03-18 2004-01-22 Yasuharu Asano Robot apparatus and method for controlling the operation thereof
US7515992B2 (en) * 2004-01-06 2009-04-07 Sony Corporation Robot apparatus and emotion representing method therefor

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4393728A (en) * 1979-03-16 1983-07-19 Robotgruppen Hb Flexible arm, particularly a robot arm
US5746602A (en) * 1996-02-27 1998-05-05 Kikinis; Dan PC peripheral interactive doll
US6319010B1 (en) * 1996-04-10 2001-11-20 Dan Kikinis PC peripheral interactive doll
US6012961A (en) * 1997-05-14 2000-01-11 Design Lab, Llc Electronic toy including a reprogrammable data storage device
US6290565B1 (en) * 1999-07-21 2001-09-18 Nearlife, Inc. Interactive game apparatus with game play controlled by user-modifiable toy
US6620024B2 (en) * 2000-02-02 2003-09-16 Silverlit Toys Manufactory, Ltd. Computerized toy
US6493606B2 (en) * 2000-03-21 2002-12-10 Sony Corporation Articulated robot and method of controlling the motion of the same
US6292714B1 (en) * 2000-05-12 2001-09-18 Fujitsu Limited Robot cooperation device, and robot cooperation program storage medium
US20020081937A1 (en) * 2000-11-07 2002-06-27 Satoshi Yamada Electronic toy
US20040015265A1 (en) * 2002-03-18 2004-01-22 Yasuharu Asano Robot apparatus and method for controlling the operation thereof
US7515992B2 (en) * 2004-01-06 2009-04-07 Sony Corporation Robot apparatus and emotion representing method therefor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080313356A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Remote control of devices through instant messenger
EP2031481A1 (en) * 2007-08-29 2009-03-04 Industrial Technology Research Institut Information communication and interaction device and method for the same
US20090137323A1 (en) * 2007-09-14 2009-05-28 John D. Fiegener Toy with memory and USB Ports
US8545335B2 (en) 2007-09-14 2013-10-01 Tool, Inc. Toy with memory and USB ports

Similar Documents

Publication Publication Date Title
Suguitan et al. Blossom: A handcrafted open-source robot
US7219064B2 (en) Legged robot, legged robot behavior control method, and storage medium
CN102596516B (en) For generation of the system and method for the situation behavior of mobile robot
US20150298315A1 (en) Methods and systems to facilitate child development through therapeutic robotics
WO2002076687A1 (en) Robot device and control method therefor, and storage medium
US20050154594A1 (en) Method and apparatus of simulating and stimulating human speech and teaching humans how to talk
WO2011093525A1 (en) Toy set, game control program, and game device and toy communication system
JP2001260063A (en) Articulated robot and its action control method
WO2001039932A1 (en) Robot apparatus, control method thereof, and method for judging character of robot apparatus
JP2002307354A (en) Electronic toy
JP7193015B2 (en) Communication support program, communication support method, communication support system, terminal device and non-verbal expression program
KR102095089B1 (en) Digital agent embedded mobile manipulator and method of operating thereof
JP2007312848A (en) Character growing game program
US20190251716A1 (en) System and method for visual scene construction based on user communication
Boer et al. Reconfiguring the appearance and expression of social robots by acknowledging their otherness
JP2015525137A (en) Method and system for generating contextual behavior of a mobile robot performed in real time
WO2004084183A1 (en) Audio conversation device, method, and robot device
JP2022113701A (en) Equipment control device, equipment, and equipment control method and program
US20070072511A1 (en) USB desktop toy
KR100880613B1 (en) System and method for supporting emotional expression of intelligent robot and intelligent robot system using the same
WO2001050362A1 (en) Purchase system and method, order accepting device and method, and computer program
Wang et al. Internet of toys: an e-Pet overview and proposed innovative social toy service platform
JP2004173928A (en) Toy for portable telephone
TWI412393B (en) Robot
KR20200120394A (en) Development of an Android-based multi-purpose robot platform device

Legal Events

Date Code Title Description
AS Assignment

Owner name: M-SYSTEMS FLASH DISK PIONEERS LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZERNOVIZKY, SHUKA;POMERANTZ, ITZHAK;REEL/FRAME:017916/0220

Effective date: 20060206

AS Assignment

Owner name: MSYSTEMS LTD, ISRAEL

Free format text: CHANGE OF NAME;ASSIGNOR:M-SYSTEMS FLASH DISK PIONEERS LTD.;REEL/FRAME:021799/0726

Effective date: 20060504

AS Assignment

Owner name: SANDISK IL LTD., ISRAEL

Free format text: CHANGE OF NAME;ASSIGNOR:MSYSTEMS LTD;REEL/FRAME:021823/0987

Effective date: 20070101

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION