WO2009026289A2 - Wearable user interface device, system, and method of use - Google Patents

Wearable user interface device, system, and method of use Download PDF

Info

Publication number
WO2009026289A2
WO2009026289A2 PCT/US2008/073591 US2008073591W WO2009026289A2 WO 2009026289 A2 WO2009026289 A2 WO 2009026289A2 US 2008073591 W US2008073591 W US 2008073591W WO 2009026289 A2 WO2009026289 A2 WO 2009026289A2
Authority
WO
WIPO (PCT)
Prior art keywords
patch
sensor
adaptable
feedback
data
Prior art date
Application number
PCT/US2008/073591
Other languages
French (fr)
Other versions
WO2009026289A3 (en
Original Assignee
Hmicro, Inc.
Niknejad, Ali
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hmicro, Inc., Niknejad, Ali filed Critical Hmicro, Inc.
Publication of WO2009026289A2 publication Critical patent/WO2009026289A2/en
Publication of WO2009026289A3 publication Critical patent/WO2009026289A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • an interface for providing information from a user to a control unit or data processing system comprising at least one wearable patch in communication with the control unit, wherein the patch is adaptable to detect object data and transmit the object data to the control unit.
  • the patch can be adaptable to communicate with the control unit through at least one of narrowband and ultrawideband frequencies.
  • the patch can comprise at least one sensor.
  • the sensor can comprise at least one multipower radio.
  • a multipower radio can function as both a narrowband radio and an ultrawideband radio.
  • the patch can be used to detect object data, wherein the object data comprises at least one detectable parameter.
  • the detectable parameter can comprise one or more of temperature, motion, heart rate, ECG, EEG, blood pressure, and hydration.
  • the patch can be further adaptable to provide feedback to the user.
  • the feedback can be selected from one or more of on-screen instruction, shock, heat, or vibration.
  • the patch can be a disposable patch.
  • the patch can be a flexible patch.
  • the patch can be further adaptable to be positioned on an inanimate object.
  • the patch can be further adaptable to be positioned on an animate object.
  • the object data detected is motion.
  • Another embodiment of the invention described herein comprises a parameter determining patch for detecting object data from an object, the motion determining sensor comprising at least one wearable data obtaining sensor adaptable to obtain data from an object; and at least one transmitter for transmitting object data.
  • the transmitter can be adaptable to transmit data using at least one of narrowband and ultrawideband frequency.
  • the transmitter can comprise at least one multimode radio.
  • the transmitter can also work to receive information from an external source.
  • the patch can be adaptable to be in communication with a control unit. Additionally, the patch can comprise at least one receiver adaptable to receive data from an external unit.
  • the patch can be adaptable to stimulate the object with a stimulus.
  • the stimulus can be selected from at least one of on-screen instruction, shock, heat, or vibration.
  • the object data can be selected from at least one of motion, hydration, heart rate, ECG, EEG, blood pressure, and temperature.
  • the wearable patch can be adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object.
  • the control unit can be adaptable to adjust the output associated with the object in response to the parameter.
  • the object can be an animate object.
  • the object can be an inanimate object.
  • the parameter detected is movement.
  • the movement can comprise at least one of displacement, velocity, acceleration, or any combination thereof.
  • the parameter can be a physiological parameter.
  • the physiological parameter can be selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof.
  • the wearable patch can be adaptable to provide feedback to the object.
  • the feedback can be physical feedback including, but not limited to at least one of vibration, electric shock, or change in temperature.
  • the data processing system can be adaptable to provide feedback in response to the detected parameter.
  • the feedback can be selected from at least one of audio feedback or visual feedback.
  • the system can further comprise at least one currently available data processing interface devices. Currently available data processing interface devices include, but are not limited to joysticks or remotes.
  • the patch can comprise at least one sensor.
  • the sensor can comprise at least one multimode radio.
  • the sensor can comprise at least one multipower radio.
  • the sensor is disposable.
  • the sensor can be flexible.
  • the positioning step can comprise positioning more than one sensor on the user.
  • the object can be an animate object.
  • the object can be an inanimate object.
  • the sensor can be adaptable to acquire physiological data from an animate object.
  • the physiological data can be selected from at least one of heart rate, ECG, EEG, blood pressure, hydration, speed, temperature, or any combination thereof.
  • the method can further comprise the step of providing feedback to the object through the patch.
  • the feedback can be a stimulus applied to the user.
  • the method can further comprise the step of providing feedback to the object through the virtual environment.
  • the feedback can be audio feedback or visual feedback.
  • the method further provides for the step of recording the object information.
  • the object information can be recorded and stored and then used later to evaluate the progress of the user.
  • the method can comprise the step of recording the object information and then manipulating the recorded information virtually.
  • the system is a gaming system.
  • the method can further provide the use of a system that is adaptable to be adjusted in real time.
  • the method can further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
  • FlG. 1 illustrates a block diagram of the system
  • FlG.2 illustrates a body model showing locations on the body where a sensor patch can be located
  • FlG.3 illustrates the relationship between patches on the body and the piece model of the body.
  • an invention comprising at least one lightweight, flexible, and wearable patch that is capable of detecting at least one parameter from an object.
  • the patch can comprise at least one sensor.
  • the patch can further be capable of transmitting the parameter or parameters detected to a control unit, or data processor, which can then incorporate the parameter into a program.
  • the program can be visually represented on a display, where the changes to the program are also visually represented on the display.
  • the patch can further allow the data processing system to faithfully reproduce a computer representation of the user, which adds a new dimension of realism to the program. The users of such systems will have to deal with their own physical constraints.
  • the strength of the punches or the ability of a user to run fast is determined by detecting a user's heart-rate and other physical factors.
  • electrocardiogram (ECG) sensors can be included with the patch and can be used to provide feedback to other players in a team situation. The ability to see other player's heart rate can make it more difficult for players to bluff while playing cards.
  • a sensor patch can be placed on the surface of a user or object.
  • the sensor can be placed on an animate object, such as a human, or even an animal.
  • the patch can be placed on an inanimate object such as, for example purposes only, a piece of sporting equipment.
  • the sensor can measure a parameter from the user or object.
  • the sensor can detect a physiological parameter including, but not limited to, heart rate, hydration, blood pressure, ECG, electroencephalogram (EEG), and temperature.
  • the sensor can detect the movement of the user's body or movement of at least part of the user's body.
  • the sensor can also be used to detect the spatial relationship between the user and an object or between multiple users. In some embodiments, the sensor can detect a single parameter. In some embodiments, the sensor can be used to detect multiple parameters.
  • the sensor can also be placed on an inanimate object. For example, if the user is playing a tennis video game, the sensor can be placed on the user's own tennis racquet. The movement of the racquet can be detected by the patch. The patch can then send information regarding the movement to the system. [0014] Once information is detected by the sensor, the sensor can transmit this data to a data processor or control unit.
  • FlG. 1 illustrates a block diagram of information transfer between various possible system components.
  • the body network can be comprised of sensors and transducers in communication with the user or object.
  • the body network is in communication with a control unit or data processor.
  • the body network can be in electrical communication with the control unit.
  • the control unit processes and combines the data from the patch sensors.
  • the information from the patches can be sent to the control unit through a wireless link. Alternatively, the information can be sent to the control unit through a hard-wire connection.
  • the control unit can be a gaming system.
  • control unit can be a medical device, for example purposes only, a heart rate monitor.
  • the control uni can en a jus e program ase on e a a o aine y e pa c es.
  • e program can adjust depen ing on the data from the user.
  • the data processor or control unit can be connected to an output display, as illustrated in Fig. 1.
  • the display can be used to illustrate a dynamic representation of the output, which can change in response to the input data from the sensor.
  • the output is displayed graphically.
  • the output is displayed pictorially.
  • the output of the control unit can also relay information to display device depicting a motion or position body model.
  • the output from the control unit can be used to show the user what the correct position or motion of a movement is.
  • the output can further be used to show the user what their current position or movement looks like and how the user can change their movement or position to correct their positioning.
  • the patch can also serve as a user-interface.
  • the patch can be used as an interface or input to a control unit, thereby allowing the user to manipulate the control unit.
  • the control unit can then be used to manipulate an external object in response to the user's input.
  • the invention described herein can be used to provide feedback to the user.
  • the feedback can comprise instructional feedback to the user.
  • the feedback can be visual feedback from the display device.
  • the feedback can be audio feedback.
  • the feedback can be temperature feedback, shock, vibration, or any combination thereof.
  • the feedback can be delivered to the user through the same patches that are used to detect object data. Alternatively, the data can be obtained using a first set of patches, and the feedback can be delivered through a second set of patched.
  • the user can obtain feedback instantaneously.
  • the data from the user can be detected by the patches while the user is in motion and then the patches can store the information collected.
  • the information can then be downloaded to a mobile device for real-time use.
  • the information can be stored by the patch and then downloaded at some point in time for delayed feedback.
  • a wearable patch that can be interfaced with a data processing system. At least one patch can be used with a system. In some embodiments, the invention can provide for the use of multiple patches to be used with the system.
  • the patch can comprise at least one multipower radio.
  • the multipower radio can be capable of transmitting data from the patch to a data processing system using either narrowband or ultrawideband frequencies.
  • FlG. 2 illustrates a body model showing locations along the body where patches can be positioned.
  • patches at positions 4 and 5 of FlG.2 can comprise patches placed on each finger.
  • patches at positions 8 and 9 of FlG.2 can comprise patches placed on each toe. Patches can be positioned on the main part of each location shown in FlG.2.
  • patches can be positioned at the joints. Patches can be positioned at any suitable location for positioning a patch. In some embodiments, five patches will be used; one on each arm, one on each leg, and one on the torso. In some embodiments, more than five patches can be used with the system. The more patches that are used the greater the sensitivity of the system. [0020]
  • the patches can be used to detect various parameters that can be used with a data processing system or any other suitable control system.
  • the patches can comprise sensors including, but not limited to, accelerometers, temperature sensors, ECG sensors, EEG sensors, impedance sensors, moisture sensors, or any combination thereof. The sensors can detect parameters from the user or from an object.
  • the detectable parameters include, but are not limited to, temperature, motion, heart rate, ECG data, EEG data, blood pressure, hydration, or any combination ereo , n some em o imen s, e pa c es can e use as par o a ee ac sys em, ine aoiuty oi me patc es o detect user limitations allows the limitations of the players to be included in the processing system. This can enhance the user's interaction with the system thereby providing a more realistic experience.
  • the patches can further comprise transducers including, but not limited to, vibrational transducers, electrical transducers, thermal transducers, or any combination thereof.
  • the relationship between the patch position and the ten piece model is shown in FlG.3.
  • the patches 300 on the body are in communication, preferably electrical communication, with the control unit.
  • the control unit can be in communication with a display device, which can provide visual feedback to the user.
  • the control unit can also be in communication with a body model 350.
  • the model can be updated in by the control unit with detecting motion and the position of the patches, in order to track the movement of the body.
  • the ten piece model can be used to map the data from the accelerometers and translate this data into potential positions or locations of the various parts of the human body.
  • a patch exists at time tl at position 1 (xl , yl, zl) and exists at time t2 at position 2 (x2, y2, z2).
  • the algorithm can search for the possible set of matching motions that take the user's patches from the measured positioned 1 to position 2. Positions that are unlikely due to physical constraints of the user are discarded.
  • the set of motions most consistent with the velocity vectors can be measured at the previous location, and can be used to render a three-dimensional version of the user. In this manner, the movements of the player are accurately captured and included in the system.
  • the update time can vary depending on the program. In some embodiments, the update time is on the order of millisecond level feedback.
  • a control unit providing an output associated with an object, and at least one wearable patch in communication with the control unit.
  • the wearable patch can be adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object.
  • the control unit can be adaptable to adjust the output associated with the object in response to the parameter.
  • the object can be an animate object.
  • the object can be an inanimate object.
  • the parameter detected is movement.
  • the movement can comprise at least one of displacement, speed, or velocity, or any combination thereof.
  • the parameter can be a physiological parameter.
  • the physiological parameter can be selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof.
  • the wearable patch can be adaptable to provide feedback to the object.
  • the feedback can be physical feedback including, but not limited to at least one of vibration, electric shock, or change in temperature.
  • the data processing system can be adaptable to provide feedback in response to the detected parameter.
  • the feedback can be selected from at least one of audio feedback or visual feedback.
  • the system can further comprise at least one currently available data processing interface devices. Currently available data processing interface devices include, but are not limited to joysticks or remotes.
  • the patch can comprise at least one sensor.
  • the sensor can comprise at least one multipower radio.
  • METHODS Additionally provided herein are methods for interacting with a virtual environment of a control unit comprising positioning at least one wearable patch comprising at least one sensor on an object from which information is desired; acquiring information from the object using the at least one patch; incorporating the object in orma ion acquire in o e vir ua environmen ; an a jus ing e vir ua environment m response to tne information from the object.
  • the sensor can comprise at least one multipower radio.
  • the sensor is disposable.
  • the sensor can be flexible.
  • the positioning step can comprise positioning more than one sensor on the user.
  • the object can be an animate object. Alternatively, the object can be an inanimate object.
  • the sensor can be adaptable to acquire physiological data from an animate object.
  • the physiological data can be selected from at least one of heart rate, ECG, EEG, blood pressure, hydration, speed, temperature, or any combination thereof.
  • the method can further comprise the step of providing feedback to the object through the patch.
  • the feedback can be a stimulus applied to the user.
  • the method can further comprise the step of providing feedback to the object through the virtual environment.
  • the feedback can be audio feedback or visual feedback.
  • the method further provides for the step of recording the object information.
  • the object information can be recorded and stored and then used later to evaluate the progress of the user.
  • the method can comprise the step of recording the object information and then manipulating the recorded information virtually.
  • the system is a gaming system.
  • the method can further provide the use of a system that is adaptable to be adjusted in real time.
  • the method can further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
  • Example 1 Interactive Exercise Videos
  • the invention described herein can be used with an interactive exercise video.
  • the user will attach patches to their body at various locations, such an on the arms and legs and torso.
  • the user can then follow an on screen coach while performing various exercises such as yoga or Pilates or other stretching moves.
  • the user can perform various aerobic exercises or weight lifting routines.
  • the feedback from the patches can then be used to assess how well the user is performing the requested tasks.
  • a visual display can show the user what corrective measures need to be taken, if any. For example, if a user is not performing a stretch properly, the system can indicate this to the user.
  • a computer coach can provide real-time feedback to the user through visual or audio cues.
  • physiological parameters such as heart rate
  • the system can adjust dynamically to compensate for the user's ability.
  • the system can also adjust to maximize the desired result for the user, such as weight loss or aerobic strength.
  • the invention described herein can be used in conjunction with a completely tetherless player who wears virtual reality goggles to immerse himself or herself into a game.
  • the video game player can now move around and the feedback of the player's position and movement is closely tracked.
  • the tracked information can be used to update the scenery in the virtual environment.
  • the transducers can be used to reinforce the visual feedback. For instance, invisible objects can be illuminated in virtual reality and if the player touches one of these invisible objects with his body, a transducer signal can be used to provide feedback (such as vibration or shock).
  • the user can run, jump, kick or do anything possible within the confines of his or her environment and these movements and motions can be accurately tracked and recreated for the game.
  • the game system can be set up so that other users of the game can see a virtual version of the player and the player's movements, even though users are located at different locations. . i
  • the invention described herein can be used to create animated sequences for motion pictures or for the gaming industry.
  • simple light sensors white dots
  • the locations of these light markers are used to move a computer animated character in sequence.
  • the current system requires wearing a special uniform with marker tags.
  • the patches can be used to record the location and position of the patches automatically.
  • the software can be used to track the motion of the actor for the use in movies or games. More complex scenes can be rendered in three-dimension as normal action sequences involving multiple characters are rendered in their natural setting.
  • Example 4. Medical Applications [0027]
  • the invention described herein can be used with medical devices.
  • the patches of the invention can be used to assist remote diagnosis of motor impairments.
  • the patches can be placed on the area surrounding a joint that has limited movement.
  • the motion can then be tracked and the range of motion used to determine the extent of an injury.
  • the patches can be used for training and recovery exercises to evaluate the progression of recovery.
  • the patches can also be used to track the fine movement of the arms, wrist joints, and fingers, to aid in surgical operations.
  • a surgeon can wear patches instead of wearing gloves. The surgeon can then use the gloves to grip medical equipment directly and manipulate objects.
  • the motion of the patches can be recorded and used to manipulate a robotic surgical arm.
  • a model of the surgical area can be created in virtual reality.
  • a robotic surgeon can perform surgery on an actual patient by recreating the movements of the surgeon.
  • the patches can be used to train a surgeon in a virtual reality environment.
  • the surgeon can practice operating on a virtual reality patient.
  • the system can provide feedback to the surgeon as they perform the surgical procedure. This can provide the surgeon with training and an impartial method for evaluating a surgeon's skills.
  • the patches can be used with systems for use in training athletes.
  • a golfer can use a system to improve his or her golfing technique.
  • a patch can be placed at various positions on the golfer.
  • a patch can also be placed on the club that the golfer actually plays golf with.
  • the golfer can then use the system to evaluate his or her swing which takes into consideration the actual club the golfer plays with.
  • the system can then provide instruction to the user on how to improve his or her swing based on the virtual performance of the golfer as measured by the system.
  • Example 7 Video Game Systems with Wearable Patches
  • the invention described herein can be used with gaming systems.
  • the patches can be used with games where the user plays the game from a stationary position.
  • the user's arm movements or leg movements need to be tracked in order to provide feedback in games involving fighting, dancing, or playing sports.
  • the patches, worn at unobtrusive positions on the body can be used to track movement. This is unlike current systems where motion detectors, such as accelerometers, are incorporated into joysticks or game interface devices.
  • the wearable patches are light-weight, thin, and low-powered.
  • the patches can be worn at multiple locations. The game can detect the number of patches being used.
  • the user will then undergo a calibration sequence with the system in order for the system to learn the location of the patches and the player's mobility.
  • the calibration sequence can consist of a series of motions, such as moving the limbs around, jumping, or perform any other suitable task for calibrating the system.
  • the patch can function to serve as a user-interface between a user and an object.
  • the patch allows the user to manipulate a data processor or control unit so that the control unit can produce the effects of the user's manipulation.
  • a user-interface system can be used to control an object.
  • a patch can be placed on a motor-impaired person's head.
  • the patch can comprise at least one EEG sensor.
  • the EEG sensor can detect electrical activity from the user's brain and this information can be sent to a control unit located in communication with a wheel chair.
  • the user can think of directions in which he or she wishes to travel.
  • the patch can pick up these directions and the chair can then be controlled by the control unit to move in the desired directions.

Abstract

Provided herein is a wearable interface for providing information from a user to a control unit comprising at least one wearable patch in communication with the control unit, wherein the patch is adaptable to detect object data and transmit the object data to the control unit. Further provided herein is a parameter determining patch for detecting object data from an object, the parameter determining sensor comprising at least one wearable data obtaining patch adaptable to obtain data from an object; and at least one transmitter for transmitting object data. Also included herein are a system and method of use of the invention.

Description

WEARABLE USER INTERFACE DEVICE, SYSTEM, AND METHOD OF USE
CROSS-REFERENCE
[0001] This application claims the benefit of U.S. Provisional Application No. 60/956,806, filed August 20, 2008, which application is incorporated herein by reference.
BACKGROUND OF THE INVENTION
[0002] Computer gaming has become increasingly more realistic, with high resolution graphics, three-dimensional rendering, sophisticated interface devices, and internet multi-player games. Additionally, systems have been developed that allow the user to hold a structure that interfaces with a system, such as a joystick or remote, where the interface detects a player's motion. The system then incorporates the player's motion into a wireless application. Current systems require that user to hold the interface. In some situations the interface can be broken or the interface can break object when the user inadvertently released the interface. Additionally, current interface systems are only capable of detecting the arm movements. Finally, current systems are not capable of detecting other parameters from the user in order to incorporate the user information into the gaming system. Systems capable of doing this would enhance the user the experience.
SUMMARY OF THE INVENTION
[0003] Provided herein is an interface for providing information from a user to a control unit or data processing system comprising at least one wearable patch in communication with the control unit, wherein the patch is adaptable to detect object data and transmit the object data to the control unit. The patch can be adaptable to communicate with the control unit through at least one of narrowband and ultrawideband frequencies. The patch can comprise at least one sensor. Furthermore, the sensor can comprise at least one multipower radio. A multipower radio can function as both a narrowband radio and an ultrawideband radio. The patch can be used to detect object data, wherein the object data comprises at least one detectable parameter. The detectable parameter can comprise one or more of temperature, motion, heart rate, ECG, EEG, blood pressure, and hydration. In some embodiments, the patch can be further adaptable to provide feedback to the user. The feedback can be selected from one or more of on-screen instruction, shock, heat, or vibration. In some embodiments, the patch can be a disposable patch. The patch can be a flexible patch. The patch can be further adaptable to be positioned on an inanimate object. The patch can be further adaptable to be positioned on an animate object. In some embodiments of the patch, the object data detected is motion. [0004] Another embodiment of the invention described herein comprises a parameter determining patch for detecting object data from an object, the motion determining sensor comprising at least one wearable data obtaining sensor adaptable to obtain data from an object; and at least one transmitter for transmitting object data. The transmitter can be adaptable to transmit data using at least one of narrowband and ultrawideband frequency. The transmitter can comprise at least one multimode radio. The transmitter can also work to receive information from an external source. In some embodiments, the patch can be adaptable to be in communication with a control unit. Additionally, the patch can comprise at least one receiver adaptable to receive data from an external unit. In some embodiments, the patch can be adaptable to stimulate the object with a stimulus. The stimulus can be selected from at least one of on-screen instruction, shock, heat, or vibration. The object data can be selected from at least one of motion, hydration, heart rate, ECG, EEG, blood pressure, and temperature. jcn mpusmg a cυi u providing an output associated with an object, and at least one wearable patch in communication with the control unit. The wearable patch can be adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object. Further, the control unit can be adaptable to adjust the output associated with the object in response to the parameter. The object can be an animate object. Alternatively, the object can be an inanimate object. In some embodiments, the parameter detected is movement. The movement can comprise at least one of displacement, velocity, acceleration, or any combination thereof. In some embodiments, the parameter can be a physiological parameter. The physiological parameter can be selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof. In some embodiments, the wearable patch can be adaptable to provide feedback to the object. The feedback can be physical feedback including, but not limited to at least one of vibration, electric shock, or change in temperature. Furthermore, the data processing system can be adaptable to provide feedback in response to the detected parameter. The feedback can be selected from at least one of audio feedback or visual feedback. Additionally, the system can further comprise at least one currently available data processing interface devices. Currently available data processing interface devices include, but are not limited to joysticks or remotes. The patch can comprise at least one sensor. The sensor can comprise at least one multimode radio.
[0006] Additionally provided herein are methods for interacting with a virtual environment of a control unit comprising positioning at least one wearable patch comprising at least one sensor on an object from which information is desired; acquiring information from the object using the at least one patch; incorporating the object information acquired into the virtual environment; and adjusting the virtual environment in response to the information from the object. The sensor can comprise at least one multipower radio. In some embodiments, the sensor is disposable. The sensor can be flexible. In some embodiments of the method, the positioning step can comprise positioning more than one sensor on the user. The object can be an animate object. Alternatively, the object can be an inanimate object. The sensor can be adaptable to acquire physiological data from an animate object. The physiological data can be selected from at least one of heart rate, ECG, EEG, blood pressure, hydration, speed, temperature, or any combination thereof. In some embodiments, the method can further comprise the step of providing feedback to the object through the patch. The feedback can be a stimulus applied to the user. Additionally, the method can further comprise the step of providing feedback to the object through the virtual environment. The feedback can be audio feedback or visual feedback. In some embodiments, the method further provides for the step of recording the object information. The object information can be recorded and stored and then used later to evaluate the progress of the user. Additionally, the method can comprise the step of recording the object information and then manipulating the recorded information virtually. In some embodiments of the method, the system is a gaming system. The method can further provide the use of a system that is adaptable to be adjusted in real time. Additionally, the method can further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
INCORPORATION BY REFERENCE
[0007] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference. [0008] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
[0009] FlG. 1 illustrates a block diagram of the system;
[0010] FlG.2 illustrates a body model showing locations on the body where a sensor patch can be located; and
[0011] FlG.3 illustrates the relationship between patches on the body and the piece model of the body.
DETAILED DESCRIPTION OF THE INVENTION [0012] Provided herein is an invention comprising at least one lightweight, flexible, and wearable patch that is capable of detecting at least one parameter from an object. The patch can comprise at least one sensor. The patch can further be capable of transmitting the parameter or parameters detected to a control unit, or data processor, which can then incorporate the parameter into a program. The program can be visually represented on a display, where the changes to the program are also visually represented on the display. The patch can further allow the data processing system to faithfully reproduce a computer representation of the user, which adds a new dimension of realism to the program. The users of such systems will have to deal with their own physical constraints. For example purposes only, in a fighting game, the strength of the punches or the ability of a user to run fast is determined by detecting a user's heart-rate and other physical factors. As another example, electrocardiogram (ECG) sensors can be included with the patch and can be used to provide feedback to other players in a team situation. The ability to see other player's heart rate can make it more difficult for players to bluff while playing cards.
[0013] The invention provides that a sensor patch can be placed on the surface of a user or object. In some embodiments, the sensor can be placed on an animate object, such as a human, or even an animal. Alternatively, the patch can be placed on an inanimate object such as, for example purposes only, a piece of sporting equipment. The sensor can measure a parameter from the user or object. In the case of an animate object, the sensor can detect a physiological parameter including, but not limited to, heart rate, hydration, blood pressure, ECG, electroencephalogram (EEG), and temperature. In some embodiments, the sensor can detect the movement of the user's body or movement of at least part of the user's body. The sensor can also be used to detect the spatial relationship between the user and an object or between multiple users. In some embodiments, the sensor can detect a single parameter. In some embodiments, the sensor can be used to detect multiple parameters. The sensor can also be placed on an inanimate object. For example, if the user is playing a tennis video game, the sensor can be placed on the user's own tennis racquet. The movement of the racquet can be detected by the patch. The patch can then send information regarding the movement to the system. [0014] Once information is detected by the sensor, the sensor can transmit this data to a data processor or control unit. FlG. 1 illustrates a block diagram of information transfer between various possible system components. The body network can be comprised of sensors and transducers in communication with the user or object. The body network is in communication with a control unit or data processor. The body network can be in electrical communication with the control unit. The control unit processes and combines the data from the patch sensors. The information from the patches can be sent to the control unit through a wireless link. Alternatively, the information can be sent to the control unit through a hard-wire connection. The control unit can be a gaming system.
Alternatively, the control unit can be a medical device, for example purposes only, a heart rate monitor. The control uni can en a jus e program ase on e a a o aine y e pa c es. e program can adjust depen ing on the data from the user.
[0015] In some embodiments, the data processor or control unit can be connected to an output display, as illustrated in Fig. 1. The display can be used to illustrate a dynamic representation of the output, which can change in response to the input data from the sensor. In some embodiments, the output is displayed graphically. In some embodiments, the output is displayed pictorially. The output of the control unit can also relay information to display device depicting a motion or position body model. The output from the control unit can be used to show the user what the correct position or motion of a movement is. The output can further be used to show the user what their current position or movement looks like and how the user can change their movement or position to correct their positioning.
[0016] The patch can also serve as a user-interface. The patch can be used as an interface or input to a control unit, thereby allowing the user to manipulate the control unit. The control unit can then be used to manipulate an external object in response to the user's input. [0017] Additionally, the invention described herein can be used to provide feedback to the user. As shown in FiG. 1, the feedback can comprise instructional feedback to the user. The feedback can be visual feedback from the display device. Additionally, the feedback can be audio feedback. In some embodiments, where the patches are used with a gaming system, the feedback can be temperature feedback, shock, vibration, or any combination thereof. The feedback can be delivered to the user through the same patches that are used to detect object data. Alternatively, the data can be obtained using a first set of patches, and the feedback can be delivered through a second set of patched.
[0018] In some embodiments, the user can obtain feedback instantaneously. In some embodiments, the data from the user can be detected by the patches while the user is in motion and then the patches can store the information collected. The information can then be downloaded to a mobile device for real-time use. Alternatively, the information can be stored by the patch and then downloaded at some point in time for delayed feedback.
I. DEVICES
[0019] Provided herein is a wearable patch that can be interfaced with a data processing system. At least one patch can be used with a system. In some embodiments, the invention can provide for the use of multiple patches to be used with the system. The patch can comprise at least one multipower radio. The multipower radio can be capable of transmitting data from the patch to a data processing system using either narrowband or ultrawideband frequencies. FlG. 2 illustrates a body model showing locations along the body where patches can be positioned. In some embodiment, patches at positions 4 and 5 of FlG.2 can comprise patches placed on each finger. In some embodiments, patches at positions 8 and 9 of FlG.2 can comprise patches placed on each toe. Patches can be positioned on the main part of each location shown in FlG.2. Additionally, patches can be positioned at the joints. Patches can be positioned at any suitable location for positioning a patch. In some embodiments, five patches will be used; one on each arm, one on each leg, and one on the torso. In some embodiments, more than five patches can be used with the system. The more patches that are used the greater the sensitivity of the system. [0020] The patches can be used to detect various parameters that can be used with a data processing system or any other suitable control system. The patches can comprise sensors including, but not limited to, accelerometers, temperature sensors, ECG sensors, EEG sensors, impedance sensors, moisture sensors, or any combination thereof. The sensors can detect parameters from the user or from an object. The detectable parameters include, but are not limited to, temperature, motion, heart rate, ECG data, EEG data, blood pressure, hydration, or any combination ereo , n some em o imen s, e pa c es can e use as par o a ee ac sys em, ine aoiuty oi me patc es o detect user limitations allows the limitations of the players to be included in the processing system. This can enhance the user's interaction with the system thereby providing a more realistic experience. In some embodiments, the patches can further comprise transducers including, but not limited to, vibrational transducers, electrical transducers, thermal transducers, or any combination thereof.
[0021] The relationship between the patch position and the ten piece model is shown in FlG.3. The patches 300 on the body are in communication, preferably electrical communication, with the control unit. The control unit can be in communication with a display device, which can provide visual feedback to the user. The control unit can also be in communication with a body model 350. The model can be updated in by the control unit with detecting motion and the position of the patches, in order to track the movement of the body. The ten piece model can be used to map the data from the accelerometers and translate this data into potential positions or locations of the various parts of the human body. A patch exists at time tl at position 1 (xl , yl, zl) and exists at time t2 at position 2 (x2, y2, z2). Using a simple model of human motion, which can include an accurate representation of the range of motion of the joints, the algorithm can search for the possible set of matching motions that take the user's patches from the measured positioned 1 to position 2. Positions that are unlikely due to physical constraints of the user are discarded. The set of motions most consistent with the velocity vectors can be measured at the previous location, and can be used to render a three-dimensional version of the user. In this manner, the movements of the player are accurately captured and included in the system. The update time can vary depending on the program. In some embodiments, the update time is on the order of millisecond level feedback.
II. SYSTEMS
[0022] Further provided herein are systems for incorporating information from an object comprising a control unit providing an output associated with an object, and at least one wearable patch in communication with the control unit. The wearable patch can be adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object. Further, the control unit can be adaptable to adjust the output associated with the object in response to the parameter. The object can be an animate object. Alternatively, the object can be an inanimate object. In some embodiments, the parameter detected is movement. The movement can comprise at least one of displacement, speed, or velocity, or any combination thereof. In some embodiments, the parameter can be a physiological parameter. The physiological parameter can be selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof. In some embodiments, the wearable patch can be adaptable to provide feedback to the object. The feedback can be physical feedback including, but not limited to at least one of vibration, electric shock, or change in temperature. Furthermore, the data processing system can be adaptable to provide feedback in response to the detected parameter. The feedback can be selected from at least one of audio feedback or visual feedback. Additionally, the system can further comprise at least one currently available data processing interface devices. Currently available data processing interface devices include, but are not limited to joysticks or remotes. The patch can comprise at least one sensor. The sensor can comprise at least one multipower radio.
III. METHODS [0023] Additionally provided herein are methods for interacting with a virtual environment of a control unit comprising positioning at least one wearable patch comprising at least one sensor on an object from which information is desired; acquiring information from the object using the at least one patch; incorporating the object in orma ion acquire in o e vir ua environmen ; an a jus ing e vir ua environment m response to tne information from the object. The sensor can comprise at least one multipower radio. In some embodiments, the sensor is disposable. The sensor can be flexible. In some embodiments of the method, the positioning step can comprise positioning more than one sensor on the user. The object can be an animate object. Alternatively, the object can be an inanimate object. The sensor can be adaptable to acquire physiological data from an animate object. The physiological data can be selected from at least one of heart rate, ECG, EEG, blood pressure, hydration, speed, temperature, or any combination thereof. In some embodiments, the method can further comprise the step of providing feedback to the object through the patch. The feedback can be a stimulus applied to the user. Additionally, the method can further comprise the step of providing feedback to the object through the virtual environment. The feedback can be audio feedback or visual feedback. In some embodiments, the method further provides for the step of recording the object information. The object information can be recorded and stored and then used later to evaluate the progress of the user. Additionally, the method can comprise the step of recording the object information and then manipulating the recorded information virtually. In some embodiments of the method, the system is a gaming system. The method can further provide the use of a system that is adaptable to be adjusted in real time. Additionally, the method can further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
IV. EXAMPLES
Example 1. Interactive Exercise Videos [0024] The invention described herein can be used with an interactive exercise video. The user will attach patches to their body at various locations, such an on the arms and legs and torso. The user can then follow an on screen coach while performing various exercises such as yoga or Pilates or other stretching moves. Alternatively, the user can perform various aerobic exercises or weight lifting routines. The feedback from the patches can then be used to assess how well the user is performing the requested tasks. A visual display can show the user what corrective measures need to be taken, if any. For example, if a user is not performing a stretch properly, the system can indicate this to the user. A computer coach can provide real-time feedback to the user through visual or audio cues. Additionally, physiological parameters, such as heart rate, can be detected to determine whether the user is overexerting himself or herself and the system can adjust dynamically to compensate for the user's ability. The system can also adjust to maximize the desired result for the user, such as weight loss or aerobic strength. Example 2. Virtual Reality Gaming Systems
[0025J The invention described herein can be used in conjunction with a completely tetherless player who wears virtual reality goggles to immerse himself or herself into a game. The video game player can now move around and the feedback of the player's position and movement is closely tracked. The tracked information can be used to update the scenery in the virtual environment. The transducers can be used to reinforce the visual feedback. For instance, invisible objects can be illuminated in virtual reality and if the player touches one of these invisible objects with his body, a transducer signal can be used to provide feedback (such as vibration or shock). The user can run, jump, kick or do anything possible within the confines of his or her environment and these movements and motions can be accurately tracked and recreated for the game. The game system can be set up so that other users of the game can see a virtual version of the player and the player's movements, even though users are located at different locations. . i
[0026] The invention described herein can be used to create animated sequences for motion pictures or for the gaming industry. Typically, simple light sensors (white dots) are used to pick-up signals in a video sequence. The locations of these light markers are used to move a computer animated character in sequence. The current system requires wearing a special uniform with marker tags. Using the invention described herein, the patches can be used to record the location and position of the patches automatically. The software can be used to track the motion of the actor for the use in movies or games. More complex scenes can be rendered in three-dimension as normal action sequences involving multiple characters are rendered in their natural setting. Example 4. Medical Applications [0027] The invention described herein can be used with medical devices. The patches of the invention can be used to assist remote diagnosis of motor impairments. The patches can be placed on the area surrounding a joint that has limited movement. The motion can then be tracked and the range of motion used to determine the extent of an injury. Furthermore, the patches can be used for training and recovery exercises to evaluate the progression of recovery. Example 5. Virtual Surgery
[0028] The patches can also be used to track the fine movement of the arms, wrist joints, and fingers, to aid in surgical operations. A surgeon can wear patches instead of wearing gloves. The surgeon can then use the gloves to grip medical equipment directly and manipulate objects. The motion of the patches can be recorded and used to manipulate a robotic surgical arm. In cases where internal surgery is needed to be performed, a model of the surgical area can be created in virtual reality. As the surgeon manipulates objects and performs surgery in a virtual reality environment, a robotic surgeon can perform surgery on an actual patient by recreating the movements of the surgeon. Additionally, the patches can be used to train a surgeon in a virtual reality environment. The surgeon can practice operating on a virtual reality patient. The system can provide feedback to the surgeon as they perform the surgical procedure. This can provide the surgeon with training and an impartial method for evaluating a surgeon's skills.
Example 6. Sport Training Systems
[0029] The patches can be used with systems for use in training athletes. For example, a golfer can use a system to improve his or her golfing technique. A patch can be placed at various positions on the golfer. A patch can also be placed on the club that the golfer actually plays golf with. The golfer can then use the system to evaluate his or her swing which takes into consideration the actual club the golfer plays with. The system can then provide instruction to the user on how to improve his or her swing based on the virtual performance of the golfer as measured by the system.
Example 7. Video Game Systems with Wearable Patches [0030] The invention described herein can be used with gaming systems. The patches can be used with games where the user plays the game from a stationary position. The user's arm movements or leg movements need to be tracked in order to provide feedback in games involving fighting, dancing, or playing sports. The patches, worn at unobtrusive positions on the body can be used to track movement. This is unlike current systems where motion detectors, such as accelerometers, are incorporated into joysticks or game interface devices. The wearable patches are light-weight, thin, and low-powered. The patches can be worn at multiple locations. The game can detect the number of patches being used. The user will then undergo a calibration sequence with the system in order for the system to learn the location of the patches and the player's mobility. The calibration sequence can consist of a series of motions, such as moving the limbs around, jumping, or perform any other suitable task for calibrating the system. xa p n aCe r y m en s
[0031] The patch can function to serve as a user-interface between a user and an object. The patch allows the user to manipulate a data processor or control unit so that the control unit can produce the effects of the user's manipulation. Such a user-interface system can be used to control an object. For example purposes only, a patch can be placed on a motor-impaired person's head. The patch can comprise at least one EEG sensor. The EEG sensor can detect electrical activity from the user's brain and this information can be sent to a control unit located in communication with a wheel chair. The user can think of directions in which he or she wishes to travel. The patch can pick up these directions and the chair can then be controlled by the control unit to move in the desired directions. [0032] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims

WHAT IS CLAIMED IS:
1. An interface for providing information from a user to a control unit comprising at least one wearable patch in communication with the control unit, wherein the patch is adaptable to detect object data and transmit the object data to the control unit.
2. The interface of claim 1 wherein the patch is adaptable to communicate with the control unit through both narrowband and ultrawideband frequencies.
3. The interface of claim 1 wherein the patch comprises at least one sensor.
4. The interface of claim 3 wherein the at least one sensor comprises at least one multipower radio.
5. The interface of claim 1 wherein the object data comprises at least one detectable parameter.
6. The interface of claim 5 wherein the detectable parameter includes one or more of temperature, motion, heart rate, ECG, EEG, blood pressure, and hydration.
7. The sensor of claim 1 wherein the patch is further adaptable to provide feedback to the user.
8. The sensor of claim 7 wherein the feedback is selected from one or more of on-screen instruction, shock, heat, or vibration.
9. The sensor of claim 1 wherein the patch is a disposable patch.
10. The sensor of claim 1 wherein the patch is a flexible patch.
11. The sensor of claim 1 wherein the patch is further adaptable to be positioned on an inanimate object.
12. The sensor of claim 1 wherein the patch is further adaptable to be positioned on an animate object.
13. The sensor of claim 1 wherein the object data is motion.
14. A parameter determining patch for detecting object data from an object, the parameter determining sensor comprising
(a) at least one wearable data obtaining patch adaptable to obtain data from an object; and
(b) at least one transmitter for transmitting object data.
15. The sensor of claim 14 wherein the transmitter is adaptable to transmit data using narrowband and ultrawideband frequency.
16. The sensor of claim 14 wherein the transmitter is at least one multipower radio.
17. The sensor of claim 14 wherein the patch is adaptable to be in communication with a control unit.
18. The sensor of claim 14 further comprising at least one receiver adaptable to receive data from a control unit.
19. The sensor of claim 18 wherein the receiver is adaptable to stimulate the object with a stimulus.
20. The sensor of claim 19 wherein the stimulus is selected from at least one of on-screen instruction, shock, heat, or vibration.
21. The sensor of claim 14 wherein the object data is selected from at least one of motion, hydration, heart rate, ECG, EEG, blood pressure, and temperature.
22. A system for incorporating information from an object comprising a control unit adaptable to provide an output associated with an object; and at least one wearable sensor in communication with the control unit, the wearable sensor adaptable to be positioned on the object and further adaptable to detect at least one parameter from the object, wherein the control unit is adaptable to adjust the output associated with the object in response to the parameter detected.
23. The system of claim 22 wherein the object is an animate object.
. e sys em o c aim w erein e o jec is an inanima e o jec .
25. The system of claim 22 wherein the parameter is movement.
26. The system of claim 25 wherein the movement comprises at least one of displacement, velocity, or acceleration.
27. The system of claim 22 wherein the parameter is a physiological parameter.
28. The system of claim 27 wherein the physiological parameter is selected from at least one of temperature, hydration, heart rate, ECG, EEG, blood pressure, or any combination thereof.
29. The system of claim 22 wherein the at least one wearable sensor is adaptable to provide feedback to the object.
30. The system of claim 29 wherein the feedback is physical feedback.
31. The system of claim 30 wherein the physical feedback is selected from at least one of vibration, electric shock, or change in temperature.
32. The system of claim 22 wherein the data processing system is adaptable to provide feedback in response to the detected parameter.
33. The system of claim 32 wherein the feedback is selected from at least one of audio feedback or visual feedback.
34. The system of claim 22 further comprising at least one currently available data processing interface.
35. The system of claim 22 wherein the sensor comprises at least one multipower radio.
36. A method for interacting with a virtual environment of a control unit comprising a) positioning at least one wearable patch on an object from which information is desired, the wearable patch comprising at least one sensor; b) acquiring information from the object using the at least one wearable patch; c) incorporating the object information acquired from the wearable patch into the virtual environment; and d) adjusting the virtual environment in response to the information from the object.
37. The method of claim 36 wherein the sensor comprises at least one multipower radio.
38. The method of claim 36 wherein the patch is a disposable.
39. The method of claim 36 wherein the patch flexible.
40. The method of claim 36 wherein the positioning step comprises positioning more than one patch on the user.
41. The method of claim 36 wherein the object is an animate object.
42. The method of claim 36 wherein the object is an inanimate object.
43. The method of claim 36 wherein the patch is adaptable to acquire physiological data from an animate object.
44. The method of claim 43 wherein the physiological data is selected from at least one of heart rate, hydration, speed, temperature, ECG, EEG, blood pressure, or any combination thereof.
45. The method of claim 36 further comprising the step of providing feedback to the object through the sensor.
46. The method of claim 45 wherein the feedback is a stimulus.
47. The method of claim 36 further comprising the step of providing feedback to the object through the virtual environment.
48. The method of claim 47 wherein the feedback is audio feedback.
. .
50. The method of claim 36 further comprising the step of recording the object information.
51. The method of claim 50 wherein the recording step further comprises manipulating the object information.
52. The method of claim 36 wherein the system is a gaming system.
53. The method of claim 36 wherein the system is adaptable to be adjusted in real time.
54. The method of claim 36 further comprising the step of communicating the object information incorporated into the virtual environment with a second computer system accessing the same virtual environment.
PCT/US2008/073591 2007-08-20 2008-08-19 Wearable user interface device, system, and method of use WO2009026289A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US95680607P 2007-08-20 2007-08-20
US60/956,806 2007-08-20

Publications (2)

Publication Number Publication Date
WO2009026289A2 true WO2009026289A2 (en) 2009-02-26
WO2009026289A3 WO2009026289A3 (en) 2009-04-16

Family

ID=40378955

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/073591 WO2009026289A2 (en) 2007-08-20 2008-08-19 Wearable user interface device, system, and method of use

Country Status (2)

Country Link
US (1) US9046919B2 (en)
WO (1) WO2009026289A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8332544B1 (en) 2010-03-17 2012-12-11 Mattel, Inc. Systems, methods, and devices for assisting play
WO2018098927A1 (en) * 2016-12-02 2018-06-07 深圳市前海康启源科技有限公司 Wearable multifunction critical patient monitoring device

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201512A1 (en) 2006-01-09 2010-08-12 Harold Dan Stirling Apparatus, systems, and methods for evaluating body movements
WO2007101141A2 (en) * 2006-02-24 2007-09-07 Hmicro, Inc. A medical signal processing system with distributed wireless sensors
US20070279217A1 (en) * 2006-06-01 2007-12-06 H-Micro, Inc. Integrated mobile healthcare system for cardiac care
US8926509B2 (en) * 2007-08-24 2015-01-06 Hmicro, Inc. Wireless physiological sensor patches and systems
WO2009055608A2 (en) * 2007-10-24 2009-04-30 Hmicro, Inc. Method and apparatus to retrofit wired healthcare and fitness systems for wireless operation
US20110019824A1 (en) 2007-10-24 2011-01-27 Hmicro, Inc. Low power radiofrequency (rf) communication systems for secure wireless patch initialization and methods of use
WO2009100401A2 (en) * 2008-02-06 2009-08-13 Hmicro, Inc. Wireless communications systems using multiple radios
US9173582B2 (en) * 2009-04-24 2015-11-03 Advanced Brain Monitoring, Inc. Adaptive performance trainer
US20110045736A1 (en) * 2009-08-20 2011-02-24 Charles Randy Wooten Effect Generating Device in Response to User Actions
US9000914B2 (en) * 2010-03-15 2015-04-07 Welch Allyn, Inc. Personal area network pairing
US8588284B2 (en) 2010-06-01 2013-11-19 Adeptence, Llc Systems and methods for networked wearable medical sensors
US9026074B2 (en) * 2010-06-04 2015-05-05 Qualcomm Incorporated Method and apparatus for wireless distributed computing
US11854427B2 (en) 2010-06-30 2023-12-26 Strategic Operations, Inc. Wearable medical trainer
US11688303B2 (en) 2010-06-30 2023-06-27 Strategic Operations, Inc. Simulated torso for an open surgery simulator
US11495143B2 (en) 2010-06-30 2022-11-08 Strategic Operations, Inc. Emergency casualty care trainer
CA2854001C (en) 2012-05-23 2019-04-16 Microsoft Corporation Dynamic exercise content
US10528135B2 (en) 2013-01-14 2020-01-07 Ctrl-Labs Corporation Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
JP2016507851A (en) 2013-02-22 2016-03-10 サルミック ラブス インコーポレイテッド Method and apparatus for combining muscle activity sensor signals and inertial sensor signals for control based on gestures
JP6389831B2 (en) 2013-03-06 2018-09-12 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. System and method for determining vital sign information
US10152082B2 (en) 2013-05-13 2018-12-11 North Inc. Systems, articles and methods for wearable electronic devices that accommodate different user forms
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US11426123B2 (en) 2013-08-16 2022-08-30 Meta Platforms Technologies, Llc Systems, articles and methods for signal routing in wearable electronic devices that detect muscle activity of a user using a set of discrete and separately enclosed pod structures
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US9788789B2 (en) 2013-08-30 2017-10-17 Thalmic Labs Inc. Systems, articles, and methods for stretchable printed circuit boards
US9372535B2 (en) * 2013-09-06 2016-06-21 Thalmic Labs Inc. Systems, articles, and methods for electromyography-based human-electronics interfaces
US9483123B2 (en) 2013-09-23 2016-11-01 Thalmic Labs Inc. Systems, articles, and methods for gesture identification in wearable electromyography devices
WO2015053473A1 (en) * 2013-10-08 2015-04-16 삼성전자 주식회사 Signature registration method, signature authentication method and device therefor
CN106102504A (en) 2014-02-14 2016-11-09 赛尔米克实验室公司 For the elastic system of power cable, goods and method and the wearable electronic installation using elastic power cable
US10199008B2 (en) 2014-03-27 2019-02-05 North Inc. Systems, devices, and methods for wearable electronic devices as state machines
US9880632B2 (en) 2014-06-19 2018-01-30 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US9807221B2 (en) 2014-11-28 2017-10-31 Thalmic Labs Inc. Systems, devices, and methods effected in response to establishing and/or terminating a physical communications link
KR102324735B1 (en) * 2015-01-19 2021-11-10 삼성전자주식회사 Wearable devcie for adaptive control based on bio information, system including the same, and method thereof
CN107427230A (en) * 2015-02-05 2017-12-01 Mc10股份有限公司 For the method and system with environmental interaction
US9769564B2 (en) 2015-02-11 2017-09-19 Google Inc. Methods, systems, and media for ambient background noise modification based on mood and/or behavior information
US10284537B2 (en) 2015-02-11 2019-05-07 Google Llc Methods, systems, and media for presenting information related to an event based on metadata
US11392580B2 (en) * 2015-02-11 2022-07-19 Google Llc Methods, systems, and media for recommending computerized services based on an animate object in the user's environment
US11048855B2 (en) 2015-02-11 2021-06-29 Google Llc Methods, systems, and media for modifying the presentation of contextually relevant documents in browser windows of a browsing application
US10223459B2 (en) 2015-02-11 2019-03-05 Google Llc Methods, systems, and media for personalizing computerized services based on mood and/or behavior information from multiple data sources
US10078435B2 (en) 2015-04-24 2018-09-18 Thalmic Labs Inc. Systems, methods, and computer program products for interacting with electronically displayed presentation materials
WO2017095951A1 (en) * 2015-11-30 2017-06-08 Nike Innovate C.V. Apparel with ultrasonic position sensing and haptic feedback for activities
US20170177833A1 (en) * 2015-12-22 2017-06-22 Intel Corporation Smart placement of devices for implicit triggering of feedbacks relating to users' physical activities
US20170224224A1 (en) * 2016-02-04 2017-08-10 Chu-Yih Yu Method of obtaining symmetric temperature change
US10470683B1 (en) * 2016-05-03 2019-11-12 Jeff Thramann Systems and methods to disassociate events and memory induced rewards
US10179598B1 (en) * 2016-07-22 2019-01-15 Harrison J. Goodbinder Mobile cart
US20200073483A1 (en) 2018-08-31 2020-03-05 Ctrl-Labs Corporation Camera-guided interpretation of neuromuscular signals
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
EP3487395A4 (en) 2016-07-25 2020-03-04 CTRL-Labs Corporation Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US10973439B2 (en) 2016-12-29 2021-04-13 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US11318350B2 (en) 2016-12-29 2022-05-03 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10352962B2 (en) * 2016-12-29 2019-07-16 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis and feedback
US9773330B1 (en) * 2016-12-29 2017-09-26 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10300333B2 (en) * 2017-05-30 2019-05-28 Under Armour, Inc. Techniques for evaluating swing metrics
WO2019079757A1 (en) 2017-10-19 2019-04-25 Ctrl-Labs Corporation Systems and methods for identifying biological structures associated with neuromuscular source signals
WO2019083863A1 (en) 2017-10-23 2019-05-02 Patent Holding Company 001, Llc Communication devices, methods, and systems
CN112040837A (en) * 2017-10-31 2020-12-04 生命信号公司 Customizable patch
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
EP3853698A4 (en) 2018-09-20 2021-11-17 Facebook Technologies, LLC Neuromuscular text entry, writing and drawing in augmented reality systems
EP3886693A4 (en) 2018-11-27 2022-06-08 Facebook Technologies, LLC. Methods and apparatus for autocalibration of a wearable electrode sensor system
CA3031479A1 (en) * 2019-01-25 2020-07-25 Jonathan Gagne Computer animation methods and systems
US11914762B2 (en) * 2020-12-28 2024-02-27 Meta Platforms Technologies, Llc Controller position tracking using inertial measurement units and machine learning
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040077975A1 (en) * 2002-10-22 2004-04-22 Zimmerman Jeffrey C. Systems and methods for motion analysis and feedback
US6909420B1 (en) * 1998-12-03 2005-06-21 Nicolas Frederic Device indicating movements for software
KR20050072558A (en) * 2004-01-07 2005-07-12 엘지전자 주식회사 Wearable computer system
US20050282633A1 (en) * 2001-11-13 2005-12-22 Frederic Nicolas Movement-sensing apparatus for software

Family Cites Families (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4313443A (en) 1980-09-11 1982-02-02 Nasa Pocket ECG electrode
US4965825A (en) 1981-11-03 1990-10-23 The Personalized Mass Media Corporation Signal processing apparatus and methods
US4784162A (en) 1986-09-23 1988-11-15 Advanced Medical Technologies Portable, multi-channel, physiological data monitoring system
HU212136B (en) 1987-10-27 1996-03-28 Cedcom Network Systems Pty Ltd Communication system
US4918690A (en) 1987-11-10 1990-04-17 Echelon Systems Corp. Network and intelligent cell for providing sensing, bidirectional communications and control
DE3809523A1 (en) 1988-03-22 1989-10-12 Miles Inc METHOD FOR THE PRODUCTION OF POROESEN MEMBRANES, THE MEMBRANES THEREOF PRODUCED AND THEIR USE AS TRAEGERMATRICES IN TEST STRIPS
US5511553A (en) 1989-02-15 1996-04-30 Segalowitz; Jacob Device-system and method for monitoring multiple physiological parameters (MMPP) continuously and simultaneously
DE69130549T2 (en) * 1990-06-11 1999-05-27 Hitachi Ltd Device for generating an object movement path
US5231990A (en) 1992-07-09 1993-08-03 Spacelabs, Medical, Inc. Application specific integrated circuit for physiological monitoring
US6771617B1 (en) 1993-06-17 2004-08-03 Gilat Satellite Networks, Ltd. Frame relay protocol-based multiplex switching scheme for satellite mesh network
DE4329898A1 (en) 1993-09-04 1995-04-06 Marcus Dr Besson Wireless medical diagnostic and monitoring device
US6230970B1 (en) 1995-06-07 2001-05-15 E-Comm, Incorporated Low-power hand-held transaction device
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
US6238338B1 (en) 1999-07-19 2001-05-29 Altec, Inc. Biosignal monitoring system and method
US5720770A (en) 1995-10-06 1998-02-24 Pacesetter, Inc. Cardiac stimulation system with enhanced communication and control capability
US6119003A (en) 1996-09-09 2000-09-12 Nokia Mobile Phones Limited Methods and apparatus for performing automatic mode selection in a multimode mobile terminal
CN1118746C (en) 1997-03-24 2003-08-20 发展产品有限公司 Two-way remote control with advertising display
US6275143B1 (en) 1997-05-09 2001-08-14 Anatoli Stobbe Security device having wireless energy transmission
US6295461B1 (en) 1997-11-03 2001-09-25 Intermec Ip Corp. Multi-mode radio frequency network system
US7542878B2 (en) 1998-03-03 2009-06-02 Card Guard Scientific Survival Ltd. Personal health monitor and a method for health monitoring
US6463039B1 (en) 1998-04-24 2002-10-08 Intelligent Ideation, Inc. Method and apparatus for full duplex sideband communication
DE19826513A1 (en) * 1998-06-15 1999-12-23 Siemens Ag Automation system with radio sensor
US6336900B1 (en) 1999-04-12 2002-01-08 Agilent Technologies, Inc. Home hub for reporting patient health parameters
US6494829B1 (en) 1999-04-15 2002-12-17 Nexan Limited Physiological sensor array
US6454708B1 (en) 1999-04-15 2002-09-24 Nexan Limited Portable remote patient telemonitoring system using a memory card or smart card
US7256708B2 (en) 1999-06-23 2007-08-14 Visicu, Inc. Telecommunications network for remote patient monitoring
US6677852B1 (en) 1999-09-22 2004-01-13 Intermec Ip Corp. System and method for automatically controlling or configuring a device, such as an RFID reader
US6694180B1 (en) 1999-10-11 2004-02-17 Peter V. Boesen Wireless biopotential sensing device and method with capability of short-range radio frequency transmission and reception
US6527711B1 (en) 1999-10-18 2003-03-04 Bodymedia, Inc. Wearable human physiological data sensors and reporting system therefor
US20050090718A1 (en) 1999-11-02 2005-04-28 Dodds W J. Animal healthcare well-being and nutrition
US6629057B2 (en) 1999-11-05 2003-09-30 Beckman Coulter, Inc. Comprehensive verification systems and methods for analyzer-read clinical assays
US6893396B2 (en) 2000-03-01 2005-05-17 I-Medik, Inc. Wireless internet bio-telemetry monitoring system and interface
US6436058B1 (en) * 2000-06-15 2002-08-20 Dj Orthopedics, Llc System and method for implementing rehabilitation protocols for an orthopedic restraining device
US7689437B1 (en) 2000-06-16 2010-03-30 Bodymedia, Inc. System for monitoring health, wellness and fitness
US6605038B1 (en) 2000-06-16 2003-08-12 Bodymedia, Inc. System for monitoring health, wellness and fitness
US7261690B2 (en) 2000-06-16 2007-08-28 Bodymedia, Inc. Apparatus for monitoring health, wellness and fitness
BRPI0414359A (en) 2000-06-16 2006-11-14 Bodymedia Inc body weight monitoring and management system and other psychological conditions that include interactive and personalized planning, intervention and reporting
US20060122474A1 (en) 2000-06-16 2006-06-08 Bodymedia, Inc. Apparatus for monitoring health, wellness and fitness
ES2260245T3 (en) 2000-06-23 2006-11-01 Bodymedia, Inc. SYSTEM TO CONTROL HEALTH, WELFARE AND EXERCISE.
US20020065828A1 (en) 2000-07-14 2002-05-30 Goodspeed John D. Network communication using telephone number URI/URL identification handle
USD439981S1 (en) 2000-08-09 2001-04-03 Bodymedia, Inc. Armband with physiological monitoring system
AUPQ970300A0 (en) 2000-08-29 2000-09-21 Massa Nominees Pty Ltd Advanced wireless network
SE0003333D0 (en) 2000-09-19 2000-09-19 Medipeda Ab Medical System
USD451604S1 (en) 2000-09-25 2001-12-04 Bodymedia, Inc. Vest having physiological monitoring system
US6885191B1 (en) 2001-02-13 2005-04-26 Stuart M. Gleman Radio-frequency imaging system for medical and other applications
JP3967680B2 (en) 2001-02-14 2007-08-29 ドレーガー メディカル システムズ インコーポレイテッド Patient monitoring area network
US6595929B2 (en) 2001-03-30 2003-07-22 Bodymedia, Inc. System for monitoring health, wellness and fitness having a method and apparatus for improved measurement of heat flow
US7376234B1 (en) 2001-05-14 2008-05-20 Hand Held Products, Inc. Portable keying device and method
US7103578B2 (en) 2001-05-25 2006-09-05 Roche Diagnostics Operations, Inc. Remote medical device access
USD460971S1 (en) 2001-06-21 2002-07-30 Bodymedia, Inc. Docking cradle for an electronic device
US7044911B2 (en) 2001-06-29 2006-05-16 Philometron, Inc. Gateway platform for biological monitoring and delivery of therapeutic compounds
WO2004002301A2 (en) 2001-07-17 2004-01-08 Gmp Wireless Medicine, Inc. Wireless ecg system
EP1423046B2 (en) 2001-08-13 2015-03-25 Novo Nordisk A/S Portable device of communicating medical data information
EP1442439A4 (en) 2001-11-08 2006-04-19 Behavioral Informatics Inc Monitoring a daily living activity and analyzing data related thereto
US20050101841A9 (en) 2001-12-04 2005-05-12 Kimberly-Clark Worldwide, Inc. Healthcare networks with biosensors
US7260424B2 (en) * 2002-05-24 2007-08-21 Schmidt Dominik J Dynamically configured antenna for multiple frequencies and bandwidths
JP4013665B2 (en) 2002-06-21 2007-11-28 株式会社日立製作所 Wireless communication system and wireless device
US7020508B2 (en) 2002-08-22 2006-03-28 Bodymedia, Inc. Apparatus for detecting human physiological and contextual information
US7294105B1 (en) 2002-09-03 2007-11-13 Cheetah Omni, Llc System and method for a wireless medical communication system
KR20040032451A (en) 2002-10-09 2004-04-17 삼성전자주식회사 Mobile device having health care function and method of health care using the same
US6731962B1 (en) 2002-10-31 2004-05-04 Smiths Medical Pm Inc Finger oximeter with remote telecommunications capabilities and system therefor
WO2004075782A2 (en) 2003-02-26 2004-09-10 Alfred, E. Mann Institute For Biomedical Engineering At The University Of Southern California An implantable device with sensors for differential monitoring of internal condition
JP2006520657A (en) 2003-03-21 2006-09-14 ウェルチ・アリン・インコーポレーテッド Personal condition physiological monitoring system and structure, and monitoring method
US20040199056A1 (en) 2003-04-03 2004-10-07 International Business Machines Corporation Body monitoring using local area wireless interfaces
US7042346B2 (en) 2003-08-12 2006-05-09 Gaige Bradley Paulsen Radio frequency identification parts verification system and method for using same
EP1533678A1 (en) * 2003-11-24 2005-05-25 Sony International (Europe) GmbH Physical feedback channel for entertaining or gaming environments
NZ529871A (en) 2003-11-28 2004-09-24 Senscio Ltd Radiofrequency adapter for medical monitoring equipment
HUE037253T2 (en) 2004-01-27 2018-08-28 Altivera L L C Diagnostic radio frequency identification sensors and applications thereof
JPWO2005073737A1 (en) 2004-01-29 2007-09-13 株式会社アドバンテスト Measuring apparatus, method, program, and recording medium
US7406105B2 (en) 2004-03-03 2008-07-29 Alfred E. Mann Foundation For Scientific Research System and method for sharing a common communication channel between multiple systems of implantable medical devices
US7125382B2 (en) 2004-05-20 2006-10-24 Digital Angel Corporation Embedded bio-sensor system
KR100587715B1 (en) * 2004-06-07 2006-06-09 주식회사 에스앤에스텍 Method for Resist Coating of Blank Mask
US7206630B1 (en) 2004-06-29 2007-04-17 Cleveland Medical Devices, Inc Electrode patch and wireless physiological measurement system and method
US8343074B2 (en) 2004-06-30 2013-01-01 Lifescan Scotland Limited Fluid handling devices
US7344500B2 (en) 2004-07-27 2008-03-18 Medtronic Minimed, Inc. Sensing system with auxiliary display
JP2006055530A (en) 2004-08-23 2006-03-02 Toshiba Corp Medical support system, communication adapter, and biometer
US8560041B2 (en) 2004-10-04 2013-10-15 Braingate Co., Llc Biological interface system
WO2006044700A2 (en) 2004-10-13 2006-04-27 Ysi Incorporated Wireless patch temperature sensor system
US7639135B2 (en) 2004-10-28 2009-12-29 Microstrain, Inc. Identifying substantially related objects in a wireless sensor network
US7254516B2 (en) * 2004-12-17 2007-08-07 Nike, Inc. Multi-sensor monitoring of athletic performance
US9259175B2 (en) * 2006-10-23 2016-02-16 Abbott Diabetes Care, Inc. Flexible patch for fluid delivery and monitoring body analytes
US7571369B2 (en) 2005-02-17 2009-08-04 Samsung Electronics Co., Ltd. Turbo decoder architecture for use in software-defined radio systems
EP1871218B1 (en) 2005-03-09 2012-05-16 Coloplast A/S A three-dimensional adhesive device having a microelectronic system embedded therein
US7270633B1 (en) 2005-04-22 2007-09-18 Cardiac Pacemakers, Inc. Ambulatory repeater for use in automated patient care and method thereof
US8688189B2 (en) * 2005-05-17 2014-04-01 Adnan Shennib Programmable ECG sensor patch
CN100471445C (en) 2005-08-01 2009-03-25 周常安 Paster style physiological monitoring device, system and network
GB2420628B (en) 2005-09-27 2006-11-01 Toumaz Technology Ltd Monitoring method and apparatus
US20070081505A1 (en) 2005-10-12 2007-04-12 Harris Corporation Hybrid RF network with high precision ranging
US20070087780A1 (en) 2005-10-14 2007-04-19 Shary Nassimi An Adaptive Wireless Headset System
US7733224B2 (en) * 2006-06-30 2010-06-08 Bao Tran Mesh network personal emergency response appliance
US7499739B2 (en) 2005-10-27 2009-03-03 Smiths Medical Pm, Inc. Single use pulse oximeter
EP1968691A4 (en) 2005-12-14 2012-01-25 Welch Allyn Inc Medical device wireless adapter
US20100201512A1 (en) * 2006-01-09 2010-08-12 Harold Dan Stirling Apparatus, systems, and methods for evaluating body movements
WO2007101141A2 (en) 2006-02-24 2007-09-07 Hmicro, Inc. A medical signal processing system with distributed wireless sensors
US8200320B2 (en) 2006-03-03 2012-06-12 PhysioWave, Inc. Integrated physiologic monitoring systems and methods
US7668588B2 (en) 2006-03-03 2010-02-23 PhysioWave, Inc. Dual-mode physiologic monitoring systems and methods
US8552597B2 (en) 2006-03-31 2013-10-08 Siemens Corporation Passive RF energy harvesting scheme for wireless sensor
US20070232234A1 (en) 2006-03-31 2007-10-04 Frank Joseph Inzerillo Method of wireless conversion by emulation of a non-wireless device
GB0608829D0 (en) 2006-05-04 2006-06-14 Husheer Shamus L G In-situ measurement of physical parameters
US7597668B2 (en) * 2006-05-31 2009-10-06 Medisim Ltd. Non-invasive temperature measurement
US20070279217A1 (en) 2006-06-01 2007-12-06 H-Micro, Inc. Integrated mobile healthcare system for cardiac care
KR100770914B1 (en) 2006-09-11 2007-10-26 삼성전자주식회사 Method for peer to peer communication in near field communication
US8979755B2 (en) 2006-12-08 2015-03-17 The Boeing Company Devices and systems for remote physiological monitoring
JP2010524050A (en) 2007-02-05 2010-07-15 メドトロニック ミニメド インコーポレイテッド Wireless data communication protocol and wireless data communication technology for wireless medical device network
JP5241254B2 (en) 2007-02-06 2013-07-17 パナソニック株式会社 Wireless communication method and wireless communication apparatus
WO2008105795A1 (en) 2007-02-26 2008-09-04 Thomson Licensing Method and apparatus for providing a communication link
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US20090037670A1 (en) 2007-07-30 2009-02-05 Broadcom Corporation Disk controller with millimeter wave host interface and method for use therewith
US8926509B2 (en) 2007-08-24 2015-01-06 Hmicro, Inc. Wireless physiological sensor patches and systems
WO2009055608A2 (en) 2007-10-24 2009-04-30 Hmicro, Inc. Method and apparatus to retrofit wired healthcare and fitness systems for wireless operation
US20110019824A1 (en) 2007-10-24 2011-01-27 Hmicro, Inc. Low power radiofrequency (rf) communication systems for secure wireless patch initialization and methods of use

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6909420B1 (en) * 1998-12-03 2005-06-21 Nicolas Frederic Device indicating movements for software
US20050282633A1 (en) * 2001-11-13 2005-12-22 Frederic Nicolas Movement-sensing apparatus for software
US20040077975A1 (en) * 2002-10-22 2004-04-22 Zimmerman Jeffrey C. Systems and methods for motion analysis and feedback
KR20050072558A (en) * 2004-01-07 2005-07-12 엘지전자 주식회사 Wearable computer system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8332544B1 (en) 2010-03-17 2012-12-11 Mattel, Inc. Systems, methods, and devices for assisting play
WO2018098927A1 (en) * 2016-12-02 2018-06-07 深圳市前海康启源科技有限公司 Wearable multifunction critical patient monitoring device

Also Published As

Publication number Publication date
US9046919B2 (en) 2015-06-02
US20090051544A1 (en) 2009-02-26
WO2009026289A3 (en) 2009-04-16

Similar Documents

Publication Publication Date Title
US9046919B2 (en) Wearable user interface device, system, and method of use
US20210316200A1 (en) Generating an animation depicting a user using motion and physiological data captured using sensors
US11037369B2 (en) Virtual or augmented reality rehabilitation
AU2017386412B2 (en) Systems and methods for real-time data quantification, acquisition, analysis, and feedback
JP6938542B2 (en) Methods and program products for articulated tracking that combine embedded and external sensors
US10446051B2 (en) Interactive cognitive-multisensory interface apparatus and methods for assessing, profiling, training, and improving performance of athletes and other populations
US9868012B2 (en) Rehabilitation systems and methods
US20090098519A1 (en) Device and method for employment of video games to provide physical and occupational therapy and measuring and monitoring motor movements and cognitive stimulation and rehabilitation
US20110270135A1 (en) Augmented reality for testing and training of human performance
US10286254B2 (en) Assessment and enhancement of reaction based joint stabilization capabilities
US20100280418A1 (en) Method and system for evaluating a movement of a patient
US20140243710A1 (en) Posture training system and control method thereof
JP2014527874A (en) System and method for supporting exercise practice
CN206497423U (en) A kind of virtual reality integrated system with inertia action trap setting
RU2107328C1 (en) Method for tracing and displaying of position and orientation of user in three-dimensional space and device which implements said method
US20160098934A1 (en) Concussion rehabilitation device and method
CN108140421A (en) Training
WO2020259858A1 (en) Framework for recording and analysis of movement skills
CN115738188A (en) Balance function training device and method based on virtual reality technology
Alahakone et al. A real-time interactive biofeedback system for sports training and rehabilitation
JP2008048972A (en) Competition ability improvement support method and the support tool
Batista et al. Surface electromyography for game-based hand motor rehabilitation
US11130063B2 (en) Gaming system for sports-based biomechanical feedback
US20220160299A1 (en) Motion capture system
CN117122871A (en) Limb exercise monitoring system based on virtual reality technology

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08798184

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08798184

Country of ref document: EP

Kind code of ref document: A2