US20080221487A1 - Method for real time interactive visualization of muscle forces and joint torques in the human body - Google Patents

Method for real time interactive visualization of muscle forces and joint torques in the human body Download PDF

Info

Publication number
US20080221487A1
US20080221487A1 US11/832,726 US83272607A US2008221487A1 US 20080221487 A1 US20080221487 A1 US 20080221487A1 US 83272607 A US83272607 A US 83272607A US 2008221487 A1 US2008221487 A1 US 2008221487A1
Authority
US
United States
Prior art keywords
forces
muscle
real time
joint
motion capture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/832,726
Inventor
Oshri Even Zohar
Antonie J. van den Bogert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motek BV
Original Assignee
Motek BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motek BV filed Critical Motek BV
Priority to US11/832,726 priority Critical patent/US20080221487A1/en
Assigned to MOTEK BV reassignment MOTEK BV ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VAN DEN BOGERT, ANTONIE J., ZOHAR, OSHRI EVEN
Priority to ES08730108.1T priority patent/ES2679125T3/en
Priority to EP08730108.1A priority patent/EP2120710B1/en
Priority to JP2009552792A priority patent/JP5016687B2/en
Priority to PCT/US2008/054239 priority patent/WO2008109248A2/en
Priority to CA2680462A priority patent/CA2680462C/en
Publication of US20080221487A1 publication Critical patent/US20080221487A1/en
Priority to US12/251,688 priority patent/US7931604B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/224Measuring muscular strength
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4519Muscles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4528Joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes

Definitions

  • This invention most generally relates to a system that combines motion capture technology and a 3D computational musculoskeletal model to create a real time display environment where muscle forces and joint torques are illustrated. More specifically, various embodiments of the present invention create real time visualizations of the physical muscle forces and joint torques in the body during movement.
  • Motion Capture is a term for a variety of techniques, and the technology has existed for many years in a variety of applications.
  • the aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner.
  • 3D three-dimensional
  • Motion capture allows an operator to use computer-generated characters.
  • Motion capture can be used to create complex motion, using the full range of human movements and allow also inanimate objects to move realistically.
  • Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently.
  • Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync.
  • Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games.
  • Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, it calculates the angles of the joints needed to be in to achieve that end point. This process is used in robotics, 3D computer animation and some engineering applications.
  • One embodiment of the present invention provides a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model.
  • Data stream coming from a motion capture system is parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the user as they happen.
  • Another embodiment of the present invention provides runtime interaction by a user or operator.
  • a further embodiment of the present invention provides a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to customize the visualization in real time of forces and torques exerted by the human body.
  • Still another embodiment of the invention creates a new measurement and visualization tool, bearing applications in various industries.
  • the invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • Yet another embodiment of the present invention provides a new measurement and visualization tool, bearing applications in various industries.
  • the invention creates the possibility of looking at muscle forces and joint forces transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • Other applications include orthopedic and ergonomic studies and designs.
  • a yet further embodiment of the present invention provides a process that incorporates real time 3D marker data streams coming from a motion capture system through real-time sets of algorithms that derive from the 3D markers cloud the joints centers of rotation, positions and orientations, then derives accelerations and velocities and converts those into an array of muscle forces that are passed to the 3D human body muscle model as a data stream used in the 3D color space visualization of the muscle forces and joint torques.
  • FIG. 1A is a computer generated image illustrating motion capture points disposed on a user (not shown) configured in accordance but not limited to one embodiment of the present invention.
  • FIG. 1B is a computer generated image illustrating a kinematics skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 1C is a computer generated image illustrating a anatomically correct skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 1D is a computer generated image illustrating a three dimensional anatomically correct muscle layer configured in accordance with one embodiment of the present invention.
  • FIG. 1E is a computer generated image illustrating a three dimensional anatomically correct muscle layer disposed on an anatomically correct three dimensional skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 2 is a computer generated image illustrating pipeline layer connections configured in accordance with one embodiment of the present invention.
  • FIG. 3 is a computer generated image illustrating a V-Gait configured in accordance with one embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a motion capture system configured in accordance with one embodiment of the present invention.
  • FIG. 5 is a flow chart illustrating a method of motion capture configured in accordance with one embodiment of the present invention.
  • Muscle forces are typically invisible by nature and one can normally only see the results of applied muscle forces on the individual's surroundings.
  • One embodiment of the present invention makes it possible to view simulated muscle forces in the human body in real-time, in a way that makes clear the force transference in the human musculoskeletal system.
  • the process of achieving this functionality relies on fast and accurate real time motion capture data processing into an IK (inverse kinematics) skeletal layer containing joint positions and orientations, a further process deriving accelerations and velocities, a further process deriving inverse dynamics in real time, a further process deriving muscle forces from joint torques, and a final process converting the result streams into 3D visualizations of color and form changes in a 3D accurate human body muscle model.
  • IK inverse kinematics
  • One embodiment of the invention is a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model.
  • Data streams coming from a motion capture system are parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the eye as they happen.
  • One embodiment of the present invention allows runtime interaction by a user or operator.
  • Such an embodiment of the invention can be seen as a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to visualize in real time the forces and torques exerted by the human body.
  • One embodiment of the invention provides a new measurement and visualization tool, bearing applications in various industries.
  • One embodiment of the invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • at least one embodiment of the present invention is intended for medical applications, embodiments of the present invention are adaptable for other market segments including ergonomics and sports.
  • This system allows the visualizations of muscle forces for any given exercise in real-time.
  • Such a system, illustrated in FIG. 3 can be used to enhance, optimize and improve muscle forces, by providing a realistic real time visualization of the given forces and torques.
  • the system allows the user 30 to see the force transference to various muscles in the body and achieve the desired effect.
  • a motion capture system 32 instantly records the user's motion and provides immediate muscle force visualizations 34 .
  • One embodiment of the present invention may be utilized by the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion.
  • the system may be useful for victims of traumatic brain injury, cerebral damage, and spinal damage.
  • the study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement.
  • embodiments of the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons. With orthopedics, prosthetics, and amputees, the system can visualize and track muscle deficiencies while training and improving movements.
  • Yet another embodiment of the present invention combines muscle forces and resultant joint force into a calculation and visualization of the forces acting within joints. This is useful as a training tool to prevent and treat overuse injuries in the workplace, in ergonomics and in sports.
  • each joint calculated is drawn as a sphere in this drawing.
  • Inverse kinematics is used to calculate the joint orientation from the motion capture data before deriving the accelerations and velocities of every body part.
  • the next step in the pipeline is to take the calculated joint angles and to derive values of accelerations and velocities for every joint (representing every body part), the acceleration and velocities values are the base for calculating through the use of inverse dynamics, the muscle forces and joint torques which are then passed to the 3D muscle model display as color information.
  • a development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. Such an embodiment pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions.
  • the enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
  • Such a system consists of a combination of an instrumented treadmill that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces, real time motion capture system and the custom computational pipeline translating the capture data to muscle forces and joint torques display.
  • An embodiment of the present invention seeks to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body.
  • the patterns of muscle activation determine whether a subject falls or not. These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks.
  • Such an embodiment offers not only a test and learning environment for patients and doctors, but is also a valuable research environment for motor control. Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered.
  • the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists. In these patients some visual cues are sufficient to trigger rather normal looking muscle force patterns (for instance used in walking), while in the absence of such stimuli a pattern can not even be started.
  • the continuous control of muscle force transference during walking is possible by having a multi-channel sensory input onto a vast library of learned motor patterns. Once the possibility exists to view in real time the muscle force pattern immergence, it will lead to fundamental improvement in the understanding and possible treatment of the sickness. Such an embodiment will allow a new glimpse into the complexity of the natural processes associated with human motion.
  • Another example of an application for one embodiment of the present invention is the prevention and treatment of low back pain through teaching of proper lifting techniques. Real-time calculation and visualization of the forces acting on the intervertebral discs will provide immediate feedback to the patient concerning the quality of their movement.
  • muscle forces will be visualized, but certain training applications may provide audio signals driven by muscle force values from the computational pipeline. Other training applications may use muscle force values as input for a virtual environment, which causes changes in position of virtual objects, or changes in position of the motion platform on which the subject is standing.
  • the computational pipeline that results in real time muscle force display is flexible and allows forward dynamics simulations to be run at any time during runtime of the system.
  • the flow of movements as an input to the inverse dynamics simulation is stopped during a sequence and the calculated joint movements are now used as input, while the movements become output.
  • forward simulations calculate movements and reaction forces from moments of force produced around the joints of the subjects. These forward simulations can be visualized as part of the virtual environment, and will show what might happen to the patient in hypothetical situations.
  • the forward and inverse dynamic calculations typically consist of a large set of equations. Depending on the methods used to form these equations, they are expressed in various ways, such as Newton-Euler equations, Lagrange's equations, or Kane's equations. These are called the equations of motion, which contain the relation between the generalized forces applied at the body and the generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of three dimensional coordinates in the external world. This implies that most of the generalized forces are actually moments of force (or torque). Equations can be added describing the kind of interaction with the environment, such as contacts with the floor. The equations can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time.
  • the main tasks of the real time computational pipeline are processing the input data coming from the motion capture sensors, mapping the collected data into the above mentioned human body model, processing the various input and/or computed data depending on different cases.
  • Other tasks concern the display of real-time 3D graphic representations of muscle forces and joint torques 28 , as well as driving the output devices such as a treadmill 38 and a display system 34 as illustrated in FIG. 3 .
  • the user interface for the operator is implemented as the means to communicate with the real time 3D muscle model 26 of FIG. 1D through a custom written software program.
  • the real time 3D muscle model is projected on the screen in front of the subject,
  • the user stands on a platform or treadmill, which can be controlled as part of the system or as a reaction to movements of the subject.
  • the user wears motion capture markers 20 , as illustrated in FIGS. 1A and 2 of which the positions are recorded. These are fed into an algorithm that turns them into the degrees of freedom of the human body model, which is filled with the segment masses 22 and inertia of the subject and displayed as color space real time animations of the 3D muscle model of FIG. 1E .
  • the human body model 26 produces the joint moments of force of the subject, if necessary; this information can be offered in the projected image to be used by the subject. Forward dynamics simulation can also be computed to indicate where weak parts in the motor pattern are located.
  • FIGS. 1A-1E illustrate an overview of one embodiment of the present invention's computational real time pipeline wherein as illustrated in FIG. 1A a user is equipped with a number of motion capture sensors or markers 20 attached at various strategic locations of the body. The data from the sensors is received by a motion capture system 32 .
  • the motion capture data set contains the X axis, Y axis, and Z axis positions of the user for the full body, and is transmitted at >100 FPS (Frames per second) to the computer 36 .
  • the computer 36 interactively operates with operator's interface 34 and executes the first step in the computational pipeline converting the positional data in real time to an inverse kinematics skeleton 22 illustrated in FIG. 1B .
  • This data is typically applied to the inverse kinematics skeleton 22 to drive a 3D anatomically correct skeleton 24 in about approximately real time ( FIG. 1C ). Then a 3D anatomically correct muscle layer 26 of FIG. 1D is connected to the human skeleton 24 and the muscle forces and joint torques resulting from the real time computational pipeline are applied to real time animations of colors 28 of the respective muscles in the 3D muscle model of FIG. 1E .
  • a person is outfitted with markers 20 and a template 22 is processed for an initial or balance position.
  • the markers 20 are typically used to record the motion. They are substantially instantaneously captured, and used to process a complete template.
  • the template 22 utilizes a template matching algorithm to interpolate for missing or bad marker data.
  • the Template matching result is passed to the computational inverse kinematics skeleton 24 .
  • position data of the markers is plotted in real time to joint orientations in the computational skeleton 24 .
  • Constraint based rigging the data is in turn driving a geometry (anatomically correct) skeleton. This skeleton is the base for the muscle force visualization layer.
  • FIG. 3 illustrates an embodiment of the present invention, wherein the patient 30 on an instrumented treadmill 38 is looking at the 3D real time interactive muscle model 34 of himself seeing the muscles in action as muscle force is exerted.
  • This interactive muscle force model 34 is calculated by a processor 36 using the method described above using data obtained from optical motion capture sensors 32 disposed on the patient's body 30 , in combination with sensors disposed in the instrumented treadmill 38 .
  • weight sensors may be disposed in the instrumented treadmill 38 while other sensors such as accelerometers, speedometers, rotation and position sensors may also be included.
  • FIG. 4 is a block diagrammatic view illustrating the hardware and software possible interconnections of one embodiment of the present invention.
  • the hardware platform is based on high end multi-core Multi-processor workstations.
  • the multi-CPU hardware platform 36 is used as the computer means for processing, memory, and interface.
  • the various peripherals and communications are accomplished by using standard high-speed connections using Ethernet, serial, and SCSI connections to dedicated hosts.
  • the dedicated host can be a separate personal computer (PC) or an integrated on-board computer that interfaces with the peripheral equipment.
  • the optical motion capture system of one embodiment includes six cameras, and the data acquisition unit of the optical motion capture system translates the camera input into the desired data set.
  • the data set of one embodiment is 3D position information of the sensors 20 obtained from a person 30 in real time, and is accessible to a dedicated host that allows for the fast exchange of data to the CPU 36 .
  • Data is, in one embodiment, delivered in a custom made file format.
  • the chosen main optical capture system of one embodiment is a realtime passive marker system 32 , which is readily configurable for many setups. This technology is capable of converting and displaying 3D data coordinates of up to 300 optical markers at >100 HZ,
  • the instrumented treadmill 38 is interconnected to dedicated host that connects to the CPU for transferring data and control information.
  • the treadmill 38 of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt.
  • a projection device 34 such as a plasma screen or a video projector and screen is used to display the real time 3D muscle model to the user.
  • FIG. 5 illustrates a flow chart illustrating the operation of a system configured according to one embodiment of the present invention.
  • Input from the motion capture system 1 in the form of 3D marker coordinates is used as input for the Kinematic Solver 6 .
  • the Kinematic solver 6 is also using resource files of a skeleton definition and marker set templates 3 .
  • the Kinematic Solver 6 is outputting in real-time the current skeleton pose. Real-time low-pass filtering and differentiation processes the changes in skeleton pose into velocities and accelerations that are used as input to the Motion Equations 7 .
  • the Kinematic Solver output also drives the generation of Muscle paths for all respective muscles 5 , and outputs the schematic skeleton used for the visualization 9 .
  • the Motion Equations 7 are also using input from ground reaction forces and other external forces coming from an array of Force sensors 2 .
  • the Motion Equations 7 also use an input from resource files that contain the respective body mass properties 4 .
  • the Equations of Motion 7 Output Joint moments to the Optimization process 8 .
  • the Optimization process 8 also uses input of muscle lengths and moment arms coming from the respective muscle paths 5 .
  • the Optimization process 8 Outputs Muscle forces used in the Real Time muscle force Visualization 9 .
  • the skeleton pose (i.e. the set of generalized coordinates) is calculated in real-time by using the Levenberg-Marquardt nonlinear least-squares algorithm to solve the global optimization problem.
  • the use of the analytical Jacobian matrix makes the computations very fast.
  • equations of motion ave produced via software that creates C code for the forward kinematics equations. Those equations generate coordinates of markers on the body from the generalized coordinates of the skeleton. The derivatives of the forward kinematics equations, forming a Jacobian matrix, are generated by via symbolic differentiation. Finally, one embodiment of the present invention translates these equations into computer code which is incorporated into the computational pipeline which executes the calculations at run time.
  • the muscle forces are the solution of a static optimization problem, with the general form: minimize the sum of normalized muscle forces raised to the Nth power, while requiring that all muscle forces are non-negative, and that the set of muscle forces multiplied by their respective moment arms, are identical to the joint torques solved by the inverse dynamics equations.
  • Normalized muscle force is defined as the muscle force relative to the maximal force capacity of the muscle.
  • Moment arm is the distance from the muscle force vector to the instantaneous center of rotation of a particular joint and is mathematically calculated as the derivative of muscle length with respect to the joint's generalized coordinate.
  • Motion Capture is a phrase used to describe for a variety of techniques for capturing the movement of a body or object, and the technology has existed for many years in a variety of applications.
  • the aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner.
  • 3D three-dimensional
  • Motion capture allows an operator to use computer-generated characters.
  • Motion capture is used to create complex natural motion, using the full range of human movements and allow also inanimate objects to move realistically.
  • Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently.
  • Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync.
  • Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games. In the context of the present invention, motion capture is used to output 3D XYZ marker positions.
  • Force sensors are used in many industries such as Automotive, Robotics and various Engineering applications, typically a force sensor will measure the total forces applied on it, those can be vertical force or horizontal and shear force components.
  • force sensors are used to measure ground reaction forces from the treadmill a person is standing, walking or running on.
  • the treadmill of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt, It's speed is interconnected to the computational pipeline for establishing a feedback loop between the motion capture system and the treadmill so that the person is remaining at the center of the treadmill regardless of changes in the walking/running speeds.
  • Skeleton definition and marker set Templates 3 are resource files used in the computational pipeline of the current invention, people are different in size and weight and a skeleton templates is selected from a group of skeleton templates to get the best match for every person.
  • Marker templates are used to define where the 4 markers are placed on the human body. Typically, such markers are disposed at every joint of the body.
  • Body mass properties 4 pertains to the weight of different body parts of different people. People vary in weight and this has ramifications on the muscle force they exert to generate specific motions. The mass properties are used as a resource for the correct real time force computations.
  • Muscle paths 5 are utilized to compensate for differences in build between users. Variations in length and width between subjects have ramifications to the force computations as a longer muscle will exert different force to generate the same motion then a shorter muscle, also the placement of the ligaments will be different in different people. In the context of one embodiment of the present invention, muscle paths are used to assist the computations of muscle forces and joint torques.
  • Kinematic solver 6 provides for the calculation of joint orientation using inverse kinematics.
  • Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, what angles do the joints need to be in to achieve that end point? This process is used in robotics, 3D computer animation and some engineering applications.
  • it is a single step in the data analysis pipeline, taking the data stream from the motion capture system and calculating the joint angles for every body part.
  • Inverse kinematics is used to calculate the joint orientation from the motion capture data, and to thereby convert XYZ positional values to rotation angles of the joints in degrees or radians.
  • Motion Equations 7 are sets of mathematical equations designed to combine incoming streams of kinematics data with marker and skeleton templates and convert those to forward and inverse dynamics data. Those can be lagrangeian equation sets, Casey sets, or Euler-Newton equation sets.
  • the motion equations 7 provide the relationship between generalized forces applied at the body and generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of forces in the external world.
  • Equations 7 can be added describing the kind of interaction with the environment, such as contacts with the floor.
  • the equations 7 can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time.
  • effective delay is eliminated using efficient algorithms, achieving a minimal sampling speed in real time to be greater than 30 hz, a standard familiar to those in the television and broadcast industries.
  • faster time would likewise be acceptable or desirable in some applications.
  • An optimization process 8 uses the input of muscle lengths and moment arms coming from the respective muscle paths to output muscle forces and joint torques.
  • the optimization 8 of the data contains routines for data normalization and several real time software filters
  • Real Time muscle force visualization 9 is provided by inputs of muscle forces and joint torques and are used to drive color animation on the respective muscles displayed as a 3D human body model on screen.
  • the color brightness and hue correlates with the muscle force amplitude, gain and activation patterns.
  • the user and operator can see a real time animation of the muscle forces active in the human body at any given time
  • Various embodiments of the present invention provide applications adaptable for other market segments. Sports and fitness is one such market.
  • One embodiment of the present invention provides a tool that is useful in numerous applications, including the fitness industry.
  • This system allows the visualizations of muscle forces for any given exercise in real-time.
  • the system can be used to enhance and improve muscle forces, by providing a realistic visualization of the given forces and torques.
  • the present system allows the user to see the force transference to various muscles in the body and achieve a desired effect.
  • the motion capture system instantly records the user's motion and provides immediate muscle force visualizations.
  • One embodiment of the present invention may have an enormous impact in the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion.
  • the system is useful for victims of traumatic brain injury, cerebral damage, and spinal damage.
  • the study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement.
  • the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons.
  • the system can visualize and track muscle deficiencies while training and improving movements.
  • One embodiment of the present invention in relation to medical applications can serve as an example.
  • One development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. This project pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions.
  • One of the major objectives of such a project is to enhance diagnostic and therapeutic activities in a range of medical fields.
  • the enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
  • the system consists of a combination of an instrumented treadmill 38 that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces 34 , real time motion capture system 32 and the custom computational pipeline 36 translating the capture data to muscle forces and joint torques display.
  • Various embodiments of the present invention seek to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body.
  • the patterns of muscle activation determine whether a subject falls or not.
  • These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks.
  • Such an embodiment offers not only a test-and learning environment for patients and doctors, but is also a valuable research environment for motor control.
  • Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered.
  • the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists.
  • One embodiment of the invention is a new principle in real time visualization, where muscle force is seen and evaluated in a totally new way. This principle establishes a mechanism to achieve a visualization state whereby the persons involved can see immediately which muscles they are using and to what extent.
  • One embodiment of the present invention is a muscle force processing system, comprising a processing means, a motion capture system connected to the processing means.
  • the motion capture data is taken from a plurality of motion sensors and is processed in real-time.
  • a further embodiment is an instrumented treadmill capable of measurements of ground reaction forces, wherein the measurements of said ground reaction forces are integrated in the computational pipeline resulting in real time view of muscle forces and joint torques.
  • a further embodiment is a 3D interactive muscle model further comprising an inverse kinematics skeleton layer, a 3D geometry anatomically correct skeleton layer and an anatomically correct muscle model layer.
  • An additional embodiment is a real time computational pipeline, further comprising a memory means for recording the motion capture data and processing the data in real time through the said layers of the processing real time pipeline.
  • Another embodiment is a method and system for real time visualization registration, evaluation, and correction of muscle forces and joint torques in the human body, wherein the full process is happening in real time.

Abstract

A method and system are provided for the visual display of anatomical forces, that system having: a motion capture system; a computer, receiving data from said motion capture system; and a computational pipeline disposed on said computer; that computational pipeline being configured to calculate muscle forces and joint torques in real time and visually display those forces and torques.

Description

    RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Applications No. 60/893,394, filed Mar. 7, 2007. This application is herein incorporated in its entirety by reference.
  • FIELD OF THE INVENTION
  • This invention most generally relates to a system that combines motion capture technology and a 3D computational musculoskeletal model to create a real time display environment where muscle forces and joint torques are illustrated. More specifically, various embodiments of the present invention create real time visualizations of the physical muscle forces and joint torques in the body during movement.
  • BACKGROUND OF THE INVENTION
  • Currently there is no known system or method available for visualizing in 3D the muscle forces exerted by the human body in real time. Most rehabilitation clinics and medical research institutes use specialized therapeutic programs, based on cause related classifications of movement disorders, but there is no known way that they can view the body force arrays in real time as it usually takes many hours and days of calculations to derive those parameters and the results are numerical or graphical and not intuitive to the viewer.
  • Motion Capture is a term for a variety of techniques, and the technology has existed for many years in a variety of applications. The aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner. In the entertainment industry, motion capture allows an operator to use computer-generated characters. Motion capture can be used to create complex motion, using the full range of human movements and allow also inanimate objects to move realistically. Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently. Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync. Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games.
  • Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, it calculates the angles of the joints needed to be in to achieve that end point. This process is used in robotics, 3D computer animation and some engineering applications.
  • Dynamics is the process of calculating the accelerations of a linked structure in space, given the set of internal and external forces acting on the structure. Inverse dynamics does the opposite. Given the accelerations of the structure, and a set of measured forces, it calculates the unknown internal forces needed to produce those accelerations. The result is typically provided as a set of joint torques and resultant joint forces.
  • What is needed, therefore, are techniques for creating a single computational pipeline of all the described steps in real time. Creating for the first time the capability to view muscle forces as they occur.
  • SUMMARY OF THE INVENTION
  • One embodiment of the present invention provides a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model. Data stream coming from a motion capture system is parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the user as they happen.
  • Another embodiment of the present invention provides runtime interaction by a user or operator.
  • A further embodiment of the present invention provides a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to customize the visualization in real time of forces and torques exerted by the human body.
  • Still another embodiment of the invention creates a new measurement and visualization tool, bearing applications in various industries. The invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • Yet another embodiment of the present invention provides a new measurement and visualization tool, bearing applications in various industries. The invention creates the possibility of looking at muscle forces and joint forces transference in the body for determining, registering and evaluating human functional performance to a range of given situations. Other applications include orthopedic and ergonomic studies and designs.
  • A yet further embodiment of the present invention provides a process that incorporates real time 3D marker data streams coming from a motion capture system through real-time sets of algorithms that derive from the 3D markers cloud the joints centers of rotation, positions and orientations, then derives accelerations and velocities and converts those into an array of muscle forces that are passed to the 3D human body muscle model as a data stream used in the 3D color space visualization of the muscle forces and joint torques.
  • The features and advantages described herein are not all-inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and not to limit the scope of the inventive subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A is a computer generated image illustrating motion capture points disposed on a user (not shown) configured in accordance but not limited to one embodiment of the present invention.
  • FIG. 1B is a computer generated image illustrating a kinematics skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 1C is a computer generated image illustrating a anatomically correct skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 1D is a computer generated image illustrating a three dimensional anatomically correct muscle layer configured in accordance with one embodiment of the present invention.
  • FIG. 1E is a computer generated image illustrating a three dimensional anatomically correct muscle layer disposed on an anatomically correct three dimensional skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 2 is a computer generated image illustrating pipeline layer connections configured in accordance with one embodiment of the present invention.
  • FIG. 3 is a computer generated image illustrating a V-Gait configured in accordance with one embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a motion capture system configured in accordance with one embodiment of the present invention.
  • FIG. 5 is a flow chart illustrating a method of motion capture configured in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Muscle forces are typically invisible by nature and one can normally only see the results of applied muscle forces on the individual's surroundings. One embodiment of the present invention makes it possible to view simulated muscle forces in the human body in real-time, in a way that makes clear the force transference in the human musculoskeletal system. The process of achieving this functionality relies on fast and accurate real time motion capture data processing into an IK (inverse kinematics) skeletal layer containing joint positions and orientations, a further process deriving accelerations and velocities, a further process deriving inverse dynamics in real time, a further process deriving muscle forces from joint torques, and a final process converting the result streams into 3D visualizations of color and form changes in a 3D accurate human body muscle model.
  • The applicant herein incorporates by reference U.S. Pat. No. 6,774,885 for all purposes.
  • One embodiment of the invention is a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model. Data streams coming from a motion capture system are parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the eye as they happen.
  • One embodiment of the present invention allows runtime interaction by a user or operator. Such an embodiment of the invention can be seen as a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to visualize in real time the forces and torques exerted by the human body.
  • One embodiment of the invention provides a new measurement and visualization tool, bearing applications in various industries. One embodiment of the invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations. Although at least one embodiment of the present invention is intended for medical applications, embodiments of the present invention are adaptable for other market segments including ergonomics and sports.
  • Various embodiments of the present invention provide tools that are useful in numerous applications, including the sports and fitness industries. This system allows the visualizations of muscle forces for any given exercise in real-time. Such a system, illustrated in FIG. 3 can be used to enhance, optimize and improve muscle forces, by providing a realistic real time visualization of the given forces and torques. The system allows the user 30 to see the force transference to various muscles in the body and achieve the desired effect. A motion capture system 32 instantly records the user's motion and provides immediate muscle force visualizations 34.
  • One embodiment of the present invention may be utilized by the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion. The system may be useful for victims of traumatic brain injury, cerebral damage, and spinal damage. The study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement. In the field of orthopedics and prosthetics, embodiments of the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons. With orthopedics, prosthetics, and amputees, the system can visualize and track muscle deficiencies while training and improving movements.
  • Yet another embodiment of the present invention combines muscle forces and resultant joint force into a calculation and visualization of the forces acting within joints. This is useful as a training tool to prevent and treat overuse injuries in the workplace, in ergonomics and in sports.
  • In the context of one embodiment of the present invention it is a first step in the data analysis pipeline illustrated in FIG. 2, taking the data stream from the motion capture system and calculating the joint angles for every body part, each joint calculated is drawn as a sphere in this drawing. In one embodiment of the present invention, Inverse kinematics is used to calculate the joint orientation from the motion capture data before deriving the accelerations and velocities of every body part. The next step in the pipeline is to take the calculated joint angles and to derive values of accelerations and velocities for every joint (representing every body part), the acceleration and velocities values are the base for calculating through the use of inverse dynamics, the muscle forces and joint torques which are then passed to the 3D muscle model display as color information.
  • One embodiment of the present invention in relation to medical applications can serve as an example. A development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. Such an embodiment pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions.
  • Among the features of such an embodiment is the ability to enhance diagnostic and therapeutic activities in a range of medical fields. The enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
  • Such a system consists of a combination of an instrumented treadmill that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces, real time motion capture system and the custom computational pipeline translating the capture data to muscle forces and joint torques display.
  • An embodiment of the present invention seeks to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body.
  • Especially in complex balance tasks, the patterns of muscle activation determine whether a subject falls or not. These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks.
  • Such an embodiment offers not only a test and learning environment for patients and doctors, but is also a valuable research environment for motor control. Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered.
  • For example the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists. In these patients some visual cues are sufficient to trigger rather normal looking muscle force patterns (for instance used in walking), while in the absence of such stimuli a pattern can not even be started. In healthy subjects, the continuous control of muscle force transference during walking is possible by having a multi-channel sensory input onto a vast library of learned motor patterns. Once the possibility exists to view in real time the muscle force pattern immergence, it will lead to fundamental improvement in the understanding and possible treatment of the sickness. Such an embodiment will allow a new glimpse into the complexity of the natural processes associated with human motion.
  • Other examples can be found among patients with peripheral disorders, such as partial paralysis or paresis of a limb. In these situations, gait and balance are compromised both by a partial lack of sensory input and a lack of muscle coordination. The usual result of that is that in order to obtain a functional gait and balance the patients find compensations, resulting in deviant movement patterns in healthy parts of the body. Making use of the real time muscle force and joint torques visualization can help to sort out the distinction between compensation and primary disorders.
  • Another example of an application for one embodiment of the present invention is the prevention and treatment of low back pain through teaching of proper lifting techniques. Real-time calculation and visualization of the forces acting on the intervertebral discs will provide immediate feedback to the patient concerning the quality of their movement.
  • In many embodiments the muscle forces will be visualized, but certain training applications may provide audio signals driven by muscle force values from the computational pipeline. Other training applications may use muscle force values as input for a virtual environment, which causes changes in position of virtual objects, or changes in position of the motion platform on which the subject is standing.
  • The computational pipeline that results in real time muscle force display is flexible and allows forward dynamics simulations to be run at any time during runtime of the system. The flow of movements as an input to the inverse dynamics simulation is stopped during a sequence and the calculated joint movements are now used as input, while the movements become output. Thus forward simulations calculate movements and reaction forces from moments of force produced around the joints of the subjects. These forward simulations can be visualized as part of the virtual environment, and will show what might happen to the patient in hypothetical situations.
  • The forward and inverse dynamic calculations typically consist of a large set of equations. Depending on the methods used to form these equations, they are expressed in various ways, such as Newton-Euler equations, Lagrange's equations, or Kane's equations. These are called the equations of motion, which contain the relation between the generalized forces applied at the body and the generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of three dimensional coordinates in the external world. This implies that most of the generalized forces are actually moments of force (or torque). Equations can be added describing the kind of interaction with the environment, such as contacts with the floor. The equations can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time.
  • From the dynamic simulation the location of the center of mass is calculated, which, together with the position of the feet, can be used to drive the motion of the platform, if this is required by the virtual environment. The human body model produces the joint moments of force of the subject. Forward dynamics simulation can be started to indicate where weak parts in the motor pattern are located.
  • The main tasks of the real time computational pipeline are processing the input data coming from the motion capture sensors, mapping the collected data into the above mentioned human body model, processing the various input and/or computed data depending on different cases. Other tasks concern the display of real-time 3D graphic representations of muscle forces and joint torques 28, as well as driving the output devices such as a treadmill 38 and a display system 34 as illustrated in FIG. 3.
  • The user interface for the operator is implemented as the means to communicate with the real time 3D muscle model 26 of FIG. 1D through a custom written software program. As an example of operation, after having decided on the type of motions to execute, the real time 3D muscle model is projected on the screen in front of the subject, The user stands on a platform or treadmill, which can be controlled as part of the system or as a reaction to movements of the subject. The user wears motion capture markers 20, as illustrated in FIGS. 1A and 2 of which the positions are recorded. These are fed into an algorithm that turns them into the degrees of freedom of the human body model, which is filled with the segment masses 22 and inertia of the subject and displayed as color space real time animations of the 3D muscle model of FIG. 1E.
  • From the skeleton motion and mass properties, also the location of the center of mass is calculated, which, together with the position of the feet, can be used to drive the motion of the treadmill or platform as required by the environment. The human body model 26 produces the joint moments of force of the subject, if necessary; this information can be offered in the projected image to be used by the subject. Forward dynamics simulation can also be computed to indicate where weak parts in the motor pattern are located.
  • FIGS. 1A-1E illustrate an overview of one embodiment of the present invention's computational real time pipeline wherein as illustrated in FIG. 1A a user is equipped with a number of motion capture sensors or markers 20 attached at various strategic locations of the body. The data from the sensors is received by a motion capture system 32. In a preferred embodiment, the motion capture data set contains the X axis, Y axis, and Z axis positions of the user for the full body, and is transmitted at >100 FPS (Frames per second) to the computer 36. The computer 36 interactively operates with operator's interface 34 and executes the first step in the computational pipeline converting the positional data in real time to an inverse kinematics skeleton 22 illustrated in FIG. 1B. This data is typically applied to the inverse kinematics skeleton 22 to drive a 3D anatomically correct skeleton 24 in about approximately real time (FIG. 1C). Then a 3D anatomically correct muscle layer 26 of FIG. 1D is connected to the human skeleton 24 and the muscle forces and joint torques resulting from the real time computational pipeline are applied to real time animations of colors 28 of the respective muscles in the 3D muscle model of FIG. 1E.
  • Referring to FIG. 2, a person is outfitted with markers 20 and a template 22 is processed for an initial or balance position. The markers 20 are typically used to record the motion. They are substantially instantaneously captured, and used to process a complete template. The template 22 utilizes a template matching algorithm to interpolate for missing or bad marker data. The Template matching result is passed to the computational inverse kinematics skeleton 24. Here position data of the markers is plotted in real time to joint orientations in the computational skeleton 24. Using Constraint based rigging; the data is in turn driving a geometry (anatomically correct) skeleton. This skeleton is the base for the muscle force visualization layer.
  • FIG. 3 illustrates an embodiment of the present invention, wherein the patient 30 on an instrumented treadmill 38 is looking at the 3D real time interactive muscle model 34 of himself seeing the muscles in action as muscle force is exerted. This interactive muscle force model 34 is calculated by a processor 36 using the method described above using data obtained from optical motion capture sensors 32 disposed on the patient's body 30, in combination with sensors disposed in the instrumented treadmill 38. In one such embodiment, weight sensors may be disposed in the instrumented treadmill 38 while other sensors such as accelerometers, speedometers, rotation and position sensors may also be included.
  • FIG. 4 is a block diagrammatic view illustrating the hardware and software possible interconnections of one embodiment of the present invention. The hardware platform is based on high end multi-core Multi-processor workstations.
  • In one embodiment the multi-CPU hardware platform 36 is used as the computer means for processing, memory, and interface. The various peripherals and communications are accomplished by using standard high-speed connections using Ethernet, serial, and SCSI connections to dedicated hosts. The dedicated host can be a separate personal computer (PC) or an integrated on-board computer that interfaces with the peripheral equipment. The optical motion capture system of one embodiment includes six cameras, and the data acquisition unit of the optical motion capture system translates the camera input into the desired data set.
  • The data set of one embodiment is 3D position information of the sensors 20 obtained from a person 30 in real time, and is accessible to a dedicated host that allows for the fast exchange of data to the CPU 36. Data is, in one embodiment, delivered in a custom made file format. Though not limited to this type of system, the chosen main optical capture system of one embodiment is a realtime passive marker system 32, which is readily configurable for many setups. This technology is capable of converting and displaying 3D data coordinates of up to 300 optical markers at >100 HZ, The instrumented treadmill 38 is interconnected to dedicated host that connects to the CPU for transferring data and control information. The treadmill 38 of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt. It's speed is interconnected to the computational pipeline for establishing a feedback loop between the motion capture system 32 and the treadmill 38 so that the person is remaining at the center of the treadmill regardless of changes in the walking/running speeds. A projection device 34 such as a plasma screen or a video projector and screen is used to display the real time 3D muscle model to the user.
  • FIG. 5 illustrates a flow chart illustrating the operation of a system configured according to one embodiment of the present invention. Input from the motion capture system 1 in the form of 3D marker coordinates is used as input for the Kinematic Solver 6. The Kinematic solver 6 is also using resource files of a skeleton definition and marker set templates 3. The Kinematic Solver 6 is outputting in real-time the current skeleton pose. Real-time low-pass filtering and differentiation processes the changes in skeleton pose into velocities and accelerations that are used as input to the Motion Equations 7. The Kinematic Solver output also drives the generation of Muscle paths for all respective muscles 5, and outputs the schematic skeleton used for the visualization 9. The Motion Equations 7 are also using input from ground reaction forces and other external forces coming from an array of Force sensors 2. The Motion Equations 7 also use an input from resource files that contain the respective body mass properties 4. The Equations of Motion 7 Output Joint moments to the Optimization process 8, The Optimization process 8 also uses input of muscle lengths and moment arms coming from the respective muscle paths 5. The Optimization process 8 Outputs Muscle forces used in the Real Time muscle force Visualization 9.
  • In one embodiment of the invention, the skeleton pose (i.e. the set of generalized coordinates) is calculated in real-time by using the Levenberg-Marquardt nonlinear least-squares algorithm to solve the global optimization problem. The use of the analytical Jacobian matrix makes the computations very fast.
  • In one embodiment of the invention, equations of motion ave produced via software that creates C code for the forward kinematics equations. Those equations generate coordinates of markers on the body from the generalized coordinates of the skeleton. The derivatives of the forward kinematics equations, forming a Jacobian matrix, are generated by via symbolic differentiation. Finally, one embodiment of the present invention translates these equations into computer code which is incorporated into the computational pipeline which executes the calculations at run time.
  • In one embodiment, the muscle forces are the solution of a static optimization problem, with the general form: minimize the sum of normalized muscle forces raised to the Nth power, while requiring that all muscle forces are non-negative, and that the set of muscle forces multiplied by their respective moment arms, are identical to the joint torques solved by the inverse dynamics equations. Normalized muscle force is defined as the muscle force relative to the maximal force capacity of the muscle. Moment arm is the distance from the muscle force vector to the instantaneous center of rotation of a particular joint and is mathematically calculated as the derivative of muscle length with respect to the joint's generalized coordinate. Traditional optimization methods are too slow for real-time applications. For N=2, which is commonly used in muscle force estimation, a solution is obtained in real time using the neural network algorithm for quadratic programming.
  • Motion Capture is a phrase used to describe for a variety of techniques for capturing the movement of a body or object, and the technology has existed for many years in a variety of applications. The aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner. In the entertainment industry, motion capture allows an operator to use computer-generated characters. Motion capture is used to create complex natural motion, using the full range of human movements and allow also inanimate objects to move realistically. Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently. Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync. Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games. In the context of the present invention, motion capture is used to output 3D XYZ marker positions.
  • Force sensors are used in many industries such as Automotive, Robotics and various Engineering applications, typically a force sensor will measure the total forces applied on it, those can be vertical force or horizontal and shear force components. In the context of the present invention, force sensors are used to measure ground reaction forces from the treadmill a person is standing, walking or running on. For example, the treadmill of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt, It's speed is interconnected to the computational pipeline for establishing a feedback loop between the motion capture system and the treadmill so that the person is remaining at the center of the treadmill regardless of changes in the walking/running speeds.
  • Skeleton definition and marker set Templates 3 are resource files used in the computational pipeline of the current invention, people are different in size and weight and a skeleton templates is selected from a group of skeleton templates to get the best match for every person. Marker templates are used to define where the 4 markers are placed on the human body. Typically, such markers are disposed at every joint of the body.
  • Body mass properties 4 pertains to the weight of different body parts of different people. People vary in weight and this has ramifications on the muscle force they exert to generate specific motions. The mass properties are used as a resource for the correct real time force computations.
  • Muscle paths 5 are utilized to compensate for differences in build between users. Variations in length and width between subjects have ramifications to the force computations as a longer muscle will exert different force to generate the same motion then a shorter muscle, also the placement of the ligaments will be different in different people. In the context of one embodiment of the present invention, muscle paths are used to assist the computations of muscle forces and joint torques.
  • Kinematic solver 6 provides for the calculation of joint orientation using inverse kinematics. Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, what angles do the joints need to be in to achieve that end point? This process is used in robotics, 3D computer animation and some engineering applications. In the context of one embodiment of the present invention it is a single step in the data analysis pipeline, taking the data stream from the motion capture system and calculating the joint angles for every body part. In the context of one embodiment of the present invention, Inverse kinematics is used to calculate the joint orientation from the motion capture data, and to thereby convert XYZ positional values to rotation angles of the joints in degrees or radians.
  • Equations used in the calculation of motion and force are known to those skilled in the physical sciences, or are readily derived from equations well known in the field of physics. Motion Equations 7 are sets of mathematical equations designed to combine incoming streams of kinematics data with marker and skeleton templates and convert those to forward and inverse dynamics data. Those can be lagrangeian equation sets, Casey sets, or Euler-Newton equation sets. In the context of one embodiment of the present invention, the motion equations 7 provide the relationship between generalized forces applied at the body and generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of forces in the external world. This implies that most of the generalized forces are actually moments of force (or torque). Equations 7 can be added describing the kind of interaction with the environment, such as contacts with the floor. The equations 7 can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time. In one embodiment, effective delay is eliminated using efficient algorithms, achieving a minimal sampling speed in real time to be greater than 30 hz, a standard familiar to those in the television and broadcast industries. One skilled in the art will readily appreciate that faster time would likewise be acceptable or desirable in some applications.
  • An optimization process 8 uses the input of muscle lengths and moment arms coming from the respective muscle paths to output muscle forces and joint torques. The optimization 8 of the data contains routines for data normalization and several real time software filters
  • Real Time muscle force visualization 9 is provided by inputs of muscle forces and joint torques and are used to drive color animation on the respective muscles displayed as a 3D human body model on screen. The color brightness and hue correlates with the muscle force amplitude, gain and activation patterns. The user and operator can see a real time animation of the muscle forces active in the human body at any given time
  • Various embodiments of the present invention provide applications adaptable for other market segments. Sports and fitness is one such market. One embodiment of the present invention provides a tool that is useful in numerous applications, including the fitness industry. This system allows the visualizations of muscle forces for any given exercise in real-time. The system can be used to enhance and improve muscle forces, by providing a realistic visualization of the given forces and torques. The present system allows the user to see the force transference to various muscles in the body and achieve a desired effect. The motion capture system instantly records the user's motion and provides immediate muscle force visualizations.
  • One embodiment of the present invention may have an enormous impact in the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion. The system is useful for victims of traumatic brain injury, cerebral damage, and spinal damage. The study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement. In the field of orthopedics and prosthetics, the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons. With orthopedics, prosthetics, and amputees, the system can visualize and track muscle deficiencies while training and improving movements. One embodiment of the present invention in relation to medical applications can serve as an example. One development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. This project pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions. One of the major objectives of such a project is to enhance diagnostic and therapeutic activities in a range of medical fields. The enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
  • In one embodiment such as that illustrated in FIG. 3, the system consists of a combination of an instrumented treadmill 38 that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces 34, real time motion capture system 32 and the custom computational pipeline 36 translating the capture data to muscle forces and joint torques display.
  • Various embodiments of the present invention seek to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body. Especially in complex balance tasks, the patterns of muscle activation determine whether a subject falls or not. These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks. Such an embodiment offers not only a test-and learning environment for patients and doctors, but is also a valuable research environment for motor control. Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered. For example the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists. In these patients some visual cues are sufficient to trigger rather normal looking muscle force patterns (for instance used in walking), while in the absence of such stimuli a pattern can not even be started. In healthy subjects, the continuous control of muscle force transference during walking is possible by having a multi-channel sensory input onto a vast library of learned motor patterns. Once the possibility exists to view in real time the muscle force pattern immergence, it will lead to fundamental improvement in the understanding and possible treatment of the sickness. Such an embodiment will allow a new glimpse into the complexity of the natural processes associated with human motion. Other examples can be found among patients with peripheral disorders, such as partial paralysis or paresis of a limb. In these situations, gait and balance are compromised both by a partial lack of sensory input and a lack of muscle coordination. The usual result of that is that in order to obtain a functional gait and balance the patients find compensations, resulting in deviant movement patterns in healthy parts of the body. Making use of the real time muscle force and joint torques visualization can help to sort out the distinction between compensation and primary disorders.
  • One embodiment of the invention is a new principle in real time visualization, where muscle force is seen and evaluated in a totally new way. This principle establishes a mechanism to achieve a visualization state whereby the persons involved can see immediately which muscles they are using and to what extent.
  • One embodiment of the present invention is a muscle force processing system, comprising a processing means, a motion capture system connected to the processing means. The motion capture data is taken from a plurality of motion sensors and is processed in real-time. There is a computational pipeline connected to the processing means, wherein resulting data is also processed in real-time, and wherein resulting data is visualized in real time through color space changes in a 3D muscle model showing the muscle forces and joint torques in real time. There is also a means of interfacing to the muscle model with a runtime control input. A further embodiment is an instrumented treadmill capable of measurements of ground reaction forces, wherein the measurements of said ground reaction forces are integrated in the computational pipeline resulting in real time view of muscle forces and joint torques. A further embodiment is a 3D interactive muscle model further comprising an inverse kinematics skeleton layer, a 3D geometry anatomically correct skeleton layer and an anatomically correct muscle model layer. An additional embodiment is a real time computational pipeline, further comprising a memory means for recording the motion capture data and processing the data in real time through the said layers of the processing real time pipeline. Another embodiment is a method and system for real time visualization registration, evaluation, and correction of muscle forces and joint torques in the human body, wherein the full process is happening in real time.
  • The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.

Claims (20)

1. A system for the visual display and output of anatomical forces, the system comprising:
A motion capture system;
A computer, receiving data from said motion capture system;
A computational pipeline disposed on said computer;
Said computational pipeline being configured to calculate joint torques, muscle forces, and joint forces in real time and visually display said forces and torques.
2. The system of claim 1 further comprising a runtime control unit.
3. The system of claim 1 wherein said motion capture system comprises sensors selected from the group of sensors consisting of optical, magnetic, inertial, and video based sensors.
4. The system of claim 1 further comprising a display device.
5. The system according to claim 1 further comprising a memory storage device.
6. The system according to claim 1 further comprising an instrumented treadmill.
7. The system according to claim 1 further comprising a ground reaction force sensor.
8. The system, according to claim 1 wherein said muscle forces and said joint torques are displayed on a representation of a human body.
9. A method for the visual display of anatomical forces, said method comprising:
Placing at least one motion capture marker on a user;
Collecting and recording real time positional data from said at least one marker;
Deriving rotational data from a plurality of said at least one marker;
Computing acceleration and velocity from said rotational data in real time;
Utilizing said acceleration and velocity to compute forward and inverse dynamics data;
Calculating muscle forces and joint torques in real time from said forward and inverse dynamics data; and
Displaying said muscle forces and joint torques.
10. The method according to claim 9 wherein said muscle forces and joint torques are displayed on a 3 dimensional muscle model.
11. The method according to claim 9 further comprising selecting a body template approximating the build of the user.
12. The method according to claim 9 further comprising displaying said muscles forces and joint torques with colored animation.
13. A system for diagnosis and treatment of musculoskeletal disorders; said system comprising:
A movement platform;
Sensors disposed within said platform;
Motion capture sensors whereby motion of a patient is captured;
A visual display whereby the location and intensity of joint torques, muscle forces, and joint forces are displayed in real time on an animated human model.
14. The system according to claim 13 wherein said visual display is provided to said patient thereby training said patient to improve movement in an affected limb.
15. The system according to claim 13 wherein said visual display is configured to provide a clinician with force and torque data necessary for diagnosis.
16. The system according to claim 15 further comprising a recoding function whereby said visual display may be replayed for study by said clinician.
17. The system according to claim 15 further comprising a database of said visual displays.
18. The system according to claim 17 further comprising a comparator whereby said database contains visual displays from a plurality of users.
19. The method according to claim 9 further comprising the use of a Jacobian matrix to facilitate said step of computing.
20. The method according to claim 9 wherein said step of calculating further comprises use of a recurrent neural network.
US11/832,726 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body Abandoned US20080221487A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US11/832,726 US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body
ES08730108.1T ES2679125T3 (en) 2007-03-07 2008-02-19 System for interactive visualization in real time of muscular forces and joint pairs in the human body
EP08730108.1A EP2120710B1 (en) 2007-03-07 2008-02-19 System for real time interactive visualization of muscle forces and joint torques in the human body
JP2009552792A JP5016687B2 (en) 2007-03-07 2008-02-19 A method to interactively visualize muscle strength and joint moments in the human body in real time
PCT/US2008/054239 WO2008109248A2 (en) 2007-03-07 2008-02-19 Method for real time interactive visualization of muscle forces and joint torques in the human body
CA2680462A CA2680462C (en) 2007-03-07 2008-02-19 Method for real time interactive visualization of muscle forces and joint torques in the human body
US12/251,688 US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US89339407P 2007-03-07 2007-03-07
US11/832,726 US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/251,688 Continuation US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Publications (1)

Publication Number Publication Date
US20080221487A1 true US20080221487A1 (en) 2008-09-11

Family

ID=39739017

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/832,726 Abandoned US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body
US12/251,688 Active 2027-10-01 US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/251,688 Active 2027-10-01 US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Country Status (6)

Country Link
US (2) US20080221487A1 (en)
EP (1) EP2120710B1 (en)
JP (1) JP5016687B2 (en)
CA (1) CA2680462C (en)
ES (1) ES2679125T3 (en)
WO (1) WO2008109248A2 (en)

Cited By (132)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090124938A1 (en) * 2007-11-14 2009-05-14 Wolfgang Brunner Gait Analysis System
US20100105525A1 (en) * 2008-10-23 2010-04-29 University Of Southern California System for encouraging a user to perform substantial physical activity
US20100222711A1 (en) * 2009-02-25 2010-09-02 Sherlock NMD, LLC, a Nevada Corporation Devices, systems and methods for capturing biomechanical motion
US20110013004A1 (en) * 2007-06-08 2011-01-20 Nokia Corporation Measuring human movements - method and apparatus
US20110045952A1 (en) * 2009-08-21 2011-02-24 Lifemodeler, Inc. Systems and Methods For Determining Muscle Force Through Dynamic Gain Optimization of a Muscle PID Controller
US20110060537A1 (en) * 2009-09-08 2011-03-10 Patrick Moodie Apparatus and method for physical evaluation
WO2011032541A1 (en) * 2009-09-16 2011-03-24 Otto-Von-Guericke-Universität Magdeburg Device for determining vertebral distances of the spinal column of a patient
WO2011034963A2 (en) * 2009-09-15 2011-03-24 Sony Corporation Combining multi-sensory inputs for digital animation
US20110137138A1 (en) * 2008-05-29 2011-06-09 Per Johansson Patient Management Device, System And Method
US20110285626A1 (en) * 2009-01-30 2011-11-24 Microsoft Corporation Gesture recognizer system architecture
CN102426709A (en) * 2011-08-19 2012-04-25 北京航空航天大学 Real-time motion synthesis method based on fast inverse kinematics
US20120108909A1 (en) * 2010-11-03 2012-05-03 HeadRehab, LLC Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
US20120123805A1 (en) * 2007-12-04 2012-05-17 Terence Vardy Collection of medical data
US20120165703A1 (en) * 2010-12-22 2012-06-28 Paul William Bottum Preempt Muscle Map Screen
US20120183939A1 (en) * 2010-11-05 2012-07-19 Nike, Inc. Method and system for automated personal training
US20120277635A1 (en) * 2011-04-29 2012-11-01 Tsai Ming-June Body motion staff, producing module, image processing module and motion replication module
US8315823B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force and/or motion measurement system having inertial compensation and method thereof
US8314793B2 (en) 2008-12-24 2012-11-20 Microsoft Corporation Implied analytical reasoning and computation
US8315822B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force measurement system having inertial compensation
US8352397B2 (en) 2009-09-10 2013-01-08 Microsoft Corporation Dependency graph in data-driven model
US20130046149A1 (en) * 2011-08-19 2013-02-21 Accenture Global Services Limited Interactive virtual care
US8411085B2 (en) 2008-06-27 2013-04-02 Microsoft Corporation Constructing view compositions for domain-specific environments
US8493406B2 (en) 2009-06-19 2013-07-23 Microsoft Corporation Creating new charts and data visualizations
US8531451B2 (en) 2009-06-19 2013-09-10 Microsoft Corporation Data-driven visualization transformation
US8578302B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Predictive determination
US8620635B2 (en) 2008-06-27 2013-12-31 Microsoft Corporation Composition of analytics models
US20140028539A1 (en) * 2012-07-29 2014-01-30 Adam E. Newham Anatomical gestures detection system using radio signals
US20140039861A1 (en) * 2012-08-06 2014-02-06 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
US8692826B2 (en) 2009-06-19 2014-04-08 Brian C. Beckman Solver-based visualization framework
US8704855B1 (en) 2013-01-19 2014-04-22 Bertec Corporation Force measurement system having a displaceable force measurement assembly
US8764532B1 (en) 2012-11-07 2014-07-01 Bertec Corporation System and method for fall and/or concussion prediction
US8788574B2 (en) 2009-06-19 2014-07-22 Microsoft Corporation Data-driven visualization of pseudo-infinite scenes
US20140267611A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Runtime engine for analyzing user motion in 3d images
US8847989B1 (en) 2013-01-19 2014-09-30 Bertec Corporation Force and/or motion measurement system and a method for training a subject using the same
US20140289964A1 (en) * 2013-02-13 2014-10-02 Jon Dodd Configurable bed
US8866818B2 (en) 2009-06-19 2014-10-21 Microsoft Corporation Composing shapes and data series in geometries
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
WO2015073368A1 (en) 2013-11-12 2015-05-21 Highland Instruments, Inc. Analysis suite
US9081436B1 (en) 2013-01-19 2015-07-14 Bertec Corporation Force and/or motion measurement system and a method of testing a subject using the same
US9168420B1 (en) 2012-01-11 2015-10-27 Bertec Corporation Force measurement system
US9289674B2 (en) 2012-06-04 2016-03-22 Nike, Inc. Combinatory score having a fitness sub-score and an athleticism sub-score
US9330503B2 (en) 2009-06-19 2016-05-03 Microsoft Technology Licensing, Llc Presaging and surfacing interactivity within data visualizations
US9358426B2 (en) 2010-11-05 2016-06-07 Nike, Inc. Method and system for automated personal training
US20160203630A1 (en) * 2015-01-09 2016-07-14 Vital Mechanics Research Inc. Methods and systems for computer-based animation of musculoskeletal systems
US9457256B2 (en) 2010-11-05 2016-10-04 Nike, Inc. Method and system for automated personal training that includes training programs
US9504909B2 (en) 2011-05-05 2016-11-29 Qualcomm Incorporated Method and apparatus of proximity and stunt recording for outdoor gaming
US9526451B1 (en) 2012-01-11 2016-12-27 Bertec Corporation Force measurement system
US9526443B1 (en) 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US20170020437A1 (en) * 2014-03-12 2017-01-26 Movotec A/S System, apparatus and method for measurement of muscle stiffness
US20170177833A1 (en) * 2015-12-22 2017-06-22 Intel Corporation Smart placement of devices for implicit triggering of feedbacks relating to users' physical activities
US20170266533A1 (en) * 2016-03-18 2017-09-21 Icon Health & Fitness, Inc. Coordinated Displays in an Exercise Device
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US20170273864A1 (en) * 2016-03-23 2017-09-28 Zoll Medical Corporation Real-Time Kinematic Analysis During Cardio-Pulmonary Resuscitation
US20170296115A1 (en) * 2009-02-02 2017-10-19 Joint Vue, LLC Motion Tracking System with Inertial-Based Sensing Units
US9811639B2 (en) 2011-11-07 2017-11-07 Nike, Inc. User interface and fitness meters for remote joint workout session
US9836118B2 (en) * 2015-06-16 2017-12-05 Wilson Steele Method and system for analyzing a movement of a person
WO2018022602A1 (en) * 2016-07-25 2018-02-01 Ctrl-Labs Corporation Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US9916011B1 (en) * 2015-08-22 2018-03-13 Bertec Corporation Force measurement system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
WO2018146630A3 (en) * 2017-02-10 2018-10-04 Apex Occupational Health Solutions Inc. Method and system for ergonomic augmentation of workspaces
US10216262B1 (en) * 2015-08-22 2019-02-26 Bertec Corporation Force management system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US10231662B1 (en) * 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US20190133693A1 (en) * 2017-06-19 2019-05-09 Techmah Medical Llc Surgical navigation of the hip using fluoroscopy and tracking sensors
CN110109532A (en) * 2018-06-11 2019-08-09 成都思悟革科技有限公司 A kind of human action Compare System obtaining system based on human body attitude
US10390736B1 (en) 2015-08-22 2019-08-27 Bertec Corporation Force measurement system that includes a force measurement assembly, at least one visual display device, and one or more data processing devices
US10409371B2 (en) 2016-07-25 2019-09-10 Ctrl-Labs Corporation Methods and apparatus for inferring user intent based on neuromuscular signals
US10413230B1 (en) * 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US10420982B2 (en) 2010-12-13 2019-09-24 Nike, Inc. Fitness training system with energy expenditure calculation that uses a form factor
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US10489986B2 (en) * 2018-01-25 2019-11-26 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
US10504286B2 (en) 2018-01-25 2019-12-10 Ctrl-Labs Corporation Techniques for anonymizing neuromuscular signal data
US10535015B2 (en) 2015-11-05 2020-01-14 Samsung Electronics Co., Ltd. Walking assistance apparatus and method of controlling same
US10555688B1 (en) 2015-08-22 2020-02-11 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10628504B2 (en) 2010-07-30 2020-04-21 Microsoft Technology Licensing, Llc System of providing suggestions based on accessible and contextual information
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
CN111492367A (en) * 2017-12-13 2020-08-04 谷歌有限责任公司 Gesture learning, lifting and noise cancellation from 2D images
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
US10817795B2 (en) 2018-01-25 2020-10-27 Facebook Technologies, Llc Handstate reconstruction based on multiple inputs
US10825561B2 (en) 2011-11-07 2020-11-03 Nike, Inc. User interface for remote joint workout session
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US10860843B1 (en) * 2015-08-22 2020-12-08 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
CN112733301A (en) * 2021-01-21 2021-04-30 佛山科学技术学院 Six-dimensional torque sensor gravity compensation method and system based on neural network
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11097154B1 (en) * 2019-10-11 2021-08-24 Bertec Corporation Swing analysis system
US11103748B1 (en) 2019-03-05 2021-08-31 Physmodo, Inc. System and method for human motion detection and tracking
CN113456060A (en) * 2021-05-27 2021-10-01 中国科学院软件研究所 Method and device for extracting characteristic parameters of motion function
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11253173B1 (en) * 2017-05-30 2022-02-22 Verily Life Sciences Llc Digital characterization of movement to detect and monitor disorders
US11301045B1 (en) * 2015-08-22 2022-04-12 Bertec Corporation Measurement system that includes at least one measurement assembly, a visual display device, and at least one data processing device
US11311209B1 (en) * 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US11331045B1 (en) 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US11331006B2 (en) 2019-03-05 2022-05-17 Physmodo, Inc. System and method for human motion detection and tracking
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US11458362B1 (en) * 2019-10-11 2022-10-04 Bertec Corporation Swing analysis system
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11482337B2 (en) * 2014-04-17 2022-10-25 The Boeing Company Method and system for tuning a musculoskeletal model
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11497962B2 (en) 2019-03-05 2022-11-15 Physmodo, Inc. System and method for human motion detection and tracking
US11511156B2 (en) 2016-03-12 2022-11-29 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
US20220386942A1 (en) * 2021-06-04 2022-12-08 University Of Iowa Research Foundation Methods And Apparatus For Machine Learning To Analyze Musculo-Skeletal Rehabilitation From Images
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US11567573B2 (en) 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US11686743B2 (en) * 2019-06-11 2023-06-27 Honda Motor Co., Ltd. Information processing device, information processing method, and storage medium
US11790536B1 (en) 2019-10-11 2023-10-17 Bertec Corporation Swing analysis system
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US20240005600A1 (en) * 2019-12-27 2024-01-04 Sony Group Corporation Nformation processing apparatus, information processing method, and information processing program
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11951357B1 (en) * 2022-11-30 2024-04-09 Roku, Inc. Platform for visual tracking of user fitness
US11957478B2 (en) * 2022-04-06 2024-04-16 University Of Iowa Research Foundation Methods and apparatus for machine learning to analyze musculo-skeletal rehabilitation from images

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI396110B (en) * 2009-06-18 2013-05-11 Nat Univ Chin Yi Technology From the plane image of the establishment of three human space center of gravity and computer program products
US8139822B2 (en) * 2009-08-28 2012-03-20 Allen Joseph Selner Designation of a characteristic of a physical capability by motion analysis, systems and methods
KR101616926B1 (en) * 2009-09-22 2016-05-02 삼성전자주식회사 Image processing apparatus and method
US8633890B2 (en) * 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
US8963927B2 (en) 2010-12-15 2015-02-24 Microsoft Technology Licensing, Llc Vertex-baked three-dimensional animation augmentation
US8840527B2 (en) * 2011-04-26 2014-09-23 Rehabtek Llc Apparatus and method of controlling lower-limb joint moments through real-time feedback training
US8696450B2 (en) 2011-07-27 2014-04-15 The Board Of Trustees Of The Leland Stanford Junior University Methods for analyzing and providing feedback for improved power generation in a golf swing
WO2013036517A1 (en) * 2011-09-06 2013-03-14 Fenil Shah System and method for providing real-time guidance to a user
ITFI20110232A1 (en) * 2011-10-21 2013-04-22 Zionamento Sant Anna METHOD FOR CALCULATING THE MASS CENTER FOR A UMANOID PLATFORM
US8363891B1 (en) 2012-03-26 2013-01-29 Southern Methodist University System and method for predicting a force applied to a surface by a body during a movement
US9588582B2 (en) 2013-09-17 2017-03-07 Medibotics Llc Motion recognition clothing (TM) with two different sets of tubes spanning a body joint
US9275488B2 (en) 2012-10-11 2016-03-01 Sony Corporation System and method for animating a body
US11083344B2 (en) 2012-10-11 2021-08-10 Roman Tsibulevskiy Partition technologies
ES2432228B1 (en) * 2013-02-15 2014-11-18 Asociación Instituto De Biomecánica De Valencia Procedure and installation to characterize the support pattern of a subject
WO2014210344A1 (en) 2013-06-26 2014-12-31 The Cleveland Clinic Foundation Systems and methods to assess balance
US10402517B2 (en) * 2013-06-26 2019-09-03 Dassault Systémes Simulia Corp. Musculo-skeletal modeling using finite element analysis, process integration, and design optimization
WO2015054426A1 (en) * 2013-10-08 2015-04-16 Ali Kord Single-camera motion capture system
EP3063496A4 (en) * 2013-10-24 2017-06-07 Ali Kord Motion capture system
CN108211308B (en) * 2017-05-25 2019-08-16 深圳市前海未来无限投资管理有限公司 A kind of movement effects methods of exhibiting and device
MX2020012885A (en) 2018-05-29 2021-09-02 Curiouser Products Inc A reflective video display apparatus for interactive training and demonstration and methods of using same.
KR102598426B1 (en) * 2018-12-13 2023-11-06 현대자동차주식회사 A joint force predicting method in use with convolutional neural networks
US20220262075A1 (en) * 2019-06-14 2022-08-18 The Regents Of The University Of California Deep Learning of Biomimetic Sensorimotor Control for Biomechanical Model Animation
EP3986266A4 (en) * 2019-06-21 2023-10-04 Rehabilitation Institute of Chicago D/b/a Shirley Ryan Abilitylab Wearable joint tracking device with muscle activity and methods thereof
WO2021222497A1 (en) 2020-04-30 2021-11-04 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11167172B1 (en) 2020-09-04 2021-11-09 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors
CA3192004A1 (en) * 2020-09-11 2022-03-17 Stephen BAEK Methods and apapratus for machine learning to analyze musculo-skeletal rehabilitation from images
US11941824B2 (en) * 2021-04-12 2024-03-26 VelocityEHS Holdings, Inc. Video-based hand and ground reaction force determination

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625577A (en) * 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
US20020045517A1 (en) * 2000-08-30 2002-04-18 Oglesby Gary E. Treadmill control system
US6738065B1 (en) * 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US6774885B1 (en) * 1999-01-20 2004-08-10 Motek B.V. System for dynamic registration, evaluation, and correction of functional human behavior
US20040256754A1 (en) * 2003-02-26 2004-12-23 Fuji Photo Film Co., Ltd. Three-dimensional image forming method and apparatus
US7136722B2 (en) * 2002-02-12 2006-11-14 The University Of Tokyo Method for generating a motion of a human type link system

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5623428A (en) * 1990-12-25 1997-04-22 Shukyohoji, Kongo Zen Sohozan Shoriji Method for developing computer animation
JP2873338B2 (en) * 1991-09-17 1999-03-24 富士通株式会社 Moving object recognition device
WO1993006779A1 (en) * 1991-10-10 1993-04-15 Neurocom International, Inc. Apparatus and method for characterizing gait
US5826578A (en) * 1994-05-26 1998-10-27 Curchod; Donald B. Motion measurement apparatus
US5930741A (en) * 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5937081A (en) * 1996-04-10 1999-08-10 O'brill; Michael R. Image composition system and method of using same
JP3170638B2 (en) * 1996-08-08 2001-05-28 谷 白糸 Virtual reality experience device
US5904484A (en) * 1996-12-23 1999-05-18 Burns; Dave Interactive motion training device and method
US6119516A (en) * 1997-05-23 2000-09-19 Advantedge Systems, Inc. Biofeedback system for monitoring the motion of body joint
JP3355113B2 (en) * 1997-09-02 2002-12-09 株式会社モノリス Method for simulating human body movement and method for generating animation using the method
AU6249899A (en) * 1998-09-22 2000-04-10 Motek Motion Technology, Inc. System for dynamic registration, evaluation, and correction of functional human behavior
US7774177B2 (en) * 2001-06-29 2010-08-10 Honda Motor Co., Ltd. Exoskeleton controller for a human-exoskeleton system
JP3860076B2 (en) * 2002-06-06 2006-12-20 独立行政法人科学技術振興機構 BODY MODEL GENERATION METHOD, BODY MODEL GENERATION PROGRAM, RECORDING MEDIUM RECORDING THE SAME, AND RECORDING MEDIUM RECORDING BODY MODEL DATA
JP3960536B2 (en) * 2002-08-12 2007-08-15 株式会社国際電気通信基礎技術研究所 Computer-implemented method and computer-executable program for automatically adapting a parametric dynamic model to human actor size for motion capture
JP4358653B2 (en) * 2003-02-26 2009-11-04 富士フイルム株式会社 3D image forming method and apparatus
JP4590640B2 (en) * 2004-06-16 2010-12-01 国立大学法人 東京大学 Method and apparatus for acquiring muscle strength based on musculoskeletal model
CA2578653A1 (en) * 2004-07-29 2006-02-09 Kevin Ferguson A human movement measurement system
US7554549B2 (en) * 2004-10-01 2009-06-30 Sony Corporation System and method for tracking facial muscle and eye motion for computer graphics animation
EP1848380B1 (en) * 2004-12-22 2015-04-15 Össur hf Systems and methods for processing limb motion
CN101155557B (en) * 2005-02-02 2012-11-28 奥瑟Hf公司 Sensing systems and methods for monitoring gait dynamics
WO2006094817A2 (en) * 2005-03-11 2006-09-14 Rsscan International Method and appartus for displaying 3d images of a part of the skeleton
US7573477B2 (en) * 2005-06-17 2009-08-11 Honda Motor Co., Ltd. System and method for activation-driven muscle deformations for existing character motion
JP2007004732A (en) * 2005-06-27 2007-01-11 Matsushita Electric Ind Co Ltd Image generation device and method
JP4826459B2 (en) * 2006-01-12 2011-11-30 株式会社豊田中央研究所 Musculoskeletal model creation method, human stress / strain estimation method, program, and recording medium
US20080009771A1 (en) * 2006-03-29 2008-01-10 Joel Perry Exoskeleton
US8139067B2 (en) * 2006-07-25 2012-03-20 The Board Of Trustees Of The Leland Stanford Junior University Shape completion, animation and marker-less motion capture of people, animals or characters
KR100901274B1 (en) * 2007-11-22 2009-06-09 한국전자통신연구원 A character animation system and its method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625577A (en) * 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
US6774885B1 (en) * 1999-01-20 2004-08-10 Motek B.V. System for dynamic registration, evaluation, and correction of functional human behavior
US6738065B1 (en) * 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US20020045517A1 (en) * 2000-08-30 2002-04-18 Oglesby Gary E. Treadmill control system
US7136722B2 (en) * 2002-02-12 2006-11-14 The University Of Tokyo Method for generating a motion of a human type link system
US20040256754A1 (en) * 2003-02-26 2004-12-23 Fuji Photo Film Co., Ltd. Three-dimensional image forming method and apparatus

Cited By (188)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8892495B2 (en) 1991-12-23 2014-11-18 Blanding Hovenweep, Llc Adaptive pattern recognition based controller apparatus and method and human-interface therefore
US9535563B2 (en) 1999-02-01 2017-01-03 Blanding Hovenweep, Llc Internet appliance system and method
US20110013004A1 (en) * 2007-06-08 2011-01-20 Nokia Corporation Measuring human movements - method and apparatus
US8269826B2 (en) * 2007-06-08 2012-09-18 Nokia Corporation Measuring human movements—method and apparatus
US20090124938A1 (en) * 2007-11-14 2009-05-14 Wolfgang Brunner Gait Analysis System
US8790279B2 (en) * 2007-11-14 2014-07-29 Zebris Medical Gmbh Gait analysis system
US20120123805A1 (en) * 2007-12-04 2012-05-17 Terence Vardy Collection of medical data
US9307941B2 (en) 2008-05-29 2016-04-12 Bläckbild Patient management device, system and method
US20110137138A1 (en) * 2008-05-29 2011-06-09 Per Johansson Patient Management Device, System And Method
US8821416B2 (en) * 2008-05-29 2014-09-02 Cunctus Ab Patient management device, system and method
US8620635B2 (en) 2008-06-27 2013-12-31 Microsoft Corporation Composition of analytics models
US8411085B2 (en) 2008-06-27 2013-04-02 Microsoft Corporation Constructing view compositions for domain-specific environments
US7980997B2 (en) * 2008-10-23 2011-07-19 University Of Southern California System for encouraging a user to perform substantial physical activity
US8317657B2 (en) 2008-10-23 2012-11-27 University Of Southern California System for encouraging a user to perform substantial physical activity
US20100105525A1 (en) * 2008-10-23 2010-04-29 University Of Southern California System for encouraging a user to perform substantial physical activity
US8314793B2 (en) 2008-12-24 2012-11-20 Microsoft Corporation Implied analytical reasoning and computation
US20110285626A1 (en) * 2009-01-30 2011-11-24 Microsoft Corporation Gesture recognizer system architecture
US8782567B2 (en) 2009-01-30 2014-07-15 Microsoft Corporation Gesture recognizer system architecture
US8869072B2 (en) 2009-01-30 2014-10-21 Microsoft Corporation Gesture recognizer system architecture
US8578302B2 (en) 2009-01-30 2013-11-05 Microsoft Corporation Predictive determination
US9280203B2 (en) * 2009-01-30 2016-03-08 Microsoft Technology Licensing, Llc Gesture recognizer system architecture
US20170296115A1 (en) * 2009-02-02 2017-10-19 Joint Vue, LLC Motion Tracking System with Inertial-Based Sensing Units
US11004561B2 (en) * 2009-02-02 2021-05-11 Jointvue Llc Motion tracking system with inertial-based sensing units
WO2010099361A1 (en) * 2009-02-25 2010-09-02 Sherlock Nmd, Llc Devices, systems and methods for capturing biomechanical motion
US20100222711A1 (en) * 2009-02-25 2010-09-02 Sherlock NMD, LLC, a Nevada Corporation Devices, systems and methods for capturing biomechanical motion
US8493406B2 (en) 2009-06-19 2013-07-23 Microsoft Corporation Creating new charts and data visualizations
US8866818B2 (en) 2009-06-19 2014-10-21 Microsoft Corporation Composing shapes and data series in geometries
US9330503B2 (en) 2009-06-19 2016-05-03 Microsoft Technology Licensing, Llc Presaging and surfacing interactivity within data visualizations
US9342904B2 (en) 2009-06-19 2016-05-17 Microsoft Technology Licensing, Llc Composing shapes and data series in geometries
US8788574B2 (en) 2009-06-19 2014-07-22 Microsoft Corporation Data-driven visualization of pseudo-infinite scenes
US8531451B2 (en) 2009-06-19 2013-09-10 Microsoft Corporation Data-driven visualization transformation
US8692826B2 (en) 2009-06-19 2014-04-08 Brian C. Beckman Solver-based visualization framework
US20110045952A1 (en) * 2009-08-21 2011-02-24 Lifemodeler, Inc. Systems and Methods For Determining Muscle Force Through Dynamic Gain Optimization of a Muscle PID Controller
US8412669B2 (en) * 2009-08-21 2013-04-02 Life Modeler, Inc. Systems and methods for determining muscle force through dynamic gain optimization of a muscle PID controller
US20110060537A1 (en) * 2009-09-08 2011-03-10 Patrick Moodie Apparatus and method for physical evaluation
US8527217B2 (en) * 2009-09-08 2013-09-03 Dynamic Athletic Research Institute, Llc Apparatus and method for physical evaluation
US8352397B2 (en) 2009-09-10 2013-01-08 Microsoft Corporation Dependency graph in data-driven model
WO2011034963A2 (en) * 2009-09-15 2011-03-24 Sony Corporation Combining multi-sensory inputs for digital animation
WO2011034963A3 (en) * 2009-09-15 2011-07-28 Sony Corporation Combining multi-sensory inputs for digital animation
WO2011032541A1 (en) * 2009-09-16 2011-03-24 Otto-Von-Guericke-Universität Magdeburg Device for determining vertebral distances of the spinal column of a patient
US10628504B2 (en) 2010-07-30 2020-04-21 Microsoft Technology Licensing, Llc System of providing suggestions based on accessible and contextual information
US20120108909A1 (en) * 2010-11-03 2012-05-03 HeadRehab, LLC Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
US11094410B2 (en) 2010-11-05 2021-08-17 Nike, Inc. Method and system for automated personal training
US9358426B2 (en) 2010-11-05 2016-06-07 Nike, Inc. Method and system for automated personal training
US9283429B2 (en) * 2010-11-05 2016-03-15 Nike, Inc. Method and system for automated personal training
US10583328B2 (en) 2010-11-05 2020-03-10 Nike, Inc. Method and system for automated personal training
US11710549B2 (en) 2010-11-05 2023-07-25 Nike, Inc. User interface for remote joint workout session
US20120183939A1 (en) * 2010-11-05 2012-07-19 Nike, Inc. Method and system for automated personal training
US9919186B2 (en) 2010-11-05 2018-03-20 Nike, Inc. Method and system for automated personal training
US9457256B2 (en) 2010-11-05 2016-10-04 Nike, Inc. Method and system for automated personal training that includes training programs
US11915814B2 (en) 2010-11-05 2024-02-27 Nike, Inc. Method and system for automated personal training
US10420982B2 (en) 2010-12-13 2019-09-24 Nike, Inc. Fitness training system with energy expenditure calculation that uses a form factor
US20120165703A1 (en) * 2010-12-22 2012-06-28 Paul William Bottum Preempt Muscle Map Screen
US8315822B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force measurement system having inertial compensation
US8315823B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force and/or motion measurement system having inertial compensation and method thereof
US20120277635A1 (en) * 2011-04-29 2012-11-01 Tsai Ming-June Body motion staff, producing module, image processing module and motion replication module
US9504909B2 (en) 2011-05-05 2016-11-29 Qualcomm Incorporated Method and apparatus of proximity and stunt recording for outdoor gaming
US9370319B2 (en) * 2011-08-19 2016-06-21 Accenture Global Services Limited Interactive virtual care
US8771206B2 (en) * 2011-08-19 2014-07-08 Accenture Global Services Limited Interactive virtual care
US9861300B2 (en) 2011-08-19 2018-01-09 Accenture Global Services Limited Interactive virtual care
US20140276106A1 (en) * 2011-08-19 2014-09-18 Accenture Global Services Limited Interactive virtual care
US9149209B2 (en) * 2011-08-19 2015-10-06 Accenture Global Services Limited Interactive virtual care
CN102426709A (en) * 2011-08-19 2012-04-25 北京航空航天大学 Real-time motion synthesis method based on fast inverse kinematics
US9629573B2 (en) * 2011-08-19 2017-04-25 Accenture Global Services Limited Interactive virtual care
US20150045646A1 (en) * 2011-08-19 2015-02-12 Accenture Global Services Limited Interactive virtual care
US20130046149A1 (en) * 2011-08-19 2013-02-21 Accenture Global Services Limited Interactive virtual care
US8888721B2 (en) * 2011-08-19 2014-11-18 Accenture Global Services Limited Interactive virtual care
US9811639B2 (en) 2011-11-07 2017-11-07 Nike, Inc. User interface and fitness meters for remote joint workout session
US10825561B2 (en) 2011-11-07 2020-11-03 Nike, Inc. User interface for remote joint workout session
US9526451B1 (en) 2012-01-11 2016-12-27 Bertec Corporation Force measurement system
US9168420B1 (en) 2012-01-11 2015-10-27 Bertec Corporation Force measurement system
US10188930B2 (en) 2012-06-04 2019-01-29 Nike, Inc. Combinatory score having a fitness sub-score and an athleticism sub-score
US9289674B2 (en) 2012-06-04 2016-03-22 Nike, Inc. Combinatory score having a fitness sub-score and an athleticism sub-score
US9235241B2 (en) * 2012-07-29 2016-01-12 Qualcomm Incorporated Anatomical gestures detection system using radio signals
US20140028539A1 (en) * 2012-07-29 2014-01-30 Adam E. Newham Anatomical gestures detection system using radio signals
US9478058B2 (en) * 2012-08-06 2016-10-25 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
US20140039861A1 (en) * 2012-08-06 2014-02-06 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
US8764532B1 (en) 2012-11-07 2014-07-01 Bertec Corporation System and method for fall and/or concussion prediction
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US8847989B1 (en) 2013-01-19 2014-09-30 Bertec Corporation Force and/or motion measurement system and a method for training a subject using the same
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US8704855B1 (en) 2013-01-19 2014-04-22 Bertec Corporation Force measurement system having a displaceable force measurement assembly
US9081436B1 (en) 2013-01-19 2015-07-14 Bertec Corporation Force and/or motion measurement system and a method of testing a subject using the same
US10231662B1 (en) * 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US10413230B1 (en) * 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US9526443B1 (en) 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US11311209B1 (en) * 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US9504336B2 (en) * 2013-02-13 2016-11-29 Jon Dodd Configurable bed
US20140289964A1 (en) * 2013-02-13 2014-10-02 Jon Dodd Configurable bed
US20140267611A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Runtime engine for analyzing user motion in 3d images
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
EP3068301A4 (en) * 2013-11-12 2017-07-12 Highland Instruments, Inc. Analysis suite
WO2015073368A1 (en) 2013-11-12 2015-05-21 Highland Instruments, Inc. Analysis suite
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US20170020437A1 (en) * 2014-03-12 2017-01-26 Movotec A/S System, apparatus and method for measurement of muscle stiffness
US10575772B2 (en) * 2014-03-12 2020-03-03 Movotec A/S System, apparatus and method for measurement of muscle stiffness
US11482337B2 (en) * 2014-04-17 2022-10-25 The Boeing Company Method and system for tuning a musculoskeletal model
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10140745B2 (en) * 2015-01-09 2018-11-27 Vital Mechanics Research Inc. Methods and systems for computer-based animation of musculoskeletal systems
US20160203630A1 (en) * 2015-01-09 2016-07-14 Vital Mechanics Research Inc. Methods and systems for computer-based animation of musculoskeletal systems
US9836118B2 (en) * 2015-06-16 2017-12-05 Wilson Steele Method and system for analyzing a movement of a person
US10555688B1 (en) 2015-08-22 2020-02-11 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10216262B1 (en) * 2015-08-22 2019-02-26 Bertec Corporation Force management system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US9916011B1 (en) * 2015-08-22 2018-03-13 Bertec Corporation Force measurement system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US11301045B1 (en) * 2015-08-22 2022-04-12 Bertec Corporation Measurement system that includes at least one measurement assembly, a visual display device, and at least one data processing device
US10860843B1 (en) * 2015-08-22 2020-12-08 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10390736B1 (en) 2015-08-22 2019-08-27 Bertec Corporation Force measurement system that includes a force measurement assembly, at least one visual display device, and one or more data processing devices
US10535015B2 (en) 2015-11-05 2020-01-14 Samsung Electronics Co., Ltd. Walking assistance apparatus and method of controlling same
US20170177833A1 (en) * 2015-12-22 2017-06-22 Intel Corporation Smart placement of devices for implicit triggering of feedbacks relating to users' physical activities
US11511156B2 (en) 2016-03-12 2022-11-29 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
US10625137B2 (en) * 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US20170266533A1 (en) * 2016-03-18 2017-09-21 Icon Health & Fitness, Inc. Coordinated Displays in an Exercise Device
US11219575B2 (en) * 2016-03-23 2022-01-11 Zoll Medical Corporation Real-time kinematic analysis during cardio-pulmonary resuscitation
US20170273864A1 (en) * 2016-03-23 2017-09-28 Zoll Medical Corporation Real-Time Kinematic Analysis During Cardio-Pulmonary Resuscitation
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US10409371B2 (en) 2016-07-25 2019-09-10 Ctrl-Labs Corporation Methods and apparatus for inferring user intent based on neuromuscular signals
US10656711B2 (en) 2016-07-25 2020-05-19 Facebook Technologies, Llc Methods and apparatus for inferring user intent based on neuromuscular signals
WO2018022602A1 (en) * 2016-07-25 2018-02-01 Ctrl-Labs Corporation Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
WO2018146630A3 (en) * 2017-02-10 2018-10-04 Apex Occupational Health Solutions Inc. Method and system for ergonomic augmentation of workspaces
US11253173B1 (en) * 2017-05-30 2022-02-22 Verily Life Sciences Llc Digital characterization of movement to detect and monitor disorders
US20190133693A1 (en) * 2017-06-19 2019-05-09 Techmah Medical Llc Surgical navigation of the hip using fluoroscopy and tracking sensors
US11826111B2 (en) 2017-06-19 2023-11-28 Techmah Medical Llc Surgical navigation of the hip using fluoroscopy and tracking sensors
US11331151B2 (en) * 2017-06-19 2022-05-17 Techmah Medical Llc Surgical navigation of the hip using fluoroscopy and tracking sensors
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
CN111492367A (en) * 2017-12-13 2020-08-04 谷歌有限责任公司 Gesture learning, lifting and noise cancellation from 2D images
US11587242B1 (en) 2018-01-25 2023-02-21 Meta Platforms Technologies, Llc Real-time processing of handstate representation model estimates
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US10950047B2 (en) 2018-01-25 2021-03-16 Facebook Technologies, Llc Techniques for anonymizing neuromuscular signal data
US11361522B2 (en) * 2018-01-25 2022-06-14 Facebook Technologies, Llc User-controlled tuning of handstate representation model parameters
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
US10817795B2 (en) 2018-01-25 2020-10-27 Facebook Technologies, Llc Handstate reconstruction based on multiple inputs
US11127143B2 (en) 2018-01-25 2021-09-21 Facebook Technologies, Llc Real-time processing of handstate representation model estimates
US10489986B2 (en) * 2018-01-25 2019-11-26 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US11163361B2 (en) 2018-01-25 2021-11-02 Facebook Technologies, Llc Calibration techniques for handstate representation modeling using neuromuscular signals
US11331045B1 (en) 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US10504286B2 (en) 2018-01-25 2019-12-10 Ctrl-Labs Corporation Techniques for anonymizing neuromuscular signal data
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11036302B1 (en) 2018-05-08 2021-06-15 Facebook Technologies, Llc Wearable devices and methods for improved speech recognition
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US11129569B1 (en) 2018-05-29 2021-09-28 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
CN110109532A (en) * 2018-06-11 2019-08-09 成都思悟革科技有限公司 A kind of human action Compare System obtaining system based on human body attitude
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10905350B2 (en) 2018-08-31 2021-02-02 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US11567573B2 (en) 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11771327B2 (en) 2019-03-05 2023-10-03 Physmodo, Inc. System and method for human motion detection and tracking
US11547324B2 (en) 2019-03-05 2023-01-10 Physmodo, Inc. System and method for human motion detection and tracking
US11103748B1 (en) 2019-03-05 2021-08-31 Physmodo, Inc. System and method for human motion detection and tracking
US11497961B2 (en) 2019-03-05 2022-11-15 Physmodo, Inc. System and method for human motion detection and tracking
US11497962B2 (en) 2019-03-05 2022-11-15 Physmodo, Inc. System and method for human motion detection and tracking
US11331006B2 (en) 2019-03-05 2022-05-17 Physmodo, Inc. System and method for human motion detection and tracking
US11826140B2 (en) 2019-03-05 2023-11-28 Physmodo, Inc. System and method for human motion detection and tracking
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11686743B2 (en) * 2019-06-11 2023-06-27 Honda Motor Co., Ltd. Information processing device, information processing method, and storage medium
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11458362B1 (en) * 2019-10-11 2022-10-04 Bertec Corporation Swing analysis system
US11790536B1 (en) 2019-10-11 2023-10-17 Bertec Corporation Swing analysis system
US11097154B1 (en) * 2019-10-11 2021-08-24 Bertec Corporation Swing analysis system
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US20240005600A1 (en) * 2019-12-27 2024-01-04 Sony Group Corporation Nformation processing apparatus, information processing method, and information processing program
US11961494B1 (en) 2020-03-27 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
CN112733301A (en) * 2021-01-21 2021-04-30 佛山科学技术学院 Six-dimensional torque sensor gravity compensation method and system based on neural network
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
CN113456060A (en) * 2021-05-27 2021-10-01 中国科学院软件研究所 Method and device for extracting characteristic parameters of motion function
US20220386942A1 (en) * 2021-06-04 2022-12-08 University Of Iowa Research Foundation Methods And Apparatus For Machine Learning To Analyze Musculo-Skeletal Rehabilitation From Images
US11957478B2 (en) * 2022-04-06 2024-04-16 University Of Iowa Research Foundation Methods and apparatus for machine learning to analyze musculo-skeletal rehabilitation from images
US11951357B1 (en) * 2022-11-30 2024-04-09 Roku, Inc. Platform for visual tracking of user fitness

Also Published As

Publication number Publication date
WO2008109248A2 (en) 2008-09-12
ES2679125T3 (en) 2018-08-22
WO2008109248A3 (en) 2008-11-20
EP2120710A2 (en) 2009-11-25
US20090082701A1 (en) 2009-03-26
EP2120710B1 (en) 2018-04-18
WO2008109248A4 (en) 2008-12-31
CA2680462C (en) 2016-08-16
CA2680462A1 (en) 2008-09-12
JP2010520561A (en) 2010-06-10
JP5016687B2 (en) 2012-09-05
US7931604B2 (en) 2011-04-26
EP2120710A4 (en) 2013-04-10

Similar Documents

Publication Publication Date Title
CA2680462C (en) Method for real time interactive visualization of muscle forces and joint torques in the human body
US11052288B1 (en) Force measurement system
US10646153B1 (en) Force measurement system
US11301045B1 (en) Measurement system that includes at least one measurement assembly, a visual display device, and at least one data processing device
US10413230B1 (en) Force measurement system
US11311209B1 (en) Force measurement system and a motion base used therein
US10231662B1 (en) Force measurement system
US9770203B1 (en) Force measurement system and a method of testing a subject
US10010286B1 (en) Force measurement system
US9763604B1 (en) Gait perturbation system and a method for testing and/or training a subject using the same
US6774885B1 (en) System for dynamic registration, evaluation, and correction of functional human behavior
Zhao et al. A Kinect-based rehabilitation exercise monitoring and guidance system
US9526443B1 (en) Force and/or motion measurement system and a method of testing a subject
Caserman et al. A survey of full-body motion reconstruction in immersive virtual reality applications
US9081436B1 (en) Force and/or motion measurement system and a method of testing a subject using the same
US11540744B1 (en) Force measurement system
EP1131734B1 (en) System for dynamic registration, evaluation, and correction of functional human behavior
Melero et al. Upbeat: augmented reality-guided dancing for prosthetic rehabilitation of upper limb amputees
US20020009222A1 (en) Method and system for viewing kinematic and kinetic information
Nunes et al. Human motion analysis and simulation tools: a survey
Wirth et al. The impact of avatar appearance, perspective and context on gait variability and user experience in virtual reality
Yang et al. Physical education motion correction system based on virtual reality technology
Ye et al. Sensation transfer for immersive exoskeleton motor training: Implications of haptics and viewpoints
Bauer et al. Interactive visualization of muscle activity during limb movements: Towards enhanced anatomy learning
White et al. A virtual reality application for stroke patient rehabilitation

Legal Events

Date Code Title Description
AS Assignment

Owner name: MOTEK BV, NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZOHAR, OSHRI EVEN;VAN DEN BOGERT, ANTONIE J.;REEL/FRAME:019645/0478;SIGNING DATES FROM 20070728 TO 20070730

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION