WO2009138929A1 - Marker tracking system and method - Google Patents

Marker tracking system and method Download PDF

Info

Publication number
WO2009138929A1
WO2009138929A1 PCT/IB2009/051900 IB2009051900W WO2009138929A1 WO 2009138929 A1 WO2009138929 A1 WO 2009138929A1 IB 2009051900 W IB2009051900 W IB 2009051900W WO 2009138929 A1 WO2009138929 A1 WO 2009138929A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
markers
series
locations
movement
Prior art date
Application number
PCT/IB2009/051900
Other languages
French (fr)
Inventor
Rong Song
Declan Patrick Kelly
Hui Liu
Jin Wang
Yunqiang Liu
Juan Du
Zhongtao Mei
Ron Kroon
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2009138929A1 publication Critical patent/WO2009138929A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/302Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device specially adapted for receiving control signals not targeted to a display device or game input means, e.g. vibrating driver's seat, scent dispenser

Definitions

  • the present invention relates to a target tracking technique, in particular to a technique of acquiring a target location.
  • Target movement analysis has been widely used in such fields as movies, video games and sports, etc. These techniques of analysis usually use different sensors to record the movement of an object.
  • the most commonly used recording technique is optical recording.
  • optical recording a marker is attached to an object, and a camera is used to record the locations of the marker.
  • An infrared camera and a reflecting marker are usually used in order to reduce interference from the background light and to increase the contrast of the marker.
  • the infrared camera is a camera in which an infrared light filter for removing the visible light and an infrared light source are mounted in front of the lens of the camera.
  • Some game machines such as the Xavix system, use only one reflecting marker to control the TV game.
  • the system can only identify some very simple movements in a two- dimensional space.
  • One marker is not enough, however, for the movements of complicated objects in a three-dimensional space, such as limb movements. Therefore, a plurality of cameras and a plurality of markers are used to track certain complicated objects.
  • Some existing systems can detect precise movements in a three-dimensional space, but such systems require multiple (6 or 8) cameras.
  • the precision of tracking of the location of an object is associated with the number of cameras used, and more cameras result in higher precision.
  • the present invention provides a target tracking system which comprises: at least two markers attached to the target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers; at least one camera for generating a series of images of the markers; a first module for differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
  • the shape feature may be the regularity of a shape, the surface area of the shape or the pattern of a surface, for example, if there is at least one hole at the surface of a marker.
  • the tracking system of the present invention further comprises a second module for obtaining a series of locations of the target from the relative movements of said markers in said series of images.
  • the target consists of two parts connected to each other, three markers are attached to the ends of the target and the connection point of the two parts respectively, and the second module obtains the location of the target by calibrating the distance between every two markers at an initial location, then measuring the distance between every two markers in the series of images, and finally calculating the spatial angular coordinates of the two parts of the target by means of corresponding mathematical formulae.
  • the second module is further adapted to identify the movement of the target according to the series of locations, said identification comprising the step of filtering repeated locations, shaking locations and incorrect locations out from said series of locations.
  • the target is a limb of a person or animal
  • the connection point is an arthrosis of the person or animal.
  • the tracking system of the present invention further comprises a third module for comparing the target movement identified by the second module with a movement template, a fourth module for displaying the comparison result obtained by the third module, and a fifth module for displaying predefined information in accordance with the comparison result of the third module.
  • the present invention also provides a target tracking method which comprises the steps of: using at least two markers attached to the target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers; using at least one camera for generating a series of images of the markers; differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
  • the tracking method further comprises a step of obtaining a series of locations of the target from the relative movements of said markers in said series of images.
  • the target consists of two parts connected to each other, three markers are attached to the ends of the target and the connection point of its two parts respectively, and the step of obtaining a series of locations of the target includes: -calibrating the distance between every two markers at an initial location;
  • the tracking method of the present invention further comprises a step of identifying the movement of the target from the series of locations, said identification including a step of filtering repeated locations, shaking locations and incorrect locations out from said series of locations.
  • the tracking method of the present invention further comprises a step of comparing the identified target movement with a movement template.
  • the tracking method of the present invention further comprises a step of displaying the comparison result and a step of displaying predefined information in accordance with the comparison result.
  • the tracking system of the present invention uses simple hardware and has a low cost, so it is suitable for use in various applications, such as games, athletics, and monitoring the movements of a dyskinesia patient during recovery.
  • FIG. 1 is a schematic diagram of an embodiment of the tracking system of the present invention
  • Fig. 2 shows an example of the shape feature of the markers of the present invention
  • Fig. 3 is a schematic diagram of how the mathematical coordinates of the spatial location of the target are obtained from the markers in the images according to the present invention
  • Fig. 4 is a schematic diagram of the principle of identifying the target movement from the target locations according to the present invention.
  • Fig. 5 is a schematic diagram of another embodiment of the tracking system of the present invention.
  • Fig. 1 is a schematic diagram of the target tracking system, which includes at least two
  • markers 121, 122, 123 attached to the target (the upper limbs of a person in this embodiment), any one of said at least two markers having at least one shape feature that is different from the shape features of other markers.
  • the markers in this embodiment are reflecting markers having a retroreflective material on their surfaces.
  • the system further comprises at least one camera 11 (one camera in Fig. 1 by way of example) for generating a series of images of the markers.
  • the camera is an infrared camera having an infrared light source.
  • the system further comprises a processor 14 for processing the generated series of images.
  • the processor 14 comprises a first module 141 for differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
  • the shape feature of the markers may be any feature capable of distinguishing said markers from other markers, such as the regularity of a shape, the area of the shape, or the pattern of a surface, e.g. if there is at least one hole at the surface of the marker.
  • markers For example, if one of the markers is a triangle or a polygon or is of a very irregular shape while another one is circular, then these two markers can be differentiated on the basis of the regularity of their shapes. As another example, different markers may be differentiated by their surface areas.
  • the marker 123 is designed to be connected to the end of a handle, and a person can hold this handle in his/her hand. Such a design has the advantage of preventing the marker from being covered by the arm. Alternatively, the marker may be directly connected to the wrist, of course.
  • the marker 123 has the largest area of the three markers.
  • the central coordinates (I c , J c ) of the three markers in the image(s) can be calculated.
  • the average distance from the center to the edge of the marker can be calculated with the following formula:
  • v hcK-in Xi and Yj correspond to the ion and column o? " UP edge pixel, respectively, and N is the number of edge pixels.
  • the standard deviation of the distances from the center to the edge of the marker can be calculated with the following formula:
  • the marker 121 may have a low shape regularity factor, whereas the marker 122 has a high shape regularity factor, so that the two markers can be distinguished from each other by means of a shape feature such as this shape regularity factor.
  • the shapes of marker 121 and marker 122 may be circular and triangular, respectively.
  • the three markers 121, 122 and 123 can be easily identified by the two shape features of surface area and regularity.
  • Fig. 2 is another example of the shape feature, wherein the surfaces of the markers have at least one hole.
  • the markers are reflecting markers
  • the surface of one marker 21 is processed into a fully reflecting surface
  • the surface of another marker 22 has at least one area of a closed pattern (preferably a circle) that is not reflecting.
  • This area of a closed pattern will be black in the captured image, because it does not reflect light, and is called a hole in image technology.
  • To distinguish the two markers it is calculated whether there are holes in the markers. If both markers have holes, they can be distinguished by comparing the numbers of holes. The number of holes in the area of the marker can be determined from the Euler number.
  • the Euler number can be calculated by the following formula:
  • ⁇ v , ⁇ t and ⁇ d represent the numbers of said corresponding modes in the marker's areas.
  • the shape feature as mentioned herein is not limited to the above-mentioned types, and any shape feature that can be used to differentiate between the shapes of two markers can be applied to the present invention.
  • the camera used in this embodiment is an infrared camera, but the invention is not limited to the use of an infrared camera, and any type of camera can be used; besides, the markers are not limited to reflecting markers, and any type of markers can be applied to this invention.
  • the tracking system of the present invention further comprises a second module for obtaining a series of locations of the target from the relative movements of said markers in said series of images.
  • the second module obtains the location of the target by calibrating the distance between every two markers at an initial location, then measuring the distance between every two markers in the series of images, and finally calculating the spatial angular coordinates of the two parts of the target from the corresponding mathematical formulae.
  • Fig. 3 is a diagram of the spatial location of the straight line formed by the centers E and S of two markers, wherein the XOY plane is parallel to the camera's imaging plane, EE' is parallel to the Z-axis, i.e. it is perpendicular to the XOY plane and the intersection point is E', which is the projection of the marker E in the XOY plane, and E'D is parallel to the Y axis, ⁇ represents the angle between the projection SE' of SE in the XOY plane and the Y axis; and ⁇ represents the angle between SE and the XOY plane, ⁇ and ⁇ reflect the spatial angles of the straight line formed by the two markers.
  • the values of ⁇ and ⁇ can be calculated from the central coordinates of the markers in the XOY plane and the distance between them:
  • the target is an arm of a person
  • the connection point is formed by the arthroses of the arm.
  • the three markers 121, 122 and 123 are attached to the three arthroses, respectively, and the processor 14 further comprises a second module 142 for obtaining the location of the arm 10 in the image.
  • the second module 142 can make calibration, i.e. to record the spatial angle between the upper arm and the forearm at the starting position and calculate the distances between the markers in the image from the coordinates of the three markers. After this initial calibrating, the upper arm begins to move. The distance between every two markers can be measured in the series of images. On the basis of the calibrated parameters (i.e.
  • the spatial angular coordinates of the two parts (the upper arm and forearm) of the target at the new location can be obtained by means of the above mathematical formulae, thereby obtaining the location of the target (i.e. the gesture of the arm).
  • a series of locations is obtained through successive image frames, which locations represent the movement of the target. Still taking the movement of the arm as an example, Fig.
  • FIG. 4A is a schematic diagram of the arm movement, wherein the arm moves from location A to location E, and this space of movement is divided into five sub-spaces A, B, C, D, and E, each of which represents a location range. Any locations in space A will be simply considered as the location A.
  • the movement from A to E in Fig. 4 can be represented by a location sequence ABCDE, which means that said movement starts from location A to location E via locations B, C, and D.
  • the density of this division into sub-spaces depends on the required precision; if a higher precision is required, the movement space can be more densely divided, for example, the space of A to E may be divided into 10 sub-spaces (ABCDEFGHIJ).
  • the present invention further includes a step of filtering any repeated locations out from the series of locations while the second module is identifying the movement of the target on the basis of the series of locations. After filtering, the movement is finally identified as ABCDE, i.e. a movement from A to E.
  • a further object is to improve the error tolerance of the system when the tracked movement shakes at the border of two adjacent sub-spaces.
  • a person with dyskinesia may stay and shake at the border of CD owing to his/her muscle strength, or the shaking movement is caused by an inaccurate location calculated owing to a system error.
  • the location sequence representing said movement in such a case may become AAAAABBBBBCCCCDCDDDEEEE, and after filtering out of the repeated locations the movement is represented as ABCDCDE. If the system requires a high error tolerance, the user generally does not want such shakes to be represented as the normal movement.
  • the present invention optionally provides a step of filtering out the shakes after the filtering of repeated locations, which filters out the XYX type shake and replaces the XYX location directly with X.
  • a step of filtering out the shakes after the filtering of repeated locations which filters out the XYX type shake and replaces the XYX location directly with X.
  • the tracking system of the present invention may further comprise a third module for comparing the target movement identified by the second module with a movement template.
  • the movement template is pre-stored in the processor 14 in order to be compared with the identified movement that is represented by a location sequence so as to determine the coincidence between the movement made and the template.
  • the tracking system of the present invention may also display the corresponding templates in the form of a picture or voice for the user's reference according to the user's selection when the user starts the tracking system.
  • the tracking system of the present invention may further comprise a fourth module for displaying the comparison result obtained by the third module.
  • Fig. 1, 13 is a user terminal, which may be a TV or a display.
  • the fourth module 131 serves to display the comparison result from the third module 143.
  • the method of displaying varies, for example informing the user acoustically whether the movement coincides with the template or displaying the comparison result visually, e.g. in pictures or text or a combination thereof. If the movement coincides with the template, for example, the word "correct” may be displayed, and if the movement does not coincide with the template, a word "incorrect” may be displayed, or it may be simply represented by the sign of right (V) or wrong (x).
  • the tracking system of the present invention may further comprise a fifth module for displaying predefined information in accordance with the comparison result of the third module.
  • the fifth module 132 is designed to p ⁇ w N ⁇ s p ⁇ ⁇ oluv ⁇ 1 information according to the comparison result of the third module 143.
  • the predefined information may be some audio and/or visual data or games liked by the user, which are selected according to the user's preference so as to inspire the user to continue his movement. For example, if the comparison result is "correct", the system plays a favorite song to reward the user, or displays some photos that a patient likes, or provides a simple game for the patient to play. This function is very helpful in recovery training for dyskinesia patients, because usually some simple movements are very difficult for such patients, and repeated training will make them feel bored and finally give up the training.
  • the tracking system of the present invention may further comprise a functional electrical stimulation device (FES) for stimulating said target via electrical pulse based on the result of comparing said identified target movement with a movement template.
  • FES functional electrical stimulation device
  • Module 51 is an infrared camera for capturing the movement of a user 50 through the retroreflective markers 52 attached to the human body.
  • a processor 54 is incorporated or connected with a camera 51 for receiving location information of the actual movement performed by the user 50 and shown the computed actual movement on the screen 53. The actual movement is compared with the movement template by the processor 54; and the comparison result is used for generating control signal to control the FES 55.
  • 53 may provide both movement template and actual movement to the user. The difference between movement template and actual user movement shown on the screen will help the user to correct the movement.
  • a control signal will be generated by the processor 54 and is sent to the FES device 55.
  • the electrical pulses will be generated by the pulse generator 551 for stimulating the corresponding muscles via its electrodes 553 attached to the skin of the user 50 in order to assist the user to finish the movement.
  • the electrodes 553 may also be used to capture a biofeedback signal (Electromyography signal, EMG for short), when the electrodes are not working in stimulating mode, i.e. there is no electronic pulses sent to the electrodes 553.
  • EMG signal is sent to an EMG signal processor 552 for amplifying and processing.
  • the processed EMG signal is then sent to the processor 54 for analyzing and/or displaying.
  • the present invention also provides a target tracking method which comprises the following steps:
  • any one of said at least two markers having at least one shape feature that is different from the shape features of other markers;
  • the tracking method of the present invention may further comprise a step of obtaining a series of locations of the target from the relative movements of said markers in said series of images.
  • the step of obtaining a series of location of the target includes:
  • the tracking method of the present invention may further comprise identifying the movement of the target from the series of locations, said identification including a step of filtering out any repeated locations, shaking locations, and incorrect locations from said series of locations. Said step has been described previously, so it will not be elaborated here.
  • the tracking method of the present invention may further comprise a step of comparing the identified target movement with a movement template. Said step has been described previously, so it will not be elaborated here.
  • the tracking method of the present invention may further comprise a step of displaying the comparison result and a step of displaying predefined information on the target in accordance with the comparison result. Said two steps have been described in detail previously, so they will not be elaborated here.
  • Said target tracking system may use one or more cameras to capture images; it may use two or more markers to mark the target; and the target may be any target without being limited to the arm of a person, for example the limb of an animal, or other objects such as a robot, etc.

Abstract

The present invention introduces a target tracking system and method comprising at least two markers attached to the target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers; at least one camera for generating a series of images of the markers; and a first module for differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature. The method and system of the present invention provide an effective tracking of the target whereby the movement of the target can be identified.

Description

MARKER TRACKING SYSTEM AND METHOD
TECHNICAL FIELD The present invention relates to a target tracking technique, in particular to a technique of acquiring a target location.
BACKGROUND ART
Target movement analysis has been widely used in such fields as movies, video games and sports, etc. These techniques of analysis usually use different sensors to record the movement of an object. The most commonly used recording technique is optical recording. In optical recording, a marker is attached to an object, and a camera is used to record the locations of the marker. An infrared camera and a reflecting marker are usually used in order to reduce interference from the background light and to increase the contrast of the marker. The infrared camera is a camera in which an infrared light filter for removing the visible light and an infrared light source are mounted in front of the lens of the camera.
Some game machines, such as the Xavix system, use only one reflecting marker to control the TV game. The system can only identify some very simple movements in a two- dimensional space. One marker is not enough, however, for the movements of complicated objects in a three-dimensional space, such as limb movements. Therefore, a plurality of cameras and a plurality of markers are used to track certain complicated objects.
Some existing systems, such as the Vicon motion system, can detect precise movements in a three-dimensional space, but such systems require multiple (6 or 8) cameras. Generally speaking, the precision of tracking of the location of an object is associated with the number of cameras used, and more cameras result in higher precision.
However, there are some deficiencies in such systems. For example, when any two of the markers overlap in the images captured by the cameras and then separate again, one of them is apt to be mistaken for the other.
Furthermore, the cost of using multiple cameras is relatively high, so it is also a problem how to use a single camera for identifying some relatively complicated three- dimensional movements. SUMMARY OF THE INVENTION
The present invention provides a target tracking system which comprises: at least two markers attached to the target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers; at least one camera for generating a series of images of the markers; a first module for differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
The shape feature may be the regularity of a shape, the surface area of the shape or the pattern of a surface, for example, if there is at least one hole at the surface of a marker. The tracking system of the present invention further comprises a second module for obtaining a series of locations of the target from the relative movements of said markers in said series of images.
In the tracking system of the present invention, the target consists of two parts connected to each other, three markers are attached to the ends of the target and the connection point of the two parts respectively, and the second module obtains the location of the target by calibrating the distance between every two markers at an initial location, then measuring the distance between every two markers in the series of images, and finally calculating the spatial angular coordinates of the two parts of the target by means of corresponding mathematical formulae. The second module is further adapted to identify the movement of the target according to the series of locations, said identification comprising the step of filtering repeated locations, shaking locations and incorrect locations out from said series of locations.
In the tracking system of the present invention, the target is a limb of a person or animal, and the connection point is an arthrosis of the person or animal. The tracking system of the present invention further comprises a third module for comparing the target movement identified by the second module with a movement template, a fourth module for displaying the comparison result obtained by the third module, and a fifth module for displaying predefined information in accordance with the comparison result of the third module. The present invention also provides a target tracking method which comprises the steps of: using at least two markers attached to the target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers; using at least one camera for generating a series of images of the markers; differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature. The tracking method further comprises a step of obtaining a series of locations of the target from the relative movements of said markers in said series of images.
In the tracking method, the target consists of two parts connected to each other, three markers are attached to the ends of the target and the connection point of its two parts respectively, and the step of obtaining a series of locations of the target includes: -calibrating the distance between every two markers at an initial location;
-measuring the distance between every two markers in the series of images;
-calculating the spatial angular coordinates of the two parts of the target from corresponding mathematical formulae so as to obtain the location of the target.
The tracking method of the present invention further comprises a step of identifying the movement of the target from the series of locations, said identification including a step of filtering repeated locations, shaking locations and incorrect locations out from said series of locations.
The tracking method of the present invention further comprises a step of comparing the identified target movement with a movement template. The tracking method of the present invention further comprises a step of displaying the comparison result and a step of displaying predefined information in accordance with the comparison result.
The accuracy of the tracking of the target location is improved by the tracking system and method of the invention, and the identification of the target movement becomes easier. In addition, the tracking system of the present invention uses simple hardware and has a low cost, so it is suitable for use in various applications, such as games, athletics, and monitoring the movements of a dyskinesia patient during recovery.
DESCRIPTION OF THE DRAWINGS Fig. 1 is a schematic diagram of an embodiment of the tracking system of the present invention;
Fig. 2 shows an example of the shape feature of the markers of the present invention; Fig. 3 is a schematic diagram of how the mathematical coordinates of the spatial location of the target are obtained from the markers in the images according to the present invention;
Fig. 4 is a schematic diagram of the principle of identifying the target movement from the target locations according to the present invention.
Fig. 5 is a schematic diagram of another embodiment of the tracking system of the present invention;
The principle of the present invention will be described below with reference to the specific embodiments and the drawings.
PREFERRED EMBODIMENTS
Fig. 1 is a schematic diagram of the target tracking system, which includes at least two
(three in Fig. 1 by way of example) markers 121, 122, 123 attached to the target (the upper limbs of a person in this embodiment), any one of said at least two markers having at least one shape feature that is different from the shape features of other markers. The markers in this embodiment are reflecting markers having a retroreflective material on their surfaces.
The system further comprises at least one camera 11 (one camera in Fig. 1 by way of example) for generating a series of images of the markers. The camera is an infrared camera having an infrared light source.
The system further comprises a processor 14 for processing the generated series of images. The processor 14 comprises a first module 141 for differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
The shape feature of the markers may be any feature capable of distinguishing said markers from other markers, such as the regularity of a shape, the area of the shape, or the pattern of a surface, e.g. if there is at least one hole at the surface of the marker.
For example, if one of the markers is a triangle or a polygon or is of a very irregular shape while another one is circular, then these two markers can be differentiated on the basis of the regularity of their shapes. As another example, different markers may be differentiated by their surface areas.
Taking the three markers in Fig. 1 as an example, the marker 123 is designed to be connected to the end of a handle, and a person can hold this handle in his/her hand. Such a design has the advantage of preventing the marker from being covered by the arm. Alternatively, the marker may be directly connected to the wrist, of course. The marker 123 has the largest area of the three markers. The central coordinates (Ic, Jc) of the three markers in the image(s) can be calculated.
The average distance from the center to the edge of the marker can be calculated with the following formula:
Distance
Figure imgf000007_0001
v hcK-in Xi and Yj correspond to the ion and column o?" UP edge pixel, respectively, and N is the number of edge pixels.
The standard deviation of the distances from the center to the edge of the marker can be calculated with the following formula:
(2)
Figure imgf000007_0002
And the shape regularity factor can be obtained from the following formula:
λ = Distancestd / Distancemean (3)
The marker 121 may have a low shape regularity factor, whereas the marker 122 has a high shape regularity factor, so that the two markers can be distinguished from each other by means of a shape feature such as this shape regularity factor. For example, the shapes of marker 121 and marker 122 may be circular and triangular, respectively. Thus, the three markers 121, 122 and 123 can be easily identified by the two shape features of surface area and regularity.
Fig. 2 is another example of the shape feature, wherein the surfaces of the markers have at least one hole. If the markers are reflecting markers, the surface of one marker 21 is processed into a fully reflecting surface, and the surface of another marker 22 has at least one area of a closed pattern (preferably a circle) that is not reflecting. This area of a closed pattern will be black in the captured image, because it does not reflect light, and is called a hole in image technology. To distinguish the two markers it is calculated whether there are holes in the markers. If both markers have holes, they can be distinguished by comparing the numbers of holes. The number of holes in the area of the marker can be determined from the Euler number.
In a binary image in eight connected regions, the Euler number can be calculated by the following formula:
_Vκ/7i = (∑v - ∑f - 2 * ∑d)/4 (4)
Where Num represents the Euler number, v, t and d represent the two-dimensional
pixel matrixes in the binary image and their spatial rotational
Figure imgf000008_0001
equivalent modes, respectively. The rotational equivalent modes of v
0 1 0 0 0 0 include the rotational equivalent modes of t
0 0 1 0 0 1
include
Figure imgf000008_0002
^v , ^t and ∑d represent the numbers of said corresponding modes in the marker's areas.
It shall be understood that the shape feature as mentioned herein is not limited to the above-mentioned types, and any shape feature that can be used to differentiate between the shapes of two markers can be applied to the present invention. The camera used in this embodiment is an infrared camera, but the invention is not limited to the use of an infrared camera, and any type of camera can be used; besides, the markers are not limited to reflecting markers, and any type of markers can be applied to this invention.
The tracking system of the present invention further comprises a second module for obtaining a series of locations of the target from the relative movements of said markers in said series of images.
In the tracking system of the present invention, the second module obtains the location of the target by calibrating the distance between every two markers at an initial location, then measuring the distance between every two markers in the series of images, and finally calculating the spatial angular coordinates of the two parts of the target from the corresponding mathematical formulae.
Fig. 3 is a diagram of the spatial location of the straight line formed by the centers E and S of two markers, wherein the XOY plane is parallel to the camera's imaging plane, EE' is parallel to the Z-axis, i.e. it is perpendicular to the XOY plane and the intersection point is E', which is the projection of the marker E in the XOY plane, and E'D is parallel to the Y axis, α represents the angle between the projection SE' of SE in the XOY plane and the Y axis; and β represents the angle between SE and the XOY plane, α and β reflect the spatial angles of the straight line formed by the two markers. The values of α and β can be calculated from the central coordinates of the markers in the XOY plane and the distance between them:
Una = E Dl 1SE (5 ) cos/? = SE' I SE (6) where the range of α is 0° to 360°, and the range of β is -90° to 90°. In such a way, the special location of line SE can be identified. As shown in FIG.3, W represents another marker, similarly, the spatial location of line EW can be identified.
Referring to Fig. 1 by way of example, the target is an arm of a person, and the connection point is formed by the arthroses of the arm. The three markers 121, 122 and 123 are attached to the three arthroses, respectively, and the processor 14 further comprises a second module 142 for obtaining the location of the arm 10 in the image.
Before the arm begins to move, it needs to stop for a moment at the starting location so that the second module 142 can make calibration, i.e. to record the spatial angle between the upper arm and the forearm at the starting position and calculate the distances between the markers in the image from the coordinates of the three markers. After this initial calibrating, the upper arm begins to move. The distance between every two markers can be measured in the series of images. On the basis of the calibrated parameters (i.e. the recorded spatial angle and the distances between the markers in the image at the starting location) , and the distance between every two markers at the new location, the spatial angular coordinates of the two parts (the upper arm and forearm) of the target at the new location can be obtained by means of the above mathematical formulae, thereby obtaining the location of the target (i.e. the gesture of the arm). After the target has made a movement, a series of locations is obtained through successive image frames, which locations represent the movement of the target. Still taking the movement of the arm as an example, Fig. 4A is a schematic diagram of the arm movement, wherein the arm moves from location A to location E, and this space of movement is divided into five sub-spaces A, B, C, D, and E, each of which represents a location range. Any locations in space A will be simply considered as the location A. Thus, the movement from A to E in Fig. 4 can be represented by a location sequence ABCDE, which means that said movement starts from location A to location E via locations B, C, and D. The density of this division into sub-spaces depends on the required precision; if a higher precision is required, the movement space can be more densely divided, for example, the space of A to E may be divided into 10 sub-spaces (ABCDEFGHIJ).
Since the camera photographs very fast, for example 24 frames of pictures per second, there may be many repeated locations if the target moves slowly; during the movement from A to E in Fig. 4, for example, the series of locations obtained from the images captured by the camera are AAAAABBBBBCCCCDDDDDEEEE. In view of this, the present invention further includes a step of filtering any repeated locations out from the series of locations while the second module is identifying the movement of the target on the basis of the series of locations. After filtering, the movement is finally identified as ABCDE, i.e. a movement from A to E. A further object is to improve the error tolerance of the system when the tracked movement shakes at the border of two adjacent sub-spaces. For example, a person with dyskinesia may stay and shake at the border of CD owing to his/her muscle strength, or the shaking movement is caused by an inaccurate location calculated owing to a system error. The location sequence representing said movement in such a case may become AAAAABBBBBCCCCDCDDDEEEE, and after filtering out of the repeated locations the movement is represented as ABCDCDE. If the system requires a high error tolerance, the user generally does not want such shakes to be represented as the normal movement.
If the system requires a high error tolerance, therefore, the present invention optionally provides a step of filtering out the shakes after the filtering of repeated locations, which filters out the XYX type shake and replaces the XYX location directly with X. After performing this step, ABCDCDE in the above example will be changed to ABCDE.
In addition, an incorrect identification of the markers owing to some cause, e.g. a marker is blocked for a while, or some other cause, will result in incorrect locations. For example, the above location sequence AAAAABBBBBCCCCDDDDDEEEE may become AAAAABBBBBCECCDDDDDEEEE, wherein an E abruptly appears among the CC locations, which is obviously unreasonable. In order to avoid such incorrect locations, the present invention may also optionally adopt a step of filtering out incorrect locations after filtering the repeated locations. Said step filters out the incorrect location of non-adjacent sub-spaces that appear abruptly. The adjacent sub-spaces herein refer to the physical adjacency instead of the adjacency of the selected letter codes. After obtaining the movement track of the target, the tracking system of the present invention may further comprise a third module for comparing the target movement identified by the second module with a movement template.
Still referring to the tracking system shown in Fig. 1 as an example, the movement template is pre-stored in the processor 14 in order to be compared with the identified movement that is represented by a location sequence so as to determine the coincidence between the movement made and the template.
The tracking system of the present invention may also display the corresponding templates in the form of a picture or voice for the user's reference according to the user's selection when the user starts the tracking system. The tracking system of the present invention may further comprise a fourth module for displaying the comparison result obtained by the third module. In Fig. 1, 13 is a user terminal, which may be a TV or a display. The fourth module 131 serves to display the comparison result from the third module 143. The method of displaying varies, for example informing the user acoustically whether the movement coincides with the template or displaying the comparison result visually, e.g. in pictures or text or a combination thereof. If the movement coincides with the template, for example, the word "correct" may be displayed, and if the movement does not coincide with the template, a word "incorrect" may be displayed, or it may be simply represented by the sign of right (V) or wrong (x).
The tracking system of the present invention may further comprise a fifth module for displaying predefined information in accordance with the comparison result of the third module. As shown in Fig. 1, the fifth module 132 is designed to pκwNαs pκ ϋoluv\1 information according to the comparison result of the third module 143. The predefined information may be some audio and/or visual data or games liked by the user, which are selected according to the user's preference so as to inspire the user to continue his movement. For example, if the comparison result is "correct", the system plays a favorite song to reward the user, or displays some photos that a patient likes, or provides a simple game for the patient to play. This function is very helpful in recovery training for dyskinesia patients, because usually some simple movements are very difficult for such patients, and repeated training will make them feel bored and finally give up the training.
The tracking system of the present invention may further comprise a functional electrical stimulation device (FES) for stimulating said target via electrical pulse based on the result of comparing said identified target movement with a movement template.
A FES uses an electrical pulse to activate muscles directly, and it can bypass the brain injury and initiate movement in muscles that are partial or complete paralyzed. In consequence, FES has been used to help users with injured nervous system to perform movement and to restore the lost functions through rehabilitation training. As shown in FIG.5, Module 51 is an infrared camera for capturing the movement of a user 50 through the retroreflective markers 52 attached to the human body. A processor 54 is incorporated or connected with a camera 51 for receiving location information of the actual movement performed by the user 50 and shown the computed actual movement on the screen 53. The actual movement is compared with the movement template by the processor 54; and the comparison result is used for generating control signal to control the FES 55. The screen
53 may provide both movement template and actual movement to the user. The difference between movement template and actual user movement shown on the screen will help the user to correct the movement.
When a user follows a movement template which is displayed via TV screen 53, and if the user can not follow the movement template or the deviation between movement template and the user's actual movement is beyond the predefined threshold, a control signal will be generated by the processor 54 and is sent to the FES device 55. The electrical pulses will be generated by the pulse generator 551 for stimulating the corresponding muscles via its electrodes 553 attached to the skin of the user 50 in order to assist the user to finish the movement.
The electrodes 553 may also be used to capture a biofeedback signal (Electromyography signal, EMG for short), when the electrodes are not working in stimulating mode, i.e. there is no electronic pulses sent to the electrodes 553. The EMG signal is sent to an EMG signal processor 552 for amplifying and processing. The processed EMG signal is then sent to the processor 54 for analyzing and/or displaying.
The present invention also provides a target tracking method which comprises the following steps:
-using at least two markers attached to the target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers;
-using at least one camera for generating a series of images of the markers;
-differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
The steps of said method have been described in detail in the above illustrations of the tracking system, so they will not be elaborated herein any more.
The tracking method of the present invention may further comprise a step of obtaining a series of locations of the target from the relative movements of said markers in said series of images.
For example, if the target consists of two parts connected to each other, with three markers attached to the ends and the connection point of the two parts of the target, respectively, the step of obtaining a series of location of the target includes:
-calibrating the distance between every two markers at an initial location; -measuring the distance between every two markers in the series of images;
-calculating the spatial angular coordinates of the two parts of the target from the corresponding mathematical formulae so as to obtain the locations of the target.
The steps of obtaining a series of locations of the target have been described in detail in the previous texts, so descriptions of them are omitted here. The tracking method of the present invention may further comprise identifying the movement of the target from the series of locations, said identification including a step of filtering out any repeated locations, shaking locations, and incorrect locations from said series of locations. Said step has been described previously, so it will not be elaborated here.
The tracking method of the present invention may further comprise a step of comparing the identified target movement with a movement template. Said step has been described previously, so it will not be elaborated here.
The tracking method of the present invention may further comprise a step of displaying the comparison result and a step of displaying predefined information on the target in accordance with the comparison result. Said two steps have been described in detail previously, so they will not be elaborated here.
It will be understood that the tracking system described in the above embodiments only serves to illustrate the principle of the invention, and it shall not be construed as limiting the invention. Said target tracking system may use one or more cameras to capture images; it may use two or more markers to mark the target; and the target may be any target without being limited to the arm of a person, for example the limb of an animal, or other objects such as a robot, etc.

Claims

CLAIMS:
1. A target tracking system, comprising:
-at least two markers attached to a target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers;
-at least one camera for generating a series of images of the markers; and -a first module for differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
2. The tracking system according to claim 1, wherein the shape feature is the regularity of a shape, the surface area of the shape, or the pattern of a surface.
3. The tracking system according to claim 1, further comprising a second module for obtaining a series of locations of the target from the relative movements of said markers in said series of images.
4. The tracking system according to claim 3, wherein the target consists of two parts connected to each other, with three of the markers attached to the ends of the target and the connection point of its two parts respectively, and the second module obtains the location of the target by calibrating the distance between every two markers at an initial location, then measuring the distance between every two markers in the series of images, and finally calculating the spatial angular coordinates of the two parts of the target by means of corresponding mathematical formulae.
5. The tracking system according to claim 4, wherein the target is a limb of a person or an animal, and the connection point is an arthrosis of the person or the animal.
6. The tracking system according to any one of claims 4 and 5, wherein the second module is further adapted to identify the movement of the target in dependence on the series of locations, said identification including the step of filtering repeated locations, shaking locations, and incorrect locations out from said series of locations.
7. The tracking system according to claim 6, further comprising a third module for comparing the target movement identified by the second module with a movement template.
8. The tracking system according to claim 7, further comprising an electrical stimulator for stimulating said target via electrical pulse based on the result of comparing said identified target movement with a movement template.
9. The tracking system according to claim 7, further comprising a fourth module for displaying the comparison result of the third module.
10. The tracking system according to claim 9, further comprising a fifth module for displaying predefined information for encouraging the target in accordance with the comparison result of the third module.
11. A target tracking method, which comprises the steps of:
-using at least two markers attached to a target, any one of said at least two markers having at least one shape feature that is different from the shape features of other markers;
-using at least one camera for generating a series of images of the markers; -differentiating said at least two markers in the generated series of images on the basis of said at least one shape feature.
12. The tracking method according to claim 11, further comprising a step of obtaining a series of locations of the target from the relative movements of said markers in said series of images.
13. The tracking method according to claim 12, wherein the target consists of two parts connected to each other, with three of the markers attached to the ends of the target and the connection point of its two parts respectively, and the step of obtaining a series of locations of the target includes:
-calibrating the distance between every two markers at an initial location; -measuring the distance between every two markers in the series of images; -calculating the spatial angular coordinates of the two parts of the target from corresponding mathematical formulae so as to obtain the location of the target.
14. The tracking method according to any one of claims 12 and 13, further comprising a step of identifying the movement of the target from the series of locations, said identification including a step of filtering repeated locations, shaking locations, and incorrect locations out from said series of locations.
15. The tracking method according to claim 14, further comprising steps of:
- comparing the identified target movement with a movement template;
- displaying predefined information in accordance with the comparison result.
PCT/IB2009/051900 2008-05-12 2009-05-08 Marker tracking system and method WO2009138929A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN200810099134.8 2008-05-12
CNA2008100991348A CN101582166A (en) 2008-05-12 2008-05-12 System and method for tracking target

Publications (1)

Publication Number Publication Date
WO2009138929A1 true WO2009138929A1 (en) 2009-11-19

Family

ID=41059536

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/051900 WO2009138929A1 (en) 2008-05-12 2009-05-08 Marker tracking system and method

Country Status (2)

Country Link
CN (1) CN101582166A (en)
WO (1) WO2009138929A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11207582B2 (en) 2019-11-15 2021-12-28 Toca Football, Inc. System and method for a user adaptive training and gaming platform
US11514590B2 (en) 2020-08-13 2022-11-29 Toca Football, Inc. System and method for object tracking
US11657906B2 (en) 2011-11-02 2023-05-23 Toca Football, Inc. System and method for object tracking in coordination with a ball-throwing machine
US11710316B2 (en) 2020-08-13 2023-07-25 Toca Football, Inc. System and method for object tracking and metric generation

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9235754B2 (en) 2011-03-28 2016-01-12 Nec Corporation Person tracking device, person tracking method, and non-transitory computer readable medium storing person tracking program
CN103735270B (en) * 2013-09-22 2016-01-13 国家康复辅具研究中心 Following through leading network pilot feedback rehabilitation training system of upper limb development of potential training
CN105279354B (en) * 2014-06-27 2018-03-27 冠捷投资有限公司 User can incorporate the situation construct system of the story of a play or opera
CN107240115B (en) * 2016-03-28 2019-07-09 浙江中正智能科技有限公司 A kind of recognition methods based on marker
CN107204005B (en) * 2017-06-12 2020-01-14 北京理工大学 Hand marker tracking method and system
CN107806837B (en) * 2017-10-29 2020-03-13 北京工业大学 Non-invasive wrist joint axis motion model measuring method
CN114177588B (en) * 2021-12-13 2022-11-11 南京伟思医疗科技股份有限公司 Vibration feedback system, method and device of rehabilitation robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20040046736A1 (en) * 1997-08-22 2004-03-11 Pryor Timothy R. Novel man machine interfaces and applications
WO2005113086A1 (en) * 2004-05-10 2005-12-01 Sony Computer Entertainment Inc. Pattern codes used for interactive control of computer applications and video game applications
US20060055706A1 (en) * 2004-09-15 2006-03-16 Perlman Stephen G Apparatus and method for capturing the motion of a performer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6244987B1 (en) * 1996-11-25 2001-06-12 Mitsubishi Denki Kabushiki Kaisha Physical exercise system having a virtual reality environment controlled by a user's movement
US20040046736A1 (en) * 1997-08-22 2004-03-11 Pryor Timothy R. Novel man machine interfaces and applications
WO2005113086A1 (en) * 2004-05-10 2005-12-01 Sony Computer Entertainment Inc. Pattern codes used for interactive control of computer applications and video game applications
US20060055706A1 (en) * 2004-09-15 2006-03-16 Perlman Stephen G Apparatus and method for capturing the motion of a performer

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11657906B2 (en) 2011-11-02 2023-05-23 Toca Football, Inc. System and method for object tracking in coordination with a ball-throwing machine
US11207582B2 (en) 2019-11-15 2021-12-28 Toca Football, Inc. System and method for a user adaptive training and gaming platform
US11514590B2 (en) 2020-08-13 2022-11-29 Toca Football, Inc. System and method for object tracking
US11710316B2 (en) 2020-08-13 2023-07-25 Toca Football, Inc. System and method for object tracking and metric generation

Also Published As

Publication number Publication date
CN101582166A (en) 2009-11-18

Similar Documents

Publication Publication Date Title
WO2009138929A1 (en) Marker tracking system and method
Johansson Spatio-Temporal Differentiation and Integration in Visual Motion Perception†: An Experimental and Theoretical Analysis of Calculus-Like Functions in Visual Data Processing
Chang et al. Towards pervasive physical rehabilitation using Microsoft Kinect
Poizner et al. Perception of American sign language in dynamic point-light displays.
CN101453941B (en) Image output apparatus, image output method, and image output system
RU2417810C2 (en) Device for controlling health
WO2020132415A1 (en) Method and system for motion measurement and rehabilitation
Dikovski et al. Evaluation of different feature sets for gait recognition using skeletal data from Kinect
KR101118654B1 (en) rehabilitation device using motion analysis based on motion capture and method thereof
US20150378433A1 (en) Detecting a primary user of a device
CN1079296A (en) The method and apparatus that is used for eye tracking measurement under complete poly-and stravismus situation
WO2013059227A1 (en) Interactive physical therapy
KR102320960B1 (en) Personalized home training behavior guidance and correction system
Malawski Depth versus inertial sensors in real-time sports analysis: A case study on fencing
CN114758415A (en) Model control method, device, equipment and storage medium
CN101965580B (en) The mankind based on biostatistics's behavior setting identify system and method
Chalkley et al. Development and Validation of a Sensor-Based Algorithm for Detecting the Visual Exploratory Actions
WO2019137186A1 (en) Food identification method and apparatus, storage medium and computer device
KR20210102622A (en) Intelligent Home Training System Based on Self-moving Motion Recognition Camera
Arsenault A quaternion-based motion tracking and gesture recognition system using wireless inertial sensors
CN115129162A (en) Picture event driving method and system based on human body image change
Ehab et al. ISwimCoach: a smart coach guiding system for assisting swimmers free style strokes: ISwimCoach
Galińska et al. A database of elementary human movements collected with rgb-d type camera
EP3653120A1 (en) A rehabilitation device and a method of monitoring hand positions
Krupicka et al. Motion camera system for measuring finger tapping in parkinson’s disease

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09746224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09746224

Country of ref document: EP

Kind code of ref document: A1