WO2000016121A1 - System relating to positioning in a virtual studio - Google Patents

System relating to positioning in a virtual studio Download PDF

Info

Publication number
WO2000016121A1
WO2000016121A1 PCT/SE1999/001589 SE9901589W WO0016121A1 WO 2000016121 A1 WO2000016121 A1 WO 2000016121A1 SE 9901589 W SE9901589 W SE 9901589W WO 0016121 A1 WO0016121 A1 WO 0016121A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
tracking
positioning
unit
inertial
Prior art date
Application number
PCT/SE1999/001589
Other languages
French (fr)
Inventor
Karl-Erik Morander
Anders JÖNEBRATT
Original Assignee
Qualisys Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualisys Ab filed Critical Qualisys Ab
Priority to AU62365/99A priority Critical patent/AU6236599A/en
Publication of WO2000016121A1 publication Critical patent/WO2000016121A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention refers to a method and a system in a virtual studio for optical non-contact positioning of at least one camera (11, 31) in a volume by means of one or more tracking cameras (18, 38). The camera (11, 31) is equipped with a marker unit (19, 39), which is visible at least from one tracking camera (18, 38), which generates a signal containing information about the position and/or orientation of the camera (11, 31). Said information is transferred to a unit for further processing with respect to a second set of information for generating an essentially coherent main information.

Description

TITLE
SYSTEM RELATING TO POSITIONING IN A VIRTUAL STUDIO
TECHNICAL FIELD
The present invention refers to a system and a method in a so-called virtual studio for determination of the position of at least one camera in a volume.
BACKGROUND OF THE INVENTION
A new technic largely used within the television is so-called "virtual studios", which means that the entire or parts of a scene and properties are generated through computer graphic. The scene created in the computer is then overlayed with "real" images (video), for example, the picture of an actor, so that in the final picture, the actor appears to be in the computer generated scene.
That is extremely important that the real video and the computer graphics fit. The data system that is used to assemble the images must have information about the position and orientation of the video camera, i.e., the position of the real and virtual camera must be identical.
The second problem that occurs is that the present tracking systems are based on cameras equipped with CCD in combination with image processing units. Usually, the information is read from the CCD-chip serially by means of one to four serial shift registers. The maximal reading speed is limited, e.g. to 20 MHz. By means of two shift registers and one CCD with 500x600 pixels, the readout takes 15 ms for one image. A faster readout normally degenerates the signal-to- noise ratio (SNR) of the (analogous) output. Consequently, the number of pixels for a given maximal computing period is limited, and accordingly the best possible accuracy. It becomes difficult or impossible to discover small movements of a measured body.
DESCRIPTION OF PRIOR ART
The technique for positioning a camera and its orientation is called "motion tracking" or camera tracking". Mechanical methods have been used in the past, for example angle indicators on a stationary support, to receive information about the orientation of the camera.
Lately, non-contact methods have been developed. One system uses "bar-codes" arranged on the studio walls for position and angel determination. The bar-code comprises a pattern of two different shades and enables extraction in real time of the camera parameters from the film (the video signal), which is then processed in a video processor. —
Another system uses an extra camera attached on the TV camera and directed towards the roof of the studio where a coded disc is provided, which can be detected by the camera and after the image processing, the position of the TV camera is indicated.
WO 97/11186 discloses a positioning and orientation system for an object, in which the markers for example formed as frames are attached on a surface such as a roof and the position and orientation of a camera is determined, e.g. with respect to the markers.
The position of an object is known, e.g. through US 5,227,985, in which the position of an object provided with a marker is determined by means of a camera.
BRIEF DESCRIPTION OF INVENTION
The object of the present invention is to provide a system for a non-contact positioning of at least one camera in a volume in real time and in a very simple and accurate way.
The system according to the invention enables free relocation of a TV camera above a surface without influencing the generation of position and orientation parameters.
The system according to the invention allows both high positioning speed and accuracy.
Additionally, the system according to invention provides following advantages: - measurement and positioning in large volumes,
- good precision, i.e. a small change in the position or angel can be detected, e.g. 0.1 mm for lateral offset and 0.01 degrees for angular,
- low noise (fluctuation) in the obtained position information from of the system, - low operation (i.e. slow displacement of the measured position without the measured object being displacement),
- high measuring speed, i.e. many measurements per second, e.g. 60/s,
- quick response, i.e. each measurement result is available within a certain period, good detection performance, i.e. detection of small markers within large distance, and enough redundancy and consequently high reliability during all the operation times. —
These tasks have been solved by means of the initially mentioned camera being equipped with a marker unit, which is visible at least from a tracking camera, which generates a signal containing information about the position and/or orientation of the camera, which information is supplied to a unit for further processing with respect to a second set of information for generating an essentially coherent main information.
In one embodiment, the tracking camera is a CCD-equipped camera and operates within IR- region. Preferably, the tracking camera communicates with a computer unit, which also is connected to the camera for positioning the camera.
The tracking cameras are arranged to measure the marker unit whereby the information about the degrees of freedom of the marker units is determined. The camera's position is determined relative to a global coordinate system of the volume.
According to a method of the invention for a non-contact positioning of at least one camera in a volume by means of one or more tracking cameras, the method includes the steps of providing the camera with a marker unit visible at least from some of the tracking cameras, and generating a signal containing information about the degrees of freedom of the camera by means of the tracking camera.
The method further includes measuring in an image recording unit of the camera and relating the generated degrees of freedom to said measurement.
Advantageously, the measurement of the tracking cameras can be conducted according to an automatic procedure whereby a special marker unit is attached onto a lens mount of the TV camera, after which the tracking camera system measures with respect to the marker unit relative the marker provided on the camera during the entire measurement procedure.
In an embodiment, said camera is provided with at least one motion sensor in form of an inertial detector and/or accelerometer and/or gyroscope for an inertial-based positioning. However, several sets of motion sensors can be arranged in different positions on a body to be measured to compute angel data (role, par and tilt). The optical positioning provides an absolute measurement while the inertial-based positioning provides a relative measurement. Upon interference in one of the optical or inertial-based positionings, data from the other positioning is used.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is described with reference to attached drawings showing an embodiment, in which: Fig. 1 is a very schematic view of the system according to the invention, Fig. 2 is a schematic marker unit, which can be used in a system according to the invention, and
Fig. 3 is a second embodiment of a system according to the invention.
DESCRIPTION OF THE EMBODIMENTS
According to the invention a camera system is used to measure the position and orientation of one or several TV/video cameras in a studio (or any place for filming). Each TV camera to be traced/positioned is provided with at least one marker and is measured in, in real time, by means of an arrangement specific for the system. The information about the position and orientation of the TV camera is then transferred, in real time, to a data system that controls the virtual camera, i.e., a camera which in the virtual studio corresponds to the real camera. The term "TV camera" is used for simplicity reason and concerns film studio cameras which produce real images and can comprise any arbitrary type of camera. The camera may also be stationary or portable and can be operated manually or remote controlled (a robot).
Fig. 1 shows a film studio where an object 10, in this case a car, is filmed by means of a camera
11 (a TV or video camera). The system additionally includes at least one computer unit 12 for generating images 13 that can constitute the background or set, a so-called virtual studio. These images can also be stored in any memory and fetched when they are needed. Furthermore, the system includes a second computer unit 14 to provide a montage 15, i.e. put together the pictures 16 and 17 obtained from the cameras 11 and the computer unit 12, respectively. Obviously, the computer units 12 and 14 can be integrated into one computer unit.
It is also possible to transfer the information from several TV-cameras, which generate "real" images and bring together them in the computer 14. —
The system also includes several "tracking cameras", IS, and at least one marker unit 19 arranged on the TV-camera 11, which is visible from at least some of the cameras, preferably from at least three cameras 18. Preferably, the tracking cameras 18 consist of IR-based CCD-cameras. These cameras can be equipped with IR (Infra Read) generating units or communicate with, for example IR-flashes. The marker unit consists of one or more markers provided with a reflective surface. Moreover, active markers (coded or non-coded) which generate IR (or other type of light) can be used.
For detecting all degrees of freedom of the TV-cameras, three (3) or more CCD-cameras 18 are used. Preferably, the CCD-cameras operate in 50 and 60 Hz synchronized with synchronization of the studio. This is however not a limitation. The function of the CCD-camera is assumed to be known for a person skilled in the art and not described in detail herein.
In an embodiment, the system can include only one tracking camera and one marker arranged on the TV-camera, whereby the positioning is carried out according to the disclosed method in the Swedish patent No. 9700066-5.
Fig. 2 shows a schematic marker unit 19 consisting of six spherical or semispherical markers 20 provided on the ends of carrying rods 21 constituting a rack. The markers are arranged in different levels to enable a more accurate determination of it's and consequently the cameras position and orientation. It is also possible to use a marker unit including one or more markers. Also, the size of the markers in the marker unit can vary. Clearly, the invention is not limited to this embodiment of the marker unit and different embodiments may occur.
In one embodiment, the positioning is carried out through recording IR-reflections (or light) from the markers by each CCD-camera. The camera includes an optical element, which projects the image of the marker onto a CCD (Charge Coupled Device) plate in the camera. The surface of CCD unit is then scanned and the image signal including pixel information is converted to a suitable video signal by means of a converter unit, which can be integrated in the CCD unit. The video signal representing the image lines is then sent serially or in parallel to a processing unit. The processing unit digitizes the received video signal, for example using an A/D-converter. This signal can also be digitized in the camera. —
The picture elements are arranged in lines by means of a lowpass filter to supply a partly continues signal, which can be processed as an analogous signal. However, in the preferred embodiment each picture element is measured individually and from the measured values, a value is interpolated, which decides when a threshold value is passed.
The digitalized signal is transferred to a comparator unit, which interpolates the individually sampled values about the predetermined threshold value, also called the video level, which is obtained from a memory unit. As described above, the aim is to detect when the amplitude of the signal passes the threshold value. Each passage represents a start and stop coordinate of each segment with a high resolution, which can be about 30 x number of picture elements in a row. In a computation unit the center point is computed.
For a circular marker the area of the image reproduction is computed. Results can later be transformed to an interface unit for further transmission to the computer unit, in which the computed values (a center point, i.e. x, y, z positions measured with two cameras, area, diameter) can be used with respect to disclosed position of the markers.
By continuously scanning the markers, their motion can be discovered. Camera rotations and movements are detected through the different levels of the markers and distance to each other and the camera, CCD-cameras 18 is arranged to measure the marker units and compute, e.g. six degrees of freedom for each marker unit (each marker's x, y, z coordinates, par, tilt and role). Before the filming, the image recording unit of the TV-camera, e.g. CCD, is measured in and the six generated degrees of freedom are associated to this. Each camera can include a computer unit for calculation of the positions of the markers; alternatively, a central unit or the computer unit 14 is used to perform the image analysis and/or computation. The signal from the CCD-cameras (processed or raw) are then sent to the computer 14 to be used for montages. The marker unit 19 is preferably arranged on an upper side of the TV-camera and the CCD camera hang from a position or roof directed towards the camera.
Preferably, the calibrated CCD-cameras are situated in a coordinate system specific for the volume (studio). Thereby the computed degrees of freedom for the picture element of the TV-camera are given in a global coordinate system specific for the volume. These degrees of freedom are — transmitted to the computer unit 14 for positioning the camera.
Through the calibration, the measurement of CCD-cameras can be carried out according to an automatic procedure whereby a special marker unit is fastened on the lens mount of the TV- camera, after which the CCD-camera system (or the tracking camera system) measures with respect to the marker unit relative the marker on TV-camera during the entire measurement procedure.
In a second embodiment shown in fig. 3, a motion sensor 33 is arranged in form of an inertial detector or accelerometer and/or gyroscope connected to the camera 31 to be positioned in the system. The data from the motion sensor, which can be film data from the camera 31 and from the positioning cameras 38 via the computer 32 is fed to a computer 34 where x, y, z, par, tilt and role are computed (see above), which provides the position of the camera 31.
Inertial tracking functions both with accelerometers (e.g. for x, y) and can use gyroscope (specially for role, par and tilt). It is also possible to use more than one set of x, y, z accelerometers provided in different positions on a body to be measured to compute an angel data (role, par and tilt). Based on the measurement principle, the accelerometers tend to offset, as the position is computed through double integration over the period for the measured acceleration.
Therefore, small errors are accumulated. However, a high measurement speed, good SNR for the signal and good sensitivity also for very small movements of the measured object are obtained.
The optical system provides a slow and offset free position, while the output of the inertial detector has low noise and high measuring speed, preferably higher than the optical system.
Moreover, the spatial solution for the inertial providing system is higher than the optical. The optical system provides absolute measurement (with respect to the spatial coordinates) while the inertial-based system provides a relative measurement (offset) and decides whether the object is mobile or stationary. Additionally, it is possible that upon interferences in a system to be able to use data from the other system.
While we have illustrated and described only preferred embodiments of the invention, it is obvious that several variations and modifications within the scope of the attached claims can occur. —
DESIGNATION SIGNS
10 object
11,31 camera
12, 32 computer unit
13 graphics
14, 34 computer unit
15 montage
16 image
17 image
18,38 tracking camera
19,39 marker unit
20 marker
21 rod
33 motion sensor

Claims

1. A system in a virtual studio for optical non-contact positioning of at least one camera (11, 31) in a volume by means of one or more tracking cameras (18, 18), characterised in, that the camera (11, 31) is equipped with a marker unit (19, 39), which is visible at least from one tracking camera (18, 38), which tracking camera generates a signal containing information about a position and/or orientation of the camera (11, 31), said information being supplied to a unit for further processing with respect to a second set of information for generation of an essentially coherent main information set.
2. System according to claim 1, characterised in, that the tracking camera is a CCD equipped camera.
3. System according to claim 2, characterised in, that the tracking camera operates within IR range.
4. System according to claim 1, characterised in, that the tracking camera communicates with a computer unit (14), which is further connected to the camera (11) for positioning the camera (11).
5. System according to claim 1, characterised in, that the tracking camera (18) is provided for measuring the marker unit (19) whereby information about degrees of freedom of the marker unit is determined.
6. System according to claim 1, characterised in, that the position of the camera is determined relative to a global coordinate system associated with the volume.
7. System according to any of the preceding claims, characterised in, that said camera (31) is equipped with at least one motion sensor (33) in form of an inertial detector and/or accelerometer and/or gyroscope for an inertial-based positioning.
8. System according to claim 7, ΓÇö characterised in, that more then one sets of motion sensors (33) are provided in different positions on a body to be measured to compute an angel data (role, pan and tilt).
9. System according to any of the claims 7 or 8, characterised in, that the optical positioning provides an absolute measurement while the inertial based positioning provides a relative measurement.
10. System according to any of the claims 1 - 9, characterised in, that upon interferences in one of optical or inertial-based positionings, data from the other positioning is used.
11. A method for non-contact positioning of at least one camera (11) in a volume in a virtual studio application by means of one or more tracking cameras (18), characterised in that the method includes the steps of providing the camera (11) with a marker unit (19) visible at least from some of the tracking cameras (18), and generating a signal containing information about degrees of freedom of the camera (11) by means of the tracking camera.
12. Method according to claim 11, characterised by relative measuring an image recording unit of the cameras (11) and generating degrees of freedom relative to said relative measurement.
13. Method according to claim 11, characterised in that the measurement of the tracking camera is carried out according to an automatic procedure whereby a special marker unit is attached on a lens mount of the camera, after which the CCD camera system in the tracking camera system measures with respect to the marker unit relative the on marker provided on the camera during the entire measurement procedure.
14. Method according to any of demands 11 - 13, characterised in that said camera (31) is provided with at least one motion sensor (33) in form of an inertial detector and/or accelerometer and/or gyroscope for an inertial-based positioning.
PCT/SE1999/001589 1998-09-11 1999-09-10 System relating to positioning in a virtual studio WO2000016121A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU62365/99A AU6236599A (en) 1998-09-11 1999-09-10 System relating to positioning in a virtual studio

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE9803102-4 1998-09-11
SE9803102A SE515134C2 (en) 1998-09-11 1998-09-11 System and method of camera positioning in a virtual studio

Publications (1)

Publication Number Publication Date
WO2000016121A1 true WO2000016121A1 (en) 2000-03-23

Family

ID=20412581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE1999/001589 WO2000016121A1 (en) 1998-09-11 1999-09-10 System relating to positioning in a virtual studio

Country Status (3)

Country Link
AU (1) AU6236599A (en)
SE (1) SE515134C2 (en)
WO (1) WO2000016121A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001071280A2 (en) * 2000-03-23 2001-09-27 Snap-On Technologies, Inc. Self-calibrating, multi-camera machine vision measuring system
WO2002103286A1 (en) * 2001-06-15 2002-12-27 Snap-On Technologies, Inc. Self-calibrating position determination system
WO2003002937A1 (en) * 2001-06-28 2003-01-09 Snap-On Technologies, Inc. Self-calibrating position determination system and user interface
EP1335181A1 (en) * 2002-02-04 2003-08-13 CORGHI S.p.A. Device for measuring the characteristic attitude parameters of a vehicle
EP1455253A2 (en) * 2003-01-29 2004-09-08 Centurfax Limited Absolute positioning system
US6931340B2 (en) 2000-05-22 2005-08-16 Snap-On Incorporated Self-calibrating, multi-camera machine vision measuring system
US7062861B2 (en) 2001-06-28 2006-06-20 Snap-On Incorporated Self-calibrating position determination system and user interface
US7444178B2 (en) 2004-10-05 2008-10-28 Brainlab Ag Positional marker system with point light sources
EP2112470A1 (en) * 2007-02-12 2009-10-28 Qifeng Yu A photogrammetric method using folding optic path transfer for an invisible target of three-dimensional position and posture
DE102010008985A1 (en) * 2010-02-24 2011-06-30 Deutsches Zentrum für Luft- und Raumfahrt e.V., 51147 Tracing device i.e. marker, for movement-tracing of finger limb in human hand, has reflection elements arranged at distance to each other to determine position and orientation of human limb to be traced in simultaneous manner
CN102243064A (en) * 2011-04-22 2011-11-16 深圳迪乐普数码科技有限公司 Determination system and method for position and posture of video camera in virtual studio
US8446473B2 (en) 2004-10-05 2013-05-21 Brainlab Ag Tracking system with scattering effect utilization, in particular with star effect and/or cross effect utilization
DE102013206569A1 (en) * 2013-04-12 2014-10-16 Siemens Aktiengesellschaft Gesture control with automated calibration
DE102005011432B4 (en) * 2005-03-12 2019-03-21 Volkswagen Ag Data glove

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100346133C (en) * 2000-03-23 2007-10-31 捷装技术公司 Self-calibrating, multi-camera machine vision measuring system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4675820A (en) * 1984-06-14 1987-06-23 Sundstrand Data Control, Inc. Inertial reference system
DE4041723A1 (en) * 1990-12-24 1992-06-25 Thiedig Ullrich METHOD AND DEVICE FOR DETERMINING THE POSITION OF A MEASURING POINT RELATIVE TO A REFERENCE POINT
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
WO1997011386A1 (en) * 1995-09-21 1997-03-27 Omniplanar, Inc. Method and apparatus for determining position and orientation
WO1998054593A1 (en) * 1997-05-30 1998-12-03 British Broadcasting Corporation Position determination

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4675820A (en) * 1984-06-14 1987-06-23 Sundstrand Data Control, Inc. Inertial reference system
DE4041723A1 (en) * 1990-12-24 1992-06-25 Thiedig Ullrich METHOD AND DEVICE FOR DETERMINING THE POSITION OF A MEASURING POINT RELATIVE TO A REFERENCE POINT
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
WO1997011386A1 (en) * 1995-09-21 1997-03-27 Omniplanar, Inc. Method and apparatus for determining position and orientation
WO1998054593A1 (en) * 1997-05-30 1998-12-03 British Broadcasting Corporation Position determination

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001071280A3 (en) * 2000-03-23 2002-02-21 Snap On Tech Inc Self-calibrating, multi-camera machine vision measuring system
WO2001071280A2 (en) * 2000-03-23 2001-09-27 Snap-On Technologies, Inc. Self-calibrating, multi-camera machine vision measuring system
US6959253B2 (en) 2000-03-23 2005-10-25 Snap-On Incorporated Self-calibrating, multi-camera machine vision measuring system
US6931340B2 (en) 2000-05-22 2005-08-16 Snap-On Incorporated Self-calibrating, multi-camera machine vision measuring system
US6968282B1 (en) 2000-05-22 2005-11-22 Snap-On Incorporated Self-calibrating, multi-camera machine vision measuring system
WO2002103286A1 (en) * 2001-06-15 2002-12-27 Snap-On Technologies, Inc. Self-calibrating position determination system
US6839972B2 (en) 2001-06-15 2005-01-11 Snap-On Incorporated Self-calibrating position determination system
CN100408974C (en) * 2001-06-28 2008-08-06 斯耐普昂技术有限公司 Self calibrating position determination system and user interface
US7062861B2 (en) 2001-06-28 2006-06-20 Snap-On Incorporated Self-calibrating position determination system and user interface
WO2003002937A1 (en) * 2001-06-28 2003-01-09 Snap-On Technologies, Inc. Self-calibrating position determination system and user interface
EP1335181A1 (en) * 2002-02-04 2003-08-13 CORGHI S.p.A. Device for measuring the characteristic attitude parameters of a vehicle
US6842238B2 (en) 2002-02-04 2005-01-11 Corghi S.P.A. Device for measuring the parameters of a vehicle characteristic attitude
EP1455253A2 (en) * 2003-01-29 2004-09-08 Centurfax Limited Absolute positioning system
EP1455253A3 (en) * 2003-01-29 2009-07-29 Centurfax Limited Absolute positioning system
US7444178B2 (en) 2004-10-05 2008-10-28 Brainlab Ag Positional marker system with point light sources
US8446473B2 (en) 2004-10-05 2013-05-21 Brainlab Ag Tracking system with scattering effect utilization, in particular with star effect and/or cross effect utilization
DE102005011432B4 (en) * 2005-03-12 2019-03-21 Volkswagen Ag Data glove
EP2112470A4 (en) * 2007-02-12 2014-05-21 Qifeng Yu A photogrammetric method using folding optic path transfer for an invisible target of three-dimensional position and posture
EP2112470A1 (en) * 2007-02-12 2009-10-28 Qifeng Yu A photogrammetric method using folding optic path transfer for an invisible target of three-dimensional position and posture
DE102010008985A1 (en) * 2010-02-24 2011-06-30 Deutsches Zentrum für Luft- und Raumfahrt e.V., 51147 Tracing device i.e. marker, for movement-tracing of finger limb in human hand, has reflection elements arranged at distance to each other to determine position and orientation of human limb to be traced in simultaneous manner
CN102243064A (en) * 2011-04-22 2011-11-16 深圳迪乐普数码科技有限公司 Determination system and method for position and posture of video camera in virtual studio
DE102013206569A1 (en) * 2013-04-12 2014-10-16 Siemens Aktiengesellschaft Gesture control with automated calibration
US9880670B2 (en) 2013-04-12 2018-01-30 Siemens Aktiengesellschaft Gesture control having automated calibration
DE102013206569B4 (en) 2013-04-12 2020-08-06 Siemens Healthcare Gmbh Gesture control with automated calibration

Also Published As

Publication number Publication date
SE9803102L (en) 2000-03-12
SE515134C2 (en) 2001-06-11
AU6236599A (en) 2000-04-03
SE9803102D0 (en) 1998-09-11

Similar Documents

Publication Publication Date Title
CN109115126B (en) Method for calibrating a triangulation sensor, control and processing unit and storage medium
US7310154B2 (en) Shape measurement system
KR100871595B1 (en) A system for measuring flying information of globe-shaped object using the high speed camera
WO2000016121A1 (en) System relating to positioning in a virtual studio
CN106643699B (en) Space positioning device and positioning method in virtual reality system
CN101715055B (en) Imaging apparatus, and imaging method
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
US7136170B2 (en) Method and device for determining the spatial co-ordinates of an object
CN110230983B (en) Vibration-resisting optical three-dimensional positioning method and device
US6304284B1 (en) Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera
US8222602B2 (en) System for producing enhanced thermal images
US20060146142A1 (en) Multi-view-point video capturing system
CN104067111B (en) For following the tracks of the automated systems and methods with the difference on monitoring objective object
EP1411371A1 (en) Surveying and position measuring instrument with a fan-shapped light beam
CN101715053A (en) Imaging apparatus, imaging method, and program
KR101308744B1 (en) System for drawing digital map
JP2003148919A (en) Device and method for detecting three-dimensional relative movement
US9062958B2 (en) Image sensor, attitude detector, contact probe, and multi-sensing probe
KR20010052283A (en) Control device and method of controlling an object
El-Sheimy et al. Kinematic positioning in three dimensions using CCD technology
JP4779491B2 (en) Multiple image composition method and imaging apparatus
JP5230354B2 (en) POSITIONING DEVICE AND CHANGED BUILDING DETECTION DEVICE
JP2007033315A (en) Three-dimensional object surveying system and surveying photography analysis system
JPH0337513A (en) Three-dimensional position/speed measuring apparatus
US20080252746A1 (en) Method and apparatus for a hybrid wide area tracking system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase