US20120133584A1 - Apparatus and method for calibrating 3D position in 3D position and orientation tracking system - Google Patents

Apparatus and method for calibrating 3D position in 3D position and orientation tracking system Download PDF

Info

Publication number
US20120133584A1
US20120133584A1 US13/137,633 US201113137633A US2012133584A1 US 20120133584 A1 US20120133584 A1 US 20120133584A1 US 201113137633 A US201113137633 A US 201113137633A US 2012133584 A1 US2012133584 A1 US 2012133584A1
Authority
US
United States
Prior art keywords
pointing
orientation
pointed
event
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/137,633
Inventor
Hyong Euk Lee
Won Chul BANG
Sang Hyun Kim
Jung Bae Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BANG, WON CHUL, KIM, JUNG BAE, KIM, SANG HYUN, LEE, HYONG EUK
Publication of US20120133584A1 publication Critical patent/US20120133584A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • One or more example embodiments of the present disclosure relate to an apparatus and method for calibrating a three-dimensional (3D) position in a 3D position and orientation tracking system.
  • a technology for tracking a three-dimensional (3D) position and orientation of a moving object or target has typically been used for sensing a motion of an object, a body, an animal, and the like in a 3D space, using a large motion capture equipment at a high cost in a movie, graphics, and animation industry, and the like.
  • a 3D position may be calibrated to track an accurate 3D position.
  • a position may be easily and accurately calibrated by touching a reference point.
  • 2D two-dimensional
  • a 3D position and orientation tracking system in a 3D space may have difficulty in calibrating a position without a high precision system providing separate reference information.
  • an apparatus for calibrating a three dimensional (3D) position of a remote device in a 3D position and orientation tracking system including a beam generating unit to generate at least two laser beams outputted in different orientations at predetermined angles, a position tracking unit to track a 3D position of the remote device in response to a detection of a pointing event, an orientation tracking unit to track a 3D orientation of the remote device in response to the detection of the pointing event, a pointing position acquiring unit to acquire positions pointed to by the at least two laser beams, in response to the detection of the pointing event, a reference position generating unit to generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, an error calculating unit to calculate an error using the reference position and the tracked 3D position, and a position calibrating unit to calibrate the 3D position tracked by the position tracking unit, using the error.
  • a beam generating unit to generate at least two laser beams outputted in different orientations at predetermined angles
  • a position tracking unit to
  • the orientation tracking unit may track a trajectory of the 3D orientation of the remote device in response to the detection of the pointing event
  • the pointing position acquiring unit may acquire a trajectory pointed to by the at least two laser beams, in response to the detection of the pointing event
  • the reference position generating unit may generate the 3D reference position, using the pointed trajectory and the tracked trajectory of the 3D orientation.
  • the pointing event may correspond to an event simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
  • the pointing event may correspond to an event that points to, using the at least two laser beams, straight lines corresponding to a number of the laser beams along the straight lines.
  • the pointing position acquiring unit may receive a position of reference points from a display device displaying the reference points, and acquire the position of the reference points as the pointed to positions.
  • the pointing position acquiring unit may acquire the pointed to positions from a sensor measuring a position pointed to by a laser beam.
  • an apparatus for calibrating a 3D position of a remote device in a 3D position and orientation tracking system including a beam generating unit to generate at least one laser beam, an event detecting unit to detect at least three instances of pointing events, a position tracking unit to track a 3D position of the remote device, in response to the detection of the pointing events, an orientation tracking unit to track a 3D orientation of the remote device, in response to the detection of the pointing events, a pointing position acquiring unit to acquire a position pointed to by the at least one laser beam, in response to the detection of the pointing events, a reference position generating unit to generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, an error calculating unit to calculate an error using the reference position and the tracked 3D position, and a position calibrating unit to calibrate the 3D position tracked by the position tracking unit, using the error.
  • the orientation tracking unit may track a trajectory of the 3D orientation of the remote device in response to the detection of the pointing events
  • the pointing position acquiring unit may acquire a pointed to trajectory, in response to the detection of the pointing events
  • the reference position generating unit may generate the 3D reference position, using the pointed to trajectory and the tracked trajectory of the 3D orientation.
  • the pointing events may correspond to events that point to points displayed on a display device using the at least one laser beam.
  • the pointing events may correspond to events that point to, using the at least one laser beam, a straight line outputted from a display device along the straight line.
  • the pointing position acquiring unit may receive a position of a reference point from a display device displaying the reference point, and acquire the position of the reference point as the pointed to position.
  • the pointing position acquiring unit may acquire the pointed to position from a sensor measuring a position pointed to by a laser beam.
  • a method of calibrating a 3D position of a remote device in a 3D position and orientation tracking system including tracking a 3D position of the remote device in response to a detection of a pointing event, tracking a 3D orientation of the remote device in response to the detection of the pointing event, acquiring positions pointed to by the at least two laser beams outputted in different orientations at predetermined angles, in response to the detection of the pointing event, generating a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, calculating an error using the reference position and the tracked 3D position, and calibrating the 3D position to be tracked, using the error.
  • the pointing event may correspond to an event simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
  • the pointing event may correspond to an event that points to, using the at least two laser beams, straight lines corresponding to a number of the laser beams along the straight lines.
  • a method of calibrating a 3D position of a remote device in a 3D position and orientation tracking system including tracking a 3D position of the remote device in response to a detection of a pointing event, tracking a 3D orientation of the remote device in response to the detection of the pointing event, acquiring a position pointed to by a laser beam, in response to the detection of the pointing event, repeating the acquiring of the pointed to position at least three times in the tracking of the 3D position of the remote device, generating a 3D reference position, based on information about the pointed to position and the tracked 3D orientation, calculating an error using the 3D reference position and the tracked 3D position, and calibrating the 3D position to be tracked, using the error.
  • the pointing event may correspond to an event pointing to a point displayed on a display device using the laser beam.
  • the pointing event may correspond to an event that points to, using the laser beam, a straight line outputted from a display device along the straight line.
  • FIG. 1 illustrates a configuration of an apparatus for calibrating a three-dimensional (3D) position in a 3D position and orientation tracking system according to example embodiments
  • FIG. 2 illustrates an example of performing a pointing using a single laser beam to calibrate a 3D position
  • FIG. 3 illustrates an example of performing a pointing using two laser beams to calibrate a 3D position
  • FIGS. 4A , 4 B, and 4 C illustrate an example of a point or a straight line pointed to by two laser beams during a pointing event for calibrating a position
  • FIG. 5 illustrates an operational flowchart for calibrating a 3D position in a 3D position and orientation tracking system according to example embodiments.
  • FIG. 1 illustrates a configuration of an apparatus 100 for calibrating a three-dimensional (3D) position in a 3D position and orientation tracking system according to example embodiments.
  • the apparatus 100 for calibrating a 3D position may include, for example, a control unit 110 , a beam generating unit 120 , an event detecting unit 130 , a position tracking unit 140 , an orientation tracking unit 150 , an initial calibrating unit 160 , and a position calibrating unit 170 .
  • the beam generating unit 120 may generate a laser beam.
  • the beam generating unit 120 may generate a single laser beam, or at least two laser beams outputted in different orientations at predetermined angles.
  • the laser beam may correspond to a light source having a characteristic of straightness.
  • the beam generating unit 120 may include at least one laser such as a laser pointer or a laser diode to generate the laser beam.
  • the event detecting unit 130 may detect a pointing event.
  • the event detecting unit 130 may correspond to an input device and may detect an input by a user. The user may point the laser beam to an accurate position and may thereby generate an event reporting the pointing through the input.
  • the position tracking unit 140 may track a 3D position of a remote device.
  • the position tracking unit 140 may track the 3D position of the remote device using a camera, using an infrared light, using an inertial sensor, or using an attenuation characteristic of a light-emitting and receiving signal according to a directivity of an infrared signal.
  • the apparatus 100 for calibrating a 3D position may be disposed inside the remote device or may be disposed outside the remote device.
  • the orientation tracking unit 150 may track a 3D orientation of the remote device.
  • the orientation tracking unit 150 may track the 3D orientation through the inertial sensor.
  • the inertial sensor may be configured as a combination including at least one of an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
  • the tracking of the orientation thorough the inertial sensor may be appropriate to an orientation tracking for calibrating an initial position.
  • the initial calibrating unit 160 may calculate an error of a current 3D position based on information about a pointed to position, the 3D position, and the 3D orientation.
  • the initial calibrating unit 160 may include a pointing position acquiring unit 162 , a reference position generating unit 164 , and an error calculating unit 166 .
  • the initial calibrating unit 160 may operate differently depending on a number of laser beams outputted from the beam generating unit 120 .
  • FIG. 2 illustrates an example of performing a pointing event using a single laser beam to calibrate a 3D position.
  • the apparatus 100 for calibrating a 3D position may be disposed inside the remote device.
  • the remote device may perform a pointing event by outputting a laser beam while changing an orientation at the same position to calibrate a 3D position.
  • the pointing position acquiring unit 162 may acquire positions of reference points 210 , 220 , 230 , and 240 from a display device 200 each time the reference points 210 , 220 , 230 , and 240 are sequentially pointed to through the display device 200 .
  • the pointing position acquiring unit 162 may acquire the positions of the reference points 210 , 220 , 230 , and 240 corresponding to positions pointed to from the display device 200 , or from a sensor for measuring a position pointed to by the laser beam.
  • the sensor measuring the laser beam may be configured to be included in a display panel of the display device 200 , although the sensor may alternatively be located elsewhere.
  • the reference position generating unit 164 may generate a 3D reference position based on information about the positions of the reference points 210 , 220 , 230 , and 240 as well as a 3D direction tracked during a pointing event.
  • the error calculating unit 166 may calculate an error using a difference between a reference position generated by the reference position generating unit 164 and a 3D position tracked by the position tracking unit 140 .
  • FIG. 3 illustrates an example of performing a pointing event using two laser beams to calibrate a 3D position.
  • the apparatus 100 for calibrating a 3D position may be disposed inside the remote device, although the apparatus 100 may alternatively be located elsewhere.
  • the pointing position acquiring unit 162 may acquire positions P 1 and P 2 pointed to by laser beams, in response to a detection of a pointing event.
  • the pointing position acquiring unit 162 may acquire the pointed to positions P 1 and P 2 from a display device 300 , or from a sensor for measuring a position pointed to by the laser beams.
  • the reference position generating unit 164 may generate a 3D reference position based on information about locations of the pointed to positions P 1 and P 2 and a 3D orientation tracked during the pointing event.
  • Equation 1 A relationship between two laser beams outputted in different orientations at predetermined angles and the 3D orientation may be expressed by the following Equation 1.
  • ⁇ right arrow over (a) ⁇ corresponds to a unit vector in a direction of a first laser beam transformed by a rotation matrix
  • ⁇ right arrow over (b) ⁇ corresponds to a unit vector in a direction of a second laser beam transformed by a rotation matrix
  • ⁇ right arrow over (a) ⁇ 0 corresponds to a reference value and to a unit vector in a direction of the first laser beam before a transformation by the rotation matrix
  • ⁇ right arrow over (b) ⁇ 0 corresponds to a reference value and to a unit vector in a direction of the second laser beam before a transformation by the rotation matrix
  • R corresponds to a rotation transformation matrix.
  • R may be expressed by the following Equation 2.
  • corresponds to a rotation angle of a roll
  • corresponds to a rotation angle of a pitch
  • corresponds to a rotation angle of a yaw.
  • Equation 3 may be expressed as shown below. It should be noted that even though values on y-axis of the pointed to positions P 1 and P 2 are set to “0” for convenience, the values may not be limited to
  • the reference position may be generated as shown in the following Equation 4.
  • the error calculating unit 166 may calculate an error using a difference between the reference position generated by the reference position generating unit 164 and the 3D position tracked through the position tracking unit 140 .
  • the pointing event may occur as illustrated in FIGS. 4A , 4 B, and 4 C.
  • FIGS. 4A , 4 B, and 4 C illustrate an example of a point or a straight line pointed to by two laser beams during a pointing event for calibrating a position.
  • the pointing event may occur in a straight line as illustrated in FIG. 4B or FIG. 4C .
  • the orientation tracking unit 150 may track a trajectory of a 3D orientation of the remote device.
  • the pointing position acquiring unit 162 may acquire a trajectory pointed to by way of the laser beams.
  • the reference position generating unit 164 may generate a 3D reference position using pointed to trajectories and tracked trajectories of a 3D orientation.
  • the position calibrating unit 170 may calibrate a 3D position tracked through the position tracking unit 140 in real time, using an error calculated through the error calculating unit 166 .
  • the position calibrating unit 170 may perform an offset calibration offsetting an error or may perform a scale factor calibration using an error.
  • the offset calibration may be expressed by Equation 5
  • the scale factor calibration may be expressed by Equation 6.
  • ( ⁇ circumflex over (x) ⁇ , ⁇ , ⁇ circumflex over (z) ⁇ ) corresponds to a calibrated 3D position value
  • (x, y, z) corresponds to a 3D position value before a calibration
  • ( ⁇ x, ⁇ y, ⁇ z) corresponds to an error value
  • ⁇ circumflex over (x) ⁇ ⁇ ( x 0 ⁇ x 0 )/ x 0 ⁇ x
  • ( ⁇ circumflex over (x) ⁇ , ⁇ , ⁇ circumflex over (Z) ⁇ ) corresponds to a calibrated 3D position value
  • (x, y, z) corresponds to a 3D position value before a calibration
  • (x 0 , y 0 , z 0 ) corresponds to a 3D position value tracked during the pointing event to generate a reference position
  • ( ⁇ x 0 , ⁇ y 0 , ⁇ z 0 ) corresponds to an error value calculated by a difference between (x 0 , y 0 , z 0 ) and the reference position.
  • the control unit 110 may control an overall operation of the apparatus 100 .
  • the control unit 110 may perform a function of the pointing position acquiring unit 162 , the reference position generating unit 164 , the error calculating unit 166 , and the position calibrating unit 170 .
  • the control unit 110 , the pointing position acquiring unit 162 , the reference position generating unit 164 , the error calculating unit 166 , and the position calibrating unit 170 are separately illustrated so as to separately describe each function.
  • the control unit 110 may include at least one processor configured to perform each function of the pointing position acquiring unit 162 , the reference position generating unit 164 , the error calculating unit 166 , and the position calibrating unit 170 .
  • the control unit 110 may include at least one processor configured to perform a at least a portion of each function of the pointing position acquiring unit 162 , the reference position generating unit 164 , the error calculating unit 166 , and the position calibrating unit 170 .
  • FIG. 5 illustrates an operational flowchart for calibrating a 3D position in a 3D position and orientation tracking system according to example embodiments.
  • the apparatus 100 may acquire positions pointed to by the at least two laser beams outputted in different orientations at predetermined angles in operation 512 .
  • the pointing event may simultaneously correspond to another event by simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
  • the pointing event may correspond to an event that points to, using the at least two laser beams, straight lines corresponding to the number of the laser beams along the straight lines.
  • the apparatus 100 may track a 3D position of a remote device in response to a detection of the pointing event.
  • the apparatus 100 may track a 3D orientation of the remote device in response to a detection of the pointing event.
  • the apparatus 100 may verify whether the pointing event is terminated.
  • the pointing event may be terminated when sufficient information for calibrating the 3D position is acquired. Whether the pointing event is terminated may be verified by determining whether the pointing event is performed a predetermined number of times.
  • the apparatus 100 may generate a 3D reference position, based on information about pointed to positions and the tracked 3D orientation.
  • the apparatus 100 may calculate an error using the reference position and the tracked 3D position.
  • the apparatus 100 may calibrate the 3D position to be tracked, using the error.
  • the aforementioned example embodiments relate to an apparatus and method for calibrating a 3D position in a 3D position and orientation tracking system, and may enable a stable sensing function by calibrating an error of the 3D position occurring due to a function deviation between a sensor and a device in mass production or in a process of manufacturing a product.
  • the methods according to the above-described example embodiments may be recorded in non-transitory computer-readable media or processor-readable media including program instructions to implement various operations embodied by a computer or processor.
  • the media may also include, alone or in combination with the program instructions, data files, data structures, and the like.
  • the program instructions recorded on the media may be those specially designed and constructed for the purposes of the example embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • non-transitory computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
  • magnetic media such as hard disks, floppy disks, and magnetic tape
  • optical media such as CD ROM disks and DVDs
  • magneto-optical media such as optical disks
  • hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
  • Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer or processor using an interpreter.
  • the described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa. Any one or more of the software modules described herein may be executed by a dedicated processor unique to that unit or by a processor common to one or more of the modules.
  • the described methods may be executed on a general purpose computer or processor or may be executed on a particular machine such as the apparatus for calibrating a 3D position of a remote device in a 3D position and orientation tracking system described herein.

Abstract

An apparatus and method for calibrating a 3D position in a 3D position and orientation tracking system are provided. The apparatus according to an embodiment may track the 3D position and the 3D orientation of a remote device in response to a detection of a pointing event, may acquire positions pointed to by the laser beams, in response to the detection of the pointing event, may generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, may calculate an error using the reference position and the tracked 3D position, and may calibrate the 3D position to be tracked, using the error.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of Korean Patent Application No. 10-2010-0120270, filed on Nov. 30, 2010, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference.
  • BACKGROUND
  • 1. Field
  • One or more example embodiments of the present disclosure relate to an apparatus and method for calibrating a three-dimensional (3D) position in a 3D position and orientation tracking system.
  • 2. Description of the Related Art
  • A technology for tracking a three-dimensional (3D) position and orientation of a moving object or target has typically been used for sensing a motion of an object, a body, an animal, and the like in a 3D space, using a large motion capture equipment at a high cost in a movie, graphics, and animation industry, and the like.
  • However, a motion sensing technology for consumer electronics related to the gaming industry has started to command attention, leading to a development of a number of 3D position and orientation tracking schemes using small-sized motion capture at a low cost.
  • A 3D position may be calibrated to track an accurate 3D position.
  • In the case of a touch panel and the like where a target interface device operates by way of a touch on a two-dimensional (2D) plane, a position may be easily and accurately calibrated by touching a reference point. However, a 3D position and orientation tracking system in a 3D space may have difficulty in calibrating a position without a high precision system providing separate reference information.
  • SUMMARY
  • The foregoing and/or other aspects are achieved by providing an apparatus for calibrating a three dimensional (3D) position of a remote device in a 3D position and orientation tracking system, the apparatus including a beam generating unit to generate at least two laser beams outputted in different orientations at predetermined angles, a position tracking unit to track a 3D position of the remote device in response to a detection of a pointing event, an orientation tracking unit to track a 3D orientation of the remote device in response to the detection of the pointing event, a pointing position acquiring unit to acquire positions pointed to by the at least two laser beams, in response to the detection of the pointing event, a reference position generating unit to generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, an error calculating unit to calculate an error using the reference position and the tracked 3D position, and a position calibrating unit to calibrate the 3D position tracked by the position tracking unit, using the error.
  • The orientation tracking unit may track a trajectory of the 3D orientation of the remote device in response to the detection of the pointing event, the pointing position acquiring unit may acquire a trajectory pointed to by the at least two laser beams, in response to the detection of the pointing event, and the reference position generating unit may generate the 3D reference position, using the pointed trajectory and the tracked trajectory of the 3D orientation.
  • The pointing event may correspond to an event simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
  • The pointing event may correspond to an event that points to, using the at least two laser beams, straight lines corresponding to a number of the laser beams along the straight lines.
  • The pointing position acquiring unit may receive a position of reference points from a display device displaying the reference points, and acquire the position of the reference points as the pointed to positions.
  • The pointing position acquiring unit may acquire the pointed to positions from a sensor measuring a position pointed to by a laser beam.
  • The foregoing and/or other aspects are achieved by providing an apparatus for calibrating a 3D position of a remote device in a 3D position and orientation tracking system, the apparatus including a beam generating unit to generate at least one laser beam, an event detecting unit to detect at least three instances of pointing events, a position tracking unit to track a 3D position of the remote device, in response to the detection of the pointing events, an orientation tracking unit to track a 3D orientation of the remote device, in response to the detection of the pointing events, a pointing position acquiring unit to acquire a position pointed to by the at least one laser beam, in response to the detection of the pointing events, a reference position generating unit to generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, an error calculating unit to calculate an error using the reference position and the tracked 3D position, and a position calibrating unit to calibrate the 3D position tracked by the position tracking unit, using the error.
  • The orientation tracking unit may track a trajectory of the 3D orientation of the remote device in response to the detection of the pointing events, the pointing position acquiring unit may acquire a pointed to trajectory, in response to the detection of the pointing events, and the reference position generating unit may generate the 3D reference position, using the pointed to trajectory and the tracked trajectory of the 3D orientation.
  • The pointing events may correspond to events that point to points displayed on a display device using the at least one laser beam.
  • The pointing events may correspond to events that point to, using the at least one laser beam, a straight line outputted from a display device along the straight line.
  • The pointing position acquiring unit may receive a position of a reference point from a display device displaying the reference point, and acquire the position of the reference point as the pointed to position.
  • The pointing position acquiring unit may acquire the pointed to position from a sensor measuring a position pointed to by a laser beam.
  • The foregoing and/or other aspects are achieved by providing a method of calibrating a 3D position of a remote device in a 3D position and orientation tracking system, the method including tracking a 3D position of the remote device in response to a detection of a pointing event, tracking a 3D orientation of the remote device in response to the detection of the pointing event, acquiring positions pointed to by the at least two laser beams outputted in different orientations at predetermined angles, in response to the detection of the pointing event, generating a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, calculating an error using the reference position and the tracked 3D position, and calibrating the 3D position to be tracked, using the error.
  • The pointing event may correspond to an event simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
  • The pointing event may correspond to an event that points to, using the at least two laser beams, straight lines corresponding to a number of the laser beams along the straight lines.
  • The foregoing and/or other aspects are achieved by providing a method of calibrating a 3D position of a remote device in a 3D position and orientation tracking system, the method including tracking a 3D position of the remote device in response to a detection of a pointing event, tracking a 3D orientation of the remote device in response to the detection of the pointing event, acquiring a position pointed to by a laser beam, in response to the detection of the pointing event, repeating the acquiring of the pointed to position at least three times in the tracking of the 3D position of the remote device, generating a 3D reference position, based on information about the pointed to position and the tracked 3D orientation, calculating an error using the 3D reference position and the tracked 3D position, and calibrating the 3D position to be tracked, using the error.
  • The pointing event may correspond to an event pointing to a point displayed on a display device using the laser beam.
  • The pointing event may correspond to an event that points to, using the laser beam, a straight line outputted from a display device along the straight line.
  • Additional aspects of embodiments will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and/or other aspects will become apparent and more readily appreciated from the following description of example embodiments, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 illustrates a configuration of an apparatus for calibrating a three-dimensional (3D) position in a 3D position and orientation tracking system according to example embodiments;
  • FIG. 2 illustrates an example of performing a pointing using a single laser beam to calibrate a 3D position;
  • FIG. 3 illustrates an example of performing a pointing using two laser beams to calibrate a 3D position;
  • FIGS. 4A, 4B, and 4C illustrate an example of a point or a straight line pointed to by two laser beams during a pointing event for calibrating a position; and
  • FIG. 5 illustrates an operational flowchart for calibrating a 3D position in a 3D position and orientation tracking system according to example embodiments.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. Embodiments are described below to explain the present disclosure by referring to the figures.
  • FIG. 1 illustrates a configuration of an apparatus 100 for calibrating a three-dimensional (3D) position in a 3D position and orientation tracking system according to example embodiments.
  • Referring to FIG. 1, the apparatus 100 for calibrating a 3D position may include, for example, a control unit 110, a beam generating unit 120, an event detecting unit 130, a position tracking unit 140, an orientation tracking unit 150, an initial calibrating unit 160, and a position calibrating unit 170.
  • The beam generating unit 120 may generate a laser beam. The beam generating unit 120 may generate a single laser beam, or at least two laser beams outputted in different orientations at predetermined angles. Note that, the laser beam may correspond to a light source having a characteristic of straightness. The beam generating unit 120 may include at least one laser such as a laser pointer or a laser diode to generate the laser beam.
  • The event detecting unit 130 may detect a pointing event. The event detecting unit 130 may correspond to an input device and may detect an input by a user. The user may point the laser beam to an accurate position and may thereby generate an event reporting the pointing through the input.
  • The position tracking unit 140 may track a 3D position of a remote device. The position tracking unit 140 may track the 3D position of the remote device using a camera, using an infrared light, using an inertial sensor, or using an attenuation characteristic of a light-emitting and receiving signal according to a directivity of an infrared signal. Here, the apparatus 100 for calibrating a 3D position may be disposed inside the remote device or may be disposed outside the remote device.
  • The orientation tracking unit 150 may track a 3D orientation of the remote device.
  • The orientation tracking unit 150 may track the 3D orientation through the inertial sensor. The inertial sensor may be configured as a combination including at least one of an acceleration sensor, a gyro sensor, and a geomagnetic sensor. Thus, the tracking of the orientation thorough the inertial sensor may be appropriate to an orientation tracking for calibrating an initial position.
  • The initial calibrating unit 160 may calculate an error of a current 3D position based on information about a pointed to position, the 3D position, and the 3D orientation. The initial calibrating unit 160 may include a pointing position acquiring unit 162, a reference position generating unit 164, and an error calculating unit 166. The initial calibrating unit 160 may operate differently depending on a number of laser beams outputted from the beam generating unit 120.
  • A case where the number of laser beams outputted from the beam generating unit 120 corresponds to one will be described with reference to FIG. 2. FIG. 2 illustrates an example of performing a pointing event using a single laser beam to calibrate a 3D position. In FIG. 2, the apparatus 100 for calibrating a 3D position may be disposed inside the remote device.
  • Where the number of laser beams outputted from the beam generating unit 120 corresponds to one, the remote device may perform a pointing event by outputting a laser beam while changing an orientation at the same position to calibrate a 3D position.
  • Referring to FIG. 2, the pointing position acquiring unit 162 (not shown) may acquire positions of reference points 210, 220, 230, and 240 from a display device 200 each time the reference points 210, 220, 230, and 240 are sequentially pointed to through the display device 200.
  • The pointing position acquiring unit 162 may acquire the positions of the reference points 210, 220, 230, and 240 corresponding to positions pointed to from the display device 200, or from a sensor for measuring a position pointed to by the laser beam. Here, the sensor measuring the laser beam may be configured to be included in a display panel of the display device 200, although the sensor may alternatively be located elsewhere.
  • The reference position generating unit 164 may generate a 3D reference position based on information about the positions of the reference points 210, 220, 230, and 240 as well as a 3D direction tracked during a pointing event.
  • The error calculating unit 166 may calculate an error using a difference between a reference position generated by the reference position generating unit 164 and a 3D position tracked by the position tracking unit 140.
  • Next, a case in which the number of laser beams outputted from the beam generating unit 120 corresponds to at least two will be described with reference to FIG. 3. FIG. 3 illustrates an example of performing a pointing event using two laser beams to calibrate a 3D position. In FIG. 3, the apparatus 100 for calibrating a 3D position may be disposed inside the remote device, although the apparatus 100 may alternatively be located elsewhere.
  • The pointing position acquiring unit 162 may acquire positions P1 and P2 pointed to by laser beams, in response to a detection of a pointing event. The pointing position acquiring unit 162 may acquire the pointed to positions P1 and P2 from a display device 300, or from a sensor for measuring a position pointed to by the laser beams.
  • The reference position generating unit 164 may generate a 3D reference position based on information about locations of the pointed to positions P1 and P2 and a 3D orientation tracked during the pointing event.
  • A relationship between two laser beams outputted in different orientations at predetermined angles and the 3D orientation may be expressed by the following Equation 1.

  • {right arrow over (a)}=R·{right arrow over (a)} 0

  • {right arrow over (b)}=R·{right arrow over (b)} 0  [Equation 1]
  • Here, {right arrow over (a)} corresponds to a unit vector in a direction of a first laser beam transformed by a rotation matrix, {right arrow over (b)} corresponds to a unit vector in a direction of a second laser beam transformed by a rotation matrix, {right arrow over (a)}0 corresponds to a reference value and to a unit vector in a direction of the first laser beam before a transformation by the rotation matrix, {right arrow over (b)}0 corresponds to a reference value and to a unit vector in a direction of the second laser beam before a transformation by the rotation matrix, and R corresponds to a rotation transformation matrix. R may be expressed by the following Equation 2.
  • R = [ CyCz - CySz Sy SxSyCz + CxSz - SxSySz + CxCz - SxCy - CxSyCz + SxSz CxSySz + SxCz CxCy ] , where Cx = cos ( ϕ ) , Sx = sin ( ϕ ) , Cy = cos ( θ ) , Sy = sin ( θ ) , Cz = cos ( ψ ) , Sz = sin ( ψ ) . [ Equation 2 ]
  • Here, φ corresponds to a rotation angle of a roll, θ corresponds to a rotation angle of a pitch, and ψ corresponds to a rotation angle of a yaw.
  • When it is assumed that the pointed to position P1 corresponds to (x1, 0, z1), the pointed to position P2 corresponds to (x2, 0, z2), a reference position Pt to be calculated corresponds to (xt, yt, zt), a distance between apparatus 100 and the position P1 corresponds to d1, and a distance between apparatus 100 and the position P2 corresponds to d2, Equation 3 may be expressed as shown below. It should be noted that even though values on y-axis of the pointed to positions P1 and P2 are set to “0” for convenience, the values may not be limited to

  • d 1 ·{right arrow over (a)}=(x 1 −x t ,−y t ,z 1 −z t)

  • d 2 ·{right arrow over (b)}=(x 2 −x t ,−y t ,z 2 −z t)  [Equation 3]
  • When simultaneous equations are solved using Equation 3, the reference position may be generated as shown in the following Equation 4.

  • x t =x 1 −d 1 a x

  • y t =−d 1 a y

  • z t =z 1 d 1 a z  [Equation 4]
  • The error calculating unit 166 may calculate an error using a difference between the reference position generated by the reference position generating unit 164 and the 3D position tracked through the position tracking unit 140.
  • Where the number of laser beams outputted from the beam generating unit 120 corresponds to at least two, the pointing event may occur as illustrated in FIGS. 4A, 4B, and 4C.
  • FIGS. 4A, 4B, and 4C illustrate an example of a point or a straight line pointed to by two laser beams during a pointing event for calibrating a position.
  • The pointing event may occur in a straight line as illustrated in FIG. 4B or FIG. 4C.
  • In a case where the pointing event occurs in a straight line, the orientation tracking unit 150 may track a trajectory of a 3D orientation of the remote device.
  • In a case where the pointing event occurs in a straight line, the pointing position acquiring unit 162 may acquire a trajectory pointed to by way of the laser beams.
  • In a case where the pointing event occurs in a straight line, the reference position generating unit 164 may generate a 3D reference position using pointed to trajectories and tracked trajectories of a 3D orientation.
  • Referring to FIG. 2, the position calibrating unit 170 may calibrate a 3D position tracked through the position tracking unit 140 in real time, using an error calculated through the error calculating unit 166. The position calibrating unit 170 may perform an offset calibration offsetting an error or may perform a scale factor calibration using an error.
  • The offset calibration may be expressed by Equation 5, and the scale factor calibration may be expressed by Equation 6.

  • {circumflex over (x)}=x−Δx

  • ŷ=y−Δy

  • {circumflex over (z)}=z−Δz  [Equation 5]
  • Here, ({circumflex over (x)}, ŷ, {circumflex over (z)}) corresponds to a calibrated 3D position value, (x, y, z) corresponds to a 3D position value before a calibration, and (Δx, Δy, Δz) corresponds to an error value.

  • {circumflex over (x)}={(x 0 −Δx 0)/x 0 }·x

  • ŷ={(y 0 −Δy 0)/y 0 }·y

  • {circumflex over (z)}={(z 0 −Δz 0)/z 0 }·z  [Equation 6]
  • Here, ({circumflex over (x)}, ŷ, {circumflex over (Z)}) corresponds to a calibrated 3D position value, (x, y, z) corresponds to a 3D position value before a calibration, (x0, y0, z0) corresponds to a 3D position value tracked during the pointing event to generate a reference position, and (Δx0, Δy0, Δz0) corresponds to an error value calculated by a difference between (x0, y0, z0) and the reference position. The control unit 110 may control an overall operation of the apparatus 100. The control unit 110 may perform a function of the pointing position acquiring unit 162, the reference position generating unit 164, the error calculating unit 166, and the position calibrating unit 170. The control unit 110, the pointing position acquiring unit 162, the reference position generating unit 164, the error calculating unit 166, and the position calibrating unit 170 are separately illustrated so as to separately describe each function. Thus, the control unit 110 may include at least one processor configured to perform each function of the pointing position acquiring unit 162, the reference position generating unit 164, the error calculating unit 166, and the position calibrating unit 170. The control unit 110 may include at least one processor configured to perform a at least a portion of each function of the pointing position acquiring unit 162, the reference position generating unit 164, the error calculating unit 166, and the position calibrating unit 170.
  • Hereinafter, a method of calibrating a 3D position in a 3D position and orientation tracking system according to an embodiment configured as above will be described with reference to FIG. 5.
  • Since a more accurate 3D position may be tracked when at least two laser beams are simultaneously outputted in comparison with a case where one laser beam is outputted, the method of calibrating a 3D position will be described for a case where the number of laser beams corresponds to at least two.
  • FIG. 5 illustrates an operational flowchart for calibrating a 3D position in a 3D position and orientation tracking system according to example embodiments.
  • Referring to FIG. 5, when a pointing event is detected in operation 510, the apparatus 100 may acquire positions pointed to by the at least two laser beams outputted in different orientations at predetermined angles in operation 512. Note that the pointing event may simultaneously correspond to another event by simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams. Also, the pointing event may correspond to an event that points to, using the at least two laser beams, straight lines corresponding to the number of the laser beams along the straight lines.
  • In operation 514, the apparatus 100 may track a 3D position of a remote device in response to a detection of the pointing event.
  • In operation 516, the apparatus 100 may track a 3D orientation of the remote device in response to a detection of the pointing event.
  • In operation 517, the apparatus 100 may verify whether the pointing event is terminated. The pointing event may be terminated when sufficient information for calibrating the 3D position is acquired. Whether the pointing event is terminated may be verified by determining whether the pointing event is performed a predetermined number of times.
  • In operation 518, the apparatus 100 may generate a 3D reference position, based on information about pointed to positions and the tracked 3D orientation.
  • In operation 520, the apparatus 100 may calculate an error using the reference position and the tracked 3D position.
  • In operation 522, the apparatus 100 may calibrate the 3D position to be tracked, using the error.
  • The aforementioned example embodiments relate to an apparatus and method for calibrating a 3D position in a 3D position and orientation tracking system, and may enable a stable sensing function by calibrating an error of the 3D position occurring due to a function deviation between a sensor and a device in mass production or in a process of manufacturing a product.
  • The methods according to the above-described example embodiments may be recorded in non-transitory computer-readable media or processor-readable media including program instructions to implement various operations embodied by a computer or processor. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like. The program instructions recorded on the media may be those specially designed and constructed for the purposes of the example embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of non-transitory computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
  • Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer or processor using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described example embodiments, or vice versa. Any one or more of the software modules described herein may be executed by a dedicated processor unique to that unit or by a processor common to one or more of the modules. The described methods may be executed on a general purpose computer or processor or may be executed on a particular machine such as the apparatus for calibrating a 3D position of a remote device in a 3D position and orientation tracking system described herein.
  • Although embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined by the claims and their equivalents.

Claims (24)

1. An apparatus for calibrating a three dimensional (3D) position of a remote device in a 3D position and orientation tracking system, the apparatus comprising:
a beam generating unit to generate at least two laser beams;
a position tracking unit to track a 3D position of the remote device in response to a detection of a pointing event;
an orientation tracking unit to track a 3D orientation of the remote device in response to the detection of the pointing event;
a pointing position acquiring unit to acquire positions pointed to by the at least two laser beams in response to the detection of the pointing event;
a reference position generating unit to generate a 3D reference position based on information about the pointed to positions and the tracked 3D orientation;
an error calculating unit to calculate an error using the reference position and the tracked 3D position; and
a position calibrating unit to calibrate the 3D position tracked by the position tracking unit, using the calculated error.
2. The apparatus of claim 1, wherein:
the orientation tracking unit tracks a trajectory of the 3D orientation of the remote device in response to the detection of the pointing event,
the pointing position acquiring unit acquires a trajectory pointed to by the at least two laser beams, in response to the detection of the pointing event, and
the reference position generating unit generates the 3D reference position, using the pointed to trajectory and the tracked trajectory of the 3D orientation.
3. The apparatus of claim 1, wherein the pointing event corresponds to an event simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
4. The apparatus of claim 1, wherein the pointing event corresponds to an event that points to, using the at least two laser beams, straight lines corresponding to a number of the laser beams along the straight lines.
5. The apparatus of claim 1, wherein the pointing position acquiring unit receives a position of reference points from a display device displaying the reference points, and acquires the position of the reference points as the pointed to positions.
6. The apparatus of claim 1, wherein the pointing position acquiring unit acquires the pointed to positions from a sensor measuring a position pointed to by a laser beam.
7. The apparatus of claim 1, wherein the orientation tracking unit tracks the 3D orientation of the remote device using at least one of an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
8. The apparatus of claim 1, wherein the beam generating unit outputs the at least two laser beams in different orientations at predetermined angles.
9. An apparatus for calibrating a three dimensional (3D) position of a remote device in a 3D position and orientation tracking system, the apparatus comprising:
a beam generating unit to generate at least one laser beam;
an event detecting unit to detect at least three instances of pointing events;
a position tracking unit to track a 3D position of the remote device, in response to the detection of the pointing events;
an orientation tracking unit to track a 3D orientation of the remote device, in response to the detection of the pointing events;
a pointing position acquiring unit to acquire a position pointed to by the at least one laser beam, in response to the detection of the pointing events;
a reference position generating unit to generate a 3D reference position based on information about the pointed to positions and the tracked 3D orientation;
an error calculating unit to calculate an error using the reference position and the tracked 3D position; and
a position calibrating unit to calibrate the 3D position tracked by the position tracking unit using the calculated error.
10. The apparatus of claim 9, wherein:
the orientation tracking unit tracks a trajectory of the 3D orientation of the remote device in response to the detection of the pointing events,
the pointing position acquiring unit acquires a pointed to trajectory, in response to the detection of the pointing events, and
the reference position generating unit generates the 3D reference position, using the pointed to trajectory and the tracked trajectory of the 3D orientation.
11. The apparatus of claim 9, wherein the pointing events correspond to events that point to points displayed on a display device using the at least one laser beam.
12. The apparatus of claim 9, wherein the pointing events correspond to events that point to, using the at least one laser beam, a straight line outputted from a display device along the straight line.
13. The apparatus of claim 9, wherein the pointing position acquiring unit receives a position of a reference point from a display device displaying the reference point and acquires the position of the reference point as the pointed to position.
14. The apparatus of claim 9, wherein the pointing position acquiring unit acquires the pointed to position from a sensor measuring a position pointed to by a laser beam.
15. The apparatus of claim 9, wherein the orientation tracking unit tracks the 3D orientation of the remote device using at least one of an acceleration sensor, a gyro sensor, and a geomagnetic sensor.
16. A method of calibrating a three dimensional (3D) position of a remote device in a 3D position and orientation tracking system, the method comprising:
tracking a 3D position of the remote device in response to a detection of a pointing event;
tracking a 3D orientation of the remote device in response to the detection of the pointing event;
acquiring positions pointed to by at least two laser beams in response to the detection of the pointing event;
generating a 3D reference position based on information about the pointed to positions and the tracked 3D orientation;
calculating, by way of a processor, an error using the reference position and the tracked 3D position; and
calibrating the 3D position to be tracked using the calculated error.
17. The method of claim 16, wherein the pointing event corresponds to an event simultaneously pointing to points corresponding to a number of the laser beams using the at least two laser beams.
18. The method of claim 16, wherein the pointing event corresponds to an event that points to, using the at least two laser beams, straight lines corresponding to a number of the laser beams along the straight lines.
19. The method of claim 16, wherein the at least two laser beams are output in different orientations at predetermined angles.
20. A non-transitory computer-readable storage medium encoded with computer readable code for implementing the method of claim 16.
21. A method of calibrating a three dimensional (3D) position of a remote device in a 3D position and orientation tracking system, the method comprising:
tracking a 3D position of the remote device in response to a detection of a pointing event;
tracking a 3D orientation of the remote device in response to the detection of the pointing event;
acquiring a position pointed to by a laser beam in response to the detection of the pointing event;
repeating the acquiring of the pointed to position at least three times in the tracking of the 3D position of the remote device;
generating a 3D reference position, based on information about the pointed to position and the tracked 3D orientation;
calculating, by way of a processor, an error using the 3D reference position and the tracked 3D position; and
calibrating the 3D position to be tracked using the calculated error.
22. The method of claim 21, wherein the pointing event corresponds to an event pointing to a point displayed on a display device using the laser beam.
23. The method of claim 21, wherein the pointing event corresponds to an event that points to, using the laser beam, a straight line outputted from a display device along the straight line.
24. A non-transitory computer-readable storage medium encoded with computer readable code for implementing the method of claim 21.
US13/137,633 2010-11-30 2011-08-30 Apparatus and method for calibrating 3D position in 3D position and orientation tracking system Abandoned US20120133584A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2010-0120270 2010-11-30
KR1020100120270A KR20120058802A (en) 2010-11-30 2010-11-30 Apparatus and method for calibrating 3D Position in 3D position/orientation tracking system

Publications (1)

Publication Number Publication Date
US20120133584A1 true US20120133584A1 (en) 2012-05-31

Family

ID=46126274

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/137,633 Abandoned US20120133584A1 (en) 2010-11-30 2011-08-30 Apparatus and method for calibrating 3D position in 3D position and orientation tracking system

Country Status (2)

Country Link
US (1) US20120133584A1 (en)
KR (1) KR20120058802A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140375794A1 (en) * 2013-06-25 2014-12-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
US8923562B2 (en) 2012-12-24 2014-12-30 Industrial Technology Research Institute Three-dimensional interactive device and operation method thereof
CN107111419A (en) * 2016-09-27 2017-08-29 深圳市大疆创新科技有限公司 Calibration method, calibrating installation, calibration system and calibration memory
US9839855B2 (en) 2014-05-21 2017-12-12 Universal City Studios Llc Amusement park element tracking system
US20190012002A1 (en) * 2015-07-29 2019-01-10 Zte Corporation Projection Cursor Control Method and Device and Remote Controller
WO2019011001A1 (en) * 2017-07-13 2019-01-17 深圳市道通智能航空技术有限公司 Remote control device and calibration method for potentiometer rocker thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926168A (en) * 1994-09-30 1999-07-20 Fan; Nong-Qiang Remote pointers for interactive televisions
US20020085097A1 (en) * 2000-12-22 2002-07-04 Colmenarez Antonio J. Computer vision-based wireless pointing system
US20060152489A1 (en) * 2005-01-12 2006-07-13 John Sweetser Handheld vision based absolute pointing system
US7142198B2 (en) * 2001-12-10 2006-11-28 Samsung Electronics Co., Ltd. Method and apparatus for remote pointing
US20080012824A1 (en) * 2006-07-17 2008-01-17 Anders Grunnet-Jepsen Free-Space Multi-Dimensional Absolute Pointer Using a Projection Marker System
US20090033807A1 (en) * 2007-06-28 2009-02-05 Hua Sheng Real-Time Dynamic Tracking of Bias
US20090085867A1 (en) * 2007-10-02 2009-04-02 Samsung Electronics Co., Ltd. Error-correction apparatus and method and 3D pointing device using the error-correction apparatus
US20090284635A1 (en) * 2008-05-13 2009-11-19 Fu-Hsiang Sung Display device having optical zoom camera module

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926168A (en) * 1994-09-30 1999-07-20 Fan; Nong-Qiang Remote pointers for interactive televisions
US20020085097A1 (en) * 2000-12-22 2002-07-04 Colmenarez Antonio J. Computer vision-based wireless pointing system
US7142198B2 (en) * 2001-12-10 2006-11-28 Samsung Electronics Co., Ltd. Method and apparatus for remote pointing
US20060152489A1 (en) * 2005-01-12 2006-07-13 John Sweetser Handheld vision based absolute pointing system
US20080012824A1 (en) * 2006-07-17 2008-01-17 Anders Grunnet-Jepsen Free-Space Multi-Dimensional Absolute Pointer Using a Projection Marker System
US20090033807A1 (en) * 2007-06-28 2009-02-05 Hua Sheng Real-Time Dynamic Tracking of Bias
US20090085867A1 (en) * 2007-10-02 2009-04-02 Samsung Electronics Co., Ltd. Error-correction apparatus and method and 3D pointing device using the error-correction apparatus
US20090284635A1 (en) * 2008-05-13 2009-11-19 Fu-Hsiang Sung Display device having optical zoom camera module

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8923562B2 (en) 2012-12-24 2014-12-30 Industrial Technology Research Institute Three-dimensional interactive device and operation method thereof
US20140375794A1 (en) * 2013-06-25 2014-12-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
CN104249217A (en) * 2013-06-25 2014-12-31 波音公司 Apparatusand method for accurate structure marking and marking-assisted structure locating
US9789462B2 (en) * 2013-06-25 2017-10-17 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
US10328411B2 (en) 2013-06-25 2019-06-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
US9839855B2 (en) 2014-05-21 2017-12-12 Universal City Studios Llc Amusement park element tracking system
US10661184B2 (en) 2014-05-21 2020-05-26 Universal City Studios Llc Amusement park element tracking system
US20190012002A1 (en) * 2015-07-29 2019-01-10 Zte Corporation Projection Cursor Control Method and Device and Remote Controller
CN107111419A (en) * 2016-09-27 2017-08-29 深圳市大疆创新科技有限公司 Calibration method, calibrating installation, calibration system and calibration memory
WO2018058326A1 (en) * 2016-09-27 2018-04-05 深圳市大疆创新科技有限公司 Calibration method, calibration device, calibration system, and calibration memory
WO2019011001A1 (en) * 2017-07-13 2019-01-17 深圳市道通智能航空技术有限公司 Remote control device and calibration method for potentiometer rocker thereof

Also Published As

Publication number Publication date
KR20120058802A (en) 2012-06-08

Similar Documents

Publication Publication Date Title
CN106767852B (en) A kind of method, apparatus and equipment generating detection target information
US20120133584A1 (en) Apparatus and method for calibrating 3D position in 3D position and orientation tracking system
US8320616B2 (en) Image-based system and methods for vehicle guidance and navigation
US10378921B2 (en) Method and apparatus for correcting magnetic tracking error with inertial measurement
KR101107538B1 (en) Sensor-based orientation system
US7768527B2 (en) Hardware-in-the-loop simulation system and method for computer vision
US9978147B2 (en) System and method for calibration of a depth camera system
US20110260968A1 (en) 3d pointing device and method for compensating rotations of the 3d pointing device thereof
US11698687B2 (en) Electronic device for use in motion detection and method for obtaining resultant deviation thereof
CN111145139A (en) Method, device and computer program for detecting 3D objects from 2D images
CN114236564A (en) Method for positioning robot in dynamic environment, robot, device and storage medium
US10830917B2 (en) Method for detecting an anomaly in the context of using a magnetic locating device
CN102778965A (en) 3d indicating device and method for compensating rotation of3d indicating device
US20220332327A1 (en) Method and Apparatus for Fusing Sensor Information and Recording Medium Storing Program to Execute the Method
US9557190B2 (en) Calibration apparatus and method for 3D position/direction estimation system
JP2010066595A (en) Environment map generating device and environment map generating method
US10885663B2 (en) Method for setting a viewing direction in a representation of a virtual environment
US11620846B2 (en) Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device
CN108762527A (en) A kind of recognition positioning method and device
KR102097247B1 (en) Distance measuring device using laser and method thereof
CN110161490B (en) Distance measuring system with layout generating function
US11893167B2 (en) Information processing device, information processing method, non-transitory computer readable medium
KR102139667B1 (en) Method and device for acquiring information
Stănescu et al. Mapping the environment at range: implications for camera calibration
KR102212268B1 (en) Localization system and means of transportation with the same and computing device for executing the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, HYONG EUK;BANG, WON CHUL;KIM, SANG HYUN;AND OTHERS;REEL/FRAME:026901/0138

Effective date: 20110715

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE