APPARATUS AND METHOD FOR AUTOMATICALLY ARRANGING THREE- DIMENSIONAL SCAN DATA USING OPTICAL MARKER
FIELD OF THE INVENTION
The present invention relates to an apparatus and method for automatically arranging three-dimensional (3D) scan data by using optical markers and, more particularly, to an apparatus and method for automatically arranging relative positions of a plurality of 3D scan data scanned in various positions and angles in reference to one coordinate system.
BACKGROUND OF THE INVENTION
In general, an optical 3D scanner can extract 3D data by scanning the surface of an object that is only in the scanner's field of view. In order to scan other regions of the object, which are obstructed from the view of the 3D scanner, the object to be scanned should be rotated or moved, or the scanner itself should be moved and positioned to a place where a part or parts of the object can be seen. A complete 3D scan then may be captured by surveying the object in various orientations and angles, and the 3D data thus obtained is arranged and integrated into a single uniform coordinate system.
In this case, the reason the 3D data thus obtained should be integrated into a single uniform coordinate system is that if the scanner is moved to different positions and angles in order to capture the object, each 3D scan data is defined by different coordinate system according to the position of a scanner.
In order to match these various coordinate systems, the distance in which the scanner has been moved must be known. There are two methods of calculating this distance. One is to obtain an absolute distance by using a numerically-controlled device for moving the scanner, and the other is to calculate the distance only by referring to the scanned data.
In the latter case, the scanning must be carried out so that the plurality of scanned
data overlap each other and corresponding points are inputted at the overlapping positions of the scanned data. The scanned data is then arranged in reference to the corresponding points such that the respective coordinate systems defining the plurality of scanned data are integrated into one uniform coordinate system.
In this case, error is likely to occur when corresponding points are inputted manually depending on how accurately an operator inputs them. Particularly, if there are no abnormal features on the surface of an object, more error is likely to occur. Furthermore, a large or complicated object requires numerous scanning operations by changing the position and angle of the scanner, thus prolonging the input of corresponding points and subjecting the scanning operation to more error. There is another drawback in that corresponding points may be erroneously inputted or omitted by an operator's careless error, thus resulting in frequent occurrences of inaccurate arrangements.
In order to overcome these drawbacks, a method that employs small detectable objects called markers or targets attached onto a surface of an object for an operator to identify them and input corresponding points accurately has recently been introduced. Furthermore, a technique has been developed by which corresponding points are automatically recognized by image processing algorithm.
As shown in Fig. 1, conventional markers 4 are attached arbitrarily onto the surface of an object 2, and the surface of the object 2 is scanned part by part in an overlapping manner.
When a plurality of scanned data is obtained by the above scanning method, an operator manually marks corresponding markers, as shown in Fig. 2.
After first and second scanned data II and 12 are obtained from scanning the surface of the object 2, as shown in Fig. 2, the operator searches the number of markers Ml and M2 positioned commonly in the two scanned data II and 12 and arranges the two scanned data II and 12 by matching the markers as corresponding points.
Meanwhile, in a technique where corresponding points are automatically recognized, markers having patterns different from each other for identification is searched via image processing, and if markers having identical patterns are positioned in two different scanned data, the two scanned data are arranged automatically in reference to the markers.
However, there is a drawback in the conventional technique of recognizing corresponding points of scanned data by using the markers described above. That is, the data of parts of the surface of an object covered by markers may be lost due to the physical dimension of the markers attached on the surface of the object.
Although the lost information can be recovered by interpolation or by rescanning after removing the markers, there are still drawbacks in that these methods are inaccurate and call for lots of working hours.
SUMMARY OF THE INVENTION
The present invention provides an apparatus and method for automatically arranging 3D scan data by using optical markers which are of non-contact type so that scanned parts of an object are preserved and not deformed.
In accordance with one object of the present invention there is provided an apparatus using optical markers for automatically arranging 3D scan data which is obtained by scanning an object at various angles, comprising: marker generating means for projecting a plurality of optical markers on the surface of an object; pattern projecting means for projecting patterns on the surface of the object in order to obtain 3D scan data of the object; image obtaining means for obtaining 2D image data of the object including the optical markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and control means for extracting 3D positions of the optical markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data in reference to the 3D positions of the optical
markers.
In accordance with another object of the present invention, there is provided a method for automatically arranging 3D scan data using optical markers, the method comprising the steps of: moving the image obtaining means to a position appropriate for obtaining an image of parts of the object; projecting the optical markers on the surface of the object by marker generating means and obtaining 2D image data of parts of the object including the optical markers projected on the surface of the object by image obtaining means; projecting patterns on the surface of the object by pattern projecting means and obtaining 3D scan data of parts of the object on which the patterns are projected by the image obtaining means; and extracting 3D positions of the optical markers from the relationship between the 2D image data and the 3D scan data and arranging 3D scan data obtained from different parts of the object in reference to the 3D positions of the optical markers.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the nature and objects of the present invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings in which:
Fig. 1 is a drawing for illustrating an example of a 3D scan of an object by attaching conventional sticker type markers on the object;
Fig. 2 is a drawing for illustrating an example of an arrangement of different scan data by using sticker type markers as reference markers;
Fig. 3 is a schematic drawing for illustrating the construction of an apparatus for automatically arranging 3D scan data using optical markers according to a first embodiment of the present invention;
Figs. 4a to 4c are schematic drawings for illustrating examples of a state for obtaining a 2D image data using optical markers according to the first embodiment of the present invention and a state for obtaining a 3D scan data by using patterns;
Fig. 5 is a schematic drawing for illustrating an example of a state for deriving 2D positions of markers from the 2D image data obtained by activating and deactivating the optical markers according to the first embodiment of the present invention;
Fig. 6 is a schematic drawing for illustrating an example of a state for deriving a 3D position of a marker from a 2D position of a marker and a center position of a camera lens;
Figs. 7a to 7b are schematic drawings for exemplarily illustrating an operation of searching corresponding markers by way of a triangular comparison at mutually different image data according to the first embodiment of the present invention;
Figs. 8a to 8d are schematic drawings for exemplarily illustrating a conversion operation of matching two triangular structures at mutually different positions in triangular comparison within two different image data according to the first embodiment of the present invention;
Fig. 9 is a schematic drawing for exemplarily illustrating an operation of searching corresponding markers by way of obtaining an imaginary marker at mutually different image data according to the first embodiment of the present invention;
Figs. 10a to 10b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention;
Fig. 11 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a second embodiment of the present invention;
Fig. 12 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the second embodiment of the present invention; Fig. 13 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a third embodiment of the present invention;
Fig. 14 is a schematic drawing for illustrating a construction of an apparatus for
automatically arranging 3D scan data using optical markers according to a fourth embodiment of the present invention;
Fig. 15 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fifth embodiment of the present invention;
Fig. 16 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a sixth embodiment of the present invention;
Figs. 17a and 17b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the sixth embodiment of the present invention ;
Fig. 18 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of a reference coordinate system;
Fig. 19 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of an absolute coordinate system;
Fig. 20 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention;
Figs. 21a and 21b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention;
Fig. 22 is a drawing of an example of an image obtained by using a large domain image obtaining part depicted in Fig. 20;
Fig. 23 is a drawing of an example of an image obtained by using the large domain image obtaining part depicted in Fig. 20 and an image obtaining part;
Fig. 24 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to an eighth embodiment of the present invention;
Figs. 25a and 25b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the eighth embodiment of the present invention;
Fig. 26a is a drawing of an example of an image obtained by using a pair of large domain image obtaining parts illustrated in Fig. 24;
Fig. 26b is a drawing of an example of an image obtained by using the pair of large domain image obtaining parts illustrated in Fig. 24 and an image obtaining part;
Fig. 27 is a schematic drawing for illustrating a principle of the eighth embodiment of the present invention; Fig. 28 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a ninth embodiment of the present invention;
Fig.29 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention;
Fig. 30 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a tenth embodiment of the present invention; and
Fig. 31 is a schematic drawing for illustrating a construction of a marker generator according to an eleventh embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The first embodiment of the present invention will now be described in detail with reference to the accompanying drawings.
Fig. 3 is a drawing for illustrating a structure of an apparatus for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention, wherein the apparatus includes a marker generator 12, a pattern projector 16, an
image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a marker blinking controller 26, a pattern projector controller 28, a microprocessor 30 and a buffer 32.
The marker generator 12 which is designed to project markers recognizable by the image obtaining part 18 on the surface of an object 10 includes a plurality of marker output parts 14 for projecting plural optical markers simultaneously on the surface of the object 10 in irregular directions.
Preferably, the plurality of marker output parts 14 adopts laser pointers capable of projecting a plurality of red spots on the surface of the object 10, such that the positions of spots projected on the surface of the object 10 may be easily distinguished from images obtained by the image obtaining part 18, such as a camera or the like.
The marker generator 12 is by no means limited to a laser pointer, and any optical marker may be adopted as long as it can focalize properly on the surface of the object and can be easily controlled to repeat blinking.
A plurality of marker generators 12 may be disposed along a circumference of an object in order to project the optical markers on the surface of the entire object 10, and the number of the optical markers may be changed in accordance with the size and shape of the object 10. Furthermore, the marker generators 12 should be fixed on the object during the scanning operation so that the positions of the markers do not vary on the surface of the object.
The pattern projector 16 shown in the drawing projects predetermined patterns so that
3D scan data of the object 10 can be obtained. Namely, space-encoded beams are projected on the surface of the object 10 by using projectors such as LCD projectors and the like, or a laser beam is projected on the surface of the object 10 so that 3D scan data of the object 10 can be obtained via the image obtaining part 18.
It is also preferable that the pattern projector 16 adopts a slide projector comprising a
light source, a pattern film and a lens for projecting a predetermined pattern, or an electronic LCD projector, or a laser diode for projecting laser striped patterns. A pattern film equipped with striped patterns is fed between a light source and a lens by a predetermined feeding means, which allow a series of striped patterns to be projected on the object 10.
The pattern film may have striped patterns of varying gaps, as disclosed in Korean
Patent Application No.2002-10839 filed on February 28, 2002 by the present applicant, entitled "3D Scan Apparatus and Method Using Multiple Striped Patterns." The same applies to a scanning device using laser striped patterns.
Furthermore, it is preferable that markers should not be projected on the object 10 while obtaining 3D scan data because scanned data might be deformed by the markers on the object 10.
The image obtaining part 18 comprises image sensors capable of receiving images such as a Charge Coupled Device (CCD) camera or a CMOS camera. The image obtaining part 18 obtains images by photographing the object when markers are optically projected on the surface of the object 10.
The image obtaining part 18 may be configured separately from the pattern projector 16 but it is preferable that the image obtaining part 18 be integrally built in with the pattern projector 16 because the image obtaining part 18 integrated with the projector 16 is simple in structure and it is easy to match the 2D image data with the 3D scan data without calibration.
The image obtaining part 18 obtains 2D image data and 3D scan data while synchronizing the blinking period of the optical markers with the blinking period of the pattern, details of which are illustrated in Figs. 4a, 4b and 4c.
As shown in Fig. 4a, the image obtaining part 18 obtains a first image data 40 by photographing a certain part of the object 10 on which a plurality of optical markers (RM) are arbitrarily projected.
Next, as shown in Fig. 4b, the image obtaining part 18 obtains a second image data 42 by photographing the same part of the object 10 as in Fig 4a while the marker generator 12 is turned off in order to prevent the surface of the object 10 from being projected with the laser markers.
Next, as illustrated in Fig. 4c, the image obtaining part 18 obtains 3D scan data by photographing the object 10 on which striped patterns are projected from the pattern projector 16 while the marker generator 12 turns off. Specifically, the 3D scan data is obtained in the form of first to fifth scanned data 44a - 44e that correspond to the same part of the object 10 with different striped patterns PT1 - PT5 on the surface thereof, respectively. Although in the present embodiment the pattern film has 5 different striped patterns, it is not limited thereto and it may have more patterns.
The movement driving part 20 moves the pattern projector 16 and the image obtaining part 18 relatively to the object 10 according to the driving control of the microprocessor 30 so that images of the whole object 10 can be obtained.
The moving mechanism 22 receives a signal from the movement driving part 20 to move the pattern projector 16 and the image obtaining part 18 to a prescribed direction relative to the object 10. Although the movement driving part 20 is adopted in the present embodiment for electrically moving the pattern projector 16 and the image obtaining part 18, it should be apparent that the moving mechanism 22 may be manually manipulated.
The image input part 24 shown in the drawing receives the image data obtained from the image obtaining part 18, and the marker blinking controller 26 serves to blink the optical markers of the marker generator 12 according to the control of the microprocessor.
The project controller 28 controls the feeding speed and direction of the pattern film of the pattern projector 16, and also controls the blinking of the light source for the projection.
The microprocessor 30 receives and analyzes the 2D image data and the 3D scan data
photographed at various angles through the image input part 24 and arranges the 3D scan data automatically on a single uniform coordinate system.
As illustrated in Fig. 5, for seeking the 2D positions of the laser markers, the microprocessor 30 carries out an image process on the first image data 40 with the laser markers (RM) and the second image data 42 without the laser markers (RM), and as a result the third image data 46 comprising only the laser markers (RM) is obtained.
As illustrated in Fig. 6, the microprocessor 30 calculates 3D positions of the markers from the relation between a camera lens center 50 of the image obtaining part 18 and a 2D image data 52 extracted with the marker position. The 3D positions of the markers (a', b', c') corresponding to relevant markers can be obtained by estimating intersection points where straight lines connecting the camera lens center 50 of the image obtaining part 18 and the positions (a, b, c) of arbitrary markers in the 2D image data intersect the 3D scan data 54.
It is possible to quickly obtain 3D positions of markers in case the pattern projector 16 and the image obtaining part 18 are integrally configured. However, if the pattern projector 16 and the image obtaining part 18 are configured separately, calibration on the coordinates of the pattern projector 16 and image obtaining part 18 should be executed for obtaining 3D positions of markers. Through the aforementioned steps, the 3D positions of the 3D scan data photographed at various angles can be obtained.
Meanwhile, a scan data should preferably include more than 4 - 5 markers, and the neighboring two scan data should include 3 or more common markers. This is because three or more points are necessary for defining a unique position in 3D space and are also a requirement for obtaining corresponding markers (described later).
Furthermore, it may be possible in the present invention that each marker is used with different patterns to be distinguished from each other. However, the configuration of the equipment and manufacture thereof may be complicated because scores or hundreds of marker
output parts should project different shapes of markers.
In the present invention, in automatically arranging 3D scan data of neighboring regions obtained by photographing the objects in an overlapping manner, the markers are distinguished from each other by using information on the relative positions of the markers calculated based on respective 3D scan data at the microprocessor 30. For example, three points formed by markers can construct a triangle, and triangles constructed by three different points are different from each other, such that each triangle can be distinguished by comparing the angles and lengths thereof. Therefore, one marker corresponding to a vertex of a triangle can be distinguished from another marker. A detailed explanation of this process will be described below.
As illustrated in Figs. 7a and 7b, in case one scan data 60 includes M number of points of markers and another scan data 62 neighboring the one scan data 60 includes N number of points of markers, the scan data 60 contains MC3 number of different triangles and another scan data contains NC3 number of different triangles. Then, comparing the triangles between the two scan data a total of MC3 X NC3 times, corresponding pairs of triangles can be obtained.
First, as illustrated in Fig. 7a, the microprocessor 30 constructs plural triangles TI and T2 according to points obtained by markers included in one scan data 60 and plural triangles T3 and T4 included in another scan data 62.
Next, as shown in Fig.7b, the microprocessor 30 seeks a pair of mutually corresponding triangles, e.g., TI and T3, which are contained in the two scan data 60 and 62, respectively.
Various methods can be used to compare the triangles. One of them is to seek a pair of corresponding triangles by comparing the lengths of each side. In other words, three sides of each triangle (al, a2, a3) (bl, b2, b3) are compared, and if the length of each side is
identical to its counterpart and if the order of each side is all the same, it can be determined that the two triangles are corresponding.
In seeking triangles with three identical sides, the length of each side is arranged in a descending order, for example, and if at least more than two triangles having identical sides are detected, the order of each side is checked. That is, two triangles are judged corresponding if each of the sides compared in a counterclockwise or clockwise order from the longest side are identical.
After corresponding triangles or markers are distinguished in the respective scan data as described in the above-noted explanation, the scan data are moved so that these markers can be positioned on the same point in a single uniform coordinate system. Namely, one of the triangles serves as a reference and the corresponding triangle moves to the reference, and as a result, the two different coordinate systems are matched.
The matching process for the two triangles located in two different scan data is illustrated in Figs. 8a - 8d. As shown in Fig. 8a, in case two triangles are given, each triangle is identical in size and shape but located at different scan data, and information on vertexes and sides are known while two corresponding triangles are determined.
As illustrated in Fig. 8b, one of the selected vertexes in two corresponding triangles are matched by using a translation matrix (T), where a reference coordinate system relating to one triangle is set up as A and the other coordinate system is set up as B. The translation matrix (T) thereto is defined in the following formula 1 :
FORMULA 1 T=T(A1-B1)
Next, as depicted in Fig. 8c, a rotational translation is carried out in order to match one of the corresponding sides by using a rotational matrix (Rl), where the rotational matrix (Rl) is defined by Formula 2:
FORMULA 2 R1=R(Θ,)
Fig. 8d depicts a rotational translation carried out in order to match the remainder of one of the corresponding vertexes by using a rotational matrix (R2), where the rotational matrix (R2) is defined by Formula 3:
FORMULA R2=R2(Θ2)
As a result, the total matching process is carried out by a translation matrix M, which is defined by Formula 4:
FORMULA 4 M=T R1-R2
Therefore, a point (P) included in one scan data moves to a new position in another scan data by the following Formula 5 :
FORMULA 5 P'=M X P
Meanwhile, calculation error is likely to occur because the physical size of markers cannot be defined as points mathematically and has actual dimensions. Therefore, after the microprocessor 30 matches various scan data as explained above, the microprocessor 30 processes more intricately to obtain a more accurate match. Namely, the positions of the markers are more adjusted based on meshed data, which is called "Registering." Through these processes, each scan data can be integrated into a single uniform coordinate system more accurately. A detailed description follows.
In case a point cloud data A comprising plural points and another point cloud data B comprising plural points are to be integrated into a single uniform coordinate system of, A is moved and rotated for being integrated to the coordinate system of B. Under this circumstance, if n number of points of A is P={pi}, and points of B corresponding thereto is Q={xi}, the translation of moving and rotating is obtained by using a least square method for minimizing a distance between P and Q and this translation is applied to A. As a result, an
average distance of the point cloud data represented by A and B is minimized, and these processes are repeated until the average distance between P and Q lie within allowance.
The corresponding points out of a plurality of point cloud data can also be obtained by using the least square method, and an equation thereto is given in the following formula 6:
FORMULA 6
J p =o
Where Q is a state vector of registering, and Q is defined as [Q
R I Q
τ]
2, where Q
R=Quaternion Vector, and
Q
T is a Translation Vector and defined as [q
4q
5q6]
t.
In the aforementioned Formula 6, f(Q) defines an average distance of squared xi minus (R(QR)pi+Qτ), and at this time, R(QR) and QT for minimizing f(Q) can be calculated by the least square method. In Formula 6, R(QR) can be defined by a 3x3 rotation matrix as in the following Formula 7:
FORMULA 7
øo+øι- 02-øϋ liwrwύ (ζ[tf3+ øofe)
2( Gift + 0003) 00 + 02 "01 "03 (ø203- røl) 2(øι0 - 00«2) 2(02 3 + 0 !) 0+ 3- 1- 02,
Furthermore, in case a set of scan data is defined by P={pi} and a set of reference data (X) is defined by X={xi}, the center of mass of P and X is given by the following Formula 8:
FORMULA 8
JV μ p 7 \Vr Σ ^—i.-
Furthermore, the cross-covariance matrix for P and X is given by the following
Formula 9:
FORMULA 9
Cyclic compounds of the anti-symmetric matrix (Aij) are used to form the column vector (Δ ). This vector (Δ ) is then used to form a symmetric 4x4 matrix Q (∑px), which can be given by the following Formula 10:
Δ= [ A
23 A
31 A
12j
Where I
3 is the 3x3 identity matrix. In the above Formula 10, Q
τ is an eigen vector corresponding to a maximum eigenvalue of Q ( ∑px ), and quaternion ( Q0, Ql, Q2, Q3 ) is obtained by utilizing the eigen vector, and the rotation matrix is obtained by substituting the
quaternion into Formula 7. Meanwhile, Qτ(q4,q5,qe) can be obtained from the following Formula 11 by utilizing R(Q
R) given from the above Formula 7.
FORMULA 11 δr=μ,-^(δΛ)μP
As a result, a final matrix is given by the following Formula 12:
FORMULA 12
F 0 + 01 -02 "03 ( ^-0003) 2(øιø3 + øoø2) 04
2(øι 2 + 00 s) 0 + 2-01-03 2(ø203"000l) 5 2(øι03-002) 2(ø203 + 000ι) 00 + 3-01- 2 5
0 0 0 1.
When corresponding markers are obtained for each of the 3D scan data, the microprocessor 30 computes a matrix for total translation based on said one scan data 60 as a reference coordinate, thereby arranging all of the 3D scan data automatically to the reference coordinate system.
Meanwhile, apart from the methods illustrated in Figs.8a to 8d, coordinate systems themselves may be mapped to the reference coordinate system by utilizing the least square method, which represent aligning scan data after finding corresponding triangles.
Information on the corresponding markers positioned at the vertexes are obtained during the process of seeking corresponding triangles, which are, for example, P and X in Formula 6, and an optimal translation matrix can be given by the following Formula 13:
FORMULA 13
R
L 0 I]
A formula applied to the point cloud data (P), which are to be arranged by the method of coordinate mapping, can be defined by the below-mentioned Formula 14:
FORMULA 14
P =TP
Meanwhile, in the method of seeking a pair of corresponding triangles as explained above, more than 3 corresponding markers should be included in a region where each scan data overlaps. Therefore, in case only 2 corresponding markers are included in the region, other methods should be adopted for seeking corresponding markers.
Since each scan data with markers has 3D information, it is possible to seek corresponding markers even if only 2 corresponding markers are included in the overlapped region. As shown in Fig. 9, there are only 2 corresponding markers on an overlapped region of the scan data 64 and 66, which are two markers (RMl, RM2) and (RM3, RM4), respectively. Corresponding markers are sought by comparing two vertical vectors at the points where the markers are positioned and the distance between the two markers .
Meanwhile, if there are too many markers projected on each scan data or the markers are projected uniformly, there is an increasing possibility of obtaining inaccurate corresponding markers. In this case, additional markers and 3D scan data are used for creating additional references. For example, in case there are three corresponding markers, a triangle is formed by the three markers, and then a line is drawn perpendicularly from the weight center of the triangle. Then, the intersecting point of the perpendicular line and the 3D scan data is obtained as a fourth reference point. Next, corresponding markers can be found by utilizing information of an average peφendicular vector of an object surface at the markers or around the markers.
Furthermore, in case only two corresponding markers are available, a straight line is drawn by connecting the two markers, and a circle is drawn on a plane perpendicular to the straight line from the center of the straight line. Then, an intersecting point of the circle and the 3D scan data is obtained as fourth and fifth reference points.
According to the preferred embodiment of the present invention, it should be apparent that any method is possible for carrying out an automatic alignment of 3D scan data by making additional reference points around markers, other than the above-mentioned methods.
As shown in Fig. 3, information on the markers newly obtained by the automatic arrangement process on the 3D scan data is registered in the buffer 32.
Successively, operations according to the first embodiment of the present invention mentioned above will be explained in detail with reference to Figs. 10a and 10b, where S denotes a step.
First, the microprocessor 30 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S10).
Then, the microprocessor 30 controls the marker blinking controller 26 to allow the plurality of marker output parts 14 disposed at the marker generator 12 to project the markers arbitrarily on the surface of the object 10 (Sll).
Next, the image obtaining part 18 photographs a prescribed domain of the object 10 to obtain a 2D image including optical markers projected on the surface of the object 10 and then the microprocessor 30 receives the 2D image data via the image input part 24 (S 12).
Successively, the microprocessor 30 controls the marker blinking controller 26 to turn off the marker generator 12 so that the markers may not be projected on the object 10 (S13). At this state, the image obtaining part 18 photographs the same domain as the above to obtain
a 2D image without the markers, and then the microprocessor 30 receives the 2D image data via the image input part 24 (S14).
The microprocessor 30 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off. Then, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 from the pattern projector 16. Next, the image obtammg part 18 photographs the object 10 with striped patterns projected on it to get 3D scan data, and then the microprocessor 30 receives the 3D scan data via the image input part 24 (S15).
At this state, the microprocessor 30 computes 2D positions of the markers by image processing the 2D image data with the markers and the 2D image data without the markers (S16).
Successively, the microprocessor 30 computes 3D positions of the markers by using the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data (S 17).
Meanwhile, the microprocessor 30 discriminates whether the register of the buffer 32 is vacant or not (SI 8). If the register of the buffer 32 is not vacant, the 3D positions of markers obtained at S17 (current 3D scan data) and 3D positions of the markers stored in the register of the buffer 32 (in other words, 3D data overlapping partly with the current 3D scan data) are compared for searching corresponding markers (SI 9).
When corresponding markers are found by comparing the markers included in the current 3D scan data with the markers registered at the register of the buffer 32 according to the aforementioned searching process, the microprocessor 30 computes translation matrix for
matching the two 3D scan data (S20). The positions of 3D scan data registered at the register of the buffer 32 are given as a reference coordinate system, to which the current scan data are translated (S21).
Next, the microprocessor 30 registers the markers newly obtained from the current scan data in the register of the buffer 32 (S22). Then, the microprocessor 30 discriminates whether an automatic arrangement relative to the 3D scan data has been completed or not
(S23). If the arrangement is not completed, the process returns to S10, and then steps S10 to
S23 are repeated.
The second embodiment of the present invention will now be described in detail with reference to the accompanying drawings.
Fig. 11 illustrates a structure of an apparatus for automatically arranging 3D scan data using the optical markers according to the second embodiment of the present invention.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
An apparatus for arranging 3D scan data automatically according to the second embodiment of the present invention includes a marker generator 70, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a moving mechanism 22, an image input part 24, a individual marker blinking controller 74, a projector controller 28, a microprocessor 76, and a buffer 32.
The marker generator 70 comprises a plurality of marker output parts 72 and projects markers recognizable by the image obtaining part 18 on the surface of object 10 randomly.
The marker generator 70 turns on the 1 - N marker output parts 72 one by one
sequentially in response to the control of the individual marker blinking controller 74, which enables a different marker to be included for each image obtained from the image obtaining part 18.
The individual marker blinking controller 74 sequentially and individually blinks the plurality of marker output parts 72 mounted at the marker generator 70 in a predetermined order according to the control of the microprocessor 76. The microprocessor 76 carries out the arrangement of the coordinate systems corresponding to the plurality of 3D scan data photographed at various angles by analyzing the 2D image data and 3D scan data inputted through the image input part 24. That is, the microprocessor 76 sets up an image photographed by the image obtaining part 18 while all the markers turn off as a reference image, and then compares the reference image with a plurality of images photographed while the markers turn on one by one sequentially. Through the above process, the 2D position of each marker can be obtained.
Successively, the microprocessor 76 carries out the same processes as performed in the first embodiment. In other words, the microprocessor analyzes the 2D image data and the
3D scan data to compute the 3D positions of the markers, and searches corresponding markers to obtain a translation matrix, and translates the plurality of 3D scan data to the reference coordinate system.
The operation according to the second embodiment of the present invention thus described will now be explained in detail with reference to the flow chart shown in Fig. 12.
First, the microprocessor 76 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning the object 10 (S30).
Under such circumstances, the microprocessor 76 obtains the image data photographed from the image obtaining part 18 as a reference image while all of the optical
markers are turned off. Then, the microprocessor 76 controls the individual marker blinking controller 74 to turn on a firstly-designated marker output part 72 out of the plurality of marker output parts 72 equipped in the marker generator 70, which allows the first marker to be projected on the surface of the object 10 (S31). Then, an image is obtained as a first image data by photographing by the image obtaining part 18 (S32).
Next, the microprocessor 76 controls the individual marker blinking controller 74 to turn on the secondly-designated marker output part according to a predetermined order and to allow the second optical marker to be projected on the object 10 (S33). Then, the second image data is obtained (S34). Next, the microprocessor 76 discriminates whether the marker contained in the image is the last (N-th) marker out of the predetermined plural markers (S35). If the marker is not the last marker, steps S33 and S34 are repeatedly carried out until N-th image data is obtained.
Meanwhile, if it is discriminated that the marker is the last one, the microprocessor 76 controls the projector controller 28 to activate the pattern projector 16 while the marker generator 70 is turned off to prevent the optical marker from being projected and to allow a prescribed pattern (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) for 3D scanning to be projected on the object 10 by the pattern projector 16.
At this time, if the image obtaining part 18 photographs the object 10 with the pattern projected on it to obtain the 3D scan data, the microprocessor 76 receives the 3D scan data from the image input part 24 (S36).
The microprocessor 76 compares each of the first to N-th image data with the reference image and searches a bright spot formed by the optical markers in each comparison, which helps the 2D positions of each marker to be found easily (S37).
Successively, the microprocessor 76 computes the 3D positions of the markers by
analyzing the 2D positions of the markers and the 3D scan data, and searches corresponding markers included in overlapped regions in reference to the 3D positions of the markers, and calculates translation matrices, and translates the plurality of 3D scan data to the reference coordinate system (S38), which is the same as described in the first embodiment.
Next, the microprocessor 76 discriminates whether the automatic arrangement for the
3D scan data has been completed (S39). If the arrangement has not been completed, the flow returns to S30 to move the pattern projector 16 and the image obtaining part 18 to appropriate positions by activating the movement mechanism 22 under the control of the movement driving part 20. Steps S30 to S38 are repeated.
Next, the third embodiment of the present invention will be described in detail with reference to the accompanying drawings.
The construction of the apparatus for automatically arranging the 3D scan data according to the third embodiment of the present invention is identical to the one shown in Fig. 11. However, the method is different between the second embodiment and the third embodiment. That is, in the second embodiment, N number of images, each of which includes one marker different from others, have to be photographed respectively. However, in the third embodiment, log2(N+l) number of images are photographed, each of which includes a group of markers for binarization.
Under the control of the microprocessor 76, the individual marker blinking controller 74 divides the marker output parts 72 disposed at the marker generator 70 into several groups for binarization, and turns on the markers group by group.
For example, if the number of marker output parts 72 is 16, the individual marker blinking controller 74 divides the 16 marker output parts 72 into 4 groups in an overlapping way.
In other words, for example, a first group comprises 9th to 16th markers, a second
group comprises 5th to 8th markers and 13th to 16th markers, a third group comprises 3rd, 4th, 7th, 8th, 11th, 12th, 15th and 16th markers, and a fourth group comprises even numbers of markers (2nd, 4th, 6th, 8th, 10th, 12th, 14th, 16th), which are all represented in Table 1.
:'0" represents that the marker is turned off while "1" represents that the marker is turned on.
As defined in Table 1, the first marker maintains a turned-off state at all times while the 16th marker always maintains a turned-on state and all markers have intrinsic values respectively.
The microprocessor 76 controls the individual marker blinking controller 76 such that the N markers are projected group by group, and compares the log2(N) image data obtained by the image obtaining part 18, and computes the 2D positions of the markers.
In case 16 markers are projected group by group to obtain the first to fourth image data as shown in Table 1, the 16 markers are differentiated by their binary codes which represent their turned-on or turned-off state, that is, identification (ID). Therefore, the 2D positions of the 16 markers can be obtained. For example, the 10th marker is recognized as a binary "1001" and the 13th marker is recognized as a binary "1100". Meanwhile, the first
marker, always maintaining a turned-off state, is not used such that a total of 15 markers can be utilized in the real sense.
As a result, even though 1,024 (210) markers are used, 10 image data are sufficient to distinguish the markers by the above binarization of the markers. Furthermore, the microprocessor 76 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data, searches corresponding markers, calculates translation matrix, and moves the plurality of 3D scan data by the translation matrix. The above process is the same as that described in the first embodiment.
The operation of the third embodiment of the present invention thus described will now be explained in detail with reference to the flow chart illustrated in Fig. 13.
As shown in Table 1, an embodiment will be explained where 16 marker output parts disposed at the marker generator 70 are available to project a total of 16 markers, and four 2D image data is obtained from the image obtaining part 18.
First, the microprocessor 76 controls the movement driving part 20 to drive the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S40).
Next, the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that the markers (9th - 16th markers) belonging to the first group can be projected (S41). A first image data photographed by the image obtaining part 18 is obtained via the image input part 24 (S42).
Successively, the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that Nth group of markers, for example, 5th - 8th and 13th - 16th markers are projected (S43), to thereby obtain Nth image data photographed by the image obtaining part 18 (S44).
W
Then, the microprocessor 76 discriminates whether the marker group contained in the image data is the last one or not (S45), and if it is not the last one, the flow is returned to S43 to repeat the process.
Meanwhile, as a result of the discrimination at S45, if it is discriminated that the marker group is the last one, the projector controller 28 drives the pattern projector 16 to project patterns on the surface of the object 10 while the marker generator 70 is turned off to prevent the optical markers from being projected.
At this time, when a 3D scan data is obtained in the image obtaining part 18 by photographing the object 10 projected with patterns, the microprocessor 76 receives the 3D scan data through the image input part 24 (S46).
Successively, the microprocessor 76 compares the first - Nth images obtained from the image obtaining part 18, which results in obtaining binary information of the markers relative to the first - fourth image data. Therefore, the ID of each marker, that is, the 2D positions of the markers are obtained (S47). 5 Meanwhile, the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and 3D scan data, and searches the corresponding markers included in the overlapped region of two different 3D scan data in reference to the 3D positions of the markers, calculates the translation matrix, and translates one of the 3D scan data to the reference coordinate system by the translation matrix (S48), which is same as0 described in the first embodiment.
The microprocessor 76 discriminates whether the automatic arrangement of the 3D scan data has been completed or not (S49). If the automatic arrangement of the 3D scan data has not been completed, the flow is returned to S40. Therefore, steps S40 to S48 are repeated.
Next, a fourth embodiment of the present invention will be described in detail with5 reference with the accompanying drawings.
As shown in Fig.14, an apparatus for automatically arranging 3D scan data according to the fourth embodiment of the present invention includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 80, a individual marker blinking controller 84, and a microprocessor 86.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The marker generator 80 projects markers recognizable by the image obtaining part 18 on the surface of the object 10. The marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at irregular angles on the entire surface of the object 10.
The marker generator 80 selectively blinks the plural marker output parts 82 according to the control of the individual marker blinking controller 84. The individual marker blinking controller 84 individually controls the plural marker output parts 82 according to the control of the microprocessor 86.
The microprocessor 86 analyzes the scanned data obtained from the object 10 for automatically arranging the 3D scan data in a single uniform coordinate system. The microprocessor 86 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles via the image input part 24 to analyze them for automatic arrangement on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
However, there is a difference between the first embodiment and the fourth embodiment in that after the process of obtaining a 2D image data and 3D scan data for one region of the object 10 is carried out, the markers projected on the region blink at a
predetermined cycle (for example, approximately 0.5 second), while the markers projected on the other region maintain an "on" state by the control of the individual marker blinking controller 84.
Conversely, it may be possible that markers for a region for which the obtaining process is already terminated maintain an "on" state, while the markers for the other region blink at a predetermined cycle.
In other words, the conditions of the markers are differently set up between one region where image data and scan data have been already obtained and the other region. Therefore, regions are easily differentiated by operators.
Next, a fifth embodiment of the present invention will be described in detail with reference to the accompanying drawings. As illustrated in Fig.15, an apparatus for automatically arranging 3D scan data according to the fifth embodiment of the present invention includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 90, a marker individual blinking/color controller 94 and a microprocessor 96.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The marker generator 90 projects patterns recognizable by the image obtaining part 18 on a surface of an object. The marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at arbitrary angles on the surface of the object 10.
The marker generator 90 is constructed such that at least more than two different colors can be selectively projected from each marker output part 92 according to the control of the marker individual blinking/color controller 94. For example, each marker output part 92
is equipped with more than two light sources each having different colors such that these light sources can be selectively lighted.
The marker individual ' blinking/color controller 94 controls the blinking and individual coloring of the plurality of marker output parts 92 disposed at the marker generator 90 according to the control of the microprocessor 96.
The microprocessor 96 analyzes the 2D image data and 3D scan data photographed from various angles by the image obtaining part 18 for automatically arranging the 3D scan data on one coordinate system. The detailed operation procedures thereto are the same as those in the first embodiment of the present invention.
However, there are some differences between the fifth embodiment and the first embodiment. The microprocessor 96 according to the fifth embodiment of the present invention controls the marker individual blinking/color controller 94 so that markers projected on a region where image data and scan data have already been obtained have different colors from that of the markers projected on the other region.
By differentiating colors according to the regions, an operator can easily check via the naked eye whether 2D image data and 3D scan data are obtained or not from the regions, which promotes convenience to the scanning operation.
Next, a sixth embodiment of the present invention will be described in detail with reference to the accompanying Fig. 16.
As shown in Fig. 16, an apparatus for automatically arranging 3D scan data includes marker generators 12, a pattern projector 16, an image obtaining part 18, an image input part 24, a marker blinking controller 26, a projector controller 28, a buffer 32, a rotating table 100, a rotating drive part 102, a rotating mechanism 104 and a microprocessor 106.
Throughout the drawings, the same reference numerals and symbols are designated to
the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The rotating table 100 rotates with an object 10 placed on the upper plate of the rotating table 100, and also rotates with a plurality of marker generators 12 disposed at a circumference of the upper plate.
The rotating drive part 102 drives the rotating mechanism 104 to rotate the rotating table 100 according to the control of the microprocessor 106 such that the object can be set at an angle appropriate for scanning.
Here, although the rotating drive part 102 is utilized to electrically rotate the rotating table 100 in the sixth embodiment of the present invention, it should be apparent that the rotating mechanism 104 may be manually rotated to allow an operator to control the rotating table 100 arbitrarily.
Furthermore, as long as the marker generators and the object can be rotated together in a fixed state, not only the rotating table 100 but also other devices may be possibly applied herein.
The microprocessor 106 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles and analyzes the data for arranging the 3D scan data automatically on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
However, there is a difference in the sixth embodiment of the present invention in that the object 10 and the marker generator 12 are rotated during the scanning process instead of the image obtaining part 18 and the pattern projector 16.
The operational procedures for the apparatus for automatically arranging 3D scan data according to the sixth embodiment of the present invention thus described will now be
explained in detail with reference to Figs. 17a and 17b.
First, the microprocessor 106 controls the rotating drive part 102 to drive the rotating mechanism 104, thus rotating the rotating table 100 at a predetermined angle so that the object 10 can be rotated to a position appropriate for scanning (S50).
Under such condition, the microprocessor 106 controls the marker blinking controller
26 to turn on the plurality of marker output parts 14 equipped at the marker generators 12, thus allowing the plural markers to be projected on the surface of the object 10 (S51).
When the image obtaining part 18 photographs the object 10 to thereby obtain 2D image including optical markers while the optical markers are being projected, the microprocessor 106 receives the 2D image data obtained by the image obtaining part 18 via the image input part 24 (S52).
Successively, the microprocessor 106 controls the marker blinking controller 26 to turn off the marker generators 12, thereby preventing the optical markers from being projected on the object 10 (S53). Next, the same region of the object 10 without the markers is photographed and the 2D image data thereof is received via the image input part 24 (S54).
Furthermore, the microprocessor 106 controls the projector controller 28 to activate the pattern projector 16 while the marker generators 12 are turned off to prevent the optical markers from being projected. Therefore, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 for 3D scanning.
When the image obtaining part 18 photographs the object 10 projected with the prescribed patterns to obtain the 3D scan data, the microprocessor 106 receives the 3D scan data via the image input part 24 (S55).
The microprocessor 106 calculates 2D positions of the markers by image-processing
the 2D image data that contains and lacks optical markers (S56).
Successively, the microprocessor 30 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data (S57). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the
2D image data intersect the 3D scan data.
Meanwhile, the microprocessor 106 discriminates whether the register of buffer 32 is vacant or not (S58).
As a result of the discrimination at S58, if the register of the buffer 32 is not vacant, the microprocessor 106 compares the 3D positions of the markers obtained at S57 with those of the markers included in the 3D scan data stored in the registers of the buffer 32, thereby searching markers that correspond to each other (S59).
After the corresponding markers are obtained by comparing the markers included in the current 3D scan data with markers stored in the register of the buffer 32 in the searching process in S59, the microprocessor 106 computes translation matrices (S60) by analyzing the relation of the corresponding markers, and the current scan data are translated to a reference coordinate by which the 3D scan data listed with the register of the buffer 32 is defined (S61), which is the same as described in the first embodiment.
Then, the microprocessor 106 registers the markers at the register of the buffer 32, which serves as a reference in the next calculation (S62). Successively, the microprocessor 106 checks whether the automatic arrangement of the 3D scan data obtained from the object 10 is completed (S63).
As a result of the check, if it is discriminated that the automatic arrangement has not been completed for the 3D scan data obtained from the object 10, the flow returns to S50. Therefore, 2D image data and 3D scan data for other regions of the object 10 are obtained by
activating the rotating mechanism 104 according to the rotating drive part 102 and rotating the rotating table to as much as a prescribed angle. Steps S50 to S62 are repeatedly carried out.
As apparent from the above description, the sixth embodiment of the present invention is so constructed as to allow the object 10 to be moved, and thus, it is easy to obtain and arrange 3D scan data from an object relatively smaller than that of the first embodiment of the present invention where the projector and the image obtaining part are structured to move.
Here, the marker generators are fixed on the rotating table for preventing relative movement between them, until the scanning process is completed.
Meanwhile, the arrangement method by using reference coordinates in the aforementioned embodiments has a drawback in that errors can increase if the number of regions to be scanned are great. Because in the above methods, arrangement is carried out by integrating one 3D scan data to a reference coordinate system in which its neighboring 3D scan data already obtained is defined, and the arrangement process is repeated over all regions of the object. Therefore, a careless error at one process can be amplified at the end of the arrangement.
For example, Figs. 18a and 18b illustrate two scan data obtained by scanning two adjacent regions that overlap each other. The dotted lines indicate real data of an object and the solid lines indicate scan data which are not identical to the real data.
Under such circumstances, if any one of the scan data in Fig.18a and Fig.18b is served as a reference and the other data is attached (arranged) to the reference, errors may add up in the attachment, which results in Fig. 19c. In other words, the increase in number of regions to be scanned is likely to increase the chance of errors.
In order to cope with the aforementioned problems, a method of arranging 3D scan data on an absolute coordinate system, instead of a reference coordinate system, is presented in the seventh and eighth embodiments of the present invention.
The absolute coordinate system in these embodiments is different from the reference coordinate system in that every 3D scan data of the regions of an object is mapped to an absolute coordinate. Therefore, errors occurring in obtaining a 3D scan data is not transmitted to obtaining adjacent 3D scan data.
For example, Figs.19a and 19b illustrate two scan data obtained from scanning two adjacent regions, and a part of the scan data overlaps. If the two scan data of Figs.19a and 19b are translated to an absolute coordinate system respectively and are attached to each other, as shown in Fig. 19d, errors in occurring in the two scan data, respectively, are not added as shown in Fig. 19c, such that an error amplification problem caused by inaccuracy of the image obtaining part thus described can be prevented.
First, the seventh embodiment of the present invention will be described in detail with reference to the accompanying drawings.
An apparatus for automatically arranging 3D scan data according to the seventh embodiment illustrated in Fig. 20 includes a marker generator 12, a projector 16, an image obtaining part 18, a first movement driving part 20, a first movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a large domain image obtaining part 110, an image input part 112, a second movement driving part 114, a second movement mechanism 116, a microprocessor 118, and a reference object 120. Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The large domain image obtaining part 110 comprises an image sensor for receiving images such as CCD camera or Complementary Metal Oxide Semiconductor (CMOS) camera.
When markers are projected on the surface of an object 10 from the marker generator 12, images thereof are photographed and obtained by the large domain image obtaining part 10.
The large domain image obtaining part 10 is separately positioned from the image obtaining
1087
part 18 to photograph and obtain an image for large domain of the object 10.
The large domain image obtaining part 110 is preferred to adopt an image sensor having a relatively higher accuracy than that of the image obtaining part 18 for obtaining images for part of the scan domain.
The image input part 112 receives image data from the image obtaining part 18 and the large domain image obtaining part 110.
The second movement driving part 114 drives the second movement mechanism 116 to move the large domain image obtaining part 110 to a position suitable for obtaining the image of large part of the object 10 according to the driving control of the microprocessor 118.
It should be noted that although the large domain image obtaining part 110 is moved electrically by the second movement driving part 114 in the seventh embodiment of the present invention, it may be moved by manipulating the second movement mechanism.
The microprocessor 118 computes 3D positions of each marker for the large scan domain by analyzing image data of the object 10 and a reference object 120 photographed by the large domain image obtaining part 110 in two or more different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12.
The 3D positions of the markers thus obtained serve as an absolute coordinate system.
Furthermore, the microprocessor 118 receives via the image input part 112 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinate system, which results in an arrangement of a complete 3D scan data of the object 10.
The reference object 120, an object of a prescribed shape whose size (dimension) information is pre-inputted in the microprocessor 118, is arranged close to the object 10. An image of the reference object 120 is obtained along with that of the object 10 via the large
domain image obtaining part 110.
The operating process of an apparatus for automatically arranging 3D scan data according to the seventh embodiment of the present invention thus explained will now be described in detail with reference to the flow charts shown in Figs. 21a and 21b.
First, an object 10 is placed beside the marker generator 12, and a reference object
120 is arranged at a prescribed position near the object 10. Then, the microprocessor 118 controls a second movement driving part 114 to drive the second movement mechanism 116 so that the large domain image obtaining part 110 moves to a position suitable for scanning the object 10.
Next, the microprocessor 118 controls the marker blinking controller 26 to turn on a plurality of marker output parts 14 equipped at the marker generator 12, whereby a plurality of markers can be arbitrarily projected on the surface of the object 10 (S70).
The large domain of the object 10 and the reference object 120 are photographed by the large domain image obtaining part 18 to obtain a 2D image data including optical markers. Then the microprocessor 118 receives the 2D image data obtained from the large domain image obtaining part 110 via the image input part 112 (S71).
In Fig. 23, an example of an image including the entire domain of the object 10 and the reference object 120 obtained by the large domain image obtaining part 110 is shown.
The symbol "RM" indicates an optical marker projected on the surface of the object 10 while another reference symbol "Bl " refers to an image obtained by the large domain image obtaining part 110.
Next, the microprocessor 118 controls the second movement driving part 114 to drive the second movement mechanism for moving the large domain image obtaining part 118 to a position suitable for scanning another part of the object 10 (S72).
Next, the microprocessor 118 controls the large domain image obtaining part 118 to photograph the large domain of the object 10 including the reference object 120, whereby a 2D image including the optical markers in a different direction from that of S71 is obtained. The 2D image is received by the microprocessor 118 via the image input part 112 (S73).
The microprocessor 118 then controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S74).
The microprocessor 118 combines the 2D images of the large domain of the object 10 obtained in different directions obtained by the large domain image obtaining part 110 and computes the 3D positions of the markers included in the combined 2D images in reference to the dimension of the reference object 120 already-known (S75). Next, the microprocessor
118 registers the 3D positions of each marker thus calculated at the buffer 32 (S76).
Successively, the microprocessor 118 controls the first movement driving part 20 to drive the first movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 is moved to a position suitable for scanning the object 10 (S77).
Under such circumstances, the microprocessor 118 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12 to allow the plural markers to be arbitrarily projected on the surface of the object 10 (S78).
After the image obtaining part 18 photographs a part (see "NI" in Fig. 23) of the large domain of the object 10 while the optical markers are projected on the surface of the object 10, the microprocessor 18 receives the 2D image data obtained by the image obtaining part 18 via the image input part 112 (S79).
Next, the microprocessor 118 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the
object 10 (S80). Under this condition, the same part of the large domain mentioned above is photographed by the image obtaining part 18 to obtain the 2D image not included with optical markers. The 2D image data thus obtained is inputted to the microprocessor via the image input part 112 (S 81).
Furthermore, the microprocessor 118 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected, whereby predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
When the object 10 projected with patterns is photographed by the image obtaining part 18 to obtain 3D scan data, the microprocessor 118 receives the 3D scan data via the image input part 112 (S82). Under such circumstances, the microprocessor 118 computes the 2D positions of the markers by image-processing the 2D image data including the markers and the 2D image data not including the markers (S83).
The microprocessor 118 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of three arbitrary markers in the 2D image data intersect the 3D scan data (S84).
Successively, the microprocessor 118 compares the 3D positions of markers found at
S84 with 3D positions of the markers stored in the register of the buffer 32 at S76 to search corresponding markers, in other words, markers that are identical in 3D positions thereof (S85).
When corresponding markers are found by comparing the optical markers included in the current 3D scan data with the markers stored in the register of the buffer 32, the microprocessor 118 calculates the translation matrices for translating the markers in the current
3D scan data to the absolute coordinate system (S86). Then, the current scan data are moved by the translation matrices to be arranged on the absolute coordinate system by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S87).
Next, the microprocessor 118 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the object 10 are all arranged or not (S88).
If it is not completed, the flow returns to S77. Therefore, the microprocessor controls the first movement driving part 20 for driving first the movement mechanism 22, whereby the projector 16 and the image obtaining part 18 are moved to positions suitable for scanning the objects not yet scanned. Steps S77 to S88 are repeatedly carried out.
Although in the seventh embodiment, a large domain image obtaining part and an image obtaining part are introduced by two different elements, it may be preferably possible that one image obtaining part is utilized for obtaining both images of the large domain of the object and images for parts of the large domain of the object.
Next, an eighth embodiment of the present invention will be described in detail with reference to the attached drawing of Fig. 24.
The apparatus for automatically arranging 3D scan data using optical markers according to the eighth embodiment of the present invention as shown in Fig. 24 includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a pair of or plural large domain image obtaining parts 130 and 132, an image input part 134 and a microprocessor 136. Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanations on the parts will be omitted for simplicity.
The pair of large domain image obtaining parts 130 and 132 comprise image sensors
for receiving images such as CCD cameras or CMOS cameras. The cameras are fixed to each other, and they capture images of the same object from different angles, the method of which is called Stereo Vision.
The large domain image obtaining parts 130 and 132 are preferred to adopt image sensors of relatively higher resolution than the image obtaining part 10 for obtaining images of parts of the domain. The image input part 134 is intended to receive image data obtained by the image obtaining part 18 and the large domain image obtaining parts 130 and 132.
The microprocessor 136 computes 3D positions of each marker for large scan domains by analyzing image data of the object 10 photographed by the large domain image obtaining parts 130 and 132 in two different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12. The 3D positions of markers thus obtained serve as an absolute coordinate system.
Furthermore, the microprocessor 136 receives via the image input part 134 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinates, which results in an arrangement of the whole 3D scan data of the object 10.
Next, the operating process of the apparatus for automatically arranging 3D scan data according to the eighth embodiment of the present invention will be described in detail with reference to the flow charts shown in Figs. 25a and 25b.
First, a predetermined object 10 is placed beside the marker generator 12, and the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, allowing plural markers to be arbitrarily projected on the surface of the object 10 (S90).
The microprocessor 118 receives two 2D image data obtained from the large domain image obtaining parts 130 and 132 via the image input part 134 when the large domain of the
object 10 is photographed in an overlapping manner in different directions by the large domain image obtaining parts 130 and 132, respectively, while the optical markers from the marker generator 12 are projected on the object 10 (S91).
Fig. 26 illustrates an example of images of the large scan domain of the object 10 obtained by the large domain image obtaining parts 130 and 132. The reference symbol
"RM" indicates an optical marker projected on the surface of the object 10, "BI-1 " is an image obtained by the large domain image obtaining part 132, and "BI-2" is an image obtained by the large domain image obtaining part 130.
Successively, the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S92).
The microprocessor 136 computes 3D positions of the markers included in the large scan domain of the object in reference to the two 2D image data photographed in two different directions by the large domain image obtaining parts 130 and 132 (S93). In other words, from the relation between the positions of the pair of the large domain image obtaining parts 130 and 132 and 2D positions of each marker projected on the object 10, 3D positions of each marker is calculated by triangulation, the details of which will be explained later. Next, the microprocessor 136 registers the 3D positions of each marker thus calculated in the register of the buffer 32 (S 94).
The microprocessor 136 then controls the movement driving part 20 to drive the movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 moves to a position suitable for scanning the object 10 (S95).
Under such condition, the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, thereby projecting the plural markers arbitrarily on the surface of the object 10 (S96).
When a part ("NI" in Fig.26b) out of the large scan domain for the object 10 is photographed by the image obtaining part 18 to obtain 2D image data including the optical markers, the microprocessor 136 receives the 2D image data via the image input part 134 (S97).
Successive!}/, the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, preventing the optical markers from being projected on the object 10 (S98). Under such condition, when the same part as mentioned above is photographed by the image obtaining part 18 to obtain 2D image date not including the optical markers, the microprocessor 136 receives the 2D image data via the image input part 112 (S99).
Furthermore, the microprocessor 136 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
When the object 10 projected with patterns is photographed by the image obtaining part 18 to obtain 3D scan data, the microprocessor 136 receives the 3D scan data via the image input part 112 (S 100).
Under such circumstances, the microprocessor 136 analyzes the 2D image data including the markers and the 2D image data not including the markers for calculating 2D positions of the markers (S 101 ) .
Furthermore, the microprocessor 136 computes 3D positions of the markers in reference to the 2D positions of the markers and the 3D scan data (S102). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data.
Successively, the microprocessor 136 compares the 3D positions of markers found at SI 02 with 3D positions of markers stored in the register of the buffer 32 at S94 to search corresponding markers (S103).
When corresponding markers are found by the search process of the markers described above, the microprocessor 136 calculates the translation matrices for translating the markers in the current 3D scan data (S104). The current scan data are moved by the translation matrices to be arranged on the absolute coordinates by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S105).
Next, the microprocessor 136 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the entire scan data domain for the object 10 are all arranged (SI 06).
If it is not completed, the flow returns to S95. Therefore, the microprocessor controls the movement driving part 20 for driving the movement mechanism 22, whereby the. pattem projector 16 and the image obtaining part 18 are moved to positions suitable for scanning the objects not yet scanned. Steps S95 to SI 06 are repeatedly carried out.
Although in the eighth embodiment, a pair of large domain image obtaining parts, an image obtaining part and a marker generator are configured separately, the pair of large domain image obtaining parts and the marker generator may be integrally configured as a modification thereto. In this case, this may be more convenient because there is no need to set positions of the pair of large domain image obtaining parts according to a domain where the optical markers are projected.
As another modification to the eighth embodiment of the present invention, a pair of large domain image obtaining parts and an image obtaining part may be integrally constructed. In this case, in the process of obtaining an absolute coordinate, the scan domain may become a
bit smaller and accuracy may also be decreased, however, in the process of obtaining parts of images of the scan domain it is not necessary to photograph in an overlapping manner. Therefore, the number of scanning process can be reduced.
The principle of the eighth embodiment of the present invention described above will be explained below in detail.
The large domain image obtaining parts 130 and 132 disclosed in the eighth embodiment of the present invention can be modeled by two cameras facing one object, which can be modified according to the field of applications. In the eighth embodiment, two cameras are arranged in parallel as shown in Fig. 27. The variables in Fig. 27 are defined below.
X: position of a point to be obtained
b: base line distance between camera lens centers
f: camera's focal length
A, B: image plane obtained by each camera
XI, Xr: respective positions of the image of the point on the image planes
P, Q: lens centers of each camera
A method for obtaining a position of a point by using stereo image can be defined in Formulas (Equations) 15 and 16.
FORMULA 15
x+ b/2 and —f- x- b/2
tf y'r
*'_ -
FORMULA 16
£ = fc — _ __J — . £ = fc __. — , ~ = b t ' ,
X I X r X X j, X j iC r
Next, the ninth embodiment of the present invention will be described with reference to the attached drawing
In the ninth embodiment of the present invention, a plurality of projectors, image obtaining parts and marker generators are arranged around an object, such that the projectors and image obtaining parts need not be moved to obtain 2D images and 3D scan data in relation to the entire scan domain of the object, and one scanning operation makes it possible to obtain 2D images and 3D scan data, thereby simplifying the job and shortening the consuming time involved therein
Fig 28 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention, wherein the apparatus comprises N number of marker generators 142,
M number of pattern projectors 146, L number of image obtammg parts 148, an image input part 150, a project controller 152, a marker blinking controller 154, a microprocessor 156 and
a buffer 158.
The N number of marker generators 142, intended to project markers recognizable by the image obtaining part 148 on the surface of an object, are disposed with a plurality of marker output parts 144 for projecting a plurality of optical markers on the entire surface of the object 10 in arbitrary scanning directions.
The N number of marker generators 142 are directed to the object 10, and are apart from each other at a predetermined interval, and the markers are so arranged as to cover the entire object.
The M number of pattern projectors 146 project predetermined patterns or laser striped patterns on the surface of the object 10 for obtaining 3D scan data. LCD projectors may be utilized to project space-coded beams or laser beams on the surface of the object 10, thereby obtaining 3D scan data via the image obtaining part 148.
The M number of pattem projectors 146 are directed to the object 10, and are apart from each other at a predetermined interval, and the space-coded beams projected from each pattern projector 146 are made to cover the entire domain of the object 10.
The L number of image obtaining parts 148 which comprise image sensors capable of receiving images, such as CCD cameras or CMOS cameras, photograph and obtain images of the object 10. It is preferable that each of the L number of image obtaining parts 148 is integrated with an individual pattern projector 146 instead of being separated.
Furthermore, the N number of image- obtaining parts 148, as shown in Fig. 28, are directed to the object 10, and are apart from each other at a predetermined interval, and the scanning domain of image obtaining parts 148 cover the entire domain of the object 10.
The image input part 150 receives each image data obtained from the L number of image obtaining parts 148, and the project controller 152 controls the transfer speed and
transfer directions of pattern films and the blinking cycle of light sources for projecting the pattern films.
The marker blinking controller 154 periodically blinks each optical markers from the N number of marker generators 142 according to the control of the microprocessor 156.
The microprocessor 156 computes the 3D positions of the markers on each domain in reference to the 2D image data and 3D scan data obtained from the L number of image obtaining parts 148, respectively, and searches corresponding markers on every overlapped domains in reference to the 3D positions of the markers, and calculates translation matrices by using the corresponding markers. As a result, the microprocessor 156 arranges each of the 3D scan data by the translation matrices. The buffer 158 stores data necessary for computing and the resultant data thereof.
Next, the operating process of the apparatus for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention thus described will be explained in detail with reference to the flow chart shown in Fig. 29.
First, the object 10 is placed in a position suitable for scanning, and the N number of marker generators 142, M number of pattem projectors 146 and L number of image obtaining parts 148 are arranged around the object 10. Then, the microprocessor 156 controls the marker blinking controller 154 to turn on the plural marker output parts 144 each equipped at the N number of marker generators 142, thereby allowing the plural markers to be arbitrarily projected on the surface of the object 10 (S 110).
When the scan domains of the object 10 are photographed from each L number of image obtaining parts 148 to obtain 2D images while the optical markers are projected on the object 10, the microprocessor 156 receives the L number of 2D image data obtained from the L number of image obtaining parts 148 via the image input part 150 (Sill).
Successively, the microprocessor 156 controls the marker blinking controller 154 to
turn off the N number of marker generators 142, thereby preventing optical markers from being projected on the surface of the object (SI 12). Under such condition, when the same domain of the object 10 as mentioned above is photographed by each L number of image obtaining parts 148 to obtain L number of 2D images not including the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150 (S113).
Furthermore, the microprocessor 156 controls the project controller 152 to operate the
M number of projectors 146 while the N number of marker generators 142 are turned off to prevent the optical markers from being projected. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattem) are projected on the surface of the object 10 from the M number of pattern projectors 146.
When the object 10 projected with patterns is photographed by L number of image obtaining parts 148 to obtain L number of 3D scan data, the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S 114). Under such circumstances, the microprocessor 156 calculates 2D positions of the markers by image-processing the 2D image data including the optical markers and the 2D image data not including the markers (SI 15).
Furthermore, the microprocessor 136 computes 3D positions of the markers from the
2D positions of the markers and the 3D scan data (SI 16). That is, the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens centers of the L number of image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data.
Successively, the microprocessor 118 compares the 3D positions of markers for each L number of 3D scan data for searching corresponding markers (SI 17).
When corresponding markers are found by the searching process, the microprocessor 156 calculates the translation matrices for translating the markers in the current 3D scan data
(SI 18). One of the L number of 3D scan data is set up as a reference coordinate and the current 3D scan data moves according to the obtained translation matrices for arrangement (SI 19).
Next, a tenth embodiment of the present invention will be described in detail with reference to the attached drawing. The elements of the tenth embodiment of the present invention is nearly identical to those of the ninth embodiment, however, operating processes are different from each other. Therefore, the tenth embodiment will be described based on the configuration of the ninth embodiment illustrated in Fig. 28 and flow chart shown in Fig. 30.
First, a reference object whose dimensions is already known is placed in a position suitable for scanning, and N number of marker generators 142, M number of pattern projectors 146 and L number of image obtaining parts 148 are respectively arranged around the reference object. The reference object may be specially manufactured for calibration, or may be an actual object if dimensions thereof are already known.
Under such condition, the microprocessor 156 controls the marker blinking controller
154 to turn off the respective marker output parts 144 equipped at the N number of marker generators 142 to allow the plural markers to be arbitrarily projected on the surface of the reference object (S120).
The microprocessor 156 carries out calibration for seeking a correlation between the reference object and the L number of image obtaining parts 148 (S121). The detailed operating processes thereto is described below.
At step S121, optical markers from the N number of marker generators 142 are projected on the surface of the reference object, and when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain 2D images containing the optical markers, the microprocessor 156 receives the L number of 2D image
data via the image input part 150.
Successively, the microprocessor 156 controls the project controller 152 to operate the M number of pattern projectors 146. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the reference object from the M number of pattem projectors 146. When the reference object projected with patterns is photographed by L number of image obtaining parts 148 to obtain L number of 3D scan data, the microprocessor 156 receives the L number of 3D scan data via the image input part 150.
Under such condition, the microprocessor 156 estimates 3D positions of the markers in the L number of 3D scan data by calculating intersection points where straight lines connecting each center of the cameras equipped in the L number of image obtaining parts 148 with the markers included in the each 2D image data intersect the 3D scan data, respectively.
Successively, the microprocessor 156 compares the 3D positions of the markers for each L number of 3D scan data for searching corresponding markers, and calculates translation matrices from the relations of the corresponding markers. Then, the microprocessor 156 registers the obtained translation matrices in the register of the buffer 158, whereby calibration is completed at S 121.
When the calibration thus described is completed at S121, the reference object is removed and the object 10 is placed in a place where the reference object has been removed, and the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing the optical markers from being projected on the surface of the object 10 (S122).
Under such circumstances, the microprocessor 156 receives the L number of 2D image data via the image input part 150 when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain the L number of 2D
images not including the optical markers (S123).
Furthermore, the microprocessor 156 controls the project controller 152 to activate the
M number of pattem projectors 146 while the N number of marker generators 142 turn off to keep the optical markers from being projected. Therefore, predetermined pattems (for example, pattems having stripes with different gaps therebetween or multiple striped pattem) are projected on the surface of the object 10 from the M number of pattem projectors 146.
When the object 10 projected with pattems is photographed by L number of image obtaining parts 148 to obtain L number of 3D scan data, the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S124).
The microprocessor 156 reads out the translation matrices stored in the register of the buffer 158, and sets one of the L number of 3D scan data as a reference, and moves L-l number of 3D scan data by the translation matrices (S 125).
When scanning is necessary for the other object, or the same object, calibration from
S 121 to S 123 can be omitted. Since 3D scan data can be arranged by the translation matrices stored in the register of the buffer 158, the scanning time can be reduced. However, if needed, calibration from S121 to S123 may be executed for every scan, and can be easily changed, modified or altered according to the operator's intention or system configuration.
Next, an eleventh embodiment of the present invention will be described. The eleventh embodiment provides marker generators and peripherals different from those used in the first to tenth embodiments of the present invention.
A marker generator of the eleventh embodiment, as shown in Fig. 31, includes a plurality of light sources of X-axis, a blinking controller, a polygon mirror 164 rotating about the X-axis, a rotary driving part 166, a rotary mechanism 168, a plurality of light sources of the
Y-axis 170, a blinking controller 172, a polygon mirror 174 rotating about the Y-axis, a rotary driving part 176 and a rotary mechanism 178.
The plurality of light sources of the X-axis 160 generate beams having excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 164. The light sources of the X-axis may be, for example, laser pointers.
The blinking controller 162 blinks each light source 160 according to the control of a microprocessor (not shown).
The polygon mirror 164 equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 168 in order to reflect the plural beams, thereby projecting the plural beams on the surface of an object (OB). The rotary driving part 166 drives the rotary mechanism 168 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
The plurality of light sources of Y-axis 170 generate beams of excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 174. The light sources may be, for example, laser pointers. The blinking controller 172 blinks each light source according to the control of a microprocessor (not shown).
The polygon mirror equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 178 in order to reflect the plurality of beams, thereby projecting the plural beams on surface of the object (OB). The rotary driving part 176 drives the rotary mechanism 178 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
Next, the operating processes of the marker generator according to the eleventh embodiment of the present invention thus described will be explained in detail.
First, a driving power generated by the rotary driving part 166 and the rotary driving part 176 is applied to the rotary mechanism 168 and the rotary mechanism 178 according to a control signal of a microprocessor. The rotary mechanism 168 and the rotary mechanism 178
respectively driven by the driving power rotate the polygon mirrors 164 and 174.
When the light sources 160 and 170 are lighted by the blinking controller 162 and 172 in response to the control signal of the microprocessor, beams created by the plurality of light sources 160 and 170 are emitted to the reflecting surfaces of polygon mirrors 164 and 174. Then, the beams are projected on the surface of the object (OB).
The polygon mirrors 164 and 174 are rotated to make the angle of the reflecting surfaces different. Therefore, on the surface of the object (OB) lines of plural number of beams are formed in the directions of the X-axis and Y-axis, and intersecting points where lines of X-axis and Y-axis are intersected become optical markers (RM), respectively.
For example, in case the number of light sources of X-axis is m and the number of light sources of Y-axis is n, m*n number of intersecting points can be formed on the surface of the object (OB), and m*n number of intersecting points become respective optical markers (RM). Therefore, it is possible to employ a small number of light sources to generate relatively large number of optical markers.
Although the present invention has been fully described in connection with the preferred embodiments thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications are apparent to those skilled in the art. Such changes and modifications are to be understood as included within the scope of the present invention as defined by the appended claims unless they depart therefrom.
As apparent from the foregoing, the present invention is disclosed to provide an apparatus and method for automatically arranging 3D scan data obtained from different angles and positions using optical markers. There is an advantage in the present invention thus described in that optical markers having no physical volume are used to find out relative positions of mutually different scan data, such that scan data are not lost or damaged even in portions where markers exist. There is another advantage in that there is no need for placing
markers on or removing markers from an object for scanning the object, thereby providing convenience and safety in scanning, and preventing damage to the object as a result of attaching and removing markers. There is still another advantage in that the present invention can be used infinitely.