WO2004011876A1 - Apparatus and method for automatically arranging three dimensional scan data using optical marker - Google Patents

Apparatus and method for automatically arranging three dimensional scan data using optical marker Download PDF

Info

Publication number
WO2004011876A1
WO2004011876A1 PCT/KR2003/001087 KR0301087W WO2004011876A1 WO 2004011876 A1 WO2004011876 A1 WO 2004011876A1 KR 0301087 W KR0301087 W KR 0301087W WO 2004011876 A1 WO2004011876 A1 WO 2004011876A1
Authority
WO
WIPO (PCT)
Prior art keywords
markers
scan data
image
positions
marker
Prior art date
Application number
PCT/KR2003/001087
Other languages
French (fr)
Inventor
Min-Ho Chang
Original Assignee
Solutionix Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR10-2003-0022624A external-priority patent/KR100502560B1/en
Application filed by Solutionix Corporation filed Critical Solutionix Corporation
Priority to AU2003241194A priority Critical patent/AU2003241194A1/en
Priority to JP2004524349A priority patent/JP4226550B2/en
Publication of WO2004011876A1 publication Critical patent/WO2004011876A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Definitions

  • the present invention relates to an apparatus and method for automatically arranging three-dimensional (3D) scan data by using optical markers and, more particularly, to an apparatus and method for automatically arranging relative positions of a plurality of 3D scan data scanned in various positions and angles in reference to one coordinate system.
  • an optical 3D scanner can extract 3D data by scanning the surface of an object that is only in the scanner's field of view.
  • the object to be scanned should be rotated or moved, or the scanner itself should be moved and positioned to a place where a part or parts of the object can be seen.
  • a complete 3D scan then may be captured by surveying the object in various orientations and angles, and the 3D data thus obtained is arranged and integrated into a single uniform coordinate system.
  • each 3D scan data is defined by different coordinate system according to the position of a scanner.
  • the distance in which the scanner has been moved must be known. There are two methods of calculating this distance. One is to obtain an absolute distance by using a numerically-controlled device for moving the scanner, and the other is to calculate the distance only by referring to the scanned data.
  • the scanning must be carried out so that the plurality of scanned data overlap each other and corresponding points are inputted at the overlapping positions of the scanned data.
  • the scanned data is then arranged in reference to the corresponding points such that the respective coordinate systems defining the plurality of scanned data are integrated into one uniform coordinate system.
  • conventional markers 4 are attached arbitrarily onto the surface of an object 2, and the surface of the object 2 is scanned part by part in an overlapping manner.
  • first and second scanned data II and 12 are obtained from scanning the surface of the object 2, as shown in Fig. 2, the operator searches the number of markers Ml and M2 positioned commonly in the two scanned data II and 12 and arranges the two scanned data II and 12 by matching the markers as corresponding points. Meanwhile, in a technique where corresponding points are automatically recognized, markers having patterns different from each other for identification is searched via image processing, and if markers having identical patterns are positioned in two different scanned data, the two scanned data are arranged automatically in reference to the markers.
  • the present invention provides an apparatus and method for automatically arranging 3D scan data by using optical markers which are of non-contact type so that scanned parts of an object are preserved and not deformed.
  • an apparatus using optical markers for automatically arranging 3D scan data which is obtained by scanning an object at various angles, comprising: marker generating means for projecting a plurality of optical markers on the surface of an object; pattern projecting means for projecting patterns on the surface of the object in order to obtain 3D scan data of the object; image obtaining means for obtaining 2D image data of the object including the optical markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and control means for extracting 3D positions of the optical markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data in reference to the 3D positions of the optical markers.
  • a method for automatically arranging 3D scan data using optical markers comprising the steps of: moving the image obtaining means to a position appropriate for obtaining an image of parts of the object; projecting the optical markers on the surface of the object by marker generating means and obtaining 2D image data of parts of the object including the optical markers projected on the surface of the object by image obtaining means; projecting patterns on the surface of the object by pattern projecting means and obtaining 3D scan data of parts of the object on which the patterns are projected by the image obtaining means; and extracting 3D positions of the optical markers from the relationship between the 2D image data and the 3D scan data and arranging 3D scan data obtained from different parts of the object in reference to the 3D positions of the optical markers.
  • Fig. 1 is a drawing for illustrating an example of a 3D scan of an object by attaching conventional sticker type markers on the object;
  • Fig. 2 is a drawing for illustrating an example of an arrangement of different scan data by using sticker type markers as reference markers;
  • Fig. 3 is a schematic drawing for illustrating the construction of an apparatus for automatically arranging 3D scan data using optical markers according to a first embodiment of the present invention
  • Figs. 4a to 4c are schematic drawings for illustrating examples of a state for obtaining a 2D image data using optical markers according to the first embodiment of the present invention and a state for obtaining a 3D scan data by using patterns;
  • Fig. 5 is a schematic drawing for illustrating an example of a state for deriving 2D positions of markers from the 2D image data obtained by activating and deactivating the optical markers according to the first embodiment of the present invention;
  • Fig. 6 is a schematic drawing for illustrating an example of a state for deriving a 3D position of a marker from a 2D position of a marker and a center position of a camera lens;
  • Figs. 7a to 7b are schematic drawings for exemplarily illustrating an operation of searching corresponding markers by way of a triangular comparison at mutually different image data according to the first embodiment of the present invention
  • Figs. 8a to 8d are schematic drawings for exemplarily illustrating a conversion operation of matching two triangular structures at mutually different positions in triangular comparison within two different image data according to the first embodiment of the present invention
  • Fig. 9 is a schematic drawing for exemplarily illustrating an operation of searching corresponding markers by way of obtaining an imaginary marker at mutually different image data according to the first embodiment of the present invention.
  • Figs. 10a to 10b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention
  • Fig. 11 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a second embodiment of the present invention
  • Fig. 12 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the second embodiment of the present invention
  • Fig. 13 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a third embodiment of the present invention
  • Fig. 14 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fourth embodiment of the present invention.
  • Fig. 15 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fifth embodiment of the present invention.
  • Fig. 16 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a sixth embodiment of the present invention.
  • Figs. 17a and 17b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the sixth embodiment of the present invention ;
  • Fig. 18 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of a reference coordinate system
  • Fig. 19 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of an absolute coordinate system
  • Fig. 20 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention.
  • Figs. 21a and 21b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention.
  • Fig. 22 is a drawing of an example of an image obtained by using a large domain image obtaining part depicted in Fig. 20;
  • Fig. 23 is a drawing of an example of an image obtained by using the large domain image obtaining part depicted in Fig. 20 and an image obtaining part;
  • Fig. 24 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to an eighth embodiment of the present invention
  • Figs. 25a and 25b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the eighth embodiment of the present invention
  • Fig. 26a is a drawing of an example of an image obtained by using a pair of large domain image obtaining parts illustrated in Fig. 24;
  • Fig. 26b is a drawing of an example of an image obtained by using the pair of large domain image obtaining parts illustrated in Fig. 24 and an image obtaining part;
  • Fig. 27 is a schematic drawing for illustrating a principle of the eighth embodiment of the present invention
  • Fig. 28 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a ninth embodiment of the present invention
  • Fig.29 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention.
  • Fig. 30 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a tenth embodiment of the present invention.
  • Fig. 31 is a schematic drawing for illustrating a construction of a marker generator according to an eleventh embodiment of the present invention.
  • Fig. 3 is a drawing for illustrating a structure of an apparatus for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention, wherein the apparatus includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a marker blinking controller 26, a pattern projector controller 28, a microprocessor 30 and a buffer 32.
  • the marker generator 12 which is designed to project markers recognizable by the image obtaining part 18 on the surface of an object 10 includes a plurality of marker output parts 14 for projecting plural optical markers simultaneously on the surface of the object 10 in irregular directions.
  • the plurality of marker output parts 14 adopts laser pointers capable of projecting a plurality of red spots on the surface of the object 10, such that the positions of spots projected on the surface of the object 10 may be easily distinguished from images obtained by the image obtaining part 18, such as a camera or the like.
  • the marker generator 12 is by no means limited to a laser pointer, and any optical marker may be adopted as long as it can focalize properly on the surface of the object and can be easily controlled to repeat blinking.
  • a plurality of marker generators 12 may be disposed along a circumference of an object in order to project the optical markers on the surface of the entire object 10, and the number of the optical markers may be changed in accordance with the size and shape of the object 10. Furthermore, the marker generators 12 should be fixed on the object during the scanning operation so that the positions of the markers do not vary on the surface of the object.
  • the pattern projector 16 shown in the drawing projects predetermined patterns so that
  • 3D scan data of the object 10 can be obtained. Namely, space-encoded beams are projected on the surface of the object 10 by using projectors such as LCD projectors and the like, or a laser beam is projected on the surface of the object 10 so that 3D scan data of the object 10 can be obtained via the image obtaining part 18.
  • the pattern projector 16 adopts a slide projector comprising a light source, a pattern film and a lens for projecting a predetermined pattern, or an electronic LCD projector, or a laser diode for projecting laser striped patterns.
  • a pattern film equipped with striped patterns is fed between a light source and a lens by a predetermined feeding means, which allow a series of striped patterns to be projected on the object 10.
  • the pattern film may have striped patterns of varying gaps, as disclosed in Korean
  • Patent Application No.2002-10839 filed on February 28, 2002 by the present applicant, entitled “3D Scan Apparatus and Method Using Multiple Striped Patterns.” The same applies to a scanning device using laser striped patterns.
  • markers should not be projected on the object 10 while obtaining 3D scan data because scanned data might be deformed by the markers on the object 10.
  • the image obtaining part 18 comprises image sensors capable of receiving images such as a Charge Coupled Device (CCD) camera or a CMOS camera.
  • CCD Charge Coupled Device
  • CMOS complementary metal-oxide-semiconductor
  • the image obtaining part 18 may be configured separately from the pattern projector 16 but it is preferable that the image obtaining part 18 be integrally built in with the pattern projector 16 because the image obtaining part 18 integrated with the projector 16 is simple in structure and it is easy to match the 2D image data with the 3D scan data without calibration.
  • the image obtaining part 18 obtains 2D image data and 3D scan data while synchronizing the blinking period of the optical markers with the blinking period of the pattern, details of which are illustrated in Figs. 4a, 4b and 4c.
  • the image obtaining part 18 obtains a first image data 40 by photographing a certain part of the object 10 on which a plurality of optical markers (RM) are arbitrarily projected.
  • the image obtaining part 18 obtains a second image data 42 by photographing the same part of the object 10 as in Fig 4a while the marker generator 12 is turned off in order to prevent the surface of the object 10 from being projected with the laser markers.
  • the image obtaining part 18 obtains 3D scan data by photographing the object 10 on which striped patterns are projected from the pattern projector 16 while the marker generator 12 turns off.
  • the 3D scan data is obtained in the form of first to fifth scanned data 44a - 44e that correspond to the same part of the object 10 with different striped patterns PT1 - PT5 on the surface thereof, respectively.
  • the pattern film has 5 different striped patterns, it is not limited thereto and it may have more patterns.
  • the movement driving part 20 moves the pattern projector 16 and the image obtaining part 18 relatively to the object 10 according to the driving control of the microprocessor 30 so that images of the whole object 10 can be obtained.
  • the moving mechanism 22 receives a signal from the movement driving part 20 to move the pattern projector 16 and the image obtaining part 18 to a prescribed direction relative to the object 10.
  • the movement driving part 20 is adopted in the present embodiment for electrically moving the pattern projector 16 and the image obtaining part 18, it should be apparent that the moving mechanism 22 may be manually manipulated.
  • the image input part 24 shown in the drawing receives the image data obtained from the image obtaining part 18, and the marker blinking controller 26 serves to blink the optical markers of the marker generator 12 according to the control of the microprocessor.
  • the project controller 28 controls the feeding speed and direction of the pattern film of the pattern projector 16, and also controls the blinking of the light source for the projection.
  • the microprocessor 30 receives and analyzes the 2D image data and the 3D scan data photographed at various angles through the image input part 24 and arranges the 3D scan data automatically on a single uniform coordinate system.
  • the microprocessor 30 carries out an image process on the first image data 40 with the laser markers (RM) and the second image data 42 without the laser markers (RM), and as a result the third image data 46 comprising only the laser markers (RM) is obtained.
  • the microprocessor 30 calculates 3D positions of the markers from the relation between a camera lens center 50 of the image obtaining part 18 and a 2D image data 52 extracted with the marker position.
  • the 3D positions of the markers (a', b', c') corresponding to relevant markers can be obtained by estimating intersection points where straight lines connecting the camera lens center 50 of the image obtaining part 18 and the positions (a, b, c) of arbitrary markers in the 2D image data intersect the 3D scan data 54.
  • a scan data should preferably include more than 4 - 5 markers, and the neighboring two scan data should include 3 or more common markers. This is because three or more points are necessary for defining a unique position in 3D space and are also a requirement for obtaining corresponding markers (described later).
  • each marker is used with different patterns to be distinguished from each other.
  • the configuration of the equipment and manufacture thereof may be complicated because scores or hundreds of marker output parts should project different shapes of markers.
  • the markers are distinguished from each other by using information on the relative positions of the markers calculated based on respective 3D scan data at the microprocessor 30.
  • three points formed by markers can construct a triangle, and triangles constructed by three different points are different from each other, such that each triangle can be distinguished by comparing the angles and lengths thereof. Therefore, one marker corresponding to a vertex of a triangle can be distinguished from another marker.
  • the scan data 60 contains M C3 number of different triangles and another scan data contains N C 3 number of different triangles. Then, comparing the triangles between the two scan data a total of M C3 X N C 3 times, corresponding pairs of triangles can be obtained.
  • the microprocessor 30 constructs plural triangles TI and T2 according to points obtained by markers included in one scan data 60 and plural triangles T3 and T4 included in another scan data 62.
  • the microprocessor 30 seeks a pair of mutually corresponding triangles, e.g., TI and T3, which are contained in the two scan data 60 and 62, respectively.
  • Various methods can be used to compare the triangles. One of them is to seek a pair of corresponding triangles by comparing the lengths of each side. In other words, three sides of each triangle (al, a2, a3) (bl, b2, b3) are compared, and if the length of each side is identical to its counterpart and if the order of each side is all the same, it can be determined that the two triangles are corresponding.
  • each side In seeking triangles with three identical sides, the length of each side is arranged in a descending order, for example, and if at least more than two triangles having identical sides are detected, the order of each side is checked. That is, two triangles are judged corresponding if each of the sides compared in a counterclockwise or clockwise order from the longest side are identical.
  • the scan data are moved so that these markers can be positioned on the same point in a single uniform coordinate system. Namely, one of the triangles serves as a reference and the corresponding triangle moves to the reference, and as a result, the two different coordinate systems are matched.
  • Figs. 8a - 8d The matching process for the two triangles located in two different scan data is illustrated in Figs. 8a - 8d.
  • Fig. 8a in case two triangles are given, each triangle is identical in size and shape but located at different scan data, and information on vertexes and sides are known while two corresponding triangles are determined.
  • a translation matrix (T) where a reference coordinate system relating to one triangle is set up as A and the other coordinate system is set up as B.
  • the translation matrix (T) thereto is defined in the following formula 1 :
  • Fig. 8d depicts a rotational translation carried out in order to match the remainder of one of the corresponding vertexes by using a rotational matrix (R2), where the rotational matrix (R2) is defined by Formula 3:
  • a point (P) included in one scan data moves to a new position in another scan data by the following Formula 5 :
  • Q is a state vector of registering
  • Q is defined as [Q R I Q ⁇ ] 2
  • Q R Quaternion Vector
  • Q T is a Translation Vector and defined as [q 4 q 5 q6] t .
  • f(Q) defines an average distance of squared xi minus (R(Q R )pi+Q ⁇ ), and at this time, R(Q R ) and Q T for minimizing f(Q) can be calculated by the least square method.
  • R(Q R ) can be defined by a 3x3 rotation matrix as in the following Formula 7:
  • Cyclic compounds of the anti-symmetric matrix (Aij) are used to form the column vector ( ⁇ ).
  • This vector ( ⁇ ) is then used to form a symmetric 4x4 matrix Q ( ⁇ px), which can be given by the following Formula 10:
  • Q ⁇ is an eigen vector corresponding to a maximum eigenvalue of Q ( ⁇ px ), and quaternion ( Q0, Ql, Q2, Q3 ) is obtained by utilizing the eigen vector, and the rotation matrix is obtained by substituting the quaternion into Formula 7. Meanwhile, Q ⁇ (q4,q5,qe) can be obtained from the following Formula 11 by utilizing R(Q R ) given from the above Formula 7.
  • the microprocessor 30 computes a matrix for total translation based on said one scan data 60 as a reference coordinate, thereby arranging all of the 3D scan data automatically to the reference coordinate system.
  • a formula applied to the point cloud data (P), which are to be arranged by the method of coordinate mapping, can be defined by the below-mentioned Formula 14:
  • each scan data with markers has 3D information, it is possible to seek corresponding markers even if only 2 corresponding markers are included in the overlapped region.
  • Corresponding markers are sought by comparing two vertical vectors at the points where the markers are positioned and the distance between the two markers .
  • additional markers and 3D scan data are used for creating additional references. For example, in case there are three corresponding markers, a triangle is formed by the three markers, and then a line is drawn perpendicularly from the weight center of the triangle. Then, the intersecting point of the perpendicular line and the 3D scan data is obtained as a fourth reference point. Next, corresponding markers can be found by utilizing information of an average pe ⁇ endicular vector of an object surface at the markers or around the markers.
  • a straight line is drawn by connecting the two markers, and a circle is drawn on a plane perpendicular to the straight line from the center of the straight line. Then, an intersecting point of the circle and the 3D scan data is obtained as fourth and fifth reference points.
  • the microprocessor 30 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S10).
  • the microprocessor 30 controls the marker blinking controller 26 to allow the plurality of marker output parts 14 disposed at the marker generator 12 to project the markers arbitrarily on the surface of the object 10 (Sll).
  • the image obtaining part 18 photographs a prescribed domain of the object 10 to obtain a 2D image including optical markers projected on the surface of the object 10 and then the microprocessor 30 receives the 2D image data via the image input part 24 (S 12).
  • the microprocessor 30 controls the marker blinking controller 26 to turn off the marker generator 12 so that the markers may not be projected on the object 10 (S13).
  • the image obtaining part 18 photographs the same domain as the above to obtain a 2D image without the markers, and then the microprocessor 30 receives the 2D image data via the image input part 24 (S14).
  • the microprocessor 30 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off. Then, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 from the pattern projector 16. Next, the image obtammg part 18 photographs the object 10 with striped patterns projected on it to get 3D scan data, and then the microprocessor 30 receives the 3D scan data via the image input part 24 (S15).
  • prescribed patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 30 computes 2D positions of the markers by image processing the 2D image data with the markers and the 2D image data without the markers (S16).
  • the microprocessor 30 computes 3D positions of the markers by using the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data (S 17).
  • the microprocessor 30 discriminates whether the register of the buffer 32 is vacant or not (SI 8). If the register of the buffer 32 is not vacant, the 3D positions of markers obtained at S17 (current 3D scan data) and 3D positions of the markers stored in the register of the buffer 32 (in other words, 3D data overlapping partly with the current 3D scan data) are compared for searching corresponding markers (SI 9).
  • the microprocessor 30 computes translation matrix for matching the two 3D scan data (S20).
  • the positions of 3D scan data registered at the register of the buffer 32 are given as a reference coordinate system, to which the current scan data are translated (S21).
  • the microprocessor 30 registers the markers newly obtained from the current scan data in the register of the buffer 32 (S22). Then, the microprocessor 30 discriminates whether an automatic arrangement relative to the 3D scan data has been completed or not
  • Fig. 11 illustrates a structure of an apparatus for automatically arranging 3D scan data using the optical markers according to the second embodiment of the present invention.
  • An apparatus for arranging 3D scan data automatically according to the second embodiment of the present invention includes a marker generator 70, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a moving mechanism 22, an image input part 24, a individual marker blinking controller 74, a projector controller 28, a microprocessor 76, and a buffer 32.
  • the marker generator 70 comprises a plurality of marker output parts 72 and projects markers recognizable by the image obtaining part 18 on the surface of object 10 randomly.
  • the marker generator 70 turns on the 1 - N marker output parts 72 one by one sequentially in response to the control of the individual marker blinking controller 74, which enables a different marker to be included for each image obtained from the image obtaining part 18.
  • the individual marker blinking controller 74 sequentially and individually blinks the plurality of marker output parts 72 mounted at the marker generator 70 in a predetermined order according to the control of the microprocessor 76.
  • the microprocessor 76 carries out the arrangement of the coordinate systems corresponding to the plurality of 3D scan data photographed at various angles by analyzing the 2D image data and 3D scan data inputted through the image input part 24. That is, the microprocessor 76 sets up an image photographed by the image obtaining part 18 while all the markers turn off as a reference image, and then compares the reference image with a plurality of images photographed while the markers turn on one by one sequentially. Through the above process, the 2D position of each marker can be obtained.
  • the microprocessor 76 carries out the same processes as performed in the first embodiment.
  • the microprocessor analyzes the 2D image data and the
  • 3D scan data to compute the 3D positions of the markers, and searches corresponding markers to obtain a translation matrix, and translates the plurality of 3D scan data to the reference coordinate system.
  • the microprocessor 76 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning the object 10 (S30).
  • the microprocessor 76 obtains the image data photographed from the image obtaining part 18 as a reference image while all of the optical markers are turned off. Then, the microprocessor 76 controls the individual marker blinking controller 74 to turn on a firstly-designated marker output part 72 out of the plurality of marker output parts 72 equipped in the marker generator 70, which allows the first marker to be projected on the surface of the object 10 (S31). Then, an image is obtained as a first image data by photographing by the image obtaining part 18 (S32).
  • the microprocessor 76 controls the individual marker blinking controller 74 to turn on the secondly-designated marker output part according to a predetermined order and to allow the second optical marker to be projected on the object 10 (S33). Then, the second image data is obtained (S34). Next, the microprocessor 76 discriminates whether the marker contained in the image is the last (N-th) marker out of the predetermined plural markers (S35). If the marker is not the last marker, steps S33 and S34 are repeatedly carried out until N-th image data is obtained.
  • the microprocessor 76 controls the projector controller 28 to activate the pattern projector 16 while the marker generator 70 is turned off to prevent the optical marker from being projected and to allow a prescribed pattern (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) for 3D scanning to be projected on the object 10 by the pattern projector 16.
  • a prescribed pattern for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 76 receives the 3D scan data from the image input part 24 (S36).
  • the microprocessor 76 compares each of the first to N-th image data with the reference image and searches a bright spot formed by the optical markers in each comparison, which helps the 2D positions of each marker to be found easily (S37).
  • the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and the 3D scan data, and searches corresponding markers included in overlapped regions in reference to the 3D positions of the markers, and calculates translation matrices, and translates the plurality of 3D scan data to the reference coordinate system (S38), which is the same as described in the first embodiment.
  • microprocessor 76 discriminates whether the automatic arrangement for the
  • the construction of the apparatus for automatically arranging the 3D scan data according to the third embodiment of the present invention is identical to the one shown in Fig. 11. However, the method is different between the second embodiment and the third embodiment. That is, in the second embodiment, N number of images, each of which includes one marker different from others, have to be photographed respectively. However, in the third embodiment, log 2 (N+l) number of images are photographed, each of which includes a group of markers for binarization.
  • the individual marker blinking controller 74 divides the marker output parts 72 disposed at the marker generator 70 into several groups for binarization, and turns on the markers group by group.
  • the individual marker blinking controller 74 divides the 16 marker output parts 72 into 4 groups in an overlapping way.
  • a first group comprises 9th to 16th markers
  • a second group comprises 5th to 8th markers and 13th to 16th markers
  • a third group comprises 3rd, 4th, 7th, 8th, 11th, 12th, 15th and 16th markers
  • a fourth group comprises even numbers of markers (2nd, 4th, 6th, 8th, 10th, 12th, 14th, 16th), which are all represented in Table 1.
  • : '0" represents that the marker is turned off while "1" represents that the marker is turned on.
  • the first marker maintains a turned-off state at all times while the 16th marker always maintains a turned-on state and all markers have intrinsic values respectively.
  • the microprocessor 76 controls the individual marker blinking controller 76 such that the N markers are projected group by group, and compares the log 2 (N) image data obtained by the image obtaining part 18, and computes the 2D positions of the markers.
  • the 16 markers are projected group by group to obtain the first to fourth image data as shown in Table 1, the 16 markers are differentiated by their binary codes which represent their turned-on or turned-off state, that is, identification (ID). Therefore, the 2D positions of the 16 markers can be obtained. For example, the 10th marker is recognized as a binary "1001" and the 13th marker is recognized as a binary "1100". Meanwhile, the first marker, always maintaining a turned-off state, is not used such that a total of 15 markers can be utilized in the real sense.
  • ID identification
  • the microprocessor 76 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data, searches corresponding markers, calculates translation matrix, and moves the plurality of 3D scan data by the translation matrix. The above process is the same as that described in the first embodiment.
  • the microprocessor 76 controls the movement driving part 20 to drive the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S40).
  • the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that the markers (9th - 16th markers) belonging to the first group can be projected (S41).
  • a first image data photographed by the image obtaining part 18 is obtained via the image input part 24 (S42).
  • the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that Nth group of markers, for example, 5th - 8th and 13th - 16th markers are projected (S43), to thereby obtain Nth image data photographed by the image obtaining part 18 (S44).
  • Nth group of markers for example, 5th - 8th and 13th - 16th markers are projected (S43), to thereby obtain Nth image data photographed by the image obtaining part 18 (S44).
  • the microprocessor 76 discriminates whether the marker group contained in the image data is the last one or not (S45), and if it is not the last one, the flow is returned to S43 to repeat the process.
  • the projector controller 28 drives the pattern projector 16 to project patterns on the surface of the object 10 while the marker generator 70 is turned off to prevent the optical markers from being projected.
  • the microprocessor 76 receives the 3D scan data through the image input part 24 (S46).
  • the microprocessor 76 compares the first - Nth images obtained from the image obtaining part 18, which results in obtaining binary information of the markers relative to the first - fourth image data. Therefore, the ID of each marker, that is, the 2D positions of the markers are obtained (S47). 5 Meanwhile, the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and 3D scan data, and searches the corresponding markers included in the overlapped region of two different 3D scan data in reference to the 3D positions of the markers, calculates the translation matrix, and translates one of the 3D scan data to the reference coordinate system by the translation matrix (S48), which is same as0 described in the first embodiment.
  • the microprocessor 76 discriminates whether the automatic arrangement of the 3D scan data has been completed or not (S49). If the automatic arrangement of the 3D scan data has not been completed, the flow is returned to S40. Therefore, steps S40 to S48 are repeated.
  • an apparatus for automatically arranging 3D scan data includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 80, a individual marker blinking controller 84, and a microprocessor 86.
  • the marker generator 80 projects markers recognizable by the image obtaining part 18 on the surface of the object 10.
  • the marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at irregular angles on the entire surface of the object 10.
  • the marker generator 80 selectively blinks the plural marker output parts 82 according to the control of the individual marker blinking controller 84.
  • the individual marker blinking controller 84 individually controls the plural marker output parts 82 according to the control of the microprocessor 86.
  • the microprocessor 86 analyzes the scanned data obtained from the object 10 for automatically arranging the 3D scan data in a single uniform coordinate system.
  • the microprocessor 86 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles via the image input part 24 to analyze them for automatic arrangement on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
  • the markers projected on the region blink at a predetermined cycle (for example, approximately 0.5 second), while the markers projected on the other region maintain an "on" state by the control of the individual marker blinking controller 84.
  • markers for a region for which the obtaining process is already terminated maintain an "on" state, while the markers for the other region blink at a predetermined cycle.
  • the conditions of the markers are differently set up between one region where image data and scan data have been already obtained and the other region. Therefore, regions are easily differentiated by operators.
  • an apparatus for automatically arranging 3D scan data includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 90, a marker individual blinking/color controller 94 and a microprocessor 96.
  • the marker generator 90 projects patterns recognizable by the image obtaining part 18 on a surface of an object.
  • the marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at arbitrary angles on the surface of the object 10.
  • the marker generator 90 is constructed such that at least more than two different colors can be selectively projected from each marker output part 92 according to the control of the marker individual blinking/color controller 94.
  • each marker output part 92 is equipped with more than two light sources each having different colors such that these light sources can be selectively lighted.
  • the marker individual ' blinking/color controller 94 controls the blinking and individual coloring of the plurality of marker output parts 92 disposed at the marker generator 90 according to the control of the microprocessor 96.
  • the microprocessor 96 analyzes the 2D image data and 3D scan data photographed from various angles by the image obtaining part 18 for automatically arranging the 3D scan data on one coordinate system.
  • the detailed operation procedures thereto are the same as those in the first embodiment of the present invention.
  • the microprocessor 96 controls the marker individual blinking/color controller 94 so that markers projected on a region where image data and scan data have already been obtained have different colors from that of the markers projected on the other region.
  • an apparatus for automatically arranging 3D scan data includes marker generators 12, a pattern projector 16, an image obtaining part 18, an image input part 24, a marker blinking controller 26, a projector controller 28, a buffer 32, a rotating table 100, a rotating drive part 102, a rotating mechanism 104 and a microprocessor 106.
  • the rotating table 100 rotates with an object 10 placed on the upper plate of the rotating table 100, and also rotates with a plurality of marker generators 12 disposed at a circumference of the upper plate.
  • the rotating drive part 102 drives the rotating mechanism 104 to rotate the rotating table 100 according to the control of the microprocessor 106 such that the object can be set at an angle appropriate for scanning.
  • the rotating drive part 102 is utilized to electrically rotate the rotating table 100 in the sixth embodiment of the present invention, it should be apparent that the rotating mechanism 104 may be manually rotated to allow an operator to control the rotating table 100 arbitrarily.
  • the marker generators and the object can be rotated together in a fixed state, not only the rotating table 100 but also other devices may be possibly applied herein.
  • the microprocessor 106 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles and analyzes the data for arranging the 3D scan data automatically on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
  • the object 10 and the marker generator 12 are rotated during the scanning process instead of the image obtaining part 18 and the pattern projector 16.
  • the microprocessor 106 controls the rotating drive part 102 to drive the rotating mechanism 104, thus rotating the rotating table 100 at a predetermined angle so that the object 10 can be rotated to a position appropriate for scanning (S50).
  • the microprocessor 106 controls the marker blinking controller
  • the microprocessor 106 receives the 2D image data obtained by the image obtaining part 18 via the image input part 24 (S52).
  • the microprocessor 106 controls the marker blinking controller 26 to turn off the marker generators 12, thereby preventing the optical markers from being projected on the object 10 (S53).
  • the same region of the object 10 without the markers is photographed and the 2D image data thereof is received via the image input part 24 (S54).
  • the microprocessor 106 controls the projector controller 28 to activate the pattern projector 16 while the marker generators 12 are turned off to prevent the optical markers from being projected. Therefore, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 for 3D scanning.
  • the microprocessor 106 receives the 3D scan data via the image input part 24 (S55).
  • the microprocessor 106 calculates 2D positions of the markers by image-processing the 2D image data that contains and lacks optical markers (S56).
  • the microprocessor 30 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data (S57). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the
  • 2D image data intersect the 3D scan data.
  • the microprocessor 106 discriminates whether the register of buffer 32 is vacant or not (S58).
  • the microprocessor 106 compares the 3D positions of the markers obtained at S57 with those of the markers included in the 3D scan data stored in the registers of the buffer 32, thereby searching markers that correspond to each other (S59).
  • the microprocessor 106 computes translation matrices (S60) by analyzing the relation of the corresponding markers, and the current scan data are translated to a reference coordinate by which the 3D scan data listed with the register of the buffer 32 is defined (S61), which is the same as described in the first embodiment.
  • the microprocessor 106 registers the markers at the register of the buffer 32, which serves as a reference in the next calculation (S62). Successively, the microprocessor 106 checks whether the automatic arrangement of the 3D scan data obtained from the object 10 is completed (S63).
  • the sixth embodiment of the present invention is so constructed as to allow the object 10 to be moved, and thus, it is easy to obtain and arrange 3D scan data from an object relatively smaller than that of the first embodiment of the present invention where the projector and the image obtaining part are structured to move.
  • the marker generators are fixed on the rotating table for preventing relative movement between them, until the scanning process is completed.
  • the arrangement method by using reference coordinates in the aforementioned embodiments has a drawback in that errors can increase if the number of regions to be scanned are great. Because in the above methods, arrangement is carried out by integrating one 3D scan data to a reference coordinate system in which its neighboring 3D scan data already obtained is defined, and the arrangement process is repeated over all regions of the object. Therefore, a careless error at one process can be amplified at the end of the arrangement.
  • Figs. 18a and 18b illustrate two scan data obtained by scanning two adjacent regions that overlap each other.
  • the dotted lines indicate real data of an object and the solid lines indicate scan data which are not identical to the real data.
  • a method of arranging 3D scan data on an absolute coordinate system instead of a reference coordinate system, is presented in the seventh and eighth embodiments of the present invention.
  • the absolute coordinate system in these embodiments is different from the reference coordinate system in that every 3D scan data of the regions of an object is mapped to an absolute coordinate. Therefore, errors occurring in obtaining a 3D scan data is not transmitted to obtaining adjacent 3D scan data.
  • Figs.19a and 19b illustrate two scan data obtained from scanning two adjacent regions, and a part of the scan data overlaps. If the two scan data of Figs.19a and 19b are translated to an absolute coordinate system respectively and are attached to each other, as shown in Fig. 19d, errors in occurring in the two scan data, respectively, are not added as shown in Fig. 19c, such that an error amplification problem caused by inaccuracy of the image obtaining part thus described can be prevented.
  • An apparatus for automatically arranging 3D scan data includes a marker generator 12, a projector 16, an image obtaining part 18, a first movement driving part 20, a first movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a large domain image obtaining part 110, an image input part 112, a second movement driving part 114, a second movement mechanism 116, a microprocessor 118, and a reference object 120.
  • a marker generator 12 includes a projector 16, an image obtaining part 18, a first movement driving part 20, a first movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a large domain image obtaining part 110, an image input part 112, a second movement driving part 114, a second movement mechanism 116, a microprocessor 118, and a reference object 120.
  • the large domain image obtaining part 110 comprises an image sensor for receiving images such as CCD camera or Complementary Metal Oxide Semiconductor (CMOS) camera.
  • CMOS Complementary Metal Oxide Semiconductor
  • the large domain image obtaining part 10 is separately positioned from the image obtaining 1087
  • part 18 to photograph and obtain an image for large domain of the object 10.
  • the large domain image obtaining part 110 is preferred to adopt an image sensor having a relatively higher accuracy than that of the image obtaining part 18 for obtaining images for part of the scan domain.
  • the image input part 112 receives image data from the image obtaining part 18 and the large domain image obtaining part 110.
  • the second movement driving part 114 drives the second movement mechanism 116 to move the large domain image obtaining part 110 to a position suitable for obtaining the image of large part of the object 10 according to the driving control of the microprocessor 118.
  • the large domain image obtaining part 110 is moved electrically by the second movement driving part 114 in the seventh embodiment of the present invention, it may be moved by manipulating the second movement mechanism.
  • the microprocessor 118 computes 3D positions of each marker for the large scan domain by analyzing image data of the object 10 and a reference object 120 photographed by the large domain image obtaining part 110 in two or more different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12.
  • the 3D positions of the markers thus obtained serve as an absolute coordinate system.
  • the microprocessor 118 receives via the image input part 112 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinate system, which results in an arrangement of a complete 3D scan data of the object 10.
  • the reference object 120 an object of a prescribed shape whose size (dimension) information is pre-inputted in the microprocessor 118, is arranged close to the object 10. An image of the reference object 120 is obtained along with that of the object 10 via the large domain image obtaining part 110.
  • an object 10 is placed beside the marker generator 12, and a reference object
  • the microprocessor 118 controls a second movement driving part 114 to drive the second movement mechanism 116 so that the large domain image obtaining part 110 moves to a position suitable for scanning the object 10.
  • the microprocessor 118 controls the marker blinking controller 26 to turn on a plurality of marker output parts 14 equipped at the marker generator 12, whereby a plurality of markers can be arbitrarily projected on the surface of the object 10 (S70).
  • the large domain of the object 10 and the reference object 120 are photographed by the large domain image obtaining part 18 to obtain a 2D image data including optical markers. Then the microprocessor 118 receives the 2D image data obtained from the large domain image obtaining part 110 via the image input part 112 (S71).
  • FIG. 23 an example of an image including the entire domain of the object 10 and the reference object 120 obtained by the large domain image obtaining part 110 is shown.
  • RM indicates an optical marker projected on the surface of the object 10 while another reference symbol “Bl " refers to an image obtained by the large domain image obtaining part 110.
  • the microprocessor 118 controls the second movement driving part 114 to drive the second movement mechanism for moving the large domain image obtaining part 118 to a position suitable for scanning another part of the object 10 (S72).
  • the microprocessor 118 controls the large domain image obtaining part 118 to photograph the large domain of the object 10 including the reference object 120, whereby a 2D image including the optical markers in a different direction from that of S71 is obtained.
  • the 2D image is received by the microprocessor 118 via the image input part 112 (S73).
  • the microprocessor 118 then controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S74).
  • the microprocessor 118 combines the 2D images of the large domain of the object 10 obtained in different directions obtained by the large domain image obtaining part 110 and computes the 3D positions of the markers included in the combined 2D images in reference to the dimension of the reference object 120 already-known (S75). Next, the microprocessor
  • the microprocessor 118 controls the first movement driving part 20 to drive the first movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 is moved to a position suitable for scanning the object 10 (S77).
  • the microprocessor 118 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12 to allow the plural markers to be arbitrarily projected on the surface of the object 10 (S78).
  • the microprocessor 18 receives the 2D image data obtained by the image obtaining part 18 via the image input part 112 (S79).
  • the microprocessor 118 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the object 10 (S80). Under this condition, the same part of the large domain mentioned above is photographed by the image obtaining part 18 to obtain the 2D image not included with optical markers. The 2D image data thus obtained is inputted to the microprocessor via the image input part 112 (S 81).
  • microprocessor 118 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected, whereby predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
  • predetermined patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 118 receives the 3D scan data via the image input part 112 (S82). Under such circumstances, the microprocessor 118 computes the 2D positions of the markers by image-processing the 2D image data including the markers and the 2D image data not including the markers (S83).
  • the microprocessor 118 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of three arbitrary markers in the 2D image data intersect the 3D scan data (S84).
  • the microprocessor 118 compares the 3D positions of markers found at
  • the microprocessor 118 calculates the translation matrices for translating the markers in the current 3D scan data to the absolute coordinate system (S86). Then, the current scan data are moved by the translation matrices to be arranged on the absolute coordinate system by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S87).
  • the microprocessor 118 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the object 10 are all arranged or not (S88).
  • Steps S77 to S88 are repeatedly carried out.
  • a large domain image obtaining part and an image obtaining part are introduced by two different elements, it may be preferably possible that one image obtaining part is utilized for obtaining both images of the large domain of the object and images for parts of the large domain of the object.
  • the apparatus for automatically arranging 3D scan data using optical markers includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a pair of or plural large domain image obtaining parts 130 and 132, an image input part 134 and a microprocessor 136.
  • the pair of large domain image obtaining parts 130 and 132 comprise image sensors for receiving images such as CCD cameras or CMOS cameras.
  • the cameras are fixed to each other, and they capture images of the same object from different angles, the method of which is called Stereo Vision.
  • the large domain image obtaining parts 130 and 132 are preferred to adopt image sensors of relatively higher resolution than the image obtaining part 10 for obtaining images of parts of the domain.
  • the image input part 134 is intended to receive image data obtained by the image obtaining part 18 and the large domain image obtaining parts 130 and 132.
  • the microprocessor 136 computes 3D positions of each marker for large scan domains by analyzing image data of the object 10 photographed by the large domain image obtaining parts 130 and 132 in two different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12.
  • the 3D positions of markers thus obtained serve as an absolute coordinate system.
  • the microprocessor 136 receives via the image input part 134 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinates, which results in an arrangement of the whole 3D scan data of the object 10.
  • a predetermined object 10 is placed beside the marker generator 12, and the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, allowing plural markers to be arbitrarily projected on the surface of the object 10 (S90).
  • the microprocessor 118 receives two 2D image data obtained from the large domain image obtaining parts 130 and 132 via the image input part 134 when the large domain of the object 10 is photographed in an overlapping manner in different directions by the large domain image obtaining parts 130 and 132, respectively, while the optical markers from the marker generator 12 are projected on the object 10 (S91).
  • Fig. 26 illustrates an example of images of the large scan domain of the object 10 obtained by the large domain image obtaining parts 130 and 132.
  • RM indicates an optical marker projected on the surface of the object 10
  • BI-1 is an image obtained by the large domain image obtaining part 132
  • BI-2 is an image obtained by the large domain image obtaining part 130.
  • the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S92).
  • the microprocessor 136 computes 3D positions of the markers included in the large scan domain of the object in reference to the two 2D image data photographed in two different directions by the large domain image obtaining parts 130 and 132 (S93). In other words, from the relation between the positions of the pair of the large domain image obtaining parts 130 and 132 and 2D positions of each marker projected on the object 10, 3D positions of each marker is calculated by triangulation, the details of which will be explained later. Next, the microprocessor 136 registers the 3D positions of each marker thus calculated in the register of the buffer 32 (S 94).
  • the microprocessor 136 then controls the movement driving part 20 to drive the movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 moves to a position suitable for scanning the object 10 (S95).
  • the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, thereby projecting the plural markers arbitrarily on the surface of the object 10 (S96).
  • the microprocessor 136 receives the 2D image data via the image input part 134 (S97).
  • the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, preventing the optical markers from being projected on the object 10 (S98). Under such condition, when the same part as mentioned above is photographed by the image obtaining part 18 to obtain 2D image date not including the optical markers, the microprocessor 136 receives the 2D image data via the image input part 112 (S99).
  • microprocessor 136 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
  • the microprocessor 136 receives the 3D scan data via the image input part 112 (S 100).
  • the microprocessor 136 analyzes the 2D image data including the markers and the 2D image data not including the markers for calculating 2D positions of the markers (S 101 ) .
  • the microprocessor 136 computes 3D positions of the markers in reference to the 2D positions of the markers and the 3D scan data (S102). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data. Successively, the microprocessor 136 compares the 3D positions of markers found at SI 02 with 3D positions of markers stored in the register of the buffer 32 at S94 to search corresponding markers (S103).
  • the microprocessor 136 calculates the translation matrices for translating the markers in the current 3D scan data (S104).
  • the current scan data are moved by the translation matrices to be arranged on the absolute coordinates by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S105).
  • the microprocessor 136 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the entire scan data domain for the object 10 are all arranged (SI 06).
  • Steps S95 to SI 06 are repeatedly carried out.
  • a pair of large domain image obtaining parts, an image obtaining part and a marker generator are configured separately, the pair of large domain image obtaining parts and the marker generator may be integrally configured as a modification thereto. In this case, this may be more convenient because there is no need to set positions of the pair of large domain image obtaining parts according to a domain where the optical markers are projected.
  • a pair of large domain image obtaining parts and an image obtaining part may be integrally constructed.
  • the scan domain in the process of obtaining an absolute coordinate, the scan domain may become a bit smaller and accuracy may also be decreased, however, in the process of obtaining parts of images of the scan domain it is not necessary to photograph in an overlapping manner. Therefore, the number of scanning process can be reduced.
  • the large domain image obtaining parts 130 and 132 disclosed in the eighth embodiment of the present invention can be modeled by two cameras facing one object, which can be modified according to the field of applications.
  • two cameras are arranged in parallel as shown in Fig. 27.
  • the variables in Fig. 27 are defined below.
  • A, B image plane obtained by each camera
  • a method for obtaining a position of a point by using stereo image can be defined in Formulas (Equations) 15 and 16.
  • a plurality of projectors, image obtaining parts and marker generators are arranged around an object, such that the projectors and image obtaining parts need not be moved to obtain 2D images and 3D scan data in relation to the entire scan domain of the object, and one scanning operation makes it possible to obtain 2D images and 3D scan data, thereby simplifying the job and shortening the consuming time involved therein
  • Fig 28 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention, wherein the apparatus comprises N number of marker generators 142,
  • M number of pattern projectors 146 L number of image obtammg parts 148, an image input part 150, a project controller 152, a marker blinking controller 154, a microprocessor 156 and a buffer 158.
  • the N number of marker generators 142 intended to project markers recognizable by the image obtaining part 148 on the surface of an object, are disposed with a plurality of marker output parts 144 for projecting a plurality of optical markers on the entire surface of the object 10 in arbitrary scanning directions.
  • the N number of marker generators 142 are directed to the object 10, and are apart from each other at a predetermined interval, and the markers are so arranged as to cover the entire object.
  • the M number of pattern projectors 146 project predetermined patterns or laser striped patterns on the surface of the object 10 for obtaining 3D scan data.
  • LCD projectors may be utilized to project space-coded beams or laser beams on the surface of the object 10, thereby obtaining 3D scan data via the image obtaining part 148.
  • the M number of pattem projectors 146 are directed to the object 10, and are apart from each other at a predetermined interval, and the space-coded beams projected from each pattern projector 146 are made to cover the entire domain of the object 10.
  • the L number of image obtaining parts 148 which comprise image sensors capable of receiving images, such as CCD cameras or CMOS cameras, photograph and obtain images of the object 10. It is preferable that each of the L number of image obtaining parts 148 is integrated with an individual pattern projector 146 instead of being separated.
  • the N number of image- obtaining parts 148 are directed to the object 10, and are apart from each other at a predetermined interval, and the scanning domain of image obtaining parts 148 cover the entire domain of the object 10.
  • the image input part 150 receives each image data obtained from the L number of image obtaining parts 148, and the project controller 152 controls the transfer speed and transfer directions of pattern films and the blinking cycle of light sources for projecting the pattern films.
  • the marker blinking controller 154 periodically blinks each optical markers from the N number of marker generators 142 according to the control of the microprocessor 156.
  • the microprocessor 156 computes the 3D positions of the markers on each domain in reference to the 2D image data and 3D scan data obtained from the L number of image obtaining parts 148, respectively, and searches corresponding markers on every overlapped domains in reference to the 3D positions of the markers, and calculates translation matrices by using the corresponding markers. As a result, the microprocessor 156 arranges each of the 3D scan data by the translation matrices.
  • the buffer 158 stores data necessary for computing and the resultant data thereof.
  • the object 10 is placed in a position suitable for scanning, and the N number of marker generators 142, M number of pattem projectors 146 and L number of image obtaining parts 148 are arranged around the object 10. Then, the microprocessor 156 controls the marker blinking controller 154 to turn on the plural marker output parts 144 each equipped at the N number of marker generators 142, thereby allowing the plural markers to be arbitrarily projected on the surface of the object 10 (S 110).
  • the microprocessor 156 receives the L number of 2D image data obtained from the L number of image obtaining parts 148 via the image input part 150 (Sill).
  • the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing optical markers from being projected on the surface of the object (SI 12). Under such condition, when the same domain of the object 10 as mentioned above is photographed by each L number of image obtaining parts 148 to obtain L number of 2D images not including the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150 (S113).
  • microprocessor 156 controls the project controller 152 to operate the
  • predetermined patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattem are projected on the surface of the object 10 from the M number of pattern projectors 146.
  • the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S 114). Under such circumstances, the microprocessor 156 calculates 2D positions of the markers by image-processing the 2D image data including the optical markers and the 2D image data not including the markers (SI 15).
  • microprocessor 136 computes 3D positions of the markers from the
  • the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens centers of the L number of image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data.
  • the microprocessor 118 compares the 3D positions of markers for each L number of 3D scan data for searching corresponding markers (SI 17).
  • the microprocessor 156 calculates the translation matrices for translating the markers in the current 3D scan data (SI 18).
  • One of the L number of 3D scan data is set up as a reference coordinate and the current 3D scan data moves according to the obtained translation matrices for arrangement (SI 19).
  • a reference object whose dimensions is already known is placed in a position suitable for scanning, and N number of marker generators 142, M number of pattern projectors 146 and L number of image obtaining parts 148 are respectively arranged around the reference object.
  • the reference object may be specially manufactured for calibration, or may be an actual object if dimensions thereof are already known.
  • the microprocessor 156 controls the marker blinking controller
  • the microprocessor 156 carries out calibration for seeking a correlation between the reference object and the L number of image obtaining parts 148 (S121). The detailed operating processes thereto is described below.
  • step S121 optical markers from the N number of marker generators 142 are projected on the surface of the reference object, and when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain 2D images containing the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150.
  • the microprocessor 156 controls the project controller 152 to operate the M number of pattern projectors 146. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the reference object from the M number of pattem projectors 146.
  • predetermined patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 156 receives the L number of 3D scan data via the image input part 150.
  • the microprocessor 156 estimates 3D positions of the markers in the L number of 3D scan data by calculating intersection points where straight lines connecting each center of the cameras equipped in the L number of image obtaining parts 148 with the markers included in the each 2D image data intersect the 3D scan data, respectively.
  • the microprocessor 156 compares the 3D positions of the markers for each L number of 3D scan data for searching corresponding markers, and calculates translation matrices from the relations of the corresponding markers. Then, the microprocessor 156 registers the obtained translation matrices in the register of the buffer 158, whereby calibration is completed at S 121.
  • the reference object is removed and the object 10 is placed in a place where the reference object has been removed, and the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing the optical markers from being projected on the surface of the object 10 (S122).
  • the microprocessor 156 receives the L number of 2D image data via the image input part 150 when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain the L number of 2D images not including the optical markers (S123).
  • microprocessor 156 controls the project controller 152 to activate the
  • predetermined pattems for example, pattems having stripes with different gaps therebetween or multiple striped pattem are projected on the surface of the object 10 from the M number of pattem projectors 146.
  • the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S124).
  • the microprocessor 156 reads out the translation matrices stored in the register of the buffer 158, and sets one of the L number of 3D scan data as a reference, and moves L-l number of 3D scan data by the translation matrices (S 125).
  • S 121 to S 123 can be omitted. Since 3D scan data can be arranged by the translation matrices stored in the register of the buffer 158, the scanning time can be reduced. However, if needed, calibration from S121 to S123 may be executed for every scan, and can be easily changed, modified or altered according to the operator's intention or system configuration.
  • the eleventh embodiment provides marker generators and peripherals different from those used in the first to tenth embodiments of the present invention.
  • a marker generator of the eleventh embodiment, as shown in Fig. 31, includes a plurality of light sources of X-axis, a blinking controller, a polygon mirror 164 rotating about the X-axis, a rotary driving part 166, a rotary mechanism 168, a plurality of light sources of the
  • the plurality of light sources of the X-axis 160 generate beams having excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 164.
  • the light sources of the X-axis may be, for example, laser pointers.
  • the blinking controller 162 blinks each light source 160 according to the control of a microprocessor (not shown).
  • the polygon mirror 164 equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 168 in order to reflect the plural beams, thereby projecting the plural beams on the surface of an object (OB).
  • the rotary driving part 166 drives the rotary mechanism 168 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
  • the plurality of light sources of Y-axis 170 generate beams of excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 174.
  • the light sources may be, for example, laser pointers.
  • the blinking controller 172 blinks each light source according to the control of a microprocessor (not shown).
  • the polygon mirror equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 178 in order to reflect the plurality of beams, thereby projecting the plural beams on surface of the object (OB).
  • the rotary driving part 176 drives the rotary mechanism 178 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
  • a driving power generated by the rotary driving part 166 and the rotary driving part 176 is applied to the rotary mechanism 168 and the rotary mechanism 178 according to a control signal of a microprocessor.
  • the rotary mechanism 168 and the rotary mechanism 178 respectively driven by the driving power rotate the polygon mirrors 164 and 174.
  • the light sources 160 and 170 are lighted by the blinking controller 162 and 172 in response to the control signal of the microprocessor, beams created by the plurality of light sources 160 and 170 are emitted to the reflecting surfaces of polygon mirrors 164 and 174. Then, the beams are projected on the surface of the object (OB).
  • the polygon mirrors 164 and 174 are rotated to make the angle of the reflecting surfaces different. Therefore, on the surface of the object (OB) lines of plural number of beams are formed in the directions of the X-axis and Y-axis, and intersecting points where lines of X-axis and Y-axis are intersected become optical markers (RM), respectively.
  • OB object
  • RM optical markers
  • m*n number of intersecting points can be formed on the surface of the object (OB), and m*n number of intersecting points become respective optical markers (RM). Therefore, it is possible to employ a small number of light sources to generate relatively large number of optical markers.
  • the present invention is disclosed to provide an apparatus and method for automatically arranging 3D scan data obtained from different angles and positions using optical markers.
  • optical markers having no physical volume are used to find out relative positions of mutually different scan data, such that scan data are not lost or damaged even in portions where markers exist.
  • there is no need for placing markers on or removing markers from an object for scanning the object thereby providing convenience and safety in scanning, and preventing damage to the object as a result of attaching and removing markers.
  • the present invention can be used infinitely.

Abstract

An apparatus and method for automatically arranging 3D scan data using optical markers are disclosed where non-contact type markers are adopted to automatically arrange 3D data while scanned parts of an object are neither lost nor damaged. The apparatus using optical markers for automatically arranging 3D scan data which is obtained by photographing an object at various different angles comprises: marker generating means for projecting a plurality of optical markers on a surface of the object; pattern projecting means for projecting patterns on the surface of the object in order to obtain 3D scan data of the object; image obtaining means for obtaining 2D image data of the object including the markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and control means for calculating 3D positions of the markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data based on the 3D positions of the markers.

Description

APPARATUS AND METHOD FOR AUTOMATICALLY ARRANGING THREE- DIMENSIONAL SCAN DATA USING OPTICAL MARKER
FIELD OF THE INVENTION
The present invention relates to an apparatus and method for automatically arranging three-dimensional (3D) scan data by using optical markers and, more particularly, to an apparatus and method for automatically arranging relative positions of a plurality of 3D scan data scanned in various positions and angles in reference to one coordinate system.
BACKGROUND OF THE INVENTION
In general, an optical 3D scanner can extract 3D data by scanning the surface of an object that is only in the scanner's field of view. In order to scan other regions of the object, which are obstructed from the view of the 3D scanner, the object to be scanned should be rotated or moved, or the scanner itself should be moved and positioned to a place where a part or parts of the object can be seen. A complete 3D scan then may be captured by surveying the object in various orientations and angles, and the 3D data thus obtained is arranged and integrated into a single uniform coordinate system.
In this case, the reason the 3D data thus obtained should be integrated into a single uniform coordinate system is that if the scanner is moved to different positions and angles in order to capture the object, each 3D scan data is defined by different coordinate system according to the position of a scanner.
In order to match these various coordinate systems, the distance in which the scanner has been moved must be known. There are two methods of calculating this distance. One is to obtain an absolute distance by using a numerically-controlled device for moving the scanner, and the other is to calculate the distance only by referring to the scanned data.
In the latter case, the scanning must be carried out so that the plurality of scanned data overlap each other and corresponding points are inputted at the overlapping positions of the scanned data. The scanned data is then arranged in reference to the corresponding points such that the respective coordinate systems defining the plurality of scanned data are integrated into one uniform coordinate system.
In this case, error is likely to occur when corresponding points are inputted manually depending on how accurately an operator inputs them. Particularly, if there are no abnormal features on the surface of an object, more error is likely to occur. Furthermore, a large or complicated object requires numerous scanning operations by changing the position and angle of the scanner, thus prolonging the input of corresponding points and subjecting the scanning operation to more error. There is another drawback in that corresponding points may be erroneously inputted or omitted by an operator's careless error, thus resulting in frequent occurrences of inaccurate arrangements.
In order to overcome these drawbacks, a method that employs small detectable objects called markers or targets attached onto a surface of an object for an operator to identify them and input corresponding points accurately has recently been introduced. Furthermore, a technique has been developed by which corresponding points are automatically recognized by image processing algorithm.
As shown in Fig. 1, conventional markers 4 are attached arbitrarily onto the surface of an object 2, and the surface of the object 2 is scanned part by part in an overlapping manner.
When a plurality of scanned data is obtained by the above scanning method, an operator manually marks corresponding markers, as shown in Fig. 2.
After first and second scanned data II and 12 are obtained from scanning the surface of the object 2, as shown in Fig. 2, the operator searches the number of markers Ml and M2 positioned commonly in the two scanned data II and 12 and arranges the two scanned data II and 12 by matching the markers as corresponding points. Meanwhile, in a technique where corresponding points are automatically recognized, markers having patterns different from each other for identification is searched via image processing, and if markers having identical patterns are positioned in two different scanned data, the two scanned data are arranged automatically in reference to the markers.
However, there is a drawback in the conventional technique of recognizing corresponding points of scanned data by using the markers described above. That is, the data of parts of the surface of an object covered by markers may be lost due to the physical dimension of the markers attached on the surface of the object.
Although the lost information can be recovered by interpolation or by rescanning after removing the markers, there are still drawbacks in that these methods are inaccurate and call for lots of working hours.
SUMMARY OF THE INVENTION
The present invention provides an apparatus and method for automatically arranging 3D scan data by using optical markers which are of non-contact type so that scanned parts of an object are preserved and not deformed.
In accordance with one object of the present invention there is provided an apparatus using optical markers for automatically arranging 3D scan data which is obtained by scanning an object at various angles, comprising: marker generating means for projecting a plurality of optical markers on the surface of an object; pattern projecting means for projecting patterns on the surface of the object in order to obtain 3D scan data of the object; image obtaining means for obtaining 2D image data of the object including the optical markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and control means for extracting 3D positions of the optical markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data in reference to the 3D positions of the optical markers.
In accordance with another object of the present invention, there is provided a method for automatically arranging 3D scan data using optical markers, the method comprising the steps of: moving the image obtaining means to a position appropriate for obtaining an image of parts of the object; projecting the optical markers on the surface of the object by marker generating means and obtaining 2D image data of parts of the object including the optical markers projected on the surface of the object by image obtaining means; projecting patterns on the surface of the object by pattern projecting means and obtaining 3D scan data of parts of the object on which the patterns are projected by the image obtaining means; and extracting 3D positions of the optical markers from the relationship between the 2D image data and the 3D scan data and arranging 3D scan data obtained from different parts of the object in reference to the 3D positions of the optical markers.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the nature and objects of the present invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings in which:
Fig. 1 is a drawing for illustrating an example of a 3D scan of an object by attaching conventional sticker type markers on the object;
Fig. 2 is a drawing for illustrating an example of an arrangement of different scan data by using sticker type markers as reference markers;
Fig. 3 is a schematic drawing for illustrating the construction of an apparatus for automatically arranging 3D scan data using optical markers according to a first embodiment of the present invention;
Figs. 4a to 4c are schematic drawings for illustrating examples of a state for obtaining a 2D image data using optical markers according to the first embodiment of the present invention and a state for obtaining a 3D scan data by using patterns; Fig. 5 is a schematic drawing for illustrating an example of a state for deriving 2D positions of markers from the 2D image data obtained by activating and deactivating the optical markers according to the first embodiment of the present invention;
Fig. 6 is a schematic drawing for illustrating an example of a state for deriving a 3D position of a marker from a 2D position of a marker and a center position of a camera lens;
Figs. 7a to 7b are schematic drawings for exemplarily illustrating an operation of searching corresponding markers by way of a triangular comparison at mutually different image data according to the first embodiment of the present invention;
Figs. 8a to 8d are schematic drawings for exemplarily illustrating a conversion operation of matching two triangular structures at mutually different positions in triangular comparison within two different image data according to the first embodiment of the present invention;
Fig. 9 is a schematic drawing for exemplarily illustrating an operation of searching corresponding markers by way of obtaining an imaginary marker at mutually different image data according to the first embodiment of the present invention;
Figs. 10a to 10b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention;
Fig. 11 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a second embodiment of the present invention;
Fig. 12 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the second embodiment of the present invention; Fig. 13 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a third embodiment of the present invention;
Fig. 14 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fourth embodiment of the present invention;
Fig. 15 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fifth embodiment of the present invention;
Fig. 16 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a sixth embodiment of the present invention;
Figs. 17a and 17b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the sixth embodiment of the present invention ;
Fig. 18 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of a reference coordinate system;
Fig. 19 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of an absolute coordinate system;
Fig. 20 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention;
Figs. 21a and 21b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention;
Fig. 22 is a drawing of an example of an image obtained by using a large domain image obtaining part depicted in Fig. 20;
Fig. 23 is a drawing of an example of an image obtained by using the large domain image obtaining part depicted in Fig. 20 and an image obtaining part;
Fig. 24 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to an eighth embodiment of the present invention; Figs. 25a and 25b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the eighth embodiment of the present invention;
Fig. 26a is a drawing of an example of an image obtained by using a pair of large domain image obtaining parts illustrated in Fig. 24;
Fig. 26b is a drawing of an example of an image obtained by using the pair of large domain image obtaining parts illustrated in Fig. 24 and an image obtaining part;
Fig. 27 is a schematic drawing for illustrating a principle of the eighth embodiment of the present invention; Fig. 28 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a ninth embodiment of the present invention;
Fig.29 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention;
Fig. 30 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a tenth embodiment of the present invention; and
Fig. 31 is a schematic drawing for illustrating a construction of a marker generator according to an eleventh embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
The first embodiment of the present invention will now be described in detail with reference to the accompanying drawings.
Fig. 3 is a drawing for illustrating a structure of an apparatus for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention, wherein the apparatus includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a marker blinking controller 26, a pattern projector controller 28, a microprocessor 30 and a buffer 32.
The marker generator 12 which is designed to project markers recognizable by the image obtaining part 18 on the surface of an object 10 includes a plurality of marker output parts 14 for projecting plural optical markers simultaneously on the surface of the object 10 in irregular directions.
Preferably, the plurality of marker output parts 14 adopts laser pointers capable of projecting a plurality of red spots on the surface of the object 10, such that the positions of spots projected on the surface of the object 10 may be easily distinguished from images obtained by the image obtaining part 18, such as a camera or the like.
The marker generator 12 is by no means limited to a laser pointer, and any optical marker may be adopted as long as it can focalize properly on the surface of the object and can be easily controlled to repeat blinking.
A plurality of marker generators 12 may be disposed along a circumference of an object in order to project the optical markers on the surface of the entire object 10, and the number of the optical markers may be changed in accordance with the size and shape of the object 10. Furthermore, the marker generators 12 should be fixed on the object during the scanning operation so that the positions of the markers do not vary on the surface of the object.
The pattern projector 16 shown in the drawing projects predetermined patterns so that
3D scan data of the object 10 can be obtained. Namely, space-encoded beams are projected on the surface of the object 10 by using projectors such as LCD projectors and the like, or a laser beam is projected on the surface of the object 10 so that 3D scan data of the object 10 can be obtained via the image obtaining part 18.
It is also preferable that the pattern projector 16 adopts a slide projector comprising a light source, a pattern film and a lens for projecting a predetermined pattern, or an electronic LCD projector, or a laser diode for projecting laser striped patterns. A pattern film equipped with striped patterns is fed between a light source and a lens by a predetermined feeding means, which allow a series of striped patterns to be projected on the object 10.
The pattern film may have striped patterns of varying gaps, as disclosed in Korean
Patent Application No.2002-10839 filed on February 28, 2002 by the present applicant, entitled "3D Scan Apparatus and Method Using Multiple Striped Patterns." The same applies to a scanning device using laser striped patterns.
Furthermore, it is preferable that markers should not be projected on the object 10 while obtaining 3D scan data because scanned data might be deformed by the markers on the object 10.
The image obtaining part 18 comprises image sensors capable of receiving images such as a Charge Coupled Device (CCD) camera or a CMOS camera. The image obtaining part 18 obtains images by photographing the object when markers are optically projected on the surface of the object 10.
The image obtaining part 18 may be configured separately from the pattern projector 16 but it is preferable that the image obtaining part 18 be integrally built in with the pattern projector 16 because the image obtaining part 18 integrated with the projector 16 is simple in structure and it is easy to match the 2D image data with the 3D scan data without calibration.
The image obtaining part 18 obtains 2D image data and 3D scan data while synchronizing the blinking period of the optical markers with the blinking period of the pattern, details of which are illustrated in Figs. 4a, 4b and 4c.
As shown in Fig. 4a, the image obtaining part 18 obtains a first image data 40 by photographing a certain part of the object 10 on which a plurality of optical markers (RM) are arbitrarily projected. Next, as shown in Fig. 4b, the image obtaining part 18 obtains a second image data 42 by photographing the same part of the object 10 as in Fig 4a while the marker generator 12 is turned off in order to prevent the surface of the object 10 from being projected with the laser markers.
Next, as illustrated in Fig. 4c, the image obtaining part 18 obtains 3D scan data by photographing the object 10 on which striped patterns are projected from the pattern projector 16 while the marker generator 12 turns off. Specifically, the 3D scan data is obtained in the form of first to fifth scanned data 44a - 44e that correspond to the same part of the object 10 with different striped patterns PT1 - PT5 on the surface thereof, respectively. Although in the present embodiment the pattern film has 5 different striped patterns, it is not limited thereto and it may have more patterns.
The movement driving part 20 moves the pattern projector 16 and the image obtaining part 18 relatively to the object 10 according to the driving control of the microprocessor 30 so that images of the whole object 10 can be obtained.
The moving mechanism 22 receives a signal from the movement driving part 20 to move the pattern projector 16 and the image obtaining part 18 to a prescribed direction relative to the object 10. Although the movement driving part 20 is adopted in the present embodiment for electrically moving the pattern projector 16 and the image obtaining part 18, it should be apparent that the moving mechanism 22 may be manually manipulated.
The image input part 24 shown in the drawing receives the image data obtained from the image obtaining part 18, and the marker blinking controller 26 serves to blink the optical markers of the marker generator 12 according to the control of the microprocessor.
The project controller 28 controls the feeding speed and direction of the pattern film of the pattern projector 16, and also controls the blinking of the light source for the projection.
The microprocessor 30 receives and analyzes the 2D image data and the 3D scan data photographed at various angles through the image input part 24 and arranges the 3D scan data automatically on a single uniform coordinate system.
As illustrated in Fig. 5, for seeking the 2D positions of the laser markers, the microprocessor 30 carries out an image process on the first image data 40 with the laser markers (RM) and the second image data 42 without the laser markers (RM), and as a result the third image data 46 comprising only the laser markers (RM) is obtained.
As illustrated in Fig. 6, the microprocessor 30 calculates 3D positions of the markers from the relation between a camera lens center 50 of the image obtaining part 18 and a 2D image data 52 extracted with the marker position. The 3D positions of the markers (a', b', c') corresponding to relevant markers can be obtained by estimating intersection points where straight lines connecting the camera lens center 50 of the image obtaining part 18 and the positions (a, b, c) of arbitrary markers in the 2D image data intersect the 3D scan data 54.
It is possible to quickly obtain 3D positions of markers in case the pattern projector 16 and the image obtaining part 18 are integrally configured. However, if the pattern projector 16 and the image obtaining part 18 are configured separately, calibration on the coordinates of the pattern projector 16 and image obtaining part 18 should be executed for obtaining 3D positions of markers. Through the aforementioned steps, the 3D positions of the 3D scan data photographed at various angles can be obtained.
Meanwhile, a scan data should preferably include more than 4 - 5 markers, and the neighboring two scan data should include 3 or more common markers. This is because three or more points are necessary for defining a unique position in 3D space and are also a requirement for obtaining corresponding markers (described later).
Furthermore, it may be possible in the present invention that each marker is used with different patterns to be distinguished from each other. However, the configuration of the equipment and manufacture thereof may be complicated because scores or hundreds of marker output parts should project different shapes of markers.
In the present invention, in automatically arranging 3D scan data of neighboring regions obtained by photographing the objects in an overlapping manner, the markers are distinguished from each other by using information on the relative positions of the markers calculated based on respective 3D scan data at the microprocessor 30. For example, three points formed by markers can construct a triangle, and triangles constructed by three different points are different from each other, such that each triangle can be distinguished by comparing the angles and lengths thereof. Therefore, one marker corresponding to a vertex of a triangle can be distinguished from another marker. A detailed explanation of this process will be described below.
As illustrated in Figs. 7a and 7b, in case one scan data 60 includes M number of points of markers and another scan data 62 neighboring the one scan data 60 includes N number of points of markers, the scan data 60 contains MC3 number of different triangles and another scan data contains NC3 number of different triangles. Then, comparing the triangles between the two scan data a total of MC3 X NC3 times, corresponding pairs of triangles can be obtained.
First, as illustrated in Fig. 7a, the microprocessor 30 constructs plural triangles TI and T2 according to points obtained by markers included in one scan data 60 and plural triangles T3 and T4 included in another scan data 62.
Next, as shown in Fig.7b, the microprocessor 30 seeks a pair of mutually corresponding triangles, e.g., TI and T3, which are contained in the two scan data 60 and 62, respectively.
Various methods can be used to compare the triangles. One of them is to seek a pair of corresponding triangles by comparing the lengths of each side. In other words, three sides of each triangle (al, a2, a3) (bl, b2, b3) are compared, and if the length of each side is identical to its counterpart and if the order of each side is all the same, it can be determined that the two triangles are corresponding.
In seeking triangles with three identical sides, the length of each side is arranged in a descending order, for example, and if at least more than two triangles having identical sides are detected, the order of each side is checked. That is, two triangles are judged corresponding if each of the sides compared in a counterclockwise or clockwise order from the longest side are identical.
After corresponding triangles or markers are distinguished in the respective scan data as described in the above-noted explanation, the scan data are moved so that these markers can be positioned on the same point in a single uniform coordinate system. Namely, one of the triangles serves as a reference and the corresponding triangle moves to the reference, and as a result, the two different coordinate systems are matched.
The matching process for the two triangles located in two different scan data is illustrated in Figs. 8a - 8d. As shown in Fig. 8a, in case two triangles are given, each triangle is identical in size and shape but located at different scan data, and information on vertexes and sides are known while two corresponding triangles are determined.
As illustrated in Fig. 8b, one of the selected vertexes in two corresponding triangles are matched by using a translation matrix (T), where a reference coordinate system relating to one triangle is set up as A and the other coordinate system is set up as B. The translation matrix (T) thereto is defined in the following formula 1 :
FORMULA 1 T=T(A1-B1)
Next, as depicted in Fig. 8c, a rotational translation is carried out in order to match one of the corresponding sides by using a rotational matrix (Rl), where the rotational matrix (Rl) is defined by Formula 2: FORMULA 2 R1=R(Θ,)
Fig. 8d depicts a rotational translation carried out in order to match the remainder of one of the corresponding vertexes by using a rotational matrix (R2), where the rotational matrix (R2) is defined by Formula 3:
FORMULA R2=R2(Θ2)
As a result, the total matching process is carried out by a translation matrix M, which is defined by Formula 4:
FORMULA 4 M=T R1-R2
Therefore, a point (P) included in one scan data moves to a new position in another scan data by the following Formula 5 :
FORMULA 5 P'=M X P
Meanwhile, calculation error is likely to occur because the physical size of markers cannot be defined as points mathematically and has actual dimensions. Therefore, after the microprocessor 30 matches various scan data as explained above, the microprocessor 30 processes more intricately to obtain a more accurate match. Namely, the positions of the markers are more adjusted based on meshed data, which is called "Registering." Through these processes, each scan data can be integrated into a single uniform coordinate system more accurately. A detailed description follows.
In case a point cloud data A comprising plural points and another point cloud data B comprising plural points are to be integrated into a single uniform coordinate system of, A is moved and rotated for being integrated to the coordinate system of B. Under this circumstance, if n number of points of A is P={pi}, and points of B corresponding thereto is Q={xi}, the translation of moving and rotating is obtained by using a least square method for minimizing a distance between P and Q and this translation is applied to A. As a result, an average distance of the point cloud data represented by A and B is minimized, and these processes are repeated until the average distance between P and Q lie within allowance.
The corresponding points out of a plurality of point cloud data can also be obtained by using the least square method, and an equation thereto is given in the following formula 6:
FORMULA 6
J p =o
Where Q is a state vector of registering, and Q is defined as [QR I Qτ]2, where QR=Quaternion Vector, and
Figure imgf000016_0001
QT is a Translation Vector and defined as [q4q5q6]t.
In the aforementioned Formula 6, f(Q) defines an average distance of squared xi minus (R(QR)pi+Qτ), and at this time, R(QR) and QT for minimizing f(Q) can be calculated by the least square method. In Formula 6, R(QR) can be defined by a 3x3 rotation matrix as in the following Formula 7:
FORMULA 7
øo+øι- 02-øϋ liwrwύ (ζ[tf3+ øofe)
2( Gift + 0003) 00 + 02 "01 "03 (ø203- røl) 2(øι0 - 00«2) 2(02 3 + 0 !) 0+ 3- 1- 02,
Furthermore, in case a set of scan data is defined by P={pi} and a set of reference data (X) is defined by X={xi}, the center of mass of P and X is given by the following Formula 8: FORMULA 8
JV μ p 7 \Vr Σ ^—i.-
Figure imgf000017_0001
Furthermore, the cross-covariance matrix for P and X is given by the following
Formula 9:
FORMULA 9
Figure imgf000017_0002
Cyclic compounds of the anti-symmetric matrix (Aij) are used to form the column vector (Δ ). This vector (Δ ) is then used to form a symmetric 4x4 matrix Q (∑px), which can be given by the following Formula 10:
FORMULA 10
Figure imgf000017_0003
Δ= [ A 23 A 31 A 12j
Figure imgf000017_0004
Where I3 is the 3x3 identity matrix. In the above Formula 10, Qτ is an eigen vector corresponding to a maximum eigenvalue of Q ( ∑px ), and quaternion ( Q0, Ql, Q2, Q3 ) is obtained by utilizing the eigen vector, and the rotation matrix is obtained by substituting the quaternion into Formula 7. Meanwhile, Qτ(q4,q5,qe) can be obtained from the following Formula 11 by utilizing R(QR) given from the above Formula 7.
FORMULA 11 δr=μ,-^(δΛP
As a result, a final matrix is given by the following Formula 12:
FORMULA 12
F 0 + 01 -02 "03 ( ^-0003) 2(øιø3 + øoø2) 04
2(øι 2 + 00 s) 0 + 2-01-03 2(ø203"000l) 5 2(øι03-002) 2(ø203 + 000ι) 00 + 3-01- 2 5
0 0 0 1.
When corresponding markers are obtained for each of the 3D scan data, the microprocessor 30 computes a matrix for total translation based on said one scan data 60 as a reference coordinate, thereby arranging all of the 3D scan data automatically to the reference coordinate system.
Meanwhile, apart from the methods illustrated in Figs.8a to 8d, coordinate systems themselves may be mapped to the reference coordinate system by utilizing the least square method, which represent aligning scan data after finding corresponding triangles.
Information on the corresponding markers positioned at the vertexes are obtained during the process of seeking corresponding triangles, which are, for example, P and X in Formula 6, and an optimal translation matrix can be given by the following Formula 13:
FORMULA 13 R
L 0 I]
A formula applied to the point cloud data (P), which are to be arranged by the method of coordinate mapping, can be defined by the below-mentioned Formula 14:
FORMULA 14
P =TP
Meanwhile, in the method of seeking a pair of corresponding triangles as explained above, more than 3 corresponding markers should be included in a region where each scan data overlaps. Therefore, in case only 2 corresponding markers are included in the region, other methods should be adopted for seeking corresponding markers.
Since each scan data with markers has 3D information, it is possible to seek corresponding markers even if only 2 corresponding markers are included in the overlapped region. As shown in Fig. 9, there are only 2 corresponding markers on an overlapped region of the scan data 64 and 66, which are two markers (RMl, RM2) and (RM3, RM4), respectively. Corresponding markers are sought by comparing two vertical vectors at the points where the markers are positioned and the distance between the two markers .
Meanwhile, if there are too many markers projected on each scan data or the markers are projected uniformly, there is an increasing possibility of obtaining inaccurate corresponding markers. In this case, additional markers and 3D scan data are used for creating additional references. For example, in case there are three corresponding markers, a triangle is formed by the three markers, and then a line is drawn perpendicularly from the weight center of the triangle. Then, the intersecting point of the perpendicular line and the 3D scan data is obtained as a fourth reference point. Next, corresponding markers can be found by utilizing information of an average peφendicular vector of an object surface at the markers or around the markers. Furthermore, in case only two corresponding markers are available, a straight line is drawn by connecting the two markers, and a circle is drawn on a plane perpendicular to the straight line from the center of the straight line. Then, an intersecting point of the circle and the 3D scan data is obtained as fourth and fifth reference points.
According to the preferred embodiment of the present invention, it should be apparent that any method is possible for carrying out an automatic alignment of 3D scan data by making additional reference points around markers, other than the above-mentioned methods.
As shown in Fig. 3, information on the markers newly obtained by the automatic arrangement process on the 3D scan data is registered in the buffer 32.
Successively, operations according to the first embodiment of the present invention mentioned above will be explained in detail with reference to Figs. 10a and 10b, where S denotes a step.
First, the microprocessor 30 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S10).
Then, the microprocessor 30 controls the marker blinking controller 26 to allow the plurality of marker output parts 14 disposed at the marker generator 12 to project the markers arbitrarily on the surface of the object 10 (Sll).
Next, the image obtaining part 18 photographs a prescribed domain of the object 10 to obtain a 2D image including optical markers projected on the surface of the object 10 and then the microprocessor 30 receives the 2D image data via the image input part 24 (S 12).
Successively, the microprocessor 30 controls the marker blinking controller 26 to turn off the marker generator 12 so that the markers may not be projected on the object 10 (S13). At this state, the image obtaining part 18 photographs the same domain as the above to obtain a 2D image without the markers, and then the microprocessor 30 receives the 2D image data via the image input part 24 (S14).
The microprocessor 30 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off. Then, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 from the pattern projector 16. Next, the image obtammg part 18 photographs the object 10 with striped patterns projected on it to get 3D scan data, and then the microprocessor 30 receives the 3D scan data via the image input part 24 (S15).
At this state, the microprocessor 30 computes 2D positions of the markers by image processing the 2D image data with the markers and the 2D image data without the markers (S16).
Successively, the microprocessor 30 computes 3D positions of the markers by using the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data (S 17).
Meanwhile, the microprocessor 30 discriminates whether the register of the buffer 32 is vacant or not (SI 8). If the register of the buffer 32 is not vacant, the 3D positions of markers obtained at S17 (current 3D scan data) and 3D positions of the markers stored in the register of the buffer 32 (in other words, 3D data overlapping partly with the current 3D scan data) are compared for searching corresponding markers (SI 9).
When corresponding markers are found by comparing the markers included in the current 3D scan data with the markers registered at the register of the buffer 32 according to the aforementioned searching process, the microprocessor 30 computes translation matrix for matching the two 3D scan data (S20). The positions of 3D scan data registered at the register of the buffer 32 are given as a reference coordinate system, to which the current scan data are translated (S21).
Next, the microprocessor 30 registers the markers newly obtained from the current scan data in the register of the buffer 32 (S22). Then, the microprocessor 30 discriminates whether an automatic arrangement relative to the 3D scan data has been completed or not
(S23). If the arrangement is not completed, the process returns to S10, and then steps S10 to
S23 are repeated.
The second embodiment of the present invention will now be described in detail with reference to the accompanying drawings.
Fig. 11 illustrates a structure of an apparatus for automatically arranging 3D scan data using the optical markers according to the second embodiment of the present invention.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
An apparatus for arranging 3D scan data automatically according to the second embodiment of the present invention includes a marker generator 70, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a moving mechanism 22, an image input part 24, a individual marker blinking controller 74, a projector controller 28, a microprocessor 76, and a buffer 32.
The marker generator 70 comprises a plurality of marker output parts 72 and projects markers recognizable by the image obtaining part 18 on the surface of object 10 randomly.
The marker generator 70 turns on the 1 - N marker output parts 72 one by one sequentially in response to the control of the individual marker blinking controller 74, which enables a different marker to be included for each image obtained from the image obtaining part 18.
The individual marker blinking controller 74 sequentially and individually blinks the plurality of marker output parts 72 mounted at the marker generator 70 in a predetermined order according to the control of the microprocessor 76. The microprocessor 76 carries out the arrangement of the coordinate systems corresponding to the plurality of 3D scan data photographed at various angles by analyzing the 2D image data and 3D scan data inputted through the image input part 24. That is, the microprocessor 76 sets up an image photographed by the image obtaining part 18 while all the markers turn off as a reference image, and then compares the reference image with a plurality of images photographed while the markers turn on one by one sequentially. Through the above process, the 2D position of each marker can be obtained.
Successively, the microprocessor 76 carries out the same processes as performed in the first embodiment. In other words, the microprocessor analyzes the 2D image data and the
3D scan data to compute the 3D positions of the markers, and searches corresponding markers to obtain a translation matrix, and translates the plurality of 3D scan data to the reference coordinate system.
The operation according to the second embodiment of the present invention thus described will now be explained in detail with reference to the flow chart shown in Fig. 12.
First, the microprocessor 76 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning the object 10 (S30).
Under such circumstances, the microprocessor 76 obtains the image data photographed from the image obtaining part 18 as a reference image while all of the optical markers are turned off. Then, the microprocessor 76 controls the individual marker blinking controller 74 to turn on a firstly-designated marker output part 72 out of the plurality of marker output parts 72 equipped in the marker generator 70, which allows the first marker to be projected on the surface of the object 10 (S31). Then, an image is obtained as a first image data by photographing by the image obtaining part 18 (S32).
Next, the microprocessor 76 controls the individual marker blinking controller 74 to turn on the secondly-designated marker output part according to a predetermined order and to allow the second optical marker to be projected on the object 10 (S33). Then, the second image data is obtained (S34). Next, the microprocessor 76 discriminates whether the marker contained in the image is the last (N-th) marker out of the predetermined plural markers (S35). If the marker is not the last marker, steps S33 and S34 are repeatedly carried out until N-th image data is obtained.
Meanwhile, if it is discriminated that the marker is the last one, the microprocessor 76 controls the projector controller 28 to activate the pattern projector 16 while the marker generator 70 is turned off to prevent the optical marker from being projected and to allow a prescribed pattern (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) for 3D scanning to be projected on the object 10 by the pattern projector 16.
At this time, if the image obtaining part 18 photographs the object 10 with the pattern projected on it to obtain the 3D scan data, the microprocessor 76 receives the 3D scan data from the image input part 24 (S36).
The microprocessor 76 compares each of the first to N-th image data with the reference image and searches a bright spot formed by the optical markers in each comparison, which helps the 2D positions of each marker to be found easily (S37).
Successively, the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and the 3D scan data, and searches corresponding markers included in overlapped regions in reference to the 3D positions of the markers, and calculates translation matrices, and translates the plurality of 3D scan data to the reference coordinate system (S38), which is the same as described in the first embodiment.
Next, the microprocessor 76 discriminates whether the automatic arrangement for the
3D scan data has been completed (S39). If the arrangement has not been completed, the flow returns to S30 to move the pattern projector 16 and the image obtaining part 18 to appropriate positions by activating the movement mechanism 22 under the control of the movement driving part 20. Steps S30 to S38 are repeated.
Next, the third embodiment of the present invention will be described in detail with reference to the accompanying drawings.
The construction of the apparatus for automatically arranging the 3D scan data according to the third embodiment of the present invention is identical to the one shown in Fig. 11. However, the method is different between the second embodiment and the third embodiment. That is, in the second embodiment, N number of images, each of which includes one marker different from others, have to be photographed respectively. However, in the third embodiment, log2(N+l) number of images are photographed, each of which includes a group of markers for binarization.
Under the control of the microprocessor 76, the individual marker blinking controller 74 divides the marker output parts 72 disposed at the marker generator 70 into several groups for binarization, and turns on the markers group by group.
For example, if the number of marker output parts 72 is 16, the individual marker blinking controller 74 divides the 16 marker output parts 72 into 4 groups in an overlapping way.
In other words, for example, a first group comprises 9th to 16th markers, a second group comprises 5th to 8th markers and 13th to 16th markers, a third group comprises 3rd, 4th, 7th, 8th, 11th, 12th, 15th and 16th markers, and a fourth group comprises even numbers of markers (2nd, 4th, 6th, 8th, 10th, 12th, 14th, 16th), which are all represented in Table 1.
TABLE 1
Figure imgf000026_0001
:'0" represents that the marker is turned off while "1" represents that the marker is turned on.
As defined in Table 1, the first marker maintains a turned-off state at all times while the 16th marker always maintains a turned-on state and all markers have intrinsic values respectively.
The microprocessor 76 controls the individual marker blinking controller 76 such that the N markers are projected group by group, and compares the log2(N) image data obtained by the image obtaining part 18, and computes the 2D positions of the markers.
In case 16 markers are projected group by group to obtain the first to fourth image data as shown in Table 1, the 16 markers are differentiated by their binary codes which represent their turned-on or turned-off state, that is, identification (ID). Therefore, the 2D positions of the 16 markers can be obtained. For example, the 10th marker is recognized as a binary "1001" and the 13th marker is recognized as a binary "1100". Meanwhile, the first marker, always maintaining a turned-off state, is not used such that a total of 15 markers can be utilized in the real sense.
As a result, even though 1,024 (210) markers are used, 10 image data are sufficient to distinguish the markers by the above binarization of the markers. Furthermore, the microprocessor 76 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data, searches corresponding markers, calculates translation matrix, and moves the plurality of 3D scan data by the translation matrix. The above process is the same as that described in the first embodiment.
The operation of the third embodiment of the present invention thus described will now be explained in detail with reference to the flow chart illustrated in Fig. 13.
As shown in Table 1, an embodiment will be explained where 16 marker output parts disposed at the marker generator 70 are available to project a total of 16 markers, and four 2D image data is obtained from the image obtaining part 18.
First, the microprocessor 76 controls the movement driving part 20 to drive the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S40).
Next, the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that the markers (9th - 16th markers) belonging to the first group can be projected (S41). A first image data photographed by the image obtaining part 18 is obtained via the image input part 24 (S42).
Successively, the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that Nth group of markers, for example, 5th - 8th and 13th - 16th markers are projected (S43), to thereby obtain Nth image data photographed by the image obtaining part 18 (S44). W
Then, the microprocessor 76 discriminates whether the marker group contained in the image data is the last one or not (S45), and if it is not the last one, the flow is returned to S43 to repeat the process.
Meanwhile, as a result of the discrimination at S45, if it is discriminated that the marker group is the last one, the projector controller 28 drives the pattern projector 16 to project patterns on the surface of the object 10 while the marker generator 70 is turned off to prevent the optical markers from being projected.
At this time, when a 3D scan data is obtained in the image obtaining part 18 by photographing the object 10 projected with patterns, the microprocessor 76 receives the 3D scan data through the image input part 24 (S46).
Successively, the microprocessor 76 compares the first - Nth images obtained from the image obtaining part 18, which results in obtaining binary information of the markers relative to the first - fourth image data. Therefore, the ID of each marker, that is, the 2D positions of the markers are obtained (S47). 5 Meanwhile, the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and 3D scan data, and searches the corresponding markers included in the overlapped region of two different 3D scan data in reference to the 3D positions of the markers, calculates the translation matrix, and translates one of the 3D scan data to the reference coordinate system by the translation matrix (S48), which is same as0 described in the first embodiment.
The microprocessor 76 discriminates whether the automatic arrangement of the 3D scan data has been completed or not (S49). If the automatic arrangement of the 3D scan data has not been completed, the flow is returned to S40. Therefore, steps S40 to S48 are repeated.
Next, a fourth embodiment of the present invention will be described in detail with5 reference with the accompanying drawings. As shown in Fig.14, an apparatus for automatically arranging 3D scan data according to the fourth embodiment of the present invention includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 80, a individual marker blinking controller 84, and a microprocessor 86.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The marker generator 80 projects markers recognizable by the image obtaining part 18 on the surface of the object 10. The marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at irregular angles on the entire surface of the object 10.
The marker generator 80 selectively blinks the plural marker output parts 82 according to the control of the individual marker blinking controller 84. The individual marker blinking controller 84 individually controls the plural marker output parts 82 according to the control of the microprocessor 86.
The microprocessor 86 analyzes the scanned data obtained from the object 10 for automatically arranging the 3D scan data in a single uniform coordinate system. The microprocessor 86 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles via the image input part 24 to analyze them for automatic arrangement on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
However, there is a difference between the first embodiment and the fourth embodiment in that after the process of obtaining a 2D image data and 3D scan data for one region of the object 10 is carried out, the markers projected on the region blink at a predetermined cycle (for example, approximately 0.5 second), while the markers projected on the other region maintain an "on" state by the control of the individual marker blinking controller 84.
Conversely, it may be possible that markers for a region for which the obtaining process is already terminated maintain an "on" state, while the markers for the other region blink at a predetermined cycle.
In other words, the conditions of the markers are differently set up between one region where image data and scan data have been already obtained and the other region. Therefore, regions are easily differentiated by operators.
Next, a fifth embodiment of the present invention will be described in detail with reference to the accompanying drawings. As illustrated in Fig.15, an apparatus for automatically arranging 3D scan data according to the fifth embodiment of the present invention includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 90, a marker individual blinking/color controller 94 and a microprocessor 96.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The marker generator 90 projects patterns recognizable by the image obtaining part 18 on a surface of an object. The marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at arbitrary angles on the surface of the object 10.
The marker generator 90 is constructed such that at least more than two different colors can be selectively projected from each marker output part 92 according to the control of the marker individual blinking/color controller 94. For example, each marker output part 92 is equipped with more than two light sources each having different colors such that these light sources can be selectively lighted.
The marker individual ' blinking/color controller 94 controls the blinking and individual coloring of the plurality of marker output parts 92 disposed at the marker generator 90 according to the control of the microprocessor 96.
The microprocessor 96 analyzes the 2D image data and 3D scan data photographed from various angles by the image obtaining part 18 for automatically arranging the 3D scan data on one coordinate system. The detailed operation procedures thereto are the same as those in the first embodiment of the present invention.
However, there are some differences between the fifth embodiment and the first embodiment. The microprocessor 96 according to the fifth embodiment of the present invention controls the marker individual blinking/color controller 94 so that markers projected on a region where image data and scan data have already been obtained have different colors from that of the markers projected on the other region.
By differentiating colors according to the regions, an operator can easily check via the naked eye whether 2D image data and 3D scan data are obtained or not from the regions, which promotes convenience to the scanning operation.
Next, a sixth embodiment of the present invention will be described in detail with reference to the accompanying Fig. 16.
As shown in Fig. 16, an apparatus for automatically arranging 3D scan data includes marker generators 12, a pattern projector 16, an image obtaining part 18, an image input part 24, a marker blinking controller 26, a projector controller 28, a buffer 32, a rotating table 100, a rotating drive part 102, a rotating mechanism 104 and a microprocessor 106.
Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The rotating table 100 rotates with an object 10 placed on the upper plate of the rotating table 100, and also rotates with a plurality of marker generators 12 disposed at a circumference of the upper plate.
The rotating drive part 102 drives the rotating mechanism 104 to rotate the rotating table 100 according to the control of the microprocessor 106 such that the object can be set at an angle appropriate for scanning.
Here, although the rotating drive part 102 is utilized to electrically rotate the rotating table 100 in the sixth embodiment of the present invention, it should be apparent that the rotating mechanism 104 may be manually rotated to allow an operator to control the rotating table 100 arbitrarily.
Furthermore, as long as the marker generators and the object can be rotated together in a fixed state, not only the rotating table 100 but also other devices may be possibly applied herein.
The microprocessor 106 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles and analyzes the data for arranging the 3D scan data automatically on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
However, there is a difference in the sixth embodiment of the present invention in that the object 10 and the marker generator 12 are rotated during the scanning process instead of the image obtaining part 18 and the pattern projector 16.
The operational procedures for the apparatus for automatically arranging 3D scan data according to the sixth embodiment of the present invention thus described will now be explained in detail with reference to Figs. 17a and 17b.
First, the microprocessor 106 controls the rotating drive part 102 to drive the rotating mechanism 104, thus rotating the rotating table 100 at a predetermined angle so that the object 10 can be rotated to a position appropriate for scanning (S50).
Under such condition, the microprocessor 106 controls the marker blinking controller
26 to turn on the plurality of marker output parts 14 equipped at the marker generators 12, thus allowing the plural markers to be projected on the surface of the object 10 (S51).
When the image obtaining part 18 photographs the object 10 to thereby obtain 2D image including optical markers while the optical markers are being projected, the microprocessor 106 receives the 2D image data obtained by the image obtaining part 18 via the image input part 24 (S52).
Successively, the microprocessor 106 controls the marker blinking controller 26 to turn off the marker generators 12, thereby preventing the optical markers from being projected on the object 10 (S53). Next, the same region of the object 10 without the markers is photographed and the 2D image data thereof is received via the image input part 24 (S54).
Furthermore, the microprocessor 106 controls the projector controller 28 to activate the pattern projector 16 while the marker generators 12 are turned off to prevent the optical markers from being projected. Therefore, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 for 3D scanning.
When the image obtaining part 18 photographs the object 10 projected with the prescribed patterns to obtain the 3D scan data, the microprocessor 106 receives the 3D scan data via the image input part 24 (S55).
The microprocessor 106 calculates 2D positions of the markers by image-processing the 2D image data that contains and lacks optical markers (S56).
Successively, the microprocessor 30 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data (S57). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the
2D image data intersect the 3D scan data.
Meanwhile, the microprocessor 106 discriminates whether the register of buffer 32 is vacant or not (S58).
As a result of the discrimination at S58, if the register of the buffer 32 is not vacant, the microprocessor 106 compares the 3D positions of the markers obtained at S57 with those of the markers included in the 3D scan data stored in the registers of the buffer 32, thereby searching markers that correspond to each other (S59).
After the corresponding markers are obtained by comparing the markers included in the current 3D scan data with markers stored in the register of the buffer 32 in the searching process in S59, the microprocessor 106 computes translation matrices (S60) by analyzing the relation of the corresponding markers, and the current scan data are translated to a reference coordinate by which the 3D scan data listed with the register of the buffer 32 is defined (S61), which is the same as described in the first embodiment.
Then, the microprocessor 106 registers the markers at the register of the buffer 32, which serves as a reference in the next calculation (S62). Successively, the microprocessor 106 checks whether the automatic arrangement of the 3D scan data obtained from the object 10 is completed (S63).
As a result of the check, if it is discriminated that the automatic arrangement has not been completed for the 3D scan data obtained from the object 10, the flow returns to S50. Therefore, 2D image data and 3D scan data for other regions of the object 10 are obtained by activating the rotating mechanism 104 according to the rotating drive part 102 and rotating the rotating table to as much as a prescribed angle. Steps S50 to S62 are repeatedly carried out.
As apparent from the above description, the sixth embodiment of the present invention is so constructed as to allow the object 10 to be moved, and thus, it is easy to obtain and arrange 3D scan data from an object relatively smaller than that of the first embodiment of the present invention where the projector and the image obtaining part are structured to move.
Here, the marker generators are fixed on the rotating table for preventing relative movement between them, until the scanning process is completed.
Meanwhile, the arrangement method by using reference coordinates in the aforementioned embodiments has a drawback in that errors can increase if the number of regions to be scanned are great. Because in the above methods, arrangement is carried out by integrating one 3D scan data to a reference coordinate system in which its neighboring 3D scan data already obtained is defined, and the arrangement process is repeated over all regions of the object. Therefore, a careless error at one process can be amplified at the end of the arrangement.
For example, Figs. 18a and 18b illustrate two scan data obtained by scanning two adjacent regions that overlap each other. The dotted lines indicate real data of an object and the solid lines indicate scan data which are not identical to the real data.
Under such circumstances, if any one of the scan data in Fig.18a and Fig.18b is served as a reference and the other data is attached (arranged) to the reference, errors may add up in the attachment, which results in Fig. 19c. In other words, the increase in number of regions to be scanned is likely to increase the chance of errors.
In order to cope with the aforementioned problems, a method of arranging 3D scan data on an absolute coordinate system, instead of a reference coordinate system, is presented in the seventh and eighth embodiments of the present invention. The absolute coordinate system in these embodiments is different from the reference coordinate system in that every 3D scan data of the regions of an object is mapped to an absolute coordinate. Therefore, errors occurring in obtaining a 3D scan data is not transmitted to obtaining adjacent 3D scan data.
For example, Figs.19a and 19b illustrate two scan data obtained from scanning two adjacent regions, and a part of the scan data overlaps. If the two scan data of Figs.19a and 19b are translated to an absolute coordinate system respectively and are attached to each other, as shown in Fig. 19d, errors in occurring in the two scan data, respectively, are not added as shown in Fig. 19c, such that an error amplification problem caused by inaccuracy of the image obtaining part thus described can be prevented.
First, the seventh embodiment of the present invention will be described in detail with reference to the accompanying drawings.
An apparatus for automatically arranging 3D scan data according to the seventh embodiment illustrated in Fig. 20 includes a marker generator 12, a projector 16, an image obtaining part 18, a first movement driving part 20, a first movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a large domain image obtaining part 110, an image input part 112, a second movement driving part 114, a second movement mechanism 116, a microprocessor 118, and a reference object 120. Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanation on the parts will be omitted for simplicity.
The large domain image obtaining part 110 comprises an image sensor for receiving images such as CCD camera or Complementary Metal Oxide Semiconductor (CMOS) camera.
When markers are projected on the surface of an object 10 from the marker generator 12, images thereof are photographed and obtained by the large domain image obtaining part 10.
The large domain image obtaining part 10 is separately positioned from the image obtaining 1087
part 18 to photograph and obtain an image for large domain of the object 10.
The large domain image obtaining part 110 is preferred to adopt an image sensor having a relatively higher accuracy than that of the image obtaining part 18 for obtaining images for part of the scan domain.
The image input part 112 receives image data from the image obtaining part 18 and the large domain image obtaining part 110.
The second movement driving part 114 drives the second movement mechanism 116 to move the large domain image obtaining part 110 to a position suitable for obtaining the image of large part of the object 10 according to the driving control of the microprocessor 118.
It should be noted that although the large domain image obtaining part 110 is moved electrically by the second movement driving part 114 in the seventh embodiment of the present invention, it may be moved by manipulating the second movement mechanism.
The microprocessor 118 computes 3D positions of each marker for the large scan domain by analyzing image data of the object 10 and a reference object 120 photographed by the large domain image obtaining part 110 in two or more different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12.
The 3D positions of the markers thus obtained serve as an absolute coordinate system.
Furthermore, the microprocessor 118 receives via the image input part 112 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinate system, which results in an arrangement of a complete 3D scan data of the object 10.
The reference object 120, an object of a prescribed shape whose size (dimension) information is pre-inputted in the microprocessor 118, is arranged close to the object 10. An image of the reference object 120 is obtained along with that of the object 10 via the large domain image obtaining part 110.
The operating process of an apparatus for automatically arranging 3D scan data according to the seventh embodiment of the present invention thus explained will now be described in detail with reference to the flow charts shown in Figs. 21a and 21b.
First, an object 10 is placed beside the marker generator 12, and a reference object
120 is arranged at a prescribed position near the object 10. Then, the microprocessor 118 controls a second movement driving part 114 to drive the second movement mechanism 116 so that the large domain image obtaining part 110 moves to a position suitable for scanning the object 10.
Next, the microprocessor 118 controls the marker blinking controller 26 to turn on a plurality of marker output parts 14 equipped at the marker generator 12, whereby a plurality of markers can be arbitrarily projected on the surface of the object 10 (S70).
The large domain of the object 10 and the reference object 120 are photographed by the large domain image obtaining part 18 to obtain a 2D image data including optical markers. Then the microprocessor 118 receives the 2D image data obtained from the large domain image obtaining part 110 via the image input part 112 (S71).
In Fig. 23, an example of an image including the entire domain of the object 10 and the reference object 120 obtained by the large domain image obtaining part 110 is shown.
The symbol "RM" indicates an optical marker projected on the surface of the object 10 while another reference symbol "Bl " refers to an image obtained by the large domain image obtaining part 110.
Next, the microprocessor 118 controls the second movement driving part 114 to drive the second movement mechanism for moving the large domain image obtaining part 118 to a position suitable for scanning another part of the object 10 (S72). Next, the microprocessor 118 controls the large domain image obtaining part 118 to photograph the large domain of the object 10 including the reference object 120, whereby a 2D image including the optical markers in a different direction from that of S71 is obtained. The 2D image is received by the microprocessor 118 via the image input part 112 (S73).
The microprocessor 118 then controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S74).
The microprocessor 118 combines the 2D images of the large domain of the object 10 obtained in different directions obtained by the large domain image obtaining part 110 and computes the 3D positions of the markers included in the combined 2D images in reference to the dimension of the reference object 120 already-known (S75). Next, the microprocessor
118 registers the 3D positions of each marker thus calculated at the buffer 32 (S76).
Successively, the microprocessor 118 controls the first movement driving part 20 to drive the first movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 is moved to a position suitable for scanning the object 10 (S77).
Under such circumstances, the microprocessor 118 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12 to allow the plural markers to be arbitrarily projected on the surface of the object 10 (S78).
After the image obtaining part 18 photographs a part (see "NI" in Fig. 23) of the large domain of the object 10 while the optical markers are projected on the surface of the object 10, the microprocessor 18 receives the 2D image data obtained by the image obtaining part 18 via the image input part 112 (S79).
Next, the microprocessor 118 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the object 10 (S80). Under this condition, the same part of the large domain mentioned above is photographed by the image obtaining part 18 to obtain the 2D image not included with optical markers. The 2D image data thus obtained is inputted to the microprocessor via the image input part 112 (S 81).
Furthermore, the microprocessor 118 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected, whereby predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
When the object 10 projected with patterns is photographed by the image obtaining part 18 to obtain 3D scan data, the microprocessor 118 receives the 3D scan data via the image input part 112 (S82). Under such circumstances, the microprocessor 118 computes the 2D positions of the markers by image-processing the 2D image data including the markers and the 2D image data not including the markers (S83).
The microprocessor 118 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of three arbitrary markers in the 2D image data intersect the 3D scan data (S84).
Successively, the microprocessor 118 compares the 3D positions of markers found at
S84 with 3D positions of the markers stored in the register of the buffer 32 at S76 to search corresponding markers, in other words, markers that are identical in 3D positions thereof (S85).
When corresponding markers are found by comparing the optical markers included in the current 3D scan data with the markers stored in the register of the buffer 32, the microprocessor 118 calculates the translation matrices for translating the markers in the current 3D scan data to the absolute coordinate system (S86). Then, the current scan data are moved by the translation matrices to be arranged on the absolute coordinate system by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S87).
Next, the microprocessor 118 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the object 10 are all arranged or not (S88).
If it is not completed, the flow returns to S77. Therefore, the microprocessor controls the first movement driving part 20 for driving first the movement mechanism 22, whereby the projector 16 and the image obtaining part 18 are moved to positions suitable for scanning the objects not yet scanned. Steps S77 to S88 are repeatedly carried out.
Although in the seventh embodiment, a large domain image obtaining part and an image obtaining part are introduced by two different elements, it may be preferably possible that one image obtaining part is utilized for obtaining both images of the large domain of the object and images for parts of the large domain of the object.
Next, an eighth embodiment of the present invention will be described in detail with reference to the attached drawing of Fig. 24.
The apparatus for automatically arranging 3D scan data using optical markers according to the eighth embodiment of the present invention as shown in Fig. 24 includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a pair of or plural large domain image obtaining parts 130 and 132, an image input part 134 and a microprocessor 136. Throughout the drawings, the same reference numerals and symbols are designated to the parts equivalent to those in the first embodiment in functions or operations, and explanations on the parts will be omitted for simplicity.
The pair of large domain image obtaining parts 130 and 132 comprise image sensors for receiving images such as CCD cameras or CMOS cameras. The cameras are fixed to each other, and they capture images of the same object from different angles, the method of which is called Stereo Vision.
The large domain image obtaining parts 130 and 132 are preferred to adopt image sensors of relatively higher resolution than the image obtaining part 10 for obtaining images of parts of the domain. The image input part 134 is intended to receive image data obtained by the image obtaining part 18 and the large domain image obtaining parts 130 and 132.
The microprocessor 136 computes 3D positions of each marker for large scan domains by analyzing image data of the object 10 photographed by the large domain image obtaining parts 130 and 132 in two different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12. The 3D positions of markers thus obtained serve as an absolute coordinate system.
Furthermore, the microprocessor 136 receives via the image input part 134 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinates, which results in an arrangement of the whole 3D scan data of the object 10.
Next, the operating process of the apparatus for automatically arranging 3D scan data according to the eighth embodiment of the present invention will be described in detail with reference to the flow charts shown in Figs. 25a and 25b.
First, a predetermined object 10 is placed beside the marker generator 12, and the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, allowing plural markers to be arbitrarily projected on the surface of the object 10 (S90).
The microprocessor 118 receives two 2D image data obtained from the large domain image obtaining parts 130 and 132 via the image input part 134 when the large domain of the object 10 is photographed in an overlapping manner in different directions by the large domain image obtaining parts 130 and 132, respectively, while the optical markers from the marker generator 12 are projected on the object 10 (S91).
Fig. 26 illustrates an example of images of the large scan domain of the object 10 obtained by the large domain image obtaining parts 130 and 132. The reference symbol
"RM" indicates an optical marker projected on the surface of the object 10, "BI-1 " is an image obtained by the large domain image obtaining part 132, and "BI-2" is an image obtained by the large domain image obtaining part 130.
Successively, the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S92).
The microprocessor 136 computes 3D positions of the markers included in the large scan domain of the object in reference to the two 2D image data photographed in two different directions by the large domain image obtaining parts 130 and 132 (S93). In other words, from the relation between the positions of the pair of the large domain image obtaining parts 130 and 132 and 2D positions of each marker projected on the object 10, 3D positions of each marker is calculated by triangulation, the details of which will be explained later. Next, the microprocessor 136 registers the 3D positions of each marker thus calculated in the register of the buffer 32 (S 94).
The microprocessor 136 then controls the movement driving part 20 to drive the movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 moves to a position suitable for scanning the object 10 (S95).
Under such condition, the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, thereby projecting the plural markers arbitrarily on the surface of the object 10 (S96). When a part ("NI" in Fig.26b) out of the large scan domain for the object 10 is photographed by the image obtaining part 18 to obtain 2D image data including the optical markers, the microprocessor 136 receives the 2D image data via the image input part 134 (S97).
Successive!}/, the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, preventing the optical markers from being projected on the object 10 (S98). Under such condition, when the same part as mentioned above is photographed by the image obtaining part 18 to obtain 2D image date not including the optical markers, the microprocessor 136 receives the 2D image data via the image input part 112 (S99).
Furthermore, the microprocessor 136 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
When the object 10 projected with patterns is photographed by the image obtaining part 18 to obtain 3D scan data, the microprocessor 136 receives the 3D scan data via the image input part 112 (S 100).
Under such circumstances, the microprocessor 136 analyzes the 2D image data including the markers and the 2D image data not including the markers for calculating 2D positions of the markers (S 101 ) .
Furthermore, the microprocessor 136 computes 3D positions of the markers in reference to the 2D positions of the markers and the 3D scan data (S102). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data. Successively, the microprocessor 136 compares the 3D positions of markers found at SI 02 with 3D positions of markers stored in the register of the buffer 32 at S94 to search corresponding markers (S103).
When corresponding markers are found by the search process of the markers described above, the microprocessor 136 calculates the translation matrices for translating the markers in the current 3D scan data (S104). The current scan data are moved by the translation matrices to be arranged on the absolute coordinates by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S105).
Next, the microprocessor 136 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the entire scan data domain for the object 10 are all arranged (SI 06).
If it is not completed, the flow returns to S95. Therefore, the microprocessor controls the movement driving part 20 for driving the movement mechanism 22, whereby the. pattem projector 16 and the image obtaining part 18 are moved to positions suitable for scanning the objects not yet scanned. Steps S95 to SI 06 are repeatedly carried out.
Although in the eighth embodiment, a pair of large domain image obtaining parts, an image obtaining part and a marker generator are configured separately, the pair of large domain image obtaining parts and the marker generator may be integrally configured as a modification thereto. In this case, this may be more convenient because there is no need to set positions of the pair of large domain image obtaining parts according to a domain where the optical markers are projected.
As another modification to the eighth embodiment of the present invention, a pair of large domain image obtaining parts and an image obtaining part may be integrally constructed. In this case, in the process of obtaining an absolute coordinate, the scan domain may become a bit smaller and accuracy may also be decreased, however, in the process of obtaining parts of images of the scan domain it is not necessary to photograph in an overlapping manner. Therefore, the number of scanning process can be reduced.
The principle of the eighth embodiment of the present invention described above will be explained below in detail.
The large domain image obtaining parts 130 and 132 disclosed in the eighth embodiment of the present invention can be modeled by two cameras facing one object, which can be modified according to the field of applications. In the eighth embodiment, two cameras are arranged in parallel as shown in Fig. 27. The variables in Fig. 27 are defined below.
X: position of a point to be obtained
b: base line distance between camera lens centers
f: camera's focal length
A, B: image plane obtained by each camera
XI, Xr: respective positions of the image of the point on the image planes
P, Q: lens centers of each camera
A method for obtaining a position of a point by using stereo image can be defined in Formulas (Equations) 15 and 16.
FORMULA 15 x+ b/2 and —f- x- b/2
tf y'r
*'_ -
FORMULA 16
£ = fc — _ __J — . £ = fc __. — , ~ = b t ' ,
X I X r X X j, X j iC r
Next, the ninth embodiment of the present invention will be described with reference to the attached drawing
In the ninth embodiment of the present invention, a plurality of projectors, image obtaining parts and marker generators are arranged around an object, such that the projectors and image obtaining parts need not be moved to obtain 2D images and 3D scan data in relation to the entire scan domain of the object, and one scanning operation makes it possible to obtain 2D images and 3D scan data, thereby simplifying the job and shortening the consuming time involved therein
Fig 28 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention, wherein the apparatus comprises N number of marker generators 142,
M number of pattern projectors 146, L number of image obtammg parts 148, an image input part 150, a project controller 152, a marker blinking controller 154, a microprocessor 156 and a buffer 158.
The N number of marker generators 142, intended to project markers recognizable by the image obtaining part 148 on the surface of an object, are disposed with a plurality of marker output parts 144 for projecting a plurality of optical markers on the entire surface of the object 10 in arbitrary scanning directions.
The N number of marker generators 142 are directed to the object 10, and are apart from each other at a predetermined interval, and the markers are so arranged as to cover the entire object.
The M number of pattern projectors 146 project predetermined patterns or laser striped patterns on the surface of the object 10 for obtaining 3D scan data. LCD projectors may be utilized to project space-coded beams or laser beams on the surface of the object 10, thereby obtaining 3D scan data via the image obtaining part 148.
The M number of pattem projectors 146 are directed to the object 10, and are apart from each other at a predetermined interval, and the space-coded beams projected from each pattern projector 146 are made to cover the entire domain of the object 10.
The L number of image obtaining parts 148 which comprise image sensors capable of receiving images, such as CCD cameras or CMOS cameras, photograph and obtain images of the object 10. It is preferable that each of the L number of image obtaining parts 148 is integrated with an individual pattern projector 146 instead of being separated.
Furthermore, the N number of image- obtaining parts 148, as shown in Fig. 28, are directed to the object 10, and are apart from each other at a predetermined interval, and the scanning domain of image obtaining parts 148 cover the entire domain of the object 10.
The image input part 150 receives each image data obtained from the L number of image obtaining parts 148, and the project controller 152 controls the transfer speed and transfer directions of pattern films and the blinking cycle of light sources for projecting the pattern films.
The marker blinking controller 154 periodically blinks each optical markers from the N number of marker generators 142 according to the control of the microprocessor 156.
The microprocessor 156 computes the 3D positions of the markers on each domain in reference to the 2D image data and 3D scan data obtained from the L number of image obtaining parts 148, respectively, and searches corresponding markers on every overlapped domains in reference to the 3D positions of the markers, and calculates translation matrices by using the corresponding markers. As a result, the microprocessor 156 arranges each of the 3D scan data by the translation matrices. The buffer 158 stores data necessary for computing and the resultant data thereof.
Next, the operating process of the apparatus for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention thus described will be explained in detail with reference to the flow chart shown in Fig. 29.
First, the object 10 is placed in a position suitable for scanning, and the N number of marker generators 142, M number of pattem projectors 146 and L number of image obtaining parts 148 are arranged around the object 10. Then, the microprocessor 156 controls the marker blinking controller 154 to turn on the plural marker output parts 144 each equipped at the N number of marker generators 142, thereby allowing the plural markers to be arbitrarily projected on the surface of the object 10 (S 110).
When the scan domains of the object 10 are photographed from each L number of image obtaining parts 148 to obtain 2D images while the optical markers are projected on the object 10, the microprocessor 156 receives the L number of 2D image data obtained from the L number of image obtaining parts 148 via the image input part 150 (Sill).
Successively, the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing optical markers from being projected on the surface of the object (SI 12). Under such condition, when the same domain of the object 10 as mentioned above is photographed by each L number of image obtaining parts 148 to obtain L number of 2D images not including the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150 (S113).
Furthermore, the microprocessor 156 controls the project controller 152 to operate the
M number of projectors 146 while the N number of marker generators 142 are turned off to prevent the optical markers from being projected. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattem) are projected on the surface of the object 10 from the M number of pattern projectors 146.
When the object 10 projected with patterns is photographed by L number of image obtaining parts 148 to obtain L number of 3D scan data, the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S 114). Under such circumstances, the microprocessor 156 calculates 2D positions of the markers by image-processing the 2D image data including the optical markers and the 2D image data not including the markers (SI 15).
Furthermore, the microprocessor 136 computes 3D positions of the markers from the
2D positions of the markers and the 3D scan data (SI 16). That is, the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens centers of the L number of image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data.
Successively, the microprocessor 118 compares the 3D positions of markers for each L number of 3D scan data for searching corresponding markers (SI 17).
When corresponding markers are found by the searching process, the microprocessor 156 calculates the translation matrices for translating the markers in the current 3D scan data (SI 18). One of the L number of 3D scan data is set up as a reference coordinate and the current 3D scan data moves according to the obtained translation matrices for arrangement (SI 19).
Next, a tenth embodiment of the present invention will be described in detail with reference to the attached drawing. The elements of the tenth embodiment of the present invention is nearly identical to those of the ninth embodiment, however, operating processes are different from each other. Therefore, the tenth embodiment will be described based on the configuration of the ninth embodiment illustrated in Fig. 28 and flow chart shown in Fig. 30.
First, a reference object whose dimensions is already known is placed in a position suitable for scanning, and N number of marker generators 142, M number of pattern projectors 146 and L number of image obtaining parts 148 are respectively arranged around the reference object. The reference object may be specially manufactured for calibration, or may be an actual object if dimensions thereof are already known.
Under such condition, the microprocessor 156 controls the marker blinking controller
154 to turn off the respective marker output parts 144 equipped at the N number of marker generators 142 to allow the plural markers to be arbitrarily projected on the surface of the reference object (S120).
The microprocessor 156 carries out calibration for seeking a correlation between the reference object and the L number of image obtaining parts 148 (S121). The detailed operating processes thereto is described below.
At step S121, optical markers from the N number of marker generators 142 are projected on the surface of the reference object, and when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain 2D images containing the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150.
Successively, the microprocessor 156 controls the project controller 152 to operate the M number of pattern projectors 146. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the reference object from the M number of pattem projectors 146. When the reference object projected with patterns is photographed by L number of image obtaining parts 148 to obtain L number of 3D scan data, the microprocessor 156 receives the L number of 3D scan data via the image input part 150.
Under such condition, the microprocessor 156 estimates 3D positions of the markers in the L number of 3D scan data by calculating intersection points where straight lines connecting each center of the cameras equipped in the L number of image obtaining parts 148 with the markers included in the each 2D image data intersect the 3D scan data, respectively.
Successively, the microprocessor 156 compares the 3D positions of the markers for each L number of 3D scan data for searching corresponding markers, and calculates translation matrices from the relations of the corresponding markers. Then, the microprocessor 156 registers the obtained translation matrices in the register of the buffer 158, whereby calibration is completed at S 121.
When the calibration thus described is completed at S121, the reference object is removed and the object 10 is placed in a place where the reference object has been removed, and the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing the optical markers from being projected on the surface of the object 10 (S122).
Under such circumstances, the microprocessor 156 receives the L number of 2D image data via the image input part 150 when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain the L number of 2D images not including the optical markers (S123).
Furthermore, the microprocessor 156 controls the project controller 152 to activate the
M number of pattem projectors 146 while the N number of marker generators 142 turn off to keep the optical markers from being projected. Therefore, predetermined pattems (for example, pattems having stripes with different gaps therebetween or multiple striped pattem) are projected on the surface of the object 10 from the M number of pattem projectors 146.
When the object 10 projected with pattems is photographed by L number of image obtaining parts 148 to obtain L number of 3D scan data, the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S124).
The microprocessor 156 reads out the translation matrices stored in the register of the buffer 158, and sets one of the L number of 3D scan data as a reference, and moves L-l number of 3D scan data by the translation matrices (S 125).
When scanning is necessary for the other object, or the same object, calibration from
S 121 to S 123 can be omitted. Since 3D scan data can be arranged by the translation matrices stored in the register of the buffer 158, the scanning time can be reduced. However, if needed, calibration from S121 to S123 may be executed for every scan, and can be easily changed, modified or altered according to the operator's intention or system configuration.
Next, an eleventh embodiment of the present invention will be described. The eleventh embodiment provides marker generators and peripherals different from those used in the first to tenth embodiments of the present invention.
A marker generator of the eleventh embodiment, as shown in Fig. 31, includes a plurality of light sources of X-axis, a blinking controller, a polygon mirror 164 rotating about the X-axis, a rotary driving part 166, a rotary mechanism 168, a plurality of light sources of the
Y-axis 170, a blinking controller 172, a polygon mirror 174 rotating about the Y-axis, a rotary driving part 176 and a rotary mechanism 178. The plurality of light sources of the X-axis 160 generate beams having excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 164. The light sources of the X-axis may be, for example, laser pointers.
The blinking controller 162 blinks each light source 160 according to the control of a microprocessor (not shown).
The polygon mirror 164 equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 168 in order to reflect the plural beams, thereby projecting the plural beams on the surface of an object (OB). The rotary driving part 166 drives the rotary mechanism 168 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
The plurality of light sources of Y-axis 170 generate beams of excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 174. The light sources may be, for example, laser pointers. The blinking controller 172 blinks each light source according to the control of a microprocessor (not shown).
The polygon mirror equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 178 in order to reflect the plurality of beams, thereby projecting the plural beams on surface of the object (OB). The rotary driving part 176 drives the rotary mechanism 178 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
Next, the operating processes of the marker generator according to the eleventh embodiment of the present invention thus described will be explained in detail.
First, a driving power generated by the rotary driving part 166 and the rotary driving part 176 is applied to the rotary mechanism 168 and the rotary mechanism 178 according to a control signal of a microprocessor. The rotary mechanism 168 and the rotary mechanism 178 respectively driven by the driving power rotate the polygon mirrors 164 and 174.
When the light sources 160 and 170 are lighted by the blinking controller 162 and 172 in response to the control signal of the microprocessor, beams created by the plurality of light sources 160 and 170 are emitted to the reflecting surfaces of polygon mirrors 164 and 174. Then, the beams are projected on the surface of the object (OB).
The polygon mirrors 164 and 174 are rotated to make the angle of the reflecting surfaces different. Therefore, on the surface of the object (OB) lines of plural number of beams are formed in the directions of the X-axis and Y-axis, and intersecting points where lines of X-axis and Y-axis are intersected become optical markers (RM), respectively.
For example, in case the number of light sources of X-axis is m and the number of light sources of Y-axis is n, m*n number of intersecting points can be formed on the surface of the object (OB), and m*n number of intersecting points become respective optical markers (RM). Therefore, it is possible to employ a small number of light sources to generate relatively large number of optical markers.
Although the present invention has been fully described in connection with the preferred embodiments thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications are apparent to those skilled in the art. Such changes and modifications are to be understood as included within the scope of the present invention as defined by the appended claims unless they depart therefrom.
As apparent from the foregoing, the present invention is disclosed to provide an apparatus and method for automatically arranging 3D scan data obtained from different angles and positions using optical markers. There is an advantage in the present invention thus described in that optical markers having no physical volume are used to find out relative positions of mutually different scan data, such that scan data are not lost or damaged even in portions where markers exist. There is another advantage in that there is no need for placing markers on or removing markers from an object for scanning the object, thereby providing convenience and safety in scanning, and preventing damage to the object as a result of attaching and removing markers. There is still another advantage in that the present invention can be used infinitely.

Claims

WHAT IS CLAIMED IS:
1. An apparatus using optical markers for automatically arranging 3D scan data which is obtained by photographing an object at various different angles, comprising:
marker generating means for projecting a plurality of optical markers on a surface of the object;
pattem projecting means for projecting pattems on the surface of the object in order to obtain 3D scan data of the object;
image obtaining means for obtaining 2D image data of said object including the markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and
control means for calculating the 3D positions of the markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data based on the 3D positions of the markers.
2. The apparatus as defined in claim 1, wherein said pattern projecting means and said image obtaining means are integrated fixedly.
3. The apparatus as defined in claim 2, further comprising:
movement driving part controlled by said control means; and
movement mechanism driven by said movement driving part in order to move said pattem projecting means and said image obtaining means relatively to the object.
4. The apparatus as defined in claim 1, further comprising:
marker blinking control means for periodically blinking the markers generated by said marker generating means according to the control of said control means, wherein said control means turns on the markers by way of said marker blinking control means for obtaining 2D image data of a part of the object including the markers by said image obtaining means, and turns off the markers for obtaining 2D images of the same part of the object without the markers.
5. The apparatus as defined in claim 4, wherein the 2D positions of said markers are obtained by comparing the 2D images including the markers and the 2D images without the markers obtained from the same part of the object by said control means.
6. The apparatus as defined in claim 1, further comprising:
individual marker blinking control means for blinking the markers individually and sequentially in a predetermined order according to the control of said control means, wherein said control means sets up the image data without the markers as a basic image data, and compares the basic image data with a plurality of image data each of which includes a marker different from each other, thereby extracting 2D positions of the markers.
7. The apparatus as defined in claim 1, further comprising:
individual marker blinking control means for blinking the markers individually according to the control of said control means, wherein said control means controls said individual marker blinking control means so that markers projected on a domain which is already photographed are differentiated from markers projected on a next domain which will be photographed next.
8. The apparatus as defined in claim 1, further comprising individual marker blinking/color control means wherein said marker generating means includes a plurality of multi-color light-emitting elements whose colors can be changed selectively, and wherein the individual marker blinking/color control means controls blinking and color of the plurality of multi-color light-emitting elements, and wherein said control means controls said marker individual blinking/color control means so that markers projected on the photographed domain are differentiated from markers projected on the domain not yet photographed.
9. The apparatus as defined in claim 1, further comprising individual marker blinking control means, wherein the markers are divided into a plurality of groups in a predetermined overlapping way for binarization of the markers, wherein said individual marker blinking control means turns on and off the markers sequentially group by group under the control of said control means, wherein said control means extracts 2D positions of the markers included in plural 2D image data by searching binary information of the markers which represents turned-on or turned-off state of the markers, individually.
10. The apparatus as defined in claim 1, further comprising a project controller for controlling the state of pattems projected by pattem projecting means under the control of said control means, wherein said control means prevent the markers from being projected while pattems are projected.
11. The apparatus as defined in claim 1, wherein said marker generating means is configured fixedly to the object in order not to move relatively to the object.
12. The apparatus as defined in claim 11, wherein said marker generating means are plural and configured fixedly to a rotating table where the object is placed.
13. The apparatus as defined in claim 1, wherein said marker generating means project the markers on the surface of the object arbitrarily.
14. The apparatus as defined in claim 1, wherein said image obtaining means obtains 2D image data and 3D scan data from overlapped domain of the object, and said control means calculates 3D positions of markers by using at least two or more characteristic points contained in the 2D image data and 3D scan data.
15. The apparatus as defined in claim 14, wherein said image obtaining means obtains a plurality of 2D image data and 3D scan data from a plurality of overlapped domains of the object, and said control means searches corresponding markers per the overlapped domain in reference to the 3D positions of the markers, and said control means calculates translation matrices based on the relation between the corresponding markers, and said control means moves each 3D scan data to a reference coordinate by applying the translation matrices.
16. The apparatus as defined in claim 1, wherein said image obtaining means, said pattem projecting means and said marker generating means are respectively arranged in plural around the object, and wherein said control means calculates 3D positions of markers from the 2D images and 3D scan data of each region allotted to each image obtaining means and searches corresponding markers included in adjacent regions in reference to the 3D positions of the markers, and calculate translation matrices from the relation of the corresponding markers, thereby arranging plural 3D scan data by applying the translation matrices.
17. The apparatus as defined in claim 16, wherein said control means memorizes the translation matrices for arranging 3D scan data in the next scanning operation.
18. The apparatus as defined in claim 1, wherein said image obtaining means obtains a plurality of 2D image data from the entire scan domain by photographing from plural scanning positions, where the distance between the scanning positions is" already known, and obtains a plurality of 2D image data and 3D scan data from a plurality of parts of the entire scan domain which overlap partly, and wherein said control means computes 3D positions of the markers by analyzing the plurality of 2D image data relative to the entire scan domain of the object and the information on the distance between the scanning positions, and sets an absolute coordinate system based on the 3D positions of the markers thus obtained, and then computes 3D positions of the markers by analyzing 2D image data relative to the parts of entire scan domain of the object and 3D scan data part by part, searches corresponding markers commonly included in the 2D image data from the entire scan domain and 2D image data from a plurality of parts of the entire scan domain by using the information on the 3D positions of the markers, and calculates translation matrices in reference to the corresponding markers, and moves each of the 3D scan data to be arranged on an absolute coordinate system.
19. The apparatus as defined in claim 18, wherein said image obtaining means is equipped with plural large domain scan apparatus for obtaining images of the large domain of the object and a scan apparatus for obtaining images of parts of the large scan domain, wherein the plurality of large domain scan apparatus are separated from each other and the distances therebetween are fixed.
20. The apparatus as defined in any one of claims 1 to 19, wherein said marker generating means comprises:
a plurality of light sources for generating straight beams in the direction of an X-axis;
a plurality of light sources for generating straight beams in the direction of a Y-axis;
a polygon mirror of X-axis for reflecting the straight beams on the surface of the object;
a polygon mirror of Y-axis for reflecting the straight beams on the surface of the object;
a rotary mechanism for rotating the polygon mirrors of X-axis and Y-axis; and
light source blinking controller for controlling the blinking of the light sources of the
X-axis and Y-axis.
21. A method for automatically arranging 3D scan data using optical markers, the method comprising the steps of:
moving image obtaining means to a position suitable for obtaining an image of a part of an object;
projecting optical markers on the surface of the object by said marker generating means and obtaining 2D image data of parts of the object including the optical markers projected on the surface of the object by said image obtaining means;
projecting patterns on the surface of the object by said pattem projecting means and obtaining 3D scan data of the object on which the pattems are projected by said image obtaining means; and
extracting 3D positions of the markers from the relation between the 2D image data and the 3D scan data and arranging a plurality of 3D scan data obtained from different parts of the object by calculating relative positions among the 3D scan data in reference to the 3D positions of the markers.
22. The method as defined in claim 21, wherein the step of obtaining 2D image data further comprises the steps of:
first, obtaining 2D image data including markers from the part of the object; and
second, obtaining 2D image data not including markers from the same part of the object by turning off the markers.
23. The method as defined in claim 22, wherein the step of obtaining 2D image data further comprises a step of extracting 2D positions of the markers by image-processing the 2D image data including markers and 2D image data not including markers of the same part of the object.
24. The method as defined in claim 21, wherein the step of obtaining 2D image data further comprises the steps of:
obtaining basic image data by photographing each part of the object while all the optical markers are turned off;
projecting the plurality of optical markers to the each part of the object individually and sequentially according to the predetermined order by said marker generating means and obtaining image data of each part of the object by photographing the part of the object at every state the markers are individually and sequentially projected; and
extracting 2D positions of the markers by comparing each image data photographed in the number corresponding to the optical markers with the basic image data.
25. The method as defined in claim 21, wherein the step of obtaining the 2D image data further comprises the steps of:
dividing the plurality of optical markers into predetermined groups in an overlapping manner and sequentially turning on the optical markers group by group;
searching binary information of the grouped markers, that is, searching turned-on state of each of the grouped markers to thereby extract 2D positions of the markers as intrinsic identification of each marker.
26. The method as defined in claim 21, wherein the step of obtaining 3D scan data further comprises a step of preventing the optical markers from being generated in said marker generating means when the pattems are projected by the pattem projecting means.
27. The method as defined in claim 21, wherein the step of extracting 3D positions of the markers from the relation between the 2D image data and 3D scan data obtained by said image obtaining means further comprises a step of estimating intersection points where straight lines connecting the camera lens center of the image obtaining part and the positions of arbitrary markers in the 2D image data intersect the 3D scan data, thereby enabling to find 3D positions of the markers.
28. The method as defined in claim 21, wherein the step of arranging the 3D scan data further comprises the steps of:
searching corresponding markers included commonly in adjacent 3D scan data which overlap each other in reference to the 3D positions of the markers; calculating translation matrices for arranging each 3D scan data based on the corresponding markers; and
setting up one of the 3D scan data as a reference and moving and arranging each scan data by applying the translation matrices.
29. The method as defined in claim 28, wherein the step of searching corresponding markers further comprises a step of utilizing information on distance between every two markers and information on average vertical vectors at or around the every two markers for searching corresponding markers.
30. The method as defined in claim 28, wherein the step of searching corresponding markers further comprises a step of selecting additional reference points around a marker out of the 3D scan data and using the points along with the information on relative 3D positions of the markers for obtaining corresponding markers.
31. The method as defined in claim 28, wherein the step of searching corresponding markers further comprises a step of forming triangles by utilizing 3D positions of every three markers and obtaining lengths of sides of the triangle, and arranging each side in descending order and comparing each length and order of the sides, for obtaining corresponding triangles, thereby searching corresponding markers.
32. The method as defined in claim 28, wherein the step of setting up one of the 3D scan data as a reference and moving and arranging each 3D scan data, comprising the steps of:
matching a vertex of a triangle formed by three markers relative to one 3D scan data to that of the triangle in a reference coordinate system based on information on the vertexes and sides of the triangles;
performing rotational translation for matching one side holding the matching vertex ; and matching the two triangle by rotating a vertex not included in the side which serves as an axis up to a vertex in the reference coordinate system.
33. The method as defined in claim 21 further comprising the steps of obtaining a plurality of 2D image data by photographing an entire scan domain from a plurality of scanning positions apart from each other by a predetermined interval, computing 3D position of each marker by analyzing the 2D image data and the information on the interval, and establishing an absolute coordinate system based on the 3D positions of the markers thus calculated, wherein the step of arranging a plurality of 3D scan data further comprises the steps of:
searching corresponding markers included commonly in every adjacent 3D scan data, so obtained as to mutually overlap, in reference to the 3D positions of the markers;
calculating translation matrices for arranging each 3D scan data in reference to the corresponding markers; and
moving each 3D scan data by way of translation matrices to arrange the 3D scan data on the absolute coordinate system.
34. The method as defined in claim 21 further comprising a step of periodically blinking markers projected on a domain already photographed when said marker generating means is controlled for photographing other domains.
35. The method as defined in claim 21 further comprising a step of generating different colors between markers projected on a domain already photographed and markers projected on not-yet-photographed domains when said marker generating means is controlled for photographing other domains.
36. The method of automatically arranging 3D scan data using optical markers, comprising the steps of: arranging a plurality of image obtaining means, pattem projecting means and marker generating means about an object;
blinking said plurality of marker generating means to allow the optical markers to be projected on the surface of the object and obtaining a plurality of 2D image data relative to the object where the optical markers are projected by the plurality of said image obtaining means;
activating the plurality of said pattem projecting means to project pattems on the surface of the object and obtaining each 3D scan data relative to the object projected with pattems by the plurality of said image obtaining means; and
extracting 3D position of each marker from the relation between respective 2D image data and 3D scan data obtained by the plurality of said image obtaining means and calculating relative positions of respective 3D scan data from the 3D positions of markers and arranging each 3D scan data.
37. A method for arranging 3D scan data using optical markers, the method comprising the steps of:
arranging a plurality of image obtaining means, pattem projecting means and marker generating means about a reference object;
performing calibration by projecting markers on the reference object under the control of the plurality of said marker generating means and obtaining 2D image data and 3D scan data relative to the reference object by the plurality of said image obtaining means and said pattem projecting means and extracting 3D positions of markers for each domain from the respective 2D image data and 3D scan data and searching corresponding markers included commonly in adjacent domains in reference to the 3D positions of the markers and calculating translation matrices by way of the corresponding markers;
obtaining respective 2D image data relative to a particular domain of an object by the plurality of said image obtaining means;
obtaining respectively 3D scan data relative to the particular domain of the object while projecting the pattems on the object; and
arranging the 3D scan data by the translation matrices obtained in the step of performing calibration.
38. The method as defined in claim 37, wherein the step of performing calibration is always carried out before 2D and 3D scan data are obtained on the object.
39. The method as defined in claim 37, wherein the step of performing calibration is carried out only one time in the beginning, and the translation matrices obtained in the step of performing calibration is utilized for the next arrangement of 3D scan data of the object.
PCT/KR2003/001087 2002-07-25 2003-06-03 Apparatus and method for automatically arranging three dimensional scan data using optical marker WO2004011876A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2003241194A AU2003241194A1 (en) 2002-07-25 2003-06-03 Apparatus and method for automatically arranging three dimensional scan data using optical marker
JP2004524349A JP4226550B2 (en) 2002-07-25 2003-06-03 Three-dimensional measurement data automatic alignment apparatus and method using optical markers

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20020043830 2002-07-25
KR10-2002-0043830 2002-07-25
KR10-2003-0022624 2003-04-10
KR10-2003-0022624A KR100502560B1 (en) 2002-07-25 2003-04-10 Apparatus and Method for Registering Multiple Three Dimensional Scan Data by using Optical Marker

Publications (1)

Publication Number Publication Date
WO2004011876A1 true WO2004011876A1 (en) 2004-02-05

Family

ID=31190416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2003/001087 WO2004011876A1 (en) 2002-07-25 2003-06-03 Apparatus and method for automatically arranging three dimensional scan data using optical marker

Country Status (4)

Country Link
JP (1) JP4226550B2 (en)
CN (1) CN1300551C (en)
AU (1) AU2003241194A1 (en)
WO (1) WO2004011876A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008500524A (en) * 2004-05-25 2008-01-10 アンシディス Surface strain measuring device
WO2008017878A2 (en) * 2006-08-11 2008-02-14 The University Of Leeds Optical imaging of physical objects
WO2009046519A1 (en) * 2007-10-11 2009-04-16 Hydro-Quebec System and method for the three-dimensional mapping of a structural surface
US8170329B2 (en) 2008-07-18 2012-05-01 Fuji Xerox Co., Ltd. Position measuring system, position measuring method and computer readable medium
EP2568253A1 (en) * 2010-05-07 2013-03-13 Shenzhen Taishan Online Technology Co., Ltd. Structured-light measuring method and system
EP2574876A1 (en) 2011-09-30 2013-04-03 Steinbichler Optotechnik GmbH Method and device for determining the 3D coordinates of an object
US8434874B2 (en) 2006-05-30 2013-05-07 Panasonic Corporation Pattern projection light source and compound-eye distance measurement apparatus
WO2014014635A1 (en) * 2012-07-20 2014-01-23 Google Inc. Systems and methods for image acquisition
US20140375794A1 (en) * 2013-06-25 2014-12-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
CN105232161A (en) * 2015-10-16 2016-01-13 北京天智航医疗科技股份有限公司 Surgical robot mark point recognition and location method
WO2016068598A1 (en) * 2014-10-30 2016-05-06 한국생산기술연구원 Multichannel head assembly for three-dimensional modeling apparatus, having polygon mirror rotating in single direction, and three-dimensional modeling apparatus using same
DE102016120026A1 (en) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Measuring device and method, program, product manufacturing method, calibration marking element, processing device and processing system
DE102009032771B4 (en) * 2009-07-10 2017-06-29 Gom Gmbh Measuring device and method for the three-dimensional optical measurement of objects
CN108225218A (en) * 2018-02-07 2018-06-29 苏州镭图光电科技有限公司 3-D scanning imaging method and imaging device based on optical micro electro-mechanical systems
DE102013110667B4 (en) 2013-09-26 2018-08-16 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for the non-destructive inspection of three-dimensional workpieces and apparatus for carrying out such a method
DE102017109854A1 (en) * 2017-05-08 2018-11-08 Wobben Properties Gmbh Method for referencing a plurality of sensor units and associated measuring device
US11077557B2 (en) 2010-05-14 2021-08-03 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
US20220122282A1 (en) * 2020-10-19 2022-04-21 Robert Bosch Gmbh Method for generating an optical marker, method for recognizing an optical marker, and marker device that includes the optical marker

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE602004015799D1 (en) * 2003-07-24 2008-09-25 Cognitens Ltd METHOD AND SYSTEM FOR THE THREE-DIMENSIONAL SURFACE RECONSTRUCTION OF AN OBJECT
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
WO2008056427A1 (en) * 2006-11-08 2008-05-15 Techno Dream 21 Co., Ltd. Three-dimensional shape measuring method, and device therefor
KR20080043047A (en) * 2006-11-13 2008-05-16 주식회사 고영테크놀러지 Three-dimensional image measuring apparatus using shadow moire
US8126260B2 (en) 2007-05-29 2012-02-28 Cognex Corporation System and method for locating a three-dimensional object using machine vision
JP5322206B2 (en) * 2008-05-07 2013-10-23 国立大学法人 香川大学 Three-dimensional shape measuring method and apparatus
US9734419B1 (en) 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
JP5435994B2 (en) * 2009-03-18 2014-03-05 本田技研工業株式会社 Non-contact shape measuring device
US9533418B2 (en) 2009-05-29 2017-01-03 Cognex Corporation Methods and apparatus for practical 3D vision system
JP5375479B2 (en) * 2009-09-17 2013-12-25 コニカミノルタ株式会社 Three-dimensional measurement system and three-dimensional measurement method
CN101813461B (en) * 2010-04-07 2011-06-22 河北工业大学 Absolute phase measurement method based on composite color fringe projection
CN102346011A (en) * 2010-07-29 2012-02-08 上海通用汽车有限公司 Measuring tools and measuring methods
US9124873B2 (en) 2010-12-08 2015-09-01 Cognex Corporation System and method for finding correspondence between cameras in a three-dimensional vision system
DE102013203399A1 (en) * 2013-02-28 2014-08-28 Siemens Aktiengesellschaft Method and projection device for marking a surface
CN104315975A (en) * 2014-10-22 2015-01-28 合肥斯科尔智能科技有限公司 Linear three dimension and high precision scan method
CN104359405B (en) * 2014-11-27 2017-11-07 上海集成电路研发中心有限公司 Three-dimensional scanner
KR101788131B1 (en) * 2016-01-18 2017-10-19 한화첨단소재 주식회사 Apparatus for displaying a die mounting location of the thermosetting resin composition sheet for an electric vehicle
US20180108178A1 (en) * 2016-10-13 2018-04-19 General Electric Company System and method for measurement based quality inspection
CN107169964B (en) * 2017-06-08 2020-11-03 广东嘉铭智能科技有限公司 Method and device for detecting surface defects of cambered surface reflecting lens
CN112461138B (en) * 2020-11-18 2022-06-28 苏州迈之升电子科技有限公司 Cross scanning measurement method, measurement grating and application thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6239868B1 (en) * 1996-01-02 2001-05-29 Lj Laboratories, L.L.C. Apparatus and method for measuring optical characteristics of an object

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5276613A (en) * 1988-12-14 1994-01-04 Etienne Schlumberger Process and device for coordinating several images of the same object
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
RU2123718C1 (en) * 1996-09-27 1998-12-20 Кузин Виктор Алексеевич Method for information input to computer

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6239868B1 (en) * 1996-01-02 2001-05-29 Lj Laboratories, L.L.C. Apparatus and method for measuring optical characteristics of an object
US6417917B1 (en) * 1996-01-02 2002-07-09 Lj Laboratories, Llc Apparatus and method for measuring optical characteristics of an object

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008500524A (en) * 2004-05-25 2008-01-10 アンシディス Surface strain measuring device
JP4739330B2 (en) * 2004-05-25 2011-08-03 アンシディス Surface strain measuring device
US8434874B2 (en) 2006-05-30 2013-05-07 Panasonic Corporation Pattern projection light source and compound-eye distance measurement apparatus
WO2008017878A2 (en) * 2006-08-11 2008-02-14 The University Of Leeds Optical imaging of physical objects
WO2008017878A3 (en) * 2006-08-11 2008-04-03 Univ Heriot Watt Optical imaging of physical objects
WO2009046519A1 (en) * 2007-10-11 2009-04-16 Hydro-Quebec System and method for the three-dimensional mapping of a structural surface
US8462208B2 (en) 2007-10-11 2013-06-11 Hydro-Quebec System and method for tridimensional cartography of a structural surface
US8170329B2 (en) 2008-07-18 2012-05-01 Fuji Xerox Co., Ltd. Position measuring system, position measuring method and computer readable medium
DE102009032771B4 (en) * 2009-07-10 2017-06-29 Gom Gmbh Measuring device and method for the three-dimensional optical measurement of objects
EP2568253A1 (en) * 2010-05-07 2013-03-13 Shenzhen Taishan Online Technology Co., Ltd. Structured-light measuring method and system
JP2013525821A (en) * 2010-05-07 2013-06-20 深▲せん▼泰山在線科技有限公司 Structured light measurement method and system
EP2568253A4 (en) * 2010-05-07 2013-10-02 Shenzhen Taishan Online Tech Structured-light measuring method and system
US11077557B2 (en) 2010-05-14 2021-08-03 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
DE102011114674B4 (en) * 2011-09-30 2015-01-29 Steinbichler Optotechnik Gmbh Method and device for determining the 3D coordinates of an object
US20130271573A1 (en) * 2011-09-30 2013-10-17 Steinbichler Optotechnik Gmbh Method and apparatus for determining the 3d coordinates of an object
EP2574876A1 (en) 2011-09-30 2013-04-03 Steinbichler Optotechnik GmbH Method and device for determining the 3D coordinates of an object
DE102011114674C5 (en) 2011-09-30 2020-05-28 Steinbichler Optotechnik Gmbh Method and device for determining the 3D coordinates of an object
DE102011114674A1 (en) * 2011-09-30 2013-04-04 Steinbichler Optotechnik Gmbh Method and device for determining the 3D coordinates of an object
EP2574876B1 (en) 2011-09-30 2017-11-15 Carl Zeiss Optotechnik GmbH Method and device for determining the 3D coordinates of an object
DE202012013561U1 (en) 2011-09-30 2017-12-21 Carl Zeiss Optotechnik GmbH Device for determining the 3D coordinates of an object
US10200670B2 (en) 2011-09-30 2019-02-05 Carl Zeiss Optotechnik GmbH Method and apparatus for determining the 3D coordinates of an object
US9163938B2 (en) 2012-07-20 2015-10-20 Google Inc. Systems and methods for image acquisition
WO2014014635A1 (en) * 2012-07-20 2014-01-23 Google Inc. Systems and methods for image acquisition
US20140375794A1 (en) * 2013-06-25 2014-12-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
US9789462B2 (en) * 2013-06-25 2017-10-17 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
DE102013110667B4 (en) 2013-09-26 2018-08-16 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for the non-destructive inspection of three-dimensional workpieces and apparatus for carrying out such a method
WO2016068598A1 (en) * 2014-10-30 2016-05-06 한국생산기술연구원 Multichannel head assembly for three-dimensional modeling apparatus, having polygon mirror rotating in single direction, and three-dimensional modeling apparatus using same
US10810788B2 (en) 2014-10-30 2020-10-20 Korea Institute Of Industrial Technology Multichannel head assembly for three-dimensional modeling apparatus, having polygon mirror rotating in single direction, and three-dimensional modeling apparatus using same
CN105232161A (en) * 2015-10-16 2016-01-13 北京天智航医疗科技股份有限公司 Surgical robot mark point recognition and location method
DE102016120026B4 (en) * 2015-10-22 2019-01-03 Canon Kabushiki Kaisha Measuring device and method, program, product manufacturing method, calibration marking element, processing device and processing system
DE102016120026A1 (en) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Measuring device and method, program, product manufacturing method, calibration marking element, processing device and processing system
DE102017109854A1 (en) * 2017-05-08 2018-11-08 Wobben Properties Gmbh Method for referencing a plurality of sensor units and associated measuring device
WO2018206527A1 (en) 2017-05-08 2018-11-15 Wobben Properties Gmbh Method for referencing a plurality of sensor units and associated measuring device
CN108225218A (en) * 2018-02-07 2018-06-29 苏州镭图光电科技有限公司 3-D scanning imaging method and imaging device based on optical micro electro-mechanical systems
US20220122282A1 (en) * 2020-10-19 2022-04-21 Robert Bosch Gmbh Method for generating an optical marker, method for recognizing an optical marker, and marker device that includes the optical marker

Also Published As

Publication number Publication date
JP4226550B2 (en) 2009-02-18
CN1300551C (en) 2007-02-14
AU2003241194A1 (en) 2004-02-16
JP2005534026A (en) 2005-11-10
CN1672013A (en) 2005-09-21

Similar Documents

Publication Publication Date Title
WO2004011876A1 (en) Apparatus and method for automatically arranging three dimensional scan data using optical marker
US9672630B2 (en) Contour line measurement apparatus and robot system
US7456842B2 (en) Color edge based system and method for determination of 3D surface topology
CN100518488C (en) Pick and place equipment with component placement inspection
US7423666B2 (en) Image pickup system employing a three-dimensional reference object
JP2919284B2 (en) Object recognition method
US7589825B2 (en) Ranging apparatus
US20070091174A1 (en) Projection device for three-dimensional measurement, and three-dimensional measurement system
CN112161619B (en) Pose detection method, three-dimensional scanning path planning method and detection system
CN101013028A (en) Image processing method and image processor
KR100502560B1 (en) Apparatus and Method for Registering Multiple Three Dimensional Scan Data by using Optical Marker
JPWO2008026722A1 (en) 3D model data generation method and 3D model data generation apparatus
WO2006135040A1 (en) Image processing device and image processing method performing 3d measurement
CN108965690A (en) Image processing system, image processing apparatus and computer readable storage medium
CN113954085A (en) Intelligent positioning and control method of welding robot based on binocular vision and linear laser sensing data fusion
KR900002509B1 (en) Apparatus for recognizing three demensional object
JPH11166818A (en) Calibrating method and device for three-dimensional shape measuring device
US6505148B1 (en) Method for combining the computer models of two surfaces in 3-D space
US11763473B2 (en) Multi-line laser three-dimensional imaging method and system based on random lattice
JPH09212643A (en) Method for recognition of three-dimensional object and device therefor
CN100518487C (en) Method for acquiring multiple patterns in pick and place equipment
JPH09329440A (en) Coordinating method for measuring points on plural images
El-Hakim A hierarchical approach to stereo vision
CN114494316A (en) Corner marking method, parameter calibration method, medium, and electronic device
JPH0545117A (en) Optical method for measuring three-dimensional position

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2004524349

Country of ref document: JP

Ref document number: 20038178915

Country of ref document: CN

122 Ep: pct application non-entry in european phase