US20130128280A1 - Method for measuring three-dimension shape of target object - Google Patents

Method for measuring three-dimension shape of target object Download PDF

Info

Publication number
US20130128280A1
US20130128280A1 US13/729,862 US201213729862A US2013128280A1 US 20130128280 A1 US20130128280 A1 US 20130128280A1 US 201213729862 A US201213729862 A US 201213729862A US 2013128280 A1 US2013128280 A1 US 2013128280A1
Authority
US
United States
Prior art keywords
solder
board
control unit
illumination
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/729,862
Inventor
Min Young Kim
Hee Tae Kim
Byung Min Yoo
Se Hyun Han
Seung Jun Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koh Young Technology Inc
Original Assignee
Koh Young Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020060008480A external-priority patent/KR100734431B1/en
Priority claimed from KR1020060008479A external-priority patent/KR100672818B1/en
Application filed by Koh Young Technology Inc filed Critical Koh Young Technology Inc
Priority to US13/729,862 priority Critical patent/US20130128280A1/en
Assigned to KOH YOUNG TECHNOLOGY INC. reassignment KOH YOUNG TECHNOLOGY INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, MIN YOUNG, HAN, SE HYUN, LEE, SEUNG JUN, YOO, BYUNG MIN, KIM, HEE TAE
Publication of US20130128280A1 publication Critical patent/US20130128280A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2531Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object using several gratings, projected with variable angle of incidence on the object, and one detection device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light

Definitions

  • a method of measuring a 3D shape which can measure 3D shape of target objects on a board by searching a database for bare board information when inspection option is set to the teaching-based inspection mode or by performing bare board teaching when the board from a supplier having not the bare board information is inspected.
  • FIG. 1 is a flowchart illustrating a method of measuring a 3D shape according to the conventional art.
  • a grating pattern illumination is emitted towards a reference surface by emitting a light generated from an illumination source (not shown) towards a grating device (not shown) to acquire a reference phase corresponding to the reference surface.
  • a grating is moved by a fine pitch using a piezoelectric actuator (not shown) and emitted towards the reference surface and a grating pattern image is acquired using a charged coupled device (CCD) camera and a grabber (not shown).
  • CCD charged coupled device
  • a bucket algorithm is applied to the grating pattern image.
  • the reference phase with respect to the reference surface is acquired.
  • a measuring object is placed on a moving table and a light generated from the illumination source is emitted towards a measuring surface of the measuring object to acquire a phase of the measuring object.
  • the grating is moved by a fine interface using the piezoelectric actuator to apply the bucket algorithm, and the grating pattern image reflected from the measuring surface is acquired via the CCD camera and the grabber.
  • the bucket algorithm is applied to the grating pattern image.
  • an object phase of the measuring object is acquired.
  • the object phase is acquired, the object phase is deducted from the reference phase in operation S 21 , and a moire phase is acquired in operation S 21 .
  • the moire phase is acquired, the moire phase is unwrapped in operation S 22 , and actual height information of the measuring object is acquired by using a result of unwrapping.
  • the conventional 3D shape measuring method has a problem in that an operator may feel tired a lot, and a productivity may be reduced since each of measuring conditions is manually calculated and then measurement is performed when a totally new measuring object, not an ongoing measuring object, is measured.
  • An objective of the present invention is to provide a method of measuring a 3D shape which can measure 3D shape of target objects according to the normal inspection mode when a measuring object is set to the normal inspection mode, and also can measure the 3D shape of target objects on the board by searching a database for bare board information when the inspection option is set to the teaching-based inspection mode or by performing bare board teaching when the board is inspected from a supplier having not the bare board information, and thereby can improve a productivity of electronic board manufacturing line.
  • Another objective of the present invention is to improve the measurement quality of the 3D shape of target objects by measuring their 3D shape with a regular brightness of an illumination source at each illumination level, which is predefined prior to the machine operation.
  • a method of measuring a 3 dimensional (3D) shape including: measuring a brightness of a first illumination source by controlling via a central control unit a module control unit and an image acquisition unit; measuring a phase-to-height conversion factor by controlling via the central control unit the module control unit and the image acquisition unit after the brightness measuring process of the first illumination source is completed; determining whether the measurement is performed in a normal inspection mode after the brightness of the first illumination source and the phase-to-height conversion factor are measured and calculated; measuring a 3D shape of a board according to the normal inspection mode by controlling, via the central control unit, the module control unit and the image acquisition unit when it is the normal inspection mode as a result of the determination; searching a database and determining, via the central control unit, whether bare board information about the board is in the database when it is not the normal inspection mode as a result of the determination; performing bare board teaching by controlling, via the central control unit, the module control unit and the image acquisition unit when the bare
  • FIG. 1 is a flowchart illustrating a method of measuring a 3D shape according to a conventional art
  • FIG. 2 is a diagram illustrating a 3D shape measuring system for a 3D shape measuring method according to the present invention
  • FIGS. 3A through 3C illustrate a configuration of a board, a bare board, and a calibration target
  • FIG. 4 is a flowchart illustrating a method of measuring a 3D shape according to the present invention
  • FIG. 5 is a flowchart illustrating an operation of measuring a brightness of a first illumination source shown in FIG. 4 ;
  • FIG. 6 is a flowchart illustrating an operation of measuring a phase-to-height conversion factor shown in FIG. 4 ;
  • FIGS. 7A and 7B are flowcharts illustrating an operation of measuring a 3D shape of a board according to a normal inspection mode shown in FIG. 4 ;
  • FIGS. 8A and 8B are flowcharts illustrating a bare board teaching operation shown in FIG. 4 ;
  • FIGS. 9A and 9B are flowcharts illustrating an operation of measuring a 3D shape of a board according to a teaching-based inspection mode shown in FIG. 4 .
  • FIG. 2 is a diagram illustrating a 3D shape measuring system for a 3D shape measuring method according to the present invention.
  • the 3D shape measuring apparatus includes a central control unit 10 , a module control unit 20 , an image acquisition unit 30 , at least one pattern projector 40 , a second illumination source 50 , an X-Y table 61 , a table moving device 60 , and a camera 70 .
  • a configuration of each element will be described.
  • a charge coupled device (CCD) camera or a complementary metal oxide semiconductor (CMOS) camera is utilized for the camera 70 .
  • a second illumination source 50 , an optical filter 71 , and a lens 72 are provided below the camera 70 .
  • a plurality of light emitting diodes (LEDs) formed in a shape of a circle or a circular lamp is utilized for the second illumination source 50 , and the second illumination source 50 is utilized as an illuminator to measure particular shape of specific locations on a board 62 or a bare board 63 which corresponds to a measuring object.
  • LEDs light emitting diodes
  • the table moving device 60 drives the X-Y table 61 which is positioned below the camera 70 and thereby moves the board 62 , the bare board 63 , or a calibration target 64 to predefined measurement locations so that the camera 70 may take images of the board 62 , the bare board 63 or the calibration target 64 .
  • At least one pattern projector 40 of the 3D shape measuring system which is indicated by a solid line and a dotted line as shown in FIG. 2 , is provided.
  • Each of the at least one pattern projector 40 is provided to be inclined in one side or another side of the camera 70 which takes images of the board 62 , the bare board 63 or the calibration target 64 .
  • the pattern projector 40 includes an illumination part 41 , a grating moving device 42 , a grating device 43 , and a lens 44 .
  • the illumination part 41 includes a first illumination source 41 a and a plurality of lenses 41 b and 41 c . An illumination generated from the first illumination source 41 a passes through the plurality of lenses 41 b and 41 c , and is emitted toward the grating device 43 and then toward the board 62 , the bare board 63 or the calibration target 64 .
  • the image acquisition unit 30 receives the image taken by the camera 70 and transmits the received image to the central control unit 10 .
  • the module control unit 20 includes a table controller 21 , a grating controller 22 , and an illumination controller 23 .
  • the illumination controller 23 controls the first illumination source 41 a of the illumination part 41 or the second illumination source 50
  • the grating controller 22 controls the grating moving device 42
  • the table controller 21 controls the table moving device 60 .
  • the central control unit 10 includes a control board 11 , an image processing board 12 , and an interface board 13 .
  • the central control unit 10 transmits/receives a control signal or control information to the module control unit 20 and the image acquisition unit 30 via the interface board 13 , the image processing board 12 processes an image received from the image acquisition unit 30 , and the control board 11 generally controls the 3D shape measuring apparatus of the present invention.
  • the central control unit 10 searches a database 80 for bare board information of a new board supplier or stores the bare board information which is acquired in a teaching-based inspection mode.
  • FIG. 2 is a diagram illustrating a 3D shape measuring system for a 3D shape measuring method according to the present invention
  • FIGS. 3A through 3C illustrate a configuration of a board, a bare board, and a calibration target
  • FIG. 4 is a flowchart illustrating a method of measuring a 3D shape according to the present invention.
  • an initial setup operation of the 3D shape measuring system is performed before measuring the 3D shape.
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure a brightness of the first illumination source 41 a .
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure a phase-to-height conversion factor.
  • the central control unit 10 determines whether the measurement is performed in a normal inspection mode.
  • a normal inspection mode or a teaching-based inspection mode, by using input information via an input device, such as a keyboard (not shown), or by using an job program pre-installed in the 3D shape measuring system, the central control unit 10 recognizes and determines the selected mode.
  • operation S 400 when the central control unit 10 determines it is the normal inspection mode in operation S 300 , the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure the 3D shape of target objects on the board 62 according to the normal inspection mode. Conversely, when the central control unit 10 determines it is not the normal inspection mode in operation S 300 , the central control unit performs operation S 500 of searching the database 80 for bare board information of the board 62 .
  • the table controller 21 of driving the table moving device 60 controls the X-Y table 61 to move the board 62 to a measurement location.
  • the central control unit 10 controls the illumination controller 23 to switch on the second illumination source 50 .
  • the central control unit 10 calculates location information about a particular part of the board 62 .
  • the particular part of the board 62 or the bare board 63 indicates a mark (not shown) which can be distinguishable for each manufacturer or each product adopting the board 62 or the bare board 63 .
  • the central control unit 10 searches the database 80 for bare board information which is identical to the image information of the particular part.
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to perform a bare board teaching.
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure the 3D shape of target objects on the board 62 .
  • the central control unit 10 analyzes whether the board 62 is normal or abnormal by using information about the measured 3D shape in operation S 800 and thereby badness of a solder 62 e , which is formed on the board 62 after measuring the 3D shape of the board 62 , is determined.
  • the 3D shape of target objects on the board 62 is measured according to the normal inspection mode or the teaching-based inspection mode, the 3D shape of target objects on the board may be more readily and efficiently acquired.
  • the central control unit 10 sets a range of an illumination adjustment command value and then a brightness of the first illumination source 41 a is adjusted by the illumination controller 23 of the module control unit 20 according to the set adjustment command value.
  • the calibration target 64 is utilized to adjust the brightness of the first illumination source 41 a .
  • the calibration target 64 includes a plane surface 64 a and a stepped difference 64 b , and is formed in a gray color.
  • the calibration target 64 is applied when measuring the brightness of the first illumination source 41 a or calculating the phase-to-height conversion factor in the initial setup operation for measuring the 3D shape.
  • the table controller 21 of the module control unit to drive the table moving device 60 drives the X-Y table 61 and the calibration target 64 is moved to a measurement location.
  • the measurement location indicates a location where the camera 70 may take an image of the calibration target 64 .
  • the central control unit 10 sets the ranges of the illumination adjustment command value. In this instance, when a user inputs information about the adjustment command value using an input device, such as a keyboard (not shown), and the like, the central control unit 10 recognizes the input information and sets the range of the adjustment command value. In operation S 114 , when the range of the illumination adjustment command value is set, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the set adjustment command value.
  • operation S 120 when the brightness of the first illumination source 41 a is adjusted, the grating controller 22 of the module control unit 20 to drive the grating moving device 42 moves the grating device 43 an N number of times and the camera 70 takes an image of the calibration target 64 for each movement and the image acquisition unit 30 acquires the image of the calibration target 64 .
  • operation S 130 when the image acquisition unit 30 acquires N images of the calibration target 64 , the acquired image is received via the image processing board 12 and the interface board 13 of the central control unit 10 and average-process of the received images is performed, and an average image is calculated.
  • the average image process acquires an average image where a grating pattern is eliminated from the N number of images taken by the camera 70 every time the grating device 43 is moved the N number of times to acquire the image of the calibration target 64 .
  • the image of the calibration target 64 is acquired by moving the grating device 43 at least four times and the average image process is performed.
  • the central control unit 10 sets a representative brightness value of the calculated average image to an illumination brightness of a corresponding illumination adjustment command value.
  • the central control unit 10 determines whether the adjustment command value is maximum. When the illumination adjustment command value is maximum within the set range of the adjustment command value, the central control unit 10 determines the brightness adjustment of the first illumination source 41 a is completed.
  • the central control unit 10 defines the illumination brightness corresponding to each adjustment command value.
  • the central control unit 10 when the illumination brightness corresponding to each adjustment command value is defined, the central control unit 10 generates the illumination brightness corresponding to each adjustment command value into an illumination index table.
  • the illumination index table defines the illumination brightness according to each adjustment command value. Therefore, when measuring the 3D shape of the board 62 or the bare board 63 by using the first illumination source 41 a , the illumination brightness may be linearly adjusted by using the brightness index table, and thus the measurement quality of a 3D shape may be improved.
  • the central control unit 10 determines whether the plurality of first illumination sources 41 a is provided.
  • operation S 162 when the plurality of first illumination sources 41 a is provided, any one of the plurality of first illumination sources 41 a is switched off and any one of remaining first illumination sources 41 a is switched on.
  • the remaining first illumination source 41 a is indicated by the dotted line in FIG. 2 .
  • the central control unit 10 determines whether the adjustment command value of the remaining first illumination source 41 a is maximum.
  • the central control unit 10 compares the illumination brightness values corresponding to each of the adjustment command value of any one of the plurality of first illumination sources 41 a and the adjustment command value of the remaining first illumination source 41 a , and selects a smaller illumination brightness value between them as the brightness of the total illumination system according to each adjustment command value.
  • the central control unit 10 calculates new adjustment command value of each first illumination corresponding to the selected illumination brightness of the total illumination system, and then redefines the illumination adjustment command values of the plurality of first illumination sources 41 a corresponding to the selected illumination brightness of the total illumination system.
  • the illumination brightness defined for each adjustment command value is generated into the illumination index table.
  • operation S 200 when the brightness measurement of the first illumination source 41 a is completed, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure the phase-to-height conversion factor. As shown in FIGS. 2 through 4 and FIG. 6 , operation S 210 of controlling, by the central control unit 10 , the illumination controller 23 of the module control unit 20 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value of the total illumination system is performed.
  • the table controller 21 of driving the table moving device drives the X-Y table 61 and thus the calibration target 64 is moved to the measurement location.
  • the first illumination source 41 a is switched on by the illumination controller 23 .
  • the central control unit 10 selects an adjustment command value.
  • the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a to the illumination brightness corresponding to the selected adjustment command value.
  • the central control unit 10 determines whether a measurement portion of the calibration target 64 corresponds to a plane surface 64 a . Whether the measurement portion of the calibration target 64 corresponds to the plane surface 64 a is determined by the central control unit 10 by using the image, taken by the camera 70 , in a state where the second illumination source 50 is switched on. When the measurement portion of the calibration target 64 does not correspond to the plane surface 64 a , the central control unit 10 drives the table moving device 60 to move the plane surface 64 a of the calibration target 64 to a focus location of the camera 70 or the plane surface 64 a is manually moved by the operator.
  • the central control unit 10 sets the plane surface 64 a of the calibration target 64 as an inspection area.
  • the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times, emits a grating pattern illumination towards the plane surface 64 a for each movement and takes an image of the calibration target via the camera 70 , which is reflected from the plane surface 64 a .
  • the image acquisition unit 30 acquires the taken images of the plane surface 64 a .
  • the central control unit 10 acquires a phase map of the plane surface 64 a by using an N-bucket algorithm and the acquired images, and also stores the phase map of a first reference surface m.
  • Information such as the phase map of the first reference surface m, is stored in a storage device (not shown), such as a hard disk that connects with the control board 11 of the central control unit 10 , and the like.
  • operation S 220 is re-performed.
  • operation S 260 when the measurement portion does not correspond to the plane surface 64 a , the central control unit 10 sets a stepped difference 64 b of the calibration target 64 as the inspection area.
  • operation S 270 when the stepped difference 64 b is set as the inspection area, the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times, emits a grating pattern illumination towards the stepped difference 64 b for each movement and takes an image of the calibration target 64 via the camera 70 , which is reflected from the stepped difference 64 b .
  • the image acquisition unit 30 acquires the taken images of the stepped difference 64 b .
  • the central control unit 10 acquires a phase map of the stepped difference 64 b by using an N-bucket algorithm and the acquired images.
  • phase-to-height conversion factor of each pixel is calculated and stored by using the acquired phase maps.
  • the phase-to-height conversion factor is required to convert a phase into a height value when calculating a phase of each point by using the N-bucket algorithm and then calculating the height value of a corresponding point by using the calculated phase.
  • the central control unit 10 calculates a relative height phase of the stepped difference 64 b with respect to the first reference surface m by using phase information about the first reference surface m and the phase map of the stepped difference 64 b .
  • the central control unit 10 calculates the phase-to-height conversion factor by using the relative height phase of the stepped difference 64 b , pattern period information of the stepped difference 64 b , and a known height of the stepped difference 64 b of the calibration target 64 .
  • Operation 5290 is performed with respect to each of the plurality of first illumination sources 41 a when the plurality of first illumination sources 41 a as indicated by the solid line and the dotted line in FIG. 2 is provided. More specifically, when the plurality of first illumination sources 41 a is provided, in operation 5291 , the phase-to-height conversion factor of each pixel is calculated by using the phase map, which is acquired according to the grating pattern illumination generated from any one of the first illumination sources 41 a.
  • the central control unit 10 determines whether the plurality of first illumination sources 41 a is provided. In operation 5293 , when the phase-to-height conversion factor of each pixel according to any one of the plurality of first illumination sources 41 a is calculated, the central control unit 10 controls the illumination controller 23 of the module control unit 20 to switch off any one of the plurality of first illumination sources 41 a where the phase-to-height conversion factor is calculated, and switch on a remaining first illumination source 41 a.
  • the central control unit 10 determines whether the phase-to-height conversion factor of each pixel according to the grating pattern illumination from the remaining first illumination source 41 a is calculated. In operation S 295 , when the phase-to-height conversion factor of each pixel is calculated, the central control unit 10 stores the phase-to-height conversion factor of each pixel according to each of the plurality of first illumination sources 41 a.
  • the central control unit 10 returns to operation S 210 of controlling the illumination controller 23 of the module control unit 20 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • operation S 400 of measuring the 3D shape of target objects on the board 62 according to the normal inspection mode is performed. More specifically, in operation S 410 of operation S 400 , as shown in FIGS. 2 through 4 and FIGS. 7A and 7B , the table controller 201 of driving the table moving device 60 drives the X-Y table 61 , and the board 62 is moved to the measurement location.
  • the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to acquire the phase map of the board 62 , and calculates the relative height phase with respect to a first reference surface m.
  • the central control unit 10 calculates a phase histogram by using the relative height phase with respect to the first reference surface m, and calculates the 3D shape of target objects on the board 62 by using the calculated phase histogram.
  • the method of calculating the centroid of the second reference surface n and the solder 62 e initially separates the second reference surface n and the solder 62 e by using pre-stored dimensional information of the board 62 .
  • the central control unit 10 separates the solder 62 e by using the calculated second reference surface n.
  • the central control unit 10 calculates the centroid of the second reference surface n and the solder 62 e by using a centroid method.
  • the central control unit 10 calculates a representative height of the solder 62 e by using the centroid of the second reference surface n and the centroid of the solder 62 e .
  • the central control unit 10 calculates a volume, a height distribution, and an positional offset of the solder 62 e by using the calculated representative height.
  • the central control unit 10 calculates a phase histogram by using the stored combined height phase, which is the same as when only the single first illumination source 41 a is provided.
  • the central control unit 10 separates the second reference surface n and the solder 62 e from the calculated phase histogram, and calculates the centroid of the second reference surface n and the solder 62 e .
  • the central control unit 10 calculates the representative height of the solder 62 e by using the centroid of the second reference surface n and the centroid of the solder 62 e .
  • the central control unit 10 calculates a volume, a height distribution, and an positional offset of the solder 62 e by using the calculated representative height.
  • the height distribution of the solder 62 e is calculated based on the second reference surface n, and the volume of the solder 62 e is calculated by multiplying the phase-to-height conversion factor of each pixel and phase information of the solder 62 e , and summing up the results of the multiplications.
  • the positional offset of the solder 62 e is calculated depending upon how far the solder 62 e is located from the center of a conductive pad 62 d by using location information of the solder 62 e which is calculated by using the volume of the solder 62 e . Based on the calculated volume, height, distribution and positional offset information of solder, the goodness and badness of the board is determined automatically.
  • the bare board 63 is moved to the measurement location in operation S 600 .
  • the table controller 21 of driving the table moving device 60 drives the X-Y table 61 whereby the bare board 63 is moved to the measurement location.
  • the bare board 63 includes a base plate 62 a , a conductive pattern 62 b , a solder mask 62 c , and the conductive pad 62 d.
  • the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to acquire the phase map of the bare board 63 , and calculates the relative height phase with respect to the first reference surface m.
  • the first reference surface m indicates the base plate 62 a of the bare board 63 , and is calculated by using pre-given bare board information.
  • the central control unit 10 stores location information and image information about a particular part of the bare board 63 , as bare board information, in the database 80 .
  • operation S 640 may be performed in a different way with respect to when only a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 a is provided. Initially, performing operation S 640 when the single first illumination source 41 a is provided will be described.
  • operation S 642 when the relative height phase with respect to the first reference surface m is calculated, the central control unit 10 stores the relative height phase with respect to the first reference surface m as height phase information of the bare board 63 .
  • operation S 643 when the height phase information is stored, the central control unit 10 determines whether teaching with respect to all areas of the bare board 63 is completed.
  • the central control unit 10 controls the illumination controller 23 of the module control unit 21 to switch off the first illumination source 41 a and switch on the second illumination source 50 .
  • the central control unit 10 controls the image acquisition unit 30 to acquire the image with respect to the particular part of the bare board 63 using the camera 70 , and stores the acquired image, and also calculates location information with respect to the particular part of the bare board 63 , and stores the calculated location information and the image information in the database 80 .
  • operation S 641 when the relative height phase with respect to the first reference surface m of the bare board 63 according to any one of the plurality of first illumination sources 41 a and the relative height phase with respect to the first reference surface m of the bare board 63 according to a remaining first illumination sources 41 a are calculated, the central control unit 10 calculates the combined height phase where noise is removed from the relative height phase with respect to each first reference surface m. Operations after calculating the combined height phase are identical when only the single first illumination source 41 a is provided. In operation S 642 , when the combined height phase is calculated, the central control unit 10 stores the calculated combined height phase as height phase information.
  • the central control unit 10 determines whether teaching with respect to all areas of the bare board 63 is completed. In operation S 644 , when the teaching with respect to all areas of the bare board 63 is completed, the central control unit 10 controls the illumination controller 23 of the module control unit 21 to switch off the plurality of first illumination sources 41 a and switch on the second illumination source 50 . In this instance, all of the plurality of first illumination sources 41 a is switched off.
  • the central control unit 10 controls the image acquisition unit 30 to acquire the image with respect to the particular part of the bare board 63 using the camera 70 , and stores the acquired image, and also calculates location information with respect to the particular part of the bare board 63 , and stores the calculated location information in the database 80 .
  • operation 5700 of measuring the 3D shape of target objects on the board 62 according to a teaching-based inspection mode when the bare board information is included in operation S 500 will be described with reference to FIGS. 2 through 4 , and FIGS. 9A and 9B .
  • bare board information corresponding to the board 62 is read from the database 80 .
  • the central control unit 10 controls the illumination controller 23 of the module control unit to switch off the second illumination source 50 when the second illumination source 50 is switched on.
  • the table controller 21 of driving the table moving device 60 drives the X-Y table 61 whereby the board 62 is moved to the measurement location.
  • the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to acquire the phase map of the board 62 and calculates the relative height phase with respect to the first reference surface m.
  • the central control unit 10 calculates the phase histogram by using the relative height phase with respect to the first reference surface m, and calculates the 3D shape of the board 62 .
  • operation S 760 may be performed in a different way with respect to when only a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 is provided.
  • operation S 760 when only the single first illumination source 41 a is provided will be described.
  • operation S 762 when the relative height phase with respect to the first reference surface m of the board 62 according to the grating pattern illumination generated from the first illumination source 41 a is calculated, the central control unit 10 stores the relative height phase with respect to the first reference surface m as height phase information of the board 62 .
  • operation S 763 when the height phase information of the board 62 is stored, the central control unit 10 separates height phase information of the solder 62 e in a corresponding inspection location by using the height phase information of the bare board 63 in the database 80 and the height phase information of the board 62 stored in operation S 762 .
  • the central control unit 10 calculates actual height information from the relative height phase information of the solder 62 e , and calculates a volume, a height distribution, and an positional offset of the solder 62 e by using the calculated actual height information. Specifically, in operation S 764 , when the height phase information of the solder 62 e is separated, the central control unit 10 calculates the actual height information from the separated height phase information of the solder 62 e , and calculates the volume, the height distribution, and the positional offset of the solder 62 e.
  • operation S 761 when the relative height phase with respect to the first reference surface m of the board 62 according to any one of the plurality of first illumination sources 41 a and the relative height phase with respect to the first reference surface m of the board 62 according to a remaining first illumination source 41 a are calculated, the central control unit 10 calculates a combined height phase where noise is removed from the relative height phase with respect to each first reference surface m.
  • the central control unit 10 stores the calculated combined height phase as height phase information.
  • the central control unit 10 separates height phase information of the solder 62 e by using the height phase information of the bare board 80 stored in the database 80 and the height phase information of the board 62 stored in operation S 762 .
  • the central control unit 10 calculates actual height information from the separated height phase information of the solder 62 e and calculates the volume, the height distribution, and the positional offset of the solder 62 e.
  • Operations S 420 , S 620 , and S 740 may be performed in a different way with respect to when a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 a is provided.
  • performing operations S 420 , S 620 , and S 740 when the single first illumination source 41 a is provided will be described with reference to FIGS. 7A , 8 A, and 9 A.
  • the central control unit 10 controls the illumination controller 23 to switch on the first illumination source 41 a .
  • the central control unit 10 selects a pre-input adjustment command value.
  • the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • the central control unit 10 determines whether any one of the plurality of first illumination sources 41 a is selected. In operations S 422 , S 622 , and 5722 , when the selected first illumination source 41 a is determined, the central control unit 10 controls the illumination controller 23 to switch on the selected first illumination source 41 a . In operations S 423 , S 623 , and S 743 , when the selected first illumination source 41 a is switched on, the central control unit 10 selects the pre-input adjustment command value. In operations S 424 , S 624 , and S 744 , when the adjustment command value is selected, the central control unit controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • the central control unit 10 controls the illumination controller 23 to switch on a remaining first illumination source 41 a .
  • the central controls unit 10 selects the pre-input adjustment command value.
  • the central control unit 10 controls the illumination controller 23 to adjust the brightness of the remaining first illumination source 41 a according to the selected adjustment command value.
  • operations S 430 , S 630 , and S 750 of calculating the relative height phase with respect to the first reference surface m are performed respectively.
  • operations S 430 , S 630 , and S 750 may be performed in a different way with respect to when a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 a is provided.
  • performing operations S 430 , S 630 , and 5750 when the single first illumination source 41 a is provided will be described with reference to FIGS. 7A , 8 A, and 9 A.
  • a concept of expanding the inspection area is applied to calculate the height of the solder 62 e based on the second reference surface n of the board 62 that includes the conductive pattern 62 b , the solder mask 62 c , the conductive pad 62 d , and the solder 62 e , which are formed on the base plate 62 a , as shown in FIG. 3A .
  • the second reference surface n indicates the height from a bottom surface of the board 62 to a top surface of the solder mask 62 c and the conductive pad 62 d , and the height corresponding to a centroid from the first reference surface m to the solder mask 62 c and the conductive pad 62 d .
  • FIGS. 3A and 3B when an area A is set as the inspection area of the board 62 or the bare board 63 and the inspection is started, the inspection area is expanded to an area B so as to calculate a height value of the second reference value n.
  • operations S 433 , S 633 , and S 753 when the image acquisition unit 30 acquires the image in operations S 431 , S 631 , and S 751 , the central control unit 10 calculates a phase map by using an N-bucket algorithm, and stores the calculated phase map.
  • operations S 434 , S 634 , and S 754 when the phase map is calculated and stored, the central control unit 10 calculates the relative height phase with respect to the first reference surface m in a corresponding inspection location by using a difference between a pre-stored phase map of the first reference surface m and the phase map stored in the central control unit 10 .
  • operations S 433 , S 633 , and S 753 when the image acquisition unit 30 acquires the image in operations S 431 , S 631 , and S 751 , the central control unit 10 calculates a phase map by using an N-bucket algorithm and stores the calculated phase map.
  • operations S 434 , S 634 , and S 754 when the phase map is calculated and stored, the central control unit 10 calculates the relative height phase with respect to the first reference surface m in a corresponding inspection location by using a difference between a pre-stored phase map of the first reference surface m and the phase map stored in the central control unit 10 .
  • the relative height phase according to the remaining first illumination source 41 a is calculated by performing operations S 436 , S 636 , and S 756 of expanding the inspection area, operations S 437 , S 637 , and S 757 of calculating and storing the phase map using N-bucket algorithm, and operations S 438 , S 638 , and S 758 of calculating the relative height phase with respect to the first reference surface m.
  • the 3D shape of target objects on the board 62 can be measured by using the calculated relative height phase. Also, goodness and badness of the solder 62 e of the board 62 can be determined by using the result of the measurement.
  • a method of measuring a 3D shape which can measure 3D shape of target objects on a board according to a normal inspection mode when a measuring object is set to the normal inspection mode, and also can measure the 3D shape of target objects on the board by searching a database for bare board information when the measuring object is not set to the normal inspection mode or by performing bare board teaching when the board is supplied from a supplier having not the bare board information, and thereby can improve a productivity of electric circuit boards.
  • the present invention it is to improve a measurement quality of a 3D shape by measuring the 3D shape while maintaining a brightness of an illumination source, which is applied to measure the 3D shape, to be regular for each operation.

Abstract

A method of measuring a 3D shape, which can measure a 3D shape of target objects on a board by searching a database for bare board information when a measuring object is not set to a normal inspection mode or by performing bare board teaching when the board is supplied from a supplier having not the bare board information is provided. The method of measuring a 3D shape includes operation S100 of measuring a brightness of a first illumination source 41 a, operation S200 of measuring a phase-to-height conversion factor, operation S300 of determining whether the measurement is performed in a normal inspection mode, operation S400 of measuring a 3D shape of a board 62 according to the normal inspection mode, operation S500 of determining whether bare board information about the board 62 is included, operation S600 of performing bare board teaching when the bare board information is excluded, operation S700 of measuring the 3D shape of target objects on the board 62 when the bare board information is included or bare board teaching information is generated, and operation S800 of analyzing whether the board 62 is normal or abnormal by using 3D shape information. Therefore, the 3D shape of target objects on the board may be more readily measured.

Description

    RELATED APPLICATIONS
  • This application is a Continuation patent application of co-pending application Ser. No. 11/656,458, filed on 23 Jan. 2007. The entire disclosure of the prior application, Ser. No. 11/656,458, from which an oath or declaration is supplied, is considered a part of the disclosure of the accompanying Continuation application and is hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • A method of measuring a 3D shape, which can measure 3D shape of target objects on a board by searching a database for bare board information when inspection option is set to the teaching-based inspection mode or by performing bare board teaching when the board from a supplier having not the bare board information is inspected.
  • 2. Description of the Related Art
  • A method of measuring a 3D shape according to a conventional art will be described with reference to FIG. 1.
  • FIG. 1 is a flowchart illustrating a method of measuring a 3D shape according to the conventional art. Referring to FIG. 1, to measure a 3D shape of a measuring object, in operation S10, a grating pattern illumination is emitted towards a reference surface by emitting a light generated from an illumination source (not shown) towards a grating device (not shown) to acquire a reference phase corresponding to the reference surface. In operation S11, a grating is moved by a fine pitch using a piezoelectric actuator (not shown) and emitted towards the reference surface and a grating pattern image is acquired using a charged coupled device (CCD) camera and a grabber (not shown). In operation S12, when the grating pattern image is acquired by the grabber, a bucket algorithm is applied to the grating pattern image. In operation S13, the reference phase with respect to the reference surface is acquired.
  • In operation S15, when the reference phase corresponding to the reference surface is acquired, a measuring object is placed on a moving table and a light generated from the illumination source is emitted towards a measuring surface of the measuring object to acquire a phase of the measuring object. In operation S16, the grating is moved by a fine interface using the piezoelectric actuator to apply the bucket algorithm, and the grating pattern image reflected from the measuring surface is acquired via the CCD camera and the grabber. In operation S17, the bucket algorithm is applied to the grating pattern image. In operation S18, an object phase of the measuring object is acquired.
  • When the object phase is acquired, the object phase is deducted from the reference phase in operation S21, and a moire phase is acquired in operation S21. When the moire phase is acquired, the moire phase is unwrapped in operation S22, and actual height information of the measuring object is acquired by using a result of unwrapping. Through the above-described operations, the 3D shape of the measuring object is acquired.
  • However, the conventional 3D shape measuring method has a problem in that an operator may feel tired a lot, and a productivity may be reduced since each of measuring conditions is manually calculated and then measurement is performed when a totally new measuring object, not an ongoing measuring object, is measured.
  • SUMMARY OF THE INVENTION
  • An objective of the present invention is to provide a method of measuring a 3D shape which can measure 3D shape of target objects according to the normal inspection mode when a measuring object is set to the normal inspection mode, and also can measure the 3D shape of target objects on the board by searching a database for bare board information when the inspection option is set to the teaching-based inspection mode or by performing bare board teaching when the board is inspected from a supplier having not the bare board information, and thereby can improve a productivity of electronic board manufacturing line.
  • Another objective of the present invention is to improve the measurement quality of the 3D shape of target objects by measuring their 3D shape with a regular brightness of an illumination source at each illumination level, which is predefined prior to the machine operation.
  • To accomplish the above objects of the present invention, there is provided a method of measuring a 3 dimensional (3D) shape, the method including: measuring a brightness of a first illumination source by controlling via a central control unit a module control unit and an image acquisition unit; measuring a phase-to-height conversion factor by controlling via the central control unit the module control unit and the image acquisition unit after the brightness measuring process of the first illumination source is completed; determining whether the measurement is performed in a normal inspection mode after the brightness of the first illumination source and the phase-to-height conversion factor are measured and calculated; measuring a 3D shape of a board according to the normal inspection mode by controlling, via the central control unit, the module control unit and the image acquisition unit when it is the normal inspection mode as a result of the determination; searching a database and determining, via the central control unit, whether bare board information about the board is in the database when it is not the normal inspection mode as a result of the determination; performing bare board teaching by controlling, via the central control unit, the module control unit and the image acquisition unit when the bare board information is not in the database; measuring the 3D shape of the board according to a teaching-based inspection mode by controlling, via the central control unit, the module control unit and the image acquisition unit when the bare board information is in the database or when bare board teaching information is generated by performing the bare board teaching; and analyzing via the central control unit whether 3D shape of target objects on the board is normal or abnormal by using the measured 3D shape information of the objects according to the normal inspection mode and the teaching-based inspection mode.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects, features and advantages of the present invention will become apparent from the following description of a preferred embodiment given in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a flowchart illustrating a method of measuring a 3D shape according to a conventional art;
  • FIG. 2 is a diagram illustrating a 3D shape measuring system for a 3D shape measuring method according to the present invention;
  • FIGS. 3A through 3C illustrate a configuration of a board, a bare board, and a calibration target;
  • FIG. 4 is a flowchart illustrating a method of measuring a 3D shape according to the present invention;
  • FIG. 5 is a flowchart illustrating an operation of measuring a brightness of a first illumination source shown in FIG. 4;
  • FIG. 6 is a flowchart illustrating an operation of measuring a phase-to-height conversion factor shown in FIG. 4;
  • FIGS. 7A and 7B are flowcharts illustrating an operation of measuring a 3D shape of a board according to a normal inspection mode shown in FIG. 4;
  • FIGS. 8A and 8B are flowcharts illustrating a bare board teaching operation shown in FIG. 4; and
  • FIGS. 9A and 9B are flowcharts illustrating an operation of measuring a 3D shape of a board according to a teaching-based inspection mode shown in FIG. 4.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
  • FIG. 2 is a diagram illustrating a 3D shape measuring system for a 3D shape measuring method according to the present invention. As shown in FIG. 2, the 3D shape measuring apparatus includes a central control unit 10, a module control unit 20, an image acquisition unit 30, at least one pattern projector 40, a second illumination source 50, an X-Y table 61, a table moving device 60, and a camera 70. Hereinafter, a configuration of each element will be described.
  • A charge coupled device (CCD) camera or a complementary metal oxide semiconductor (CMOS) camera is utilized for the camera 70. A second illumination source 50, an optical filter 71, and a lens 72 are provided below the camera 70. A plurality of light emitting diodes (LEDs) formed in a shape of a circle or a circular lamp is utilized for the second illumination source 50, and the second illumination source 50 is utilized as an illuminator to measure particular shape of specific locations on a board 62 or a bare board 63 which corresponds to a measuring object.
  • The table moving device 60 drives the X-Y table 61 which is positioned below the camera 70 and thereby moves the board 62, the bare board 63, or a calibration target 64 to predefined measurement locations so that the camera 70 may take images of the board 62, the bare board 63 or the calibration target 64.
  • At least one pattern projector 40 of the 3D shape measuring system, which is indicated by a solid line and a dotted line as shown in FIG. 2, is provided. Each of the at least one pattern projector 40 is provided to be inclined in one side or another side of the camera 70 which takes images of the board 62, the bare board 63 or the calibration target 64. In this instance, the pattern projector 40 includes an illumination part 41, a grating moving device 42, a grating device 43, and a lens 44. The illumination part 41 includes a first illumination source 41 a and a plurality of lenses 41 b and 41 c. An illumination generated from the first illumination source 41 a passes through the plurality of lenses 41 b and 41 c, and is emitted toward the grating device 43 and then toward the board 62, the bare board 63 or the calibration target 64.
  • The image acquisition unit 30 receives the image taken by the camera 70 and transmits the received image to the central control unit 10. The module control unit 20 includes a table controller 21, a grating controller 22, and an illumination controller 23. The illumination controller 23 controls the first illumination source 41 a of the illumination part 41 or the second illumination source 50, the grating controller 22 controls the grating moving device 42, and the table controller 21 controls the table moving device 60.
  • The central control unit 10 includes a control board 11, an image processing board 12, and an interface board 13. The central control unit 10 transmits/receives a control signal or control information to the module control unit 20 and the image acquisition unit 30 via the interface board 13, the image processing board 12 processes an image received from the image acquisition unit 30, and the control board 11 generally controls the 3D shape measuring apparatus of the present invention. Also, the central control unit 10 searches a database 80 for bare board information of a new board supplier or stores the bare board information which is acquired in a teaching-based inspection mode.
  • Hereinafter, a method of measuring 3D shape of target objects on the board 62 by using the 3D shape measuring system constructed as above will be described with reference to FIGS. 2 through 4.
  • FIG. 2 is a diagram illustrating a 3D shape measuring system for a 3D shape measuring method according to the present invention, FIGS. 3A through 3C illustrate a configuration of a board, a bare board, and a calibration target, and FIG. 4 is a flowchart illustrating a method of measuring a 3D shape according to the present invention.
  • As shown in FIGS. 2 through 4, an initial setup operation of the 3D shape measuring system according to the present invention is performed before measuring the 3D shape. For the initial setup operation, in operation S100, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure a brightness of the first illumination source 41 a. In operation S200, when the brightness of the first illumination source 41 a is completely measured, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure a phase-to-height conversion factor. Through operations S100 and S200, the initial setup operation of the 3D shape measuring system is completed.
  • In operation S300, after the brightness of the first illumination source 41 a and the phase-to-height conversion factor are measured for the initial setup operation of the 3D shape measuring system, the central control unit 10 determines whether the measurement is performed in a normal inspection mode. When an operator selects a normal inspection mode or a teaching-based inspection mode, by using input information via an input device, such as a keyboard (not shown), or by using an job program pre-installed in the 3D shape measuring system, the central control unit 10 recognizes and determines the selected mode.
  • In operation S400, when the central control unit 10 determines it is the normal inspection mode in operation S300, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure the 3D shape of target objects on the board 62 according to the normal inspection mode. Conversely, when the central control unit 10 determines it is not the normal inspection mode in operation S300, the central control unit performs operation S500 of searching the database 80 for bare board information of the board 62.
  • In operation S510 of operation S500, the table controller 21 of driving the table moving device 60 controls the X-Y table 61 to move the board 62 to a measurement location. In operation S520, when the board 62 is moved to the measurement location, the central control unit 10 controls the illumination controller 23 to switch on the second illumination source 50. In operation S530, when the second illumination source 50 is switched on and thereby the camera 70 takes a picture of the board 62 and the image acquisition unit acquires an image, the central control unit 10 calculates location information about a particular part of the board 62. In this instance, the particular part of the board 62 or the bare board 63 indicates a mark (not shown) which can be distinguishable for each manufacturer or each product adopting the board 62 or the bare board 63. In operation S540, when the location information about the particular part of the board 62 is calculated, the central control unit 10 searches the database 80 for bare board information which is identical to the image information of the particular part. In operation S550, it is determined whether the database 80 contains the bare board information about the board 62 to be currently measured.
  • In operation 600, when the database 80 excludes the bare board information in operation S500, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to perform a bare board teaching.
  • In operation 700, when the database 80 includes the bare board information in operation S500 or bare board teaching information is generated in operation S600, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure the 3D shape of target objects on the board 62. When the 3D shape of the board 62 is measured in each of operations S400 and S700, the central control unit 10 analyzes whether the board 62 is normal or abnormal by using information about the measured 3D shape in operation S800 and thereby badness of a solder 62 e, which is formed on the board 62 after measuring the 3D shape of the board 62, is determined. As described above, since the 3D shape of target objects on the board 62 is measured according to the normal inspection mode or the teaching-based inspection mode, the 3D shape of target objects on the board may be more readily and efficiently acquired.
  • Hereinafter, operations S100, S200, S400, S600, and S700 in a method of measuring a 3D shape according to the present invention will be sequentially described in detail
  • As shown in FIGS. 2 through 5, in operation S110 of operation S100, the central control unit 10 sets a range of an illumination adjustment command value and then a brightness of the first illumination source 41 a is adjusted by the illumination controller 23 of the module control unit 20 according to the set adjustment command value.
  • The calibration target 64 is utilized to adjust the brightness of the first illumination source 41 a. As shown in FIG. 3C, the calibration target 64 includes a plane surface 64 a and a stepped difference 64 b, and is formed in a gray color. Also, the calibration target 64 is applied when measuring the brightness of the first illumination source 41 a or calculating the phase-to-height conversion factor in the initial setup operation for measuring the 3D shape. In operation S111, to measure the brightness of the first illumination source 41 a using the calibration target 64, the table controller 21 of the module control unit to drive the table moving device 60 drives the X-Y table 61 and the calibration target 64 is moved to a measurement location. In this instance, the measurement location indicates a location where the camera 70 may take an image of the calibration target 64.
  • In operation S112, when the calibration target 64 is moved to the measurement location, the first illumination source 41 a is switched on by the illumination controller 23 of the module control unit 20. In operation S113, when the first illumination source 41 a is switched on, the central control unit 10 sets the ranges of the illumination adjustment command value. In this instance, when a user inputs information about the adjustment command value using an input device, such as a keyboard (not shown), and the like, the central control unit 10 recognizes the input information and sets the range of the adjustment command value. In operation S114, when the range of the illumination adjustment command value is set, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the set adjustment command value.
  • In operation S120, when the brightness of the first illumination source 41 a is adjusted, the grating controller 22 of the module control unit 20 to drive the grating moving device 42 moves the grating device 43 an N number of times and the camera 70 takes an image of the calibration target 64 for each movement and the image acquisition unit 30 acquires the image of the calibration target 64. In operation S130, when the image acquisition unit 30 acquires N images of the calibration target 64, the acquired image is received via the image processing board 12 and the interface board 13 of the central control unit 10 and average-process of the received images is performed, and an average image is calculated. In this instance, the average image process acquires an average image where a grating pattern is eliminated from the N number of images taken by the camera 70 every time the grating device 43 is moved the N number of times to acquire the image of the calibration target 64. In the present invention, the image of the calibration target 64 is acquired by moving the grating device 43 at least four times and the average image process is performed.
  • In operation S140, when the average image of the calibration target 64 is calculated, the central control unit 10 sets a representative brightness value of the calculated average image to an illumination brightness of a corresponding illumination adjustment command value. In operation S150, when the illumination brightness is set, the central control unit 10 determines whether the adjustment command value is maximum. When the illumination adjustment command value is maximum within the set range of the adjustment command value, the central control unit 10 determines the brightness adjustment of the first illumination source 41 a is completed.
  • In operation S160, when the illumination adjustment command value is maximum, the central control unit 10 defines the illumination brightness corresponding to each adjustment command value. In operation S170, when the illumination brightness corresponding to each adjustment command value is defined, the central control unit 10 generates the illumination brightness corresponding to each adjustment command value into an illumination index table. In this instance, the illumination index table defines the illumination brightness according to each adjustment command value. Therefore, when measuring the 3D shape of the board 62 or the bare board 63 by using the first illumination source 41 a, the illumination brightness may be linearly adjusted by using the brightness index table, and thus the measurement quality of a 3D shape may be improved.
  • When a plurality of first illuminations sources 41 a is provided as indicated by a solid line and a dotted line as shown in FIG. 2, in operation S161 of operation S160, the central control unit 10 determines whether the plurality of first illumination sources 41 a is provided. In operation S162, when the plurality of first illumination sources 41 a is provided, any one of the plurality of first illumination sources 41 a is switched off and any one of remaining first illumination sources 41 a is switched on. Specifically, when any one of the plurality of first illumination sources 41 a indicates the first illumination source 41 a indicated by the solid line in FIG. 2, the remaining first illumination source 41 a is indicated by the dotted line in FIG. 2.
  • In operation S163, when the remaining first illumination source 41 a is switched on, the central control unit 10 determines whether the adjustment command value of the remaining first illumination source 41 a is maximum. In operation S164, when the adjustment command value of the remaining first illumination source 41 a is maxim, the central control unit 10 compares the illumination brightness values corresponding to each of the adjustment command value of any one of the plurality of first illumination sources 41 a and the adjustment command value of the remaining first illumination source 41 a, and selects a smaller illumination brightness value between them as the brightness of the total illumination system according to each adjustment command value. In operation S165, when the illumination brightness of the total illumination system corresponding to each adjustment command value is determined, the central control unit 10 calculates new adjustment command value of each first illumination corresponding to the selected illumination brightness of the total illumination system, and then redefines the illumination adjustment command values of the plurality of first illumination sources 41 a corresponding to the selected illumination brightness of the total illumination system. Through the above-described operations, the illumination brightness defined for each adjustment command value is generated into the illumination index table.
  • In operation S200, when the brightness measurement of the first illumination source 41 a is completed, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to measure the phase-to-height conversion factor. As shown in FIGS. 2 through 4 and FIG. 6, operation S210 of controlling, by the central control unit 10, the illumination controller 23 of the module control unit 20 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value of the total illumination system is performed.
  • More specifically, to adjust the brightness of the first illumination source, in operation S211, the table controller 21 of driving the table moving device drives the X-Y table 61 and thus the calibration target 64 is moved to the measurement location. In operation S212, when the calibration target 64 is moved to the measurement location, the first illumination source 41 a is switched on by the illumination controller 23. In operation S213, when the first illumination source 41 a is switched on, the central control unit 10 selects an adjustment command value. In operation S214, when the adjustment command value is selected, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a to the illumination brightness corresponding to the selected adjustment command value.
  • In operation S220, when the first illumination source is adjusted, the central control unit 10 determines whether a measurement portion of the calibration target 64 corresponds to a plane surface 64 a. Whether the measurement portion of the calibration target 64 corresponds to the plane surface 64 a is determined by the central control unit 10 by using the image, taken by the camera 70, in a state where the second illumination source 50 is switched on. When the measurement portion of the calibration target 64 does not correspond to the plane surface 64 a, the central control unit 10 drives the table moving device 60 to move the plane surface 64 a of the calibration target 64 to a focus location of the camera 70 or the plane surface 64 a is manually moved by the operator.
  • In operation S230, when the measurement portion of the calibration target 64 corresponds to the plane surface 64 a, the central control unit 10 sets the plane surface 64 a of the calibration target 64 as an inspection area. In operation S240, when the plane surface 64 a of the calibration target 64 is set as the inspection area, the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times, emits a grating pattern illumination towards the plane surface 64 a for each movement and takes an image of the calibration target via the camera 70, which is reflected from the plane surface 64 a. The image acquisition unit 30 acquires the taken images of the plane surface 64 a. In operation S250, when the image acquisition unit 30 acquires the images of the plane surface 64 a, the central control unit 10 acquires a phase map of the plane surface 64 a by using an N-bucket algorithm and the acquired images, and also stores the phase map of a first reference surface m. Information, such as the phase map of the first reference surface m, is stored in a storage device (not shown), such as a hard disk that connects with the control board 11 of the central control unit 10, and the like.
  • When the phase map of the first reference surface m of the plane surface 64 a is stored, operation S220 is re-performed. In operation S260, when the measurement portion does not correspond to the plane surface 64 a, the central control unit 10 sets a stepped difference 64 b of the calibration target 64 as the inspection area. In operation S270, when the stepped difference 64 b is set as the inspection area, the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times, emits a grating pattern illumination towards the stepped difference 64 b for each movement and takes an image of the calibration target 64 via the camera 70, which is reflected from the stepped difference 64 b. The image acquisition unit 30 acquires the taken images of the stepped difference 64 b. In operation S280, when the image acquisition unit 30 acquires the images of the stepped difference 64 b, the central control unit 10 acquires a phase map of the stepped difference 64 b by using an N-bucket algorithm and the acquired images.
  • In operation S290, when the phase map of each of the plane surface 64 a and the stepped difference 64 b are acquired in each of operations S250 and S280, the phase-to-height conversion factor of each pixel is calculated and stored by using the acquired phase maps. In this instance, the phase-to-height conversion factor is required to convert a phase into a height value when calculating a phase of each point by using the N-bucket algorithm and then calculating the height value of a corresponding point by using the calculated phase. To calculate the phase-to-height conversion factor of each pixel, the central control unit 10 calculates a relative height phase of the stepped difference 64 b with respect to the first reference surface m by using phase information about the first reference surface m and the phase map of the stepped difference 64 b. When the relative height phase of the stepped difference 64 b is calculated, the central control unit 10 calculates the phase-to-height conversion factor by using the relative height phase of the stepped difference 64 b, pattern period information of the stepped difference 64 b, and a known height of the stepped difference 64 b of the calibration target 64.
  • Operation 5290 is performed with respect to each of the plurality of first illumination sources 41 a when the plurality of first illumination sources 41 a as indicated by the solid line and the dotted line in FIG. 2 is provided. More specifically, when the plurality of first illumination sources 41 a is provided, in operation 5291, the phase-to-height conversion factor of each pixel is calculated by using the phase map, which is acquired according to the grating pattern illumination generated from any one of the first illumination sources 41 a.
  • In operation 5292, when the phase-to-height conversion factor of each pixel is calculated, the central control unit 10 determines whether the plurality of first illumination sources 41 a is provided. In operation 5293, when the phase-to-height conversion factor of each pixel according to any one of the plurality of first illumination sources 41 a is calculated, the central control unit 10 controls the illumination controller 23 of the module control unit 20 to switch off any one of the plurality of first illumination sources 41 a where the phase-to-height conversion factor is calculated, and switch on a remaining first illumination source 41 a.
  • In operation S294, when the remaining first illumination source 41 a is switched on, the central control unit 10 determines whether the phase-to-height conversion factor of each pixel according to the grating pattern illumination from the remaining first illumination source 41 a is calculated. In operation S295, when the phase-to-height conversion factor of each pixel is calculated, the central control unit 10 stores the phase-to-height conversion factor of each pixel according to each of the plurality of first illumination sources 41 a.
  • When the phase-to-height conversion factor of each pixel was not calculated, the central control unit 10 returns to operation S210 of controlling the illumination controller 23 of the module control unit 20 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • When the illumination index table is generated and the phase-to-height conversion factor is calculated, operation S400 of measuring the 3D shape of target objects on the board 62 according to the normal inspection mode is performed. More specifically, in operation S410 of operation S400, as shown in FIGS. 2 through 4 and FIGS. 7A and 7B, the table controller 201 of driving the table moving device 60 drives the X-Y table 61, and the board 62 is moved to the measurement location.
  • In operation S420, when the board 62 is moved to the measurement location, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value. In operation S430, when the brightness of the first illumination source 41 a is adjusted, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to acquire the phase map of the board 62, and calculates the relative height phase with respect to a first reference surface m. In operation S440, when the relative height phase with respect to the first reference surface m is calculated, the central control unit 10 calculates a phase histogram by using the relative height phase with respect to the first reference surface m, and calculates the 3D shape of target objects on the board 62 by using the calculated phase histogram.
  • In operation S442 of operation S440 in FIG. 7A, when a single first illumination source 41 a is provided, and in this instance, the relative height phase with respect to the first reference surface m of the board 62 according to the grating pattern illumination generated from the first illumination source 41 a is calculated, the central control unit 10 calculates a phase histogram by using the relative height phase with respect to the first reference surface m. In operation S443, when the phase histogram is calculated, the central control unit 10 separates a second reference surface n and a solder 62 e from the calculated phase histogram, and calculates a centroid of the second reference surface n and the solder 62 e.
  • In this instance, the method of calculating the centroid of the second reference surface n and the solder 62 e initially separates the second reference surface n and the solder 62 e by using pre-stored dimensional information of the board 62. When the second reference surface n is calculated, the central control unit 10 separates the solder 62 e by using the calculated second reference surface n. When the second reference surface n and the solder 62 e are separated, the central control unit 10 calculates the centroid of the second reference surface n and the solder 62 e by using a centroid method.
  • In operation S444, when the centroid of the second reference surface n and the solder 62 e is calculated, the central control unit 10 calculates a representative height of the solder 62 e by using the centroid of the second reference surface n and the centroid of the solder 62 e. In operation S445, when the representative height of the solder 62 e is calculated, the central control unit 10 calculates a volume, a height distribution, and an positional offset of the solder 62 e by using the calculated representative height.
  • Hereinafter, performing operation S440 in FIG. 7B when the plurality of first illumination sources 41 a is provided will be described in detail.
  • In operation S441, when the plurality of first illumination sources 41 a is provided, and the relative height phase with respect to the first reference surface m of the board 62 according to the grating pattern illumination generated from any one of the plurality of first illumination sources 41 a and the relative height phase with respect to the first reference surface m of the board 62 according to the grating pattern illumination generated from the remaining first illumination source 41 a are calculated, the central control unit 10 calculates a combined height phase where noise is removed from the relative height phase with respect to the first reference surface m according to each of the plurality of first illumination sources 41 a, and stores the calculated combined height phase.
  • In operation S442, when the combined height phase is stored, the central control unit 10 calculates a phase histogram by using the stored combined height phase, which is the same as when only the single first illumination source 41 a is provided. In operation S443, the central control unit 10 separates the second reference surface n and the solder 62 e from the calculated phase histogram, and calculates the centroid of the second reference surface n and the solder 62 e. Next, as described above, in operation S444, the central control unit 10 calculates the representative height of the solder 62 e by using the centroid of the second reference surface n and the centroid of the solder 62 e. In operation S445, when the representative height of the solder 62 e is calculated, the central control unit 10 calculates a volume, a height distribution, and an positional offset of the solder 62 e by using the calculated representative height.
  • In this instance, the height distribution of the solder 62 e is calculated based on the second reference surface n, and the volume of the solder 62 e is calculated by multiplying the phase-to-height conversion factor of each pixel and phase information of the solder 62 e, and summing up the results of the multiplications. Also, the positional offset of the solder 62 e is calculated depending upon how far the solder 62 e is located from the center of a conductive pad 62 d by using location information of the solder 62 e which is calculated by using the volume of the solder 62 e. Based on the calculated volume, height, distribution and positional offset information of solder, the goodness and badness of the board is determined automatically.
  • In operation S600, when the bare board information is excluded in operation 5500, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to perform a bare board teaching.
  • As shown in FIGS. 2 through 4, and FIGS. 8A and 8B, the bare board 63 is moved to the measurement location in operation S600. Specifically, in operation S610, the table controller 21 of driving the table moving device 60 drives the X-Y table 61 whereby the bare board 63 is moved to the measurement location. To distinguish the bare board 63 from the board 62, the bare board 63 includes a base plate 62 a, a conductive pattern 62 b, a solder mask 62 c, and the conductive pad 62 d.
  • In operation S620, when the bare board 63 is moved to the measurement location, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value. In operation S630, when the brightness of the first illumination source 41 a is adjusted, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to acquire the phase map of the bare board 63, and calculates the relative height phase with respect to the first reference surface m. In this instance, the first reference surface m indicates the base plate 62 a of the bare board 63, and is calculated by using pre-given bare board information.
  • In operation S640, when the relative height phase with respect to the first reference surface m is calculated, the central control unit 10 stores location information and image information about a particular part of the bare board 63, as bare board information, in the database 80.
  • In this instance, operation S640 may be performed in a different way with respect to when only a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 a is provided. Initially, performing operation S640 when the single first illumination source 41 a is provided will be described. In operation S642, when the relative height phase with respect to the first reference surface m is calculated, the central control unit 10 stores the relative height phase with respect to the first reference surface m as height phase information of the bare board 63. In operation S643, when the height phase information is stored, the central control unit 10 determines whether teaching with respect to all areas of the bare board 63 is completed. In operation S644, when the teaching with respect to all areas of the bare board 63 is completed, the central control unit 10 controls the illumination controller 23 of the module control unit 21 to switch off the first illumination source 41 a and switch on the second illumination source 50. In operation S645, when the second illumination source 50 is switched on, the central control unit 10 controls the image acquisition unit 30 to acquire the image with respect to the particular part of the bare board 63 using the camera 70, and stores the acquired image, and also calculates location information with respect to the particular part of the bare board 63, and stores the calculated location information and the image information in the database 80.
  • Hereinafter, performing operation S640 when the plurality of first illumination sources 41 a is provided will be described. In this instance, in operation S641, when the relative height phase with respect to the first reference surface m of the bare board 63 according to any one of the plurality of first illumination sources 41 a and the relative height phase with respect to the first reference surface m of the bare board 63 according to a remaining first illumination sources 41 a are calculated, the central control unit 10 calculates the combined height phase where noise is removed from the relative height phase with respect to each first reference surface m. Operations after calculating the combined height phase are identical when only the single first illumination source 41 a is provided. In operation S642, when the combined height phase is calculated, the central control unit 10 stores the calculated combined height phase as height phase information.
  • In operation S643, when the height phase information is stored, the central control unit 10 determines whether teaching with respect to all areas of the bare board 63 is completed. In operation S644, when the teaching with respect to all areas of the bare board 63 is completed, the central control unit 10 controls the illumination controller 23 of the module control unit 21 to switch off the plurality of first illumination sources 41 a and switch on the second illumination source 50. In this instance, all of the plurality of first illumination sources 41 a is switched off. Also, in operation S645, when the second illumination source 50 is switched on, the central control unit 10 controls the image acquisition unit 30 to acquire the image with respect to the particular part of the bare board 63 using the camera 70, and stores the acquired image, and also calculates location information with respect to the particular part of the bare board 63, and stores the calculated location information in the database 80.
  • Hereinafter, operation 5700 of measuring the 3D shape of target objects on the board 62 according to a teaching-based inspection mode when the bare board information is included in operation S500 will be described with reference to FIGS. 2 through 4, and FIGS. 9A and 9B.
  • In operation S710 of operation S700, bare board information corresponding to the board 62 is read from the database 80. In operation S720, when the bare board information of the board 62 is read, the central control unit 10 controls the illumination controller 23 of the module control unit to switch off the second illumination source 50 when the second illumination source 50 is switched on. In operation S730, when the second illumination source 50 is switched off, the table controller 21 of driving the table moving device 60 drives the X-Y table 61 whereby the board 62 is moved to the measurement location.
  • In operation S740, when the board 62 is moved to the measurement location, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value. In operation S750, when the brightness of the first illumination source 41 a is adjusted, the central control unit 10 controls the module control unit 20 and the image acquisition unit 30 to acquire the phase map of the board 62 and calculates the relative height phase with respect to the first reference surface m. In operation 5760, when the relative height phase with respect to the first reference surface m is calculated, the central control unit 10 calculates the phase histogram by using the relative height phase with respect to the first reference surface m, and calculates the 3D shape of the board 62.
  • In this instance, operation S760 may be performed in a different way with respect to when only a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 is provided.
  • Initially, performing operation S760 when only the single first illumination source 41 a is provided will be described. In operation S762, when the relative height phase with respect to the first reference surface m of the board 62 according to the grating pattern illumination generated from the first illumination source 41 a is calculated, the central control unit 10 stores the relative height phase with respect to the first reference surface m as height phase information of the board 62. In operation S763, when the height phase information of the board 62 is stored, the central control unit 10 separates height phase information of the solder 62 e in a corresponding inspection location by using the height phase information of the bare board 63 in the database 80 and the height phase information of the board 62 stored in operation S762. When the height phase information of the solder 62 e is separated in the corresponding inspection location, the central control unit 10 calculates actual height information from the relative height phase information of the solder 62 e, and calculates a volume, a height distribution, and an positional offset of the solder 62 e by using the calculated actual height information. Specifically, in operation S764, when the height phase information of the solder 62 e is separated, the central control unit 10 calculates the actual height information from the separated height phase information of the solder 62 e, and calculates the volume, the height distribution, and the positional offset of the solder 62 e.
  • Hereinafter, performing operation S760 when the plurality of first illumination sources 41 a is provided will be described. In operation S761, when the relative height phase with respect to the first reference surface m of the board 62 according to any one of the plurality of first illumination sources 41 a and the relative height phase with respect to the first reference surface m of the board 62 according to a remaining first illumination source 41 a are calculated, the central control unit 10 calculates a combined height phase where noise is removed from the relative height phase with respect to each first reference surface m. Following operations will be identical when only the single first illumination source 41 a is provided and thus only a brief description will be made below.
  • In operation S762, when the combined height phase is calculated, the central control unit 10 stores the calculated combined height phase as height phase information. In operation S763, when the height phase information is stored, the central control unit 10 separates height phase information of the solder 62 e by using the height phase information of the bare board 80 stored in the database 80 and the height phase information of the board 62 stored in operation S762. In operation S764, when the height phase information of the solder 62 e is separated, the central control unit 10 calculates actual height information from the separated height phase information of the solder 62 e and calculates the volume, the height distribution, and the positional offset of the solder 62 e.
  • In a method of measuring a 3D shape, operations 5420, S620, and S740 of adjusting the brightness of the first illumination source 41 a, to determine goodness and badness of the board 62, will be further described in detail with reference to FIGS. 7A through 9B.
  • Operations S420, S620, and S740 may be performed in a different way with respect to when a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 a is provided. Hereinafter, performing operations S420, S620, and S740 when the single first illumination source 41 a is provided will be described with reference to FIGS. 7A, 8A, and 9A.
  • In operations 5422, S622, and 5742 of operations 5420, S620, and S740, the central control unit 10 controls the illumination controller 23 to switch on the first illumination source 41 a. In operations S423, S623, and S743, when the first illumination source 41 a is switched on, the central control unit 10 selects a pre-input adjustment command value. In operations S424, S624, and S744, when the adjustment command value is selected, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • Hereinafter, operations S420, S620, and S740 when the plurality of first illumination sources 41 a is provided will be described with reference to FIGS. 7B, 8B, and 9B.
  • In operations S421, S621, and 5741 of operations S420, S620, and S740, the central control unit 10 determines whether any one of the plurality of first illumination sources 41 a is selected. In operations S422, S622, and 5722, when the selected first illumination source 41 a is determined, the central control unit 10 controls the illumination controller 23 to switch on the selected first illumination source 41 a. In operations S423, S623, and S743, when the selected first illumination source 41 a is switched on, the central control unit 10 selects the pre-input adjustment command value. In operations S424, S624, and S744, when the adjustment command value is selected, the central control unit controls the illumination controller 23 to adjust the brightness of the first illumination source 41 a according to the selected adjustment command value.
  • In operations S425, 5625, and S745, when any one of the plurality of first illumination sources 41 a is not selected, the central control unit 10 controls the illumination controller 23 to switch on a remaining first illumination source 41 a. In operation S426, S626, and S746, when the remaining first illumination source 41 a is switched on, the central controls unit 10 selects the pre-input adjustment command value. In operations S427, S627, and S747, when the adjustment command value is selected, the central control unit 10 controls the illumination controller 23 to adjust the brightness of the remaining first illumination source 41 a according to the selected adjustment command value.
  • When the brightness of the first illumination source 41 a is adjusted, operations S430, S630, and S750 of calculating the relative height phase with respect to the first reference surface m are performed respectively. In this instance, operations S430, S630, and S750 may be performed in a different way with respect to when a single first illumination source 41 a is provided and when a plurality of first illumination sources 41 a is provided. Hereinafter, performing operations S430, S630, and 5750 when the single first illumination source 41 a is provided will be described with reference to FIGS. 7A, 8A, and 9A.
  • In operations 5431, 5631, and S751, when the brightness of the first illumination source 41 a is adjusted, the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times and the camera 70 takes an image, which is reflected by emitting a grating pattern illumination generated from the first illumination source 41 a, every movement, and the image acquisition unit 30 acquires the taken image. In operations S432, S632, and S752, when the image acquisition unit 30 acquires the image, an inspection area is expanded. A concept of expanding the inspection area is applied to calculate the height of the solder 62 e based on the second reference surface n of the board 62 that includes the conductive pattern 62 b, the solder mask 62 c, the conductive pad 62 d, and the solder 62 e, which are formed on the base plate 62 a, as shown in FIG. 3A. In this instance, the second reference surface n indicates the height from a bottom surface of the board 62 to a top surface of the solder mask 62 c and the conductive pad 62 d, and the height corresponding to a centroid from the first reference surface m to the solder mask 62 c and the conductive pad 62 d. Also, as shown in FIGS. 3A and 3B, when an area A is set as the inspection area of the board 62 or the bare board 63 and the inspection is started, the inspection area is expanded to an area B so as to calculate a height value of the second reference value n.
  • In operations S433, S633, and S753, when the image acquisition unit 30 acquires the image in operations S431, S631, and S751, the central control unit 10 calculates a phase map by using an N-bucket algorithm, and stores the calculated phase map. In operations S434, S634, and S754, when the phase map is calculated and stored, the central control unit 10 calculates the relative height phase with respect to the first reference surface m in a corresponding inspection location by using a difference between a pre-stored phase map of the first reference surface m and the phase map stored in the central control unit 10.
  • Hereinafter, performing operations S430, S630, and S750 when the plurality of first illumination sources 41 a is provided will be described with reference to FIGS. 7B, 8B, and 9B.
  • In operations S431, S631, and S751, when the brightness of any one of the plurality of first illumination sources 41 a is adjusted, the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times and the camera 70 takes an image, which is reflected by emitting a grating pattern illumination generated from the first illumination source 41 a, every movement, and the image acquisition unit 30 acquires the taken image. In operations S432, S632, and S752, when the image acquisition unit 30 acquires the image, an inspection area is expanded. In operations S433, S633, and S753, when the image acquisition unit 30 acquires the image in operations S431, S631, and S751, the central control unit 10 calculates a phase map by using an N-bucket algorithm and stores the calculated phase map. In operations S434, S634, and S754, when the phase map is calculated and stored, the central control unit 10 calculates the relative height phase with respect to the first reference surface m in a corresponding inspection location by using a difference between a pre-stored phase map of the first reference surface m and the phase map stored in the central control unit 10.
  • In operations S435, S635, and S755, when the brightness of a remaining first illumination source 41 a is adjusted, the grating controller 22 of driving the grating moving device 42 moves the grating device 43 the N number of times and the camera 70 takes an image, which is reflected by emitting a grating pattern illumination generated from the remaining first illumination source 41 a, every movement, and the image acquisition unit 30 acquires the taken image. When the image according to the remaining first illumination source 41 a is acquired by the image acquisition unit 30, the relative height phase according to the remaining first illumination source 41 a is calculated by performing operations S436, S636, and S756 of expanding the inspection area, operations S437, S637, and S757 of calculating and storing the phase map using N-bucket algorithm, and operations S438, S638, and S758 of calculating the relative height phase with respect to the first reference surface m.
  • When the relative height phase is calculated with respect to when the single first illumination source 41 a is provided to the 3D shape measuring system and when the plurality of first illumination sources 41 a is provided thereto, the 3D shape of target objects on the board 62 can be measured by using the calculated relative height phase. Also, goodness and badness of the solder 62 e of the board 62 can be determined by using the result of the measurement.
  • According to the present invention, there is provided a method of measuring a 3D shape which can measure 3D shape of target objects on a board according to a normal inspection mode when a measuring object is set to the normal inspection mode, and also can measure the 3D shape of target objects on the board by searching a database for bare board information when the measuring object is not set to the normal inspection mode or by performing bare board teaching when the board is supplied from a supplier having not the bare board information, and thereby can improve a productivity of electric circuit boards.
  • Also, according to the present invention, it is to improve a measurement quality of a 3D shape by measuring the 3D shape while maintaining a brightness of an illumination source, which is applied to measure the 3D shape, to be regular for each operation.
  • Although a few exemplary embodiments of the present invention have been shown and described, the present invention is not limited to the described exemplary embodiments. Instead, it would be appreciated by those skilled in the art that changes may be made to these exemplary embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (8)

What is claimed is:
1. A method of measuring a three-dimensional shape, comprising:
transferring a target object having a conductive pad on which a solder is not formed by controlling a table transfer device;
illuminating a pattern light onto the target object having the conductive pad on which a solder is not formed;
acquiring a pattern image of the pattern light illuminated onto the target object having the conductive pad on which a solder is not formed;
calculating height of the conductive pad by using the acquired pattern image;
transferring a target object having a conductive pad on which a solder is formed by controlling the table transfer device;
illuminating a pattern light onto the target object having the conductive pad on which the solder is formed;
acquiring a pattern image of the pattern light illuminated onto the target object having the conductive pad on which the solder is formed;
calculating a height of the solder by using the acquired pattern image;
calculating a height of the solder in which the height of the pad is excluded; and
calculating a volume, a height distribution and a center of gravity of the solder.
2. The method of claim 1, wherein calculating a height of the solder by using the acquired pattern image comprises:
calculating a histogram by using the acquired pattern image;
calculating a reference plane by using the calculated histogram; and
calculating the height of the solder that is relative to the reference plane.
3. The method of claim 1, wherein illuminating a pattern light onto the target object having the conductive pad on which a solder is not formed and illuminating a pattern light onto the target object having the conductive pad on which the solder is formed comprises switching off any one of a plurality of first illumination sources and switching on a remaining first illumination source.
4. The method of claim 3, wherein calculating a height of the solder by using the acquired pattern image comprises unifying heights respectively measured by the first illumination sources.
5. The method of claim 1, further comprising:
illuminating a second illumination source onto the target object and capturing an image thereof;
acquiring a particular shape from the captured image; and
searching a database for a target object information having a shape that is substantially identical to the acquired particular shape.
6. The method of claim 1, wherein calculating a volume, a height distribution and a center of gravity of the solder comprises calculating an eccentricity for judging whether the solder is separated from the conductive pad by using the center of gravity.
7. A method of measuring a three-dimensional shape, comprising:
illuminating a pattern light generated from an illumination source onto a target object including a conductive pad and a solder;
acquiring a pattern image of the pattern light by a camera; and
calculating a volume, a height distribution and a center of gravity of the solder by using the acquired pattern image.
8. The method of claim 7, wherein calculating a volume, a height distribution and a center of gravity of the solder by using the acquired pattern image comprises calculating an eccentricity for judging whether the solder is separated from a center of the conductive pad by using the center of gravity.
US13/729,862 2006-01-26 2012-12-28 Method for measuring three-dimension shape of target object Abandoned US20130128280A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/729,862 US20130128280A1 (en) 2006-01-26 2012-12-28 Method for measuring three-dimension shape of target object

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
KR1020060008480A KR100734431B1 (en) 2006-01-26 2006-01-26 Method for measuring three-dimension shape
KR1020060008479A KR100672818B1 (en) 2006-01-26 2006-01-26 Method for measuring three-dimension shape
KR10-2006-0008480 2006-01-26
KR10-2006-0008479 2006-01-26
US11/656,458 US7545512B2 (en) 2006-01-26 2007-01-23 Method for automated measurement of three-dimensional shape of circuit boards
US12/453,321 US20090216486A1 (en) 2006-01-26 2009-05-07 Method for measuring three-dimension shape
US13/729,862 US20130128280A1 (en) 2006-01-26 2012-12-28 Method for measuring three-dimension shape of target object

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/453,321 Continuation US20090216486A1 (en) 2006-01-26 2009-05-07 Method for measuring three-dimension shape

Publications (1)

Publication Number Publication Date
US20130128280A1 true US20130128280A1 (en) 2013-05-23

Family

ID=38282411

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/656,458 Active US7545512B2 (en) 2006-01-26 2007-01-23 Method for automated measurement of three-dimensional shape of circuit boards
US12/453,321 Abandoned US20090216486A1 (en) 2006-01-26 2009-05-07 Method for measuring three-dimension shape
US13/729,862 Abandoned US20130128280A1 (en) 2006-01-26 2012-12-28 Method for measuring three-dimension shape of target object

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US11/656,458 Active US7545512B2 (en) 2006-01-26 2007-01-23 Method for automated measurement of three-dimensional shape of circuit boards
US12/453,321 Abandoned US20090216486A1 (en) 2006-01-26 2009-05-07 Method for measuring three-dimension shape

Country Status (3)

Country Link
US (3) US7545512B2 (en)
JP (2) JP2007199070A (en)
DE (2) DE102007063932B4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150373319A1 (en) * 2014-06-20 2015-12-24 Akira Kinoshita Shape measurement system, image capture apparatus, and shape measurement method

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100870930B1 (en) * 2007-05-08 2008-11-28 주식회사 고영테크놀러지 Multi-directional projection type moire interferometer and inspection method using it
DE102008003775A1 (en) * 2007-09-18 2009-03-26 Siemens Aktiengesellschaft Method and device for transporting and processing multiple objects
CN101435693B (en) * 2007-11-14 2010-12-08 鸿富锦精密工业(深圳)有限公司 Light source correcting system and method of image measuring platform
CN101960253B (en) * 2008-02-26 2013-05-01 株式会社高永科技 Apparatus and method for measuring a three-dimensional shape
KR101190122B1 (en) * 2008-10-13 2012-10-11 주식회사 고영테크놀러지 Apparatus and method for measuring three dimension shape using multi-wavelength
KR101251372B1 (en) 2008-10-13 2013-04-05 주식회사 고영테크놀러지 Three dimension shape measuring method
DE102009017464B4 (en) * 2009-04-03 2011-02-17 Carl Zeiss Oim Gmbh Device for optically inspecting a surface on an object
DE102010029091B4 (en) * 2009-05-21 2015-08-20 Koh Young Technology Inc. Form measuring device and method
KR101158324B1 (en) * 2010-01-26 2012-06-26 주식회사 고영테크놀러지 Image Sensing System
JP2010276607A (en) * 2009-05-27 2010-12-09 Koh Young Technology Inc Apparatus and method for measuring three-dimensional shape
DE102010030883B4 (en) * 2009-07-03 2018-11-08 Koh Young Technology Inc. Apparatus for testing a plate and method therefor
US8855403B2 (en) * 2010-04-16 2014-10-07 Koh Young Technology Inc. Method of discriminating between an object region and a ground region and method of measuring three dimensional shape by using the same
FR2960059B1 (en) * 2010-05-11 2012-12-28 Visuol Technologies INSTALLATION FOR MONITORING THE QUALITY OF A SURFACE OF AN OBJECT
JP5515039B2 (en) 2010-10-22 2014-06-11 株式会社ミツトヨ Image measuring device
TWI510756B (en) * 2010-10-27 2015-12-01 尼康股份有限公司 A shape measuring device, a shape measuring method, a manufacturing method and a program for a structure
KR101311251B1 (en) * 2010-11-12 2013-09-25 주식회사 고영테크놀러지 Inspection apparatus
KR101547218B1 (en) * 2010-11-19 2015-08-25 주식회사 고영테크놀러지 Method for inspecting substrate
KR101174676B1 (en) * 2010-11-19 2012-08-17 주식회사 고영테크놀러지 Method and apparatus of profiling a surface
US8755043B2 (en) 2010-11-19 2014-06-17 Koh Young Technology Inc. Method of inspecting a substrate
KR101311215B1 (en) * 2010-11-19 2013-09-25 경북대학교 산학협력단 Method for inspecting substrate
JP5772062B2 (en) 2011-02-25 2015-09-02 オムロン株式会社 Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
EP2527784A1 (en) * 2011-05-19 2012-11-28 Hexagon Technology Center GmbH Optical measurement method and system for determining 3D coordinates of a measured object surface
KR101307944B1 (en) * 2011-10-26 2013-09-12 주식회사 고영테크놀러지 Registration method of images for surgery
US9349182B2 (en) * 2011-11-10 2016-05-24 Carestream Health, Inc. 3D intraoral measurements using optical multiline method
JP5709009B2 (en) * 2011-11-17 2015-04-30 Ckd株式会社 3D measuring device
KR101215083B1 (en) * 2011-12-27 2012-12-24 경북대학교 산학협력단 Method for generating height information of board inspection apparatus
US20130250092A1 (en) * 2012-03-22 2013-09-26 Jeremy Georges Bertin Digital light masking systems
KR101262242B1 (en) * 2012-04-30 2013-05-15 주식회사 고영테크놀러지 Image Sensing Method and System
CN105823438B (en) * 2012-05-22 2018-10-19 株式会社高永科技 The height measurement method of 3 d shape measuring apparatus
JP6029394B2 (en) * 2012-09-11 2016-11-24 株式会社キーエンス Shape measuring device
FR3001564B1 (en) * 2013-01-31 2016-05-27 Vit SYSTEM FOR DETERMINING A THREE-DIMENSIONAL IMAGE OF AN ELECTRONIC CIRCUIT
KR20170045232A (en) 2014-08-28 2017-04-26 케어스트림 헬스 인코포레이티드 3-d intraoral measurements using optical multiline method
CN104568963A (en) * 2014-12-17 2015-04-29 华南理工大学 Online three-dimensional detection device based on RGB structured light
TWI526671B (en) * 2015-01-20 2016-03-21 德律科技股份有限公司 Board-warping measuring apparatus and board-warping measuring method thereof
KR101639227B1 (en) * 2015-06-08 2016-07-13 주식회사 고영테크놀러지 Three dimensional shape measurment apparatus
US20170148101A1 (en) * 2015-11-23 2017-05-25 CSI Holdings I LLC Damage assessment and repair based on objective surface data
JP7000380B2 (en) * 2019-05-29 2022-01-19 Ckd株式会社 3D measuring device and 3D measuring method
US20220252893A1 (en) * 2021-02-09 2022-08-11 Himax Technologies Limited Light projection apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6075605A (en) * 1997-09-09 2000-06-13 Ckd Corporation Shape measuring device
US6385335B1 (en) * 1996-02-27 2002-05-07 Cyberoptics Corp. Apparatus and method for estimating background tilt and offset
US20020180962A1 (en) * 1999-09-13 2002-12-05 Siemens Ag Device and method for inspecting a three-dimensional surface structure
US20040076323A1 (en) * 2002-08-06 2004-04-22 Omron Corporation Inspection data producing method and board inspection apparatus using the method
US7079666B2 (en) * 2000-03-24 2006-07-18 Solvision Inc. System for simultaneous projections of multiple phase-shifted patterns for the three-dimensional inspection of an object

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0615968B2 (en) 1986-08-11 1994-03-02 伍良 松本 Three-dimensional shape measuring device
US4978224A (en) 1987-07-14 1990-12-18 Sharp Kabushiki Kaisha Method of and apparatus for inspecting mounting of chip components
DE4007502A1 (en) 1990-03-09 1991-09-12 Zeiss Carl Fa METHOD AND DEVICE FOR CONTACTLESS MEASUREMENT OF OBJECT SURFACES
US5495337A (en) 1991-11-06 1996-02-27 Machine Vision Products, Inc. Method of visualizing minute particles
JP3148000B2 (en) * 1992-06-29 2001-03-19 松下電器産業株式会社 Solder printing inspection method
CN1050423C (en) * 1993-04-21 2000-03-15 欧姆龙株式会社 Visual inspection support apparatus, substrate inspection apparatus, and soldering inspection and correction methods using the same apparatuses
US5496337A (en) * 1995-04-06 1996-03-05 Brown; Randall L. Device for gauging suture depth
US6246787B1 (en) * 1996-05-31 2001-06-12 Texas Instruments Incorporated System and method for knowledgebase generation and management
JP4137212B2 (en) * 1998-02-25 2008-08-20 Juki株式会社 Height measuring method and height measuring apparatus
US6104493A (en) * 1998-05-14 2000-08-15 Fujitsu Limited Method and apparatus for visual inspection of bump array
WO2000003357A1 (en) * 1998-07-08 2000-01-20 Ppt Vision, Inc. Identifying and handling device tilt in a three-dimensional machine-vision image
WO2000070332A1 (en) * 1999-05-18 2000-11-23 Applied Materials, Inc. Method of and apparatus for inspection of articles by comparison with a master
US6496270B1 (en) 2000-02-17 2002-12-17 Gsi Lumonics, Inc. Method and system for automatically generating reference height data for use in a three-dimensional inspection system
US6501554B1 (en) * 2000-06-20 2002-12-31 Ppt Vision, Inc. 3D scanner and method for measuring heights and angles of manufactured parts
JP4030726B2 (en) * 2001-03-23 2008-01-09 シーケーディ株式会社 Solder printing inspection device
US6750974B2 (en) 2002-04-02 2004-06-15 Gsi Lumonics Corporation Method and system for 3D imaging of target regions
TWI291040B (en) * 2002-11-21 2007-12-11 Solvision Inc Fast 3D height measurement method and system
US7324214B2 (en) * 2003-03-06 2008-01-29 Zygo Corporation Interferometer and method for measuring characteristics of optically unresolved surface features
JP4077754B2 (en) * 2003-04-04 2008-04-23 オリンパス株式会社 3D shape measuring device
DE10335312A1 (en) * 2003-08-01 2005-02-24 Asys Automatisierungssysteme Gmbh Generation of a reference pattern for testing a substrate on which a reference pattern has been applied, e.g. for testing a circuit board with applied solder paste, whereby a reference data set is generated from control data
JP4746841B2 (en) * 2004-01-23 2011-08-10 ルネサスエレクトロニクス株式会社 Manufacturing method of semiconductor integrated circuit device
JP2005233780A (en) * 2004-02-19 2005-09-02 Olympus Corp Height measuring method and apparatus therefor
JP4011561B2 (en) * 2004-05-28 2007-11-21 シーケーディ株式会社 3D measuring device
US20070115484A1 (en) * 2005-10-24 2007-05-24 Peisen Huang 3d shape measurement system and method including fast three-step phase shifting, error compensation and calibration

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6385335B1 (en) * 1996-02-27 2002-05-07 Cyberoptics Corp. Apparatus and method for estimating background tilt and offset
US6075605A (en) * 1997-09-09 2000-06-13 Ckd Corporation Shape measuring device
US20020180962A1 (en) * 1999-09-13 2002-12-05 Siemens Ag Device and method for inspecting a three-dimensional surface structure
US7079666B2 (en) * 2000-03-24 2006-07-18 Solvision Inc. System for simultaneous projections of multiple phase-shifted patterns for the three-dimensional inspection of an object
US20040076323A1 (en) * 2002-08-06 2004-04-22 Omron Corporation Inspection data producing method and board inspection apparatus using the method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150373319A1 (en) * 2014-06-20 2015-12-24 Akira Kinoshita Shape measurement system, image capture apparatus, and shape measurement method
US9792690B2 (en) * 2014-06-20 2017-10-17 Ricoh Company, Ltd. Shape measurement system, image capture apparatus, and shape measurement method

Also Published As

Publication number Publication date
US7545512B2 (en) 2009-06-09
DE102007063932B4 (en) 2023-11-02
DE102007004122A1 (en) 2007-08-09
US20090216486A1 (en) 2009-08-27
JP2010243508A (en) 2010-10-28
DE102007004122B4 (en) 2014-10-30
US20070177159A1 (en) 2007-08-02
JP2007199070A (en) 2007-08-09
JP5733923B2 (en) 2015-06-10

Similar Documents

Publication Publication Date Title
US20130128280A1 (en) Method for measuring three-dimension shape of target object
US10262431B2 (en) Three-dimensional measurement device
US9329024B2 (en) Dimension measuring apparatus, dimension measuring method, and program for dimension measuring apparatus
JP5679560B2 (en) Dimension measuring apparatus, dimension measuring method and program for dimension measuring apparatus
KR100672818B1 (en) Method for measuring three-dimension shape
US8503758B2 (en) Image measurement device, method for image measurement, and computer readable medium storing a program for image measurement
EP1946376B1 (en) Apparatus for and method of measuring image
CN101013028A (en) Image processing method and image processor
US9772480B2 (en) Image measurement device
JP2020115127A (en) Method for measuring z height value of surface of workpiece using machine vision inspection system
JP6462823B2 (en) Image inspection device
KR100740249B1 (en) Apparatus and method for measuring image
JP6230434B2 (en) Image inspection apparatus, image inspection method, image inspection program, and computer-readable recording medium
JP6680552B2 (en) Shape measuring device and method of manufacturing object to be coated
JP2006208084A (en) Inspection device for irregularities in cyclic pattern
JP2023043178A (en) Workpiece inspection and defect detection system utilizing color channels
JP2009053485A (en) Automatic focusing device, automatic focusing method, and measuring device
JP6665028B2 (en) Shape measuring device and coating device equipped with the same
US20180130230A1 (en) Recognition apparatus, determination method, and article manufacturing method
KR100810722B1 (en) Apparatus for measurement of surface profile and control method thereof
JPH05296745A (en) Form measuring device
JP2023167395A (en) Image measurement device
JP2023167392A (en) Image measurement device
JP2020148894A (en) Method for recognizing magnification of lens and measuring apparatus
CN114390269A (en) Three-dimensional image generation system, three-dimensional image generation method, three-dimensional image generation program, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: KOH YOUNG TECHNOLOGY INC., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, MIN YOUNG;KIM, HEE TAE;YOO, BYUNG MIN;AND OTHERS;SIGNING DATES FROM 20130219 TO 20130221;REEL/FRAME:030365/0515

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION