WO1999016010A1 - Automated visual inspection system and process for detecting and classifying defects - Google Patents

Automated visual inspection system and process for detecting and classifying defects Download PDF

Info

Publication number
WO1999016010A1
WO1999016010A1 PCT/US1998/019544 US9819544W WO9916010A1 WO 1999016010 A1 WO1999016010 A1 WO 1999016010A1 US 9819544 W US9819544 W US 9819544W WO 9916010 A1 WO9916010 A1 WO 9916010A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing
defect
image data
classification
operable
Prior art date
Application number
PCT/US1998/019544
Other languages
French (fr)
Inventor
Mark R. Deyong
Thomas C. Eskridge
John W. Grace
Jeffrey E. Newberry
Original Assignee
Intelligent Reasoning Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelligent Reasoning Systems, Inc. filed Critical Intelligent Reasoning Systems, Inc.
Priority to AU93993/98A priority Critical patent/AU9399398A/en
Publication of WO1999016010A1 publication Critical patent/WO1999016010A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation

Definitions

  • the manufacture of populated printed wiring boards for computing devices is an industry in which automated inspection could be used to reduce the number of defective board escapes (i.e., boards that make it out of the factory and to the customer) and improve the yield of the production line.
  • automated inspection could be used to reduce the number of defective board escapes (i.e., boards that make it out of the factory and to the customer) and improve the yield of the production line.
  • defects some are at the component level, such as missing components, components off-pad, reverse-polarity, wrong components, and foreign material.
  • Other defects are at the lead level and include bent and missing leads, individual lead skew, solder bridges and solder joints with dewets or insufficient solder.
  • defect classification and source of the defect information can be used to improve the production process, thereby improving production yields, reducing the number of defective boards produced, and reducing the time defective boards are in rework.
  • a template is an image of a component or section of populated printed wiring board that is assumed to be good. As boards are being inspected, the template is subtracted from the image of the same component or board section taken from the board under inspection. The template is divided into a number of different regions, each with one or more threshold values . The difference between the image under inspection and the template is compared to the threshold to determine if there is a potential defect in that region.
  • Templating as a technology for automated visual inspection of populated printed wiring boards has some benefits that, in the past, made it a viable approach. The approach is conceptually very simple and is straightforward to implement in hardware.
  • the ease of implementation has led to the development of specialized hardware designed to process a single image frame very quickly.
  • the template based method is also capable of detecting a number of possible errors. Further, when defect sizes are relatively large, there can be enough flexibility in the specification of the "important" regions around a component that limited classification of detected defects is possible.
  • Templating technology is very difficult to extend. As new defects arise, a set of thresholds must be adjusted which often is too difficult for the typical production line operator or impossible in some cases, and effects of a change can only be determined by experimentation. Further, if classification is required over a range of defects, functional additions must be made to the templating decision method (e.g., the magnitude of the differences triggers computational functions to be run, and the defect classification is made on the output of the functions.) In addition, specialized hardware implementations, made on the basis of an imaging system field of view (FOV) and resolution capable of handling larger defects, require substantial design modification when higher resolutions are needed to distinguish defects.
  • FOV imaging system field of view
  • the automated visual inspection system and process include an imaging system operable to provide image data and a precision positioning system operable to move the imaging system to scan an object.
  • a processing and control engine including a scalable processing accelerator, is coupled to receive the image data and to provide control signals to the precision positioning system.
  • the processing and control engine pre-processes the image data, represents the image using descriptors, compares the descriptors to information in a knowledge base and identifies and classifies a defect based upon results of the comparison.
  • the processing and control engine also can associate a confidence level with the classification of the defect.
  • the engine alerts an operator and provides a best option as to the classification of the defect, accepts a confirmation or alternate classification from the operator, and adds information to the knowledge base responsive to the confirmation or alternate classification.
  • the imaging system provides structured, reflected light intensity as part of the image data, and the engine recovers three-dimensional spatial information based upon the reflected light intensity when the image data is pre-processed.
  • a technical advantage of the present invention is the automatic classification of defects based upon comparisons with information held in a knowledge base.
  • Another technical advantage of the present invention is the use of a line scan camera, a telecentric optic, and structured lighting to allow the inspection system to collect three-dimensional spatial information about a scanned object in addition to a two-dimensional spatial image of the scanned object.
  • An additional technical advantage of the present invention is the modular architecture which allows for scaling of data collection and processing to desired levels of resolution, accuracy, and throughput (processing speed) .
  • a further technical advantage of the present invention is the capability to adjust the physical platform to mechanically adjust for skew. Also, the system can periodically check for and remove fixed pattern errors caused by the system's mechanical characteristics .
  • Another technical advantage of the present invention is the maintenance of information between a number of inspection systems about classifications of prior inspections to place each decision in historical context . Additional technical advantages should be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
  • FIGURE 2 is a flow diagram of one embodiment of a process for detecting and classifying defects implemented by the system of FIGURE 1;
  • FIGURE 3 is a diagram of an example of a detected defect mapped onto a decision space of a knowledge base according to the present invention
  • FIGURE 6 is a diagram of one embodiment of the imaging system of the station of FIGURE 5;
  • FIGURE 1 is a block diagram of one embodiment of an automated visual inspection system, indicated generally at 10, which is based on an adaptive knowledge-based reasoning technology according to the present invention.
  • System 10 includes an imaging system 12 that is coupled to a precision positioning system 14.
  • Precision positioning system 14 operates to position imaging system 12 to scan an object 16 that is being inspected.
  • Object 16 is coupled to a material handling system 18 which operates to position object 16 within the inspection range of imaging system 12.
  • An optic 24 views the illuminated object 16 and provides an image to imager 26.
  • Imager 26 provides image data to computer system 20.
  • imager 26 can be a time delay integration (TDI) line scan camera which provides relatively high resolution over a large field of view and increased light sensitivity.
  • TDI line scan camera operates with a continuous scan and provides high speed imagery without the need for excessive light.
  • Optic 24 maps a given object field of view (FOV) onto the sensor array of imager 26 based upon optic 24 magnification/demagnification.
  • optic 24 is a telecentric optic with constant perspective and magnification across the entire FOV to achieve a top-down view of object 16 with relatively little parallax distortion or occlusions.
  • Illumination system 22 can implement various types of lighting. Lighting approaches can include, for example, diffuse, back lighting, polarized front illumination, and directional structured front illumination. Different light sources provide different spectrums of light each being useful in different situations. In general, the illumination system 22 needs to be tailored to match the desired task. In one embodiment, a white light source with a bifurcated fiber optic light array that provides a structured, focused, off-axis light illuminating object 16. This illumination allows three-dimensional spatial information to be acquired based upon the intensity of the reflected light received through the optic 24.
  • Computer system 20 includes control hub 28 which interfaces between computer system 20 and various connected components.
  • Host computer system 20 can be an IBM compatible personal computer, with typical components, which is rack mounted.
  • Computer system 20 executes software engine 30 which provides a user interface and performs the classification analysis of image descriptions.
  • Control hub 28 receives status signals from the connected components and provides the status signals to software engine 30.
  • Software engine processes the status signals, in addition to the image descriptors, and generates control signals to manage operation of the components. The control signals are then provided to the various components through control hub 28.
  • precision positioning system 14, material handling system 18, illumination system 22, and optic 24 can all receive control signals and provide status signals through control hub 28.
  • control hub 28 can include optical interfaces (e.g., OPTO 22) for digital signals and digital/analog (D/A) interfaces for analog signals.
  • Computer system 20 further includes a data input/output (I/O) card 32 that controls and receives the image data from imager 26.
  • Data I/O card 32 can function essentially as a frame grabber which reconstructs the scanned image and converts the data for processing accelerator card 34.
  • Processing accelerator card 34 processes the image data to provide image descriptors to software engine 30.
  • Processing accelerator card 34 performs image processing algorithms on the image data to produce the descriptors of the image.
  • software engine 30 references populations of information about defects stored in its knowledge base. When software engine 30 makes a decision, it associates a confidence level with the decision. Software engine 30 typically has more confidence when a defect is similar to prior defects and a number of similar defects have been seen.
  • the analysis is essentially an adaptive, experience/knowledge-based correlation with an underlying statistical basis or behavior.
  • the practical issues for confidence in a decision can include how many examples of the defect, how many times a classification has been done in a certain way, how similar the defect is to other defect classes in the knowledge base, relationship to other current defect classifications (defects found in other board locations) , and what operator, if any, has entered the classification. Once determined with sufficient confidence, classification of defects allows the manufacturing entity not only to correct the defects but also to identify and correct the source of the problem.
  • the computation and aggregation of the descriptor values can be viewed as a re-description of the image, previously represented as a set of pixels, now represented as a set of descriptor-value pairs or other appropriate descriptors.
  • the descriptors need not be a complete description of the image if the set of descriptors are chosen so that they provide the information necessary to determine if a defect exists, and, if it does, to what class of defects it belongs. In other words, the descriptors reflect the semantics of the image and inspection process.
  • system 10 performs historical similarity based classification using an adaptive knowledge base of historical information.
  • the knowledge base provides a general , wide breadth of information about the inspection and classification of assembly components. This knowledge is stored in such as way as to permit a historical similarity-based classification decision making methodology to be employed during decision making.
  • This method of classification uses the knowledge base of general inspection knowledge which is augmented with specific knowledge of how defects appear on different components of a given object 16.
  • FIGURE 3 is a diagram of an example of a detected defect mapped onto a decision space of the knowledge base according to the present invention.
  • One way to represent the tree-like structure depicted at region 46 of FIGURE 2 is as a multi -dimensional space.
  • This multi-dimensional space reflects the partitioning of knowledge into defect regions that correspond to either a defect class or to a prototypical defect.
  • there are two known defects having defined defect regions 50 and 52, as shown. Suppose that a new defect point 54 is presented after inspection which does not fall within either region 50 or region 52. The inspection system thus needs to determine how to classify point 5 .
  • the inspection system may prefer defect class 1 (region 50) over defect class 2 (region 52) , defect class 2 over defect class 1, or neither.
  • FIGURES 4A, 4B and 4C are diagrams of three possibilities for classification of the detected defect of FIGURE 3 according to the present invention.
  • one alternative is that point 54 should be classified as a member of defect 1.
  • the decision space is modified to expand defect region 50, as shown in FIGURE 4A.
  • a second alternative is shown in FIGURE 4B in which point 54 is classified as a member of defect 2, and defect boundary 52 is expanded.
  • a third alternative is shown in FIGURE 4C in which point 54 is classified as a member of a new defect class 3 defined by a new region 56.
  • This determination of how to classify point 54 into a defect class can be made by taking into account the relative similarity between the new point 54 and defect classes 1 and 2.
  • the operator can be solicited for an opinion. The information gained from the operator can then be used to determine which alternative is selected and to provide important feedback to the system for future classifications.
  • One advantage to the process used in the present invention is that images are represented in semantic space by the use of descriptors. Rather than work directly with image pixels (syntactic space) or with simple geometric descriptions of images (symbolic space) , the inspection system represents information contained in the image with semantically meaningful descriptors (semantic space) .
  • the descriptors embody selected descriptive components of the image which allow the inspection system to distinguish a defect class or set of classes from others.
  • a second advantage is that defects can be arbitrarily close to one another in the defect space.
  • one of the difficult things to resolve is where two defect classes are visually similar.
  • the inspection system of the present invention allows defect classes to be arbitrarily close to one another as long as there is at least one feature by which they can be discriminated.
  • Another advantage is that new information on defect classification can be introduced at any time during system operation. For example, as shown in FIGURES 4A, 4B and 4C, a new data point or defect can be added that modifies the boundary between two defect classes. The addition of this new knowledge can be used to refine the decision boundary between classes to an arbitrarily fine degree.
  • a fourth advantage is the introduction of knowledge into the knowledge base is additive, meaning that members of other existing defect classes are not reclassified simply because one defect class is being modified by the introduction of new information. This is unlike templating methods, where the modification of a threshold value is likely to upset the classification accuracy of a number of defect classes and the total effects can only be determined by experimentation (across multiple board types and lots) .
  • a further advantage of the present knowledge base is that decision making is explainable, justifiable, and correctable.
  • the decision making component of the inspection system enables any classification made by the system to be presented to the operator with the basis for the decision. Often, the explanation is based on previous interactions with the operator, enabling the growth of trust in the operation of the inspection system.
  • the inspection system is designed in an object-oriented, modular fashion, so that the speed at which the system runs can be regulated by the type of computational support given to it. For example, inspection times in the range of two minutes to below twenty seconds can be realized for a typical twenty by eighteen inch printed wiring board.
  • the knowledge based decision making for automated visual inspection of products, such as populated printed wiring boards has the potential to fill in many information gathering holes in current assembly production processes.
  • the present technology can detect and classify relevant defects by making informed, reasoned judgments on the components in images. The system can do this because it works with semantic descriptors of images that are designed to maximize discrimination between defect classes, and because the system has the knowledge to make those decisions with regard to other possible defects.
  • the present technology also can provide accurate process control information. Accurate defect classification information is an important requirement for providing accurate process control information. Detection of anomalies is generally not enough to generate useful process control information. Further, conventional defect detection systems often rely completely on human operators to make classification decisions which results in data that is not timely or consistently accurate.
  • the present invention reduces false alarms.
  • the ability to accept new information from the operator in real-time allows the system to reduce the number of false alarms that it produces, and eliminates the reoccurrence of any particular false alarm.
  • the knowledge structuring and decision making features of the system allow it to learn the correct classification of a defect that was initially classified incorrectly and not repeat that error.
  • the corrected classification becomes part of the knowledge base which is retained from run to run, applying the learned knowledge to a wide variety of new component instances .
  • the system also simplifies programming of new boards for inspection. Populated printed wiring board inspection is made significantly easier by this adaptive learning technology. Bringing a never-before- seen board through inspection programming can be done by presentation of CAD information and a sample board (populated and unpopulated) and through selection of inspection criteria preferences.
  • the present system also gives the manufacturer an ability to perform full 100% inspection of boards assembled on an equipped production line. This can be viewed in two ways: 1) it provides a gate facility that prevents defective boards from escaping the manufacturing facility; and 2) it provides continuous monitoring feedback that allows new insights into the production process to be found and exploited.
  • FIGURE 5 is a diagram of one embodiment of an automated visual inspection station, indicated generally at 60, for detecting and classifying defects in printed wiring boards according to the present invention.
  • Station 60 includes a platform 62 and vibration isolated subplatform cradle 63 constructed from metal components to provide physical support for station 60.
  • a pair of guides 64 are mounted on subplatform 63 and provide a path for populated printed wiring boards 66.
  • a stop 68 can be used to stop each board 66 at the appropriate position for inspection, and a thumper 69 can be used to position board 66 against one of guides 64.
  • Guides 64, stop 68 and thumper 69 form a material handling system for boards 66. This material handling system can be 18 inches wide by one meter long with a two speed, reversible conveyor to handle boards 66 and can have a SMEMA interface built into platform 62 for interfacing with other components of a manufacturing line.
  • Station 60 further includes a light source 70 which provides light for illuminating board 66.
  • Light source 70 can be a FOSTEC DC regulated 150 Watt source with analog control and long life quartz Halogen bulbs (DDL type) .
  • Station 60 also includes a camera 72 that receives an image through an optic 74.
  • Camera 72 can be a DALSA CL-E2 2048A TDI line scan camera with a VISION 1 PS1 linear power supply for CL-E2.
  • Optic 74 can be an INVARITAR telecentric optic with .37x magnification over a 72 millimeter field of view (FOV) .
  • FOV millimeter field of view
  • the light generated by light source 70 is split to two heads 76 through a bifurcated optical fiber 78.
  • the heads 76 can be FOSTEC 6 inch line heads with cylindrical lenses.
  • Light source 70, camera 72, optic 74, heads 76 and optical fiber 78 together form an imaging system for scanning and acquiring images of board 66. Because of the structured directed light provided by heads 76, three-dimensional spatial information can be extracted/computed from the intensity of the reflected light during scans.
  • FIGURE 6 is a diagram showing more detail of one embodiment of this imaging system.
  • the precision positioning system includes a threaded precision guide 80 having a motor 82 and a threaded precision guide 84 having a motor 86.
  • guides 84 and 86 could be, for example, pneumatic, hydraulic or magnetic.
  • Motors 82 and 86 include motor drive units which can be simultaneously controlled by a two- axis controller in interface devices 88. These components allow the imaging system to be moved across board 66 to scan in an image of board 66.
  • the threaded precision guides 80 and 84 can be DAEDAL 406 series linear stages (e.g., 16 and 20 inch) with precision linear encoders and two pitch lead screws.
  • Motors 82 and 86 can be two ZETA motor drives (one per axis) controlled by COMPUMOTOR 6200 Series Two axis indexers .
  • Interface devices 88 which include optical relays for digital control signals and digital/analog converters for analog control signals, interface between a computer system 90 and the various controlled components.
  • Computer system 90 provides a processing and control engine for operating inspection station 60 and operates generally in the same manner as described above with respect to FIGURE 1.
  • Computer system 90 receives status information and provides control signals as well as communicating with peripherals 92.
  • Computer system 90 can be an Industrial Computer Source rack mounted chassis having a single board computer with an INTEL PENTIUM PRO 200 MHz processor and other typical computer components.
  • Interface devices 88 can include a 6200 indexer interfacing through a standard RS232 serial port and computer boards and digital-to- analog and OPTO 22 interface cards.
  • Camera 72 can be controlled by a data I/O card in computer 90, and light source 70 can be controlled by multi-functional I/O PWB with a dual D/A upgrade.
  • the data I/O card can be a MATROX INTERNATIONAL camera interface card (RS 422) .
  • An associated processing accelerator card (described above) can be a MATROX INTERNATIONAL GENESIS 64/8 and MATROX INTERNATIONAL GPRO 16/8 processing card set that communicates via a high-speed backplane. With a MATROX INTERNATIONAL GENESIS card, additional MATROX
  • Peripherals 92 can include a 17 inch color monitor, standard keyboard and mouse/trackball, and a 4 Gbyte DAT tape drive.
  • the network interface (not shown in FIGURE 5) can be a 100 Mbit (100 BASE T) Ethernet card.
  • FIGURE 6 is a diagram of one embodiment of the imaging system of station 60.
  • camera 72 is connected to optic 74 which includes an adapter optic 94 and a telecentric optic 96.
  • Telecentric optic 96 provides a support for mount 98 on which bifurcated line heads 76 are mounted. Heads 76 direct structured focused light on object (printed wiring board) 66 which is reflected and captured by optic 74.
  • station 60 of FIGURE 5 can provide automatic detection and classification of component, lead, and board level defects on populated printed wiring board 66, including discriminating between defects and "don't care" anomalies.
  • Station 60 provides rapid throughput with real-time optical inspection and, for example, can inspect an 18 by 20 inch board 66 with seven passes in approximately two minutes.
  • Station 60 does not require on-the-floor programming to modify or add defects or to change board types, the latter only requiring standard component placement and data CAD files and board samples (populated and unpopulated) .
  • Station 60 includes automatic background self- diagnostics and correction and can be networked with other stations or a central management unit. Further, station 60 can be set up to electronically transfer messages (e.g., e-mail, TCP/IP, paging, and other messages) as well as appropriate images for engineering analysis of low confidence classifications. Station 60 can be delivered fully functional to automatically detect and classify defects and can be trained to classify new defects encountered in production. For example, the table provided in APPENDIX A, below, shows defects that station 60 is trained to detect and classify. Further, APPENDIX B provides example specifications for one implementation of station 60.
  • APPENDIX A shows defects that station 60 is trained to detect and classify.
  • APPENDIX B provides example specifications for one implementation of station 60.
  • the process implemented by inspection station 60 can include a number of operational modes. At power- up/log-in, station 60 can automatically perform diagnostics and calibration tests and adjustments on critical inspection station parameters such as illumination, control and overall system health. An initialization window can then prompt the operator to select a board type from either a directory or to enter the name of a new board type. If a new board type is specified, station 60 enters a changeover mode. In the new board changeover mode, the station interface prompts the operator through the stages to create, develop, and load a new board knowledge base. If a known board type is specified, station 60 verifies that the complete knowledge base is available, and station 60 is then ready to inspect. Station 60 generally can be ready for board inspection in minutes and returns the inspection and characterization window to the operator.
  • station 60 runs system diagnostics continuously in the background during operation. Automatic correction is made or the operator is prompted with the corrective action required.
  • station 60 typically allows two modes to be selected: characterization and inspection. In the characterization mode, station 60 classifies board defects in greater detail and outputs complete information required for process control (PC) feedback. In inspection mode, station 60 acts in a pass/fail mode as a production line gate (generally, well understood and controlled processes) . Station 60 can toggle between characterization and inspection modes. When station 60 encounters a defect in which it has low confidence for classification, it queries the operator for input and enters an incremental training mode. Upon completion of the incremental training, station 60 returns to the inspection/characterization mode and continues operation.
  • PC process control
  • the addition (or removal) of defect classes to (or from) the knowledge base can be performed, assuming proper authorization.
  • station 60 classifies a defect in which it has low confidence, the station interface can show both the region in question and the defect in question on the display.
  • Station 60 can then provide its best guess regarding the classification and offer the operator three choices: (1) verify that its best guess is indeed correct; (2) identify its best guess as wrong and classify the defect as another type previously known to the station; or (3) identify its best guess as wrong and classify the defect as a new type previously unknown to the station.
  • Station 60 can ask the operator to verify or correct the classification and several example images and information files of each defect class or prototype relevant to a classification decision can be presented to the operator to aid in the decision.
  • a line manager, engineer, or technician can be the only one authorized to make these decisions.
  • the results of incremental training can be broadcast to a central control and/or directly to all other networked stations .
  • Station 60 has a report mode in which summaries of the results of inspection/characterization and station status can be generated. For example, the total number of a certain type of defect over the last N boards can be graphically presented or a report of the station's calibration history can be generated. Reports can be automatically produced and electronically distributed on a regular schedule. Further, an analysis mode allows trends in inspection data to be used for process understanding and control. The analysis (or data- mining) mode of station 60 allows the operator to determine the relationships between process parameters and product quality (defects) over a range of time scales. Station 60 can thus play a crucial element in process control. Early detection and classification of defects alerts the operator to problems and allows early correction at the malfunctioning manufacturing stage .
  • the inspection station of the present invention could be implemented using numerous alternative components other than those shown in FIGURE 5. However, certain components have been selected as being well suited to the task of inspecting populated printed wiring boards.
  • a line scan camera is particularly useful because it gives higher resolution imagery over a wider field of view than can be obtained from available, low-cost area cameras.
  • a line scan camera is made to continuously scan an image, so it can work with moving parts which naturally complements the inspection process because that is how objects are typically presented.
  • the line scan camera can provide a 100% resolution, or 100% fill factor, whereas an area camera will not.
  • the line scan camera can accomplish this over a larger field of view and can collect data faster on moving targets.
  • TDI time delay integration
  • the reason for choosing a TDI camera is that it overcomes the inherent light insensitivity of conventional line scan cameras.
  • the TDI line scan camera provides an ability to get high speed imagery without needing excessive amounts of light. Without the TDI aspect, an excessively bright light source might be required.
  • the TDI line scan camera allows low light levels to be used while the camera is run at a relatively fast rate (e.g., 8,000 lines a second at low illumination).
  • a telecentric optic component is used with the camera to provide a "straight on" view of the board under inspection.
  • a MELLES GRIOT INVARITAR telecentric optic is used because it provides advantages over conventional optics and an advantage over other telecentric optics.
  • conventional optics view part features at different angles, up to 30 degrees or more, across the FOV. These changes in viewing perspectives can make the tops and bottoms of a given object appear at different x,y locations in the image, and can cause small components at the edges of the FOV to be occluded by larger components that are closer to the center of the FOV.
  • INVARITAR optics are designed to view objects "straight on" across the entire FOV, removing both limitations.
  • the magnification depends on the object-to-optic distance.
  • INVARITAR telecentric optics provide large gauging depths of field (DOF) because their magnification is independent of object- to-optic distance. Compared to conventional, and most telecentric optics, INVARITAR telecentric optics can reduce magnification errors by a factor of 10 or more.
  • the optic physically mounts directly to the camera and needs to match the camera so that the optic focuses the required FOV onto the camera sensor. So, if the sensor has sides X and Y, and the optic field of view is A and B, the optic needs to map the field of view down to the sensor (A,B:X,Y) .
  • the calculation of FOV involves the resolution required and the number of pixels in the line scan array. For example, if a 1 mil. (.001") image pixel resolution is required, a 2,048 pixel line scan camera can image a 2.048" FOV.
  • the calculation of the required optic magnification involves the physical size of the line scan imager pixels and the image pixel resolution.
  • One implementation uses the following:
  • Image pixel resolution 1.383 mils 35 . 14 ⁇ m Imager size 2,048 pixels
  • an optic such as the INVARITAR telecentric optic
  • the inspection of many small occluded parts and leads would be virtually impossible because the required image data could not be obtained.
  • the benefits of an optic such as the INVARITAR telecentric optic improve the overall accuracy and repeatability of all inspections.
  • illumination there are numerous ways to accomplish illumination depending on what it is that needs to be imaged.
  • a halogen light source is used along with a structured light system with heads on each side that are driven by a bifurcated fiber optic cable from the source.
  • the structured lighting provided by the heads points on the board and reflects up and collects on the sensor.
  • the structured illumination provides significant advantages by allowing the generation of three- dimensional spatial information from a two-dimensional captured image.
  • the third dimension (height) can be captured based upon the relative intensity of the reflected light and apriori information about the topology and reflectivity of the object being imaged. Things like solder paste volume can then be estimated from the three-dimensional spatial data.
  • a three-dimensional surface i.e., a board
  • a particular reflection back into the camera is obtained depending on the actual topology of that surface. So, for example, if the system is looking at a solder joint, the reflection will be noticeably different depending on what portion of the joint is lighted and the physical structure of the joint (defective or not) .
  • the intensity or the change in intensity provides information from which three-dimensional spatial features can be recovered.
  • the rate of change can be recorded as a gradient for the inspected surface. This allows equal height surfaces to be reconstructed for the object, somewhat like a topology map.
  • a volume estimate can be generated from this information by initially calibrating based upon a known volume.
  • the imaging system design enables this type of analysis and provides information that most systems can not capture .
  • On the control and processing side there are generally a number of printed circuit boards that can accomplish the various functions. For example, the imager feeds data to a data I/O board that then directly feeds a processing accelerator card. The imager has control signals that tell it to collect and send image data.
  • the data is then collected and reconstructed into a two-dimensional image.
  • the processing accelerator cards have a direct high-speed connection to the data I/O card. Additionally, data transfer between accelerator cards can occur across a backplane and not across the host system bus. Because the host system is involved with executing the final classification algorithms and graphical user interface (GUI) and managing the control signals, it would be undesirable to over task it with either pre-processing or with data transfer. It is for this reason that intensive data transfer and processing is accomplished by the processing accelerators.
  • the resulting descriptors are then passed to the host computer over the system bus. As for the physical platform, it is recognized that the platform may not be completely perfect. Thus, built-in mechanisms are included to allow adjustment.
  • One particular feature supported by the host system is the ability to electronically transfer a message to a production manager so that the production manager does not have to come over to the manufacturing area each time something happens.
  • the production manager can receive the electronic message in a remote office, look at any attached images and do what needs to be done for the classification.
  • FIGURE 7 is a flow chart of one embodiment of an automated inspection process implemented using station 60 of FIGURE 5.
  • the station acquires image data from scanning the board being inspected.
  • the image data is then pre-processed in step 102.
  • the image is described by selected descriptors in step 104.
  • the descriptors are then analyzed, in step 106, such that the image is analyzed in context with the system knowledge base.
  • defects are identified and classified.
  • the station determines whether the confidence in the classification is above a preset threshold. If so, the station checks, in step 112, whether analysis of the board is complete. If not, the process continues at step 100. If the analysis is complete, the process continues at step 114 with the ejection of the board and obtaining a new board to inspect .
  • step 110 If, in step 110, confidence was not above the threshold, then the process continues at step 116.
  • step 116 the operator is alerted and provided with the station's best option as to the proper classification.
  • step 118 the station accepts confirmation of the best option by the operator or accepts an alternate classification by the operator.
  • step 120 the station adds information to the knowledge base as necessary to accomplish incremental learning based upon the operator's input.
  • step 112 the process then continues at step 112 as discussed above. Classification is accomplished according to the present invention in accordance with a set of pre- specified policies. Descriptors are extracted and stored to the knowledge base.
  • the new descriptors are compared by a variety of metrics to those stored in the knowledge base to determine whether or not a defect exists and what type it is. So, if there is a good match, the system can have confidence that the object being inspected contains the specified defect. If there is not a good match, the system can select the closest match as a best option. The operator can then provide feedback as to whether the option is accurate, a different class should be selected, or a new class should be defined. If it is the latter, then in one step, an entirely new class of defects is established. In this manner, the knowledge base is built incrementally from a base level of information to include more and more knowledge.
  • the system generates more and more trust with the operator because the operator sees his decisions appear in the decision making of the system.
  • part of the incremental learning cycle can be not only to store the descriptors but also to store who made the decisions on low confidence classifications and when. This allows decisions to be back tracked out of the system if they prove to be poor judgments.
  • the confidence level of the system can also be based upon how many of the particular type of defect the system has seen. Thus, the confidence level represents not only how confident the system is in the single classification but also populations of classifications. The statistics therefore use populations of information with known quality.
  • Process control can then be based upon the classifications.
  • the overall process can be improved by analyzing defect histories and matching defects to a cause. Placing decisions as to how to classify a defect in context with other defects and decisions produces higher quality input from operators, and over time, the knowledge base gets more and more refined.
  • the present invention is applicable to analysis of bare printed wiring boards, semiconductor packages (e.g., PPGAs, PBGAs, C4 packages, or TAB tape) and other types of objects.
  • semiconductor packages e.g., PPGAs, PBGAs, C4 packages, or TAB tape
  • the imaging system can be fixed while the object to be analyzed is moved. This approach would allow the imaging system to remain more stable during scanning which would allow more precise motion for ultra high resolution images (e.g., ⁇ 3 m pixels vs. 30 ⁇ m pixels), but would likely require a larger platform with a wider range of movement.
  • an integrated piece of equipment can include both object processing components and automated visual inspection components.
  • the object processing components receive the object and process it to prepare it for subsequent steps in the manufacturing process .
  • the automated visual inspection components perform substantially as described above to identify any defects after the object has been processed.
  • control components can be used to instruct the object processing components to correct the defect prior to passing the object on to the subsequent step in the manufacturing process.
  • the object for example, can be a printed wiring board where the object processing components perform solder paste deposition, and the automated inspection components allows solder paste verification.
  • the object processing components can perform pick and place of components, and the automated inspection components allow component placement verification.
  • Material handling system is a SMEMA compatible conveyor segment and is compatible with other SMEMA production line equipment.
  • Station is not integrated into the production.
  • the station is loaded/unloaded manually with conveyor assist . • Real-time 100% inspection at production line rates .
  • Scalable processing engine allows increased line rates to be met.
  • Processor architecture is transparent to the operator.
  • Embedded classification analysis mechanisms flag questionable classifications for operator analysis and immediate correction which pushes escape rates down.
  • GUI Graphical User Interface
  • DDC defect detection & classification
  • Throughput is primarily a function of component count and inspection routines being applied. Board area does not typically affect the station throughput; only for very sparse boards does board area set throughput levels .
  • the throughput numbers provided assume a complete set of relevant inspection routines is applied to each part. Minimum part size: 0402. Minimum lead size: 10 mil pitch.
  • Footprint lm (L) x lm (W) x 2m (H) occupies the production line floor space required by a standard lm conveyor section. Allows for On-Line and Off-Line application.
  • GUI Requirements GUI includes routines and screens for system start up, known board preparation, unknown board preparation, inspection, low confidence/defect verification and validation, new defect entry, knowledge base verification/editing, report generation, data base set up.
  • Station includes network connection, disk drive and resident 2G hard drive for data storage and archiving. High-capacity tape drive systems are optional.
  • Air 100 PSI, 1.0 CFM, Filtered &
  • Safety Mechanisms Emergency power-off button (entire station is shut down) and emergency interrupt button (only moving components are shut down) . Emergency interrupts are also generated when any user openable access panel is opened. Once triggered, a reset mechanism must be enabled to re-establish power.
  • the Calibration Toolkit contains Calibration: routines for automatic station calibration and calibration verification including illumination verification and adjustment, new- bulb calibration, station alignment and motion analysis, image quality assurance .
  • the only regular station maintenance required is the periodic replacement of illumination system light bulbs.
  • the station will automatically notify the operator when a given bulb set requires replacement.
  • a step by step bulb replacement procedure will be provided to the operator through the GUI (no tools will be required) .
  • the station automatically performs calibration and returns to the inspection mode of operation. It is expected that the light bulbs will require replacement no more than once every two months. It is recommended that two sets of spare bulbs be kept available for each station.

Abstract

An automated visual inspection system and process are disclosed. The system (10) includes an imaging system (12) operable to provide image data and a precision positioning system (14) operable to move the imaging system (12) to scan an object (16). A processing and control engine is coupled to receive the image data and to provide control signals to the precision positioning system (14) and other components. The processing and control engine pre-processes the image data, represents the image using descriptors, compares the descriptors to information in a knowledge base and detects and classifies defects based upon results of the comparison. The processing and control engine also associates a confidence level with the classification of the defect. In one embodiment, if the confidence level is not above a specified tolerance, the processing and control engine alerts an operator and provides a best option as to the classification of the defect, accepts a confirmation or alternate classification from the operator, and adds information to the knowledge base after a decision is made. In another embodiment, the imaging system (12) provides structured, reflected light intensity as part of the image data, and the processing and control engine recovers three-dimensional spatial information based upon the reflected light intensity.

Description

AUTOMATED VISUAL INSPECTION SYSTEM AND PROCESS FOR DETECTING AND CLASSIFYING DEFECTS
TECHNICAL FIELD OF THE INVENTION This invention relates in general to automated inspection of manufactured products, and more particularly to an automated visual inspection system and process for detecting and classifying defects. BACKGROUND OF THE INVENTION The need for automated inspection of manufactured products arises from inherent inefficiency of human inspectors caused, in part, by an inability to maintain a continuous focus and to apply a consistent analysis from day to day. Conventional automated inspection technology typically sacrifices classification accuracy -- and hence the ability to provide meaningful process control information - for detection ability and ease of implementation. Thus, while conventional inspection technologies may be able to detect a wide range of anomalies, they do not provide adequate classification of defects. This is problematic because detection without accurate classification does not provide the information needed to improve the manufacturing process and increase production yields. For example, the manufacture of populated printed wiring boards for computing devices is an industry in which automated inspection could be used to reduce the number of defective board escapes (i.e., boards that make it out of the factory and to the customer) and improve the yield of the production line. There should be two main goals of this automated inspection: (1) to detect and classify visible defects and (2) to provide process control information. With respect to defects, some are at the component level, such as missing components, components off-pad, reverse-polarity, wrong components, and foreign material. Other defects are at the lead level and include bent and missing leads, individual lead skew, solder bridges and solder joints with dewets or insufficient solder. With respect to process control, defect classification and source of the defect information can be used to improve the production process, thereby improving production yields, reducing the number of defective boards produced, and reducing the time defective boards are in rework.
Conventional automated inspection technology does not accomplish these goals. For populated printed wiring boards, the conventional technology is generally based on a technology called templating. A template is an image of a component or section of populated printed wiring board that is assumed to be good. As boards are being inspected, the template is subtracted from the image of the same component or board section taken from the board under inspection. The template is divided into a number of different regions, each with one or more threshold values . The difference between the image under inspection and the template is compared to the threshold to determine if there is a potential defect in that region. Templating as a technology for automated visual inspection of populated printed wiring boards has some benefits that, in the past, made it a viable approach. The approach is conceptually very simple and is straightforward to implement in hardware. The ease of implementation has led to the development of specialized hardware designed to process a single image frame very quickly. The template based method is also capable of detecting a number of possible errors. Further, when defect sizes are relatively large, there can be enough flexibility in the specification of the "important" regions around a component that limited classification of detected defects is possible.
However, there are significant drawbacks as well. Templating technology is very difficult to extend. As new defects arise, a set of thresholds must be adjusted which often is too difficult for the typical production line operator or impossible in some cases, and effects of a change can only be determined by experimentation. Further, if classification is required over a range of defects, functional additions must be made to the templating decision method (e.g., the magnitude of the differences triggers computational functions to be run, and the defect classification is made on the output of the functions.) In addition, specialized hardware implementations, made on the basis of an imaging system field of view (FOV) and resolution capable of handling larger defects, require substantial design modification when higher resolutions are needed to distinguish defects. Detecting discrepancies between the board under inspection and a template is relatively inapplicable to current inspection problems. Templating inspection methods work well where there is very little variation in the images to be analyzed (and where variation alone is a direct indication of a defect) . Historically, component sizes have been large enough and production rates have been slow enough that templating methods could apply threshold values "creatively" to produce a functional solution. However, for current populated printed wiring board manufacturing, there are numerous variations that can produce satisfactory boards -- e.g., variations in solder shape, replacement generic parts, and minor placement differences -- where the variations can represent a majority of the component itself. These variations make it unpractical, if not impossible, to successfully apply templating to current automated inspection problems and modern production lines .
SUMMARY OF THE INVENTION
In accordance with the present invention, an automated visual inspection system and process for detecting and classifying defects are disclosed that provide advantages over conventional automated inspection systems and methods.
According to one aspect of the present invention, the automated visual inspection system and process include an imaging system operable to provide image data and a precision positioning system operable to move the imaging system to scan an object. A processing and control engine, including a scalable processing accelerator, is coupled to receive the image data and to provide control signals to the precision positioning system. The processing and control engine pre-processes the image data, represents the image using descriptors, compares the descriptors to information in a knowledge base and identifies and classifies a defect based upon results of the comparison. The processing and control engine also can associate a confidence level with the classification of the defect. In one embodiment, if the confidence level is not above a specified tolerance, the engine alerts an operator and provides a best option as to the classification of the defect, accepts a confirmation or alternate classification from the operator, and adds information to the knowledge base responsive to the confirmation or alternate classification. In another embodiment, the imaging system provides structured, reflected light intensity as part of the image data, and the engine recovers three-dimensional spatial information based upon the reflected light intensity when the image data is pre-processed.
A technical advantage of the present invention is the automatic classification of defects based upon comparisons with information held in a knowledge base.
Another technical advantage of the present invention is the use of a line scan camera, a telecentric optic, and structured lighting to allow the inspection system to collect three-dimensional spatial information about a scanned object in addition to a two-dimensional spatial image of the scanned object.
An additional technical advantage of the present invention is the modular architecture which allows for scaling of data collection and processing to desired levels of resolution, accuracy, and throughput (processing speed) .
A further technical advantage of the present invention is the capability to adjust the physical platform to mechanically adjust for skew. Also, the system can periodically check for and remove fixed pattern errors caused by the system's mechanical characteristics .
Another technical advantage of the present invention is the maintenance of information between a number of inspection systems about classifications of prior inspections to place each decision in historical context . Additional technical advantages should be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete understanding of the present invention and the advantages thereof may be acquired by referring to the following description taken in conjunction with the accompanying drawings, in which like reference numbers indicate like features, and wherein:
FIGURE 1 is a block diagram of one embodiment of an automated visual inspection system according to the present invention;
FIGURE 2 is a flow diagram of one embodiment of a process for detecting and classifying defects implemented by the system of FIGURE 1;
FIGURE 3 is a diagram of an example of a detected defect mapped onto a decision space of a knowledge base according to the present invention;
FIGURES 4A, 4B and 4C are diagrams of three possibilities for classification of the detected defect of FIGURE 3 according to the present invention; FIGURE 5 is a diagram of one embodiment of an automated visual inspection station for detecting and classifying defects in printed wiring boards according to the present invention;
FIGURE 6 is a diagram of one embodiment of the imaging system of the station of FIGURE 5; and
FIGURE 7 is a flow chart of one embodiment of an automated inspection process implemented using the station of FIGURE 5. DETAILED DESCRIPTION OF THE INVENTION
In general, it is not adequate for automated visual inspection systems to simply detect anomalies. Often, there are too many anomalies that are viable component constructions and should not be identified as defects. To be effective in supplying a manufacturing entity the information needed to improve yields on the production line, inspection stations need to provide accurate classification (both defective and non- defective) of anomalies and need to report defects and variations to the user in a way that allows predictions of production line performance. The weaknesses of templating and other conventional techniques motivated the development of a fundamentally different technology for visual inspection. According to the present invention, adaptive knowledge-based reasoning is used which stores inspection knowledge in a database and uses that knowledge to generate high-quality defect detection and classification. The benefits of this technology go beyond accurate classification by making setup, operation, and maintenance a more simple, end- user task.
Automated Visual Inspection System FIGURE 1 is a block diagram of one embodiment of an automated visual inspection system, indicated generally at 10, which is based on an adaptive knowledge-based reasoning technology according to the present invention. System 10 includes an imaging system 12 that is coupled to a precision positioning system 14. Precision positioning system 14 operates to position imaging system 12 to scan an object 16 that is being inspected. Object 16 is coupled to a material handling system 18 which operates to position object 16 within the inspection range of imaging system 12.
Imaging system 12, precision positioning system 14 and material handling system 18 are coupled to and communicate with a host computer system 20 which provides a processing and control engine for system 10. Material handling system 18 provides gross manipulation of object 16 and delivers object 16 for viewing by imaging system 12. Material handling system 18 can include robotic, mechanical, or manual movement to position object 16. Precision positioning system 14, which is coupled to imaging system 12, operates to position imaging system 12 over object 16 and to move imaging system 12 to scan object 16. In one embodiment, precision positioning system 14 sweeps the length of object 16, orthogonally steps over, and continues to sweep until the width of object 16 is completed by successive parallel sweeps. It should be understood that alternatives exist for obtaining the scanned image, including having a fixed camera with precision motion of object. In the embodiment of FIGURE 1, imaging system 12 includes an illumination system 22 which operates to illuminate object 16 during inspection. An optic 24 views the illuminated object 16 and provides an image to imager 26. Imager 26, in turn, provides image data to computer system 20. In one embodiment, imager 26 can be a time delay integration (TDI) line scan camera which provides relatively high resolution over a large field of view and increased light sensitivity. The TDI line scan camera operates with a continuous scan and provides high speed imagery without the need for excessive light. Optic 24 maps a given object field of view (FOV) onto the sensor array of imager 26 based upon optic 24 magnification/demagnification. In one embodiment, optic 24 is a telecentric optic with constant perspective and magnification across the entire FOV to achieve a top-down view of object 16 with relatively little parallax distortion or occlusions. All other fixed pattern optical distortions can then be mapped out by calibration algorithms running on host computer 20. Illumination system 22 can implement various types of lighting. Lighting approaches can include, for example, diffuse, back lighting, polarized front illumination, and directional structured front illumination. Different light sources provide different spectrums of light each being useful in different situations. In general, the illumination system 22 needs to be tailored to match the desired task. In one embodiment, a white light source with a bifurcated fiber optic light array that provides a structured, focused, off-axis light illuminating object 16. This illumination allows three-dimensional spatial information to be acquired based upon the intensity of the reflected light received through the optic 24.
Computer system 20 includes control hub 28 which interfaces between computer system 20 and various connected components. Host computer system 20 can be an IBM compatible personal computer, with typical components, which is rack mounted. Computer system 20 executes software engine 30 which provides a user interface and performs the classification analysis of image descriptions. Control hub 28 receives status signals from the connected components and provides the status signals to software engine 30. Software engine processes the status signals, in addition to the image descriptors, and generates control signals to manage operation of the components. The control signals are then provided to the various components through control hub 28. As shown, precision positioning system 14, material handling system 18, illumination system 22, and optic 24 can all receive control signals and provide status signals through control hub 28. In one embodiment, control hub 28 can include optical interfaces (e.g., OPTO 22) for digital signals and digital/analog (D/A) interfaces for analog signals. Computer system 20 further includes a data input/output (I/O) card 32 that controls and receives the image data from imager 26. Data I/O card 32 can function essentially as a frame grabber which reconstructs the scanned image and converts the data for processing accelerator card 34. Processing accelerator card 34 processes the image data to provide image descriptors to software engine 30. Processing accelerator card 34 performs image processing algorithms on the image data to produce the descriptors of the image. Processing accelerator card 34 then provides the image descriptors to software engine 30 which performs defect detection and classification by comparing the descriptors to information in a knowledge base of historical information. Computer system 20 further includes support interface hardware 36 which interface between computer system 20 and network devices and/or peripherals 38, such as a printer, back-up drives, networking devices, electronic message (e.g., email, TCP/IP, paging, or other messages) interface, keyboard and monitor.
Control hub 28 can also provide control signals to a system platform 39 which provides a physical support for system 10. In addition, platform 39 can be manually adjusted to fine tune system 10 and compensate for error introduced by the physical characteristics of platform 39 or of the other components.
Detection and Classification
In general, software engine 30 references populations of information about defects stored in its knowledge base. When software engine 30 makes a decision, it associates a confidence level with the decision. Software engine 30 typically has more confidence when a defect is similar to prior defects and a number of similar defects have been seen. The analysis is essentially an adaptive, experience/knowledge-based correlation with an underlying statistical basis or behavior. The practical issues for confidence in a decision can include how many examples of the defect, how many times a classification has been done in a certain way, how similar the defect is to other defect classes in the knowledge base, relationship to other current defect classifications (defects found in other board locations) , and what operator, if any, has entered the classification. Once determined with sufficient confidence, classification of defects allows the manufacturing entity not only to correct the defects but also to identify and correct the source of the problem.
FIGURE 2 is a flow diagram of one embodiment of the process for detecting and classifying defects implemented by system 10 of FIGURE 1. In region 40, an original image of object 16 is obtained from imager 26. The image data is transformed, in region 42, into a set of symbolic and real -valued descriptors by data I/O card 32 and processing accelerator card 34. This transformation can be accomplished through the use of computer aided design (CAD) component location and characteristic information, rules for determining what set of preprocessing routines need to be run on the image, and image processing routines for generating descriptor values. In region 44, the computation and aggregation of the descriptor values can be viewed as a re-description of the image, previously represented as a set of pixels, now represented as a set of descriptor-value pairs or other appropriate descriptors. The descriptors need not be a complete description of the image if the set of descriptors are chosen so that they provide the information necessary to determine if a defect exists, and, if it does, to what class of defects it belongs. In other words, the descriptors reflect the semantics of the image and inspection process. In region 46, system 10 performs historical similarity based classification using an adaptive knowledge base of historical information. The knowledge base provides a general , wide breadth of information about the inspection and classification of assembly components. This knowledge is stored in such as way as to permit a historical similarity-based classification decision making methodology to be employed during decision making. This method of classification uses the knowledge base of general inspection knowledge which is augmented with specific knowledge of how defects appear on different components of a given object 16.
Decision Space FIGURE 3 is a diagram of an example of a detected defect mapped onto a decision space of the knowledge base according to the present invention. One way to represent the tree-like structure depicted at region 46 of FIGURE 2 is as a multi -dimensional space. This multi-dimensional space reflects the partitioning of knowledge into defect regions that correspond to either a defect class or to a prototypical defect. In the 3D example of FIGURE 3, there are two known defects having defined defect regions 50 and 52, as shown. Suppose that a new defect point 54 is presented after inspection which does not fall within either region 50 or region 52. The inspection system thus needs to determine how to classify point 5 . Because point 54 is outside of the decision boundaries of regions 50 and 52, it may not be possible to say with confidence that the point 54 is a member of either class. Looking at class information, such as the "spread" or spatial coverage of the class, the inspection system may prefer defect class 1 (region 50) over defect class 2 (region 52) , defect class 2 over defect class 1, or neither.
FIGURES 4A, 4B and 4C are diagrams of three possibilities for classification of the detected defect of FIGURE 3 according to the present invention. As shown in FIGURE 4A, one alternative is that point 54 should be classified as a member of defect 1. In this case, the decision space is modified to expand defect region 50, as shown in FIGURE 4A. A second alternative is shown in FIGURE 4B in which point 54 is classified as a member of defect 2, and defect boundary 52 is expanded. A third alternative is shown in FIGURE 4C in which point 54 is classified as a member of a new defect class 3 defined by a new region 56. This determination of how to classify point 54 into a defect class can be made by taking into account the relative similarity between the new point 54 and defect classes 1 and 2. Also, in cases where a high confidence classification cannot be made, the operator can be solicited for an opinion. The information gained from the operator can then be used to determine which alternative is selected and to provide important feedback to the system for future classifications.
Adaptive Knowledge Based Reasoning
The use of adaptive knowledge based reasoning for detection and classification of product defects provides a number of advantages over conventional approaches to automated vision and inspection applications. One advantage to the process used in the present invention is that images are represented in semantic space by the use of descriptors. Rather than work directly with image pixels (syntactic space) or with simple geometric descriptions of images (symbolic space) , the inspection system represents information contained in the image with semantically meaningful descriptors (semantic space) . The descriptors embody selected descriptive components of the image which allow the inspection system to distinguish a defect class or set of classes from others.
A second advantage is that defects can be arbitrarily close to one another in the defect space. In templating systems, one of the difficult things to resolve is where two defect classes are visually similar. The inspection system of the present invention, on the other hand, allows defect classes to be arbitrarily close to one another as long as there is at least one feature by which they can be discriminated. Another advantage is that new information on defect classification can be introduced at any time during system operation. For example, as shown in FIGURES 4A, 4B and 4C, a new data point or defect can be added that modifies the boundary between two defect classes. The addition of this new knowledge can be used to refine the decision boundary between classes to an arbitrarily fine degree.
A fourth advantage is the introduction of knowledge into the knowledge base is additive, meaning that members of other existing defect classes are not reclassified simply because one defect class is being modified by the introduction of new information. This is unlike templating methods, where the modification of a threshold value is likely to upset the classification accuracy of a number of defect classes and the total effects can only be determined by experimentation (across multiple board types and lots) .
A further advantage of the present knowledge base is that decision making is explainable, justifiable, and correctable. The decision making component of the inspection system enables any classification made by the system to be presented to the operator with the basis for the decision. Often, the explanation is based on previous interactions with the operator, enabling the growth of trust in the operation of the inspection system.
Another advantage is that the inspection system is designed in an object-oriented, modular fashion, so that the speed at which the system runs can be regulated by the type of computational support given to it. For example, inspection times in the range of two minutes to below twenty seconds can be realized for a typical twenty by eighteen inch printed wiring board. The knowledge based decision making for automated visual inspection of products, such as populated printed wiring boards, has the potential to fill in many information gathering holes in current assembly production processes. The present technology can detect and classify relevant defects by making informed, reasoned judgments on the components in images. The system can do this because it works with semantic descriptors of images that are designed to maximize discrimination between defect classes, and because the system has the knowledge to make those decisions with regard to other possible defects.
The present technology also can provide accurate process control information. Accurate defect classification information is an important requirement for providing accurate process control information. Detection of anomalies is generally not enough to generate useful process control information. Further, conventional defect detection systems often rely completely on human operators to make classification decisions which results in data that is not timely or consistently accurate.
The present invention reduces false alarms. The ability to accept new information from the operator in real-time allows the system to reduce the number of false alarms that it produces, and eliminates the reoccurrence of any particular false alarm. The knowledge structuring and decision making features of the system allow it to learn the correct classification of a defect that was initially classified incorrectly and not repeat that error. The corrected classification becomes part of the knowledge base which is retained from run to run, applying the learned knowledge to a wide variety of new component instances . The system also simplifies programming of new boards for inspection. Populated printed wiring board inspection is made significantly easier by this adaptive learning technology. Bringing a never-before- seen board through inspection programming can be done by presentation of CAD information and a sample board (populated and unpopulated) and through selection of inspection criteria preferences. The present system also gives the manufacturer an ability to perform full 100% inspection of boards assembled on an equipped production line. This can be viewed in two ways: 1) it provides a gate facility that prevents defective boards from escaping the manufacturing facility; and 2) it provides continuous monitoring feedback that allows new insights into the production process to be found and exploited.
Inspection Station
FIGURE 5 is a diagram of one embodiment of an automated visual inspection station, indicated generally at 60, for detecting and classifying defects in printed wiring boards according to the present invention. Station 60 includes a platform 62 and vibration isolated subplatform cradle 63 constructed from metal components to provide physical support for station 60. A pair of guides 64 are mounted on subplatform 63 and provide a path for populated printed wiring boards 66. A stop 68 can be used to stop each board 66 at the appropriate position for inspection, and a thumper 69 can be used to position board 66 against one of guides 64. Guides 64, stop 68 and thumper 69 form a material handling system for boards 66. This material handling system can be 18 inches wide by one meter long with a two speed, reversible conveyor to handle boards 66 and can have a SMEMA interface built into platform 62 for interfacing with other components of a manufacturing line.
Station 60 further includes a light source 70 which provides light for illuminating board 66. Light source 70 can be a FOSTEC DC regulated 150 Watt source with analog control and long life quartz Halogen bulbs (DDL type) . Station 60 also includes a camera 72 that receives an image through an optic 74. Camera 72 can be a DALSA CL-E2 2048A TDI line scan camera with a VISION 1 PS1 linear power supply for CL-E2. Optic 74 can be an INVARITAR telecentric optic with .37x magnification over a 72 millimeter field of view (FOV) . As shown, the light generated by light source 70 is split to two heads 76 through a bifurcated optical fiber 78. The heads 76 can be FOSTEC 6 inch line heads with cylindrical lenses. Light source 70, camera 72, optic 74, heads 76 and optical fiber 78 together form an imaging system for scanning and acquiring images of board 66. Because of the structured directed light provided by heads 76, three-dimensional spatial information can be extracted/computed from the intensity of the reflected light during scans. FIGURE 6 is a diagram showing more detail of one embodiment of this imaging system.
Station 60 has a precision positioning system associated with the imaging system. The precision positioning system includes a threaded precision guide 80 having a motor 82 and a threaded precision guide 84 having a motor 86. (In other implementations, guides 84 and 86 could be, for example, pneumatic, hydraulic or magnetic.) Motors 82 and 86 include motor drive units which can be simultaneously controlled by a two- axis controller in interface devices 88. These components allow the imaging system to be moved across board 66 to scan in an image of board 66. The threaded precision guides 80 and 84 can be DAEDAL 406 series linear stages (e.g., 16 and 20 inch) with precision linear encoders and two pitch lead screws. Motors 82 and 86 can be two ZETA motor drives (one per axis) controlled by COMPUMOTOR 6200 Series Two axis indexers . Interface devices 88, which include optical relays for digital control signals and digital/analog converters for analog control signals, interface between a computer system 90 and the various controlled components. Computer system 90 provides a processing and control engine for operating inspection station 60 and operates generally in the same manner as described above with respect to FIGURE 1. Computer system 90 receives status information and provides control signals as well as communicating with peripherals 92. Computer system 90 can be an Industrial Computer Source rack mounted chassis having a single board computer with an INTEL PENTIUM PRO 200 MHz processor and other typical computer components. Interface devices 88 can include a 6200 indexer interfacing through a standard RS232 serial port and computer boards and digital-to- analog and OPTO 22 interface cards.
Camera 72 can be controlled by a data I/O card in computer 90, and light source 70 can be controlled by multi-functional I/O PWB with a dual D/A upgrade. Specifically, the data I/O card can be a MATROX INTERNATIONAL camera interface card (RS 422) . An associated processing accelerator card (described above) can be a MATROX INTERNATIONAL GENESIS 64/8 and MATROX INTERNATIONAL GPRO 16/8 processing card set that communicates via a high-speed backplane. With a MATROX INTERNATIONAL GENESIS card, additional MATROX
INTERNATIONAL GPRO cards can be added up to 6 total to increase the processing capacity. Peripherals 92 can include a 17 inch color monitor, standard keyboard and mouse/trackball, and a 4 Gbyte DAT tape drive. The network interface (not shown in FIGURE 5) can be a 100 Mbit (100 BASE T) Ethernet card.
FIGURE 6 is a diagram of one embodiment of the imaging system of station 60. As shown, camera 72 is connected to optic 74 which includes an adapter optic 94 and a telecentric optic 96. Telecentric optic 96 provides a support for mount 98 on which bifurcated line heads 76 are mounted. Heads 76 direct structured focused light on object (printed wiring board) 66 which is reflected and captured by optic 74. In operation, station 60 of FIGURE 5 can provide automatic detection and classification of component, lead, and board level defects on populated printed wiring board 66, including discriminating between defects and "don't care" anomalies. Station 60 provides rapid throughput with real-time optical inspection and, for example, can inspect an 18 by 20 inch board 66 with seven passes in approximately two minutes. Station 60 does not require on-the-floor programming to modify or add defects or to change board types, the latter only requiring standard component placement and data CAD files and board samples (populated and unpopulated) .
Station 60 includes automatic background self- diagnostics and correction and can be networked with other stations or a central management unit. Further, station 60 can be set up to electronically transfer messages (e.g., e-mail, TCP/IP, paging, and other messages) as well as appropriate images for engineering analysis of low confidence classifications. Station 60 can be delivered fully functional to automatically detect and classify defects and can be trained to classify new defects encountered in production. For example, the table provided in APPENDIX A, below, shows defects that station 60 is trained to detect and classify. Further, APPENDIX B provides example specifications for one implementation of station 60.
The process implemented by inspection station 60 can include a number of operational modes. At power- up/log-in, station 60 can automatically perform diagnostics and calibration tests and adjustments on critical inspection station parameters such as illumination, control and overall system health. An initialization window can then prompt the operator to select a board type from either a directory or to enter the name of a new board type. If a new board type is specified, station 60 enters a changeover mode. In the new board changeover mode, the station interface prompts the operator through the stages to create, develop, and load a new board knowledge base. If a known board type is specified, station 60 verifies that the complete knowledge base is available, and station 60 is then ready to inspect. Station 60 generally can be ready for board inspection in minutes and returns the inspection and characterization window to the operator. Also, station 60 runs system diagnostics continuously in the background during operation. Automatic correction is made or the operator is prompted with the corrective action required. During inspection operation, station 60 typically allows two modes to be selected: characterization and inspection. In the characterization mode, station 60 classifies board defects in greater detail and outputs complete information required for process control (PC) feedback. In inspection mode, station 60 acts in a pass/fail mode as a production line gate (generally, well understood and controlled processes) . Station 60 can toggle between characterization and inspection modes. When station 60 encounters a defect in which it has low confidence for classification, it queries the operator for input and enters an incremental training mode. Upon completion of the incremental training, station 60 returns to the inspection/characterization mode and continues operation. In the incremental training mode, the addition (or removal) of defect classes to (or from) the knowledge base can be performed, assuming proper authorization. When station 60 classifies a defect in which it has low confidence, the station interface can show both the region in question and the defect in question on the display. Station 60 can then provide its best guess regarding the classification and offer the operator three choices: (1) verify that its best guess is indeed correct; (2) identify its best guess as wrong and classify the defect as another type previously known to the station; or (3) identify its best guess as wrong and classify the defect as a new type previously unknown to the station. Station 60 can ask the operator to verify or correct the classification and several example images and information files of each defect class or prototype relevant to a classification decision can be presented to the operator to aid in the decision. Optionally, a line manager, engineer, or technician can be the only one authorized to make these decisions. If station 60 is networked, the results of incremental training can be broadcast to a central control and/or directly to all other networked stations .
Station 60 has a report mode in which summaries of the results of inspection/characterization and station status can be generated. For example, the total number of a certain type of defect over the last N boards can be graphically presented or a report of the station's calibration history can be generated. Reports can be automatically produced and electronically distributed on a regular schedule. Further, an analysis mode allows trends in inspection data to be used for process understanding and control. The analysis (or data- mining) mode of station 60 allows the operator to determine the relationships between process parameters and product quality (defects) over a range of time scales. Station 60 can thus play a crucial element in process control. Early detection and classification of defects alerts the operator to problems and allows early correction at the malfunctioning manufacturing stage .
The inspection station of the present invention could be implemented using numerous alternative components other than those shown in FIGURE 5. However, certain components have been selected as being well suited to the task of inspecting populated printed wiring boards. For the imager, a line scan camera is particularly useful because it gives higher resolution imagery over a wider field of view than can be obtained from available, low-cost area cameras. A line scan camera is made to continuously scan an image, so it can work with moving parts which naturally complements the inspection process because that is how objects are typically presented. Ideally, the line scan camera can provide a 100% resolution, or 100% fill factor, whereas an area camera will not. The line scan camera can accomplish this over a larger field of view and can collect data faster on moving targets. An additional aspect of the selected camera is that the camera is a time delay integration (TDI) line scan camera. The reason for choosing a TDI camera is that it overcomes the inherent light insensitivity of conventional line scan cameras. The TDI line scan camera provides an ability to get high speed imagery without needing excessive amounts of light. Without the TDI aspect, an excessively bright light source might be required. The TDI line scan camera allows low light levels to be used while the camera is run at a relatively fast rate (e.g., 8,000 lines a second at low illumination). A telecentric optic component is used with the camera to provide a "straight on" view of the board under inspection. In particular, a MELLES GRIOT INVARITAR telecentric optic is used because it provides advantages over conventional optics and an advantage over other telecentric optics. First, conventional optics view part features at different angles, up to 30 degrees or more, across the FOV. These changes in viewing perspectives can make the tops and bottoms of a given object appear at different x,y locations in the image, and can cause small components at the edges of the FOV to be occluded by larger components that are closer to the center of the FOV. INVARITAR optics are designed to view objects "straight on" across the entire FOV, removing both limitations. Second, with conventional, and many telecentric optics, the magnification depends on the object-to-optic distance. Objects that are close to the camera are imaged larger than objects that are farther. INVARITAR telecentric optics provide large gauging depths of field (DOF) because their magnification is independent of object- to-optic distance. Compared to conventional, and most telecentric optics, INVARITAR telecentric optics can reduce magnification errors by a factor of 10 or more. The optic physically mounts directly to the camera and needs to match the camera so that the optic focuses the required FOV onto the camera sensor. So, if the sensor has sides X and Y, and the optic field of view is A and B, the optic needs to map the field of view down to the sensor (A,B:X,Y) . The calculation of FOV involves the resolution required and the number of pixels in the line scan array. For example, if a 1 mil. (.001") image pixel resolution is required, a 2,048 pixel line scan camera can image a 2.048" FOV. The calculation of the required optic magnification involves the physical size of the line scan imager pixels and the image pixel resolution.
M = physical size/resolution = 13μm/25.4μm image pixel resolution = .001" (25.4 m) imager size = 2,048 pixels FOV = (.001") 2048 x (.001") 1 2 . 048 "
Figure imgf000032_0001
Physical pixel size = 13μm for CL-E2
physical pixel size in image magnification required image pixel resolution
13μm 13μm 13
M =
.001 " 25.4 μm 25.4
One implementation uses the following:
Image pixel resolution 1.383 mils 35 . 14μm Imager size 2,048 pixels
2 . 832 " x . 001383 ' FOV
Figure imgf000032_0002
If not for the capabilities provided by an optic such as the INVARITAR telecentric optic, the inspection of many small occluded parts and leads (which are almost always occluded by the part itself) would be virtually impossible because the required image data could not be obtained. Getting a wide field of view optic, with high resolution, is typically difficult to do. However, this is possible with, and is yet another advantage of the described INVARITAR telecentric optic. Additionally, the benefits of an optic such as the INVARITAR telecentric optic improve the overall accuracy and repeatability of all inspections. With respect to illumination, there are numerous ways to accomplish illumination depending on what it is that needs to be imaged. On a printed wiring board, since there are many visible colors present, white light of the entire visible spectrum is used, and the light source needs to be as uniformly distributed as possible. Specifically, a halogen light source is used along with a structured light system with heads on each side that are driven by a bifurcated fiber optic cable from the source. The structured lighting provided by the heads points on the board and reflects up and collects on the sensor. Altogether, the imager, optic and illumination provide a top down view with minimized distortions and with a structured light focused at a certain point.
The structured illumination provides significant advantages by allowing the generation of three- dimensional spatial information from a two-dimensional captured image. The third dimension (height) can be captured based upon the relative intensity of the reflected light and apriori information about the topology and reflectivity of the object being imaged. Things like solder paste volume can then be estimated from the three-dimensional spatial data. With the structured light, a three-dimensional surface (i.e., a board) is moved under the imaging system, a particular reflection back into the camera is obtained depending on the actual topology of that surface. So, for example, if the system is looking at a solder joint, the reflection will be noticeably different depending on what portion of the joint is lighted and the physical structure of the joint (defective or not) .
The intensity or the change in intensity provides information from which three-dimensional spatial features can be recovered. For example, the rate of change can be recorded as a gradient for the inspected surface. This allows equal height surfaces to be reconstructed for the object, somewhat like a topology map. Depending on the resolution, angle and structure of the light, a volume estimate can be generated from this information by initially calibrating based upon a known volume. The imaging system design enables this type of analysis and provides information that most systems can not capture . On the control and processing side, there are generally a number of printed circuit boards that can accomplish the various functions. For example, the imager feeds data to a data I/O board that then directly feeds a processing accelerator card. The imager has control signals that tell it to collect and send image data. The data is then collected and reconstructed into a two-dimensional image. The processing accelerator cards have a direct high-speed connection to the data I/O card. Additionally, data transfer between accelerator cards can occur across a backplane and not across the host system bus. Because the host system is involved with executing the final classification algorithms and graphical user interface (GUI) and managing the control signals, it would be undesirable to over task it with either pre-processing or with data transfer. It is for this reason that intensive data transfer and processing is accomplished by the processing accelerators. The resulting descriptors are then passed to the host computer over the system bus. As for the physical platform, it is recognized that the platform may not be completely perfect. Thus, built-in mechanisms are included to allow adjustment.
One particular feature supported by the host system is the ability to electronically transfer a message to a production manager so that the production manager does not have to come over to the manufacturing area each time something happens. The production manager can receive the electronic message in a remote office, look at any attached images and do what needs to be done for the classification.
Inspection Process
FIGURE 7 is a flow chart of one embodiment of an automated inspection process implemented using station 60 of FIGURE 5. As shown, in step 100, the station acquires image data from scanning the board being inspected. The image data is then pre-processed in step 102. After pre-processing the image data, the image is described by selected descriptors in step 104. The descriptors are then analyzed, in step 106, such that the image is analyzed in context with the system knowledge base. In step 108, defects are identified and classified. In step 110, the station determines whether the confidence in the classification is above a preset threshold. If so, the station checks, in step 112, whether analysis of the board is complete. If not, the process continues at step 100. If the analysis is complete, the process continues at step 114 with the ejection of the board and obtaining a new board to inspect .
If, in step 110, confidence was not above the threshold, then the process continues at step 116. In step 116, the operator is alerted and provided with the station's best option as to the proper classification. In step 118, the station accepts confirmation of the best option by the operator or accepts an alternate classification by the operator. Then, in step 120, the station adds information to the knowledge base as necessary to accomplish incremental learning based upon the operator's input. The process then continues at step 112 as discussed above. Classification is accomplished according to the present invention in accordance with a set of pre- specified policies. Descriptors are extracted and stored to the knowledge base. Then, when a new image is acquired and a set of descriptors are generated from it, the new descriptors are compared by a variety of metrics to those stored in the knowledge base to determine whether or not a defect exists and what type it is. So, if there is a good match, the system can have confidence that the object being inspected contains the specified defect. If there is not a good match, the system can select the closest match as a best option. The operator can then provide feedback as to whether the option is accurate, a different class should be selected, or a new class should be defined. If it is the latter, then in one step, an entirely new class of defects is established. In this manner, the knowledge base is built incrementally from a base level of information to include more and more knowledge. In addition, the system generates more and more trust with the operator because the operator sees his decisions appear in the decision making of the system. In fact, part of the incremental learning cycle can be not only to store the descriptors but also to store who made the decisions on low confidence classifications and when. This allows decisions to be back tracked out of the system if they prove to be poor judgments. The confidence level of the system can also be based upon how many of the particular type of defect the system has seen. Thus, the confidence level represents not only how confident the system is in the single classification but also populations of classifications. The statistics therefore use populations of information with known quality.
Process control can then be based upon the classifications. By classifying defects and maintaining a history, the overall process can be improved by analyzing defect histories and matching defects to a cause. Placing decisions as to how to classify a defect in context with other defects and decisions produces higher quality input from operators, and over time, the knowledge base gets more and more refined.
The implementations described herein are not intended to limit the scope of the present invention. For example, in addition to populated printed wiring boards, the present invention is applicable to analysis of bare printed wiring boards, semiconductor packages (e.g., PPGAs, PBGAs, C4 packages, or TAB tape) and other types of objects. Also, in contrast with the described implementation, the imaging system can be fixed while the object to be analyzed is moved. This approach would allow the imaging system to remain more stable during scanning which would allow more precise motion for ultra high resolution images (e.g., ~3 m pixels vs. 30μm pixels), but would likely require a larger platform with a wider range of movement.
Further, the inspection system can be integrated with other pieces equipment rather than being separate. For example, an integrated piece of equipment can include both object processing components and automated visual inspection components. The object processing components receive the object and process it to prepare it for subsequent steps in the manufacturing process . The automated visual inspection components perform substantially as described above to identify any defects after the object has been processed. Then control components can be used to instruct the object processing components to correct the defect prior to passing the object on to the subsequent step in the manufacturing process. The object, for example, can be a printed wiring board where the object processing components perform solder paste deposition, and the automated inspection components allows solder paste verification. Another example is that the object processing components can perform pick and place of components, and the automated inspection components allow component placement verification. In this manner, integration with processing equipment allows the equipment to ensure objects are complete before being passed downstream in the manufacturing process. Although the present invention has been described in detail, it should be understood that various changes, substitutions, and alterations can be made hereto without departing from the spirit and scope of the invention as defined by the appended claims .
APPENDIX A TABLE OF DEFECTS
Figure imgf000040_0001
Figure imgf000041_0001
APPENDIX B SAMPLE STATION FEATURES AND SPECIFICATION
Standard Features
• Can be configured for On-Line or Off-Line application.
Material handling system is a SMEMA compatible conveyor segment and is compatible with other SMEMA production line equipment. On-Line Configuration: Station is integrated into the production line under SMEMA control.
Off-Line Configuration: Station is not integrated into the production. The station is loaded/unloaded manually with conveyor assist . • Real-time 100% inspection at production line rates .
Scalable processing engine allows increased line rates to be met.
Processor architecture is transparent to the operator.
• Accommodates complex, high-density board types.
High-resolution, high-quality, line-scan imagery.
100% board coverage without part occlusions. • Real-time, on-line learning. Point-and-click:
Addition of new defect classes (seconds) . Correction of incorrect classifications (seconds) . Re-enforcement of correct low-confidence decisions (seconds) .
Board change over (new board - minutes/pretrained board - seconds) . Inspection routine modification (seconds - minutes) . • Low escape and false-alarm rates.
Embedded classification analysis mechanisms (confidence and relative confidence) flag questionable classifications for operator analysis and immediate correction which pushes escape rates down.
On-line learning allows rapid tuning of marginal defect limits which pushes false- alarm rates down.
• Low operator skill requirements. Point-and-Click Graphical User Interface (GUI) .
Delivered with a functional knowledge based installed (turn-key operation) . - Automated new-board changeover routines
(including inspection routine development) . New board change over takes only minutes . Automated calibration and calibration verification routines. • A broad range of defect detection & classification (DDC) software packages including: DDC Description
1 Component Package
2 Code Reader & Data Logging Package 3 Lead Quality Package
4 Solder Joint Quality Package
5 Connector & Add-On Package
6 Label & Marking Package
7 Component ID Package 8 Board Verification Package
9 PWB Package
10 Solder Paste Package
[Refer to APPENDIX A for an example of a detailed listing of the defects detected and classified by the different station DDC packages.]
• Real-time generation of process control information and reports.
Assessment of production-process and product quality. - Aids in defect source identification.
Real-time input to SPC packages.
Specifications
Board Size: 20" x 18" x ±2" (L x W x H - max.), 3" x 3" (L x W - min.) .
Throughput: Throughput is primarily a function of component count and inspection routines being applied. Board area does not typically affect the station throughput; only for very sparse boards does board area set throughput levels . The throughput numbers provided assume a complete set of relevant inspection routines is applied to each part. Minimum part size: 0402. Minimum lead size: 10 mil pitch.
100 components (average mix) per 5 seconds nominal throughput .
[Processing engine upgrades provide higher throughput]
(ex. a 2000 component 18" x 20" board- ->100 seconds + 20 seconds material handling and data collection overhead --> 120 seconds or 2 minutes)
Footprint lm (L) x lm (W) x 2m (H) . Compact design occupies the production line floor space required by a standard lm conveyor section. Allows for On-Line and Off-Line application.
Automation: Manual assist (off line) or conveyor hand-off (on line) . For off-line use, the conveyor positions a board, fed by the operator, for inspection and then returns the board to the operator after inspection is complete. For on-line use, the conveyor positions a board, fed by the previous stage in the line, for inspection and then hands it off to the subsequent stage in the line after inspection is complete.
GUI Requirements GUI includes routines and screens for system start up, known board preparation, unknown board preparation, inspection, low confidence/defect verification and validation, new defect entry, knowledge base verification/editing, report generation, data base set up.
Data Archiving: Station includes network connection, disk drive and resident 2G hard drive for data storage and archiving. High-capacity tape drive systems are optional.
Facility Temperature Range: 70° - 85°F. Requirements Power: 110V, 60Hz, 20 Amps.
Air: 100 PSI, 1.0 CFM, Filtered &
Dry.
Network Access at the Station.
Safety Mechanisms Emergency power-off button (entire station is shut down) and emergency interrupt button (only moving components are shut down) . Emergency interrupts are also generated when any user openable access panel is opened. Once triggered, a reset mechanism must be enabled to re-establish power.
Maintenance/ The Calibration Toolkit contains Calibration: routines for automatic station calibration and calibration verification including illumination verification and adjustment, new- bulb calibration, station alignment and motion analysis, image quality assurance .
The only regular station maintenance required is the periodic replacement of illumination system light bulbs. The station will automatically notify the operator when a given bulb set requires replacement. A step by step bulb replacement procedure will be provided to the operator through the GUI (no tools will be required) . Once the bulbs have been replaced, the station automatically performs calibration and returns to the inspection mode of operation. It is expected that the light bulbs will require replacement no more than once every two months. It is recommended that two sets of spare bulbs be kept available for each station.
Customer Data: To ensure a "knowledge-full" knowledge base at delivery, it is requested that 30 samples of each defect type be provided prior to station delivery.
Options Flat Panel Display.
Touch Screen Display.
Networking .
Tape Drive .
Printer.
Throughput Accelerator
Packages .

Claims

WHAT IS CLAIMED IS:
1. An automated visual inspection system, comprising: an imaging system operable to view an object and to provide image data representing an image of the obj ect ; a precision positioning system coupled to the imaging system, the precision positioning system operable to move the imaging system to scan the object, and the precision positioning system operating responsive to control signals; and a processing and control engine coupled to receive the image data from the imaging system and to provide control signals to the precision positioning system, the processing and control engine operating: to acquire and pre-process the image data; to represent the image of the object using descriptors; to compare the descriptors to information in a knowledge base and to identify and classify a defect based upon results of the comparison; and to associate a confidence level with the classification of the defect.
2. The automated visual inspection system of
Claim 1, wherein the processing and control engine further operates, if the confidence level is not above a specified threshold: to alert an operator and provide a best option as to the classification of the defect; to accept a confirmation or alternate classification from the operator; and to add information to the knowledge base responsive to the confirmation or alternate classification.
3. The automated visual inspection system of Claim 1, wherein: the imaging system comprises a structured, focused light; the imaging system operates to provide reflected light intensity as part of the images data; and the processing and control engine operates to recover three-dimensional spatial information based upon the reflected light intensity when the image data is pre-processed.
4. The automated visual inspection system of Claim 1, wherein the imaging system comprises: an illumination system providing light to illuminate the object; an optic to gather an image of the object; and an imager to sense the image and generate the image data.
5. The automated visual inspection system of Claim 1, wherein the object is coupled to a material handling system that operates to move the object to a scanning range of the imaging system.
6. The automated visual inspection system of Claim 1, wherein the processing and control engine comprises : a control hub that receives status information from and sends control signals to the imaging system and the precision positioning system; a data input/output device that receives the image data; processing accelerators that pre-process the image data and generate the descriptors; and a software engine executed by a host computer that manages processing and control, accomplishes the identification and classification and drives a graphical user interface.
7. The automated visual inspection system of Claim 1, wherein the processing and control engine further operates to share knowledge base information with other inspection systems.
8. The automated visual inspection system of Claim 1, wherein the processing and control engine further operates to send an electronic message to a production manager, the electronic message including the best option as to the classification of the defect and the descriptors .
9. The automated visual inspection system of Claim 1, wherein the processing and control engine further operates to send an electronic message with status and inspection reports to specified groups at specified intervals.
10. The automated visual inspection system of Claim 1, wherein the object is a populated printed wiring board.
11. An automated visual inspection station for inspecting populated wiring boards, comprising: a platform providing physical support for the inspection station; a pair of guides coupled to the platform, the guides providing a path for a board to be inspected; a stop coupled to the platform, the stop operable to position the board for inspection; a thumper coupled to one of the guides, the thumper operable to position the board against the other guide; an optic operable to obtain an image of the board; a camera coupled to the optic, the camera operable to acquire the image of the board and provide image data; a light source operable to provide light for illuminating the board; a bifurcated optical fiber coupled to the light source ; a pair of linear heads coupled to receive light from the bifurcated end of the optical fiber and coupled to the optic, the heads operable to illuminate the board with structured, directed light; a first precision linear guide and first motor coupled to the camera and operable to control movement, along one axis, of the camera, optic and heads to scan the board; a second precision linear guide and second motor coupled to the platform and to the first precision linear guide and operable to control movement, along a second axis, of the camera, optic and heads to scan the board; interface devices coupled to provide an interface to the stop, the thumper, the camera, the light source, the first motor and the second motor; and a processing and control engine coupled to the interface devices, the processing and control engine operable to receive the image data and to provide control signals to the stop, the thumper, the camera, the light source, the first motor and the second motor, and the processing and control engine operable to identify and classify a defect based upon analysis of the image data.
12. The automated visual inspection station of Claim 11, further comprising peripherals coupled to the processing and control engine.
13. The automated visual inspection station of Claim 11, wherein the processing and control engine is further operable to use descriptors to represent the image data and to compare the descriptors to information in a knowledge base to identify and classify the defect.
14. An integrated piece of equipment for completing a step of a manufacturing process, the equipment comprising: object processing components operable to receive an object and to process the object to prepare the object for a subsequent step in the manufacturing process; and automated visual inspection components operable: to acquire image data representing an image of the object; to pre-process the image data; to represent the image of the object using descriptors; to compare the descriptors to information in a knowledge base; and to identify a defect based upon results of the comparison.
15. The integrated piece of equipment of Claim 14, further comprising control components operable to instruct the object processing components to correct the defect identified by the automated visual inspection components prior to passing the object on to the subsequent step in the manufacturing process.
16. The integrated piece of equipment of Claim 14, wherein the object is a printed wiring board, the object processing components perform solder paste deposition, and the automated inspection components allow solder paste verification.
17. The integrated piece of equipment of Claim 14, wherein the object is a printed wiring board, the object processing components perform pick and place of components, and the automated inspection components allow component placement verification.
18. An automated visual inspection process, comprising: acquiring image data representing an image of an object; pre-processing the image data; representing the image of the object using descriptors; comparing the descriptors to information in a knowledge base; identifying and classifying a defect based upon results of the comparison; and associating a confidence level with the classification of the defect.
19. The process of Claim 18, further comprising, if the confidence level is not above a specified threshold: alerting an operator and providing a best option as to the classification of the defect; accepting a confirmation or alternate classification from the operator; and adding information to the knowledge base responsive to the confirmation or alternate classification.
20. The process of Claim 18, further comprising: providing reflected light intensity as part of the images data; and recovering three-dimensional spatial information based upon the reflected light intensity when the image data is pre-processed.
21. The process of Claim 18, further comprising sharing knowledge base information with other inspection processes.
22. The process of Claim 18, further comprising sending an electronic message to a production manager, the electronic message including the best option as to the classification of the defect and the descriptors.
23. The process of Claim 18, wherein the object is a populated printed wiring board.
24. The process of Claim 18, further comprising performing automatic calibration and correction algorithms .
25. The process of Claim 24, wherein performing comprises : assisting a user to adjust adjustable alignments; automatically computing relevant system misalignments and distortions and generating correction coefficients that are applied during inspection; and checking lights before each inspection cycle and automatically performing any required adjustments and calculations .
PCT/US1998/019544 1997-09-22 1998-09-22 Automated visual inspection system and process for detecting and classifying defects WO1999016010A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU93993/98A AU9399398A (en) 1997-09-22 1998-09-22 Automated visual inspection system and process for detecting and classifying defects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US93535197A 1997-09-22 1997-09-22
US08/935,351 1997-09-22

Publications (1)

Publication Number Publication Date
WO1999016010A1 true WO1999016010A1 (en) 1999-04-01

Family

ID=25466962

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/019544 WO1999016010A1 (en) 1997-09-22 1998-09-22 Automated visual inspection system and process for detecting and classifying defects

Country Status (2)

Country Link
AU (1) AU9399398A (en)
WO (1) WO1999016010A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001008099A1 (en) * 1999-07-24 2001-02-01 Intelligent Reasoning Systems, Inc. User interface for automated optical inspection systems
EP1104915A2 (en) * 1999-11-12 2001-06-06 Applied Materials, Inc. Defect detection using gray level signatures
WO2002001208A1 (en) * 2000-06-27 2002-01-03 Matsushita Electric Works, Ltd. A programming apparatus of a visual inspection program
US6515962B1 (en) 1999-07-16 2003-02-04 Alcatel Hit-less switching pointer aligner apparatus and method
US6963076B1 (en) * 2000-07-31 2005-11-08 Xerox Corporation System and method for optically sensing defects in OPC devices
GB2417073A (en) * 2004-08-13 2006-02-15 Mv Res Ltd A machine vision analysis system and method
CN100419410C (en) * 1999-11-25 2008-09-17 奥林巴斯光学工业株式会社 Defect inspection data processing system
DE102008001174A1 (en) * 2008-04-14 2009-11-05 Nanophotonics Ag Inspection system and method for the optical examination of object surfaces, in particular wafer surfaces
CN103398660A (en) * 2013-08-05 2013-11-20 河北工业大学 Structured light visual sensor parameter calibration method for acquiring height information of welded joint
US9833962B2 (en) 2014-02-26 2017-12-05 Toyota Motor Engineering & Manufacturing Norh America, Inc. Systems and methods for controlling manufacturing processes
WO2018006180A1 (en) 2016-07-08 2018-01-11 Ats Automation Tooling Systems Inc. System and method for combined automatic and manual inspection
CN107748396A (en) * 2016-08-29 2018-03-02 张家港孚冈汽车部件有限公司 Motor assembles position detecting system
WO2019177539A1 (en) * 2018-03-14 2019-09-19 Agency For Science, Technology And Research Method for visual inspection and apparatus thereof
WO2019209397A1 (en) * 2018-04-26 2019-10-31 Liberty Reach Inc. Non-contact method and system for controlling an industrial automation machine
EP3828821A1 (en) * 2019-11-27 2021-06-02 AT&S (Chongqing) Company Limited User interface for judgment concerning quality classification of displayed arrays of component carriers
CN112927170A (en) * 2021-04-08 2021-06-08 上海哥瑞利软件股份有限公司 Automatic defect removal method in semiconductor manufacturing process
CN113808094A (en) * 2021-09-10 2021-12-17 武汉联开检测科技有限公司 Ray detection welding defect image rating system and method
DE102022103844B3 (en) 2022-02-17 2023-06-22 Synsor.ai GmbH Method for optimizing a production process based on visual information and device for carrying out the method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4202092A (en) * 1976-09-17 1980-05-13 Matsushita Electric Industrial Co., Ltd. Automatic part insertion machine
US4586148A (en) * 1982-06-03 1986-04-29 M.A.N.-Roland Druckmaschinen Aktiengesellschaft Arrangement for scanning printing plates
US5101442A (en) * 1989-11-24 1992-03-31 At&T Bell Laboratories Three-dimensional imaging technique using sharp gradient of illumination
US5274714A (en) * 1990-06-04 1993-12-28 Neuristics, Inc. Method and apparatus for determining and organizing feature vectors for neural network recognition
US5455870A (en) * 1991-07-10 1995-10-03 Raytheon Company Apparatus and method for inspection of high component density printed circuit board
US5660519A (en) * 1992-07-01 1997-08-26 Yamaha Hatsudoki Kabushiki Kaisha Method for mounting components and an apparatus therefor
US5751910A (en) * 1995-05-22 1998-05-12 Eastman Kodak Company Neural network solder paste inspection system
US5801965A (en) * 1993-12-28 1998-09-01 Hitachi, Ltd. Method and system for manufacturing semiconductor devices, and method and system for inspecting semiconductor devices
US5812693A (en) * 1994-10-17 1998-09-22 Chrysler Corporation Integrated machine vision inspection and rework system -- CIP

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4202092A (en) * 1976-09-17 1980-05-13 Matsushita Electric Industrial Co., Ltd. Automatic part insertion machine
US4586148A (en) * 1982-06-03 1986-04-29 M.A.N.-Roland Druckmaschinen Aktiengesellschaft Arrangement for scanning printing plates
US5101442A (en) * 1989-11-24 1992-03-31 At&T Bell Laboratories Three-dimensional imaging technique using sharp gradient of illumination
US5274714A (en) * 1990-06-04 1993-12-28 Neuristics, Inc. Method and apparatus for determining and organizing feature vectors for neural network recognition
US5455870A (en) * 1991-07-10 1995-10-03 Raytheon Company Apparatus and method for inspection of high component density printed circuit board
US5660519A (en) * 1992-07-01 1997-08-26 Yamaha Hatsudoki Kabushiki Kaisha Method for mounting components and an apparatus therefor
US5801965A (en) * 1993-12-28 1998-09-01 Hitachi, Ltd. Method and system for manufacturing semiconductor devices, and method and system for inspecting semiconductor devices
US5812693A (en) * 1994-10-17 1998-09-22 Chrysler Corporation Integrated machine vision inspection and rework system -- CIP
US5751910A (en) * 1995-05-22 1998-05-12 Eastman Kodak Company Neural network solder paste inspection system

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6515962B1 (en) 1999-07-16 2003-02-04 Alcatel Hit-less switching pointer aligner apparatus and method
WO2001008099A1 (en) * 1999-07-24 2001-02-01 Intelligent Reasoning Systems, Inc. User interface for automated optical inspection systems
US6597381B1 (en) 1999-07-24 2003-07-22 Intelligent Reasoning Systems, Inc. User interface for automated optical inspection systems
US6603873B1 (en) 1999-11-12 2003-08-05 Applied Materials, Inc. Defect detection using gray level signatures
EP1104915A3 (en) * 1999-11-12 2003-01-02 Applied Materials, Inc. Defect detection using gray level signatures
EP1104915A2 (en) * 1999-11-12 2001-06-06 Applied Materials, Inc. Defect detection using gray level signatures
CN100419410C (en) * 1999-11-25 2008-09-17 奥林巴斯光学工业株式会社 Defect inspection data processing system
WO2002001208A1 (en) * 2000-06-27 2002-01-03 Matsushita Electric Works, Ltd. A programming apparatus of a visual inspection program
US6922481B2 (en) 2000-06-27 2005-07-26 Matsushita Electric Works, Ltd. Programming apparatus of a visual inspection program
US6963076B1 (en) * 2000-07-31 2005-11-08 Xerox Corporation System and method for optically sensing defects in OPC devices
GB2417073A (en) * 2004-08-13 2006-02-15 Mv Res Ltd A machine vision analysis system and method
DE102008001174A1 (en) * 2008-04-14 2009-11-05 Nanophotonics Ag Inspection system and method for the optical examination of object surfaces, in particular wafer surfaces
DE102008001174B4 (en) * 2008-04-14 2013-03-14 Rudolph Technologies Germany Gmbh Inspection system and method for the optical examination of object surfaces, in particular wafer surfaces
DE102008001174B9 (en) * 2008-04-14 2013-05-29 Rudolph Technologies Germany Gmbh Inspection system and method for the optical examination of object surfaces, in particular wafer surfaces
CN103398660A (en) * 2013-08-05 2013-11-20 河北工业大学 Structured light visual sensor parameter calibration method for acquiring height information of welded joint
US9833962B2 (en) 2014-02-26 2017-12-05 Toyota Motor Engineering & Manufacturing Norh America, Inc. Systems and methods for controlling manufacturing processes
EP3482192A4 (en) * 2016-07-08 2020-08-05 ATS Automation Tooling Systems Inc. System and method for combined automatic and manual inspection
US11449980B2 (en) 2016-07-08 2022-09-20 Ats Automation Tooling Systems Inc. System and method for combined automatic and manual inspection
CN109564166A (en) * 2016-07-08 2019-04-02 Ats自动化加工系统公司 The system and method checked for automatic and artificial combination
WO2018006180A1 (en) 2016-07-08 2018-01-11 Ats Automation Tooling Systems Inc. System and method for combined automatic and manual inspection
CN107748396A (en) * 2016-08-29 2018-03-02 张家港孚冈汽车部件有限公司 Motor assembles position detecting system
WO2019177539A1 (en) * 2018-03-14 2019-09-19 Agency For Science, Technology And Research Method for visual inspection and apparatus thereof
WO2019209397A1 (en) * 2018-04-26 2019-10-31 Liberty Reach Inc. Non-contact method and system for controlling an industrial automation machine
US11314220B2 (en) 2018-04-26 2022-04-26 Liberty Reach Inc. Non-contact method and system for controlling an industrial automation machine
EP3828821A1 (en) * 2019-11-27 2021-06-02 AT&S (Chongqing) Company Limited User interface for judgment concerning quality classification of displayed arrays of component carriers
US11935221B2 (en) 2019-11-27 2024-03-19 AT&S (Chongqing) Company Limited User interface for judgment concerning quality classification of displayed arrays of component carriers
CN112927170A (en) * 2021-04-08 2021-06-08 上海哥瑞利软件股份有限公司 Automatic defect removal method in semiconductor manufacturing process
CN112927170B (en) * 2021-04-08 2024-03-15 上海哥瑞利软件股份有限公司 Automatic defect removing method in semiconductor manufacturing process
CN113808094A (en) * 2021-09-10 2021-12-17 武汉联开检测科技有限公司 Ray detection welding defect image rating system and method
DE102022103844B3 (en) 2022-02-17 2023-06-22 Synsor.ai GmbH Method for optimizing a production process based on visual information and device for carrying out the method

Also Published As

Publication number Publication date
AU9399398A (en) 1999-04-12

Similar Documents

Publication Publication Date Title
WO1999016010A1 (en) Automated visual inspection system and process for detecting and classifying defects
US5751910A (en) Neural network solder paste inspection system
US5455870A (en) Apparatus and method for inspection of high component density printed circuit board
US5801965A (en) Method and system for manufacturing semiconductor devices, and method and system for inspecting semiconductor devices
JP4477356B2 (en) Ophthalmic lens inspection system and method
US6546308B2 (en) Method and system for manufacturing semiconductor devices, and method and system for inspecting semiconductor devices
US20070058854A1 (en) Gear pattern inspection system
US20060199287A1 (en) Method and system for defect detection
CN109840900A (en) A kind of line detection system for failure and detection method applied to intelligence manufacture workshop
CN109564173B (en) Image inspection apparatus, production system, image inspection method, and storage medium
Labudzki et al. The essence and applications of machine vision
KR100338194B1 (en) Inspection system of Semiconductor device package with 3 dimension type and the inspection method thereof
EP4202424A1 (en) Method and system for inspection of welds
JPH07198354A (en) Test for mounted connector pin using image processing
JP3310898B2 (en) Image processing device
US20220284699A1 (en) System and method of object detection using ai deep learning models
Kovalev et al. Development of a module for analyzing milling defects using computer vision defects using computer vision
Mundy et al. Automatic visual inspection
CN113465505B (en) Visual detection positioning system and method
Zeuch Understanding and applying machine vision, revised and expanded
KR20210058329A (en) Multi-sided Vision Inspection Algorithm and Using the same
KR100357764B1 (en) Apparatus for Leather Quality Inspection using Artificial Intelligence
Moru et al. Machine Vision and Metrology Systems: An Overview
Moru Improving the pipeline of an optical metrology system.
Radkowski Machine Vision and Robotic Inspection Systems

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR CN JP MX SG

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase