EP0145725A4 - Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like. - Google Patents
Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like.Info
- Publication number
- EP0145725A4 EP0145725A4 EP19840901532 EP84901532A EP0145725A4 EP 0145725 A4 EP0145725 A4 EP 0145725A4 EP 19840901532 EP19840901532 EP 19840901532 EP 84901532 A EP84901532 A EP 84901532A EP 0145725 A4 EP0145725 A4 EP 0145725A4
- Authority
- EP
- European Patent Office
- Prior art keywords
- pattern
- objects
- information
- image
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000007689 inspection Methods 0.000 title claims abstract description 57
- 230000007547 defect Effects 0.000 title claims description 93
- 230000015654 memory Effects 0.000 claims abstract description 120
- 230000003287 optical effect Effects 0.000 claims abstract description 17
- 238000012544 monitoring process Methods 0.000 claims description 12
- 238000012935 Averaging Methods 0.000 claims description 8
- 239000003086 colorant Substances 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000005070 sampling Methods 0.000 claims description 4
- 238000001228 spectrum Methods 0.000 claims description 4
- 238000009826 distribution Methods 0.000 claims description 3
- 238000012795 verification Methods 0.000 claims description 2
- 238000012512 characterization method Methods 0.000 claims 1
- 229910052729 chemical element Inorganic materials 0.000 claims 1
- 230000000052 comparative effect Effects 0.000 claims 1
- 238000009877 rendering Methods 0.000 claims 1
- 238000003860 storage Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 20
- 239000002184 metal Substances 0.000 description 15
- 238000010586 diagram Methods 0.000 description 12
- 239000004020 conductor Substances 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 238000013139 quantization Methods 0.000 description 9
- 230000001934 delay Effects 0.000 description 8
- 101100521334 Mus musculus Prom1 gene Proteins 0.000 description 6
- 238000013459 approach Methods 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 4
- 238000003909 pattern recognition Methods 0.000 description 4
- 210000004556 brain Anatomy 0.000 description 3
- 238000011179 visual inspection Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- GOLXNESZZPUPJE-UHFFFAOYSA-N spiromesifen Chemical compound CC1=CC(C)=CC(C)=C1C(C(O1)=O)=C(OC(=O)CC(C)(C)C)C11CCCC1 GOLXNESZZPUPJE-UHFFFAOYSA-N 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01R—MEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
- G01R31/00—Arrangements for testing electric properties; Arrangements for locating electric faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere
- G01R31/28—Testing of electronic circuits, e.g. by signal tracer
- G01R31/302—Contactless testing
- G01R31/308—Contactless testing using non-ionising electromagnetic radiation, e.g. optical radiation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30141—Printed circuit board [PCB]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/06—Recognition of objects for industrial automation
Definitions
- the present invention relates to methods of and apparatus for real-time high-speed inspection of objects for such purposes as identifying known and unknown portions thereof, pattern recognition, defect identification, and the like.
- An object of the present invention accordingly, is to provide a new and improved method of and apparatus for real-time high-speed inspection of objects that are not subject to the above or orher limitations and provide a universal approach to pattern recognition, defect identification and related problems.
- a further object is to provide a novel high-speed automatic recognition and identification method and apparatus of more general utility, as well.
- the invention embraces a method of real-time high-speed inspection of objects with the aid of image sensors and the like, that comprises, generating signal images in the form of pixels of an object with such sensors at one or more magnifications!
- the invention involves a method of real-time inspection of objects at a predetermined region, comprising, optically scanning an area of interest of the object at said region; digitizing the scanned signals in real time to produce successive trains of digital signals corresponding to the optical scanned lines; delaying the successive trains of digital signals to produce a vertical raster of successive trains of digital signals corresponding to previously optically scanned lines; reducing the vertical raster to a sub-raster containing a predetermined number of substantially horizontal (H) and vertical (V) pixels, extracting from the sub-raster groups of pixels h by v in spatial distribution such that v i ⁇ V and h i ⁇ H, where the subscript i represents the specific elements ranging from an element one pixel in size to an element V by H pixels in size; selecting an element threshhold value for each element size by predetermining a minimum and maximum number of pixels such that if the number of pixels contained in an element lies within said minimurr and maximum values the element value is set to the value 0
- the invention also embraces a method of real-time highspeed inspection of objects with the aid of image-sensing scanning cameras and the like that comprises, storing digital signal mask information corresponding to video images of regions of predetermined fields of view of a known object at effectively different magnifications; successively scanning contiguous regions of an object with such a camera, for each of said predetermined fields of view, and digitizing the image scans to generate digital signal mask information; repeating such scanning at effectively greater magnification with the same field of view to generate magnified digital signal mask information; comparing the generated digital mask information at different magnifications with the stored digital mask information to identify known or unknown portions of the object; and indicating the identification of such known or unknown portions.
- Other features of the invention and preferred and best mode embodiments, including preferred apparatus and constructional details, are hereinafter presented.
- FIG. 1 is an explanatory diagram illustrating orientation problems in inspection systems
- Figs. 2, 3, 7 and 8 are plan views of illustrative defects in one of the important areas of application of the invention to printed circuit board inspection and the like;
- Figs. 4 and 5 are similar diagramatic views of systems for imaging at different magnifications ;
- Fig. 6 is a block diagram illustrating memory apparatus useful In inspection apparatus constructed In. accordance with the underlying philosophy of the invention.
- Fig. 9 contains views similar to Fig. 8 illustrating divided fields of view for inspection of defects at low and high magnification;
- Fig. 10 is a similar view illustrating shifting positions of memory mask;
- Fig. 11 is a combined block and functional diagram similar to Fig. 6 of the system functions of an inspection apparatus operating in accordance with the invention.
- Figs. 12 and 13 illustrate pixel groups being scanned and the analysis of the same at successive instants of time, more particularly outlined in the timing chart of Fig. 14;
- Figs. 15 and 16 are schematic block diagrams of scanned element quantization and delay registration in accordance with the invention;
- Figs. 17 and 18 schematically illustrate successive scanned bands of pixel groups and the quantizing of the same
- Fig. 19 is an alternate arrangement to Figs. 17 and 18;
- Fig. 20 is a more detailed diagram of the scanning window memory used in the system of Figs. 11, 13 and 16;
- Fig. 21 is a block diagram of the threshold comparators useful in the embodiment of Fig, 16;
- Fig. 22 is a similar diagram of a preferred prototype implementation of the invention.
- Figs. 23(a)-(e) are block diagrams of parts of the inspection and correlating pattern information processer of Fig. 22.
- an observer attempting visual inspection may first scan the complete area, looking over small areas thereof to see if relatively large defects can be found, but without viewing the entire card in one glance; rather, concentrating on these smaller areas, section by section, and inspecting contiguous regions.
- the observer may resort to the use of a magnifying glass.
- the field size as viewed by the eye will be the same as that viewed without the assistance of the magnifying glass; but now the observer will actually be looking at much smaller regions that have been magnified to some degree. With this greater magnification, inspection of contiguous regions will enable viewing any small defects, breaks, etc.
- The, next step may be to inspect even smaller areas to identify even smaller defects, perhaps very fine pinholes, requiring an even more powerful magnifying glass for inspecting contiguous regions of interest, as before.
- a small field of view is made up of little picture elements, (pixels). These pixels are grouped to create elements of binary value 1 or 0 and the elements are then grouped to create a mask. In this application, a mask 4 x4 elements is created which can easily be mapped into a 65 K dynamic or static memory.
- the technique for effectively varying the magnification to be provided to this mask resides in the use of picture point (or pixel) averaging, somewhat analagously to human viewing techniques. If one holds something very close, a lot of the details are apparent; but as one looks at an object farther away, one tends to average the fine picture points into one.
- elements are formed by producing a kind of averaging or weighting function on a small group of picture elements or pixels with threshold averaged values selected such that the element value, 1 or 0, represents the averaged pixel brightness within the element area.
- an element is to be represented by a group of four pixels, 2 horizontal (h) by 2 vertical (v) , it may be decided that if three of the four pixels are white (with common binary digits in the mask), the whole block will be considered as of white brightness; or if three are black, this may be the threshold for treating the block as black.
- the threshold is set on a summing operation, and in real time, the final mask composed In this example of 4 by 4 elements is finalized with an averaging or weighted function of the pixels-- such architecture lending itself to extremely high speed Implementation.
- the entire function composed of first forming all of the 16 elements and then deciding whether it is a "good” mask or a "defect” mask can be achieved. That determination can be made by storing "good” masks and comparing a realtime mask scan with the "good” masks to identify a defect. Alternatively, design masks for all defects may be stored, whereby mask matching would identify the defect.
- the invention embraces looking at local features (perhaps a break or a pinhole) , it is possible to use many inspection units simultaneously looking at the field in parallel. This may be somewhat analagous, from the human standpoint, to having several observers, each having a magnifying glass, simultaneously moving their magnifying glasses over different sections of the objects to be inspected. Because the field of view is adjusted such that a defect will fit inside the same, parallel processing units of this type could be employed, contiguous with one another, or overlapping and enabling inspections of very large fields in real time at very high speeds.
- a most desirable feature of this recognition procedure is to enable operation of the same with decreased sensitivity to angular rotation of the card or other object under inspection.
- a printed circuit card PC that is, say, twelve inches long and six inches wide and that has a pinhale defect A with diameter 1/1000 inch (1 mil) located in the upper left corner, as identified in Fig. 1.
- Fig. 1 Examine the contents of each field of view independently, searching for a pattern of interest that lies within the field.
- the pinhole type defect A illustrated in Fig. 1 is again shown at A in the lower right in Fig. 2, which illustrates also another type of frequent printed circuit card defect, namely a short-circuit metal connection B between a conductor 1 on the printed circuit card PC and a bonding pad, so-labelled, connected with a further conductor 2.
- Fig. 2 Indicates that the pinhole and short defects A and B are to be observed both at relatively high and at relatively low magnification, as more particularly shown in Fig. 3.
- Portions A and B of Fig. 3 represent the appearance of the respective pinhole and short defects at high magnification (and as compared with a normal conductor edge at C), while portions D and E represent the same respective defects at low magnification.
- 3E illustrates, on the other hand, that this low magnification, so suitable for large, defect identification is not suitable to search for smaller defects such as the pinhole A.
- the automated inspection system of the invention will also: 2. Examine the entire object at a variety of different magnifications, analyzing each FOV as described in operation 1, above.
- the invention provides, for viewing contiguous or overlapping fields at this low magnification and constructing a complete picture equivalent to what would have been seen had a sufficiently low magnification objective been provided. For the machine of the invention to operate with this feature, it will, therefore :
- One field is the area of the object viewed by, for example, the microscope objective.
- the other is the area of the microscope eye piece viewed by the eye, in the case of a human observer.
- the field of the objective is a variable which is a function of magnification; whereas, the field of the human eye is a constant. It is from this constant field that the brain obtains all its information, with the microscope serving to adjust the desired object image to fill the visual field.
- the human recognition system the brain, then processes the data provided by these receptors and attempts to make a decision as to object or pattern identity.
- the image acquisition function underlying the invention consists of the following three operations : a. Obtaining images of the object at one or more magnifications; b. Constructing these images using the minimum number of required light levels or colors ; and c. Dividing each image at each magnification into N 2 discrete spatial elements.
- the image recognition function of the invention following such image acquisition, consists of performing any number or combination of the following four operations to locate patterns of interest : a. Examining the N 2 elements from each field at each magnification independently, looking for a pattern of interest within each field; b. Correlating patterns of the same object area generated at different magnifications ; c.
- the first system function that must be implemented is the acquisition of object images at different magnification.
- the simplest and perhaps most conceptual means to achieve this goal consists of placing a two-dimensional television, CCD, thermal-sensing, or other suitable sensing camera at the eye-piece of a microscope.
- the object will be repeatedly scanned at each of the required magnifications.
- Such technique has the inherent disadvantage of being relatively slow, since objects are viewed at each magnification in a sequential isanner.
- the image acquisition time is proportional to the number of different magnifications incorporated.
- the image will be blurred and distorted. This is frequently observed on home television sets, for example, when viewing a fast moving object such as a baseball in flight. This blurring phenomenon is an additional factor which limits the maximum inspection rate.
- An alternative method is to obtain all images at each of the different magnifications in parallel.
- This can be achieved by having a separate camera for each different magnification as shown in Fig. 4, wherein camera 1 effectively operates at 5 x magnification, camera 2 at 10 x, and camera 3 at 20 x, for example.
- the object, so-labelled, would then be moved past each of the cameras, as shown by the arrow; or beam splitters and mirrors may be used, as in Fig. 5, to image the object onto each camera.
- This technique however, has numerous disadvantages. It is costly and technically complex, requiring as many cameras as there are different magnification types; and critical optical alignment between cameras is required to maintain spatial relationships between patterns obtained at different magnifications.
- a third method utilizes a high resolution large field lens L 1 , Fig. 6, (such as, for example the type Photor produced by Leitz, or APO-EL-Nikor produced by Nikon) to image the object onto a single high resolution linear charge coupled device imager (CCD) (such as, for example, the type 1728 pixel CCD currently produced by Reticon, or the type 2000 pixel CCD produced by Fairchild ), shown at CCD in Fig. 6.
- CCD charge coupled device imager
- the output of the CCD is then transferred into a two-dimensional buffer memory BM and then into a sliding window memory SWM to create large field high resolution images.
- This technique enables the resolution to be equal to that obtained from high magnification objectives, while the FOV is equal to or greater than that obtained from low magnification objectives.
- object blurring can be totally eliminated by moving the object perpendicularly to the axis of the CCD and reading out the CCD into the buffer memory BM each time the object ima ⁇ e has moved a distance equal to one CCD element; which may be, for example, approximately 1/2000 of an inch.
- the analog signal output of the CCD after conversion to digital form by a conventional sampling pixel quantizer, so labelled, is transferred in Fig. 6 into the buffer memory BM which can be modelled as a series of shift registers SR each equal in length to the number of CCD elements.
- the output (OUT) of each shift register SR is connected to the input (IN) of the next register and also to one row of the sliding window memory SWM, as shown In Fig. 6.
- the image appearing in the window memory SWM is similar to what would be seen by moving a magnifying glass over the entire object, as schematically illustrated in Fig. 7, wherein (a) illustrates the resolution and FOV seen in the window memory SWM as compared to what is seen at (b) and (c), by a conventional camera viewing the eyepiece of a microscope at high and low magnification, respectively.
- the optical configuration incorporated in the system of Fig. 6 is extremely flexible.
- the image size of the object produced on the CCD can be magnified or demagnified (from 1/3 x to 3 x using said Photor lens L 1 ; 2 x to 10 x or 1/2 x to 1/10 x using said APO-EL-Nikor lens) by simply moving the relative positions of the CCD, lens L 1 and object. Since the CCD is relatively very long (1728 pixels for said Reticon apparatus and 2000 pixels for said Faiirchild equipment) and each pixel is extremely small (0.64 mils for Reticon, 0.5 mils for Fairchild), a large variety of high resolution large field images are obtainable.
- LFHR large field high resolution
- E i j brightness value of the element in horizontal position i and vertical position j
- Pxy brightness value of the pixel in horizontal position x and vertical position y and
- N total number of pixels contained within each element, them
- This group of N 2 elements forms the actual patterns which are to be recognized and therefore will be referred to as a pattern of elements (POE).
- POE pattern of elements
- Each POE is applied to a recognition memory for positive identification.
- Each different magnification (or FOV) has its own recognition memory.
- each element is quantized into L discrete light levels and each PPCOE contains N 2 elements , then there must exist L N2 memory locations, at each magnification, to store pattern information about each of the L N2 total possible patterns. Since the mapping must be unique between each memory location and POE, the POE bit pattern itself is used in the invention as the memory address, and information about each particular POE is stored at that address.
- the specific information is a function of the application and desired task. If that task is to locate defects on the printed circuit card as in the above illustration, and a defect is defined as any pattern not present, on a good card, the memories are programmed as follows. A good card is scanned; and at each magnification, a valid bit is entered at each memory location accessed by a POE. This mode of operation is referred to as the "learn mode" because patterns are "learned" by the memories. To check cards, the system is now placed in "run mode.” Each card is scanned and all POE are applied to the appropriate recognition memories for positive confirmation. Any foreign POE not previously learned is classified as a defeot.
- An example useful in robotic and many other applications is to teach the machine (learn mode) to differentiate among different tools, such as a plier, screwdriver, and hammer.
- the hammer is now assigned #3 and scanned into the system. All accessed blank locations are programmed with the number 3. All accessed non-blank locations are programmed with the number 4.
- run mode a tool is scanned, and each memory address accessed by a POE is read. The frequency at which the numbers 1, 2 and 3 occur is computed. The object is identified by the number having the highest frequency of occurence that also exceeds some predetermined minimum value.
- Another application may involve teaching the machine to recognize a certain pattern for which there is no physical example.
- the correct recognition memories are first accessed directly via a memory input/output port. Identification numbers are programmed into the memory addresses that would be accessed by the POE's representing this specific pattern.
- run mode the contents of all accessed memory locations are examined. When the contents equal the programmed identification numbers, the specific pattern has been located.
- Fig. 11 the methodology and machine functions necessary to implement the inspection apparatus are presented in the block diagram of Fig. 11, involving the following flow of functions a. Moving the object in a direction perpendicular to the axis of the CCD or other imager; b. Quantizing pixels from the CCD output into the minimum number of light levels or colors required to characterize the object (A/D in Fig. 11); c. Storing quantized pixels in a buffer to create a two-dimensional image equal to the largest desired field of view (BM in Fig. 11) ; d. Choosing desired fields of view (FOV) and magnifications from the two-dimensional image (in SWM, Fig. 11); e. Dividing each FOV into N 2 elements (block
- Fig. 11 is a block diagram that summarizes the system functions described above, to implement the final machine as in a practical high-speed, real-time, cost-effective system, these functions may be performed in a somewhat different order than indicated, as hereinafter explained. Now turning to practical and preferred system implementation of the technique of the invention, it should be recognized that using the exact sequence of operations presented in Fig. 11 is extremely. complex and costly though, if desired, it could be implemented.
- the FOV extracted from the window memory SWM contains 32 x 32 (1024) pixels and the CCD is operated at a 5 megahertz data rate (200 nanosecond period), all 1024 pixels required to compute the POE for this field must be processed such that a new POE is generated each 200 nanoseconds (ns). Processing involves, computing the average and quantized brightness values for each of the N 2 elements. In addition, this entire computation must be performed for each different FOV simultaneously extracted from the window memory SWM.
- This complex operation can be greatly simplified by realizing that all the pixels required to compute the first of the N 2 elements (shown at E 11 at (e) in Fig. 11 and more fully delineated in Fig. 12(a)), is transferred from the CCD buffer BM into the window memory SWM prior to that required to compute the following element (E 12 , Fig. 12(a), etc. up to element E NN ) .
- the window memory SWM need only be of sufficient size to store the number of pixels contained in the largest element, reducing the memory requirement of the window bya factor of N 2 .
- Another advantage of this technique is that only one pixel averaging and quantizing unit is required to compute all element values, rather than N 2 units. The last and perhaps most important feature of this technique is that all N 2 element values required for each consecutive
- POE can be obtained in one clock cycle (200 ns) if the N 2 element values are stored in a tapped delay line memory.
- the rectangle I of pixels (circles) is also shown in enlarged form in Fig. 13, showing details of the brightness averaging and quantizing functions (e) and (g) of Fig. 11, and which is to be now discussed in connection with successive times, as indicated on the timing chart of Fig. 14. If it be assumed that the window is in this position I at time t 0 , Fig. 14, then, prior to the next clock cycle, time t 1 -, all (C/N) 2 pixels are transf erred into a pixel averager e 1 , as indicated in Fig. 13.
- the window memory contains those pixels required to calculate element value E 12 (Fig. 12(A)), as indicated by the dashed square I" in Fig. 12(c). If one places the quantized element values into a shift register delay line memory SR 1 (Fig. 13), with N-1 taps separated by C/N locations, one can simultaneously obtain element values E 11 through E 1N . To obtain elements in the second through the N th row (E 21 through E NN in Fig. 12(A)), it is also necessary to delay values in the vertical direction (I v , l v 1 , etc.).
- the window memory will contain those pixels required to calculate element E 21 , as shown at l v 1 in Fig. 12(E). If one takes the output after C/N vertical CCD line delays and places it into a tapped delay line of the type described previously, shown at SR" in Fig. 15, with taps for each C/N pixel location, one can obtain values for E 2 1 through E 2N .
- the computational time of the pixel averaging network is shown as a function of the total number of stages M, the number of bits per pixel, and the number of bits per quantized element, all of which can be optimized for a given application.
- a single pixel in band #1 Is quantized to yield a result of one pixel in the element; all three pixels in band #2, divided by 2 2 (DIV) and quantized (Q) , yields 2 2 ; and so on, with all (2M-1) pixels in band #M, divided (DIV) by M 2 and quantized (Q) , yielding
- Fig, 20 Look-up tables are formed (Fig, 20) in the form of programmable-read-only-memories (PROM's), now more fully discussed.
- PROM programmable-read-only-memories
- each pixel brightness is represented by a logical "1" representing the presence of metal or a logic "0" representing the absence of metal, such that the sum of pixel brightnesses in a band equals the number of pixels in the band equal to a "1".
- This sum is computed by using the pixels in each band as the address to a PROM, Fig. 20, and storing as the contents of each address the number of bits equal to a "1" in that address.
- the group of four pixels 11, 12, 21, 22 (corresponding to I, in Fig. 17 and band #2 in Fig. 18), is applied to the uppermost PROM (2), with the number of pixels in the 2 x 2 group S 2 equal to a logic "1" being ⁇ 4.
- Band #3 (3 x 3 pixels) is shown associated with the PROM (3) and band #4 (4 x 4 pixels), with PROM (4), and so on; with advantage taken of the computation of the partial sum of pixel brightness contained within band #3 by adding the same (11, 12, 13, 21, 22, 23, 31, 32, 33) to the pixel brightness for 14, 24, 34, 44, 43, 42, 41 at the high speed summing network ⁇ 3 .
- B 4 (the number of pixels equal to a logic "1" in band #4) ⁇ 7.
- Fig. 20 demonstrates how the pixel summing is implemented for all eight element sizes EL1 through EL8
- Fig. 21 demonstrates how the thresholding is accomplished for the successive masks.
- the outputs S 2 -S 8 are showm applied to respective comparators COMP 2-8, to which are respectively applied the appropriate threshold signals T 2 -T 8 .
- Each of the eight quantized element values EL1 through EL8 is applied to an element delay unit of the type shown, for example, in Fig. 16. A separate delay unit is required for each of the eight element sizes. Two parameters are required to implement the delay unit;
- the values for C/N are determined by the element size and are listed in Table 1 below.
- N 2 which is the number of elements in the POE, also equals the number of bits applied to the address lines of the recognition memories ("k", Fig. 11), since each element value is represented by one bit in this particular application.
- N 2 The value chosen for N 2 is preferably selected to minimize cost and hardware complexity. Specifically, if there exists N 2 elements, each represented by two possible values ("0" or "1"), then there exists a total of 2 N2 possible patterns. This implies that the recognition memory (Fig. 11) must contain
- Table 2 lists the number of IC's required as a function of the number of elements (N 2 ) in the POE.
- this learning process may be vastly improved in speed by proceeding in a way that prior communication or similar processing theory, predi cated on reducing noise , has contra-indicated.
- noise modeled to correspond to actual jitter or vibration of the inspection apparatus over time, or actual illumination variations over time
- the learning time can be significantly reduced.
- “Noise” perturbates the relationship between the true signal and particular quantization levels that the signal lies between.
- the system uses one quantization level to create binary images (>threshold are white; ⁇ threshold are black), and the threshold is perturbated relative to the signal.
- subsonic or low frequency noise modeled in spectrum and amplitude to be equivalent to vertical vibration noise of the inspection platform in actual operation over time is introduced by adding the same to the analog image signals from the CCD, causing the same to be learned and thus recognized when detected, rather than eliminated.
- Large sample sets to be learned by standard practice is time consuming.
- By using the mask samples and modeling noise to correspond to vibration or variation In illumination (causing thicker or thinner lines in the imaging), and superimposing such noise at a rapid rate, on the analog image signals before being quantized to digital form the machine may be taught large quantities of information rapidly.
- the following frequency bands of noise were selected to effect graphic TV display, simulating the shifting of the object being examined (such as the printed circuit cards) either due to spatial offsets or vibration or possibly variations in illumination.
- the horizontal bands of noise frequencies were selected to affect one horizontal element at each of the eight different mask sizes P1-P8; that is, a single horizontal element may change value without affecting its neighboring elements.
- each one element 1 megahertz; for P2 (2 Z ), 500 kilohertz; for P3, 333 kilohertz; for P4, 250 kilohertz; for P5, 200 kilohertz; for P6, 166 kilohertz; for P7, 140 kilohertz; and P8, 125 kilohertz.
- PI 1 kilohertz; P2 , 500 hertz; P3 , 333 hertz; P4, 250 hertz; P5 , 200 hertz; P6, 166 hertz; P7 , 140 hertz; P8, 125 hertz.
- noise frequencies may be selected as follows: PI, 250 hertz; P2, 125 hertz; P3 , 83 hertz; P4, 62 hertz; P5 , 50 hertz; P6, 41 hertz; P7, 35 hertz; P8, 31 hertz. Effects such as slow daily variations in illuminations which would cause corresponding variations in camera signals, may be modeled by varying the threshold at a low noise frequency band in the range of .003 hertz to 15 hertz.
- This noise also simulates slow frequency wobble or possibly shaking of the entire machine and table thereof
- a further refinement that avoids the possibility of the apparatus mistaking an indicated abnormality for a defect or other such pattern of interest, as may be caused by variations about the thresholds, is the use of adjacent location mask outputs to enable the conclusion of a real defect, such as by monitoring horizontal and vertical contiguous masks. More generally, the avoidance of mask edge position uncertainty or error in the image of the successive samples being monitored during the line scanning of the object or pattern thereof, with resulting pattern uncertainty (including whether the defect or other sub-pattern of interest is actually there), is effected by monitoring adjacent or contiguous patterns in two dimensions (horizontally and vertically, for example) , to verify that the same sub-pattern defect actually occurs in all adjacent masks.
- this may be considered also as effectively "vibrating" the mask to avoid uncertainty of image sampling.
- This may be effected by storing in succession the sets of bits representing horizontally adjacent pattern information in a scanned row or line of the image, as before explained, Fig. 23C, and comparing the same with similar pattern information of a vertically displaced adjacent scanned row line Fig. 23E; such that only if all pairs of pattern information sets of bits show
- the presence of the sub-pattern is indication of the detection of the defect or other sub-pattern indicated.
- masks are composed of elements, and elements are composed of groups of pixels ranging from 1 pixel in size ro many pixels.
- a pixel's threshold is set, and i f the number of white pixels contained within an element group exceeds the minimum threshold, the element is said to be white; else it is black.
- thresholds may be set, for example, such that if more than 50% of the pixels are white, the element is white. Elements lying on edge boundaries, however, may oscillate in value because there exists a certain amount of quantization error or noise in the system.
- the defect must be present at all "vibrational" positions of the mask; otherwise, it is concluded that the defect signal resulted from such quantization and should be ignored as such as not constituting a true defect.
- the invention thus provides a significant departure from prior pre-programming concepts in image inspection and recognition systems, enabling self- or auto-learning, wherein the objects to be “learned” are placed underneath the system which proceeds to scan them, learning all important features thereof as the complete learning process. No programming whatsoever is needed.
- the invention further eliminates any requirement of critical alignment, and operates independently of slight rotation or displacement of the object.
- the invention enables orienting the object by rotating it while “learning”, thus to teach the system all tion independent.
- the mask does not have to be of square configuration as illustrated. It could be triangular or round or circular or of any other kind of two-dimensional shape whatsoever, so long as the masks can be overlapping such that all areas of the object are seen by the masks , independently of the mask shape.
- An additional feature of the invention is that since the same processes all data as it is being scanned, with complete information and decision-making performed as the data is coming in, there is no necessity for storing the picture whatsoever.
- the implication of this feature is that the system can look at extremely large fields--in fact, at truly infinite fields, without limitation. For example, the system can look at a row of printed paper or newspaper coming out in thousands and thousands of feet, analyzing it as it passes. It can inspect rolls of metal, as for pin holes or other artifacts, with no limitation on size. This is as decidely contrasted with prior systems that require that the image be stored, storage size increasing as the square of the image, to astronomical values in such applications as the newspaper or the like, before mentioned,
- the system having stored scan information on a known object, scans similar objects-to-be inspected and determines if any shape not previously seen is present, and, if desired, stopping or otherwise indicating the location, for example, of defects or shape variations not present in "good" object images.
- the system is "taught", as before explained, to differentiate between different types of objects by storing scan information as to these different objects with information for uniquely defining the different objects, and then, when inspecting objects, as in a bin or on a conveyor belt, identifying the same and, if desired, stopping and/or tagging the identified objects.
- the concept underlying the invention is to provide a set of magnifications, the smallest being chosen, as previously explained, to see the smallest object of interest such as a defect or other shape, and to use the largest magnification which will still show perhaps the largest object and still enable identification of that object.
- pattern information at the lower magnification as well as the higher magnification, patterns can be recognized at very fast rates, say 200 nanosecond-type speeds or less. This will reduce and perhaps in some applications eliminate the need to reconstruct the larger shapes or perimeters from the smallest shapes shown by the highest magnifications.
- the invention thus distinguishes also from some other approaches that store specific "learned" images and cannot work with new images not seen before.
- the system learns and stores in recognition memory graphical characteristics which it can then identify when such appear even in totally new graphics and images that have not been shown to the machine before.
- graphical characteristics are not the only information that may be classified in accordance with the underlying concepts of the invention; the same being applicable to other information including other electromagnetic or acoustic signal information, if desired.
- an object is in motion (as upon a moving table or platform or the like) past the inspection station in the upper left, illuminated so that its image is focused by a lens L' upon a linear CCD or other sensing camera or the like, with the object motion perpendicular to the CCD.
- the output of the CCD is an analog signal which is quantized into the digital levels.
- the quantized information is stored in a two-dimensional buffer memory BM, each line of the buffer memory being equal to the length of the CCD scan, such that a large two-dimensional image is created.
- a small memory referred to as a sliding window memory SWM, is filled, In effect sliding along the large two-dimensional buffer memory.
- the size of the sliding window memory is equal to the largest element size required to compute the largest mask.
- the information from this sliding window memory is placed into an element computational unit.
- the sliding window memory contains pixels, where N 2 is the total number of elements looked at in each field of view, and C 2 is the total number of pixels in the largest field of view. In this specific application, the largest element size is 8 pixels by 8 pixels, or a total of 64 pixels. From the output of the element computational unit Fig. 18, an actual mask, 4 elements by 4 elements, is created. The specifics of the element delay unit which computes these masks was earlier described in connection with Fig.
- each of the element delay units is fed to a recognition memory.
- a recognition memory For each different size, there is a unique element delay unit that creates the mask composed of such elements, referred to as POE (pattern of elements), and such are traced onto the address lines of the recognition memory, representing a unique address or unique pattern.
- POE pattern of elements
- an object is shown to the machine and scanned, developing these patterns of elements or addresses applied to the recognition memory, such that information about these patterns is stored as associated with the specific object being scanned.
- the run mode of the apparatus inspecting an object under test and determining whether that object has been seen before and "learned", the patterns of elements or addresses of the object under inspection that are developed are compared with the contents of the recognition memories to determine whether these patterns have been seen before.
- the analysis for actual interpretation and correlation of these patterns is performed in a unit described as a computer or processor for interpreting, and correlating pattern information, performing the following operations.
- any pattern is developed at any of the different recognition memories that has not been previously seen or learned and addressed therein, this constitutes an error or defect or pattern outside the norm, and is flagged.
- this may be definitely identified as a true defect, it may be required, as before explained, that multiple recognition memories all indicate a foreign pattern is present at more than one of the so-called view magnifications; or, if only certain recognition memories for the given application are important, that a certain limited number of foreign patterns would constitute a defect.
- the optical scanning of an area of a region of interest of the object is performed by the CCD.
- Digitizing the scanned signals in real time to produce successive trains of digital signals corresponding to the optical scanned lines is performed in the pixel quantizer.
- Delaying the successive trains of digital signals to produce a vertical raster of successive trains of digital signals corresponding to previously optical scanned lines is effected in the two-dimensional buffer memory, more fully previously described in connection with Fig. 6.
- the sliding window memorySWM thereupon reduces the vertical raster to a sub-raster containing a predetermined number of horizontal H and vertical V pixels, earlier described in detail in Figs. 6, 12 and 13.
- Groups of pixels are now extracted from the sub-raster groups of pixels in spatial distribution such that V i is equal to or less than V and hi Is equal to or less than H, where the subscript i represents specific elements ranging from an element one pixel in size to an element V by H pixels in size, reference being made to the element computational unit described in Figs. 18, 20 and 21. That unit enables selecting an element threshold value for each element size by predetermining a minimum and maximum number of pixels, such that if the number of pixels contained in an element is within said minimum and maximum values, the element value is set to the value zero or one, with such threshold varying with the size of the element, as detailed in Fig. 21.
- the element delay unit (more fully described in connection with Fig.
- the code number is read to determine if the mask generated during the optical scanning of the object to be inspected corresponds thereto, indicating the recognition of the desired object pattern or the lack of correspondence thereto in the computer or processor for interpreting and correlating pattern information (Fig. 22). If the object or pattern has not been previously seen, such is indicated or flagged. It is now in order to describe the operational units in the computer or processor above-referenced and suitable indicating or display apparatus. Referring now to Figs. 23A, B, C, D and E, five units or operations of the computer or processor are illustrated with suitable circuit designs for performing the desired functions. Fig. 23A shows a circuit for indicating a defect or unknown and unlearned pattern present in an object under inspection.
- the circuit is designed such that an unknown pattern in any memory constitutes a defect in the object; such result being attained by providing that the data from, each of the N recognition memories is logically ANDed with an enable gate, to determine if that memory is to be looked at; and the output of the N gates are all worked together.
- the defect will be indicated.
- Such indication can be presented to the operator by way of a number of different types of display devices, Including the TV monitor M (Fig. 22), wherein the specific image which the system of the invention is actually viewing at the time that it detects a defect can be frozen and displayed.
- Fig. 23A data from recognition memories 1-N are applied to respective gates receiving enabling signals from masks 1-N and provide an output A, such as a logic 1, representing a defect indicative of the fact that an unknown pattern has appeared in at least one of the memories.
- the circuitry of Fig. 23B insures that if defects exist simultaneously at chosen masks, then and only then is it considered that a defect actually exists in the object, as before explained.
- Fig. 23C shows the application of ourput A of Fig. 23A or output B of Fig. 23B to an AND gate also inputted from successive unit horizontal delay elements 1-N.
- a logic 1 in the output C would thus indicate that a defect was flagged by one or more recognition memories at each of the previous N horizontal positions .
- FIG. 23D Similar circuitry for achieving the contiguous vertical mask defect verification before described is shown in Fig. 23D wherein successive vertical line delays input the final AND gate to produce output D. If and only if a defect is detected at M contiguous vertical locations at the same horizontal location does the circuit Indicate the presence of a defect in the object. Thus a logic 1 in the output D indicates that a defect was flagged by one or more recognition memories at each of the previous
- Fig. 23E achieves the "vibration" of the mask to insure indication of a genuine defect as previously explained.
- This combines the horizontal and vertical contiguous mask checking for defects with the outputs a-j from successive vertical line delays, the outputs b-k from unit horizontal delays, and the outputs c-L from the Nth horizontal delays, operating the final AND gate such that a logic 1 will indicate that a defect is present if and only if all previous N horizontal locations at each of the previous M vertical lines indicate a defect.
- An operating system of the type shown in Fig. 22 had the following technical specifications and specific parts and equipment.
- the motor drive for an XY table (of Ralmike Corporation) was controlled by stepping motors (from Sigma Corporation).
- the imaging lens was a Photar enlarging lens by Leitz (130mm), and the CCD camera was constructed by Beltronics of Brookline, Massachusetts, using a Reticon 1728 element array.
- the clock drivers for the CCD were DS0026 National Company chips and the linear amplifiers and sampling-hold circuitry utilized LH0023 chips.
- the noise generators were: an H. H.
- the pixel quantizer in this application was a single LM361 comparator.
- the two-dimensional buffer memory was constructed with a width of 2000 elements and a depth or vertical height of 64 lines. Referring to Fig. 22, and the two-dimensional buffer memory, L was equal to 2,000 and — equaled 64, being constructed
- the TV memory buffer was constructed using Hitachi HM ⁇ l ⁇ 7's, with count and control chips 74S163's and variety of 7408' s, and gates and multiplexers.
- the TV monitor was produced by Panosonic.
- the sliding window memory was constructed using TMM2016's memory chips in conjunction with 74LS374's, 74S299's, and Data Delay Devices No. DDU5100, for providing accurate timing.
- the element computational unit was constructed using 74S374's to construct the various element sizes ranging from 1 to 64 pixels in size.
- the pixel sums were calculated using both proms 82S131 by Signetics and actual summing chips 74S283's. Their final sums were then latched into 74S174's. Having computed a sum of pixels, such is compared to its pixel threshold, with the comparison being performed using 74S85's.
- the element values were then sent into the individual element delay units constructed using 74S299's, 74LS374's, 74164's, and Hitachi TMM2016's. These element delay units were used to compute the mask addresses, which are applied to the recognition memories.
- the memories were implemented using Hitachi HM6167's memory chips.
- the computer or processor for interpreting and correlating pattern information was implemented using 74S86's to perform the exclusive OR function described in Fig.
- the delay lines were of the type TMM2016 to implement the various vertical delays necessary for some of the defect detection shown in Fig. 23; and again 74164' s were used to implement some of the horizontal delays.
- Additional control logic contained 74S163's and 74S299's for enabling and disabling the appropriate masks at the correct time.
- the overall systems specifications were as follows. For demonstration purposes, the XY table was capable of scanning an area 8 inches by 5 inches. Larger tables, however, that can easily scan two-foot squared areas and larger, can readily be hooked Into the system.
- the resolution of the CCD system was of the order of 1 mil.
- the processing or data rate was from 5 megahertz (200 nanoseconds) to as low as 2 megahertz (500 nanoseconds). Operating the CCD at a 5 megahertz rate, the time required to clock out the complete array was half a millisecond so that the time required to analyze a strip 8" long by 1" was 5 seconds.
- the time required to scan a PC card 8" by 10" was 50 seconds at IX optical magnification.
- the TV or other monitor M of Fig. 22 may have its image of the pattern of interest of the object, such as a defect, frozen on the screen with the aid of the TV memory buffer, so labelled, and with x-y or other coordinates or position of the same also recorded, as indicated.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US06/477,264 US4589140A (en) | 1983-03-21 | 1983-03-21 | Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like |
US477264 | 1990-02-08 |
Publications (3)
Publication Number | Publication Date |
---|---|
EP0145725A1 EP0145725A1 (en) | 1985-06-26 |
EP0145725A4 true EP0145725A4 (en) | 1988-06-27 |
EP0145725B1 EP0145725B1 (en) | 1993-10-06 |
Family
ID=23895212
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP84901532A Expired - Lifetime EP0145725B1 (en) | 1983-03-21 | 1984-03-19 | Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like |
Country Status (6)
Country | Link |
---|---|
US (1) | US4589140A (en) |
EP (1) | EP0145725B1 (en) |
JP (1) | JPS60501429A (en) |
CA (1) | CA1234908A (en) |
DE (1) | DE3486226D1 (en) |
WO (1) | WO1984003784A1 (en) |
Families Citing this family (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5297222A (en) * | 1982-05-04 | 1994-03-22 | Hitachi, Ltd. | Image processing apparatus |
DE3347645C1 (en) * | 1983-12-30 | 1985-10-10 | Dr.-Ing. Ludwig Pietzsch Gmbh & Co, 7505 Ettlingen | Method and device for opto-electronic testing of a surface pattern on an object |
GB2152658A (en) * | 1984-01-09 | 1985-08-07 | Philips Electronic Associated | Object sorting system |
US4650333A (en) * | 1984-04-12 | 1987-03-17 | International Business Machines Corporation | System for measuring and detecting printed circuit wiring defects |
US4953224A (en) * | 1984-09-27 | 1990-08-28 | Hitachi, Ltd. | Pattern defects detection method and apparatus |
US5374830A (en) * | 1984-10-12 | 1994-12-20 | Sensor Adaptive Machines, Inc. | Target based determination of robot and sensor alignment |
USRE38559E1 (en) * | 1984-12-20 | 2004-07-27 | Orbotech Ltd | Automatic visual inspection system |
EP0195161B1 (en) * | 1985-03-14 | 1993-09-15 | Nikon Corporation | Apparatus for automatically inspecting objects and identifying or recognizing known and unknown portions thereof, including defects and the like and method |
US4646355A (en) * | 1985-03-15 | 1987-02-24 | Tektronix, Inc. | Method and apparatus for input picture enhancement by removal of undersired dots and voids |
US4794647A (en) * | 1985-04-08 | 1988-12-27 | Northern Telecom Limited | Automatic optical inspection system |
JPS61290311A (en) * | 1985-06-19 | 1986-12-20 | Hitachi Ltd | Apparatus and method for inspecting soldered zone |
US5222159A (en) * | 1985-07-19 | 1993-06-22 | Canon Kabushiki Kaisha | Image processing method and apparatus for extracting a portion of image data |
JPS6261390A (en) * | 1985-09-11 | 1987-03-18 | 興和株式会社 | Method and apparatus for inspecting printed board |
SE452386B (en) * | 1985-10-07 | 1987-11-23 | Hasselblad Ab Victor | DEVICE FOR APPLIANCES FOR ASTAD COMMANDING OF AN ELECTRIC SIGNAL REPRESENTING A PICTURE |
US4928313A (en) * | 1985-10-25 | 1990-05-22 | Synthetic Vision Systems, Inc. | Method and system for automatically visually inspecting an article |
US4777525A (en) * | 1985-12-23 | 1988-10-11 | Preston Jr Kendall | Apparatus and method for a multi-resolution electro-optical imaging, display and storage/retrieval system |
JPS62173731A (en) * | 1986-01-28 | 1987-07-30 | Toshiba Corp | Inspection device for surface of article to be inspected |
US4827412A (en) * | 1986-01-29 | 1989-05-02 | Computer Sports Systems, Inc. | Pinfall detector using video camera |
US5046109A (en) * | 1986-03-12 | 1991-09-03 | Nikon Corporation | Pattern inspection apparatus |
JPS62247478A (en) * | 1986-04-21 | 1987-10-28 | Hitachi Ltd | Pattern inspection instrument |
IL78943A (en) * | 1986-05-27 | 1990-09-17 | Ibm Israel | Method and apparatus for automated optical inspection of printed circuits |
US4809341A (en) * | 1986-07-18 | 1989-02-28 | Fujitsu Limited | Test method and apparatus for a reticle or mask pattern used in semiconductor device fabrication |
US5151822A (en) * | 1986-10-17 | 1992-09-29 | E. I. Du Pont De Nemours And Company | Transform digital/optical processing system including wedge/ring accumulator |
US4872052A (en) * | 1986-12-03 | 1989-10-03 | View Engineering, Inc. | Semiconductor device inspection system |
US4760444A (en) * | 1987-07-22 | 1988-07-26 | Csd International, Inc. | Machine visual inspection device and method |
US4866629A (en) * | 1987-11-13 | 1989-09-12 | Industrial Technology Research Institute | Machine vision process and apparatus for reading a plurality of separated figures |
JPH0737892B2 (en) * | 1988-01-12 | 1995-04-26 | 大日本スクリーン製造株式会社 | Pattern defect inspection method |
US4899219A (en) * | 1988-10-31 | 1990-02-06 | Amoco Corporation | Macroview and microview video record of core |
US5046111A (en) * | 1989-02-09 | 1991-09-03 | Philip Morris Incorporated | Methods and apparatus for optically determining the acceptability of products |
US5150422A (en) * | 1989-03-31 | 1992-09-22 | Dainippon Screen Mfg. Co., Ltd. | Method of and apparatus for inspecting conductive pattern on printed board |
US5046120A (en) * | 1989-04-10 | 1991-09-03 | Beltronics, Inc. | Method of and apparatus for increasing the processing speed in the scanning inspection of circuit boards and other objects |
US5361309A (en) * | 1989-09-07 | 1994-11-01 | Canon Kabushiki Kaisha | Character recognition apparatus and method with low-resolution storage for character extraction |
EP0435660B1 (en) * | 1989-12-29 | 1997-06-04 | Canon Kabushiki Kaisha | Method of evaluating objects based upon image processing, and inspection apparatus using said method |
JPH03210679A (en) * | 1990-01-12 | 1991-09-13 | Hiyuutec:Kk | Method and device for pattern matching |
EP0455898A1 (en) * | 1990-05-09 | 1991-11-13 | Robert Bishop | Image scanning inspection system |
JP2747105B2 (en) * | 1990-11-05 | 1998-05-06 | 富士通株式会社 | Image data verification method and apparatus |
IL99823A0 (en) * | 1990-11-16 | 1992-08-18 | Orbot Instr Ltd | Optical inspection method and apparatus |
US5598345A (en) * | 1990-11-29 | 1997-01-28 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for inspecting solder portions |
US5119434A (en) * | 1990-12-31 | 1992-06-02 | Beltronics, Inc. | Method of and apparatus for geometric pattern inspection employing intelligent imaged-pattern shrinking, expanding and processing to identify predetermined features and tolerances |
JPH04290186A (en) * | 1991-03-19 | 1992-10-14 | Eastman Kodak Japan Kk | Image processing method |
DE4133590A1 (en) * | 1991-07-03 | 1993-01-14 | Bosch Gmbh Robert | METHOD FOR CLASSIFYING SIGNALS |
US5283418A (en) * | 1992-02-27 | 1994-02-01 | Westinghouse Electric Corp. | Automated rotor welding processes using neural networks |
US5361307A (en) * | 1993-03-25 | 1994-11-01 | General Electric Company | Correlation methods of identifying defects in imaging devices |
US5452368A (en) * | 1993-08-02 | 1995-09-19 | Motorola, Inc. | Method of detecting defects in semiconductor package leads |
JP3392573B2 (en) * | 1994-03-31 | 2003-03-31 | 株式会社東芝 | Sample inspection apparatus and method |
US7843497B2 (en) | 1994-05-31 | 2010-11-30 | Conley Gregory J | Array-camera motion picture device, and methods to produce new visual and aural effects |
US5515301A (en) * | 1994-06-29 | 1996-05-07 | General Electric Company | Real-time visualization system for multiple time-sampled signals |
US6714665B1 (en) | 1994-09-02 | 2004-03-30 | Sarnoff Corporation | Fully automated iris recognition system utilizing wide and narrow fields of view |
US5703729A (en) * | 1994-12-12 | 1997-12-30 | Dainippon Screen Mfg. Co., Ltd. | Image inputting apparatus |
US5850466A (en) * | 1995-02-22 | 1998-12-15 | Cognex Corporation | Golden template comparison for rotated and/or scaled images |
AU1127197A (en) * | 1995-12-04 | 1997-06-27 | David Sarnoff Research Center, Inc. | Wide field of view/narrow field of view recognition system and method |
FR2751109B1 (en) * | 1996-07-09 | 1998-10-09 | Ge Medical Syst Sa | PROCEDURE FOR LOCATING AN ELEMENT OF INTEREST CONTAINED IN A THREE-DIMENSIONAL OBJECT, IN PARTICULAR DURING AN EXAMINATION OF STEREOTAXIS IN MAMMOGRAPHY |
US6330354B1 (en) | 1997-05-01 | 2001-12-11 | International Business Machines Corporation | Method of analyzing visual inspection image data to find defects on a device |
GB2330198A (en) * | 1997-10-10 | 1999-04-14 | Leonard Frederick Goldstone | Article identification system |
TW419634B (en) | 1999-02-02 | 2001-01-21 | Ind Tech Res Inst | Automatic detection system and method using bar code positioning |
US6678681B1 (en) * | 1999-03-10 | 2004-01-13 | Google Inc. | Information extraction from a database |
JP3350477B2 (en) * | 1999-04-02 | 2002-11-25 | セイコーインスツルメンツ株式会社 | Wafer inspection equipment |
US7106895B1 (en) | 1999-05-05 | 2006-09-12 | Kla-Tencor | Method and apparatus for inspecting reticles implementing parallel processing |
CA2296143A1 (en) | 2000-01-18 | 2001-07-18 | 9071 9410 Quebec Inc. | Optical inspection system |
GB2362459A (en) * | 2000-05-16 | 2001-11-21 | Lloyd Doyle Ltd | Method and apparatus for inspection of printed wiring boards |
JP2002100660A (en) * | 2000-07-18 | 2002-04-05 | Hitachi Ltd | Defect detecting method, defect observing method and defect detecting apparatus |
US7200246B2 (en) | 2000-11-17 | 2007-04-03 | Honeywell International Inc. | Object detection |
US6711279B1 (en) | 2000-11-17 | 2004-03-23 | Honeywell International Inc. | Object detection |
US6841780B2 (en) | 2001-01-19 | 2005-01-11 | Honeywell International Inc. | Method and apparatus for detecting objects |
WO2002067039A1 (en) * | 2001-02-19 | 2002-08-29 | Olympus Optical Co., Ltd. | Image comparing device, image comparing method and progrom having computer run image comparison |
US6700658B2 (en) | 2001-10-05 | 2004-03-02 | Electro Scientific Industries, Inc. | Method and apparatus for circuit pattern inspection |
US7151850B2 (en) * | 2001-10-30 | 2006-12-19 | Matsushita Electric Industrial Co., Ltd. | Apparatus and method for setting teaching data, teaching data providing system over network |
US7813559B2 (en) * | 2001-11-13 | 2010-10-12 | Cyberoptics Corporation | Image analysis for pick and place machines with in situ component placement inspection |
US7239399B2 (en) * | 2001-11-13 | 2007-07-03 | Cyberoptics Corporation | Pick and place machine with component placement inspection |
US6879389B2 (en) * | 2002-06-03 | 2005-04-12 | Innoventor Engineering, Inc. | Methods and systems for small parts inspection |
US6996264B2 (en) * | 2002-10-18 | 2006-02-07 | Leco Corporation | Indentation hardness test system |
US20040086166A1 (en) * | 2002-11-01 | 2004-05-06 | Photon Dynamics, Inc. | Method and apparatus for flat patterned media inspection |
US7289658B2 (en) * | 2003-06-24 | 2007-10-30 | International Business Machines Corporation | Removal of relatively unimportant shapes from a set of shapes |
JP4041042B2 (en) * | 2003-09-17 | 2008-01-30 | 大日本スクリーン製造株式会社 | Defect confirmation device and defect confirmation method |
US7706595B2 (en) * | 2003-11-07 | 2010-04-27 | Cyberoptics Corporation | Pick and place machine with workpiece motion inspection |
US8010579B2 (en) * | 2003-11-17 | 2011-08-30 | Nokia Corporation | Bookmarking and annotating in a media diary application |
US20050105374A1 (en) * | 2003-11-17 | 2005-05-19 | Nokia Corporation | Media diary application for use with digital device |
US8990255B2 (en) * | 2003-11-17 | 2015-03-24 | Nokia Corporation | Time bar navigation in a media diary application |
US20050108643A1 (en) * | 2003-11-17 | 2005-05-19 | Nokia Corporation | Topographic presentation of media files in a media diary application |
GB2409027A (en) * | 2003-12-11 | 2005-06-15 | Sony Uk Ltd | Face detection |
US7215808B2 (en) * | 2004-05-04 | 2007-05-08 | Kla-Tencor Technologies Corporation | High throughout image for processing inspection images |
JP4357355B2 (en) * | 2004-05-07 | 2009-11-04 | 株式会社日立ハイテクノロジーズ | Pattern inspection method and apparatus |
US7084970B2 (en) * | 2004-05-14 | 2006-08-01 | Photon Dynamics, Inc. | Inspection of TFT LCD panels using on-demand automated optical inspection sub-system |
US20050286428A1 (en) * | 2004-06-28 | 2005-12-29 | Nokia Corporation | Timeline management of network communicated information |
US20060075631A1 (en) * | 2004-10-05 | 2006-04-13 | Case Steven K | Pick and place machine with improved component pick up inspection |
JP4647974B2 (en) * | 2004-11-17 | 2011-03-09 | 株式会社日立ハイテクノロジーズ | Defect review apparatus, data management apparatus, defect observation system, and defect review method |
FR2880453A1 (en) * | 2005-01-06 | 2006-07-07 | Thomson Licensing Sa | METHOD AND DEVICE FOR PROCESSING IMAGE MOSAIC |
US20070003126A1 (en) * | 2005-05-19 | 2007-01-04 | Case Steven K | Method and apparatus for evaluating a component pick action in an electronics assembly machine |
JP4558607B2 (en) * | 2005-08-24 | 2010-10-06 | ローレル精機株式会社 | Coin detector |
JP4657869B2 (en) * | 2005-09-27 | 2011-03-23 | シャープ株式会社 | Defect detection apparatus, image sensor device, image sensor module, image processing apparatus, digital image quality tester, defect detection method, defect detection program, and computer-readable recording medium |
JP2009514234A (en) * | 2005-10-31 | 2009-04-02 | サイバーオプティクス コーポレーション | Electronic assembly machine with built-in solder paste inspection |
US7636478B2 (en) * | 2006-07-31 | 2009-12-22 | Mitutoyo Corporation | Fast multiple template matching using a shared correlation map |
US7593560B2 (en) * | 2006-11-30 | 2009-09-22 | Canon U.S. Life Sciences, Inc. | Systems and methods for monitoring the amplification and dissociation behavior of DNA molecules |
US7869621B1 (en) | 2007-06-07 | 2011-01-11 | Aydin Arpa | Method and apparatus for interpreting images in temporal or spatial domains |
US8824731B2 (en) * | 2007-10-31 | 2014-09-02 | The Boeing Comapny | Image processing of apparatus condition |
US8581848B2 (en) | 2008-05-13 | 2013-11-12 | Pixart Imaging Inc. | Hybrid pointing device |
US8730169B2 (en) * | 2009-10-29 | 2014-05-20 | Pixart Imaging Inc. | Hybrid pointing device |
SG164298A1 (en) * | 2009-02-24 | 2010-09-29 | Visionxtreme Pte Ltd | Object inspection system |
US20110090489A1 (en) * | 2009-10-21 | 2011-04-21 | Robert Bishop | Method and Apparatus for Detecting Small Reflectivity Variations in Electronic Parts at High Speed |
US8581847B2 (en) | 2009-10-29 | 2013-11-12 | Pixart Imaging Inc. | Hybrid pointing device |
US8648836B2 (en) | 2010-04-30 | 2014-02-11 | Pixart Imaging Inc. | Hybrid pointing device |
US8760403B2 (en) | 2010-04-30 | 2014-06-24 | Pixart Imaging Inc. | Hybrid human-interface device |
US8175452B1 (en) * | 2010-10-26 | 2012-05-08 | Complete Genomics, Inc. | Method and system for imaging high density biochemical arrays with sub-pixel alignment |
US8891872B2 (en) | 2011-12-16 | 2014-11-18 | General Electric Company | System and method for identifying physical markings on objects |
US9235499B2 (en) | 2011-12-16 | 2016-01-12 | General Electric Company | System and method for identifying a character-of-interest |
US9378546B2 (en) * | 2012-01-12 | 2016-06-28 | Hewlett-Packard Indigo B.V. | Image defect visibility predictor |
TWI435092B (en) * | 2012-02-17 | 2014-04-21 | Wistron Corp | Method and system for detecting circuit |
US9488823B2 (en) | 2012-06-07 | 2016-11-08 | Complete Genomics, Inc. | Techniques for scanned illumination |
US9628676B2 (en) | 2012-06-07 | 2017-04-18 | Complete Genomics, Inc. | Imaging systems with movable scan mirrors |
JP6187811B2 (en) * | 2013-09-09 | 2017-08-30 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US9465995B2 (en) | 2013-10-23 | 2016-10-11 | Gracenote, Inc. | Identifying video content via color-based fingerprint matching |
US9558547B2 (en) | 2014-01-09 | 2017-01-31 | The Boeing Company | System and method for determining whether an apparatus or an assembly process is acceptable |
CN106415192B (en) * | 2014-06-06 | 2019-05-03 | 株式会社富士 | Lead image-recognizing method and identification device and image procossing component data generation method and generating means |
EP3385884A1 (en) | 2017-04-04 | 2018-10-10 | Siemens Aktiengesellschaft | Method for recognising an oject of a mobile unit |
JP7187377B2 (en) * | 2019-04-23 | 2022-12-12 | 株式会社日立製作所 | OBJECT INFORMATION REGISTRATION DEVICE AND OBJECT INFORMATION REGISTRATION METHOD |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2001823A (en) * | 1977-07-14 | 1979-02-07 | Nippon Jidoseigyo Ltd | Apparatus for detecting defects in patterns |
US4175860A (en) * | 1977-05-31 | 1979-11-27 | Rush-Presbyterian-St. Luke's Medical Center | Dual resolution method and apparatus for use in automated classification of pap smear and other samples |
EP0023574A1 (en) * | 1979-07-23 | 1981-02-11 | Siemens Aktiengesellschaft | Opto-electronic system for automatically testing the quality of printed-circuit boards, their intermediate products and printing tools |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4934385A (en) * | 1972-07-28 | 1974-03-29 | ||
US3969577A (en) * | 1974-10-15 | 1976-07-13 | Westinghouse Electric Corporation | System for evaluating similar objects |
JPS51112236A (en) * | 1975-03-28 | 1976-10-04 | Hitachi Ltd | Shape position recognizer unit |
US3999129A (en) * | 1975-04-16 | 1976-12-21 | Rolm Corporation | Method and apparatus for error reduction in digital information transmission systems |
JPS5839357B2 (en) * | 1976-01-26 | 1983-08-29 | 株式会社日立製作所 | Pattern position detection method |
US4107648A (en) * | 1976-04-12 | 1978-08-15 | Bell Telephone Laboratories, Incorporated | Scan encoding of two dimensional pictorial entities |
US4345312A (en) * | 1979-04-13 | 1982-08-17 | Hitachi, Ltd. | Method and device for inspecting the defect of a pattern represented on an article |
JPS5738941A (en) * | 1980-08-18 | 1982-03-03 | Iseki Agricult Mach | Change-over device for finishing rice extracting flow path in rice huller |
JPS57147780A (en) * | 1981-03-09 | 1982-09-11 | Nippon Telegr & Teleph Corp <Ntt> | Extracting system for feature of binary picture |
US4490847A (en) * | 1981-11-27 | 1984-12-25 | National Research Development Corporation | Recognition apparatus |
-
1983
- 1983-03-21 US US06/477,264 patent/US4589140A/en not_active Expired - Fee Related
-
1984
- 1984-03-19 JP JP59501385A patent/JPS60501429A/en active Granted
- 1984-03-19 EP EP84901532A patent/EP0145725B1/en not_active Expired - Lifetime
- 1984-03-19 WO PCT/US1984/000427 patent/WO1984003784A1/en active IP Right Grant
- 1984-03-19 DE DE84901532T patent/DE3486226D1/en not_active Expired - Lifetime
- 1984-03-20 CA CA000450013A patent/CA1234908A/en not_active Expired
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4175860A (en) * | 1977-05-31 | 1979-11-27 | Rush-Presbyterian-St. Luke's Medical Center | Dual resolution method and apparatus for use in automated classification of pap smear and other samples |
GB2001823A (en) * | 1977-07-14 | 1979-02-07 | Nippon Jidoseigyo Ltd | Apparatus for detecting defects in patterns |
EP0023574A1 (en) * | 1979-07-23 | 1981-02-11 | Siemens Aktiengesellschaft | Opto-electronic system for automatically testing the quality of printed-circuit boards, their intermediate products and printing tools |
Also Published As
Publication number | Publication date |
---|---|
EP0145725A1 (en) | 1985-06-26 |
DE3486226D1 (en) | 1993-11-11 |
CA1234908A (en) | 1988-04-05 |
JPS60501429A (en) | 1985-08-29 |
JPH0452992B2 (en) | 1992-08-25 |
EP0145725B1 (en) | 1993-10-06 |
WO1984003784A1 (en) | 1984-09-27 |
US4589140A (en) | 1986-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4589140A (en) | Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like | |
US7796807B2 (en) | Optical inspection apparatus for substrate defect detection | |
US4589141A (en) | Apparatus for automatically inspecting printed labels | |
Ejiri et al. | A process for detecting defects in complicated patterns | |
US4969198A (en) | System for automatic inspection of periodic patterns | |
US6246472B1 (en) | Pattern inspecting system and pattern inspecting method | |
US5027417A (en) | Method of and apparatus for inspecting conductive pattern on printed board | |
EP0493657B1 (en) | Method and apparatus for identifying manufacturing defects in solid state devices | |
US4771468A (en) | System for automatic inspection of periodic patterns | |
EP0195161A2 (en) | Apparatus for automatically inspecting objects and identifying or recognizing known and unknown portions thereof, including defects and the like and method | |
CN101105459A (en) | Empty bottle mouth defect inspection method and device | |
JPS63503332A (en) | Inspection equipment | |
EP0183565A1 (en) | Method and apparatus for checking articles against a standard | |
JP2006113073A (en) | System and method for pattern defect inspection | |
JP3752849B2 (en) | Pattern defect inspection apparatus and pattern defect inspection method | |
JP2710527B2 (en) | Inspection equipment for periodic patterns | |
JPH10208066A (en) | Method for extracting edge line of check object and appearance check method using this method | |
KR910007348B1 (en) | Machine vision process and apparatus for reading a plurality of separated figures | |
Chin et al. | Automatic visual inspection of printed circuit boards | |
Chin | Automated visual inspection algorithms | |
Sischka et al. | Detection of defects on the surface of microelectronic structures | |
Besl et al. | Automatic visual inspection of solder joints | |
JPH0783614A (en) | Distance image processing method | |
Pan | A Simulated shape recognition system using feature extraction | |
JPS59206990A (en) | Pattern extracting device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 19850322 |
|
AK | Designated contracting states |
Designated state(s): BE CH DE FR GB LI NL SE |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 19880627 |
|
17Q | First examination report despatched |
Effective date: 19890911 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: BELTRONICS, INC. |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: NIKON CORPORATION |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): BE CH DE FR GB LI NL SE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Effective date: 19931006 Ref country code: NL Effective date: 19931006 Ref country code: LI Effective date: 19931006 Ref country code: FR Effective date: 19931006 Ref country code: DE Effective date: 19931006 Ref country code: CH Effective date: 19931006 Ref country code: BE Effective date: 19931006 |
|
REF | Corresponds to: |
Ref document number: 3486226 Country of ref document: DE Date of ref document: 19931111 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
EN | Fr: translation not filed | ||
NLV1 | Nl: lapsed or annulled due to failure to fulfill the requirements of art. 29p and 29m of the patents act | ||
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Effective date: 19940319 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed | ||
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 19940319 |