CA2456163C - Method and arrangement in a measuring system - Google Patents

Method and arrangement in a measuring system Download PDF

Info

Publication number
CA2456163C
CA2456163C CA002456163A CA2456163A CA2456163C CA 2456163 C CA2456163 C CA 2456163C CA 002456163 A CA002456163 A CA 002456163A CA 2456163 A CA2456163 A CA 2456163A CA 2456163 C CA2456163 C CA 2456163C
Authority
CA
Canada
Prior art keywords
light
image
designed
rows
digital representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA002456163A
Other languages
French (fr)
Other versions
CA2456163A1 (en
Inventor
Anders Astroem
Erik Astrand
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sick IVP AB
Original Assignee
Sick IVP AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=20285521&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=CA2456163(C) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Sick IVP AB filed Critical Sick IVP AB
Publication of CA2456163A1 publication Critical patent/CA2456163A1/en
Application granted granted Critical
Publication of CA2456163C publication Critical patent/CA2456163C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/46Wood
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • G01B11/303Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces using photoelectric detection means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/89Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles
    • G01N21/892Investigating the presence of flaws or contamination in moving material, e.g. running paper or textiles characterised by the flaw, defect or object feature examined
    • G01N21/898Irregularities in textured or patterned surfaces, e.g. textiles, wood
    • G01N21/8986Wood

Abstract

The present invention relates to a method and an arrangement for representin g the characteristics of an object (3) by means of a measuring system, in whic h either the measuring system or the object (3) is designed to move in relatio n to one another in a predefined direction of movement, the object (3) preferably being designed to move in relation to the measuring system. At least one light source (2) is designed to illuminate the object (3) with a light which is incident upon the object (3) and has a limited dispersion in the direction of movement. An imaging sensor (1), which is arranged on the same side of the object (3) as the light source (2) is designed to pick up light reflected from the object (3) and to convert this into electrical charges. An image-processing unit is furthermore designed to create a digita l representation (5) of the object (3) from said electrical charges. The light source (2) is arranged at a predetermined distance from the imaging sensor ( 1) viewed in the direction of movement, and the image-processing unit is design ed to simultaneously read out information on the geometric profile of the objec t and information on the light dispersion in a predetermined area around said profile.

Description

METHOD AND ARRANGEMENT IN A MEASURING SYSTEM

TECHNICAL FIELD

The present invention relates generally to a method and an arrangement for imaging the characteristics of an object and relates in particular to a method and an arrangement for imaging the characteristics of an object by means of a measuring system, in which the measuring system and/or the object are moved in relation to one another in a predefined direction of movement, the object preferably being moved in relation to the measuring system. The object is illuminated by means of incident light, which has limited dispersion in the direction of movement, and light reflected from the object is detected by means of an imaging sensor arranged on the same side of the object as the incident light, the imaging sensor converting the detected light into electrical charges, according to which a digital representation of the characteristics of the object is created.

DESCRIPTION OF THE PRIOR ART
Dl: US 3 976 384 D2: SE 501 650 D3: Astrand Erik, Automatic Inspection of Sawn Wood, doctoral thesis, University of Link6ping, 1996 D4: Wendt P, Coyle E, Gallagher N, Stack Filters, IEEE trans. ASSP-34, 1986 An advantageous method of detecting defects in wood is already known in the art, in which the surface of the wood is illuminated by a light source, for example a laser, and the dispersion of the light in the surface layer of the wood is measured.
That is to say, the light penetrating the material is registered and after dispersion erimerges from the material at a different location from that at which it entered. How this occurs depends on the internal characteristics of the material, which can in this way be measured. The greater part of the incident light, however, is reflected at the surface and is termed "scattered light". A point light source [D1] or alternatively a linear light source [D2] may be used for this purpose. The detector may comprise discrete light-sensitive elements but in an advantageous embodiment a linear light source is used together with a two-dimensional image-processing sensor [D2].
It is particularly advantageous if the image-processing sensor has the facility for defining various windows, that is to say limiting the part of the image-processing sensor that is read out for further processing.

Also known is the possibility of measuring the shape of an object, that is to say the cross-sectional geometric profile thereof, by illuminating it with a light source and then detecting the position of the representation of the reflected light on a sensor, which observes the object from a given angle, so-called triangulation. This will be referred to hereinafter as profile measurement. Combining light dispersion measurement and profile measurement by illuminating the wood surface with more than one light source [D2], one for light dispersion and one for profile measurement, in one image is likewise known.

In the known methods of measuring light dispersion, the direction of illumination from the light source and the direction of observation of the image-processing sensor lie substantially in the same plane. This means that the representation of both the reflected and the dispersed light always ends up in the same position on the image-processing sensor regardless of the geometric profile of the piece of timber. This means that only a small part of the image surface needs to be read out and the measurement can thereby be performed at high frequency.
In measuring the profile, on the other hand, the representation of the reflected light and of the dispersed light will quite naturally end up in different positions depending on dimensions. It is necessary here to compromise on the size of the image window and the angle of the light source in order to obtain different measuring ranges and accuracies. The greatest limitations here are the fact that large image windows give large quantities of data to be read out from the image-processing sensor for further processing, and that a large data processing capacity is required in order to perform calculations on this large quantity of image data.

When inspecting wood it is desirable to combine detection of light dispersion and geometric profile. Owing to the limitations outlined above, however, it has in practice not been possible, using known methods, to obtain a measuring frequency adequate for the simultaneous measurement of light dispersion and profile.
Different light sources have therefore been used for these two measurements and one problem which then occurs is that these characteristics are measured at different locations at any given instant. Data from one measurement must therefore be corrected in order to spatially match the measurement from the other, and this correction can never be made one hundred percent. Furthermore, one obvious disadvantage is that a plurality of different light sources entails a higher system cost.

SUMMARY OF THE INVENTION

An object of the present invention is to provide an improved method for simultaneously acquire geometric profile information of an object and the light dispersion information in a predetermined area around said profile by means of a measuring system.

Another object is to provide an improved arrangement for simultaneously reading out the geometric profile information of an object and the light dispersion information in a predetermined area around the said profile by means of a measuring system.

According to one embodiment of the present invention said objects have been achieved by a method and an arrangement according to the characterising parts of claim 1 and claim 9 respectively.

BRIEF DESCRIPTION OF THE DRAWINGS

The invention will now be described in more detail with examples of embodiments and with reference to the drawings attached, in which:

Fig. 1 shows a perspective view of an inventive measuring system;

Fig. 2 shows the image of the light source reflection on the object registered in the imaging sensor;

Fig. 3 illustrates how the sensor image is compressed;

Fig. 4 illustrates an embodiment of a decoding vector used in order to reconstruct the original image;

Fig. 5 shows the intensity distribution in a column of the summation image;
Fig. 6 shows how the intensity distribution is used in order to obtain the light dispersion information;

Fig. 7 shows an embodiment for generating a summation image and a decoding vector;

Fig. 8 shows an alternative embodiment for generating a summation image and a decoding vector.
DETAILED DESCRIPTION OF EMBODIMENTS

The invention in question relates to a method for rapidly measuring light dispersion and/or geometric profile by means of one and the same light source. In practical terms this is a method for reducing the quantity of data on or in proximity to the actual image-processing sensor in order to thereby obtain a high measuring frequency, given a limited bandwidth to a subsequent computer unit. All essential information regarding the light dispersion and/or geometric profile can then be reconstructed from the reduced set.
The invention will now be explained with reference to the figures below.
Figure 1 shows a typical set-up with a camera 1 containing an imaging sensor, a linear light source 2, for example a laser, and an object 3, the characteristics of which are to be represented. In Figure 1 the line on the object 3 where the light is incident is denoted by 4. Light sources other than linear ones are also feasible. Figure 2 shows the image 5 registered by the camera in which the representation of the laser line 4 is illustrated by the line 6. Supposing now that we form a total image by adding up in columns a number of rows in the image, for example every tenth one, as illustrated in Figure 3. In the resulting total image 7 row 1 will thereby represent the sum of rows {1, 11, 21} etc., row 2 the sum of {2, 12, 22} etc.

In the following, representation of the light source relates to the representation on the imaging sensor of the light reflected on the object and dispersed in the object.
Whilst at the same time forming the summation, for each column a check is kept on the row in which the representation of the light source first became visible.
This can be done, for example, by continuously comparing the total with a threshold value. If the total after adding a further row has passed the threshold value for a certain column, a note is made of the position in which this occurred. This can be done, for example, by saving the result of the threshold operation in a bit field 8. The bit field 8 contains as many bits as the number of rows added up for each row in the total image. If, for example, the first total reaches the threshold after row 31, that is to say after the third summation, bit 3 is entered in the register. If the next total reaches the threshold in row 22, that is to say after the second addition, bit 2 is entered and so on. The result when all summations are completed is not only the total image but also a vector 9 with one bit field each for each column, which can be used in order to calculate where in the original sensor image the representation of the light source was first generated. This is shown in more detail in Figure 4. It should be noted, however, that this is only one of several possible ways of registering the position when the sum reached a certain level. The invention in no way depends on precisely how this is done.

It should be mentioned that as an alternative to summation it is also possible to use a max operation in which the greatest value in each column is retained. This actually gives a less noise sensitive result but can, on the other hand, be more 5 expensive to implement. It depends, therefore, on the embodiment. As further alternatives, other so-called Stackfilter operations [D4] are also conceivable.
The method when recreating data in the computer unit proceeds from the vector with bit field 9 according to the above, which gives a rough estimate of the position of the line. The bit field can be seen as giving the position of the partial window 10 in the original image which is represented by the summation image. Only those parts of the original image that contain the laser line 4 make a significant contribution. If the line lies at the boundary between two partial windows, both corresponding bits in the bit field will be set to one, as illustrated in Figure 4.
From the summation image it is then possible to detect in precisely which sensor row in the total image the representation of the light source was located. If the representation of the light source is assigned a magnitude and shape that extends over a plurality of sensor rows, it is also possible, by analysing the intensity distribution 13 in a given column 12, to also detect the position of the line with sub-pixel accuracy 13. Since the imaging sensor in practice comprises discrete image points, this analysis is undertaken on the basis of a series of discrete values, as illustrated in Figure S. Determining the position of the line with great accuracy in this way is well known, see [D3], for example, even if in the known methods this calculation is performed directly from the original image. In our case we perform the calculation on the summation image but by combining this with information from the bit field 9 we can reconstruct precisely where in the original image the line was located.

In the same way it is also possible to measure the light dispersion by studying the shape of the representation of the light source over a number of sensor rows.
In a material which disperses light in the surface layer, the representation of the light source will ostensibly be wider than in a material with no light dispersion.
Let us assume that the detected intensity distribution has a shape like that illustrated in Figure 5. A measure of the light dispersion can thereby be obtained, for example, by directly studying the intensity in the edge areas (A in Figure 6), or alternatively by comparing the outer areas with the middle area (B in Figure 6), or the total intensity (A+B). One possible way of ineasuring the edge intensity is to proceed from the position 13 previously worked out, which may therefore lie between two sensor rows. Then, moving a predetermined distance in both directions, the edge intensities at the positions 14 are calculated, for example by interpolation.
Other measured values, which vary in different ways as function of the form of the intensity distribution, are also possible, however, and the invention in no way depends on precisely how this is done.

The formation of the summation image and the detection of the position of the line can be performed in a number of different ways. One alternative is to use a conventional image-processing sensor in combination with a computer unit, for example a digital signal processor. If the image-processing sensor has the facility for reading out the sensor rows in random order, the total image and the bit field vector can advantageously be formed by electronic circuits according to Figure 7, in which a summator 15 adds the content of the various lines, which are buffered in a line register 16 whilst a threshold circuit 17 is used for detecting the approximate position of the line. Figure 7 is here somewhat simplified in the sense that the threshold circuit 17 ensures that only when the total exceeds the threshold for the first time is a one obtained in the result vector 9. In an advantageous embodiment an image-processing sensor having a plurality of parallel outputs is used, for example a Photobit PB1024* in which the circuits 18 in Figure 7 are repeated with a set-up for each output as illustrated in Figure 8. As an alternative to summation it is also possible here to use a max-operation.
In a further advantageous embodiment an image-processing sensor is used which has integrated circuits for parallel processing of image data in columns, for example MAPP2200*and MAPP2500 *These circuits also afford the facility for forming the column by analog summation of data from different sensor rows. The method can thereby be performed at very high speed.

Only single-sided measurement using one light source or camera has been demonstrated above. In practice the timber will often be measured from more than one side using a measuring set-up for each side. These can either be displaced in relation to one another, so that they measure in various positions in the timber feed direction or they can be located in the same position. In the latter case it will be suitably ensured that the planes from the light sources coincide. Otherwise if the timber has an irregular shape it is possible to get interference from the light sources of the adjacent measuring units. If the light planes on either side coincide, the light sources may advantageously be placed so that a single surface is illuminated from more than one light source. For example, it is possible to turn the light sources in the plane so that they illuminate the timber from an angle of 45 degrees. This not only gives more even illumination but also greater security, since illumination is still available if one light source shouid fail. Neither is there anything, in the case of unilateral illumination, to prevent the use of multiple light sources from different directions within the plane in order to achieve more even illumination and increased * Trade-marks reliability.

In the description above it is specified that the light source is linear. An alternative embodiment involves replacing the line with a series of points in one or more rows.
It is likewise stated in the description that measurement is performed on a piece of timber. The invention obviously works just as well in measuring the geometric profile of and/or the light dispersion in an object of some other shape or of a material other than wood. Examples of material are fibrous material such as cellulose and paper. The invention must thereby be regarded as being limited only by the scope of the patent claims below.

Claims (16)

1. A method for imaging characteristics of an object by means of a measuring system comprising the steps of:
moving the measuring system and/or the object in relation to one another in a predefined direction of movement;
illuminating the object by means of incident light, which has limited extension in the direction of movement;
detecting light reflected from the object by means of an imaging sensor arranged on the same side of the object as the incident light;
converting by an image-processing sensor the detected light into electrical charges creating a digital representation of the object, wherein the incident light is arranged to strike the object at a predetermined distance from the imaging sensor viewed in the direction of movement, and wherein information on a geometric profile of the object and information on light dispersion in a predetermined area around the said profile is simultaneously read out from the digital representation.
2. A method according to Claim 1, wherein the digital representation is divided up into rows and columns and a compressed image is created from the digital representation by reducing the number of rows.
3. A method according to Claim 2, wherein the number of rows is reduced by summation of the rows of the digital representation in columns in a predetermined order.
4. A method according to Claim 3, wherein the summation is performed by analog means.
5. A method according to Claim 3, wherein the summation is performed by digital means.
6. A method according to Claim 3, wherein the summation by columns information on the row at which the electrical charge exceeds a predetermined threshold value, indicating that reflected light is detected just in that row, is saved for each column.
7. A method according to Claim 2, wherein the compressed image is created by saving for each column the maximum value for the pre-selected rows.
8. A method according to Claim 1, wherein in addition to information on the geometric profile of the object and the light dispersion, information on the intensity distribution is also read out from the digital representation.
9. A device for representing characteristics of an object by means of a measuring system, in which either the measuring system or the object is designed to move in relation to one another in a predefined direction of movement, which arrangement comprises at least one light source designed to illuminate the object with a light which is incident upon the object and has a limited extension in the direction of movement, the arrangement further comprising an imaging sensor, which is arranged on the same side of the object as the light source and is designed to pick up light reflected from the object and to convert this into electrical charges, an image-processing unit being designed to create a digital representation of the object from said electrical charges, wherein the light source is arranged at a predetermined distance from the imaging sensor viewed in the direction of movement, and that the image-processing unit is designed to simultaneously read out information on a geometric profile of the object and information on light dispersion in a predetermined area around said profile.
10. A device according to Claim 9, wherein the digital representation is divided into rows and columns and that the image-processing unit is designed to create a compressed image from the digital representation by reducing the number of rows.
11. A device according to Claim 10, wherein the image-processing unit is designed to reduce the number of rows by summation of the rows of the digital representation in columns in a predetermined order.
12. A device according to Claim 11, wherein the image-processing unit is designed, in the summation by columns, to save for each column information on the row at which the electrical charge exceeds a predetermined threshold value, indicating that reflected light is detected in that row.
13. A device according to Claim 9, wherein the incident light is linear.
14. A device according to Claim 9, wherein the incident light consists of a plurality of points or linear segments.
15. A device according to Claim 10, wherein the image-processing unit is designed to create the compressed image by saving for each column the maximum value for the pre-selected rows.
16. A device according to Claim 9, wherein in addition to information on the geometric profile of the object and the light dispersion, the image-processing unit is also designed to read out information on the intensity distribution from the digital representation.
CA002456163A 2001-10-02 2002-10-01 Method and arrangement in a measuring system Expired - Lifetime CA2456163C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
SE0103279-6 2001-10-02
SE0103279A SE0103279L (en) 2001-10-02 2001-10-02 Procedure for measuring light scattering and geometric profile
PCT/SE2002/001791 WO2003042631A1 (en) 2001-10-02 2002-10-01 Method and arrangement in a measuring system

Publications (2)

Publication Number Publication Date
CA2456163A1 CA2456163A1 (en) 2003-05-22
CA2456163C true CA2456163C (en) 2009-08-04

Family

ID=20285521

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002456163A Expired - Lifetime CA2456163C (en) 2001-10-02 2002-10-01 Method and arrangement in a measuring system

Country Status (11)

Country Link
US (1) US8923599B2 (en)
EP (1) EP1432961B2 (en)
JP (1) JP2005524828A (en)
CN (1) CN100397036C (en)
AT (1) ATE347683T1 (en)
CA (1) CA2456163C (en)
DE (1) DE60216623T3 (en)
DK (1) DK1432961T4 (en)
ES (1) ES2274125T5 (en)
SE (1) SE0103279L (en)
WO (1) WO2003042631A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE526617C2 (en) * 2003-10-01 2005-10-18 Sick Ivp Ab System and method for mapping the properties of an object
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
US8131008B2 (en) * 2007-01-31 2012-03-06 Building Component Verification Systems, Inc. Methods, apparatuses, and systems for image-based measurement and inspection of pre-engineered structural components
DK1985969T3 (en) * 2007-04-26 2017-12-04 Sick Ivp Ab Method and apparatus for determining the amount of scattered light in a machine vision system
US8126260B2 (en) * 2007-05-29 2012-02-28 Cognex Corporation System and method for locating a three-dimensional object using machine vision
US9734419B1 (en) 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
US9533418B2 (en) 2009-05-29 2017-01-03 Cognex Corporation Methods and apparatus for practical 3D vision system
US9393694B2 (en) 2010-05-14 2016-07-19 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
US8670029B2 (en) * 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US9124873B2 (en) 2010-12-08 2015-09-01 Cognex Corporation System and method for finding correspondence between cameras in a three-dimensional vision system
EE05696B1 (en) 2011-01-20 2013-12-16 Visiometric Oü A technical solution and method for improving the performance of line-based scanners
EP2985992A1 (en) 2014-08-13 2016-02-17 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus and method for providing an image
DE102016106994A1 (en) 2016-04-15 2017-10-19 ATB Blank GmbH Laser scanning device for optical quality assessment and measurement of objects in transverse transport
US10274311B2 (en) 2016-10-19 2019-04-30 Columbia Insurance Company Three dimensional laser measurement device for quality control measurements
US10445893B2 (en) * 2017-03-10 2019-10-15 Microsoft Technology Licensing, Llc Dot-based time of flight
CN110220479A (en) * 2019-07-16 2019-09-10 深圳数马电子技术有限公司 The method and system of contactless shaped as key bit study
DE102021116495A1 (en) 2021-06-25 2022-12-29 Ford Global Technologies, Llc Method and device for checking a connection during a laser-based connection process

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5034431B1 (en) * 1969-03-01 1975-11-08
US4209852A (en) * 1974-11-11 1980-06-24 Hyatt Gilbert P Signal processing and memory arrangement
GB1488841A (en) * 1974-01-18 1977-10-12 Plessey Co Ltd Optical detection apparatus
US4188544A (en) * 1977-08-22 1980-02-12 Weyerhaeuser Company Method and apparatus for automatically processing a workpiece employing calibrated scanning
US4168489A (en) * 1978-02-13 1979-09-18 Lexitron Corp. Full page mode system for certain word processing devices
IT1204492B (en) * 1986-03-21 1989-03-01 Cremona Lorenzo SYSTEM FOR THE DETECTION AND ELIMINATION OF DEFECTS PRESENT IN MANUFACTURING WORK, IN PARTICULAR WOOD PANELS WITH CRACKS AND FOLDS THAT MUST BE STUCKED
US4826299A (en) * 1987-01-30 1989-05-02 Canadian Patents And Development Limited Linear deiverging lens
SE466420B (en) 1989-11-14 1992-02-10 Svenska Traeforskningsinst PROCEDURE AND DEVICE FOR THE DETECTION OF BARK AND DETERMINATION OF BARKING RATE BY WOOD OR TIP
JP2523222B2 (en) 1989-12-08 1996-08-07 ゼロックス コーポレーション Image reduction / enlargement method and apparatus
US5233191A (en) * 1990-04-02 1993-08-03 Hitachi, Ltd. Method and apparatus of inspecting foreign matters during mass production start-up and mass production line in semiconductor production process
US5327254A (en) * 1992-02-19 1994-07-05 Daher Mohammad A Method and apparatus for compressing and decompressing image data
JP3231383B2 (en) 1992-03-18 2001-11-19 キヤノン株式会社 Rod-shaped ultrasonic motor and motor holding and fixing device
GB2274181B (en) * 1993-01-09 1997-04-02 Digital Equipment Int Summation unit
US5347311A (en) * 1993-05-28 1994-09-13 Intel Corporation Method and apparatus for unevenly encoding error images
SE9400849L (en) 1994-03-08 1995-04-03 Soliton Elektronik Ab Device and method for detecting defects in wood
NZ270892A (en) * 1994-08-24 1997-01-29 Us Natural Resources Detecting lumber defects utilizing optical pattern recognition algorithm
US5831748A (en) * 1994-12-19 1998-11-03 Minolta Co., Ltd. Image processor
US5644392A (en) * 1995-09-12 1997-07-01 U.S. Natural Resources, Inc. Scanning system for lumber
JPH09190372A (en) 1995-09-29 1997-07-22 Sony Corp Information management device and method
US20020014533A1 (en) * 1995-12-18 2002-02-07 Xiaxun Zhu Automated object dimensioning system employing contour tracing, vertice detection, and forner point detection and reduction methods on 2-d range data maps
US6382515B1 (en) * 1995-12-18 2002-05-07 Metrologic Instruments, Inc. Automated system and method for identifying and measuring packages transported through a laser scanning tunnel
US6064747A (en) * 1997-05-13 2000-05-16 Axxess Technologies, Inc. Method and apparatus for using light to identify a key
US6037579A (en) * 1997-11-13 2000-03-14 Biophotonics Information Laboratories, Ltd. Optical interferometer employing multiple detectors to detect spatially distorted wavefront in imaging of scattering media
US6094269A (en) * 1997-12-31 2000-07-25 Metroptic Technologies, Ltd. Apparatus and method for optically measuring an object surface contour
US6097849A (en) * 1998-08-10 2000-08-01 The United States Of America As Represented By The Secretary Of The Navy Automated image enhancement for laser line scan data
US6934420B1 (en) * 1999-12-22 2005-08-23 Trident Systems Incorporated Wave image compression
CA2335784A1 (en) 2000-02-14 2001-08-14 Marcel Lizotte Wood differentiating system
US7344082B2 (en) * 2002-01-02 2008-03-18 Metrologic Instruments, Inc. Automated method of and system for dimensioning objects over a conveyor belt structure by applying contouring tracing, vertice detection, corner point detection, and corner point reduction methods to two-dimensional range data maps of the space above the conveyor belt captured by an amplitude modulated laser scanning beam
US7508961B2 (en) * 2003-03-12 2009-03-24 Eastman Kodak Company Method and system for face detection in digital images
US7282180B2 (en) * 2003-07-02 2007-10-16 Immunivest Corporation Devices and methods to image objects
US7587064B2 (en) * 2004-02-03 2009-09-08 Hrl Laboratories, Llc Active learning system for object fingerprinting
DK1985969T3 (en) 2007-04-26 2017-12-04 Sick Ivp Ab Method and apparatus for determining the amount of scattered light in a machine vision system
JP5503990B2 (en) 2010-02-02 2014-05-28 ローム株式会社 Phase-locked loop circuit and electronic device using the same

Also Published As

Publication number Publication date
US8923599B2 (en) 2014-12-30
EP1432961A1 (en) 2004-06-30
JP2005524828A (en) 2005-08-18
CA2456163A1 (en) 2003-05-22
CN100397036C (en) 2008-06-25
ES2274125T5 (en) 2016-05-23
DE60216623T3 (en) 2016-07-21
SE0103279D0 (en) 2001-10-02
EP1432961B1 (en) 2006-12-06
DK1432961T4 (en) 2016-05-30
CN1555480A (en) 2004-12-15
DK1432961T3 (en) 2007-03-05
DE60216623T2 (en) 2007-09-27
ATE347683T1 (en) 2006-12-15
EP1432961B2 (en) 2016-03-02
SE0103279L (en) 2003-04-03
DE60216623D1 (en) 2007-01-18
ES2274125T3 (en) 2007-05-16
US20040234118A1 (en) 2004-11-25
WO2003042631A1 (en) 2003-05-22

Similar Documents

Publication Publication Date Title
CA2456163C (en) Method and arrangement in a measuring system
EP1985969B1 (en) Method and apparatus for determining the amount of scattered light in a maschine vision system
US5986745A (en) Co-planar electromagnetic profile scanner
US4162126A (en) Surface detect test apparatus
CA2131919C (en) Improved log scanning
US6111601A (en) Non-contacting laser gauge for qualifying screw fasteners and the like
EP0724773B1 (en) Grid array inspection system
CN108801164B (en) Method and system for testing gap value of workpiece based on laser
JP2009139248A (en) Defect detecting optical system and surface defect inspecting device for mounting defect detecting image processing
SE501650C2 (en) Device and method for detecting defects in wood
EP0871008A2 (en) Device for measuring the dimensions of an object that is very extensive longitudinally and whose cross section has a curved contour
EP1194734B1 (en) Method and device for log measurement
US7257248B2 (en) Non-contact measurement system and method
CN112712554B (en) Method for extracting central line of laser stripe on surface of semitransparent Lambert surface
Astrand et al. A single chip multi-function sensor system for wood inspection
Zhao et al. Preliminary study on measurement of coarse surface roughness by computer vision
Zhao Inspecting wood surface roughness using computer vision
JPH06138165A (en) Pattern inspection method and device

Legal Events

Date Code Title Description
EEER Examination request
MKEX Expiry

Effective date: 20221003