CA1265869A - Method and system for high-speed, 3-d imaging of an object at a vision station - Google Patents

Method and system for high-speed, 3-d imaging of an object at a vision station

Info

Publication number
CA1265869A
CA1265869A CA000537961A CA537961A CA1265869A CA 1265869 A CA1265869 A CA 1265869A CA 000537961 A CA000537961 A CA 000537961A CA 537961 A CA537961 A CA 537961A CA 1265869 A CA1265869 A CA 1265869A
Authority
CA
Canada
Prior art keywords
electrical signals
signal
light
split
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA000537961A
Other languages
French (fr)
Inventor
Donald J. Svetkoff
David N. Smith
Brian Doss
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
View Engineering Inc
Original Assignee
SYNTHETIC VISION SYSTEMS Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SYNTHETIC VISION SYSTEMS Inc filed Critical SYNTHETIC VISION SYSTEMS Inc
Application granted granted Critical
Publication of CA1265869A publication Critical patent/CA1265869A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Abstract

METHOD AND SYSTEM FOR HIGH-SPEED, 3-D
IMAGING OF AN OBJECT AT A VISION STATION
ABSTRACT
A method and system for high-speed, 3-imaging of an object at a vision station including a flying spot laser scanner, a dynamically configurable spatial filter and a diffuser for spatial averaging, are used in conjunction with a variable transmission filter in an optical depth sensing system. The reflected laser signal received by the sensing system is first spatially filtered and averaged and then split into first and second beams which are imaged onto a pair of highly sensitive photodetectors which introduce capability for high-speed, 3-D sensing under low light level conditions. The first beam passes through a variable transmission filter which is used to encode position which, in turn, is proportional to the height of the object. The second or reference split beam is provided to compensate for changes in the reflectance of the object and the power of the laser scanner. A high-speed signal processing unit which incorporates special circuitry to greatly extend dynamic range computes the ratio of the transmitted signal to the sum of the reference signal and the transmitted signal to determine height information. The signal processing unit also contains noise rejection circuitry which is activated during "off" and "on" periods of laser diode TTL modulation and includes feedback control for pulse amplitude modulation of the laser diode source if necessary to increase dynamic range of the system,

Description

MET~OD ANI) SYSTEM FOR HIGH SPEEI~, 3-~
IklAGIN~ OF AN OBJECT AT A VISION STATION
. .
TECHNICAL FIELD
~ his invention relates to method and apparatus for imaaing an object at a vision station to develop dimensional information associ-ated with the object and, in particular, to method and apparatus for imaging an object at a vision station to develop dimensional in~ormation associ-ated with the object by projecting a beam of controlled light at the object and detecting the position of the laser spot in the image plane.
BACKGROUN~ AR~
In many machine vision applications, changes in depth can provide much of the informa-tion for inspection, especially in cases where the grey scale or intensity contrast is poor, diffi-cult to predict or irrelevant. In fact, it has20 been suggested that most industrial vision appli-cations are inherently three-dimensional and that two-dimensional problems rarely exist. SM~
(surface mounted device) inspection is a good example o~ an application where depth det~ction ~5 could be very useful for determining the presence and orientation of componentsO Otherwise, special case engineering is usually involved to handle varying component color, texture and background.
Even with standard circuit inspection techniques, some capability for depth perception is desirable.
For example, when operators inspect with a stereo microscope both color and depth perception ca-pabilities are utilized.
~ epth detection techniques are cate-gori2ed as passive when a controlled source of radiation i5 not required, or active if a beam of radiant energy is involved. Passive ranging techniques avoid putting constraints on the ob-served objects or their environment and, conse-qu~ntl~, have been the subject of much resear~h in both computer vision and psychophysics. ~ethods based on stereo disparity, camera motion, surface re1ectance, texture gradients, shadows and oc-clusions have been exploredO These techniques often have psychophysical correlates. For exam-ple, depth perception in the human visual systemis believed to he based upon these types of cues.
One disadvantage of the passive approach is the extensive computation required for con-struction of a depth map. Methods based on stereo disparity and camera motion are potentially very powerful but require matching of corresponding features in a sequence of images. A method for consistently establiqhing the correspondence has not been developed at this time for real time computer applications. Nevertheless, several ideas have emerged from studies in depth percep-tion, including techniques for representing the properties of surfaces.
Active depth detection techniques eliminate the correspondence problem and measure the depth directly by using a beam of energy and P~304 ~3-recording the time of flight (sonar and radar applications such as shown in U.S.P.N. 4,212,534).
~epth may also be measured through displacement ~triangulation and grid coding), phase shift of a laser beam compared to a reference beam ~lasex radar), or shadow length (direc~ional illumina-tion). Extensive computation for a depth map is avoided and the information processing task is reduced to extraction of three-dimensional fea-tures, representation of the surfaces and sceneanalysis operations. I~ the applications where the use o~ intensity or color improve classifica~
tion, both range and intensity data may be used.
~ he triangulation or structured light concept offers a great potential for acquixing a dense, high-resolution (approximately 1 mil and finerl 3-~ image at high data rates (10 MHz) at a relatively low cost. ~he triangulation concept is one of the oldest depth detection techniques which exists, but which continues to undergo new devel-opments. On the other hand, the laser radar ap-proach is relatively new to machine vision. While the laser radar approach has some advantages, its relatively low data rate and high cost make this approach somewhat unwieldy for high resolution application; as the modulation frequency is increased to the GHz xange, high resolution imaging becomes relatively difficult to implement in a cost-effective way. By contrast, the trian-gulation method is relatively simple and has aninherently high resolution capability.

8~i9 P-304 _~_ Many of the refinements of the basic triangulation concept involve projection of single and multiple stripes ~grid patterns), sca~ning strip systems and flying spot scanners. One 3-D
vision system utilizing structured light is described in United States Patent No. 4,105,925.
The vision system described therein includes a linear array sensor which is positioned so that a line of light is visible only if a reference plane is illuminated~ If an object is present, then th~
light beam is broken. A second line source is used to minimize shadows. As the object is scanned with a lin~ar sensor, a binary image is produced and the presence and orientation of the object is then determined.
One commercially available 3-~ ~ision system which produces height measurements includes a microprocessor-based, laser line sectioning system. The system is capable of producing 60 fields of 480 x,y,z coordinates each second which corresponds to approximately 30 KHz data rate.
Each line of data requires acquisition of an entire video field. If a one half inch by one-half inch object is to be imaged at one mil, x,y,2 ~5 resolution then the maximum speed of the object conveyed on a conveyor belt for 100% inspection of the part is approximately 30 seconds per inch.
Such a single stxipe system is most useful in gauging and spot checking and the use of multiple stripes in such a system are best for highly constrained scenes which are no~ likely to P-30~ -5-change very often. These systems are mostly used for gauging rather than for i~age processing and 100~ inspection. Proper sampling and acquisition of dense three-dimensional information requires the stripes be scanned across the object and imaged with an array camera or line sensor, either of which can limit the data rate. Tradeoffs between speed, resolution and dynamic range are a necessary consequence of the use of a multiple stripe system.
One method for acquiring da~a in a 3-~vision system is to replace the line scan or array sensor utilized in most 3-~ vision systems with a lateral effect photodiode as illustrated in ~nited States Patent No. 4,375,921. The ambiguities which might exist in multiple stripe systems are not a problem with this technique and the measur-able range variation i5 relatively large~ ~his is true because the entire detector sur~ace is available and there is no requirement to share the detector area as in a multiple stripe system.
Unfortunately, the bandwidth of most of the devices with amplification circuitry is well below one MHz. ~ual detector devices (i.e.
bi-cells) which have a 30 MHz bandwidth are available but standard devices are not useful in the basic triangulation concept for imaging under low light conditions at high speed, particularly when large fields of view are examined. These d~vices are also very sensitive to spot shape and geometric distortions.

~5~6~

United States Patent ~os. 4,06B,955 and 4,192,612 disclose a thickness measuring device utilizing well-known trigonometric principles to generate data to give the dis~ance to or the thickness of a remote object. In such thickness gaugin~ systems, beams of light are directed through beam-splitting mirrors to opposite sur-faces of the object to be measured. By ascertain~
ing the relative angles of incidence and reflec-tion with respect to the object surface, suitabletrigonometric rules can be applied to generate the approximate thickness of the object in question.
United States Patent No. 4,472,056 discl~ses a shape-detecting apparatus for detect-ing three-dimensional products or parts such as soldered areas of a printed circuit board, the parts attached to the printed board and bumps in an LSI bonding process. The apparatus comprises a slit projector for projecting a slit bright line on a number of objects arranged in a line and an image forming lens for forming the bright line image. The apparatus also comprises an image scanning mechanism for the bright line image formed through the image forming lens in a height direction of the object and a one-dimensional image sensing device for self-scanning the bright line image formed therein with an array of image sensing elements orthogonal to the scanning direction by the image scanning mechanism. This system is severely limited by readout time; each P-~04 -7-3-D point requires examination at many photode-tectors.
U.S. Patent No. 4,355,904 discloses a device for measuring depth using a pair of photodetectors such as photodiodes and a partially re1ective and a partially transmissive filter.
The comput~tion of the centroid is done by an analo~ divider.
U.S. Patent No. 4,553,844 discloses a method and sys~em in which a spot beam scans an object in one direction and the resulting spot image is detected through observation in a direc~ion transverse the one direction.
U.S. Patent No. 4,645,917 discloses a swept aperture flying spot profiler. The sensor used in the system is either a photomultiplier or an avalanche diode.
U.S. Patent No. 4,349,277 discloses a parallax method of wavelength labeling based on optical trian~ulation. A signal processor calculates a normalized signal that is independent of surface reflectivity and roughness variations~
U.S. Patent No. 4,634,879 discloses the use of optical triangulation for determining the profile of a surface utilizing two photomultipliex tubes in a flying spot camera system. These are arranged in a "bi-cell" confi~uration. As an anti-noise feature, amplitude modulation is impressed upon the laser beam and a filter network is used to filter photomultiplier response so as to exclude response to background optical noise.

P-30~ -8-Other United States patents of a more general interest include United States Patent Nos.
4,053,234; 4,06~,201; 4,160,599, ~,201,475;
4,249,244; 4,269,51~; 4,411,528; 4,525,858;
4,567,347; and 4,564,078~
~ISCLOSURE OF THE INVENTION
An object of the present invention is to provide an improved method and system for high speed, 3~ imaging of an object at a vision station wherein extremely high speed and sensi-tivity can be o~tained by using photodetectors and a relatively simple and low cost signal processing circuitry having a large dynamic range to develop dimensional information associated with the objec~.
Another object of the present invent.ion is to provide a method and system for imaging an object at a vlsion station which overcomes many of the limitations of the prior art methods and systems by optically preprocessing the reflected light signal by a set of optical components which improve the quality of the data collected.
Yet still another object of the present invention is to provide a method and system for imaging an object at a vision station to develop high resolution, dimensional information associ-ated with the object in a relatively inexpensive and compact fashion and which system can be interfaced with standard, video-rate apparatus.

~2~

P-304 _9_ In carrying out the above objects and other objects of the present invention, a method is provided for the high-speed, 3-D im~ging of an object at a vision station to develop dimensional information associated with the ohject. 'rhe method comprises the steps of scanning of ~ontrolled light at the surface of the object at a fixst predetermined angle to generate a corres-ponding reflected light signal, receiving the reflected light signal at a second angle with a set of optical components, spatially filtering and smoothing the received signal and optically splitting the received light signal into first and second split beams, the second split beam being a reference beam. ~he method further comprises the steps of measuring the amount of radiant energy in the first split beam and the reference beam and producin~ corresponding first and second elec-trical signals proportional to the measurements, normalizing the first and second electrical signals to lie within a predetermined range, and computing a centroid value for the first split beam from the normaliæed signals.
Further in carrying out the above objects and other objects of the present inven~ion, an imaging system for the high-speed, 3~ aging of an object at a vision station to develop dimensional information associated with the object is provided. The system comprises a source for scanning a beam of controlled light at the surface of the object at a first predetermined ~i8$~3 angle to generate a corresponding reflected light signal and a set of optical components for receiving the reflected light signal at a second angle, for spatially filtering and smoothing the recei~ed signal and for optically splitting the received light signal into first and second split heams, the second split heam being a reference beam. The system further comprises first and second measuring means for measuring the amount of radiant energy in the transmitted portion of the first split beam and the reference beam, respectivPly, and producing first and second electrical signals proportional to the measurements, respectively. Signal processing lS means normalizes the first and second electrical signals to lie within a predetermined range and computes a centroid value for the first split beam from the normalized signals.
Preferably, the method described above includes the step of imaging the first and second light beams to first and second adjacent posi-tions, respectively, prior to the step of measur-ing.
Also, in one construction of the imaging system, the source is preferably a laser scanner, t~e set of optical components preferably includes components for op~ical fil~ering and posi~ion detection, and the first and second measuring means each includes a highly sensitive photode-tector for converting the radiant eneryy in itsrespective split beam into an electrical current.

P-304 ~

The advantages accruing to the method and system as described above are numerous. For example, such an imaging system can be incorporat-ed into an inspection/gauging product wherein both range and intensity data are acquired. Inspection o~ stationary large objects at high resolution can be performed utilizing a line scan configuration~
Also, such a method and system provide high resolution, video rate, full 3-~ imaging at a relatively low cos~. Such method and system also provide imaging at low light levels with high-speed circuitry to accommodate a large dynamic range.
The present invention overcomes many of the problems of the prior art by (1) spatial smoothing to reduce erroneous readings within a position-sensitive filter, including a cylindrical element for spot/line conversion; (2) including a programmable mask for rejecting multiple scattered light; t3) a high-speed signal processor with wide dynamic range with feedback to modulate the laser source if necessary; (4~ use of detectors which allow shot noise performance to be achieved; and (5) modulation of laser source for reduction of noise bandwidth.
~ lso, a feedback arrangement is incorporated hexein which could be used to extend dynamic range by first acquiring a line of intensity data which is buffered and used to modulate laser amplitude on the subsequent line.

~s~

Spatial registration is maintained throu~h use of two laser diodes which are precisely offset.
Prefeablyr ~TL modulation is used herein and allows for separate signal processing operations to be performed during the "on" and "off" intervals.
The objects, features and advantages of the present invention are readily apparent from the following detailed description of the best mode for practicing the invention when taken in connection with the aocompanying drawings.
sRIEF ~ESCRIPTION OF ~HE ~R~WINGS
FIGURE 1 is a schematic view illustrat-ing the basic triangulation or structured light concept;
FIGURE 2 is an illustration of two mathematical formulas interrelating various variables illustrated in FIGURB l;
FIGURE 3 is a schematic view of a set of optical components, detector assembly and signal processing circuitry for use in the method and system of the present invention;
FIGURE 4 is a signal processing block ~5 diagram illustra~ing the method and system of the present invention;
FIGURE 5 is a detailed schematic diagram of the signal processing circuitry of FIGURE 3;
and FIGURE 6 is a detailed block diayram of the noise suppression circuit of FIGURE 5.

BEST MOI~E FOR CARRYING OUT THE INVENTION
Referring now to FIGURE 1, there is illustrated the major components of a basic 3-D
imaging system, collectively indicated at 10. ~he system 10 is positioned at a vision station and includes a controlled source o~ light such as a lasex ar.d collimating lens assembly 12 and a scanner and beam shaping and focusing optics 22 for projecting a series of laser beams 14 at the reflective surface 16 of an object, generally indicatPd at 18. The object 18 is supported on a reference, planar surface 20 at the vision 3tation.
Referring now to FIGURE 2 in combination with FIGURE 1, there is also illustrated the basic triangulation or structured light concept including ~ormulas for interxelating the various variables depicted in FIGURE 2. Basically, the height Z of the object 18 is computed from the projection angle, ep, and the deflection angle, ~d~ at which the reflected beam is incident upon detector element 28 of a detector 24.
~he prior art typically ~hows a plurality of detector elements which may comprise a linear array or array sensor or, if a single detector element is provided, may comprise a lateral effect photodiode or bi-cell ~dual photo diodes). ~ual photodiodes have been used as position sensing devices and are capable of measuring very small depth changes over a limited height range.

P-30~ -14-Although ~hese systems can obtain quite accurate z measurements with lateral effect photodiodes tLEP), the internal resistance of the diode (which providés the depth sensing and S centroid computation capability of the LEP through attenuation of signal currents) also introduces a noise current which is high enough to become ~he dominant source of noise in the imaging system.
~his includes the noise in the dark curren~ which is quite high together with the Johnson noise created by the resistor. For example, a good quality commercially available LEP will produce a noise current which is about an order of magnitude higher than a silicon photodiode with the same area. When compared to the sensitivity of photomultipliets or avalanche detectors the LEPs are very slow yet the PM tubes and AP~s are orders of megnitude more sensitive to light~ The resistance and the capacitance of the LEP device introduce bandwidth limitations. ~he preferred depth sensing technique described herein derives position of the light spot centroid through attenuation of light intensity and can be considered the electro-optic counterpart of a lateral ef~ec~ photodiode but high-speed detectors optimized for sensitivity ~a~ also be used.
The sensitivity of the basic triangu-lation concept depends upon the haseline to height ratio and decreases for steep illumination and viewing angles. ~he sensitivity also depends upon the spacing between the detector element 28 and P~304 -15-the erfective focal length of the focusing lens 26 of the detec~or 24. Increasing the distanc~ of the detector ~lement 28 from the lens 26 increases the sensitivi~y.
Scanning methodologies described as "synchronous scanning"~ or "telecentric scanning", "descanning", allow for considerably relaxation of this tradeoff. Using such geometries allows high resolution d~pth measurements at steep angles.
~hese scanning geometries can also be used with the position sensing and signal processing system described herein without o~her modifica~ions. The per~ormance of the system o~ the inven~ion can be further enhanced with such geometries.
In tne configuration illustrated in FIG~RE l, ~he effects of shadows and occlusions (i.e. invisible areas to the detector) are a concern and can limit the achievable range resolu-tion. For high precision gauging of relatively flat surfaces such as solder paste voluma, ink thickness, surface flatness etc., an illumination angle of 30 will typically give the most precise results and occlusion effects will usually be insignificant. For applications where the size and shape of the objects to be imaged have a relatively large variation, such as found on many t.ypes of circuit boards, illumination angles of lO
to 15 degrees will be desirable. Missing poin~
problems can be reduced with multiple detectors or with autosynchronized geometries to view within 10 degrees.

~L2~

The laser 12 and scanner 22 of the present invention preferably define a flying spot laser scanner. The laser 12 is coupled to a modulator 13 to shi~t the information to a higher frequency where system noise characteristics are better. ~he modulator 13 may perform one of many types of modulation, includi~g sine wave, pulse amplitude, pulse position, etc. Preferably, the laser 12 is a solid state laser diode and is "shuttered" with a ~L signal (i.e. ~TL
modulation)~ In this way, the laser si~nal is encoded so as to allow separate signal processing ~unctions to be performed during "on" and "off"
intervals as described in detail hereinbelow.
Ty~ically, power levels are 20-30 mW (Class III-B) which are well suited for machine vision applications.
Three types of scanners 22 may be utilized: spinning polygon (x-scan) and galvono-meter (y-scan) mirrors; linearized re~onant scanner (x-scan) and ~alvonometer (y-scan); or acousto-optic cell (x-scan) and galvonometer (y-scan). Preferably, the scanner 22 comprises the latter acousto-optical system because no moving parts are required and the retrace time is very fast. Either of the sensors can be used in conventional triangulation-bases sensing systems having a relatively large baseline-to-height ratio or in narxow angle triangulation systems which utilize telecentric or auto-synchronized scanning methods.

~fi~

P-304 -17~

Another convenien~ method for inspecting large areas is linear scanning. In this method the object or projection system is translated and the galvonometers (y-scan) are not needed.
Referring now to FIGURE 3, there is ~enerally indicated at 30 an optical system for u~e in the imaging system of the present inven-tion. The optical system 30 includes a se~ of optical components, including an objective lens 31 to collect scattered light from the object and a second diffraction limited lens 32 to focus the collected light onto an intermediate image plane.
The lenses 36 and 32 are conventional. However, each of the lenses 31 and 32 operates a* a pre-ferred conjugate. The second lens 32 can be interchanged to accommodate various reduction and magnification ratios.
The system 30 also includes a mask 33 which, in one embodiment, forms a rectangular aperture 34 (i.e. spatial filter) po~itioned at the intermediate image plane to reject background noise (i.e~ stray light) which arises from secondary reflections from objects outside of the desired instantaneous field of vision of the system. The mask 33 may be a fixed aperture or electromechanical shutter, or, preferably, is a liquid crystal, binaxy~ spatial light modulator which is dynamically reconfigured under software control. Such a configuration is use~ul for inspection of very shiny objects (reflowed solder, wire bond, 1OOPB, pin grids, etc.) which are in P-304 ~18-close pxoximity from which multiple reflections will ~e created. When used with auto-synchroni~ed scanners or in a telecentric scanner ~rotating mirroxs have moving mechanical parts), the mask 33 is a narrow strip which allows for collection of only the light which is useful for z measurement.
I~ desired, the spatial filter or strip can be programmed in a chosen pattern of opaque and transmissive patterns correla~ed with the height profile of the object to be detected. For example, a height measurement of shiny pins placed on a shiny background will be more reliable if only a narrow strip corresponding to the height range over which properly positioned pins is viewed. Multiple reflections may produce a signal return which is significantly larger than the return produced by useful light. If properly placed, the position of the pin will be reported.
If defective, no pin will be found.
When a conventional triangulation based scanner is used ~i.e. a solid state device having no moving parts but an area detector) the aperture 34 of the mask 33 is no larger than necessary for detection of a specified height range, but is still programmable.
A fine grained ground glass diffuser 35 o~ the system 30 is located adjacent the intermediate image plane to create a relatively uniform and relatively broad spot of light.
Fluctuations in the measured position of the spot as a result of directional scattering or from local variations in a varia~le filter 36 of the system 30 are spatially averaged and therefore minimized. ~his is analogous to low pass filtering in eleetronic systems.
The variable fil~er 36 which is utilized as a position dependent transmission device produces a measurement of spot centroid location and is relatively insensiti~e to focus and the shape of the intensity distribution.
The variable filter 36 can be fabricated with a transmission function which is linear with position or as a filter which has a linear density characteristic ~i.e. logarithmic transmission with position). A nonlinear computation approach has a property which allows for compression/expansion of the depth sensitivity throughout the range. In particular, the filter 36 can be utilized in such a way as to allow small height changes to be s~nsed near the baseline (the most distant z coordinate~ without compromising the depth sensitivity on taller objects. Since the primary use of the filter 3~ is as a variable density device, this nonlinear complltation is accomplished with a relatively standard type of filter. On the other hand, if linearity of the z measurement is of i~portance the linear transmission function should be used.
The system 30 ~urther includes a second lens system, generally indicated at 37, which is used to reduce or magnify the intermediate ima~e and transmit (relay) the image through the system 86~

30. A cylindrical lens 38 allows for additional spatial averaging over the surface of the variable filter by converting the spot of light into a line li.e., changing aspect ratio)O
A beamsplitter 39 is used to produce a reference split beam or signal and another split beam which is transmitted to the filter 36. The first split beam is transmitted to the filter 36 in a first ~hannel 40 and the second or reference split beam is transmitted in a second channel 41.
The second channel 41 provides an intensity refexence to normalize the data and eliminate the dependence of the height measurement on brightnessO
If desired, the system 30 can also be ~abricated in such a way as to allow both the reference beam and txansmitted beam to be produced by a linear variable filter with a metallic coating on its front surface to produce spatial reflection, thereby allowing for splitting and position dependent attenuation with a single optical component.
If the beamsplitter transmission/re~
flection ratio is prPcisely known and is of a constant value, position is determined by simply dividing (ratio detection) the voltages obtained on the two channels. ~hat is to say that Z -Vl/V2. Otherwise, the position is found by Z =
V1/(V1 + V2) (i.e. this latter computation is more robust).

$~

A folding mirror system, including mirrors 42 and 43 are used to deliver the light beams to a localized area of a detector assembly, generally indicated at 44. The connection between the photodetectors 45 and 4S and their associated pre-amplifiers 47 and 48 should be as short as possible to minimize stray capacitance for high-speed applications and to avoid mismatches hetween the signal channels. Constan~ deviation prisms can also be used in place of the mirrors 42 and 43 to simplify system alignmen~. The short wire lengths are necessary so that the low level signals are not corrupted by noise.
The laser light signal transmitted by the filter 36 and the light signal reflected by the mirror 43 are imaged bv conventional field lenses 49 and 50, respectively, onto consistent, predetermined areas on a pair of photodetectors, 45 and 46 of the assembly 44. The assembly 44 also includes pre-amplifiers 47 and 48, respectively for the photodetectors. Each of the photode~ectors is preferably a small area photodiode (i.e. no larger than 3mm x 3mm) having low capacitance and very high shunt resistance, photomultiplier, avalanche photodiode or inten-sified detector of a detector ele~ent and pre-amplifier combination. Such a photodiode preferably has at least a 300 MHz cutoff frequen-cy, corresponding to rise times of 1 nanosecond or less. The high speed, low noise pre-amplifier part of the combination operates at video rates.

5~3~

Since a single sensitive detector is used for each channel 40 or 41, sensitivity of the assembly 44 is extremely high and the noise is very low when compared to L~P's. Also, sinc~ the photodetector/amplifier gain bandwidth product is very high, large signals may be ob~ained for relatively small chanyes in signal levels.
An "optical bi-cell" can be formed with a slight modification of ~he arrangemen~ in FIGURE
3. ~his is done by eliminating the variable fil~er 36 and introducing a slight position offset of the photodetectors 45 and 46.
~ he bi-cell detector is useful for obtaining good height resolution when only a narrow range mus~ be sensed (e.g. traces on circuit board, flatness detection, etc.) and is a complementary approach to the variable density filter 36 which provides high resolution over a relatively large depth of field and a direct measurement of the light spot centroid position.
~ he spatial averaging obtained with the diffuser 35 is required, in general, to make the bi-cell approach robust because measurement errors will be introduced for non uniform intensitv distributions which result from scattering or from geometric tangular) distortions.
As previously mentioned, the signal representing the height of the object to be inspected is determined from the ratio Z
V1/~Vl~V2) whereas ~he intensit~ information is yiven by I = V1 ~ V2. For determination oE Z, a 8~

commercially availa~le analog divider can be used to determine the ratio at speeds approaching video frame rates. Such dividers, however, require the sum of V1 ~ V2 (denominator) vary over only a small range (typically 3:1) if high accuracy is to be maintained. A ~ypical scene to be imaged will contain xeflectivity variations which are much grèater than this range. For instance, printed circu.it board components and backgrounds will produce V1 + V2 signals which vary by a few orders o~ magnitude representing the approximate useful diffuse reflectivity variations from .5~ to 100~.
In aadition, specular returns will produce a much larger variation and must be identified since the resulting Z value is incorrect, whereas very low photon limited signal returns almost result in division by 0.
Other long standoff applications may require a much larger dynamic range because of variations in standoff. In such cases, real time laser intensity modulation is necessary.
As discussed above, the dynamic range of the measuring system is sufficient to accommodate diffuse reflectivity variations from a fraction of a percent to 100~ (e.g~, substantially greter than th~ full range of black to white given on standard greyscale test charts). In some applications, primarily where a large depth of range must b~
sensed, it may be necessary to implement further extensions allowing the system to measure very weak returns from dark distant objects and strong ~ z6~8~

P-304 -~4-returns from bright objects in close proximity.
The vision system in such an application (i.e.
robot navigation) must have a much larger dynamic range than what is required for lnspection of small, quasi-flat objects like circuit boards.
~ he system described can ~e modified to accommodate extended synamic range requirements throu~h feedback to increase or decrease the laser power dependent upon the intensity measur~d from some point on the object. However, with such an arrangement a necessary decrease in the data rate (at least a factor of two) must result and, more importantly, caution must be exercised so that th~
reflectance measurement corresponds to the exact same physical point on the object as the z measurement. Nevertheless, now that high power laser diodes are commercially available with source powers of up to 1 W with very fast rise times, it is feasible to implement a feedback circuit in a practical configuration. A source power of lW would enable the system to measure objects which return signal levels approximately 50X lower than can be measured with the low cost, 20mw laser diode as previously described. Also, dual laser diodes of low and medium power which are precisely offset from each other are now commercially availa~le. Primary application of such dual diodes has been in inspection of optical disks using a '7read after write" technique. In many applications a medium power laser produces sufficient signal to noise and these devices can ke usedO The advantage of not requiring additional opto-mechanical hardware to maintain near-perfect registration between the means is also significant.
Such a feedhack arrangement can be implemented with "look ahead" capability in the sense that data from the low power laser diode is buffered and subsequently use~ by a feedback circuit 76 to control the modulation of the high power lascr diode by means of pulse amplitude modulation. An acousto~optic light modulator instead of the modulator 13 is preferred over direct time varyinS amplitude laser modulation for maintaining stability of the laser source and this device can be regarded as a high-speed, electro-optic shutterO
Referring now to FIGURE 5, there is generally indicated at 51 signal processing circuitry or unit which expands/compres~es the variable data in order to obtain the proper Z
value and also generates special values indicating incorrect height information.
The preamplifiers 47 and 48 convert the signal currents I~ of the photodetectors 45 and 46, respectively, to corresponding voltages.
~he sum V1 ~ V2 is then formed as an analog signal by a summing circuit 54 and then converted into digital form by a non-linear data converter 56 which operates at very high speeds. ~he purpose of converting the data to digital form is to provide an easy method OL selecting the gain values (inverse function) reguired to scale V1 ~
V~ into the approximate 3:1 range required by an analog divider of circuitry 58. The idea is similar to that used in AGC (automatic gain control) circuits except that AGC circuits often average the signals for prohlbitively long periods and feedback the gain control v~lue which reduces the ~ystelr, bandwidth. ~he output of the convexter 56 is fed into a gain select logic circuit 57 to provide output signals without feedback. The gain value selected with the logic circuit 57 axe used to "program" a series of high pxecision amplifier stages 60 and 62 for selecting the gain values to scale the signal Vl + V2 into the 3:1 range.
As previously mentioned, modulation of the laser source is used to shift the information to a higher frequency where system noise char~c-teristics are better. The circuitry 51 includes noise suppression circuits, generally indicated at 64 and 66, for thé first and second channels 40 and 41, respectively~
~uring the "on" time of the laser source 1~, a first anti-aliasing filter 68 of each noise suppression circuit 64 or 66 ~as shown in FIGURE
6) is applied to smooth out signals variations (high frequency noise) thereby rejecting out-of-band noise. This high frequency noise is rapidly varying compared to the known (modulated~
signal. During the "off" time these rapid variations are also removed by a second anti-aliasing filter 68. A demodulation step is ~ %~58~

performed to reject low frequency noise with sample and hold circuits 70 and 72. Because the low fre~uency noise is slowly varying compared to the clock rate of the system an average value is obtained by the circuit 70 during the "off" period to provide a reference voltage which is subtracted by a circuit 74 from the "on't voltage value obtained by the circuit 72 to cancel low frequency components. Suppression of the low frequency (l/f, 60 cycle etc.) noise is important to maintain a large dynamic range.
This noise suppression also has the major advantage of pxoviding a l'black level reference" for each picture element which is not possible in conventional video systems. Black reference avoids ~C offset drift in the signal processing chain. ~s a result, the fideli~y of the signal is dramatically improved. A by-product of the method is an automatic calibration feature.
~he laser power is controlled with the circuit 13 in such a way that slow term drift in the light level is negligible, thereby establishing a "white reference". Since the black reference is established for each pixel the entire li~ht range is calibrated on a pixel by pixel basis.
Thereafter, the signal from the circuit-ry 51 is preferably amplified and coupled to an analog-to-digital converter 78 which, in turn, may be interfaced to a conventional video frame grabber of a larger inspection/gauging product.

With further reference to the feedback circuit 76 and the signal processing cixcuit 51, the intensity signal Vl + V2 associa~ed with object point is quantized as before by the non linear da~a converter 52 of the circuitry 51 which includes additional ou~puts 80 for data values outside of the previous ra~ge. These digital values are buffered and through a delay line arrangement within the feedback circuit 76 provide the data necessary to control the modulator 13 so that the appropriate power level i~ provided to the ~urface so th~t the modified Vl + V2 signal is in the range of the pre-scalers 60 and 62, nois~ suppression circuits 64 and 66, and lS the divider 58.
The range of the non~linear data converter 56 described herein can be extended to accommodate an arbitrarily large range within pxac~ical requirements of cost, circuit board space, speed requirementæ, and laser power range.
The result of this implementation is identical to the previous (3~ + greyscale system), except that the 3~ data is delayed with respect to the intensity data by a constant known offset. A
practical choice is to acquire alternating lines of intensity and depth information.
The above-described imaging method and system present numerous advantages. For example, imaging can be performed at high resolution and at video rates to obtain full 3-~ information. Also, such a method and system offer the potential of ~L2~

accurate video frame rate depth sensing at low cost.
Finally, the detection method can be applied to several 3-~ imaging geometries in addit~on to the standard triangulation techniquas illustrated in FIGURES 1 and 2. For example, it has been suggested in research literatuxe that shadows and occlusion can be completely avoided by u~ing a quite simple but clever method utilizing a nearly coaxial illumina~ion beam, light collection system and a CC~ detector array. The optical system is a good one, but the d~tector again severely limits speed, and is incapable of responding to low light levels since the required optical system is inherently inefficient due to a mask which is largely opaque. By incorporating, with minor modifications, the system disclosed in this invention, high speed 3-O sensing at low light levels can be achieved with only a slight increase in the physical size of the optical package.
While the best mode for carrying out the invention has been described in detail, those familiar with the art to which this invention relates will recognize various alternative ways of practicing the invention as defined by the follow-ing claims.

Claims (47)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE
PROPERTY OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. A method for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the method comprising the steps of: scanning a beam of controlled light at the surface of the object at a first predetermined angle to generate a corresponding reflected light signal; receiving said reflected light signal at a second angle with a set of optical components; spatially filtering the received signal; spatially averaging the received light signal with the set of optical components to compensate fox non-uniform intensity distributions in the received signal, said step of averaging including the step of creating a uniform spot of light; optically splitting the received light signal into first and second split beams, the second split beam being a reference beam; imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively, to produce corresponding first and second electrical signals proportional to the measurements; normalizing the first and second electrical signals to lie within a predetermined range;
and computing a centroid value for the first split beam from the normalized signals.
2. A method for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the method comprising the steps of: scanning a beam of controlled light at the surface of the object at a first predetermined angle to generate a corresponding reflected light signal; receiving said reflected light signal at a second angle with a set of optical components; spatially filtering the received signal; spatially averaging the received light signal with the set of optical components to compensate for non-uniform intensity distributions in the received signal, said step of averaging including the step of creating a uniform spot of light; optically splitting the received light signal into first and second split beams; transmitting a portion of the first split beam dependent on the second angle; imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively, to produce corresponding first and second electrical signals proportional to the measurements;
normalizing the first and second electrical signals to lie within a predetermined range; and computing a centroid value for the first split beam from the normalized signals.
3. The method as claimed in claim 1 or claim 2 wherein said step of normalizing includes the step of summing the first and second electrical signals to obtain a resultant signal wherein the resultant signal is normalized to lie within the predetermined range, and wherein the centroid is computed from the normalized sum.
4. The method as claimed in claim 1 wherein the first and second electrical signals have an analog representation and wherein said step of normalizing further includes the steps of generating a digital representation of the first and second electrical signals and utilizing the digital representation to scale the first and second electrical signals to lie within the predetermined range.
5. The method as claimed in claim 4 further comprising the steps of modulating the beam of controlled light to shift the frequency of the controlled light to a predetermined band of frequencies and demodulating the first and second electrical signals.
6. The method as claimed in claim 5 wherein the step of demodulating includes the step of removing noise from the first and second electrical signals with a filter having a transmission band including the predetermined band of frequencies.
7. The method as claimed in claim 1 or claim 2 wherein said step of spatially filtering utilizes a programmable mask correlated with a height profile of the object at the vision station.
8. The method as claimed in claim 1 wherein said step of spatially averaging includes the step of converting the spot of light into a line of light.
9. The method as claimed in claim 8 wherein the step of spatially filtering the received light utilizes a spatial filter of the set of optical components.
10. The method as claimed in claim 1 or claim 2 wherein at least one of said photodetectors is a photodiode having a measuring area of less than 20 mm2.
11. A method for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the method comprising the steps of: scanning a beam of controlled, modulated light at the surface of the object at a first predetermined angle to generate a corresponding reflected light signal; receiving said reflected light signal at a second angle with a set of optical components; spatially filtering the received light signal with the set of optical components;
spatially averaging the received light signal with the set of optical components to compensate for non-uniform intensity distributions in the received signal, said step of averaging including the step of creating a uniform spot of light; optically splitting the received light signal into first and second split beams, the second split beam being a reference beam; imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively, to produce corresponding first and second electrical signals proportional to the measurements; normalizing the first and second electrical signals wherein the step of normalizing includes the step of scaling the first and second electrical signals to lie within a predetermined range; demodulating the scaled first and second electrical signals; and computing a centroid value for the first split beam from the demodulated signals.
12. An imaging system for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the system comprising: a source for scanning a beam of controlled light at the surface of the object at a first predetermined angle to generate a corresponding reflected light signal; a set of optical components for receiving the reflected light signal at a second angle, for spatially filtering the received signal, for spatially averaging the received light signal to compensate for non-uniform intensity distributions in the received signal, and for optically splitting the received light signal into first and second split beams, the second split beam being a reference beam said set of optical components including means for creating a uniform spot of light;
first and second photodetector means for measuring the amount of radiant energy in the first split beam and the reference beam, respectively, and producing first and second electrical signals proportional to the measurements, respectively; means for imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively; and signal processing means for normalizing the first and second electrical signals to lie within a predetermined range and for computing a centroid value for the first split beam from the normalized signals.
13. An imaging system for high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the system comprising: a source for scanning a beam of controlled light at the surface of the object at a first predetermined angle to generate a corresponding light signal; a set of optical components for receiving the reflected light signal at a second angle, for spatially filtering the received signal, for spatially averaging the received light signal to compensate for non-uniform intensity distributions in the received signal, and for optically splitting the received light signal into first and second split beams, the set of optical components including transmitting means for transmitting a portion of the first split beam dependent on the second angle, said set of optical components including means for creating a uniform spot of light; first and second photodetector means for measuring the amount of radiant energy in the transmitted portion of the first split beam and the second split beam, respectively, and producing first and second electrical signals proportional to the measurements, respectively; means for imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively; and signal processing means for normalizing the first and second electrical signals to lie within a predetermined range and for computing a centroid value for the first split beam from the normalized signals.
14. The system as claimed in claim 12 or claim 13 wherein the signal processing means includes a summing circuit for summing the first and second electrical signals to obtain a resultant signal wherein the resultant signal is normalized to lie within the predetermined range and wherein the centroid is computed from the normalized sum.
15. The system as claimed in claim 12 wherein the first and second electrical signals have an analog representation and wherein the signal processing means includes generating means for generating a digital representation of the first and second electrical signals.
16. The system as claimed in claim 15 wherein said signal processing means includes scaler means coupled to the generating means and utilizing the digital representation to scale the first and second electrical signals to lie within the predetermined range.
17. The system as claimed in claim 16 wherein said scaler means includes a set of programmable amplifiers for selectively amplifying the first and second electrical signals in response to the digital representation.
18. The system as claimed in claim 17 further comprising modulating means for modulating the beam of controlled light to shift the frequency of the controlled light to a predetermined band of frequencies and wherein said signal processing means includes a demodulator for demodulating the first and second digital signals.
19. The system as claimed in claim 18 wherein said demodulator includes a filter having a transmission band including the predetermined band of frequencies for removing noise from the first and second electrical signals.
20. The system as claimed in claim 12 or claim 13 wherein said set of optical components includes means for spatially filtering the received light signal including a programmable mask correlated with a height profile of the object at the vision station.
21. The system as claimed in claim 12 wherein said set of optical components includes a spot-to-line converter for converting the spot of light into a line of light.
22. The system as claimed in claim 21 wherein said means for spatially filtering includes a spatial filter for spatially filtering the received light.
23. The system as claimed in claim 12 or claim 13 wherein areas of the first and second photodiodes, respectively.
24. The system as claimed in claim 12 wherein each of said first and second photodetector means includes a single photodiode having a measuring area of less than 20 mm2 for converting the radiant energy into an electrical current.
25. The system as claimed in claim 24 wherein each of said photodiodes has a measuring area of less than 10 mm2.
26. The system as claimed in claim 12 or claim 13 wherein said set of optical components includes splitting means for optically splitting the received light signal into the first and second split beams.
27. The system as claimed in claim 12 wherein said source is a laser scanner.
28. The system as claimed in claim 27 wherein said laser scanner is a flying spot laser scanner.
29. An imaging system for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the system comprising: a source for scanning a beam of controlled modulated light at the surface of the object at a first predetermined angle to generate a corresponding reflected light signal; a set of optical components for receiving the reflected light signal at a second angle, the set of optical components including means for spatially filtering the received light signal, averaging means for averaging the received light signal to compensate for non-uniform intensity distributions in the received signal said averaging means including means for creating a uniform spot of light, and splitting means for optically splitting the received light signal into first and second split beams, the second split beam being a reference beam, first and second photodetector means for measuring the amount of radiant energy in the first split beam and the reference beam, respectively, and producing first and second electrical signals proportional to the measurements, respectively;
means for imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively; and signal processing means for normalizing the first and second electrical signals, said signal processing means including scaler means for scaling the first and second electrical signals to lie within a predetermined range and a demodulator for demodulating the first and second electrical signals to reduce noise in the signals, said signal processing means computing a centroid value for the first split beam from the demodulated signals.
30. A method for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the method comprising the steps of: (a) scanning a beam of controlled light at the surface of the object at a first predetermined angle to generate a corresponding reflected light signal; (b) receiving said reflected light signal at a second angle with a set of optical components; (c) spatially filtering the received light signal with the set of optical components; (d) spatially averaging the received light signal with the set of optical components to compensate for non-uniform intensity distributions in the received signal, said step of averaging including the step of creating a uniform spot of light; (e) optically splitting the received light signal into first and second split beams, the second split beam being a reference beam;
(f) imaging the first and second split beams to first and second predetermined measuring areas of first and second photodetector means, respectively, to produce a corresponding pair of electrical signals proportional to the measurements; generating a feedback signal dependent on the pair of electrical signals; utilizing the feedback signal to control the modulation of a source of controlled light; utilizing a beam of the controlled light in steps (a) through (f); normalizing the second pair of electrical signals wherein the step of normalizing includes the step of scaling the second pair of electrical signals to lie within a predetermined range; demodulating the scaled second pair of electrical signals; and computing a centroid value for the first split beam of the second pair of electrical signals from the demodulated signals.
31. An imaging system for the high-speed, 3-D imaging of an object at a vision station to develop dimensional information associated with the object, the system comprising: a source including first and second sources for sequentially scanning first and second beams of controlled modulated light respectively, at the surface of the object at a first predetermined angle to generated corresponding reflected light signals; a set of optical components for receiving the reflected light signals at a second angle, the set of optical components including means for spatially filtering the received light signal; means for spatially averaging the received light signal to compensate for non-uniform intensity distributions in the received signal, said means for averaging including means for creating a uniform spot of light and splitting means for optically splitting each of the received light signals into first and second pairs of split beams, the second split beam being a reference beam; first and second photodetector means for measuring the amount of radiant energy in the first split beams and the reference beams, respectively, and producing first and second pairs of signals proportional to the measurements, respectively, the first pair of electrical signals corresponding to the first pair of split beams and the second pair of electrical signals corresponding to the second pair of split beams; means for imaging the first and second split beams to first and second predetermined measuring areas of the first and second photodetector means, respectively, to produce corresponding first and second electrical signals proportional to the measurements;
feedback means for generating a feedback signal dependent on the first pair of electrical signals, the feedback means controlling the modulation of the second source to improve noise suppression and extension of the dynamic range of the system; and signal processing means for normalizing the second pair of electrical signals, said signal processing means including scaler means for scaling the second pair of electrical signals to lie within a predetermined range and a demodulator for demodulating the second pair of electrical signals to reduce noise in the signals, said signal processing means computing a centroid value for the first split beam of the second pair of split beams from the demodulated signal.
32. The method as claimed in claim 2 wherein the first and second electrical signals have an analog representation and wherein said step of normalizing further includes the steps of generating a digital representation of the first and second electrical signals and utilizing the digital representation to scale the first and second electrical signals to lie within the predetermined range.
33. The method as claimed in claim 32 further comprising the steps of modulating the beam of controlled light to shift the frequency of the controlled light to a predetermined band of frequencies and demodulating the first and second electrical signals.
34. The method as claimed in claim 33 wherein the step of demodulating includes the step of removing noise from the first and second electrical signals with a filter having a transmission band including the predetermined band of frequencies.
35. The method as claimed in claim 2 wherein said step of spatially averaging includes the step of converting the spot of light into a line of light.
36. The method as claimed in claim 35 wherein the step of spatially filtering the received light utilizes a spatial filter of the set of optical components.
37. The system as claimed in claim 13 wherein the first and second electrical signals have an analog representation and wherein the signal processing means includes generating means for generating a digital representation of the first and second electrical signals.
38. The system as claimed in claim 37 wherein said signal processing means includes scaler means coupled to the generating means and utilizing the digital representation to scale the first and second electrical signals to lie within the predetermined range.
39. The system as claimed in claim 38 wherein said scaler means includes a set of programmable amplifiers for selectively amplifying the first and second electrical signals in response to the digital representation.
40. The system as claimed in claim 39 further comprising modulating means for modulating the beam of controlled light to shift the frequency of the controlled light to a predetermined band of frequencies and wherein said signal processing means includes a demodulator for demodulating the first and second digital signals.
41. The system as claimed in claim 40 wherein said demodulator includes a filter having a transmission band including the predetermined band of frequencies for removing noise from the first and second electrical signals.
42. The system as claimed in claim 13 wherein said set of optical components includes a spot-to-line converter for converting the spot of light into a line of light.
43. The system as claimed in claim 42 wherein said means for spatially filtering includes a spatial filter for spatially filtering the received light.
44. The system as claimed in claim 13 wherein each of said first and second photodetector means includes a single photodiode having a measuring area of less than 20 mm2 for converting the radiant energy into an electrical current.
45. The system as claimed in claim 44 wherein each of said photodiodes has a measuring area of less than 10 mm2.
46. The system as claimed in claim 13 wherein said source is a laser scanner.
47. The system as claimed in claim 46 wherein said laser scanner is a flying spot laser scanner.
CA000537961A 1986-05-27 1987-05-26 Method and system for high-speed, 3-d imaging of an object at a vision station Expired - Fee Related CA1265869A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US86673586A 1986-05-27 1986-05-27
US866,735 1986-05-27
US07/052,841 US4796997A (en) 1986-05-27 1987-05-21 Method and system for high-speed, 3-D imaging of an object at a vision station
US052,841 1987-05-21

Publications (1)

Publication Number Publication Date
CA1265869A true CA1265869A (en) 1990-02-13

Family

ID=26731151

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000537961A Expired - Fee Related CA1265869A (en) 1986-05-27 1987-05-26 Method and system for high-speed, 3-d imaging of an object at a vision station

Country Status (4)

Country Link
US (1) US4796997A (en)
EP (1) EP0247833B1 (en)
CA (1) CA1265869A (en)
DE (1) DE3769368D1 (en)

Families Citing this family (294)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5024529A (en) * 1988-01-29 1991-06-18 Synthetic Vision Systems, Inc. Method and system for high-speed, high-resolution, 3-D imaging of an object at a vision station
US5011960A (en) * 1988-05-20 1991-04-30 Fujitsu Limited Wiring pattern detection method and apparatus
US5200838A (en) * 1988-05-27 1993-04-06 The University Of Connecticut Lateral effect imaging system
AU598418B2 (en) * 1988-06-04 1990-06-21 Fujitsu Limited Optical system for detecting three-dimensional shape
GB2222047A (en) * 1988-07-25 1990-02-21 Unisearch Ltd Optical mapping of field of view and information storage
US5090811A (en) * 1989-05-31 1992-02-25 General Electric Company Optical radius gauge
EP0417736B1 (en) * 1989-09-12 1994-11-30 Matsushita Electric Industrial Co., Ltd. System for optically inspecting conditions of parts packaged on substrate
US5589942A (en) * 1990-04-05 1996-12-31 Intelligent Automation Systems Real time three dimensional sensing system
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
JP3263724B2 (en) * 1993-04-16 2002-03-11 日本電信電話株式会社 Shape feature extraction device using two-dimensional laser pattern
US5621807A (en) * 1993-06-21 1997-04-15 Dornier Gmbh Intelligent range image camera for object measurement
US5500737A (en) * 1993-07-21 1996-03-19 General Electric Company Method for measuring the contour of a surface
US5638301A (en) * 1994-06-02 1997-06-10 Ford Motor Company Method and system for inspecting die sets using free-form inspection techniques
US5555090A (en) * 1994-10-24 1996-09-10 Adaptive Optics Associates System for dimensioning objects
US5911035A (en) * 1995-04-12 1999-06-08 Tsao; Thomas Method and apparatus for determining binocular affine disparity and affine invariant distance between two image patterns
FR2735859B1 (en) * 1995-06-23 1997-09-05 Kreon Ind PROCESS FOR ACQUISITION AND DIGITIZATION OF OBJECTS THROUGH A TRANSPARENT WALL AND SYSTEM FOR IMPLEMENTING SUCH A PROCESS
US5644141A (en) * 1995-10-12 1997-07-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method for high-speed characterization of surfaces
US6028671A (en) * 1996-01-31 2000-02-22 General Scanning, Inc. Method and system for suppressing unwanted reflections in an optical system
DE19605218C1 (en) * 1996-02-13 1997-04-17 Dornier Gmbh Obstacle warning system for low-flying aircraft
GB2310557B (en) 1996-02-21 2000-05-10 Rank Taylor Hobson Ltd Image processing apparatus
US5859924A (en) * 1996-07-12 1999-01-12 Robotic Vision Systems, Inc. Method and system for measuring object features
JPH1117286A (en) * 1997-06-27 1999-01-22 Ando Electric Co Ltd Tunable laser device
US6120190A (en) * 1997-11-26 2000-09-19 Lasertron, Inc. Spatially variable bandpass filter monitoring and feedback control of laser wavelength especially in wavelength division multiplexing communication systems
US5946148A (en) * 1998-10-26 1999-08-31 The United States Of America As Represented By The Secretary Of The Air Force Variable transmission beamsplitter
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
DE60012672T2 (en) * 1999-12-03 2005-08-11 Matsushita Electric Industrial Co., Ltd., Kadoma Tester with laser beam
CA2301822A1 (en) * 2000-03-24 2001-09-24 9071 9410 Quebec Inc. Simultaneous projection of several patterns with simultaneous acquisition for inspection of objects in three-dimensions
EP1152261A1 (en) * 2000-04-28 2001-11-07 CSEM Centre Suisse d'Electronique et de Microtechnique SA Device and method for spatially resolved photodetection and demodulation of modulated electromagnetic waves
JP4356958B2 (en) * 2000-05-31 2009-11-04 キヤノン株式会社 Image forming apparatus and laser drive control method in the apparatus
US6502053B1 (en) * 2000-06-12 2002-12-31 Larry Hardin Combination passive and active speed detection system
US6975747B2 (en) 2001-08-14 2005-12-13 Acuity Cimatrix, Inc. Method and system for monitoring and controlling workpieces
US6836349B2 (en) * 2001-12-07 2004-12-28 Jds Uniphase Corporation Optical performance monitoring device
US6990639B2 (en) * 2002-02-07 2006-01-24 Microsoft Corporation System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration
US7202965B2 (en) * 2002-07-16 2007-04-10 Stanley Korn Method of using printed forms to transmit the information necessary to create electronic forms
US7665041B2 (en) 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US8745541B2 (en) * 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
CA2424441C (en) * 2003-03-31 2008-07-15 Institut National D'optique Position-sensing device for 3-d profilometers
US7472831B2 (en) * 2003-11-13 2009-01-06 Metrologic Instruments, Inc. System for detecting image light intensity reflected off an object in a digital imaging-based bar code symbol reading device
DE10354078B4 (en) * 2003-11-19 2008-09-04 Daimler Ag Clamping device for workpieces for three-dimensional optical surface measurement
US7697827B2 (en) 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
US7483151B2 (en) * 2006-03-17 2009-01-27 Alpineon D.O.O. Active 3D triangulation-based imaging method and device
WO2008002278A1 (en) * 2006-06-29 2008-01-03 Agency For Science, Technology And Research Shg quantification of matrix-related tissue dynamic and disease
DE102006034914A1 (en) * 2006-07-28 2008-01-31 Carl Zeiss Microimaging Gmbh Microscope i.e. laser scanning microscope, controlling method for e.g. fluorescence resonance energy application, involves carrying out adjustment of illumination lights upto maximum valve, which is determined by default for light
US7474415B2 (en) * 2006-09-13 2009-01-06 Chung Shan Institute Of Science And Technology, Armaments Bureau, M.N.D. Measurement method of three-dimensional profiles and reconstruction system thereof using subpixel localization with color gratings and picture-in-picture switching on single display
US8005238B2 (en) 2007-03-22 2011-08-23 Microsoft Corporation Robust adaptive beamforming with enhanced noise suppression
WO2008124397A1 (en) 2007-04-03 2008-10-16 David Fishbaine Inspection system and method
US8005237B2 (en) 2007-05-17 2011-08-23 Microsoft Corp. Sensor array beamformer post-processor
DE102007029274B4 (en) * 2007-06-22 2013-06-13 Automation W + R Gmbh Method and arrangement for the optical inspection of unilaterally open tunnel-like cavities in workpieces, in particular cooling channels in brake discs
US8629976B2 (en) * 2007-10-02 2014-01-14 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (TOF) systems
US20090166684A1 (en) * 2007-12-26 2009-07-02 3Dv Systems Ltd. Photogate cmos pixel for 3d cameras having reduced intra-pixel cross talk
US8194233B2 (en) 2008-04-11 2012-06-05 Microsoft Corporation Method and system to reduce stray light reflection error in time-of-flight sensor arrays
US8385557B2 (en) 2008-06-19 2013-02-26 Microsoft Corporation Multichannel acoustic echo reduction
US8325909B2 (en) 2008-06-25 2012-12-04 Microsoft Corporation Acoustic echo suppression
US8203699B2 (en) 2008-06-30 2012-06-19 Microsoft Corporation System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8681321B2 (en) * 2009-01-04 2014-03-25 Microsoft International Holdings B.V. Gated 3D camera
US8295546B2 (en) * 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US8682028B2 (en) * 2009-01-30 2014-03-25 Microsoft Corporation Visual target tracking
US8267781B2 (en) 2009-01-30 2012-09-18 Microsoft Corporation Visual target tracking
US8487938B2 (en) * 2009-01-30 2013-07-16 Microsoft Corporation Standard Gestures
US8577084B2 (en) * 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US20100199228A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Gesture Keyboarding
US8294767B2 (en) * 2009-01-30 2012-10-23 Microsoft Corporation Body scan
US8565476B2 (en) * 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US20100199231A1 (en) * 2009-01-30 2010-08-05 Microsoft Corporation Predictive determination
US8588465B2 (en) * 2009-01-30 2013-11-19 Microsoft Corporation Visual target tracking
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
US8448094B2 (en) * 2009-01-30 2013-05-21 Microsoft Corporation Mapping a natural input device to a legacy system
US8565477B2 (en) * 2009-01-30 2013-10-22 Microsoft Corporation Visual target tracking
US8577085B2 (en) * 2009-01-30 2013-11-05 Microsoft Corporation Visual target tracking
US20100226114A1 (en) * 2009-03-03 2010-09-09 David Fishbaine Illumination and imaging system
US8773355B2 (en) * 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US9256282B2 (en) * 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US9313376B1 (en) 2009-04-01 2016-04-12 Microsoft Technology Licensing, Llc Dynamic depth power equalization
US20100277470A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Systems And Methods For Applying Model Tracking To Motion Capture
US8660303B2 (en) * 2009-05-01 2014-02-25 Microsoft Corporation Detection of body and props
US8253746B2 (en) * 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US9898675B2 (en) * 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US8638985B2 (en) * 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US9498718B2 (en) * 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US8181123B2 (en) 2009-05-01 2012-05-15 Microsoft Corporation Managing virtual port associations to users in a gesture-based computing environment
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US9015638B2 (en) * 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US8942428B2 (en) * 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US20100277711A1 (en) * 2009-05-04 2010-11-04 Capella Microsystems, Corp. Optical quantized distance measuring apparatus and method thereof
US20100295771A1 (en) * 2009-05-20 2010-11-25 Microsoft Corporation Control of display objects
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US8379101B2 (en) * 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8418085B2 (en) * 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US20100302365A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Depth Image Noise Reduction
US8509479B2 (en) * 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US9383823B2 (en) * 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8856691B2 (en) * 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US20100306716A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Extending standard gestures
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8542252B2 (en) * 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US8693724B2 (en) 2009-05-29 2014-04-08 Microsoft Corporation Method and system implementing user-centric gesture control
US9400559B2 (en) * 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US8487871B2 (en) 2009-06-01 2013-07-16 Microsoft Corporation Virtual desktop coordinate transformation
US8390680B2 (en) * 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9159151B2 (en) * 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US20110025689A1 (en) * 2009-07-29 2011-02-03 Microsoft Corporation Auto-Generating A Visual Representation
US8264536B2 (en) * 2009-08-25 2012-09-11 Microsoft Corporation Depth-sensitive imaging via polarization-state mapping
US9141193B2 (en) * 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
US8330134B2 (en) 2009-09-14 2012-12-11 Microsoft Corporation Optical fault monitoring
US8508919B2 (en) * 2009-09-14 2013-08-13 Microsoft Corporation Separation of electrical and optical components
US8976986B2 (en) * 2009-09-21 2015-03-10 Microsoft Technology Licensing, Llc Volume adjustment based on listener position
US8428340B2 (en) * 2009-09-21 2013-04-23 Microsoft Corporation Screen space plane identification
US8760571B2 (en) * 2009-09-21 2014-06-24 Microsoft Corporation Alignment of lens and image sensor
US9014546B2 (en) 2009-09-23 2015-04-21 Rovi Guides, Inc. Systems and methods for automatically detecting users within detection regions of media devices
US8723118B2 (en) * 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US20110083108A1 (en) * 2009-10-05 2011-04-07 Microsoft Corporation Providing user interface feedback regarding cursor position on a display screen
US7961910B2 (en) 2009-10-07 2011-06-14 Microsoft Corporation Systems and methods for tracking a model
US8564534B2 (en) 2009-10-07 2013-10-22 Microsoft Corporation Human tracking system
US8867820B2 (en) 2009-10-07 2014-10-21 Microsoft Corporation Systems and methods for removing a background of an image
US8963829B2 (en) 2009-10-07 2015-02-24 Microsoft Corporation Methods and systems for determining and tracking extremities of a target
US9400548B2 (en) * 2009-10-19 2016-07-26 Microsoft Technology Licensing, Llc Gesture personalization and profile roaming
US20110099476A1 (en) * 2009-10-23 2011-04-28 Microsoft Corporation Decorating a display environment
US8988432B2 (en) * 2009-11-05 2015-03-24 Microsoft Technology Licensing, Llc Systems and methods for processing an image for target tracking
US20110109617A1 (en) * 2009-11-12 2011-05-12 Microsoft Corporation Visualizing Depth
US8843857B2 (en) 2009-11-19 2014-09-23 Microsoft Corporation Distance scalable no touch computing
US9244533B2 (en) * 2009-12-17 2016-01-26 Microsoft Technology Licensing, Llc Camera navigation for presentations
US20110150271A1 (en) 2009-12-18 2011-06-23 Microsoft Corporation Motion detection using depth images
US20110151974A1 (en) * 2009-12-18 2011-06-23 Microsoft Corporation Gesture style recognition and reward
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US8631355B2 (en) * 2010-01-08 2014-01-14 Microsoft Corporation Assigning gesture dictionaries
US9019201B2 (en) * 2010-01-08 2015-04-28 Microsoft Technology Licensing, Llc Evolving universal gesture sets
US9268404B2 (en) * 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
US8334842B2 (en) 2010-01-15 2012-12-18 Microsoft Corporation Recognizing user intent in motion capture system
US8933884B2 (en) * 2010-01-15 2015-01-13 Microsoft Corporation Tracking groups of users in motion capture system
US8676581B2 (en) * 2010-01-22 2014-03-18 Microsoft Corporation Speech recognition analysis via identification information
US8265341B2 (en) 2010-01-25 2012-09-11 Microsoft Corporation Voice-body identity correlation
US8864581B2 (en) * 2010-01-29 2014-10-21 Microsoft Corporation Visual based identitiy tracking
US8891067B2 (en) 2010-02-01 2014-11-18 Microsoft Corporation Multiple synchronized optical sources for time-of-flight range finding systems
US8687044B2 (en) * 2010-02-02 2014-04-01 Microsoft Corporation Depth camera compatibility
US8619122B2 (en) * 2010-02-02 2013-12-31 Microsoft Corporation Depth camera compatibility
US8717469B2 (en) * 2010-02-03 2014-05-06 Microsoft Corporation Fast gating photosurface
US8659658B2 (en) * 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
US8499257B2 (en) * 2010-02-09 2013-07-30 Microsoft Corporation Handles interactions for human—computer interface
US20110199302A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Capturing screen objects using a collision volume
US8633890B2 (en) * 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
US8928579B2 (en) * 2010-02-22 2015-01-06 Andrew David Wilson Interacting with an omni-directionally projected display
US8411948B2 (en) 2010-03-05 2013-04-02 Microsoft Corporation Up-sampling binary images for segmentation
US8655069B2 (en) * 2010-03-05 2014-02-18 Microsoft Corporation Updating image segmentation following user input
US8422769B2 (en) 2010-03-05 2013-04-16 Microsoft Corporation Image segmentation using reduced foreground training data
US20110221755A1 (en) * 2010-03-12 2011-09-15 Kevin Geisner Bionic motion
US20110223995A1 (en) 2010-03-12 2011-09-15 Kevin Geisner Interacting with a computer based application
US8279418B2 (en) 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
US8213680B2 (en) * 2010-03-19 2012-07-03 Microsoft Corporation Proxy training data for human body tracking
US8514269B2 (en) * 2010-03-26 2013-08-20 Microsoft Corporation De-aliasing depth images
US20110234481A1 (en) * 2010-03-26 2011-09-29 Sagi Katz Enhancing presentations using depth sensing cameras
US8523667B2 (en) * 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US8605763B2 (en) 2010-03-31 2013-12-10 Microsoft Corporation Temperature measurement and control for laser and light-emitting diodes
US9646340B2 (en) 2010-04-01 2017-05-09 Microsoft Technology Licensing, Llc Avatar-based virtual dressing room
US9098873B2 (en) 2010-04-01 2015-08-04 Microsoft Technology Licensing, Llc Motion-based interactive shopping environment
US8351651B2 (en) 2010-04-26 2013-01-08 Microsoft Corporation Hand-location post-process refinement in a tracking system
US8379919B2 (en) 2010-04-29 2013-02-19 Microsoft Corporation Multiple centroid condensation of probability distribution clouds
US8284847B2 (en) 2010-05-03 2012-10-09 Microsoft Corporation Detecting motion for a multifunction sensor device
US8885890B2 (en) 2010-05-07 2014-11-11 Microsoft Corporation Depth map confidence filtering
US8498481B2 (en) 2010-05-07 2013-07-30 Microsoft Corporation Image segmentation using star-convexity constraints
US8457353B2 (en) 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US8803888B2 (en) 2010-06-02 2014-08-12 Microsoft Corporation Recognition system for sharing information
US9008355B2 (en) 2010-06-04 2015-04-14 Microsoft Technology Licensing, Llc Automatic depth camera aiming
US8751215B2 (en) 2010-06-04 2014-06-10 Microsoft Corporation Machine based sign language interpreter
US9557574B2 (en) 2010-06-08 2017-01-31 Microsoft Technology Licensing, Llc Depth illumination and detection optics
US8330822B2 (en) 2010-06-09 2012-12-11 Microsoft Corporation Thermally-tuned depth camera light source
US8749557B2 (en) 2010-06-11 2014-06-10 Microsoft Corporation Interacting with user interface via avatar
US9384329B2 (en) 2010-06-11 2016-07-05 Microsoft Technology Licensing, Llc Caloric burn determination from body movement
US8675981B2 (en) 2010-06-11 2014-03-18 Microsoft Corporation Multi-modal gender recognition including depth data
US8982151B2 (en) 2010-06-14 2015-03-17 Microsoft Technology Licensing, Llc Independently processing planes of display data
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
US8670029B2 (en) 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
US8296151B2 (en) 2010-06-18 2012-10-23 Microsoft Corporation Compound gesture-speech commands
US8381108B2 (en) 2010-06-21 2013-02-19 Microsoft Corporation Natural user input for driving interactive stories
US8416187B2 (en) 2010-06-22 2013-04-09 Microsoft Corporation Item navigation using motion-capture data
US9075434B2 (en) 2010-08-20 2015-07-07 Microsoft Technology Licensing, Llc Translating user motion into multiple object responses
US8613666B2 (en) 2010-08-31 2013-12-24 Microsoft Corporation User selection and navigation based on looped motions
US20120058824A1 (en) 2010-09-07 2012-03-08 Microsoft Corporation Scalable real-time motion recognition
US8437506B2 (en) 2010-09-07 2013-05-07 Microsoft Corporation System for fast, probabilistic skeletal tracking
US8988508B2 (en) 2010-09-24 2015-03-24 Microsoft Technology Licensing, Llc. Wide angle field of view active illumination imaging system
US8681255B2 (en) 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
US8548270B2 (en) 2010-10-04 2013-10-01 Microsoft Corporation Time-of-flight depth imaging
US9484065B2 (en) 2010-10-15 2016-11-01 Microsoft Technology Licensing, Llc Intelligent determination of replays based on event identification
US8592739B2 (en) 2010-11-02 2013-11-26 Microsoft Corporation Detection of configuration changes of an optical element in an illumination system
US8866889B2 (en) 2010-11-03 2014-10-21 Microsoft Corporation In-home depth camera calibration
US8667519B2 (en) 2010-11-12 2014-03-04 Microsoft Corporation Automatic passive and anonymous feedback system
US10726861B2 (en) 2010-11-15 2020-07-28 Microsoft Technology Licensing, Llc Semi-private communication in open environments
US9349040B2 (en) 2010-11-19 2016-05-24 Microsoft Technology Licensing, Llc Bi-modal depth-image analysis
US10234545B2 (en) 2010-12-01 2019-03-19 Microsoft Technology Licensing, Llc Light source module
US8553934B2 (en) 2010-12-08 2013-10-08 Microsoft Corporation Orienting the position of a sensor
US8618405B2 (en) 2010-12-09 2013-12-31 Microsoft Corp. Free-space gesture musical instrument digital interface (MIDI) controller
US8408706B2 (en) 2010-12-13 2013-04-02 Microsoft Corporation 3D gaze tracker
US9171264B2 (en) 2010-12-15 2015-10-27 Microsoft Technology Licensing, Llc Parallel processing machine learning decision tree training
US8920241B2 (en) 2010-12-15 2014-12-30 Microsoft Corporation Gesture controlled persistent handles for interface guides
US8884968B2 (en) 2010-12-15 2014-11-11 Microsoft Corporation Modeling an object from image data
US8448056B2 (en) 2010-12-17 2013-05-21 Microsoft Corporation Validation analysis of human target
US8803952B2 (en) 2010-12-20 2014-08-12 Microsoft Corporation Plural detector time-of-flight depth mapping
US8994718B2 (en) 2010-12-21 2015-03-31 Microsoft Technology Licensing, Llc Skeletal control of three-dimensional virtual world
US9823339B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Plural anode time-of-flight sensor
US9848106B2 (en) 2010-12-21 2017-12-19 Microsoft Technology Licensing, Llc Intelligent gameplay photo capture
US8385596B2 (en) 2010-12-21 2013-02-26 Microsoft Corporation First person shooter control with virtual skeleton
US9821224B2 (en) 2010-12-21 2017-11-21 Microsoft Technology Licensing, Llc Driving simulator control with virtual skeleton
US9123316B2 (en) 2010-12-27 2015-09-01 Microsoft Technology Licensing, Llc Interactive content creation
US8488888B2 (en) 2010-12-28 2013-07-16 Microsoft Corporation Classification of posture states
US8587583B2 (en) 2011-01-31 2013-11-19 Microsoft Corporation Three-dimensional environment reconstruction
US8401242B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Real-time camera tracking using depth maps
US9247238B2 (en) 2011-01-31 2016-01-26 Microsoft Technology Licensing, Llc Reducing interference between multiple infra-red depth cameras
US8401225B2 (en) 2011-01-31 2013-03-19 Microsoft Corporation Moving object segmentation using depth images
US8724887B2 (en) 2011-02-03 2014-05-13 Microsoft Corporation Environmental modifications to mitigate environmental factors
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US8497838B2 (en) 2011-02-16 2013-07-30 Microsoft Corporation Push actuation of interface controls
US9551914B2 (en) 2011-03-07 2017-01-24 Microsoft Technology Licensing, Llc Illuminator with refractive optical element
US9067136B2 (en) 2011-03-10 2015-06-30 Microsoft Technology Licensing, Llc Push personalization of interface controls
US8571263B2 (en) 2011-03-17 2013-10-29 Microsoft Corporation Predicting joint positions
US9470778B2 (en) * 2011-03-29 2016-10-18 Microsoft Technology Licensing, Llc Learning from high quality depth measurements
US9298287B2 (en) 2011-03-31 2016-03-29 Microsoft Technology Licensing, Llc Combined activation for natural user interface systems
US9842168B2 (en) 2011-03-31 2017-12-12 Microsoft Technology Licensing, Llc Task driven user intents
US10642934B2 (en) 2011-03-31 2020-05-05 Microsoft Technology Licensing, Llc Augmented conversational understanding architecture
US9760566B2 (en) 2011-03-31 2017-09-12 Microsoft Technology Licensing, Llc Augmented conversational understanding agent to identify conversation context between two humans and taking an agent action thereof
US8503494B2 (en) 2011-04-05 2013-08-06 Microsoft Corporation Thermal management system
US8824749B2 (en) 2011-04-05 2014-09-02 Microsoft Corporation Biometric recognition
US8620113B2 (en) 2011-04-25 2013-12-31 Microsoft Corporation Laser diode modes
US8702507B2 (en) 2011-04-28 2014-04-22 Microsoft Corporation Manual and camera-based avatar control
US9259643B2 (en) 2011-04-28 2016-02-16 Microsoft Technology Licensing, Llc Control of separate computer game elements
US10671841B2 (en) 2011-05-02 2020-06-02 Microsoft Technology Licensing, Llc Attribute state classification
US8888331B2 (en) 2011-05-09 2014-11-18 Microsoft Corporation Low inductance light source module
US9064006B2 (en) 2012-08-23 2015-06-23 Microsoft Technology Licensing, Llc Translating natural language utterances to keyword search queries
US9137463B2 (en) 2011-05-12 2015-09-15 Microsoft Technology Licensing, Llc Adaptive high dynamic range camera
US8788973B2 (en) 2011-05-23 2014-07-22 Microsoft Corporation Three-dimensional gesture controlled avatar configuration interface
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US9594430B2 (en) 2011-06-01 2017-03-14 Microsoft Technology Licensing, Llc Three-dimensional foreground selection for vision system
US8526734B2 (en) 2011-06-01 2013-09-03 Microsoft Corporation Three-dimensional background removal for vision system
US8597142B2 (en) 2011-06-06 2013-12-03 Microsoft Corporation Dynamic camera based practice mode
US8897491B2 (en) 2011-06-06 2014-11-25 Microsoft Corporation System for finger recognition and tracking
US9013489B2 (en) 2011-06-06 2015-04-21 Microsoft Technology Licensing, Llc Generation of avatar reflecting player appearance
US9208571B2 (en) 2011-06-06 2015-12-08 Microsoft Technology Licensing, Llc Object digitization
US9724600B2 (en) 2011-06-06 2017-08-08 Microsoft Technology Licensing, Llc Controlling objects in a virtual environment
US9098110B2 (en) 2011-06-06 2015-08-04 Microsoft Technology Licensing, Llc Head rotation tracking from depth-based center of mass
US8929612B2 (en) 2011-06-06 2015-01-06 Microsoft Corporation System for recognizing an open or closed hand
US10796494B2 (en) 2011-06-06 2020-10-06 Microsoft Technology Licensing, Llc Adding attributes to virtual representations of real-world objects
US9597587B2 (en) 2011-06-08 2017-03-21 Microsoft Technology Licensing, Llc Locational node device
US9791466B2 (en) 2011-07-21 2017-10-17 Brooks Automation, Inc. Method and device for compensation for dimensional variations in low temperature sample group holders
US8786730B2 (en) 2011-08-18 2014-07-22 Microsoft Corporation Image exposure using exclusion regions
US9759995B2 (en) * 2011-08-18 2017-09-12 Massachusetts Institute Of Technology System and method for diffuse imaging with time-varying illumination intensity
US9557836B2 (en) 2011-11-01 2017-01-31 Microsoft Technology Licensing, Llc Depth image compression
US9117281B2 (en) 2011-11-02 2015-08-25 Microsoft Corporation Surface segmentation from RGB and depth images
US8854426B2 (en) 2011-11-07 2014-10-07 Microsoft Corporation Time-of-flight camera with guided light
US8724906B2 (en) 2011-11-18 2014-05-13 Microsoft Corporation Computing pose and/or shape of modifiable entities
US8509545B2 (en) 2011-11-29 2013-08-13 Microsoft Corporation Foreground subject detection
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US8803800B2 (en) 2011-12-02 2014-08-12 Microsoft Corporation User interface control based on head orientation
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US8879831B2 (en) 2011-12-15 2014-11-04 Microsoft Corporation Using high-level attributes to guide image processing
US8630457B2 (en) 2011-12-15 2014-01-14 Microsoft Corporation Problem states for pose tracking pipeline
US8971612B2 (en) 2011-12-15 2015-03-03 Microsoft Corporation Learning image processing tasks from scene reconstructions
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US9342139B2 (en) 2011-12-19 2016-05-17 Microsoft Technology Licensing, Llc Pairing a computing device to a user
US9720089B2 (en) 2012-01-23 2017-08-01 Microsoft Technology Licensing, Llc 3D zoom imager
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
US9210401B2 (en) 2012-05-03 2015-12-08 Microsoft Technology Licensing, Llc Projected visual cues for guiding physical movement
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
CN104395929B (en) 2012-06-21 2017-10-03 微软技术许可有限责任公司 Constructed using the incarnation of depth camera
US9836590B2 (en) 2012-06-22 2017-12-05 Microsoft Technology Licensing, Llc Enhanced accuracy of user presence status determination
US9696427B2 (en) 2012-08-14 2017-07-04 Microsoft Technology Licensing, Llc Wide angle depth detection
US8882310B2 (en) 2012-12-10 2014-11-11 Microsoft Corporation Laser die light source module with low inductance
US9857470B2 (en) 2012-12-28 2018-01-02 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US9251590B2 (en) 2013-01-24 2016-02-02 Microsoft Technology Licensing, Llc Camera pose estimation for 3D reconstruction
US9277206B1 (en) 2013-01-28 2016-03-01 Cognex Corporation Dual-view laser-based three-dimensional capture system and method for employing the same
US9052746B2 (en) 2013-02-15 2015-06-09 Microsoft Technology Licensing, Llc User center-of-mass and mass distribution extraction using depth images
US9940553B2 (en) 2013-02-22 2018-04-10 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US9135516B2 (en) 2013-03-08 2015-09-15 Microsoft Technology Licensing, Llc User body angle, curvature and average extremity positions extraction using depth images
US9092657B2 (en) 2013-03-13 2015-07-28 Microsoft Technology Licensing, Llc Depth image processing
US9274606B2 (en) 2013-03-14 2016-03-01 Microsoft Technology Licensing, Llc NUI video conference controls
US9953213B2 (en) 2013-03-27 2018-04-24 Microsoft Technology Licensing, Llc Self discovery of autonomous NUI devices
US9442186B2 (en) 2013-05-13 2016-09-13 Microsoft Technology Licensing, Llc Interference reduction for TOF systems
US9462253B2 (en) 2013-09-23 2016-10-04 Microsoft Technology Licensing, Llc Optical modules that reduce speckle contrast and diffraction artifacts
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
US9674563B2 (en) 2013-11-04 2017-06-06 Rovi Guides, Inc. Systems and methods for recommending content
US9769459B2 (en) 2013-11-12 2017-09-19 Microsoft Technology Licensing, Llc Power efficient laser diode driver circuit and method
US9508385B2 (en) 2013-11-21 2016-11-29 Microsoft Technology Licensing, Llc Audio-visual project generator
US9971491B2 (en) 2014-01-09 2018-05-15 Microsoft Technology Licensing, Llc Gesture library for natural user input
ITUB20154051A1 (en) * 2015-09-30 2017-03-30 Keyline S P A APPARATUS FOR READING AND RECOGNIZING THE PROFILE OF A KEY
US10412280B2 (en) 2016-02-10 2019-09-10 Microsoft Technology Licensing, Llc Camera with light valve over sensor array
US10257932B2 (en) 2016-02-16 2019-04-09 Microsoft Technology Licensing, Llc. Laser diode chip on printed circuit board
US10462452B2 (en) 2016-03-16 2019-10-29 Microsoft Technology Licensing, Llc Synchronizing active illumination cameras
EP3441712A1 (en) * 2017-08-08 2019-02-13 Klingelnberg AG Coordinate measuring device comprising an optical sensor and corresponding method
DE102018222777A1 (en) * 2018-12-21 2020-06-25 Robert Bosch Gmbh Optoelectronic sensor and method for operating an optoelectronic sensor
US11663804B2 (en) 2021-06-04 2023-05-30 Micron Technology, Inc. Determining image sensor settings using LiDAR

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4040738A (en) * 1975-03-20 1977-08-09 Gulton Industries, Inc. Railroad track profile spacing and alignment apparatus
US4355904A (en) * 1978-09-25 1982-10-26 Balasubramanian N Optical inspection device for measuring depthwise variations from a focal plane
US4375921A (en) * 1980-03-13 1983-03-08 Selective Electronic Co. Ab Dimension measuring apparatus
JPS57128810A (en) * 1981-02-03 1982-08-10 Olympus Optical Co Ltd Distance measuring device
JPS57165704A (en) * 1981-04-03 1982-10-12 Hitachi Ltd Detecting system for light spot position
JPS5834313A (en) * 1981-08-26 1983-02-28 Canon Inc Active type distance measuring device
JPS58113706A (en) * 1981-12-26 1983-07-06 Nippon Kogaku Kk <Nikon> Detector for horizontal position
DE3319320A1 (en) * 1983-05-27 1984-11-29 Siemens AG, 1000 Berlin und 8000 München Device for detecting a spatial coordinate of a light point
US4643578A (en) * 1985-03-04 1987-02-17 Robotic Vision Systems, Inc. Arrangement for scanned 3-D measurement
US4634879A (en) * 1985-03-21 1987-01-06 General Electric Company Method and system for determining surface profile information
US4677302A (en) * 1985-03-29 1987-06-30 Siemens Corporate Research & Support, Inc. Optical system for inspecting printed circuit boards wherein a ramp filter is disposed between reflected beam and photodetector
US4645917A (en) * 1985-05-31 1987-02-24 General Electric Company Swept aperture flying spot profiler

Also Published As

Publication number Publication date
EP0247833A3 (en) 1988-04-06
US4796997A (en) 1989-01-10
DE3769368D1 (en) 1991-05-23
EP0247833A2 (en) 1987-12-02
EP0247833B1 (en) 1991-04-17

Similar Documents

Publication Publication Date Title
CA1265869A (en) Method and system for high-speed, 3-d imaging of an object at a vision station
US4743771A (en) Z-axis height measurement system
EP1571414B1 (en) Apparatus and method for surface contour measurement
US5546189A (en) Triangulation-based 3D imaging and processing method and system
US6268923B1 (en) Optical method and system for measuring three-dimensional surface topography of an object having a surface contour
US6525827B2 (en) Method and system for imaging an object with a plurality of optical beams
US5812269A (en) Triangulation-based 3-D imaging and processing method and system
CN110986756B (en) Measuring device for three-dimensional geometrical acquisition of the surroundings
EP1183499B1 (en) Three dimensional optical scanning
US6765606B1 (en) Three dimension imaging by dual wavelength triangulation
Sensiper et al. Modulation transfer function testing of detector arrays using narrow-band laser speckle
Beraldin et al. Practical range camera calibration
US4453827A (en) Optical distortion analyzer system
CN211876977U (en) Line focusing differential color confocal three-dimensional surface topography measuring system
JPH05332733A (en) Detection optical system and method for detecting three-dimensional form
GB2306825A (en) Laser ranging using time correlated single photon counting
GB2204947A (en) Method and system for high speed, 3-D imaging of an object at a vision station
CN112432766A (en) Method for detecting performance of laser scanning galvanometer
US5940566A (en) 3D array optical displacement sensor and method using same
Svetkoff Towards a high resolution, video rate, 3D sensor for machine vision
JPH09196624A (en) Method and apparatus for measurement of very small size
US5521694A (en) Laser beam path profile sensor system
CN112857752A (en) Absolute measurement system and method for angle-resolved scattering of optical element
CN112432765A (en) Laser scanning galvanometer performance detection device
JPS6361110A (en) High-speed three-dimensional measuring method and device

Legal Events

Date Code Title Description
MKLA Lapsed