WO1998002764A1 - Portable 3-d scanning system and method for rapid shape digitizing and adaptive mesh generation - Google Patents

Portable 3-d scanning system and method for rapid shape digitizing and adaptive mesh generation Download PDF

Info

Publication number
WO1998002764A1
WO1998002764A1 PCT/US1997/011764 US9711764W WO9802764A1 WO 1998002764 A1 WO1998002764 A1 WO 1998002764A1 US 9711764 W US9711764 W US 9711764W WO 9802764 A1 WO9802764 A1 WO 9802764A1
Authority
WO
WIPO (PCT)
Prior art keywords
motor
stripe
known angular
audio
approach
Prior art date
Application number
PCT/US1997/011764
Other languages
French (fr)
Inventor
Alexander A. Migdal
Alexei Lebedev
Michael Petrov
Original Assignee
Real-Time Geometry Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to AU36523/97A priority Critical patent/AU3652397A/en
Application filed by Real-Time Geometry Corporation filed Critical Real-Time Geometry Corporation
Publication of WO1998002764A1 publication Critical patent/WO1998002764A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/12Scanning systems using multifaceted mirrors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/001Model-based coding, e.g. wire frame

Definitions

  • This invention relates generally to three- dimensional (“3D") scanning and measuring systems, and relates particularly to a portable 3D scanning system and method which facilitate acquisition and storage of data relating to 3D profiles of objects for subsequent computer-aided data processing and reproduction of the 3D profiles of the objects by shape digitizing and adaptive mesh generation.
  • 3D three- dimensional
  • a mechanical system acquires data about an object through the use of a probe that has a sensitive tip.
  • the mechanical system scans an object by manually moving its probe tip across the object's surface and taking readings.
  • the probe connects to a mechanical arm, and the system tracks the probe's position in space using angle measuring devices as the arm moves.
  • the system calculates the position of the probe with coordinates known from the angle measuring devices .
  • a triangulation system projects beams of light on an object and then determines three-dimensional spatial locations for points where the light reflects from the object. Ordinarily, the reflected light bounces off the object at an angle relative to the light source. The system collects the reflection information from a location relative to the light source and then determines the coordinates of the point or points of reflection by triangulation.
  • a single dot system projects a single beam of light which, when reflected, produces a single dot of reflection.
  • a scan line system sends a plane of light against the object which projects on the object on a line and reflects as a curvilinear-shaped set of points describing one contour line of the object.
  • the location of each point in that curvilinear set of points can be determined by triangulation.
  • Some single dot optical scanning systems use a linear reflected light position detector to read information about the object.
  • a laser projects a dot of light upon the object.
  • the linear reflected light position detector occupies a position relative to the laser which allows the determination of a three dimensional location for the point of reflection.
  • a single dot optical scanner with a linear reflected light position detector can digitize only a single point at a time.
  • a single dot optical scanning system like the mechanical system described above, is relatively slow in collecting a full set of points to describe an object.
  • Single dot optical scanners are typically used for applications such as industrial engineering. The digitizing speed is usually slow and is limited by the mechanics of the scanning system, i.e., the moving and positioning of the light beam.
  • a scanning head can be mounted on a high-precision, but costly, positioning system to take a digitized image of the object's shape with generally good accuracy.
  • single dot optical scanners find generally only limited application.
  • Tnose systems typically employ a 2D imager, such as a charged coupled device (CCD) camera, for signal detection.
  • CCD charged coupled device
  • the systems project a light plane (i.e., a laser stripe) instead of just one dot and then read the reflection of multiple points depicting the contour of an object at a location that is a distance from the CCD camera and from which the position can be triangulated.
  • Some embodiments of the scan line -type system attach the CCD camera to a rotating arm or a moving platform.
  • Some laser stripe triangulation systems currently available are further limited because the laser stripe stays at a fixed angle relative to the camera and the system makes its calculations based on the cylindrical coordinates of its rotating platform.
  • the mathematical simplicity in such a projection system complicates the hardware portion of these devices as they typically depend on the rotational platform mentioned.
  • the simplified geometry does not generally allow for extremely refined reproduction of topologically nontrivial objects, such as objects with holes in them (e.g . , a tea pot with a handle) .
  • Full realization of triangulation scanning with a non-restrictive geometry has not been achieved m the available devices. Additionally, for those optical triangulation systems employing a computer, there is the further problem of processing the incoming data.
  • the CCD camera typically outputs frames of picture information at a rate of 30 or more frames per second.
  • Each frame is composed of a two dimensional frame matrix of pixels and contains, for example, 640 x 480 pixel values of light intensity information.
  • laser stripe triangulation systems must sort through many megabytes of information. These systems typically require very powerful computers and have sizeable memory requirements. In addition, they take a relatively long time to process the incoming CCD information into a viable set of points concerning the object. The points created can depict the object, but the system that create them are also limited in that they typically do not achieve a sophisticated model of the object.
  • Range meters and multi-camera systems are among those categorized as "alternative" systems.
  • Range meter systems typically use an infrared pulsed laser and mechanical scanning techniques to project a dot laser across an object and then measure the phase delay of the reflected signal.
  • range meter systems typically incorporate a single dot method of data collection, they generally have the speed limitations that are intrinsic to single-point scanners. Additional accuracy problems occur because depth coordinates are not sufficiently accurate, such that in some systems, when an object is large, ghosts can appear on the scan.
  • Another type of alternative scanning system is a stereoscopic system ' which uses several CCD cameras located at known distances from each other. The captured images are processed with a pattern recognition system which maps the various points of an object captured by the cameras, thereby obtaining the shape/contour information.
  • One advanced stereoscopic system uses 16 CCD cameras. Although each camera in such a system has a small exposure time, it takes several minutes to analyze the data for each scan. This can cause the system to delay, sometimes up to six minutes per scan.
  • the device must also project a special grid on an object to obtain reference points for gluing a complete 3D picture.
  • accuracy is sometimes a problem because stereoscopic scanning relies on light reflecting properties. The systems make assumptions based on Lambertian reflecting properties to determine resolution surface features of the scanned objects. Different surfaces can dictate different results for the same object.
  • an ideal, portable, inexpensive 3D scanning system should minimize the components, e.g., include just a video camera and a mechanism for variably projecting a light stripe onto a target object, and preferably not require a computer for scanning .
  • the present invention provides a high speed, accurate and portable system and method for rapidly measuring objects and processing the shape, contour, color and other data it collects for display, graphic manipulation, model building -- such as through adaptive mesh generation -- and other uses Because the basic information about the object is obtained in rapid fashion, the invention is particularly suited to scan and measure objects which cannot easily stay motionless, such as people or animals
  • the mechanical and data processing features of the present invention permit the collected data to be processed with high accuracy
  • the present invention provides a system for rapidly scanning an object with a geometric light shape (such as a laser stripe) , recording the shape of the reflected points of light by means of an image collector ⁇ such as a camera) , and, by a triangulation technique that does not depend on the fixed direction of the light source relative to the camera, reconstructing the 3D shape of the object through a computer using the data points collected from the reflection of the laser stripes.
  • a geometric light shape such as a laser stripe
  • a user can, inter al a , create, display and manipulate an image of the 3D object on a computer, physically reproduce the object (through computer controlled milling machines and stereolithography) , compress the data for easy transmission (such as over the Internet), or use the data in graphic manipulation systems (such as in 3D video animators)
  • the invention includes a model building algorithm which provides multi-resolution analysis and adaptive mesh generation, leading to significant compression of the digital shape information, which for most objects will be order of magnitude compression.
  • the present invention also provides embodiments which are portable (and not limited to any specific background) and can also be implemented using components readily available.
  • a representative embodiment of the portable scanning system according to the present invention does not require a computer as part of the system because data processing contemporaneous with the data collection is obviated in this embodiment Instead, the portable system collects 3D-prof ⁇ le data of objects and records the data on a storage medium The data stored on the storage medium is subsequently processed, at any desired time, by a computer system which applies the above-mentioned data-processing routines, e.g , the model building algorithm which provides multi -resolution analysis and adaptive mesh generation It should be noted, however, processing of the data collected using the portable scanning system according to the present invention need not be limited to the specific data- processing routines described herein In the portable scanning system according to the present invention, synchronization of the image- captu ⁇ ng device, e.g , a video camera, and the projection mechanism for variably projecting a stripe of light is not achieved at the time of actual scanning By avoiding the synchronization issue at the time of actual scanning, the need to utilize a computer to simultaneously control the projection mechanism
  • the portable scanning system records on one or more data storage medium both the video data and data relating to the angular positions of the light stripe during scanning.
  • the recorded data is eventually processed by a computer, the angular-position data is synchronized w th the video data
  • Several implementations of the synchronization scheme are utilized in the portable scanning system according to the present invention.
  • the portable scanning system utilizes the principle of optical triangulation.
  • the scanning system includes a laser- stripe generator, a video camera, a scanning mirror attached to a continuously rotating motor, and associated electronics.
  • the scanning system may also include an encoder or a photodiode coupled to the motor and a light source for providing ambient light.
  • the rotating, scanning mirror reflects the laser stripe and variably positions the laser stripe across a target object.
  • the angular positions of the scanning mirror are encoded, and the encoded data reflecting the angular positions is recorded.
  • the camera which is positioned at a triangulation distance from the target object, detects the laser-stripe on the object and records the video data.
  • the computer can calculate the depth component of the 3D profile of the target object by combining the video data and the encoded data reflecting the angular positions of the scanning mirror.
  • the present invention provides a further aspect related to rapid processing by providing a system to compress the video camera information as it is input to the computer.
  • a large amount of video signal information ⁇ bytes converted from analog signals comes to the computer memory for processing.
  • the present invention provides a system which rapidly processes the information by compressing the image information and keeping from each image only the information about the reflected data points collected from the positioned laser stripes.
  • the data compression system of the present invention reduces the raw data down to only a few kilobytes depending upon the application.
  • the computer applies an "on the fly" test as it first comes into a memory buffer, with the computer rejecting pixel values that fail the test .
  • the compacted data allows for easier, speedier processing and alleviates the need for massive computing power to complete the scanning process.
  • a currently available personal computer is all that is necessary to implement the data gathering system of the present invention.
  • the present invention also provides features that allow for more accurate measurement of the object and better display of its image.
  • One feature provided is a system to eliminate "noise" from the image information and determine the most accurate 3D spatial location of any reflection data point on the object to the degree of "subpixel accuracy".
  • the image of the data point created by a laser stripe reflection can be three or more pixels wide, but the most intense portion of the point will be located in only a portion of a pixel.
  • the invention refines the collected data points to subpixel accuracy by locating the most likely center of the data point according to calculation heuristics.
  • Another feature of the present invention in performing shape digitization is to obtain only the essential information when collecting shape data and color information about an object. In the present invention, it is not necessary in obtaining shape data to process full color information during scanning.
  • the present invention features the ability in an exemplary embodiment to intake laser stripe scanning information about an object with an electronic camera, and to capture that information in black and white or through a single color port, such as the red.
  • the present invention provides that color information about the object can be separately gathered apart from the collection of data points by laser stripe scanning and later matched and used with the three-dimensional data points.
  • Fig. la illustrates a first exemplary embodiment of the portable scanning system according to the present invention.
  • Fig. lb illustrates some of the geometric relationships among various components of the first exemplary embodiment of the portable scanning system according to the present invention shown in Fig. la.
  • Fig. 2a illustrates a second exemplary embodiment of the portable scanning system according to the present invention which includes a mechanical switch or an encoder.
  • Fig. 2b illustrates a modificatio of the second exemplary embodiment of the portable scanning system according to the present invention shown in Fig. 2a, which modification adds a logic electronics and a lamp .
  • Fig. 2c illustrates a modification of the second exemplary embodiment of the portable scanning system according to the present invention shown in Fig. 2a, which modification adds a logic electronics and a telephone tone generator or a modem encoder chip.
  • Fig. 3 illustrates a third exemplary embodiment of the portable scanning system according to the present invention which includes an optical encoder.
  • Fig. 4a illustrates a fourth exemplary embodiment of the portable scanning system according to the present invention which includes photodiodes .
  • Fig. 4b illustrates a modification of the fourth exemplary embodiment of the portable scanning system according to the present invention shown in Fig 4a, which modification adds a logic electronics and a lamp.
  • Fig. 4c illustrates a modification of the fourth exemplary embodiment of the portable scanning system according to the present invention shown in Fig. 4a, which modification adds a logic electronics and a telephone tone generator or a modem encoder chip.
  • Fig. 5 illustrates a fifth exemplary embodiment of the portable scanning system according to the present invention which includes logic electronics and a lamp
  • Fig. 6 illustrates a sixth exemplary embodiment of the portable scanning system according to the present invention which implements angular data encoding by utilizing a telephone tone generator or a modem encoder
  • Fig. 7 illustrates the sixth exemplary embodiment of the portable scanning system shown m Fig 6 connected to audio and video signal interfaces of a computer .
  • Fig. 8 illustrates the use of an intermediate storage medium to store 3D-profile data captured using the sixth exemplary embodiment of the portable scanning system according to the present invention shown m Fig. 6.
  • a basic embodiment of a portable scanning system includes a laser-stripe generator 1602, a video camera 1604, a continuously spinning motor 1603 and a scanning mirror 1601 attached to the continuously spinning motor 1603
  • the rotating, scanning mirror reflects the laser stripe 1606 and variably positions the laser stripe across a target object 101
  • the video camera which is positioned relative to the target object at a distance which allows triangulation calculation, detects the laser stripe 1608 reflected from the object and records a sequence of image frames each containing an image of the reflected laser-stripe
  • the combination of the scanning mirror 1601 attached to the continuously spinning motor 1603, the laser-stripe generator 1602, and the video camera 1604 are preferably mounted on a mechanical holder 1605
  • the mechanical holder 1605 which allows the triangulation distance between the scanning mirror 1601 and the video camera 1604 to be varied by manipulation of the adjustment mechanism 1607, may be in turn mounted on a standard camera tripod, which is not shown
  • the motor 1603 may be a simple DC/AC motor with a gear head
  • the motor 1603 rotates at a constant speed in the range of 2-4 revolutions per minute (rp ) , for example
  • the angular velocity of the motor 1603 may be adjusted for varying levels of scanning quality and scanning time High stability of rotation speed of the motor is not required for the portable scanning system according to the present invention
  • the motor may be a miniature motor such as model 2230V012S22B sold by MicroMo Corp of Clearwater, Florida
  • the model 2230V012S22B is approximately 2" deep, including the gear head, and 0.7" in diameter.
  • the mirror 1601 shown in Fig la is a polygonal mirror which has, for example, 6 symmetrical facets
  • the multi-faceted, polygonal mirror 1601 allows the portable embodiments of the scanning system according to present invention to achieve shorter "dead periods," i.e., periods when no laser stripe is positioned on the target object, during scanning in comparison to an embodiment utilizing a single, flat mirror mounted on a constant-speed motor.
  • the video camera 1604 may be any one of the commercially available video cameras as long as the chosen video camera outputs sufficiently high-resolution, real-time video signals.
  • Typical, commercially available video cameras have a resolution ranging from 320x240 pixels per image frame up to 640x480 pixels per image frame.
  • typical video cameras have an operating speed of 30 image frames per second, which is equivalent to 60 image fields per second.
  • a typical, commercially available video camcorder which combines a video camera with a recording mechanism, is adapted to record the video signals on a storage medium such as a video cassette tape.
  • the scanning time for obtaining 50-200 laser-stripe images would be approximately 1 to 3 seconds
  • the computer can calculate the depth component Z 0 of the 3D profile of the scanned portion of the target object by performing the triangulation calculation with the two-dimensional coordinates of the scanned portion of the target object stored in the video image frames.
  • the three- dimensional coordinate system for the scanning system according to the present invention is centered at the focal point of the focusing lens 1604a of the camera 1604, with the Y-coordinate extending perpendicular to the plane defined by the drawing sheet.
  • the distance d shown in Fig. lb which may be described as the focal point location value representing the relationship between the focal point and the image plane (light collection plate) 1604b, is ascertained by a calibration procedure which is described in detail in a commonly- owned U.S. Patent Application entitled "High Accuracy Calibration for 3D Scanning and Measuring Systems" by A. Migdal, A. Lebedev, M. Petrov and A.
  • the distance R,. is determined by measurement, and the distances X and Z L are determined by the calibration procedure described in the "High Accuracy Calibration for 3D Scanning and Measuring Systems" application.
  • Z L illustrated in Fig. lb need not be zero, Z L may be assumed to be zero for the sake of simplicity of explanation in the following discussion.
  • the rotational axis of the motor 1603, which need not be coincident with the Y-axis is assumed to be coincident with the Y-axis for the sake of simplicity of explanation in the following discussion.
  • the angle ⁇ ' can be ascertained based on the image coordinate (x,y,d) on the image plane 1604b corresponding to the scanned, actual coordinate (X 0 , Y picnic, Z lake) . Since the distances X L , Z L and R,., 17 as well as the angle ⁇ ', are known, the Z-coordinate distance Z 0 of the scanned point on the target object may be ascertained if the angle ⁇ , which indicates the angle of approach of the laser stripe relative to the target object 101 for the scanned, actual coordinate (X 0 , Y 0 , Z 0 ) , is known.
  • the angle ⁇ is the angular separation between the normal vector to the given mirror facet and an axis parallel to the Z-axis of the coordinate system. Accordingly, the remaining issue in calculating the depth component of the 3D profile of the target object is how to ascertain the angle ⁇ of the scanning mirror, i.e., the angle of approach of the laser stripe relative to the target object for a given light-stripe position as recorded in the stored video image frames.
  • Several different approaches may be taken in addressing the issue of ascertaining the angle ⁇ of the laser stripe relative to the target object.
  • the speed of the motor 1603 may be assumed to be constant.
  • the angular velocity of the motor 1603 and, in turn, the angular velocity of the scanning mirror 1601 may be calculated by analyzing the video data recorded by the video camera 160 .
  • a computer simply compares a sequence of stored video frames to find those frames in which the laser stripes are at identical positions. Based on the number of intervening frames between the two substantially identical video frames, the computer can calculate the angular velocity of the motor 1603 and the scanning mirror 1601.
  • the computer can associate the two-dimensional coordinate points recorded in the video image frames with the corresponding angle ⁇ of the laser stripe to calculate the depth component of the 3D profile of the scanned portion of the target object.
  • the angle ⁇ 0 may be manually "adjusted" by means of a 3-D editor, i.e., by performing graphic manipulation.
  • the embodiment illustrated in Fig. 2a is substantially identical to the embodiment of Fig. 16, with the addition of a pulse-generating mechanism 1701.
  • the pulse- generating mechanism 1701 which may be a mechanical switch or a low-resolution encoder, is used to provide one or several pulses per rotation of the motor 1603.
  • Each pulse corresponds to a reference angle.
  • the pulses may be fed via connection 1701a to, and recorded on, a separate recording medium, which is not shown, and subsequently processed with the recorded video image frames containing laser-stripe images.
  • the video camera may be adapted to include digital memory for the purpose of storing the encoder pulses.
  • the pulses, along with the video data may be fed directly to a computer. When the pulses and the video data are processed by the computer, the computer zeroes the time counter at each occurrence of the pulse, thereby synchronizing the recorded video image frames with corresponding angular positions of the motor 1603.
  • the embodiment of Fig. 18 is substantially identical to the embodiment shown in Fig.
  • the optical encoder 1801 includes a disk ("code wheel") with holes which is attached to the shaft of the motor 1603, one or more light emitting diodes (LEDs) , a set of photodetectors and a signal -processing circuit. As the disk rotates with the shaft of the motor 1603, light emitted from the set of LEDs travels through the holes in the disk and is detected by the photodetectors. The signals from the photodetectors are processed by the signal -processing circuit, and the resulting output signals indicating the angular position ⁇ of the motor
  • the output signals on channels 1802 and 1803 are square waveforms representing binary numbers; the output signal on channel 1802 is 90 degrees out of phase with the output signal on channel 1803.
  • Exemplary, low-cost optical encoders which may be incorporated in the exemplary embodiment of Fig. 3 are HEDS/HEDM series of optical encoders made by Hewlett- Packard Co. These encoders provide 360-1024 counts per revolution, as reflected by output signals on the channels 1802 and 1803.
  • HEDS-5540/5640 encoders generate three-channel , digital output, with the output signal on channel 1804 being an index pulse P 0 which is generated once for each full rotation of the motor 1603.
  • the output signals on channels 1802-1804 may be recorded on a recording medium or fed directly to a computer.
  • the computer can calculate the angular position ⁇ of the motor 1603 at any given time with high precision. In this manner, the angular positions of the motor 1603 corresponding to the recorded video image frames may be ascertained.
  • yet another exemplary embodiment of the portable scanning system utilizes photodiodes for the purpose of facilitating eventual synchronization between the laser-stripe images recorded as video data and the corresponding angular positions of the motor 1603.
  • the embodiment of Fig. 4a is substantially similar to the embodiment shown in Fig. 16, with the addition of two photodiodes 1901 each located at a known position from the scanning mirror 1601. Alternatively, a single photodiode may be used. As the mirror 1601 rotates, the photodiodes 1901 detect the laser stripe 1606 reflected from the mirror 1601, in response to which the photodiodes 1901 generate detection signals.
  • the detection signal indicates orientation of the scanning mirror 1601 at the particular known angle relative to the photodiode.
  • the detection signals may be recorded on a recording medium or fed directly to a computer.
  • the angle ⁇ of the scanning mirror 1601 relative to the target object corresponding to a given laser-stripe image may be ascertained.
  • Still another exemplary embodiment of the portable scanning system according to the present invention, shown in Fig. 5, is substantially similar to the embodiment of Fig. 3, with the addition of logic electronics 2001, a relay 2002 and a lamp 2003.
  • the logic electronics 2001 triggers the lamp 2003 in response to the output signal of the encoder 1801 when the motor 1603 reaches one or more predetermined reference angle (s) . Since the lamp 2003 turns on only at specific angular positions of the motor 1603, the angular position ⁇ of the motor corresponding to a given image frame may be ascertained by comparing total brightness of adjacent recorded image frames, i.e., determining when the lamp 2003 is turned on. Based on the angular position of the motor corresponding to an image frame in which the lamp 2003 was turned on, the computer can then calculate the angular position ⁇ of the motor for any other recorded image frame .
  • Fig. 5 shows the output signal of the encoder 1801, transmitted via channels 1802-1804, as the input signal for the logic electronics 2001
  • the logic electronics 2001 may be designed to trigger the lamp 2003 in response to other input signals which indicate that the motor 1603 has reached a predetermined reference angle.
  • output signals from the mechanical switch 1701, shown in Fig. 2a, or signals from the photodiodes 1901, shown in Fig. 4a may serve as the input signals for the logic electronics 2001, i.e., the embodiments of Figs. 2a and 4a may be modified by adding the logic electronics 2001, the relay 2002 and the lamp 2003.
  • Figs. 2b and 4b are illustrated in the exemplary embodiment of Fig.
  • the lamp 2003 also serves the function of providing light for taking texture-map image, in addition to the function of generating information about the angular position of the motor.
  • the lamp 2003 is useful for the purpose of taking texture-map image because increased uniformity of illumination of the scanned object is desirable. Meanwhile, because the dynamic range of commercially available video cameras is limited, i.e., 256 gradations of gray for a typical low-cost camera, reduced background illumination relative to the object illumination usually results in a more efficient 3D scanning process.
  • Fig. 6 Shown in Fig. 6 is yet another exemplary embodiment of the portable scanning system according to the present invention, which system records the angular positions of the motor 1603 preferably in audio format.
  • the exemplary embodiment of Fig. 6 includes substantially similar components as the exemplary embodiment of Fig. 5, with the addition of an audio-signal generator 2101.
  • the angular positions of the motor 1603 may be encoded into audio type electrical signals by the audio-signal generator 2101, which may be a telephone tone generator or a modem encoder chip.
  • the telephone tone generator used in modern touch-tone telephones, generates audio type electrical signals by mixing two electrical signals of different frequencies. The bandwidth of the resultant signal falls well within the telephone signal bandwidth of 4kHz .
  • the logic electronics 2001 triggers the telephone tone generator or the modem encoder chip 2101 in response to the output of the optical encoder 1801 transmitted via output channels 1802-1804, such that every count generated by the optical encoder 1801 is represented by a sequence of telephone tone pulses.
  • the logic electronics 2001 and the telephone tone generator (or the modem encoder chip) 2101 may be viewed as components of audio electronics 2102.
  • the logic electronics 2001 includes a quartz generator for triggering the tone generator or the modem encoder chip 2101. Assuming the optical encoder 1801 generates 360 counts per revolution, for example, then the modem encoder chip 2101 needs to encode 360 different numbers, which means using as many as three bits of the modem encoder chip 2101 for encoding the value of a single angle.
  • the quartz generator provides timing signal for outputting the multiple bits of the modem encoder chip
  • any commonly used telephone tone generator will be sufficient for the purposes of encoding angular positions of the motor 1603.
  • Fig. 6 shows the output signal of the encoder 1801, transmitted via channels 1802-1804, as the input signal for the logic electronics 2001
  • the logic electronics 2001 may be designed to trigger the tone generator 2101 in response to other input signals which indicate that the motor 1603 has reached one or more predetermined reference angles.
  • output signals from the mechanical switch 1701, shown in Fig. 2a, or signals from the photodiodes 1901, shown in Fig. 4a may serve as the input signals for the logic electronics 2001, i.e., the embodiments of Figs. 2a and 4a may be modified by adding the logic electronics 2001 and the tone generator 2101.
  • Figs. 2c and 4c are illustrated in the exemplary embodiment shown in Fig.
  • the lamp 2003 is an optional component of the scanning system.
  • the lamp 2003 is utilized to enhance the illumination of the target object for taking texture-map image.
  • the illumination provided by the lamp 2003 reduces the probability that the recorded 25 images frames of the scanned object are too dark, or the probability that the computer processing the recorded image frames will have problems in identifying the laser stripe positioned on the scanned object from bright 5 background.
  • the video image frames from the video camera 1604 and the audio signals from the audio electronics 2102 may be fed directly to a computer 2205, as shown in Fig. 7, or recorded on a storage medium and processed
  • the video signals representing the captured image frames are fed from the video camera 1604 via channel 2104 to a video capture device 2206 of the computer 2205.
  • the video capture device 2206 is a plug-in frame grabber board, while the audio capture device 2207
  • Video frame gabbers capable of capturing NTSC format video images with 640x480 pixels resolution at a rate of 30 frames (60 fields) per second are readily 25 available.
  • MeteorTM video frame grabber board manufactured by Matrox, Inc. takes advantage of high bandwidth of the PCI bus by transferring the video data directly to the main processor/memory of the computer without buffering it on the board. Since the
  • main processor of a typical PC type computer is capable of directly processing the captured video data in order to determine the recorded laser stripe image and perform the triangulation calculation, no additional buffering of data is necessary for performing the triangulation 35 calculation and determining the 3D coordinates representing the profile of the scanned object.
  • video image frames from the camera 1604 and audio signals representing the angular positions of the motor 1603 may be recorded on a storage medium 2302 by means of a recording device 2301.
  • the storage medium 2302 is, for example, a cassette
  • a typical camcorder which is designed to record both video and audio signals, combines the video camera 1604 and the recording device 2301.
  • camcorders manufactured by Sony have an audio input for recording
  • the audio input can be used to record audio signals generated by the audio electronics 2102.
  • the video and audio signals recorded on the storage medium 2302 may be reproduced by means of a VCR 2303, for example, and fed to the video
  • the computer then performs the triangulation calculation and determines the 3D coordinates representing the profile of the scanned object, as well as performing other post-scanning tasks,
  • 35 positions of the motor 1603 may be recorded on an intermediate storage medium.
  • the exemplary embodiments of the scanning system according to the present invention shown in Figs, la-8 eliminate the need to have a computer present at the scanning site, thereby achieving practical portability and added flexibility in scanning.
  • the computer is ust a passive device used to process the data obtained from the scanning process: the computer need not participate in the data acquisition process.
  • the exemplary embodiments of the portable scanning system according to the present invention shown in Figs, la-8 achieve practical portability and enhanced flexibility of scanning, these embodiments of the portable scanning system do not sacrifice the quality of resulting 3D profile of the scanned object or significantly increase the time required to process the laser-stripe image data and data reflecting the angular positions of the laser stripe.
  • the computer performs various program routines, e.g., 3D reconstruction, data simplification, mesh generation and stitching, all of which are described in U.S. Patent Application Serial Number 08/620,689 filed on March 21, 1996 by A. Migdal, M. Petrov and A. Lebedev, to generate the 3D model of the scanned object.
  • U.S. Patent Application Serial Number 08/620,689 is expressly incorporated herein by reference as if fully set forth at length herein.
  • Calibration for camera lens distortion may be accomplished by scanning a flat reference surface with a repeated and evenly spaced pattern of laser lines and capturing the image of these laser lines.
  • a frame grabber associated with a computer may be used to collect from the camera 1604 the image of a rectilinear grid pattern drawn on a flat surface. If the image of the rectilinear grid pattern is used for calibration, for example, since the lines forming the rectilinear grid are known to be straight, the deviation between the captured image of the grid lines and the actual grid lines may be used to generate a correction matrix.
  • the focal point value d of the camera 1604 may be ascertained by one of several different calibration procedures.
  • One calibration procedure involves capturing the image of a flat reference surface on which an equilateral triangle of known dimensions is drawn.
  • each of the three vertices generates a corresponding image point on the camera image plane 1604b, and the imaginary line connecting the vertex of the triangle on the flat reference surface and the corresponding image point on the camera image plane may be expressed as a line equation.
  • the focal distance d may be ascertained.
  • the focal distance d may be ascertained.
  • the portable scanning system is calibrated for the following parameters: relative positions of the image collector and the light source (the X L and Z L settings) ; and, for each facet of the multi-faceted mirror 1601, an initial (or reference) mirror angle ⁇ 0 corresponding to a known motor position.
  • the initial mirror angle ⁇ 0 indicates the orientation of a given mirror facet relative to the motor position.

Abstract

A portable 3-D scanning system collects 2-D profile data of objects (101) using a combination of a laser stripe positioning device and a video camera which detects the images of the laser stripe reflected from the object. The scanning system includes a laser stripe generator (1602), a video camera (1604), a scanning mirror (1601) attached to a continuously rotating motor (1603), an encoder or photodiode (1701, 1801, 1901) operationally coupled to the motor and associated electronics. As the scanning mirror reflects the laser stripe and positions it across the object, the encoder or photodiode generates signals indicating the angular position of the mirror. The video images of the reflected laser stripes and the angular positions of the laser stripes are synchronized by a computer to generate a 3-D model of the object.

Description

PORTABLE 3 -D SCANNING SYSTEM AND METHOD FOR RAPID SHAPE DIGITIZING AND ADAPTIVE MESH GENERATION
Field of the Invention
This invention relates generally to three- dimensional ("3D") scanning and measuring systems, and relates particularly to a portable 3D scanning system and method which facilitate acquisition and storage of data relating to 3D profiles of objects for subsequent computer-aided data processing and reproduction of the 3D profiles of the objects by shape digitizing and adaptive mesh generation.
Background of the Invention
Speed, accuracy, and portability have been recurrent and difficult to achieve goals for devices that scan, measure or otherwise collect data about 3D objects for purposes such as reproduction. With the advent of computers, such devices have useful application in many fields, such as digital imaging, computer animation, topography, reconstructive and plastic surgery, dentistry, architecture, industrial design, anthropology, biology, internal medicine, milling and object production, and other fields. These computer-aided systems obtain information about an object and then transform the shape, contour, color, and other information to a useful, digitized form. The technology currently available for shape digitizing falls into two distinct but related groups: mechanical systems and optical systems. All systems within those two general categories struggle with the basic criteria of speed, accuracy, and portability in measuring and generating information about an object. A mechanical system acquires data about an object through the use of a probe that has a sensitive tip. The mechanical system scans an object by manually moving its probe tip across the object's surface and taking readings. Generally, the probe connects to a mechanical arm, and the system tracks the probe's position in space using angle measuring devices as the arm moves. The system calculates the position of the probe with coordinates known from the angle measuring devices .
Although mechanical systems scan with generally high accuracy, the rate at which a mechanical system acquires data is relatively slow and can take several hours for scanning and digitizing. A typical mechanical system measures only one point at a time and can digitize only small, solid objects.
As an alternative to mechanical systems, there are several types of optical object shape digitizers which fall into two basic categories: systems based on triangulation and alternative systems. A triangulation system projects beams of light on an object and then determines three-dimensional spatial locations for points where the light reflects from the object. Ordinarily, the reflected light bounces off the object at an angle relative to the light source. The system collects the reflection information from a location relative to the light source and then determines the coordinates of the point or points of reflection by triangulation. A single dot system projects a single beam of light which, when reflected, produces a single dot of reflection. A scan line system sends a plane of light against the object which projects on the object on a line and reflects as a curvilinear-shaped set of points describing one contour line of the object. The location of each point in that curvilinear set of points can be determined by triangulation.
Some single dot optical scanning systems use a linear reflected light position detector to read information about the object. In such systems a laser projects a dot of light upon the object. The linear reflected light position detector occupies a position relative to the laser which allows the determination of a three dimensional location for the point of reflection. A single dot optical scanner with a linear reflected light position detector can digitize only a single point at a time. Thus, a single dot optical scanning system, like the mechanical system described above, is relatively slow in collecting a full set of points to describe an object. Single dot optical scanners are typically used for applications such as industrial engineering. The digitizing speed is usually slow and is limited by the mechanics of the scanning system, i.e., the moving and positioning of the light beam. However, accuracy of these systems can be high. A scanning head can be mounted on a high-precision, but costly, positioning system to take a digitized image of the object's shape with generally good accuracy. However, because of the high cost, slow speed, and lack of flexibility, single dot optical scanners find generally only limited application.
Scan line systems offer one solution to the speed time bottleneck of single point triangulation system. Tnose systems typically employ a 2D imager, such as a charged coupled device (CCD) camera, for signal detection. The systems project a light plane (i.e., a laser stripe) instead of just one dot and then read the reflection of multiple points depicting the contour of an object at a location that is a distance from the CCD camera and from which the position can be triangulated. Some embodiments of the scan line -type system attach the CCD camera to a rotating arm or a moving platform.
During scanning, either the object moves on a known path relative to the camera and laser, or the camera and laser, together, move around the object. In any case, such systems usually depend on this type of fixed rotational movement and typically use a bulky, high- precision mechanical system for positioning. Because of the use of mechanical positioning devices, rescaling flexibility can be very limited, e g , a scanner designed for objects the size of a basketball may not be useful for scanning apple-sized oP ects.
Some laser stripe triangulation systems currently available are further limited because the laser stripe stays at a fixed angle relative to the camera and the system makes its calculations based on the cylindrical coordinates of its rotating platform. The mathematical simplicity in such a projection system complicates the hardware portion of these devices as they typically depend on the rotational platform mentioned. Also, the simplified geometry does not generally allow for extremely refined reproduction of topologically nontrivial objects, such as objects with holes in them (e.g . , a tea pot with a handle) . Full realization of triangulation scanning with a non-restrictive geometry has not been achieved m the available devices. Additionally, for those optical triangulation systems employing a computer, there is the further problem of processing the incoming data. The CCD camera typically outputs frames of picture information at a rate of 30 or more frames per second. Each frame is composed of a two dimensional frame matrix of pixels and contains, for example, 640 x 480 pixel values of light intensity information. Thus, laser stripe triangulation systems must sort through many megabytes of information. These systems typically require very powerful computers and have sizeable memory requirements. In addition, they take a relatively long time to process the incoming CCD information into a viable set of points concerning the object. The points created can depict the object, but the system that create them are also limited in that they typically do not achieve a sophisticated model of the object.
Apart from optical triangulation systems (single dot or scan line systems) , there are alternative optical scanning systems which present a scanning solution different from those employing triangulation techniques. Range meters and multi-camera systems are among those categorized as "alternative" systems. Range meter systems typically use an infrared pulsed laser and mechanical scanning techniques to project a dot laser across an object and then measure the phase delay of the reflected signal. As range meter systems typically incorporate a single dot method of data collection, they generally have the speed limitations that are intrinsic to single-point scanners. Additional accuracy problems occur because depth coordinates are not sufficiently accurate, such that in some systems, when an object is large, ghosts can appear on the scan.
Another type of alternative scanning system is a stereoscopic system' which uses several CCD cameras located at known distances from each other. The captured images are processed with a pattern recognition system which maps the various points of an object captured by the cameras, thereby obtaining the shape/contour information. One advanced stereoscopic system uses 16 CCD cameras. Although each camera in such a system has a small exposure time, it takes several minutes to analyze the data for each scan. This can cause the system to delay, sometimes up to six minutes per scan. In this type of system, the device must also project a special grid on an object to obtain reference points for gluing a complete 3D picture. In addition, accuracy is sometimes a problem because stereoscopic scanning relies on light reflecting properties. The systems make assumptions based on Lambertian reflecting properties to determine resolution surface features of the scanned objects. Different surfaces can dictate different results for the same object.
With respect to the embodiments of the 3D scanning systems which utilize a computer to control the scanning parameters, e.g., angular position of the light stripe at a given instant, much of the cost of such 3D scanning systems is attributable to the feedback circuit which allows the computer to control the scanning parameters of the system. Furthermore, because a computer is needed for scanning, practical portability may be difficult to achieve. Accordingly, an ideal, portable, inexpensive 3D scanning system should minimize the components, e.g., include just a video camera and a mechanism for variably projecting a light stripe onto a target object, and preferably not require a computer for scanning . Thus, for devices that scan, measure or otherwise collect data about an object, it would be a substantial advance if a scanner could be created that could rapidly gather highly accurate data concerning a 3D object. It would also be an advance if the device could rapidly process the data in a fashion that did not require a large computing system (and allow for portable embodiments) , and after computing, create a descriptive model from the data points collected about the object. Summary of the Invention
The present invention provides a high speed, accurate and portable system and method for rapidly measuring objects and processing the shape, contour, color and other data it collects for display, graphic manipulation, model building -- such as through adaptive mesh generation -- and other uses Because the basic information about the object is obtained in rapid fashion, the invention is particularly suited to scan and measure objects which cannot easily stay motionless, such as people or animals The mechanical and data processing features of the present invention permit the collected data to be processed with high accuracy The present invention provides a system for rapidly scanning an object with a geometric light shape (such as a laser stripe) , recording the shape of the reflected points of light by means of an image collector {such as a camera) , and, by a triangulation technique that does not depend on the fixed direction of the light source relative to the camera, reconstructing the 3D shape of the object through a computer using the data points collected from the reflection of the laser stripes. With the collected data points, a user can, inter al a , create, display and manipulate an image of the 3D object on a computer, physically reproduce the object (through computer controlled milling machines and stereolithography) , compress the data for easy transmission (such as over the Internet), or use the data in graphic manipulation systems (such as in 3D video animators) The invention includes a model building algorithm which provides multi-resolution analysis and adaptive mesh generation, leading to significant compression of the digital shape information, which for most objects will be order of magnitude compression. The present invention also provides embodiments which are portable (and not limited to any specific background) and can also be implemented using components readily available. A representative embodiment of the portable scanning system according to the present invention does not require a computer as part of the system because data processing contemporaneous with the data collection is obviated in this embodiment Instead, the portable system collects 3D-profιle data of objects and records the data on a storage medium The data stored on the storage medium is subsequently processed, at any desired time, by a computer system which applies the above-mentioned data-processing routines, e.g , the model building algorithm which provides multi -resolution analysis and adaptive mesh generation It should be noted, however, processing of the data collected using the portable scanning system according to the present invention need not be limited to the specific data- processing routines described herein In the portable scanning system according to the present invention, synchronization of the image- captuπng device, e.g , a video camera, and the projection mechanism for variably projecting a stripe of light is not achieved at the time of actual scanning By avoiding the synchronization issue at the time of actual scanning, the need to utilize a computer to simultaneously control the projection mechanism and the video camera is also obviated. Instead, the portable scanning system according to the present invention records on one or more data storage medium both the video data and data relating to the angular positions of the light stripe during scanning. When the recorded data is eventually processed by a computer, the angular-position data is synchronized w th the video data Several implementations of the synchronization scheme are utilized in the portable scanning system according to the present invention.
The portable scanning system according to the present invention utilizes the principle of optical triangulation. The scanning system includes a laser- stripe generator, a video camera, a scanning mirror attached to a continuously rotating motor, and associated electronics. Optionally, the scanning system may also include an encoder or a photodiode coupled to the motor and a light source for providing ambient light. The rotating, scanning mirror reflects the laser stripe and variably positions the laser stripe across a target object. During scanning, the angular positions of the scanning mirror are encoded, and the encoded data reflecting the angular positions is recorded. The camera, which is positioned at a triangulation distance from the target object, detects the laser-stripe on the object and records the video data. If the relative geometry of the components of the scanning system and the angle of the scanning mirror relative to the object are known, the computer can calculate the depth component of the 3D profile of the target object by combining the video data and the encoded data reflecting the angular positions of the scanning mirror. The present invention provides a further aspect related to rapid processing by providing a system to compress the video camera information as it is input to the computer. In an embodiment of the present invention employing a camera as an image collector, a large amount of video signal information {bytes converted from analog signals) comes to the computer memory for processing. The present invention provides a system which rapidly processes the information by compressing the image information and keeping from each image only the information about the reflected data points collected from the positioned laser stripes. Instead of storing megabytes of information, the data compression system of the present invention reduces the raw data down to only a few kilobytes depending upon the application. As data arrives, the computer applies an "on the fly" test as it first comes into a memory buffer, with the computer rejecting pixel values that fail the test . The compacted data allows for easier, speedier processing and alleviates the need for massive computing power to complete the scanning process. A currently available personal computer is all that is necessary to implement the data gathering system of the present invention.
The present invention also provides features that allow for more accurate measurement of the object and better display of its image. One feature provided is a system to eliminate "noise" from the image information and determine the most accurate 3D spatial location of any reflection data point on the object to the degree of "subpixel accuracy". The image of the data point created by a laser stripe reflection can be three or more pixels wide, but the most intense portion of the point will be located in only a portion of a pixel. The invention refines the collected data points to subpixel accuracy by locating the most likely center of the data point according to calculation heuristics.
Another feature of the present invention in performing shape digitization is to obtain only the essential information when collecting shape data and color information about an object. In the present invention, it is not necessary in obtaining shape data to process full color information during scanning. The present invention features the ability in an exemplary embodiment to intake laser stripe scanning information about an object with an electronic camera, and to capture that information in black and white or through a single color port, such as the red. The present invention provides that color information about the object can be separately gathered apart from the collection of data points by laser stripe scanning and later matched and used with the three-dimensional data points.
Brief Description of the Drawings
Fig. la illustrates a first exemplary embodiment of the portable scanning system according to the present invention.
Fig. lb illustrates some of the geometric relationships among various components of the first exemplary embodiment of the portable scanning system according to the present invention shown in Fig. la. Fig. 2a illustrates a second exemplary embodiment of the portable scanning system according to the present invention which includes a mechanical switch or an encoder. Fig. 2b illustrates a modificatio of the second exemplary embodiment of the portable scanning system according to the present invention shown in Fig. 2a, which modification adds a logic electronics and a lamp . Fig. 2c illustrates a modification of the second exemplary embodiment of the portable scanning system according to the present invention shown in Fig. 2a, which modification adds a logic electronics and a telephone tone generator or a modem encoder chip. Fig. 3 illustrates a third exemplary embodiment of the portable scanning system according to the present invention which includes an optical encoder.
Fig. 4a illustrates a fourth exemplary embodiment of the portable scanning system according to the present invention which includes photodiodes . Fig. 4b illustrates a modification of the fourth exemplary embodiment of the portable scanning system according to the present invention shown in Fig 4a, which modification adds a logic electronics and a lamp.
Fig. 4c illustrates a modification of the fourth exemplary embodiment of the portable scanning system according to the present invention shown in Fig. 4a, which modification adds a logic electronics and a telephone tone generator or a modem encoder chip.
Fig. 5 illustrates a fifth exemplary embodiment of the portable scanning system according to the present invention which includes logic electronics and a lamp
Fig. 6 illustrates a sixth exemplary embodiment of the portable scanning system according to the present invention which implements angular data encoding by utilizing a telephone tone generator or a modem encoder
Fig. 7 illustrates the sixth exemplary embodiment of the portable scanning system shown m Fig 6 connected to audio and video signal interfaces of a computer .
Fig. 8 illustrates the use of an intermediate storage medium to store 3D-profile data captured using the sixth exemplary embodiment of the portable scanning system according to the present invention shown m Fig. 6.
Detailed Description of the Invention
As shown in Fig. la, a basic embodiment of a portable scanning system according to the present invention includes a laser-stripe generator 1602, a video camera 1604, a continuously spinning motor 1603 and a scanning mirror 1601 attached to the continuously spinning motor 1603 The rotating, scanning mirror reflects the laser stripe 1606 and variably positions the laser stripe across a target object 101 The video camera, which is positioned relative to the target object at a distance which allows triangulation calculation, detects the laser stripe 1608 reflected from the object and records a sequence of image frames each containing an image of the reflected laser-stripe The combination of the scanning mirror 1601 attached to the continuously spinning motor 1603, the laser-stripe generator 1602, and the video camera 1604 are preferably mounted on a mechanical holder 1605 The mechanical holder 1605, which allows the triangulation distance between the scanning mirror 1601 and the video camera 1604 to be varied by manipulation of the adjustment mechanism 1607, may be in turn mounted on a standard camera tripod, which is not shown
In the embodiment illustrated in Fig la, the motor 1603 may be a simple DC/AC motor with a gear head The motor 1603 rotates at a constant speed in the range of 2-4 revolutions per minute (rp ) , for example The angular velocity of the motor 1603 may be adjusted for varying levels of scanning quality and scanning time High stability of rotation speed of the motor is not required for the portable scanning system according to the present invention Because virtually no load is applied to the motor 1603, the motor may be a miniature motor such as model 2230V012S22B sold by MicroMo Corp of Clearwater, Florida The model 2230V012S22B is approximately 2" deep, including the gear head, and 0.7" in diameter.
The mirror 1601 shown in Fig la is a polygonal mirror which has, for example, 6 symmetrical facets For each degree of angular movement of the motor 1603, the laser stripe 1606 sweeps two degrees of field of view of the video camera 1604. Accordingly, each of the six facets of the mirror 1601 sweeps 120 degrees ( (360 degrees/6) x 2) = 120 degrees) of field of view during each rotation of the motor 1603. The mirror 1601 may alternatively have 5-12 symmetrical facets If the mirror 1601 has, for example, 8 facets, each facet will sweep 90 degrees ((360 degrees/8) x 2 = 90 degrees) of field of view during each rotation of the motor 1603. Since a typical video camera has a maximum field of view of about 60 degrees, the whole field of view of the video camera will be covered by a sweep of each facet of the mirror as long as the polygonal mirror has 12 or fewer facets. The multi-faceted, polygonal mirror 1601 allows the portable embodiments of the scanning system according to present invention to achieve shorter "dead periods," i.e., periods when no laser stripe is positioned on the target object, during scanning in comparison to an embodiment utilizing a single, flat mirror mounted on a constant-speed motor.
The video camera 1604 may be any one of the commercially available video cameras as long as the chosen video camera outputs sufficiently high-resolution, real-time video signals. Typical, commercially available video cameras have a resolution ranging from 320x240 pixels per image frame up to 640x480 pixels per image frame. In addition, typical video cameras have an operating speed of 30 image frames per second, which is equivalent to 60 image fields per second. A typical, commercially available video camcorder, which combines a video camera with a recording mechanism, is adapted to record the video signals on a storage medium such as a video cassette tape.
Assuming that the video camera 1604 shown in Fig. la has a resolution of 640x480 pixels resolution and an operating speed of 60 fields per second, about 50-200 laser-stripe images should be recorded to recreate a 3D image of the scanned object. Accordingly, if one laser- stripe image is captured in each image field, the scanning time for obtaining 50-200 laser-stripe images would be approximately 1 to 3 seconds As explained below, the time required to scan the target object will depend on the angular velocity of the motor 1603 and tne desired number of laser-stripe images for a given scan If the angular velocity of the motor is 4 rpm, for example, and the scanned object corresponds to 30 degrees within the field of view of the camera, then each of the six facets of the mirror 1601 will sweep 30 degree field of view in (30 degrees/2) x (60 seconds/ (4x360 degrees)) = .625 second. In this case, the time required to scan the object will be a function of the desired number of scan lines. If, on the other hand, the motor 1603 has an angular velocity of 2 rpm and 60 laser stripe images are desired for scanning an object which corresponds to 30 degrees within the field of view of the camera, the video camera 1604 will record 60 image fields in 1 second, but the scanning mirror will sweep across only (30 degrees/1.25 second) x (1 second) = 24 degrees in 1 second In this case, the time required to scan the target object will be dependent on the time required for the mirror to sweep 30 degrees in the camera's field of view. With respect to Fig. lb, which illustrates some of the geometric parameters of the various components of the embodiment of the portable scanning system according to the present invention illustrated in Fig. la, if the relative geometry of the components of the scanning system, e.g., distances XL and ZL between the origin of the coordinate system and the rotation axis of the motor 1603, perpendicular distance R_- between the rotation axis and a given facet of the mirror 1603, and the angle of the scanning mirror relative to the object, i.e., the angle θ of approach of the laser stripe on the object, are known, the computer can calculate the depth component Z0 of the 3D profile of the scanned portion of the target object by performing the triangulation calculation with the two-dimensional coordinates of the scanned portion of the target object stored in the video image frames. As can be seen from Fig. lb, the three- dimensional coordinate system for the scanning system according to the present invention is centered at the focal point of the focusing lens 1604a of the camera 1604, with the Y-coordinate extending perpendicular to the plane defined by the drawing sheet. The distance d shown in Fig. lb, which may be described as the focal point location value representing the relationship between the focal point and the image plane (light collection plate) 1604b, is ascertained by a calibration procedure which is described in detail in a commonly- owned U.S. Patent Application entitled "High Accuracy Calibration for 3D Scanning and Measuring Systems" by A. Migdal, A. Lebedev, M. Petrov and A. Zhilyaev, which application is filed concurrently with the present application and expressly incorporated herein by reference. The distance R,. is determined by measurement, and the distances X and ZL are determined by the calibration procedure described in the "High Accuracy Calibration for 3D Scanning and Measuring Systems" application. Although the distance ZL illustrated in Fig. lb need not be zero, ZL may be assumed to be zero for the sake of simplicity of explanation in the following discussion. Similarly, the rotational axis of the motor 1603, which need not be coincident with the Y-axis, is assumed to be coincident with the Y-axis for the sake of simplicity of explanation in the following discussion.
Continuing with the description of the scanning system in conjunction with Fig. lb, the angle θ' can be ascertained based on the image coordinate (x,y,d) on the image plane 1604b corresponding to the scanned, actual coordinate (X0, Y„, Z„) . Since the distances XL, ZL and R,., 17 as well as the angle θ', are known, the Z-coordinate distance Z0 of the scanned point on the target object may be ascertained if the angle θ, which indicates the angle of approach of the laser stripe relative to the target object 101 for the scanned, actual coordinate (X0, Y0, Z0) , is known. The angle θ is the angular separation between the normal vector to the given mirror facet and an axis parallel to the Z-axis of the coordinate system. Accordingly, the remaining issue in calculating the depth component of the 3D profile of the target object is how to ascertain the angle θ of the scanning mirror, i.e., the angle of approach of the laser stripe relative to the target object for a given light-stripe position as recorded in the stored video image frames. Several different approaches may be taken in addressing the issue of ascertaining the angle θ of the laser stripe relative to the target object.
Within the range of precision of measurement achieved with the embodiment illustrated in Fig. la, the speed of the motor 1603 may be assumed to be constant. With this assumption, the angular velocity of the motor 1603 and, in turn, the angular velocity of the scanning mirror 1601, may be calculated by analyzing the video data recorded by the video camera 160 . As the first step in calculating the angular velocity, a computer simply compares a sequence of stored video frames to find those frames in which the laser stripes are at identical positions. Based on the number of intervening frames between the two substantially identical video frames, the computer can calculate the angular velocity of the motor 1603 and the scanning mirror 1601. For example, if recorded video frames 7 and 100 are found to be substantially identical and the camera used to record these video frames is an NTSC format camera, then the elapsed time between these frames is 93x(l/30) seconds = 3.1 seconds. Furthermore, assuming an 8-faceted polygonal mirror is utilized as the scanning mirror, the angle between two successive, identical scanning positions is 360/8 = 45 degrees. Accordingly, the rotational speed of the motor and the mirror may be calculated to be ((360/45) x 3.1 seconds) per revolution = 24.8 seconds per revolution, or, 1 revolution per ((60 seconds/minute) /2 .8 seconds) = 2.42 rpm.
Using the calculated rotational speed of the motor 1603, the computer can calculate the angular position φ of the motor with respect to a chosen reference point, and hence the angle θ of the given mirror facet indicating the angle of approach of the laser stripe relative to the object for a given laser- stripe position on the target object, at any given time if at least one reference angular position, φ0, of the motor 1603 at a corresponding reference time, e.g., t=0, is known. Using the above example of an 8-faceted polygonal mirror having the rotational speed of 2.42 rpm, if the angular position φ of the motor was 46 degrees from a predetermined reference line at t=0 second, then the angular position φ of the motor at t=3 seconds will be 46+ ((3 seconds) x (360 degrees/1 revolution) x (2.42 revolution/60 seconds)) = 89.56 degrees. The value of the angle θ at time t=0 may be calculated from the value of φ0 because the orientation of each facet of the mirror 1601 relative to the angular position of the motor is known from the calibration procedure described in the "High Accuracy Calibration for 3D Scanning and Measuring Systems" application incorporated herein by reference. Since the value of the angle θ at any given time may be calculated based on ω, the rotational velocity of the motor 1603, and the value of the angle θ at time t=0, the computer can associate the two-dimensional coordinate points recorded in the video image frames with the corresponding angle θ of the laser stripe to calculate the depth component of the 3D profile of the scanned portion of the target object.
With respect to the embodiment of the portable scanning system illustrated in Fig. la, it is not possible to ascertain the reference angle φ0 corresponding to time t=0 because no mechanism for indicating the angular position of the motor 1603 is provided. Accordingly, it is difficult to make absolute size measurements using the embodiment of the scanning system shown in Fig. la. However, if the main purpose of the scanning system is to obtain a 3D image of an object without obtaining absolute size measurements, then the computer may be programmed to make an estimation of the reference angle φ0 at time t=0. For example, if the scanned object is substantially symmetric, e.g., a human face, then the computer may implement an optimization program with the angle φ0 at t=0 as a parameter to find the angular position of φ for which the object looks symmetric. In addition, for certain applications, e.g., scanning human faces for subsequent incorporation of 3D images into a computer game or a page on the World Wide Web, high precision in finding the angle φ0 at t=0 may not be very important. In this case, the angle φ0 may be manually "adjusted" by means of a 3-D editor, i.e., by performing graphic manipulation.
Illustrated in Fig. 2a is another exemplary embodiment of the portable scanning system according to the present invention, which embodiment provides an angular position value φ0 for the motor 1603 at time t=0. The embodiment illustrated in Fig. 2a is substantially identical to the embodiment of Fig. 16, with the addition of a pulse-generating mechanism 1701. The pulse- generating mechanism 1701, which may be a mechanical switch or a low-resolution encoder, is used to provide one or several pulses per rotation of the motor 1603.
Each pulse corresponds to a reference angle. The pulses may be fed via connection 1701a to, and recorded on, a separate recording medium, which is not shown, and subsequently processed with the recorded video image frames containing laser-stripe images. For example, the video camera may be adapted to include digital memory for the purpose of storing the encoder pulses. Alternatively, the pulses, along with the video data, may be fed directly to a computer. When the pulses and the video data are processed by the computer, the computer zeroes the time counter at each occurrence of the pulse, thereby synchronizing the recorded video image frames with corresponding angular positions of the motor 1603.
Shown in Fig. 3 is yet another exemplary embodiment of the portable scanning system according to the present invention, which embodiment provides an angular position φ0 for the motor 1603 at time t=0, as well as at regular time intervals thereafter, by means of an optical encoder 1801. With the exception of the optical encoder 1801, the embodiment of Fig. 18 is substantially identical to the embodiment shown in Fig.
16. The optical encoder 1801 includes a disk ("code wheel") with holes which is attached to the shaft of the motor 1603, one or more light emitting diodes (LEDs) , a set of photodetectors and a signal -processing circuit. As the disk rotates with the shaft of the motor 1603, light emitted from the set of LEDs travels through the holes in the disk and is detected by the photodetectors. The signals from the photodetectors are processed by the signal -processing circuit, and the resulting output signals indicating the angular position φ of the motor
1603 are produced on channels 1802 and 1803, for example. The output signals on channels 1802 and 1803 are square waveforms representing binary numbers; the output signal on channel 1802 is 90 degrees out of phase with the output signal on channel 1803. Exemplary, low-cost optical encoders which may be incorporated in the exemplary embodiment of Fig. 3 are HEDS/HEDM series of optical encoders made by Hewlett- Packard Co. These encoders provide 360-1024 counts per revolution, as reflected by output signals on the channels 1802 and 1803. HEDS-5540/5640 encoders generate three-channel , digital output, with the output signal on channel 1804 being an index pulse P0 which is generated once for each full rotation of the motor 1603. The output signals on channels 1802-1804 may be recorded on a recording medium or fed directly to a computer. By interpolating between adjacent, recorded counts, the computer can calculate the angular position φ of the motor 1603 at any given time with high precision. In this manner, the angular positions of the motor 1603 corresponding to the recorded video image frames may be ascertained.
As shown in Fig. 4a, yet another exemplary embodiment of the portable scanning system according to the present invention utilizes photodiodes for the purpose of facilitating eventual synchronization between the laser-stripe images recorded as video data and the corresponding angular positions of the motor 1603. The embodiment of Fig. 4a is substantially similar to the embodiment shown in Fig. 16, with the addition of two photodiodes 1901 each located at a known position from the scanning mirror 1601. Alternatively, a single photodiode may be used. As the mirror 1601 rotates, the photodiodes 1901 detect the laser stripe 1606 reflected from the mirror 1601, in response to which the photodiodes 1901 generate detection signals. For each photodiode 1901, the detection signal indicates orientation of the scanning mirror 1601 at the particular known angle relative to the photodiode. The detection signals may be recorded on a recording medium or fed directly to a computer. By combining the recorded laser- stripe images with the time sequence of corresponding detection signals, the angle θ of the scanning mirror 1601 relative to the target object corresponding to a given laser-stripe image may be ascertained. Still another exemplary embodiment of the portable scanning system according to the present invention, shown in Fig. 5, is substantially similar to the embodiment of Fig. 3, with the addition of logic electronics 2001, a relay 2002 and a lamp 2003. The logic electronics 2001 triggers the lamp 2003 in response to the output signal of the encoder 1801 when the motor 1603 reaches one or more predetermined reference angle (s) . Since the lamp 2003 turns on only at specific angular positions of the motor 1603, the angular position φ of the motor corresponding to a given image frame may be ascertained by comparing total brightness of adjacent recorded image frames, i.e., determining when the lamp 2003 is turned on. Based on the angular position of the motor corresponding to an image frame in which the lamp 2003 was turned on, the computer can then calculate the angular position φ of the motor for any other recorded image frame .
Although Fig. 5 shows the output signal of the encoder 1801, transmitted via channels 1802-1804, as the input signal for the logic electronics 2001, the logic electronics 2001 may be designed to trigger the lamp 2003 in response to other input signals which indicate that the motor 1603 has reached a predetermined reference angle. For example, output signals from the mechanical switch 1701, shown in Fig. 2a, or signals from the photodiodes 1901, shown in Fig. 4a, may serve as the input signals for the logic electronics 2001, i.e., the embodiments of Figs. 2a and 4a may be modified by adding the logic electronics 2001, the relay 2002 and the lamp 2003. These alternative embodiments are illustrated in Figs. 2b and 4b. In the exemplary embodiment of Fig. 5, the lamp 2003 also serves the function of providing light for taking texture-map image, in addition to the function of generating information about the angular position of the motor. The lamp 2003 is useful for the purpose of taking texture-map image because increased uniformity of illumination of the scanned object is desirable. Meanwhile, because the dynamic range of commercially available video cameras is limited, i.e., 256 gradations of gray for a typical low-cost camera, reduced background illumination relative to the object illumination usually results in a more efficient 3D scanning process.
Shown in Fig. 6 is yet another exemplary embodiment of the portable scanning system according to the present invention, which system records the angular positions of the motor 1603 preferably in audio format. The exemplary embodiment of Fig. 6 includes substantially similar components as the exemplary embodiment of Fig. 5, with the addition of an audio-signal generator 2101. The angular positions of the motor 1603 may be encoded into audio type electrical signals by the audio-signal generator 2101, which may be a telephone tone generator or a modem encoder chip. For example, the telephone tone generator, used in modern touch-tone telephones, generates audio type electrical signals by mixing two electrical signals of different frequencies. The bandwidth of the resultant signal falls well within the telephone signal bandwidth of 4kHz .
As shown in Fig. 6, the logic electronics 2001 triggers the telephone tone generator or the modem encoder chip 2101 in response to the output of the optical encoder 1801 transmitted via output channels 1802-1804, such that every count generated by the optical encoder 1801 is represented by a sequence of telephone tone pulses. The logic electronics 2001 and the telephone tone generator (or the modem encoder chip) 2101 may be viewed as components of audio electronics 2102. The logic electronics 2001 includes a quartz generator for triggering the tone generator or the modem encoder chip 2101. Assuming the optical encoder 1801 generates 360 counts per revolution, for example, then the modem encoder chip 2101 needs to encode 360 different numbers, which means using as many as three bits of the modem encoder chip 2101 for encoding the value of a single angle. The quartz generator provides timing signal for outputting the multiple bits of the modem encoder chip
2101. Assuming the motor 1603 has an angular velocity of 2-4 rpm and the optical encoder 1801 generates 360 counts per revolution, any commonly used telephone tone generator will be sufficient for the purposes of encoding angular positions of the motor 1603.
Although Fig. 6 shows the output signal of the encoder 1801, transmitted via channels 1802-1804, as the input signal for the logic electronics 2001, the logic electronics 2001 may be designed to trigger the tone generator 2101 in response to other input signals which indicate that the motor 1603 has reached one or more predetermined reference angles. For example, output signals from the mechanical switch 1701, shown in Fig. 2a, or signals from the photodiodes 1901, shown in Fig. 4a, may serve as the input signals for the logic electronics 2001, i.e., the embodiments of Figs. 2a and 4a may be modified by adding the logic electronics 2001 and the tone generator 2101. These alternative embodiments are illustrated in Figs. 2c and 4c. In the exemplary embodiment shown in Fig. 6, as well as in the embodiments shown in Figs. 2c and 4c, the lamp 2003 is an optional component of the scanning system. In these embodiments, the lamp 2003 is utilized to enhance the illumination of the target object for taking texture-map image. The illumination provided by the lamp 2003 reduces the probability that the recorded 25 images frames of the scanned object are too dark, or the probability that the computer processing the recorded image frames will have problems in identifying the laser stripe positioned on the scanned object from bright 5 background.
The video image frames from the video camera 1604 and the audio signals from the audio electronics 2102 may be fed directly to a computer 2205, as shown in Fig. 7, or recorded on a storage medium and processed
10 later by a computer, as shown in Fig. 8. As shown in
Fig. 7, the video signals representing the captured image frames are fed from the video camera 1604 via channel 2104 to a video capture device 2206 of the computer 2205. Similarly, the audio signals representing the angular
15 positions of the motor 1603 are fed from the audio electronics 2102 via channel 2103 to an audio capture device 2207 of the computer 2205. For a PC type computer, the video capture device 2206 is a plug-in frame grabber board, while the audio capture device 2207
20 is an input terminal of a sound board.
For modern PC type computers having a PCI type internal bus, video frame gabbers capable of capturing NTSC format video images with 640x480 pixels resolution at a rate of 30 frames (60 fields) per second are readily 25 available. For example, Meteor™ video frame grabber board manufactured by Matrox, Inc. takes advantage of high bandwidth of the PCI bus by transferring the video data directly to the main processor/memory of the computer without buffering it on the board. Since the
30 main processor of a typical PC type computer is capable of directly processing the captured video data in order to determine the recorded laser stripe image and perform the triangulation calculation, no additional buffering of data is necessary for performing the triangulation 35 calculation and determining the 3D coordinates representing the profile of the scanned object. Λ Λ«.,«-.-, O 98/02764
26
Because the sound boards of modern PC type computers typically have a 16-bit resolution and a bandwidth much higher than the bandwidth of the telephone tone generator incorporated in the audio electronics 5 2102, such sound boards are very well suited for handling audio data representing the angular positions of the motor 1603. Furthermore, due to its limited bandwidth, these sound boards consume just a small fraction of the bus and processor time.
10 As shown in Fig. 8, video image frames from the camera 1604 and audio signals representing the angular positions of the motor 1603 may be recorded on a storage medium 2302 by means of a recording device 2301. In Fig. 8, the storage medium 2302 is, for example, a cassette
15 tape for recording both video and audio signals. A typical camcorder, which is designed to record both video and audio signals, combines the video camera 1604 and the recording device 2301. For example, camcorders manufactured by Sony have an audio input for recording
20 audio signals from an auxiliary microphone. The audio input can be used to record audio signals generated by the audio electronics 2102. The video and audio signals recorded on the storage medium 2302 may be reproduced by means of a VCR 2303, for example, and fed to the video
25 capture device 2206 and the audio capture device 2207, respectively, of the computer 2205. The computer then performs the triangulation calculation and determines the 3D coordinates representing the profile of the scanned object, as well as performing other post-scanning tasks,
30 e.g., data simplification and adaptive-mesh generation. It should be emphasized that, for each of the embodiments illustrated in Figs, la-8, video image frames containing the images of laser stripes reflected from the scanned object and signals representing the angular
35 positions of the motor 1603 may be recorded on an intermediate storage medium. By utilizing the intermediate storage medium, the exemplary embodiments of the scanning system according to the present invention shown in Figs, la-8 eliminate the need to have a computer present at the scanning site, thereby achieving practical portability and added flexibility in scanning. For the exemplary embodiments of the portable scanning system according to the present invention, the computer is ust a passive device used to process the data obtained from the scanning process: the computer need not participate in the data acquisition process.
Although the exemplary embodiments of the portable scanning system according to the present invention shown in Figs, la-8 achieve practical portability and enhanced flexibility of scanning, these embodiments of the portable scanning system do not sacrifice the quality of resulting 3D profile of the scanned object or significantly increase the time required to process the laser-stripe image data and data reflecting the angular positions of the laser stripe. Once the angular positions of the laser stripe corresponding to the recorded laser-stripe images have been ascertained from the recorded audio and/or video data, the computer performs various program routines, e.g., 3D reconstruction, data simplification, mesh generation and stitching, all of which are described in U.S. Patent Application Serial Number 08/620,689 filed on March 21, 1996 by A. Migdal, M. Petrov and A. Lebedev, to generate the 3D model of the scanned object. U.S. Patent Application Serial Number 08/620,689 is expressly incorporated herein by reference as if fully set forth at length herein.
For each of the exemplary embodiments of the portable scanning system illustrated in Figs, la-8, the calibration procedures for setting the initial scanning parameters are described in detail in "High Accuracy Calibration for 3D Scanning and Measuring Systems" application, which is filed concurrently with the present application and has been expressly incorporated herein by reference. As previously mentioned, the calibration procedures described in the "High Accuracy Calibration for 3D Scanning and Measuring Systems" application include, among others, calibrations for camera lens distortion and the focal point value d. These calibration procedures are briefly discussed below.
Calibration for camera lens distortion may be accomplished by scanning a flat reference surface with a repeated and evenly spaced pattern of laser lines and capturing the image of these laser lines. Alternatively, a frame grabber associated with a computer may be used to collect from the camera 1604 the image of a rectilinear grid pattern drawn on a flat surface. If the image of the rectilinear grid pattern is used for calibration, for example, since the lines forming the rectilinear grid are known to be straight, the deviation between the captured image of the grid lines and the actual grid lines may be used to generate a correction matrix.
Referring to Fig. lb, the focal point value d of the camera 1604 may be ascertained by one of several different calibration procedures. One calibration procedure involves capturing the image of a flat reference surface on which an equilateral triangle of known dimensions is drawn. When the image of the triangle is captured, each of the three vertices generates a corresponding image point on the camera image plane 1604b, and the imaginary line connecting the vertex of the triangle on the flat reference surface and the corresponding image point on the camera image plane may be expressed as a line equation. By simultaneously solving three line equations corresponding to the three vertices of the equilateral triangle, the focal distance d may be ascertained. In addition to the calibration for the camera lens distortion and the camera focal distance d shown in Fig. lb, the portable scanning system according to the present invention is calibrated for the following parameters: relative positions of the image collector and the light source (the XL and ZL settings) ; and, for each facet of the multi-faceted mirror 1601, an initial (or reference) mirror angle θ0 corresponding to a known motor position. The initial mirror angle θ0 indicates the orientation of a given mirror facet relative to the motor position. The calibration procedures for the XL, ZL and θ0 parameters are described in detail in "High Accuracy Calibration for 3D Scanning and Measuring Systems" application . While the present invention has been described in connection with specific embodiments, it should be understood that these embodiments are exemplary in nature and not to be construed as limiting the scope of protection for the invention as set forth in the appended claims, as certain changes may be made to the exemplary embodiments without departing from the clear teachings of the present invention. For example, although the preferred embodiments utilize the multi-faceted mirror 1601 to reflect and variably position the laser stripe across a target object, other laser-stripe positioning devices such as a holographic disc may be utilized. Accordingly, reference should be made to the following claims which alone define the invention.

Claims

WE CLAIM: 1. A system for determining a three - dimensional profile of an object comprising: a light-source unit for projecting a stripe of light onto said object and creating a luminous contour line at an intersection of said stripe of light and said object, said light-source unit being adapted to variably position said stripe of light onto said object; an image-detecting device for detecting a sequence of images each containing at least one image of luminous contour line, said image-detecting device producing for each image signals representative of said each image, including signals representative of two- dimensional coordinates of detected contour line; and an angle indicator for generating a signal related to an angle of approach of said stripe of light onto said object for at least one position of said stripe of light on said object.
2. The system according to claim 1 wherein said light-source unit comprises a light-stripe generator and a light-stripe positioner for variably deflecting said stripe of light onto said object.
3. The system according to claim 2 wherein said light-stripe positioner comprises a mirror rotatingly driven by a motor.
4. The system according to claim 3 wherein said image-capturing device is located relative to said rotating mirror at a known position, and wherein depth coordinates corresponding to said two-dimensional coordinates of said detected contour line for each image are calculable by triangulation based on an angle of approach of said stripe of light onto said object corresponding to said detected contour line, whereby a plurality of three-dimensional coordinates representative of said three-dimensional profile of said object are determined.
5. The system according to claim 4 wherein said signals representative of said each image and said signal related to said angle of approach of said stripe of light onto said object for said at least one position of said stripe of light on said object are each stored on a data storage medium, and wherein said stored signals are subsequently processed by a data-processing apparatus to determine said plurality of three-dimensional coordinates representative of said three-dimensional profile of said object.
6. The system according to claim 5 wherein said rotating mirror comprises a multi-faceted mirror rotatingly driven by said motor at a known angular velocity, and wherein said angle of approach of said stripe of light onto said object corresponding to each detected contour line is calculable based on said known angular velocity of said motor and said stored signal related to said angle of approach of said stripe of light onto said object for said at least one position of said stripe of light on said object.
7. The system according to claim 6 wherein said angle indicator comprises a pulse-generator for providing at least one pulse per one rotation of said motor, said at least one pulse corresponding to a known angular position of said motor, and wherein an angle of approach of said stripe of light onto said object at any given time is calculable based on said known angular position of said motor and time of occurrence of said at least one pulse corresponding to said known angular position of said motor.
8. The system according to claim 7 further comprising: a lamp operationally coupled to said at least one pulse corresponding to said known angular position of said motor, said lamp being turned on during a finite period of time corresponding to said occurrence of said at least one pulse corresponding to said known angular position of said motor; wherein time of occurrence of said at least one pulse corresponding to said known angular position of said motor is ascertainable based on relative brightness of said detected sequence of images as indicated by said stored signals representative of said detected sequence of images.
9. The system according to claim 7 further comprising: an audio-signal generator operationally coupled to said at least one pulse corresponding to said known angular position of said motor, said audio-signal generator generating an audio signal indicative of time of occurrence of said at least one pulse corresponding to said known angular position of said motor; wherein said time of occurrence of said at least one pulse corresponding to said known angular position of said motor is ascertainable based on said audio signal.
10. The system according to claim 9 wherein said audio-signal generator comprises a telephone tone generator.
11. The system according to claim 9 wherein said audio-signal generator comprises a modem encoder chip. 33
1 12. The system according to claim 6 wherein
2 said angle indicator comprises an optical encoder for
3 providing a plurality of signals per one rotation of said motor, said plurality of signals corresponding to known angular positions of said motor, and wherein an angle of
6 approach of said stripe of light onto said object at any given time is calculable based on said known angular positions of said motor and respective times of occurrence of said plurality of signals corresponding to said known angular positions of said motor.
13. The system according to claim 12 further comprising: a lamp operationally coupled to said plurality of signals corresponding to said known angular positions of said motor, said lamp being turned on during a finite period of time corresponding to each occurrence of said plurality of signals corresponding to said known angular positions of said motor; wherein respective times of occurrence of said plurality of signals corresponding to said known angular positions of said motor are ascertainable based on relative brightness of said detected sequence of images as indicated by said stored signals representative of said detected sequence of images.
14. The system according to claim 12 further comprising: an audio-signal generator operationally coupled to said plurality of signals corresponding to said known angular positions of said motor, said audio-signal generator generating a plurality of audio signals corresponding to said plurality of signals corresponding to said known angular positions of said motor; wherein respective times of occurrence of said plurality of signals corresponding to said known angular positions of said motor are ascertainable based on said audio signals.
15. The system according to claim 14 wherein said audio-signal generator comprises a telephone tone generator.
16. The system according to claim 14 wherein said audio-signal generator comprises a digital encoder.
17. The system according to claim 2 wherein said light-stripe positioner comprises a holographic disc rotatingly driven by a motor.
18. The system according to claim 17 wherein said image-capturing device is located relative to said rotating holographic disc at a known position, and wherein depth coordinates corresponding to said two- dimensional coordinates of said detected contour line for each image are calculable by triangulation based on an angle of approach of said stripe of light onto said object corresponding to said detected contour line, whereby a plurality of three-dimensional coordinates representative of said three-dimensional profile of said object are determined.
19. The system according to claim 18 wherein said signals representative of said each image and said signal related to said angle of approach of said stripe of light onto said object for said at least one position of said stripe of light on said object are each stored on a data storage medium, and wherein said stored signals are subsequently processed by a data-processing apparatus to determine said plurality of three-dimensional coordinates representative of said three-dimensional profile of said object.
20. The system according to claim 19 wherein said holographic disc is rotatingly driven by said motor at a known angular velocity, and wherein said angle of approach of said stripe of light onto said object corresponding to each detected contour line is calculable based on said known angular velocity of said motor and said stored signal related to said angle of approach of said stripe of light onto said object for said at least one position of said stripe of light on said object.
21. The system according to claim 20 wherein said angle indicator comprises a pulse-generator for providing at least one pulse per one rotation of said motor, said at least one pulse corresponding to a known angular position of said motor, and wherein an angle of approach of said stripe of light onto said object at any given time is calculable based on said known angular position of said motor and time of occurrence of said at least one pulse corresponding to said known angular position of said motor.
22. The system according to claim 21 further comprising: a lamp operationally coupled to said at least one pulse corresponding to said known angular position of said motor, said lamp being turned on during a finite period of time corresponding to said occurrence of said at least one pulse corresponding to said known angular position of said motor; wherein time of occurrence of said at least one pulse corresponding to said known angular position of said motor is ascertainable based on relative brightness of said detected sequence of images as indicated by said 64
36 stored signals representative of said detected sequence of images.
23. The system according to claim 21 further comprising: an audio-signal generator operationally coupled to said at least one pulse corresponding to said known angular position of said motor, said audio-signal generator generating an audio signal indicative of time of occurrence of said at least one pulse corresponding to said known angular position of said motor; wherein said time of occurrence of said at least one pulse corresponding to said known angular position of said motor is ascertainable based on said audio signal .
24. The system according to claim 23 wherein said audio-signal generator comprises a telephone tone generator.
25. The system according to claim 23 wherein said audio-signal generator comprises a digital encoder.
26. The system according to claim 20 wherein said angle indicator comprises an optical encoder for providing a plurality of signals per one rotation of said motor, said plurality of signals corresponding to known angular positions of said motor, and wherein an angle of approach of said stripe of light onto said object at any given time is calculable based on said known angular positions of said motor and respective times of occurrence of said plurality of signals corresponding to said known angular positions of said motor.
27. The system according to claim 26 further comprising: 37
3 a lamp operationally coupled to said plurality
4 of signals corresponding to said known angular positions
5 of said motor, said lamp being turned on during a finite
6 period of time corresponding to each occurrence of said
7 plurality of signals corresponding to said known angular
8 positions of said motor;
9 wherein respective times of occurrence of said 0 plurality of signals corresponding to said known angular 1 positions of said motor are ascertainable based on 2 relative brightness of said detected sequence of images 3 as indicated by said stored signals representative of 4 said detected sequence of images
1 28. The system according to claim 26 further
2 comprising.-
3 an audio-signal generator operationally coupled
4 to said plurality of signals corresponding to said known
5 angular positions of said motor, said audio-signal
6 generator generating a plurality of audio signals
7 corresponding to said plurality of signals corresponding
8 to said known angular positions of said motor;
9 wherein respective times of occurrence of said 0 plurality of signals corresponding to said known angular 1 positions of said motor are ascertainable based on said 2 audio signals.
1 29. The system according to claim 28 wherein
2 said audio- signal generator comprises a telephone tone
3 generator .
1 30. The system according to claim 28 wherein
2 said audio-signal generator comprises a digital encoder
1 31. The system according to claim 6 wherein
2 said angle indicator comprises a photodiode for providing
3 at least one signal per one rotation of said motor, said ,M^Λ 02764
38 at least one signal corresponding to a known angular position of said motor, and wherein an angle of approach of said stripe of light onto said object at any given time is calculable based on said known angular position of said motor and time of occurrence of said at least one signal corresponding to said known angular position of said motor.
32. The system according to claim 31 further comprising: a lamp operationally coupled to said at least one signal corresponding to said known angular position of said motor, said lamp being turned on during a finite period of time corresponding to said occurrence of said at least one signal corresponding to said known angular position of said motor; wherein time of occurrence of said at least one signal corresponding to said known angular position of said motor is ascertainable based on relative brightness of said detected sequence of images as indicated by said stored signals representative of said detected sequence of images .
33. The system according to claim 31 further comprising: an audio-signal generator operationally coupled to said at least one signal corresponding to said known angular position of said motor, said audio-signal generator generating an audio signal indicative of time of occurrence of said at least one signal corresponding to said known angular position of said motor; wherein said time of occurrence of said at least one signal corresponding to said known angular position of said motor is ascertainable based on said audio signal.
34 The system according to claim 33 wherein said audio-signal generator comprises a telephone tone generator
35 The system according to claim 33 wherein said audio-signal generator comprises a digital encoder
36 The system according to claim 20 wherein said angle indicator comprises a photodiode for providing at least one signal per one rotation of said motor, said at least one signal corresponding to a known angular position of said motor, and wherein an angle of approach of said stripe of light onto said object at any given time is calculable based on said known angular position of said motor and time of occurrence of said at least one signal corresponding to said known angular position of said motor
37 The system according to claim 36 further comprising- a lamp operationally coupled to said at least one signal corresponding to said known angular position of said motor, said lamp being turned on during a finite period of time corresponding to said occurrence of said at least one signal corresponding to said known angular position of said motor; wherein time of occurrence of said at least one signal corresponding to said known angular position of said motor is ascertainable based on relative brightness of said detected sequence of images as indicated by said stored signals representative of said detected sequence of images.
38. The system according to claim 36 further comprising: 64
40 an audio-signal generator operationally coupled to said at least one signal corresponding to said known angular position of said motor, said audio-signal generator generating an audio signal indicative of time of occurrence of said at least one signal corresponding to said known angular position of said motor; wherein said time of occurrence of said at least one signal corresponding to said known angular position of said motor is ascertainable based on said audio signal.
39. The system according to claim 38 wherein said audio-signal generator comprises a telephone tone generator.
40. The system according to claim 38 wherein said audio-signal generator comprises a digital encoder.
41. A method of determining a three- dimensional profile of an object comprising: sequentially positioning a stripe of light at a plurality of locations on said object to create a corresponding sequence of positions of a luminous contour line, said luminous contour line being generated at an intersection of said stripe of light and said object; detecting by means of an image-detecting device a sequence of images of said luminous contour line corresponding to said' sequence of positions of said luminous contour line; for each of said sequence of images of said luminous contour line, generating signals representative of said each image; storing said signals representative of said sequence of images of said luminous contour line; generating a signal related to an angle of approach of said stripe of light onto said object for at least one position of said luminous contour line on said object; storing said signal related to said angle of approach of said stripe of light onto said object for said at least one position of said luminous contour line; calculating a plurality of three-dimensional coordinates representative of said three-dimensional profile of said object based on said stored signals representative of said sequence of images of said luminous contour line and said stored signal related to said angle of approach of said stripe of light onto said object for said at least one position of said luminous contour line.
42. The method according to claim 41, wherein said step of calculating said plurality of three- dimensional coordinates representative of said three- dimensional profile of said object comprises: determining, based on said angle of approach of said stripe of light onto said object for said at least one position of said luminous contour line, an angle of approach of said stripe of light onto said object for each of said sequence of positions of said luminous contour line as indicated by said stored signals representative of said sequence of images of said luminous contour line; and for each of said sequence of positions of said luminous contour line, calculating by triangulation depth coordinates corresponding to two-dimensional coordinates of said luminous contour line as indicated by said stored signals representative of said image of said luminous contour line.
43. The method according to claim 42 wherein said signal related to said angle of approach of said stripe of light onto said object for said at least one position of said luminous contour line comprises an electrical pulse.
44. The method according to claim 42, wherein said signal related to said angle of approach of said stripe of light onto said object for said at least one position of said luminous contour line comprises an intensity level of ambient light illuminating said object.
45. The method according to claim 42, wherein said signal related to said angle of approach of said stripe of light onto said object for said at least one position of said luminous contour line comprises an audio signal .
46. A method of obtaining data representative of a 3-D shape of an object, which comprises: generating a plurality of luminous contour lines of said object by projecting a beam of light onto said object at a corresponding plurality of angles of approach relative to said object; detecting a sequence of images of said object, each image containing a luminous contour line of said object associated with a corresponding one of the plurality of angles of approach of said beam of light relative to said object; recording said sequence of images of said object; and recording at least one signal related to the one of the plurality of angles of approach of said beam of light relative to said object.
47. The method according to claim 46, wherein said step of recording at least one signal related to the one of the plurality of angles of approach of said beam of light relative to said object occurs during said recording of a sequence of images of said object.
48. The method according to claim 47, wherein time of said recording of said at least one signal related to the one of the plurality of angles of approach of said beam of light is calculable relative to time of recording of at least one image recorded during said recording of said sequence of images of said object.
49. The method according to claim 48, wherein each recorded image of said object is in the form of recorded signals comprising pixel information representative of the recorded image, the pixel information for each recorded image comprising information for each of a plurality of individual pixels.
50. The method according to claim 49, wherein said at least one signal related to the one of the plurality of angles of approach of said beam of light relative to said object comprises an audio signal.
51. A portable system for obtaining data representative of a 3-D shape of an object, which comprises: an illumination source for projecting a beam of light onto said object at a plurality of angles of approach relative to said object; a photographic detecting device for detecting a sequence of images of said object, each image containing a luminous contour line of said object associated with a corresponding angle of approach of said beam of light relative to said object; an angle indicator for generating a signal related to one of the plurality of angles of approach of said beam of light relative to said object; and a storage device for storing data representative of said sequence of images of said object and said signal related to one of the plurality of angles of approach of said beam of light relative to said object.
52. The system according to claim 51, wherein said photographic detecting device generates electrical signals comprising pixel information representative of the detected image of said object and said luminous contour line, the pixel information for each image comprising information for each of a plurality of individual pixels, and wherein said storage device stores said electrical signals comprising pixel information.
53. The system according to claim 52 further comprising a mechanical holder, wherein said illumination source and said photographic detecting device are coupled to said mechanical holder, and wherein relative spacing between said illumination source and said photographic detecting device is adjustable.
54. The system according to claim 53 , wherein said angle indicator is an audio-signal generator.
55. The system according to claim 54, wherein said photographic detecting device comprises a video camera, and wherein said video camera is adjustably and detachably coupled to said mechanical holder.
56. A method of using a portable scanning system for measuring 3-D shape of an object, said portable system comprising an illumination source, a photographic detecting device, an angle indicator and a data storage device, which method comprises: generating a plurality of luminous contour lines of said object by using said illumination source to project a beam of light onto said object at a corresponding plurality of angles of approach relative to said object; detecting a sequence of images of said object, each image containing a luminous contour line of said object associated with a corresponding angle of approach of said beam of light relative to said object; recording said sequence of images of said object; and recording a signal related to one of the plurality of angles of approach of said beam of light relative to said object; wherein time of said recording of said signal related to one of the plurality of angles of approach of said beam of light is calculable relative to time of recording of at least one of said sequence of images of said object, and wherein each recorded image of said object is in the form of recorded signals comprising pixel information representative of the recorded image, the pixel information for each recorded image comprising information for each of a plurality of individual pixels.
57. The method according to claim 56, further comprising the steps of: for each recorded images of said object, associating pixel information representative of recorded luminous contour line of said object with a corresponding angle of approach of said beam of light relative to said object based on said signal related to one of the plurality of angles of approach of said beam of light; and using the associated angle information and pixel information to determine three dimensional X, Y, Z coordinates representative of the 3-D shape of said object.
PCT/US1997/011764 1996-07-12 1997-07-02 Portable 3-d scanning system and method for rapid shape digitizing and adaptive mesh generation WO1998002764A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU36523/97A AU3652397A (en) 1996-07-12 1997-06-27 Portable 3-d scanning system and method for rapid shape digitizing and adaptive mesh generation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/679,498 US5870220A (en) 1996-07-12 1996-07-12 Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation
US08/679,498 1996-07-12

Publications (1)

Publication Number Publication Date
WO1998002764A1 true WO1998002764A1 (en) 1998-01-22

Family

ID=24727148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/011764 WO1998002764A1 (en) 1996-07-12 1997-07-02 Portable 3-d scanning system and method for rapid shape digitizing and adaptive mesh generation

Country Status (4)

Country Link
US (1) US5870220A (en)
AU (1) AU3652397A (en)
TW (1) TW338820B (en)
WO (1) WO1998002764A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2152171A1 (en) * 1998-11-30 2001-01-16 Univ Madrid Carlos Iii 3D vision system with hardware processing of the video signal
DE10359781A1 (en) * 2003-12-19 2005-08-04 Krones Ag Apparatus for inspecting returned empty bottles or packages moving along a conveyer belt using a line camera for recording light reflected from the bottles
US7200130B2 (en) 2001-02-13 2007-04-03 Nokia Corporation Short range RF network configuration
AU2011202572B2 (en) * 2004-03-19 2013-05-09 Siemens Mobility Pty Ltd Optical method of determining a physical attribute of a moving object
CN103295228A (en) * 2013-05-06 2013-09-11 深圳先进技术研究院 Quick data registering method in three-dimensional scanning system and three-dimensional scanning system
CN105806242A (en) * 2016-04-15 2016-07-27 同济大学 Surface type measuring device adopting laser rotary scanning
DE102020130746A1 (en) 2020-11-20 2022-05-25 Raphael Ginbar Device and method for examining a surface

Families Citing this family (138)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6631842B1 (en) 2000-06-07 2003-10-14 Metrologic Instruments, Inc. Method of and system for producing images of objects using planar laser illumination beams and image detection arrays
US7051922B2 (en) * 1994-08-17 2006-05-30 Metrologic Instruments, Inc. Compact bioptical laser scanning system
US6382515B1 (en) 1995-12-18 2002-05-07 Metrologic Instruments, Inc. Automated system and method for identifying and measuring packages transported through a laser scanning tunnel
US6360947B1 (en) 1995-12-18 2002-03-26 Metrologic Instruments, Inc. Automated holographic-based tunnel-type laser scanning system for omni-directional scanning of bar code symbols on package surfaces facing any direction or orientation within a three-dimensional scanning volume disposed above a conveyor belt
US20020014533A1 (en) * 1995-12-18 2002-02-07 Xiaxun Zhu Automated object dimensioning system employing contour tracing, vertice detection, and forner point detection and reduction methods on 2-d range data maps
US6517004B2 (en) * 1995-12-18 2003-02-11 Metrologic Instruments, Inc. Automated system for identifying and dimensioning packages transported through a laser scanning tunnel using laser scanning beam indexing techniques
US6554189B1 (en) 1996-10-07 2003-04-29 Metrologic Instruments, Inc. Automated system and method for identifying and measuring packages transported through a laser scanning tunnel
US6619550B1 (en) 1995-12-18 2003-09-16 Metrologic Instruments, Inc. Automated tunnel-type laser scanning system employing corner-projected orthogonal laser scanning patterns for enhanced reading of ladder and picket fence oriented bar codes on packages moving therethrough
US6457642B1 (en) 1995-12-18 2002-10-01 Metrologic Instruments, Inc. Automated system and method for identifying and measuring packages transported through a laser scanning tunnel
SE509005C2 (en) * 1997-02-24 1998-11-23 Dentronic Ab Method and arrangement for non-contact measurement of the three-dimensional shape of detail objects
DE19730565A1 (en) * 1997-07-17 1999-02-11 Daimler Benz Ag Use of a holographic screen as a display area for information systems
US7028899B2 (en) 1999-06-07 2006-04-18 Metrologic Instruments, Inc. Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target
US6377865B1 (en) 1998-02-11 2002-04-23 Raindrop Geomagic, Inc. Methods of generating three-dimensional digital models of objects by wrapping point cloud data points
US6788807B1 (en) * 1998-02-13 2004-09-07 Minolta Co., Ltd. Three dimensional information measurement method and apparatus
JP3417377B2 (en) * 1999-04-30 2003-06-16 日本電気株式会社 Three-dimensional shape measuring method and apparatus, and recording medium
KR100371078B1 (en) * 1999-05-29 2003-02-06 지스캔(주) striped pattern formatting apparatus and method using polygon mirror
JP4288753B2 (en) * 1999-05-31 2009-07-01 コニカミノルタセンシング株式会社 3D data input device
CA2278108C (en) 1999-07-20 2008-01-29 The University Of Western Ontario Three-dimensional measurement method and apparatus
US7111252B1 (en) * 1999-09-22 2006-09-19 Harris Scott C Enhancing touch and feel on the internet
US8738471B2 (en) * 1999-09-22 2014-05-27 Scott C. Harris Enhancing touch and feel on the internet
US7617135B2 (en) * 2000-02-16 2009-11-10 Illinois Computer Research, Llc Enhancing touch and feel on the internet
US6704791B1 (en) 2000-02-24 2004-03-09 Scott C. Harris Three dimensional experience for thick and thin clients
US7167575B1 (en) 2000-04-29 2007-01-23 Cognex Corporation Video safety detector with projected pattern
US6701005B1 (en) 2000-04-29 2004-03-02 Cognex Corporation Method and apparatus for three-dimensional object segmentation
US7041533B1 (en) * 2000-06-08 2006-05-09 Micron Technology, Inc. Stereolithographic method for fabricating stabilizers for semiconductor devices
US6996505B1 (en) 2000-06-21 2006-02-07 Raindrop Geomagic, Inc. Methods, apparatus and computer program products for automatically generating nurbs models of triangulated surfaces using homeomorphisms
US7625335B2 (en) * 2000-08-25 2009-12-01 3Shape Aps Method and apparatus for three-dimensional optical scanning of interior surfaces
US7358986B1 (en) 2000-09-13 2008-04-15 Nextengine, Inc. Digital imaging system having distribution controlled over a distributed network
US6639684B1 (en) * 2000-09-13 2003-10-28 Nextengine, Inc. Digitizer using intensity gradient to image features of three-dimensional objects
CA2422242A1 (en) * 2000-09-13 2002-03-21 Nextengine, Inc. Imaging system monitored or controlled to ensure fidelity of file captured
US6856407B2 (en) 2000-09-13 2005-02-15 Nextengine, Inc. Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels
CA2327894A1 (en) * 2000-12-07 2002-06-07 Clearview Geophysics Inc. Method and system for complete 3d object and area digitizing
US6492651B2 (en) 2001-02-08 2002-12-10 3D Systems, Inc. Surface scanning system for selective deposition modeling
US7233351B1 (en) 2001-02-23 2007-06-19 Nextengine, Inc. Method for high resolution incremental imaging
ES2378060T3 (en) * 2001-03-02 2012-04-04 3Shape A/S Procedure for modeling custom ear pieces
FI113982B (en) * 2001-03-13 2004-07-15 Tamtron Oy Procedure and system for measurement
DE10111919A1 (en) * 2001-03-13 2002-09-19 Boegl Max Bauunternehmung Gmbh guideway beams
KR20020073890A (en) * 2001-03-16 2002-09-28 한국전자통신연구원 Three - Dimensional Modeling System Using Hand-Fumble and Modeling Method
US7365672B2 (en) * 2001-03-16 2008-04-29 Battelle Memorial Institute Detection of a concealed object
US6507309B2 (en) 2001-03-16 2003-01-14 Battelle Memorial Institute Interrogation of an object for dimensional and topographical information
US7405692B2 (en) * 2001-03-16 2008-07-29 Battelle Memorial Institute Detecting concealed objects at a checkpoint
US6853373B2 (en) 2001-04-25 2005-02-08 Raindrop Geomagic, Inc. Methods, apparatus and computer program products for modeling three-dimensional colored objects
US6792140B2 (en) * 2001-04-26 2004-09-14 Mitsubish Electric Research Laboratories, Inc. Image-based 3D digitizer
US20040246334A1 (en) * 2001-08-30 2004-12-09 Dimitri Philippou Image portrayal system
FI111755B (en) * 2001-11-23 2003-09-15 Mapvision Oy Ltd Method and system for calibrating an artificial vision system
TW584736B (en) * 2001-12-07 2004-04-21 Ind Tech Res Inst Shape measurement device of dual-axial anamorphic image magnification
US7344082B2 (en) * 2002-01-02 2008-03-18 Metrologic Instruments, Inc. Automated method of and system for dimensioning objects over a conveyor belt structure by applying contouring tracing, vertice detection, corner point detection, and corner point reduction methods to two-dimensional range data maps of the space above the conveyor belt captured by an amplitude modulated laser scanning beam
DE10204430A1 (en) 2002-02-04 2003-08-07 Zeiss Carl Stereo microscopy method and stereo microscopy system
US7881896B2 (en) 2002-02-14 2011-02-01 Faro Technologies, Inc. Portable coordinate measurement machine with integrated line laser scanner
EP1490651B1 (en) * 2002-04-04 2010-12-08 Amfit, Inc. Compact optical contour digitizer
US7058439B2 (en) * 2002-05-03 2006-06-06 Contourmed, Inc. Methods of forming prostheses
ATE426793T1 (en) 2002-12-31 2009-04-15 D4D Technologies Llc DIGITIZATION SYSTEM WITH A LASER FOR DENTAL APPLICATIONS
JP5189287B2 (en) 2003-03-24 2013-04-24 ディーフォーディー テクノロジーズ エルエルシー Dental laser digitizer system
US20050020910A1 (en) * 2003-04-30 2005-01-27 Henley Quadling Intra-oral imaging system
WO2004100068A2 (en) * 2003-05-05 2004-11-18 D3D, L.P. Optical coherence tomography imaging
US7324132B2 (en) * 2003-05-06 2008-01-29 Hewlett-Packard Development Company, L.P. Imaging three-dimensional objects
JP4913597B2 (en) * 2003-09-17 2012-04-11 ディーフォーディー テクノロジーズ エルエルシー High-speed multiple line 3D digitization method
CA2550842A1 (en) * 2003-12-30 2005-07-21 The Trustees Of The Stevens Institute Of Technology Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
US7711179B2 (en) 2004-04-21 2010-05-04 Nextengine, Inc. Hand held portable three dimensional scanner
US20060045174A1 (en) * 2004-08-31 2006-03-02 Ittiam Systems (P) Ltd. Method and apparatus for synchronizing a transmitter clock of an analog modem to a remote clock
US20060153427A1 (en) * 2005-01-12 2006-07-13 Zanzucchi Peter J Image methods for determining contour of terrain
US7253482B2 (en) * 2005-08-03 2007-08-07 International Business Machines Corporation Structure for reducing overlap capacitance in field effect transistors
US7995834B1 (en) 2006-01-20 2011-08-09 Nextengine, Inc. Multiple laser scanner
US7844081B2 (en) * 2006-05-15 2010-11-30 Battelle Memorial Institute Imaging systems and methods for obtaining and using biometric information
US7990397B2 (en) * 2006-10-13 2011-08-02 Leica Geosystems Ag Image-mapped point cloud with ability to accurately represent point coordinates
US8072482B2 (en) * 2006-11-09 2011-12-06 Innovative Signal Anlysis Imaging system having a rotatable image-directing device
WO2009011946A1 (en) * 2007-04-17 2009-01-22 University Of Southern California Rendering for an interactive 360 degree light field display
US7726575B2 (en) * 2007-08-10 2010-06-01 Hand Held Products, Inc. Indicia reading terminal having spatial measurement functionality
EP2180832A2 (en) * 2007-08-22 2010-05-05 Koninklijke Philips Electronics N.V. Method and apparatus for the optical characterization of surfaces
EP2208169A4 (en) * 2007-11-07 2013-06-05 Amfit Inc Impression foam digital scanner
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
US9569001B2 (en) * 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
TW201032090A (en) * 2009-02-16 2010-09-01 Chih-Hsiung Lin Laser scanning input device
US8643717B2 (en) * 2009-03-04 2014-02-04 Hand Held Products, Inc. System and method for measuring irregular objects with a single camera
US8031345B2 (en) * 2009-05-29 2011-10-04 Perceptron, Inc. Hybrid sensor
US8243289B2 (en) * 2009-05-29 2012-08-14 Perceptron, Inc. System and method for dynamic windowing
US7995218B2 (en) * 2009-05-29 2011-08-09 Perceptron, Inc. Sensor system and reverse clamping mechanism
US9430923B2 (en) 2009-11-30 2016-08-30 Innovative Signal Analysis, Inc. Moving object detection, tracking, and displaying systems
US9002946B2 (en) * 2010-08-25 2015-04-07 Autodesk, Inc. Dual modeling environment in which commands are executed concurrently and independently on both a light weight version of a proxy module on a client and a precise version of the proxy module on a server
US8939369B2 (en) 2011-01-24 2015-01-27 Datalogic ADC, Inc. Exception detection and handling in automated optical code reading systems
DE102011103510A1 (en) 2011-06-03 2012-12-06 Daimler Ag Method for creating three-dimensional CAD representation of e.g. workshop, involves detecting coherent surface elements in scatter-plot representation, and creating three-dimensional CAD representation of object arrangement
WO2013155687A1 (en) * 2012-04-18 2013-10-24 Thomson Licensing Vertex correction method and apparatus for rotated three-dimensional (3d) components
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US9007368B2 (en) 2012-05-07 2015-04-14 Intermec Ip Corp. Dimensioning system calibration systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
CN103608696B (en) * 2012-05-22 2016-05-11 韩国生产技术研究院 The method of 3D scanning system and acquisition 3D rendering
US9473760B2 (en) * 2012-08-08 2016-10-18 Makerbot Industries, Llc Displays for three-dimensional printers
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US20140281871A1 (en) * 2013-03-15 2014-09-18 Meditory Llc Method for mapping form fields from an image containing text
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US9239950B2 (en) 2013-07-01 2016-01-19 Hand Held Products, Inc. Dimensioning system
DE102013215040B4 (en) * 2013-07-31 2016-09-22 Tangible Engineering Gmbh Compact apparatus for producing a three-dimensional object by solidifying a photo-hardening material
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US9091568B2 (en) * 2013-10-21 2015-07-28 Heptagon Micro Optics Pte. Ltd. Optical encoder system including a structured code wheel
US10111714B2 (en) 2014-01-27 2018-10-30 Align Technology, Inc. Adhesive objects for improving image registration of intraoral images
TWI589149B (en) * 2014-04-29 2017-06-21 鈺立微電子股份有限公司 Portable three-dimensional scanner and method of generating a three-dimensional scan result corresponding to an object
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US10139819B2 (en) 2014-08-22 2018-11-27 Innovative Signal Analysis, Inc. Video enabled inspection using unmanned aerial vehicles
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3396313B1 (en) 2015-07-15 2020-10-21 Hand Held Products, Inc. Mobile dimensioning method and device with dynamic accuracy compatible with nist standard
US20170017301A1 (en) 2015-07-16 2017-01-19 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US10436910B2 (en) 2016-09-20 2019-10-08 Apple Inc. Line scan depth sensor comparing a time dependent waveform of the signals to an expected waveform
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
CN106713752B (en) * 2016-12-27 2019-05-10 中国科学院长春光学精密机械与物理研究所 The circuit control device and its control method of single-sensor combination optical multiplexer camera lens
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
US10229537B2 (en) * 2017-08-02 2019-03-12 Omnivor, Inc. System and method for compressing and decompressing time-varying surface data of a 3-dimensional object using a video codec
US10692247B2 (en) * 2017-08-02 2020-06-23 Omnivor, Inc. System and method for compressing and decompressing surface data of a 3-dimensional object using an image codec
US10742881B1 (en) 2017-09-27 2020-08-11 Apple Inc. Combined temporal contrast sensing and line scanning
CN108347596B (en) * 2017-11-02 2020-01-31 广东康云多维视觉智能科技有限公司 laser guide scanning system and method based on feedback
US10473919B2 (en) 2017-12-29 2019-11-12 Konica Minolta Laboratory U.S.A., Inc. Portable 3D scanning system for 3D surface reconstruction
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
JP7181790B2 (en) * 2018-12-28 2022-12-01 株式会社キーエンス Laser processing equipment
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US20230148865A1 (en) * 2020-03-30 2023-05-18 Dimension Orthotics, LLC Apparatus for anatomic three dimensional scanning and automated three dimensional cast and splint design
CN116878388B (en) * 2023-09-07 2023-11-14 东莞市兆丰精密仪器有限公司 Line scanning measurement method, device and system and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794262A (en) * 1985-12-03 1988-12-27 Yukio Sato Method and apparatus for measuring profile of three-dimensional object
US4982102A (en) * 1989-06-27 1991-01-01 Mitsubishi Denki Kabushiki Kaisha Apparatus for detecting three-dimensional configuration of object employing optical cutting method
JPH04110707A (en) * 1990-08-31 1992-04-13 Kiyadeitsukusu:Kk Device for measuring three-dimensional shape

Family Cites Families (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6013443B2 (en) * 1978-09-11 1985-04-08 日本碍子株式会社 Device for measuring the height of the object to be measured
US4238147A (en) * 1979-05-23 1980-12-09 Solid Photography Inc. Recording images of a three-dimensional surface by focusing on a plane of light irradiating the surface
JPS581982Y2 (en) * 1979-07-30 1983-01-13 東海電線株式会社 Composite flat terminal
US4575805A (en) * 1980-12-24 1986-03-11 Moermann Werner H Method and apparatus for the fabrication of custom-shaped implants
GB2103355B (en) * 1981-08-03 1985-08-07 Gersan Ets Examining a gemstone
US4529316A (en) * 1982-10-18 1985-07-16 Robotic Vision Systems, Inc. Arrangement of eliminating erroneous data in three-dimensional optical sensors
US4627734A (en) * 1983-06-30 1986-12-09 Canadian Patents And Development Limited Three dimensional imaging method and device
US4653104A (en) * 1984-09-24 1987-03-24 Westinghouse Electric Corp. Optical three-dimensional digital data acquisition system
US4645347A (en) * 1985-04-30 1987-02-24 Canadian Patents And Development Limited-Societe Canadienne Des Brevets Et D'exploitation Limitee Three dimensional imaging device
US4737032A (en) * 1985-08-26 1988-04-12 Cyberware Laboratory, Inc. Surface mensuration sensor
US4705401A (en) * 1985-08-12 1987-11-10 Cyberware Laboratory Inc. Rapid three-dimensional surface digitizer
JPH0323664Y2 (en) * 1986-05-01 1991-05-23
FR2610821B1 (en) * 1987-02-13 1989-06-09 Hennson Int METHOD FOR TAKING MEDICAL IMPRESSION AND DEVICE FOR IMPLEMENTING SAME
US4825263A (en) * 1987-06-02 1989-04-25 University Of Medicine & Dentistry Of New Jersey Optical method and apparatus for determining three-dimensional changes in facial contours
US4800271A (en) * 1987-06-23 1989-01-24 Canadian Patents & Development Ltd. Galvanometric optical scanning system having synchronization photodetectors
US4800270A (en) * 1987-06-23 1989-01-24 Canadian Patents & Development Ltd. Galvanometric optical scanning system having a pair of closely located synchronization
US4961155A (en) * 1987-09-19 1990-10-02 Kabushiki Kaisha Toyota Chuo Kenkyusho XYZ coordinates measuring system
CA1313040C (en) * 1988-03-31 1993-01-26 Mitsuaki Uesugi Method and apparatus for measuring a three-dimensional curved surface shape
NO164946C (en) * 1988-04-12 1990-11-28 Metronor As OPTO-ELECTRONIC SYSTEM FOR EXACTLY MEASURING A FLAT GEOMETRY.
US4948258A (en) * 1988-06-27 1990-08-14 Harbor Branch Oceanographic Institute, Inc. Structured illumination surface profiling and ranging systems and methods
US5030008A (en) * 1988-10-11 1991-07-09 Kla Instruments, Corporation Method and apparatus for the automated analysis of three-dimensional objects
US5187364A (en) * 1989-03-22 1993-02-16 National Research Council Of Canada/Conseil National De Recherches Du Canada Scanning device with waveform generator optimizer
JPH0641851B2 (en) * 1989-04-05 1994-06-01 日本鋼管株式会社 Measuring device for three-dimensional curved surface
CA1316590C (en) * 1989-04-17 1993-04-20 Marc Rioux Three-dimensional imaging device
US5027281A (en) * 1989-06-09 1991-06-25 Regents Of The University Of Minnesota Method and apparatus for scanning and recording of coordinates describing three dimensional objects of complex and unique geometry
US4965665A (en) * 1989-10-25 1990-10-23 At&T Bell Laboratories 3D imaging of a substrate using perpendicular scanning directions
EP0462289B1 (en) * 1989-12-28 1994-11-02 Kabushiki Kaisha Toyota Chuo Kenkyusho Apparatus for measuring three-dimensional coordinates
US5506683A (en) * 1990-04-30 1996-04-09 Kumho & Co., Inc. Non-contact measuring apparatus for the section profile of a tire and its method
CA2017518A1 (en) * 1990-05-24 1991-11-24 Her Majesty The Queen, In Right Of Canada, As Represented By The Ministe R Of The National Research Council Of Canada Colour-range imaging
CA2044820C (en) * 1990-06-19 1998-05-26 Tsugito Maruyama Three-dimensional measuring apparatus
US5127061A (en) * 1990-12-03 1992-06-30 At&T Bell Laboratories Real-time three-dimensional imaging technique
GB9102903D0 (en) * 1991-02-12 1991-03-27 Oxford Sensor Tech An optical sensor
US5216236A (en) * 1991-02-19 1993-06-01 National Research Council Of Canada Optical tracking system
US5193120A (en) * 1991-02-27 1993-03-09 Mechanical Technology Incorporated Machine vision three dimensional profiling system
US5108320A (en) * 1991-05-20 1992-04-28 Kimber Ray L Electrical lead wire terminal connector
US5345490A (en) * 1991-06-28 1994-09-06 General Electric Company Method and apparatus for converting computed tomography (CT) data into finite element models
US5436655A (en) * 1991-08-09 1995-07-25 Olympus Optical Co., Ltd. Endoscope apparatus for three dimensional measurement for scanning spot light to execute three dimensional measurement
JPH0560528A (en) * 1991-09-03 1993-03-09 Hitachi Ltd Input device for three-dimensional information
US5377011A (en) * 1991-09-06 1994-12-27 Koch; Stephen K. Scanning system for three-dimensional object digitizing
US5218427A (en) * 1991-09-06 1993-06-08 Koch Stephen K Ranging system for three-dimensional object digitizing
US5164793A (en) * 1991-09-13 1992-11-17 Brown Group, Inc. Shoe size selection system and apparatus therefor
GB9127139D0 (en) * 1991-12-20 1992-02-19 3D Scanners Ltd Scanning unit
FR2685764B1 (en) * 1991-12-30 1995-03-17 Kreon Ind HIGH RESOLUTION COMPACT OPTICAL SENSOR FOR THREE-DIMENSIONAL SHAPE ANALYSIS.
GB9127548D0 (en) * 1991-12-31 1992-02-19 3D Scanners Ltd Scanning sensor
JP2680224B2 (en) * 1992-06-25 1997-11-19 松下電工株式会社 Three-dimensional shape detection method and apparatus
US5270795A (en) * 1992-08-11 1993-12-14 National Research Council Of Canada/Conseil National De Rechereches Du Canada Validation of optical ranging of a target surface in a cluttered environment
JP2651093B2 (en) * 1992-10-27 1997-09-10 松下電工株式会社 Shape detection method and device
US5384717A (en) * 1992-11-23 1995-01-24 Ford Motor Company Non-contact method of obtaining dimensional information about an object
JP3186876B2 (en) * 1993-01-12 2001-07-11 株式会社東芝 Surface profile measuring device
US5446549A (en) * 1993-01-14 1995-08-29 The United States Of America As Represented By The Secretary Of The Navy Method and apparatus for noncontact surface contour measurement
US5418608A (en) * 1993-05-04 1995-05-23 Harbor Branch Oceanographic Institution Inc. Three dimensional mapping systems and methods
EP0632349A1 (en) * 1993-06-29 1995-01-04 James Britten Zapka Reflection integral holography - method and apparatus
JP2592690Y2 (en) * 1993-12-10 1999-03-24 住友電装株式会社 Terminal fitting
US5513276A (en) * 1994-06-02 1996-04-30 The Board Of Regents Of The University Of Oklahoma Apparatus and method for three-dimensional perspective imaging of objects
GB2292605B (en) * 1994-08-24 1998-04-08 Guy Richard John Fowler Scanning arrangement and method
US5529509A (en) * 1995-05-12 1996-06-25 Alcoa Fujikura Limited Interlocking ground terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794262A (en) * 1985-12-03 1988-12-27 Yukio Sato Method and apparatus for measuring profile of three-dimensional object
US4982102A (en) * 1989-06-27 1991-01-01 Mitsubishi Denki Kabushiki Kaisha Apparatus for detecting three-dimensional configuration of object employing optical cutting method
JPH04110707A (en) * 1990-08-31 1992-04-13 Kiyadeitsukusu:Kk Device for measuring three-dimensional shape

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2152171A1 (en) * 1998-11-30 2001-01-16 Univ Madrid Carlos Iii 3D vision system with hardware processing of the video signal
US7200130B2 (en) 2001-02-13 2007-04-03 Nokia Corporation Short range RF network configuration
DE10359781A1 (en) * 2003-12-19 2005-08-04 Krones Ag Apparatus for inspecting returned empty bottles or packages moving along a conveyer belt using a line camera for recording light reflected from the bottles
DE10359781B4 (en) * 2003-12-19 2006-01-05 Krones Ag Device for inspection of empties containers
AU2011202572B2 (en) * 2004-03-19 2013-05-09 Siemens Mobility Pty Ltd Optical method of determining a physical attribute of a moving object
AU2011202572B9 (en) * 2004-03-19 2013-09-19 Siemens Mobility Pty Ltd Optical method of determining a physical attribute of a moving object
CN103295228A (en) * 2013-05-06 2013-09-11 深圳先进技术研究院 Quick data registering method in three-dimensional scanning system and three-dimensional scanning system
CN105806242A (en) * 2016-04-15 2016-07-27 同济大学 Surface type measuring device adopting laser rotary scanning
CN105806242B (en) * 2016-04-15 2018-06-05 同济大学 Using the surface type measurement device of laser rotary scanning
DE102020130746A1 (en) 2020-11-20 2022-05-25 Raphael Ginbar Device and method for examining a surface
DE102020130746B4 (en) 2020-11-20 2022-06-15 Raphael Ginbar Device and method for examining a surface

Also Published As

Publication number Publication date
TW338820B (en) 1998-08-21
AU3652397A (en) 1998-02-09
US5870220A (en) 1999-02-09

Similar Documents

Publication Publication Date Title
US5870220A (en) Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation
US6044170A (en) System and method for rapid shape digitizing and adaptive mesh generation
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
US6611617B1 (en) Scanning apparatus and method
US5991437A (en) Modular digital audio system having individualized functional modules
CA2511828C (en) Laser digitizer system for dental applications
US6553138B2 (en) Method and apparatus for generating three-dimensional representations of objects
US20090322859A1 (en) Method and System for 3D Imaging Using a Spacetime Coded Laser Projection System
US20090168045A1 (en) Three-dimensional surround scanning device and method thereof
CA2090093A1 (en) Motion detection and image acquisition apparatus and method of detecting the motion of and acquiring and image of an object
WO2009120073A2 (en) A dynamically calibrated self referenced three dimensional structured light scanner
US6927864B2 (en) Method and system for determining dimensions of optically recognizable features
JPH05135155A (en) Three-dimensional model constitution device using successive silhouette image
HU220729B1 (en) Method and apparatus for scanning a three-dimensional object and producing a model of it by computer
Poulsen et al. An optical registration method for 3D ultrasound freehand scanning
Cockshott et al. Experimental 3-D digital TV studio
Rioux Colour 3-D electronic imaging of the surface of the human body
JPH05223524A (en) Method of determining space coordinate of work
CN109959455B (en) Static infrared target scanning imaging device and method based on no lens
Ricci et al. High-resolution laser radar for 3D imaging in artwork cataloging, reproduction, and restoration
Wen et al. A low-cost, user-friendly, and real-time operating 3D camera
JPH01250707A (en) Method and apparatus for measuring shape of three-dimensional curved surface
Drenik et al. An evaluation of low cost scanning versus industrial 3D scanning devices
Schoor et al. A concept for applying VR and AR technologies to support efficient 3D non-contact model digitalization
JP2000227963A (en) Device and method for preparing image with distance information and recording medium recording program for the method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: CA

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 1998506084

Format of ref document f/p: F

122 Ep: pct application non-entry in european phase