WO2015167914A1 - Optical system for measuring particle characteristics - Google Patents

Optical system for measuring particle characteristics Download PDF

Info

Publication number
WO2015167914A1
WO2015167914A1 PCT/US2015/027214 US2015027214W WO2015167914A1 WO 2015167914 A1 WO2015167914 A1 WO 2015167914A1 US 2015027214 W US2015027214 W US 2015027214W WO 2015167914 A1 WO2015167914 A1 WO 2015167914A1
Authority
WO
WIPO (PCT)
Prior art keywords
particle
sensor
imaging device
volume
image
Prior art date
Application number
PCT/US2015/027214
Other languages
French (fr)
Inventor
Firat Y. TESTIK
Original Assignee
Clemson University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Clemson University filed Critical Clemson University
Publication of WO2015167914A1 publication Critical patent/WO2015167914A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
    • G01N15/02Investigating particle size or size distribution
    • G01N15/0205Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging
    • G01N15/0227Investigating particle size or size distribution by optical means, e.g. by light scattering, diffraction, holography or imaging using imaging, e.g. a projected image of suspension; using holography
    • G01N15/1433
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume, or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N15/14Electro-optical investigation, e.g. flow cytometers
    • G01N15/1456Electro-optical investigation, e.g. flow cytometers without spatial resolution of the texture or inner structure of the particle, e.g. processing of pulse signals
    • G01N15/1459Electro-optical investigation, e.g. flow cytometers without spatial resolution of the texture or inner structure of the particle, e.g. processing of pulse signals the analysis being performed on a sample stream
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • G01W1/14Rainfall or precipitation gauges
    • G01N15/075
    • G01N2015/1027

Landscapes

  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Dispersion Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Hydrology & Water Resources (AREA)
  • Engineering & Computer Science (AREA)
  • Environmental Sciences (AREA)
  • Ecology (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Atmospheric Sciences (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

Systems and methods for detection of particle characteristics are described. Systems include a sensor and an imaging device to obtain a series of images of a particle as it passes through a detection volume. Particles can be hydrometeors and a system can be utilized to obtain information during a weather event. Particles can include solid, liquid, and gaseous particles.

Description

OPTICAL SYSTEM FOR MEASURING PARTICLE CHARACTERISTICS
Cross Reference to Related Application
[0001 ] This application claims filing benefit of U.S. Provisional Patent Application serial number 61/985,756 having a filing date of April 29, 2014, which is incorporated herein in its entirety by reference.
Statement As to Rights to Inventions Made Under
Federally Sponsored Research
[0002] This invention was made with Government support under Award No.
1 144846 by the National Science Foundation. The Government has certain rights in the invention.
Background
[0003] The ability to accurately determine characteristics of particles is important in a wide variety of applications. For instance, accurate radar rainfall estimation requires a prior knowledge of the relationship between the radar observables and the rainfall intensity. This relationship in turn depends on the rainfall microphysics, i.e., the characteristics of the individual raindrops such as the velocity, size, acceleration, drop size distribution, etc. Accurate and fast detection of the characteristics of hydrometeors in weather events can also be important with regard to public safety. For instance, early and accurate detection of freezing rain, hail, snow, etc. can lead to earlier and better decisions with regard to business and school closings, plane de- icing materials and timing, location and timing of road salting, and so forth.
[0004] The accurate characterization of particles (e.g., solid particles, liquid droplets, gaseous bubbles, etc.) and small animals such as insects, aquatic creatures, etc. is desirable in applications beyond weather events. For instance, accurate liquid particle characterization such as in development and monitoring of spray nozzles is important in agriculture (e.g., watering systems), safety (e.g., fire sprinkler systems), medical applications (e.g., nebulizers, inhalers), and so forth. Solid particulate characterization is important in waste monitoring (e.g., smoke stack monitoring, soil erosion studies), medical applications (pill formation and packaging), and manufacturing (e.g., powder use and formation as may be found in the ceramic, chemical, and food industries), among others. Accurate characterization of gaseous particles is also important, for instance in aquaculture industries and coatings technologies. Environmental studies to determine the population of small animals such as insects (e.g., mosquitos) and small aquatic creatures (e.g., fish, tadpoles) can be of use in both public health and determination of the health and variation in natural environments.
[0005] A wide variety of devices and materials have been developed to accurately determine the characteristics of particles. For example, a disdrometer is typically used to obtain precipitation microphysical information such as drop size distribution, drop shape, and fall velocity during a rain event. Particle detection methodology in general primarily utilizes impact and optical based devices. Disdrometers likewise can generally be classified as impact or optical types but also include rain radar-type devices. The most popular impact type disdrometer is the Joss Waldvogel disdrometer (JWD). Among the various types of the optical disdrometers are the 2- dimensional video disdrometer (2DVD), the Parsivel disdrometer, the snowflake video imager, and the Thies optical disdrometer. The rain radar type disdrometers include the Micro Rain Radar (MRR) and the Pludix disdrometers.
[0006] The JWD is an electromechanical (impact) type disdrometer. When a raindrop falls vertically, it carries momentum. By knowing the momentum of the falling drop and transforming it into an electrical signal, drop size can be determined. The JWD gives good results for light to moderate intensity rainfall events, but underestimates the amount of small size drops during heavy rainfall events as it cannot detect any drops that are less than 0.3 mm in size. Also, this system assumes that the raindrops are falling at terminal velocity which has been shown to not necessarily be the case and leads to inaccuracies in the data.
[0007] A commonly used optical disdrometer is the 2 DVD, which uses two orthogonal line scan cameras for capturing images of raindrops. From the two orthogonal projections, a three dimensional view of the raindrop can be obtained. The line scan camera has a single line of photo sensors. When a raindrop passes through the light sheet, it creates a shadow on the photo sensors of the line scan camera. The size of the hydrometeor can be determined from the number of dark pixels in the image of the camera. Unfortunately, the 2DVD system suffers problems at high temperature (an overheating problem). In addition, if the two orthogonal cameras get misaligned due to transportation or handling, it can be quite difficult to realign them. [0008] In the Parsival disdrometer, optical sensors are used for detecting raindrops and snowflakes. It includes a sensor that sends a 5V signal to an output. When a particle passes through the sensor, it blocks some portion of the light beam of the sensor and results in an output voltage of less than 5V. A pre-defined empirical relation between size and voltage is employed to quantify the size of the particle. An updated version of this system improves the measurement accuracy when the hydrometeors do not fall vertically due to the presence of strong winds. The Parsivel system can measure the size and velocity of raindrops and snowflakes, but it doesn't provide any shape or other detailed information.
[0009] The Thies optical disdrometer has a laser emitter, a receiver and a digital signal processing unit. This system uses laser light to detect raindrops. When a raindrop passes through the laser beam, it dims the light of the beam and voltage drops in the receiver unit. By estimating the voltage drop of the photo diode, the size of the hydrometeor can be determined. However, information with regard to particle type, shape, velocity, acceleration, etc. is limited at best.
[0010] The snowflake video imager has a video camera and a halogen lighting system pointing towards the camera. This imaging system captures images of snowflakes and it processes the images to obtain the sizes of the snowflakes. There is a data logger to keep the images in a computer system. Unfortunately, it is quite limited in application.
[001 1 ] The Pludix disdrometer analyzes the X-band continuous wave (Doppler) radar signal reflected by the hydrometeors to measure the shape and velocity information of the hydrometeor. This type of disdrometer underestimates the rainfall amount when the rainfall is light to moderate; however, it overestimates if the rainfall is intense. The MRR is a vertically looking Doppler radar. In this system, assumption of the form of drop size distribution is not required. The major limitation of this system is that it does not consider the effect of the vertical wind velocity and turbulence.
[0012] What are needed in the art are systems and methods that can accurately provide characteristics of particles. For instance, a system and method that can identify a wide variety of characteristics of hydrometeors with accuracy and speed would be of great benefit. Summary
[0013] Aspects and advantages of the invention will be set forth in part in the following description, or may be obvious from the description, or may be learned through practice of the invention.
[0014] According to one embodiment, disclosed is a system for determining one or more characteristics of a particle. The system can include an imaging device that defines a field of view. The system can also include a sensor that defines a sensing volume. The sensing volume can intersect the field of view of the imaging device such that a detection volume includes this intersection. The system can also include a processor that is in communication with the imaging device and the sensor. The processor can be configured such that upon detection by the sensor of the particle, at least one image of the particle is obtained by the imaging device of the particle within the detection volume.
[0015] According to another embodiment, disclosed is a method for determining one or more characteristics of a particle. The method includes sensing the presence of the particle by use of a sensor and obtaining at least one image of the particle as the particle and a detection volume move or are moved with respect to one another. The detection volume is defined by the intersection of a sensing volume with the field of view of the imaging device. The method also includes processing the at least one image to obtain one or more characteristics of the particle.
Brief Description of the Figures
[0016] A full and enabling disclosure of the present invention, including the best mode thereof, directed to one of ordinary skill in the art, is set forth in the
specification, which makes reference to the appended figures, in which:
[0017] FIG. 1 is a schematic illustration of one embodiment of a system as disclosed herein.
[0018] FIG. 2 includes a front view of one embodiment of a sensor and a cross- sectional view of the field of view of an imaging device.
[0019] FIG. 3 is a front view of one embodiment of a sensor.
[0020] FIG. 4 is a top view of one embodiment of a sensor.
[0021 ] FIG. 5 is a schematic representation of a detection volume formed at the intersection of a sensing volume and the field of view of an imaging device.
[0022] FIG. 6 is a side view of one embodiment of a system as disclosed herein. [0023] FIG. 7 is a side view of another embodiment of a system as disclosed herein.
[0024] FIG. 8 is a flow diagram for one embodiment of a method as disclosed herein.
[0025] FIG. 9 compares an image of a particle as obtained from a system as disclosed herein and that of another optical imaging system.
[0026] FIG. 10 graphically presents the error in particle size determination as a function of distance from the focal plane for an optical imaging system.
[0027] FIG. 1 1 graphically illustrates the maximum percentage error within the sensor width for different spherical particle size.
[0028] FIG. 12 presents images of particles obtained from a system including an image of a 5 mm diameter spherical lens (FIG. 12A), the processed image of this image (FIG. 12B), an image of a 3 mm diameter spherical lens (FIG. 12C), and the processed image of this image (FIG. 12D).
[0029] FIG. 13 schematically illustrates a laboratory experimental setup utilized in validation of the disclosed system.
[0030] FIG. 14 graphically illustrates the volume of water used for generating drops in laboratory experiments compared to the volume calculated from data obtained from a system as disclosed herein for several different drop diameters.
[0031 ] FIG. 15 illustrates a system as disclosed herein for use in the field.
[0032] FIG. 16 presented images of raindrops obtained from a system as disclosed herein including an equilibrium shaped raindrop (FIG. 16A) and a processed image of the image of FIG. 16A (FIG. 16B) and an image of a canted raindrop (FIG. 16C) and a processed image of the image of FIG. 16C (FIG. 16D).
[0033] FIG. 17A, FIG. 17B, FIG. 17C, FIG. 17D, FIG. 17E and FIG. 17F are sequential images of a single raindrop taken at intervals of 1 millisecond as is passed through a detection volume.
[0034] FIG. 18 graphically compares the rainfall amount as determined by use of a system as disclosed herein as compared to the amount as determined by use of a rain gauge for different rainfall events.
[0035] FIG. 19A graphically presents the measured raindrop fall velocity as a function of drop diameter as determined by a system as disclosed herein compared to that predicted by the method of Gunn and Kinzer (1949) for a rainfall event.
[0036] FIG. 19B graphically present the drop size distribution as determined by a system as disclosed herein during a rainfall event.
[0037] Repeat use of reference characters in the present specification and drawings is intended to represent the same or analogous features or elements of the present disclosure.
Detailed Description
[0038] Reference will now be made in detail to various embodiments of the disclosure, one or more examples of which are illustrated in the accompanying drawings. Each example is provided by way of explanation of the subject matter, not limitation thereof. In fact, it will be apparent to those skilled in the art that various modifications and variations can be made in the present disclosure without departing from the scope or spirit of the subject matter. For instance, features illustrated or described as part of one embodiment, can be used on another embodiment to yield a still further embodiment.
[0039] In general, the present disclosure is directed to systems and methods that can be utilized to accurately detect characteristics of particles, small animals, etc. For instance, disclosed systems can be utilized as disdrometers to detect
characteristics of hydrometeors during a weather event. However, the systems are not limited to the detection of hydrometeors, and it should be understood that the systems can be utilized in a wide variety of settings to detect characteristics of any particle, including solid, liquid, and gaseous particles.
[0040] Through utilization of the disclosed systems, one or more characteristics of a single particle, a plurality of particles and/or a particle event can be determined. As utilized herein, the term 'particle' can generally refer to any relatively small structure (e.g., a cross sectional dimension of about one inch or less in some embodiments) including, without limitation, solid particles, liquid droplets, gaseous or vaporous bubbles, small animals (e.g., insects, tadpoles, fish, worms, etc.). A particle is not limited to any particular size, however, in general, a particle will be smaller in cross sectional area than the detection volume described further herein. A particle can generally be of any material or combination of materials and have a recognizable boundary within a carrier domain (e.g., air, water, fog, etc.). For example, and without limitation, the size, velocity, acceleration, state (e.g., solid, liquid, gas) and nature (e.g., at or near a state change) of individual particles can be obtained.
Infornnation with regard to a plurality of the particles and/or the particle event (i.e., the actions and nature of the plurality of particles) can also be obtained such as, and without limitation, particle size distribution, total volume of particles (e.g., total precipitation), total particle flow rate, change in particle flow rate over time, and so forth. The information obtained from the system can also be combined with other traditional sensing systems (e.g., radar detection, satellite imaging, wind detectors, etc.) to provide information about the event, and in one particular embodiment, about a weather event.
[0041 ] The system includes a sensor for detecting a particle and an imaging device for obtaining images of the particle. More specifically, the system is designed such that at least one and in some embodiments a series of images of a particle can be obtained as a particle moves or is moved in relation to a predefined detection volume. For instance, the particle can move or be moved through the detection volume as in a raindrop falling to the ground and passing through a detection volume or the detection volume can move past the particle, as in a detection volume on/near an airplane wing moving through an area containing a raindrop suspended in or falling through the atmosphere.
[0042] This predefined detection volume is in a limited and known relationship to the focal plane of the imaging device. A serious issue with previously known optical systems is that if the particle is not at the focal plane, there will be optical
measurement errors as the drop images will be blurry (see, e.g., FIG. 9). In the presently disclosed systems, the sensor is provided with a known sensing area that defines a sensing volume and in a known relationship to the focal plane for capturing images of the particle. Consequently, clear particle images can be obtained, which can be digitally processed for accurate measurements of the particles.
[0043] The system also includes an imaging device that in one embodiment is capable of taking one or a plurality of images of a particle as it moves with respect to the detection volume. A series of images of the same particle can provide
information with regard to the nature of the particle. For instance, information with regard to oscillations of the particle as it moves with respect to the detection volume can be obtained, which can provide information with regard to the state of particle (e.g., fluid vs. solid; gas vs. liquid, etc.) and the nature of the particle (e.g., near freezing) as well as the nature of the event (e.g., a weather event). (See, e.g., Testik, et al., Journal of the Atmospheric Sciences 63 (2006) 2663-2668)
[0044] A series of images of a particle can provide information with regard to collision behavior between particles, which can likewise provide information with regard to the characteristics of the particles and the nature of the event. For instance, the behavior of gas particles upon collision can provide information with regard to the state of the encapsulating fluid (e.g., the viscosity of the surrounding fluid) as well as the total energy of the system. Examination of collisions between hydrometeors in weather events can similarly provide information with regard to the state and nature of the hydrometeors (e.g., hail vs. rain, rain vs. freezing rain, etc.). (See, e.g., Testik, et al., Journal of the Atmospheric Sciences 68 (201 1 ) 1097-1 1 13; Testik, Atmospheric Research 94 (2009) 389-399).
[0045] FIG. 1 schematically represents one embodiment of a system 10. The system includes an imaging device 3, and a sensor 2. The imaging device 3 can define a field of view 5 within which it is capable of capturing images. To improve imaging of the particles, it may be beneficial to include a light 1 to better image the particles by use of the imaging device 3.
[0046] In this particular embodiment, the system 10 is designed to examine particles in a weather event and can include additional weather examination devices such as an anemometer 4. For example, a 3-dimensional ultrasonic anemometer (model: RM Young 81000) can be used to record the wind information. There can be multiple outputs from such a device. For instance, a typical anemometer 4 can include four different analog voltage outputs to provide information with regard to wind speed, wind direction, elevation, and sonic temperature.
[0047] The imaging device 3 can generally be any imaging device capable of obtaining an image of a particle as it passes through the field of view 5. In one, in order to obtain a series of images of a particle, the imaging device can be a high speed camera, a video camera, or the like. By way of example, the imaging device 3 may include a high speed imaging array sensor, such as a CMOS sensor or a charge- coupled device (CCD) sensor or the like, such as described in U.S. Pat. Nos. 5,550,677; 5,670,935; and 6,313,454, which are hereby incorporated herein by reference. [0048] When utilizing a high-speed imaging device, the device can generally be capable of capturing multiple images of a particle as it passes through the field of view 5 of the system. For example, the high speed imaging device 3 can be capable of obtaining about 100 images per second, or more in some embodiments, such as about 500 or more images per second, or about 1 ,000 or more images per second.
[0049] FIG. 2 is a view of the system 10 along the line AA of FIG. 1 . In this view the field of view 5 is shown in cross section. The size of the field of view 5 can be adjusted depending upon the particular application of the system. For instance, when considering examination of a weather event, a field of view can be relatively small, for instance about 10 cm by about 10 cm or less in some embodiments, such as about 8 cm by about 8 cm. When considering the field of view with regard to the pixels of the imaging device 3, the field of view can be of any size, for instance in one embodiment the field of view can be about 2000 pixels by about 2000 pixels or less, about 1500 pixels by about 1500 pixels or less, or about 1000 pixels by about 1000 pixels or less. High pixel field of view is encompassed herein as well, i.e., about 2000pixels by 2000 pixels or greater, such as about 5000 pixels by about 5000 pixels. Of course, the field of view is not limited to a square cross sectional area, and any shape for the field of view is encompassed herein.
[0050] A relatively small view frame can be preferred in some embodiments as this can provide for a route to better observe small particles with high accuracy. The camera view frame can be adjusted (if desired) by changing the camera lens setting and the position of the sensor, as is generally known to one of skill in the art.
[0051 ] When considering utilization of a system in an outdoor environment, it can be beneficial to utilize instruments capable of functioning in a wide range of weather environments. For example, it can prove beneficial to utilize an imaging device 3 that can be operated in a rugged environment such as, without limitation, a wide range of temperatures, e.g., from about -40°C to about +50°C, in conjunction with natural objects such as weeds, spider webs, wind, etc., and so forth.
[0052] The system 10 can include a light 1 to improve imaging of the particles. For example, a light emitting diode (LED) device, optionally including a diffuser can be included. A non-fluctuating light such as is possible by use of an LED device can be utilized in one embodiment, though the system is not limited to a non-fluctuating light. For instance, the processor can be programmed to compensate for light fluctuations as necessary. Use of a light can improve initial image quality for better image processing. A diffuser can also be incorporated to facilitate uniform
background light in the images according to standard practice.
[0053] The system 10 also includes a sensor 2. The sensor is located so as to detect particles that pass through the field 5. Upon detection of a particle by the sensor 2, a processor 15 in communication with the sensor 2 and the imaging device 3 can send a trigger signal to the imaging device and one or more images of the particle can then be obtained. These images can then be further processed to obtain one or more characteristics of the particle.
[0054] Any suitable sensing technology can be utilized to detect individual particles including, without limitation, optical sensing, impact sensing, temperature sensing, etc. In the embodiment illustrated in FIG. 2, the sensor 2 is an optical sensor that includes a transmitter 6 and a receiver 7. An optical sensor can function in a standard fashion. Briefly, a transmitter 6 can emit a signal that is received at a receiver 7. A particle moving through the signal can disrupt the reception of the signal, which can be detected, for instance by a change in voltage at the receiver 7. An optical sensor can utilize a signal in any visible or near visible spectrum as is known in the art. For instance, the sensor can operate in visible, infrared (IR), near IR, or ultraviolet (UV) spectra. The transmitter 6 can utilize optical fibers, lasers, or any other suitable emission device. In one embodiment, a line-scan camera, which utilizes a linear CCD array to assemble the images, can be utilized as an optical sensor 2.
[0055] In the embodiment of FIG. 2, and with regard to the direction of the particle motion, the sensor 2 is located downstream of (i.e., beneath) the field of view 5 of the imaging device 3. While the particular location of the sensor is not limited, provided that the sensor can detect particles that pass through the field of view 5, location of the sensor 2 downstream of the field of view 5 can be beneficial in some
embodiments. For instance, location of the sensor 2 beneath the field of view 5 in a disdrometer application can help to reduce the effects of rain splashes due to the presence of the sensor. Optionally, the sensor 2 can include components that can further reduce interaction between the sensor 2 and the particles 8. For instance, the sensor 2 can include a covering that can catch and/or divert particles that could otherwise contact the sensor and interfere with accurate detection. However, as stated, with regard to the direction of motion of the particle, the sensor 2 can be located upstream of the field of view 5 or even within the field of view 5, provided that the sensor can detect particles that pass through the field of view 5.
[0056] In an embodiment such as that of FIG. 2, in which the sensor is
downstream as the particles travel of the field of view 5, the images obtained by the imaging device 3 can be recorded using the pre-trigger mode; e.g., the imaging device can operate continuously, and a number of the previously obtained images can be saved to memory upon detection of a particle at the sensor 2. The particular number to be saved can be programmed directly into the system or can be
determined based upon some other criteria of the system, e.g., temperature, time, etc. can affect the particular number of images to be saved.
[0057] FIG 3 illustrates one embodiment of an optical sensor including a
transmitter 6 and a receiver 7 that define between them a sensing volume 9 (shown in a front cross-sectional view in FIG. 3). However, the sensor is not limited to optical sensors. It can be any sensor. For instance, FIG. 4 illustrates a top view of an impact sensor 12 as may be utilized to detect a particle 8. An impact sensor 12 can utilize electrical, mechanical, electromechanical, etc., or combinations of processes to detect the presence of a particle 8. For instance, an impact sensor can utilize a similar sensor to a JWD disdrometer, as discussed previously.
[0058] As illustrated in FIG. 3, there is a possibility that multiple particles may be present in the sensing volume 9 at one time. To accurately identify each particle, the processor can be programmed to control the triggering of the camera. For example, after the first particle is detected by the sensor 2 and the processor triggers the imaging device, the system can be programmed such that the imaging device will not be triggered again before it finishes capturing the predefined number of frames for that particle. During this time, when a second particle is detected by the sensor, the timing information for this particles can be stored, for instance in a temporary output file. Following storing of the images for the first particle, the stored information for the second particle can then be accessed and transferred to the more permanent output file record, and so forth until the information in the temporary output file has been completely transferred to the permanent output file record. Thus, the overlapping data for each particle can be recorded and the record can be more complete. Unless the detection signals for the first and second particles are almost simultaneous, e.g., less than about 2 milliseconds apart, the system can obtain the characteristics of both of the particles. When considering weather events, the chances of two particles falling in the detection volume within such a short time span is extremely unlikely even under heavy rain conditions.
[0059] In another embodiment, should there be two sequential triggers within a relatively short time span (i.e., both particles are in the detection volume at one time), one or more images of the first particle can be collected. At the second trigger, the subsequent images can be saved under a file for the second trigger. For example, if the typical number of images that is saved is 20, and 5 images have been saved for a first particle at the time of a second trigger, then these five images are the images ascribed by the system to the first particle and the next 20 images are ascribed by the system to the second particle.
[0060] Other designs for accounting for multiple particles in the detection volume at a single time are likewise encompassed herein.
[0061 ] FIG. 5 provides a perspective view of a detection volume 13 within which images of a particle 8 can be obtained. The detection volume 13 can include the volume defined by the intersection of the field of view 5 of the imaging device and the sensing volume 9 of the sensor. It should be understood that there is no particular limitation on the shape of the detection volume 13. Though illustrated as cubic, this is not a requirement of a system, and any cross-sectional shape that includes an intersection of the field of view 5 and a sensing volume 9 is encompassed, which can define the shape of the detection volume 13.
[0062] The detection volume 13 can be in a known relationship to the focal plane of the imaging device. This can provide for much clearer images of the particles as they pass through the field of view.
[0063] In one embodiment, illustrated in FIG. 6, the focal plane 14 of the imaging device can be within the detection volume 13. For instance, the sensing volume of the sensor 2 can be centered on the focal plane 14 thus centering the focal plane 14 in the depth dimension of the detection volume 13. In another embodiment, the focal plane 14 can be within the detection volume 9, but not necessarily centered in the detection volume or even exterior to the detection volume 9 but in a known
relationship to the detection volume. [0064] In those embodiments in which the focal plane is within the detection volume, by limiting the distance that the detection volume 13 extends to either side of the focal plane 14, all of the particles 8 that pass through the detection volume 13 will be relatively close to the focal plane 14 and the images of the particles 8 can be clear. Thus, more accurate characteristics (e.g., size, density, etc.) can be obtained by use of the system. By way of example, the detection volume 13 can extend to a distance of about 1 cm or less, or about 5 mm or less in some embodiments to one or both sides of the focal plane.
[0065] As the particles pass further way from the focal plane, the images become more and more blurry and increase or decrease in size in the images. This presents great difficulty in obtaining accurate measurement characteristics of the particles. However, in some embodiments, it can be necessary to increase the effective depth of field of the detection volume 13 ('d' on FIG. 6) rather than recording only the particles that are very close to the focal plane 14. In such embodiments, the depth of the detection volume 13 can be designed to extend a greater distance from the focal plane through the introduction of focus correction factors to the imaging processing. Similarly, in those embodiments in which the focal plane is exterior to the detection volume, but in a known relationship to the detection volume, correction factors can be introduced in the processing of the images to provide accurate information with regard to the particles.
[0066] In one embodiment, illustrated in FIG. 7, a plurality of sensors side by side 2a, 2b, 2c, can be utilized, each of which being located at a known distance from the focal plane 14. This distance information can be used to develop correction factors for use in processing the images of the particles 8 that are detected by one of the sensor 2a, 2b, 2c and thus provide a detection volume 13 having a larger depth d. The particular sensor triggered (2a, 2b, or 2c) for each detected particle can be determined by software or hardware methodology. For instance multiplexers can be utilized to separate the signals from the different sensors. Using different correction factors that correspond to the different sensors 2a, 2b, 2c accurate characteristics can be obtained for particles that pass within the detection volume 13, even at a relatively large distance from the focal plane 14.
[0067] The improved image clarity of the system can greatly improve the accuracy of particle characterizations. For example, FIG. 9 compares an image of a particle as may be obtained from a system as disclosed herein (left image) to an image of the same particle taken at a distance from the focal plane and with no correction factors (right image). The lines are parallel and illustrate the error in the edge estimation of the image on the right. While the error for a single particle can be small, when multiplied over the course of an entire event, e.g., a weather event, errors in total particle volume, variations in particle sizes, etc., can become quite large.
[0068] FIG. 8 presents a flow diagram illustrating one embodiment for methods of using disclosed systems. As shown, the sensor, which will define a sensing volume according to the nature of the sensor, can define one or more dimensions of the detection volume. The view field of the imaging device can also define one or more dimensions of the detection volume. Upon alignment of the sensor with the imaging device, the detection volume will be defined.
[0069] When a particle passes through the detection volume, the sensor can detect the particle. This detection can trigger a processor to send a signal to the imaging device. When utilizing a high speed imaging camera, the number of frames of capture for each sensor trigger can be preset in the camera software. Beneficially, the number of frames captured for each trigger can be controlled by the user and thus can be varied as desired for each application. For example, a sensor system can send a 5 volt digital signal (e.g., a Transistor Transistor Logic (TTL) signal) to a processor that can then trigger the camera.
[0070] Any suitable processing system can be utilized, and can incorporate software and hardware components of the sensor, the camera, or both as well as an external processing unit, as desired. For example, the processing can take place within the sensor, within the imaging device, within an external device (e.g., an external CPU), or a combination of locations for the processing and communications between the various components of the system. For instance, a trigger at the sensor can be conveyed directly to an imaging device and/or through an external processing unit via an Ethernet cable or can be via wireless communication to instigate the taking and/or saving of the images.
[0071 ] Upon triggering the imaging device, the device can capture a
predetermined number of sequential frames showing the particle as it passes through the detection volume. As discussed, the capturing instructions can be in either backward or forward mode, generally depending upon the relative location of the sensor to the detection volume (e.g., upstream or downstream as the particle travels).
[0072] The images can then be stored and/or transferred for further processing.
For example, the imaging system can be controlled by a LabVIEW code. The output file from the code can contain all of the information regarding the triggering of the camera up to millisecond precision, or even higher precision in some embodiments.
Data can be stored in the camera or in another device. For instance, the imaging device can temporarily store the data, and once the camera storage space is filled, the camera can automatically download the stored data to a computer prior to capturing additional images. Of course, other data storage methods can be utilized.
For example, the data can be transferred real time from the camera to a larger storage device, without the need for periodic data dumps.
[0073] The images thus obtained can then be processed to obtain information regarding one or more characteristics of the particles. In one embodiment, a reference image can be utilized during processing that is an image captured when there are no particles in the detection volume. This can provide a reference background. During processing of each particle image, the background can be subtracted leaving only particles in a lighter background.
[0074] The image can be converted into a binary image to obtain the number of particles and their approximate centroidal locations in the image. Based on the number of particles found in an image and their centroidal locations, a processor
(e.g., software code) can draw an equal number of polygons that are slightly bigger than the particle diameter so as to analyze each individual particle in the image.
Thus, there is a polygon enclosing each particle. The advantage of selecting a slightly bigger polygon to the particle diameter is that no information of that particle is lost. The code can then analyze the specific regions of interest and the remaining regions can be ignored.
[0075] An edge detection algorithm (e.g., a Canny edge detection algorithm (Canny, 1986)) can be applied to detect the edges of the particles. Instruction including criteria to identify a single particle in the consecutive images can also be included in the processor.
[0076] Following identification of the particle boundaries, geometric quantities of interest (e.g. diameter, location of the centroid, axis ratio) and dynamic quantities of interest (e.g. vertical and lateral fall velocity of the particles) can be calculated in pixel units. In those embodiments in which the shapes of the particles are non-spherical, characteristic size of the particles can be defined, for instance as the diameter of an equivalent volume spherical particle (i.e. equivalent volume diameter). The
equivalent volume diameters can be calculated from the two-dimensional images of the particles by assuming rotational symmetry of particles.
[0077] In one embodiment, multiple cameras can be utilized to obtain multiple images of the particles. For instance, multiple cameras can be located with their respective fields of view at an angle to one another so as to obtain three dimensional data of a particle. The field of view of a second and/or additional cameras can overlap the detection volume of a first camera or can define additional detection volumes with the sensing volume. For instance, a second camera can be located upstream or downstream of a first camera to effectively increase the size of a detection volume.
[0078] The vertical and lateral components of velocities can be calculated from the displacement of the centroids between two consecutive images with a known time interval. Once all of the geometric and dynamic quantities of interest are determined, the pixel units can be converted to real world units using predetermined conversion factors and an output file can be provided. For example, the output file can include equivalent volume diameter, vertical and lateral components of fall velocity and acceleration, axis ratio, and canting angle of the particles as well as particle size distribution, changes in total volumetric flow over the course of an event, variations in particle nature over the course of an event (e.g., rain to freezing rain to snow), and so forth.
[0079] In addition, the information can be processed with information from other systems, e.g., radar, anemometers, heat sensors, etc. to provide additional information with regard to the event. For example, the data from various different systems can be synchronized to provide highly detailed information with regard to the particle motion during an event.
[0080] The disclosed systems and methods can be beneficially utilized in a wide variety of disciplines. For example, disclosed systems can be utilized to obtain information in manufacturing applications (e.g., particle formation and use), educational applications (e.g., environmental studies of weather events, animal movements and populations, etc.), medical applications (drug dose delivery determination), public health and safety (real time weather knowledge), and the like.
[0081 ] The disclosure may be further understood with reference to the Examples, set forth below.
Example 1
[0082] A set of tests was carried out to evaluate the accuracy of system
measurements for drop geometry using spherical lenses of different diameters.
[0083] The different lenses had diameters of 0.5 mm, 1 mm, 3 mm and 5 mm with a diameter tolerance of ±2.5μηη. The lenses were made of either fused silica or sapphire, with refractivity indices of 1 .46 and 1 .77, respectively. Both fused silica and sapphire spheres were used with each of the selected sizes.
[0084] In the tests, images of the spheres were captured at different distances from the focal plane with 1 .59 mm increments within the camera's view field. A traverse that provided 1 .59 mm horizontal movement per each turn of its hand wheel was used to move the spheres. These tests provided information on the overall measurement errors (including both hardware and software related errors) of a system within the detection volume for the simplistic case of spherical drops. FIG. 1 1 shows maximum percentage errors in diameter measurements of different size spherical lenses at different distances from the focal plane within the measurement volume. This figure indicates that maximum measurement error for spherical drops that are larger than 0.5 mm is less than approximately 10% of the actual drop diameter. As can be seen in FIG. 1 1 , the maximum measurement error reduced as the drop size increased and became only approximately 2.9 % of the actual diameter for 5mm diameter spheres.
[0085] FIG. 12A presents an image of a spherical lens of 5 mm diameter and FIG. 12C illustrates a spherical lens of 3 mm diameter used in the tests. FIG. 12B and FIG. 12D are the processed images of each, respectively. Briefly, the images were processed using a digital image processing code that was developed using MATLAB. As the first step of the image processing, any background / reference image that does not include any of the sphere was subtracted from the images (i.e. digital numeric value for the intensity of each pixel of the reference image was subtracted from that of the corresponding pixel in the captured image), and then the resulting images were inverted (i.e. digital numeric value for the intensity of each pixel of the resulting image was subtracted from that of a white pixel).
[0086] These images were then converted into binary images to obtain the approximate centroidal locations of each sphere in a given image using global image threshold technique developed by Otsu (1979). The image processing code used this information to specify regions of interests for the subsequent edge detection analysis to determine the boundaries of each sphere in the image. The Canny edge detection algorithm (Canny, 1986) was used to determine the sphere boundaries (FIG. 12B, FIG. 12D).
Example 2
[0087] Laboratory tests with free-falling water drops were conducted. The schematic of the experimental setup is given in Fig. 13. The system included a constant-head tank 30 and a valve 32 upstream of a needle 34 to generate nearly same size drops to fall through a detection volume 36 throughout a test.
[0088] Different size drops were generated using different size needles 34 that were attached to the constant-head tank 30. The average equivalent volume drop diameters in these tests with different needles were 2.0 mm, 2.3 mm, 2.4 mm, 2.6 mm, 3.0 mm, 3.3 mm, and 5 mm. The generated drops fell approximately 60 cm under the action of gravity from the tip of the needle to the detection volume 36. The camera was an NX4-S1 from Integrated Design Tools Company capable of capturing up to 1000 frames per second and a having a digital storage capacity of 3GB. Using this capacity, the camera could store up to 2325 frames.
[0089] Each of these drops triggered 10 images of the same water drop. The generated water drops were collected in a graduated cylinder and the total volume of the collected water (Vw) was measured to compare with the total volume of the drops measured by the system (Vd). The experimental error in measuring the collected water volume within the graduated cylinder was 0.5 ml_. FIG. 14 shows the comparison of the volume measurements by the system and the collected water volume in the graduated cylinder. The maximum percentage error in system measurements of the water volume for all of the tests was 9.2%. It is important to note that this error value is conservative because detachment from the needle tip induced large amplitude drop oscillations, which were still present within the measurement volume. The water volume calculations for oscillating drops are subject to errors associated with the rotational symmetry assumption that is used to calculate the drop volume from a 2D image. Therefore, system measurement errors calculated in these tests include also the errors associated with the rotational symmetry assumption.
Example 3
[0090] A photograph of the system used in field testing is shown in Fig. 15. The main components of the system were the high-speed CCD camera, an LED light with a diffuser, and a digital fiber optic sensing unit. These components were installed on a modular frame for easy assembly and alterations. The camera and the light were installed at the ends of the 160cm long frame. The sensor was installed in between the camera and the light and centered the camera focal plane that was 60cm away from the camera. The camera, LED light, and sensor were all ruggedized for field experiments. The camera was ruggedized in a Pelco outdoor housing with a heater, defroster, and cooler that provided an operational capability for the temperature range of -23 °C to 49 °C. The LED light was placed in a Plexiglas enclosure, and the sensor was placed in an impermeable enclosure that was designed to minimize the effects of raindrop splashes in the measurements.
[0091 ] The camera captured raindrop images at 1000 frames per second with a resolution of 1024 pixels x 1024 pixels and had a digital storage capacity of 3GB (corresponding to 2325 frames). Although the camera could capture images continuously, it did not record all of the captured images. Instead, it recorded sequential images only when there was a raindrop within the detection volume by utilizing the sensor-based camera triggering system. Once the digital storage space was filled up (i.e. 2325 frames - one batch of images), the images were downloaded to a computer automatically. During the download time (typically around 1 -2 minutes), the system did not record raindrop images. As soon as the download was completed, the system restarted to record the next batch of images. The
communication between the camera and the computer was via a category 6 Ethernet cable with a maximum data transfer rate of 1000Mbps.
[0092] This system included an amplifier and two array thru-beam fiber optic sensors. One of the sensors acted as the transmitter and the other one acted as the receiver of the signal. The amplifier and sensors were supplied by Keyence
Corporation. The beam width of the sensor transmitter and receiver units was 5.25 mm. The transmitter and receiver units were placed at the focal distance at the corners of the bottom edge of the camera view frame (see FIG. 2). The detection volume was defined by the vertical size of the camera view frame (70 mm) and the horizontal sensing area that is defined by the sensor beam width (5.25 mm centered around the focal plane) and the transverse size of the camera view frame (70 mm). This configuration corresponded to a measurement volume of 25.73 cm3. An LED light (with 508 ultra-bright LED bulbs placed in a rectangular-shaped array of 33.3 cm x 17.1 cm, being an equivalent of a 450 Watt incandescent bulb) served as the light source for the system. An LED light was selected as the light source because it provided a non-fluctuating light intensity and a rather uniform background light with the use of a diffuser. The result was a high-quality (i.e. sharp) raindrop image with a bright uniform background and a dark drop silhouette with a bright central region. It was easy to delineate the raindrop boundary (i.e. two-dimensional raindrop shape, and subsequently various other geometric and dynamic quantities) from such sharp images accurately by using an image processing code such as described previously.
[0093] The performance of the system was evaluated through field tests under real rainfall events. Performance evaluations were to assess the outdoor image quality (e.g. day and night operations), measurement accuracy (e.g. rainfall amount), and the system's capability to monitor dynamic processes (e.g. raindrop oscillations), and its autonomous operation capability for an extended period of time, among others.
[0094] The co-ordinates and elevation of the field site were 34°37'26.48" N and 82°43'58.2" W, and 255 m above the sea level. There were no trees or any other obstructions within approximately one-mile radius of the field site, which eliminated the potential for flow field disturbances induced by obstructions. During the field tests, a trailer provided a dry working space and a 10m tower served as the platform to install the system for measurements at 10 m above the ground. This height, which is the standard wind measurement height, was chosen for the system measurements to reduce any potential ground effects. In the field tests, a calibrated tipping bucket raingauge (Weather Measure Corp. Model P-501 ) was used to measure the rainfall amounts. The exact time information for each tip of the raingauge bucket was recorded for rainfall amount comparisons with the system measurements for the same time period. [0095] The table below provides a summary of the conditions for the rainfall events, during which field tests were performed.
Figure imgf000022_0001
5 1 6:04:16AM 6:28:02AM 0.71
2 6:29:55AM 6:42:58AM -
3 6:44:31AM 6:57:18AM -
4 6:58:36AM 7:07:33AM -
5 7:09:16AM 7:17:06AM 0.54
6 7:18:51AM 7:31 :05AM 1 .17
1 1 :47:10 PM 1 1 :56:59PM
1
1 1 :57:43 PM 12:1 1 :18AM 1 .07
2
3 12:16:40 PM 12:28:58AM -
6 4 12:29:44AM 12:40:00AM -
5 12:40:53AM 12:59:04AM 0.54
6 1 :00:12AM 1 :10:55AM 1 .22
7 1 :12:20AM 1 :19:01AM 1 .00
8 1 :20:17AM 1 :27:17AM -
9 1 :28:57AM 1 :35:29AM -
5:39:03PM 5:55:24PM 0.70
1
2 5:56:40PM 6:07:06PM 0.50
7 3 6:08:43PM 6:25:02PM 0.47
4 6:26:34PM 7:04:13PM -
[0096] Sample raindrop images captured by the system are shown in FIG. 16 and FIG. 17. The images in FIG. 16 include for an equilibrium shaped raindrop (FIG. 16A) and a canted raindrop (FIG. 16C). These raindrop images are sharp and represent the typical image quality from the system measurements during rainfall events. The boundaries of the raindrops determined by the system's image processing software are shown in FIG. 16B and FIG. 16D, respectively for visual comparison purposes.
[0097] Images of an oscillating raindrop are presented in FIG. 17 to demonstrate the advanced observational capabilities of the system for raindrop dynamics. Such image sequences can provide the information on various characteristics such as oscillation frequency and amplitude.
[0098] Rain total measurements using the system were compared with the tipping bucket raingauge measurements. Since the system collects raindrop data in batches (each batch corresponding to the system's digital storage capacity), batch-wise system data was compared with the data from the rain gauge for the same time interval for consistency purposes. These comparisons for batches of data during the rainfall events are shown in Fig. 18. This figure indicates a good agreement between the rain total measurements from the system and the rain gauge. The solid line in the figure represents the perfect agreement of the measurements from both of the instruments (i.e. 45° line). Accurate rain total measurements by the system provide additional confidence to the accuracy of the measurements for the raindrop geometric characteristics because accurate rain total measurements can only be possible if the raindrop shape is identified accurately. It is important to note that raindrop shape identification is the basis for all of the calculations for the raindrop geometric characteristics such as raindrop volume and equivalent volume diameter.
[0099] Typical raindrop fall velocity and raindrop size distribution measurements by the system are presented in FIG. 19A and FIG. 19B, respectively. These measurements were conducted during the rainfall event # 3 in the tables above. In FIG. 19A, raindrop fall velocities measured by the system were compared with the terminal velocity predictions by Gunn and Kinzer (1949). In this graph, average raindrop fall velocities (symbols) for the diameter bins and the corresponding standard deviations (error bars) are plotted. In FIG. 19B, illustrates the drop size distribution as determined from the data. Standard Parsivel bin sizes are used for the fall velocity graph in FIG. 19A and a uniform bin size of 0.2mm is used for the DSD graph in FIG. 19B.
[0100] What have been described above are examples. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the subject matter, but one of ordinary skill in the art will recognize that many further combinations and permutations of the subject matter are possible. Accordingly, the disclosure is intended to embrace all such alterations, modifications, and variations that fall within the scope of this application, including any appended claims.

Claims

WHAT IS CLAIMED IS
1 . A system for determining one or more characteristics of a particle comprising: an imaging device defining a field of view;
a sensor defining a sensing volume, the sensing volume intersecting the field of view, a detection volume comprising the intersection;
a processor in communication with the imaging device and with the sensor, the processor being configured such that upon detection of the particle by the sensor, at least one image of the particle is obtained by the imaging device of the particle within the detection volume.
2. The system of claim 1 , wherein the imaging device comprises a charge- coupled device or a CMOS.
3. The system of claim 1 , wherein the imaging device is capable of obtaining multiple images per second.
4. The system of claim 1 , wherein the sensor is an optical sensor.
5. The system of claim 4, wherein the optical sensor operates in the visible spectrum, the infrared spectrum, the near infrared spectrum, or the ultraviolet spectrum.
6. The system of claim 4, wherein the optical sensor comprises an optical fiber.
7. The system of claim 4, wherein the optical sensor comprises a laser.
8. The system of claim 1 , wherein the sensor is an impact sensor.
9. The system of claim 1 , wherein the sensor is located downstream, upstream, or within the field of view with respect to motion of the particle.
10. The system of claim 1 , the imaging device defining a focal plane, wherein the focal plane is within the sensing volume.
1 1 . The system of claim 1 , further comprising one or more additional sensors.
12. The system of claim 1 , further comprising one or more additional imaging devices.
13. The system of claim 1 , wherein the processor is in wireless communication with at least one of the imaging device and the sensor.
14. A method for determining one or more characteristics of a particle comprising: sensing the presence of a particle by use of a sensor;
obtaining at least one image of the particle by use of an imaging device, the image being an image of the particle within a detection volume, the detection volume being defined by the intersection of a sensing volume of the sensor and a field of view of the imaging device; and
processing the at least one image and thereby determining one or more characteristics of the particle.
15. The method of claim 14, wherein the particle is a hydrometeor.
16. The method of claim 14, wherein the particle is a gas particle, a liquid particle, a solid particle, or an animal.
17. The method of claim 14, wherein the one or more characteristics comprises particle type, particle volume, particle velocity, particle acceleration, oscillation behavior, nearness to a state change, or collision behavior.
18. The method of claim 14, further comprising combining the particle
characteristics with information obtained from a second system to determine one or more secondary characteristics with regard to the actions and/or nature of a plurality of the particles.
19. The method of claim 18, wherein the one or more characteristics comprise the total volume of particles during a weather event.
20. The method of claim 18, wherein the one or more characteristics comprise a particle size distribution.
PCT/US2015/027214 2014-04-29 2015-04-23 Optical system for measuring particle characteristics WO2015167914A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461985756P 2014-04-29 2014-04-29
US61/985,756 2014-04-29

Publications (1)

Publication Number Publication Date
WO2015167914A1 true WO2015167914A1 (en) 2015-11-05

Family

ID=54359174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/027214 WO2015167914A1 (en) 2014-04-29 2015-04-23 Optical system for measuring particle characteristics

Country Status (1)

Country Link
WO (1) WO2015167914A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10026163B2 (en) 2015-02-25 2018-07-17 Cale Fallgatter Hydrometeor identification methods and systems
US20210216039A1 (en) * 2020-03-27 2021-07-15 Jakub Wenus Image processing techniques using digital holographic microscopy
US11168854B2 (en) 2016-11-28 2021-11-09 Signify Holding B.V. Precipitation sensing via intelligent lighting
US11674878B2 (en) 2019-11-27 2023-06-13 University Of Utah Research Foundation Differential emissivity based evaporable particle measurement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4679939A (en) * 1985-12-23 1987-07-14 The United States Of America As Represented By The Secretary Of The Air Firce In situ small particle diagnostics
US20070132599A1 (en) * 2005-11-19 2007-06-14 Dufaux Douglas P Apparatus and method for measuring precipitation
US7249502B2 (en) * 2001-09-24 2007-07-31 Vaisala Oyj Precipitation/hail sensor and method for precipitation rate measurement
US20130205890A1 (en) * 2010-06-10 2013-08-15 Korea Meteorological Administration Disdrometer system having a three-dimensional laser array unit
CN103439756A (en) * 2013-07-31 2013-12-11 中国人民解放军理工大学 Natural precipitation particle micro physical characteristic measuring method based on particle forming speed measurement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4679939A (en) * 1985-12-23 1987-07-14 The United States Of America As Represented By The Secretary Of The Air Firce In situ small particle diagnostics
US7249502B2 (en) * 2001-09-24 2007-07-31 Vaisala Oyj Precipitation/hail sensor and method for precipitation rate measurement
US20070132599A1 (en) * 2005-11-19 2007-06-14 Dufaux Douglas P Apparatus and method for measuring precipitation
US20130205890A1 (en) * 2010-06-10 2013-08-15 Korea Meteorological Administration Disdrometer system having a three-dimensional laser array unit
CN103439756A (en) * 2013-07-31 2013-12-11 中国人民解放军理工大学 Natural precipitation particle micro physical characteristic measuring method based on particle forming speed measurement

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10026163B2 (en) 2015-02-25 2018-07-17 Cale Fallgatter Hydrometeor identification methods and systems
US11168854B2 (en) 2016-11-28 2021-11-09 Signify Holding B.V. Precipitation sensing via intelligent lighting
US11674878B2 (en) 2019-11-27 2023-06-13 University Of Utah Research Foundation Differential emissivity based evaporable particle measurement
US20210216039A1 (en) * 2020-03-27 2021-07-15 Jakub Wenus Image processing techniques using digital holographic microscopy

Similar Documents

Publication Publication Date Title
AU2021202277B2 (en) Avian detection systems and methods
WO2015167914A1 (en) Optical system for measuring particle characteristics
US7388662B2 (en) Real-time measuring of the spatial distribution of sprayed aerosol particles
US9342899B2 (en) Method for measuring microphysical characteristics of natural precipitation using particle image velocimetry
Brydegaard et al. Super resolution laser radar with blinking atmospheric particles----Application to interacting flying insects
US10026163B2 (en) Hydrometeor identification methods and systems
Prata et al. Atmospheric processes affecting the separation of volcanic ash and SO 2 in volcanic eruptions: inferences from the May 2011 Grímsvötn eruption
Beck et al. HoloGondel: in situ cloud observations on a cable car in the Swiss Alps using a holographic imager
CN102436015B (en) Method and rain gauge for measuring rainfall by pulse illumination optics
US9715039B2 (en) Apparatus and system for smart seeding within cloud formations
CN102301269A (en) Optical Sectioning Of A Sample And Detection Of Particles In A Sample
DE602006018418D1 (en) LIDAR device with increased pulse repetition rate
CN103869385A (en) Method and device for detecting rain amount through laser
Schön et al. Particle habit imaging using incoherent light: a first step toward a novel instrument for cloud microphysics
Muecke et al. Terrain echo probability assignment based on full-waveform airborne laser scanning observables
Buntov et al. Four-channel photoelectric counter of saltating sand particles
Maahn et al. Introducing the video in situ snowfall sensor (VISSS)
KR20170088113A (en) Method for real time measurement on meteorological and environmental parameters based on the convergence of TOF Lidar and camera, and measuring system
CN104010165A (en) Automatic collecting device for precipitation particle shadow image
Mylapore et al. A three-beam aerosol backscatter correlation lidar for three-component wind profiling
De Cock Design of a hydraulic nozzle with a narrow droplet size distribution
Garrett et al. The multi-angle snowflake camera
von Lerber Challenges in measuring winter precipitation:-Advances in combining microwave remote sensing and surface observations
Chua et al. Effects of target reflectivity on the reflected laser pulse for range estimation
Montalban Advancing LiDAR Perception in Degraded Visual Environments: a Probabilistic Approach for Degradation Analysis and Inference of Visibility

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15786057

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07/03/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15786057

Country of ref document: EP

Kind code of ref document: A1