US20070051890A1 - Sensor having differential polarization capability and a network comprised of several such sensors - Google Patents

Sensor having differential polarization capability and a network comprised of several such sensors Download PDF

Info

Publication number
US20070051890A1
US20070051890A1 US11/109,022 US10902205A US2007051890A1 US 20070051890 A1 US20070051890 A1 US 20070051890A1 US 10902205 A US10902205 A US 10902205A US 2007051890 A1 US2007051890 A1 US 2007051890A1
Authority
US
United States
Prior art keywords
polarization
sensors
image
polarizer
set forth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/109,022
Other versions
US7193214B1 (en
Inventor
William Pittman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
US Department of Army
Original Assignee
US Department of Army
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by US Department of Army filed Critical US Department of Army
Priority to US11/109,022 priority Critical patent/US7193214B1/en
Assigned to UNITED STATES OF AMERICA, AS REPRESENTED BY THE SECRETARY OF THE ARMY reassignment UNITED STATES OF AMERICA, AS REPRESENTED BY THE SECRETARY OF THE ARMY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PITTMAN, WILLIAM C.
Publication of US20070051890A1 publication Critical patent/US20070051890A1/en
Application granted granted Critical
Publication of US7193214B1 publication Critical patent/US7193214B1/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/148Charge coupled imagers
    • H01L27/14806Structural or functional details thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J4/00Measuring polarisation of light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/21Polarisation-affecting properties
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/359Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using near infrared light

Definitions

  • infrared sensors for extant missile seekers and forward-looking infrared (FLIR) sensors lack the capability for detecting the polarization orientation of the incident radiation, i.e. they respond to any polarization orientation vector of the incident radiation.
  • FLIR forward-looking infrared
  • Infrared instrumentation having this feature would also have potential commercial applications in areas such as pattern recognition, materials characterization and analysis.
  • polarization properties are independent of target-background contrast (i.e. polarization properties are distinct even when there is no temperature difference between the target and the background) and of normal infrared target and clutter temperature fluctuations (i.e. standard deviation of temperature).
  • target-background contrast i.e. polarization properties are distinct even when there is no temperature difference between the target and the background
  • normal infrared target and clutter temperature fluctuations i.e. standard deviation of temperature
  • This invention relates to a system for detecting and processing passive and active polarized near-infrared radiation for applications in devices such as research, instrumentation, surveillance systems, infrared missile sensors or seekers, search and acquisition systems and guidance systems.
  • Polarimetric processing capability is added to a typical sensor so that a target object can be more clearly distinguished from the background clutter in a given scenery.
  • An embodiment to accomplish this involves using a polarizer with several segments of different polarization orientations in conjunction with the microchannel image intensifier tube as taught by David B. Johnson, et al in U.S. Pat. No. 5,373,320 (Dec. 13, 1994), resulting in improved sensors.
  • the polarization segments are sequentially advanced to pass therethrough infrared radiation of pre-selected polarization orientations, which radiation, then, impinges on the charge coupled device (CCD) camera.
  • CCD charge coupled device
  • polarized frame grabbers coupled to the camera produce image frames of the respective pre-selected polarization orientations and image processing circuit then processes these image frames to yield the polarization difference between any given pair of orthogonal polarizations.
  • the polarization differences are subsequently used to enhance the distinction of the objects against the background clutter suspended in the propagation path.
  • a control center that controls and coordinates the functions of the individual sensors. Also, the control center receives relevant polarization information from the sensors and correlates the information for a more effective surveillance of the diverse locations.
  • FIG. 1 shows a diagram of a preferred embodiment of the improved sensor, the parts enclosed in the dashed lines indicating the improvement over the Johnson system.
  • FIG. 2 illustrates the several polarization segments of the polarizer 101 .
  • FIG. 3 depicts a surveillance network comprising a plurality of the sensors.
  • FIG. 4 is a flowchart of a typical algorithm that may be used by control center 300 to process the inputs from individual sensors 100 .
  • This invention is an improvement that may be used with the camera attachment that converts a standard daylight video camera into a day/night-vision video camera as taught by Johnson in the U.S. Pat. No. 5,373,320 (Dec. 13, 1994). Therefore, the disclosure of the Johnson patent is incorporated herein in its entirety.
  • near-infrared radiation 59 emanating or reflecting from the observed scenery (not shown in the figure) enters sensor 100 through iris lens assembly 60 which may be an auto iris lens.
  • iris lens assembly 60 which may be an auto iris lens.
  • image intensifier tube 61 having coupled thereto photocathode 117
  • polarizer 101 swing into place to be in the path of the incoming radiation 59 .
  • the radiation then impinges on polarizer 101 , a particular segment of the polarizer having a specific polarization orientation being in the path of the incoming radiation.
  • the particular segment is one of several segments of the polarizer, each segment having a distinct polarization orientation and transmitting only the portion of the incoming radiation that has the same polarization orientation as that of the particular segment in the radiation path.
  • the several segments of polarizer 101 comprise four at 0°, 45°, 90°, and 135° orientations, respectively.
  • Actuator 103 powered by power source 62 , translates the polarizer segments sequentially in discrete steps along the length of image intensifier tube 61 so that they are sequentially engaged to be in the radiation path between iris lens assembly 60 and the image intensifier tube, which may be a two-dimensional microchannel plate.
  • the radiation transmitted through the engaged polarizer segment is then brought to a focus on photocathode 117 that covers all the pixel elements of the microchannel plate (image intensifier tube). It is noted here that the particular segment of the polarizer engaged at any given moment in time should be sufficiently large to cover all the pixel elements of the microchannel plate.
  • photocathode 117 that covers all the pixel elements of the microchannel plate (image intensifier tube).
  • the particular segment of the polarizer engaged at any given moment in time should be sufficiently large to cover all the pixel elements of the microchannel plate.
  • electrons are emitted from the rear of the photocathode.
  • the electrons thusly emitted from each pixel position enter the corresponding channels of the microchannel plate and are multiplied in the channels of the microchannel plate, subsequently exiting as output light from the microchannel plate and being relayed by relay lens 63 to be incident on CCD camera 64 .
  • a plurality of polarized frame grabbers 105 , 107 , 109 and 111 are selectively connected to the CCD camera via first switch 119 and second switch 121 .
  • the operation of the switches is synchronized with the operation of actuator 103 such that the polarization orientation of the engaged polarizer segment is the same as that of the connected polarized frame grabber.
  • the image at 0° is read into frame grabber 105 ; then the actuator translates the polarizer to engage the 90°-segment, resulting in the image at 90° being read into frame grabber 107 .
  • the polarizer functions with the microchannel plate that amplifies the low light level, a condition that is prevalent in night-time, the polarizer is not operational during daylight operation of the sensor, but is removed from the path of the incoming radiation.
  • the image intensifier tube (microchannel plate) and the polarizer-are rotated out of position and, in their place, optical path length compensator 67 is rotated into position and second switch 121 connects the CCD camera with unpolarized frame grabber 125 . With the compensator in place, unpolarized imagery may now be collected by the unpolarized frame grabber and further processed by the control center.
  • a plurality of ground sensors 100 may be connected via wireless links or fiber optic links 301 to control center 300 as depicted in FIG. 3 to form a surveillance network. Additional sensors (having the same or different capabilities as the ground sensors) on air-borne platforms may be connected to the control center as well for greater versatility of the network.
  • the control center performs correlation of the imagery inputs from the ground sensors with each other as well as with imagery inputs from air-borne sensors and other sources, such as satellites, to identify and track a target object against background clutter.
  • the satellite or air-borne sensors having the capability to view a large area from their elevated positions, can alert the ground sensors to hostile movements toward the protected perimeter.
  • links 301 also allows the control center to adjust the settings at each of the sensors: for example, the frame readout rates of each sensor, the selective connections of switches 119 and 121 , the occasional activation of power source 62 and the level of incoming radiation through iris lens assembly 60 .
  • the control center is under the supervision of an expert image analyst, as indicated in FIG. 4 , who is familiar with the topography and understands the physics of the diurnal and long term changes in the scenes observed, even though the image processing operations may be performed autonomously or semi-autonomously.
  • the terrain features being observed by the sensors in the surveillance system may include natural and man-made features both of which change over time.
  • the scenes being observed may include roads (a target area for emplacement of roadside bombs), building structures that may conceal snipers, a forest area that might conceal military movement or grassy area where land mines may be buried.
  • the scene images are obtained at different points in time that may extend over a period of an hour to a day, or a period of a day to weeks or to months.
  • a suitable calibration procedure is described by Mehrdad Soumekh et al in “Time Series Processing of FLIR Imagery for MTI and Change Detection” in the IEEE's Proceedings of the International Conference on Acoustics, Speech and Signal Processing, Vol. 5, pages 21-24 (2003).
  • the calibration procedure is performed under the supervision of the control center and its result becomes a part of the site model as depicted in FIG. 4 .
  • FIG. 4 illustrates a representative software architecture for implementing signal processing operation between the multiple images obtained from the several ground emplaced sensors or aerial sensors.
  • the aerial sensors and the ground sensors may not be of the same type; it is envisioned that the aerial sensors would include all-weather sensors, such as synthetic aperture radar (SAR), but the ground sensors would not.
  • the aerial sensors may also be hyperspectral sensors whose several spectral bands can be directly coordinated with the ground sensors.
  • control center 300 The main operation of control center 300 is achieving the match between images from the same or different sensor types taken at different times and different resolutions.
  • a common approach is to designate one image, an aerial image for example, as a reference and perform a rotation or warping of the second image to bring it into alignment with the reference image.
  • Many algorithms are already known in the art that can be used to coordinate multiple images, depending on the nature of the particular images. For example, when comparing infrared images at wavelengths shorter than 4-5 microns and those at longer infrared wavelengths, a suitable algorithm used to achieve a match between the day-time images and night-time images is the Multivariate Mutual Information Algorithm as is explained in “Registration of Image Cubes Using Multivariate Mutual Information,” by Jeffrey P.
  • Another example is when the air-borne sensor includes a SAR. In such a case, the fusion of the SAR image with near-infrared image will be the required signal processing.
  • the necessary signal processing may include a multi-scale decomposition based on wavelets that allows the detection and characterization of the dynamic elements in the scene. An algorithm to accomplish such decomposition is explained in “Multiple Single Pixel Dim Target Detection in Infrared Image Sequence,”by Mukesh A. Zaveri et al in IEEE's Proceedings of International Conference on Circuits and Systems, Vol. 2, pages II-380 to II-383 (2000).

Abstract

Polarimetric processing capability is added to a typical sensor so that a target object can be more clearly distinguished from the background clutter in a given scenery. A polarizer with several segments of different polarization orientations is used to improve the typical sensor. The segments are sequentially advanced to pass therethrough infrared radiation images of pre-selected polarization orientations which are then collected by respective polarized frame grabbers. Image processing circuit processes these images to yield the polarization difference between any given pair of orthogonal polarizations. In a surveillance network, the polarization differences are subsequently used in the control center, to which such sensors are connected, to enhance the distinction of the observed objects against the background clutter suspended in the propagation medium.

Description

    DEDICATORY CLAUSE
  • The invention described herein may be manufactured, used and licensed by or for the Government for governmental purposes without the payment to me of any royalties thereon.
  • BACKGROUND OF THE INVENTION
  • In these times of heightened security, there are many instances when particular geographic areas may need to be placed under surveillance to protect potential targets from terrorist attacks or as an adjunct to military operations.
  • When an object is imbedded in a medium such as rain or fog, the presence of scatterers in the medium causes the image of the object to appear indistinct or blurred, making object detection difficult. One aspect of man-made objects that may assist in their detection, however, is that they emit and reflect near-infrared radiation that is more highly polarized than does natural background that may include trees, brush grass or terrain.
  • There are two broad categories of infrared systems designed for military use; namely, scanning system and staring system. However, infrared sensors for extant missile seekers and forward-looking infrared (FLIR) sensors lack the capability for detecting the polarization orientation of the incident radiation, i.e. they respond to any polarization orientation vector of the incident radiation. Studies have shown that the capability to detect and analyze the polarization orientation of near-infrared radiation emanating or reflecting from targets and background scenery can provide a potential means for improving the detection and discrimination of the targets in military systems. Infrared instrumentation having this feature would also have potential commercial applications in areas such as pattern recognition, materials characterization and analysis.
  • Thus, one way to improve the performance of target detection system is to utilize polarized infrared signatures of a given scenery. What renders this possible is that polarization properties are independent of target-background contrast (i.e. polarization properties are distinct even when there is no temperature difference between the target and the background) and of normal infrared target and clutter temperature fluctuations (i.e. standard deviation of temperature). As is well-known, the formal characterization of an electromagnetic wave that may be linear, elliptical or circular is accomplished with the derivation of the four Stokes parameters.
  • SUMMARY OF THE INVENTION
  • This invention relates to a system for detecting and processing passive and active polarized near-infrared radiation for applications in devices such as research, instrumentation, surveillance systems, infrared missile sensors or seekers, search and acquisition systems and guidance systems.
  • Polarimetric processing capability is added to a typical sensor so that a target object can be more clearly distinguished from the background clutter in a given scenery. An embodiment to accomplish this involves using a polarizer with several segments of different polarization orientations in conjunction with the microchannel image intensifier tube as taught by David B. Johnson, et al in U.S. Pat. No. 5,373,320 (Dec. 13, 1994), resulting in improved sensors. The polarization segments are sequentially advanced to pass therethrough infrared radiation of pre-selected polarization orientations, which radiation, then, impinges on the charge coupled device (CCD) camera. Multiple polarized frame grabbers coupled to the camera produce image frames of the respective pre-selected polarization orientations and image processing circuit then processes these image frames to yield the polarization difference between any given pair of orthogonal polarizations. The polarization differences are subsequently used to enhance the distinction of the objects against the background clutter suspended in the propagation path.
  • In a surveillance system, many such sensors placed to observe diverse geographical locations can be connected to a control center that controls and coordinates the functions of the individual sensors. Also, the control center receives relevant polarization information from the sensors and correlates the information for a more effective surveillance of the diverse locations.
  • DESCRIPTION OF THE DRAWING
  • FIG. 1 shows a diagram of a preferred embodiment of the improved sensor, the parts enclosed in the dashed lines indicating the improvement over the Johnson system.
  • FIG. 2 illustrates the several polarization segments of the polarizer 101.
  • FIG. 3 depicts a surveillance network comprising a plurality of the sensors.
  • FIG. 4 is a flowchart of a typical algorithm that may be used by control center 300 to process the inputs from individual sensors 100.
  • DESCRIPTION OF THE PREFERRED EMBODIMENT
  • This invention is an improvement that may be used with the camera attachment that converts a standard daylight video camera into a day/night-vision video camera as taught by Johnson in the U.S. Pat. No. 5,373,320 (Dec. 13, 1994). Therefore, the disclosure of the Johnson patent is incorporated herein in its entirety.
  • Referring now to the drawing wherein like numbers represent like parts in each of the several figures, the improvement to impart differential polarization capability to sensor 100 is explained in detail. Any and all of the numerical dimensions and values that follow should be taken as nominal values rather than absolutes or as a limitation on the scope of the invention.
  • In FIG. 1, near-infrared radiation 59 emanating or reflecting from the observed scenery (not shown in the figure) enters sensor 100 through iris lens assembly 60 which may be an auto iris lens. At a pre-selected low level of ambient illumination (usually night-time or a very dark day), image intensifier tube 61 having coupled thereto photocathode 117, and polarizer 101 swing into place to be in the path of the incoming radiation 59. The radiation then impinges on polarizer 101, a particular segment of the polarizer having a specific polarization orientation being in the path of the incoming radiation. The particular segment is one of several segments of the polarizer, each segment having a distinct polarization orientation and transmitting only the portion of the incoming radiation that has the same polarization orientation as that of the particular segment in the radiation path. As illustrated in FIG. 2, the several segments of polarizer 101 comprise four at 0°, 45°, 90°, and 135° orientations, respectively. Actuator 103, powered by power source 62, translates the polarizer segments sequentially in discrete steps along the length of image intensifier tube 61 so that they are sequentially engaged to be in the radiation path between iris lens assembly 60 and the image intensifier tube, which may be a two-dimensional microchannel plate.
  • The radiation transmitted through the engaged polarizer segment is then brought to a focus on photocathode 117 that covers all the pixel elements of the microchannel plate (image intensifier tube). It is noted here that the particular segment of the polarizer engaged at any given moment in time should be sufficiently large to cover all the pixel elements of the microchannel plate. For each pixel element of the radiated image, electrons are emitted from the rear of the photocathode. The electrons thusly emitted from each pixel position enter the corresponding channels of the microchannel plate and are multiplied in the channels of the microchannel plate, subsequently exiting as output light from the microchannel plate and being relayed by relay lens 63 to be incident on CCD camera 64.
  • A plurality of polarized frame grabbers 105, 107, 109 and 111, each with a different polarization orientation, are selectively connected to the CCD camera via first switch 119 and second switch 121. The operation of the switches is synchronized with the operation of actuator 103 such that the polarization orientation of the engaged polarizer segment is the same as that of the connected polarized frame grabber. In exemplar operation of the above-described process, the image at 0° is read into frame grabber 105; then the actuator translates the polarizer to engage the 90°-segment, resulting in the image at 90° being read into frame grabber 107. Then the logic circuitry at the output of the two polarized frame grabbers forms both the sum, V0 +V90, and the difference, V0 −V90, of the two orthogonal polarizations. From this, the normalized ratio is formed: Polarization Difference = V 0 - V 90 V 0 + V 90
    This information is transmitted to control center 300 for further processing as will be explained below. Collecting two orthogonal polarizations of an object image and differentiating between the two as illustrated above improves the detectability of the object. This process is known as polarization difference imaging and its general principles are explained in U.S. Pat. No. 5,975,702 (Nov. 2, 1999). The process of summation and differencing and obtaining the ratio can be performed with any other pair of orthogonal polarizations.
  • Since the polarizer functions with the microchannel plate that amplifies the low light level, a condition that is prevalent in night-time, the polarizer is not operational during daylight operation of the sensor, but is removed from the path of the incoming radiation. Thus, at a selected higher level of ambient illumination, under the control of control center 300 the image intensifier tube (microchannel plate) and the polarizer-are rotated out of position and, in their place, optical path length compensator 67 is rotated into position and second switch 121 connects the CCD camera with unpolarized frame grabber 125. With the compensator in place, unpolarized imagery may now be collected by the unpolarized frame grabber and further processed by the control center.
  • A plurality of ground sensors 100 (perhaps placed along a defensive perimeter to detect hostile intrusion by man or vehicles) may be connected via wireless links or fiber optic links 301 to control center 300 as depicted in FIG. 3 to form a surveillance network. Additional sensors (having the same or different capabilities as the ground sensors) on air-borne platforms may be connected to the control center as well for greater versatility of the network. In such a network, the control center performs correlation of the imagery inputs from the ground sensors with each other as well as with imagery inputs from air-borne sensors and other sources, such as satellites, to identify and track a target object against background clutter. For example, the satellite or air-borne sensors, having the capability to view a large area from their elevated positions, can alert the ground sensors to hostile movements toward the protected perimeter. The ground sensors, on the other hand, having visibility underneath vegetation canopy, may be able to alert the elevated sensors to movements hidden from their view. In addition to transmitting data from the sensors to the control center, links 301 also allows the control center to adjust the settings at each of the sensors: for example, the frame readout rates of each sensor, the selective connections of switches 119 and 121, the occasional activation of power source 62 and the level of incoming radiation through iris lens assembly 60.
  • It is assumed that the control center is under the supervision of an expert image analyst, as indicated in FIG. 4, who is familiar with the topography and understands the physics of the diurnal and long term changes in the scenes observed, even though the image processing operations may be performed autonomously or semi-autonomously. The terrain features being observed by the sensors in the surveillance system may include natural and man-made features both of which change over time. The scenes being observed may include roads (a target area for emplacement of roadside bombs), building structures that may conceal snipers, a forest area that might conceal military movement or grassy area where land mines may be buried. In all these applications, the scene images are obtained at different points in time that may extend over a period of an hour to a day, or a period of a day to weeks or to months. This requires that a signal model of each of the emplaced sensors be developed so that variations can be identified, and imperfections in time be obtained to compensate for those variations in each sensor. A suitable calibration procedure is described by Mehrdad Soumekh et al in “Time Series Processing of FLIR Imagery for MTI and Change Detection” in the IEEE's Proceedings of the International Conference on Acoustics, Speech and Signal Processing, Vol. 5, pages 21-24 (2003). The calibration procedure is performed under the supervision of the control center and its result becomes a part of the site model as depicted in FIG. 4.
  • Initially, to build the site model, an aerial image of a site (a given scene) is obtained as a reference. Then subsequent images of the site from emplaced ground sensors or air-borne sensors are used to update and verify the site model, utilizing the principle that when two or more images of a given scene are available and the geometric relationship between the images and the geo-spatial coordinates are known with some degree of accuracy, the collected images can be coordinated. FIG. 4 illustrates a representative software architecture for implementing signal processing operation between the multiple images obtained from the several ground emplaced sensors or aerial sensors. It is noted that the aerial sensors and the ground sensors may not be of the same type; it is envisioned that the aerial sensors would include all-weather sensors, such as synthetic aperture radar (SAR), but the ground sensors would not. The aerial sensors may also be hyperspectral sensors whose several spectral bands can be directly coordinated with the ground sensors.
  • The main operation of control center 300 is achieving the match between images from the same or different sensor types taken at different times and different resolutions. A common approach is to designate one image, an aerial image for example, as a reference and perform a rotation or warping of the second image to bring it into alignment with the reference image. Many algorithms are already known in the art that can be used to coordinate multiple images, depending on the nature of the particular images. For example, when comparing infrared images at wavelengths shorter than 4-5 microns and those at longer infrared wavelengths, a suitable algorithm used to achieve a match between the day-time images and night-time images is the Multivariate Mutual Information Algorithm as is explained in “Registration of Image Cubes Using Multivariate Mutual Information,” by Jeffrey P. Kern et al in IEEE's Thirty-Seventh ASILOMAR Conference: Signals, Systems and Computers, Vol. 2, pages 1645-1649 (2003). Another example is when the air-borne sensor includes a SAR. In such a case, the fusion of the SAR image with near-infrared image will be the required signal processing. Yet a third example is when the targets are far from the sensors and thus appear in the image as points against the background clutter. In this case, the necessary signal processing may include a multi-scale decomposition based on wavelets that allows the detection and characterization of the dynamic elements in the scene. An algorithm to accomplish such decomposition is explained in “Multiple Single Pixel Dim Target Detection in Infrared Image Sequence,”by Mukesh A. Zaveri et al in IEEE's Proceedings of International Conference on Circuits and Systems, Vol. 2, pages II-380 to II-383 (2000).
  • Although a particular embodiment and form of this invention has been illustrated, it is apparent that various modifications and embodiments of the invention may be made by those skilled in the art without departing from the scope and spirit of the foregoing disclosure. For example, to obtain two frames of orthogonal polarizations at the same instant in a given sensor, one can use a stereo camera with the two orthogonal polarizers fixed to the two separate sets of optics so the two frames of orthogonal polarizations are obtained at the same instant. This may avoid the image smearing that can happen if motion occurs during the transition of the polarizer from one polarization segment to another but would correspondingly require additional intensifier, relay lens and switching mechanism, rendering the system bulkier and more complicated. Another modification is to use sensors that are capable of switching between daylight and night vision operational modes. To provide differential polarization during daylight operations, a second polarizer-actuator assembly would need to be integrated with the optical elements of the sensor. Accordingly, the scope of the invention should be limited only by the claims appended hereto.

Claims (15)

1. In a system for viewing a pre-selected scenery in the near-infrared spectrum, the system having an iris lens assembly for collecting incoming radiation, a charge-coupled device camera; a relay lens for focusing the collected radiation onto the CCD camera; a microchannel image intensifier positioned between the iris lens assembly and the relay lens, an optical path length compensator selectively-placeable between the iris lens and the image intensifier, the intensifier amplifying the incorning radiation level prior to transmitting the radiation to the relay lens; and a photocathode coupled to the image intensifier, AN IMPROVEMENT for providing a polarimetric processing capability to the system so as to enable distinction of various features of the pre-selected scenery, said IMPROVEMENT comprising: a polarizer placeable between the iris lens and the photocathode, said polarizer being composed of several discrete segments of different polarization orientations and being translatable along the length of the image intensifier in discrete steps so as to bring sequentially said discrete polarization segments to be engaged between the iris lens and the photocathode, each of said segments being of sufficient size to cover all the pixels of the image intensifier; a plurality of polarized frame grabbers coupled to the CCD camera, each polarized frame grabber receiving radiation of a particular polarization orientation and outputting an image frame of that particular polarization orientation; and a logic circuit for performing image processing on said image frame outputs from said polarized frame grabbers and producing therefrom polarization difference of any given pair of orthogonal polarization orientations, said polarization difference being used to enhance the visual perception of the diverse features of the scenery.
2. (canceled)
3. An IMPROVEMENT as set forth in claim 2, wherein said improvement further comprises: a means to translate said polarizer in said sequential manner along the length of the image intensifier.
4. An IMPROVEMENT as set forth in claim 3, wherein each of said several discrete segments of said polarizer covers all of the pixel elements in the microchannel image intensifier.
5. An IMPROVEMENT as set forth in claim 4, wherein said improvement still further comprises: a first switch coupled between the CCD camera and said polarized frame grabbers, said first switch selectively connecting, from time to time, said CCD camera with one of said polarized frame grabbers.
6. An IMPROVEMENT as set forth in claim 5, wherein said first switch is selectively connected to said polarized frame grabbers in synchronization with said translating means such that, at any given moment in) time, said engaged polarization segment and said connected polarized frame grabber are of the same polarization orientation.
7-11. (canceled)
12. A surveillance network for surveilling multiple sceneries in the near-infrared spectrum, said network comprising: a plurality of ground-based sensors positioned for optically sensing objects in the various sceneries; and a control center, said control center being coupled to each of said ground-based sensors, said center receiving image inputs from said sensors relative to their respective sensed objects and processing, using appropriate algorithms, said inputs to identify and track said sensed objects.
13. A surveillance network as set forth in claim 12, wherein said network further comprises: a plurality of air-borne sensors for sensing objects in the sceneries, said air-borne sensors being coupled to said control center and inputting to said center their respective image information.
14. A surveillance network as set forth in claim 13, wherein said center is coupled to each of said ground-based and air-borne sensors via a wireless link or optical fiber link.
15. A surveillance network as set forth in claim 14, wherein each of said sensors is capable of polarimetric processing of near-infrared radiation that emanates from said sceneries and is incident on said sensors to produce polarization difference of any given pair of orthogonal polarization orientations, said polarization differences being input to said control center, thereby enabling said control center to distinguish target objects from background clutter in the sceneries.
16. A surveillance network as set forth in claim 15, wherein each of said sensors comprises: an iris lens assembly for collecting incoming radiation, a charge-coupled device camera; a relay lens for focusing the collected radiation onto said CCD camera; a microchannel image intensifier positioned between said iris lens assembly and said relay lens, an optical path length compensator selectively-placeable between said iris lens and said image intensifier, said intensifier amplifying the incoming radiation level prior to transmitting the radiation to said relay lens; a photocathode coupled to said image intensifier; a polarizer composed of several discrete segments embodying different polarization orientations, said polarizer being placeable between said iris lens and said photocathode, said placement of said polarizer alternating selectively with the placement of said optical path length compensator in the same position; a plurality of polarized frame grabbers coupled to said CCD camera, each polarized frame grabber receiving radiation of a particular polarization orientation and producing an image frame of that particular polarization orientation; and a logic circuit for performing image processing on said image frame outputs from said polarized frame grabbers and producing therefrom polarization difference of any given pair of orthogonal polarization orientations, said polarization difference being transmitted to said control center to be used thereby to enhance the visual distinction of target objects against background clutter.
17. A surveillance network as set forth in claim 16, wherein said polarizer in each sensor is disposed to be translatable linearly along the length of said image intensifier in discrete steps so as to bring sequentially said discrete polarization segments to be engaged between said iris lens and said photocathode and said each sensor contains therein, an actuator for translating said polarizer in said sequential manner.
18. A surveillance network as set forth in claim 17, wherein each said sensor further comprises: a switch coupled between said CCD camera and said polarized frame grabbers, said switch selectively connecting, from time to time, said CCD camera with one of said polarized frame grabbers, said selective connecting of said switch being synchronized with said translating means such that at any given moment in time, said engaged polarization segment and said connected polarized frame grabber are of the same polarization orientation.
19. A surveillance network as set forth in claim 18, wherein each said sensor still further comprises: an unpolarized frame grabber, said unpolarized frame grabber being activated only when said optical path length compensator is placed between said iris lens and said image intensifier.
US11/109,022 2005-04-08 2005-04-08 Sensor having differential polarization and a network comprised of several such sensors Expired - Fee Related US7193214B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/109,022 US7193214B1 (en) 2005-04-08 2005-04-08 Sensor having differential polarization and a network comprised of several such sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/109,022 US7193214B1 (en) 2005-04-08 2005-04-08 Sensor having differential polarization and a network comprised of several such sensors

Publications (2)

Publication Number Publication Date
US20070051890A1 true US20070051890A1 (en) 2007-03-08
US7193214B1 US7193214B1 (en) 2007-03-20

Family

ID=37829206

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/109,022 Expired - Fee Related US7193214B1 (en) 2005-04-08 2005-04-08 Sensor having differential polarization and a network comprised of several such sensors

Country Status (1)

Country Link
US (1) US7193214B1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7310440B1 (en) * 2001-07-13 2007-12-18 Bae Systems Information And Electronic Systems Integration Inc. Replacement sensor model for optimal image exploitation
GB2451494A (en) * 2007-08-01 2009-02-04 Qinetiq Ltd Polarimetric imaging apparatus
US20130129193A1 (en) * 2011-11-17 2013-05-23 Sen Wang Forming a steroscopic image using range map
US20130176566A1 (en) * 2010-06-25 2013-07-11 Front Street Investment Management Inc., As Manager For Front Street Diversified Income Class Method and Apparatus for Generating Three-Dimensional Image Information
US20130265577A1 (en) * 2012-04-09 2013-10-10 Kla-Tencor Corporation Variable Polarization Wafer Inspection
US9041819B2 (en) 2011-11-17 2015-05-26 Apple Inc. Method for stabilizing a digital video
EP2905590A1 (en) * 2014-02-06 2015-08-12 The Boeing Company Systems and methods for measuring polarization of light in images
US20150373286A1 (en) * 2014-06-19 2015-12-24 JVC Kenwood Corporation Imaging device and infrared image generation method
US10615513B2 (en) 2015-06-16 2020-04-07 Urthecast Corp Efficient planar phased array antenna assembly
US10871561B2 (en) 2015-03-25 2020-12-22 Urthecast Corp. Apparatus and methods for synthetic aperture radar with digital beamforming
US10955546B2 (en) 2015-11-25 2021-03-23 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US11378682B2 (en) 2017-05-23 2022-07-05 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods for moving targets
US11396354B2 (en) * 2020-04-15 2022-07-26 Bae Systems Information And Electronic Systems Integration Inc. Covert underwater navigation via polarimetry
US11506778B2 (en) 2017-05-23 2022-11-22 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
US11525910B2 (en) 2017-11-22 2022-12-13 Spacealpha Insights Corp. Synthetic aperture radar apparatus and methods
EP4104004A4 (en) * 2020-02-11 2024-03-06 Valve Corp Polarimetry camera for high fidelity surface characterization measurements

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7381937B2 (en) * 2005-09-13 2008-06-03 The United States Of America As Represented By The Secretary Of The Army Image analysis and enhancement system
CN101569205B (en) * 2007-06-15 2012-05-23 松下电器产业株式会社 Image processing device
US9420202B1 (en) * 2015-04-01 2016-08-16 Aviation Specialties Unlimited, Inc. Compact intensified camera module
CN105959514B (en) * 2016-04-20 2018-09-21 河海大学 A kind of weak signal target imaging detection device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5373320A (en) * 1993-05-18 1994-12-13 Intevac, Inc. Surveillance system having a microchannel image intensifier tube
US5386119A (en) * 1993-03-25 1995-01-31 Hughes Aircraft Company Apparatus and method for thick wafer measurement
US5416324A (en) * 1993-06-11 1995-05-16 Chun; Cornell S. L. Optical imaging device with integrated polarizer
US5483066A (en) * 1994-06-08 1996-01-09 Loral Corporation Polarization diverse infrared object recognition system and method
US5557261A (en) * 1994-05-06 1996-09-17 Nichols Research Corporation Ice monitoring and detection system
US5975702A (en) * 1996-03-15 1999-11-02 The Trustees Of The University Of Pennsylvania Method of using polarization differencing to improve vision
US6023061A (en) * 1995-12-04 2000-02-08 Microcam Corporation Miniature infrared camera
US6075235A (en) * 1997-01-02 2000-06-13 Chun; Cornell Seu Lun High-resolution polarization-sensitive imaging sensors
US6310345B1 (en) * 1999-10-12 2001-10-30 The United States Of America As Represented By The Secretary Of The Army Polarization-resolving infrared imager
US20020180866A1 (en) * 2001-05-29 2002-12-05 Monroe David A. Modular sensor array
US6563582B1 (en) * 1998-10-07 2003-05-13 Cornell Seu Lun Chun Achromatic retarder array for polarization imaging
US6583415B2 (en) * 2001-04-30 2003-06-24 Lockheed Martin Corporation Method and system for dynamically polarizing electro-optic signals
US6987256B2 (en) * 2004-05-24 2006-01-17 The United States Of America As Represented By The Secretary Of The Army Polarized semi-active laser last pulse logic seeker using a staring focal plane array

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5386119A (en) * 1993-03-25 1995-01-31 Hughes Aircraft Company Apparatus and method for thick wafer measurement
US5373320A (en) * 1993-05-18 1994-12-13 Intevac, Inc. Surveillance system having a microchannel image intensifier tube
US5416324A (en) * 1993-06-11 1995-05-16 Chun; Cornell S. L. Optical imaging device with integrated polarizer
US5557261A (en) * 1994-05-06 1996-09-17 Nichols Research Corporation Ice monitoring and detection system
US5483066A (en) * 1994-06-08 1996-01-09 Loral Corporation Polarization diverse infrared object recognition system and method
US6023061A (en) * 1995-12-04 2000-02-08 Microcam Corporation Miniature infrared camera
US5975702A (en) * 1996-03-15 1999-11-02 The Trustees Of The University Of Pennsylvania Method of using polarization differencing to improve vision
US6075235A (en) * 1997-01-02 2000-06-13 Chun; Cornell Seu Lun High-resolution polarization-sensitive imaging sensors
US6563582B1 (en) * 1998-10-07 2003-05-13 Cornell Seu Lun Chun Achromatic retarder array for polarization imaging
US6310345B1 (en) * 1999-10-12 2001-10-30 The United States Of America As Represented By The Secretary Of The Army Polarization-resolving infrared imager
US6583415B2 (en) * 2001-04-30 2003-06-24 Lockheed Martin Corporation Method and system for dynamically polarizing electro-optic signals
US20020180866A1 (en) * 2001-05-29 2002-12-05 Monroe David A. Modular sensor array
US6987256B2 (en) * 2004-05-24 2006-01-17 The United States Of America As Represented By The Secretary Of The Army Polarized semi-active laser last pulse logic seeker using a staring focal plane array

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7310440B1 (en) * 2001-07-13 2007-12-18 Bae Systems Information And Electronic Systems Integration Inc. Replacement sensor model for optimal image exploitation
GB2451494A (en) * 2007-08-01 2009-02-04 Qinetiq Ltd Polarimetric imaging apparatus
GB2451494B (en) * 2007-08-01 2011-06-08 Qinetiq Ltd Polarimetric imaging apparatus
US9664612B2 (en) * 2010-06-25 2017-05-30 Steropes Technologies, Llc Method and apparatus for generating three-dimensional image information
US20130176566A1 (en) * 2010-06-25 2013-07-11 Front Street Investment Management Inc., As Manager For Front Street Diversified Income Class Method and Apparatus for Generating Three-Dimensional Image Information
US8611642B2 (en) * 2011-11-17 2013-12-17 Apple Inc. Forming a steroscopic image using range map
US20130129193A1 (en) * 2011-11-17 2013-05-23 Sen Wang Forming a steroscopic image using range map
US9041819B2 (en) 2011-11-17 2015-05-26 Apple Inc. Method for stabilizing a digital video
US20130265577A1 (en) * 2012-04-09 2013-10-10 Kla-Tencor Corporation Variable Polarization Wafer Inspection
TWI588473B (en) * 2012-04-09 2017-06-21 克萊譚克公司 Variable polarization wafer inspection
US9239295B2 (en) * 2012-04-09 2016-01-19 Kla-Tencor Corp. Variable polarization wafer inspection
KR101784334B1 (en) 2014-02-06 2017-10-11 더 보잉 컴파니 Systems and methods for measuring polarization of light in images
US9464938B2 (en) 2014-02-06 2016-10-11 The Boeing Company Systems and methods for measuring polarization of light in images
EP2905590A1 (en) * 2014-02-06 2015-08-12 The Boeing Company Systems and methods for measuring polarization of light in images
US20150373286A1 (en) * 2014-06-19 2015-12-24 JVC Kenwood Corporation Imaging device and infrared image generation method
US9967478B2 (en) * 2014-06-19 2018-05-08 JVC Kenwood Corporation Imaging device and infrared image generation method
US10871561B2 (en) 2015-03-25 2020-12-22 Urthecast Corp. Apparatus and methods for synthetic aperture radar with digital beamforming
US10615513B2 (en) 2015-06-16 2020-04-07 Urthecast Corp Efficient planar phased array antenna assembly
US10955546B2 (en) 2015-11-25 2021-03-23 Urthecast Corp. Synthetic aperture radar imaging apparatus and methods
US11754703B2 (en) 2015-11-25 2023-09-12 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
US11378682B2 (en) 2017-05-23 2022-07-05 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods for moving targets
US11506778B2 (en) 2017-05-23 2022-11-22 Spacealpha Insights Corp. Synthetic aperture radar imaging apparatus and methods
US11525910B2 (en) 2017-11-22 2022-12-13 Spacealpha Insights Corp. Synthetic aperture radar apparatus and methods
EP4104004A4 (en) * 2020-02-11 2024-03-06 Valve Corp Polarimetry camera for high fidelity surface characterization measurements
US11396354B2 (en) * 2020-04-15 2022-07-26 Bae Systems Information And Electronic Systems Integration Inc. Covert underwater navigation via polarimetry

Also Published As

Publication number Publication date
US7193214B1 (en) 2007-03-20

Similar Documents

Publication Publication Date Title
US7193214B1 (en) Sensor having differential polarization and a network comprised of several such sensors
US11778289B2 (en) Multi-camera imaging systems
US9906718B2 (en) Biomimetic integrated optical sensor (BIOS) system
US7541588B2 (en) Infrared laser illuminated imaging systems and methods
US6781127B1 (en) Common aperture fused reflective/thermal emitted sensor and system
US20040119020A1 (en) Multi-mode optical imager
US11392805B2 (en) Compact multi-sensor fusion system with shared aperture
US6982743B2 (en) Multispectral omnidirectional optical sensor and methods therefor
JP2009075107A (en) Correlated ghost imager
US10621438B2 (en) Hybrid hyperspectral augmented reality device
US11635328B2 (en) Combined multi-spectral and polarization sensor
US20150185079A1 (en) Hyper-Spectral and Hyper-Spatial Search, Track and Recognition Sensor
US20130235211A1 (en) Multifunctional Bispectral Imaging Method and Device
US11125623B2 (en) Satellite onboard imaging systems and methods for space applications
Perić et al. Analysis of SWIR imagers application in electro-optical systems
US10733442B2 (en) Optical surveillance system
Sadler et al. Mobile optical detection system for counter-surveillance
Dulski et al. Infrared uncooled cameras used in multi-sensor systems for perimeter protection
WO2009156362A1 (en) Imaging apparatus and method
Pręgowski Infrared detector arrays in the detection, automation and robotics-trends and development perspectives
KR20210094872A (en) Integrated monitoring system using multi sensor
Chenault et al. Pyxis: enhanced thermal imaging with a division of focal plane polarimeter
Lavigne et al. Enhanced military target discrimination using active and passive polarimetric imagery
Eismann Emerging research directions in air-to-ground target detection and discrimination
WO2011141329A1 (en) System and method for detection of buried objects

Legal Events

Date Code Title Description
AS Assignment

Owner name: UNITED STATES OF AMERICA, AS REPRESENTED BY THE SE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PITTMAN, WILLIAM C.;REEL/FRAME:018852/0576

Effective date: 20050405

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20150320