WO2011139150A1 - Improved optical rangefinding and imaging apparatus - Google Patents

Improved optical rangefinding and imaging apparatus Download PDF

Info

Publication number
WO2011139150A1
WO2011139150A1 PCT/NL2011/050301 NL2011050301W WO2011139150A1 WO 2011139150 A1 WO2011139150 A1 WO 2011139150A1 NL 2011050301 W NL2011050301 W NL 2011050301W WO 2011139150 A1 WO2011139150 A1 WO 2011139150A1
Authority
WO
WIPO (PCT)
Prior art keywords
degree
defocus
optical
modulation
angular momentum
Prior art date
Application number
PCT/NL2011/050301
Other languages
French (fr)
Inventor
Aleksey Nikolaevich Simonov
Michiel Christiaan Rombach
Original Assignee
Asmr Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asmr Holding B.V. filed Critical Asmr Holding B.V.
Publication of WO2011139150A1 publication Critical patent/WO2011139150A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Definitions

  • Active rangefinding is based on emission of a signal, for example sound, laser light or radar waves, by an emitter linked to the rangefmder and detection and analysis of reflections of the signal from a target object by the rangefmder.
  • Passive rangefinding is based on detection and analyses of signals emitted by the target itself, for example, infrared emission of the engine, or, alternatively, scattering of ambient light or ambient radio-waves.
  • Passive optical rangefmders are used since the nineteenth century and generally based on geometrical methods, measurement of angles. In stadimeters the distance is calculated from the angular size of an object with the dimensions of the object known a priori, for example, the distance to a battleship can be calculated once its size and angle of view are known.
  • Passive rangefinding triangulation methods are also employed in modern photographic cameras, for example, K. Engelhardt and R. Knop, Appl. Opt. 34, 2339-2344, 1995, to estimate the degree of defocus at multiple locations in the image field.
  • Modern cameras calculate the range by phase detection or contrast measurement methods which can be passive in sufficient ambient light, but active under low light conditions. For example, a camera has to pre-flash during the rangefinding procedures prior to actual picture taking.
  • Phase detection methods as in, for example, WO2005/098501 and US2008/205872 analyze the sub-images for differences in light intensity after splitting the light coming from the primary objective into several channels with precisely known light-path lengths.
  • Contrast measurement methods as in, for example, US2006109369 and US2004017502, optimize the contrast of the image, or part of the image, on the photosensor by changing the focusing condition and detect the optimal contrast corresponding to the optimal focus.
  • the above methods differ, in virtually all their aspects, from the rangefmder and imaging apparatus and corresponding methods described in the present document, this document, which do neither require triangulation or angular measurement of the object nor contrast evaluation or comparison of phase delayed sub-images.
  • a rangefinder determines the range, distance, from the apparatus to an object.
  • An optical rangefinder determines the range by means of light, for example, visible, or infrared (IR), or ultraviolet (UV).
  • An imaging system alternatively, imaging optics, comprises focusing and relaying optics such as lenses, mirrors, beam-splitters and various optical components, and functionalities and assembly thereof, as, for example, in a camera, to project light onto a photosensor.
  • a photosensor is an electronic component converting light intensity into an electronic signal, or digital/electronic image.
  • a rangefinder comprises at least one optical channel which is a combination of an imaging system along the optical path of the incoming light and at least one photosensor.
  • independent photosensors can be either physically separate components but can also be separate sectors on a single photosensor.
  • a rangefinder with multiple optical channels may comprise individual optical channels, generally physically separate with each optical channel comprising individual optical components and individual photosensors.
  • Such a rangefinder may include optical components and/or photosensors shared between the optical channels.
  • An optical rangefinder can include at least two independent optical channels in a specific spatial optical arrangement, for example, all the channels tilted symmetrically with respect to the optical axis, or, alternatively, one of the channels tilted additionally in the tangential plane while other channels have optical axes positioned in the sagittal plane of the optical arrangement.
  • Optical channels in a rangefinder can also be separated based on polarization or spectral characteristics of light. For example, a Wollaston prism separates the incoming unpolarized light beam into two orthogonal, linearly polarized outgoing beams that can be further transformed (and mixed again) and detected by a photosensor.
  • An optical rangefinder with multiple optical channels is referred to as having temporal optical arrangement when it comprises at least one optical channel that switches sequentially, over time, from one configuration to another configuration. Such switching may be achieved by, for example, displacing the whole individual optical channel from one spatial position to another, or, alternatively, sequential turning an optical component, for example, a mirror or lens, in a given optical channel.
  • Temporal optical arrangements can be achieved, for example, in most of the modern cameras by adaptation of the vibration correction function included in the lens or sensor systems.
  • Optical arrangement of a rangefinder in the context of this document, includes any physical arrangement of optical, mechanical, electronic components as well as any virtual arrangement obtained, for example, by digital processing of the electronic images. Both types of optical arrangements (i. e.
  • An image is in-focus or sharp when the image plane coincides with the in- focus image plane, and, if not, the image is defocused or blurred, or, alternatively, the image has a certain degree of defocus.
  • object and image conform to Goodman's definitions for a generalized imaging system (J.W. Goodman, Introduction to Fourier Optics, McGraw-Hill Co., Inc., New York, 1996, Chap. 6).
  • object and image refer to multidimensional data arrays or functions representing distribution of light in the object and image planes, respectively.
  • Spectral response and spatial spectrum Spectral response is generally obtained by Fourier transformation of the intensity impulse response, or, alternatively, by other transformations including, but not restricted to, wavelet decomposition and other spectral decompositions, where spectral decomposition is the decomposition of the image into a basis of eigenfunctions of an appropriate operator in the Hilbert space.
  • Spatial spectrum of an image is obtained by the spectral decomposition of the image data.
  • the spatial spectrum is a multidimensional array of complex values.
  • a reference pattern in the context of the present document, denotes any pattern usually in the spatial spectrum domain that can be used as a reference scale to measure the degree of displacement of the image spectrum generally caused by defocus, but not restricted to this optical aberration only.
  • the reference pattern can be any synthetic structure in the spatial spectrum domain, for example, a single line, or a grid of lines.
  • the reference pattern can be a characteristic pattern obtained from the inherent spectral response of the optical arrangement, or optical mask.
  • the characteristic pattern can be conveniently represented by the modulus of the incoherent optical transfer function (OTF), i. e. the modulation transfer function (MTF), calculated with the mask amplitude and phase functions.
  • OTF incoherent optical transfer function
  • MTF modulation transfer function
  • the characteristic pattern is useful for measuring and evaluating the displacement of the spatial spectra.
  • a periodic pattern of lines is an example of the characteristic pattern resulting from the optical mask with a chiral prismatic element, for example, a half-aperture prismatic optical mask.
  • defocus map denotes the distribution, generally two-dimensional, of degrees of defocus obtained for a spatially extended object/scene, a depth map provides distances of multiple objects or multiple sub-images and a wavefront map describes the distortion of the wavefront, for example, in the plane of the exit pupil and measured relative to the Gaussian reference sphere.
  • Said maps can be static maps, providing information for a single point in time, or, alternatively, active maps, providing updated information at, for example, real-time speed.
  • EOF is for Extended Depth of Field.
  • Optical chirality and arrangements - Optical chirality results from chiral modulation including chiral phase modulation which, in turn, provides modulation of angular momentum of light.
  • Modulation of angular momentum includes any change of the angular momentum carried by electromagnetic waves, or, alternatively, its flux (see, for example, S.M. Barnett, J. Opt. B4, S7-S17, 2002).
  • General properties as well as mathematical description of light beams carrying optical angular momentum are presented in Optical Angular Momentum, L. Allen et ah, IOP, 2003, which document is included in this document by reference.
  • Modulation of angular momentum of light and detection of the degree of such momentum can be achieved in a number of ways by a number of optical arrangements of which a few examples are provided below, but which number of ways is not necessarily restricted thereto:
  • modulation of angular momentum can be provided by a chiral optical element, which element includes, at least one, optical surface resulting in chiral phase modulation of the light beam, or alternative light signal.
  • chiral modulation can be obtained with a half-prismatic optical mask.
  • modulation of angular momentum can be provided by a chiral optical arrangement comprised of multiple independent channels of a multichannel imaging system or combination of several independent imaging systems, for example an arrangement with two imaging systems of which one of the channels is tilted optically versus the other channel (for details of such an arrangement based on optical tilt see below).
  • a chiral optical arrangement can be a combination of at least two optical channels, i. e. two independent imaging systems, which channels are arranged such that the combination provides modulation of angular momentum of the incoming light.
  • modulation of angular momentum can be provided by digital processing, in silico, from subimages from a single image resulting from a single exposure on a single photosensor, or, alternatively, be provided by processing, in silico, of a combination of any number of images from any number of combinations of imaging systems and photosensors.
  • An optical chiral arrangement with multiple optical channels can be a spatial chiral arrangement comprising, for example, at least two independent imaging channels that are not separately chiral, but the combination of channels is a chiral arrangement.
  • an optical chiral arrangement can be a temporal chiral arrangement including, for example, at least one non-chiral imaging channel which provides at least two consecutive images of an object at different positions of the optical channel such that the image taking positions constitute a chiral arrangement.
  • an independent imaging channel may consist of an imaging system and a photosensor.
  • - Modulation of angular momentum can be provided by chiral modulation of phase, which chiral phase modulation is associated with the chiral phase function.
  • the chiral phase function can be conveniently represented by a three- dimensional chiral surface. By definition, the mirror image of the chiral surface cannot be mapped to the original surface by rotations and translations, for example,
  • an object is defined to be chiral if it is not invariant under parity transformation.
  • a chiral optical arrangement is an arrangement of an optical system that modulates a light beam such that the light approaching the image plane contains chiral phase modulation, or, alternatively, the generalized pupil function of the optical system has a chiral phase.
  • Chiral modulation is the amplitude and phase modulation of light resulting from the chiral optical arrangement. Chirality signs, or, alternatively, directions; clockwise or counterclockwise, or, right-handed or left-handed, should be preferably the same for each particular optical mask, i. e.
  • one mask should only be comprised of a chirality function, or a combination of chirality functions, of the same sign, but combinations of chirality functions of opposite signs are not excluded.
  • the degree of chirality in simple cases, for example, a vortex, can be quantitatively measured in terms of topological charge, in other cases (for example, a complex three-dimensional surface), the degree of chirality can be calculated in terms of the continuous chirality measure, for example, Salomon et al, J. Mater. Chem. 25, 295-308, 1999.
  • Figure 1 shows one of the preferred embodiments of the optical rangefmder.
  • Light from an object, 1, is collected by an imaging system, 2, and projected onto a photosensor, 3.
  • the imaging system comprises optical elements for imaging, 4, and, in this example, one modulator, 5, in the plane of the exit pupil, 6.
  • the modulator modulates the angular momentum of the incoming light, 7 and the electronic image is further processed by digital processing means, 8.
  • Figure 2 shows an alternative embodiment of the optical rangefmder including a polarization demodulator, 9, to convert the degree of spin momentum into a
  • Figure 3 shows an alternative embodiment of the optical rangefmder depicted in Figure 2 including a medium, 10, adapted to convert orbital angular momentum into spin angular momentum and vise versa. (For references 1-9 see Figs. 1 and 2.) .
  • Figure 4 shows an alternative embodiment of the optical rangefmder depicted in Figure 3, adapted to convert spin angular momentum into orbital angular momentum. (For references 1-9 see Figs. 1 and 3.)
  • Figure 5 shows the preferred embodiment of the optical rangefmder with two independent optical channels.
  • Light from an object, 1, is projected by two independent imaging systems, 11, 12, on two independent photosensors, 13, 14.
  • the optical axes, 15, 16, of the respective imaging systems, 11, 12, make angles with measures 17 and 18 versus the mutual projection axis 19.
  • the axis 19 is the orthogonal projection of the axis 15 on the plane of the photosensor 13, where the plane of the photosensor 13 is defined by axes 16 and 19.
  • the angles 17 and 18 of both imaging systems can be chosen to make the rangefmder arrangement chiral, or non-chiral.
  • the digital images from photosensors are processed by digital processing means, 20, in this example a laptop computer.
  • Figure 6 illustrates rangefinding with the apparatus described in Fig. 5.
  • Images 21 and 22 acquired by two independent photosensors, at a given distance from the object (USAF 1951 resolution chart) to the rangefmder, are combined into a single image 23.
  • the spatial spectrum, 24, of the image 23 exhibits a distinct periodic pattern of lines rotated by an angle with the measure 25 versus a reference line 26.
  • the angular position of the periodic pattern of lines is indicated by an additional line, 27.
  • the reference line 26 is chosen to be along the vertical axis.
  • images 28 and 29 acquired by two independent photosensors give rise to a combined image 30.
  • the spectrum, 31, of the combined image 30 reveals a periodic pattern of lines rotated by a different angle with the measure 32 with respect to the vertical axis.
  • Prior calibration of the rotation angle versus the distance to the object for example, USAF 1951 resolution chart, allows determination of the absolute range from the rangefmder to any object of interest.
  • Figure 7 illustrates the diffraction of a light beam carrying orbital angular momentum.
  • An angular measure 35 represents the angular position of an image axis 36, a preferred direction of intensity distribution, with respect to a vertical axis 37.
  • Such a diffraction picture takes place when the beam propagates in a free space or in an imaging system (in the latter case the beam spatial coordinates should be replaced by the "lens transformed" coordinates, see also explanations in the text).
  • the relative lateral shift (caused by defocus) of two images can be calculated and the unknown range can be found from the defocus-related shift.
  • passive ranging with the extended depth of field according to US7218448B1 is obtained by combining a linear phase modulation filter with the cubic phase modulation mask.
  • the extended depth of ranging can be attained without additional elements and phase masks by using the directional sensitivity of the optical arrangement, for example, by shaping the exit pupil of the optical system.
  • WO2010/080030 which is, by reference, included in full in the present document, describes a system and methods for passive ranging comprising a prismatic phase-mask and a single optical channel.
  • the present invention comprises a rangefmder with multiple optical channels in chiral configuration and multiple photosensors.
  • WO2010/080030 does not disclose embodiments employing modulation of angular momentum of light which includes spin and orbital parts and WO2010/080030 does not consider rangefmding using the transformations of spin angular momentum into orbital and vice versa.
  • WO2010/080030 discloses optical arrangements with multiple optical channels adapted to provide chiral modulation of the incoming light, as shown, for example, in Figures 11 and 12 and, for example, in a combination of Claims 1, 2 and 4.
  • WO2010/080030 also covers non-chiral optical arrangements with multiple non-chiral optical channels with shared optical elements (e. g. photosensor), as it can be constructed, for example, according to Claims 1 and 4 of WO2010/080030.
  • One of the preferred embodiments in the present document is a rangefmder with individual non-chiral optical channels (having no shared optical components and photosensors) in which the virtual, chiral or non-chiral, optical arrangement (providing rangefmding functionality) is achieved by combining in silico the images from the individual photosensors or separate sectors of the same photosensor.
  • Eq. (3) By symmitrizing the canonical stress tensor ⁇ ⁇ (3 and choosing a proper divergent term for J ap7 in Eq. (2) (see, for example, F.J. Belinfante, Physica VI(9), 887-898, 1939), Eq. (3) can be transformed to the gauge-invariant form satisfying Eq. (2)
  • the angular momentum flux density for a light beam propagating along the Z - axis (through the T - plane) is
  • E (r) ⁇ E (x,y,z) — E 0 (k l ,k 2 )Qxip(ik.r)dk l dk 2 , (13)
  • 2 The formulas derived in this section can be generalized for a light beam propagating through an anisotropic (e. g. birefringent, gyrotropic) medium (see, for example, L. Allen et al. and references therein).
  • anisotropic e. g. birefringent, gyrotropic
  • the total orbital momentum flux ⁇ '°' is a combination of partial chiral-mode powers
  • 2 As follows from Eqs. (22), (25) and (28), the resulting orbital momentum flux ⁇ '°' depends on light polarization as well as phase (chirality).
  • phase chirality/helicity of the light wave results in rotation of a light beam when the beam propagates in a free space. Similar rotation effects take place when the light beam passes through a defocused optical system.
  • 2 can be determined for any given defocus ⁇ .
  • ⁇ ⁇ ( ⁇ ⁇ , ⁇ ⁇ ) ⁇ ⁇ ( ⁇ ⁇ , co ⁇ )7 0 ( ⁇ ⁇ , ⁇ ⁇ ), (36)
  • I 2 ( ⁇ ⁇ ,0) y ) H 2 ( ⁇ ⁇ , co ⁇ )/ 0 ( ⁇ ⁇ , co ⁇ ) expH ' (co x Ax + 0) y Ay)] .
  • the spatial/temporal optical arrangement of a rangefinder apparatus can be made chiral with at least one independent optical channel.
  • a rangefinder with a spatial chiral arrangement comprising two independent optical channels, where each channel has its own optical mask in the plane of the exit pupil.
  • the system may comprise only one channel which moves from one position to another position taking two consecutive pictures of the same object/scene.
  • the second mask is a square opening xe [d - a /2,d + a /2] and y € [-b/ 2,b/2] with a wedge prism introducing the phase delay
  • H 1 (ro x ,ro j ,,9) H 0 (ro x ,ro j ,,9) exp(-/29ro x c) , (42) in which / ⁇ ., ⁇ ,. ⁇ ) ⁇ " ⁇ "- 1 "- M ** ⁇ M (43 )
  • ⁇ ⁇ ⁇ ⁇ b is the defocused OTF of a rectangular aperture of size a x b and ⁇ is the defocus parameter as in A.N. Simonov and M.C. Rombach, Opt. Lett. 36, 115-117 (2011). Similarly, the OTF of the second channel becomes
  • ⁇ ( ⁇ ) ⁇ A(i) x + B(i) y )l2 + ⁇ (i) x ⁇ d ⁇ c) . (46)
  • H(a> x ,a> y ,q>) contains a periodic characteristic pattern of fringes specified by the term ⁇ 8 ⁇ (+) . From Eqs. (45, 46) it follows that the inclination of the periodic pattern caused by defocusing occurs only if B ⁇ 0 . In this case the two-channel arrangement is chiral.
  • an effective chiral arrangement can be constructed.
  • This arrangement allows accurate measurement of ranges by, for example, determining the tilting of the characteristic pattern.
  • the system with two independent channels provides two undistorted images (when ⁇ is small) of the original object/scene. When ⁇ is not small but can be determined by the rangefmder
  • an in-focus image of the original object/scene can be obtained by digital processing, e. g. Wiener deconvolution, other methods can be found in
  • Optical rangefmding and imaging apparatus comprises as main components optics and processing means. Additionally the apparatus can comprise supporting components such as, for example display means.
  • Optical arrangements can include at least one optical channel adapted to project at least two images of at least one object onto at least one photosensor.
  • optical channels can include at least one optical channel adapted to project at least two images of at least one object onto at least one photosensor.
  • two independent optical arrangements as in for example, two cameras with their own independent optical channels and their own independent photosensors, or, alternatively, one shared optical channel, a beam splitter and two independent photosensors, or, alternatively, two independent optical channels projecting onto a shared photosensor, and other arrangements not mentioned here including arrangements with more than two optical channels and more than two photosensors.
  • This document discloses optical arrangements which are chiral optical arrangement adapted to provide at least one chiral arrangement of at least two images which is an arrangement according to principles of chirality as set forth elsewhere in this document and in the document WO2010/080030.
  • Processing means include image processing means adapted to provide processing of digital images, spectral processing means adapted to provide spatial spectra and to derive the degree of defocus, and range processing means adapted to provide the range of said object.
  • the image processing means can be processing means as can be found in standard consumer cameras or, alternatively, can be dedicated processing means, for example, to increase image contrast, reduce noise, remove or interpolate the data points which are not usable due to, for example, sensor saturation; the image processing means can be also adapted to provide sharp or in- focus image restoration form the images detected by the photosensor by, for example, Wiener deconvolution - see the theoretical background above.
  • the spectral processing means process images according to principles for analysis and comparison of image spatial spectra according to
  • the spectral processing means is adapted to derive the degree of defocus from the degree of displacement of the spatial spectrum of the chiral combination of images versus a reference.
  • the range means is adapted to provide the distance from at least one object to the apparatus by deriving said range from said degree of defocus.
  • a chiral combination of images can be the merging of at two images into a single image which image is chiral, as in Fig. 6, with 21 and 22 being the two images and 23 the resulting chiral combination.
  • a chiral combination of images can be achieved by virtually comparing the spectra of two images without merging the images as explained in the theoretical background in the section giving an example of determining the mutual shift of images acquired by different photosensors that belong to different, but highly similar, optical channels, elsewhere in the present document.
  • the reference versus which the spatial spectrum of the chiral combination of images is compared can be, for example, any pattern in the spatial spectrum that can be used as a reference scale to measure the degree of displacement of the spatial spectrum caused by defocus.
  • the reference pattern can be any synthetic structure in the spatial spectrum, for example, a single line, or a grid of lines.
  • the reference pattern can be a characteristic pattern obtained from the inherent spectral response of the chiral optical arrangement.
  • the characteristic pattern can be conveniently represented by the modulus of the incoherent optical transfer function (OTF), i. e. the modulation transfer function (MTF), calculated with the corresponding chiral optical arrangement.
  • OTF incoherent optical transfer function
  • MTF modulation transfer function
  • the characteristic pattern is useful for measuring and evaluating the displacement of the spatial spectra.
  • a periodic pattern of lines is an example of the characteristic pattern resulting from the chiral optical arrangement.
  • the image processing means can be adapted to provide at least one in-focus image of said object by processing of at least one digital image from at least one optical channel. So, the optical arrangement and image processing means function as a camera. Note that stereoscopic imaging, imaging suitable for 3D viewing by stereoscopy, is an option for optical arrangements including multiple cameras.
  • Additional processing means can be adapted to combine the ranges of at least two objects into a depth map.
  • objects can be physical objects or, alternatively, objects can be any single detectable feature, for example, a small area with contrast, an edge or a dot.
  • Depth maps contain, and if displayed, show, the relative distance of objects in a scene. Depth maps are becoming main tools for analysis of scenes and are becoming a basis for novel concepts for 3D visualization.
  • the method for optical rangefmding and imaging comprises projecting at least two images of at least one object onto at least one photosensor by at least one optical arrangement including at least one optical channel, processing of digital images by image processing means, deriving the degree of defocus from the spatial spectrum of at least two images by a combination of spectral and defocus processing means, and, providing of the range of said object by range processing means.
  • the method comprises providing at least one chiral combination of at least two images by at least one chiral optical arrangement, deriving the degree of defocus from the degree of displacement of the spatial spectrum of the chiral combination of images versus a reference by the combination of spectral and defocus processing means, and, providing the distance from at least one object to the apparatus by deriving said range from said degree of defocus by the range means.
  • the method can comprise image processing of at least one digital image from at least one optical channel to provide at least one in-focus image of said object by image processing means and the method can comprise additional processing combining the ranges of at least two objects into a depth map.
  • the preferred embodiments of the present inventions include optical rangefmders operating in visible, infrared or ultraviolet light, which fits the majority of practical applications. However, the described methods for rangefinding can be applied to all wave phenomena (for example, acoustic waves, RF fields etc.). Multiple channels with independent optics.
  • An alternative embodiment of a rangefinder with multiple optical channels may comprise individual optical components and individual photosensors.
  • a rangefinder comprising two physically separate imaging systems with two individual photosensors the required sensitivity to defocus can be achieved by tilting the optical axis of one of the optical channels with respect to another - see the theoretical background above.
  • the combination of independent channels can be a spatial/temporal chiral optical arrangement adapted to provide chiral modulation.
  • Such spatial/temporal chiral arrangement may include, for example, at least one independent optical channel to project at least two images of the same object/scene on at least one photosensor such that the combination of images is adapted to provide range information by processing at least two images.
  • the processing may include evaluating a degree of displacement of the spatial spectrum in response to defocus of the image, or composing a combined (digital) image with the spatial spectrum containing well-defined characteristic pattern which evolves with the focusing error, and hence range - see the theoretical background in the present document.
  • the chiral optical arrangement can be achieved, for example, by tilting one of the optical channels versus another channel in an asymmetric tilting arrangement (see also section in Terms and Definitions in the present document and Fig. 5).
  • a rangefinder may comprise only one optical channel adapted to obtain at least two images of the same object in time such that the rangefinder arrangement becomes chiral in time.
  • the optical arrangement of such a rangefinder and imaging apparatus can be adapted to provide modulation of the incoming light such that the images, i. e. digital images, acquired by the photosensors exhibit mutual changes in response to defocus easily detected by processing of their spatial spectra which spectra are obtained by spectral processing means.
  • defocusing in the rangefinder and imaging apparatus may cause a linear change, displacement in phase of one spatial spectrum with respect to another, see the theoretical section above.
  • the digital images from the different optical channels can be pre-processed by image processing means, for example, to equalize light intensities in the digital images, or increase their contrast.
  • the degree of displacement of the image spectra, caused by defocus can be converted into the corresponding degree of defocus relative to the in- focus image plane. It is clear that with the derived degree of defocus, one of the digital images can be reconstructed, for example, by Wiener filtering method, as explained in WO2010/080030, and a final in- focus image can be obtained.
  • each optical channel may be a standard camera and the optical arrangement of the rangefinder may only require tilting of one camera with respect to the another.
  • each optical channel provides an undistorted, i. e. with no mask-related distortions, image of the object of interest, though, the image may be defocused.
  • the image might still be in- focus when the object is at large distance because even very small focusing errors will be detected for range finding purposes which errors do not necessarily detorirate the image.
  • a sharp and in- focus image can be further reconstructed, for example, by Wiener filtering method (J.W. Goodman, Introduction to Fourier Optics, McGraw-Hill Co., Inc., New York, 1996, Chap. 8).
  • Wiener filtering method J.W. Goodman, Introduction to Fourier Optics, McGraw-Hill Co., Inc., New York, 1996, Chap. 8).
  • Suitable configurations and optical arrangements of a rangefinder with multiple independent optical channels are determined by the application requirements.
  • a chiral optical arrangement is one of the preferred optical arrangements of a rangefmder.
  • an optical arrangement of a rangefmder can be non-chiral, but resulting in detectable defocus-related changes in the images acquired by the photosensors.
  • Accurate rangefmding can be achieved by processing the whole images or their multiple pixel portions, i. e. sub-images.
  • An example of such image processing is given in the theoretical section of the present document.
  • processing of multiple sub-images can be utilized to obtain depth maps of distant objects/scenes.
  • two images of the same object/scene taken by a two-channel rangefmder can be combined digitally into one resulting image which, in turn, can be split into multiple sub-images.
  • Local ranges defined by processing of the individual sub-images can be used to produce a depth map of the object/scene, see the explanations in
  • a rangefmder is adapted to provide the required degree of modulation, caused by defocus, by digital processing only, in silico, by combining at least two sub-images from at least one whole image.
  • a further analysis of the combination of sub-images, for example, their distortions can be used to determine the degree of defocus and thus range.
  • a standard camera takes a digital image
  • the processing means selects at least one central sub-image, an image which, for example, is positioned symmetrical around the optical axis, as well as a shifted sub-image, an image which only partly overlaps with the central image and positioned asymmetrical around the optical axis.
  • the shifted sub-images can be chosen such that in combination they form any type of geometrical asymmetry, including chiral asymmetry.
  • Additional processing means combine said images in a combined image from which, first, the degree of defocus and, second, distance to the object can be found.
  • Pre-processing can include, for example, increase of contrast, digital selection of features, addition of colors and other digital image manipulation procedures. As an example, consider rangefmding with two independent optical channels followed by the subsequent analysis of multiple sub-images within each acquired image.
  • the sub-images cannot be individually analyzed and the degrees of defocus as well as the corresponding ranges cannot be evaluated.
  • Processing and transformations described above can be applied to a single image acquired by one or several photosensors, or any portion of the image, i. e. sub-image, or any combination of sub-images from said image.
  • processing and transformations can be carried out on multiple digital images, or sub-images, or any synthetic combination, meaning combinations in silico, of the multiple images and sub- images.
  • separate processing of a number of sub-images from a single image obtained by a rangefmder with a chiral optical arrangement allow, first, to determine defocus and, thus, the distance to the objects within every sub-images, and, second, reconstruct a depth map for a whole scene.
  • the optical arrangement of a rangefmder with a single optical channel or multiple optical channels, including chiral optical arrangements, may have a preferred plane of rangefmding, providing a directional sensitivity to defocus.
  • Explanations can be found, for example, in WO2010/080030.
  • the directional sensitivity is of a particular interest because, first, the shape of an object can be sensed remotely by turning the rangefmder about its optical axis; second, the dynamic range of the rangefmder can be significantly increased without losing sensitivity in the preferred plane of rangefmding.
  • the directional sensitivity and/or the increased dynamic range can be achieved with additional optical components, including optical phase and amplitude masks (also slits and shaped aperture stops).
  • optical phase and amplitude masks also slits and shaped aperture stops.
  • One example, explained in WO2010/080030, and applicable to a rangefmder with multiple optical channels is a cubic optical phase mask.
  • the preferred method for rangefmding is to measure the defocus-induced displacement of at least one characteristic pattern versus the reference pattern in the spatial spectrum of the detected image.
  • the spatial spectrum is obtained by spectral decomposition of the image, generally, by the Fourier transformation.
  • the image, I(x, y) in this context a two-dimensional intensity distribution on a photosensor, results in the spatial spectrum / (£> x , £> y ) , where x, y are the spatial variables and £> ⁇ , ⁇ ⁇ are the
  • the spectral decomposition can be accomplished by digital algorithms from the digital image, or directly from the image by an optical processor, for example, the optical processor described in
  • the characteristic pattern results from the inherent spectral response of the optical arrangement, or modulator, see also explanations in WO2010/080030.
  • the characteristic pattern can be conveniently represented by the modulus of the incoherent optical transfer function (OTF), i. e. the modulation transfer function (MTF), calculated for a given optical arrangement.
  • OTF incoherent optical transfer function
  • the man skilled in the art may conclude that the defocus-induced changes in the spectral domain of the image are inevitably associated with the spatial transformations of the image itself.
  • the unknown range can be evaluated directly from the spatial displacement of the image on a photosensor.
  • the displacements may include shift, rotation, scaling of the image, and any combination of thereof.
  • the displacement is easy to derive in an optical arrangement with multiple optical channels having individual photosensors.
  • An example of the lateral shift calculation from two independent images is considered in the theoretical section of the present document and in WO2009/108050.
  • the degree of the lateral shift of the images can be directly calibrated, for example at a manufacturer's facility, in units of defocus, for example, wavelength, and in units of absolute range such as micrometers, meters or kilometers, depending on the design and application of the particular rangefmder.
  • This invention - aspects of angular momentum
  • a rangefmder comprises at least one modulator to modulate the angular momentum of the incoming light.
  • Modulation of the angular momentum can be provided by chiral optical elements or/and polarizing elements included in at least one imaging system, or, alternatively, provided by spatial chiral optical arrangement, or temporal chiral optical arrangement, or, alternatively, provided by chiral modulation at digital processing, in silico, from, for example, at least two subimages from a at least one image.
  • Optical arrangements may include any number of optical channels providing an effective chirality of the rangefmder.
  • Embodiments of modulators to modulate either the orbital angular momentum or the spin angular momentum are disclosed as well as combinations thereof and combinations of modulators with converters to convert orbital angular momentum into spin angular momentum and vice versa.
  • At least one image is projected by at least one imaging system onto at least one photosensor.
  • the imaging system and the photosensor can, for example, be a standard camera lens and a standard photosensor or, alternatively, a special imaging optics and a photosensor optimized for particular rangefinding needs.
  • the light is modulated such that the degree of modulation depends on the degree of defocus in the imaging system.
  • the degree of defocus is derived from the degree of modulation of the angular momentum which can be detected from the intensity variations on the photosensor.
  • the modulation of the angular momentum may include modulation of the orbital momentum and/or spin momentum.
  • the range for a single object, or a group of objects can be derived from the degree of defocus by using, for example, formulas presented in WO2010/080030, or as described by Nayar (Nayar et al., Proc. of Fifth Intl. Conf. on Computer Vision, 995- 1001, Cambridge, MA, USA, 1995).
  • the modulation of the angular momentum can be adjusted for a highest sensitivity with respect to defocus variations, i. e. the value
  • the rangefmder also includes image processing means, or digital processing means including defocus-from-modulation processing means, deriving the degree of defocus from the degree of modulation, and range-from-defocus processing means, deriving the range (as described above) of the object from the degree of defocus.
  • the rangefmder also includes at least one modulator providing a degree of modulation which degree depends on the degree of defocus.
  • the modulator provides modulation of the angular momentum of the incoming light.
  • a rangefmder may comprise a chiral optical arrangement with two independent optical channels to modulate the orbital momentum of light.
  • the degree of modulation in this case can be represented by the angular position and spacing of the periodic pattern of lines in the spatial spectrum of the combined image. Recalculating the angle and spacing of lines into defocus, the unknown distance can be evaluated, for example, by using formulas from
  • WO2010/080030 or as described by Nayar (Nayar e al, Proc. of Fifth Intl. Conf. on Computer Vision, 995-1001, Cambridge, MA, USA, 1995).
  • Digital processing means may include calculating digital means, digital display means, digital storage means, means for transformations, for example, Fourier transformations, and other supporting means. All calculations are preferably non-iterative calculations allowing high processing speeds.
  • Defocus-from-modulation processing means and methods thereof are based on novel physical concepts and allow to derive directly the degree of defocus from the degree of modulation which degree of modulation is represented by, for example, the degree of at least one displacement of the spatial spectrum of the image relative to a reference pattern or, alternatively, the degree of light intensities of a spectrum or, alternatively, a combination of said displacement and intensity.
  • the term directly derive refers to a single step process which does not include, for example, deconvolution steps.
  • the reference pattern generally corresponds to the inherent spectral optical response of the modulator, known a priori.
  • Range-from- defocus processing means and methods thereof are based on traditional methods to derive range from defocus discussed above.
  • the modulator providing the degree of modulation of angular momentum can include at least one phase modulator, for example a chiral optical element, to modulate the phase of the incoming light or, alternatively, include at least one polarization modulator to modulate the polarization, spin angular momentum, of the incoming light.
  • a combination of said masks can be adapted to modulate the combination of phase and polarization of the incoming light.
  • the modulator providing the degree of modulation of angular momentum can include a specific arrangement of at least one imaging system adapted to provide the degree of modulation versus defocus.
  • the degree of modulation can be obtained by proper spatial or temporal arrangement of the multiple channels.
  • a beam splitter can provide at least two optical channels which channels can be separately modulated, for example, by introducing a degree of optical tilt in only one of the channels with the other channel remaining the reference channel.
  • at least two imaging systems in combination with at least one photosensor can provide at least two independent optical channels adjusted to provide a degree of modulation versus defocus, by, for example, introducing optical tilt in one of the channels, with the other channels remaining the reference channels.
  • the modulator providing the degree of modulation of the angular momentum can also include any combination of at least one of said physical masks and at least one of said optical channels.
  • a number of embodiments and combinations of embodiments can provide desired modulation of the angular momentum of the incoming light.
  • Alternative embodiments to the same effect can likely be designed which alternatives are considered to be included in here by this reference.
  • the embodiment of choice for a particular application depends, for example, on the specifications of the application such need for sharp imaging in combination with rangefmding, the desired resolution of imaging and desired accuracy as well as economical considerations.
  • Such embodiments include, but are not restricted to, the following embodiments:
  • Phase modulators - Modulation of phase can be obtained by a number of different ways by phase modulators comprised of various passive and active optical elements, for example, optical phase and amplitude masks, diffraction optical elements, holograms, spatial light modulators, fixed and deformable mirrors etc.
  • the phase modulation can be applied to rangefinders having a single optical channel or multiple optical channels.
  • one or several phase modulators can be integrated in one of the optical channels, or be an integral part of the whole optical arrangement.
  • a phase modulator can provide chiral/helical phase of the light beam resulting, for example, in a non-zero orbital angular momentum.
  • phase modulators can provide non-zero angular momentum of the light beam such that, for example, the characteristic pattern in the spatial spectrum exhibit displacements in response to defocusing.
  • the prior unknown defocus can be derived from this displacement and the range can be evaluated.
  • the rangefinder can include at least one phase modulator providing modulation of the orbital angular momentum which causes displacement of the image (or its spatial spectrum) in the plane of the photosensor.
  • the degree of defocus, and thus the range can be derived from said degree of displacement.
  • polarization modulators also referred to as polarization masks, comprised of various passive and active optical elements, for example, polarizers, waveplates, Fresnel rhombs and Faraday rotators, electro-optical and magneto-optical devices, liquid-crystal devices etc.
  • the polarization modulation can be applied to rangefmders having a single optical channel or multiple optical channels.
  • a polarization modulator changes the flux of spin angular momentum, so a light beam passing through a polarization modulator acquires a non-zero total spin angular momentum ⁇ ' ⁇ 0 , for example, the light beam becomes circularly-polarized.
  • a polarization modulator may result in a variable degree of modulation depending on defocus.
  • polarization modulators can provide non-zero angular momentum of the light beam such that, for example, the characteristic pattern in the spatial spectrum exhibit displacements in response to defocusing. The prior unknown defocus can be derived from this displacement and the range can be evaluated.
  • the rangefmder can include combinations of one or multiple phase modulators, one or multiple polarization modulators, or any combinations of said modulators.
  • said modulators can be combined with said converters in at least one modulator/converter combination.
  • the combination of modulators may include at least one phase modulator, which can be a chiral phase modulator and at least one polarization modulator.
  • the modulators can be arranged to maximize the changes in the light beam (intensity distribution, polarization etc.) caused by defocus.
  • the combination of modulators can also provide at least one distinct characteristic pattern in the spatial spectrum of the image such that the degree of displacement of said pattern corresponds to the degree of defocus.
  • Said combination of modulators might be adapted to maximize the detectability of the characteristic pattern, for example, by maximizing the degree of displacement in response to a given degree of defocus.
  • An important role of the characteristic pattern is to provide a reference (for example, in the spectral domain) independent on the spatial structure of the object scene.
  • a photosensor such as a CMOS or CCD sensor can only sense intensities of light and can not sense polarization and phase of light.
  • the polarization modulator is generally combined with an additional optical component, for example, an additional polarization modulator (in this context, a polarization demodulator), to convert changes in spin angular momentum into a corresponding degree of light intensity that is detected by a photosensor.
  • an additional polarization modulator in this context, a polarization demodulator
  • Such polarization modulator /demodulator combination provides modulation of light intensity of which the degree of modulation depends on the degree of defocus.
  • Multiple channels with dependent optics can be used.
  • An alternative embodiment of a rangefmder can provide sensing of defocus and range by using a multiple channel optical arrangement.
  • Such a rangefmder may include optical components and/or photosensors shared between different optical channels.
  • the light is shared into two optical channels by a beam splitter and is detected by two independent photosensors, or, alternatively, two independent sectors on the same photosensor.
  • the required modulation can be achieved, for example, by a phase modulator placed in only one of the optical channels.
  • the primary lens is shared between the optical channels and, second, the channel without the modulator can provide sharp, undistorted imaging of the object of interest.
  • Rangefmding principles disclosed in the present document are based on the fact that electromagnetic waves may carry angular momentum (see, for example, A.I. Akhiezer and V.B. Berestetskii, Quantum Electrodynamics, John Wiley & Sons, 1965, and L. Allen et al., Optical Angular Momentum, IOP, 2003) which, in many practical cases, can be split into angular and spin parts.
  • Angular momentum of light in turn, can be modulated by phase and polarization modulators as described above. Making the degree of modulation dependent on defocus, the unknown defocus can be encoded into the degree of modulation of the angular momentum that can be converted by modulation processing means into a detectable variation of the light intensity on a photosensor.
  • the degree of modulation can be fixed (independent on defocus), but the resulting light beam with non-zero angular momentum may undergo spatial-temporal transformations depending on the degree of defocus, see the example above with a superposition of two circular harmonics. These transformations can be detected directly by a photosensor or converted into detectable variations of light intensity by modulation processing means.
  • Modulation processing means may comprise various optical components, for example, phase and amplitude optical masks, polarizers.
  • One example of a rangefmder employing modulation of the angular momentum is a rangefmder with a phase only optical mask changing the total orbital angular momentum flux from
  • ⁇ '°' 0 (light in the plane of the first optical surface) to ⁇ '°' ⁇ 0 (light after the optical mask), see the formulas above.
  • the light intensity distribution rotates as a function of defocus.
  • the rotation of the light intensity can be detected by a photosensor and, thus, the unknown defocus and range can be determined.
  • the optical rangefinding can be performed with polarization modulators.
  • defocus may cause variations in the degree of polarization which variations can be detected by a photosensor combined with an additional polarizer (i. e. modulation processing means).
  • the degree of modulation can be fixed, but the total spin momentum flux ⁇ ' ⁇ 0 and the
  • phase and polarization modulators can be employed in a rangefmder on condition that the combination provides directly, or though modulation processing means, a detectable variation of the light intensity on a photosensor depending on defocus.
  • the rangefmder can include at least one polarization-to-phase converter adapted to convert a degree of spin angular momentum into a corresponding degree of orbital angular momentum. Also, the rangefmder can include at least one phase-to-polarization converter adapted to convert a degree of orbital angular momentum into a corresponding degree of spin angular momentum.
  • the rangefinding method comprises projection of an image of at least one object onto a photosensor, modulation of the incoming light such that the degree of modulation depends on the degree of defocus in the imaging system, for example, in the plane of the exit pupil, derivation of the degree of defocus in the imaging system from the degree of modulation, and, derivation of the range from the object to the rangefmder based on the degree of defocus.
  • the method also includes modulation of the angular momentum of the incoming light.
  • the modulation of angular momentum can be either modulation of the orbital angular momentum or modulation of the spin angular momentum or modulation of both momenta.
  • the method can include phase-to-polarization conversion which converts orbital angular momentum into spin angular momentum and/or polarization-to-phase conversion to convert spin angular momentum into orbital angular momentum.
  • the method derives the degree of defocus of the imaging system from either the displacement of a pattern in the spatial spectrum or from light intensity.
  • Knowing the distance from a rangefmder to an object, or multiple objects, or an extended scene along one axis (for example, the optical axis of a rangefmder), in a plane or in three-dimensional (3D) space is of practical importance for a variety of consumer, 3D entertainment/imaging, automotive, military/defense, security, technical and scientific applications.
  • the rangefmder disclosed in the present document can be adapted for distance and speed measurements of single or multiple objects using visible, IR or UV light, or multispectral light.
  • the described methods of rangefinding can be, in principle, adapted to any type of wave processes.
  • the presented rangefmders are likely to be inexpensive because of standard optical components and photosensors.
  • Optical phase masks are of a simple design and can be easily manufactured.
  • a rangefmder is a standard, mass-produced photocamera with an additional optical phase mask.
  • a rangefmder is a combination of two standard photocameras without additional optical components. Evaluation of the unknown range from defocus is carried out by digital processing means which may include standard computer devices, digital signal processors (DSP), devices based on field-programmable gate arrays (FPGA) or other digital evaluation means and components.
  • DSP digital signal processors
  • FPGA field-programmable gate arrays
  • a rangefmder comprising multiple independent optical channels can also be adapted for remote sensing of displacements and speed in lateral directions.
  • position of objects, their speed in the direction perpendicular, for example, to the optical axis of the rangefmder can be determined by processing of images acquired by the independent photosensors.
  • One example of such processing is the Fourier decomposition of the combined image.
  • the spatial spectrum of the combined image reveals a line pattern whose spacing and orientation depends on the speed and movement direction of the object.
  • active rangefmders such as, for example, radar or sonar
  • passive rangefmders can, of course, be adapted for active rangefmding by, for example, combining the rangefmder with a light source projecting a light spot, or a grid, or dots, or any light pattern on the object. Detection, analysis and digital processing in an active
  • rangefmding mode are, in essence, the same as for images obtained in a passive mode.
  • the optical rangefmder can be adapted to provide the degrees of defocus of at least two sub-images from the corresponding sub-areas of the image, or several images, acquired by one or several photosensors. With the information on defocus of sub-images corresponding to multiple objects, or spatially extended objects, a complex scene can be completely characterized so that, for example, a defocus map composed of the local degrees of defocus can be directly constructed.
  • the optical rangefmder can also be adapted to provide a wavefront map, in which the wavefront is constructed from the local degrees of defocus corresponding to said sub-images. Construction of the continuous wavefront map can be carried out analogously to the wavefront construction methods developed for Shack-Hartmann wavefront sensors.
  • Static and dynamic defocus maps, depth maps and wavefront maps are relatively novel and promising approaches for military and home-security applications because of additional options for optical, and generally passive, detection of movement and speed of objects and changes in wavefront characteristics. Said 3D capability for depth map applications will only increase the scope of applications.
  • An observer can obtain a 3D map of a spatially extended scene, for example on a 3D display, as well as indications of lateral displacements of the separate objects.
  • Image reconstruction. - An image reconstruction apparatus can be designed including a certain embodiment of the optical rangefmder, as described in the present document, to provide at least one in- focus image of the object which in- focus image is
  • the image reconstruction apparatus may combine, for example, rangefmding using several independent optical channels, as described in the present document, and subsequent processing, as suggested by A. Simonov and M. Rombach, Opt. Lett. 34, 2111, (2009), of at least two, generally, defocused images acquired by the photosensors.
  • any of the image correction/reconstruction methods disclosed in the theoretical section of the present document, or in WO2009/108050 can be used. Note that the accuracy and processing time of the image restoration algorithms (in WO2009/108050 and Opt. Lett.
  • the optical arrangement of the image reconstruction apparatus may include, for example, at least one chiral optical arrangement for imaging such that the OTF of the apparatus has no zeros in at least one region of the spatial spectrum, usually close to zero, even at large focusing errors, see also explanations in WO2010/080030. Those skilled in the art may conclude that zeros in the spatial spectrum of the image cannot be removed by any digital post filtering. Clearly, such image reconstruction is less necessary with rangefmders comprising multiple independent optical channels since sharp, or reasonably sharp, images can be obtained directly from the imagesensors.
  • An optical rangefmder can be adapted to become an optical pathfinder apparatus providing an estimate of at least one optical path length, for example, the optical path length between the fixed object and the optical rangefmder provided that that the absolute distance is known beforehand, or known a priori.
  • the optical path length (OPL) and the absolute distance (L ) are given by respectively, where n ⁇ s) is the refractive index along the optical path and C is the integration path.
  • the OPL can vary while the range L remains constant, for example, due to variations in refractive index of the medium, if the medium is air, due to, for example, air pressure, humidity etc.
  • the rangefmder can be also adapted to measure atmospheric disturbances and turbulence, wind speed and even measure acoustic waves in air, operating as an optical microphone, and be adapted to measure equivalent phenomena in liquids.
  • Modal wavefront sensor -
  • the optical rangefmder described above can be adapted to provide modal wavefront sensing and, thus, become a modal wavefront sensor.
  • the concepts set forth in the present document can be extended to likely any aberration of choice by adaptation and derivation of the formulas and mathematical concepts disclosed in the present document. Since the rangefmding methods disclosed in the present document rely on the 'range from defocus' principle, defocus is the primary aberration determined by the rangefmder.

Abstract

Passive and solid-state optical rangefinders are described comprising at least two optical channels in specific configurations. Displacement of patterns in the spatial spectrum allow calculation of defocus and subsequently calculation of precise distances of one or multiple objects. Other embodiments provide depth maps of complex scenes. Applications include rangefinding, mapping of spatially extended scenes, depth-and wavefront mapping, image reconstruction, pathfinding and modal wavefront sensing for military, medical, consumer and automotive applications.

Description

IMPROVED OPTICAL RANGEFINDING AND IMAGING APPARATUS
Introduction
Active rangefinding is based on emission of a signal, for example sound, laser light or radar waves, by an emitter linked to the rangefmder and detection and analysis of reflections of the signal from a target object by the rangefmder. Passive rangefinding is based on detection and analyses of signals emitted by the target itself, for example, infrared emission of the engine, or, alternatively, scattering of ambient light or ambient radio-waves. Passive optical rangefmders are used since the nineteenth century and generally based on geometrical methods, measurement of angles. In stadimeters the distance is calculated from the angular size of an object with the dimensions of the object known a priori, for example, the distance to a battleship can be calculated once its size and angle of view are known. Passive rangefinding triangulation methods are also employed in modern photographic cameras, for example, K. Engelhardt and R. Knop, Appl. Opt. 34, 2339-2344, 1995, to estimate the degree of defocus at multiple locations in the image field. Modern cameras calculate the range by phase detection or contrast measurement methods which can be passive in sufficient ambient light, but active under low light conditions. For example, a camera has to pre-flash during the rangefinding procedures prior to actual picture taking. Phase detection methods as in, for example, WO2005/098501 and US2008/205872 analyze the sub-images for differences in light intensity after splitting the light coming from the primary objective into several channels with precisely known light-path lengths. Contrast measurement methods as in, for example, US2006109369 and US2004017502, optimize the contrast of the image, or part of the image, on the photosensor by changing the focusing condition and detect the optimal contrast corresponding to the optimal focus. The above methods differ, in virtually all their aspects, from the rangefmder and imaging apparatus and corresponding methods described in the present document, this document, which do neither require triangulation or angular measurement of the object nor contrast evaluation or comparison of phase delayed sub-images.
Terms and definitions General terms: Optical rangefinder and imaging apparatus will also be referred to as rangefinder throughout this document. A rangefinder determines the range, distance, from the apparatus to an object. An optical rangefinder determines the range by means of light, for example, visible, or infrared (IR), or ultraviolet (UV). An imaging system, alternatively, imaging optics, comprises focusing and relaying optics such as lenses, mirrors, beam-splitters and various optical components, and functionalities and assembly thereof, as, for example, in a camera, to project light onto a photosensor. A photosensor is an electronic component converting light intensity into an electronic signal, or digital/electronic image. A rangefinder comprises at least one optical channel which is a combination of an imaging system along the optical path of the incoming light and at least one photosensor. In the context of the present document, independent photosensors can be either physically separate components but can also be separate sectors on a single photosensor. A rangefinder with multiple optical channels may comprise individual optical channels, generally physically separate with each optical channel comprising individual optical components and individual photosensors.
Alternatively, such a rangefinder may include optical components and/or photosensors shared between the optical channels. An optical rangefinder can include at least two independent optical channels in a specific spatial optical arrangement, for example, all the channels tilted symmetrically with respect to the optical axis, or, alternatively, one of the channels tilted additionally in the tangential plane while other channels have optical axes positioned in the sagittal plane of the optical arrangement. Optical channels in a rangefinder can also be separated based on polarization or spectral characteristics of light. For example, a Wollaston prism separates the incoming unpolarized light beam into two orthogonal, linearly polarized outgoing beams that can be further transformed (and mixed again) and detected by a photosensor. An optical rangefinder with multiple optical channels is referred to as having temporal optical arrangement when it comprises at least one optical channel that switches sequentially, over time, from one configuration to another configuration. Such switching may be achieved by, for example, displacing the whole individual optical channel from one spatial position to another, or, alternatively, sequential turning an optical component, for example, a mirror or lens, in a given optical channel. Temporal optical arrangements can be achieved, for example, in most of the modern cameras by adaptation of the vibration correction function included in the lens or sensor systems. Optical arrangement of a rangefinder, in the context of this document, includes any physical arrangement of optical, mechanical, electronic components as well as any virtual arrangement obtained, for example, by digital processing of the electronic images. Both types of optical arrangements (i. e. spatial and temporal) have to be adapted to support the rangefmding functionality. An image is in-focus or sharp when the image plane coincides with the in- focus image plane, and, if not, the image is defocused or blurred, or, alternatively, the image has a certain degree of defocus. The terms object and image conform to Goodman's definitions for a generalized imaging system (J.W. Goodman, Introduction to Fourier Optics, McGraw-Hill Co., Inc., New York, 1996, Chap. 6). In mathematical description, object and image refer to multidimensional data arrays or functions representing distribution of light in the object and image planes, respectively.
Spectral response and spatial spectrum: Spectral response is generally obtained by Fourier transformation of the intensity impulse response, or, alternatively, by other transformations including, but not restricted to, wavelet decomposition and other spectral decompositions, where spectral decomposition is the decomposition of the image into a basis of eigenfunctions of an appropriate operator in the Hilbert space. Spatial spectrum of an image is obtained by the spectral decomposition of the image data. Generally, the spatial spectrum is a multidimensional array of complex values. A reference pattern, in the context of the present document, denotes any pattern usually in the spatial spectrum domain that can be used as a reference scale to measure the degree of displacement of the image spectrum generally caused by defocus, but not restricted to this optical aberration only. An appropriate reference pattern can be also used in the spatial domain. The reference pattern can be any synthetic structure in the spatial spectrum domain, for example, a single line, or a grid of lines. Alternatively, the reference pattern can be a characteristic pattern obtained from the inherent spectral response of the optical arrangement, or optical mask. The characteristic pattern can be conveniently represented by the modulus of the incoherent optical transfer function (OTF), i. e. the modulation transfer function (MTF), calculated with the mask amplitude and phase functions. In practice, the characteristic pattern is useful for measuring and evaluating the displacement of the spatial spectra. A periodic pattern of lines is an example of the characteristic pattern resulting from the optical mask with a chiral prismatic element, for example, a half-aperture prismatic optical mask. The term defocus map denotes the distribution, generally two-dimensional, of degrees of defocus obtained for a spatially extended object/scene, a depth map provides distances of multiple objects or multiple sub-images and a wavefront map describes the distortion of the wavefront, for example, in the plane of the exit pupil and measured relative to the Gaussian reference sphere. Said maps can be static maps, providing information for a single point in time, or, alternatively, active maps, providing updated information at, for example, real-time speed. The abbreviation EOF is for Extended Depth of Field.
Optical chirality and arrangements - Optical chirality results from chiral modulation including chiral phase modulation which, in turn, provides modulation of angular momentum of light. Modulation of angular momentum, the main concept behind rangefmding disclosed in the present document, includes any change of the angular momentum carried by electromagnetic waves, or, alternatively, its flux (see, for example, S.M. Barnett, J. Opt. B4, S7-S17, 2002). General properties as well as mathematical description of light beams carrying optical angular momentum are presented in Optical Angular Momentum, L. Allen et ah, IOP, 2003, which document is included in this document by reference.
Modulation of angular momentum of light and detection of the degree of such momentum can be achieved in a number of ways by a number of optical arrangements of which a few examples are provided below, but which number of ways is not necessarily restricted thereto: First, modulation of angular momentum can be provided by a chiral optical element, which element includes, at least one, optical surface resulting in chiral phase modulation of the light beam, or alternative light signal. In a single-channel imaging system, for example, chiral modulation can be obtained with a half-prismatic optical mask. Second, modulation of angular momentum can be provided by a chiral optical arrangement comprised of multiple independent channels of a multichannel imaging system or combination of several independent imaging systems, for example an arrangement with two imaging systems of which one of the channels is tilted optically versus the other channel (for details of such an arrangement based on optical tilt see below). So, a chiral optical arrangement can be a combination of at least two optical channels, i. e. two independent imaging systems, which channels are arranged such that the combination provides modulation of angular momentum of the incoming light. Third, modulation of angular momentum can be provided by digital processing, in silico, from subimages from a single image resulting from a single exposure on a single photosensor, or, alternatively, be provided by processing, in silico, of a combination of any number of images from any number of combinations of imaging systems and photosensors. An optical chiral arrangement with multiple optical channels, as defined in the present document, can be a spatial chiral arrangement comprising, for example, at least two independent imaging channels that are not separately chiral, but the combination of channels is a chiral arrangement. Alternatively, an optical chiral arrangement can be a temporal chiral arrangement including, for example, at least one non-chiral imaging channel which provides at least two consecutive images of an object at different positions of the optical channel such that the image taking positions constitute a chiral arrangement. In both cases an independent imaging channel may consist of an imaging system and a photosensor.
Chirality concept. - Modulation of angular momentum can be provided by chiral modulation of phase, which chiral phase modulation is associated with the chiral phase function. The chiral phase function can be conveniently represented by a three- dimensional chiral surface. By definition, the mirror image of the chiral surface cannot be mapped to the original surface by rotations and translations, for example,
M. Petitjean, J. Math. Phys. 43, 4147-4157, 2002, which document is included in the present document by reference. In mathematical terms, an object is defined to be chiral if it is not invariant under parity transformation. A chiral optical arrangement is an arrangement of an optical system that modulates a light beam such that the light approaching the image plane contains chiral phase modulation, or, alternatively, the generalized pupil function of the optical system has a chiral phase. Chiral modulation is the amplitude and phase modulation of light resulting from the chiral optical arrangement. Chirality signs, or, alternatively, directions; clockwise or counterclockwise, or, right-handed or left-handed, should be preferably the same for each particular optical mask, i. e. one mask should only be comprised of a chirality function, or a combination of chirality functions, of the same sign, but combinations of chirality functions of opposite signs are not excluded. The degree of chirality, in simple cases, for example, a vortex, can be quantitatively measured in terms of topological charge, in other cases (for example, a complex three-dimensional surface), the degree of chirality can be calculated in terms of the continuous chirality measure, for example, Salomon et al, J. Mater. Chem. 25, 295-308, 1999.
Description of figures
Figure 1 shows one of the preferred embodiments of the optical rangefmder. Light from an object, 1, is collected by an imaging system, 2, and projected onto a photosensor, 3. The imaging system comprises optical elements for imaging, 4, and, in this example, one modulator, 5, in the plane of the exit pupil, 6. The modulator modulates the angular momentum of the incoming light, 7 and the electronic image is further processed by digital processing means, 8. Figure 2 shows an alternative embodiment of the optical rangefmder including a polarization demodulator, 9, to convert the degree of spin momentum into a
corresponding degree of intensity on the photosensor. (For references 1-8 see Fig. 1.).
Figure 3 shows an alternative embodiment of the optical rangefmder depicted in Figure 2 including a medium, 10, adapted to convert orbital angular momentum into spin angular momentum and vise versa. (For references 1-9 see Figs. 1 and 2.) .
Figure 4 shows an alternative embodiment of the optical rangefmder depicted in Figure 3, adapted to convert spin angular momentum into orbital angular momentum. (For references 1-9 see Figs. 1 and 3.)
Figure 5 shows the preferred embodiment of the optical rangefmder with two independent optical channels. Light from an object, 1, is projected by two independent imaging systems, 11, 12, on two independent photosensors, 13, 14. The optical axes, 15, 16, of the respective imaging systems, 11, 12, make angles with measures 17 and 18 versus the mutual projection axis 19. The axis 19 is the orthogonal projection of the axis 15 on the plane of the photosensor 13, where the plane of the photosensor 13 is defined by axes 16 and 19. The angles 17 and 18 of both imaging systems can be chosen to make the rangefmder arrangement chiral, or non-chiral. The digital images from photosensors are processed by digital processing means, 20, in this example a laptop computer. Figure 6 illustrates rangefinding with the apparatus described in Fig. 5. Images 21 and 22 acquired by two independent photosensors, at a given distance from the object (USAF 1951 resolution chart) to the rangefmder, are combined into a single image 23. The spatial spectrum, 24, of the image 23 exhibits a distinct periodic pattern of lines rotated by an angle with the measure 25 versus a reference line 26. The angular position of the periodic pattern of lines is indicated by an additional line, 27. In this example, the reference line 26 is chosen to be along the vertical axis. When the object is at a different distance from the rangefmder, images 28 and 29 acquired by two independent photosensors give rise to a combined image 30. The spectrum, 31, of the combined image 30 reveals a periodic pattern of lines rotated by a different angle with the measure 32 with respect to the vertical axis. Prior calibration of the rotation angle versus the distance to the object, for example, USAF 1951 resolution chart, allows determination of the absolute range from the rangefmder to any object of interest.
Figure 7 illustrates the diffraction of a light beam carrying orbital angular momentum. In this example, the initial electric field of the light beam is a superposition of two circular harmonics with the azimuthal numbers n = 0 and m = 2 :
E (ρ,φ, ζ = 0) = A0[l + exp(-z'2cp)]exp(-p 2 / al) . The intensity distributions of the light beam at z = 0 , zR / 2, and zR , where zR = k0al 12 is the Rayleigh length, are shown in panels 32, 33, and 34, respectfully. An angular measure 35 represents the angular position of an image axis 36, a preferred direction of intensity distribution, with respect to a vertical axis 37. Angular position of the intensity distributions at z = zR 12 and z = zR are indicated by angular measures 38 and 39. Such a diffraction picture takes place when the beam propagates in a free space or in an imaging system (in the latter case the beam spatial coordinates should be replaced by the "lens transformed" coordinates, see also explanations in the text).
Prior art for passive rangefinding
Prior invention, prior art, US7218448B1 describes a system and methods for passive ranging which differ in key aspects including physical principle, design and
embodiments from the passive optical rangefmder in the present document. First, none of the optical arrangements or mask designs (including the linear phase modulation mask with two prisms) disclosed in the document US7218448B1 is a chiral optical arrangement or chiral optical mask. Second, the optical arrangement in US7218448B1 has only one optical channel and requires a linear phase modulation mask in
combination with a cubic surface for extended depth of field. The present document discloses multi-channel optical arrangements that might have no phase or amplitude masks at all (see, for example, Fig. 5). Third, passive ranging according to
US7218448B1 (quote) 'is accomplished by modifying the incoherent optical system [...] such a way that range dependent zeros are present in the OTF,' or other words (quote) 'zeros are added to encode the wavefront with information on range.' The rangefmding method described in the present document does not require zeros in the OTF. For example, the images obtained with the setup depicted in Fig. 5 can be in- focus images, meaning that the OTFs of two channels are close to diffraction- limited OTFs (assuming that defocus is small) and have no zeros at all. Nevertheless with the algorithm described in the theoretical section below and, for example, in WO2009/108050, the relative lateral shift (caused by defocus) of two images can be calculated and the unknown range can be found from the defocus-related shift. Fourth, passive ranging with the extended depth of field according to US7218448B1 is obtained by combining a linear phase modulation filter with the cubic phase modulation mask. In the present document the extended depth of ranging can be attained without additional elements and phase masks by using the directional sensitivity of the optical arrangement, for example, by shaping the exit pupil of the optical system. It can be proven mathematically that pupil shaping, for example, replacing an open pupil with a narrow slit, may result in the increased ranging depth without losing the rangefmder accuracy (explanations are given, for example, in WO2010/080030). Such a pupil shaping is naturally achieved in a multiple channel configuration. Prior invention WO2010/080030, which is, by reference, included in full in the present document, describes a system and methods for passive ranging comprising a prismatic phase-mask and a single optical channel. The present invention comprises a rangefmder with multiple optical channels in chiral configuration and multiple photosensors.
Also, WO2010/080030 does not disclose embodiments employing modulation of angular momentum of light which includes spin and orbital parts and WO2010/080030 does not consider rangefmding using the transformations of spin angular momentum into orbital and vice versa. WO2010/080030 discloses optical arrangements with multiple optical channels adapted to provide chiral modulation of the incoming light, as shown, for example, in Figures 11 and 12 and, for example, in a combination of Claims 1, 2 and 4. WO2010/080030 also covers non-chiral optical arrangements with multiple non-chiral optical channels with shared optical elements (e. g. photosensor), as it can be constructed, for example, according to Claims 1 and 4 of WO2010/080030. One of the preferred embodiments in the present document is a rangefmder with individual non-chiral optical channels (having no shared optical components and photosensors) in which the virtual, chiral or non-chiral, optical arrangement (providing rangefmding functionality) is achieved by combining in silico the images from the individual photosensors or separate sectors of the same photosensor.
Theoretical background
The Lagrangian density of the free electromagnetic field (L.D. Landau and
E.M Lifshitz, The Classical Theory of Fields, Butterworth-Heinemann, 1980) is expressed by
1
L = - F β ' (1)
16π where ap = daA^ -d^Aa is the antisymmetric field-strength tensor with the 4-vector potential Aa = (φ, A) , , β = 0,3 ; the metric tensor is ap = diag {1,-1,-1,-1} . From the
Noether's theorem, the invariance of the field action with respect to spatial rotations results in the conserved current densities, i. e. the Noether's currents, (N.N. Bogoliubov, D.V. Shirkov, Introduction to the Theory of Quantized Field, John Wiley & Sons, 1980) da ^ = o (2)
Figure imgf000010_0001
the density of the Noether's currents in gauge-variant form, ai_
is the field momentum (J.D. Jackson, Classical Electrodynamics, John Wiley & Sons, 1999), and
7"* = π-3ρ^- - ίδαρ is the canonical stress tensor.
In the literature (L. Allen et al., Optical Angular Momentum, IOP, 2003), the first and second terms in Eq. (3) are associated with orbital and spin moments of the electromagnetic field, respectively. Integrating Eq. (2) over the whole 3D space and using the divergence theorem one may prove that the integral, or total, angular momentum of the free electromagnetic field is the conserved quantity. It was shown (see, for example, L. Allen et al.) that the spin of the electromagnetic light field is carried by polarization and the orbital angular momentum is carried by chiral wavefront.
By symmitrizing the canonical stress tensor Γα(3 and choosing a proper divergent term for Jap7 in Eq. (2) (see, for example, F.J. Belinfante, Physica VI(9), 887-898, 1939), Eq. (3) can be transformed to the gauge-invariant form satisfying Eq. (2)
M + Θαγχρ , (6) where Θαρ is the symmetric energy-momentum tensor (see, for example, J.D. Jackson). Introducing the vector
Figure imgf000011_0001
dual to αργ , the angular momentum density m≡ {ml , m2, mi} becomes (see L. Allen et al.) m = [rx g] , (8) where g = [ExB]/(47tc) is the linear momentum density of the electromagnetic field, E and B are the magnetic and electric field vectors, respectively. The total angular momentum stored by the electromagnetic field
Figure imgf000012_0001
is the conserved quantity, and is the state function of the free electromagnetic field (V. Fock, Theory of Space, Time and Gravitation, Pergamon Press, 1964).
In a homogeneous and isotropic transparent medium, however, the spin and orbital contributions are separately conserved (if the total angular momentum can be separated into orbital and spin parts). Recently it was found experimentally that the spin angular momentum of the electromagnetic field can be converted into orbital angular
momentum by an inhomogeneous anisotropic element (L. Marrucci et al., Phys. Rev. Lett. 96, 163905, 2006). Conversion of the spin angular momentum into orbital momentum can be also obtained in a light beam focused by a rotationally symmetric optical system, as described, for example, by Nieminen et al., J. Opt. A: Pure Appl. Opt. 10, 115005, 2008. In this case, reduction of the spin flux due to focusing is
accompanied by the increase of the orbital flux; the total angular momentum flux remains constant before and after focusing. Note that this conversion takes place at the lens interface and results in a mechanical torque.
Separation of the total angular momentum into orbital and spin contributions was firmly established only in the paraxial approximation (see L. Allen et al.). Attempts to go beyond the paraxial approximation immediately revealed that no such lucid separation is possible for common classes of nonparaxial beams. However, instead of the angular momentum stored by the field, Eq. (9), one may study the angular momentum flux (S.M. Barnett, J. Opt. B4, S7-S17, 2002), i. e. the angular momentum transferred by the electromagnetic field, which allows a clear separation of the angular momentum flux into orbital and spin fluxes even for nonparaxial beams (C.N. Alexeyev et al., J. Opt. Soc. Am. A25, 643-646, 2008).
Using Eq. (3), the flux density tensor can be defined by the dual tensor Φ" = -ε„ /' βγ (10)
The angular momentum flux density for a light beam propagating along the Z - axis (through the T - plane) is
(11) where the first and the second terms are the orbital and spin angular momentum flux densities, respectively.
Consider now a monochromatic wave (e. g. electric field satisfies the wave equation dad E = 0 , a = 0,3 ) defined by the electric field vector
E(r, t) = (r) exp(-z'GW) + c.c. , (12) where ω is the frequency and E (r) is the amplitude expressed by
E (r)≡E (x,y,z) =— E0 (kl ,k2)Qxip(ik.r)dkldk2 , (13)
J in which k is the wavevector defined by (we denote k0 = ω / c ) k = {kl , k2 , k3 = -^k0— kx— k2 } , (14) and ^(ky, k2) is the spatial spectrum of E (r) at z = 0 1 r
^(^, ^2) =— \ E (x, y, z = 0) exp(-ikr)dxdy . (15)
From the definition of spin angular momentum flux density, see Eq. (11), it can be found that the total, cycle-averaged spin angular momentum flux is
(16) where ek = k / k0 is the unit vector along k . It is convenient to decompose Ej^,^) into a combination of circularly-polarized waves
E^ ki, k2)— eRfR (kl , k2) + eLfL(kl , k2) , (17) where , eL are the unit right- and left-hand helicity vectors: eReR = eLeL = 0 , e^e^ = 1 , = ez ; fR , fL are the field amplitudes. Substituting Eq. (17) into Eq. (16), the total spin angular momentum flux becomes
Figure imgf000014_0001
which is similar to the relativistic helicity invariant (G.N. Afanasiev et al., Nuovo Cimento 109, 271-279, 1996). As clearly seen from Eq. (16) 33Φ = 0 , so it is a conserved quantity along the direction of propagation (Z - axis). The total spin angular momentum flux, as seen from Eq. (18), is proportional to the difference of the right- and left-polarized photons. It can be shown that the total, cycle-averaged Poynting flux along the Z - axis is given by
Figure imgf000014_0002
e3 being a unit vector along the Z - axis.
Equations 16, 18 and 19 and the conservation condition ΰ3Φ'°' = 0 allows us to relate the polarization of light beam expressed by the complex vector ^ ky, k2) , or partial field amplitudes fR {kx , k2) , fL {kx , k2 ) (complex values) dependent on defocus (for example, due to diffraction) with the detectable quantities, i. e. spectral intensities | fR and | fL |2 . The formulas derived in this section can be generalized for a light beam propagating through an anisotropic (e. g. birefringent, gyrotropic) medium (see, for example, L. Allen et al. and references therein).
Analogously to above, the total, cycle-averaged orbital angular momentum flux can be obtained from Eq. (11) in terms of the amplitude E = E(r) in the spatial domain
Figure imgf000015_0001
in which d / d"& = x(d I dy) - y(d I dx) , or in terms of the spatial spectrum EQ = EO(kl , k2) in the reciprocal space
(21) From Eq. (21) it is clear that 33Φ'°' = 0 , thus Φ'°' is a conserved quantity along the direction of propagation.
Assuming that E = {E E 2,0} , in the paraxial approximation dE 1,2 I dz ~ ik0E 1,2 , Eq. (20) simplifies to
Figure imgf000015_0002
When the light wave is a plane polarized, e. g. E = {Ε,Ο,Ο} , Eq. (22) takes the form
Im [ E * ^- , (23) or representing the complex field E as
E = V7exp Y) , (24) where / e R is the local intensity and ψ e R is the local phase of the light wave. After substitution, the orbital momentum flux takes the form
ΦΓ≡ , (25)
Figure imgf000015_0003
Separating explicitly ψ into defocus φ and the phase ψ ' caused by the optical arrangement (e. g. an optical phase mask), we have ψ = ψ(χ2 + y2) +\\f ' . Taking into account that ψ determines the evolution of / along the spatial direction z , it is clear from Eq. (25) that ψ' can be chosen such that 'ο°'≠ 0 and 3Φ'°' /dip≠ 0 . This illustrates modulation of the orbital momentum by defocus.
In polar coordinates (r,i3) the amplitude E = {E \E 2,0} of the electric field can be expanded into circular harmonics Ε ι =∑αη (^η , (26) and
Figure imgf000016_0001
where ψ„(θ) = exp(mi3-) / ν2π , n is the mode index, or chirality/helicity index.
Substituting further Eqs. (26) and (27) into Eq. (22), one may obtain
Figure imgf000016_0002
Analogously to Eq. (19), the cycle-averaged Poynting flux along the Z - axis becomes
Figure imgf000016_0003
So, the total orbital momentum flux Φ'°' is a combination of partial chiral-mode powers
J I an rdr and J| bn rdr . If the light wave is not chiral, i. e. the total power of modes with positive chirality ( n > 0 ) equals to the total power of the modes with negative chirality ( n < 0 ) then Eq. (23) gives Φ'°' = 0 .
Equations 25, 28 and 29 and the conservation condition 33Φ'°' = 0 allows to relate the phase of light beam expressed by the complex vector EO(kl, k2) , or field amplitudes an (r) and bn (r) (complex values) dependent on defocus (for example, due to diffraction) with the detectable quantities, i. e. spectral intensities | an {r) |2 , | bn {r) |2 . As follows from Eqs. (22), (25) and (28), the resulting orbital momentum flux Φ'°' depends on light polarization as well as phase (chirality). The formulas derived in this section can be generalized for a light beam propagating through an anisotropic (for example, a birefringent or gyrotropic) medium (see, for example, L. Allen et al. and references therein).
The phase chirality/helicity of the light wave results in rotation of a light beam when the beam propagates in a free space. Similar rotation effects take place when the light beam passes through a defocused optical system. We consider the diffraction of a light beam carrying orbital angular momentum specified in terms of circular harmonics, see Eqs. (26-27). Assuming that in Eq. (12) E (r)≡ ew(r) exp(z' 0z) , where e is the unit polarization vector and u e C is the slow-varying amplitude, and using the cylindrical coordinates {ρ, , z) , the paraxial approximation \ du l dz \« k0 \ u \ yields the following Cauchy problem for u = ιι(ρ,ϋ-, z) in the area 0 < p <∞, 0 < ι3- < 2π
Figure imgf000017_0001
Since Vi3- : w0 (p,i3-) = ιι0(ρ,ϋ + 2π ) , one may represent w0(p,i3-) as a superposition of circular harmonics
Figure imgf000017_0002
It is convenient to solve Eq. (30) for a single circular harmonic and then generalize the result to an arbitrary number of harmonics. Thus, taking for simplicity
«0 (p,i3-) = A0 exp(mi3- - p 2 1 al) (32) being constants of the beam, the solution of Eq. (30) takes the form
Figure imgf000017_0003
, (33) where zR = k0a^ 12 is the Rayleigh length, Iv (x) is the modified Bessel function of the first kind and the complex parameter
Figure imgf000018_0001
From Eq. (33) it can be seen that a light beam consisting of superposition of circular harmonics with different azimuthal numbers, for example,
u0 = A0 exp(-p 2 /
Figure imgf000018_0002
+ exp( mi3-)] , rotates in the course of diffraction. The diffracted intensity distribution | w(p,i3-, z) |2 reveals counter clock- wise rotation in the T - plane (opposite to the Z - axis): at z -)∞ the original intensity distribution tends to rotate by π / 2 . This result can be easily generalized to an imaging system by applying the "lens transformations" to Eqs. (30) and (33), see, for example,
A.D. Polyanin, Handbook of Linear Partial Differential Equations for Engineers and Scientists, Chapman&Hall/CRC, 2002, chap. 2.2.1.
In an imaging system, replacing «0 (p,i3-)→ «0(p,i3-) exp[z 0p 2 /(2/)] , where / is the back focal length, one may find that | w(p,i3-, z) |2 rotates by π / 2 when the light beam prorogates from the plane of the exit pupil ( z = 0 ) to the focal plane ( z = f ) of the system. This means that the intensity distribution at z = / can also rotate from 0 to π / 2 depending on defocus φ , where formally φ = k0 /(2f) . The rotation of
I w(p,i3-, z) |2 with defocus follows, for example, from the fact that
I «(p,i3-, z) |2~ Cnm + cos[(« - m)(d -π 12 - χ / 2)] , where Cnm depends only on mode indices n and m and χ = arctan(z5 / z) , here also χ = π / 2 - arctan(z I zR) with the last arctangent term being the Gouy phase. Replacing z with the "lens transformed" z specified via z = z'/(l - z' I f) and recalling that φ = k0 /(2f) , the rotation angle of
I w(p,i3-, z) |2 can be determined for any given defocus φ .
We consider now an example of determining the mutual shift of images acquired by different photosensors that belong to different, but highly similar, optical channels. Denoting by Ix (x, y) and I2 (x, y) the intensities of any selected pair of images from the photosensors, one can define their spatial spectra by
T„ (ωχ , ) = J J /„ (x, y) exp[-z((oxx + &>yy)] dxdy , (35) where n = \,2. These spatial spectra can be conveniently expressed in terms of the defocused OTFs
Ίηχγ) = Ηηχ , co^ )70χγ), (36) where Ηίχ}>)≡ Η(ωχ),,φί) and H2((£>x,(£>y)≡ H((£>x,(£>y,(p2) are the corresponding defocused OTFs, usually ( j≠ φ2 , and 0 (ωχ , ) = J J I0 (x, y) expH'((oxx + ω^)] dxdy (37) is the spatial spectrum of the intensity distribution I0(x, y) in the object plane. Note that if φι≠ φ2 then the image scaling effects can be attributed to defocusing. Assuming that the exit pupils for both imaging systems are symmetric (for example, centered squares, rectangles, or circles) the Hermitian symmetry of the defocused OTF, i. e. H(&x,&y,q>) = H*(-rox,-roj,,9), yields: H(rox,roj,,9)e R . For the images Ιχ{χ,γ) and I2(x,y) , with I2(x,y) shifted with respect to Ix{x,y) by Δχ and Ay, it can be found from Eq. (35) (ωχ , ) = H!χ , co^ )/0χ , co^ ),
~ (38)
I2χ,0)y) = H2χ , co^ )/0χ , co^ ) expH'(coxAx + 0)yAy)] .
Recalling that for a symmetric pupil Η(ύχ,ύ ,φ) e R , from Eqs. (38) it can be found
εχρ[-2ζ'χΔχ + ω Ay)]≡ εχρ(-2ζ'ζ ) . (39)
Α(ωχ,ω„) Denoting for simplicity 72χ,ω„)71 *χ,ω„) = « + ιν , (40) it can be found that ζ , ω ) = ωχΔχ + ω Ay = - - arctan -^ ^T . (41)
2 w - v
Thus, from the known spectra Ιιχ,ΰ) ) and 72ι;,) for each point {ωχ;,} in the spatial spectrum the value ζ (α^,ω^) = ωχΔχ + (ύ Ay can be calculated. The values Δχ and Ay can be subsequently evaluated, for example, by the least mean square error
(LMSE) fit of ζ (ω^,ω^) by a plane surface ζ = ^ωχ + icc^ + C , where the
coefficients A , 5 and C are to be determined in the LMSE fit. So, A and B give the best estimates for Ax and Ay , respectively. As discussed above, the spatial/temporal optical arrangement of a rangefinder apparatus can be made chiral with at least one independent optical channel. As an example, consider a rangefinder with a spatial chiral arrangement comprising two independent optical channels, where each channel has its own optical mask in the plane of the exit pupil. Alternatively, the system may comprise only one channel which moves from one position to another position taking two consecutive pictures of the same object/scene. The first optical mask, defined in the canonical pupil coordinates x and y , is a square opening x e [-a /2 -c,a /2 -c] and y e [-b /2,b/2] producing the phase delay ϋ-(χ, y) = 0 . The second mask is a square opening xe [d - a /2,d + a /2] and y€ [-b/ 2,b/2] with a wedge prism introducing the phase delay
ϋ-(χ, y) = A(x - d) + By , where A , B are the prism constants, see also A.N. Simonov and M.C. Rombach, Opt. Lett. 36, 115-117 (2011). In the reduced spatial frequencies cox and , the OTF of the first channel Ηι((ύχ,(ύ),,φ) is given by
H1 (rox,roj,,9) = H0(rox,roj,,9) exp(-/29roxc) , (42) in which / ω.,ω,.φ) ^"^"- 1 "- M **^^ M (43)
φωχα φω b is the defocused OTF of a rectangular aperture of size a x b and φ is the defocus parameter as in A.N. Simonov and M.C. Rombach, Opt. Lett. 36, 115-117 (2011). Similarly, the OTF of the second channel becomes
Η2χγ ,φ) = Η0χγ,φ) εχρ(ϊ[2<ρωχά + Αωχ + Βωγ ]) . (44) Two independent images from the two independent channels can be merged, for example digitally, into a single image. As clearly seen, obtaining the resulting image is equivalent to using an optical system with the OTF
Η(ωχ}, ,φ) = Htχ ,ω^ ,cp)+ Htχ ,ω^ ,φ) = 2H0χ ,ω^ ,φ) οο$Ψ(+) εχρ( Ψ(_) ), (45) where
Ψ(±) = {A(i)x + B(i)y)l2 +< (i)x{d ± c) . (46)
So, H(a>x,a>y,q>) contains a periodic characteristic pattern of fringes specified by the term ΰθ8Ψ(+) . From Eqs. (45, 46) it follows that the inclination of the periodic pattern caused by defocusing occurs only if B≠ 0 . In this case the two-channel arrangement is chiral.
As a summary, by combining two independent images from two independent optical channels (that can be non-chiral) an effective chiral arrangement can be constructed. This arrangement allows accurate measurement of ranges by, for example, determining the tilting of the characteristic pattern. In addition to ranging, the system with two independent channels provides two undistorted images (when φ is small) of the original object/scene. When φ is not small but can be determined by the rangefmder
(via the distance), an in-focus image of the original object/scene can be obtained by digital processing, e. g. Wiener deconvolution, other methods can be found in
WO2010/080030.
This invention - aspects of chiral configuration The inventions disclosed in the present document, this document, concern embodiments of an optical rangefmding and imaging apparatus, or, in preferred embodiments, a passive optical rangefmder, henceforth referred to as a rangefinder, or, alternatively, as apparatus. Preferably that the functions of rangefmding, measuring distances, and of imaging, providing images, can be combined into one apparatus sharing components of the apparatus, but same results can be obtained by a rangefmding apparatus which is separate, independent, from the imaging apparatus with the data of both appratus to be combined at the processing stages. The inventions cover two main sets of aspects: (1) - a set of aspects related to chiral configuration and (2) - a set of aspects related to angular momentum of light, both which sets will be introduced separately. Note that comments mentioned in connection with one set can apply to the other set as well; not all comments and technical clarifications are repeated in the descriptions of both sets. Optical rangefmding and imaging apparatus comprises as main components optics and processing means. Additionally the apparatus can comprise supporting components such as, for example display means.
Optical arrangements can include at least one optical channel adapted to project at least two images of at least one object onto at least one photosensor. Note that various arrangements are possible: For example, two independent optical arrangements, as in for example, two cameras with their own independent optical channels and their own independent photosensors, or, alternatively, one shared optical channel, a beam splitter and two independent photosensors, or, alternatively, two independent optical channels projecting onto a shared photosensor, and other arrangements not mentioned here including arrangements with more than two optical channels and more than two photosensors. This document discloses optical arrangements which are chiral optical arrangement adapted to provide at least one chiral arrangement of at least two images which is an arrangement according to principles of chirality as set forth elsewhere in this document and in the document WO2010/080030.
Processing means include image processing means adapted to provide processing of digital images, spectral processing means adapted to provide spatial spectra and to derive the degree of defocus, and range processing means adapted to provide the range of said object. The image processing means can be processing means as can be found in standard consumer cameras or, alternatively, can be dedicated processing means, for example, to increase image contrast, reduce noise, remove or interpolate the data points which are not usable due to, for example, sensor saturation; the image processing means can be also adapted to provide sharp or in- focus image restoration form the images detected by the photosensor by, for example, Wiener deconvolution - see the theoretical background above. The spectral processing means process images according to principles for analysis and comparison of image spatial spectra according to
descriptions which can be found elsewhere in this document, the present document, and in WO2010/080030 which document is included in the present document, by reference. The spectral processing means is adapted to derive the degree of defocus from the degree of displacement of the spatial spectrum of the chiral combination of images versus a reference. The range means is adapted to provide the distance from at least one object to the apparatus by deriving said range from said degree of defocus.
A chiral combination of images can be the merging of at two images into a single image which image is chiral, as in Fig. 6, with 21 and 22 being the two images and 23 the resulting chiral combination. Alternatively, a chiral combination of images can be achieved by virtually comparing the spectra of two images without merging the images as explained in the theoretical background in the section giving an example of determining the mutual shift of images acquired by different photosensors that belong to different, but highly similar, optical channels, elsewhere in the present document.
The reference versus which the spatial spectrum of the chiral combination of images is compared can be, for example, any pattern in the spatial spectrum that can be used as a reference scale to measure the degree of displacement of the spatial spectrum caused by defocus. The reference pattern can be any synthetic structure in the spatial spectrum, for example, a single line, or a grid of lines. Alternatively, the reference pattern can be a characteristic pattern obtained from the inherent spectral response of the chiral optical arrangement. The characteristic pattern can be conveniently represented by the modulus of the incoherent optical transfer function (OTF), i. e. the modulation transfer function (MTF), calculated with the corresponding chiral optical arrangement. In practice, the characteristic pattern is useful for measuring and evaluating the displacement of the spatial spectra. A periodic pattern of lines is an example of the characteristic pattern resulting from the chiral optical arrangement.
The image processing means can be adapted to provide at least one in-focus image of said object by processing of at least one digital image from at least one optical channel. So, the optical arrangement and image processing means function as a camera. Note that stereoscopic imaging, imaging suitable for 3D viewing by stereoscopy, is an option for optical arrangements including multiple cameras.
Additional processing means can be adapted to combine the ranges of at least two objects into a depth map. In the context of the present document, objects can be physical objects or, alternatively, objects can be any single detectable feature, for example, a small area with contrast, an edge or a dot.
Depth maps contain, and if displayed, show, the relative distance of objects in a scene. Depth maps are becoming main tools for analysis of scenes and are becoming a basis for novel concepts for 3D visualization.
The method for optical rangefmding and imaging comprises projecting at least two images of at least one object onto at least one photosensor by at least one optical arrangement including at least one optical channel, processing of digital images by image processing means, deriving the degree of defocus from the spatial spectrum of at least two images by a combination of spectral and defocus processing means, and, providing of the range of said object by range processing means. As disclosed in the present document, the method comprises providing at least one chiral combination of at least two images by at least one chiral optical arrangement, deriving the degree of defocus from the degree of displacement of the spatial spectrum of the chiral combination of images versus a reference by the combination of spectral and defocus processing means, and, providing the distance from at least one object to the apparatus by deriving said range from said degree of defocus by the range means.
Also, the method can comprise image processing of at least one digital image from at least one optical channel to provide at least one in-focus image of said object by image processing means and the method can comprise additional processing combining the ranges of at least two objects into a depth map. The preferred embodiments of the present inventions include optical rangefmders operating in visible, infrared or ultraviolet light, which fits the majority of practical applications. However, the described methods for rangefinding can be applied to all wave phenomena (for example, acoustic waves, RF fields etc.). Multiple channels with independent optics. - An alternative embodiment of a rangefinder with multiple optical channels (this includes one channel systems having temporal chiral arrangement; one channel is used to take at least two images of the same object, two images are used to get range information) may comprise individual optical components and individual photosensors. For example, in a rangefinder comprising two physically separate imaging systems with two individual photosensors the required sensitivity to defocus can be achieved by tilting the optical axis of one of the optical channels with respect to another - see the theoretical background above. By processing two sharp and undistorted images their relative displacement, caused by defocus, can be derived and, finally, the range can be found. The combination of independent channels can be a spatial/temporal chiral optical arrangement adapted to provide chiral modulation. Such spatial/temporal chiral arrangement may include, for example, at least one independent optical channel to project at least two images of the same object/scene on at least one photosensor such that the combination of images is adapted to provide range information by processing at least two images. The processing may include evaluating a degree of displacement of the spatial spectrum in response to defocus of the image, or composing a combined (digital) image with the spatial spectrum containing well-defined characteristic pattern which evolves with the focusing error, and hence range - see the theoretical background in the present document. In the simplest case, the chiral optical arrangement can be achieved, for example, by tilting one of the optical channels versus another channel in an asymmetric tilting arrangement (see also section in Terms and Definitions in the present document and Fig. 5).
Another embodiment of a multiple channel rangefinder providing simultaneous rangefinding and in- focus imaging of an object includes at least two optical channels with at least two independent photosensors of which photosensors may be, for example, two independent sections of a single photosensor. Alternatively, a rangefinder may comprise only one optical channel adapted to obtain at least two images of the same object in time such that the rangefinder arrangement becomes chiral in time. The optical arrangement of such a rangefinder and imaging apparatus can be adapted to provide modulation of the incoming light such that the images, i. e. digital images, acquired by the photosensors exhibit mutual changes in response to defocus easily detected by processing of their spatial spectra which spectra are obtained by spectral processing means. For example, defocusing in the rangefinder and imaging apparatus may cause a linear change, displacement in phase of one spatial spectrum with respect to another, see the theoretical section above. Note that prior to the analysis of the image spectra, the digital images from the different optical channels can be pre-processed by image processing means, for example, to equalize light intensities in the digital images, or increase their contrast. In the subsequent processing the degree of displacement of the image spectra, caused by defocus, can be converted into the corresponding degree of defocus relative to the in- focus image plane. It is clear that with the derived degree of defocus, one of the digital images can be reconstructed, for example, by Wiener filtering method, as explained in WO2010/080030, and a final in- focus image can be obtained. In addition, by combining two reconstructed images from two different optical channels, having different viewing angles with respect to an extended object, a singe synthetic 3D image of the object can be generated. Such a 3D imaging may require additional visualization devices, for example, a pair of spectacles adapted for 3D imaging, 3D projectors etc. Rangefmding with multiple independent optical channels is of particular practical interest because, first, in the preferred embodiment with two optical channels, it does not require additional optical components. For example, each optical channel may be a standard camera and the optical arrangement of the rangefinder may only require tilting of one camera with respect to the another. Second, each optical channel provides an undistorted, i. e. with no mask-related distortions, image of the object of interest, though, the image may be defocused. However, the image might still be in- focus when the object is at large distance because even very small focusing errors will be detected for range finding purposes which errors do not necessarily detorirate the image. Also, with the degree of defocus determined by rangefmding, a sharp and in- focus image can be further reconstructed, for example, by Wiener filtering method (J.W. Goodman, Introduction to Fourier Optics, McGraw-Hill Co., Inc., New York, 1996, Chap. 8). Suitable configurations and optical arrangements of a rangefinder with multiple independent optical channels are determined by the application requirements. A chiral optical arrangement is one of the preferred optical arrangements of a rangefmder.
However, an optical arrangement of a rangefmder can be non-chiral, but resulting in detectable defocus-related changes in the images acquired by the photosensors.
Accurate rangefmding can be achieved by processing the whole images or their multiple pixel portions, i. e. sub-images. An example of such image processing is given in the theoretical section of the present document.
Note that processing of multiple sub-images, as disclosed in WO2010/080030, of the corresponding images can be utilized to obtain depth maps of distant objects/scenes. For example, two images of the same object/scene taken by a two-channel rangefmder can be combined digitally into one resulting image which, in turn, can be split into multiple sub-images. Local ranges defined by processing of the individual sub-images can be used to produce a depth map of the object/scene, see the explanations in
WO2010/080030.
Processing in silico and pre-processing of images. - An alternative embodiment of a rangefmder is adapted to provide the required degree of modulation, caused by defocus, by digital processing only, in silico, by combining at least two sub-images from at least one whole image. A further analysis of the combination of sub-images, for example, their distortions can be used to determine the degree of defocus and thus range. For example, a standard camera takes a digital image, the processing means selects at least one central sub-image, an image which, for example, is positioned symmetrical around the optical axis, as well as a shifted sub-image, an image which only partly overlaps with the central image and positioned asymmetrical around the optical axis. The shifted sub-images can be chosen such that in combination they form any type of geometrical asymmetry, including chiral asymmetry. Additional processing means combine said images in a combined image from which, first, the degree of defocus and, second, distance to the object can be found. One may note that multiple, non-overlapping images acquired by different photosensors can be initially processed, or pre-processed, before the defocus evaluation. Pre-processing can include, for example, increase of contrast, digital selection of features, addition of colors and other digital image manipulation procedures. As an example, consider rangefmding with two independent optical channels followed by the subsequent analysis of multiple sub-images within each acquired image. Assuming that the apparent, defocus-related, spatial displacement of an object on the acquired images is large compared to the size of a sub-image, the sub-images cannot be individually analyzed and the degrees of defocus as well as the corresponding ranges cannot be evaluated. However, in the beginning, one may analyze whole images, estimate they mutual displacement and correct for said mutual displacement after which the multiple sub-images can be further processed.
Processing and transformations described above can be applied to a single image acquired by one or several photosensors, or any portion of the image, i. e. sub-image, or any combination of sub-images from said image. Alternatively, processing and transformations can be carried out on multiple digital images, or sub-images, or any synthetic combination, meaning combinations in silico, of the multiple images and sub- images. For example, separate processing of a number of sub-images from a single image obtained by a rangefmder with a chiral optical arrangement allow, first, to determine defocus and, thus, the distance to the objects within every sub-images, and, second, reconstruct a depth map for a whole scene. The optical arrangement of a rangefmder with a single optical channel or multiple optical channels, including chiral optical arrangements, may have a preferred plane of rangefmding, providing a directional sensitivity to defocus. For example, a rangefmder with two optical channels positioned in the ZY - plane (the Z - axis is along the optical axis) may provide much higher sensitivity to defocus W = q>y2 than to Wx = φχ2. Explanations can be found, for example, in WO2010/080030. The directional sensitivity is of a particular interest because, first, the shape of an object can be sensed remotely by turning the rangefmder about its optical axis; second, the dynamic range of the rangefmder can be significantly increased without losing sensitivity in the preferred plane of rangefmding. Alternatively, the directional sensitivity and/or the increased dynamic range can be achieved with additional optical components, including optical phase and amplitude masks (also slits and shaped aperture stops). One example, explained in WO2010/080030, and applicable to a rangefmder with multiple optical channels is a cubic optical phase mask.
The preferred method for rangefmding is to measure the defocus-induced displacement of at least one characteristic pattern versus the reference pattern in the spatial spectrum of the detected image. The spatial spectrum is obtained by spectral decomposition of the image, generally, by the Fourier transformation. For example, the image, I(x, y) , in this context a two-dimensional intensity distribution on a photosensor, results in the spatial spectrum / (£>x, £>y) , where x, y are the spatial variables and £>χ, ύγ are the
corresponding spatial spectrum variables. Alternatively, the spectral decomposition can be accomplished by digital algorithms from the digital image, or directly from the image by an optical processor, for example, the optical processor described in
US4556950. The reference pattern is any convenient pattern in the spectrum domain, for example, a single vertical line (i. e. cox = 0 , where cox is the reduced spatial frequency measured along the X - axis) that is used as a reference to measure the degree of displacement of the image spectrum caused by defocus. The characteristic pattern results from the inherent spectral response of the optical arrangement, or modulator, see also explanations in WO2010/080030. The characteristic pattern can be conveniently represented by the modulus of the incoherent optical transfer function (OTF), i. e. the modulation transfer function (MTF), calculated for a given optical arrangement. The man skilled in the art may conclude that the defocus-induced changes in the spectral domain of the image are inevitably associated with the spatial transformations of the image itself. In this connection, the unknown range can be evaluated directly from the spatial displacement of the image on a photosensor. The displacements may include shift, rotation, scaling of the image, and any combination of thereof. The displacement is easy to derive in an optical arrangement with multiple optical channels having individual photosensors. An example of the lateral shift calculation from two independent images is considered in the theoretical section of the present document and in WO2009/108050. The degree of the lateral shift of the images can be directly calibrated, for example at a manufacturer's facility, in units of defocus, for example, wavelength, and in units of absolute range such as micrometers, meters or kilometers, depending on the design and application of the particular rangefmder.
This invention - aspects of angular momentum
A rangefmder comprises at least one modulator to modulate the angular momentum of the incoming light. Modulation of the angular momentum (or detection thereof) can be provided by chiral optical elements or/and polarizing elements included in at least one imaging system, or, alternatively, provided by spatial chiral optical arrangement, or temporal chiral optical arrangement, or, alternatively, provided by chiral modulation at digital processing, in silico, from, for example, at least two subimages from a at least one image. Optical arrangements may include any number of optical channels providing an effective chirality of the rangefmder. Embodiments of modulators to modulate either the orbital angular momentum or the spin angular momentum are disclosed as well as combinations thereof and combinations of modulators with converters to convert orbital angular momentum into spin angular momentum and vice versa.
In the preferred embodiment of a rangefmder, first, at least one image is projected by at least one imaging system onto at least one photosensor. The imaging system and the photosensor can, for example, be a standard camera lens and a standard photosensor or, alternatively, a special imaging optics and a photosensor optimized for particular rangefinding needs. Second, the light is modulated such that the degree of modulation depends on the degree of defocus in the imaging system. Third, the degree of defocus is derived from the degree of modulation of the angular momentum which can be detected from the intensity variations on the photosensor. The modulation of the angular momentum may include modulation of the orbital momentum and/or spin momentum. Fourth, the range for a single object, or a group of objects, can be derived from the degree of defocus by using, for example, formulas presented in WO2010/080030, or as described by Nayar (Nayar et al., Proc. of Fifth Intl. Conf. on Computer Vision, 995- 1001, Cambridge, MA, USA, 1995). To achieve a high accuracy in the range measurements, the modulation of the angular momentum can be adjusted for a highest sensitivity with respect to defocus variations, i. e. the value | dM I Dcp | is maximized, here M denotes the degree of modulation and φ is defocus.
The rangefmder also includes image processing means, or digital processing means including defocus-from-modulation processing means, deriving the degree of defocus from the degree of modulation, and range-from-defocus processing means, deriving the range (as described above) of the object from the degree of defocus. The rangefmder also includes at least one modulator providing a degree of modulation which degree depends on the degree of defocus. The modulator provides modulation of the angular momentum of the incoming light. For example, a rangefmder may comprise a chiral optical arrangement with two independent optical channels to modulate the orbital momentum of light. The degree of modulation in this case can be represented by the angular position and spacing of the periodic pattern of lines in the spatial spectrum of the combined image. Recalculating the angle and spacing of lines into defocus, the unknown distance can be evaluated, for example, by using formulas from
WO2010/080030, or as described by Nayar (Nayar e al, Proc. of Fifth Intl. Conf. on Computer Vision, 995-1001, Cambridge, MA, USA, 1995).
Digital processing means may include calculating digital means, digital display means, digital storage means, means for transformations, for example, Fourier transformations, and other supporting means. All calculations are preferably non-iterative calculations allowing high processing speeds. Defocus-from-modulation processing means and methods thereof are based on novel physical concepts and allow to derive directly the degree of defocus from the degree of modulation which degree of modulation is represented by, for example, the degree of at least one displacement of the spatial spectrum of the image relative to a reference pattern or, alternatively, the degree of light intensities of a spectrum or, alternatively, a combination of said displacement and intensity. The term directly derive refers to a single step process which does not include, for example, deconvolution steps. The reference pattern generally corresponds to the inherent spectral optical response of the modulator, known a priori. Range-from- defocus processing means and methods thereof are based on traditional methods to derive range from defocus discussed above. The modulator providing the degree of modulation of angular momentum can include at least one phase modulator, for example a chiral optical element, to modulate the phase of the incoming light or, alternatively, include at least one polarization modulator to modulate the polarization, spin angular momentum, of the incoming light. Also, a combination of said masks can be adapted to modulate the combination of phase and polarization of the incoming light. Alternatively, the modulator providing the degree of modulation of angular momentum can include a specific arrangement of at least one imaging system adapted to provide the degree of modulation versus defocus. In a rangefmder with multiple optical channels the degree of modulation can be obtained by proper spatial or temporal arrangement of the multiple channels. For example, after imaging by an imaging system a beam splitter can provide at least two optical channels which channels can be separately modulated, for example, by introducing a degree of optical tilt in only one of the channels with the other channel remaining the reference channel. Alternatively, at least two imaging systems in combination with at least one photosensor can provide at least two independent optical channels adjusted to provide a degree of modulation versus defocus, by, for example, introducing optical tilt in one of the channels, with the other channels remaining the reference channels. The modulator providing the degree of modulation of the angular momentum can also include any combination of at least one of said physical masks and at least one of said optical channels.
A number of embodiments and combinations of embodiments can provide desired modulation of the angular momentum of the incoming light. Alternative embodiments to the same effect can likely be designed which alternatives are considered to be included in here by this reference. Clearly, the embodiment of choice for a particular application depends, for example, on the specifications of the application such need for sharp imaging in combination with rangefmding, the desired resolution of imaging and desired accuracy as well as economical considerations. Such embodiments include, but are not restricted to, the following embodiments:
Phase modulators. - Modulation of phase can be obtained by a number of different ways by phase modulators comprised of various passive and active optical elements, for example, optical phase and amplitude masks, diffraction optical elements, holograms, spatial light modulators, fixed and deformable mirrors etc. The phase modulation can be applied to rangefinders having a single optical channel or multiple optical channels. In a rangefinder with multiple optical channels, one or several phase modulators can be integrated in one of the optical channels, or be an integral part of the whole optical arrangement. In the basic case, a phase modulator can provide chiral/helical phase of the light beam resulting, for example, in a non-zero orbital angular momentum. In other cases, phase modulators can provide non-zero angular momentum of the light beam such that, for example, the characteristic pattern in the spatial spectrum exhibit displacements in response to defocusing. The prior unknown defocus can be derived from this displacement and the range can be evaluated. The rangefinder can include at least one phase modulator providing modulation of the orbital angular momentum which causes displacement of the image (or its spatial spectrum) in the plane of the photosensor. The degree of defocus, and thus the range, can be derived from said degree of displacement. Polarization modulators. - Modulation of polarization can be obtained by a number of different ways by polarization modulators, also referred to as polarization masks, comprised of various passive and active optical elements, for example, polarizers, waveplates, Fresnel rhombs and Faraday rotators, electro-optical and magneto-optical devices, liquid-crystal devices etc. The polarization modulation can be applied to rangefmders having a single optical channel or multiple optical channels. In the basic case, a polarization modulator changes the flux of spin angular momentum, so a light beam passing through a polarization modulator acquires a non-zero total spin angular momentum Φ^'≠ 0 , for example, the light beam becomes circularly-polarized.
Additionally, a polarization modulator may result in a variable degree of modulation depending on defocus. In other cases, polarization modulators can provide non-zero angular momentum of the light beam such that, for example, the characteristic pattern in the spatial spectrum exhibit displacements in response to defocusing. The prior unknown defocus can be derived from this displacement and the range can be evaluated. The rangefmder can include combinations of one or multiple phase modulators, one or multiple polarization modulators, or any combinations of said modulators. Also, said modulators can be combined with said converters in at least one modulator/converter combination. The combination of modulators may include at least one phase modulator, which can be a chiral phase modulator and at least one polarization modulator. In such a combination the modulators can be arranged to maximize the changes in the light beam (intensity distribution, polarization etc.) caused by defocus. The combination of modulators can also provide at least one distinct characteristic pattern in the spatial spectrum of the image such that the degree of displacement of said pattern corresponds to the degree of defocus. Said combination of modulators might be adapted to maximize the detectability of the characteristic pattern, for example, by maximizing the degree of displacement in response to a given degree of defocus. An important role of the characteristic pattern is to provide a reference (for example, in the spectral domain) independent on the spatial structure of the object scene.
Note that a photosensor such as a CMOS or CCD sensor can only sense intensities of light and can not sense polarization and phase of light. So, the polarization modulator is generally combined with an additional optical component, for example, an additional polarization modulator (in this context, a polarization demodulator), to convert changes in spin angular momentum into a corresponding degree of light intensity that is detected by a photosensor. So, such polarization modulator /demodulator combination provides modulation of light intensity of which the degree of modulation depends on the degree of defocus. Multiple channels with dependent optics. - An alternative embodiment of a rangefmder can provide sensing of defocus and range by using a multiple channel optical arrangement. Such a rangefmder may include optical components and/or photosensors shared between different optical channels. For example, in a rangefmder with a single primary lens the light is shared into two optical channels by a beam splitter and is detected by two independent photosensors, or, alternatively, two independent sectors on the same photosensor. The required modulation can be achieved, for example, by a phase modulator placed in only one of the optical channels. In this example, first, the primary lens is shared between the optical channels and, second, the channel without the modulator can provide sharp, undistorted imaging of the object of interest.
Rangefmding principles disclosed in the present document are based on the fact that electromagnetic waves may carry angular momentum (see, for example, A.I. Akhiezer and V.B. Berestetskii, Quantum Electrodynamics, John Wiley & Sons, 1965, and L. Allen et al., Optical Angular Momentum, IOP, 2003) which, in many practical cases, can be split into angular and spin parts. Angular momentum of light, in turn, can be modulated by phase and polarization modulators as described above. Making the degree of modulation dependent on defocus, the unknown defocus can be encoded into the degree of modulation of the angular momentum that can be converted by modulation processing means into a detectable variation of the light intensity on a photosensor. Alternatively, the degree of modulation can be fixed (independent on defocus), but the resulting light beam with non-zero angular momentum may undergo spatial-temporal transformations depending on the degree of defocus, see the example above with a superposition of two circular harmonics. These transformations can be detected directly by a photosensor or converted into detectable variations of light intensity by modulation processing means. Modulation processing means may comprise various optical components, for example, phase and amplitude optical masks, polarizers. One example of a rangefmder employing modulation of the angular momentum is a rangefmder with a phase only optical mask changing the total orbital angular momentum flux from
Φ'°' = 0 (light in the plane of the first optical surface) to Φ'°'≠ 0 (light after the optical mask), see the formulas above. For a light beam with Φ'°'≠ 0 , it was also shown that the light intensity distribution rotates as a function of defocus. The rotation of the light intensity can be detected by a photosensor and, thus, the unknown defocus and range can be determined. Similarly, the optical rangefinding can be performed with polarization modulators. In this case defocus may cause variations in the degree of polarization which variations can be detected by a photosensor combined with an additional polarizer (i. e. modulation processing means). Alternatively, the degree of modulation can be fixed, but the total spin momentum flux Φ^'≠ 0 and the
polarization of light changes, for example, on its passage through an additional anisotropic medium depending on defocus. Additionally, any combination of phase and polarization modulators can be employed in a rangefmder on condition that the combination provides directly, or though modulation processing means, a detectable variation of the light intensity on a photosensor depending on defocus.
A simultaneous independent coupling of both the spin and orbital moments of light with inhomogenous and anisotropic media predicted by Piccirillo et. ah, Phys. Rev. E 69, 056613, (2004) allows conversion of spin angular momentum into orbital angular momentum and vice versa. Spin-to -orbital angular momentum conversion can be achieved, for example, in a planar slab of an uniaxial birefringent medium as described, for example, by Marrucci et ah, Phys. Rev. Lett. 96, 163905 (2006). The rangefmder can include at least one polarization-to-phase converter adapted to convert a degree of spin angular momentum into a corresponding degree of orbital angular momentum. Also, the rangefmder can include at least one phase-to-polarization converter adapted to convert a degree of orbital angular momentum into a corresponding degree of spin angular momentum.
The rangefinding method comprises projection of an image of at least one object onto a photosensor, modulation of the incoming light such that the degree of modulation depends on the degree of defocus in the imaging system, for example, in the plane of the exit pupil, derivation of the degree of defocus in the imaging system from the degree of modulation, and, derivation of the range from the object to the rangefmder based on the degree of defocus. The method also includes modulation of the angular momentum of the incoming light. The modulation of angular momentum can be either modulation of the orbital angular momentum or modulation of the spin angular momentum or modulation of both momenta. The method can include phase-to-polarization conversion which converts orbital angular momentum into spin angular momentum and/or polarization-to-phase conversion to convert spin angular momentum into orbital angular momentum. The method derives the degree of defocus of the imaging system from either the displacement of a pattern in the spatial spectrum or from light intensity.
Lastly, the range from the object to the imaging system can be derived from the degree of defocus as described above. Applications
Knowing the distance from a rangefmder to an object, or multiple objects, or an extended scene along one axis (for example, the optical axis of a rangefmder), in a plane or in three-dimensional (3D) space is of practical importance for a variety of consumer, 3D entertainment/imaging, automotive, military/defense, security, technical and scientific applications.
The rangefmder disclosed in the present document can be adapted for distance and speed measurements of single or multiple objects using visible, IR or UV light, or multispectral light. The described methods of rangefinding can be, in principle, adapted to any type of wave processes. In addition, the presented rangefmders are likely to be inexpensive because of standard optical components and photosensors. Optical phase masks are of a simple design and can be easily manufactured. In one of the preferred embodiments, a rangefmder is a standard, mass-produced photocamera with an additional optical phase mask. In an alternative embodiment, for example, a rangefmder is a combination of two standard photocameras without additional optical components. Evaluation of the unknown range from defocus is carried out by digital processing means which may include standard computer devices, digital signal processors (DSP), devices based on field-programmable gate arrays (FPGA) or other digital evaluation means and components.
Note that a rangefmder comprising multiple independent optical channels can also be adapted for remote sensing of displacements and speed in lateral directions. So, position of objects, their speed in the direction perpendicular, for example, to the optical axis of the rangefmder can be determined by processing of images acquired by the independent photosensors. One example of such processing is the Fourier decomposition of the combined image. In this case, the spatial spectrum of the combined image reveals a line pattern whose spacing and orientation depends on the speed and movement direction of the object.
Note that active rangefmders such as, for example, radar or sonar, can not be adapted for passive rangefmding. However, the passive rangefmders as disclosed in the present document can, of course, be adapted for active rangefmding by, for example, combining the rangefmder with a light source projecting a light spot, or a grid, or dots, or any light pattern on the object. Detection, analysis and digital processing in an active
rangefmding mode are, in essence, the same as for images obtained in a passive mode.
Depth maps and wavefront maps. - The optical rangefmder can be adapted to provide the degrees of defocus of at least two sub-images from the corresponding sub-areas of the image, or several images, acquired by one or several photosensors. With the information on defocus of sub-images corresponding to multiple objects, or spatially extended objects, a complex scene can be completely characterized so that, for example, a defocus map composed of the local degrees of defocus can be directly constructed. In addition, the optical rangefmder can also be adapted to provide a wavefront map, in which the wavefront is constructed from the local degrees of defocus corresponding to said sub-images. Construction of the continuous wavefront map can be carried out analogously to the wavefront construction methods developed for Shack-Hartmann wavefront sensors. Static and dynamic defocus maps, depth maps and wavefront maps are relatively novel and promising approaches for military and home-security applications because of additional options for optical, and generally passive, detection of movement and speed of objects and changes in wavefront characteristics. Said 3D capability for depth map applications will only increase the scope of applications.
So, as a result of rangefmding and sensing of displacements in lateral directions, an observer can obtain a 3D map of a spatially extended scene, for example on a 3D display, as well as indications of lateral displacements of the separate objects. Image reconstruction. - An image reconstruction apparatus can be designed including a certain embodiment of the optical rangefmder, as described in the present document, to provide at least one in- focus image of the object which in- focus image is
reconstructed, for example, from the spatial spectrum of a single defocused/distorted image, or multiple defocused images. One embodiment of the image reconstruction apparatus may combine, for example, rangefmding using several independent optical channels, as described in the present document, and subsequent processing, as suggested by A. Simonov and M. Rombach, Opt. Lett. 34, 2111, (2009), of at least two, generally, defocused images acquired by the photosensors. Alternatively, any of the image correction/reconstruction methods disclosed in the theoretical section of the present document, or in WO2009/108050 can be used. Note that the accuracy and processing time of the image restoration algorithms (in WO2009/108050 and Opt. Lett. 34, 2111, 2009) depends, to a large degree, on the initial guess for the unknown de focus. Thus, by combining accurate rangefmding with image reconstruction one may finally obtain a sharp image of the object. The optical arrangement of the image reconstruction apparatus may include, for example, at least one chiral optical arrangement for imaging such that the OTF of the apparatus has no zeros in at least one region of the spatial spectrum, usually close to zero, even at large focusing errors, see also explanations in WO2010/080030. Those skilled in the art may conclude that zeros in the spatial spectrum of the image cannot be removed by any digital post filtering. Clearly, such image reconstruction is less necessary with rangefmders comprising multiple independent optical channels since sharp, or reasonably sharp, images can be obtained directly from the imagesensors.
Pathfinder. - An optical rangefmder can be adapted to become an optical pathfinder apparatus providing an estimate of at least one optical path length, for example, the optical path length between the fixed object and the optical rangefmder provided that that the absolute distance is known beforehand, or known a priori. In the context of the present document, the optical path length (OPL) and the absolute distance (L ) are given by respectively, where n{s) is the refractive index along the optical path and C is the integration path. The OPL can vary while the range L remains constant, for example, due to variations in refractive index of the medium, if the medium is air, due to, for example, air pressure, humidity etc. The rangefmder can be also adapted to measure atmospheric disturbances and turbulence, wind speed and even measure acoustic waves in air, operating as an optical microphone, and be adapted to measure equivalent phenomena in liquids.
Modal wavefront sensor. - The optical rangefmder described above can be adapted to provide modal wavefront sensing and, thus, become a modal wavefront sensor. Those skilled in the art will conclude that the concepts set forth in the present document can be extended to likely any aberration of choice by adaptation and derivation of the formulas and mathematical concepts disclosed in the present document. Since the rangefmding methods disclosed in the present document rely on the 'range from defocus' principle, defocus is the primary aberration determined by the rangefmder. One may, however, consider adaptations of the rangefmder such that changes in the degree of modulation of angular momentum, or variations in the spatial /spectral structure of the images, might be caused by any aberration of the wavefront (for example, in the plane of the exit pupil). For example, in Zernike modal representation, an optical phase mask with the phase function ϋ-(χ, y) = Axy , where A is the steepness of the phase function, results in scaling and rotation of the OTF in response to astigmatic distortion W = φ(χ2 - y2) of the wavefront.

Claims

Claims
1. Optical rangefmding and imaging apparatus comprising:
- at least one optical arrangement including at least one optical channel adapted to project at least two images of at least one object onto at least one photosensor;
- image processing means adapted to provide processing of digital images;
- spectral processing means adapted to provide spatial spectra and to derive the degree of defocus of images, and,
- range processing means adapted to provide the distance from at least one object to the apparatus,
characterized in that:
- the optical arrangement includes at least one chiral optical arrangement adapted to provide at least one chiral combination of said two images,
- the spectral processing means is adapted to derive the degree of defocus from the degree of displacement of the spatial spectrum of the chiral combination of images versus a reference,
- the range processing means is adapted to provide the distance from at least one object to the apparatus from the degree of defocus.
2. Apparatus according to Claim 1 characterized in that image processing means is adapted to provide at least one in-focus image of said object by processing of at least one digital image from at least one optical channel.
3. Apparatus according to Claim 1-2 characterized in that apparatus comprises additional processing means adapted to combine the ranges of at least two objects into a depth map.
4. Method for optical rangefmding and imaging comprising projecting at least two images of at least one object onto at least one photosensor by at least one optical arrangement including at least one optical channel, processing of digital images by image processing means, deriving the degree of defocus from the spatial spectrum by spectral processing means, and range processing to provide the distance from at least one object to the apparatus characterized in that the method includes arranging of at least two images into at least one chiral combination of images by at least one chiral optical arrangement, deriving the degree of defocus from the degree of displacement of the spatial spectrum of the chiral arrangement of images versus a reference by the spectral processing means and providing the range of said object by deriving said range from said degree of defocus by range processing means.
5. Method according to Claim 4 characterized in that the method comprises image processing of at least one digital image from at least one optical channel to provide at least one in-focus image of said object by image processing means.
6. Method according to Claim 4 characterized in that that the method comprises additional processing combining the ranges of at least two objects into a depth map.
7. Apparatus comprising at least one imaging system adapted to project at least one image of at least one object on at least one photosensor, at least one modulator adapted to provide a degree of modulation corresponding to the degree of defocus in the imaging system, defocus-from-modulation processing means adapted to derive the degree of defocus from said degree of modulation, and range-from-defocus processing means adapted to derive the range from the object to the Apparatus based on said degree of defocus characterized in that the modulator is adapted to provide modulation of the angular momentum of the incoming light.
8. Apparatus according to Claim 7 characterized in that the modulator includes at least one phase modulator adapted to provide modulation of the orbital angular momentum.
9. Apparatus according to Claim 7 characterized in that the modulator includes at least one polarization modulator adapted to provide modulation of the spin angular momentum.
10. Apparatus according to Claim 7-9 characterized in that the modulator includes at least one polarization-to-phase converter adapted to convert a degree of spin angular momentum into a corresponding degree of orbital angular momentum.
11. Apparatus according to Claim 7-9 characterized in that the apparatus also comprises at least one phase-to-polarization converter adapted to convert a degree of orbital angular momentum into a corresponding degree of spin angular momentum.
12. Apparatus according to Claim 7 characterized in that the apparatus comprises at least one modulator/converter combination which combination includes at least one modulator according to Claims 5-6 and/or at least one converter according to Claim 7-8.
13. Apparatus according to any of the foregoing Claims characterized in that the modulator is adapted to provide displacement of at least one characteristic pattern in the spatial spectrum of which the degree of displacement depends on the degree of defocus.
14. Apparatus according to Claim 13 characterized in that the apparatus includes defocus-from-modulation processing means adapted to derive the degree of defocus from said degree of displacement.
15. Apparatus according to Claim any combination of foregoing Claims characterized in that the apparatus comprises a polarization modulator/demodulator combination including at least one polarization modulator and at least one polarization demodulator which combination is adapted to provide modulation of light intensity of which degree of modulation depends on the degree of defocus.
16. Apparatus according to Claim 15 characterized in that the apparatus includes defocus-from modulation processing means adapted to derive the degree of defocus from said degree of modulation of light intensity.
17. Apparatus according to Claim 14-16 characterized in that the apparatus includes range-from-defocus processing means adapted to derive the range from the object to the apparatus from said degree of defocus.
18. Method for rangefmding, comprising projecting at least image of at least one object onto at least one sensor, modulating the light such that the degree of modulation depends on the degree of defocus, deriving the degree of defocus from the degree of modulation, and, deriving the range from the object to the apparatus based on degree of defocus characterized in that the method includes modulation of the angular momentum.
19. Method for rangefmding according to Claim 18 characterized in that the method includes modulation of phase to modulate the orbital angular momentum.
20. Method for rangefinding according to Claim 18 characterized in that the method includes modulation of polarization to modulate the spin angular momentum.
21. Method for rangefinding according to Claim 18-20 characterized in that the method includes phase-to-polarization conversion to convert orbital angular momentum into spin angular momentum.
22. Method for rangefinding according to Claim 18-20 characterized in that the method includes polarization-to-phase conversion to convert spin angular momentum into orbital angular momentum.
23. Method for rangefinding according to Claim 18-22 characterized in that the method includes modulation such that at least one characteristic pattern in the spatial spectrum is provided of which a degree of displacement depends on the degree of de focus.
24. Method for rangefinding according to Claim 18-22 characterized in that the method includes modulation such that a degree of light intensity is provided which degree depends on the degree of defocus.
25. Method for rangefinding according to any of Claims 23 and 24 characterized in that the method includes derivation of the range of at least one object to the apparatus from the degree of defocus.
PCT/NL2011/050301 2010-05-03 2011-05-03 Improved optical rangefinding and imaging apparatus WO2011139150A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
NL2004644 2010-05-03
NL2004644 2010-05-03
NL2005234 2010-08-16
NL2005234 2010-08-16

Publications (1)

Publication Number Publication Date
WO2011139150A1 true WO2011139150A1 (en) 2011-11-10

Family

ID=44484800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2011/050301 WO2011139150A1 (en) 2010-05-03 2011-05-03 Improved optical rangefinding and imaging apparatus

Country Status (1)

Country Link
WO (1) WO2011139150A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012106584A1 (en) * 2012-07-20 2014-01-23 Carl Zeiss Ag Method and apparatus for image reconstruction
WO2015036467A1 (en) * 2013-09-11 2015-03-19 Sirona Dental Systems Gmbh Optical system for generating a pattern which changes over time for a confocal microscope
WO2017093227A1 (en) * 2015-12-02 2017-06-08 Carl Zeiss Ag Method and device for image correction
CN113237834A (en) * 2021-07-08 2021-08-10 成都信息工程大学 Chiral molecule chiral resolution device and method based on optical spin Hall effect
US11145033B2 (en) 2017-06-07 2021-10-12 Carl Zeiss Ag Method and device for image correction
CN114722354A (en) * 2022-06-10 2022-07-08 苏州大学 Method, device and storage medium for calculating flux density of normalized orbital angular momentum

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4556950A (en) 1983-09-02 1985-12-03 Environmental Research Institute Of Michigan Incoherent optical processor
US4856884A (en) * 1988-03-16 1989-08-15 The United States Of America As Represented By The Secretary Of The Air Force Dynamically matched optical filtering in a multiple telescope imaging system
US5337181A (en) * 1992-08-27 1994-08-09 Kelly Shawn L Optical spatial filter
US20040017502A1 (en) 2002-07-25 2004-01-29 Timothy Alderson Method and system for using an image based autofocus algorithm
WO2005098501A1 (en) 2004-04-09 2005-10-20 Canon Kabushiki Kaisha Solid-state image pickup device for auto-focus and auto-focus camera using the same
US20060109369A1 (en) 2001-10-26 2006-05-25 Fuji Photo Film Co., Ltd. Device and method for autofocus adjustment
US7218448B1 (en) 1997-03-17 2007-05-15 The Regents Of The University Of Colorado Extended depth of field optical systems
US20080137059A1 (en) * 2006-06-05 2008-06-12 University Of Colorado Method And System For Optical Imaging And Ranging
US20080205872A1 (en) 2007-02-27 2008-08-28 Canon Kabushiki Kaisha Focus detection device
WO2009108050A1 (en) 2008-02-27 2009-09-03 Aleksey Nikolaevich Simonov Image reconstructor
WO2010080030A2 (en) 2009-01-09 2010-07-15 Aleksey Nikolaevich Simonov Optical rangefinder an imaging apparatus with chiral optical arrangement

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4556950A (en) 1983-09-02 1985-12-03 Environmental Research Institute Of Michigan Incoherent optical processor
US4856884A (en) * 1988-03-16 1989-08-15 The United States Of America As Represented By The Secretary Of The Air Force Dynamically matched optical filtering in a multiple telescope imaging system
US5337181A (en) * 1992-08-27 1994-08-09 Kelly Shawn L Optical spatial filter
US7218448B1 (en) 1997-03-17 2007-05-15 The Regents Of The University Of Colorado Extended depth of field optical systems
US20060109369A1 (en) 2001-10-26 2006-05-25 Fuji Photo Film Co., Ltd. Device and method for autofocus adjustment
US20040017502A1 (en) 2002-07-25 2004-01-29 Timothy Alderson Method and system for using an image based autofocus algorithm
WO2005098501A1 (en) 2004-04-09 2005-10-20 Canon Kabushiki Kaisha Solid-state image pickup device for auto-focus and auto-focus camera using the same
US20080137059A1 (en) * 2006-06-05 2008-06-12 University Of Colorado Method And System For Optical Imaging And Ranging
US20080205872A1 (en) 2007-02-27 2008-08-28 Canon Kabushiki Kaisha Focus detection device
WO2009108050A1 (en) 2008-02-27 2009-09-03 Aleksey Nikolaevich Simonov Image reconstructor
WO2010080030A2 (en) 2009-01-09 2010-07-15 Aleksey Nikolaevich Simonov Optical rangefinder an imaging apparatus with chiral optical arrangement

Non-Patent Citations (25)

* Cited by examiner, † Cited by third party
Title
A. SIMONOV, M. ROMBACH, OPT. LETT., vol. 34, 2009, pages 2111
A.D. POLYANIN: "Handbook of Linear Partial Differential Equations for Engineers and Scientists", 2002, CHAPMAN&HALL/CRC
A.I. AKHIEZER, V.B. BERESTETSKII: "Quantum Electrodynamics", 1965, JOHN WILEY & SONS
A.N. SIMONOV, M.C. ROMBACH, OPT. LETT., vol. 36, 2011, pages 115 - 117
C.N. ALEXEYEV ET AL., J. OPT. SOC. AM. A, vol. 25, 2008, pages 643 - 646
F.J. BELINFANTE, PHYSICA, vol. VI, no. 9, 1939, pages 887 - 898
G.N. AFANASIEV ET AL., NUOVO CIMENTO, vol. 109, 1996, pages 271 - 279
J.D. JACKSON: "Classical Electrodynamics", 1999, JOHN WILEY & SONS
J.W. GOODMAN: "Introduction to Fourier Optics", 1996, MCGRAW-HILL CO., INC.
K. ENGELHARDT, R. KNOP, APPL. OPT., vol. 34, 1995, pages 2339 - 2344
L. ALLEN ET AL.: "Optical Angular Momentum", IOP, 2003
L. MARRUCCI ET AL., PHYS. REV. LETT., vol. 96, 2006, pages 163905
LEACHTENAUER J C ET AL: "Linear Shift Invariant Imaging Systems", 1 January 2001, SURVEILLANCE AND RECONNAISSANCE IMAGING SYSTEMS: MODELING AND PERFORMANCE PREDICTION, ARTECH HOUSE, US, PAGE(S) 75 - 113, ISBN: 978-1-58053-132-0, XP002553938 *
M. PETITJEAN, J. MATH. PHYS., vol. 43, 2002, pages 4147 - 4157
MARRUCCI ET AL., PHYS. REV. LETT., vol. 96, 2006, pages 163905
N.N. BOGOLIUBOV, D.V. SHIRKOV: "Introduction to the Theory of Quantized Field", 1980, JOHN WILEY & SONS
NAYAR ET AL., PROC. OF FIFTH INTL. CONF. ON COMPUTER VISION, 1995, pages 995 - 1001
NAYAR, PROC. OF FIFTH INTL. CONF. ON COMPUTER VISION, 1995, pages 995 - 1001
NIEMINEN ET AL., J. OPT. A: PURE APPL. OPT., vol. 10, 2008, pages 115005
OPT. LETT., vol. 34, 2009, pages 2111
PICCIRILLO, PHYS. REV. E, vol. 69, 2004, pages 056613
S.M. BARNETT, J. OPT., vol. 4, 2002, pages S7 - S17
S.M. BARNETT, J. OPT., vol. B4, 2002, pages S7 - S 17
SALOMON ET AL., J. MATER. CHEM., vol. 25, 1999, pages 295 - 308
V. FOCK: "Theory of Space, Time and Gravitation", 1964, PERGAMON PRESS

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9516242B2 (en) 2012-07-20 2016-12-06 Carl Zeiss Ag Method and apparatus for image reconstruction
DE102012106584B4 (en) * 2012-07-20 2021-01-07 Carl Zeiss Ag Method and device for image reconstruction
DE102012106584A1 (en) * 2012-07-20 2014-01-23 Carl Zeiss Ag Method and apparatus for image reconstruction
US9910255B2 (en) 2013-09-11 2018-03-06 DENTSPLY SIRONA, Inc. Optical system for generating a pattern which changes over time for a confocal microscope
EP3125013A1 (en) * 2013-09-11 2017-02-01 Sirona Dental Systems GmbH Optical system for the generation of a pattern for a confocal microscope which changes over time
EP3096172A1 (en) * 2013-09-11 2016-11-23 Sirona Dental Systems GmbH Optical system for the generation of a pattern for a confocal microscope which changes over time
WO2015036467A1 (en) * 2013-09-11 2015-03-19 Sirona Dental Systems Gmbh Optical system for generating a pattern which changes over time for a confocal microscope
WO2017093227A1 (en) * 2015-12-02 2017-06-08 Carl Zeiss Ag Method and device for image correction
US10748252B2 (en) 2015-12-02 2020-08-18 Carl Zeiss Ag Method and device for image correction
US11145033B2 (en) 2017-06-07 2021-10-12 Carl Zeiss Ag Method and device for image correction
CN113237834A (en) * 2021-07-08 2021-08-10 成都信息工程大学 Chiral molecule chiral resolution device and method based on optical spin Hall effect
CN113237834B (en) * 2021-07-08 2021-09-14 成都信息工程大学 Chiral molecule chiral resolution device and method based on optical spin Hall effect
CN114722354A (en) * 2022-06-10 2022-07-08 苏州大学 Method, device and storage medium for calculating flux density of normalized orbital angular momentum
CN114722354B (en) * 2022-06-10 2022-10-18 苏州大学 Method, apparatus and storage medium for calculating normalized orbital angular momentum flux density
WO2023236323A1 (en) * 2022-06-10 2023-12-14 苏州大学 Method for calculating normalized orbital angular momentum flux density, and device and storage medium

Similar Documents

Publication Publication Date Title
EP2386053B1 (en) Optical rangefinder and imaging apparatus with chiral optical arrangement
WO2011139150A1 (en) Improved optical rangefinding and imaging apparatus
CN108780142A (en) 3D imaging systems and method
CN106017683B (en) Obtaining spectral information from a moving object
JPH10508107A (en) Apparatus and method for determining a three-dimensional shape of an object using relative blur in an image due to active illumination and defocus
WO2008128024A1 (en) Compact snapshot polarimetry camera
US11624700B2 (en) Efficient reading of birefringent data
CN109087395A (en) A kind of method and system of three-dimensional reconstruction
CN108007574A (en) The fast illuminated image spectrum linear polarization detection device of resolution ratio adjustable type and method
CN104535188A (en) Static full-polarization imaging detection system and method for spatial frequency modulation
US11341620B2 (en) Background correction for birefringence measurements
Zhou et al. 3D shape measurement based on structured light field imaging
Cossairt et al. Digital refocusing with incoherent holography
US11017546B2 (en) Method and device for depth detection using stereo images
JP2016134732A (en) Imaging device and imaging method
Colburn et al. Single-shot three-dimensional imaging with a metasurface depth camera
CN108594617A (en) The big view field imaging recording method of incoherent digital hologram and device
JP3495854B2 (en) Phase distribution measuring method and phase shift electronic moiré device using the same
Peña Gutiérrez Design and construction of a snapshot full-Stokes polarimetric camera: seeing through fog
Wang et al. 3D Reconstruction of Weakly Textured Objects Based On Denoising Polarized Images
Zuo Chip-integrated Metasurface Polarimetric Imaging Sensor and Applications
Hu et al. Metasurface-based computational imaging: a review
NL2002406C2 (en) Optical range finder and imaging apparatus.
JP2021060288A (en) Polarization measurement device and polarization measurement method
Eslami Light field analysis and its applications in adaptive optics and surveillance systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11731143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11731143

Country of ref document: EP

Kind code of ref document: A1