WO1999030197A1 - An omnidirectional imaging apparatus - Google Patents

An omnidirectional imaging apparatus Download PDF

Info

Publication number
WO1999030197A1
WO1999030197A1 PCT/US1998/025689 US9825689W WO9930197A1 WO 1999030197 A1 WO1999030197 A1 WO 1999030197A1 US 9825689 W US9825689 W US 9825689W WO 9930197 A1 WO9930197 A1 WO 9930197A1
Authority
WO
WIPO (PCT)
Prior art keywords
paraboloid
shaped reflector
image
scene
imaging apparatus
Prior art date
Application number
PCT/US1998/025689
Other languages
French (fr)
Inventor
Shree K. Nayar
Malcolm J. Macfarlane
Original Assignee
The Trustees Of Columbia University In The City Of New York
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Trustees Of Columbia University In The City Of New York filed Critical The Trustees Of Columbia University In The City Of New York
Priority to KR1020007006159A priority Critical patent/KR100599423B1/en
Priority to EP98963782A priority patent/EP1042697A1/en
Priority to BR9813370-5A priority patent/BR9813370A/en
Priority to AU19033/99A priority patent/AU1903399A/en
Priority to CA002312970A priority patent/CA2312970A1/en
Priority to JP2000524698A priority patent/JP2001526471A/en
Publication of WO1999030197A1 publication Critical patent/WO1999030197A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • G06T3/12
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19626Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19626Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses
    • G08B13/19628Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses of wide angled cameras and camera groups, e.g. omni-directional cameras, fish eye, single units having multiple cameras achieving a wide angle view
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • This invention relates to omnidirectional image sensing with reference to a single viewpoint, and, more particularly to such image sensing using a truncated, substantially paraboloid-shaped reflector.
  • an imaging system For many applications such as surveillance, teleconferencing, remote sensing, photogrammetry, model acquisition, virtual reality, computer graphics, machine vision and robotics, it is desirable that an imaging system have a large field of view so as to be able to take in as much information as possible about the world around it.
  • Traditional imaging systems include a camera with a lens that provides a perspective projection of an image.
  • a camera with even a very wide angle lens has only a limited field of view (i.e., covering less than a full hemisphere). This limited field of view may be expanded by tilting and panning the entire imaging system about its center of projection.
  • S.E. Chen "Quicklime VR - An Image-Based Approach to Virtual Environment Navigation", Proc.
  • Nalwa discloses the use of multiple planar reflecting surfaces in conjunction with multiple charge coupled device (“CCD”) cameras to obtain a 360 degree panoramic image of a 50 degree band of a hemispherical scene.
  • CCD charge coupled device
  • four planar mirrors are arranged in the shape of a pyramid, with one camera being positioned above each of the four planar reflecting sides, and with each camera viewing slightly more than 90 degrees by 50 degrees of the hemispherical scene.
  • This system suffers from the serious drawback of requiring multiple sensors to capture a 360-degree image.
  • this system suffers from the inherent problems associated with distortion at the "seams" when the separate images are combined to provide a full 360 degree view.
  • Curved reflective surfaces have also been used in conjunction with image sensors.
  • Yagi et al. "Evaluating Effectivity of Map Generation by Tracking Vertical Edges in Omnidirectional Image Sequence", IEEE International Conference on Robotics and Automation, June 1995, p. 2334
  • Yagi et al. "Map-Based Navigation for a Mobile Robot With Omnidirectional Image Sensor COPIS”
  • IEEE Transactions on Robotics and Automation, Vol. II, No. 5, Oct. 1995 disclose a conical projection image sensor (COPIS) which uses a conical reflecting surface to gather images from the surrounding environment, and processes the information to guide the navigation of a mobile robot.
  • COPIS conical projection image sensor
  • the COPIS is able to attain viewing in 360 degrees, it is not a true omnidirectional image sensor because the field of view is limited by the vertex angle of the conical mirror and by the viewing angle of the camera lens. Furthermore, the COPIS does not have a single viewpoint, but instead has a locus of viewpoints lying on a circle. This locus of multiple viewpoints causes distortion in the captured images, which cannot be eliminated to obtain pure perspective images.
  • Yamazawa et al. "Obstacle Detection With Omnidirectional Image Sensor HyperOmni Vision", IEEE International Conference on Robotics and Automation, Oct. 1995, p. 1062, discloses a purported improvement in the COPIS system which involves the use of a hyperboloidal reflecting surface in place of a conical surface. As discussed therein, the rays of light which are reflected off the hyperboloidal surface, no matter where the point of origin, will all converge at a single point, thus enabling perspective viewing.
  • a hyperboloidal mirror is advantageous in that it enables full perspective image sensing, because the rays of light which make up the reflected image converge at the focal point of the reflector, positioning of the sensor relative to the reflecting surface is critical, and any disturbance will impair the image quality.
  • the use of a perspective-projection model inherently requires that as the distance between the sensor and the mirror increases, the cross-section of the mirror must increase. Therefore, practical considerations dictate that in order to keep the mirror at a reasonable size, the mirror must be placed close to the sensor. This in turn causes complications to arise with respect to the design of the image sensor optics.
  • mapping sensed image to usable coordinates requires complex calibration due to the nature of the converging image.
  • a further drawback is that the relative positions of the mirror and the optics cannot be changed while maintaining a single viewpoint.
  • a hyperboloidal mirror system cannot take advantage of the relative movement of the mirror and optics to adjust the field of view of the system, while maintaining a single viewpoint.
  • an omnidirectional imaging apparatus for sensing an image of a scene from a single viewpoint that includes a truncated, substantially paraboloid-shaped reflector positioned to orthographically reflect principal rays of electromagnetic radiation radiating from the scene.
  • the paraboloid- shaped reflector has a focus coincident with the single viewpoint of the omnidirectional imaging apparatus.
  • the omnidirectional imaging apparatus also includes telecentric means, optically coupled to the paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by the paraboloid-shaped reflector.
  • the omnidirectional imaging apparatus further includes one or more image sensors positioned to receive the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector, thereby sensing the image of the scene.
  • the paraboloid-shaped reflector of the present invention may be either convex or concave.
  • the telecentric means may include a telecentric lens, a telecentric aperture, or a collimating lens.
  • the paraboloid-shaped reflector comprises a substantially paraboloidal mirror having a surface which substantially obeys the equation expressed in cylindrical coordinates:
  • z being an axis of rotation of the surface
  • r being a radial coordinate
  • h being a constant.
  • the shape of the surface is not a function of the angular coordinate ⁇ .
  • the one or more image sensors comprise one or more video cameras. These video cameras may employ one or more charge-coupled devices or one or more charge injection devices. Alternatively, the one or more image sensors may comprise photographic film. In another preferred embodiment, at least one image sensor has a non-uniform resolution to compensate for the non-uniform resolution of the image reflected from the paraboloid-shaped reflector.
  • the paraboloid-shaped reflector comprises a mirror truncated at a plane which includes the focus of the paraboloid-shaped reflector and which is perpendicular to the axis passing through the focus and the vertex of the paraboloid- shaped reflector.
  • the paraboloid-shaped reflector is mounted on a fixed base and the one or more image sensors are mounted on a movable base, whereby movement of the one or more image sensors produces a changing field of view.
  • the paraboloid-shaped reflector may be mounted on a movable base and the one or more image sensors may be mounted on a fixed base, whereby movement of the paraboloid-shaped reflector produces a changing field of view.
  • the one or more image sensors provide an image signal representative of the image of the scene.
  • An image signal processing apparatus is coupled to the one or more image sensors, which converts the image signal from the image sensors into image signal data.
  • the image signal processing apparatus then maps the image signal data into a Cartesian-coordinate system to produce a perspective image or into a cylindrical-coordinate system to produce a panoramic image.
  • the image signal processing may include interpolation means for providing interpolated image data, whereby the interpolated image data and the image signal data are combined to form a digital image.
  • the image processing apparatus may further include means for zooming in on a preselected portion of the digital image to thereby provide an enlarged image of the preselected portion from a predetermined focal distance.
  • the omnidirectional imaging apparatus comprises at least one lens optically coupling the one or more image sensors and the paraboloid- shaped reflector.
  • This coupling lens may be a zoom lens, a microscope objective, or a field-flattening lens.
  • the field-flattening lens has a field curvature approximately opposite to the field curvature of the paraboloid-shaped reflector.
  • the field-flattening lens is either a plano-concave lens or an aplanatic, meniscus lens.
  • the omnidirectional imaging apparatus is used to image a substantially spherical scene by using two truncated, substantially paraboloid-shaped reflectors positioned to orthographically reflect principal rays of electromagnetic radiation radiating from two complementary hemispherical scenes.
  • the two paraboloid-shaped mirrors are positioned to share a common paraboloidal axis.
  • the two paraboloid-shaped reflectors are convex, they are positioned back-to-back along their planes of truncation, such that they share a common focus point.
  • the two paraboloid-shaped reflectors are concave, they are positioned such that their vertexes coincide.
  • a plurality of beam splitters are provided for splitting the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector into a plurality of ray bundles.
  • a plurality of image sensors is required, with each image sensor positioned to receive at least one of the plurality of ray bundles, and thereby sensing a portion of the image of the scene.
  • a plurality of dichroic beam splitters is provided for splitting the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector into a plurality of monochromatic principal rays of electromagnetic radiation.
  • a plurality of image sensors is required, with each image sensor positioned to receive at least one of the plurality of monochomatic principal rays of electromagnetic radiation, and thereby sensing at least one monochromatic image of the scene.
  • a method for sensing an image of a scene from a single viewpoint includes the steps of: (a) orthographically reflecting principal rays of electromagnetic radiation radiating from the scene on a truncated, substantially paraboloid-shaped reflector such that the single viewpoint of the omnidirectional imaging method coincides with a focus point of the paraboloid-shaped reflector;
  • a method for omnidirectionally sensing images of a scene from a single viewpoint includes:
  • the paraboloid-shaped reflector may be mounted on a movable base and the image sensors may be mounted on a fixed base.
  • the above-described method also includes the step of optically coupling the paraboloid-shaped reflector and the image sensors with a zoom lens, which may be used to magnify an area of interest in the scene.
  • FIG. 1 A is a side view of an exemplary embodiment of an omnidirectional imaging apparatus
  • Fig. IB is a side view of an alternate embodiment in which a paraboloid- shaped reflector is connected to an image sensor by a transparent support;
  • Fig. 2 is an isometric view of a paraboloid-shaped reflector mounted on a base plate;
  • Fig. 3 is a partially isometric view of a paraboloid-shaped reflector mapped into a cylindrical coordinate system
  • Fig. 4 is a geometric representation of an orthographic reflection from a curved reflecting surface
  • Fig. 5 is an illustration of orthographic reflection from a substantially paraboloid-shaped reflector to an image sensor
  • Fig. 6 illustrates how any selected portion of a hemispherical scene can be viewed from the single viewpoint using a paraboloid-shaped reflector
  • Fig. 7 is a side view of an omnidirectional imaging apparatus with two back- to-back substantially paraboloid-shaped reflectors and two image sensors;
  • Fig. 8 is a cross-sectional view of two substantially paraboloid-shaped reflectors positioned back-to-back and having a common paraboloidal axis and a common focus;
  • Fig. 9 illustrates the mapping of image data to cylindrical coordinates to enable production of a panoramic view
  • Fig. 10 is a flowchart of an exemplary embodiment of a method for sensing and processing an image of a substantially hemispherical scene from a single viewpoint;
  • Fig. 11 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes an extended paraboloid- shaped reflector;
  • Fig. 12 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a paraboloid-shaped reflector truncated at a plane that is tilted with respect to the paraboloidal axis of the reflector;
  • Fig. 13 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a paraboloid-shaped reflector that is larger than the imaging area of the image sensor;
  • Fig. 14 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a concave paraboloid- shaped reflector;
  • Fig. 15 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a zoom lens optically coupling a paraboloid-shaped reflector and an image sensor;
  • Fig. 16 is a partially isometric view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a paraboloid- shaped reflector mounted on a movable base;
  • Fig. 17 A is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes an image sensor mounted on a movable base;
  • Fig. 17B is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a movable camera
  • Fig. 17C is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a movable camera and optics;
  • Fig. 18 is a partially isometric view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes an image sensor comprising four charge coupled devices positioned side-by-side;
  • Fig. 19 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes multiple image sensors and beam splitters;
  • Fig. 20 is a top view of an image sensor according to an embodiment of the present invention whose sensing elements are non-uniformly distributed and sized;
  • Fig. 21 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a planar mirror that optically couples a paraboloid-shaped reflector and an image sensor;
  • Fig. 22 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a microscope objective that optically couples a paraboloid-shaped reflector and an image sensor;
  • Fig. 23 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a collimator lens that optically couples a paraboloid-shaped reflector and an imaging lens;
  • Fig. 24A is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a field-flattening planoconcave lens;
  • Fig. 24B is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a field-flattening meniscus lens with aplanatic sides; and
  • Fig. 25 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes two concave paraboloid- shaped mirrors used to image a substantially spherical view.
  • FIG. 1A illustrates an omnidirectional imaging apparatus 100 according to an exemplary embodiment of the present invention.
  • An image sensor 110 such as a commercially available Sony 3CCD color video camera device 111 having a telecentric lens or a magnifying lens 112 and a telecentric aperture 113, is positioned to receive the orthographic reflection of the image.
  • the telecentric lens or aperture functions to filter out all rays of light which are not perpendicular to the plane of the lens or aperture, i.e., background light which does not form part of the orthographic reflection of the hemispherical scene.
  • background light which does not form part of the orthographic reflection of the hemispherical scene.
  • the paraboloid-shaped reflector may be coupled to the image sensor by a transparent support 136, such as a length of clear tubing.
  • the video camera 110 generates an analog video signal representative of the orthographically reflected image which is sent through cable 150.
  • the video signal is converted to a digital signal by digitizer 120, which is a commercially available NTSC video signal analog-to-digital converter.
  • the digital signal is then sent through a cable 155 to a general purpose computer 125, such as a DEC Alpha 3000/600 workstation.
  • a general purpose computer 125 such as a DEC Alpha 3000/600 workstation.
  • the computer 125 is programmed to allow the user to view any desired portion of the hemispherical scene, to zoom in on a selected portion of the scene, or to pan the scene in any desired manner.
  • the image sensor 110 may simply be a still or motion picture photographic camera using conventional photographic film.
  • the image sensor 110 may also be a camcorder or video camera 116 which provides a digital video signal output, which can be provided directly to the computer 125 without the need for the analog-to- digital converter 120.
  • Fig. 2 shows an isometric view of the paraboloid-shaped reflector 135, which extends from base 140 from which it is formed.
  • the reflector 135 may comprise a paraboloid-shaped plastic body coated with a thin layer 145 of highly reflective metal, such as aluminum or silver.
  • the reflector 135 may comprise a paraboloid-shaped, polished metal body.
  • a metal such as stainless steel may be used.
  • Fig. 3 illustrates in greater detail, the preferred geometry of the paraboloid- shaped reflector 135, as well as the orthographic reflection of the image of the substantially hemispherical scene 130 onto the image sensor 110.
  • the reflector 135 of Fig. 3 is defined in cylindrical coordinates, r, ⁇ and z, as substantially obeying the equation
  • z the axis of rotation
  • r the radial coordinate
  • h a constant.
  • the z axis coincides with the optical axis of the imaging arrangement
  • a focus point 315 of the paraboloid defined by equation (1) coincides with the origin of the coordinate system.
  • the reflector 135 of Fig. 3 is truncated at a plane p which is substantially perpendicular to the z axis 310 and which includes the focus point 315 of its paraboloidal surface.
  • z and r are perpendicular cylindrical coordinates for a given value of ⁇ , the angular coordinate.
  • the angle of an incoming ray 405 relative to the f axis is ⁇ .
  • the incoming ray 405 is orthographically reflected by the reflecting surface 415 as an outgoing ray 410.
  • is the angle between the outgoing ray 410 and the axis.
  • the omnidirectional imaging apparatus in accordance with the present invention enables viewing of any portion of the scene, enables zooming in on a selected portion, and enables panning of the scene, all with respect to the single viewpoint and without requiring image reconstruction or complex frame transformation.
  • Fig. 5 illustrates how a portion of the substantially hemispherical scene is viewed by the image sensor from a single viewpoint.
  • Fig. 5 also illustrates how a truncated convex substantially paraboloid-shaped reflector 135 is mapped into a Cartesian coordinate system.
  • the optical axis 502 of the imaging arrangement is coincident with the z axis, and the focus 501 of the substantially paraboloid-shaped reflector 135 is located at the origin.
  • Incoming rays 505, 510 from a portion of the scene 300 being viewed intersects the reflecting surface at points 515 and 520, which can be defined by their respective x and y coordinates.
  • Point 515 and 520 lie along imaginary radial lines 516 and 521, respectively, which originate at the viewpoint of the scene, i.e., the focus 501 of the paraboloid-shaped reflector.
  • the image sensor 110 includes a planar charge- coupled device (“CCD”) image sensor having an array of light sensing cells.
  • CCD planar charge- coupled device
  • Fig. 6 illustrates a technique for zooming in on any selected portion of the substantially hemispherical scene.
  • the reflector 135 is positioned relative to orthogonal x, y and z axes in the same manner as in Fig. 5.
  • the light intensity signal generated by the CCD cell which lies at 580 is chosen. As shown in Fig. 6, a line segment drawn between point 570 and focus point 551 intersects the reflector 135 at point 552.
  • the light intensity at point 570 is set equal to that represented by the image signal generated by the CCD cell at 580 which is located at the x-y coordinate on the grid nearest to the x-y coordinate of point 552.
  • the same is repeated for each CCD cell within the same range of x-y coordinates as the region of the reflecting surface projecting the selected portion of the scene.
  • a general purpose computer 125 can be readily programmed by one skilled in the art to perform the above steps to enable viewing of any portion of the hemispherical scene from a single viewpoint, and to also enable zooming in on any particular portion to provide an enlarged image of that portion. Furthermore, by designating successive points along the reflector, the hemispherical scene can be panned as if one were viewing the scene from a single viewpoint. In the embodiment discussed above, it is readily apparent that as one zooms in on smaller portions of the scene, the number of CCD cells providing information to the computer 125 is reduced, and hence the granularity of the viewed image is increased. In a preferred embodiment, information about points in the scene which do not exactly correspond to CCD cells are more closely approximated by interpolation.
  • a suitable interpolation program which may be executed on computer 125 is included in Appendix I to this specification.
  • the program attached as Appendix I will map the sensed omnidirectional image to an ordinary perspective image that is suitable for display on computer 125.
  • the program requires the user to input the name, center location, and radius of the omnidirectional image to be converted.
  • the program also requires the user to input a name for the generated perspective image, as well as a focal length and size for the perspective image.
  • the image for such scene portions is estimated by the appended program based on a suitable average of image signals generated by CCD cells which correspond to neighboring portions of the scene.
  • interpolation programs known to those skilled in the art, such as those that are based on polynomial or temporal matching, may be used without departing from the scope of the invention, as defined by the claims herein.
  • a cylindrical-coordinate mapping may also be performed to achieve a panoramic image of the scene being viewed.
  • Cylindrical- coordinate mapping will be described with reference to Fig. 9.
  • a principal ray 950 from a point 945 in a scene strikes a paraboloid-shaped reflector 935 and is orthographically reflected toward an image sensor 910.
  • the orthographically reflected ray 960 impinges the image sensor at sensing element 965.
  • a truncated cylinder 970 is imagined to surround the paraboloid-shaped reflector 935 and image sensor 910.
  • sensing element 965 is then traced back through rays 960 and 950, and the point of intersection 955 of the ray 950 with the truncated cylinder 970 is determined. Point 955 is then assigned the light intensity of sensing element 965. This same calculation is performed for each sensing element of the image sensor 910.
  • the resultant collection of points (with appropriately assigned light intensities) located on the truncated cylinder 970 produces a panoramic image of the scene being viewed. This panoramic image can be viewed on a display by further mapping the truncated cylinder to a planar surface. This mapping is easily performed by those skilled in the art and can be visualized by imagining that the cylinder is cut length-wise and flattened out.
  • interpolation of image data as discussed above in relation to Cartesian- coordinate mapping may also be used with cylindrical-coordinate mapping.
  • a V inch CCD is used with a 0.4 inch focal length paraboloid-shaped mirror truncated through its focus and having a 1.6 inch diameter.
  • a collimating lens such as Model No. P32921 from EDMUND SCIENTIFIC of Barrington, New Jersey, is used with an 8.5 inch focal- length imaging lens to optically couple the mirror to the CCD.
  • the omnidirectional imaging apparatus includes an additional substantially paraboloid-shaped reflector 735 as shown in Fig. 7.
  • the additional reflector is positioned to orthographically project an image of an additional hemispherical scene 730 which is complementary to the hemispherical scene 130 so that together they constitute a spherical scene.
  • An additional image sensor 710 is positioned to receive the image orthographically projected by the additional reflector 735.
  • An image signal representative of the orthographic reflection of the additional reflector 735 is converted to a digital signal by converter 720 in the same manner as described above, and is sent to the same general purpose computer 125 via line 725.
  • the reflectors 135 and 735 are positioned back-to-back, share a common axis of rotation 810, which is also the optical axis of the imaging apparatus, and a common focus 805, and are each truncated at a plane p which is substantially perpendicular to the axis of rotation 810 and which includes the focus 805.
  • FIG. 10 there is shown a flow chart 1000 illustrating a method for sensing an image of a substantially hemispherical or spherical scene from a single viewpoint according to an exemplary embodiment of the present invention.
  • Flowchart 1000 shows the necessary steps for sensing the hemispherical scene from a single viewpoint.
  • the method requires orthographically reflecting the substantially hemispherical scene 1010, and sensing the orthographically reflected image 1020.
  • the method may further include the steps of converting the image signal into image signal data 1030, mapping the image data into an appropriate coordinate system 1040, interpolating the image data 1060 to derive approximate values for missing image data, and forming a digital image 1070 from the mapped image data and from the interpolated image data.
  • the steps of specifying a viewing direction, a focal length, and an image size 1045 and zooming in 1050 on a selected portion of the image data may be performed before the interpolation step.
  • the exemplary embodiments described have all utilized a "normal" paraboloid-shaped reflector.
  • the term "normal" in association with a paraboloid-shaped reflector refers to a paraboloid-shaped reflector that is truncated at a plane that passes through the focal point of the paraboloid-shaped reflector and that is substantially perpendicular to the paraboloidal axis of the paraboloid-shaped reflector.
  • the paraboloidal axis of a paraboloid-shaped reflector is the axis passing through the vertex and focal point of the paraboloid-shaped reflector.
  • Figs. 11 to 15 show further exemplary embodiments of the omnidirectional imaging apparatus in which the paraboloid-shaped reflector may also take the form of various non-normal paraboloids.
  • Fig. 11 shows an omnidirectional imaging apparatus that is able to image a field of view ("FOV") greater than a hemisphere using only one camera 1111 and one paraboloid-shaped reflector 1135.
  • the paraboloid- shaped reflector 1135 is an extended paraboloid that is obtained by cutting a suitable reflector with a plane that is normal to the axis of the paraboloid (z) but passes below the focal point 1130 of the paraboloid.
  • the paraboloid-shaped reflector is able to orthographically reflect rays from the hemisphere below its focal point.
  • FOV field of view
  • the FOV covered by the paraboloid- shaped reflector is 240 degrees, or 75% of an entire sphere.
  • the camera 1111 and the paraboloid-shaped reflector 1135 are coupled by optics 1112.
  • Fig. 12 shows an omnidirectional imaging apparatus that may be used to image a FOV that is tilted with respect to the paraboloidal axis of the paraboloid- shaped reflector.
  • the embodiment of Fig. 12 includes a camera 1211, optics 1212, and a paraboloid-shaped reflector 1235.
  • the paraboloid-shaped reflector 1235 is truncated at a plane passing through the focus of the paraboloid-shaped reflector 1235 and tilted with respect to its paraboloidal axis (z).
  • the FOV of this reflector is thus a tilted hemisphere, as shown by the dotted lines in Fig. 12.
  • the truncation plane may also pass above the focus 1230 of the paraboloid (thereby resulting in a FOV smaller than a hemisphere), or the truncation plane may pass below the focus 1230 (thereby resulting in a FOV greater than a hemisphere).
  • Fig. 13 shows an omnidirectional imaging apparatus that may be used to image a FOV smaller than a hemisphere.
  • the embodiment of Fig. 13 includes a camera 1311 coupled to a paraboloid-shaped reflector 1335 by optics 1312.
  • the paraboloid-shaped reflector 1335 is formed such that it is "larger" than the imaging area of the camera 1311.
  • a paraboloid-shaped reflector is "larger” than the imaging area of a camera if the base of a normal paraboloid having the same shape as the reflector (i.e., having the same paraboloidal constant h as defined in equation (1)) is larger than the smallest dimension of the imaging area of the camera.
  • the paraboloid-shaped mirror has a higher resolution than a corresponding image captured using a smaller paraboloid.
  • the paraboloidal axis of the paraboloid-shaped reflector 1335 (z') may be shifted with respect to the optical axis (z) to obtain fields of view towards the horizon.
  • the paraboloid-shaped reflector 1335 need not be a normal paraboloid, but may be truncated in accordance with the FOV to be imaged.
  • FIG. 14 an embodiment of an omnidirectional imaging apparatus according to the present invention is shown that includes a camera 1411, optics 1412, and a concave paraboloid-shaped reflector 1435.
  • a concave paraboloid-shaped reflector may be used in applications where the concealment of the reflector is desirable (as, for example, in outdoor applications where protection against weather is desirable).
  • the paraboloidal image of the scene is "flipped," but the image continues to satisfy the single viewpoint constraint disclosed previously. Therefore, pure perspective images can be generated from the concave paraboloidal image, just as with the convex paraboloidal image.
  • a hemispherical field of view can be obtained with a single reflector.
  • This hemispherical FOV is obtained by truncating the paraboloid with a plane that passes through the focal point 1435 of the paraboloid (the plane being either normal or tilted with respect to the axis of the paraboloid (z)).
  • a concave paraboloid that is truncated above its focal point may also be used, such a paraboloid is not desirable because it causes self obstruction of the image.
  • a FOV greater than a hemisphere may be obtained by using multiple concave paraboloid-shaped reflectors.
  • two paraboloid- shaped reflectors, 2535a and 2535b are positioned such that they share a common paraboloidal axis (z) and their vertexes 2545 coincide.
  • the two paraboloid-shaped reflectors 2535a and 2535b are able to image two hemispheres 2530a and 2530b, respectively.
  • This system may be used advantageously when the reflectors are required to be recessed for concealment or protection.
  • Fig. 15 shows an embodiment of an omnidirectional imaging system according to the present invention with zoom capabilities.
  • the omnidirectional imaging system of Fig. 15 includes a paraboloid-shaped reflector 1535, a camera 1511 , a zoom lens 1512, and relay optics 1513.
  • the omnidirectional imaging system provides an image of the entire hemisphere (or greater or less than a hemisphere if the embodiments of Fig. 11 or Fig. 13 are used).
  • the zoom lens 1512 provides a higher magnification (and, therefore, a higher resolution) of a smaller FOV.
  • the effective center of projection of the zoom lens 1512 must remain approximately fixed to ensure that the imaging system remains telecentric.
  • the relay optics 1513 is used to ensure that the zoom lens 1512 remains telecentric over its entire settings.
  • the zoom lens 1512 may be either fixed or mobile with respect to the paraboloid-shaped reflector 1535. If the zoom lens 1512 is fixed, only regions around the paraboloidal axis (z) can be observed under magnification. Preferably, therefore, the zoom lens 1512 is equipped with some movement means, allowing the zoom lens to be positioned over and image regions along the outer edges of the paraboloid-shaped reflector 1535. Of course, such movement means must ensure that the optical axis of the zoom lens 1512 remains parallel to the paraboloidal axis of the paraboloid-shaped reflector 1535 at all times.
  • Fig. 16 shows an omnidirectional imaging system that may be used to produce dynamically changing fields of view of a scene.
  • a paraboloid-shaped reflector 1635 is mounted on a movable base 1640, which allows translation of the paraboloid- shaped reflector 1635 along the x, y and z axes.
  • the movable base 1640 may be controlled either manually or with a computer.
  • a dynamically changing field of view of a scene could be produced, for example, by a circular motion of the movable base 1640 about the optical axis (z).
  • the images are image processed as described previously to obtain perspective or panoramic views.
  • Fig. 16 further shows the use of a zoom lens 1612 in combination with the movable base 1640.
  • a zoom lens 1612 adds the capability to zoom into sections of the paraboloid-shaped reflector 1635 brought under the view of the imaging system by the movement of the movable base 1640.
  • a relay lens 1613 is used to couple the zoom lens and the paraboloid-shaped reflector 1635.
  • the zoom lens 1612 preferably includes a manual or automatic focus control to ensure that the sharpness of images are maintained over all sections of the paraboloid-shaped reflector 1635.
  • translation of the reflector along the z axis may also be used to adjust the focus of an image.
  • Figs. 17 A, 17B, and 17C show various exemplary embodiments of such omnidirectional imaging systems.
  • an image sensor 1710 (such as a CCD) is provided with movable means;
  • a camera 1711 is provided with movable means;
  • a camera 1711 and optics 1712 are provided with movable means, for moving together simultaneously.
  • each of these components may be moved along any of the x, y, or z axes to change the field of view being imaged.
  • a zoom lens may be used to magnify areas of interest.
  • the viewpoint of the omnidirectional imaging system remains fixed in space at the focal point of the paraboloid-shaped reflector.
  • the embodiments of Fig. 16, 17 A, 17B, or 17C may be used to great advantage in a surveillance system.
  • the omnidirectional imaging capability of these embodiments allows an operator to monitor an entire area of interest at once. When the operator observes a particular region of interest within the area being monitored, the operator may then select appropriate translational coordinates (for movement of the camera, optics, or paraboloid-shaped reflector), and appropriate zoom settings, to view the region of interest in greater detail.
  • Fig. 18 shows an omnidirectional imaging system that utilizes multiple image sensors to achieve increased image resolution.
  • the embodiment of Fig. 18 includes a paraboloid-shaped reflector 1835, video electronics 1809, four CCD elements 1810a - 1810d, and imaging optics 1812.
  • the four CCD elements 1810a - 1810d are placed side-by-side in a non-overlapping arrangement.
  • the embodiment of Fig. 18 takes advantage of the fact that commercial CCD elements are typically manufactured in standard resolutions regardless of their size. Therefore, by using four, commercial -inch CCD elements instead of a single, commercial ' ⁇ -inch CCD element, the resolution of an image may advantageously be quadrupled.
  • Fig. 18 shows an omnidirectional imaging system that utilizes multiple image sensors to achieve increased image resolution.
  • the embodiment of Fig. 18 includes a paraboloid-shaped reflector 1835, video electronics 1809, four CCD elements 1810a - 1810d, and imaging optics 1812.
  • Fig. 19 shows another embodiment that utilizes multiple image sensors to increase image resolution.
  • the multiple image sensors are provided by multiple cameras 1911.
  • Beam splitters 1916 are used to direct separate sections of a paraboloidal image to different cameras.
  • each portion of the paraboloidal image is imaged with a higher resolution than if the entire image were imaged by one camera alone.
  • dichroic beam splitters may be used to split an image into a plurality of monochromatic images, which may be sensed by a plurality of monochromatic image detectors. These monochromatic images may later be suitably combined into a full-color image by image processing means well-known in the art.
  • Fig. 20 shows a planar image sensor 2010, as, for example, a CCD element.
  • a typical planar image sensor with a paraboloid-shaped mirror the effective resolution of a captured paraboloidal image is increasingly greater towards the outer edge of an image than at its center.
  • the resolution of the captured image increases by a factor of four from the center of the image to its fringe.
  • an image sensor has sensing elements 2008 whose sizes and placements are varied to result in a uniform resolution over the entire image. This same approach may also be used to increase resolution in selected parts of the FOV.
  • Fig. 21 shows a preferred embodiment in which an omnidirectional imaging system includes a paraboloid-shaped reflector 2135, a planar mirror 2116, a relay lens 2113, an imaging lens 2112, and a camera 2111.
  • the paraboloid-shaped reflector 2135 is positioned above a surface 2140, and the planar mirror 2116, relay lens 2113, imaging lens 2112, and camera 2111 are concealed below the surface 2140.
  • the planar mirror 2116 is positioned beneath an opening 2145 in the surface 2140 and folds the image from the paraboloid-shaped reflector 90 degrees, thereby redirecting the image to the relay lens, imaging lens, and camera.
  • the planar mirror is shown between the paraboloid-shaped reflector and the relay lens, the planar mirror may also be placed between the relay lens and the imaging lens or between the imaging lens and the camera, as those skilled in the art will appreciate.
  • Fig. 22 shows an embodiment of an omnidirectional imaging system in which the optics between a paraboloid-shaped mirror 2235 and an image sensor 2210 comprise a low-power, inverted microscope objective 2212.
  • the reflector 2235 is at the position normally occupied by the eyepiece of the microscope and the image sensor 2210 is at the position normally occupied by a slide.
  • the use of an inverted microscope objective is advantageous for imaging since commercial microscope objectives are well corrected for aberrations.
  • Fig. 23 shows an embodiment of an omnidirectional imaging system in which a collimator lens 2313 is placed between a paraboloid-shaped mirror 2335 and imaging optics 2312. It is desirable to use commercially available imaging lens in many cases to save the cost and time of designing special lenses.
  • Figs. 24A and 24B illustrate the use of field-flattening lenses between an image sensor 2410 and an imaging lens 2413.
  • Field-flattening means are desirable because the paraboloid-shaped reflector of the present invention, having a typically small focal length of a few millimeters, is afflicted with very strong field curvature.
  • One method of eliminating this imaging defect is to use an image sensor with a curved surface that matches the field curvature. More preferably, however, a special lens, called a field-flattening lens, may be introduced which has a curvature of opposite sign to that of the reflector. Therefore, the two field curvatures cancel, and the resultant image surface is flat, allowing the entire image to be in sharp focus on a planar image sensor.
  • Figs. 24 A and 24B Two types of preferred field-flattening lenses are illustrated in Figs. 24 A and 24B.
  • a plano-concave lens 2412a is shown.
  • the plano-concave lens 2412a is placed as close as possible to the image sensor 2410.
  • the plano- concave lens 2412a is placed in contact with the image sensor window 2417. In this position, the plano-concave lens 2412a compensates for the field curvature of the reflector while introducing only small amounts of undesirable aberrations.
  • FIG. 24B A second type of preferred field-flattening lens, a meniscus lens 2412b, is shown in Fig. 24B.
  • Both of the surfaces of the meniscus lens 2412b are aplanatic to the incoming light. If a surface is aplanatic, it introduces no spherical aberration, coma or astigmatism into the beam of light; it only introduces field curvature.
  • the meniscus lens 2412b has a marked field flattening effect which is determined by the thickness of the lens: the thicker the lens, the greater the field flattening effect.
  • the meniscus lens 2412b of Fig. 24B is not used in contact with the image sensor 2410.
  • the surface of best focus of an optical system is a plane.
  • a CCD or other type of flat image sensor can match the surface of best focus over its entire area, thereby providing maximum resolution for the image.
  • an optical system has a tendency to form its best images on a curved surface. Accordingly, the curved focal surface and the flat CCD surface cannot be matched up over their entire area, and some or all of the image will not be in best focus.
  • the field curvature of an optical system is called its Petzval curvature. Every optical element in an optical system contributes to the Petzval curvature for the system. If a surface of an optical element is refracting, its Petzval contribution to the curvature of the system is:
  • n the refractive index of the optical element and R is the radius of curvature of the surface of the optical element.
  • R the radius of curvature of the surface of the optical element.
  • the field curvature of an image is calculated by taking the sum of the contributions of all of the reflecting and refracting surfaces and multiplying the sum by a simple constant. If this value is not zero, then the field of the image is curved and the problem discussed above will be encountered (i.e., the surface of the image and the surface of the image sensor will not be completely matched).
  • the first method for changing the Petzval curvature depends on the optical characteristics of an optical surface located at the surface of an image. If an optical surface is located at the surface of an image (either an intermediate image or the final image of the optical system), then this surface will not change the spherical aberration, coma or astigmatism of the image. The only change will be to the Petzval curvature.
  • the Petzval curvature of a system can be corrected by inserting a surface with an appropriate radius of curvature at the final focus of the system. This is the basis for the plano-concave field-flattening lens described above.
  • the second method for changing the Petzval curvature depends on the optical characteristics of aplanatic surfaces. Assume there is an aplanatic surface, which is defined as follows: Let s be the object distance for the surface and s' be the image distance. Also, let n and n' be the refractive indices of the materials before and after the surface, respectively (where n-1 for air and n>l for glass). If s and s' are related by / R ( n +n ) ns
  • the surface will introduce no spherical aberration or coma and only very small amounts of astigmatism. If now a thick lens is introduced, both of whose surfaces satisfy this condition, then the difference in their radii will depend on the thickness of the lens. This fact can again be used to control the Petzval curvature of the system by adjusting the thickness of the aplanatic lens. This is the basis of the thick, meniscus field-flattening lens discussed above.
  • the plano-concave lens 2412a of Fig. 24A is composed of BK7 and has a refractive index (n) of 1.517.
  • the radius of the curved (concave) surface r is 6.2 mm.
  • the surface opposite the curved surface r is flat and is placed in contact with the image detector window 2417.
  • the axial thickness of the lens is 1.5 mm, and the optical diameter is 3 mm.
  • the aplanatic lens is composed of acrylic plastic and has a refractive index (n) of 1.494.
  • the radius of the curved (convex) surface r 2 is 4.78 mm, and the radius of the curved (concave) surface r 3 is 2J 6 mm.
  • the axial thickness of the lens is 6.7 mm.
  • the optical diameter of the curved surface r 2 is 7 mm, and the optical diameter of the curved surface r 3 is 2.7 mm.
  • double sqrt() atan(), sin(), cos(), acos(); unsigned char *r, *g, *b; unsigned char *red; unsigned char * green; unsigned char *blue; int xsize, ysize; int xosize, yosize; int i, j, x0, yO, xl, yl; double theta, phi; double ox, oy, oz; double px, py, pz; double qx, qy, qz; double tempx, tempy, tempz; double sx, sy, sz; double rad, mag; double xs, ys, zs; double dispx, dispy; int xcent, ycent, xnew, ynew, xpix, ypix, xpoint, ypoint; int xpixel, ypixel, indexx, indexy,
  • xosize xpix
  • yosize ypix
  • px tempx/sqrt(tempx*tempx + tempy*tempy + tempz*tempz);
  • py tempy/sqrt(tempx*tempx + tempy*tempy + tempz*tempz);
  • pz tempz/sqrt(tempx ';: tempx + tempy*tempy + tempz* tempz);
  • tempx py*oz - pz*oy
  • tempy pz*ox - px*oz
  • tempz px*oy - py*ox
  • qx tempx/sqrt(tempx*tempx + tempy*tempy + tempz*tempz);
  • qy tempy/sqrt(tempx*tempx + tempy*tempy + tempz*tempz);
  • qz tempz/sqrt(tempx*tempx + tempy*tempy + tempz*tempz);
  • mag sqrt(sx*sx + sy*sy + sz*sz);
  • sx sx/mag
  • sy sy/mag
  • sz sz/mag
  • rad 2*radius*(l-cos(theta))/(l-cos(2*theta));

Abstract

The invention disclosed herein is an omnidirectional imaging apparatus for sensing an image of a scene (130) from a single viewpoint. The omnidirectional imaging apparatus includes a truncated, substantially paraboloid-shaped reflector (135) positioned to orthographically reflect principal rays of electromagnetic radiation radiating from said scene, said paraboloid-shaped reflector having a focus coincident with said single viewpoint of said omnidirectional imaging apparatus, including said paraboloid-shaped reflector. The omnidirectional imaging apparatus also includes telecentric means (112, 113), optically coupled to said paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by said paraboloid-shaped reflector. The omnidirectional imaging apparatus also includes one or more image sensors (111) positioned to receive said orthographically reflected principal rays of electromagnetic radiation from said paraboloid-shaped reflector, thereby sensing said image of said scene.

Description

AN OMNIDIRECTIONAL IMAGING APPARATUS
SPECIFICATION
BACKGROUND OF THE INVENTION
1. Field of the Invention This invention relates to omnidirectional image sensing with reference to a single viewpoint, and, more particularly to such image sensing using a truncated, substantially paraboloid-shaped reflector.
2. Discussion of the State of the Art
For many applications such as surveillance, teleconferencing, remote sensing, photogrammetry, model acquisition, virtual reality, computer graphics, machine vision and robotics, it is desirable that an imaging system have a large field of view so as to be able to take in as much information as possible about the world around it. Traditional imaging systems include a camera with a lens that provides a perspective projection of an image. However, a camera with even a very wide angle lens has only a limited field of view (i.e., covering less than a full hemisphere). This limited field of view may be expanded by tilting and panning the entire imaging system about its center of projection. One such system is described in S.E. Chen, "Quicklime VR - An Image-Based Approach to Virtual Environment Navigation", Proc. of SIGGRAPH 95, (8):29-38, August (1995). The article by L. McMillan and G. Bishop, "Plenoptic Modeling: An Image-Based Rendering System", Computer Graphics: Proc. of SIGGRAPH, August 1995, pp. 39-46, also describes a traditional pan-and-tilt system. This type of system has two serious drawbacks, however, one being the obvious disadvantages associated with a device having critical moving parts, and the second being the significant amount of time required to make a full rotation in order to view the surrounding world. This time limitation makes such a device unsuitable for real-time applications.
Another approach to increasing the field of view in an imaging system is by employing a so called "fish eye" lens as is disclosed in E.L. Hall et al., "Omnidirectional Viewing Using a Fish Eye Lens", SPIE Vol. 728 Optics, Illumination, and Image Sensing for Machine Vision (1986), p. 250. Since the fish eye lens has a very short focal length, the field of view may be as large as a hemisphere. The use of such lenses in an imaging system is problematic, however, in that they are significantly larger and more complex than conventional lenses. Moreover, it has been difficult to develop a fish eye lens with a fixed viewpoint for all points of the relevant scene. U.S. Patent No. 5,187,667 to Zimmerman, and U.S. Patent No. 5,359,363 to Kuban et al. are also directed to the use offish eye lenses to replace conventional pan and tilt mechanisms, and accordingly suffer from the same disadvantages.
Other prior art devices have used reflecting surfaces to increase the field of view. One such prior art device is disclosed in V.S. Nalwa, "A True Omni-
Directional Viewer", AT&T Bell Laboratories Technical Memorandum, BL0115500- 960115-01, Jan. 1996. Nalwa discloses the use of multiple planar reflecting surfaces in conjunction with multiple charge coupled device ("CCD") cameras to obtain a 360 degree panoramic image of a 50 degree band of a hemispherical scene. Specifically, in Nalwa, four planar mirrors are arranged in the shape of a pyramid, with one camera being positioned above each of the four planar reflecting sides, and with each camera viewing slightly more than 90 degrees by 50 degrees of the hemispherical scene. This system suffers from the serious drawback of requiring multiple sensors to capture a 360-degree image. In addition, this system suffers from the inherent problems associated with distortion at the "seams" when the separate images are combined to provide a full 360 degree view.
Curved reflective surfaces have also been used in conjunction with image sensors. Both Yagi et al., "Evaluating Effectivity of Map Generation by Tracking Vertical Edges in Omnidirectional Image Sequence", IEEE International Conference on Robotics and Automation, June 1995, p. 2334, and Yagi et al., "Map-Based Navigation for a Mobile Robot With Omnidirectional Image Sensor COPIS", IEEE Transactions on Robotics and Automation, Vol. II, No. 5, Oct. 1995, disclose a conical projection image sensor (COPIS) which uses a conical reflecting surface to gather images from the surrounding environment, and processes the information to guide the navigation of a mobile robot. Although the COPIS is able to attain viewing in 360 degrees, it is not a true omnidirectional image sensor because the field of view is limited by the vertex angle of the conical mirror and by the viewing angle of the camera lens. Furthermore, the COPIS does not have a single viewpoint, but instead has a locus of viewpoints lying on a circle. This locus of multiple viewpoints causes distortion in the captured images, which cannot be eliminated to obtain pure perspective images.
Yamazawa et al., "Obstacle Detection With Omnidirectional Image Sensor HyperOmni Vision", IEEE International Conference on Robotics and Automation, Oct. 1995, p. 1062, discloses a purported improvement in the COPIS system which involves the use of a hyperboloidal reflecting surface in place of a conical surface. As discussed therein, the rays of light which are reflected off the hyperboloidal surface, no matter where the point of origin, will all converge at a single point, thus enabling perspective viewing.
Although the use of a hyperboloidal mirror is advantageous in that it enables full perspective image sensing, because the rays of light which make up the reflected image converge at the focal point of the reflector, positioning of the sensor relative to the reflecting surface is critical, and any disturbance will impair the image quality. Further, the use of a perspective-projection model inherently requires that as the distance between the sensor and the mirror increases, the cross-section of the mirror must increase. Therefore, practical considerations dictate that in order to keep the mirror at a reasonable size, the mirror must be placed close to the sensor. This in turn causes complications to arise with respect to the design of the image sensor optics. In addition, mapping sensed image to usable coordinates requires complex calibration due to the nature of the converging image. A further drawback is that the relative positions of the mirror and the optics cannot be changed while maintaining a single viewpoint. Thus, a hyperboloidal mirror system cannot take advantage of the relative movement of the mirror and optics to adjust the field of view of the system, while maintaining a single viewpoint.
Prior to Yamazawa et al., U.S. Patent No. 3,505,465 to Donald Rees also disclosed the use of a hyperboidal reflecting surface. Accordingly, the Rees disclosure also suffers from the same drawbacks as that of Yamazawa et al.
The above-described prior art devices fail in one of two ways. They either fail to provide a truly omnidirectional imaging apparatus that is capable of sensing a scene from a single viewpoint, making it impossible to provide distortion-free images with the apparatus, or they provide apparatus that require complex calibration and implementation.
SUMMARY OF THE INVENTION The drawbacks of the prior art, as discussed above, are substantially solved by the present invention, which in one aspect is an omnidirectional imaging apparatus for sensing an image of a scene from a single viewpoint that includes a truncated, substantially paraboloid-shaped reflector positioned to orthographically reflect principal rays of electromagnetic radiation radiating from the scene. The paraboloid- shaped reflector has a focus coincident with the single viewpoint of the omnidirectional imaging apparatus. The omnidirectional imaging apparatus also includes telecentric means, optically coupled to the paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by the paraboloid-shaped reflector. The omnidirectional imaging apparatus further includes one or more image sensors positioned to receive the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector, thereby sensing the image of the scene.
The paraboloid-shaped reflector of the present invention may be either convex or concave. The telecentric means may include a telecentric lens, a telecentric aperture, or a collimating lens.
Preferably, the paraboloid-shaped reflector comprises a substantially paraboloidal mirror having a surface which substantially obeys the equation expressed in cylindrical coordinates:
h 2 -r 2
z being an axis of rotation of the surface, r being a radial coordinate, and h being a constant. As the equation represents a symmetrical surface of rotation, the shape of the surface is not a function of the angular coordinate φ.
In a preferred embodiment of the invention, the one or more image sensors comprise one or more video cameras. These video cameras may employ one or more charge-coupled devices or one or more charge injection devices. Alternatively, the one or more image sensors may comprise photographic film. In another preferred embodiment, at least one image sensor has a non-uniform resolution to compensate for the non-uniform resolution of the image reflected from the paraboloid-shaped reflector.
Preferably, the paraboloid-shaped reflector comprises a mirror truncated at a plane which includes the focus of the paraboloid-shaped reflector and which is perpendicular to the axis passing through the focus and the vertex of the paraboloid- shaped reflector.
In an exemplary embodiment, the paraboloid-shaped reflector is mounted on a fixed base and the one or more image sensors are mounted on a movable base, whereby movement of the one or more image sensors produces a changing field of view. Alternatively, the paraboloid-shaped reflector may be mounted on a movable base and the one or more image sensors may be mounted on a fixed base, whereby movement of the paraboloid-shaped reflector produces a changing field of view. In each of these embodiments, it is also preferred that a zoom lens be provided for optically coupling the one or more image sensors and the paraboloid-shaped reflector. In a further exemplary embodiment, the one or more image sensors provide an image signal representative of the image of the scene. An image signal processing apparatus is coupled to the one or more image sensors, which converts the image signal from the image sensors into image signal data. The image signal processing apparatus then maps the image signal data into a Cartesian-coordinate system to produce a perspective image or into a cylindrical-coordinate system to produce a panoramic image. The image signal processing may include interpolation means for providing interpolated image data, whereby the interpolated image data and the image signal data are combined to form a digital image. Advantageously, the image processing apparatus may further include means for zooming in on a preselected portion of the digital image to thereby provide an enlarged image of the preselected portion from a predetermined focal distance.
In a preferred arrangement, the omnidirectional imaging apparatus comprises at least one lens optically coupling the one or more image sensors and the paraboloid- shaped reflector. This coupling lens may be a zoom lens, a microscope objective, or a field-flattening lens. Advantageously, the field-flattening lens has a field curvature approximately opposite to the field curvature of the paraboloid-shaped reflector. Preferably, the field-flattening lens is either a plano-concave lens or an aplanatic, meniscus lens.
In yet another preferred arrangement, the omnidirectional imaging apparatus is used to image a substantially spherical scene by using two truncated, substantially paraboloid-shaped reflectors positioned to orthographically reflect principal rays of electromagnetic radiation radiating from two complementary hemispherical scenes. The two paraboloid-shaped mirrors are positioned to share a common paraboloidal axis. In addition, when the two paraboloid-shaped reflectors are convex, they are positioned back-to-back along their planes of truncation, such that they share a common focus point. When the two paraboloid-shaped reflectors are concave, they are positioned such that their vertexes coincide.
In a further exemplary embodiment of the present invention, a plurality of beam splitters are provided for splitting the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector into a plurality of ray bundles. In this embodiment, a plurality of image sensors is required, with each image sensor positioned to receive at least one of the plurality of ray bundles, and thereby sensing a portion of the image of the scene.
In yet a further exemplary embodiment, a plurality of dichroic beam splitters is provided for splitting the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector into a plurality of monochromatic principal rays of electromagnetic radiation. As in the previous embodiment, a plurality of image sensors is required, with each image sensor positioned to receive at least one of the plurality of monochomatic principal rays of electromagnetic radiation, and thereby sensing at least one monochromatic image of the scene.
In accordance with the pioneering nature of the present invention, a method for sensing an image of a scene from a single viewpoint is also provided. In an exemplary embodiment, the method includes the steps of: (a) orthographically reflecting principal rays of electromagnetic radiation radiating from the scene on a truncated, substantially paraboloid-shaped reflector such that the single viewpoint of the omnidirectional imaging method coincides with a focus point of the paraboloid-shaped reflector;
(b) telecentrically filtering out a substantial portion of any principal rays of electromagnetic radiation which are not orthographically reflected by the paraboloid-shaped reflector; and
(c) sensing the image of the scene by sensing the orthographically reflected principal rays of electromagnetic radiation from the paraboloid-shaped reflector with one or more image sensors. In a further exemplary embodiment, a method for omnidirectionally sensing images of a scene from a single viewpoint is provided, which includes:
(a) mounting a truncated, substantially paraboloid-shaped reflector on a fixed base;
(b) mounting one or more image sensors on a movable base; (c) orthographically reflecting principal rays of electromagnetic radiation radiating from the scene on the substantially paraboloid-shaped reflector such that the single viewpoint of the omnidirectional imaging method coincides with a focus point of the paraboloid-shaped reflector;
(d) telecentrically filtering out a substantial portion of any principal rays of electromagnetic radiation which are not orthographically reflected from the paraboloidal-shaped reflector;
(e) moving the movable base to a first position;
(f) sensing a first image of the scene having a first field of view by sensing the orthographically reflected principal rays of electromagnetic radiation from the paraboloidal-shaped reflector with the one or more image sensors;
(g) moving the movable base to a second position different from the first position; and
(h) sensing a second image of the scene having a second field of view by sensing the orthographically reflected principal rays of electromagnetic radiation from the paraboloidal-shaped reflector with the one or more image sensors. Alternatively, instead of mounting the paraboloid-shaped reflector on a fixed base and mounting the image sensors on a movable base, the paraboloid-shaped reflector may be mounted on a movable base and the image sensors may be mounted on a fixed base. Preferably, the above-described method also includes the step of optically coupling the paraboloid-shaped reflector and the image sensors with a zoom lens, which may be used to magnify an area of interest in the scene.
BRIEF DESCRIPTION OF THE DRAWINGS Exemplary embodiments of the present invention will now be described in detail with reference in the accompanying drawings in which: Fig. 1 A is a side view of an exemplary embodiment of an omnidirectional imaging apparatus;
Fig. IB is a side view of an alternate embodiment in which a paraboloid- shaped reflector is connected to an image sensor by a transparent support; Fig. 2 is an isometric view of a paraboloid-shaped reflector mounted on a base plate;
Fig. 3 is a partially isometric view of a paraboloid-shaped reflector mapped into a cylindrical coordinate system; Fig. 4 is a geometric representation of an orthographic reflection from a curved reflecting surface;
Fig. 5 is an illustration of orthographic reflection from a substantially paraboloid-shaped reflector to an image sensor;
Fig. 6 illustrates how any selected portion of a hemispherical scene can be viewed from the single viewpoint using a paraboloid-shaped reflector;
Fig. 7 is a side view of an omnidirectional imaging apparatus with two back- to-back substantially paraboloid-shaped reflectors and two image sensors;
Fig. 8 is a cross-sectional view of two substantially paraboloid-shaped reflectors positioned back-to-back and having a common paraboloidal axis and a common focus;
Fig. 9 illustrates the mapping of image data to cylindrical coordinates to enable production of a panoramic view;
Fig. 10 is a flowchart of an exemplary embodiment of a method for sensing and processing an image of a substantially hemispherical scene from a single viewpoint;
Fig. 11 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes an extended paraboloid- shaped reflector;
Fig. 12 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a paraboloid-shaped reflector truncated at a plane that is tilted with respect to the paraboloidal axis of the reflector;
Fig. 13 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a paraboloid-shaped reflector that is larger than the imaging area of the image sensor; Fig. 14 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a concave paraboloid- shaped reflector;
Fig. 15 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a zoom lens optically coupling a paraboloid-shaped reflector and an image sensor;
Fig. 16 is a partially isometric view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a paraboloid- shaped reflector mounted on a movable base; Fig. 17 A is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes an image sensor mounted on a movable base;
Fig. 17B is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a movable camera; Fig. 17C is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a movable camera and optics;
Fig. 18 is a partially isometric view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes an image sensor comprising four charge coupled devices positioned side-by-side;
Fig. 19 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes multiple image sensors and beam splitters;
Fig. 20 is a top view of an image sensor according to an embodiment of the present invention whose sensing elements are non-uniformly distributed and sized; Fig. 21 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a planar mirror that optically couples a paraboloid-shaped reflector and an image sensor;
Fig. 22 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a microscope objective that optically couples a paraboloid-shaped reflector and an image sensor;
Fig. 23 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a collimator lens that optically couples a paraboloid-shaped reflector and an imaging lens; Fig. 24A is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a field-flattening planoconcave lens;
Fig. 24B is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes a field-flattening meniscus lens with aplanatic sides; and
Fig. 25 is a side view of an embodiment of an omnidirectional imaging apparatus according to the present invention, which includes two concave paraboloid- shaped mirrors used to image a substantially spherical view.
DETAILED DESCRIPTION Fig. 1A illustrates an omnidirectional imaging apparatus 100 according to an exemplary embodiment of the present invention. A convex paraboloid shaped reflector 135, which is mounted on a base plate 140, is positioned to orthographically reflect an image of a substantially hemispherical scene 130. An image sensor 110, such as a commercially available Sony 3CCD color video camera device 111 having a telecentric lens or a magnifying lens 112 and a telecentric aperture 113, is positioned to receive the orthographic reflection of the image. The telecentric lens or aperture functions to filter out all rays of light which are not perpendicular to the plane of the lens or aperture, i.e., background light which does not form part of the orthographic reflection of the hemispherical scene. Although the description herein is with regard to visible light, the present invention has equal application to other forms of electromagnetic radiation such as ultraviolet light or infrared light.
In an alternate exemplary embodiment of the imaging apparatus 100 according to the invention shown in Fig. IB, the paraboloid-shaped reflector may be coupled to the image sensor by a transparent support 136, such as a length of clear tubing.
Referring again to Fig. 1 A, the video camera 110 generates an analog video signal representative of the orthographically reflected image which is sent through cable 150. The video signal is converted to a digital signal by digitizer 120, which is a commercially available NTSC video signal analog-to-digital converter.
The digital signal is then sent through a cable 155 to a general purpose computer 125, such as a DEC Alpha 3000/600 workstation. As will be explained in further detail, the computer 125 is programmed to allow the user to view any desired portion of the hemispherical scene, to zoom in on a selected portion of the scene, or to pan the scene in any desired manner.
The image sensor 110 may simply be a still or motion picture photographic camera using conventional photographic film. The image sensor 110 may also be a camcorder or video camera 116 which provides a digital video signal output, which can be provided directly to the computer 125 without the need for the analog-to- digital converter 120.
Fig. 2 shows an isometric view of the paraboloid-shaped reflector 135, which extends from base 140 from which it is formed. The reflector 135 may comprise a paraboloid-shaped plastic body coated with a thin layer 145 of highly reflective metal, such as aluminum or silver. Alternatively, the reflector 135 may comprise a paraboloid-shaped, polished metal body. For this latter embodiment, a metal such as stainless steel may be used.
Fig. 3 illustrates in greater detail, the preferred geometry of the paraboloid- shaped reflector 135, as well as the orthographic reflection of the image of the substantially hemispherical scene 130 onto the image sensor 110. The reflector 135 of Fig. 3 is defined in cylindrical coordinates, r, φ and z, as substantially obeying the equation
, ! 2 z = - 2h?-, (1) where z is the axis of rotation, r is the radial coordinate, and h is a constant. The z axis coincides with the optical axis of the imaging arrangement, and a focus point 315 of the paraboloid defined by equation (1) coincides with the origin of the coordinate system. The reflector 135 of Fig. 3 is truncated at a plane p which is substantially perpendicular to the z axis 310 and which includes the focus point 315 of its paraboloidal surface.
All incoming rays 305 that would otherwise pass through the focus point 315, are orthographically reflected towards the image sensor 110 by the reflecting paraboloidal surface. Thus, the focus point 315 is coincident with the single viewpoint from which the substantially hemispherical scene 130 is viewed. The image sensor 110 is positioned along the optical axis 310 of the imaging system and the photosensitive surface thereof is perpendicular to the optical axis. The use of orthographic reflection to enable viewing of a substantially hemispherical scene from a single viewpoint is an advantageous feature of the present invention. That orthographic reflection enables viewing from a single viewpoint can be demonstrated by reference to Fig. 4. In Fig. 4, z and r are perpendicular cylindrical coordinates for a given value of φ, the angular coordinate. The angle of an incoming ray 405 relative to the f axis is θ. The incoming ray 405 is orthographically reflected by the reflecting surface 415 as an outgoing ray 410. To have a single viewpoint 420, any incoming ray must satisfy tan ( θ ) = z/ r, (2) and for orthographic reflection, all rays must be reflected at an angle α=π/2 , (3)
where α is the angle between the outgoing ray 410 and the axis. For these two constraints to be satisfied, and for the angle of incidence to equal the angle of reflection, it is clear that the angle, β, between the reflected ray 410 and the normal direction of the surface at the point of reflection, ή, must equal α-θ π-2 ( β = — - or β (4)
which can also be expressed as tanα-tanθ 2 tanβ tan 2 β
1 +tanα tant l -tan2β (5)
Finally, the slope of the reflecting surface 415 in the z-f plane at the point of reflection is dz
= -tan β a
~dr (6)
Substituting (6) and (4) into (5) yields
The quadratic
Figure imgf000016_0001
but to avoid self occlusion by the reflecting surface, the slope of the curve in the right quadrant is made negative (i.e., the surface is convex). The result is
Figure imgf000016_0002
If a =z/r, the above expression reduces to
a + A/1 + a (9)
where h is a constant of integration. Substituting z = ra into equation (9) yields equation (1).
Thus, there exists a curve, which when rotated about the z axis, generates a surface that will allow orthographic reflection of a substantially hemispherical scene from a single viewpoint. This curve is the parabola defined by equation (1), which has a single viewpoint that is coincident with the focus 420 of the parabola.
In addition to providing viewing of a substantially hemispherical scene from a single viewpoint, the omnidirectional imaging apparatus in accordance with the present invention enables viewing of any portion of the scene, enables zooming in on a selected portion, and enables panning of the scene, all with respect to the single viewpoint and without requiring image reconstruction or complex frame transformation. Fig. 5 illustrates how a portion of the substantially hemispherical scene is viewed by the image sensor from a single viewpoint. Fig. 5 also illustrates how a truncated convex substantially paraboloid-shaped reflector 135 is mapped into a Cartesian coordinate system. The optical axis 502 of the imaging arrangement is coincident with the z axis, and the focus 501 of the substantially paraboloid-shaped reflector 135 is located at the origin. Incoming rays 505, 510 from a portion of the scene 300 being viewed intersects the reflecting surface at points 515 and 520, which can be defined by their respective x and y coordinates. Point 515 and 520 lie along imaginary radial lines 516 and 521, respectively, which originate at the viewpoint of the scene, i.e., the focus 501 of the paraboloid-shaped reflector. Since these rays are orthographically reflected toward the image sensor 110, which has a planar light- sensitive surface perpendicular to the z-axis, the projected rays will intersect the light- sensitive surface at the same respective x and y coordinates. Only the z coordinate will change. Accordingly, there is a one-to-one correspondence between the x-y coordinate of the point of intersection with the reflector 135 of the orthographically projected ray, and the x-y coordinate of the point at which that orthographically projected ray intersects the planar light-sensitive surface of the image sensor 110. In a preferred arrangement, the image sensor 110 includes a planar charge- coupled device ("CCD") image sensor having an array of light sensing cells. Each cell senses the intensity of light at its particular location in the array. Therefore, with a one-to-one correspondence, the image signals produced by the CCD cells which cover a particular range of x-y coordinates in the grid is representative of the rays which are orthographically reflected from the reflecting surface 135 at points within the same range of x-y coordinates. Thus, mapping of the image into a Cartesian- coordinate system is a simple task for persons skilled in the art. With the one-to-one correspondence explained above in mind, Fig. 6 illustrates a technique for zooming in on any selected portion of the substantially hemispherical scene. The reflector 135 is positioned relative to orthogonal x, y and z axes in the same manner as in Fig. 5. In order to zoom in at a focal distance f on a selected portion of the scene centered around a point 550, with a specified size, only the image signals of the CCD cells located with the same range of x-y coordinates as the region of the reflecting surface projecting the selected portion of the scene are selected for magnification and viewing.
More particularly, to determine the appropriate light intensity for point 570 in the selected portion of the scene, the light intensity signal generated by the CCD cell which lies at 580 is chosen. As shown in Fig. 6, a line segment drawn between point 570 and focus point 551 intersects the reflector 135 at point 552. The light intensity at point 570 is set equal to that represented by the image signal generated by the CCD cell at 580 which is located at the x-y coordinate on the grid nearest to the x-y coordinate of point 552. The same is repeated for each CCD cell within the same range of x-y coordinates as the region of the reflecting surface projecting the selected portion of the scene. As a result of the orthographic reflection and the one-to-one correspondence described above, no image reconstruction or complex frame transformation is required.
A general purpose computer 125 can be readily programmed by one skilled in the art to perform the above steps to enable viewing of any portion of the hemispherical scene from a single viewpoint, and to also enable zooming in on any particular portion to provide an enlarged image of that portion. Furthermore, by designating successive points along the reflector, the hemispherical scene can be panned as if one were viewing the scene from a single viewpoint. In the embodiment discussed above, it is readily apparent that as one zooms in on smaller portions of the scene, the number of CCD cells providing information to the computer 125 is reduced, and hence the granularity of the viewed image is increased. In a preferred embodiment, information about points in the scene which do not exactly correspond to CCD cells are more closely approximated by interpolation. A suitable interpolation program which may be executed on computer 125 is included in Appendix I to this specification. The program attached as Appendix I will map the sensed omnidirectional image to an ordinary perspective image that is suitable for display on computer 125. The program requires the user to input the name, center location, and radius of the omnidirectional image to be converted. The program also requires the user to input a name for the generated perspective image, as well as a focal length and size for the perspective image.
Thus, instead of simply choosing the image signal generated by the nearest CCD cell to represent portions of the image which do not precisely correspond to a CCD cell, the image for such scene portions is estimated by the appended program based on a suitable average of image signals generated by CCD cells which correspond to neighboring portions of the scene. Of course, more sophisticated interpolation programs known to those skilled in the art, such as those that are based on polynomial or temporal matching, may be used without departing from the scope of the invention, as defined by the claims herein.
In addition to the Cartesian-coordinate mapping that has been described, which produces a perspective image, a cylindrical-coordinate mapping may also be performed to achieve a panoramic image of the scene being viewed. Cylindrical- coordinate mapping will be described with reference to Fig. 9. In Fig. 9, a principal ray 950 from a point 945 in a scene strikes a paraboloid-shaped reflector 935 and is orthographically reflected toward an image sensor 910. The orthographically reflected ray 960 impinges the image sensor at sensing element 965. To map the point represented by sensing element 965 into cylindrical coordinates, a truncated cylinder 970 is imagined to surround the paraboloid-shaped reflector 935 and image sensor 910. The point represented by sensing element 965 is then traced back through rays 960 and 950, and the point of intersection 955 of the ray 950 with the truncated cylinder 970 is determined. Point 955 is then assigned the light intensity of sensing element 965. This same calculation is performed for each sensing element of the image sensor 910. The resultant collection of points (with appropriately assigned light intensities) located on the truncated cylinder 970 produces a panoramic image of the scene being viewed. This panoramic image can be viewed on a display by further mapping the truncated cylinder to a planar surface. This mapping is easily performed by those skilled in the art and can be visualized by imagining that the cylinder is cut length-wise and flattened out. Moreover, as those skilled in the art will readily appreciate, interpolation of image data as discussed above in relation to Cartesian- coordinate mapping may also be used with cylindrical-coordinate mapping.
In a preferred embodiment of the present invention, a V inch CCD is used with a 0.4 inch focal length paraboloid-shaped mirror truncated through its focus and having a 1.6 inch diameter. A collimating lens, such as Model No. P32921 from EDMUND SCIENTIFIC of Barrington, New Jersey, is used with an 8.5 inch focal- length imaging lens to optically couple the mirror to the CCD.
In a further exemplary embodiment of the invention, the omnidirectional imaging apparatus includes an additional substantially paraboloid-shaped reflector 735 as shown in Fig. 7. The additional reflector is positioned to orthographically project an image of an additional hemispherical scene 730 which is complementary to the hemispherical scene 130 so that together they constitute a spherical scene. An additional image sensor 710 is positioned to receive the image orthographically projected by the additional reflector 735.
An image signal representative of the orthographic reflection of the additional reflector 735 is converted to a digital signal by converter 720 in the same manner as described above, and is sent to the same general purpose computer 125 via line 725. As shown in Fig. 8, the reflectors 135 and 735 are positioned back-to-back, share a common axis of rotation 810, which is also the optical axis of the imaging apparatus, and a common focus 805, and are each truncated at a plane p which is substantially perpendicular to the axis of rotation 810 and which includes the focus 805.
Referring to Fig. 10, there is shown a flow chart 1000 illustrating a method for sensing an image of a substantially hemispherical or spherical scene from a single viewpoint according to an exemplary embodiment of the present invention.
Flowchart 1000 shows the necessary steps for sensing the hemispherical scene from a single viewpoint. The method requires orthographically reflecting the substantially hemispherical scene 1010, and sensing the orthographically reflected image 1020.
The method may further include the steps of converting the image signal into image signal data 1030, mapping the image data into an appropriate coordinate system 1040, interpolating the image data 1060 to derive approximate values for missing image data, and forming a digital image 1070 from the mapped image data and from the interpolated image data. Advantageously the steps of specifying a viewing direction, a focal length, and an image size 1045 and zooming in 1050 on a selected portion of the image data may be performed before the interpolation step. Thus far, the exemplary embodiments described have all utilized a "normal" paraboloid-shaped reflector. As used in this specification and the appended claims, the term "normal" in association with a paraboloid-shaped reflector refers to a paraboloid-shaped reflector that is truncated at a plane that passes through the focal point of the paraboloid-shaped reflector and that is substantially perpendicular to the paraboloidal axis of the paraboloid-shaped reflector. As used in this specification and the appended claims, the paraboloidal axis of a paraboloid-shaped reflector is the axis passing through the vertex and focal point of the paraboloid-shaped reflector. As described above, using a normal paraboloid-shaped reflector, one can image an entire hemisphere (π steradians), or by placing two such reflectors back-to-back, a complete sphere (2π steradians). Figs. 11 to 15 show further exemplary embodiments of the omnidirectional imaging apparatus in which the paraboloid-shaped reflector may also take the form of various non-normal paraboloids.
Fig. 11 shows an omnidirectional imaging apparatus that is able to image a field of view ("FOV") greater than a hemisphere using only one camera 1111 and one paraboloid-shaped reflector 1135. In the embodiment of Fig. 11, the paraboloid- shaped reflector 1135 is an extended paraboloid that is obtained by cutting a suitable reflector with a plane that is normal to the axis of the paraboloid (z) but passes below the focal point 1130 of the paraboloid. Advantageously, because the paraboloid extends below its focal point, the paraboloid-shaped reflector is able to orthographically reflect rays from the hemisphere below its focal point. In the embodiment illustrated in Fig. 11 , for example, the FOV covered by the paraboloid- shaped reflector is 240 degrees, or 75% of an entire sphere. Preferably, as shown in Fig. 11, the camera 1111 and the paraboloid-shaped reflector 1135 are coupled by optics 1112.
Fig. 12 shows an omnidirectional imaging apparatus that may be used to image a FOV that is tilted with respect to the paraboloidal axis of the paraboloid- shaped reflector. The embodiment of Fig. 12 includes a camera 1211, optics 1212, and a paraboloid-shaped reflector 1235. The paraboloid-shaped reflector 1235 is truncated at a plane passing through the focus of the paraboloid-shaped reflector 1235 and tilted with respect to its paraboloidal axis (z). The FOV of this reflector is thus a tilted hemisphere, as shown by the dotted lines in Fig. 12. Although the embodiment in Fig. 12 shows the truncation plane passing through focus of the paraboloid, the invention is not limited to this embodiment. The truncation plane may also pass above the focus 1230 of the paraboloid (thereby resulting in a FOV smaller than a hemisphere), or the truncation plane may pass below the focus 1230 (thereby resulting in a FOV greater than a hemisphere).
Fig. 13 shows an omnidirectional imaging apparatus that may be used to image a FOV smaller than a hemisphere. The embodiment of Fig. 13 includes a camera 1311 coupled to a paraboloid-shaped reflector 1335 by optics 1312. In this embodiment, the paraboloid-shaped reflector 1335 is formed such that it is "larger" than the imaging area of the camera 1311. In this context, a paraboloid-shaped reflector is "larger" than the imaging area of a camera if the base of a normal paraboloid having the same shape as the reflector (i.e., having the same paraboloidal constant h as defined in equation (1)) is larger than the smallest dimension of the imaging area of the camera. Taking the case of a normal paraboloid for illustrative purposes, it is clear that when such a paraboloid is larger than the imaging area of a camera, only a FOV smaller than a full hemisphere is capable of being captured in the imaging area of the camera because the orthographically reflected rays at the outer edges of the paraboloid will not impinge on the imaging area. Advantageously, however, the image captured using such a paraboloid-shaped mirror has a higher resolution than a corresponding image captured using a smaller paraboloid. As shown in Fig. 13, the paraboloidal axis of the paraboloid-shaped reflector 1335 (z') may be shifted with respect to the optical axis (z) to obtain fields of view towards the horizon. In addition, the paraboloid-shaped reflector 1335 need not be a normal paraboloid, but may be truncated in accordance with the FOV to be imaged.
Thus far, all of the embodiments discussed have comprised a convex paraboloid-shaped reflector. In Fig. 14, an embodiment of an omnidirectional imaging apparatus according to the present invention is shown that includes a camera 1411, optics 1412, and a concave paraboloid-shaped reflector 1435. A concave paraboloid-shaped reflector may be used in applications where the concealment of the reflector is desirable (as, for example, in outdoor applications where protection against weather is desirable). In the case of a concave paraboloid-shaped reflector, the paraboloidal image of the scene is "flipped," but the image continues to satisfy the single viewpoint constraint disclosed previously. Therefore, pure perspective images can be generated from the concave paraboloidal image, just as with the convex paraboloidal image. In the case of the concave paraboloid, however, at most a hemispherical field of view can be obtained with a single reflector. This hemispherical FOV is obtained by truncating the paraboloid with a plane that passes through the focal point 1435 of the paraboloid (the plane being either normal or tilted with respect to the axis of the paraboloid (z)). Although a concave paraboloid that is truncated above its focal point may also be used, such a paraboloid is not desirable because it causes self obstruction of the image.
As shown in Fig. 25, a FOV greater than a hemisphere may be obtained by using multiple concave paraboloid-shaped reflectors. In Fig. 25, two paraboloid- shaped reflectors, 2535a and 2535b, are positioned such that they share a common paraboloidal axis (z) and their vertexes 2545 coincide. Together with image sensors 251 la and 251 lb, the two paraboloid-shaped reflectors 2535a and 2535b are able to image two hemispheres 2530a and 2530b, respectively. This system may be used advantageously when the reflectors are required to be recessed for concealment or protection. A disadvantage to using concave mirrors in this arrangement, instead of using convex mirrors in the arrangement of Fig. 7, is that a small blindspot, comprising the area between the truncation planes of the two reflectors, is inevitable. Fig. 15 shows an embodiment of an omnidirectional imaging system according to the present invention with zoom capabilities. The omnidirectional imaging system of Fig. 15 includes a paraboloid-shaped reflector 1535, a camera 1511 , a zoom lens 1512, and relay optics 1513. (As used in this specification and the appended claims, relay optics and collimating optics are synonymous.) With the zoom lens 1512 set to its lowest power, the omnidirectional imaging system provides an image of the entire hemisphere (or greater or less than a hemisphere if the embodiments of Fig. 11 or Fig. 13 are used). When zoomed in, the zoom lens 1512 provides a higher magnification (and, therefore, a higher resolution) of a smaller FOV. While zooming in, the effective center of projection of the zoom lens 1512 must remain approximately fixed to ensure that the imaging system remains telecentric. Preferably, the relay optics 1513 is used to ensure that the zoom lens 1512 remains telecentric over its entire settings.
In the embodiment of Fig. 15, the zoom lens 1512 may be either fixed or mobile with respect to the paraboloid-shaped reflector 1535. If the zoom lens 1512 is fixed, only regions around the paraboloidal axis (z) can be observed under magnification. Preferably, therefore, the zoom lens 1512 is equipped with some movement means, allowing the zoom lens to be positioned over and image regions along the outer edges of the paraboloid-shaped reflector 1535. Of course, such movement means must ensure that the optical axis of the zoom lens 1512 remains parallel to the paraboloidal axis of the paraboloid-shaped reflector 1535 at all times. Fig. 16 shows an omnidirectional imaging system that may be used to produce dynamically changing fields of view of a scene. A paraboloid-shaped reflector 1635 is mounted on a movable base 1640, which allows translation of the paraboloid- shaped reflector 1635 along the x, y and z axes. The movable base 1640 may be controlled either manually or with a computer. Using the movable base 1640, a dynamically changing field of view of a scene could be produced, for example, by a circular motion of the movable base 1640 about the optical axis (z). Preferably, the images are image processed as described previously to obtain perspective or panoramic views.
Fig. 16 further shows the use of a zoom lens 1612 in combination with the movable base 1640. A zoom lens 1612 adds the capability to zoom into sections of the paraboloid-shaped reflector 1635 brought under the view of the imaging system by the movement of the movable base 1640. Preferably, a relay lens 1613 is used to couple the zoom lens and the paraboloid-shaped reflector 1635. In addition, the zoom lens 1612 preferably includes a manual or automatic focus control to ensure that the sharpness of images are maintained over all sections of the paraboloid-shaped reflector 1635. Alternatively, translation of the reflector along the z axis may also be used to adjust the focus of an image.
Instead of moving the paraboloid-shaped reflector as in the embodiment of Fig. 16, one or more parts of the camera or optics of the imaging system may alternatively be moved to achieve the same effect as the Fig. 16 embodiment. Figs. 17 A, 17B, and 17C show various exemplary embodiments of such omnidirectional imaging systems. In Fig. 17A, an image sensor 1710 (such as a CCD) is provided with movable means; in Fig. 17B, a camera 1711 is provided with movable means; and in Fig. 17C, both a camera 1711 and optics 1712 are provided with movable means, for moving together simultaneously. As shown in the figures, each of these components may be moved along any of the x, y, or z axes to change the field of view being imaged. As in the embodiment of Fig. 16, a zoom lens may be used to magnify areas of interest. Advantageously, by moving the camera or optics instead of moving the paraboloid-shaped reflector, the viewpoint of the omnidirectional imaging system remains fixed in space at the focal point of the paraboloid-shaped reflector. The embodiments of Fig. 16, 17 A, 17B, or 17C may be used to great advantage in a surveillance system. The omnidirectional imaging capability of these embodiments allows an operator to monitor an entire area of interest at once. When the operator observes a particular region of interest within the area being monitored, the operator may then select appropriate translational coordinates (for movement of the camera, optics, or paraboloid-shaped reflector), and appropriate zoom settings, to view the region of interest in greater detail.
Fig. 18 shows an omnidirectional imaging system that utilizes multiple image sensors to achieve increased image resolution. The embodiment of Fig. 18 includes a paraboloid-shaped reflector 1835, video electronics 1809, four CCD elements 1810a - 1810d, and imaging optics 1812. In this embodiment, the four CCD elements 1810a - 1810d are placed side-by-side in a non-overlapping arrangement. The embodiment of Fig. 18 takes advantage of the fact that commercial CCD elements are typically manufactured in standard resolutions regardless of their size. Therefore, by using four, commercial -inch CCD elements instead of a single, commercial 'Λ-inch CCD element, the resolution of an image may advantageously be quadrupled. Although Fig. 18 shows the use of CCD elements placed in a non-overlapping arrangement, the invention described here is not limited to such an arrangement. Thus, an arrangement where multiple CCD elements partially overlap can likewise be used. Moreover, multiple image sensors may be fabricated into a single integrated circuit, with each image sensor connected to its own video circuitry.
Fig. 19 shows another embodiment that utilizes multiple image sensors to increase image resolution. In this instance, the multiple image sensors are provided by multiple cameras 1911. Beam splitters 1916 are used to direct separate sections of a paraboloidal image to different cameras. Advantageously, therefore, each portion of the paraboloidal image is imaged with a higher resolution than if the entire image were imaged by one camera alone.
In another exemplary embodiment of the present invention, dichroic beam splitters (not shown) may be used to split an image into a plurality of monochromatic images, which may be sensed by a plurality of monochromatic image detectors. These monochromatic images may later be suitably combined into a full-color image by image processing means well-known in the art.
Fig. 20 shows a planar image sensor 2010, as, for example, a CCD element. Using a typical planar image sensor with a paraboloid-shaped mirror, the effective resolution of a captured paraboloidal image is increasingly greater towards the outer edge of an image than at its center. For example, when a planar image sensor is used to capture an image reflected by a normal paraboloid-shaped reflector, the resolution of the captured image increases by a factor of four from the center of the image to its fringe. To compensate for this effect, an image sensor has sensing elements 2008 whose sizes and placements are varied to result in a uniform resolution over the entire image. This same approach may also be used to increase resolution in selected parts of the FOV. When specific resolution variations are difficult to incorporate, standard resolution variations, such as those provided by log-polar sensors, may also be used. One or more planar mirrors may be included in an omnidirectional imaging apparatus according to the present invention for flexibility of placement of the optics and reflector. Fig. 21 shows a preferred embodiment in which an omnidirectional imaging system includes a paraboloid-shaped reflector 2135, a planar mirror 2116, a relay lens 2113, an imaging lens 2112, and a camera 2111. In the embodiment shown, the paraboloid-shaped reflector 2135 is positioned above a surface 2140, and the planar mirror 2116, relay lens 2113, imaging lens 2112, and camera 2111 are concealed below the surface 2140. The planar mirror 2116 is positioned beneath an opening 2145 in the surface 2140 and folds the image from the paraboloid-shaped reflector 90 degrees, thereby redirecting the image to the relay lens, imaging lens, and camera. Although the planar mirror is shown between the paraboloid-shaped reflector and the relay lens, the planar mirror may also be placed between the relay lens and the imaging lens or between the imaging lens and the camera, as those skilled in the art will appreciate.
Fig. 22 shows an embodiment of an omnidirectional imaging system in which the optics between a paraboloid-shaped mirror 2235 and an image sensor 2210 comprise a low-power, inverted microscope objective 2212. In this embodiment, the reflector 2235 is at the position normally occupied by the eyepiece of the microscope and the image sensor 2210 is at the position normally occupied by a slide. The use of an inverted microscope objective is advantageous for imaging since commercial microscope objectives are well corrected for aberrations. Fig. 23 shows an embodiment of an omnidirectional imaging system in which a collimator lens 2313 is placed between a paraboloid-shaped mirror 2335 and imaging optics 2312. It is desirable to use commercially available imaging lens in many cases to save the cost and time of designing special lenses. Most commercial imaging lenses, however, are intended to image scenes that are very far from the lens. Indeed, they are normally designed for objects that are infinitely distant from lens. Therefore, when used to image objects that are close to the lens, the image suffers from various types of aberrations which degrade the effective resolution of the lens. The result is a "fuzzy" or smeared out image. In this embodiment, this problem is solved by the use of a collimating lens 2313, which produces a virtual object at infinity for the imaging optics 2312. Advantageously, therefore, the use of a collimating lens 2313 allows the use of commercially available imaging lenses.
The embodiments of Figs. 24A and 24B illustrate the use of field-flattening lenses between an image sensor 2410 and an imaging lens 2413. Field-flattening means are desirable because the paraboloid-shaped reflector of the present invention, having a typically small focal length of a few millimeters, is afflicted with very strong field curvature. One method of eliminating this imaging defect is to use an image sensor with a curved surface that matches the field curvature. More preferably, however, a special lens, called a field-flattening lens, may be introduced which has a curvature of opposite sign to that of the reflector. Therefore, the two field curvatures cancel, and the resultant image surface is flat, allowing the entire image to be in sharp focus on a planar image sensor.
Two types of preferred field-flattening lenses are illustrated in Figs. 24 A and 24B. In Fig. 24A, a plano-concave lens 2412a is shown. The plano-concave lens 2412a is placed as close as possible to the image sensor 2410. Preferably, the plano- concave lens 2412a is placed in contact with the image sensor window 2417. In this position, the plano-concave lens 2412a compensates for the field curvature of the reflector while introducing only small amounts of undesirable aberrations.
A second type of preferred field-flattening lens, a meniscus lens 2412b, is shown in Fig. 24B. Both of the surfaces of the meniscus lens 2412b are aplanatic to the incoming light. If a surface is aplanatic, it introduces no spherical aberration, coma or astigmatism into the beam of light; it only introduces field curvature. The meniscus lens 2412b has a marked field flattening effect which is determined by the thickness of the lens: the thicker the lens, the greater the field flattening effect. In contrast to the plano-concave lens 2412a of Fig. 24 A, the meniscus lens 2412b of Fig. 24B is not used in contact with the image sensor 2410.
The theory of the field-flattening lenses will now be explained. Ideally, the surface of best focus of an optical system is a plane. With a planar surface, a CCD or other type of flat image sensor can match the surface of best focus over its entire area, thereby providing maximum resolution for the image. Unfortunately, an optical system has a tendency to form its best images on a curved surface. Accordingly, the curved focal surface and the flat CCD surface cannot be matched up over their entire area, and some or all of the image will not be in best focus.
The field curvature of an optical system is called its Petzval curvature. Every optical element in an optical system contributes to the Petzval curvature for the system. If a surface of an optical element is refracting, its Petzval contribution to the curvature of the system is:
1 -n
P=Y^'
where n is the refractive index of the optical element and R is the radius of curvature of the surface of the optical element. Clearly, the Petzval contribution of a surface depends on the sign of the radius. If the surface is a mirror instead of a refracting surface, its Petzval contribution is: 2 p = - ~R
The field curvature of an image is calculated by taking the sum of the contributions of all of the reflecting and refracting surfaces and multiplying the sum by a simple constant. If this value is not zero, then the field of the image is curved and the problem discussed above will be encountered (i.e., the surface of the image and the surface of the image sensor will not be completely matched).
Unfortunately, the curvatures of optical surfaces cannot be eliminated because they are necessary for other purposes, such as for controlling spherical aberration, coma and astigmatism. Because the control of these aberrations depends on the curvatures of optical elements, if the curvature of these elements is changed, these aberrations may be adversely effected. There are two ways, however, in which the Petzval curvature of an optical system may be changed without changing the other aberrations of a system. These two methods form the basis for the two types of field- flattening lenses described above.
The first method for changing the Petzval curvature depends on the optical characteristics of an optical surface located at the surface of an image. If an optical surface is located at the surface of an image (either an intermediate image or the final image of the optical system), then this surface will not change the spherical aberration, coma or astigmatism of the image. The only change will be to the Petzval curvature. Thus, the Petzval curvature of a system can be corrected by inserting a surface with an appropriate radius of curvature at the final focus of the system. This is the basis for the plano-concave field-flattening lens described above.
The second method for changing the Petzval curvature depends on the optical characteristics of aplanatic surfaces. Assume there is an aplanatic surface, which is defined as follows: Let s be the object distance for the surface and s' be the image distance. Also, let n and n' be the refractive indices of the materials before and after the surface, respectively (where n-1 for air and n>l for glass). If s and s' are related by / R ( n +n ) ns
S = ; = — , n n
then the surface will introduce no spherical aberration or coma and only very small amounts of astigmatism. If now a thick lens is introduced, both of whose surfaces satisfy this condition, then the difference in their radii will depend on the thickness of the lens. This fact can again be used to control the Petzval curvature of the system by adjusting the thickness of the aplanatic lens. This is the basis of the thick, meniscus field-flattening lens discussed above.
In a preferred embodiment of the plano-concave lens 2412a of Fig. 24A, the plano-concave lens is composed of BK7 and has a refractive index (n) of 1.517. The radius of the curved (concave) surface r, is 6.2 mm. The surface opposite the curved surface r, is flat and is placed in contact with the image detector window 2417. The axial thickness of the lens is 1.5 mm, and the optical diameter is 3 mm.
In a preferred embodiment of the aplanatic lens 2412b of Fig. 24B, the aplanatic lens is composed of acrylic plastic and has a refractive index (n) of 1.494.
The radius of the curved (convex) surface r2 is 4.78 mm, and the radius of the curved (concave) surface r3 is 2J 6 mm. The axial thickness of the lens is 6.7 mm. The optical diameter of the curved surface r2 is 7 mm, and the optical diameter of the curved surface r3 is 2.7 mm.
Although the present invention has been described with reference to certain preferred embodiments, various modifications, alterations, and substitutions will be known or obvious to those skilled in the art without departing from the spirit and scope of the invention, as defined by the appended claims. APPENDIX I
compute_image.c
#include "stdlib.h" #include "imageutil.h" #include "stdio.h" #include "math.h"
/* int main(int argc, char **argv) */ main(argc,argv) int argc; char *argv[]; {
double sqrt(), atan(), sin(), cos(), acos(); unsigned char *r, *g, *b; unsigned char *red; unsigned char * green; unsigned char *blue; int xsize, ysize; int xosize, yosize; int i, j, x0, yO, xl, yl; double theta, phi; double ox, oy, oz; double px, py, pz; double qx, qy, qz; double tempx, tempy, tempz; double sx, sy, sz; double rad, mag; double xs, ys, zs; double dispx, dispy; int xcent, ycent, xnew, ynew, xpix, ypix, xpoint, ypoint; int xpixel, ypixel, indexx, indexy, xcenter, ycenter; float radius, focal;
/* printf(" completed initializations\n\n"); */
if(argc != 4) { printf("arguments: xcenter, ycenter, radius\n"); exit(0); }
printf("\n");
xcent = atoi(argv[l]); ycent = atoi(argv[2]); radius = atof(argv[3]);
printf("omni-image: xcenter = %d ycenter = %d radius = %f\n\n", xcent, ycent, (float)radius);
printf("input view pixel [xnovel ynovel]: "); scanf("%d %d", &xnew, &ynew); printf("\n");
printf("selected view pixel: xnew = %d ynew = %d\n\n", xnew, ynew);
printf("input new image parameters [xpixels ypixels focal]: "); scanf("%d %d %f ', &xpix, &ypix, &focal); printf("\n");
printf("output image: xpixels = %d ypixels = %d focal = %f\n\n", xpix, ypix, (float)focal);
loadPPM("test.ppm", &r, &g, &b, &xsize, &ysize);
printf(" loaded omni-image file\n\n");
xosize = xpix; yosize = ypix;
/* printf("set new img size, xsize = %d, ysize = %d \n\n", xosize, yosize); */
red = (unsigned char*)malloc(xosize * yosize * sizeof(unsigned char)); green = (unsigned char*)malloc(xosize * yosize * sizeof(unsigned char)); blue = (unsigned char*)malloc(xosize * yosize * sizeof(unsigned char));
printf("allocated memory for new image file\n\n");
xcenter = xcent; ycenter = ycent;
xpoint = ynew - ycent; ypoint = xnew - xcent;
tempx = (double)xpoint; tempy ■= (double)ypoint; tempz = (radius*radius - (tempx*tempx + tempy*tempy))/(2*radius); ox = tempx/sqrt(tempx*tempx + tempy*tempy + tempz*tempz); oy = tempy/sqrt(tempx*tempx + tempy*tempy + tempz*tempz); oz = tempz/sqrt(tempx:; tempx + tempy*tempy + tempz*tempz);
/* computed optical (z) axis */
tempx = -oy; tempy = ox; tempz = 0;
px = tempx/sqrt(tempx*tempx + tempy*tempy + tempz*tempz); py = tempy/sqrt(tempx*tempx + tempy*tempy + tempz*tempz); pz = tempz/sqrt(tempx';:tempx + tempy*tempy + tempz* tempz);
/* computed horizontal axis */
tempx = py*oz - pz*oy; tempy = pz*ox - px*oz; tempz = px*oy - py*ox;
qx = tempx/sqrt(tempx*tempx + tempy*tempy + tempz*tempz); qy = tempy/sqrt(tempx*tempx + tempy*tempy + tempz*tempz); qz = tempz/sqrt(tempx*tempx + tempy*tempy + tempz*tempz);
/* computed vertical axis */
printf("computed perspective image frame\n\n");
/* raster scan perspective image plane */ for(i=0;i<ypix;i++){ dispy = (double)i - (double)ypi.x/2; for(j=0;j<xpix;j++){
dispx = (double)xpix/2 - (double)j;
sx = ox * focal + px * dispx + qx * dispy; sy = oy * focal + py * dispx + qy * dispy; sz = oz * focal + pz * dispx + qz * dispy;
mag = sqrt(sx*sx + sy*sy + sz*sz);
sx = sx/mag; sy = sy/mag; sz = sz/mag;
/* computed vector in direction of current pixel */
phi = atan2(sy,sx); theta = acos(sz/sqrt(sx*sx + sy*sy + sz*sz));
/* converted vector to polar coordinates */
rad = 2*radius*(l-cos(theta))/(l-cos(2*theta));
/* found radius of intersection on parabola *l
xs = rad*sin(theta)*cos(phi); ys = rad*sin(theta)*sin(phi); zs = rad*cos(theta); /* found x, y, z coordinates on paraboloid */
/* printf("xs - %f ys = %f zs = %f\n\n" , (float)xs, (float)ys, (float)zs); */ /* use xs,ys to read from input image and save in output image */
/* check if image point lies outside parabolic image */
if(sqrt(xs*xs + ys*ys) > radius) {
red[i * xpix + j] = 255; green[i * xpix + jj = 255; blue[i * xpix +j] = 255;
else{
indexx = (int)ys + xcenter; indexy = (int)xs + ycenter;
/* printf("one pixel\n\nM); */
/* write closest color value into pixel */
red[i * xpix + j] = r [indexy * xsize + indexx]; green[i * xpix + j] = g[indexy * xsize + indexx]; blue[i * xpix + j] = b [indexy * xsize + indexx]; } } }
printf(" computed perspective image\n\n");
savePPM("out.ppm", red, green, blue, xpix, ypix);
printf("saved new image file\n\n");
system("xv out.ppm &");
free(r); free(g);
free(red); free(green);
printf("freed allocated memory\n\n");
return 0; }

Claims

1. An omnidirectional imaging apparatus for sensing an image of a scene from a single viewpoint, comprising:
(a) a truncated, substantially paraboloid-shaped reflector positioned to orthographically reflect principal rays of electromagnetic radiation radiating from said scene, said paraboloid-shaped reflector having a focus coincident with said single viewpoint of said omnidirectional imaging apparatus, including said paraboloid-shaped reflector;
(b) telecentric means, optically coupled to said paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by said paraboloid-shaped reflector; and
(c) one or more image sensors positioned to receive said orthographically reflected principal rays of electromagnetic radiation from said paraboloid-shaped reflector, thereby sensing said image of said scene.
2. An omnidirectional imaging apparatus according to claim 1, wherein said paraboloid-shaped reflector is convex.
3. An omnidirectional imaging apparatus according to claim 1, wherein said paraboloid-shaped reflector is concave.
4. An omnidirectional imaging apparatus according to claim 1, wherein said paraboloid-shaped reflector comprises a substantially paraboloidal mirror having a surface which substantially obeys the equation expressed in cylindrical coordinates:
2 z being an axis of rotation of said surface, r being a radial coordinate, and h being a constant.
5. An omnidirectional imaging apparatus according to claim 1 , wherein said one or more image sensors comprise one or more charge-coupled devices.
6. An omnidirectional imaging apparatus according to claim 1, wherein said one or more image sensors comprise one or more charge injection devices.
7. An omnidirectional imaging apparatus according to claim 1, wherein said one or more image sensors comprise photographic film.
8. An omnidirectional imaging apparatus according to claim 1, wherein said one or more image sensors comprise one or more video cameras.
9. An omnidirectional imaging apparatus according to claim 1, wherein said one or more image sensors has a curved surface that matches a field-curvature of said image.
10. .An omnidirectional imaging apparatus according to claim 1, wherein at least one of said one or more image sensors has a non-uniform resolution.
11. An omnidirectional imaging apparatus according to claim 1 , wherein said one or more image sensors are positioned along an axis passing through the vertex of said paraboloid-shaped reflector and through said focus of said paraboloid- shaped reflector.
12. An omnidirectional imaging apparatus according to claim 1, further comprising one or more planar mirrors positioned between said paraboloid-shaped reflector and said one or more image sensors, wherein said one or more planar mirrors optically couple said paraboloid-shaped reflector to said one or more image sensors.
13. ^\n omnidirectional imaging apparatus according to claim 1, wherein said paraboloid-shaped reflector comprises a mirror truncated at a plane which includes said focus of said paraboloid-shaped reflector.
14. An omnidirectional imaging apparatus according to claim 1, wherein said paraboloid-shaped reflector comprises a mirror truncated at a plane that is substantially perpendicular to an axis passing through the vertex of said paraboloid- shaped reflector and through said focus of said paraboloid-shaped reflector.
15. An omnidirectional imaging apparatus according to claim 1, wherein said paraboloid-shaped reflector comprises a normal paraboloidal mirror.
16. An omnidirectional imaging apparatus according to claim 1, further comprising a transparent support coupling said paraboloid-shaped reflector to said one or more image sensors to thereby maintain the relative positions thereof.
17. An omnidirectional imaging apparatus according to claim 1, further comprising a fixed base and a movable base, wherein said paraboloid-shaped reflector is mounted on said fixed base and said one or more image sensors are mounted on said movable base, whereby movement of said one or more image sensors produces a changing field of view.
18. An omnidirectional imaging apparatus according to claim 17, further comprising a zoom lens positioned between and optically coupling said one or more image sensors and said paraboloid-shaped reflector.
19. An omnidirectional imaging apparatus according to claim 1, further comprising a fixed base and a movable base, wherein said paraboloid-shaped reflector is mounted on said movable base and said one or more image sensors are mounted on said fixed base, whereby movement of said paraboloid-shaped reflector produces a changing field of view.
20. An omnidirectional imaging apparatus according to claim 19, further comprising a zoom lens positioned between and optically coupling said one or more image sensors and said paraboloid-shaped reflector.
21. P Ά omnidirectional imaging apparatus according to claim 1, wherein said one or more image sensors generate an image signal representative of said image of said scene, further comprising an image signal processing apparatus coupled to said one or more image sensors and receiving said image signal for converting said image signal into image signal data.
22. An omnidirectional imaging apparatus according to claim 21, wherein said image signal processing apparatus maps said image signal data into a Cartesian- coordinate system to produce a perspective image.
23. An omnidirectional imaging apparatus according to claim 21, wherein said image signal processing apparatus maps said image signal data into a cylindrical- coordinate system to produce a panoramic image.
24.
Figure imgf000042_0001
omnidirectional imaging apparatus according to claim 21, wherein said image signal processing apparatus further includes interpolation means for providing interpolated image data, whereby said interpolated image data and said image signal data are combined to form a digital image.
25. An omnidirectional imaging apparatus according to claim 24, wherein said image processing apparatus further includes means for zooming in on a preselected portion of said digital image to thereby provide an enlarged image of said preselected portion from a predetermined focal distance.
26. An omnidirectional imaging apparatus according to claim 1 , wherein said telecentric means comprises a telecentric lens.
27. An omnidirectional imaging apparatus according to claim 1, wherein said telecentric means comprises a telecentric aperture.
28. An omnidirectional imaging apparatus according to claim 1, further comprising at least one lens optically coupling said one or more image sensors and said paraboloid-shaped reflector.
29. An omnidirectional imaging apparatus according to claim 28, wherein said at least one lens has a focal plane between said one or more image sensors and said at least one lens, and wherein said telecentric means is a telecentric aperture positioned along said focal plane.
30. .An omnidirectional imaging apparatus according to claim 28, wherein said telecentric means comprises a collimating lens optically coupling said paraboloid-shaped reflector and said at least one lens.
31. . n omnidirectional imaging apparatus according to claim 1 , further comprising a zoom lens optically coupling said one or more image sensors and said paraboloid-shaped reflector.
32. An omnidirectional imaging apparatus according to claim 1 , further comprising a microscope objective optically coupling said one or more image sensors and said paraboloid-shaped reflector.
33. An omnidirectional imaging apparatus according to claim 1, further comprising a field-flattening lens optically coupling said one or more image sensors and said paraboloid-shaped reflector, said field-flattening lens having a field curvature approximately opposite to the field curvature of said paraboloid-shaped reflector.
34. An omnidirectional imaging apparatus according to claim 33, wherein said field-flattening lens comprises a plano-concave lens which is positioned closely to said one or more image sensors.
35. An omnidirectional imaging apparatus according to claim 33, wherein said field-flattening lens comprises a meniscus lens having aplanatic sides.
36. An omnidirectional imaging apparatus according to claim 1, wherein said scene is a substantially hemispherical scene, and further comprising: an additional truncated, substantially paraboloid-shaped reflector positioned to orthographically reflect principal rays of electromagnetic radiation radiating from an additional hemispherical scene, said additional paraboloid-shaped reflector having a focus coincident with a single viewpoint of said additional hemispherical scene; additional telecentric means, optically coupled to said additional paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by said additional paraboloid-shaped reflector; and additional one or more image sensors positioned to receive said orthographically reflected principal rays of electromagnetic radiation from said additional paraboloid-shaped reflector, thereby sensing said additional substantially hemispherical scene.
37. An omnidirectional imaging apparatus according to claim 36, wherein said additional hemispherical scene and said hemispherical scene are substantially complementary to one another so that the combination thereof is a substantially spherical scene, and wherein said paraboloid-shaped reflector and said additional paraboloid-shaped reflector are normal convex paraboloids, positioned back-to-back along their planes of truncation, and having a common paraboloidal axis and a common focus point.
38. An omnidirectional imaging apparatus according to claim 36, wherein said additional hemispherical scene and said hemispherical scene are substantially complementary to one another so that the combination thereof is a substantially spherical scene, and wherein said paraboloid-shaped reflector and said additional paraboloid-shaped reflector are normal concave paraboloids, positioned such that their vertexes coincide and they share a common paraboloidal axis.
39. An omnidirectional imaging method for sensing an image of a scene from a single viewpoint, comprising the steps of:
(a) orthographically reflecting principal rays of electromagnetic radiation radiating from said scene on a truncated, substantially paraboloid-shaped reflector such that said single viewpoint of said omnidirectional imaging method coincides with a focus point of said paraboloid-shaped reflector;
(b) telecentrically filtering out a substantial portion of any principal rays of electromagnetic radiation which are not orthographically reflected by said paraboloid-shaped reflector; and
(c) sensing said orthographically reflected principal rays of electromagnetic radiation from said paraboloid-shaped reflector with one or more image sensors to thereby sense said image of said scene.
40. The method of claim 39, wherein step (c) comprises sensing said image of said scene from a position along an axis passing through the vertex of said paraboloid-shaped reflector and through said focus of said paraboloid-shaped reflector.
41. The method of claim 39, further comprising the step of optically coupling said paraboloid-shaped reflector and said one or more image sensors with one or more planar mirrors positioned between said paraboloid-shaped reflector and said one or more image sensors.
42. The method of claim 39, further comprising the steps of providing an image signal which is representative of said image of said scene and converting said image signal into image signal data.
43. The method of claim 42, further comprising the step of mapping said image signal data into a Cartesian-coordinate system to produce a perspective image.
44. The method of claim 42, further comprising the step of mapping said image signal data into a cylindrical-coordinate system to produce a panoramic image.
45. The method of claim 42, further comprising the steps of interpolating said image signal data to define approximate values for missing image data, and forming a digital image from said mapped image data and said interpolated image data.
46. The method of claim 45, further comprising the steps of zooming in on a preselected portion of said digital image to thereby obtain an enlarged image of said preselected portion from a predetermined focal distance, interpolating said image data to define approximate values for missing image data, and forming a digital image from said mapped image data and said interpolated image data.
47. The method of claim 39, wherein said scene is substantially hemispherical and further comprising the steps of: orthographically reflecting principal rays of electromagnetic radiation radiating from an additional substantially hemispherical scene on an additional truncated, substantially paraboloid-shaped reflector such that a single viewpoint of said additional hemispherical scene coincides with a focus point of said additional paraboloid-shaped reflector; telecentrically filtering out a substantial portion of any principal rays of electromagnetic radiation which are not orthographically reflected by said additional paraboloid-shaped reflector; and sensing said orthographically reflected principal rays of electromagnetic radiation from said additional paraboloid-shaped reflector with additional one or more image sensors to thereby sense said additional hemispherical scene.
48. A method for omnidirectionally sensing images of a scene from a single viewpoint, the method comprising the steps of:
(a) mounting a truncated, substantially paraboloid-shaped reflector on a fixed base; (b) mounting one or more image sensors on a movable base;
(c) orthographically reflecting principal rays of electromagnetic radiation radiating from said scene on said substantially paraboloid-shaped reflector such that said single viewpoint of said omnidirectional imaging method coincides with a focus point of said paraboloid-shaped reflector; (d) telecentrically filtering out a substantial portion of any principal rays of electromagnetic radiation which are not orthographically reflected from said paraboloidal-shaped reflector;
(e) moving said movable base to a first position;
(f) sensing a first image of said scene having a first field of view by sensing said orthographically reflected principal rays of electromagnetic radiation from said paraboloidal-shaped reflector with said one or more image sensors;
(g) moving said movable base to a second position different from said first position; and
(h) sensing a second image of said scene having a second field of view by sensing said orthographically reflected principal rays of electromagnetic radiation from said paraboloidal-shaped reflector with said one or more image sensors.
49. A method for omnidirectionally sensing images of a scene according to claim 48, further comprising the step of optically coupling said substantially paraboloid-shaped reflector and said one or more image sensors with a zoom lens.
50. A method for omnidirectionally sensing images of a scene according to claim 49, further comprising the steps of: locating an area of interest within said scene with said zoom lens set at a first power of magnification; and magnifying said area of interest by setting said zoom lens at a second power of magnification greater than said first power of magnification.
51. A method for omnidirectionally sensing images of a scene from a single viewpoint, the method comprising the steps of: (a) mounting a truncated, substantially paraboloid-shaped reflector on a movable base;
(b) mounting one or more image sensors on a fixed base;
(c) orthographically reflecting principal rays of electromagnetic radiation radiating from said scene on said substantially paraboloid-shaped reflector such that said single viewpoint of said omnidirectional imaging method coincides with a focus point of said paraboloid-shaped reflector;
(d) telecentrically filtering out a substantial portion of any principal rays of electromagnetic radiation which are not orthographically reflected from said paraboloidal-shaped reflector; (e) moving said movable base to a first position;
(f) sensing a first image of said scene having a first field of view by sensing said orthographically reflected principal rays of electromagnetic radiation from said paraboloidal-shaped reflector with said one or more image sensors;
(g) moving said movable base to a second position different from said first position; and
(h) sensing a second image of said scene having a second field of view by sensing said orthographically reflected principal rays of electromagnetic radiation from said paraboloidal-shaped reflector with said one or more image sensors.
52. A method for omnidirectionally sensing images of a scene according to claim 51 , further comprising the step of optically coupling said substantially paraboloid-shaped reflector and said one or more image sensors with a zoom lens.
53. A method for omnidirectionally sensing images of a scene according to claim 52, further comprising the steps of: locating an area of interest within said scene with said zoom lens set at a first power of magnification; and magnifying said area of interest by setting said zoom lens at a second power of magnification greater than said first power of magnification.
54. An omnidirectional imaging apparatus for sensing an image of a scene from a single viewpoint, comprising:
(a) a truncated, substantially paraboloid-shaped reflector positioned to orthographically reflect principal rays of electromagnetic radiation radiating from said scene, said paraboloid-shaped reflector having a focus coincident with said single viewpoint of said omnidirectional imaging apparatus, including said paraboloid-shaped reflector;
(b) telecentric means, optically coupled to said paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by said paraboloid-shaped reflector;
(c) a plurality of beam splitters for splitting said orthographically reflected principal rays of electromagnetic radiation into a plurality of ray bundles, each ray bundle comprising a portion of said orthographically reflected principal rays of electromagnetic radiation from said paraboloid-shaped reflector; and
(d) a plurality of image sensors, each image sensor positioned to receive at least one of said plurality of ray bundles, each image sensor thereby sensing a portion of said image of said scene.
55. An omnidirectional imaging apparatus for sensing an image of a scene from a single viewpoint, comprising:
(a) a truncated, substantially paraboloid-shaped reflector positioned to orthographically reflect principal rays of electromagnetic radiation radiating from said scene, said paraboloid-shaped reflector having a focus coincident with said single viewpoint of said omnidirectional imaging apparatus, including said paraboloid-shaped reflector;
(b) telecentric means, optically coupled to said paraboloid-shaped reflector, for substantially filtering out principal rays of electromagnetic radiation which are not orthographically reflected by said paraboloid-shaped reflector;
(c) a plurality of dichroic beam splitters for splitting said orthographically reflected principal rays of electromagnetic radiation into a plurality of monochromatic principal rays of electromagnetic radiation; and (d) a plurality of image sensors, each image sensor positioned to receive at least one of said plurality of monochomatic principal rays of electromagnetic radiation, thereby sensing at least one monochromatic image of said scene.
PCT/US1998/025689 1997-12-05 1998-12-04 An omnidirectional imaging apparatus WO1999030197A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
KR1020007006159A KR100599423B1 (en) 1997-12-05 1998-12-04 An omnidirectional imaging apparatus
EP98963782A EP1042697A1 (en) 1997-12-05 1998-12-04 An omnidirectional imaging apparatus
BR9813370-5A BR9813370A (en) 1997-12-05 1998-12-04 Omnidirectional imaging device.
AU19033/99A AU1903399A (en) 1997-12-05 1998-12-04 An omnidirectional imaging apparatus
CA002312970A CA2312970A1 (en) 1997-12-05 1998-12-04 An omnidirectional imaging apparatus
JP2000524698A JP2001526471A (en) 1997-12-05 1998-12-04 Omnidirectional imaging device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/986,082 1997-12-05
US08/986,082 US6118474A (en) 1996-05-10 1997-12-05 Omnidirectional imaging apparatus

Publications (1)

Publication Number Publication Date
WO1999030197A1 true WO1999030197A1 (en) 1999-06-17

Family

ID=25532061

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/025689 WO1999030197A1 (en) 1997-12-05 1998-12-04 An omnidirectional imaging apparatus

Country Status (10)

Country Link
US (1) US6118474A (en)
EP (1) EP1042697A1 (en)
JP (1) JP2001526471A (en)
KR (1) KR100599423B1 (en)
CN (1) CN1290355A (en)
AU (1) AU1903399A (en)
BR (1) BR9813370A (en)
CA (1) CA2312970A1 (en)
RU (1) RU2201607C2 (en)
WO (1) WO1999030197A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000074018A1 (en) * 1999-06-02 2000-12-07 Cyclovision Technologies, Inc. Omni-directional security and lighting system
WO2001071424A1 (en) * 2000-03-22 2001-09-27 Egg Solution Optronics Sa Magnifying device for panoramic anamorphic imaging system
WO2001071423A1 (en) * 2000-03-22 2001-09-27 Egg Solution_Optronics Sa Panoramic image acquisition device
WO2002097730A1 (en) * 2001-05-25 2002-12-05 Matsushita Electric Industrial Co., Ltd. Wide-angle image generating device
FR2830128A1 (en) * 2001-09-26 2003-03-28 Egg Solution Sa Photoelectric sensor with sensing elements of pixels laid out in concentric circles, and device comprising such sensor for acquisition of panoramic images
DE10158415A1 (en) * 2001-11-29 2003-06-18 Daimler Chrysler Ag Method for monitoring the interior of a vehicle, as well as a vehicle with at least one camera in the vehicle interior
EP1377041A2 (en) * 2002-06-27 2004-01-02 Microsoft Corporation Integrated design for omni-directional camera and microphone array
FR2842306A1 (en) * 2002-07-12 2004-01-16 Egg Solution Optronics Wide angle electromagnetic wave acquisition sensor system has part coated infra red and optical reflector with filtering and processing to control zoom camera
KR100442733B1 (en) * 2000-05-23 2004-08-02 샤프 가부시키가이샤 Omniazimuthal visual system
EP1677535A1 (en) * 2004-12-30 2006-07-05 Microsoft Corporation Camera lens shuttering mechanism
US8256902B2 (en) 2008-06-04 2012-09-04 Satoru Yoshii Entire visual-field projection device, and entire-visual-field image system

Families Citing this family (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6111702A (en) * 1995-11-30 2000-08-29 Lucent Technologies Inc. Panoramic viewing system with offset virtual optical centers
US6493032B1 (en) * 1996-06-24 2002-12-10 Be Here Corporation Imaging arrangement which allows for capturing an image of a view at different resolutions
US6226035B1 (en) 1998-03-04 2001-05-01 Cyclo Vision Technologies, Inc. Adjustable imaging system with wide angle capability
US6333749B1 (en) * 1998-04-17 2001-12-25 Adobe Systems, Inc. Method and apparatus for image assisted modeling of three-dimensional scenes
US6337683B1 (en) 1998-05-13 2002-01-08 Imove Inc. Panoramic movies which simulate movement through multidimensional space
US6304285B1 (en) * 1998-06-16 2001-10-16 Zheng Jason Geng Method and apparatus for omnidirectional imaging
US20010015751A1 (en) * 1998-06-16 2001-08-23 Genex Technologies, Inc. Method and apparatus for omnidirectional imaging
US6545702B1 (en) * 1998-09-08 2003-04-08 Sri International Method and apparatus for panoramic imaging
DE60014317T2 (en) * 1999-01-04 2005-10-06 Cyclovision Technologies, Inc. DEVICE FOR RECORDING PANORAMIC IMAGES
FI114244B (en) * 1999-05-11 2004-09-15 Teknillinen Korkeakoulu Camera system and monitor
US6738073B2 (en) * 1999-05-12 2004-05-18 Imove, Inc. Camera system with both a wide angle view and a high resolution view
US6690374B2 (en) * 1999-05-12 2004-02-10 Imove, Inc. Security camera system for tracking moving objects in both forward and reverse directions
US7050085B1 (en) 2000-10-26 2006-05-23 Imove, Inc. System and method for camera calibration
US7015954B1 (en) * 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
US7256834B1 (en) 2000-03-17 2007-08-14 Axis, Ab Digital camera having panning and/or tilting functionality, and an image rotating device for such a camera
JP3549463B2 (en) * 2000-06-02 2004-08-04 松下電器産業株式会社 Camera device
KR100343836B1 (en) * 2000-06-27 2002-07-20 이성환 Panorama video surveillance system and controlling method therefore
JP4750927B2 (en) * 2000-06-30 2011-08-17 日本ネットワークサービス株式会社 Remote monitoring method and monitoring control server
SE519734C2 (en) * 2000-07-07 2003-04-01 Axis Ab Image changing device for an image generating device and method and digital camera for the same
JP2002196438A (en) * 2000-10-20 2002-07-12 Matsushita Electric Ind Co Ltd Wide angle image pickup apparatus
JP3804916B2 (en) * 2001-02-09 2006-08-02 シャープ株式会社 Imaging system, program used for controlling image data thereof, method for correcting distortion of captured image in imaging system, and storage medium storing procedure thereof
US20020147773A1 (en) * 2001-02-24 2002-10-10 Herman Herman Method and system for panoramic image generation using client-server architecture
CA2439082A1 (en) * 2001-02-24 2002-09-06 Eyesee360, Inc. Method and apparatus for processing photographic images
US6594448B2 (en) 2001-02-24 2003-07-15 Eyesee360, Inc. Radially-oriented planar surfaces for flare reduction in panoramic cameras
US6856472B2 (en) * 2001-02-24 2005-02-15 Eyesee360, Inc. Panoramic mirror and system for producing enhanced panoramic images
US6963355B2 (en) * 2001-02-24 2005-11-08 Eyesee380, Inc. Method and apparatus for eliminating unwanted mirror support images from photographic images
US6831643B2 (en) 2001-04-16 2004-12-14 Lucent Technologies Inc. Method and system for reconstructing 3D interactive walkthroughs of real-world environments
JP2002334322A (en) * 2001-05-10 2002-11-22 Sharp Corp System, method and program for perspective projection image generation, and storage medium stored with perspective projection image generating program
US7362969B2 (en) * 2001-05-29 2008-04-22 Lucent Technologies Inc. Camera model and calibration procedure for omnidirectional paraboloidal catadioptric cameras
US6937266B2 (en) * 2001-06-14 2005-08-30 Microsoft Corporation Automated online broadcasting system and method using an omni-directional camera system for viewing meetings over a computer network
US6744569B2 (en) 2001-06-19 2004-06-01 Genex Technologies, Inc Method and apparatus for omnidirectional three dimensional imaging
WO2003027766A2 (en) * 2001-09-27 2003-04-03 Eyesee360, Inc. System and method for panoramic imaging
US7058239B2 (en) * 2001-10-29 2006-06-06 Eyesee360, Inc. System and method for panoramic imaging
JP2003223633A (en) * 2002-01-29 2003-08-08 Sharp Corp Omnidirectional visual system
US7904826B2 (en) * 2002-03-29 2011-03-08 Microsoft Corporation Peek around user interface
US7429996B2 (en) * 2002-07-16 2008-09-30 Intel Corporation Apparatus and method for sensing depth in every direction
US7042508B2 (en) * 2002-07-26 2006-05-09 Appro Technology Inc. Method for presenting fisheye-camera images
US7133031B2 (en) * 2002-10-31 2006-11-07 Microsoft Corporation Optical system design for a universal computing device
US20040184653A1 (en) * 2003-03-20 2004-09-23 Baer Richard L. Optical inspection system, illumination apparatus and method for use in imaging specular objects based on illumination gradients
US20040254424A1 (en) * 2003-04-15 2004-12-16 Interscience, Inc. Integrated panoramic and forward view endoscope
US7450165B2 (en) * 2003-05-02 2008-11-11 Grandeye, Ltd. Multiple-view processing in wide-angle video camera
US7313285B2 (en) * 2003-05-30 2007-12-25 Lucent Technologies Inc. Method and apparatus for compressing and decompressing images captured from viewpoints throughout N-dimensional space
US7126603B2 (en) * 2003-05-30 2006-10-24 Lucent Technologies Inc. Method and system for creating interactive walkthroughs of real-world environment from set of densely captured images
US7356164B2 (en) * 2003-05-30 2008-04-08 Lucent Technologies Inc. Method and apparatus for finding feature correspondences between images captured in real-world environments
US8896660B2 (en) * 2003-05-30 2014-11-25 Alcatel Lucent Method and apparatus for computing error-bounded position and orientation of panoramic cameras in real-world environments
US7118228B2 (en) * 2003-11-04 2006-10-10 Hewlett-Packard Development Company, L.P. Image display system
US7268956B2 (en) * 2003-11-24 2007-09-11 Electronic Scripting Products, Inc. Solid catadioptric lens with two viewpoints
US7038846B2 (en) * 2003-11-24 2006-05-02 Electronic Scripting Products, Inc. Solid catadioptric lens with a single viewpoint
GB2408661B (en) * 2003-11-27 2008-02-06 Sony Comp Entertainment Europe Image rendering
US7548803B2 (en) * 2004-01-21 2009-06-16 Maccarthy James Vehicle surveillance and control system
US8130827B2 (en) 2004-08-13 2012-03-06 Samsung Electronics Co., Ltd. Method and apparatus for interpolating a reference pixel in an annular image and encoding/decoding an annular image
WO2006030488A1 (en) * 2004-09-14 2006-03-23 Fujitsu Limited Image processor, image processing method and image processing program
EP1729102B1 (en) * 2005-05-24 2019-04-24 Yonathan Gerlitz Detector with miniature optics for constant energy collection from different distances
US20070045522A1 (en) * 2005-09-01 2007-03-01 Yi-Tsung Chien Omnidirectional electromagnetic sensing device
CN100469137C (en) * 2006-06-19 2009-03-11 浙江工业大学 Omnibearing monitor and control sighting device of considering sensory function in the mind
US9361943B2 (en) * 2006-11-07 2016-06-07 The Board Of Trustees Of The Leland Stanford Jr. University System and method for tagging objects in a panoramic video and associating functions and indexing panoramic images with same
WO2008066742A1 (en) * 2006-11-22 2008-06-05 Geng Z Jason Wide field-of-view reflector and method of designing and making same
US20090073254A1 (en) * 2007-09-17 2009-03-19 Hui Li Omnidirectional imaging system with concurrent zoom
US8077401B2 (en) * 2007-10-03 2011-12-13 Ricoh Co., Ltd. Catadioptric imaging system
US8203596B1 (en) * 2007-12-20 2012-06-19 Lockheed Martin Corporation Panoramic imaging system with dual imagers
JP4787292B2 (en) * 2008-06-16 2011-10-05 富士フイルム株式会社 Omni-directional imaging device
US8587770B1 (en) 2008-09-24 2013-11-19 Jetprotect Corporation Aircraft collision warning system
US9100562B2 (en) * 2009-04-13 2015-08-04 Massachusetts Institute Of Technology Methods and apparatus for coordinated lens and sensor motion
CN101923179B (en) * 2009-11-06 2013-04-24 中国科学院空间科学与应用研究中心 All-sky atmosphere gravitational wave imager
CN101706588B (en) * 2009-11-10 2013-04-24 中国科学院空间科学与应用研究中心 All-sky atmospheric gravity wave imaging instrument adopting fish eye lens and telecentric beam path
KR101091564B1 (en) 2009-12-22 2011-12-13 연세대학교 산학협력단 Omnidirectional camera
FI20105058A0 (en) * 2010-01-22 2010-01-22 Valtion Teknillinen Omnidirectional lens, lens utilizing optical devices and optical measurement method
KR20110120590A (en) 2010-04-29 2011-11-04 삼성전자주식회사 Optical system and image projection apparatus
DE102010026572B4 (en) * 2010-07-08 2013-10-31 Michael Kanna Method for recording and reproducing panorama representations
CN102043322A (en) * 2010-11-09 2011-05-04 浙江工业大学 Portable type 360-degree circular-screen theater system
WO2012145317A1 (en) * 2011-04-18 2012-10-26 Eyesee360, Inc. Apparatus and method for panoramic video imaging with mobile computing devices
DE112012005632A5 (en) * 2012-01-11 2014-10-23 Michael Kanna Method and device for recording and reproducing panorama displays
FR2986337B1 (en) * 2012-01-31 2014-09-05 Jean-Pierre Lauret OPTICAL SYSTEM FOR MEASURING BRDF, BSDF AND BDTF
CN102736396B (en) * 2012-07-23 2015-02-04 中国人民解放军国防科学技术大学 Hyperbolic concave refractive and reflective panorama camera and making method and application thereof
US9071752B2 (en) * 2012-09-25 2015-06-30 National Chiao Tung University Scene imaging method using a portable two-camera omni-imaging device for human-reachable environments
FR3007613A1 (en) * 2013-06-20 2014-12-26 Eurekam SHOOTING DEVICE FOR SECURE PREPARATION OF MEDICINAL PREPARATIONS, ASSOCIATED OBJECT POSITIONING MEDIUM, AND SYSTEM INCLUDING SUCH A DEVICE AND SUCH A MEDIUM
US9817243B2 (en) * 2015-01-27 2017-11-14 Microsoft Technology Licensing, Llc Imaging apparatus
US9979885B2 (en) * 2015-02-09 2018-05-22 Steven Christopher Sparks Apparatus and method for capture of 360° panoramic video image and simultaneous assembly of 360° panoramic zoetropic video image
EP3275172B1 (en) 2015-03-24 2021-12-15 Battelle Memorial Institute Imaging system and method of creating images
DE102015215836B4 (en) 2015-08-19 2017-05-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Multiaperture imaging device with a reflective facet beam deflection device
JP6833348B2 (en) * 2016-05-25 2021-02-24 キヤノン株式会社 Information processing device, image processing system, information processing device control method, virtual viewpoint image generation method, and program
TWI773677B (en) * 2017-06-30 2022-08-11 揚明光學股份有限公司 Wide-angle projection lens
WO2019117569A1 (en) 2017-12-14 2019-06-20 Samsung Electronics Co., Ltd. Method and apparatus for managing immersive data
CN108055513A (en) * 2017-12-31 2018-05-18 北京机械设备研究所 Panoramic video security monitoring device
JP7129654B2 (en) * 2018-04-04 2022-09-02 パナソニックIpマネジメント株式会社 Infrared detector
US10951859B2 (en) 2018-05-30 2021-03-16 Microsoft Technology Licensing, Llc Videoconferencing device and method
US10573060B1 (en) * 2018-06-14 2020-02-25 Kilburn Live, Llc Controller binding in virtual domes
US10740957B1 (en) * 2018-06-14 2020-08-11 Kilburn Live, Llc Dynamic split screen
WO2020197532A1 (en) * 2019-03-22 2020-10-01 Source Photonics, Inc. System and method for transferring optical signals in photonic devices and method of making the system
WO2020264098A1 (en) * 2019-06-25 2020-12-30 Intelligent Commute Llc Extensiview and adaptive lka for adas and autonomous driving
CN110441311B (en) * 2019-07-22 2021-10-08 中国科学院上海光学精密机械研究所 Multi-axis and multi-focus lens for multi-object plane imaging
CN115327849B (en) * 2022-09-05 2024-02-06 同济人工智能研究院(苏州)有限公司 Panoramic lens and gas monitoring equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3505465A (en) * 1967-04-21 1970-04-07 Us Army Panoramic television viewing system
US4549208A (en) * 1982-12-22 1985-10-22 Hitachi, Ltd. Picture processing apparatus
US5530650A (en) * 1992-10-28 1996-06-25 Mcdonnell Douglas Corp. Computer imaging system and method for remote in-flight aircraft refueling
WO1997043854A1 (en) * 1996-05-10 1997-11-20 The Trustees Of Columbia University In The City Of New York Omnidirectional imaging apparatus
WO1997050252A1 (en) * 1996-06-24 1997-12-31 Behere Corporation Panoramic camera

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2638033A (en) * 1950-12-19 1953-05-12 Buchele Donald Robert Unitary catadioptric objective lens system
US4045116A (en) * 1975-02-27 1977-08-30 Russa Joseph A Wide angle optical imaging system
US4136926A (en) * 1978-04-14 1979-01-30 The United States Of America As Represented By The United States Department Of Energy Achromatic illumination system for small targets
US4395093A (en) * 1981-05-21 1983-07-26 The United States Of America As Represented By The Secretary Of The Navy Lens system for panoramic imagery
US4421721A (en) * 1981-10-02 1983-12-20 The Board Of Trustees Of The Leland Stanford Junior University Apparatus for growing crystal fibers
HU192125B (en) * 1983-02-08 1987-05-28 Budapesti Mueszaki Egyetem Block of forming image for centre theory projection adn reproduction of spaces
USD312263S (en) 1987-08-03 1990-11-20 Charles Jeffrey R Wide angle reflector attachment for a camera or similar article
US4820048A (en) * 1987-11-19 1989-04-11 The Perkin-Elmer Corporation Detector for a spectrometer
US5029963A (en) * 1990-02-15 1991-07-09 Itt Corporation Replacement device for a driver's viewer
JPH04185075A (en) * 1990-11-20 1992-07-01 Canon Inc Color picture processor
US5185667A (en) * 1991-05-13 1993-02-09 Telerobotics International, Inc. Omniview motionless camera orientation system
US5359363A (en) * 1991-05-13 1994-10-25 Telerobotics International, Inc. Omniview motionless camera surveillance system
US5563650A (en) * 1992-11-24 1996-10-08 Geeris Holding Nederland B.V. Method and device for producing panoramic images, and a method and device for consulting panoramic images
US5473474A (en) * 1993-07-16 1995-12-05 National Research Council Of Canada Panoramic lens
US5610391A (en) * 1994-08-25 1997-03-11 Owens-Brockway Glass Container Inc. Optical inspection of container finish dimensional parameters
FR2731896B1 (en) * 1995-03-24 1997-08-29 Commissariat Energie Atomique DEVICE FOR MEASURING THE POSITION OF THE FIXING POINT OF AN EYE ON A TARGET, METHOD FOR LIGHTING THE EYE AND APPLICATION TO THE DISPLAY OF IMAGES OF WHICH THE IMAGES CHANGE ACCORDING TO THE MOVEMENTS OF THE EYE
JPH08275066A (en) * 1995-03-29 1996-10-18 Toshiba Corp Panoramic camera
CA2146406A1 (en) * 1995-04-05 1996-10-06 Ian Powell Panoramic fish-eye imaging system
US5627675A (en) * 1995-05-13 1997-05-06 Boeing North American Inc. Optics assembly for observing a panoramic scene
US5539483A (en) * 1995-06-30 1996-07-23 At&T Corp. Panoramic projection apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3505465A (en) * 1967-04-21 1970-04-07 Us Army Panoramic television viewing system
US4549208A (en) * 1982-12-22 1985-10-22 Hitachi, Ltd. Picture processing apparatus
US5530650A (en) * 1992-10-28 1996-06-25 Mcdonnell Douglas Corp. Computer imaging system and method for remote in-flight aircraft refueling
WO1997043854A1 (en) * 1996-05-10 1997-11-20 The Trustees Of Columbia University In The City Of New York Omnidirectional imaging apparatus
WO1997050252A1 (en) * 1996-06-24 1997-12-31 Behere Corporation Panoramic camera

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000074018A1 (en) * 1999-06-02 2000-12-07 Cyclovision Technologies, Inc. Omni-directional security and lighting system
WO2001071424A1 (en) * 2000-03-22 2001-09-27 Egg Solution Optronics Sa Magnifying device for panoramic anamorphic imaging system
WO2001071423A1 (en) * 2000-03-22 2001-09-27 Egg Solution_Optronics Sa Panoramic image acquisition device
FR2806809A1 (en) * 2000-03-22 2001-09-28 Powell Group PANORAMIC IMAGE AQUISITION DEVICE
KR100442733B1 (en) * 2000-05-23 2004-08-02 샤프 가부시키가이샤 Omniazimuthal visual system
WO2002097730A1 (en) * 2001-05-25 2002-12-05 Matsushita Electric Industrial Co., Ltd. Wide-angle image generating device
US7312810B2 (en) 2001-05-25 2007-12-25 Matsushita Electric Industrial Co., Ltd. Wide-angle image generating device
FR2830128A1 (en) * 2001-09-26 2003-03-28 Egg Solution Sa Photoelectric sensor with sensing elements of pixels laid out in concentric circles, and device comprising such sensor for acquisition of panoramic images
DE10158415A1 (en) * 2001-11-29 2003-06-18 Daimler Chrysler Ag Method for monitoring the interior of a vehicle, as well as a vehicle with at least one camera in the vehicle interior
DE10158415C2 (en) * 2001-11-29 2003-10-02 Daimler Chrysler Ag Method for monitoring the interior of a vehicle, as well as a vehicle with at least one camera in the vehicle interior
EP1377041A2 (en) * 2002-06-27 2004-01-02 Microsoft Corporation Integrated design for omni-directional camera and microphone array
EP1377041A3 (en) * 2002-06-27 2004-08-25 Microsoft Corporation Integrated design for omni-directional camera and microphone array
CN1479525B (en) * 2002-06-27 2010-05-12 微软公司 System and method for capturing audio and video frequency data
FR2842306A1 (en) * 2002-07-12 2004-01-16 Egg Solution Optronics Wide angle electromagnetic wave acquisition sensor system has part coated infra red and optical reflector with filtering and processing to control zoom camera
EP1677535A1 (en) * 2004-12-30 2006-07-05 Microsoft Corporation Camera lens shuttering mechanism
US7812882B2 (en) 2004-12-30 2010-10-12 Microsoft Corporation Camera lens shuttering mechanism
US8256902B2 (en) 2008-06-04 2012-09-04 Satoru Yoshii Entire visual-field projection device, and entire-visual-field image system

Also Published As

Publication number Publication date
KR20010024698A (en) 2001-03-26
CN1290355A (en) 2001-04-04
EP1042697A1 (en) 2000-10-11
CA2312970A1 (en) 1999-06-17
US6118474A (en) 2000-09-12
KR100599423B1 (en) 2006-07-10
JP2001526471A (en) 2001-12-18
BR9813370A (en) 2001-08-28
RU2201607C2 (en) 2003-03-27
AU1903399A (en) 1999-06-28

Similar Documents

Publication Publication Date Title
US6118474A (en) Omnidirectional imaging apparatus
EP0897636B1 (en) Omnidirectional imaging apparatus
US6313865B1 (en) Method and apparatus for implementing a panoptic camera system
US8213087B2 (en) Integrated panoramic and forward optical device, system and method for omnidirectional signal processing
Nayar Catadioptric omnidirectional camera
JP3485261B2 (en) System and method for electronic imaging and processing of a hemispherical field of view
US6856472B2 (en) Panoramic mirror and system for producing enhanced panoramic images
Nayar Omnidirectional video camera
US7298548B2 (en) Multi-directional viewing and imaging
Nayar Omnidirectional vision
US20030081952A1 (en) Method and apparatus for omnidirectional three dimensional imaging
JP2006011103A (en) Spherical mirror imaging apparatus
KR100482727B1 (en) Omnidirectional imaging device and method
CA2251879C (en) Omnidirectional imaging apparatus
MXPA00005531A (en) An omnidirectional imaging apparatus
JP2002262157A (en) Multiple focus omnidirectional image pickup device
WO2002069035A2 (en) Panoramic mirror and system for producing panoramic images

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 98813411.X

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

ENP Entry into the national phase

Ref document number: 2312970

Country of ref document: CA

Ref document number: 2312970

Country of ref document: CA

Kind code of ref document: A

Ref document number: 2000 524698

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020007006159

Country of ref document: KR

Ref document number: PA/a/2000/005531

Country of ref document: MX

WWE Wipo information: entry into national phase

Ref document number: 1998963782

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1998963782

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1020007006159

Country of ref document: KR

WWW Wipo information: withdrawn in national office

Ref document number: 1998963782

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 1020007006159

Country of ref document: KR