US20050059886A1 - Method and system for creating task-dependent three-dimensional images - Google Patents
Method and system for creating task-dependent three-dimensional images Download PDFInfo
- Publication number
- US20050059886A1 US20050059886A1 US10/958,972 US95897204A US2005059886A1 US 20050059886 A1 US20050059886 A1 US 20050059886A1 US 95897204 A US95897204 A US 95897204A US 2005059886 A1 US2005059886 A1 US 2005059886A1
- Authority
- US
- United States
- Prior art keywords
- image
- images
- projected
- source
- fiducial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/02—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
- G01N23/04—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
- G01N23/046—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5258—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/548—Remote control of the apparatus or devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2223/00—Investigating materials by wave or particle radiation
- G01N2223/40—Imaging
- G01N2223/419—Imaging computed tomograph
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S378/00—X-ray or gamma ray systems or devices
- Y10S378/901—Computer tomography program or processor
Definitions
- the present invention relates to a method and system for creating three-dimensional displays or images from a multiplicity of two-dimensional projected images and, more specifically, to a method and system for producing task-dependent radiographic images of an object of interest which are substantially free of blurring artifacts.
- imaging aperture approaches zero (i.e., a pinhole) and the resulting display is characterized by images produced from a single transmission radiograph. This yields an infinitely wide depth of field and therefore no depth information can be extracted from the image.
- the aperture approaches a surrounding ring delimiting an infinite numerical aperture resulting in projection angles orthogonal to the long axis of the irradiated object. This yields an infinitely narrow depth of field and hence no information about adjacent slices through the object can be ascertained. It therefore follows that a “middle ground” approach, which provides the ability to adapt a sampling aperture to a particular task, would be highly advantageous.
- the key to achieving the full potential of diagnostic flexibility lies in the fact that perceptually meaningful three-dimensional reconstructions can be produced from optical systems having any number of different aperture functions. That fact can be exploited since any aperture can be approximated by summation of a finite number of appropriately distributed point apertures.
- the key is to map all incrementally obtained projective data into a single three dimensional matrix. To accomplish this goal, one needs to ascertain all positional degrees of freedom existing between the object of interest, the source of radiation, and the detector.
- the relative positions of the object, the source, and the detector have been determined by fixing the position of the object relative to the detector while the source of radiation is moved along a predetermined path, i.e. a path of known or fixed geometry. Projective images of the object are then recorded at known positions of the source of radiation. In this way, the relative positions of the source of radiation, the object of interest, and the detector can be determined for each recorded image.
- each incrementally obtained projective image is mapped into a single three-dimensional matrix.
- the mapping is performed by laterally shifting and summing the projective images to yield tomographic images at a selected slice position through the object of interest.
- a three-dimensional representation of the object can be obtained by repeating the mapping process for a series of slice positions through the object.
- the quality and independence of the tomographic images is compromised by blurring artifacts produced from unregistered details located outside the plane of reconstruction.
- the present invention relates to a system and a method for synthesizing an image slice through a selected object from a plurality of projected radiographic images of the selected object.
- the system comprises a radiation source for irradiating the object.
- the preferred radiation source depends upon the particular application.
- the present invention may be practiced using x-rays, electron microscopy, ultrasound, visible light, infrared light, ultraviolet light, microwaves, or virtual radiation simulated by manipulation of magnetic fields (magnetic resonance imaging (MRI)).
- MRI magnetic resonance imaging
- the position of the radiation source within a plane parallel to an image plane is determined from projected images of two object points associated with a fiducial reference which is maintained in fixed position relative to the selected object. Once the projected images are compensated for differences in magnification, the relative position of the radiation source within the plane parallel to the image plane is determined from an estimate of the actual distance between the two object points obtained from a sinusoidal fit of the distances between the projected images of the object points.
- a recording medium or radiation detector is used to record a series of projected images of the selected object.
- the recording medium may be in the form of a photographic plate or a radiation-sensitive, solid-state image detector such as a charge-coupled device (CCD), or any other system capable of producing two-dimensional projections or images suitable for digitization or other analysis.
- CCD charge-coupled device
- An image synthesizer for transforming the series of projected images of the selected object into an image slice.
- the image slice consists of an array of pixels with each pixel having an associated attenuation value and corresponds to a cross-sectional slice through the selected object at a selected slice position.
- a three-dimensional representation of the object can be obtained by repeating the transformation at a series of slice positions through the object.
- an optional source comparator is provided for adjusting the radiation source to enable meaningful quantitative comparisons between projected images recorded either at different times and/or using different radiation sources.
- the source comparator is positionable between the radiation source and the radiographic medium for producing a gradient image indicative of characteristics associated with the output from the radiation source.
- the source comparator is used to record a first gradient image using a first radiation source at the same time that a first projected image or series of projected images is recorded.
- the source comparator is used to record a second gradient image.
- the second gradient image is compared to the first gradient and differences between the two gradient images are noted.
- the beam energy, filtration, and beam exposure associated with the radiation source used to record the second gradient image are then adjusted to minimize the differences between the first gradient image and the second gradient image.
- the source comparator comprises two wedges or five-sided polyhedrons of equal dimension having a rectangular base and two right-triangular faces.
- the triangular faces lie in parallel planes at opposite edges of the base such that the triangular faces are oriented as mirror images of each other.
- each wedge has a tapered edge and provides a uniformly increasing thickness from the tapered edge in a direction parallel to the plane of the base and perpendicular to the tapered edge.
- the wedges are arranged with the base of one wedge adjacent to the base of the other wedge such that the tapered edges of the two wedges are at adjacent edges of the base.
- One wedge is formed from a uniform high attenuation material while the other wedge is formed from a uniform low attenuation material. Accordingly, when the source comparator is irradiated from a radiation source directed perpendicularly to the bases of the wedges, the resulting image will be a quadrilateral having an intensity gradient that is maximized in a particular direction.
- the system of the present invention is used to produce an image slice through the selected object that is substantially free of blurring artifacts from unregistered details located outside a plane of reconstruction.
- the radiation source and recording medium are used to record a series of two-dimensional projected images of the selected object.
- the series of two-dimensional projected images are then shifted by an amount and in a direction required to superimpose the object images of the two-dimensional images.
- the shifted two-dimensional images can then be combined in a non-linear manner to generate a tomosynthetic slice through the selected object.
- the two-dimensional images are combined by selecting details from a single projection demonstrating the most relative attenuation at each pixel.
- a different non-linear operator could be used wherein the two-dimensional images are combined by selecting details from a single projection demonstrating the least relative attenuation at each pixel in the reconstructed image.
- a series of reconstructed images at varying slice positions through the selected object are determined to create a three-dimensional representation of the selected object.
- the system of the present invention is used to synthesize a three-dimensional reconstruction of the object from as few as two projected images of the object.
- a first projected image of the object is recorded in a first projection plane and a second projected image is recorded in a second projection plane.
- Each of the first and the second projected images are then rendered at a common magnification.
- the first and the second projected images are transformed to occupy the same volume.
- the transformed first and second projected images are then combined into a three-dimensional representation of the selected object. Additional projected images are optionally combined with the three-dimensional representation to refine the three-dimensional representation.
- the system of the present invention is used to synthesize a three-dimensional representation of the selected object from two or more sets of projected images of the selected object.
- the first and second sets of projected images are tomosynthetically transformed into a series of contiguous slices forming a first and a second three-dimensional volume, respectively, using previously disclosed methods (e.g., U.S. Pat. No. 5,668,844) or those in the public domain (e.g., tomosynthesis).
- the first and second three-dimensional volumes are then rendered at a common magnification.
- the second three-dimensional volume is then rotated by an angle corresponding to the angular disparity between the first and the second three-dimensional volumes.
- the rotated second three-dimensional volume is then merged with the first three-dimensional volume to produce a three-dimensional representation of the selected object.
- the system of the present invention can be used to determine temporal changes in the selected object.
- the radiation source and recording medium are used to record a first series of two-dimensional projected images of the selected object.
- the radiation source and recording medium are used to record a second series of two-dimensional projected images of the selected object.
- Both series are tomosynthetically converted into a series of slices via previously disclosed methods (TACT®) or those in the public domain (tomosynthesis).
- TACT® previously disclosed methods
- Each slice of the first series is then correlated with a corresponding slice of the second series to form pairs of correlated slices.
- Each pair of slices is then aligned to maximize the overlap between homologous structures.
- Each pair of correlated slices is then subtracted to produce a difference image.
- Each difference image is then displayed individually.
- all of the difference images can be overlapped to yield a complete difference image corresponding to the volumetric difference associated with the entire tomosynthetically reconstructed volume.
- the three-dimensional representation can be viewed holographically using a display in accordance with the present invention.
- the display comprises stereoscopic spectacles which are worn by an observer and a target operatively associated with the spectacles. Accordingly, as the observer changes his or her vantage point, movement of the spectacles translates into a corresponding movement of the target.
- a detector is operatively associated with the target for tracking movement of the target. The detector is connected to a monitor such that the monitor receives a signal from the detector indicative of movement of the target. In response to the signal from the detector, the monitor displays an image pair of the three-dimensional representation which, when viewed through the spectacles produces a stereoscopic effect. The image pair which is displayed is changed to compensate for changes in the vantage point of the observer.
- FIG. 1 is a schematic representation of a system for creating; three-dimensional radiographic displays using computed tomography in accordance with the present invention
- FIG. 2 is a flow chart showing the steps involved in creating three-dimensional radiographic displays using computed tomography in accordance with the present invention
- FIG. 3 is a flow chart showing details of a method of projectively warping or transforming a projected image from an actual plane of projection onto a virtual projection plane;
- FIG. 4 is a schematic representation of a system having nine degrees of freedom in which a source is shifted and displaced relative to an original projection plane and in which a projection plane of a recording medium is shifted, rotated, displaced, and tilted relative to the original projection plane;
- FIG. 5 is a schematic representation showing an arrangement of reference markers in accordance with the an embodiment of the present invention, wherein five spherical reference markers are positioned at five of the eight vertices of a cube;
- FIG. 6 is a schematic representation of a system having seven degrees of freedom in which an infinite point source is shifted relative to an original projection plane and in which a projection plane of a recording medium is shifted, displaced, and tilted relative to the original projection plane;
- FIG. 7 is a schematic representation of a system having four degrees of freedom in which an infinite point source is shifted relative to an original projection plane and in which a projection plane of a recording medium is shifted relative to the original projection plane;
- FIG. 8 is an exploded, schematic representation of a charge-coupled device (CCD) for use as a recording medium with intrinsic components enabling automated determination of projective geometry;
- CCD charge-coupled device
- FIG. 9 is a schematic representation of an embodiment of the present invention wherein the recording medium is smaller than the projected image of the object;
- FIG. 10 is a schematic representation of an embodiment of the present invention wherein the source is a hand-held X-ray source with a laser a ming device;
- FIG. 11 is a schematic representation of an embodiment of the present invention wherein the reference markers of the fiducial reference are positioned at the vertices of a square pyramid;
- FIG. 12 is a schematic representation of an embodiment of the present invention wherein the source is a hand-held X-ray source which is constrained relative to the recording medium by a C-arm;
- FIG. 13 is an enlarged schematic representation of the object of interest and the recording medium depicted in FIG. 14 ;
- FIG. 14 is a schematic representation of an embodiment of the present invention wherein the reference markers of the fiducial reference are positioned at the centers of the faces of a parallelpiped;
- FIG. 15 is a schematic representation of an embodiment of the present invention wherein the corners of a frame define four reference markers;
- FIG. 16 is a schematic representation of a reference image cast by a spherical reference marker showing the resulting brightness profile
- FIG. 17 is a schematic representation of the parameters associated with a system comprising three spherical, non-collinear reference markers wherein the orthogonal distance between the radiation source and the recording medium is fixed at a distance short enough so that the images cast by the reference markers are magnified relative to the size of the actual reference markers;
- FIG. 18 is a schematic representation of the relevant parameters associated with a reference image associated with a spherical reference marker
- FIG. 19 is a schematic representation of an embodiment of the present invention wherein the fiducial reference comprises a radiopaque shield with a ring-like aperture;
- FIG. 20 is a schematic, perspective view of an embodiment of the present invention, wherein the detector comprises a charge-coupled device (CCD) and the fiducial reference comprises a frame, shown with the front and a section of the top removed;
- CCD charge-coupled device
- FIG. 21 is a sectional view of the embodiment depicted in FIG. 20 taken along the 23 - 23 line;
- FIG. 22 is an alternate embodiment of a laser aiming device in accordance with the present invention.
- FIG. 23 is a flow chart showing the steps involved in a method for task-dependent tomosynthetic image reconstruction in accordance with the present invention.
- FIGS. 24A and B are schematic representations of a linear tomosynthetic reconstruction and a non-linear tomosynthetic reconstruction in accordance with the present invention.
- FIG. 25 is a flow chart showing the steps involved in a method for determining temporal changes in accordance with the present invention.
- FIG. 26 is a schematic representation of a source comparator used for matching X-ray sources
- FIG. 27 is a flow chart showing the steps of a method for using the source comparator of FIG. 26 ;
- FIG. 28 is a schematic representation of a pseudo-holographic image display
- FIG. 29 is a tomosynthetic slice through a human breast reconstructed using a linear summation of projected images
- FIG. 30 is a tomosynthetic slice through the human breast reconstructed using a linear summation of projected images augmented by a deconvolution filter.
- FIG. 31 is a tomosynthetic slice through the human breast reconstructed using a non-linear reconstruction scheme
- FIG. 32 is a flow chart showing the steps of a method for creating nearly isotropic three-dimensional images from a single pair of arbitrary two-dimensional images
- FIG. 33 is a flow chart showing the steps of a method for creating a three-dimensional image from two series of two-dimensional images
- FIG. 34 is a flow chart showing the steps of a method for producing a three-dimensional representation of a stationary object from multiple plane projections recorded by an arbitrarily positionable camera;
- FIG. 35 is a schematic representation of a three-dimensional scaling calibration for determining the relative position of a camera in two planes orthogonal to the projection plane of the camera;
- FIG. 36 is a schematic representation of a remote-controlled mobile radiation source
- FIG. 37 is a schematic representation of an embodiment of the present invention wherein a camera is used to record overlapping sets of projected images
- FIG. 38 is a schematic representation of an embodiment of the present invention wherein laser light sources provide fiducial reference points.
- FIG. 39 is a schematic representation of a three-dimensional image of an object produced from a single pair of arbitrary two-dimensional images.
- the present invention generally relates to a system 20 , as depicted schematically in FIG. 1 , for synthesizing an image of an object 21 at a selected slice position 35 through the object 21 from a plurality of radiographic projected images 38 of the selected object 21 .
- a fiducial reference 22 is held in a fixed position relative to the selected object 21 , for example, by directly attaching the fiducial reference 22 to the object 21 .
- the fiducial reference comprises two finite sized, identifiable reference markers, 23 and 123 , which are maintained coupled together in a fixed geometry relative to each other by a radiolucent bar 24 .
- the fiducial reference 22 may comprise various numbers and arrangements of reference markers 23 .
- FIG. 1 for synthesizing an image of an object 21 at a selected slice position 35 through the object 21 from a plurality of radiographic projected images 38 of the selected object 21 .
- a fiducial reference 22 is held in a fixed position relative to the selected object 21 , for example, by directly attaching the fiducial
- the reference markers, 23 and 123 are provided by the reflection of laser light from the surface of the selected object.
- a radiation source 27 is provided to irradiate the object 21 along with the fiducial reference 22 . Irradiation of the object 21 casts a projected image 38 onto a recording medium 31 .
- the projected image 38 comprises an object image 40 of the object 21 and reference images, 39 and 139 , of the reference markers, 23 and 123 , respectively.
- the pattern of source 27 positions does not need to be in any fixed geometry or position. Indeed, the position of the source 27 may be totally arbitrary in translation and displacement relative to the object 21 . Likewise, the recording medium 31 may also be arbitrarily movable relative to the object 21 by translation, displacement, tilting, or rotation. The only requirement is that for every degree of freedom in the system resulting from movement of the source 27 or the recording medium 31 relative to the object 21 , the fiducial reference 22 must include sufficient measurable or defined characteristics, such as size, shape, or numbers of reference markers 23 , to account for each degree of freedom.
- the minimum number of reference markers required to completely determine the system depends on the constraints, if any, imposed on the relative positions of (1) the radiation source, (2) the object and fiducial reference, and (3) the recording medium.
- the system may have a total of nine possible relative motions ( 2 translations and 1 displacement for the radiation source relative to a desired projection plane and 2 translations, 1 displacement, 2 tilts, and 1 rotation for the recording medium relative to the desired projection plane).
- Each of these possible relative motions must be capable of analysis either by constraining the system and directly measuring the quantity, by providing a sufficient number of reference markers to enable the quantity to be determined, or by estimating the value of the quantity.
- Each unconstrained relative motion represents a degree of freedom for the system. For a system to be completely determined, the total number of degrees of freedom in the system must be less than or equal to the total number of degrees of freedom associated with the fiducial reference.
- More than the minimum number of reference markers can be used. In such cases, the system is overdetermined and least squares fitting can be used to improve the accuracy of the resulting image slices. If, however, less than the minimum number of reference markers is used, then the system is underdetermined and the unknown degrees of freedom must either be estimated or measured directly.
- the reference markers can be essentially any size and shape, spherical reference markers of known diameter may be used. When using spherical reference markers of a finite size, a single reference marker can account for up to five degrees of freedom.
- the reference image cast by the spherical reference marker is elliptical and is independent of any rotation of the reference marker. Determining the position of the reference image in the projection plane (X- and Y-coordinates) and the magnitudes of the major and minor diameters of the elliptical image accounts for four degrees of freedom.
- the reference image will be magnified relative to the actual size of the reference marker, thereby accounting for an additional degree of freedom.
- only two degrees of freedom are typically associated with the reference image of a point-size reference marker.
- FIG. 4 The most complex, yet most generally applicable, arrangement is depicted in FIG. 4 , wherein the radiation source 27 and the recording medium 31 are completely unconstrained and uncoupled from the selected object 21 .
- this arrangement there are nine degrees of freedom: 2 translational ( ⁇ X and ⁇ Y) and 1 displacement ( ⁇ Z) degrees of freedom for the radiation source 27 relative to an original or desired projection plane 37 and 2 translational ( ⁇ X′ and ⁇ Y′), 1 displacement ( ⁇ Z′), 2 tilting ( ⁇ and ⁇ ), and 1 rotational ( ⁇ ) degree of freedom for the recording medium 31 relative to the original or desired projection plane.
- a fiducial reference system sufficient to solve a projection system having nine degrees of freedom is needed to completely determine the system.
- One embodiment of the present invention that permits this general arrangement to be realized conveniently involves two-dimensional projected images from a system comprised of a fiducial reference having five point-size or finite reference markers.
- This approach conveniently facilitates three-dimensional reconstructions when exactly four reference markers are coplanar and no three or more reference markers are collinear. Under these conditions, only the projection from the non-coplanar marker need be distinguished from the other four because the projections from the latter always bear a fixed sequential angular arrangement relative to each other which simplifies identification of homologous points in all projections.
- the reference markers can be placed at five contiguous vertices of a cube as shown in FIG. 5 .
- Fiducial reference 122 comprises five reference markers, 23 , 123 , 223 , 323 , 423 , positioned contiguously at five vertices of a cube.
- the object 121 is preferably positioned within the cube.
- the four co-planar reference markers, 23 , 123 , 223 , and 323 then can be used for projectively warping or transforming the projected images onto a desired projection plane while the remaining reference marker 423 serves as the alignment marker required to determine the normalized projection angle as described in U.S. Pat. No. 5,359,637.
- FIG. 11 Another useful arrangement of the fiducial reference comprising five reference markers is shown in FIG. 11 , wherein a fiducial reference 222 employing a pyramidal distribution of reference markers 323 is used.
- the fiducial reference 222 comprises five reference markers 23 , 123 , 223 , 323 , and 423 , which are held in a fixed relationship relative to each other and to the object 221 .
- four of the reference markers, 23 , 123 , 223 , and 323 lie in a plane that can be used to establish the desired projection plane.
- they define the four corners of the base of a pyramid.
- the fifth reference marker 423 is positioned to define the apex of the pyramid and serves as the means for determining the projection angles relative to the desired projection plane as described in U.S. Pat. No. 5,359,637.
- the fiducial reference 222 may be attached or fixed relative to the object 221 such that the base of the pyramid is proximate to the recording medium and the apex of the pyramid is proximate to the source.
- a fiducial reference 322 having an alternative arrangement of reference markers in a pyramidal distribution is shown.
- the fiducial reference 322 comprises a radiopaque frame 25 having a radiolucent central window.
- the four inside corners of the radiopaque frame 25 define four reference markers, 23 , 123 , 223 , and 323 , at the base of the pyramid.
- the fifth reference marker 423 is positioned at the apex of the pyramid.
- the object 321 is positioned between the frame 25 and the reference marker 423 .
- Fiducial reference 422 which is also useful for solving a system with nine degrees of freedom is shown.
- Fiducial reference 422 comprises a rectangular parallelpiped 33 with radiopaque reference markers, 23 , 123 , 223 , 323 , 423 , and 523 , centered on each of the six faces of the parallelpiped 33 .
- the reference markers, 23 , 123 , 223 , 323 , 423 , and 523 are marked with distinguishable indicia, such as X, Y, Z, ⁇ circle over (X) ⁇ , ⁇ circle over (Y) ⁇ , and ⁇ circle over (Z) ⁇ so that the reference images cast by the markers, 23 , 123 , 223 , 323 , 423 , and 523 , can be identified easily and distinguished from one another.
- two or more of the edges of the parallelpiped 33 may be defined by radiopaque bars 26 such that the intersections of the bars 26 provide additional reference markers, such as reference marker 623 located at the intersection of the three bars labeled 26 in FIG. 14 .
- the six degrees of freedom for the radiation source 27 relative to the desired projection plane 37 can be determined independently from the use of the fiducial reference when the orientation of the detector is fixed or known relative to either the object of interest or the radiation source.
- the position of the radiation source 27 can be determined from multiple plane projections recorded from an arbitrarily positioned camera provided that the lens aperture is adjusted such that the entire object always appears in focus.
- the three relative angles associated with each projection are determined by attaching three orthogonally oriented angle sensing devices, such as gyroscopes, to the camera.
- the displacement of the radiation source relative to the object is determined using a range finder associated with the camera.
- the remaining degrees of freedom need only be measured relative to one another and, therefore, can be fixed from a geometric analysis of paired point projections.
- the projected distances, D1, D2, and D3, between the paired points P 1 and P 2 are a sinusoidal function of the corrected projection angle.
- the actual distance between P 1 and P 2 can be estimated from a non-linear curve fit to the observed projection distances.
- FIG. 34 A method for determining the position of the radiation source relative to the object using an arbitrarily positionable camera in accordance with the present invention is depicted in FIG. 34 .
- angle sensors attached to the camera are initialized in order to eliminate possible drift in accuracy.
- the object is then roughly centered within the viewfinder of the camera and an object image, the nominal displacement of the camera from the object, and the angle data are recorded at step 1002 .
- An intrinsic range finder associated with the camera is used to determine the nominal distance from the camera to the object of interest and the angle sensors are used to determine the angle data.
- step 1004 it is determined whether additional object images are desired. If additional object images are desired, the camera is repositioned at step 1005 and the process returns to step 1002 . It should be appreciated that a minimum of three object images is required to produce a meaningful sinusoidal regression, as discussed in detail below. If no additional object images are to be recorded, the recorded object images and data is optionally stored in a computer readable format and the process proceeds to step 1007 .
- Each of the object images is then individually scaled to render all of the object images at the same magnification at step 1009 .
- the scaling is possible using the range recorded for each object image because the linear magnification is inversely proportional to the range.
- an effective displacement between the camera and the object can be defined.
- a first object point visible on all of the projected object images, is selected.
- a representative object image is then selected at step 1013 .
- the representative object image should be the object image which best approximates the orientation to which desired reconstructed tomosynthetic slices are to be parallel.
- Each object image is then rotated and translated, at step 1015 , so that all of the object images are brought into tomosynthetic registration. Specifically, each object image is rotated by an amount sufficient to adjust the rotational orientation of the camera about an axis perpendicular to the projection plane to match that of the representative object image. Rotational adjustment of the object images assures that the registrations which follow will not exclude a second reference point, whose selection is discussed below.
- Each rotated object image is then translated both vertically and horizontally by an amount which causes superposition of the projected image of the first object point within each object image with the projected image of the first object point within the representative object image.
- a second object point visible on all of the scaled, rotated, and translated object images is selected.
- the distance between the projected images of the second object point and the first object point is measured, at step 1019 , for each of the object images. If the relative change in distance does not exceed a task-dependent threshold value and produce a well-distributed range of values, the accuracy of the subsequent non-linear regression may be compromised. Accordingly, at step 1021 , it is determined whether the measured distances exceed the task-dependent threshold. If the threshold is not exceeded, a new second object point is selected at step 1017 . If the threshold is exceeded, the process proceeds to step 1023 .
- the actual distance between the first object point and the second object point is estimated from the measured distance separating the projected images of the first and second object points in the recorded object images.
- the estimate of the actual distance is determined using the effective displacement of the camera from the object and a sinusoidal curve fitting procedure, as well as the projection angle defined by a line connecting the first and second object points and the plane of the representative object image.
- each object image is remapped onto the plane defined by the representative object image selected above at step 1025 .
- the remapping is performed using the first object point as the common point of superposition.
- the object images are then tomosynthetically reconstructed using the second object point as a disparity marker.
- the distances between object images is then calibrated, at step 1029 , using the estimate for the distance between the first and second object points and trigonometrically correcting the object images for foreshortening caused by variations in the projection angle.
- a radiation source 1050 is mounted on a mobile carriage 1052 .
- the carriage 1052 is controlled remotely using a transmitter 1054 which transmits a signal to the carriage 1052 through an antenna 1056 .
- the transmitter 1054 is operated to maneuver the carriage 1052 , and thereby the radiation source, 1050 , to move around a selected object 1058 to enable projected images of the object 1058 to be recorded on a detector 1060 at a variety of relative positions of the radiation source 1050 , the object 1058 and fiducial reference 1062 , and the detector 1060 .
- the elevation and angle of tilt of the radiation source 1050 relative to the object 1058 and fiducial reference 1062 is also controllable through the transmitter 1054 .
- FIGS. 12 and 13 An arrangement of the system of the present invention which is somewhat constrained is depicted in FIGS. 12 and 13 , wherein a hand-held X-ray source is provided such that the orthogonal distance between the radiation source 127 and the recording medium 131 is fixed by a C-arm 129 at a distance short enough so that the image cast by the fiducial reference 122 is magnified relative to the size of the actual fiducial reference 122 .
- the C-arm 129 is connected to the recording medium 131 by a concentric swivel collar 149 to allow the C-arm 129 to be rotated relative to the recording medium 131 .
- a disposable and crushable radiolucent foam cushion 130 may be attached to the surface of the recording medium 131 to permit comfortable customized stable adaptation of the detector 131 to the object 121 .
- the other end of the C-arm 129 is attached to a potted X-ray source 145 so that radiation emanating from the potted X-ray source 145 impinges upon the recording medium 131 .
- a trigger 146 is provided for operating the source 127 .
- the source 127 optionally comprises a circular beam collimator 147 for collimating radiation emanating from the source 127 .
- the collimator 147 may provide a relatively long focal-object distance to provide nearly affine projection geometries.
- a handle 148 is also provided to enable the operator to more easily maneuver the source 127 .
- the hand-held X-ray source 127 is connected to a computer/high voltage source 128 for controlling operation of the device.
- a disposable plastic bag 132 can be positioned around the detector 131 for microbial isolation.
- the source 127 can optionally comprise a rotatable transparent radiopaque plastic cylinder 119 and a transparent radiopaque shield 152 to protect the operator from scattered radiation.
- a fiducial reference 122 comprising a single radiopaque sphere of finite diameter. Under those conditions, the length of the minor axis of the resulting elliptical shadow plus two translational measurements are sufficient to define the projection geometry completely.
- c is the fixed distance between the source and the projection plane; PS is the orthogonal projection of the source onto the projection plane; B, M, and T are the reference markers; r is the radius of the reference markers; a p is the distance from the center of a reference marker to the source; ⁇ is the angle subtended by the center of a reference marker relative to a line orthogonal to the projection plane through the source; ⁇ is the angle at the apex of an isosceles triangle having a base of length r and a height of length a p ; B s , M s , and T s are the reference images associated with the reference markers; a (or, alternatively, d p ) is the major diameter of the reference images; b is the minor diameter of the reference images; x is the length of a section of an arc associated with a reference image measured from the projection of the center of the corresponding reference marker onto the projection plane along the major diameter, b, in a direction toward P s ;
- FIG. 6 another arrangement of the system of the present invention is depicted wherein the radiation source 27 is located at a fixed distance from the selected object 21 and sufficiently far so that magnification is not significant.
- the recording medium 31 is allowed to be shifted, displaced, and tilted relative to the selected object 21 and an original or desired projection plane 37 .
- a fiducial reference having at least seven degrees of freedom is needed to solve the system.
- a fiducial reference comprising at least four point-size reference markers can be used to determine the position of the radiation source relative to the selected object 21 and the recording medium 31 .
- FIG. 7 yet another arrangement of the system of the present invention is depicted wherein the distance between the object 21 and the radiation source 27 is sufficiently large so that magnification can be ignored and wherein the recording medium 31 is free to shift laterally relative to the object 21 and the desired or original projection plane 37 .
- this arrangement there are four degrees of freedom (two translational degrees of freedom for the radiation source 27 and two translational degrees of freedom for the recording medium 31 ). Therefore, a fiducial reference having at least four degrees of freedom is necessary to completely determine the system. Accordingly, a fiducial reference comprising at least two point-size reference markers can be used to determine the position of the radiation source relative to the selected object 21 and the recording medium 31 .
- This relatively constrained system may be useful in three-dimensional reconstructions of transmission electron micrographs produced from video projections subtending various degrees of specimen tilt and exhibiting various amounts of arbitrary and unpredictable lateral shift due to intrinsic instability associated with the instrument's electron lenses.
- the radiation source 27 may be either a portable or a stationary X-ray source.
- the radiation source 27 is not limited to an X-ray source.
- the specific type of source 27 which is utilized will depend upon the particular application.
- the present invention can also be practiced using magnetic resonance imaging (MRI), ultrasound, visible light, infrared light, ultraviolet light, or microwaves.
- MRI magnetic resonance imaging
- the source 227 is a hand-held X-ray source, similar to that described above in reference to source 127 , except that a low power laser aiming device 250 and an alignment indicator 251 are provided to insure that the source 227 and the recording medium 231 are properly aligned.
- a radiolucent bite block 218 is provided to constrain the detector 231 relative to the object 221 , thereby constraining the system to three degrees of freedom (two translational and one displacement for the radiation source 227 relative to the object 221 and detector 231 ). Consequently, the fiducial reference 222 can be fixed directly to the bite block 218 .
- the source 227 When the source 227 is properly aligned with the recording medium 231 , radiation emanating from the aiming device 250 impinges on the recording medium 231 . In response to a measured amount of radiation impinging on the recording medium 231 , a signal is sent to activate the alignment indicator 251 which preferably produces a visible and/or auditory signal. With the alignment indicator 251 activated, the X-ray source 245 can be operated at full power to record a projected image.
- the source 227 can optionally comprise a collimator 247 to collimate the radiation from the X-ray source and/or a transparent scatter shield 252 to protect the operator from scattered radiation.
- the operator can stand behind a radiopaque safety screen when exposing the patient to radiation from the source 227 .
- a handle 248 and trigger 246 may be provided to facilitate the handling and operation of the source 227 .
- the source 227 is connected to a computer/high voltage source 228 and an amplifier 260 for controlling operation of the device.
- the aiming device 250 comprises an X-ray source operated in an ultra-low exposure mode and the projected image is obtained using the same X-ray source operated in a full-exposure mode.
- a real-time ultra-low dose fluoroscopic video display can be mounted into the handle 248 of the source 227 via a microchannel plate (MCP) coupled to a CCD.
- MCP microchannel plate
- the video display switches to a lower gain (high signal-to-noise) frame grabbing mode when the alignment is considered optimal and the trigger 246 is squeezed more tightly.
- the aiming device 850 comprises a laser source 857 and a radiolucent angled mirror 858 which produces a laser beam, illustrated by dashed line 859 , which is concentric with the radiation emanating from the source 827 .
- the alignment indicator 851 comprises a radiolucent spherical surface 861 which is rigidly positioned relative to the detector 831 by a C-arm 829 that is plugged into the bite block 818 .
- the fiducial reference 822 comprises a radiolucent spacer containing a fiducial pattern that is affixed to the detector 831 .
- a central ring area 863 can be designated at the center of the spherical surface 861 such that aiming the laser beam 859 at the central ring area 863 assures an essentially orthogonal arrangement of the source 827 and the detector 831 .
- the concentric laser source 857 with a laser source that produces two laser beams that are angled relative to the radiation emanating from the source 827 permits the distance between the source 827 and the detector 831 to be set to a desired distance, provided that the two laser beams are constrained to converge at the spherical surface 861 when the desired distance has been established.
- the recording medium 31 is provided for recording the projected object image 40 of the selected object 21 and the projected reference images, 39 and 139 of the reference markers 23 and 123 .
- the recording medium 31 may be in the form of a photographic plate or a radiation-sensitive, solid-state image detector such as a radiolucent charge-coupled device (CCD).
- CCD radiolucent charge-coupled device
- the recording medium 331 comprises a CCD having a top screen 200 , a bottom screen 206 positioned below the top screen 200 , and a detector 210 positioned below the bottom screen 206 .
- the top screen 200 is monochromatic so that a projected image projected onto the top screen 200 causes the top screen 200 to fluoresce or phosphoresce a single color.
- the bottom screen 206 is dichromatic, so that the bottom screen 206 fluoresces or phosphoresces in a first color in response to a projected image projected directly onto the bottom screen 206 and fluoresces or phosphoresces in a second color in response to fluorescence or phosphorescence from the top screen 200 .
- the detector 210 is also dichromatic so as to allow for the detection and differentiation of the first and the second colors.
- the recording medium 331 may also comprise a radiolucent optical mask 202 to modulate the texture and contrast of the fluorescence or phosphorescence from the top screen 200 , a radiolucent fiber-optic spacer 204 to establish a known projection disparity, and a radiopaque fiber-optic faceplate 208 to protect the detector 210 from radiation emanating directly from the radiation source.
- FIGS. 20 and 21 Yet another embodiment is depicted in FIGS. 20 and 21 , wherein the detector 731 comprises a phosphor-coated CCD and the fiducial reference 722 comprises a radiopaque rectangular frame 725 .
- Both the detector 731 and the fiducial reference 722 are contained within a light-tight package 756 .
- the detector 731 and fiducial reference 722 are preferably positioned flush with an upper, inner surface of the package 756 .
- the dimensions of the frame 725 are selected such that the frame 725 extends beyond the perimeter of the detector 731 .
- Phosphor-coated strip CCDs 754 are also contained within the package 756 .
- the strip CCDs 754 are positioned below the frame 725 such that radiation impinging upon the frame 725 castes an image of each edge of the frame 725 onto one of the strip CCDs 754 .
- the positions of the frame shadow on the strip CCDs 754 is used to determine the projection geometry.
- the recording medium 431 is smaller than the projected image of object 521 .
- the reference images, 39 and 139 corresponding to the reference markers, 23 and 123 , can be identified on all the projected images, image slices extending across the union of all the projected images can be obtained. This is illustrated schematically in FIG. 9 , wherein the reference images, 39 and 139 , are taken with the source 27 and the recording medium 431 in the image positions indicated by the solid lines. Similarly, the dashed images, 39 ′ and 139 ′, are taken with the source 27 ′ and the recording medium 431 ′ in the positions indicated by the dashed lines.
- image slices of an object which casts an object image that is larger than the recording medium 431 can be synthesized. Further, by using multiple fiducial references spaced in a known pattern which are all linked to the object of interest, additional regions of commonality can be identified between multiple overlapping projection geometries, so that a region of any size can be propagated into a single, unified reconstruction. Thus, it is possible to accommodate an object much larger than the recording medium used to record individual projection images.
- regions of overlap between two or more sets of projected images recorded can be used as a basis for extrapolating registration and calibration of the sets of projected images.
- a first set of projected images is recorded using an X-ray camera configured to provide a first aperture.
- a second set of projected images is then recorded using the camera configured to provide a second aperture.
- the first and second sets of projected images are then brought into alignment by identifying fiducial reference points that are common to the overlapping regions of the projected images.
- the present invention also relates to a method for creating a slice image through the object 21 of FIG. 1 from a series of two-dimensional projected images of the object 21 , as shown in FIG. 2 .
- the method of synthesizing the image slice starts at step 45 .
- Each step of the method can be performed as part of a computer-executed process.
- a fiducial reference 22 comprising at least two reference markers, 23 and 123 , is selected which bears a fixed relationship to the selected object 21 . Accordingly, the fiducial reference 22 may be affixed directly to the selected object 21 .
- the minimum required number of reference markers 23 is determined by the number of degrees of freedom in the system, as discussed above.
- the fiducial reference 22 comprises reference markers 23 of a finite size, the size and shape of the reference markers 23 are typically recorded.
- the selected object 21 and fiducial reference 22 are exposed to radiation from any desired projection geometry at step 49 and a two-dimensional projected image 38 is recorded at step 51 .
- the projected image 38 contains an object image 40 of the selected object 21 and a reference image, 39 and 139 , respectively, for each of the reference markers 23 and 123 of the fiducial reference 22 .
- step 53 it is determined whether additional projected images 38 are desired.
- the desired number of projected images 38 is determined by the task to be accomplished. Fewer images reduce the signal-to-noise ratio of the reconstructions and increase the intensities of component “blur” artifacts. Additional images provide information which supplements the information contained in the prior images, thereby improving the accuracy of the three-dimensional radiographic display. If additional projected images 38 are not desired, then the process continues at step 60 .
- the system geometry is altered at step 55 by varying the relative positions of (1) the radiation source 27 , (2) the selected object 21 and the fiducial reference 22 , and (3) the recording medium 31 .
- the geometry of the system can be varied by moving the radiation source 27 and/or the recording medium 31 .
- the source 27 and recording medium 31 , the selected object 21 and fiducial reference 22 are moved.
- the radiation source and recording medium produce images using visible light (e.g., video camera)
- the geometry of the system must be varied to produce images from various sides of the object in order to obtain information about the entire object. After the system geometry has been varied, the process returns to step 49 .
- a slice position is selecteu at step 60 .
- the slice position corresponds to the position at which the image slice is Lo be generated through the object.
- each projected image 38 is projectively warped onto a virtual projection plane 37 at step 65 .
- the warping procedure produces a virtual image corresponding to each of the actual projected images.
- Each virtual image is identical to the image which would have been produced had the projection plane been positioned at the virtual projection plane with the projection geometry for the radiation source 27 , the selected object 21 , and the fiducial reference 22 of the corresponding actual projected image.
- the details of the steps involved in warping the projection plane 37 are shown in FIG. 3 .
- the process starts at step 70 .
- a virtual projection plane 37 is selected. In most cases it is possible to arrange for one of the projected images to closely approximate the virtual projection plane position. That image can then be used as the basis for transformation of all the other images 38 .
- a plane which is parallel to the plane containing the co-planar reference markers 23 can be selected as the virtual projection plane 37 .
- the reconstruction yields a slice image which may be deformed due to variations in magnification. The deformation becomes more prominent when the magnification varies significantly over the range in which the reconstruction is carried out. In such cases, an additional geometric transformation to correct for differential magnification may be individually performed on each projected image 38 to correct for image deformation.
- One of the recorded projected images 38 is selected at step 74 and the identity of the reference images 39 cast by each reference marker 23 is determined at step 76 .
- assignment of each elliptical image 39 to a corresponding reference marker 23 call be accomplished simply by inspection. Under such conditions, the minor diameter of the elliptical image 39 is always larger the closer the reference marker 23 is to the radiation source 27 . This is shown most clearly in FIG.
- spherical reference markers 23 which are hollow having different wall thicknesses and hence, different attenuations can be used. Accordingly, the reference image 39 cast by each spherical reference marker 23 can be easily identified by the pattern of the reference images 39 . Analogously, spherical reference markers 23 of different colors could be used in a visible light mediated system.
- each reference image 39 cast by each reference marker 23 is measured at step 78 .
- the projected center 41 of the reference marker 23 does not necessarily correspond to the center 42 of the reference image 39 cast by that reference marker 23 . Accordingly, the projected center 41 of the reference marker 23 must be determined.
- One method of determining the projected center 41 of the reference marker 23 is shown in FIG. 16 .
- the variation in intensity of the reference image 39 associated with reference marker 23 along the length of the major diameter of the reference image 39 is represented by the brightness profile 43 . The method depicted in FIG.
- the projected center 41 always intersects the brightness profile 43 of the reference image 39 at, or very near, the maximum 44 of the brightness profile 43 . Accordingly, the projected center 41 of a spherical reference marker 23 produced by penetrating radiation can be approximated by smoothing the reference image 39 to average out quantum mottle or other sources of brightness variations which are uncorrelated with the attenuation produced by the reference marker 23 . An arbitrary point is then selected which lies within the reference image 39 . A digital approximation to the projected center 41 is isolated by performing a neighborhood search of adjacent pixels and propagating the index position iteratively to the brightest (most attenuated) pixel in the group until a local maximum is obtained. The local maximum then represents the projected center 41 of the reference marker 23 .
- the fiducial reference 22 comprises reference markers 23 of finite size
- the sizes of each image 39 cast by each reference marker 23 are also recorded.
- the lengths of the major and minor diameters of elliptical reference images cast by spherical reference markers 23 can be measured.
- Computerized fitting procedures can be used to assist in measuring the elliptical reference images 39 cast by spherical reference markers 23 .
- Such procedures which are well-known in the art, may be used to isolate the elliptical reference images 39 from the projected image 38 and determine the major and minor diameters of the reference images 39 .
- the projected minor diameter of resulting elliptical reference images 39 will be slightly smaller than that determined geometrically by projection of the reference marker's actual diameter.
- the amount of the resulting error is a function of the energy of the X-ray beam and the spectral sensitivity of the recording medium 31 . This error can be eliminated by computing an effective radiographic diameter of the reference marker 23 as determined by the X-ray beam energy and the recording medium sensitivity in lieu of the actual diameter.
- One method of obtaining the effective radiographic diameter is to generate a series of tomosynthetic slices through the center of the reference marker 23 using a range of values for the reference marker diameter decreasing systematically from the actual value and noting when the gradient of the reference image 39 along the minor diameter is a maximum.
- the value for the reference marker diameter resulting in the maximum gradient is the desired effective radiographic diameter to be used for computing magnification.
- each projected image can be scaled by an appropriate magnification.
- the minor diameter of the reference image 39 is preferably used to determine the magnification since the minor diameter does not depend on the angle between the source 27 and the recording medium 31 . Accordingly, the magnification of a spherical reference marker 23 can be determined from the measured radius of the reference marker 23 , the minor diameter of the reference image 39 on the recording medium 31 , the vertical distance between the center of the reference marker 23 and the recording medium 31 , and the vertical distance between the recording medium 31 and the virtual projection plane 37 .
- a projection transformation matrix representing a series of transformation operations necessary to map the selected projected image 38 onto the virtual projection plane 37 .
- the projection transformation matrix is generated by solving each projected image 38 relative to the virtual projection plane 37 .
- the positions of the co-planar reference markers 23 are used to determine the transformation matrix by mapping the position of the reference images 39 cast by each co-planar reference marker 23 in the projected image onto its corresponding position in the virtual projection plane.
- the fiducial reference comprises a radiopaque frame 25
- the positions of the reference images 39 cast by the reference markers 23 formed at the corners of the frame 25 are mapped to a canonical rectangle having the same dimensions and scale as the frame 25 .
- This approach also serves to normalize the projective data. Depending on the number of degrees of freedom, the transformation operations range from complex three-dimensional transformations to simple planar rotations or translations.
- step 84 it is determined whether all of the projected images 38 have been analyzed. If all of the projected images 38 have not been analyzed, the process returns to step 74 , wherein an unanalyzed image 38 is selected. If no additional projected images 38 are to be analyzed, then the process proceeds through step 85 of FIG. 3 to step 90 of FIG. 2 .
- an image slice through the object 21 at the selected slice position is generated at step 90 .
- An algorithm such as that described in U.S. Pat. No. 5,359,637, which is incorporated herein by reference, can be used for that purpose.
- the position of the reference image cast by the alignment marker or markers 23 in each projected image 38 are used as the basis for application of the algorithm to generate the image slices.
- a true three-dimensional representation can be synthesized. Accordingly, it is determined whether an additional slice position is to be selected at step 92 . If an additional slice position is not desired, the process proceeds to step 94 . If a new slice position is to be selected, the process returns to step 60 .
- the entire set of image slices is integrated into a single three-dimensional representation at step 94 .
- Alternative bases for interactively analyzing and displaying the three-dimensional data can be employed using any number of well-established three-dimensional recording and displaying methods.
- the three-dimensional representation can be displayed using the display device depicted in FIG. 28 in order to produce a holographic-type display.
- the display device comprises a pair of stereoscopic eyeglasses or spectacles 1080 which are worn by an observer 1082 .
- the eyeglasses 1080 contain lenses which are either cross-polarized or which pass complementary colored light.
- a target 1084 is positioned on the eyeglass frame 1080 .
- a color computer monitor 1086 and video camera or detector 1088 are provided in association with the eyeglasses 1080 .
- the color monitor 1086 is used to display complementary-colpred or cross-polarized stereoscopic image pairs 1090 of the three-dimensional representation.
- the video camera 1088 is used to track the target 1084 as the observer's head is moved. When the observer's head is moved to a different position, the video camera 1088 relays information either directly to the color monitor 1086 or to the color monitor 1086 through computer-related hardware. The information relayed by the video camera relates to the angle subtended by the target 1084 relative to the video camera 1088 .
- the relayed information is then used to alter the angular disparity associated with the stereoscopic image pairs 1090 being displayed on the color monitor 1080 in quasi-realtime, so that the resulting display is adjusted to correlate with the movement of the observer's head and appears holographic to the observer.
- a nearly isotropic three-dimensional image can be created from a single pair of two-dimensional projections as depicted in FIG. 39 .
- the two-dimensional images are combined and overlap to produce a three-dimensional image. Since only one two-dimensional image is utilized to reconstruct each slice image, the method depicted in FIG. 39 represents a completely degenerate case wherein the slice image is infinitely thick. When the slice image is infinitely thick, the slice image is indistinguishable from a conventional two-dimensional projection of a three-dimensional object.
- a three-dimensional fiducial reference is functionally associated with an object of interest.
- the association need only be complete enough to permit the location of all of the details in the object to be determined relative to the position of the object.
- the fiducial reference must occupy a volume and be defined spatially such that a minimum of six points can be unequivocally generated and/or identified individually.
- the object may be encased inside a cubic reference volume wherein the corners of adjacent faces are rendered identifiable by tiny, spherical fiducial markers.
- a first projected image is then produced on a first projection plane at step 1102 .
- the relative positions of the object, the radiation source, and the detector are then altered so that a second projected image can be recorded on a second projection plane at step 1104 .
- the second projection plane must be selected so that it intersects the first projection plane at a known angle.
- the angle should be or approach orthogonality.
- a projective transformation of each projected image is performed to map the images of the fiducial reference on each face into an orthogonal, affine representation of the face.
- the projective transformation amounts to converting the identifiable corners of the image of fiducial reference corresponding to a projected face of the fiducial reference into a perfect square having the same dimensions as a face of the fiducial reference.
- Each of the transformed projected images is then extruded, at step 1108 , such that both projected images occupy the same virtual volume.
- the extrusion step is equivalent to the creation of a virtual volume having the same dimensions as the fiducial reference containing the sum of the transformed projected images.
- an optional non-linear filtering technique is used to limit visualization of the three-dimensional representation to the logical intersection of the transformed projected images.
- the three-dimensional representation can be refined by optionally recording additional projected images.
- the present invention also relates to a method for reducing distortions in the three-dimensional representation.
- Tomosynthesis uses two-dimensional image projections constrained within a limited range of angles relative to the irradiated object to produce a three-dimensional representation of the object.
- the limited range of angles precludes complete and uniform sampling of the object. This results in incomplete three-dimensional visualization of spatial relationships hidden in the resulting undersampled shadows or null spaces.
- Another limiting factor which interferes with artifact-free tomosynthetic reconstruction is the change in slice magnification with depth caused by the relative proximity of the source of radiation.
- a fiducial reference is functionally associated with the object and at least two independent sets of image slices are recorded.
- the angular disparity between the sets of image slices is noted.
- the first set of image slices may comprise multiple anterior-posterior projections while the second set of image slices comprises multiple lateral projections.
- the sets of image slices are then integrated to create a first and a second three-dimensional tomosynthetic matrix volume at step 1122 .
- the affine correction matrix C is determined by the number of slices comprising the three-dimensional matrix volume, the correlation angle (i.e., the greatest angle of the projection sequence in the range [ - ⁇ 4 -> ⁇ 4 ] measured from an axis normal to the detector surface), and the correlation distance (i.e., the apex-to-apex distance created by the intersection of the most disparate projections of the sequence).
- the second three-dimensional matrix volume is rotated by an angle ⁇ .
- the angle ⁇ is defined as the angular disparity between the first and the second three-dimensional matrix volumes.
- the transformed matrix volumes, A′ and L′′ are then merged using matrix averaging at step 1128 .
- M is the averaged matrix of the two component transformed matrix volumes, A′ and L′′.
- a non-linear combination of the transformed matrix volumes, A′ and L′′ is performed.
- the present invention further relates to a method for generating tomosynthetic images optimized for a specific diagnostic task.
- a task-dependent method for tomosynthetic image reconstruction can be used to mitigate the effects of ringing artifacts from unregistered details located outside the focal plane of reconstruction, which are intrinsic to the tomosynthetic reconstruction process.
- the production and elimination of blurring artifacts is depicted schematically in FIG. 24 .
- a first radiopaque object 1140 within the focal plane 1141 and a second radiopaque 1142 object above the focal plane are irradiated from two different source positions 1144 to produce two distinct data images.
- the first data image 1146 contains an image of the first radiopaque object 1140 at relative position C and an image of the second radiopaque object 1142 at relative position B.
- the second data image 1148 contains an image of the first radiopaque object 1140 at relative position F and an image of the second radiopaque object 1142 at relative position G.
- the image intensity at the same relative position of both data images is averaged. For example, relative position B in one data image corresponds to relative position E in the other data image and, therefore, the corresponding relative position in the tomosynthetic image is assigned an intensity equal to the average of the intensity measured at relative position B and relative position E (i.e., (B+E)/2).
- the tomosynthetic image 1150 is marked by a blurring of the image produced by the first radiopaque object 1140 .
- both data images are compared and, for example, only the minimum intensity at each relative position is retained.
- relative position B in one data image corresponds to relative position E in the other data image and, therefore, the corresponding relative position in the tomosynthetic image is assigned an intensity equal to the lesser of the intensities measured at relative position B and relative position E (i.e., B or E).
- the blurring shadows are eliminated from the tomosynthetic image 1152 .
- the non-linear tomosynthetic approach in accordance with the present invention is beneficial when, for example, physicians want to know with relative assurance whether a lesion or tumor has encroached into a vital organ.
- the ringing artifacts tend to blur the interface between the lesion or tumor and the surrounding tissues.
- the non-linear tomosynthetic reconstruction can be employed such that only the relatively radiopaque tumor structures of interest are retained in the reconstructed image.
- a different non-linear operator could be used such that only relatively radiolucent structures of interest are retained in the reconstructed image to determine whether a lytic process is occurring in relatively radiopaque tissues.
- a method for task-dependent tomosynthetic image reconstruction is depicted in the flow chart of FIG. 23 .
- the method begins at step 900 and proceeds to step 902 where a series of projected images are acquired.
- the projected images are acquired in the same manner already described in connection with steps 49 - 55 of FIG. 2 .
- the projected images are shifted laterally, in the plane of the projection, by amounts required to produce a desired tomosynthetic slice where all the images are then superimposed, in a manner identical to the method described in connection with steps 60 and 65 of FIG. 2 .
- the type and degree of task-dependent processing is chosen.
- step 906 it is determined that features having a low attenuation are to be identified or that the entire range of attenuating structures are to be identified. If only features having a low attenuation are to be identified, a pixel value corresponding to a desired maximum attenuation is selected. The selected pixel value is used as a maximum threshold value whereby each projected image is analyzed, pixel by pixel, and all pixels having an associated attenuation value above the selected pixel value are disregarded when an image slice is generated.
- step 910 If it is determined at step 910 that features having a low attenuation are not to be identified or that the entire range of attenuating structures are to be identified, then it is determined at step 916 whether an unbiased estimate of the three-dimensional configuration of the entire range of attenuating structures is to be identified. If the entire range of attenuating structures is to be identified, then conventional tomosynthesis is performed at step 918 , whereby the attenuation values from all of the projected images are averaged.
- step 920 it is determined at step 920 whether the user desires to restart the selection of features to be identified. If the user wants to restart the identification process, then the method returns to step 906 . If the user decides not to restart the identification process, then the method ends at step 922 .
- an image slice is generated at a selected slice position at step 924 .
- the process for generating the image slice at step 924 is essentially the same as discussed previously in connection with step 90 of FIG. 2 . However, when only features having either a high attenuation or a low attenuation are to be identified, the image generation process is performed only on the non-linearly selected images, instead of on all of the projected images as initially acquired.
- the image slice is displayed at step 926 and the method ends at step 922 .
- a method for determining temporal changes in three-dimensions.
- the me hod enables two or more sets of image data collected at different times to be compared by adjusting the recorded sets of image data for arbitrary changes in the vantage points from which the image data were recorded.
- the method takes advantage of the fact that a single three-dimensional object will present a variety of different two-dimensional projection patterns, depending on the object's orientation to the projection system. Most of this variety is caused by the fact that a three-dimensional structure is being collapsed into a single two-dimensional image by the projection system. Limiting projection options to only two-dimensional slices precludes this source of variation. The result is a much reduced search space for appropriate registration of the images required to accomplish volumetrically meaningful subtraction.
- a flow chart showing the steps involved in the method for determining temporal changes in three-dimensions of the present invention is depicted in FIG. 25 .
- a first set of image slices is generated at step 1180 .
- the object is positioned in roughly the same position as it was when the first set of image slices was produced and a second set of image slices is generated, at step 1182 , using a similar exposure protocol.
- the first set of image slices is spatially cross-correlated with the second set of image slices.
- the cross-correlation is accomplished by individually comparing each image slice comprising the first set of image slices with the individual image slices comprising the second set of image slices. The comparison is performed in order to determine which image slice in the second set of image slices corresponds to a slice through the object at approximately the same relative position through the object as that of the image slice of the first set of image slices to which the comparison is being made.
- each of the correlated pairs of image slices are individually aligned at step 1186 .
- the alignment is performed in order to maximize the associated cross-correlations by maximizing the overlap between the image slices comprising the correlated pairs of image slices.
- the cross-correlations are maximized by shifting the image slices relative to one another until the projected image of the object on one image slice is optimally aligned with the projected image of the object on the other image slice.
- the difference images are displayed.
- the difference images can be presented as a series of individual differences corresponding to various different slice positions.
- the individual difference images can be integrated to yield a composite difference representing a three-dimensional image of the temporal changes associated with the selected object.
- the present invention further relates to a source comparator and a method for matching radiation sources for use in quantitative radiology. Meaningful quantitative comparisons of different image data can be made only when the radiation source or sources used to record the image data is very nearly unchanged.
- conventional radiation sources produce radiation that varies with changes in tube potential, beam filtration, beam orientation with respect to the radiation target, tube current, and distance from the focal spot.
- the source comparator and method of the present invention enable the radiation output from one radiation source to be matched to that of another radiation source or to that of the same radiation source at a different time.
- the source comparator 1200 for matching radiation sources in accordance with the present invention is depicted in FIG. 26 .
- the source comparator 1200 comprises two wedges or five-sided polyhedrons, 1202 and 1204 , of equal dimension having a rectangular base and two right-triangular faces.
- the triangular faces lie in parallel planes at opposite edges of the base such that the triangular faces are oriented as mirror images of each other.
- each wedge, 1202 and 1204 has a tapered edge and provides a uniformly increasing thickness from the tapered edge in a direction parallel to the plane of the base and perpendicular to the tapered edge.
- the wedges, 1202 and 1204 are arranged with the base of one wedge 1202 adjacent to the base of the other wedge 1204 such that the tapered edges of the two wedges are at adjacent edges of the base.
- One wedge is formed from a uniform high attenuation material while the other wedge is formed from a uniform low attenuation material to differentially attenuate the relative proportion of high and low energy photons in the output from the radiation source.
- the resulting image will be a quadrilateral having an intensity gradient that varies uniformly in a single direction with the angle of the gradient being determined by the distribution of high and low energy photons in the output from the radiation source.
- the source comparator 1200 of FIG. 26 is used in the method of matching radiation sources in accordance with the present invention as shown in FIG. 27 .
- the source comparator is positioned between a radiation source and a detector.
- An original gradient image is then recorded by exposing the source comparator to radiation from the radiation source at the source settings to be used for recording a first set of data images.
- the first set of data images is then recorded.
- the source settings for the radiation source to be used to record the second set of data images are adjusted to match the settings used for recording the first set of data images.
- the source comparator is positioned between the radiation source and the detector and a first gradient image is recorded.
- the source comparator is then rotated perpendicularly to the detector by an angle of 180° and a second gradient image recorded at step 1224 .
- the first and second gradient images are compared and the source comparator oriented to produce the smaller gradient at step 1226 .
- the individual settings on the radiation source are then iteratively adjusted.
- the beam energy is matched by adjusting the kVp on the radiation source so that the measured gradient value approaches the gradient value of the original gradient image.
- the beam quality is then matched at step 1232 by adjusting the filtration of the radiation source so that the angle of the maximum gradient relative to the edge of the source comparator approaches that of the original gradient image.
- the beam exposures are then estimated by integrating the detector response across a fixed region of the source comparator and matched at step 1234 by adjusting the mAs of the radiation source so that the exposure approaches that of the original gradient image.
- the two images are significantly different, the beam energy, beam quality, and exposure are readjusted. If, however, asymptotic convergence has been reached and the two gradient images are substantially the same, the radiation sources are matched and the process ends at step 1238 . Once the radiation sources have been matched, the second set of data images can be recorded and quantitatively compared to the first set of data images.
- the source 627 is an unconstrained point source and the detector 631 is completely constrained relative to the object 621 . Accordingly, the system has three degrees of freedom (two translational and one displacement for the radiation source 627 relative to the object 621 and detector 631 ).
- a beam collimator 647 can be positioned between the source 627 and the object 621 to collimate the radiation from the source 627 .
- the detector 631 comprises a primary imager 632 and a secondary imager 634 positioned a known distance below the primary imager 632 . In one embodiment, both the primary and secondary imagers, 632 and 634 , are CCD detectors.
- the fiducial reference 622 comprises a radiopaque shield 633 with a ring-shaped aperture 636 of known size positioned between the primary imager 632 and the secondary imager 634 .
- Radiation from the source 627 passes through collimator 647 , irradiates object 621 , and produces an object image on the primary imager 632 .
- radiation from the source 627 which impinges upon the radiopaque shield 633 passes through the aperture 636 to produce a ring-shaped reference image of the aperture 636 on the secondary imager 634 .
- the secondary imager 634 can be a low quality imager such as a low resolution CCD.
- a lower surface of the primary imager 632 can be coated with a phosphorescent material 635 , so that radiation impinging upon the primary imager 632 causes the phosphorescent material 635 to phosphoresce.
- the phosphorescence passes through the aperture 636 to produce the reference image on the secondary imager 634 .
- the reference image produced using the system depicted in FIG. 19 can be used to determine the position of the source 627 relative to the object 621 and the detector 631 .
- a circle, or ellipse is fitted to the projected reference image.
- the position of the center of the fitted circle, or ellipse, relative to the known center of the aperture 636 is determined.
- the angle ⁇ of a central ray 637 radiating from the source 627 relative to the object 621 and the detector 631 can then be determined.
- the length of the minor diameter of the projected reference image is determined and compared to the known diameter of the aperture 636 to provide a relative magnification factor.
- the relative magnification factor can then be used to determine the distance of the source 627 from the object 621 .
- the center of the fitted circle can be determined as follows.
- a pixel or point on the secondary imager 634 that lies within the fitted circle is selected as a seed point.
- the center pixel of the secondary imager 634 can be selected, since the center point will typically lie within the fitted circle.
- a point R is determined by propagating from the seed point towards the right until the fitted circle is intersected.
- a point L is determined by propagating from the seed point towards the left until the fitted circle is intersected. For each pixel along the arc L-R, the average of the number of pixels traversed by propagating from that pixel upwardly until the fitted circle is intersected and the number of pixels traversed by propagating from that pixel downwardly until the fitted circle is intersected is determined.
- This average represents the row address of the fitted circle's center.
- the entire reference image is rotated by 90° and the process is repeated.
- the row address and column address together represent the position of the center of the fitted circle.
- the present invention is equally applicable to images produced using a variety of technologies, such as visible light, ultrasound, or electron microscopy images.
- technologies such as visible light, ultrasound, or electron microscopy images.
- ENM intermediate voltage electron microscope
- the present invention can also be used to reconstruct three-dimensional images of objects which either emit or scatter radiation.
- the present invention allows cellular changes to be detected and quantified in an efficient and cost-effective manner. Quantitation of three-dimensional structure facilitates comparison with other quantitative techniques, such as biochemical analysis. For example, increases in the Golgi apparatus in cells accumulating abnormal amounts of cholesterol can be measured and correlated with biochemically measured increases in cellular cholesterol.
- the present invention can be applied to construct topological images of geological structures by recording images of the structure created by the sun.
- the present invention is useful for other three-dimensional imaging modalities.
- the present invention is also intended to relate to images obtained using magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), positron emission tomography (PET), conventional tomography, tomosynthesis, and tuned-aperture computed tomography (TACT), as well as microscopic methods including confocal optical schemes.
- MRI magnetic resonance imaging
- SPECT single photon emission computed tomography
- PET positron emission tomography
- TACT tuned-aperture computed tomography
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- High Energy & Nuclear Physics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Optics & Photonics (AREA)
- Public Health (AREA)
- Pulmonology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
- Image Processing (AREA)
Abstract
A system and method for producing three-dimensional representations of an object. A first series of projected images of the selected object is recorded in a first projection plane and a second series of projected images of the selected object is recorded in a second projection plane. The first and the second series of projected images are rendered at a common magnification. The first set of projected images is then integrated into a first three-dimensional volume and the second set of projected images is integrated into a second three-dimensional volume. The three-dimensional representation of the object is then produced by combining one projected image from the first set of projected images with one projected image from the second set of projected images. Alternatively, the three-dimensional representation is produced by merging the first three-dimensional volume with the second three-dimensional volume.
Description
- This application is a continuation-in-part of co-pending application Ser. No. 09/252,632, entitled “Method And System For Creating Task-Dependent Three-Dimensional Images,” filed on Feb. 19, 1999, such application being incorporated herein by reference.
- The present invention relates to a method and system for creating three-dimensional displays or images from a multiplicity of two-dimensional projected images and, more specifically, to a method and system for producing task-dependent radiographic images of an object of interest which are substantially free of blurring artifacts.
- A variety of three-dimensional imaging modalities has been developed for medical applications, as well as for use in non-destructive testing of manufactured parts. In particular, a wide range of tomosynthetic imaging techniques has previously been demonstrated to be useful in examining three-dimensional objects by means of radiation. These imaging techniques differ in the size and configuration of the effective imaging aperture. At one extreme, the imaging aperture approaches zero (i.e., a pinhole) and the resulting display is characterized by images produced from a single transmission radiograph. This yields an infinitely wide depth of field and therefore no depth information can be extracted from the image. At the other extreme, the aperture approaches a surrounding ring delimiting an infinite numerical aperture resulting in projection angles orthogonal to the long axis of the irradiated object. This yields an infinitely narrow depth of field and hence no information about adjacent slices through the object can be ascertained. It therefore follows that a “middle ground” approach, which provides the ability to adapt a sampling aperture to a particular task, would be highly advantageous.
- The key to achieving the full potential of diagnostic flexibility lies in the fact that perceptually meaningful three-dimensional reconstructions can be produced from optical systems having any number of different aperture functions. That fact can be exploited since any aperture can be approximated by summation of a finite number of appropriately distributed point apertures. The key is to map all incrementally obtained projective data into a single three dimensional matrix. To accomplish this goal, one needs to ascertain all positional degrees of freedom existing between the object of interest, the source of radiation, and the detector.
- In the past, the relative positions of the object, the source, and the detector have been determined by fixing the position of the object relative to the detector while the source of radiation is moved along a predetermined path, i.e. a path of known or fixed geometry. Projective images of the object are then recorded at known positions of the source of radiation. In this way, the relative positions of the source of radiation, the object of interest, and the detector can be determined for each recorded image.
- A method and system which enables the source of radiation to be decoupled from the object of interest and the detector has been described in U.S. Pat. No. 5,359,637, that issued on Oct. 25, 1994, which is incorporated herein by reference. This is accomplished by fixing the position of the object of interest relative to the detector and providing a fiducial reference which is in a fixed position relative to the coupled detector and object. The position of the image of the fiducial reference in the recorded image then can be used to determine the position of the source of radiation. In addition, a technique for solving the most general application wherein the radiation source, the object of interest, and the detector are independently positioned for each projection has been described by us in co-pending U.S. patent application Ser. No. 09/034,922, filed on Mar. 5, 1998, which is also incorporated herein by reference.
- Once the relative positions of the radiation source, the object, and the detector are determined, each incrementally obtained projective image is mapped into a single three-dimensional matrix. The mapping is performed by laterally shifting and summing the projective images to yield tomographic images at a selected slice position through the object of interest. A three-dimensional representation of the object can be obtained by repeating the mapping process for a series of slice positions through the object. However, the quality and independence of the tomographic images is compromised by blurring artifacts produced from unregistered details located outside the plane of reconstruction.
- In addition, quantitative information has traditionally been difficult to determine from conventional tomography. Although many questions of medical interest are concerned with temporal changes of a structure (e.g., changes in the size and shape of a tumor over time), the ability to compare diagnostic measurements made over time is complicated by the fact that factors other than the parameter of diagnostic interest often contribute to the measured differences. For example, spatial variations produced from arbitrary changes in the observational vantage point(s) of the radiation source create differences between the measurements which are unrelated to temporal changes of the object being investigated. In addition, conventional X-ray sources produce radiation that varies with changes in tube potential, beam filtration, beam orientation, tube current, distance form the focal spot, and exposure time. The fluctuations in the output of radiation sources is therefore another factor that limits the ability to derive quantitative information from conventional tomography.
- In light of the foregoing, it would be highly beneficial to provide a method for producing a three-dimensional representation of an object that is substantially free of blurring artifacts from unregistered details. In addition, the method should enable quantitative information related to temporal changes associated with the object to be measured.
- The present invention relates to a system and a method for synthesizing an image slice through a selected object from a plurality of projected radiographic images of the selected object. The system comprises a radiation source for irradiating the object. The preferred radiation source depends upon the particular application. For example, the present invention may be practiced using x-rays, electron microscopy, ultrasound, visible light, infrared light, ultraviolet light, microwaves, or virtual radiation simulated by manipulation of magnetic fields (magnetic resonance imaging (MRI)). In one embodiment of the present invention, the position of the radiation source within a plane parallel to an image plane is determined from projected images of two object points associated with a fiducial reference which is maintained in fixed position relative to the selected object. Once the projected images are compensated for differences in magnification, the relative position of the radiation source within the plane parallel to the image plane is determined from an estimate of the actual distance between the two object points obtained from a sinusoidal fit of the distances between the projected images of the object points.
- A recording medium or radiation detector is used to record a series of projected images of the selected object. The recording medium may be in the form of a photographic plate or a radiation-sensitive, solid-state image detector such as a charge-coupled device (CCD), or any other system capable of producing two-dimensional projections or images suitable for digitization or other analysis.
- An image synthesizer is provided for transforming the series of projected images of the selected object into an image slice. The image slice consists of an array of pixels with each pixel having an associated attenuation value and corresponds to a cross-sectional slice through the selected object at a selected slice position. A three-dimensional representation of the object can be obtained by repeating the transformation at a series of slice positions through the object.
- In addition, an optional source comparator is provided for adjusting the radiation source to enable meaningful quantitative comparisons between projected images recorded either at different times and/or using different radiation sources. The source comparator is positionable between the radiation source and the radiographic medium for producing a gradient image indicative of characteristics associated with the output from the radiation source. In operation, the source comparator is used to record a first gradient image using a first radiation source at the same time that a first projected image or series of projected images is recorded. When a second projected image or series of projected images are to be recorded, the source comparator is used to record a second gradient image. The second gradient image is compared to the first gradient and differences between the two gradient images are noted. The beam energy, filtration, and beam exposure associated with the radiation source used to record the second gradient image are then adjusted to minimize the differences between the first gradient image and the second gradient image.
- In one embodiment, the source comparator comprises two wedges or five-sided polyhedrons of equal dimension having a rectangular base and two right-triangular faces. The triangular faces lie in parallel planes at opposite edges of the base such that the triangular faces are oriented as mirror images of each other. As a result, each wedge has a tapered edge and provides a uniformly increasing thickness from the tapered edge in a direction parallel to the plane of the base and perpendicular to the tapered edge. The wedges are arranged with the base of one wedge adjacent to the base of the other wedge such that the tapered edges of the two wedges are at adjacent edges of the base. One wedge is formed from a uniform high attenuation material while the other wedge is formed from a uniform low attenuation material. Accordingly, when the source comparator is irradiated from a radiation source directed perpendicularly to the bases of the wedges, the resulting image will be a quadrilateral having an intensity gradient that is maximized in a particular direction.
- In operation, the system of the present invention is used to produce an image slice through the selected object that is substantially free of blurring artifacts from unregistered details located outside a plane of reconstruction. The radiation source and recording medium are used to record a series of two-dimensional projected images of the selected object. The series of two-dimensional projected images are then shifted by an amount and in a direction required to superimpose the object images of the two-dimensional images. The shifted two-dimensional images can then be combined in a non-linear manner to generate a tomosynthetic slice through the selected object. In one embodiment, the two-dimensional images are combined by selecting details from a single projection demonstrating the most relative attenuation at each pixel. Alternatively, a different non-linear operator could be used wherein the two-dimensional images are combined by selecting details from a single projection demonstrating the least relative attenuation at each pixel in the reconstructed image. Optionally, a series of reconstructed images at varying slice positions through the selected object are determined to create a three-dimensional representation of the selected object.
- Alternatively, the system of the present invention is used to synthesize a three-dimensional reconstruction of the object from as few as two projected images of the object. A first projected image of the object is recorded in a first projection plane and a second projected image is recorded in a second projection plane. Each of the first and the second projected images are then rendered at a common magnification. Using a known angle between the first and the second projection planes, the first and the second projected images are transformed to occupy the same volume. The transformed first and second projected images are then combined into a three-dimensional representation of the selected object. Additional projected images are optionally combined with the three-dimensional representation to refine the three-dimensional representation.
- In yet another embodiment, the system of the present invention is used to synthesize a three-dimensional representation of the selected object from two or more sets of projected images of the selected object. The first and second sets of projected images are tomosynthetically transformed into a series of contiguous slices forming a first and a second three-dimensional volume, respectively, using previously disclosed methods (e.g., U.S. Pat. No. 5,668,844) or those in the public domain (e.g., tomosynthesis). The first and second three-dimensional volumes are then rendered at a common magnification. The second three-dimensional volume is then rotated by an angle corresponding to the angular disparity between the first and the second three-dimensional volumes. The rotated second three-dimensional volume is then merged with the first three-dimensional volume to produce a three-dimensional representation of the selected object.
- Alternatively, the system of the present invention can be used to determine temporal changes in the selected object. The radiation source and recording medium are used to record a first series of two-dimensional projected images of the selected object. At some later time, the radiation source and recording medium are used to record a second series of two-dimensional projected images of the selected object. Both series are tomosynthetically converted into a series of slices via previously disclosed methods (TACT®) or those in the public domain (tomosynthesis). Each slice of the first series is then correlated with a corresponding slice of the second series to form pairs of correlated slices. Each pair of slices is then aligned to maximize the overlap between homologous structures. Each pair of correlated slices is then subtracted to produce a difference image. Each difference image is then displayed individually. Alternatively, all of the difference images can be overlapped to yield a complete difference image corresponding to the volumetric difference associated with the entire tomosynthetically reconstructed volume.
- When a three-dimensional representation of the selected object is produced, the three-dimensional representation can be viewed holographically using a display in accordance with the present invention. The display comprises stereoscopic spectacles which are worn by an observer and a target operatively associated with the spectacles. Accordingly, as the observer changes his or her vantage point, movement of the spectacles translates into a corresponding movement of the target. A detector is operatively associated with the target for tracking movement of the target. The detector is connected to a monitor such that the monitor receives a signal from the detector indicative of movement of the target. In response to the signal from the detector, the monitor displays an image pair of the three-dimensional representation which, when viewed through the spectacles produces a stereoscopic effect. The image pair which is displayed is changed to compensate for changes in the vantage point of the observer.
- The foregoing summary, as well as the following detailed description of the preferred embodiments of the present invention, will be better understood when read in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a schematic representation of a system for creating; three-dimensional radiographic displays using computed tomography in accordance with the present invention; -
FIG. 2 is a flow chart showing the steps involved in creating three-dimensional radiographic displays using computed tomography in accordance with the present invention; -
FIG. 3 is a flow chart showing details of a method of projectively warping or transforming a projected image from an actual plane of projection onto a virtual projection plane; -
FIG. 4 is a schematic representation of a system having nine degrees of freedom in which a source is shifted and displaced relative to an original projection plane and in which a projection plane of a recording medium is shifted, rotated, displaced, and tilted relative to the original projection plane; -
FIG. 5 is a schematic representation showing an arrangement of reference markers in accordance with the an embodiment of the present invention, wherein five spherical reference markers are positioned at five of the eight vertices of a cube; -
FIG. 6 is a schematic representation of a system having seven degrees of freedom in which an infinite point source is shifted relative to an original projection plane and in which a projection plane of a recording medium is shifted, displaced, and tilted relative to the original projection plane; -
FIG. 7 is a schematic representation of a system having four degrees of freedom in which an infinite point source is shifted relative to an original projection plane and in which a projection plane of a recording medium is shifted relative to the original projection plane; -
FIG. 8 is an exploded, schematic representation of a charge-coupled device (CCD) for use as a recording medium with intrinsic components enabling automated determination of projective geometry; -
FIG. 9 is a schematic representation of an embodiment of the present invention wherein the recording medium is smaller than the projected image of the object; -
FIG. 10 is a schematic representation of an embodiment of the present invention wherein the source is a hand-held X-ray source with a laser a ming device; -
FIG. 11 is a schematic representation of an embodiment of the present invention wherein the reference markers of the fiducial reference are positioned at the vertices of a square pyramid; -
FIG. 12 is a schematic representation of an embodiment of the present invention wherein the source is a hand-held X-ray source which is constrained relative to the recording medium by a C-arm; -
FIG. 13 is an enlarged schematic representation of the object of interest and the recording medium depicted inFIG. 14 ; -
FIG. 14 is a schematic representation of an embodiment of the present invention wherein the reference markers of the fiducial reference are positioned at the centers of the faces of a parallelpiped; -
FIG. 15 is a schematic representation of an embodiment of the present invention wherein the corners of a frame define four reference markers; -
FIG. 16 is a schematic representation of a reference image cast by a spherical reference marker showing the resulting brightness profile; -
FIG. 17 is a schematic representation of the parameters associated with a system comprising three spherical, non-collinear reference markers wherein the orthogonal distance between the radiation source and the recording medium is fixed at a distance short enough so that the images cast by the reference markers are magnified relative to the size of the actual reference markers; -
FIG. 18 is a schematic representation of the relevant parameters associated with a reference image associated with a spherical reference marker; -
FIG. 19 is a schematic representation of an embodiment of the present invention wherein the fiducial reference comprises a radiopaque shield with a ring-like aperture; -
FIG. 20 is a schematic, perspective view of an embodiment of the present invention, wherein the detector comprises a charge-coupled device (CCD) and the fiducial reference comprises a frame, shown with the front and a section of the top removed; -
FIG. 21 is a sectional view of the embodiment depicted inFIG. 20 taken along the 23-23 line; -
FIG. 22 is an alternate embodiment of a laser aiming device in accordance with the present invention; -
FIG. 23 is a flow chart showing the steps involved in a method for task-dependent tomosynthetic image reconstruction in accordance with the present invention; -
FIGS. 24A and B are schematic representations of a linear tomosynthetic reconstruction and a non-linear tomosynthetic reconstruction in accordance with the present invention; -
FIG. 25 is a flow chart showing the steps involved in a method for determining temporal changes in accordance with the present invention; -
FIG. 26 is a schematic representation of a source comparator used for matching X-ray sources; -
FIG. 27 is a flow chart showing the steps of a method for using the source comparator ofFIG. 26 ; -
FIG. 28 is a schematic representation of a pseudo-holographic image display; -
FIG. 29 is a tomosynthetic slice through a human breast reconstructed using a linear summation of projected images; -
FIG. 30 is a tomosynthetic slice through the human breast reconstructed using a linear summation of projected images augmented by a deconvolution filter; and -
FIG. 31 is a tomosynthetic slice through the human breast reconstructed using a non-linear reconstruction scheme; -
FIG. 32 is a flow chart showing the steps of a method for creating nearly isotropic three-dimensional images from a single pair of arbitrary two-dimensional images; -
FIG. 33 is a flow chart showing the steps of a method for creating a three-dimensional image from two series of two-dimensional images; -
FIG. 34 is a flow chart showing the steps of a method for producing a three-dimensional representation of a stationary object from multiple plane projections recorded by an arbitrarily positionable camera; -
FIG. 35 is a schematic representation of a three-dimensional scaling calibration for determining the relative position of a camera in two planes orthogonal to the projection plane of the camera; -
FIG. 36 is a schematic representation of a remote-controlled mobile radiation source; -
FIG. 37 is a schematic representation of an embodiment of the present invention wherein a camera is used to record overlapping sets of projected images; -
FIG. 38 is a schematic representation of an embodiment of the present invention wherein laser light sources provide fiducial reference points; and -
FIG. 39 is a schematic representation of a three-dimensional image of an object produced from a single pair of arbitrary two-dimensional images. - The present invention generally relates to a
system 20, as depicted schematically inFIG. 1 , for synthesizing an image of anobject 21 at a selectedslice position 35 through theobject 21 from a plurality of radiographic projected images 38 of the selectedobject 21. Afiducial reference 22 is held in a fixed position relative to the selectedobject 21, for example, by directly attaching thefiducial reference 22 to theobject 21. The fiducial reference comprises two finite sized, identifiable reference markers, 23 and 123, which are maintained coupled together in a fixed geometry relative to each other by aradiolucent bar 24. However, thefiducial reference 22 may comprise various numbers and arrangements ofreference markers 23. Alternatively, as shown inFIG. 38 , the reference markers, 23 and 123, are provided by the reflection of laser light from the surface of the selected object. Aradiation source 27 is provided to irradiate theobject 21 along with thefiducial reference 22. Irradiation of theobject 21 casts a projected image 38 onto arecording medium 31. The projected image 38 comprises anobject image 40 of theobject 21 and reference images, 39 and 139, of the reference markers, 23 and 123, respectively. - In general, the pattern of
source 27 positions does not need to be in any fixed geometry or position. Indeed, the position of thesource 27 may be totally arbitrary in translation and displacement relative to theobject 21. Likewise, therecording medium 31 may also be arbitrarily movable relative to theobject 21 by translation, displacement, tilting, or rotation. The only requirement is that for every degree of freedom in the system resulting from movement of thesource 27 or therecording medium 31 relative to theobject 21, thefiducial reference 22 must include sufficient measurable or defined characteristics, such as size, shape, or numbers ofreference markers 23, to account for each degree of freedom. - The minimum number of reference markers required to completely determine the system depends on the constraints, if any, imposed on the relative positions of (1) the radiation source, (2) the object and fiducial reference, and (3) the recording medium. The system may have a total of nine possible relative motions (2 translations and 1 displacement for the radiation source relative to a desired projection plane and 2 translations, 1 displacement, 2 tilts, and 1 rotation for the recording medium relative to the desired projection plane). Each of these possible relative motions must be capable of analysis either by constraining the system and directly measuring the quantity, by providing a sufficient number of reference markers to enable the quantity to be determined, or by estimating the value of the quantity. Each unconstrained relative motion represents a degree of freedom for the system. For a system to be completely determined, the total number of degrees of freedom in the system must be less than or equal to the total number of degrees of freedom associated with the fiducial reference.
- More than the minimum number of reference markers can be used. In such cases, the system is overdetermined and least squares fitting can be used to improve the accuracy of the resulting image slices. If, however, less than the minimum number of reference markers is used, then the system is underdetermined and the unknown degrees of freedom must either be estimated or measured directly.
- Although the reference markers can be essentially any size and shape, spherical reference markers of known diameter may be used. When using spherical reference markers of a finite size, a single reference marker can account for up to five degrees of freedom. When a spherical reference marker is projected obliquely onto the recording medium, the reference image cast by the spherical reference marker is elliptical and is independent of any rotation of the reference marker. Determining the position of the reference image in the projection plane (X- and Y-coordinates) and the magnitudes of the major and minor diameters of the elliptical image accounts for four degrees of freedom. Further, when the distance between the radiation source and the reference marker is sufficiently short, the reference image will be magnified relative to the actual size of the reference marker, thereby accounting for an additional degree of freedom. In contrast, only two degrees of freedom (the X- and Y-coordinates) are typically associated with the reference image of a point-size reference marker.
- The most complex, yet most generally applicable, arrangement is depicted in
FIG. 4 , wherein theradiation source 27 and therecording medium 31 are completely unconstrained and uncoupled from the selectedobject 21. In this arrangement, there are nine degrees of freedom: 2 translational (ΔX and ΔY) and 1 displacement (ΔZ) degrees of freedom for theradiation source 27 relative to an original or desiredprojection plane recording medium 31 relative to the original or desired projection plane. Accordingly, a fiducial reference system sufficient to solve a projection system having nine degrees of freedom is needed to completely determine the system. - One embodiment of the present invention that permits this general arrangement to be realized conveniently involves two-dimensional projected images from a system comprised of a fiducial reference having five point-size or finite reference markers. This approach conveniently facilitates three-dimensional reconstructions when exactly four reference markers are coplanar and no three or more reference markers are collinear. Under these conditions, only the projection from the non-coplanar marker need be distinguished from the other four because the projections from the latter always bear a fixed sequential angular arrangement relative to each other which simplifies identification of homologous points in all projections. For example, the reference markers can be placed at five contiguous vertices of a cube as shown in
FIG. 5 .Fiducial reference 122 comprises five reference markers, 23, 123, 223, 323, 423, positioned contiguously at five vertices of a cube. Theobject 121 is preferably positioned within the cube. The four co-planar reference markers, 23, 123, 223, and 323, then can be used for projectively warping or transforming the projected images onto a desired projection plane while the remainingreference marker 423 serves as the alignment marker required to determine the normalized projection angle as described in U.S. Pat. No. 5,359,637. - The most general reconstruction task requiring information sufficient to determine all nine possible degrees of freedom requires computation of separate projective transformations for each projected image in each and every slice. However, by limiting the region of interest to a subvolume constrained such that the magnification across and between its slices may be considered constant, it is possible to generate veridical three-dimensional images within the volume much more efficiently. The increase in efficiency under these conditions results from the fact that all projections within this region can be mapped by a single fixed transformation, and that associated slice generation can be accomplished by simple tomosynthetic averaging of laterally shifted projections as described in U.S. Pat. No. 5,359,637.
- Another useful arrangement of the fiducial reference comprising five reference markers is shown in
FIG. 11 , wherein afiducial reference 222 employing a pyramidal distribution ofreference markers 323 is used. Thefiducial reference 222 comprises fivereference markers object 221. As was the case inFIG. 5 , four of the reference markers, 23, 123, 223, and 323, lie in a plane that can be used to establish the desired projection plane. Here, they define the four corners of the base of a pyramid. Thefifth reference marker 423 is positioned to define the apex of the pyramid and serves as the means for determining the projection angles relative to the desired projection plane as described in U.S. Pat. No. 5,359,637. In use, thefiducial reference 222 may be attached or fixed relative to theobject 221 such that the base of the pyramid is proximate to the recording medium and the apex of the pyramid is proximate to the source. - In
FIG. 15 , afiducial reference 322 having an alternative arrangement of reference markers in a pyramidal distribution is shown. In this arrangement, thefiducial reference 322 comprises aradiopaque frame 25 having a radiolucent central window. The four inside corners of theradiopaque frame 25 define four reference markers, 23, 123, 223, and 323, at the base of the pyramid. Thefifth reference marker 423 is positioned at the apex of the pyramid. Preferably, theobject 321 is positioned between theframe 25 and thereference marker 423. - In
FIG. 14 , afiducial reference 422 which is also useful for solving a system with nine degrees of freedom is shown.Fiducial reference 422 comprises arectangular parallelpiped 33 with radiopaque reference markers, 23, 123, 223, 323, 423, and 523, centered on each of the six faces of theparallelpiped 33. The reference markers, 23, 123, 223, 323, 423, and 523, are marked with distinguishable indicia, such as X, Y, Z, {circle over (X)}, {circle over (Y)}, and {circle over (Z)} so that the reference images cast by the markers, 23, 123, 223, 323, 423, and 523, can be identified easily and distinguished from one another. Alternatively or additionally, two or more of the edges of the parallelpiped 33 may be defined byradiopaque bars 26 such that the intersections of thebars 26 provide additional reference markers, such asreference marker 623 located at the intersection of the three bars labeled 26 inFIG. 14 . - Alternatively, the six degrees of freedom for the
radiation source 27 relative to the desired projection plane 37 (two translational, one displacement, two rotational, and one tilting degree of freedom) can be determined independently from the use of the fiducial reference when the orientation of the detector is fixed or known relative to either the object of interest or the radiation source. For example, the position of theradiation source 27 can be determined from multiple plane projections recorded from an arbitrarily positioned camera provided that the lens aperture is adjusted such that the entire object always appears in focus. The three relative angles associated with each projection are determined by attaching three orthogonally oriented angle sensing devices, such as gyroscopes, to the camera. The displacement of the radiation source relative to the object is determined using a range finder associated with the camera. Since the position of the camera within a plane parallel to the camera's projection plane is used only to determine the three-dimensional geometric relationships underlying the disparity observed between object images, the remaining degrees of freedom need only be measured relative to one another and, therefore, can be fixed from a geometric analysis of paired point projections. Referring toFIG. 35 , it can be seen that, when the arbitrary camera positions are compensated for displacement and projection orthogonality, the projected distances, D1, D2, and D3, between the paired points P1 and P2 are a sinusoidal function of the corrected projection angle. Hence, the actual distance between P1 and P2 can be estimated from a non-linear curve fit to the observed projection distances. - A method for determining the position of the radiation source relative to the object using an arbitrarily positionable camera in accordance with the present invention is depicted in
FIG. 34 . Atstep 1000, angle sensors attached to the camera are initialized in order to eliminate possible drift in accuracy. The object is then roughly centered within the viewfinder of the camera and an object image, the nominal displacement of the camera from the object, and the angle data are recorded atstep 1002. An intrinsic range finder associated with the camera is used to determine the nominal distance from the camera to the object of interest and the angle sensors are used to determine the angle data. - At
step 1004, it is determined whether additional object images are desired. If additional object images are desired, the camera is repositioned atstep 1005 and the process returns to step 1002. It should be appreciated that a minimum of three object images is required to produce a meaningful sinusoidal regression, as discussed in detail below. If no additional object images are to be recorded, the recorded object images and data is optionally stored in a computer readable format and the process proceeds to step 1007. - Each of the object images is then individually scaled to render all of the object images at the same magnification at
step 1009. The scaling is possible using the range recorded for each object image because the linear magnification is inversely proportional to the range. By scaling the object images, an effective displacement between the camera and the object can be defined. - At
step 1011, a first object point, visible on all of the projected object images, is selected. A representative object image is then selected atstep 1013. The representative object image should be the object image which best approximates the orientation to which desired reconstructed tomosynthetic slices are to be parallel. - Each object image is then rotated and translated, at
step 1015, so that all of the object images are brought into tomosynthetic registration. Specifically, each object image is rotated by an amount sufficient to adjust the rotational orientation of the camera about an axis perpendicular to the projection plane to match that of the representative object image. Rotational adjustment of the object images assures that the registrations which follow will not exclude a second reference point, whose selection is discussed below. Each rotated object image is then translated both vertically and horizontally by an amount which causes superposition of the projected image of the first object point within each object image with the projected image of the first object point within the representative object image. - At
step 1017, a second object point visible on all of the scaled, rotated, and translated object images is selected. The distance between the projected images of the second object point and the first object point is measured, atstep 1019, for each of the object images. If the relative change in distance does not exceed a task-dependent threshold value and produce a well-distributed range of values, the accuracy of the subsequent non-linear regression may be compromised. Accordingly, atstep 1021, it is determined whether the measured distances exceed the task-dependent threshold. If the threshold is not exceeded, a new second object point is selected atstep 1017. If the threshold is exceeded, the process proceeds to step 1023. - At
step 1023, the actual distance between the first object point and the second object point is estimated from the measured distance separating the projected images of the first and second object points in the recorded object images. The estimate of the actual distance is determined using the effective displacement of the camera from the object and a sinusoidal curve fitting procedure, as well as the projection angle defined by a line connecting the first and second object points and the plane of the representative object image. - Using affine projection geometry, the recorded angle data, and the recorded displacement data, each object image is remapped onto the plane defined by the representative object image selected above at
step 1025. The remapping is performed using the first object point as the common point of superposition. Atstep 1027, the object images are then tomosynthetically reconstructed using the second object point as a disparity marker. The distances between object images is then calibrated, atstep 1029, using the estimate for the distance between the first and second object points and trigonometrically correcting the object images for foreshortening caused by variations in the projection angle. - Referring to
FIG. 36 , one arrangement for unconstraining and uncoupling the radiation source from the selected object is depicted. As shown in the figure, aradiation source 1050 is mounted on amobile carriage 1052. Thecarriage 1052 is controlled remotely using atransmitter 1054 which transmits a signal to thecarriage 1052 through anantenna 1056. In operation, thetransmitter 1054 is operated to maneuver thecarriage 1052, and thereby the radiation source, 1050, to move around a selectedobject 1058 to enable projected images of theobject 1058 to be recorded on adetector 1060 at a variety of relative positions of theradiation source 1050, theobject 1058 andfiducial reference 1062, and thedetector 1060. In order to provide essentially complete freedom in positioning theradiation source 1050 relative to theobject 1058 andfiducial reference 1062, the elevation and angle of tilt of theradiation source 1050 relative to theobject 1058 andfiducial reference 1062 is also controllable through thetransmitter 1054. - Reducing the uncertainty of the projection geometry through the constraint of one or more degrees of freedom reduces the complexity of the resulting reconstruction. An arrangement of the system of the present invention which is somewhat constrained is depicted in
FIGS. 12 and 13 , wherein a hand-held X-ray source is provided such that the orthogonal distance between theradiation source 127 and therecording medium 131 is fixed by a C-arm 129 at a distance short enough so that the image cast by thefiducial reference 122 is magnified relative to the size of the actualfiducial reference 122. Preferably, the C-arm 129 is connected to therecording medium 131 by aconcentric swivel collar 149 to allow the C-arm 129 to be rotated relative to therecording medium 131. A disposable and crushableradiolucent foam cushion 130 may be attached to the surface of therecording medium 131 to permit comfortable customized stable adaptation of thedetector 131 to theobject 121. The other end of the C-arm 129 is attached to apotted X-ray source 145 so that radiation emanating from the pottedX-ray source 145 impinges upon therecording medium 131. Atrigger 146 is provided for operating thesource 127. Thesource 127 optionally comprises acircular beam collimator 147 for collimating radiation emanating from thesource 127. Thecollimator 147 may provide a relatively long focal-object distance to provide nearly affine projection geometries. Preferably, ahandle 148 is also provided to enable the operator to more easily maneuver thesource 127. The hand-heldX-ray source 127 is connected to a computer/high voltage source 128 for controlling operation of the device. In addition, a disposableplastic bag 132 can be positioned around thedetector 131 for microbial isolation. Thesource 127 can optionally comprise a rotatable transparent radiopaqueplastic cylinder 119 and a transparentradiopaque shield 152 to protect the operator from scattered radiation. In this arrangement, there are 3 degrees of freedom (two translational and one displacement for the radiation source 127). Accordingly, a fiducial reference compensating for at least three degrees of freedom is necessary to completely describe or analyze the system. One convenient embodiment for solving the system depicted inFIGS. 12 and 13 employs afiducial reference 122 comprising a single radiopaque sphere of finite diameter. Under those conditions, the length of the minor axis of the resulting elliptical shadow plus two translational measurements are sufficient to define the projection geometry completely. - The computational steps involved in synthesizing a three-dimensional image using three spherical, non-linear reference markers in a system wherein the orthogonal distance between the radiation source and the recording medium is fixed at a distance short enough so that the images cast by the reference markers are magnified relative to the size of the actual reference markers (i.e., a system with eight degrees of freedom as depicted in
FIGS. 12 and 13 ) can be derived with reference toFIGS. 17 and 19 . In the drawings, c is the fixed distance between the source and the projection plane; PS is the orthogonal projection of the source onto the projection plane; B, M, and T are the reference markers; r is the radius of the reference markers; ap is the distance from the center of a reference marker to the source; θ is the angle subtended by the center of a reference marker relative to a line orthogonal to the projection plane through the source; φ is the angle at the apex of an isosceles triangle having a base of length r and a height of length ap; Bs, Ms, and Ts are the reference images associated with the reference markers; a (or, alternatively, dp) is the major diameter of the reference images; b is the minor diameter of the reference images; x is the length of a section of an arc associated with a reference image measured from the projection of the center of the corresponding reference marker onto the projection plane along the major diameter, b, in a direction toward Ps; y is the length of an arc associated with a reference image through the projection of the center of the corresponding reference marker onto the projection plane and parallel to the minor diameter of the reference image; and ds is the major diameter of a reference image in a virtual projection plane. - In
FIG. 6 , another arrangement of the system of the present invention is depicted wherein theradiation source 27 is located at a fixed distance from the selectedobject 21 and sufficiently far so that magnification is not significant. However, therecording medium 31 is allowed to be shifted, displaced, and tilted relative to the selectedobject 21 and an original or desiredprojection plane 37. In this arrangement, there are seven degrees of freedom (two translational degrees of freedom for theradiation source object 21 and therecording medium 31. - In
FIG. 7 , yet another arrangement of the system of the present invention is depicted wherein the distance between theobject 21 and theradiation source 27 is sufficiently large so that magnification can be ignored and wherein therecording medium 31 is free to shift laterally relative to theobject 21 and the desired ororiginal projection plane 37. In this arrangement, there are four degrees of freedom (two translational degrees of freedom for theradiation source 27 and two translational degrees of freedom for the recording medium 31). Therefore, a fiducial reference having at least four degrees of freedom is necessary to completely determine the system. Accordingly, a fiducial reference comprising at least two point-size reference markers can be used to determine the position of the radiation source relative to the selectedobject 21 and therecording medium 31. This relatively constrained system may be useful in three-dimensional reconstructions of transmission electron micrographs produced from video projections subtending various degrees of specimen tilt and exhibiting various amounts of arbitrary and unpredictable lateral shift due to intrinsic instability associated with the instrument's electron lenses. - Referring to
FIG. 1 , theradiation source 27 may be either a portable or a stationary X-ray source. However, theradiation source 27 is not limited to an X-ray source. The specific type ofsource 27 which is utilized will depend upon the particular application. For example, the present invention can also be practiced using magnetic resonance imaging (MRI), ultrasound, visible light, infrared light, ultraviolet light, or microwaves. - In the embodiment shown in
FIG. 10 , thesource 227 is a hand-held X-ray source, similar to that described above in reference tosource 127, except that a low powerlaser aiming device 250 and analignment indicator 251 are provided to insure that thesource 227 and therecording medium 231 are properly aligned. In addition, aradiolucent bite block 218 is provided to constrain thedetector 231 relative to theobject 221, thereby constraining the system to three degrees of freedom (two translational and one displacement for theradiation source 227 relative to theobject 221 and detector 231). Consequently, thefiducial reference 222 can be fixed directly to thebite block 218. When thesource 227 is properly aligned with therecording medium 231, radiation emanating from the aimingdevice 250 impinges on therecording medium 231. In response to a measured amount of radiation impinging on therecording medium 231, a signal is sent to activate thealignment indicator 251 which preferably produces a visible and/or auditory signal. With thealignment indicator 251 activated, theX-ray source 245 can be operated at full power to record a projected image. In addition, thesource 227 can optionally comprise acollimator 247 to collimate the radiation from the X-ray source and/or atransparent scatter shield 252 to protect the operator from scattered radiation. In lieu of thescatter shield 252, the operator can stand behind a radiopaque safety screen when exposing the patient to radiation from thesource 227. Ahandle 248 and trigger 246 may be provided to facilitate the handling and operation of thesource 227. Thesource 227 is connected to a computer/high voltage source 228 and anamplifier 260 for controlling operation of the device. - In one embodiment, the aiming
device 250 comprises an X-ray source operated in an ultra-low exposure mode and the projected image is obtained using the same X-ray source operated in a full-exposure mode. Alternatively, a real-time ultra-low dose fluoroscopic video display can be mounted into thehandle 248 of thesource 227 via a microchannel plate (MCP) coupled to a CCD. The video display switches to a lower gain (high signal-to-noise) frame grabbing mode when the alignment is considered optimal and thetrigger 246 is squeezed more tightly. - An alternate embodiment of an aiming device in accordance with the present invention is shown in
FIG. 22 . The aimingdevice 850 comprises alaser source 857 and a radiolucentangled mirror 858 which produces a laser beam, illustrated by dashedline 859, which is concentric with the radiation emanating from thesource 827. Thealignment indicator 851 comprises a radiolucentspherical surface 861 which is rigidly positioned relative to thedetector 831 by a C-arm 829 that is plugged into thebite block 818. When the aimingdevice 850 is aimed such that thelaser beam 859 impinges upon thespherical surface 861, the specular component of thelaser beam 859 is reflected by thespherical surface 861. Accordingly, proper alignment of thesource 827, theobject 821, and thedetector 831 is obtained when the reflected portion of thelaser beam 859 is within a small solid angle determined by the position of the aimingdevice 850. Direct observation of the reflected portion of thelaser beam 859 by a detector orobserver 862 can be used to verify the alignment. As shown in the figure, thefiducial reference 822 comprises a radiolucent spacer containing a fiducial pattern that is affixed to thedetector 831. Further, acentral ring area 863 can be designated at the center of thespherical surface 861 such that aiming thelaser beam 859 at thecentral ring area 863 assures an essentially orthogonal arrangement of thesource 827 and thedetector 831. In addition, replacing theconcentric laser source 857 with a laser source that produces two laser beams that are angled relative to the radiation emanating from thesource 827 permits the distance between thesource 827 and thedetector 831 to be set to a desired distance, provided that the two laser beams are constrained to converge at thespherical surface 861 when the desired distance has been established. - Referring again to
FIG. 1 , therecording medium 31 is provided for recording the projectedobject image 40 of the selectedobject 21 and the projected reference images, 39 and 139 of thereference markers recording medium 31 may be in the form of a photographic plate or a radiation-sensitive, solid-state image detector such as a radiolucent charge-coupled device (CCD). - In one particular embodiment depicted in
FIG. 8 , therecording medium 331 comprises a CCD having atop screen 200, abottom screen 206 positioned below thetop screen 200, and adetector 210 positioned below thebottom screen 206. Thetop screen 200 is monochromatic so that a projected image projected onto thetop screen 200 causes thetop screen 200 to fluoresce or phosphoresce a single color. In contrast, thebottom screen 206 is dichromatic, so that thebottom screen 206 fluoresces or phosphoresces in a first color in response to a projected image projected directly onto thebottom screen 206 and fluoresces or phosphoresces in a second color in response to fluorescence or phosphorescence from thetop screen 200. Thedetector 210 is also dichromatic so as to allow for the detection and differentiation of the first and the second colors. Therecording medium 331 may also comprise a radiolucentoptical mask 202 to modulate the texture and contrast of the fluorescence or phosphorescence from thetop screen 200, a radiolucent fiber-optic spacer 204 to establish a known projection disparity, and a radiopaque fiber-optic faceplate 208 to protect thedetector 210 from radiation emanating directly from the radiation source. - Yet another embodiment is depicted in
FIGS. 20 and 21 , wherein thedetector 731 comprises a phosphor-coated CCD and thefiducial reference 722 comprises a radiopaquerectangular frame 725. Both thedetector 731 and thefiducial reference 722 are contained within a light-tight package 756. Thedetector 731 andfiducial reference 722 are preferably positioned flush with an upper, inner surface of thepackage 756. The dimensions of theframe 725 are selected such that theframe 725 extends beyond the perimeter of thedetector 731. Phosphor-coatedstrip CCDs 754 are also contained within thepackage 756. Thestrip CCDs 754 are positioned below theframe 725 such that radiation impinging upon theframe 725 castes an image of each edge of theframe 725 onto one of thestrip CCDs 754. The positions of the frame shadow on thestrip CCDs 754 is used to determine the projection geometry. - In the embodiment shown in
FIG. 9 , therecording medium 431 is smaller than the projected image ofobject 521. Provided that the reference images, 39 and 139, corresponding to the reference markers, 23 and 123, can be identified on all the projected images, image slices extending across the union of all the projected images can be obtained. This is illustrated schematically inFIG. 9 , wherein the reference images, 39 and 139, are taken with thesource 27 and therecording medium 431 in the image positions indicated by the solid lines. Similarly, the dashed images, 39′ and 139′, are taken with thesource 27′ and therecording medium 431′ in the positions indicated by the dashed lines. Accordingly, image slices of an object which casts an object image that is larger than therecording medium 431 can be synthesized. Further, by using multiple fiducial references spaced in a known pattern which are all linked to the object of interest, additional regions of commonality can be identified between multiple overlapping projection geometries, so that a region of any size can be propagated into a single, unified reconstruction. Thus, it is possible to accommodate an object much larger than the recording medium used to record individual projection images. - Similarly, as depicted in
FIG. 37 , regions of overlap between two or more sets of projected images recorded can be used as a basis for extrapolating registration and calibration of the sets of projected images. As shown, a first set of projected images is recorded using an X-ray camera configured to provide a first aperture. A second set of projected images is then recorded using the camera configured to provide a second aperture. The first and second sets of projected images are then brought into alignment by identifying fiducial reference points that are common to the overlapping regions of the projected images. - The present invention also relates to a method for creating a slice image through the
object 21 ofFIG. 1 from a series of two-dimensional projected images of theobject 21, as shown inFIG. 2 . The method of synthesizing the image slice starts atstep 45. Each step of the method can be performed as part of a computer-executed process. - At
step 47, afiducial reference 22 comprising at least two reference markers, 23 and 123, is selected which bears a fixed relationship to the selectedobject 21. Accordingly, thefiducial reference 22 may be affixed directly to the selectedobject 21. The minimum required number ofreference markers 23 is determined by the number of degrees of freedom in the system, as discussed above. When thefiducial reference 22 comprisesreference markers 23 of a finite size, the size and shape of thereference markers 23 are typically recorded. - The selected
object 21 andfiducial reference 22 are exposed to radiation from any desired projection geometry atstep 49 and a two-dimensional projected image 38 is recorded atstep 51. Referring toFIG. 1 , the projected image 38 contains anobject image 40 of the selectedobject 21 and a reference image, 39 and 139, respectively, for each of thereference markers fiducial reference 22. - At
step 53, it is determined whether additional projected images 38 are desired. The desired number of projected images 38 is determined by the task to be accomplished. Fewer images reduce the signal-to-noise ratio of the reconstructions and increase the intensities of component “blur” artifacts. Additional images provide information which supplements the information contained in the prior images, thereby improving the accuracy of the three-dimensional radiographic display. If additional projected images 38 are not desired, then the process continues atstep 60. - If additional projected images 38 are desired, the system geometry is altered at
step 55 by varying the relative positions of (1) theradiation source 27, (2) the selectedobject 21 and thefiducial reference 22, and (3) therecording medium 31. The geometry of the system can be varied by moving theradiation source 27 and/or therecording medium 31. Alternatively, thesource 27 andrecording medium 31, the selectedobject 21 andfiducial reference 22 are moved. When the radiation source and recording medium produce images using visible light (e.g., video camera), the geometry of the system must be varied to produce images from various sides of the object in order to obtain information about the entire object. After the system geometry has been varied, the process returns to step 49. - After all of the desired projected images have been recorded, a slice position is selecteu at
step 60. The slice position corresponds to the position at which the image slice is Lo be generated through the object. - After the slice position has been selected, each projected image 38 is projectively warped onto a
virtual projection plane 37 atstep 65. The warping procedure produces a virtual image corresponding to each of the actual projected images. Each virtual image is identical to the image which would have been produced had the projection plane been positioned at the virtual projection plane with the projection geometry for theradiation source 27, the selectedobject 21, and thefiducial reference 22 of the corresponding actual projected image. The details of the steps involved in warping theprojection plane 37 are shown inFIG. 3 . The process starts atstep 70. - At
step 72, avirtual projection plane 37 is selected. In most cases it is possible to arrange for one of the projected images to closely approximate the virtual projection plane position. That image can then be used as the basis for transformation of all the other images 38. Alternatively, as shown for example inFIG. 4 , if thefiducial reference 22 comprises more than twoco-planar reference markers 23, a plane which is parallel to the plane containing theco-planar reference markers 23 can be selected as thevirtual projection plane 37. When thevirtual projection plane 37 is not parallel to the plane containing theco-planar reference markers 23, although the validity of the slice reconstruction is maintained, the reconstruction yields a slice image which may be deformed due to variations in magnification. The deformation becomes more prominent when the magnification varies significantly over the range in which the reconstruction is carried out. In such cases, an additional geometric transformation to correct for differential magnification may be individually performed on each projected image 38 to correct for image deformation. - One of the recorded projected images 38 is selected at
step 74 and the identity of thereference images 39 cast by eachreference marker 23 is determined atstep 76. In the specialized case, such as the one shown inFIG. 1 , wherespherical reference markers 23 of the same radius are used and the relative proximal distance of eachreference marker 23 to theradiation source 27 at the time that the image 38 was recorded is known, assignment of eachelliptical image 39 to acorresponding reference marker 23 call be accomplished simply by inspection. Under such conditions, the minor diameter of theelliptical image 39 is always larger the closer thereference marker 23 is to theradiation source 27. This is shown most clearly inFIG. 17 wherein the minor diameter of reference image Bs corresponding to reference marker B is smaller than the minor diameter of reference image Ts corresponding to reference marker T. Alternatively, when applied to radiation capable of penetrating the fiducial reference 22 (i.e., X-rays),spherical reference markers 23 which are hollow having different wall thicknesses and hence, different attenuations can be used. Accordingly, thereference image 39 cast by eachspherical reference marker 23 can be easily identified by the pattern of thereference images 39. Analogously,spherical reference markers 23 of different colors could be used in a visible light mediated system. - The position of each
reference image 39 cast by eachreference marker 23 is measured atstep 78. When aspherical reference marker 23 is irradiated bysource 27, the projectedcenter 41 of thereference marker 23 does not necessarily correspond to thecenter 42 of thereference image 39 cast by thatreference marker 23. Accordingly, the projectedcenter 41 of thereference marker 23 must be determined. One method of determining the projectedcenter 41 of thereference marker 23 is shown inFIG. 16 . The variation in intensity of thereference image 39 associated withreference marker 23 along the length of the major diameter of thereference image 39 is represented by thebrightness profile 43. The method depicted inFIG. 16 relies on the fact that the projectedcenter 41 always intersects thebrightness profile 43 of thereference image 39 at, or very near, the maximum 44 of thebrightness profile 43. Accordingly, the projectedcenter 41 of aspherical reference marker 23 produced by penetrating radiation can be approximated by smoothing thereference image 39 to average out quantum mottle or other sources of brightness variations which are uncorrelated with the attenuation produced by thereference marker 23. An arbitrary point is then selected which lies within thereference image 39. A digital approximation to the projectedcenter 41 is isolated by performing a neighborhood search of adjacent pixels and propagating the index position iteratively to the brightest (most attenuated) pixel in the group until a local maximum is obtained. The local maximum then represents the projectedcenter 41 of thereference marker 23. - Returning to step 78 of
FIG. 3 , when thefiducial reference 22 comprisesreference markers 23 of finite size, the sizes of eachimage 39 cast by eachreference marker 23 are also recorded. For example, the lengths of the major and minor diameters of elliptical reference images cast byspherical reference markers 23 can be measured. Computerized fitting procedures can be used to assist in measuring theelliptical reference images 39 cast byspherical reference markers 23. Such procedures, which are well-known in the art, may be used to isolate theelliptical reference images 39 from the projected image 38 and determine the major and minor diameters of thereference images 39. - Because the attenuation of a
spherical reference marker 23 to X-rays approaches zero at tangential extremes, the projected minor diameter of resultingelliptical reference images 39 will be slightly smaller than that determined geometrically by projection of the reference marker's actual diameter. The amount of the resulting error is a function of the energy of the X-ray beam and the spectral sensitivity of therecording medium 31. This error can be eliminated by computing an effective radiographic diameter of thereference marker 23 as determined by the X-ray beam energy and the recording medium sensitivity in lieu of the actual diameter. - One method of obtaining the effective radiographic diameter is to generate a series of tomosynthetic slices through the center of the
reference marker 23 using a range of values for the reference marker diameter decreasing systematically from the actual value and noting when the gradient of thereference image 39 along the minor diameter is a maximum. The value for the reference marker diameter resulting in the maximum gradient is the desired effective radiographic diameter to be used for computing magnification. - Further, each projected image can be scaled by an appropriate magnification. For
fiducial references 22 comprisingspherical reference markers 23, the minor diameter of thereference image 39 is preferably used to determine the magnification since the minor diameter does not depend on the angle between thesource 27 and therecording medium 31. Accordingly, the magnification of aspherical reference marker 23 can be determined from the measured radius of thereference marker 23, the minor diameter of thereference image 39 on therecording medium 31, the vertical distance between the center of thereference marker 23 and therecording medium 31, and the vertical distance between therecording medium 31 and thevirtual projection plane 37. - Returning to
FIG. 3 with reference toFIG. 1 , a projection transformation matrix, representing a series of transformation operations necessary to map the selected projected image 38 onto thevirtual projection plane 37, is generated atstep 80. The projection transformation matrix is generated by solving each projected image 38 relative to thevirtual projection plane 37. In one embodiment, the positions of theco-planar reference markers 23 are used to determine the transformation matrix by mapping the position of thereference images 39 cast by eachco-planar reference marker 23 in the projected image onto its corresponding position in the virtual projection plane. For example, when the fiducial reference comprises aradiopaque frame 25, the positions of thereference images 39 cast by thereference markers 23 formed at the corners of theframe 25 are mapped to a canonical rectangle having the same dimensions and scale as theframe 25. This approach also serves to normalize the projective data. Depending on the number of degrees of freedom, the transformation operations range from complex three-dimensional transformations to simple planar rotations or translations. Once the projective transformation matrix has been generated, the matrix is used to map the projected image 38 onto thevirtual projection plane 37 atstep 82. - At
step 84, it is determined whether all of the projected images 38 have been analyzed. If all of the projected images 38 have not been analyzed, the process returns to step 74, wherein an unanalyzed image 38 is selected. If no additional projected images 38 are to be analyzed, then the process proceeds throughstep 85 ofFIG. 3 to step 90 ofFIG. 2 . - After each image has been warped onto the virtual projection plane, an image slice through the
object 21 at the selected slice position is generated atstep 90. An algorithm, such as that described in U.S. Pat. No. 5,359,637, which is incorporated herein by reference, can be used for that purpose. The position of the reference image cast by the alignment marker ormarkers 23 in each projected image 38 are used as the basis for application of the algorithm to generate the image slices. - By generating image slices at more than one slice position, a true three-dimensional representation can be synthesized. Accordingly, it is determined whether an additional slice position is to be selected at
step 92. If an additional slice position is not desired, the process proceeds to step 94. If a new slice position is to be selected, the process returns to step 60. - If image slices at multiple slice positions have been generated, the entire set of image slices is integrated into a single three-dimensional representation at
step 94. Alternative bases for interactively analyzing and displaying the three-dimensional data can be employed using any number of well-established three-dimensional recording and displaying methods. Additionally, the three-dimensional representation can be displayed using the display device depicted inFIG. 28 in order to produce a holographic-type display. The display device comprises a pair of stereoscopic eyeglasses orspectacles 1080 which are worn by anobserver 1082. Theeyeglasses 1080 contain lenses which are either cross-polarized or which pass complementary colored light. In addition, atarget 1084 is positioned on theeyeglass frame 1080. Acolor computer monitor 1086 and video camera ordetector 1088 are provided in association with theeyeglasses 1080. Thecolor monitor 1086 is used to display complementary-colpred or cross-polarized stereoscopic image pairs 1090 of the three-dimensional representation. Thevideo camera 1088 is used to track thetarget 1084 as the observer's head is moved. When the observer's head is moved to a different position, thevideo camera 1088 relays information either directly to thecolor monitor 1086 or to thecolor monitor 1086 through computer-related hardware. The information relayed by the video camera relates to the angle subtended by thetarget 1084 relative to thevideo camera 1088. The relayed information is then used to alter the angular disparity associated with the stereoscopic image pairs 1090 being displayed on thecolor monitor 1080 in quasi-realtime, so that the resulting display is adjusted to correlate with the movement of the observer's head and appears holographic to the observer. - Instead of creating a slice image or a three-dimensional representation from one or more series of two-dimensional images, a nearly isotropic three-dimensional image can be created from a single pair of two-dimensional projections as depicted in
FIG. 39 . As shown, the two-dimensional images are combined and overlap to produce a three-dimensional image. Since only one two-dimensional image is utilized to reconstruct each slice image, the method depicted inFIG. 39 represents a completely degenerate case wherein the slice image is infinitely thick. When the slice image is infinitely thick, the slice image is indistinguishable from a conventional two-dimensional projection of a three-dimensional object. - The steps of a method for producing a three-dimensional image of an object from a single pair of two-dimensional projections is shown in
FIG. 32 . Atstep 1100, a three-dimensional fiducial reference is functionally associated with an object of interest. The association need only be complete enough to permit the location of all of the details in the object to be determined relative to the position of the object. The fiducial reference must occupy a volume and be defined spatially such that a minimum of six points can be unequivocally generated and/or identified individually. For example, the object may be encased inside a cubic reference volume wherein the corners of adjacent faces are rendered identifiable by tiny, spherical fiducial markers. - A first projected image is then produced on a first projection plane at
step 1102. The relative positions of the object, the radiation source, and the detector are then altered so that a second projected image can be recorded on a second projection plane at step 1104. The second projection plane must be selected so that it intersects the first projection plane at a known angle. However, for the resultant three-dimensional representation to be mathematically well conditioned, the angle should be or approach orthogonality. - At
step 1106, a projective transformation of each projected image is performed to map the images of the fiducial reference on each face into an orthogonal, affine representation of the face. For example, when a cubic fiducial reference is used, the projective transformation amounts to converting the identifiable corners of the image of fiducial reference corresponding to a projected face of the fiducial reference into a perfect square having the same dimensions as a face of the fiducial reference. - Each of the transformed projected images is then extruded, at
step 1108, such that both projected images occupy the same virtual volume. The extrusion step is equivalent to the creation of a virtual volume having the same dimensions as the fiducial reference containing the sum of the transformed projected images. Atstep 1110, an optional non-linear filtering technique is used to limit visualization of the three-dimensional representation to the logical intersection of the transformed projected images. - The three-dimensional representation can be refined by optionally recording additional projected images. At
step 1112, it is determined whether additional projected images are to be recorded. If additional projected images are desired, the process returns to step 1104. However, if additional projected images are not desired, the three-dimensional representation is displayed atstep 1114. - The present invention also relates to a method for reducing distortions in the three-dimensional representation. Tomosynthesis uses two-dimensional image projections constrained within a limited range of angles relative to the irradiated object to produce a three-dimensional representation of the object. The limited range of angles precludes complete and uniform sampling of the object. This results in incomplete three-dimensional visualization of spatial relationships hidden in the resulting undersampled shadows or null spaces. Another limiting factor which interferes with artifact-free tomosynthetic reconstruction is the change in slice magnification with depth caused by the relative proximity of the source of radiation. These distortions can be reduced by merging independently generated sets of tomosynthetic image slices, as shown in
FIG. 33 . - At
step 1120, a fiducial reference is functionally associated with the object and at least two independent sets of image slices are recorded. The angular disparity between the sets of image slices is noted. For example, the first set of image slices may comprise multiple anterior-posterior projections while the second set of image slices comprises multiple lateral projections. The sets of image slices are then integrated to create a first and a second three-dimensional tomosynthetic matrix volume atstep 1122. - At
step 1124, the resulting three-dimensional matrix volumes are affinized to counteract the effects of having a finite focal-object distance. Affinization is accomplished by first identifying the reference images of the appropriate reference markers of the fiducial reference. Once the reference images have been identified, the three-dimensional matrix volumes are shifted and scaled in order to correct for geometrical and surface imperfections. The transformation of the first three-dimensional matrix volumes is carried out in accordance with the following equation:
A=CA
where A is the first three-dimensional matrix volume, A′ is the shifted and scaled first three-dimensional matrix volume, and C is the affine correction matrix for the first three-dimensional matrix volume. The affine correction matrix C is determined by the number of slices comprising the three-dimensional matrix volume, the correlation angle (i.e., the greatest angle of the projection sequence in the range
measured from an axis normal to the detector surface), and the correlation distance (i.e., the apex-to-apex distance created by the intersection of the most disparate projections of the sequence). The transformation of the second three-dimensional matrix volume is analogously determined in accordance with the following equation:
L′=DL
where L is the second three-dimensional matrix volume, L′ is the shifted and scaled second three-dimensional matrix volume, and D is the affine correction matrix for the second three-dimensional matrix volume. - At
step 1126, the second three-dimensional matrix volume is rotated by an angle φ. The angle φ is defined as the angular disparity between the first and the second three-dimensional matrix volumes. Specifically, the shifted and scaled second three-dimensional matrix volume, L′, is rotated in accordance with the following equation:
L″=R ψL′
where L″ is the rotated, shifted, and scaled second three-dimensional matrix volume and R100 is the rotational transform matrix. - The transformed matrix volumes, A′ and L″, are then merged using matrix averaging at
step 1128. The matrix averaging is accomplished in accordance with the following equation:
where M is the averaged matrix of the two component transformed matrix volumes, A′ and L″. Alternatively, a non-linear combination of the transformed matrix volumes, A′ and L″, is performed. - The present invention further relates to a method for generating tomosynthetic images optimized for a specific diagnostic task. A task-dependent method for tomosynthetic image reconstruction can be used to mitigate the effects of ringing artifacts from unregistered details located outside the focal plane of reconstruction, which are intrinsic to the tomosynthetic reconstruction process. The production and elimination of blurring artifacts is depicted schematically in
FIG. 24 . As shown, a firstradiopaque object 1140 within thefocal plane 1141 and a second radiopaque 1142 object above the focal plane are irradiated from twodifferent source positions 1144 to produce two distinct data images. Thefirst data image 1146 contains an image of the firstradiopaque object 1140 at relative position C and an image of the secondradiopaque object 1142 at relative position B. Thesecond data image 1148 contains an image of the firstradiopaque object 1140 at relative position F and an image of the secondradiopaque object 1142 at relative position G. When a linear combination of the first and second data images is performed, the image intensity at the same relative position of both data images is averaged. For example, relative position B in one data image corresponds to relative position E in the other data image and, therefore, the corresponding relative position in the tomosynthetic image is assigned an intensity equal to the average of the intensity measured at relative position B and relative position E (i.e., (B+E)/2). As a result, thetomosynthetic image 1150 is marked by a blurring of the image produced by the firstradiopaque object 1140. However, when a non-linear combination of the first and second data images is performed, both data images are compared and, for example, only the minimum intensity at each relative position is retained. For example, relative position B in one data image corresponds to relative position E in the other data image and, therefore, the corresponding relative position in the tomosynthetic image is assigned an intensity equal to the lesser of the intensities measured at relative position B and relative position E (i.e., B or E). As a result, the blurring shadows are eliminated from thetomosynthetic image 1152. - The non-linear tomosynthetic approach in accordance with the present invention is beneficial when, for example, physicians want to know with relative assurance whether a lesion or tumor has encroached into a vital organ. When viewing a linear tomosynthetic reconstruction of the general region in three dimensions, the ringing artifacts tend to blur the interface between the lesion or tumor and the surrounding tissues. However, since tumors are typically more dense than the tissues that are at risk of invasion, the non-linear tomosynthetic reconstruction can be employed such that only the relatively radiopaque tumor structures of interest are retained in the reconstructed image. Similarly, a different non-linear operator could be used such that only relatively radiolucent structures of interest are retained in the reconstructed image to determine whether a lytic process is occurring in relatively radiopaque tissues.
- The use of non-linear operators to reduce the affects of ringing artifacts is effective because images of many structures of radiographic interest have projection patterns determined almost entirely by discrete variations in mass or thickness of relatively uniform materials. Under these conditions, changes in radiographic appearance map closely with simple changes in either material thickness or density. In other words, complicating attributes associated with visual images, such as specular reflections, diverse energy-dependent (e.g., color) differences, etc., do not contribute significantly to many diagnostic radiographic applications. This simplification assures that many tissues can be identified easily by their position in a monotonic range of X-ray attenuations. Accordingly, selection of only projections yielding maximum or minimum attenuations when performing tomosynthetic reconstructions derived from such structures assures that resulting image slices yield results characterized by only extremes of a potential continuum of display options. Such displays make sense when the diagnostic task is more concerned with specificity (i.e., a low likelihood of mistaking an artifact for a diagnostic signal) than sensitivity (i.e., a low likelihood of missing a diagnostic signal).
- A method for task-dependent tomosynthetic image reconstruction is depicted in the flow chart of
FIG. 23 . The method begins atstep 900 and proceeds to step 902 where a series of projected images are acquired. In one embodiment, the projected images are acquired in the same manner already described in connection with steps 49-55 ofFIG. 2 . Atstep 904, the projected images are shifted laterally, in the plane of the projection, by amounts required to produce a desired tomosynthetic slice where all the images are then superimposed, in a manner identical to the method described in connection withsteps FIG. 2 . - Once the projected images have been acquired and appropriately shifted, the type and degree of task-dependent processing is chosen. At
step 906, it is determined whether only those features characterized by a relatively high attenuation are to be unequivocally identified. If only features having a high attenuation are to be identified, a pixel value corresponding to a desired minimum attenuation is selected. The selected pixel value is used as a minimum threshold value whereby each projected image is analyzed, pixel by pixel, and all pixels having an associated attenuation value below the selected pixel value are disregarded when an image slice is generated. - If however, at
step 906, it is determined that features having a low attenuation are to be identified or that the entire range of attenuating structures are to be identified, then it is determined atstep 910 whether only features characterized by a relatively low attenuation are to be unequivocally identified. If only features having a low attenuation are to be identified, a pixel value corresponding to a desired maximum attenuation is selected. The selected pixel value is used as a maximum threshold value whereby each projected image is analyzed, pixel by pixel, and all pixels having an associated attenuation value above the selected pixel value are disregarded when an image slice is generated. - If it is determined at
step 910 that features having a low attenuation are not to be identified or that the entire range of attenuating structures are to be identified, then it is determined atstep 916 whether an unbiased estimate of the three-dimensional configuration of the entire range of attenuating structures is to be identified. If the entire range of attenuating structures is to be identified, then conventional tomosynthesis is performed atstep 918, whereby the attenuation values from all of the projected images are averaged. - If the features having a high attenuation, the features having a low attenuation, and the features covering the entire range of attenuations are not to be identified, then it is determined at
step 920 whether the user desires to restart the selection of features to be identified. If the user wants to restart the identification process, then the method returns to step 906. If the user decides not to restart the identification process, then the method ends atstep 922. - Once it has been determined which features are to be identified, then an image slice is generated at a selected slice position at
step 924. The process for generating the image slice atstep 924 is essentially the same as discussed previously in connection withstep 90 ofFIG. 2 . However, when only features having either a high attenuation or a low attenuation are to be identified, the image generation process is performed only on the non-linearly selected images, instead of on all of the projected images as initially acquired. Once the image slice has been generated, the image slice is displayed atstep 926 and the method ends atstep 922. - In another aspect of the present invention, a method is provided for determining temporal changes in three-dimensions. The me hod enables two or more sets of image data collected at different times to be compared by adjusting the recorded sets of image data for arbitrary changes in the vantage points from which the image data were recorded. The method takes advantage of the fact that a single three-dimensional object will present a variety of different two-dimensional projection patterns, depending on the object's orientation to the projection system. Most of this variety is caused by the fact that a three-dimensional structure is being collapsed into a single two-dimensional image by the projection system. Limiting projection options to only two-dimensional slices precludes this source of variation. The result is a much reduced search space for appropriate registration of the images required to accomplish volumetrically meaningful subtraction.
- A flow chart showing the steps involved in the method for determining temporal changes in three-dimensions of the present invention is depicted in
FIG. 25 . A first set of image slices is generated atstep 1180. After the desired time period to be assessed for changes has passed, the object is positioned in roughly the same position as it was when the first set of image slices was produced and a second set of image slices is generated, atstep 1182, using a similar exposure protocol. - At
step 1184, the first set of image slices is spatially cross-correlated with the second set of image slices. The cross-correlation is accomplished by individually comparing each image slice comprising the first set of image slices with the individual image slices comprising the second set of image slices. The comparison is performed in order to determine which image slice in the second set of image slices corresponds to a slice through the object at approximately the same relative position through the object as that of the image slice of the first set of image slices to which the comparison is being made. - After each of the image slices in the first set of image slices is correlated to an image slice in the second set of image slices, each of the correlated pairs of image slices are individually aligned at step 1186. The alignment is performed in order to maximize the associated cross-correlations by maximizing the overlap between the image slices comprising the correlated pairs of image slices. The cross-correlations are maximized by shifting the image slices relative to one another until the projected image of the object on one image slice is optimally aligned with the projected image of the object on the other image slice. Once each correlated pair of image slices has been aligned, the image slices from one set of image slices is subtracted from the image slices from the other set of image slices at
step 1188 to form a set of difference images. - At
step 1190, the difference images are displayed. The difference images can be presented as a series of individual differences corresponding to various different slice positions. Alternatively, the individual difference images can be integrated to yield a composite difference representing a three-dimensional image of the temporal changes associated with the selected object. - The present invention further relates to a source comparator and a method for matching radiation sources for use in quantitative radiology. Meaningful quantitative comparisons of different image data can be made only when the radiation source or sources used to record the image data is very nearly unchanged. However, conventional radiation sources produce radiation that varies with changes in tube potential, beam filtration, beam orientation with respect to the radiation target, tube current, and distance from the focal spot. The source comparator and method of the present invention enable the radiation output from one radiation source to be matched to that of another radiation source or to that of the same radiation source at a different time.
- The
source comparator 1200 for matching radiation sources in accordance with the present invention is depicted inFIG. 26 . Thesource comparator 1200 comprises two wedges or five-sided polyhedrons, 1202 and 1204, of equal dimension having a rectangular base and two right-triangular faces. The triangular faces lie in parallel planes at opposite edges of the base such that the triangular faces are oriented as mirror images of each other. As a result, each wedge, 1202 and 1204, has a tapered edge and provides a uniformly increasing thickness from the tapered edge in a direction parallel to the plane of the base and perpendicular to the tapered edge. The wedges, 1202 and 1204, are arranged with the base of onewedge 1202 adjacent to the base of theother wedge 1204 such that the tapered edges of the two wedges are at adjacent edges of the base. One wedge is formed from a uniform high attenuation material while the other wedge is formed from a uniform low attenuation material to differentially attenuate the relative proportion of high and low energy photons in the output from the radiation source. Accordingly, when thesource comparator 1200 is irradiated from a radiation source directed perpendicularly to the bases of the wedges, the resulting image will be a quadrilateral having an intensity gradient that varies uniformly in a single direction with the angle of the gradient being determined by the distribution of high and low energy photons in the output from the radiation source. - The
source comparator 1200 ofFIG. 26 is used in the method of matching radiation sources in accordance with the present invention as shown inFIG. 27 . Atstep 1220, the source comparator is positioned between a radiation source and a detector. An original gradient image is then recorded by exposing the source comparator to radiation from the radiation source at the source settings to be used for recording a first set of data images. The first set of data images is then recorded. - When a second set of data images is to be recorded, the source settings for the radiation source to be used to record the second set of data images are adjusted to match the settings used for recording the first set of data images. At
step 1222, the source comparator is positioned between the radiation source and the detector and a first gradient image is recorded. The source comparator is then rotated perpendicularly to the detector by an angle of 180° and a second gradient image recorded atstep 1224. The first and second gradient images are compared and the source comparator oriented to produce the smaller gradient atstep 1226. By so doing, it is assured that the source comparator bears the same relative relationship to the radiation source for both sets of data and, thereby, eliminates the potential for confounding the data by spatial variations in the cross-sectional intensity of the output from the radiation source. - The individual settings on the radiation source are then iteratively adjusted. At
step 1230, the beam energy is matched by adjusting the kVp on the radiation source so that the measured gradient value approaches the gradient value of the original gradient image. The beam quality is then matched atstep 1232 by adjusting the filtration of the radiation source so that the angle of the maximum gradient relative to the edge of the source comparator approaches that of the original gradient image. The beam exposures are then estimated by integrating the detector response across a fixed region of the source comparator and matched atstep 1234 by adjusting the mAs of the radiation source so that the exposure approaches that of the original gradient image. Atstep 1236 it is determined whether the gradient image is substantially the same as the original gradient image. If the two images are significantly different, the beam energy, beam quality, and exposure are readjusted. If, however, asymptotic convergence has been reached and the two gradient images are substantially the same, the radiation sources are matched and the process ends atstep 1238. Once the radiation sources have been matched, the second set of data images can be recorded and quantitatively compared to the first set of data images. - In the embodiment shown in
FIG. 19 , thesource 627 is an unconstrained point source and thedetector 631 is completely constrained relative to theobject 621. Accordingly, the system has three degrees of freedom (two translational and one displacement for theradiation source 627 relative to theobject 621 and detector 631). Abeam collimator 647 can be positioned between thesource 627 and theobject 621 to collimate the radiation from thesource 627. Thedetector 631 comprises aprimary imager 632 and asecondary imager 634 positioned a known distance below theprimary imager 632. In one embodiment, both the primary and secondary imagers, 632 and 634, are CCD detectors. Thefiducial reference 622 comprises aradiopaque shield 633 with a ring-shapedaperture 636 of known size positioned between theprimary imager 632 and thesecondary imager 634. - Radiation from the
source 627 passes throughcollimator 647, irradiatesobject 621, and produces an object image on theprimary imager 632. In addition, radiation from thesource 627 which impinges upon theradiopaque shield 633 passes through theaperture 636 to produce a ring-shaped reference image of theaperture 636 on thesecondary imager 634. Since thesecondary imager 634 is not used to record object images, thesecondary imager 634 can be a low quality imager such as a low resolution CCD. Alternatively, a lower surface of theprimary imager 632 can be coated with aphosphorescent material 635, so that radiation impinging upon theprimary imager 632 causes thephosphorescent material 635 to phosphoresce. The phosphorescence passes through theaperture 636 to produce the reference image on thesecondary imager 634. - In operation, the reference image produced using the system depicted in
FIG. 19 can be used to determine the position of thesource 627 relative to theobject 621 and thedetector 631. A circle, or ellipse, is fitted to the projected reference image. By fitting a circle, or ellipse, to the reference image, the effect of dead areas and/or poor resolution of thesecondary imager 634 can be eliminated by averaging. The position of the center of the fitted circle, or ellipse, relative to the known center of theaperture 636 is determined. The angle α of acentral ray 637 radiating from thesource 627 relative to theobject 621 and thedetector 631 can then be determined. In addition, the length of the minor diameter of the projected reference image is determined and compared to the known diameter of theaperture 636 to provide a relative magnification factor. The relative magnification factor can then be used to determine the distance of thesource 627 from theobject 621. - The center of the fitted circle can be determined as follows. A pixel or point on the
secondary imager 634 that lies within the fitted circle is selected as a seed point. For convenience, the center pixel of thesecondary imager 634 can be selected, since the center point will typically lie within the fitted circle. A point R is determined by propagating from the seed point towards the right until the fitted circle is intersected. Similarly, a point L is determined by propagating from the seed point towards the left until the fitted circle is intersected. For each pixel along the arc L-R, the average of the number of pixels traversed by propagating from that pixel upwardly until the fitted circle is intersected and the number of pixels traversed by propagating from that pixel downwardly until the fitted circle is intersected is determined. Any statistical outliers from the averages can be discarded and the average of the remaining values calculated. This average represents the row address of the fitted circle's center. To obtain the column address, the entire reference image is rotated by 90° and the process is repeated. The row address and column address together represent the position of the center of the fitted circle. - Although the above embodiments have been described in relation to projected images of objects produced using X-rays, the present invention is equally applicable to images produced using a variety of technologies, such as visible light, ultrasound, or electron microscopy images. Specifically, intermediate voltage electron microscope (IVEM) images can be used to provide quantitative three-dimensional ultrastructural information. Further, the present invention can also be used to reconstruct three-dimensional images of objects which either emit or scatter radiation.
- When IVEM images are used, the present invention allows cellular changes to be detected and quantified in an efficient and cost-effective manner. Quantitation of three-dimensional structure facilitates comparison with other quantitative techniques, such as biochemical analysis. For example, increases in the Golgi apparatus in cells accumulating abnormal amounts of cholesterol can be measured and correlated with biochemically measured increases in cellular cholesterol.
- When photographic images are used, it is possible to create a true three-dimensional model of a diffusely illuminated fixed scene from any number of arbitrary camera positions and angles. The resulting three-dimensional image permits inverse engineering of structural sizes and shapes, and may be expressed as a series of topographic slices or as a projective model that can be manipulated interactively. This capability is particularly useful in retrofitting existing structures or quantifying three-dimensional attributes using non-invasive methods. In addition, the present invention can be applied to construct topological images of geological structures by recording images of the structure created by the sun.
- Representative lumpectomy specimens containing cancer from human breasts were radiographed using a digital mammographic machine (Delta 16, Instrumentarium, Inc.). Exposure parameters were regulated by an automatic exposure control mechanism built into the unit. Seven distinct projections of each specimen were made using a swing arm containing the tube head that swept across each specimen in a single arched path. This resulted in mammographic projections having angular disparities of 15, 10, 5, 0, −5, −10, and −15 degrees from vertical. These data were processed to yield a series of tomosynthetic slices distributed throughout the breast tissues in three ways: 1) conventional linear summation of all seven appropriately shifted projections (
FIG. 29 ), 2) identical linear summation augmented by the application of an interactive deconvolution filter known to minimize tomographic blur (FIG. 30 ), and 3) a nonlinear tomosynthetic reconstruction scheme based on selection of only the projection(s) yeilding the minimum brightness at each pixel (FIG. 31 ). Notice the lack of “ringing” artifacts caused by the wire used to locate the lesion inFIG. 31 corresponding to the nonlinear reconstruction method. Five board-certified radiologists compared tomographic displays of these tissues produced from all three methods and ranked them in terms of their perceived interpretability with regard to cancer recognition and relative freedom from apparent tomosynthetic artifacts. A related exercise involved having a different set of eight observers estimate the relative depths of a series of seven holes bored in a solid Lucite block exposed under comparable conditions. - All five radiologists preferred the nonlinearly generated tomosynthetic mammograms over those produced conventionally (with or without subsequent blurring via interactive deconvolution). A similar statistically significant result (p<0.05) was produced when the performance of the hole-depth experiment was objectively determined.
- This approach is very efficient: it is simpler to implement than conventional tomosynthetic back-projection methods; and it produces sharp-appearing images that do not require additional computationally intensive inverse filtering or interative deconvolution schemes. Therefore, it has the potential for implementation with full-field digital mammograms using only modest computer processing resources that lie well within the current state of the art. For certain tasks that are unduly compromised by tomosynthetic blurring, a simple nonlinear tomosynthetic reconstruction algorithm may improve diagnostic performance over the status quo with no increase in cost or complexity.
- Although the above discussion has centered around computed tomography, it will be appreciated by those skilled in the art that the present invention is useful for other three-dimensional imaging modalities. For example, the present invention is also intended to relate to images obtained using magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), positron emission tomography (PET), conventional tomography, tomosynthesis, and tuned-aperture computed tomography (TACT), as well as microscopic methods including confocal optical schemes.
- It will be recognized by those skilled in the art that changes or modifications may be made to the above-described embodiments without departing from the broad inventive concepts of the invention. It should therefore be understood that this invention is not limited to the particular embodiments described herein, but is intended to include all changes and modifications that are within the scope and spirit of the invention as set forth in the claims.
Claims (23)
1. A system for synthesizing an image of an object at a selected slice position through the selected object comprising:
a. an identifiable fiducial reference located at a fixed position relative to the selected object, the fiducial reference providing constraint for a number of degrees of freedom correlated to a number of degrees of freedom of the system;
b. a source of radiation for irradiating the selected object and the fiducial reference;
c. a recording medium for recording a first set and a second set of projected images of the fiducial reference and a region of interest of the selected object; and
d. an image synthesizer for determining overlapping regions of the projected images by identifying fiducial reference points common to the overlapping regions and for bringing the first and second sets of projected images into alignment based on the identified reference points to reconstruct a tomographic slice from the object images;
whereby the fiducial reference, the source of radiation, and the recording medium are arbitrarily positioned relative to one another.
2. The system according to claim 1 , wherein the recording medium includes a CCD device.
3. The system according to claim 1 , wherein the fiducial reference comprises at least five identifiable reference markers in a fixed geometry relative to each other.
4. The system according to claim 3 , wherein at least four of the reference markers are co-planar.
5. The system according to claim 4 , wherein a maximum of any two of the four co-planar reference markers are co-linear.
6. The system according to claim 5 , wherein a fifth reference marker is not co-planar with the four co-planar reference markers.
7. The system according to claim 1 , wherein the source of radiation comprises an electron source for irradiating the selected object and the fiducial reference with electrons.
8. The system according to claim 1 , wherein the source of radiation comprises a visible light source for irradiating the selected object and the fiducial reference, and the recording medium comprises a photographic recording medium for recording photographic images of the selected object and fiducial reference.
9. The system according to claim 8 , wherein the source of radiation and the recording medium comprise a video camera.
10. A system according to claim 1 , wherein the fiducial reference comprises a rectangular parallelepiped having at least six reference markers, each face of the parallelepiped comprising at least one reference marker, the parallelepiped comprising at least two bars disposed at intersecting edges of the parallelepiped such that the bars provide at least one additional reference marker disposed at the intersection of the bars.
11. A system for synthesizing an image of an object according to claim 1 , wherein the fiducial reference comprises at least two identifiable reference markers in a fixed geometry relative to each other.
12. A system for synthesizing an image of an object according to claim 11 , wherein the the markers differ in opacity from one another.
13. A system for synthesizing an image of an object according to claim 1 , wherein the image synthesizer is adapted to extrapolate registration and calibration of the sets of projected images.
14. A method for synthesizing an image slice through a selected object at a selected slice position through the object from a plurality of projected images of the object comprising the steps of:
a. providing a fiducial reference to provide a total number of degrees of freedom associated therewith greater or equal to the total number of degrees of freedom of the system;
b. recording projected images of a region of interest of the selected object and the fiducial reference on a recording means at different arbitrary relative positions between (1) a source of radiation, (2) the selected object and fiducial reference, and (3) the recording means; and
c. synthesizing an image slice of the selected object at a selected slice position by determining overlapping regions of the projected images by identifying fiducial reference points common to the overlapping regions and bringing the first and second sets of projected images into alignment based on the identified reference points.
15. A method for synthesizing an image slice according to claim 14 , comprising the step of extrapolating registration and calibration of the sets of projected images.
16. A method according to claim 14 , wherein the region of interest comprises a subvolume in which the magnification of the projected images is substantially constant.
17. A method according to claim 14 , wherein the fiducial reference comprises identifiable reference markers.
18. A method according to claim 17 , comprising the steps of associating each reference marker projected image with the corresponding reference marker and measuring the position and size of each reference marker image.
19. A method according to claim 18 , comprising the step of determining the magnification of the projected image of the reference marker.
20. A method according to claim 19 , wherein the image of the reference marker comprises a minor diameter and the step of determining the magnification uses the minor diameter to determine the magnification.
21. A method according to claim 19 , comprising the step of generating a projected transformation matrix.
22. A method according to claim 21 , wherein the step of generating the projected transformation matrix comprises mapping the position of the reference marker image in the projected image onto a corresponding position of the reference marker in a virtual projection plane.
23. A method according to claim 22 , comprising the steps of synthesizing a plurality of image slices and generating a three-dimensional representation of the selected object from the plurality of image slices.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/958,972 US20050059886A1 (en) | 1998-07-24 | 2004-10-05 | Method and system for creating task-dependent three-dimensional images |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US9546398P | 1998-07-24 | 1998-07-24 | |
US09/252,632 US6081577A (en) | 1998-07-24 | 1999-02-19 | Method and system for creating task-dependent three-dimensional images |
US09/561,376 US6549607B1 (en) | 1998-07-24 | 2000-04-28 | Method and system for creating task-dependent three-dimensional images |
US10/414,439 US6801597B2 (en) | 1998-07-24 | 2003-04-14 | Method and system for creating task-dependent three-dimensional images |
US10/958,972 US20050059886A1 (en) | 1998-07-24 | 2004-10-05 | Method and system for creating task-dependent three-dimensional images |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/414,439 Continuation US6801597B2 (en) | 1998-07-24 | 2003-04-14 | Method and system for creating task-dependent three-dimensional images |
Publications (1)
Publication Number | Publication Date |
---|---|
US20050059886A1 true US20050059886A1 (en) | 2005-03-17 |
Family
ID=26790258
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/252,632 Expired - Fee Related US6081577A (en) | 1998-07-24 | 1999-02-19 | Method and system for creating task-dependent three-dimensional images |
US09/561,376 Expired - Fee Related US6549607B1 (en) | 1998-07-24 | 2000-04-28 | Method and system for creating task-dependent three-dimensional images |
US10/414,439 Expired - Fee Related US6801597B2 (en) | 1998-07-24 | 2003-04-14 | Method and system for creating task-dependent three-dimensional images |
US10/958,972 Abandoned US20050059886A1 (en) | 1998-07-24 | 2004-10-05 | Method and system for creating task-dependent three-dimensional images |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/252,632 Expired - Fee Related US6081577A (en) | 1998-07-24 | 1999-02-19 | Method and system for creating task-dependent three-dimensional images |
US09/561,376 Expired - Fee Related US6549607B1 (en) | 1998-07-24 | 2000-04-28 | Method and system for creating task-dependent three-dimensional images |
US10/414,439 Expired - Fee Related US6801597B2 (en) | 1998-07-24 | 2003-04-14 | Method and system for creating task-dependent three-dimensional images |
Country Status (3)
Country | Link |
---|---|
US (4) | US6081577A (en) |
AU (1) | AU5223599A (en) |
WO (1) | WO2000004830A1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050163279A1 (en) * | 2003-12-19 | 2005-07-28 | Matthias Mitschke | Method and apparatus for image support of an operative procedure implemented with a medical instrument |
US20050182319A1 (en) * | 2004-02-17 | 2005-08-18 | Glossop Neil D. | Method and apparatus for registration, verification, and referencing of internal organs |
US20060082590A1 (en) * | 2004-10-14 | 2006-04-20 | Stevick Glen R | Method and apparatus for dynamic space-time imaging system |
US20060122497A1 (en) * | 2004-11-12 | 2006-06-08 | Glossop Neil D | Device and method for ensuring the accuracy of a tracking device in a volume |
US20060173269A1 (en) * | 2004-11-12 | 2006-08-03 | Glossop Neil D | Integrated skin-mounted multifunction device for use in image-guided surgery |
US20060173291A1 (en) * | 2005-01-18 | 2006-08-03 | Glossop Neil D | Electromagnetically tracked K-wire device |
US20060184016A1 (en) * | 2005-01-18 | 2006-08-17 | Glossop Neil D | Method and apparatus for guiding an instrument to a target in the lung |
US7110807B2 (en) | 1998-03-05 | 2006-09-19 | Wake Forest University Health Sciences | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US20070032723A1 (en) * | 2005-06-21 | 2007-02-08 | Glossop Neil D | System, method and apparatus for navigated therapy and diagnosis |
US20070055128A1 (en) * | 2005-08-24 | 2007-03-08 | Glossop Neil D | System, method and devices for navigated flexible endoscopy |
US20070167787A1 (en) * | 2005-06-21 | 2007-07-19 | Glossop Neil D | Device and method for a trackable ultrasound |
US20080071215A1 (en) * | 2004-11-05 | 2008-03-20 | Traxtal Technologies Inc. | Access System |
US20080183071A1 (en) * | 2007-01-10 | 2008-07-31 | Mediguide Lit. | System and method for superimposing a representation of the tip of a catheter on an image acquired by a moving imager |
US20100171811A1 (en) * | 2006-04-21 | 2010-07-08 | Expert Treuhand Gmbh | Method and device for the creation of pseudo-holographic images |
US20120313943A1 (en) * | 2011-06-09 | 2012-12-13 | Toshiba Medical Systems Corporation | Image processing system and method thereof |
US20130064430A1 (en) * | 2010-05-26 | 2013-03-14 | Nec Corporation | Image processing device, image processing method, and image processing program |
US20130094755A1 (en) * | 2007-09-26 | 2013-04-18 | Carl Zeiss Microlmaging Gmbh | Method for the microscopic three-dimensional reproduction of a sample |
US20150145961A1 (en) * | 2010-03-01 | 2015-05-28 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
US9510771B1 (en) | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US9848922B2 (en) | 2013-10-09 | 2017-12-26 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
WO2019038304A1 (en) * | 2017-08-23 | 2019-02-28 | Carestream Dental Technology Topco Limited | Dental chair-side tomosynthesis system |
Families Citing this family (204)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8556983B2 (en) | 2001-05-25 | 2013-10-15 | Conformis, Inc. | Patient-adapted and improved orthopedic implants, designs and related tools |
US8480754B2 (en) | 2001-05-25 | 2013-07-09 | Conformis, Inc. | Patient-adapted and improved articular implants, designs and related guide tools |
US9603711B2 (en) | 2001-05-25 | 2017-03-28 | Conformis, Inc. | Patient-adapted and improved articular implants, designs and related guide tools |
US8545569B2 (en) | 2001-05-25 | 2013-10-01 | Conformis, Inc. | Patient selectable knee arthroplasty devices |
US8771365B2 (en) | 2009-02-25 | 2014-07-08 | Conformis, Inc. | Patient-adapted and improved orthopedic implants, designs, and related tools |
US8882847B2 (en) | 2001-05-25 | 2014-11-11 | Conformis, Inc. | Patient selectable knee joint arthroplasty devices |
US8735773B2 (en) | 2007-02-14 | 2014-05-27 | Conformis, Inc. | Implant device and method for manufacture |
JP2856207B1 (en) * | 1997-09-10 | 1999-02-10 | 日本電気株式会社 | Image position adjusting device and computer readable recording medium storing image position adjusting program |
FR2780183B1 (en) * | 1998-06-19 | 2000-07-28 | Commissariat Energie Atomique | METHOD FOR IMPROVING THE SIGNAL TO NOISE RATIO OF THE IMAGE OF A MOVING OBJECT |
US6081577A (en) | 1998-07-24 | 2000-06-27 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
AU772012B2 (en) | 1998-09-14 | 2004-04-08 | Board Of Trustees Of The Leland Stanford Junior University | Assessing the condition of a joint and preventing damage |
US7239908B1 (en) | 1998-09-14 | 2007-07-03 | The Board Of Trustees Of The Leland Stanford Junior University | Assessing the condition of a joint and devising treatment |
US6341152B1 (en) * | 1998-10-02 | 2002-01-22 | Kabushiki Kaisha Toshiba | X-ray computerized tomography apparatus |
US6912491B1 (en) * | 1999-05-25 | 2005-06-28 | Schlumberger Technology Corp. | Method and apparatus for mapping uncertainty and generating a map or a cube based on conditional simulation of random variables |
US8233968B1 (en) * | 1999-06-21 | 2012-07-31 | Victor John Yannacone, Jr. | Method and apparatus for high resolution dynamic digital infrared imaging |
US7408156B2 (en) * | 1999-06-21 | 2008-08-05 | Yannacone Jr Victor John | System and method for identifying and classifying dynamic thermodynamic processes in mammals and discriminating between and among such processes |
JP4653270B2 (en) * | 1999-08-25 | 2011-03-16 | 東芝医用システムエンジニアリング株式会社 | Magnetic resonance imaging system |
US6804683B1 (en) * | 1999-11-25 | 2004-10-12 | Olympus Corporation | Similar image retrieving apparatus, three-dimensional image database apparatus and method for constructing three-dimensional image database |
US6856827B2 (en) * | 2000-04-28 | 2005-02-15 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
US6484049B1 (en) * | 2000-04-28 | 2002-11-19 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
US6856826B2 (en) * | 2000-04-28 | 2005-02-15 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
US7023448B1 (en) | 2000-08-22 | 2006-04-04 | Adobe Systems Incorporated | Selecting rendering intent |
ATE413135T1 (en) | 2000-09-14 | 2008-11-15 | Univ Leland Stanford Junior | ASSESSMENT OF THE CONDITION OF A JOINT AND THE LOSS OF CARTILAGE TISSUE |
DE60138116D1 (en) | 2000-09-14 | 2009-05-07 | Univ R | ASSESSMENT OF THE CONDITION OF A JOINT AND PLANNING OF A TREATMENT |
EP1199886B1 (en) * | 2000-10-13 | 2004-03-10 | Applied Scintillation Technologies Ltd. | Infrared camera with phosphor coated CCD |
US6671349B1 (en) | 2000-11-13 | 2003-12-30 | Olganix Corporation | Tomosynthesis system and registration method |
JP2002148153A (en) * | 2000-11-15 | 2002-05-22 | Inst Of Physical & Chemical Res | Method and device for analyzing three-dimensional internal structure |
US6980682B1 (en) * | 2000-11-22 | 2005-12-27 | Ge Medical Systems Group, Llc | Method and apparatus for extracting a left ventricular endocardium from MR cardiac images |
US7209777B2 (en) | 2000-11-30 | 2007-04-24 | General Electric Company | Method and apparatus for automated tracking of non-linear vessel movement using MR imaging |
US6422750B1 (en) * | 2000-12-22 | 2002-07-23 | Ge Medical Systems Global Technology Company, Llc | Digital x-ray imager alignment method |
FI117317B (en) * | 2001-04-20 | 2006-09-15 | Instrumentarium Corp | Imaging, mammography, and biopsy |
US9308091B2 (en) | 2001-05-25 | 2016-04-12 | Conformis, Inc. | Devices and methods for treatment of facet and other joints |
EP1389980B1 (en) | 2001-05-25 | 2011-04-06 | Conformis, Inc. | Methods and compositions for articular resurfacing |
US8061006B2 (en) * | 2001-07-26 | 2011-11-22 | Powderject Research Limited | Particle cassette, method and kit therefor |
US6611575B1 (en) * | 2001-07-27 | 2003-08-26 | General Electric Company | Method and system for high resolution 3D visualization of mammography images |
JP2003043991A (en) * | 2001-08-02 | 2003-02-14 | Fujitsu Hitachi Plasma Display Ltd | Plasma display device |
AU2002332758A1 (en) * | 2001-08-31 | 2003-03-18 | Analogic Corporation | Image positioning method and system for tomosynthesis in a digital x-ray radiography system |
US20030072478A1 (en) * | 2001-10-12 | 2003-04-17 | Claus Bernhard Erich Hermann | Reconstruction method for tomosynthesis |
US7286866B2 (en) * | 2001-11-05 | 2007-10-23 | Ge Medical Systems Global Technology Company, Llc | Method, system and computer product for cardiac interventional procedure planning |
FR2833100B1 (en) * | 2001-11-30 | 2004-03-12 | Ge Med Sys Global Tech Co Llc | METHOD FOR RECONSTRUCTING AN IMAGE OF AN ORGAN |
US6978040B2 (en) * | 2001-12-19 | 2005-12-20 | Canon Kabushiki Kaisha | Optical recovery of radiographic geometry |
US7311705B2 (en) | 2002-02-05 | 2007-12-25 | Medtronic, Inc. | Catheter apparatus for treatment of heart arrhythmia |
US20030154201A1 (en) * | 2002-02-13 | 2003-08-14 | Canon Kabushiki Kaisha | Data storage format for topography data |
US7283253B2 (en) * | 2002-03-13 | 2007-10-16 | Applied Precision, Llc | Multi-axis integration system and method |
WO2003077758A1 (en) * | 2002-03-14 | 2003-09-25 | Netkisr Inc. | System and method for analyzing and displaying computed tomography data |
US7346381B2 (en) * | 2002-11-01 | 2008-03-18 | Ge Medical Systems Global Technology Company Llc | Method and apparatus for medical intervention procedure planning |
US7499743B2 (en) * | 2002-03-15 | 2009-03-03 | General Electric Company | Method and system for registration of 3D images within an interventional system |
US20050075564A1 (en) * | 2002-04-17 | 2005-04-07 | Ballard Marlin Daniel | Method and system configured for counting surgical articles |
US7778686B2 (en) * | 2002-06-04 | 2010-08-17 | General Electric Company | Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool |
DE10227307A1 (en) * | 2002-06-19 | 2004-01-15 | Siemens Ag | System for generating a 3D data record |
DE10393169T5 (en) * | 2002-08-26 | 2006-02-02 | Orthosoft, Inc., Montreal | A method of placing multiple implants during surgery using a computer-aided surgery system |
US6970531B2 (en) * | 2002-10-07 | 2005-11-29 | General Electric Company | Continuous scan RAD tomosynthesis system and method |
US6947579B2 (en) * | 2002-10-07 | 2005-09-20 | Technion Research & Development Foundation Ltd. | Three-dimensional face recognition |
EP1555962B1 (en) * | 2002-10-07 | 2011-02-09 | Conformis, Inc. | Minimally invasive joint implant with 3-dimensional geometry matching the articular surfaces |
AU2003290757A1 (en) | 2002-11-07 | 2004-06-03 | Conformis, Inc. | Methods for determing meniscal size and shape and for devising treatment |
US8565372B2 (en) | 2003-11-26 | 2013-10-22 | Hologic, Inc | System and method for low dose tomosynthesis |
US7616801B2 (en) | 2002-11-27 | 2009-11-10 | Hologic, Inc. | Image handling and display in x-ray mammography and tomosynthesis |
US10638994B2 (en) | 2002-11-27 | 2020-05-05 | Hologic, Inc. | X-ray mammography with tomosynthesis |
US7123684B2 (en) | 2002-11-27 | 2006-10-17 | Hologic, Inc. | Full field mammography with tissue exposure control, tomosynthesis, and dynamic field of view processing |
US8571289B2 (en) | 2002-11-27 | 2013-10-29 | Hologic, Inc. | System and method for generating a 2D image from a tomosynthesis data set |
US7577282B2 (en) | 2002-11-27 | 2009-08-18 | Hologic, Inc. | Image handling and display in X-ray mammography and tomosynthesis |
EP1567985B1 (en) * | 2002-12-04 | 2019-04-24 | ConforMIS, Inc. | Fusion of multiple imaging planes for isotropic imaging in mri and quantitative image analysis using isotropic or near-isotropic imaging |
US7747047B2 (en) * | 2003-05-07 | 2010-06-29 | Ge Medical Systems Global Technology Company, Llc | Cardiac CT system and method for planning left atrial appendage isolation |
US7565190B2 (en) * | 2003-05-09 | 2009-07-21 | Ge Medical Systems Global Technology Company, Llc | Cardiac CT system and method for planning atrial fibrillation intervention |
US7343196B2 (en) * | 2003-05-09 | 2008-03-11 | Ge Medical Systems Global Technology Company Llc | Cardiac CT system and method for planning and treatment of biventricular pacing using epicardial lead |
US7344543B2 (en) * | 2003-07-01 | 2008-03-18 | Medtronic, Inc. | Method and apparatus for epicardial left atrial appendage isolation in patients with atrial fibrillation |
US7813785B2 (en) * | 2003-07-01 | 2010-10-12 | General Electric Company | Cardiac imaging system and method for planning minimally invasive direct coronary artery bypass surgery |
US20050010105A1 (en) * | 2003-07-01 | 2005-01-13 | Sra Jasbir S. | Method and system for Coronary arterial intervention |
US7433507B2 (en) * | 2003-07-03 | 2008-10-07 | Ge Medical Systems Global Technology Co. | Imaging chain for digital tomosynthesis on a flat panel detector |
WO2005011947A2 (en) * | 2003-07-28 | 2005-02-10 | Fluidigm Corporation | Image processing method and system for microfluidic devices |
US20050054918A1 (en) * | 2003-09-04 | 2005-03-10 | Sra Jasbir S. | Method and system for treatment of atrial fibrillation and other cardiac arrhythmias |
US20060009755A1 (en) * | 2003-09-04 | 2006-01-12 | Sra Jasbir S | Method and system for ablation of atrial fibrillation and other cardiac arrhythmias |
WO2005031635A1 (en) | 2003-09-25 | 2005-04-07 | Paieon, Inc. | System and method for three-dimensional reconstruction of a tubular organ |
US7308299B2 (en) | 2003-10-22 | 2007-12-11 | General Electric Company | Method, apparatus and product for acquiring cardiac images |
US7308297B2 (en) * | 2003-11-05 | 2007-12-11 | Ge Medical Systems Global Technology Company, Llc | Cardiac imaging system and method for quantification of desynchrony of ventricles for biventricular pacing |
US20050143777A1 (en) * | 2003-12-19 | 2005-06-30 | Sra Jasbir S. | Method and system of treatment of heart failure using 4D imaging |
US20050137661A1 (en) * | 2003-12-19 | 2005-06-23 | Sra Jasbir S. | Method and system of treatment of cardiac arrhythmias using 4D imaging |
US20050149132A1 (en) * | 2003-12-24 | 2005-07-07 | Imad Libbus | Automatic baroreflex modulation based on cardiac activity |
US7454248B2 (en) * | 2004-01-30 | 2008-11-18 | Ge Medical Systems Global Technology, Llc | Method, apparatus and product for acquiring cardiac images |
US7035371B2 (en) | 2004-03-22 | 2006-04-25 | Siemens Aktiengesellschaft | Method and device for medical imaging |
US20050228280A1 (en) * | 2004-03-31 | 2005-10-13 | Siemens Medical Solutions Usa, Inc. | Acquisition and display methods and systems for three-dimensional ultrasound imaging |
US20050238539A1 (en) * | 2004-04-01 | 2005-10-27 | Gal Shafirstein | Apparatus for automated fresh tissue sectioning |
JP4647360B2 (en) * | 2004-04-05 | 2011-03-09 | 富士フイルム株式会社 | DIFFERENTIAL IMAGE CREATION DEVICE, DIFFERENTIAL IMAGE CREATION METHOD, AND PROGRAM THEREOF |
DE102004021772B4 (en) * | 2004-04-30 | 2007-05-24 | Siemens Ag | Method and apparatus for enhanced radial magnetic data acquisition PPA magnetic resonance imaging and computer software product |
EP1750584B1 (en) * | 2004-05-14 | 2020-10-14 | Philips Intellectual Property & Standards GmbH | System and method for diagnosing breast cancer |
US7522779B2 (en) * | 2004-06-30 | 2009-04-21 | Accuray, Inc. | Image enhancement method and system for fiducial-less tracking of treatment targets |
US8774355B2 (en) * | 2004-06-30 | 2014-07-08 | General Electric Company | Method and apparatus for direct reconstruction in tomosynthesis imaging |
US7366278B2 (en) * | 2004-06-30 | 2008-04-29 | Accuray, Inc. | DRR generation using a non-linear attenuation model |
US7369695B2 (en) * | 2004-08-20 | 2008-05-06 | General Electric Company | Method and apparatus for metal artifact reduction in 3D X-ray image reconstruction using artifact spatial information |
DE102004043739A1 (en) * | 2004-09-09 | 2006-03-30 | Siemens Ag | Mammography device with height-adjustable stage |
DE102004047519B4 (en) * | 2004-09-28 | 2021-08-12 | Leica Microsystems Cms Gmbh | Method for superimposing optical information in a scanning microscope |
US8515527B2 (en) * | 2004-10-13 | 2013-08-20 | General Electric Company | Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system |
US7327872B2 (en) * | 2004-10-13 | 2008-02-05 | General Electric Company | Method and system for registering 3D models of anatomical regions with projection images of the same |
WO2006055830A2 (en) | 2004-11-15 | 2006-05-26 | Hologic, Inc. | Matching geometry generation and display of mammograms and tomosynthesis images |
EP1816965B1 (en) | 2004-11-26 | 2016-06-29 | Hologic, Inc. | Integrated multi-mode mammography/tomosynthesis x-ray system |
US7522755B2 (en) * | 2005-03-01 | 2009-04-21 | General Electric Company | Systems, methods and apparatus for filtered back-projection reconstruction in digital tomosynthesis |
FI20055168A0 (en) * | 2005-04-13 | 2005-04-13 | Gen Electric | Tomografiamenetelmä |
US7330578B2 (en) * | 2005-06-23 | 2008-02-12 | Accuray Inc. | DRR generation and enhancement using a dedicated graphics device |
US7494344B2 (en) * | 2005-12-29 | 2009-02-24 | Molex Incorporated | Heating element connector assembly with press-fit terminals |
CN100447816C (en) * | 2005-12-31 | 2008-12-31 | 清华大学 | 3D analysis and analog method for CT projection data |
WO2007095330A2 (en) | 2006-02-15 | 2007-08-23 | Hologic Inc | Breast biopsy and needle localization using tomosynthesis systems |
FR2897461A1 (en) * | 2006-02-16 | 2007-08-17 | Gen Electric | X-RAY DEVICE AND IMAGE PROCESSING METHOD |
US7466790B2 (en) * | 2006-03-02 | 2008-12-16 | General Electric Company | Systems and methods for improving a resolution of an image |
DE102006021051A1 (en) * | 2006-05-05 | 2007-11-15 | Siemens Ag | Medical image e.g. computer tomography image, generating method for use during surgery, involves producing volume image data set based on reconstruction volume, and generating medical images with respect to section plane of image data set |
US20080008372A1 (en) * | 2006-07-07 | 2008-01-10 | General Electric Company | A method and system for reducing artifacts in a tomosynthesis imaging system |
US8577171B1 (en) * | 2006-07-31 | 2013-11-05 | Gatan, Inc. | Method for normalizing multi-gain images |
US7348776B1 (en) * | 2006-09-01 | 2008-03-25 | The Board Of Trustees Of The Leland Stanford Junior University | Motion corrected magnetic resonance imaging |
US7621169B2 (en) * | 2006-10-26 | 2009-11-24 | General Electric Company | Systems and methods for integrating a navigation field replaceable unit into a fluoroscopy system |
JP4350738B2 (en) * | 2006-10-27 | 2009-10-21 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | X-ray tomography apparatus and artifact reduction method |
JP4414420B2 (en) * | 2006-10-27 | 2010-02-10 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | X-ray tomography apparatus and artifact reduction method |
US8000522B2 (en) * | 2007-02-02 | 2011-08-16 | General Electric Company | Method and system for three-dimensional imaging in a non-calibrated geometry |
EP2114312B1 (en) | 2007-02-14 | 2014-01-08 | ConforMIS, Inc. | Method for manufacture of an implant device |
US8971658B2 (en) * | 2007-04-23 | 2015-03-03 | Xerox Corporation | Edge contrast adjustment filter |
WO2008153836A2 (en) * | 2007-05-31 | 2008-12-18 | President And Fellows Of Harvard College | Target-locking acquisition with real-time confocal (tarc) microscopy |
US7920669B2 (en) * | 2007-07-25 | 2011-04-05 | Siemens Aktiengesellschaft | Methods, apparatuses and computer readable mediums for generating images based on multi-energy computed tomography data |
US9786022B2 (en) | 2007-09-30 | 2017-10-10 | DePuy Synthes Products, Inc. | Customized patient-specific bone cutting blocks |
US8357111B2 (en) | 2007-09-30 | 2013-01-22 | Depuy Products, Inc. | Method and system for designing patient-specific orthopaedic surgical instruments |
US9173662B2 (en) | 2007-09-30 | 2015-11-03 | DePuy Synthes Products, Inc. | Customized patient-specific tibial cutting blocks |
ES2802126T3 (en) | 2007-09-30 | 2021-01-15 | Depuy Products Inc | Patient Specific Custom Orthopedic Surgical Instrument |
RU2359614C1 (en) * | 2007-10-31 | 2009-06-27 | Закрытое Акционерное Общество "Импульс" | Method of calibrating digital x-ray machine (versions) |
EP2901969B1 (en) * | 2008-03-05 | 2018-07-04 | ConforMIS, Inc. | Method of making an edge-matched articular implant |
US8682052B2 (en) | 2008-03-05 | 2014-03-25 | Conformis, Inc. | Implants for altering wear patterns of articular surfaces |
DE102008025538B4 (en) * | 2008-05-28 | 2017-08-17 | Siemens Healthcare Gmbh | Method for calibrating a multilevel X-ray machine, calibration unit and multilevel X-ray machine |
JP5380121B2 (en) * | 2008-06-09 | 2014-01-08 | 株式会社東芝 | Ultrasonic diagnostic equipment |
US8024060B2 (en) * | 2008-06-16 | 2011-09-20 | Electro Scientific Industries, Inc. | Method for defining safe zones in laser machining systems |
DE102008032479A1 (en) * | 2008-07-10 | 2010-01-21 | Siemens Aktiengesellschaft | Method for determining attenuation values of an object |
US9170200B2 (en) | 2008-07-24 | 2015-10-27 | Massachusetts Institute Of Technology | Inflatable membrane with hazard mitigation |
US9291565B2 (en) * | 2008-07-24 | 2016-03-22 | Massachusetts Institute Of Technology | Three dimensional scanning using membrane with optical features |
US9140649B2 (en) | 2008-07-24 | 2015-09-22 | Massachusetts Institute Of Technology | Inflatable membrane having non-uniform inflation characteristic |
US9170199B2 (en) | 2008-07-24 | 2015-10-27 | Massachusetts Institute Of Technology | Enhanced sensors in three dimensional scanning system |
EP2156790B1 (en) * | 2008-08-22 | 2012-03-28 | BrainLAB AG | Allocation of x-ray markers to picture markers depicted in an x-ray picture |
EP2192546A1 (en) * | 2008-12-01 | 2010-06-02 | Nederlandse Organisatie voor toegepast-natuurwetenschappelijk Onderzoek TNO | Method for recognizing objects in a set of images recorded by one or more cameras |
US9213176B2 (en) | 2008-12-02 | 2015-12-15 | The Regents Of The University Of California | Imaging arrangement and microscope |
US8515004B2 (en) * | 2009-01-16 | 2013-08-20 | Varian Medical Systems, Inc. | Real-time motion tracking using tomosynthesis |
EP2405865B1 (en) | 2009-02-24 | 2019-04-17 | ConforMIS, Inc. | Automated systems for manufacturing patient-specific orthopedic implants and instrumentation |
WO2011043838A1 (en) | 2009-10-08 | 2011-04-14 | Hologic, Inc . | Needle breast biopsy system and method of use |
JP2011089897A (en) * | 2009-10-22 | 2011-05-06 | Mitsutoyo Corp | Form measuring device and method of aligning form data |
US9082182B2 (en) | 2009-11-25 | 2015-07-14 | Dental Imaging Technologies Corporation | Extracting patient motion vectors from marker positions in x-ray images |
US9826942B2 (en) | 2009-11-25 | 2017-11-28 | Dental Imaging Technologies Corporation | Correcting and reconstructing x-ray images using patient motion vectors extracted from marker positions in x-ray images |
US9082177B2 (en) | 2009-11-25 | 2015-07-14 | Dental Imaging Technologies Corporation | Method for tracking X-ray markers in serial CT projection images |
US9082036B2 (en) | 2009-11-25 | 2015-07-14 | Dental Imaging Technologies Corporation | Method for accurate sub-pixel localization of markers on X-ray images |
CA2782137A1 (en) | 2009-12-11 | 2011-06-16 | Conformis, Inc. | Patient-specific and patient-engineered orthopedic implants |
JP4995888B2 (en) * | 2009-12-15 | 2012-08-08 | 株式会社神戸製鋼所 | Stainless steel arc welding flux cored wire |
EP2538853A4 (en) | 2010-02-25 | 2016-07-27 | Depuy Products Inc | Customized patient-specific bone cutting blocks |
WO2011106407A1 (en) | 2010-02-25 | 2011-09-01 | Depuy Products, Inc. | Method of fabricating customized patient-specific bone cutting blocks |
EP2538855A4 (en) | 2010-02-25 | 2016-08-03 | Depuy Products Inc | Customized patient-specific tibial cutting blocks |
DE102010040963A1 (en) * | 2010-09-17 | 2012-03-22 | Siemens Aktiengesellschaft | Method and X-ray machine for generating an X-ray projection image |
WO2012071429A1 (en) | 2010-11-26 | 2012-05-31 | Hologic, Inc. | User interface for medical image review workstation |
EP2754419B1 (en) | 2011-02-15 | 2024-02-07 | ConforMIS, Inc. | Patient-adapted and improved orthopedic implants |
AU2012225398B2 (en) | 2011-03-08 | 2017-02-02 | Hologic, Inc. | System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy |
US9360571B2 (en) * | 2011-04-25 | 2016-06-07 | Generic Imaging Ltd. | System and method for correction of vignetting effect in multi-camera flat panel x-ray detectors |
US8641721B2 (en) | 2011-06-30 | 2014-02-04 | DePuy Synthes Products, LLC | Customized patient-specific orthopaedic pin guides |
US10491915B2 (en) | 2011-07-05 | 2019-11-26 | Texas Instruments Incorporated | Method, system and computer program product for encoding disparities between views of a stereoscopic image |
KR102109588B1 (en) | 2011-11-27 | 2020-05-12 | 홀로직, 인크. | Methods for processing, displaying and navigating breast tissue images |
ES2641456T3 (en) | 2012-02-13 | 2017-11-10 | Hologic, Inc. | System and method for navigating a tomosynthesis stack using synthesized image data |
KR101318552B1 (en) * | 2012-03-12 | 2013-10-16 | 가톨릭대학교 산학협력단 | Method for measuring recognition warping about 3d image |
US9439622B2 (en) | 2012-05-22 | 2016-09-13 | Covidien Lp | Surgical navigation system |
US9439627B2 (en) | 2012-05-22 | 2016-09-13 | Covidien Lp | Planning system and navigation system for an ablation procedure |
US9439623B2 (en) | 2012-05-22 | 2016-09-13 | Covidien Lp | Surgical planning system and navigation system |
US8750568B2 (en) | 2012-05-22 | 2014-06-10 | Covidien Lp | System and method for conformal ablation planning |
US9498182B2 (en) | 2012-05-22 | 2016-11-22 | Covidien Lp | Systems and methods for planning and navigation |
US8982338B2 (en) * | 2012-05-31 | 2015-03-17 | Thermo Scientific Portable Analytical Instruments Inc. | Sample analysis |
US20140052420A1 (en) * | 2012-08-20 | 2014-02-20 | Ingrain Inc. | Digital Rock Analysis Systems and Methods that Estimate a Maturity Level |
US9091628B2 (en) | 2012-12-21 | 2015-07-28 | L-3 Communications Security And Detection Systems, Inc. | 3D mapping with two orthogonal imaging views |
US9131922B2 (en) * | 2013-01-29 | 2015-09-15 | Eigen, Inc. | Calibration for 3D reconstruction of medical images from a sequence of 2D images |
US10092358B2 (en) | 2013-03-15 | 2018-10-09 | Hologic, Inc. | Tomosynthesis-guided biopsy apparatus and method |
US20150120777A1 (en) * | 2013-10-24 | 2015-04-30 | Olivia Ramos | System and Method for Mining Data Using Haptic Feedback |
WO2015130916A1 (en) | 2014-02-28 | 2015-09-03 | Hologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
US9861332B2 (en) * | 2014-09-19 | 2018-01-09 | Fujifilm Corporation | Tomographic image generation device and method, and recording medium |
US9986983B2 (en) | 2014-10-31 | 2018-06-05 | Covidien Lp | Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same |
US9565421B2 (en) | 2014-11-25 | 2017-02-07 | Harold O. Hosea | Device for creating and enhancing three-dimensional image effects |
US10306208B2 (en) | 2014-11-05 | 2019-05-28 | Harold O. Hosea | Device for creating and enhancing three-dimensional image effects |
US9872663B2 (en) * | 2015-02-04 | 2018-01-23 | Dentsply Sirona Inc. | Methods, systems, apparatuses, and computer programs for removing marker artifact contribution from a tomosynthesis dataset |
DE112016000842T5 (en) | 2015-02-20 | 2017-11-23 | Artium Technologies, Inc. | Cross-beam imaging using multiple beam and convergent-light illumination |
DE102015206630B4 (en) * | 2015-04-14 | 2022-05-05 | Siemens Healthcare Gmbh | Multispectral CT imaging |
US10163262B2 (en) | 2015-06-19 | 2018-12-25 | Covidien Lp | Systems and methods for navigating through airways in a virtual bronchoscopy view |
US10702226B2 (en) | 2015-08-06 | 2020-07-07 | Covidien Lp | System and method for local three dimensional volume reconstruction using a standard fluoroscope |
US10716525B2 (en) | 2015-08-06 | 2020-07-21 | Covidien Lp | System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction |
US10674982B2 (en) | 2015-08-06 | 2020-06-09 | Covidien Lp | System and method for local three dimensional volume reconstruction using a standard fluoroscope |
US11172895B2 (en) | 2015-12-07 | 2021-11-16 | Covidien Lp | Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated |
US11029263B2 (en) * | 2015-12-09 | 2021-06-08 | Integrated-X, Inc. | Systems and methods for inspection using electromagnetic radiation |
US11076820B2 (en) | 2016-04-22 | 2021-08-03 | Hologic, Inc. | Tomosynthesis with shifting focal spot x-ray system using an addressable array |
JP6791479B2 (en) * | 2016-06-06 | 2020-11-25 | 富士電機株式会社 | Inspection system |
US11051886B2 (en) | 2016-09-27 | 2021-07-06 | Covidien Lp | Systems and methods for performing a surgical navigation procedure |
JP7174710B2 (en) | 2017-03-30 | 2022-11-17 | ホロジック, インコーポレイテッド | Systems and Methods for Targeted Object Augmentation to Generate Synthetic Breast Tissue Images |
EP3600047A1 (en) | 2017-03-30 | 2020-02-05 | Hologic, Inc. | System and method for hierarchical multi-level feature image synthesis and representation |
JP7169986B2 (en) | 2017-03-30 | 2022-11-11 | ホロジック, インコーポレイテッド | Systems and methods for synthesizing low-dimensional image data from high-dimensional image data using object grid augmentation |
WO2018204705A1 (en) | 2017-05-03 | 2018-11-08 | Turner Innovations, Llc. | Three dimensional x-ray imaging system |
US11403483B2 (en) | 2017-06-20 | 2022-08-02 | Hologic, Inc. | Dynamic self-learning medical image method and system |
US10699448B2 (en) | 2017-06-29 | 2020-06-30 | Covidien Lp | System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data |
EP4129188A1 (en) | 2017-08-16 | 2023-02-08 | Hologic, Inc. | Techniques for breast imaging patient motion artifact compensation |
EP3449835B1 (en) | 2017-08-22 | 2023-01-11 | Hologic, Inc. | Computed tomography system and method for imaging multiple anatomical targets |
CN111163697B (en) | 2017-10-10 | 2023-10-03 | 柯惠有限合伙公司 | System and method for identifying and marking targets in fluorescent three-dimensional reconstruction |
US10905498B2 (en) | 2018-02-08 | 2021-02-02 | Covidien Lp | System and method for catheter detection in fluoroscopic images and updating displayed position of catheter |
US10705001B2 (en) * | 2018-04-23 | 2020-07-07 | Artium Technologies, Inc. | Particle field imaging and characterization using VCSEL lasers for convergent multi-beam illumination |
US11051829B2 (en) | 2018-06-26 | 2021-07-06 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrument |
US10743822B2 (en) * | 2018-06-29 | 2020-08-18 | Carestream Health, Inc. | Fiducial marker for geometric calibration of bed-side mobile tomosynthesis system |
CN112566581B (en) | 2018-08-10 | 2024-03-19 | 柯惠有限合伙公司 | System for ablation visualization |
US11090017B2 (en) | 2018-09-13 | 2021-08-17 | Hologic, Inc. | Generating synthesized projection images for 3D breast tomosynthesis or multi-mode x-ray breast imaging |
DE102019204765B3 (en) * | 2019-04-03 | 2020-06-18 | Siemens Healthcare Gmbh | Method for determining a three-dimensional tomosynthesis data set, X-ray device, computer program and electronically readable data carrier |
US11172908B2 (en) * | 2019-07-30 | 2021-11-16 | GE Precision Healthcare LLC | Method and systems for correcting x-ray detector tilt in x-ray imaging |
EP3832689A3 (en) | 2019-12-05 | 2021-08-11 | Hologic, Inc. | Systems and methods for improved x-ray tube life |
US11471118B2 (en) | 2020-03-27 | 2022-10-18 | Hologic, Inc. | System and method for tracking x-ray tube focal spot position |
US11786191B2 (en) | 2021-05-17 | 2023-10-17 | Hologic, Inc. | Contrast-enhanced tomosynthesis with a copper filter |
Citations (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4662379A (en) * | 1984-12-20 | 1987-05-05 | Stanford University | Coronary artery imaging system using gated tomosynthesis |
US4722056A (en) * | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
US4920491A (en) * | 1988-05-16 | 1990-04-24 | General Electric Company | Enhancement of image quality by utilization of a priori information |
US4941164A (en) * | 1987-10-29 | 1990-07-10 | The Governors Of The University Of Alberta | Method and apparatus for improving the alignment of radiographic images |
US5008947A (en) * | 1988-10-13 | 1991-04-16 | Kabushiki Kaisha Toshiba | Method and apparatus for correcting extension rates of images |
US5051904A (en) * | 1988-03-24 | 1991-09-24 | Olganix Corporation | Computerized dynamic tomography system |
US5070454A (en) * | 1988-03-24 | 1991-12-03 | Olganix Corporation | Reference marker orientation system for a radiographic film-based computerized tomography system |
US5081577A (en) * | 1989-12-22 | 1992-01-14 | Harris Corporation | State controlled device driver for a real time computer control system |
US5227969A (en) * | 1988-08-01 | 1993-07-13 | W. L. Systems, Inc. | Manipulable three-dimensional projection imaging method |
US5299254A (en) * | 1989-11-24 | 1994-03-29 | Technomed International | Method and apparatus for determining the position of a target relative to a reference of known co-ordinates and without a priori knowledge of the position of a source of radiation |
US5359637A (en) * | 1992-04-28 | 1994-10-25 | Wake Forest University | Self-calibrated tomosynthetic, radiographic-imaging system, method, and device |
US5446548A (en) * | 1993-10-08 | 1995-08-29 | Siemens Medical Systems, Inc. | Patient positioning and monitoring system |
US5642293A (en) * | 1996-06-03 | 1997-06-24 | Camsys, Inc. | Method and apparatus for determining surface profile and/or surface strain |
US5751787A (en) * | 1996-09-05 | 1998-05-12 | Nanoptics, Inc. | Materials and methods for improved radiography |
US5755725A (en) * | 1993-09-07 | 1998-05-26 | Deemed International, S.A. | Computer-assisted microsurgery methods and equipment |
US5828722A (en) * | 1996-05-17 | 1998-10-27 | Sirona Dental Systems Gmbh & Co., Kg | X-ray diagnostic apparatus for tomosynthesis having a detector that detects positional relationships |
US5872828A (en) * | 1996-07-23 | 1999-02-16 | The General Hospital Corporation | Tomosynthesis system for breast imaging |
US5878104A (en) * | 1996-05-17 | 1999-03-02 | Sirona Dental Systems Gmbh & Co. Kg | Method for producing tomosynthesis exposures employing a reference object formed by a region of the examination subject |
US5964530A (en) * | 1996-06-24 | 1999-10-12 | Fons; Lloyd C. | Method for compensating earth surface temperatures for the skyward effect thereon |
US6081577A (en) * | 1998-07-24 | 2000-06-27 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
US6118845A (en) * | 1998-06-29 | 2000-09-12 | Surgical Navigation Technologies, Inc. | System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers |
US6120180A (en) * | 1997-10-17 | 2000-09-19 | Siemens Aktiengesellschaft | X-ray exposure system for 3D imaging |
US6122541A (en) * | 1995-05-04 | 2000-09-19 | Radionics, Inc. | Head band for frameless stereotactic registration |
US6146390A (en) * | 1992-04-21 | 2000-11-14 | Sofamor Danek Holdings, Inc. | Apparatus and method for photogrammetric surgical localization |
US6249568B1 (en) * | 1998-06-19 | 2001-06-19 | Commissariat A L'energie Atomique | Process for improving a signal/noise ratio of the image of a moving object |
US6275725B1 (en) * | 1991-01-28 | 2001-08-14 | Radionics, Inc. | Stereotactic optical navigation |
US6289235B1 (en) * | 1998-03-05 | 2001-09-11 | Wake Forest University | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US6351573B1 (en) * | 1994-01-28 | 2002-02-26 | Schneider Medical Technologies, Inc. | Imaging device and method |
US6405072B1 (en) * | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
US20030026469A1 (en) * | 2001-07-30 | 2003-02-06 | Accuimage Diagnostics Corp. | Methods and systems for combining a plurality of radiographic images |
US6684098B2 (en) * | 1996-08-16 | 2004-01-27 | Brigham And Women's Hospital, Inc. | Versatile stereotactic device and methods of use |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2810608A1 (en) * | 1978-03-11 | 1979-09-20 | Philips Patentverwaltung | PROCESS FOR LOW-DISTURBANCE LAYER REPRESENTATION OF SPATIAL OBJECTS BY USING DIFFERENT PERSPECTIVE IMAGES |
US4554676A (en) | 1983-03-16 | 1985-11-19 | The S. S. White Company | Dental aiming device |
IL109385A (en) | 1993-04-22 | 1998-03-10 | Pixsys | System for locating the relative positions of objects in three dimensional space |
US6158888A (en) * | 1996-09-05 | 2000-12-12 | University Of Florida | Materials and methods for improved radiography |
-
1999
- 1999-02-19 US US09/252,632 patent/US6081577A/en not_active Expired - Fee Related
- 1999-07-22 AU AU52235/99A patent/AU5223599A/en not_active Abandoned
- 1999-07-22 WO PCT/US1999/016618 patent/WO2000004830A1/en active Application Filing
-
2000
- 2000-04-28 US US09/561,376 patent/US6549607B1/en not_active Expired - Fee Related
-
2003
- 2003-04-14 US US10/414,439 patent/US6801597B2/en not_active Expired - Fee Related
-
2004
- 2004-10-05 US US10/958,972 patent/US20050059886A1/en not_active Abandoned
Patent Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4662379A (en) * | 1984-12-20 | 1987-05-05 | Stanford University | Coronary artery imaging system using gated tomosynthesis |
US4722056A (en) * | 1986-02-18 | 1988-01-26 | Trustees Of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
US4941164A (en) * | 1987-10-29 | 1990-07-10 | The Governors Of The University Of Alberta | Method and apparatus for improving the alignment of radiographic images |
US5051904A (en) * | 1988-03-24 | 1991-09-24 | Olganix Corporation | Computerized dynamic tomography system |
US5070454A (en) * | 1988-03-24 | 1991-12-03 | Olganix Corporation | Reference marker orientation system for a radiographic film-based computerized tomography system |
US5319550A (en) * | 1988-03-24 | 1994-06-07 | Olganix Corporation | High resolution digital image registration |
US4920491A (en) * | 1988-05-16 | 1990-04-24 | General Electric Company | Enhancement of image quality by utilization of a priori information |
US5227969A (en) * | 1988-08-01 | 1993-07-13 | W. L. Systems, Inc. | Manipulable three-dimensional projection imaging method |
US5008947A (en) * | 1988-10-13 | 1991-04-16 | Kabushiki Kaisha Toshiba | Method and apparatus for correcting extension rates of images |
US5299254A (en) * | 1989-11-24 | 1994-03-29 | Technomed International | Method and apparatus for determining the position of a target relative to a reference of known co-ordinates and without a priori knowledge of the position of a source of radiation |
US5081577A (en) * | 1989-12-22 | 1992-01-14 | Harris Corporation | State controlled device driver for a real time computer control system |
US6275725B1 (en) * | 1991-01-28 | 2001-08-14 | Radionics, Inc. | Stereotactic optical navigation |
US6405072B1 (en) * | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
US6146390A (en) * | 1992-04-21 | 2000-11-14 | Sofamor Danek Holdings, Inc. | Apparatus and method for photogrammetric surgical localization |
US5668844A (en) * | 1992-04-28 | 1997-09-16 | Webber; Richard L. | Self-calibrated tomosynthetic, radiographic-imaging system, method, and device |
US5359637A (en) * | 1992-04-28 | 1994-10-25 | Wake Forest University | Self-calibrated tomosynthetic, radiographic-imaging system, method, and device |
US5755725A (en) * | 1993-09-07 | 1998-05-26 | Deemed International, S.A. | Computer-assisted microsurgery methods and equipment |
US5446548A (en) * | 1993-10-08 | 1995-08-29 | Siemens Medical Systems, Inc. | Patient positioning and monitoring system |
US6351573B1 (en) * | 1994-01-28 | 2002-02-26 | Schneider Medical Technologies, Inc. | Imaging device and method |
US6122541A (en) * | 1995-05-04 | 2000-09-19 | Radionics, Inc. | Head band for frameless stereotactic registration |
US5878104A (en) * | 1996-05-17 | 1999-03-02 | Sirona Dental Systems Gmbh & Co. Kg | Method for producing tomosynthesis exposures employing a reference object formed by a region of the examination subject |
US5828722A (en) * | 1996-05-17 | 1998-10-27 | Sirona Dental Systems Gmbh & Co., Kg | X-ray diagnostic apparatus for tomosynthesis having a detector that detects positional relationships |
US5642293A (en) * | 1996-06-03 | 1997-06-24 | Camsys, Inc. | Method and apparatus for determining surface profile and/or surface strain |
US5964530A (en) * | 1996-06-24 | 1999-10-12 | Fons; Lloyd C. | Method for compensating earth surface temperatures for the skyward effect thereon |
US5872828A (en) * | 1996-07-23 | 1999-02-16 | The General Hospital Corporation | Tomosynthesis system for breast imaging |
US6684098B2 (en) * | 1996-08-16 | 2004-01-27 | Brigham And Women's Hospital, Inc. | Versatile stereotactic device and methods of use |
US5751787A (en) * | 1996-09-05 | 1998-05-12 | Nanoptics, Inc. | Materials and methods for improved radiography |
US6120180A (en) * | 1997-10-17 | 2000-09-19 | Siemens Aktiengesellschaft | X-ray exposure system for 3D imaging |
US6289235B1 (en) * | 1998-03-05 | 2001-09-11 | Wake Forest University | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US20010034482A1 (en) * | 1998-03-05 | 2001-10-25 | Webber Richard L. | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US6810278B2 (en) * | 1998-03-05 | 2004-10-26 | Wake Forest University | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US6249568B1 (en) * | 1998-06-19 | 2001-06-19 | Commissariat A L'energie Atomique | Process for improving a signal/noise ratio of the image of a moving object |
US6118845A (en) * | 1998-06-29 | 2000-09-12 | Surgical Navigation Technologies, Inc. | System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers |
US6549607B1 (en) * | 1998-07-24 | 2003-04-15 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
US6081577A (en) * | 1998-07-24 | 2000-06-27 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
US6801597B2 (en) * | 1998-07-24 | 2004-10-05 | Wake Forest University Health Sciences | Method and system for creating task-dependent three-dimensional images |
US20030026469A1 (en) * | 2001-07-30 | 2003-02-06 | Accuimage Diagnostics Corp. | Methods and systems for combining a plurality of radiographic images |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7801587B2 (en) | 1998-03-05 | 2010-09-21 | Wake Forest University Health Sciences | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US20110092812A1 (en) * | 1998-03-05 | 2011-04-21 | Webber Richard L | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US7110807B2 (en) | 1998-03-05 | 2006-09-19 | Wake Forest University Health Sciences | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US20070165922A1 (en) * | 1998-03-05 | 2007-07-19 | Webber Richard L | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
US7519415B2 (en) * | 2003-12-19 | 2009-04-14 | Siemens Aktiengesellschaft | Method and apparatus for image support of an operative procedure implemented with a medical instrument |
US20050163279A1 (en) * | 2003-12-19 | 2005-07-28 | Matthias Mitschke | Method and apparatus for image support of an operative procedure implemented with a medical instrument |
US20050182319A1 (en) * | 2004-02-17 | 2005-08-18 | Glossop Neil D. | Method and apparatus for registration, verification, and referencing of internal organs |
US10582879B2 (en) | 2004-02-17 | 2020-03-10 | Philips Electronics Ltd | Method and apparatus for registration, verification and referencing of internal organs |
US20060082590A1 (en) * | 2004-10-14 | 2006-04-20 | Stevick Glen R | Method and apparatus for dynamic space-time imaging system |
US7620209B2 (en) * | 2004-10-14 | 2009-11-17 | Stevick Glen R | Method and apparatus for dynamic space-time imaging system |
US20080071215A1 (en) * | 2004-11-05 | 2008-03-20 | Traxtal Technologies Inc. | Access System |
US7722565B2 (en) | 2004-11-05 | 2010-05-25 | Traxtal, Inc. | Access system |
US7805269B2 (en) | 2004-11-12 | 2010-09-28 | Philips Electronics Ltd | Device and method for ensuring the accuracy of a tracking device in a volume |
US20060173269A1 (en) * | 2004-11-12 | 2006-08-03 | Glossop Neil D | Integrated skin-mounted multifunction device for use in image-guided surgery |
US7751868B2 (en) | 2004-11-12 | 2010-07-06 | Philips Electronics Ltd | Integrated skin-mounted multifunction device for use in image-guided surgery |
US20060122497A1 (en) * | 2004-11-12 | 2006-06-08 | Glossop Neil D | Device and method for ensuring the accuracy of a tracking device in a volume |
US7840254B2 (en) | 2005-01-18 | 2010-11-23 | Philips Electronics Ltd | Electromagnetically tracked K-wire device |
US20060184016A1 (en) * | 2005-01-18 | 2006-08-17 | Glossop Neil D | Method and apparatus for guiding an instrument to a target in the lung |
US20060173291A1 (en) * | 2005-01-18 | 2006-08-03 | Glossop Neil D | Electromagnetically tracked K-wire device |
US8611983B2 (en) | 2005-01-18 | 2013-12-17 | Philips Electronics Ltd | Method and apparatus for guiding an instrument to a target in the lung |
US9398892B2 (en) | 2005-06-21 | 2016-07-26 | Koninklijke Philips N.V. | Device and method for a trackable ultrasound |
US8632461B2 (en) | 2005-06-21 | 2014-01-21 | Koninklijke Philips N.V. | System, method and apparatus for navigated therapy and diagnosis |
US20070032723A1 (en) * | 2005-06-21 | 2007-02-08 | Glossop Neil D | System, method and apparatus for navigated therapy and diagnosis |
US20070167787A1 (en) * | 2005-06-21 | 2007-07-19 | Glossop Neil D | Device and method for a trackable ultrasound |
WO2008045016A3 (en) * | 2005-06-21 | 2008-09-25 | Traxtal Inc | Device and method for a trackable ultrasound |
US9661991B2 (en) | 2005-08-24 | 2017-05-30 | Koninklijke Philips N.V. | System, method and devices for navigated flexible endoscopy |
US20070055128A1 (en) * | 2005-08-24 | 2007-03-08 | Glossop Neil D | System, method and devices for navigated flexible endoscopy |
US20140152782A1 (en) * | 2006-04-21 | 2014-06-05 | Expert Treuhand Gmbh | Method and device for the creation of pseudo-holographic images |
US20100171811A1 (en) * | 2006-04-21 | 2010-07-08 | Expert Treuhand Gmbh | Method and device for the creation of pseudo-holographic images |
US8633967B2 (en) * | 2006-04-21 | 2014-01-21 | Expert Treuhand Gmbh | Method and device for the creation of pseudo-holographic images |
US9083963B2 (en) * | 2006-04-21 | 2015-07-14 | Expert Treuhand Gmbh | Method and device for the creation of pseudo-holographic images |
US20080183071A1 (en) * | 2007-01-10 | 2008-07-31 | Mediguide Lit. | System and method for superimposing a representation of the tip of a catheter on an image acquired by a moving imager |
US20130094755A1 (en) * | 2007-09-26 | 2013-04-18 | Carl Zeiss Microlmaging Gmbh | Method for the microscopic three-dimensional reproduction of a sample |
US9697605B2 (en) * | 2007-09-26 | 2017-07-04 | Carl Zeiss Microscopy Gmbh | Method for the microscopic three-dimensional reproduction of a sample |
US10291905B2 (en) * | 2010-03-01 | 2019-05-14 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
US20150145961A1 (en) * | 2010-03-01 | 2015-05-28 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
US20130064430A1 (en) * | 2010-05-26 | 2013-03-14 | Nec Corporation | Image processing device, image processing method, and image processing program |
US9053522B2 (en) * | 2010-05-26 | 2015-06-09 | Nec Corporation | Image processing device, image processing method, and image processing program |
US8884958B2 (en) * | 2011-06-09 | 2014-11-11 | Kabushiki Kaisha Toshiba | Image processing system and method thereof |
US20120313943A1 (en) * | 2011-06-09 | 2012-12-13 | Toshiba Medical Systems Corporation | Image processing system and method thereof |
US9510771B1 (en) | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
USRE49094E1 (en) | 2011-10-28 | 2022-06-07 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
US9848922B2 (en) | 2013-10-09 | 2017-12-26 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
WO2019038304A1 (en) * | 2017-08-23 | 2019-02-28 | Carestream Dental Technology Topco Limited | Dental chair-side tomosynthesis system |
Also Published As
Publication number | Publication date |
---|---|
US20040008809A1 (en) | 2004-01-15 |
US6549607B1 (en) | 2003-04-15 |
US6801597B2 (en) | 2004-10-05 |
US6081577A (en) | 2000-06-27 |
WO2000004830A1 (en) | 2000-02-03 |
AU5223599A (en) | 2000-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6549607B1 (en) | Method and system for creating task-dependent three-dimensional images | |
EP1059877B1 (en) | System for creating three-dimensional images using tomosynthetic computed tomography | |
US9907520B2 (en) | Digital tomosynthesis systems, methods, and computer readable media for intraoral dental tomosynthesis imaging | |
EP0932363B1 (en) | Tomosynthesis system for breast imaging | |
EP2614773B1 (en) | X-ray tomogram imaging device | |
RU2545319C2 (en) | Phase-contrast image formation | |
JP7382042B2 (en) | Fixed intraoral tomosynthesis imaging system, method, and computer-readable medium for three-dimensional dental imaging | |
US20100172472A1 (en) | Collecting images for image stitching with rotating a radiation detector | |
WO2011016508A1 (en) | Radiation imaging apparatus and imaging method using radiation | |
US20170319160A1 (en) | Stationary intraoral tomosynthesis imaging systems, methods, and computer readable media for three dimensional dental imaging | |
JPH06269444A (en) | Method for generating three-dimensional radiograph | |
KR20180004134A (en) | Method for improving image data from a tooth image generation system | |
US6317481B1 (en) | Stereo x-ray image processing | |
JP2000237177A (en) | X-ray three-dimensional image photographing method and device | |
US11162909B2 (en) | System and method for colorizing a radiograph from cabinet X-ray systems | |
JP2000135211A (en) | Radiographic instrument, radiography, memory medium recording program to realize radiography, and transmission medium to be transmitted | |
Ye | Advanced Image Reconstruction for Limited View Cone-Beam CT | |
JPH0461851A (en) | Photographing of tomography image by radioactive ray and device therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |