Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS8175681 B2
Publication typeGrant
Application numberUS 12/336,085
Publication date8 May 2012
Filing date16 Dec 2008
Priority date16 Dec 2008
Also published asEP2376935A1, EP2376935B1, US20100152571, US20120220860, WO2010074986A1
Publication number12336085, 336085, US 8175681 B2, US 8175681B2, US-B2-8175681, US8175681 B2, US8175681B2
InventorsSteven L. Hartmann, Andrew Bzostek, Bradley A. Jascob
Original AssigneeMedtronic Navigation Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Combination of electromagnetic and electropotential localization
US 8175681 B2
Abstract
A navigation system or combination of navigation systems can be used to provide two or more types of navigation or modalities of navigation to navigate a single instrument. The single instrument can be positioned within the patient and tracked. For example, both an Electromagnetic (EM) and Electropotential (EP) navigation system can be used to navigate an instrument within a patient.
Images(9)
Previous page
Next page
Claims(29)
1. A system for tracking an instrument within a volume, comprising:
a first electromagnetic tracking system including a first tracking device operable to determine a position of the first tracking device within a first tracking domain;
a second electropotential tracking system including:
a plurality of current electrodes operable to induce a current into the volume to form a second tracking domain;
a second tracking device operable to sense a voltage at the second tracking device based upon the induced current;
an instrument having a first portion connected to the first tracking device and a second portion connected to the second tracking device to allow tracking of the instrument with at least one of the first tracking system or the second tracking system, wherein the second portion is operable to move relative to the first portion to place the second tracking device at a known position relative to the first tracking device;
a controller in communication with the first electromagnetic tracking system and the second electropotential tracking system;
wherein the controller is operable to determine a registration of the first tracking domain and the second tracking domain based on at least the position of the first tracking device relative to the second tracking device within the volume;
wherein the registration is operable to allow tracking of the instrument with either of the first electromagnetic tracking system or the second electropotential tracking system in the volume.
2. The system of claim 1, further comprising:
an imaging system operable to acquire image data of the volume;
wherein the first electromagnetic tracking system is operable to generate an electromagnetic field defining a tracking volume domain of a determined uniformity within the volume allowing the first tracking domain to be registered to the image data.
3. The system of claim 2, wherein the first electromagnetic tracking system includes a localizer;
wherein either of the localizer or the first tracking device is operable to generate the electromagnetic field and the other of the localizer or the first tracking device is operable to sense the electromagnetic field;
wherein the electromagnetic field defines the first tracking domain.
4. The system of claim 2, wherein the imaging system is operable to acquire image data of the first tracking device and the second tracking device to determine the position of the first tracking device and the second tracking device relative to each other within the volume.
5. The system of claim 1, wherein the instrument having the first portion and the second portion includes:
an outer member as the first portion and an inner member as the second portion, wherein the outer member and the inner member are operable to move relative to one another;
wherein the outer member is connected to the first tracking device and the second inner member is connected the second tracking device.
6. The system of claim 1, wherein the first tracking device is configured to be fixedly associated with the volume;
wherein the second tracking device is moveable to a known position relative to the first tracking device;
wherein when the second tracking device is at the known position, the controller is operable to correlate the position of the first tracking device relative to the second tracking device and correlate the first tracking domain relative to the second tracking domain.
7. The system of claim 1, wherein the instrument is a catheter and lead assembly;
wherein the catheter includes a first tracking device and the lead includes the second tracking device.
8. The system of claim 7, wherein the first tracking device is one or more coils of wire and the second tracking device includes an electrode.
9. The system of claim 1, wherein the first tracking device and the second tracking device are formed as a single device operable to be tracked within both of the first electromagnetic tracking system and the second electropotential tracking system.
10. The system of claim 9, wherein the single device includes a first portion defining an electrode to sense a voltage and a second portion operable to sense or generate a field.
11. The system of claim 1, wherein the instrument includes both the first tracking device and the second tracking device operable to be positioned at a known position relative to one another in at least a first orientation of the instrument.
12. The system of claim 1, wherein the instrument includes a lead having an electrode;
wherein the electrode of the lead is the second tracking device.
13. A method of tracking an instrument using at least a first tracking system and a second tracking system, comprising:
providing the first tracking system to determine a position of a first tracking device, including:
generating an electromagnetic field to define a first navigation space in a volume;
determining a first position of the first tracking device with the generated electromagnetic field;
providing the second tracking system to determine a position of a second tracking device, including:
generating a current in the volume to define a second navigation space;
sensing a voltage with the second tracking device;
determining a second position of the second tracking device based upon the sensed voltage;
registering the first tracking system and the second tracking system based upon moving the second tracking device to a known position relative to the first tracking device; and
tracking the second tracking device with the second tracking system and illustrating on a display device a position of the second tracking device based upon the registration of the first tracking system and the second tracking system.
14. The method of claim 13, further comprising:
providing image data of the volume;
registering the image data to the first navigation space; and
illustrating on the display device the determined first position of the first tracking device.
15. The method of claim 14, wherein moving the second tracking device to a known position relative to the first tracking device includes:
positioning the first tracking device on a tube;
moving the second tracking device through the tube; and
measuring a signal change when the second tracking device moves past the first tracking device.
16. The method of claim 14, wherein registering the first tracking system and the second tracking system, includes:
determining a first identity position in the volume with the first tracking device in the first navigation space;
determining a second identity position in the volume with the second tracking device in the second navigation space;
relating a selected plurality of points in the second navigation space with a selected plurality of points in the first navigation space based upon the known first identity position and the second identity position;
calculating a position with the second tracking system of the second tracking device once the second tracking device is no longer at the identity position; and
displaying a tracked position of the second tracking device on the display device relative to the image data.
17. The method of claim 16, wherein determining the first identity position includes determining a plurality of first identity positions and determining a second identity position includes determining a plurality of second identity positions; and
registering the first tracking system and the second tracking system based on the determined plurality of first identity positions and plurality of second identity positions.
18. The method of claim 17, wherein determining a first identity position includes positioning the first tracking device at substantially the same location as the second tracking device.
19. The method of claim 16, wherein determining the first position of the first tracking device and the second position of the second tracking device includes:
acquiring image data of the first tracking device and the second tracking device; and
determining spatial positions of both the first tracking device and the second tracking device in the image data.
20. The method of claim 16, further comprising:
selecting a position to implant a lead; and
navigating a lead with the image data based upon the tracked position of the second tracking device alone.
21. The method of claim 16, further comprising:
illustrating a selected position of implantation for a lead on the image data; and
navigating a lead to the selected position while substantially only tracking the position of the second tracking device.
22. The method of claim 13, further comprising:
providing a catheter including the first tracking device; and
providing a lead including the second tracking device.
23. The method of claim 13, further comprising:
providing a lead including the second tracking device;
wherein the lead includes an electrode as the second tracking device.
24. A system for tracking an instrument including a tracking device operable to be tracked within a volume, comprising:
a tracking device including:
a tip electrode,
an annular ring electrode, and
a coil of conductive material wrapped relative to a surface of the annular ring; and
a first electromagnetic tracking system for determining a position of the coil of conductive material within a first tracking domain;
a second electropotential tracking system including:
a plurality of current electrodes operable to induce a current into the volume to form a second tracking domain;
wherein the tip electrode and the annular ring electrode are operable to sense a voltage based upon the induced current;
a communication system interconnecting the tip electrode, the annular ring and the coil of conductive material with both the first tracking system and the second tracking system;
an instrument having a first portion connected to the tip electrode and a second portion connected to the annular ring electrode and the coil of conductive material to allow tracking of the instrument with at least the second tracking system, wherein the second portion is operable to move relative to the first portion to place the annular ring electrode at a known position relative to the tip electrode;
a controller in communication with the first electromagnetic tracking system and the second electropotential tracking system;
wherein the controller is operable to determine a registration of the first tracking domain and the second tracking domain based on at least the position of the annular ring relative to the coil of conductive material within the volume;
wherein the registration is operable to allow tracking of the first portion with the second electropotential tracking system in the volume.
25. The system of claim 24, wherein the first electromagnetic tracking system includes:
a first localizer operable to generate an electromagnetic field or sense an electromagnetic field;
wherein the coil of conductive material is operable to generate an electromagnetic field or sense an electromagnetic field.
26. The system of claim 24, further comprising:
a display device;
wherein the first or second tracking system is in communication with the display device and operable to display a represented position of the annular ring, the coil of conductive material, or combinations thereof.
27. The system of claim 26, wherein image data is operable to be displayed on the display device relative to the representation of the annular ring and the coil of conductive material.
28. The system of claim 24, further comprising:
wherein the second portion defines a cannula and the annular ring and coil of conductive material are positioned around the cannula.
29. The system of claim 28, wherein the instrument includes at least one of a catheter, an implantable lead, or combinations thereof.
Description
FIELD

The present disclosure relates generally to a system for localizing a tracked instrument, and particularly to a localization system using two or more modalities for localizing the instrument within a volume.

BACKGROUND

This section provides background information related to the present disclosure which is not necessarily prior art.

A navigation system can be used to track and navigate an instrument within a volume. For example, a navigation system can be used to track an instrument during a procedure, such as a surgical procedure. Various systems can be used to track instruments including electromagnetic systems, optical systems, acoustic systems, and other appropriate systems.

Tracking an instrument can allow for determination of a position of the instrument relative to the patient without directly viewing the instrument within the patient. Various methods can be used to achieve this result, such as directly tracking a particular portion of the instrument exterior to the patient or tracking a distal point of the instrument within the patient.

Differing navigation systems can be used to track different instruments within a patient. For example, a long substantially rigid instrument can be tracked with an optical navigation system that can track a proximal and/or end of the instrument that is external to the patient. Based on determinations, a position of a distal tip or an end of the instrument within the patient can be made. Additionally, navigation systems can use fields, such as electromagnetic fields, to track and navigate a distal portion of an instrument that is within a patient.

SUMMARY

This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.

A navigation system or combination of navigation systems can be used to provide two or more types of navigation or modalities of navigation to navigate a single instrument. The single instrument can be positioned within the patient and tracked. For example, both an Electromagnetic (EM) and Electropotential (EP) tracking systems can be used to navigate an instrument within a patient.

A navigation system can generally include a localizer and a tracking sensor. One skilled in the art will understand that the localizer can either transmit or receive a signal and the tracking sensor can also transmit or receive a signal to allow for a determination of a location of the tracking sensor associated with the surgical instrument. A surgical instrument can have associated therewith two or more tracking sensors that can be used in two or more modalities of navigation. For example, a surgical instrument may include an electrode that can be used with an EP tracking system and can also be associated or moved relative to a tracking sensor that includes an EM coil to be used with an EM tracking system.

An instrument can include one or more tracking sensors to be used with two or more navigation systems during a single procedure. In addition, a method can be used to register the two navigation systems during a single procedure. The registration of the two navigation systems can allow all or a selected number of points within one navigational domain to be coordinated or correlated to all or selected points in a second navigational domain. For example, a surgical instrument can include a single tracking sensor that can be tracked within two navigation modalities. Also, a surgical instrument with a single tracking sensor can be moved relative to a second tracking sensor, where each of the tracking sensors are tracked in different navigation modalities. According to various embodiments, when a first tracking sensor is positioned at a known location relative to a second tracking sensor, a navigation volume or domain of the first navigation system can be registered to a navigation volume or domain of a second navigation system. In this way, a first and second navigation system can be registered for navigating a tracking sensor or a surgical instrument within the two navigation modalities.

Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.

DRAWINGS

The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.

FIG. 1 is an environmental view of a navigation system;

FIG. 2A is a detailed cross-section view of an instrument, according to various embodiments;

FIG. 2B is a detailed cross-section and environmental view of an instrument, according to various embodiments;

FIG. 3 is a detailed cross-section view of an instrument, according to various embodiments;

FIG. 4 is an environmental view of a navigation system, according to various embodiments;

FIG. 5 is a flow chart of a method of registering two navigation systems;

FIG. 6 is a view of image data and icons displayed relative to the image data, according to various embodiments;

FIG. 7A-7C are detailed flowcharts of registration of two tracking systems; and

FIG. 8 is a detailed view of navigating a registered instrument.

Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.

DETAILED DESCRIPTION

Example embodiments will now be described more fully with reference to the accompanying drawings.

A surgical navigation system 20 is illustrated in FIG. 1. A first tracking system can include an electropotential (EP) tracking system 22. A second tracking system can include an electromagnetic (EM) tracking system 24. Appropriate tracking systems can include those disclosed in U.S. patent application Ser. No. 12/117,537, filed on May 8, 2008 and U.S. Patent Publication No. 2004/0097805, published on May 20, 2004, both incorporated herein by reference. The first and second tracking systems 22, 24 can be used to track a surgical instrument 26. The surgical instrument 26 can be any appropriate instrument, including a lead used as a part of an implantable medical device (IMD) for heart rhythm treatment, neurological treatment, or other appropriate purposes.

The surgical navigation system 20 can also include other components, such as an imaging system 30. The imaging system 30 can be any appropriate imaging system and is exemplary illustrated as a fluoroscopic C-arm system 32. Other imaging systems can include computed tomography (CT) imaging systems, magnetic resonance imaging (MRI) systems, and positron emission tomography (PET) imaging systems. The imaging systems 30 can be used by a surgeon 34 to image a patient 36 prior to (preoperatively), during (intraoperatively), or after (postoperatively) a procedure. Imaging the patient 36 can create image data that can be viewed on a display device 38 or a display device 40. The display device 38, 40 can be provided alone, such as on a stand 42 or with a processing system as a part of a workstation or processing system 44. The image data can be transferred from the imaging system 30 through a data transmission system 46, such as a wired or wireless transmission system, to the display devices 38, 40.

The navigation system 20, also including the tracking systems 22, 24 can be incorporated or connected to the processor system 44. The processor system 44 can include human input devices such as a keyboard 48, a joystick or mouse 50, a foot pedal 52 or any other appropriate human input device. Each of the human input devices 48-52 can be connected with the processor system 44 or other systems, such as the imaging system 30, for control or actuation thereof.

The EP tracking system 22 can include components to generate a current in the patient 36. The EP tracking system can include or be based on the Localisa™ intracardiac tracking system sold by Medtronic, Inc. have a place of business in Minneapolis, Minn. The EP tracking system 22 can also include portions disclosed in U.S. Pat. Nos. 5,697,377 or 5,983,126 to Wittkampf, incorporated herein by reference

Briefly, the EP tracking system 22 can include a pair of axis electrodes, which can also be referred to as a localizer, operable to generate a current within a volume, such as the patient 36. The axis electrodes can include three pairs of axis electrodes to generate three substantially orthogonal axes of current within the patient 26 (also see FIG. 4). The axis electrodes can include a first pair 60 a, 60 b, a second pair 62 a, 62 b, and a third pair 64 a, 64 b. The axis can be defined by an alternating current that is generated between any pair of the axis electrodes. For example, the first pair of axis electrodes 60 a and 60 b can be positioned on a left and right side of the patient 36 to define an X-axis when a current is generated between the two axis electrodes 60 a and 60 b.

The substantially orthogonal axis of current can be used to determine or calculate a location of a tracking device 70. The tracking device 70 can include a first or EP tracking device 70 a and a second or EM tracking device 70 b. The EP tracking system 22 can be used to track the EP tracking device 70 a. The first tracking device 70 a can sense a voltage in the patient 36 based upon the induced current between any pair of the axis electrodes 60 a-64 b. The voltage can be related to a position of the first tracking device 70 a in the patient 36.

The pairs of axis electrodes 60 a-64 b can be driven with a generator in a controller 72 that is connected via wires or wirelessly with the axis electrodes 60 a-64 b. The generator can provide the power to generate the alternating current in the patient 36 between the respective the axis electrodes 60 a-64 b. The controller 72 can also include a connection for the instrument 26 to communicate a signal from the tracking device 70 to the controller. The connection with the instrument 26 can be wired or wireless, according to various embodiments. In addition, the controller 72 can include a processor portion or simply be a transmitter to transmit signals from the tracking device 70. Signals can be transmitted from the controller 72 to the processor system 44 with a transmission system 74. The transmission system 74 can be a wired or wireless transmission system.

The EM tracking system 24 can also be associated with the controller 72 or can be provided with a separate controller system. It will be understood that various separate circuitry portions may be provided in the controller 72 to generate or operate the EP tracking system 22 or the EM tracking system 24.

The EM tracking system 24 includes an EM localizer 76 that can be positioned relative to the patient 36. The EM tracking system can include the AxiEM™ electromagnetic tracking system sold by Medtronic Navigation, Inc. having a place of business in Colorado, USA. The localizer 76 can generate an electromagnetic field that is sensed by the EM tracking device 70 b. Alternatively, the EM tracking device 70 b can generate a field that is sensed by the localizer 76.

A localizer can be used as a part of a tracking system to determine the location of the tracking device 70. For example, the localizer 76 can be interconnected with the controller 72 to transmit a signal to the processor system 44 regarding the position of the EM tracking device 70 b. The axis electrodes 60 a-64 b can be a localizer that induces axes of current in the patient 36 to localize the EP tracking device 70 a. Accordingly, the localizer can refer to a portion of the tracking system which can be exterior to the volume, such as the patient 36, that is used to determine a position of the tracking device 70.

According to various embodiments, the localizer devices, including the EM localizer 76 and the axis electrodes 60 a-64 b, can be used to define a navigation domain in a patient space of the patient 36. Patient space can be the physical space that is being operated on during the operative procedure. The patient space can also include the navigated space through which the surgical instrument 26 is being navigated. Image space can be defined by image data 80 that is displayed on the display devices 38, 40. Image data 80 can include any appropriate image data, such as image data of a heart 84 (FIG. 4) of the patient 36. the image data 80 displayed on the display devices 38, 40 can also include atlas data. Atlas data can include statistical or historical data. The atlas data can be registered or morphed to the patient image data or patient space. It will be understood that atlas data may be used in an imageless navigation system. For example, an imageless navigation system may not require the acquisition of image data of the patient 36.

The patient space can be registered to the image space of the image data 80 according to any appropriate technique, including those discussed herein. Generally, however, the patient space is registered to the image data 80 to allow for displaying or a super imposing an icon or representation of a tracked device, for example the surgical instrument 26, over the image data 80 on the display device 38, 40. Registration generally allows for a transformation of the image data to the patient space. Various registration techniques can include contour matching, fiducial or point matching, automatic registration, or any other appropriate registration. For example, various landmarks or fiducials can be identified in the image data 80 and the same fiducials or landmarks can be identified in the patient 36, such as within the heart 84. The image data 80 can then be transformed to the patient space of the patient 36 so that a proper location of a superimposed icon 26 i can be shown relative to the image data 80 of the heart 84. Registration techniques can include those discussed in the U.S. patent applications incorporated above. In addition, as discussed herein, the EP tracking system 22 can be registered to the EM tracking system 24. The registration of the EP tracking system 22 to the EM tracking system 24 can allow navigation of the EP tacking device 70 a with the image data 80.

Turning to FIGS. 2A and 2B, the tracking device 70 can include the two tracking devices 70 a and 70 b. The first tracking device 70 a can be a single electrode or a tip electrode 90 or ring electrode (not illustrated) of a lead assembly 92. The lead assembly 92 can be a lead for any appropriate device, such as a pacing or defibrillator system. The lead assembly 92 can be positioned or delivered within a sheath 94 according to generally known lead assemblies, such as the such as the Attain family of catheters sold by Medtronic Inc., having a place of business in Minneapolis, Minn.

The lead assembly 92 can be positioned within the patient 36, such as relative to the heart 84, with a catheter assembly 100. The catheter assembly 100 can be any appropriate configuration. The catheter 100 can include a body molded to substantially define a cannula. The catheter assembly 100 can include the second tracking device 70 b. The second tracking device 70 b can include a first coil 102 and a second coil 104, or any appropriate number of coils, as part of the EM tracking device 70 b. The coils can be coiled with any appropriate configuration, such as around substantially orthogonal axes to one another. The second tracking device 70 b, however, can sense an electromagnetic field generated with the localizer 76 or generate an electromagnetic field that is sensed by the localizer 76.

The two tracking devices 70 a, 70 b can be used with respective tracking systems 22, 24. The first tracking device 70 a can sense a voltage or determine bioimpedance (such as an impedance of a tissue of the patient 36) because of the induced current from the axis electrodes 60 a-64 b. The inducement of the current generates a voltage that can be sensed with the EP tracking device 70 a. The voltage sensed by the EP tracking device 70 a can be transmitted to the controller 72 with an appropriate communication line, such as a conductor 106. The conductor 106 can be conductively coupled to the EP tracking device 70 a. It will be understood that although the EP tracking device 70 a is illustrated as the tip electrode 90 of the lead assembly 92, that the EP tracking device 70 a can also include an alternative EP tracking device 70 a′ formed as a part of the sheath 94. Regardless of the position of the EP tracking device 70 a, its contact (e.g. by removal of a portion of insulation around the electrode) with a conductive medium or electrolyte of the patient 36 can increase and provide efficiency of detecting an appropriate voltage. The voltage sensed by the EP tracking device 70 a can be used to determine the position of the EP tracking device 70 a as discussed further herein and also described in the above incorporated U.S. patent applications and patents.

The second tracking device 70 b, according to various embodiments, can sense an electromagnetic field generated by the localizer 76. For example, a current can be induced in one or more of the coils 102, 104 that is dependent upon the position of the coils 102, 104 in a portion of the electromagnetic field. The generated current can be sent as a signal along a transmission line 108 to the controller 72.

As discussed further herein, and illustrated in FIG. 2B, the lead assembly 92 can be moved relative to tissue of the heart 84 to position the distal tip electrode 90 into the heart 84. When positioning the distal tip electrode 90 into the heart 84, the sheath 94 and the tip 90, which can include the first tracking device 70 a, can move relative to the catheter assembly 100. Moving the first tracking device 70 a relative to the catheter assembly 100 moves the first tracking device 70 a relative to the second tracking device 70 b. As discussed herein, this can be used to determine the location of the first tracking device 70 a relative to the second tracking device 70 b for registration of the EP tracking system 22 and the EM tracking system 24. This determination can be used to track the first tracking device 70 a relative to the patient 36 and with the registered image data 80.

In addition, the tracking devices 70 a and 70 b could be the same coil of wire or conductive material provided with different insulation characteristics. For example, the loops or turns of the tracking device 70 a can be electrically separated from the loops or turns of wire for the second tracking device 70 b. Both sets of loops can be of the same length of wire over top one another. The conductive media or loops of the first tracking device 70 a can be external and exposed to the patient to sense or measure the voltage in the patient. The second portion of the loops can be isolated from the patient and insulated, but they can, along with the first portion, sense the field of the EM tracking system 24.

Turning to FIG. 3, an instrument 120, according to various embodiments, is illustrated. The instrument 120 can include a lead assembly 121 substantially similar to the lead assembly 92 discussed above, including a tip electrode 90 and a sheath 94. The instrument 120 can also include a catheter assembly 122. The lead assembly 121, including the distal tip 90 and the sheath 94 can be moved relative to the catheter assembly 122.

The catheter assembly 122 can include the tracking device 70′ as a single unit or device including an EP tracking device 70 a′ and one or more windings of an EM tracking device 70 b′. The EM tracking device 70 b′ can be positioned substantially over or around the EP tracking device 70 a′. The EP tracking device 70 a′ can include an annular ring that is molded into or formed with the catheter assembly 122. The EP tracking device 70 a′ can be used with the EP tracking system 22 similar to the distal tip electrode 90 of the lead assembly 92. The EM tracking device 70 b′ can be used with the EM tracking system 24 similar to the windings 102, 104 of the EM tracking device 70 b. Nevertheless, the EP tracking device 70 a′ and the EM tracking device 70 b′ can be positioned substantially atop one another. This allows for the tracked position of the EP tracking device 70 a′ and the tracked position of the EM tracking device 70 b′ to be substantially coincident throughout a tracked procedure. A signal from either of the EP tracking device 70 a′ or the EM tracking device 70 b′ can be transmitted along or with a communication system 124, which can include a wired or wireless transmission system.

Again, it will be understood, that the tracking device 70′ can be tracked with the two tracking systems 22, 24. As discussed above, the electrode of the EP tracking device 70 a′ can sense a voltage within the patient 36. The EM tracking device 70 b′ can sense a magnetic field or electromagnetic field or transmit a magnetic field or electromagnetic field. Accordingly, the single tracking device 70′ can be used with two or more tracking systems 22, 24 to determine a location of the tracking device 70′ and the catheter and lead assembly 120. It will be further understood that the tip electrode 90 of the lead assembly 121 can also be used as the EP tracking device with the EP tracking system 22.

With reference to FIG. 4, a tracking device 70″ can include an EP tracking device 70 a″ and an EM tracking device 70 b″. The EP tracking device 70 a″ can be positioned on a first instrument portion 26 a and the EM tracking device 70 b″ can be positioned on a second instrument portion 26 b. The two instrument portions 26 a, 26 b can be positioned within the patient 36. Alternately, one of the two instrument portions 26 can be positioned relative to the patient 36 in any appropriate manner. For example, the second instrument portion 26 b including the EM tracking device 70 b″ can be positioned on an exterior surface of the patient 36 or be implanted as a fiducial or dynamic reference frame in the patient 36, such as fixed relative to the heart 84.

The two tracking devices 70 a″ and 70 b″ can be moved relative to one another during an operative procedure. For example, if both of the tracking devices 70 a″ and 70 b″ are positioned on two separate and moveable instruments 26 a, 26 b they can be moved to a known position relative to one another within the patient 36 during an operative procedure. Alternatively, if the second instrument 26 b is positioned at a fixed location relative to the patient 36, the first instrument portion 26 a can be moved to a known position relative to the second instrument portion 26 b during an operative procedure. For example, fluoroscopic or ultrasound imaging, such as with the imaging system 30, can be used to confirm or determine the known position of the first surgical instrument 26 a and the second instrument 26 b. Accordingly, during a second procedure, a position of the EP tracking device 70 a″ and the EM tracking device 70 b″ can be determined.

A location of the EP tracking device 70 a″ can be determined with the EP tracking system 22. The EM tracking system 24 can be used to determine the location of the EM tracking device 70 b″. As discussed further herein, the determined location of the two tracking devices 70 a″, 70 b″ can be used to register the EP tracking system 22 and the EM tracking system 24. The tracked position of the two instruments 26 a, 26 b can be used for illustration of an icon representing one or both of the instruments 26 a, 26 b on the display devices 38, 40 relative to the image data 80.

Turning reference to FIG. 5, a flow chart or navigation method for registering or coordinating a dual tracking system 130 is illustrated. The navigation method 130 is illustrated briefly in FIG. 5 and further detailed in FIGS. 7A-7C and 8. The method of using a two tracking system navigation system will be discussed in an exemplary embodiment herein. It will be understood, however, that a navigation system including two or more tracking systems can be used according to various embodiments, including those discussed above. The assembly 92, however, is discussed as an exemplary embodiment.

The navigation method 130, as discussed in detail herein, allows for registration of the EP tracking system 22 to the EM tracking system 24 and further to the image data 80. The EM tracking system 24 can be registered to the image data 80, as discussed herein, including registering the navigation domain of the EM tracking system 24 with the image space. The EP tracking system 22, including the navigation domain of the EP tracking system 22, can be registered to the EM tracking system 24, including the EM navigation domain, according to various embodiments, such as using the devices discussed above. The registration of the EP tracking system 22 to the EM tracking system 24 can allow the EP tracking system 22 to be registered to the image data 80.

The navigation method 130 can include starting in start block 132. The image data 80 can then be acquired in block 134. In addition, with reference to FIG. 6, the image data 80 can be displayed on the display device 40. As discussed above, an icon 92i can be superimposed on the image data 80 to represent a location of an appropriate instrument, such as the surgical instrument 26. The image data 80 can include three dimensional or two dimensional image data that is acquired for representation or illustration of a portion of the patient 36. It will be further understood that the image data 80 acquired in block 134 can be image data that is acquired preoperatively, intraoperatively, or at any appropriate time. It may also include a combination of preoperative and intraoperative image data. For example, preoperative image data can be merged or registered with intraoperative image data according to any appropriate technique. For example, 2D to 3D image registration can occur as described in U.S. patent application Ser. No. 10/644,680 filed Aug. 20, 2003, incorporated herein by reference.

The acquired image data can be stored or transferred to the processor system 44 which is a part of the navigation system 20 for use in illustrating a tracked location of the surgical instrument 26 relative to the patient 36. To assist in illustrating the correct location of the surgical instrument 26 relative to the patient 36, the patient space generally defined by the tracking system 22, 24, can be registered to the image data 80 or image space in block 136. The registration of the image data 80 to the patient space can be with any appropriate method, as discussed above.

The registration of the image data 80 to the patient space can be performed with the EM tracking system 24. The EM tracking system 24, including the localizer 76, can generate a field and navigation space which can be substantially known and definable Euclidean coordinates. The known navigation space can be efficiently and directly registered to Euclidean coordinates of the image data 80. The known field of the EM localizer 76 allows a detected change in the field generated with the EM localizer 76 to be directly related to a distinct position or movement in the field at substantially all points in the field. In other words, a detected changed of movement of the EM tracking device 70 b generally equals a selected known movement of the EM tracking device 70 b within the field regardless of the position of the EM tracking device 70 b within the field generated by the EM localizer 76. Also, every space in the EM navigation domain is known due to the uniform electromagnetic field. Accordingly, a coordinate system identified or defined by the EM tracking system 24 can be substantially known and efficiently applied to the coordinate system of the image data 80.

The registration of the image data 80 to the patient space identified with the EM tracking system 24 can be performed in any appropriate manner. As discussed above, point, contour, or any other appropriate registration processes can be used. For example, the EM tracking device 70 b can be positioned relative to known fiducials or landmarks within the patient 36 and similar or correlated landmarks or fiducials can be identified in the image data 80. The processor system 44, or any appropriate processor system, can then be used to register or correlate the points in the image data 80 to the points of the patient space. Once the registration has occurred, the image data 80 is registered to the patient space identified or within the navigation space defined by the EM tracking system 24.

The EM tracking system 24 can be registered to the EP tracking system 22 in block 138. The registration or coordination between the EM tracking system 24 and the EP tracking system 22 can occur at any appropriate time, such as before or after the EM tracking system 24 is registered to the image data in block 136. The EP tracking system 22 can be registered to the EM tracking system 24 in block 138 in any appropriate manner. As discussed further herein, exemplary registration systems 138 a, 138 b, and 138 c are illustrated and described in greater detail in relation to FIGS. 7A-7C. Once the EP tracking system 22 is registered with the EM tracking system 24, navigation of the instrument 26 with only the EP tracking device 70 a can be done in block 140. The navigation with the EP tracking device 70 a can be done and a position of the instrument 26 including the tracking device 70 a can be navigated relative to the image data 80 due to the registration of the EP tracking system 22 and the EM tracking system 24 in block 138. Accordingly, navigation using only the EP tracking system 22 can occur in block 140.

With continuing reference to FIGS. 5 and 6 and additional reference to FIG. 7A, registration of the EM tracking system and the EP tracking system, according to various embodiments, is illustrated in block 138 a. As discussed above, the lead assembly 92 can include the EP tracking device 70 a that can be defined by the tip electrode 90 of the lead 92. The catheter 100 can include one or more coils 102, 104 of the EM tracking device 70 b. As illustrated in FIG. 6, the EM tracking device 70 b can be used to register the image data 80 to the patient space of the patient 36.

Once the registration has occurred in block 136, then the EP tracking system 22 can be registered with the EM tracking system 24 in block 138 a, illustrated in FIG. 7A. A lead or instrument including an EP electrode can be provided in block 152. The EP electrode can be a distal tip electrode of the lead or can be provided in any other portion, such as in the sheath 94. For example, as illustrated in FIG. 2A, the alternative EP tracking device 70 a′ can be provided in the sheath 94. Regardless of the position of the electrode it can be used as the EP tracking device 70 a and it can be positioned relative to the EM tracking device 70 b that is positioned within the catheter 100. In addition, as illustrated in FIG. 2B, the lead including the EP tracking device 70 a can be moved relative to the catheter 100 in block 154.

When moving the lead relative to the catheter 100, it can be determined when the EP tracking device 70 a moves past or is near the coils 102, 104 of the EM tracking device 70 b in block 156. Various mechanisms can be used to determine when the EP electrode 70 a moves past the EM tracking device 70 b. For example, a change in impedance, measured voltage, or other determinations can be used to determine when the EP electrode is next to or immediately past the EM tracking device 70 b.

When the determination is made that the EP tracking device 70 a has been positioned relative to the EM tracking device 70 b, such as substantially in the same position, a registration of the EM tracking system 24 and the EP tracking system 22 can occur in block 158. The registration can occur by substantially coordinating or registering the EP tracking system 22 and the EM tracking system 24. In other words, because the EP tracking system 22 can be used to determine the position of the EP tracking device 70 a and the EM tracking system 24 can be used to determine the position of the EM tracking device 70 b these two positions or points in patient space can be identified as the same. Accordingly, the navigation space of the EP tracking system 22 can be overlaid or registered with the navigation space of the EM tracking system 24.

The coordination or registration between the EP tracking system 22 and the EM tracking system 24 can be performed by acquiring a selected number of points that are identical or at known locations relative to one another, as discussed above, with both of the tracking systems. For example, at least three corresponding points may be acquired though more points may be used to actually model or characterize the non-orthogonal or known navigation space defined by the EP tracking system 22. Less information may be necessary in a local or small region than would be needed for a larger space, such as an entire navigation space. Once points with both of the tracking systems have been acquired a curvature model, such as a spline model, can be used to model the EP tracking system 22 coordinate system or navigation space. Other appropriate modeling calculations could also be used to computationally coordinate the EP tracking system 22 and the EM tracking system 24.

Once the EM tracking system 24 and the EP tracking system 22 have been registered, movement of the EP tracking device 70 a within the patient space of the patient 36 can be illustrated superimposed on the image data 80. As illustrated in FIG. 6, icons illustrating the first tracking device 70 ai and second tracking device 70 bi can be illustrated and superimposed on the image data 80. Once registration has occurred, however, the EP tracking device icon 70 ai, illustrating the position of the EP tracking device 70 a, can be illustrated separate from the EM tracking device icon 70 bi, representing the position of the EM tracking device 70 b, but correctly related to the image data 80. It will be understood that an icon 92 i can represent generally the surgical instrument 26, or any portion thereof, and not only the tracking devices. The position of the surgical instrument, however, can be identified or determined based upon the tracked position of the tracking device 70.

Registration of the EP tracking system 22 with of the second navigation space, such as that of the EM tracking system 24, can allow for image navigation of the instrument 26 tracked with only the EP tracking system 22. The navigation space of the EP tracking system 22 may not be substantially uniform or strictly aligned with the image data 80. For example, the tissue of the patient 36 may not be substantially uniform impedance. For example, the impedance of muscle tissue may be substantially different from the impedance of blood or other electrolyte. Accordingly, a particular change in voltage may not always be related to a single physical movement amount of the EP tracking device 70 a. Movement of the EP tracking device 70 a within the patient 36, however, can be measured using the EP tracking system 22 once it is registered with a tracking system, such as the EM tracking system 24, which can be registered to the image data 80. A registered position of the EP tracking device 70 a can be superimposed on the image data 80. Therefore, a position of the EP tracking device 70 a can be superimposed on the image data 80 even if a non-uniform navigation space is generated with the EP tracking system 22.

Returning reference to FIG. 7B, registering the EP tracking system 22 and the EM tracking system 24 can be achieved with registration method 138 b. According to the registration method 138 b, a catheter can be provided with an EP electrode as the EP tracking device 70 a in block 170. A lead assembly can be provided with the EM tracking device 70 b in block 172. The lead can then be moved relative to the catheter in block 174. A determination can be made when the EM tracking device 70 b is aligned with or at a selected and known position relative to the EP tracking device 70 a in block 176. A registration of the EM tracking system 24 and the EP tracking system 22 can then occur in block 178. The registration method 138 b can be substantially similar to the registration method 138 a (illustrated in FIG. 7A) save that the EP electrode is positioned in the catheter 100 and the EM tracking device 70 b is positioned on the lead. Therefore, registration can occur in substantially the same way and tracking of the EP tracking device 70 a can occur and superimposition of a position of the EP tracking device 70 a can be illustrated relative to the image data 80.

Turning to FIG. 7C, a registration method 138 c is illustrated. The registration method 138 c can include positioning the EM tracking device 70 b at a known location in the patient 36 or other navigable space in block 184. The EM tracking device 70 b can be any appropriate device, for example the second tracked instrument 26 b illustrated in FIG. 4. The second tracked device 26 b can be a second instrument moved relative to the patient 36, a dynamic reference frame fixed relative to the patient 36, or any appropriate device including the EM tracking device 70 b. For example, the DRF 26 b′ can be positioned relative to the patient 36 at a fixed and known location. The known location of the DRF 26 b′ can be determined in any appropriate manner. For example, a registration probe (not illustrated) can be moved relative to the DRF 26 b′ to determine the location of the DRF 26 b′. In addition, the DRF 26 b′ can be positioned or include a fiducial that is identified in the image data 80 to allow for identification and registration to the image data 80. Alternatively, if the second instrument 26 b is a moveable instrument, it can be moved to a landmark that can also be identified within the image data 80.

When the second tracked device 26 b, 26 b′ is identified relative to the image data 80 and the EM tracking system 24 is registered to the image data 80, the first tracked instrument 26 a including the EP tracking device 70 a can be moved relative to the second tracked device 26 b, 26 b′. For example, the first instrument 26 a, illustrated in FIG. 4, can move to the location of the DRF 26 b′ in block 186. Once the first tracked instrument 26 a is at the same position as the DRF 26 b′, registration of the EM tracking system 24 and the EP tracking system 22 can occur in block 188. As discussed above, the location of the two tracking devices 70 a, 70 b can be determined to be substantially identical when they are positioned next to each other to allow for registration of the two tracking systems 22, 24.

It will be further understood that when two tracked instruments 26 a, 26 b are provided, they can be positioned at a known position and orientation relative to one another to allow for registration to occur in block 188. For example, it will be understood, the first tracked instrument 26 a can be positioned at a known position and orientation relative to the DRF 26 b′ and registration can occur. In other words, knowing a position and orientation of the DRF 26 b′ and position and orientation of the EP tracking device 70 a can allow for registration of the two tracking systems 22, 24 even if the two tracking devices 70 a, 70 b are not in substantially identical locations. As discussed above, imaging systems can be used to determine or identify the known locations of the two tracking devices 70 a, 70 b.

Registration of the EP tracking system 22 and the EM tracking system, 24 can also occur by providing the EP tracking device 70 a and the EM tracking device 70 b substantially at the same position on the tracked instrument 26, as illustrated with the instrument 120 in FIG. 3. When the tracking device 70 has substantially only one location for both the EP tracking system 22 and the EM tracking system 24 a determination of registration is not otherwise required, including positioning the EP tracking device 70 a relative to the EM tracking device 70 b. Rather, the tracked position of the EM tracking device 70 b with the EM tracking system 24 can be used to correlate the position of the EP tracking device 70 a inherently since all positions determined with the EM tracking device 70 b are inherently registered with the EP tracking device 70 a. Therefore, the coordinate system of the EM tracking system 24 can be used to illustrate a position of the EP tracking device 70 a on the image data 80 at all times. This can allow or be used to acquire more than one point that is the same position with both of the tracking devices 70 a and 70 b. This can assist in registration of the EP tracking system 22 and the EM tracking system 24. It will be understood, however, that the two tracking devices 70 a and 70 b need not be the same device to acquire more than one point that is at the same position with both of the tracking devices 70 a and 70 b.

Even when the two tracking devices 70 a, 70 b are the same device or formed to be at the same or fixed relative positions, a third tracking device can be provided. For example, the tip electrode 92 can also be interconnected with the controller 72. Thus, the position of the tip electrode 92 can be tracked once it has exited the catheter 122.

In addition, or alternatively, it will be understood that the EP tracking device 70 a and the EM tracking device 70 b need not be positioned on top of one another, but can be positioned substantially at a known fixed location relative to one another or next to each other with a selected assembly. For example, an electrode of the EP tracking device 70 a can be positioned substantially abutting coils of wire defining the EM tracking device 70 b. They can also be positioned a distance from one another at a substantially known location, at least when a device is at a known configuration. The known relationship or relative positions of the EP tracking device 70 a and the EM tracking device 70 b can be used to register the EP tracking system 22 and the EM tracking system 24 even if the EP tracking device 70 a and the EM tracking device 70 b are not at the same location.

Turning to FIG. 8, navigating the EP tracking device 70 a in block 140 is described in further detail. Movement of the EP tracking device 70 a can be determined in block 200. The movements of the EP tracking device 70 a can then be registered to the coordinates of the EM tracking system 24 in block 202. As discussed above, registration of the EP tracking system 22 and the EM tracking system 24 allow for a registration of a coordinate in the EM tracking system 24 with a determined position of the EP tracking device 70 a in the EP tracking system 22.

Because of the registration of the EP tracking system 22 and the EM tracking system 24, a position of the EP tracking device 70 a can be illustrated or displayed on the display device 38, 40 in block 204. As discussed above regarding FIG. 6, a tracked position of just the EP tracking device 70 a with the EP tracking system 22 can be displayed on the display device 40 relative to the image data 80. For example, the icon 70 ai representing a position of the instrument tracked with the EP tracking device 70 a can be displayed on the image data 80.

Merging preoperative acquired image data, such as the image data 80, can be done to intraoperative acquired image data in block 206. The merging of the image data can occur in any appropriate manner. One appropriate method can include contour merging, which matches contours in the preoperative acquired image data and intraoperative acquired image data. For example, if image data of a vertebra is acquired preoperatively and contours of a vertebra is acquired intraoperatively they can be matched. The contours can be manually or automatically determined in the image data and matched between image data sets.

Additionally, tracking the EP tracking device 70 a can be used to create point clouds for various organs. For example, a point cloud or point cloud map can be generated for a portion of the heart 84. The point cloud can then be matched, such as with contour matching or landmark matching, with preoperative acquired image data. Point cloud matching or generation includes identifying one or more points with the tracking device 70, such as with the EP tracking device 70 a to generate a surface of a volume. Appropriate cloud mapping techniques include those described in U.S. patent application Ser. No. 12/117,537, filed on May 8, 2008, incorporated herein by reference. It will be understood, however, that the generation of the point cloud can be made with either the EP tracking device 70 a or the EM tracking device 70 b. However, the EP tracking device 70 a, which can include an electrode, can be provided at a selected size, such as one that will easily maneuver within the heart 84 to allow for an efficient generation of the cloud map by identifying a plurality of points. Accordingly, a selected one of the tracking devices 70 a, 70 b can be efficiently used to generate a selected type of data, such as a landmark or cloud map, for merging of intraoperative and preoperative image data.

In addition, the electrode 92 of the lead 90 can be used as the EP tracking device 70 a. The tip electrode 92 can be implanted in the heart 84. Accordingly, image data 80, which can be pre- or intra-operatively acquired, can be used to identify or suggest a selected location of the lead tip 92. By correlating the EM tracking system 24 and the EP tacking system 22 a selected location identified relative to the image data 80 can be used to guide the electrode 92 to an appropriate or selected location for implantation. An additional tracking device, such as the EM tracking device 70 b, is not required to track the electrode 92 to a selected location within the heart 84 with the image data 80 because of the registration of the EM tracking system 24 and the EP tracking system 22. Suggesting a placement of a lead tip can be based on any appropriate information, such as historical data, statistical data, or atlas models. Exemplary suggestion systems include those disclosed in U.S. Patent Application Publication No. 2002/0097806, published on May 20, 2004, incorporated herein by reference.

Providing registered tracking systems can be used for various purposes. As discussed above, the EM tracking system 24 and the EP tracking system 22 can be used for different tracking purposes or in different locations. In addition, the EP tracking system 22 may not generate an appropriate signal in various portions of the patient 36. For example, if the EP tracking device 70 a is not positioned within a portion of the patient 36 that includes an electrolyte or appropriately conducted material, a voltage may not be generated relative to the EP tracking device 70 a when a current is induced in the patient 36. Therefore, the EM tracking device 70 b can be used to track the position of the instrument 26 relative to the patient 36.

According to various embodiments, the EP tracking device 70 a can be substantially smaller than the EM tracking device 70 b. For example, the EP tracking device 70 a may only include a single wire or small conductive member to act as an electrode. The small dimensions of the electrode of the EP tracking device 70 a and can allow it to move to selected locations, such as within the heart 84, which may not be accessible with a larger tracking device, such as the EM tracking device 70 b. Therefore, providing the EP Tracking system 22 and the EM tracking system 24 can allow for tracking the surgical device 26, or any appropriate device, with more than one modality.

The EP tracking system 22 can be used to track the lead electrode 90 as the EP tracking device 70 a. Accordingly, the EP tracking system 22 can be used to track the location of the lead electrode 90 to its intended implantation site or location with the EP tracking device 70 a. The tracked position can then be displayed on the display devices 38, 40 for viewing by the surgeon 34.

The EP tracking system 22, however, may not be directly registerable to the image data 80. As discussed above, varying impedances of tissue of the patient 36 may inhibit registration of the EP tracking system 22 with the image data 80. Lack of registration with the image data 80 can reduce effectiveness of image navigation.

The EM tracking system 24, however, can be registered with the image data 80. The EM tracking system 24, including the more uniform navigation domain, can be registered to the image data 80. In determining one or more points, also referred to as identity points, in both the EP tracking system 22 navigation domain and the EM tracking system 24 navigation domain the two tracking systems can be registered. This can allow the EP tracking system 22 to be registered to the image data 80. Registration can also allow the use of pre-acquired image data that can be registered to intraoperative image data or other appropriate image data for navigation of the instrument 26 with the EP tracking device 70 a.

In addition, the two tracking systems 22, 24 can be used for complimentary purposes. For example, the EM tracking system 24 may have a higher accuracy than the EP tracking system 22. Therefore the EM tracking system 24 can be used to determine locations of various landmarks for registration, while the EP tracking system 22 is used for navigation of the instrument 26 for implantation. Also, if location and size permits, the EM tracking system 24 can be used to confirm a location of the instrument 26 after implantation.

Further, the EM tracking system 24 can track the tracking device 70 b in the absence of a conductive material. Thus, the EP tracking device 70 a can be used to track the instrument when a conductive medium and current is present (e.g. within the heart 84) and the EM tracking device 70 b can be used to track the instrument 26 when the conductive medium is not present (e.g. exterior to the heart 84).

It will be further understood that the tracking systems 22, 24 can be used to track any appropriate device relative to any appropriate volume. For example, positioning a device within an enclosed volume may be selected for building, manufacturing, or repairing various workpieces in selected workspaces. For example, a device can be moved relative to an enclosed volume, such as within an airplane, robot, or other enclosed areas without requiring open visualization or access within the volume. The enclosed volume of the workpiece or workspace, may also include more than one type of environment. Accordingly, having multiple tracking systems using differing tracking modalities can be used to track a single instrument or two parts of the single instrument within any appropriate volume.

The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the invention, and all such modifications are intended to be included within the scope of the invention.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US157678122 Apr 192416 Mar 1926Philips Herman BFluoroscopic fracture apparatus
US173572620 Dec 192712 Nov 1929 bornhardt
US240784516 Jan 194317 Sep 1946California Inst Res FoundAligning device for tools
US26505887 Feb 19511 Sep 1953Drew Harry Guy RadcliffeArtificial femoral head having an x-ray marker
US26974334 Dec 195121 Dec 1954Max A ZehnderDevice for accurately positioning and guiding guide wires used in the nailing of thefemoral neck
US30168993 Nov 195816 Jan 1962Carl B StenvallSurgical instrument
US301788719 Jan 196023 Jan 1962William T HeyerStereotaxy device
US30619368 Jul 19596 Nov 1962Univ LouvainStereotaxical methods and apparatus
US30733105 Aug 195715 Jan 1963Zenon R MocarskiSurgical instrument positioning device
US310958826 Jan 19625 Nov 1963Bostick Lewis MCelestial computers
US329408326 Aug 196327 Dec 1966Alderson Res Lab IncDosimetry system for penetrating radiation
US336732615 Jun 19656 Feb 1968Calvin H. FrazierIntra spinal fixation rod
US343925623 Feb 196715 Apr 1969Merckle Flugzeugwerke GmbhInductive angular position transmitter
US357716010 Jan 19684 May 1971James E WhiteX-ray gauging apparatus with x-ray opaque markers in the x-ray path to indicate alignment of x-ray tube, subject and film
US361495017 Mar 196926 Oct 1971Graham Peter RabeyApparatus for location relatively to a subject's cephalic axis
US364482531 Dec 196922 Feb 1972Texas Instruments IncMagnetic detection system for detecting movement of an object utilizing signals derived from two orthogonal pickup coils
US367401421 Oct 19704 Jul 1972Astra Meditec AbMagnetically guidable catheter-tip and method
US370293513 Oct 197114 Nov 1972Litton Medical ProductsMobile fluoroscopic unit for bedside catheter placement
US37047076 Apr 19715 Dec 1972William X HalloranOrthopedic drill guide apparatus
US382146915 May 197228 Jun 1974Amperex Electronic CorpGraphical data device
US383734720 Apr 197224 Sep 1974Electro Catheter CorpInflatable balloon-type pacing probe
US386856530 Jul 197325 Feb 1975Jack KuipersObject tracking and orientation determination means, system and process
US39411273 Oct 19742 Mar 1976Froning Edward CApparatus and method for stereotaxic lateral extradural disc puncture
US398347421 Feb 197528 Sep 1976Polhemus Navigation Sciences, Inc.Tracking and determining orientation of object using coordinate transformation means, system and process
US399562311 Feb 19767 Dec 1976American Hospital Supply CorporationMultipurpose flow-directed catheter
US401785824 Feb 197512 Apr 1977Polhemus Navigation Sciences, Inc.Apparatus for generating a nutating electromagnetic field
US40375924 May 197626 Jul 1977Kronner Richard FGuide pin locating tool and method
US405262028 Nov 19754 Oct 1977Picker CorporationMethod and apparatus for improved radiation detection in radiation scanning systems
US405488126 Apr 197618 Oct 1977The Austin CompanyRemote object position locater
US41173373 Nov 197726 Sep 1978General Electric CompanyPatient positioning indication arrangement for a computed tomography system
US417322816 May 19776 Nov 1979Applied Medical DevicesCatheter locating device
US418231220 May 19778 Jan 1980Mushabac David RDental probe
US420234924 Apr 197813 May 1980Jones James WRadiopaque vessel markers
US422879922 Sep 197821 Oct 1980Anichkov Andrei DMethod of guiding a stereotaxic instrument at an intracerebral space target point
US425611212 Feb 197917 Mar 1981David Kopf InstrumentsHead positioner
US426230621 Nov 197914 Apr 1981Karlheinz RennerMethod and apparatus for monitoring of positions of patients and/or radiation units
US428780920 Aug 19798 Sep 1981Honeywell Inc.Helmet-mounted sighting system
US429887424 Oct 19783 Nov 1981The Austin CompanyMethod and apparatus for tracking objects
US431425130 Jul 19792 Feb 1982The Austin CompanyRemote object position and orientation locater
US431707815 Oct 197923 Feb 1982Ohio State University Research FoundationRemote position and orientation detection employing magnetic flux linkage
US43191369 Nov 19799 Mar 1982Jinkins J RandolphComputerized tomography radiograph data transfer cap
US43285484 Apr 19804 May 1982The Austin CompanyLocator for source of electromagnetic radiation having unknown structure or orientation
US432881320 Oct 198011 May 1982Medtronic, Inc.Brain lead anchoring system
US433995329 Aug 198020 Jul 1982Aisin Seiki Company, Ltd.Position sensor
US434122013 Apr 197927 Jul 1982Pfizer Inc.Stereotactic surgery apparatus and method
US434638430 Jun 198024 Aug 1982The Austin CompanyRemote object position and orientation locator
US435885631 Oct 19809 Nov 1982General Electric CompanyMultiaxial x-ray apparatus
US436853619 Nov 198011 Jan 1983Siemens AktiengesellschaftDiagnostic radiology apparatus for producing layer images
US43968853 Jun 19802 Aug 1983Thomson-CsfDevice applicable to direction finding for measuring the relative orientation of two bodies
US439694519 Aug 19812 Aug 1983Solid Photography Inc.Method of sensing the position and orientation of elements in space
US440332115 Apr 19816 Sep 1983U.S. Philips CorporationSwitching network
US441842211 Mar 198129 Nov 1983Howmedica International, Inc.Aiming device for setting nails in bones
US44190123 Sep 19806 Dec 1983Elliott Brothers (London) LimitedPosition measuring system
US442204130 Jul 198120 Dec 1983The United States Of America As Represented By The Secretary Of The ArmyMagnet position sensing system
US44310057 May 198114 Feb 1984Mccormick Laboratories, Inc.Method of and apparatus for determining very accurately the position of a device inside biological tissue
US448581530 Aug 19824 Dec 1984Kurt AmplatzDevice and method for fluoroscope-monitored percutaneous puncture treatment
US450667610 Sep 198226 Mar 1985Duska Alois ARadiographic localization technique
US450668017 Mar 198326 Mar 1985Medtronic, Inc.Drug dispensing body implantable lead
US454395925 Jan 19851 Oct 1985Instrumentarium OyDiagnosis apparatus and the determination of tissue structure and quality
US454820827 Jun 198422 Oct 1985Medtronic, Inc.Automatic adjusting induction coil treatment device
US457183416 May 198525 Feb 1986Orthotronics Limited PartnershipKnee laxity evaluator and motion module/digitizer arrangement
US457219818 Jun 198425 Feb 1986Varian Associates, Inc.Catheter for use with NMR imaging systems
US45835384 May 198422 Apr 1986Onik Gary MMethod and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization
US458457717 Oct 198322 Apr 1986Brookes & Gatehouse LimitedAngular position sensor
US460897720 Dec 19822 Sep 1986Brown Russell ASystem using computed tomography as for selective body treatment
US461386613 May 198323 Sep 1986Mcdonnell Douglas CorporationThree dimensional digitizer with electromagnetic coupling
US461792528 Sep 198421 Oct 1986Laitinen Lauri VAdapter for definition of the position of brain structures
US461897821 Oct 198321 Oct 1986Cosman Eric RMeans for localizing target coordinates in a body relative to a guidance system reference frame in any arbitrary plane as viewed by a tomographic image through the body
US461924620 May 198528 Oct 1986William Cook, Europe A/SCollapsible filter basket
US46216286 Sep 198411 Nov 1986Ortopedia GmbhApparatus for locating transverse holes of intramedullary implantates
US46257187 Jun 19852 Dec 1986Howmedica International, Inc.Aiming apparatus
US463879810 Sep 198027 Jan 1987Shelden C HunterStereotactic method and apparatus for locating and treating or removing lesions
US464278625 May 198410 Feb 1987Position Orientation Systems, Ltd.Method and apparatus for position and orientation measurement using a magnetic field and retransmission
US46453434 Jun 198424 Feb 1987U.S. Philips CorporationAtomic resonance line source lamps and spectrophotometers for use with such lamps
US464950422 May 198410 Mar 1987Cae Electronics, Ltd.Optical position and orientation measurement techniques
US465173211 Apr 198524 Mar 1987Frederick Philip RThree-dimensional light guidance system for invasive procedures
US46535093 Jul 198531 Mar 1987The United States Of America As Represented By The Secretary Of The Air ForceGuided trephine samples for skeletal bone studies
US465997116 Aug 198521 Apr 1987Seiko Instruments & Electronics Ltd.Robot controlling system
US466097015 Oct 198428 Apr 1987Carl-Zeiss-StiftungMethod and apparatus for the contact-less measuring of objects
US46733522 Jan 198616 Jun 1987Markus HansenDevice for measuring relative jaw positions and movements
US468803719 Apr 198218 Aug 1987Mcdonnell Douglas CorporationElectromagnetic communications and switching system
US469630423 Apr 198629 Sep 1987Thomas J. FogartyThermodilution flow-directed catheter assembly and method
US470104919 Jun 198420 Oct 1987B.V. Optische Industrie "De Oude Delft"Measuring system employing a measuring method based on the triangulation principle for the non-contact measurement of a distance from the surface of a contoured object to a reference level. _
US47053953 Oct 198410 Nov 1987Diffracto Ltd.Triangulation data integrity
US470540112 Aug 198510 Nov 1987Cyberware Laboratory Inc.Rapid three-dimensional surface digitizer
US470666517 Dec 198417 Nov 1987Gouda Kasim IFrame for stereotactic surgery
US470915627 Nov 198524 Nov 1987Ex-Cell-O CorporationMethod and apparatus for inspecting a surface
US471070823 Jul 19821 Dec 1987DevelcoMethod and apparatus employing received independent magnetic field components of a transmitted alternating magnetic field for determining location
US471941915 Jul 198512 Jan 1988Harris Graphics CorporationApparatus for detecting a rotary position of a shaft
US472205618 Feb 198626 Jan 1988Trustees Of Dartmouth CollegeReference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope
US472233625 Jan 19852 Feb 1988Michael KimPlacement guide
US47235449 Jul 19869 Feb 1988Moore Robert RHemispherical vectoring needle guide for discolysis
US472756513 Nov 198423 Feb 1988Ericson Bjoern EMethod of localization
US47339698 Sep 198629 Mar 1988Cyberoptics CorporationLaser probe for determining distance
US473703226 Aug 198512 Apr 1988Cyberware Laboratory, Inc.Surface mensuration sensor
US47377949 Dec 198512 Apr 1988Mcdonnell Douglas CorporationMethod and apparatus for determining remote object orientation and position
US47379213 Jun 198512 Apr 1988Dynamic Digital Displays, Inc.Three dimensional medical image display system
US47423569 Dec 19853 May 1988Mcdonnell Douglas CorporationMethod and apparatus for determining remote object orientation and position
USRE3261917 Oct 19848 Mar 1988 Apparatus and method for nuclear magnetic resonance scanning and mapping
Non-Patent Citations
Reference
1"Prestige Cervical Disc System Surgical Technique", 12 pgs.
2"Vital Images Receives 510(k) Clearance to Market VScore(TM) With AutoGate(TM); Breakthrough in Cardiac CT Imaging Simplifies Screening for Heart Disease," Press Release. Vital Images, Inc., Feb. 6, 2001 (4 pages).
3"EnSite NavX™ Navigation & Visualization Technology." 3 pages, St. Jude Medical. http://www.sjmprofessional.com/Products/US/Mapping-and-Visualization/EnSite-NavX-Navigation-and-Visualization-Technology.aspx Web. Accessed Jun. 19, 2009.
4"Local Lisa® Intracardiac Navigation System Model 9670000/9670025." Technical Manual Version 1.2, Chapter 1, pp. 1-19. 2004.
5Adams et al., "Orientation Aid for Head and Neck Surgeons," Innov. Tech. Biol. Med., vol. 13, No. 4, 1992, pp. 409-424.
6Adams et al., Computer-Assisted Surgery, IEEE Computer Graphics & Applications, pp. 43-51, (May 1990).
7Barrick et al., "Prophylactic Intramedullary Fixation of the Tibia for Stress Fracture in a Professional Athlete," Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 241-244 (1992).
8Barrick et al., "Technical Difficulties with the Brooker-Wills Nail in Acute Fractures of the Femur," Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 144-150 (1990).
9Barrick, "Distal Locking Screw Insertion Using a Cannulated Drill Bit: Technical Note," Journal of Orthopaedic Trauma, vol. 7, No. 3, 1993, pp. 248-251.
10Batnitzky et al., "Three-Dimensinal Computer Reconstructions of Brain Lesions from Surface Contours Provided by Computed Tomography: A Prospectus," Neurosurgery, vol. 11, No. 1, Part 1, 1982, pp. 73-84.
11Benzel et al., "Magnetic Source Imaging: a Review of the Magnes System of Biomagnetic Technologies Incorporated," Neurosurgery, vol. 33, No. 2 (Aug. 1993), pp. 252-259.
12Bergstrom et al. Stereotaxic Computed Tomography, Am. J. Roentgenol, vol. 127 pp. 167-170 (1976).
13Birkfellner, Wolfgang, et al. "Calibration of Tracking Systems in a Surgical Environment," IEEE Transactions on Medical Imaginge, IEEE Service Center, Piscataway, NJ, US, vol. 17, No. 5. (Oct. 1, 1998) XP011035767. ISSN: 0278-0062 the whole document.
14Bouazza-Marouf et al.; "Robotic-Assisted Internal Fixation of Femoral Fractures", IMECHE., pp. 51-58 (1995).
15Brack et al., "Accurate X-ray Based Navigation in Computer-Assisted Orthopedic Surgery," CAR '98, pp. 716-722.
16Brenner, David J., Ph.D., "Computed Tomography—An Increasing Source of Radiation Exposure", The New England Journal of Medicine (Nov. 29, 2007), pp. 2277-2284.
17Brown, R., M.D., A Stereotactic Head Frame for Use with CT Body Scanners, Investigative Radiology .COPYRGT. J.B. Lippincott Company, pp. 300-304 (Jul.-Aug. 1979).
18Bryan, "Bryan Cervical Disc System Single Level Surgical Technique", Spinal Dynamics, 2002, pp. 1-33.
19Bucholz et al., "Variables affecting the accuracy of stereotactic localizationusing computerized tomography," Journal of Neurosurgery, vol. 79, Nov. 1993, pp. 667-673.
20Bucholz, R.D., et al. Image-guided surgical techniques for infections and trauma of the central nervous system, Neurosurg. Clinics of N.A., vol. 7, No. 2, pp. 187-200 (1996).
21Bucholz, R.D., et al., A Comparison of Sonic Digitizers Versus Light Emitting Diode-Based Localization, Interactive Image-Guided Neurosurgery, Chapter 16, pp. 179-200 (1993).
22Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE-The Intl. Soc. for Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
23Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE—The Intl. Soc. for Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
24Bucholz, R.D., et al., Intraoperative Ultrasonic Brain Shift Monitor and Analysis, Stealth Station Marketing Brochure (2 pages) (undated).
25Bucholz, R.D., et al., The Correction of Stereotactic Inaccuracy Caused by Brain Shift Using an Intraoperative Ultrasound Device, First Joint Conference, Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics andComputer-Assisted Surgery, Grenoble, France, pp. 459-466 (Mar. 19-22, 1997).
26Champleboux et al., "Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method," IEEE International Conference on Robotics and Automation, Nice, France, May 1992.
27Champleboux, "Utilisation de Fonctions Splines pour la Mise au Point D'un Capteur Tridimensionnel sans Contact," Quelques Applications Medicales, Jul. 1991.
28Cinquin et al., "Computer Assisted Medical Interventions," IEEE Engineering in Medicine and Biology, May/Jun. 1995, pp. 254-263.
29Cinquin et al., "Computer Assisted Medical Interventions," International Advanced Robotics Programme, Sep. 1989, pp. 63-65.
30Clarysse et al., "A Computer-Assisted System for 3-D Frameless Localization in Stereotaxic MRI," IEEE Transactions on Medical Imaging, vol. 10, No. 4, Dec. 1991, pp. 523-529.
31Cutting M.D. et al., Optical Tracking of Bone Fragments During Craniofacial Surgery, Second Annual International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 221-225, (Nov. 1995).
32European Search Report completed Mar. 1, 2004 for European application EP03024327, claiming benefit of U.S. Appl. No. 10/299,969, filed Nov. 19, 2002.
33European Search Report completed Sep. 29, 2004 for European application EP04016056, claiming benefit of U.S. Appl. No. 10/619,216, filed Jul. 14, 2003.
34Feldmar et al., "3D-2D Projective Registration of Free-Form Curves and Surfaces," Rapport de recherche (Inria Sophia Antipolis), 1994, pp. 1-44.
35Foley et al., "Fundamentals of Interactive Computer Graphics," The Systems Programming Series, Chapter 7, Jul. 1984, pp. 245-266.
36Foley et al., "Image-guided Intraoperative Spinal Localization," Intraoperative Neuroprotection, Chapter 19, 1996, pp. 325-340.
37Foley, "The SteathStation: Three-Dimensional Image-Interactive Guidance for the Spine Surgeon," Spinal Frontiers, Apr. 1996, pp. 7-9.
38Friets, E.M., et al. A Frameless Stereotaxic Operating Microscope for Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 36, No. 6, pp. 608-617 (Jul. 1989).
39Gallen, C.C., et al., Intracranial Neurosurgery Guided by Functional Imaging, Surg. Neurol., vol. 42, pp. 523-530 (1994).
40Galloway, R.L., et al., Interactive Image-Guided Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 89, No. 12, pp. 1226-1231 (1992).
41Galloway, R.L., Jr. et al, Optical localization for interactive, image-guided neurosurgery, SPIE, vol. 2164 (May 1, 1994) pp. 137-145.
42Gepstein, Lior, M.D., "A Novel Method for Nonfluoroscopic Catheter-Based Electroanatomical Mapping of the Heart, In Vitro and In Vivo Accuracy Results", American Heart Association, Learn and Live, Circulation (1997), http://circ.ahajournals.org/cgi/content/abstract/95/6/1611 printed Oct. 2, 2008.
43Germano, "Instrumentation, Technique and Technology", Neurosurgery, vol. 37, No. 2, Aug. 1995, pp. 348-350.
44Gildenberg et al., "Calculation of Stereotactic Coordinates from the Computed Tomographic Scan," Neurosurgery, vol. 10, No. 5, May 1982, pp. 580-586.
45Gomez, C.R., et al., Transcranial Doppler Ultrasound Following Closed Head Injury: Vasospasm or Vasoparalysis?, Surg. Neurol., vol. 35, pp. 30-35 (1991).
46Gonzalez, "Digital Image Fundamentals," Digital Image Processing, Second Edition, 1987, pp. 52-54.
47Gottesfeld Brown et al., "Registration of Planar Film Radiographs with Computer Tomography," Proceedings of MMBIA, Jun. '96, pp. 42-51.
48Grimson, W.E.L., An Automatic Registration Method for Frameless Stereotaxy, Image Guided Surgery, and enhanced Reality Visualization, IEEE, pp. 430-436 (1994).
49Grimson, W.E.L., et al., Virtual-reality technology is giving surgeons the equivalent of x-ray vision helping them to remove tumors more effectively, to minimize surgical wounds and to avoid damaging critical tissues, Sci. Amer., vol. 280, No. 6,pp. 62-69 (Jun. 1999).
50Gueziec et al., "Registration of Computed Tomography Data to a Surgical Robot Using Fluoroscopy: A Feasibility Study," Computer Science/Mathematics, Sep. 27, 1996, 6 pages.
51Guthrie, B.L., Graphic-Interactive Cranial Surgery: The Operating Arm System, Handbook of Stereotaxy Using the CRW Apparatus, Chapter 13 (1994) pp. 193-211.
52Hamadeh et al, "Kinematic Study of Lumbar Spine Using Functional Radiographies and 3D/2D Registration," TIMC UMR 5525-IMAG (1997).
53Hamadeh et al, "Kinematic Study of Lumbar Spine Using Functional Radiographies and 3D/2D Registration," TIMC UMR 5525—IMAG (1997).
54Hamadeh et al., "Automated 3-Dimensional Computed Tomographic and Fluorscopic Image Registration," Computer Aided Surgery (1998), 3:11-19.
55Hamadeh et al., "Towards Automatic Registration Between CT and X-ray Images: Cooperation Between 3D/2D Registration and 2D Edge Detection," MRCAS '95, pp. 39-46.
56Hardy, T., M.D., et al., CASS: A Program for Computer Assisted Stereotaxic Surgery, The Fifth Annual Symposium on Comptuer Applications in Medical Care, Proceedings, Nov. 1-4, 1981, IEEE, pp. 1116-1126, (1981).
57Hatch, "Reference-Display System for the Integration of CT Scanning and the Operating Microscope," Thesis, Thayer School of Engineering, Oct. 1984, pp. 1-189.
58Hatch, et al., "Reference-Display System for the Integration of CT Scanning and the Operating Microscope", Proceedings of the Eleventh Annual Northeast Bioengineering Conference, Mar. 14-15, 1985, pp. 252-254.
59Heilbrun et al., "Preliminary experience with Brown-Roberts-Wells (BRW) computerized tomography stereotaxic guidance system," Journal of Neurosurgery, vol. 59, Aug. 1983, pp. 217-222.
60Heilbrun, M.D., Progressive Technology Applications, Neurosurgery for the Third Millenium, Chapter 15, J. Whitaker & Sons, Ltd., Amer. Assoc. of Neurol. Surgeons, pp. 191-198 (1992).
61Heilbrun, M.P., Computed Tomography-Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
62Heilbrun, M.P., Computed Tomography—Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
63Heilbrun, M.P., et al., Stereotactic Localization and Guidance Using a Machine Vision Technique, Sterotact & Funct. Neurosurg., Proceed. of the Mtg. of the Amer. Soc. for Sterot. and Funct. Neurosurg. (Pittsburgh, PA) vol. 58, pp. 94-98 (1992).
64Henderson et al., "An Accurate and Ergonomic Method of Registration for Image-guided Neurosurgery," Computerized Medical Imaging and Graphics, vol. 18, No. 4, Jul.-Aug. 1994, pp. 273-277.
65Hoerenz, "The Operating Microscope I. Optical Principles, Illumination Systems, and Support Systems," Journal of Microsurgery, vol. 1, 1980, pp. 364-369.
66Hofstetter et al., "Fluoroscopy Based Surgical Navigation-Concept and Clinical Applications," Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
67Hofstetter et al., "Fluoroscopy Based Surgical Navigation—Concept and Clinical Applications," Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
68Horner et al., "A Comparison of CT-Stereotaxic Brain Biopsy Techniques," Investigative Radiology, Sep.-Oct. 1984, pp. 367-373.
69Hounsfield, "Computerized transverse axial scanning (tomography): Part 1. Description of system," British Journal of Radiology, vol. 46, No. 552, Dec. 1973, pp. 1016-1022.
70Hubert-Tremblay, Vincent, et al. "Octree indexing of DICOM images for voxel number reduction and improvement of Monte Carolo simulation computing efficiency," Medical Physics, AIP, Melville, NY, US, vol. 33, No. 8, (Jul. 21, 2006) pp. 2819-2831, XP012092212, ISSN: 0094-2405, DOI: 10.1118/1.2214305 pp. 2820-2821.
71International Preliminary Report on Patentability and Written Opinion for PCT/US2009/0400984 mailed Oct. 28, 2010, claiming benefit of U.S. Appl. No. 12/117,549, filed May 8, 2008.
72International Preliminary Report on Patentability and Written Opinion for PCT/US2009/040979 mailed Oct. 28, 2010 claiming benefit of U.S. Appl. No. 12/117,537, filed May 8, 2008.
73International Preliminary Report on Patentability and Written Opinion for PCT/US2009/040998 mailed Oct. 28, 2010, 2009 claiming benefit of U.S. Appl. No. 12/421,332, filed Apr. 9, 2009; which claims priority to U.S. Appl. No. 61/105,957, filed Oct. 16, 2008; U.S. Appl. No. 12/117549, filed May 8, 2008.
74International Preliminary Report on Patentability and Written Opinion mailed Oct. 29, 2009 for PCT/US2007/089087, of which U.S. Appl. No. 12/492,906 filed Jun. 26, 2009 claims benefit.
75International Preliminary Report on Patentability mailed Oct. 11, 2011 for PCT/US2010/030534 claming benefit of U.S. Appl. No. 12/421,375, filed Apr. 9, 2009.
76International Search Report and Written Opinion for PCT/US2008/088189 mailed Apr. 3, 2009, claiming benefit of U.S. Appl. No. 12/183,796, filed Jul. 31, 2008; and claims priority to U.S. Appl. No. 11/966382, filed Dec. 28, 2007.
77International Search Report and Written Opinion for PCT/US2009/0400984 mailed Sep. 21, 2009, claiming benefit of U.S. Appl. No. 12/117,549, filed May 8, 2008.
78International Search Report and Written Opinion for PCT/US2009/040998 mailed Jul. 29, 2009 claiming benefit of U.S. Appl. No. 12/421,332, filed Apr. 9, 2009; which claims priority to U.S. Appl. No. 61/105,957, filed Oct. 16, 2008; U.S. Appl. No. 12/117,549, filed May 8, 2008.
79International Search Report and Written Opinion for PCT/US2009/067486 mailed May 4, 2010, claiming benefit of U.S. Appl. No. 12/336,085, filed Dec. 16, 2008.
80International Search Report and Written Opinion mailed Dec. 6, 2010 for PCT/US2010/051248, which claims benefit of U.S. Appl. No. 12/609,734 filed Oct. 30, 2009.
81International Search Report and Written Opinion mailed May 4, 2010 for PCT/US2009/067486 claiming benefit of U.S. Appl. No. 12/336,085 filed Dec. 16, 2008.
82International Search Report and Written Opinon for PCT/US2009/040979 mailed Sep. 21, 2009 claiming benefit of U.S. Appl. No. 12/117,537, filed May 8, 2008.
83International Search Report and Written Opinon mailed Jul. 25, 2011 for PCT/US2010/047241 claiming benefit of U.S. Appl. No. 12/844,065, filed Jul. 27, 2010.
84International Search Report for PCT/US2007/089087 mailed Jul. 9, 2008, of which U.S. Appl. No. 12/492,906 filed Jun. 26, 2009 claims benefit.
85International Search Report mailed Sep. 13, 2010 for PCT/US2010/030534 darning benefit of U.S. Appl. No. 12/421,375, filed Apr. 9, 2009.
86Intracardiac Echocardiographic Guidance & Monitoring During Percutaneous Endomyocardial Gene Injection in Porcine Heart, Seung, et al. (Human Gene Therapy 12:893-903 May 20, 2001).
87Invitation to Pay Additional Fees for PCT/US2009/0400984 mailed Jul. 30, 2009, claiming benefit of U.S. Appl. No. 12/117,549, filed May 8, 2008.
88Invitation to Pay Additional Fees for PCT/US2009/040979 mailed Jul. 30, 2009 claiming benefit of U.S. Appl. No. 12/117,537, filed May 8, 2008.
89Invitation to Pay Additional Fees for PCT/US2009/067486 mailed Mar. 5, 2010, claiming benefit of U.S. Appl. No. 12/336,085, filed Dec. 16, 2008.
90Invitation to Pay Additional Fees for PCT/US2010/047241 mailed Jan. 10, 2011, claiming benefit of U.S. Appl. No. 12/844,065, filed Jul. 27, 2010.
91Invitation to Pay Additional Fees mailed Jul. 7, 2010 for PCT/US2010/030534 claiming benefit of U.S. Appl. No. 12/421,375, filed Apr. 9, 2009.
92Invitation to Pay Additional Fees mailed Jul. 7, 2010 for PCT/US2010/030534 darning benefit of U.S. Appl. No. 12/421,375, filed Apr. 9, 2009.
93Invitation to Pay Additional Fees mailed Mar. 5, 2010 for PCT/US2009/067486 claiming benefit of U.S. Appl. No. 12/336,085, filed Dec. 16, 2008.
94Jacob, Al, et al., "A Whole-Body Registration-Free Navigation System for Image-Guided Surgery and Interventional Radiology," Investigative Radiology, vol. 35 No. 5 (May 2000) pp. 279-288.
95Jacques et al., "A Computerized Microstereotactic Method to Approach, 3-Dimensionally Reconstruct, Remove and Adjuvantly Treat Small CNS Lesions," Applied Neurophysiology, vol. 43, 1980, pp. 176-182.
96Jacques et al., "Computerized three-dimensional stereotaxic removal of small central nervous system lesion in patients," J. Neurosurg., vol. 53, Dec. 1980, pp. 816-820.
97Jiang, Yuan. "An Impedance-Based Catheter Poisitioning System for Cardiac Mapping and Navigation." IEEE Transactions on Biomedical Engineering, (Aug. 2009) pp. 1963-1970, vol. 56, No. 8.
98Joskowicz et al., "Computer-Aided Image-Guided Bone Fracture Surgery: Concept and Implementation," CAR '98, pp. 710-715.
99Kall, B., The Impact of Computer and Imgaging Technology on Stereotactic Surgery, Proceedings of the Meeting of the American Society for Stereotactic and Functional Neurosurgery, pp. 10-22 (1987).
100Kato, A., et al., A frameless, armless navigational system for computer-assisted neurosurgery, J. Neurosurg., vol. 74, pp. 845-849 (May 1991).
101Kelly et al., "Computer-assisted stereotaxic laser resection of intra-axial brain neoplasms," Journal of Neurosurgery, vol. 64, Mar. 1986, pp. 427-439.
102Kelly et al., "Precision Resection of Intra-Axial CNS Lesions by CT-Based Stereotactic Craniotomy and Computer Monitored CO.sub.2 Laser," Acta Neurochirurgica, vol. 68, 1983, pp. 1-9.
103Kelly, P.J., Computer Assisted Stereotactic Biopsy and Volumetric Resection of Pediatric Brain Tumors, Brain Tumors in Children, Neurologic Clinics, vol. 9, No. 2, pp. 317-336 (May 1991).
104Kelly, P.J., Computer-Directed Stereotactic Resection of Brain Tumors, Neurologica Operative Atlas, vol. 1, No. 4, pp. 299-313 (1991).
105Kelly, P.J., et al., Results of Computed Tomography-based Computer-assisted Stereotactic Resection of Metastatic Intracranial Tumors, Neurosurgery, vol. 22, No. 1, Part 1, 1988, pp. 7-17 (Jan. 1988).
106Kelly, P.J., Stereotactic Imaging, Surgical Planning and Computer-Assisted Resection of Intracranial Lesions: Methods and Results, Advances and Technical Standards in Neurosurgery, vol. 17, pp. 78-118, (1990).
107Kim, W.S. et al., A Helmet Mounted Display for Telerobotics, IEEE, pp. 543-547 (1988).
108Klimek, L., et al., Long-Term Experience with Different Types of Localization Systems in Skull-Base Surgery, Ear, Nose & Throat Surgery, Chapter 51 (1996) pp. 635-638.
109Kosugi, Y., et al., An Articulated Neurosurgical Navigation System Using MRI and CT Images, IEEE Trans. on Biomed, Eng. vol. 35, No. 2, pp. 147-152 (Feb. 1988).
110Krybus, W., et al., Navigation Support for Surgery by Means of Optical Position Detection, Computer Assisted Radiology Proceed. of the Intl. Symp. Car '91 Computed Assisted Radiology, pp. 362-366 (Jul. 3-6, 1991).
111Kwoh, Y.S., Ph.D., et al., A New Computerized Tomographic-Aided Robotic Stereotaxis System, Robotics Age, vol. 7, No. 6, pp. 17-22 (Jun. 1985).
112Laitinen et al., "An Adapter for Computed Tomography-Guided, Stereotaxis," Surg. Neurol., 1985, pp. 559-566.
113Laitinen, "Noninvasive multipurpose stereoadapter," Neurological Research, Jun. 1987, pp. 137-141.
114Lavallee et al, "Matching 3-D Smooth Surfaces with their 2-D Projections using 3-D Distance Maps," SPIE, vol. 1570, Geometric Methods in Computer Vision, 1991, pp. 322-336.
115Lavallee et al., "Computer Assisted Driving of a Needle into the Brain," Proceedings of the International Symposium CAR '89, Computer Assisted Radiology, 1989, pp. 416-420.
116Lavallee et al., "Computer Assisted Interventionist Imaging: The Instance of Stereotactic Brain Surgery," North-Holland MEDINFO 89, Part 1, 1989, pp. 613-617.
117Lavallee et al., "Computer Assisted Spine Surgery: A Technique for Accurate Transpedicular Screw Fixation Using CT Data and a 3-D Optical Localizer," TIMC, Faculte de Medecine de Grenoble. (1995).
118Lavallee et al., "Image guided operating robot: a clinical application in stereotactic neurosurgery," Proceedings of the 1992 IEEE Internation Conference on Robotics and Automation, May 1992, pp. 618-624.
119Lavallee et al., "Matching of Medical Images for Computed and Robot Assisted Surgery," IEEE EMBS, Orlando, 1991.
120Lavallee, "A New System for Computer Assisted Neurosurgery," IEEE Engineering in Medicine & Biology Society 11.sup.th Annual International Conference, 1989, pp. 0926-0927.
121Lavallee, "VI Adaption de ja Methodologie a Quelques Applications Cliniques," Chapitre VI, pp. 133-148.
122Lavallee, S., et al., Computer Assisted Knee Anterior Cruciate Ligament Reconstruction First Clinical Tests, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 11-16 (Sep. 1994).
123Lavallee, S., et al., Computer Assisted Medical Interventions, NATO ASI Series, vol. F 60, 3d Imaging in Medic., pp. 301-312 (1990).
124Leavitt, D.D., et al., Dynamic Field Shaping to Optimize Stereotactic Radiosurgery, I.J. Rad. Onc. Biol. Physc., vol. 21, pp. 1247-1255 (1991).
125Leksell et al., "Stereotaxis and Tomography-A Technical Note," ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
126Leksell et al., "Stereotaxis and Tomography—A Technical Note," ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
127Lemieux et al., "A Patient-to-Computed-Tomography Image Registration Method Based on Digitally Reconstructed Radiographs," Med. Phys. 21 (11), Nov. 1994, pp. 1749-1760.
128Levin et al., "The Brain: Integrated Three-dimensional Display of MR and PET Images," Radiology, vol. 172, No. 3, Sep. 1989, pp. 783-789.
129Markowitz, Toby, et al., "Unleaded: The Fluoroless 3D Lead Implant", Presented at Heart Rhythm Society, Denver, CO, (May 2007) 1 pg.
130Markowitz, Toby, et al., Abstract Submission, "Unleaded: The Fluoroless 3D Lead Implant", Mar. 2007 2 pgs.
131Maurer, Jr., et al., Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces, IEEE Trans. on Med. Imaging, vol. 17, No. 5, pp. 753-761 (Oct. 1998).
132Mazier et al., "Computer-Assisted Interventionist Imaging: Application to the Vertebral Column Surgery," Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 12, No. 1, 1990, pp. 0430-0431.
133Mazier et al., Chirurgie de la Cotonne Vertebrale Assistee par Ordinateur: Appication au Vissage Pediculaire, Innov. Tech. Biol. Med., vol. 11, No. 5, 1990, pp. 559-566.
134McGirr, S., M.D., et al., Stereotactic Resection of Juvenile Pilocytic Astrocytomas of the Thalamus and Basal Ganglia, Neurosurgery, vol. 20, No. 3, pp. 447-452, (1987).
135Merloz, et al. "Computer Assisted Spine Surgery", Clinical Assisted Spine Surgery, No. 337, pp. 86-96, (Apr. 1997).
136Milstein, S. et al., "Initial Clinical Results of Non-Fluoroscopic Pacemaker Lead Implantation." (poster presentation) May 14-17, 2008. 1 pg.
137Milstein, S. et al., "Initial Clinical Results of Non-Fluoroscopic Pacemaker Lead Implantation." (pre-presentation abstract) May 14-17, 2008. 2 pgs.
138Muschlitz, Lin, "Ultrasound in the OR suite is providing more detailed information to allow less invasive surgeries." Technology—Ultra Sound Surgical Partners (Sep. 2003) Medical Imaging. http://www.imagingeconomics.com/issues/articles/M1—2003-09—03.asp (accessed on Aug. 12, 2010).
139Nelder, J.A., et al. "A simplex method for function minimization." vol. 7, Issue 4, (1965) pp. 308-313.The Computer Journal.
140Ng, W.S. et al., Robotic Surgery—A First-Hand Experience in Transurethral Resection of the Prostate Surgery, IEEE Eng. in Med. and Biology, pp. 120-125 (Mar. 1993).
141Partial European Search Report completed Mar. 1, 2004 for European application EP03024327, claiming benefit of U.S. Appl. No. 10/299,969, filed Nov. 19, 2002.
142Pelizzari et al., "Accurate Three-Dimensional Registration of CT, PET, and/or MR Images of the Brain," Journal of Computer Assisted Tomography, Jan./Feb. 1989, pp. 20-26.
143Pelizzari et al., "Interactive 3D Patient-Image Registration," Information Processing in Medical Imaging, 12.sup.th International Conference, IPMI '91, Jul. 7-12, 136-141 (A.C.F. Colchester et al. eds. 1991).
144Pelizzari et al., No. 528—"Three Dimensional Correlation of PET, CT and MRI Images," The Journal of Nuclear Medicine, vol. 28, No. 4, Apr. 1987, p. 682.
145Penn, R.D., et al., Stereotactic Surgery with Image Processing of Computerized Tomographic Scans, Neurosurgery, vol. 3, No. 2, pp. 157-163 (Sep.-Oct. 1978).
146Phillips et al., "Image Guided Orthopaedic Surgery Design and Analysis," Trans Inst. MC, vol. 17, No. 5, 1995, pp. 251-264.
147Pixsys, 3-D Digitizing Accessories, by Pixsys (marketing brochure)(undated) (2 pages).
148Potamianos et al., "Intra-Operative Imaging Guidance for Keyhole Surgery Methodology and Calibration," First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, pp. 98-104.
149Reinhardt et al., "CT-Guided ‘Real Time’ Stereotaxy," ACTA Neurochirurgica, 1989.
150Reinhardt, H., et al., A Computer-Assisted Device for Intraoperative CT-Correlated Localization of Brain Tumors, pp. 51-58 (1988).
151Reinhardt, H.F. et al., Sonic Stereometry in Microsurgical Procedures for Deep-Seated Brain Tumors and Vascular Malformations, Neurosurgery, vol. 32, No. 1, pp. 51-57 (Jan. 1993).
152Reinhardt, H.F., et al., Mikrochirugische Enffernung tiefliegender Gefa.beta.mi.beta.bildungen mit Hilfe der Sonar-Stereometrie (Microsurgical Removal of Deep-Seated Vascular Malformations Using Sonar Stereometry). Ultraschall in Med. 12, pp. 80-83(1991).
153Reinhardt, Hans. F., Neuronavigation: A Ten-Year Review, Neurosurgery (1996) pp. 329-341.
154Roberts et al., "A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope," J. Neurosurg., vol. 65, Oct. 1986, pp. 545-549.
155Rosenbaum et al., "Computerized Tomography Guided Stereotaxis: A New Approach," Applied Neurophysiology, vol. 43, No. 3-5, 1980, pp. 172-173.
156Sautot, "Vissage Pediculaire Assiste Par Ordinateur," Sep. 20, 1994.
157Savage, George, M.D., "Electric Tomography (ET)—A Novel Method for Assessing Myocardial Motion and Cardiac Performance", Heart Rhythm Society, Denver, CO (May 9-12, 2007) 1 pg.
158Schueler et al., "Correction of Image Intensifier Distortion for Three-Dimensional X-Ray Angiography," SPIE Medical Imaging 1995, vol. 2432, pp. 272-279.
159Selvik et al., "A Roentgen Stereophotogrammetric System," Acta Radiologica Diagnosis, 1983, pp. 343-352.
160Shelden et al., "Development of a computerized microsteroetaxic method for localization and removal of minute CNS lesions under direct 3-D vision," J. Neurosurg., vol. 52, 1980, pp. 21-27.
161Simon, D.A., Accuracy Validation in Image-Guided Orthopaedic Surgery, Second Annual Intl. Symp. on Med. Rob. an Comp-Assisted surgery, MRCAS (1995) pp. 185-192.
162Smith et al., "Computer Methods for Improved Diagnostic Image Display Applied to Stereotactic Neurosurgery," Automedical, vol. 14, 1992, pp. 371-382.
163Smith et al., "The Neurostation.TM.—A Highly Accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery," Computerized Medical Imaging and Graphics, vol. 18, Jul.-Aug. 1994, pp. 247-256.
164Smith, K.R., et al. Multimodality Image Analysis and Display Methods for Improved Tumor Localization in Stereotactic Neurosurgery, Annul Intl. Conf. of the IEEE Eng. in Med. and Biol. Soc., vol. 13, No. 1, p. 210 (1991).
165Tan, K., Ph.D., et al., A frameless stereotactic approach to neurosurgical planning based on retrospective patient-image registration, J Neurosuroy, vol. 79, pp. 296-303 (Aug. 1993).
166The Laitinen Stereotactic System, E2-E6.
167Thompson, et al., A System for Anatomical and Functional Mapping of the Human Thalamus, Computers and Biomedical Research, vol. 10, pp. 9-24 (1977).
168Trobraugh, J.W., et al.; Frameless Stereotactic Ultrasonography: Method and Applications, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 235-246 (1994).
169Viant et al., "A Computer Assisted Orthopaedic System for Distal Locking of Intramedullary Nails," Proc. of MediMEC '95, Bristol, 1995, pp. 86-91.
170Von Hanwhr et al., Foreword, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 225-228, (Jul.-Aug. 1994).
171Wang, M.Y., et al., An Automatic Technique for Finding and Localizing Externally Attached Markers in CT and MR vol. Images of the Head, IEEE Trans. on Biomed. Eng., vol. 43, No. 6, pp. 627-637 (Jun. 1996).
172Watanabe et al., "Three-Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography-Guided Stereotaxic Surgery," Surgical Neurology, vol. 27, No. 6, Jun. 1987, pp. 543-547.
173Watanabe, "Neuronavigator," Igaku-no-Ayumi, vol. 137, No. 6, May 10, 1986, pp. 1-4.
174Watanabe, E., M.D., et al., Open Surgery Assisted by the Neuronavigator, a Stereotactic, Articulated, Sensitive Arm, Neurosurgery, vol. 28, No. 6, pp. 792-800 (1991).
175Weese et al., "An Approach to 2D/3D Registration of a Vertebra in 2D X-ray Fluoroscopies with 3D CT Images," (1997) pp. 119-128.
176Wittkampf, Fred, H.M., et al., "LocaLisa: New Technique for Real-Time 3-Dimensional Localization of Regular Intracardiac Electrodes." Circulation Journal of the American Heart Association, 1999; 99; 13-12-1317.
177Wittkampf, Fred., H.M., et al. "Accuracy of the LocaLisa System in Catheter Ablation Procedures." Journal of Electrocardiology vol. 32 Supplement (1999). Heart Lung Institute, University Hospital Utrecht, The Netherlands.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US834506715 Apr 20091 Jan 2013Regents Of The University Of MinnesotaVolumetrically illustrating a structure
US835577430 Oct 200915 Jan 2013Medtronic, Inc.System and method to evaluate electrode position and spacing
US836425215 Apr 200929 Jan 2013Medtronic, Inc.Identifying a structure for cannulation
US839196515 Apr 20095 Mar 2013Regents Of The University Of MinnesotaDetermining the position of an electrode relative to an insulative cover
US842453615 Apr 200923 Apr 2013Regents Of The University Of MinnesotaLocating a member in a structure
US844262515 Apr 200914 May 2013Regents Of The University Of MinnesotaDetermining and illustrating tracking system members
US84573718 May 20084 Jun 2013Regents Of The University Of MinnesotaMethod and apparatus for mapping a structure
US84946089 Apr 200923 Jul 2013Medtronic, Inc.Method and apparatus for mapping a structure
US85327349 Apr 200910 Sep 2013Regents Of The University Of MinnesotaMethod and apparatus for mapping a structure
US856004220 Mar 201215 Oct 2013Medtronic, Inc.Locating an indicator
US866064014 Apr 200925 Feb 2014Medtronic, Inc.Determining a size of a representation of a tracked member
Classifications
U.S. Classification600/424, 600/427, 128/899
International ClassificationA61B19/00, A61B5/05
Cooperative ClassificationG01S5/0263, A61B19/5244, A61B2019/5251, A61B2019/5246
European ClassificationG01S5/02H1, A61B19/52H12
Legal Events
DateCodeEventDescription
16 Dec 2008ASAssignment
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARTMANN, STEVEN L.;BZOSTEK, ANDREW;JASCOB, BRADLEY A.;SIGNED BETWEEN 20081023 AND 20081028;REEL/FRAME:21988/483
Owner name: MEDTRONIC NAVIGATION, INC.,MINNESOTA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARTMANN, STEVEN L.;BZOSTEK, ANDREW;JASCOB, BRADLEY A.;SIGNING DATES FROM 20081023 TO 20081028;REEL/FRAME:021988/0483
Owner name: MEDTRONIC NAVIGATION, INC., MINNESOTA