|Publication number||WO1999033406 A1|
|Publication date||8 Jul 1999|
|Filing date||30 Dec 1998|
|Priority date||31 Dec 1997|
|Also published as||DE19882935B4, DE19882935T0, DE19882935T1|
|Publication number||PCT/1998/27841, PCT/US/1998/027841, PCT/US/1998/27841, PCT/US/98/027841, PCT/US/98/27841, PCT/US1998/027841, PCT/US1998/27841, PCT/US1998027841, PCT/US199827841, PCT/US98/027841, PCT/US98/27841, PCT/US98027841, PCT/US9827841, WO 1999/033406 A1, WO 1999033406 A1, WO 1999033406A1, WO 9933406 A1, WO 9933406A1, WO-A1-1999033406, WO-A1-9933406, WO1999/033406A1, WO1999033406 A1, WO1999033406A1, WO9933406 A1, WO9933406A1|
|Inventors||Mark W. Hunter, Paul J. Kessman, Scott R. Smith|
|Applicant||Surgical Navigation Technologies, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (6), Referenced by (40), Classifications (12), Legal Events (7)|
|External Links: Patentscope, Espacenet|
WIRELESS PROBE SYSTEM FOR USE WITH A STEREOTACTIC SURGICAL DEVICE
BACKGROUND OF THE INVENTION Technical Field
The present invention relates generally to computer assisted image guided medical and surgical navigation systems that generate images during medical and surgical procedures indicating the relative position of various body parts, surgical implants, and instruments. In particular, the present invention relates to a system of determining the position of a wireless reference probe or arc using a sequenced lighting pattern to synchronize lights on the probe and reference arc or frame to determine a position of each.
Computer assisted image guided medical and surgical navigation systems are known and used to generate images in order to guide a doctor during a surgical procedure. Such systems are disclosed, for example, in U.S. Patent No. 5,383,454 to Buchojz; PCT Application No. PCT/US94/04530 (Publication No. WO 94/24933) to Bucholz: and PCT Application No. PCT/US95/12984 (Publication No. WO 96/11624) to Bucholz et al.. incorporated herein by reference.
In general, these image guided systems use images of a body part, such as CT scans, taken before surgery to generate images on a display, such as a CRT monitor screen, during surgery for representing the position of a surgical instrument with respect to the body part. The systems typically include tracking devices such as, for example, an LED array mounted on a surgical instrument as well as a body part, a digitizer or a camera device to track in real time the position of the LED arrays and thus the body part and the instrument used during surgery, and a monitor screen to display images representing the body part and the position of the instrument relative to the body part as the surgical procedure is performed.
Current stereotactic surgical systems such as that described in the United States Patent 5,383,454 and PCT Application No. PCT/US95/12984, disclose a system tracking a probe as well as a body part during surgery. As described therein, a reference arc or frame is positioned in fixed relation to the head of a patient. Both the probe and the reference arc include emitters which are detected and used by a processing system to determine the position of the probe and arc in three dimensional space. In these prior art systems, both the reference arc and the probe must receive external signals, typically through a wire or cable or other suitable connection, for activating the emitters. The surgeon typically initiates these external signals by, for example, depressing a button on a foot switch. The button may be connected to a processor that sends a signal to the probe emitters directing them to emit radiation. The processor must also direct the emitters on the reference arc to light. One problem with this prior art system is that the wires that connect the probe to the processor can interfere with the surgeon's work. In addition, the surgeon must take the time to press the foot pedal; otherwise, he may not receive constant updates on the probe position without repeatedly hitting the floor pedal.
Disclosure of the Invention
Systems and methods consistent with the present invention include a device to determine a position of a probe with respect to an object, such as, for example, a cranium or spinal vertebrae. A reference arc has a first set of emitters positioned on it and is fixedly positioned on the object. A probe has a second set of emitters positioned on it and is positioned by a surgeon near the object. An activating circuit in the probe causes each of the second set of emitters to turn on and off in a predetermined sequence to produce a first ordered pattern of light. A detector which is part of a digitizer, is positioned to detect radiation from the first and second set of emitters. A processor includes instructions for recognizing the ordered pattern output by the second set of emitters based on the radiation detected by the detector and for instructing the first set of emitters to turn on and off in a second ordered pattern that limits overlap with the first ordered pattern, whereby a position of both the reference arc and the probe may be determined by the processor based on detected radiation.
In an alternate embodiment the reference arc includes an activating circuit for lighting the first set of emitters in a pattern.
In an alternate embodiment the probe includes an activating circuit that lights emitters on the probe at a different rate than an activating circuit in the reference arc lights emitters on the arc. A processor includes instructions for recognizing the differing ordered pattern outputs of the set of emitters on the arc and probe, whereby a position of both the reference arc and the probe may be determined based on detected radiation.
A method is also provided for indicating a position of a surgical instrument comprising the steps of positioning a reference arc near a patient and emitting radiation using a first set of emitters positioned on the reference arc. Radiation is emitted from a second set of emitters positioned on a second structure in an ordered pattern. Detectors detect radiation from the first and second set of emitters. The ordered pattern is recognized by a processor based on detected radiation. The processor activates the first set of emitters during times when the second set of emitters is inactive, whereby a position of the reference arc and surgical instrument may be determined based on detected radiation.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention as claimed.
Brief Description of the Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the objects, advantages and principles of the invention. In the drawings,
Fig. 1 is a schematic front view of a computer assisted image guided surgery system used with a wireless probe according to the present invention; Fig. 2 shows a control circuit that is positioned in the probe shown in Fig. 1 according to the present invention;
Fig. 3 shows a timing diagram of the pattern of activation of the LEDs on the probe using the control circuit of Fig. 2 and includes the timing of activation of the LEDs on a reference arc activated by the system shown in Fig. 2 and the detection frames and localization frames of the processor;
Fig. 4 shows a flow chart of the steps for initializing the system of the present invention and determining the position of the probe and reference arc;
Fig. 5 shows a timing diagram of the emission of light from the reference arc and the probe according to one embodiment of the present invention; and
Fig. 6 shows a timing diagram of the emission of light from the reference arc and the probe according to one embodiment of the present invention.
Best Mode for Carrying Out the Invention
Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
The medical instrument of the present invention is shown generally as part of the system 11 in Fig. 1. An instrument such as probe 100 can be used in known computer assisted image guided surgical navigation systems such as the system 11 shown in Fig. 1 and disclosed in PCT Application No. PCT/US95/12984 (Publication No. WO 96/11624) to Bucholz et al. A computer assisted image guided surgery system 11 generates an image for display on a CRT monitor 106 representing the real time position of a body part such as a cranium, represented generally by circle 119, having reference points 118 and the position of probe 100 relative to the body part 119.
An image may be generated on monitor 106 from an image data set stored in a controller, such as computer 108, usually generated preoperatively by some scanning technique such as, for example, by a CT scanner or by magnetic resonance imaging (MRI). The image data set and the image generated have reference points for at least one body part. The reference points for the particular body part have a fixed spatial relation to the particular body part.
System 11 also generally includes a processor for processing image data, shown as computer 108, and displays this processed data on monitor 106. Digitizer control unit 114 is under control of computer 108. Digitizer 114, in conjunction with a reference frame or arc 120 and a sensor array 110 or other known position sensing unit, tracks the real time position of a body part, such as a cranium shown at 119 clamped in reference arc 120, and a probe 100. Reference arc 120 has emitters 122 (such as LEDs A1 , A2, A3, A4, and A5) or other tracking means that generate signals representing the position of the various body reference points. Reference arc 120 is fixed spatially in relation to a body part by a clamp assembly indicated generally at 124, 125, and 126, so that arc 120 moves as the body part is moved during surgery. Probe 100 also has a tracking device shown as an emitter array set 40 (such as LEDs 366, 368, 369) which generates signals representing the position of the probe during the procedure.
Sensor array 110, mounted on support 112, receives and in conjunction with digitizer 114 and computer 108, triangulates the signals generated by emitters 122 and emitter array set 40 in order to identify during the procedure the relative position of the reference points on arc 120 and the probe 100. Digitizer 114 and computer 108 may then modify the image data set according to the identified relative position of each of the reference points during the procedure. Computer 108 may then generate an image data set for display on monitor 106 representing the position of the body part and the probe on the appropriate CT or MRI scan slice during the operation. The general structure and operation of an image guided surgery system is well known in the art and need not be discussed further here.
In order to initiate the emitters on a reference arc and probe some prior art systems required the surgeon to take a positive action such as pressing a foot switch connected to a digitizer 114 attached by a cable to a probe 100. However, this unduly burdened the surgeon and might not provide for continual and easy updates on the position of the probe. The present invention does not require any action on the part of the surgeon to activate emitters on the reference arc and eliminates the need for the surgeon to attach a cable to the probe 100 to activate the emitters.
Fig. 1 shows the probe 100 having a body 41 and an emitter set 40. Probe 100 may be any one of a plurality of types of surgical instruments such as a surgical coagulating forceps, a bipolar coagulating forceps, drill, suction tube, bayonet cauterizing device, catheter guide, drill guide, or any other surgical instrument modified as discussed with respect to Figs. 2-4. Probe 100 can be wireless and has a battery 361 for power and internal circuitry shown in Fig. 2 for controlling the light emitter set 40. In one embodiment a switch may be on the probe to activate and disable the emitter set 40.
The probe 100 includes within body 41 a control circuit 362 for controlling the emission of light from emitter set 40. Figure 2 shows the control circuit 362 hardware that includes a microprocessor 364 shown as the Motorola 68HC705J2CS. Any suitable microprocessor or pattern generator circuit known in the art may be used. Control circuit 362 drives the light emitter set 40 which includes at least two light emitters, but as shown includes three LEDs 366, 368, and 369, all suitably positioned on the probe 100 for monitoring by sensor array 110. PAO output 18 drives the first LED 366, PA1 output 17 drives the second LED 368, and PA2 output 16 drives the third LED 369. If additional emitters are desired, then other outputs such as 11-15 may act as drivers. The clock timer signal generated by the 4 megahertz crystal of clock 370 is input to OSC1 and OSC2.
Further with respect to Fig. 2 , the microprocessor 364 includes routine software and hardware for controlling the emitters on the probe 100 to generate signals to drive the emitter set 40. The present invention activates the emitters on the probe 100 in a predetermined ordered pattern such as that indicated generally at 400 in Fig. 3. The probe of the present invention behaves independently of the digitizer 114 in that probe 100 is not connected or controlled by the digitizer 114. The probe 100 includes the internal circuitry described above for activating the emitters on the probe 100 in a predetermined order at predetermined times. This order and timing of activating the emitters is referred to as an ordered pattern. One example of the pattern is illustrated in Fig. 3. Other patterns in addition to that of Fig. 3, such as any cyclical pattern with a sufficient period for detection by digitizer 114, may also be used as long as both the digitizer 114 and/or the computer 108 are programmed so the digitizer 114 can recognize the same predetermined pattern produced by probe 100. The computer 108 has a memory which is programmed beforehand to store the predefined pattern 400 and this pattern 400 can be programmed into the digitizer 114 during initialization; therefore the digitizer 114 can recognize the pattern 400 when output by the probe 100.
Preferably the emission of radiation from one LED does not overlap with that of another LED. There may be empty time between emitter transitions. Preferably there is an empty time cycle between transitioning from the probe to the arc to account for errors in the synchronization. In addition in a preferred embodiment the geometry of the emitters based on detected radiation may be determined to verify that the detected structure is the appropriate shape for the probe or arc. This best fit geometry check can be performed to verify correct synchronization throughout the localization process.
The timing diagram of Fig. 3 is divided into time slots, with each time slot generally corresponding to a LED activation period shown as 1-20 on the bottom of the drawing. All the time slots do not necessarily have to be equal; but in a preferred embodiment, each time slot is about 30 milliseconds.
With reference to Fig. 3, control circuit 362 activates (turns on lights) first emitter LED 366 for the first time slot S1 of the pattern 400. Then control circuit 362 activates LED 368 for the second time slot S2. Next, a third LED 369 is activated for a third time slot S3. The emitters are on for the entire time slot and preferably the emitters are activated sequentially such that the end of the first time slot corresponds to the start of the second time slot. Preferably the periods of activation do not overlap and there may be time gaps between the periods of activation. Changes to these parameters merely change the pattern 400. A correspondingly similar pattern 400 must be stored in both probe 100 and also the computer 108 to enable the digitizer 114 to recognize the pattern.
After the sequential lighting of the LED set 40, the clock 370 input to microprocessor 364 counts seven time slots (slots S4-S10) represented as T1-T7 in Fig. 3 during which the probe 100 does not activate any of the emitters. Microprocessor 364 does not directly control emitters 122 on the reference arc 120; instead, in this embodiment, the computer 108 controls emitter set 122, which includes emitters A1-A5. The timing diagram shows the emission of radiation from emitters A1-A5, which can begin - as described in further detail below - after the probe pattern is recognized and synchronization has occurred. The overall pattern 400 generated by the probe 100 alone includes sequencing through ten equal time slots - three of which include a single LED being lit and the other seven of which include no action. During the period of no action by the probe, the computer 108 will activate emitters A1-A5 on the reference arc 120. Once the probe completes its pattern 400, the pattern 400 is restarted at the beginning with LED 366 by the microprocessor.
Referring to Fig. 4, a flow chart is provided further showing the steps of the present invention. The microprocessor 364 of probe 100 activates LEDs 366, 368, and 369 and executes the pattern 400 in Fig. 3 (step 500) independently of any other parts of the system shown in Fig. 1. While the pattern 400 is being executed, the sensor array 110 detects light output by LEDs 366, 368, and 369 on the probe 100. During this initializing period, the emitter set 122 on the reference arc 120 are not activated by the computer 108. The probe 100 executes the pattern of emission 400 for LEDs 366, 368, and 369 as shown in Fig. 3. Sensor array 110 is constantly detecting received light (step 510).
After a suitable initialization period, the digitizer 114 samples the received light in units called detection frames to recognize the probe's 100 pattern 400. As illustrated in Fig. 3, detection frames 115 are each at a fraction of the probe's 100 emitter activation time (oversampling), that is the time each emitter is on, so that the digitizer 100 can determine when the probe 100 emitters are on and off. The detection frames sampling rate must be at least two times the individual emitter rate (nyquist criteria), but it is preferable to use a faster sampling rate such as four times the emitter lighting rate. The more samples that are taken in a given time period the greater the precision of determining when the emitters are on or off. In this example of Fig. 3, it can be seen that for this example there are about four frames per time slot. Based on knowing detection frames where light is detected and times that no light is detected above a threshold, the digitizer 114 recognizes the pattern 400 (step 520).
The digitizer 114 has an internal clock timer that preferably runs at the same frequency (or integral multiples thereof) as the clock timer 370 of the probe. Based on the recognized pattern 400, the digitizer 114 synchronizes the pulse rate of the emitters 122 on the reference arc 120 to that of the emitters 40 on the probe 100 (step 530). Synchronization of signals is well known in the art and need not be discussed further. Once pulse rates are synchronized, the digitizer 114 knows the times at which the light emitter set 40 on the probe 100 will light and defines localization frames 116 for the times at which the LEDs on the probe light (step 540). The digitizer 114 can then direct the emitters 122 on the reference arc 120 to sequentially light, A1- A5, at times when the LEDs on the probe are not lit (step 550). The digitizer 114 samples the sensor array 110 at times when emitters are expected to be on shown as probe localization frames 116 and can also sample for arc localization frames 117 (step 560). The position of the reference arc 120 and the probe 100 may now be calculated as is known in the art (step 570).
If the probe signal is lost for a significant time, such as for example when the surgeon may remove the probe from the digitizer 114 lines of sight, the digitizer 114 will assume the probe is off. The digitizer 114 will then attempt to resynchronize using detection frames until the pattern 400 of the probe 100 is recognized. The probe signal could be lost if it were blocked or the clock signal were out of phase. Once resynchronized, the digitizer 114 resumes processing as discussed above.
In another embodiment of the present invention, the circuit shown in Fig. 2 can be included in the reference arc 120 instead of the probe 100 so that the reference arc can be wireless and independently perform the pattern 400 of Fig. 3. In this case, computer 108 does not activate the emitters on the reference arc but must synchronize and recognize the reference pattern 400 in the same manner discussed above with respect to the probe. In this embodiment, it is the probe instead of the arc that may be wired and controlled by the digitizer 114.
In yet another embodiment of the present invention, the circuit shown in Fig. 1 can be included in both the probe 100 and reference arc 120. Therefore, both devices can be wireless and independently perform different patterns that can be recognized by the digitizer 114. In this case one of the device will have a pattern that is half the period of the other or differing patterns that can be recognized. The digitizer 114 will distinguish between lights on the arc and the probe by using a known geometry of the probe and the arc to determine which detected lights belong to which element. The pattern and detection of the same are repeated allowing for enough data points of lights to be detected to determine a geometry of the detected lights. Once synchronized, the digitizer 114, as discussed for the previous embodiment, can know the times at which lights are to be expected to be lit from both the probe and the arc. Figs. 5 and 6 show example timing diagrams corresponding to this embodiment. As shown in Figs. 5 and 6 the probe is localized during periods where only the probe emitters 366, 368, and 369 are on and the reference arc 120 is localized during periods when only the emitters A1-A5 on the reference arc are on. Fig. 5 shows an example where the patterns of both the emitters on the reference arc 120 and the probe 100 are close in phase and Fig. 6 show an example where the patterns of both the emitters on the reference arc 120 and the probe 100 are out of phase. During periods when both the emitters on the probe and on the arc are off, the digitizer 114 may turn on emitters on a third device, such as another probe to detect the position of the third device. For example, in Fig. 6 during the no localization period when both the emitters on the probe and arc are off, the digitizer 114 may initiate LEDs to light on a third structure.
The pattern lighting registration system of the present invention may be used in connection with any type of image guided surgery system including those described in U.S. Patent No. 5,383,454 to Bucholz: PCT Application No. PCT/US94/04530 (Publication No. WO 94/24933) to Bucholz: and PCT Application No. PCT/US95/12984 (Publication No. WO 96/11624) to Bucholz et al.
In another embodiment, the emitters on either the probe or the reference arc are replaced with reflectors that reflect light emitted from an external source, such as the digitizer unit. In this case either the probe or reference arc with reflectors will be localized during the A1-A5 cycles.
Systems and methods consistent with the present invention allow for a probe or reference arc used in image guided surgery to be continuously and easily monitored without the surgeon having to repeatedly initiate the monitoring or having the surgical instruments cabled. This is achieved by placing emitters on the probe 100 or reference arc 120 that light in an ordered pattern 400 recognized by a processor so that the processing system knows the times at which the emitters will be activated.
The foregoing description is presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. The embodiments were chosen and described in order to explain the principles of the invention and its practical application to enable one skilled in the art to utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. The scope of the invention is defined by the following claims and their equivalents.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|WO1994024933A1||25 Apr 1994||10 Nov 1994||St. Louis University||Indicating the position of a surgical probe|
|WO1995025475A1 *||21 Mar 1995||28 Sep 1995||Elekta Instrument Ab||Device for detecting position of surgical instruments|
|WO1996011624A2||5 Oct 1995||25 Apr 1996||St. Louis University||Surgical navigation systems including reference and localization frames|
|DE4225112C1 *||30 Jul 1992||9 Dec 1993||Bodenseewerk Geraetetech||Instrument position relative to processing object measuring apparatus - has measuring device for measuring position of instrument including inertia sensor unit|
|US5383454||2 Jul 1992||24 Jan 1995||St. Louis University||System for indicating the position of a surgical probe within a head on an image of the head|
|US5622170 *||4 Oct 1994||22 Apr 1997||Image Guided Technologies, Inc.||Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|EP1341465A1 *||17 Nov 2000||10 Sep 2003||Calypso Medical, Inc||Systems and methods for locating and defining a target location within a human body|
|EP1341465A4 *||17 Nov 2000||23 Aug 2006||Calypso Medical Inc||Systems and methods for locating and defining a target location within a human body|
|US7998062||19 Jun 2007||16 Aug 2011||Superdimension, Ltd.||Endoscope structures and techniques for navigating to a target in branched structure|
|US8246602||22 Jun 2004||21 Aug 2012||Medtronic, Inc.||Catheters with tracking elements and permeable membranes|
|US8452068||2 Nov 2011||28 May 2013||Covidien Lp||Hybrid registration method|
|US8467589||2 Nov 2011||18 Jun 2013||Covidien Lp||Hybrid registration method|
|US8473032||2 Jun 2009||25 Jun 2013||Superdimension, Ltd.||Feature-based registration method|
|US8611984||6 Apr 2010||17 Dec 2013||Covidien Lp||Locatable catheter|
|US8663088||2 Dec 2009||4 Mar 2014||Covidien Lp||System of accessories for use with bronchoscopes|
|US8696548||9 Jun 2011||15 Apr 2014||Covidien Lp||Endoscope structures and techniques for navigating to a target in branched structure|
|US8764725||14 Nov 2008||1 Jul 2014||Covidien Lp||Directional anchoring mechanism, method and applications thereof|
|US8838199||14 Feb 2005||16 Sep 2014||Medtronic Navigation, Inc.||Method and apparatus for virtual digital subtraction angiography|
|US8905920||19 Sep 2008||9 Dec 2014||Covidien Lp||Bronchoscope adapter and method|
|US8932207||10 Jul 2009||13 Jan 2015||Covidien Lp||Integrated multi-functional endoscopic tool|
|US9055881||1 May 2005||16 Jun 2015||Super Dimension Ltd.||System and method for image-based alignment of an endoscope|
|US9072895||27 Sep 2010||7 Jul 2015||Varian Medical Systems, Inc.||Guided radiation therapy system|
|US9089261||14 Sep 2004||28 Jul 2015||Covidien Lp||System of accessories for use with bronchoscopes|
|US9113813||17 Dec 2013||25 Aug 2015||Covidien Lp||Locatable catheter|
|US9117258||20 May 2013||25 Aug 2015||Covidien Lp||Feature-based registration method|
|US9168102||18 Jan 2006||27 Oct 2015||Medtronic Navigation, Inc.||Method and apparatus for providing a container to a sterile environment|
|US9237860||5 Jun 2009||19 Jan 2016||Varian Medical Systems, Inc.||Motion compensation for medical imaging and associated systems and methods|
|US9238151||6 May 2013||19 Jan 2016||Varian Medical Systems, Inc.||Dynamic/adaptive treatment planning for radiation therapy|
|US9248003||31 Dec 2003||2 Feb 2016||Varian Medical Systems, Inc.||Receiver used in marker localization sensing system and tunable to marker frequency|
|US9271803||2 May 2013||1 Mar 2016||Covidien Lp||Hybrid registration method|
|US9283053||19 Sep 2006||15 Mar 2016||Varian Medical Systems, Inc.||Apparatus and methods for implanting objects, such as bronchoscopically implanting markers in the lung of patients|
|US9486162||7 Jan 2011||8 Nov 2016||Ultrasonix Medical Corporation||Spatial needle guidance system and associated methods|
|US9504530||3 Feb 2014||29 Nov 2016||Medtronic Navigation, Inc.||Method and apparatus for surgical navigation|
|US9575140||2 Apr 2009||21 Feb 2017||Covidien Lp||Magnetic interference detection system and method|
|US9586059||25 Jul 2005||7 Mar 2017||Varian Medical Systems, Inc.||User interface for guided radiation therapy|
|US9597154||24 Feb 2014||21 Mar 2017||Medtronic, Inc.||Method and apparatus for optimizing a computer assisted surgical procedure|
|US9616248||26 Jul 2012||11 Apr 2017||Varian Medical Systems, Inc.||Integrated radiation therapy systems and methods for treating a target in a patient|
|US9623208||12 Jan 2005||18 Apr 2017||Varian Medical Systems, Inc.||Instruments with location markers and methods for tracking instruments through anatomical passageways|
|US9642514||11 Apr 2014||9 May 2017||Covidien Lp||Endoscope structures and techniques for navigating to a target in a branched structure|
|US9659374||24 Jul 2015||23 May 2017||Covidien Lp||Feature-based registration method|
|US9668639||8 Jun 2012||6 Jun 2017||Covidien Lp||Bronchoscope adapter and method|
|US9675424||22 Jul 2009||13 Jun 2017||Surgical Navigation Technologies, Inc.||Method for calibrating a navigation system|
|US9682253||25 Jul 2005||20 Jun 2017||Varian Medical Systems, Inc.||Integrated radiation therapy systems and methods for treating a target in a patient|
|US9757087||29 Jun 2009||12 Sep 2017||Medtronic Navigation, Inc.||Method and apparatus for perspective inversion|
|USRE46409||20 Apr 2015||23 May 2017||Medtronic Navigation, Inc.||Image guided awl/tap/screwdriver|
|USRE46422||20 Apr 2015||6 Jun 2017||Medtronic Navigation, Inc.||Image guided awl/tap/screwdriver|
|Cooperative Classification||A61B90/36, A61B34/20, A61B34/10, A61B2034/2072, A61B2034/2068, A61B2034/2055, A61B90/10, A61B2090/363|
|European Classification||A61B19/20, A61B19/52, A61B19/52H12|
|8 Jul 1999||AL||Designated countries for regional patents|
Kind code of ref document: A1
Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG
|8 Jul 1999||AK||Designated states|
Kind code of ref document: A1
Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW
|5 Aug 1999||DFPE||Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)|
|8 Sep 1999||121||Ep: the epo has been informed by wipo that ep was designated in this application|
|30 Jun 2000||NENP||Non-entry into the national phase in:|
Ref country code: KR
|2 May 2001||122||Ep: pct application non-entry in european phase|
|4 Oct 2001||RET||De translation (de og part 6b)|
Ref country code: DE
Ref document number: 19882935
Date of ref document: 20011004
Format of ref document f/p: P