WO2010102197A2 - Thermal surgical monitoring - Google Patents

Thermal surgical monitoring Download PDF

Info

Publication number
WO2010102197A2
WO2010102197A2 PCT/US2010/026349 US2010026349W WO2010102197A2 WO 2010102197 A2 WO2010102197 A2 WO 2010102197A2 US 2010026349 W US2010026349 W US 2010026349W WO 2010102197 A2 WO2010102197 A2 WO 2010102197A2
Authority
WO
WIPO (PCT)
Prior art keywords
probe
temperature
tracking
distal end
surgical
Prior art date
Application number
PCT/US2010/026349
Other languages
French (fr)
Other versions
WO2010102197A3 (en
Inventor
James Henry Boll
Richard Shaun Welches
Mirko Mirkov
Original Assignee
Cynosure, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cynosure, Inc. filed Critical Cynosure, Inc.
Priority to PCT/US2010/026349 priority Critical patent/WO2010102197A2/en
Publication of WO2010102197A2 publication Critical patent/WO2010102197A2/en
Publication of WO2010102197A3 publication Critical patent/WO2010102197A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B18/203Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser applying laser energy to the outside of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00084Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00084Temperature
    • A61B2017/00088Temperature using thermistors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00315Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for treatment of particular body parts
    • A61B2018/00452Skin
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00315Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for treatment of particular body parts
    • A61B2018/00452Skin
    • A61B2018/0047Upper parts of the skin, e.g. skin peeling or treatment of wrinkles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/397Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
    • A61B2090/3975Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
    • A61B2090/3979Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active infrared
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/5025Supports for surgical instruments, e.g. articulated arms with a counter-balancing mechanism
    • A61B2090/504Supports for surgical instruments, e.g. articulated arms with a counter-balancing mechanism with a counterweight
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the present invention relates to a surgical theatre camera and a surgical IR sensor. More particularly, the present invention relates to a surgical camera and sensor adapted to monitor tissue temperature of a patient for during a thermal surgical procedure.
  • Plastic surgeons, dermatologists and their patients continually search for new and improved methods for treating the effects of an aging or otherwise damaged skin.
  • One common procedure for rejuvenating the appearance of aged or photodamaged skin is laser skin resurfacing using a carbon dioxide laser.
  • Another technique is non-ablative laser skin tightening, which does not take the top layer of skin off, but instead uses a deep-penetrating laser to treat the layers of skin beneath the outer epidermal layer, tightening the skin and reducing wrinkles to provide a more youthful appearance.
  • Some skin tightening techniques include using a hollow tubular cannula that contains an optical fiber connected to a laser source.
  • the cannula can be inserted subcutaneously into a patient so that the end of the fiber is located within the tissue underlying the dermis.
  • the source emits a treatment output, for example an output pulse that is conveyed by the fiber to the dermis, which causes collagen shrinkage within the treatment area, thus tightening the skin.
  • patients have also turned to surgical methods for removing undesirable tissue from areas of their body. For example, to remove fat tissue, some patients have preferred liposuction, a procedure in which fat is removed by suction mechanism because despite strenuous dieting and exercise, some of the patients cannot lose fat, particularly in certain areas.
  • laser or other light sources has been applied for heating, removal, destruction (for example, killing), photocoagulation, eradication or otherwise treating (hereinafter collectively referred as "treating" or "treatment”) the tissue.
  • thermo-surgical theater with appropriate sensors and processing may provide hands free surface and/or internal temperature monitoring and/or feedback to a surgeon during thermal surgical procedures, for example laser surgery or any other procedure/ surgery where energy is deposited into surface tissue to effect a tissue temperature rise.
  • Various embodiments could include surgical systems employing, e.g., laser, radio frequency (RF), acoustic or other suitable energy source.
  • RF radio frequency
  • an apparatus for monitoring a thermal surgical procedure includes a thermal camera for monitoring temperature at a plurality of locations within at least a portion of a surgical field undergoing thermal surgical treatment and generating a series of thermal images based on the monitoring.
  • the apparatus further includes a processor for processing the thermal images and a display for displaying, in real time, a series of display images indicative of temperature at the plurality of positions.
  • the display images indicative of temperature at the plurality of locations comprise false color images.
  • the thermal camera is an infrared camera.
  • the thermal camera generates the series of thermal images at a rate of greater than about 30 frames per second.
  • the display displays the series of display images at a rate of greater than about 30 frames per second.
  • the apparatus for monitoring a thermal surgical procedure further includes a video camera for monitoring at least a portion of the surgical field, where the display is configured to display, in real time, video images of said portion of the surgical field.
  • the apparatus for monitoring a thermal surgical procedure further includes a processor configured to superimpose the thermal images with the video images, where the display images comprise the superimposed images.
  • the field of view of the thermal camera substantially overlaps the field of view of the video camera.
  • the apparatus for monitoring a thermal surgical procedure further includes an optical element which selectively directs light from the surgical field in the infrared spectrum to the thermal camera, and selectively directs light from the surgical field in the visible spectrum to the video camera.
  • at least one of the thermal camera and the video camera include an autofocus.
  • the apparatus for monitoring a thermal surgical procedure further includes a servo unit for directing the thermal camera to monitor a selected portion of the surgical field.
  • the apparatus for monitoring a thermal surgical procedure further includes an indicator which illuminates the selected portion of the surgical field, where the servo unit comprises a tracking unit adapted to track the illuminated portion and direct the thermal camera to monitor the selected portion.
  • the processor determines temperature information about the portion of the surgical field based on the thermal images.
  • the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device based on said temperature information.
  • the processor is configured to process the thermal images and determine information indicative of temperature at the plurality of locations, compare the information indicative of temperature at the plurality of locations to a selected master threshold temperature, and produce a master alarm if the temperature at any of the plurality of locations exceeds the selected master threshold temperature.
  • the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device in response to the master alarm.
  • the processor is configured to process the thermal images and determine information indicative of temperature at a subset of the plurality of locations, compare the information indicative of temperature at the subset of plurality of locations to a selected range of temperature, and produce a range alarm if the temperature at any of the subset of the plurality of locations falls outside the selected range.
  • the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device in response to the range alarm.
  • the processor is configured to process the thermal images and determine information indicative of a rate of change of temperature at the plurality of locations, compare the information indicative of temperature at the plurality of locations to a selected secondary threshold temperature, compare the information indicative of the rate of change of temperature at the plurality of locations to a selected threshold rate, and produce a rate alarm if, at any one of the plurality of locations, the temperature exceeds the secondary threshold and the rate of change of temperature exceed the rate threshold.
  • the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device in response to the rate alarm.
  • the display comprises at least one chosen from the group consisting of: a video monitor, a flat panel monitor, a heads up display, and a virtual reality display.
  • a method for monitoring a thermal surgical procedure including using a thermal camera to monitor temperature at a plurality of locations within at least a portion of a surgical field undergoing thermal surgical treatment and generating a series of thermal images based on the monitoring, and displaying, in real time, a series of display images indicative of temperature at the plurality of positions based on the series of thermal images.
  • displaying, in real time, a series of display images indicative of temperature at the plurality of positions includes displaying false color images.
  • the method further includes using a video camera to monitor at least a portion of the surgical field and displaying, in real time, video images of the portion of the surgical field. In some embodiments, the method further includes processing the thermal images and the video images to superimpose the thermal images with the video images and displaying display images corresponding to the superimposed images.
  • the method further includes determining temperature information about the portion of the surgical field based on the thermal images. In some embodiments, the method further includes controlling the operation of a laser surgical device based on the temperature information.
  • the method further includes processing the thermal images and determining information indicative of temperature at the plurality of locations, comparing the information indicative of temperature at the plurality of locations to a selected master threshold temperature, and producing a master alarm if the temperature at any of the plurality of locations exceeds the selected master threshold temperature.
  • the method further includes processing the thermal images and determining information indicative of temperature at a subset of the plurality of locations, comparing the information indicative of temperature at the subset of plurality of locations to a selected range of temperature, and producing a range alarm if the temperature at any of the subset of the plurality of locations falls outside the selected range.
  • the method further includes processing the thermal images and determine information indicative of a rate of change of temperature at the plurality of locations, comparing the information indicative of temperature at the plurality of locations to a selected secondary threshold temperature, comparing the information indicative of the rate of change of temperature at the plurality of locations to a selected rate threshold, and producing a rate alarm if, at any one of the plurality of locations, the temperature exceeds the secondary threshold and the rate of change of temperature exceed the rate threshold.
  • the method further includes providing an indicator identifying a area of interest within the surgical field, tracking the position of the indicator, and adjusting the thermal camera to monitor the area of interest.
  • a method including: using a video camera to generate a video stream of a surgical field including an area of skin of a patient; inserting a surgical probe into the patient such that at least a distal end of the probe is located in a treatment region under the area of skin; emitting tracking light from the distal end of the probe to illuminate a portion of the area of skin overlaying the distal end of the probe; processing the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin; displaying the video stream; and based on the tracking information, displaying an indicia of the position of the distal end of the probe overlaying the display of the video stream.
  • the surgical probe includes an inertial sensor
  • the method further includes using the inertial sensor to determine position information indicative of the position or movement of the probe.
  • the distal end of the probe includes a temperature sensor, and the further includes: moving the distal end of the probe to a plurality of locations in the treatment region; using the temperature sensor to determine temperature information indicative of the temperature at each of the locations; and overlaying the display of the video stream with graphical information indicative of the temperature of the locations based on the tracking information and the temperature information.
  • Some embodiments include, for each of the plurality of locations: determining the position of the illuminated a portion of the area of skin overlaying the probe; determining the temperature at the location of the distal end of the probe; displaying a graphical element overlaying the display of the video stream for a selected duration at a display location corresponding to the position of the illuminated a portion of the area of skin, where a property of the graphical element is indicative of the temperature of the probe at the location.
  • the property of the graphical element includes at least one from the list consisting of: color, shape, size, brightness, grey scale, blink rate, and an alphanumeric indicator.
  • displaying a graphical element overlaying the display of the video stream for a selected duration includes causing the graphical element to fade away at a selected rate.
  • the temperature sensor includes a thermister.
  • the surgical probe includes an inertial sensor
  • the method further includes: using the inertial sensor to determine position information indicative of the position or movement of the probe. For each of the plurality of locations, a property of the respective graphical element is determined based on the position information.
  • the position information includes information indicative of a depth under the skin of the distal end of the probe at the location, hi some embodiments, the position information includes information indicative of a dwell time of the distal end of the probe at the location. In some embodiments, the position information includes information indicative of a speed of motion of the distal end of the probe at the location.
  • the surgical probe includes an inertial sensor
  • the method further includes: using the inertial sensor to determine position information indicative of the position or movement of the probe; providing an obstruction blocking the video camera from viewing the illumination of the of the portion of the area of skin by the tracking light, thereby causing an interruption of the determination of the tracking information; and during the interruption, providing substitute tracking information based on the position information determined by the inertial sensor.
  • the tracking information includes information indicative of the depth of the distal end of the probe beneath the skin.
  • the step of emitting tracking light includes emitting a first portion of tracking light at a first wavelength and a second portion of tracking light at a second wavelength; and the step processing the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin includes comparing an intensity of the illumination at the first wavelength to an intensity of the illumination at the second wavelength.
  • the first wavelength is in the red portion of the visible spectrum and the second wavelength is in the green portion of the visible spectrum.
  • the surgical probe includes an optical delivery device, and the method further includes using the optical delivery device to deliver therapeutic light from a source to the treatment region.
  • the therapeutic light includes laser light, hi some embodiments, the therapeutic light includes light having a wavelength in the visible or infrared. In some embodiments, the therapeutic light has a wavelength of about 1064nm, 1320 nm, or 1440 nm. hi some embodiments, the delivered therapeutic light has a total power in the range of 1 W to 2OW. In some embodiments, the delivered therapeutic light has a power density in the range of 200 W/cm ⁇ 2 to 20,000 W/cm ⁇ 2 , or any sub range therein, at the target region.
  • the step of delivering therapeutic light from the light emitting portion of the delivery device includes delivering a series of light pulses.
  • the series of pulses includes a pulse having a duration in the range of about 0.1 ms to about 1.0 ms.
  • the series of pulses has a repetition rate in the range of about 10 to about 100 Hz.
  • the surgical probe includes at least one sensor
  • the method further includes: using the at least one sensor, generating a signal indicative of at least one property of the delivery device or the treatment region; and controlling the delivery of therapeutic light based on the sensor signal.
  • the property of the delivery device or the target region includes at least one selected from the list consisting of: a position of the optical delivery device, a movement of the optical delivery device, temperature of the optical delivery device, a tissue type in the vicinity of the optical delivery device, an amount of energy delivered by the optical delivery device, and a temperature of tissue in the target region.
  • the sensor includes at least one selected from the list consisting of: a thermister, an inertial sensor, an accelerometer, a gyroscope, and a color sensor.
  • Some embodiments include generating a display based on the signal indicative of at least one property of the delivery device or the target region.
  • the surgical probe includes a cannula and the optical delivery device includes a optical fiber having at least a portion located in or proximate to the cannula.
  • an apparatus including: a video camera configured to generate a video stream of a surgical field including an area of skin a patient; a surgical probe configured for insertion into the patient such that at least a distal end of the probe is located in a treatment region under the area of skin; an optical emitter configured to emit tracking light from the distal end of the probe to illuminate a portion of the area of skin overlaying the distal end of the probe; a processor in communication with the camera and including a visual tracking module which processes the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin; and a display in communication with the processor and the camera, the display configured to display the video stream; and based on the tracking information, display an indicia of the position of the distal end of the probe overlaying the display of the video stream.
  • the surgical probe includes an inertial sensor in communication with the processor, and the processor includes an inertial tracking module which processes a signal from the inertial sensor to determine position information indicative of the position or movement of the probe.
  • the distal end of the probe includes a temperature sensor in communication with the processor, and the processor is configured to process a signal from the temperature sensor to determine temperature information indicative of the temperature at each of a plurality of the location traversed by the probe.
  • the processor includes an image generation module which forms a graphical representation indicative of the temperature of the locations based on the tracking information and the temperature information; and a video mixer which overlays the display of the video stream with the graphical representations.
  • the processor includes: the visual tracking module, the module configured to process the video stream to determining the position of the illuminated a portion of the area of skin overlaying the probe; and a temperature sensing module configured to determine, based on the signal from the temperature sensor, the temperature at a plurality of locations in the treatment region.
  • the image generation module and video mixing module cooperate to generate a respective graphical element overlaying the display of the video stream for a selected duration at a location corresponding to the position of the illuminated portion of the area of skin, where a property of the graphical element is indicative of the temperature of the probe at the location.
  • the property of the graphical element includes at least one from the list consisting of: color, shape, size, brightness, grey scale, blink rate, and an alphanumeric indicator.
  • the image generation module includes a persistence module which causes each graphical element to become increasingly transparent at a selected rate.
  • the temperature sensor includes a thermister.
  • the surgical probe includes an inertial sensor
  • the processor includes an inertial tracking module configured to use the inertial sensor to determine position information indicative of the position or movement of the probe, and for each of the plurality of locations, determine a property of the respective graphical element based on the position information.
  • the position information includes information indicative of a depth under the skin of the distal end of the probe.
  • the position information includes information indicative of a dwell time of the distal end of the probe.
  • the position information includes information indicative of a speed of motion of the distal end of the probe.
  • the surgical probe includes an inertial sensor
  • the processor includes: an inertial tracking module configured to use the inertial sensor to determine inertial position information indicative of the position or movement of the probe; and a visual tracking lock indicator module configured to process the video stream to identify an interruption tracking by the visual tracking module; and upon identifying the interruption, substitute inertial position information for tracking information from the visual tracking module.
  • the tracking information includes information indicative of the depth of the distal end of the probe beneath the skin.
  • the optical emitter emits a first portion of tracking light at a first wavelength and a second portion of tracking light at a second wavelength; and the visual tracking module processes the video stream to determine the information indicative of the depth of the distal end of the probe beneath the skin by comparing a property of the illumination at the first wavelength to a property of the illumination at the second wavelength.
  • the first wavelength is in the red portion of the visible spectrum and the second wavelength is in the green portion of the visible spectrum.
  • the surgical probe includes an optical delivery device configured to deliver therapeutic light from a therapeutic light source to the treatment region.
  • the surgical probe includes at least one sensor in communication with the processor, and the at least one sensor is configured to generate a signal indicative of at least one property of the delivery device or the treatment region; and the processor is configured to control the delivery of therapeutic light to the treatment region based on the sensor signal.
  • the property of the delivery device or the target region includes at least one selected from the list consisting of: a position of the optical delivery device, a movement of the optical delivery device, temperature of the optical delivery device, a tissue type in the vicinity of the optical delivery device, an amount of energy delivered by the optical delivery device, and a temperature of tissue in the target region.
  • the senor includes at least one selected from the list consisting of: a thermister, an inertial sensor, an accelerometer, a gyroscope, and a color sensor.
  • the processor and the display are configured to cooperate to generate a display based on the signal indicative of at least one property of the delivery device or the target region.
  • Various embodiments may include any of the features of techniques described herein, either alone or in any suitable combination.
  • FIG. 1 shows a thermal surgical monitoring system
  • FIG. 2 shows a thermal surgical monitoring system with a surgical theatre thermal camera and laser treatment instrument mounted to an adjustable pole.
  • FIG. 2A shows a thermal surgical monitoring system with a surgical theatre thermal camera and laser treatment instrument mounted to a portable station.
  • FIG. 3 shows a display of a thermal image taken with a surgical theatre thermal camera superimposed onto a display of a video image of a treatment site taken with a video camera.
  • FIG. 3 A shows a block diagram of a thermal surgical monitoring system with feedback control to a thermal surgical device.
  • FIGs. 4 and 4A show a thermal surgical monitoring system.
  • FIGs. 5A-5E show a surgical field monitored by the system of FIGs. 4 and 4A.
  • FIG. 6 is a schematic of a processor of he system of FIGs. 4 and 4A.
  • FIG. 1 shows a thermal surgical monitoring system 100 .
  • a surgeon performs a thermal surgical procedure within surgical field 101.
  • laser light generated by a laser 125 is directed to through handpiece 126 to tissue within surgical field 101.
  • Handpiece 126 includes a cannula for insertion into the patient through an incision located in surgical field 126.
  • the cannula contains a waveguide, e.g. an optical fiber, which delivers later light from laser 125 to tissue to be treated.
  • laser 125 may be used, for example, to treat tissues within the dermis to cause skin tightening, for laser lipolysis, or for other such applications.
  • Thermal camera 105 e.g. a thermal camera
  • An exemplary suitable thermal camera is a high resolution infrared (IR) camera such as the ThermaCam45 Infrared Camera, manufactured by FLIR, Wilsonville, OR, which produces both good resolution and a wide field of view.
  • IR infrared
  • Display 115 (e.g. an LCD screen)is located within convenient view of a surgeon 120 such that a real time surface temperature information in the form of, for example, a false color isotherm map shown on display 115 is available to the surgeon during surgery.
  • the surgeon can use the real time surface temperature information to control a deposition of energy within surgical field 101 in a tissue or tissues to be treated, such that, for example, a desired temperature rise in the treated tissue or tissues is achieved.
  • the thermal camera 105 includes an auto focus 106.
  • thermal camera 105 includes a manual focus control, e.g. a zoom lever. Such a control could be covered with a sterile bag 130.
  • thermal camera 105 has a preset depth of field that would include the full surgical working range, reducing or eliminating the need for an auto focus feature 106.
  • the mounting location for thermal camera 105 would be over a surgical arena adjacent to or part of a set of ceiling mounted surgical lights.
  • the camera may be light enough to mount to existing operating room (OR) lighting booms.
  • a ceiling mounted camera boom places the thermal camera 105 advantageously close to a surgical field 101, yet out of the way of the surgeon.
  • the display 115 can also be ceiling boom mounted, hi some embodiments, the display 115 may be mounted on top of treatment laser 125.
  • a wide angle or standard lens may be used to provide a suitable view for typical surgical applications, roughly framing a human abdomen, for example, at a working distance of 3-6 feet.
  • a temperature measurement accuracy may be approximately ⁇ 2°C, with a temperature drift over a 1 hour period of ⁇ 2°C.
  • Some embodiments may feature a calibration shutter which operates, for example, about every minute. Some embodiments may feature less frequent calibration, more suitable for the particular application at hand.
  • system 100 provides real time surface temperature information which can be used as "feedback" by the surgeon performing the treatment procedure.
  • the treatment can be improved or optimized for safety and efficacy, hi some embodiments, the surgeon manually adjusts his actions based on the displayed temperature information (e.g. by adjusting applied dosage).
  • temperature information measured by system 100 may be processed and used to provide alerts to the surgeon and/or automatically control laser 125 (e.g. to shut off the laser system in the event of the appearance of a high temperature "hot spot" in within surgical field 101).
  • FIG. 2 shows and embodiment of system 100 where thermal camera 105 is mounted on an adjustable pole 210 connected to laser 125. Such mounting may reduce or eliminate operating room installation costs and issues concerning the surgical sterile field.
  • thermal camera 105 may be controlled (e.g., turned On/Off) directly from touchscreen display 117 on the laser 125.
  • laser 125 and system 100 are integrated such that no external wiring above and beyond a self contained wiring for the laser 125 may be required.
  • FIG. 2A shows a thermal surgical monitoring system with a surgical theatre thermal camera and laser treatment instrument mounted to a portable station 165.
  • An autoclavable handle 170 allows for the adjustment of the thermal camera 105.
  • the thermal camera 105 itself is supported and balanced by a boom 172 and counterweight 174, respectively, that allows the thermal camera 105 to pivot about a portable station pivot point 176.
  • the bottom of the portable station 165 is configured with a support ballast 178 and a set of castors 179 to both stabilize the portable station 165 and to give it portability.
  • a display 115 is connected to the portable station 165 to a mounting bracket 180 approximately eye level to the surgeon 120.
  • a processor 180 to interface the thermal camera 105 to the display 115 and the laser 125 and to interface the display
  • FIG. 3 shows an embodiment of system 100 featuring a conventional video camera 305 (e.g. operating within the visual spectrum) in addition to thermal camera 105.
  • a simultaneous display e.g. superposition
  • thermal image e.g. a false color image
  • thermal camera 105 is ceiling mounted 110 at right angles to the standard video camera 305.
  • Light from surgical field 101 is directed to thermal camera 105 and standard video camera 305 using beam splitting optic 306.
  • beam splitting optic 306 For example, light in the IR spectrum may be selectively directed to thermal camera 105 while light in the visible spectrum is directed to conventional camera 305.
  • the optic 306 may be made of ZnSe or Germanium, both of which have good IR transmission, and both of which can be coated to reflect the visible wavelengths. If the visible full-color spectrum is distorted due to a coating or to a substrate reflectivity, a black and white standard video image would, in some embodiments, suffice.
  • the two cameras 105, 305 have a set of matched field of views
  • the laser 125 irradiates a target tissue 309 in the field of view 308.
  • the video images of the two cameras may be processed by a processor 307, which may combine the images for display on display 115.
  • Processor 307 may include high speed image processing capabilities to combine and analyze a set of real time images for potential for hot spots.
  • processor 307 may control laser 125 based on information determined by processing the images.
  • no connection is made from the laser driver 307 to the laser 125. In the case where no connection is made, the laser driver provides temperature and position data to the surgeon, displaying the temperature and position data on the display 115.
  • cameras 105 and 305 may feature a stereoscopic version using, for example, a standard video camera (visual spectrum) 305 in sync and aligned immediately adjacent to the thermal camera 105.
  • a standard video camera visual spectrum
  • both the standard video camera 305 and the thermal camera 105 have the same field of view.
  • images from video camera 305 may be displayed on display 115 superimposed with images from thermal camera 105.
  • an image from thermal camera 105 can be a transparent layer of false color superimposed on top of an image from the standard video camera 305.
  • a set of hot spot islands could be immediately linked to a corresponding anatomical treatment area without a need to use other indicia (e.g. a warm or cold cannula placed within the field of view) to identify where the hot spots are located.
  • images superimposed from thermal camera 105 onto images from the standard video camera 305 provide data to a laser driver 307 to determine laser power levels as a function of a set of treatment site surface temperatures. Controlling the laser 125 based on a set of treatment site surface temperatures may improve safety and efficacy.
  • an image from the standard video camera 305 and an image from the thermal camera 105 can be displayed side-by-side; in some applications, an image from the standard video camera 305 and an image from the thermal camera 105 can be displayed using a screen-in-screen format.
  • display 115 is virtual reality image projection display. Such displays are commonly available and are low cost.
  • the thermal data for a virtual reality image projection is a three dimensional (3D) projection superimposed onto a 3D projection of the target tissue 309.
  • display 115 is a head mounted heads-up display (HUD) used to superimpose a set of images from the thermal camera directly onto the target tissue 309 in the field of view of the surgeon.
  • a focal distance for the HUD may be set approximately to the distance from the eyes of the surgeon 120 to the target tissue 309.
  • a servo driven stereo camera assembly 310 supporting both the thermal camera 105 and the standard video camera 305, allows for automatic tracking and auto focus of the target tissue 309.
  • the stereo camera assembly may, for example, track an aim beam, directed from the laser 125, and position such that the thermal camera is focused on the target tissue with a 10-15 cm diameter field of view 308.
  • an aim beam may be used with sufficient intensity to shine through tissue and to provide a visible indicia of illumination on the outside of the patients skin.
  • processing of thermal data taken from thermal camera allows for numerous types of alarms, thresholds and/or automatic control of a surgical device (e.g. laser 125) based on temperature information derived from the thermal images.
  • a surgical device e.g. laser 125
  • thermal images from thermal camera 105 and, optionally, video images from video camera 305 are sent to processor 307.
  • Processor 307 processes the images to determine temperature information about the portion of the surgical field 101 within thermal camera's field of view.
  • the processor may determine, in real time, the temperature and/or change in temper at all positions (or a subset thereof) within the field of view.
  • the processor may analyze this temperature information and provide alarms or other indications to the surgeon.
  • processor 307 is in communication with display 115, and can display information or alarm indications on the display based on the analyzed temperature information. For example, visual indications could be provided on the display highlighting "hot spot" locations within the filed of view having temperatures above a selected threshold.
  • Processor 307 may also be in communication with separate alarm unit 312.
  • Alarm unit 312 may provide a visual alarm indication (e.g. a flashing light), an audible alarm indication (e.g. a buzzer or tone generator), or any other suitable indication.
  • Processor 307 may also control laser 125 based on the analyzed temperature information.
  • the processor may be programmed to shut down the laser if the presence of one or more hot spots are detected.
  • the processor may, in general, control any aspect of the laser (e.g. wavelength, pulse rate, intensity, etc.) based on the analyzed temperature data.
  • processor 307 may receive other inputs from, e.g. acceleration, speed, or position sensors located in handpiece 126, thermal sensors located in handpiece 126, tissue type sensors located in handpiece 126, etc.
  • Processor 307 may analyze information from these sensor inputs in conjunction with the temperature information from the thermal camera and display information, provide alarms, and/or control laser 125 based on the analysis.
  • the laser 125 may be directed by the processor 307 to automatically self limit the laser power applied to the target tissue 309.
  • the laser driver 307 automatically adjusts the laser 125 power to achieve a selected surface sector temperature within the field of view 308.
  • the surgeon 120 may thereby select a treatment area by for example, aiming a treatment beam or simply placing a surgical waveguide into the selected treatment sector and stepping on a footswitch.
  • the system may also be optionally placed in a manual mode are where the surgeon 120 is in direct control of the laser 125.
  • Some embodiments may feature a master alarm, where an alarm occurs when any point within the thermal camera field of view exceeds a threshold, for example 48°C.
  • a threshold for example 48°C.
  • the alarm is not user selectable (i.e. hardwired). It is possible that the threshold is exceeded as a result of a hot cannula 207, which may be a result of a surgical waveguide 208 slipping in the cannula 207 as the surgical waveguide is plunged in and out of the tissue.
  • a false alarm resulting from a hot cannula 207 may be reduced or eliminated by weighting the points within the thermal camera field of view, allowing for localized hot spots that do not set off the alarm.
  • Some embodiments feature a "fast alarm", where an alarm occurs when any point within the thermal camera field of view exceeds a threshold, for example 45 0 C, and a rapid rate of change of temperature ( ⁇ T) is detected in within the thermal camera field of view.
  • the fast alarm is not user selectable.
  • ⁇ T is approximately 0.2 0 C /min.; ⁇ T on the order of 0.2 0 C /min. may detect a leakage of a set of warm fluids from an incision site.
  • a ⁇ T on the order of 0.2 0 C /min. may give an indirect indication of an excessive internal deep tissue temperature, as a deep tissue temperature may appear in a high resolution IR video image rapidly.
  • a more conservative threshold is chosen for the fast alarm than for the master alarm.
  • Some embodiments feature a local treatment zone High/Low alarm, where a set of treatment zone High/Low alarm thresholds are adjustable thresholds.
  • the treatment zone High/Low alarm thresholds may be set to 39°C and 42 0 C, respectively.
  • the treatment zone High/Low alarm is accompanied by a soft audible tone.
  • Two ways of selecting data for determination of treatment zone thresholds include: selecting the entire thermal camera field of view and selecting a subset of the field of view, e.g. a 10 cm diameter area proximate to an aim beam, or other indicator indicative of the area targeted for treatment.
  • the dynamic range of thermal cameral 105 may be adjusted to correspond to an expected temperature range for a given surgical application.
  • the display 115 displays a treatment area within the surgical field 101 and the surgical surroundings.
  • a temperature range between the surgical surroundings and an expected maximum safe tissue temperature in the treatment area may be 10 0 C- 80 0 C.
  • a typical laser lipolysis treatment may require a range of approximately 20°C-45°C. for some applications.
  • a software feature set of the thermal camera may be limited. For example, in some embodiments the camera may not provide 'ON/OFF' controls, may default to display a false color image, and may have only one to three easily set temperature ranges.
  • thermal surgical monitoring system Referring to FIG. 4, in one embodiment, thermal surgical monitoring system
  • a conventional camera 405 (e.g. operating within the visual spectrum) is positioned to view a surgical field 407 to produce a video stream (e.g. a sequence of images of the surgical field).
  • camera 405 may be a digital camera (e.g. a CCD camera).
  • camera 405 is a digital color CCD camera which produces an RGB video stream familiar in the art.
  • the video camera 405 communicates with a processor 410 to provide the video stream for processing, as described in detail below.
  • the surgical field includes an area of skin 408 of a patient.
  • a surgical probe 415 is inserted through an incision 409 in the skin such that a distal end of the probe is located below the skin in a treatment region 411.
  • the probe 415 includes an optical delivery device (as shown, an optical fiber 416).
  • the optical fiber 416 is coupled to a therapeutic light source 420 (e.g. a laser) and can transmit therapeutic light from the therapeutic source 420 to the treatment region 411 to effect treatment (e.g. heating of fatty tissue to induce lipolysis, thermal ablation, etc.).
  • the surgical probe 415 may include one or more cannulas 412 (one is show), which may optionally house the optical fiber 416.
  • the cannula 412 may be a suction cannula to which a vacuum may be applied, e.g. to remove the byproducts produced by application of the therapeutic light to tissue in the target region.
  • the optical fiber is also coupled to a tracking light source 425 (e.g. a laser light source) to transmit visible tracking light which is emitted from the distal end of the probe 415 in the treatment region 411.
  • a tracking light source 425 e.g. a laser light source
  • the visible tracking light is transmitted and/or scattered through the treatment region to illuminate an area 417 of the skin 408 which overlays the distal end of the probe.
  • tracking light source is a laser diode emitting light in the red portion of the visible spectrum (e.g. at 808 nm).
  • tracking light source maybe incorporated in the probe 415, for example, as an element (e.g. an LED) positioned on the distal end of the probe to emit light directly into the treatment region 411.
  • the illuminated area 417 appears in the video stream from the camera 405 as an externally visible spot on the skin 408 in the surgical field 407 (e.g. the exemplary field shown in FIG. 5A).
  • the processor 410 processes the video stream to track the location of the illuminated area to 417 provide tracking information indicative of the position of the distal end of the probe 415 under the patient's skin. Tracking information obtained in this fashion may be described as "visual:" tracking information.
  • the tracking light may be modulated (e.g. pulsed at a known frequency) to aid in identification of the illuminated area 417 in the video stream (e.g. using lock-in detection processing techniques known in the art).
  • the surgical probe 415 includes a thermal sensor 435 which is in communication with processor 410 to provide a temperature signal indicative of the temperature of the treatment region surrounding the distal end of the probe 415.
  • thermal sensor 435 may be a thermister mounted on the distal end of the probe 415 which samples the temperature of its environment, e.g. continuously or at regular intervals.
  • the thermal sensor 415 may include a thermocouple, a pyrometer, or an infrared (IR) thermal sensor.
  • the thermal sensor 435 exhibits a finite reaction rate in response to changes in temperature.
  • the response rate of the sensor may be characterized by a time constant.
  • Typical sensors e.g. thermister
  • Typical sensors know in the art operate wish a variety of response time constants ranging from 1 ms or less to a second or more, e.g., about 250 ms.
  • a display 440 is in communication with processor 410 and camera 405.
  • the display 440 allows a user to view, in real time, video images from the video stream of the surgical field 407 showing the skin surface area 408 overlaying the internal treatment region 41 1.
  • An exemplary display view of the surgical field 407 is shown in FIG. 5C.
  • the display 440 shows grayscale video images of the surgical field, but in other embodiments a color, false color, or other video display may be used.
  • processor 410 processes the video stream to determine tracking information indicative of the position of the spot area 417 on skin surface 408 illuminated by the tracking light from the distal end of the surgical probe 415. Using this information, the processor superimposes a graphical tracking indicia (as shown, a cross hair, although any other suitable indicia may be used) on the display of the skin surface 408 indicating the position on the surface overlaying the distal end of the probe 415. As the probe 415 is moved throughout the treatment region 411, the cross hair indicia is repositioned, providing the practitioner real time visual tracking of the probe end.
  • a graphical tracking indicia as shown, a cross hair, although any other suitable indicia may be used
  • temperature sensor 435 on the probe samples the temperature at several locations.
  • This temperate information is communicated to the processor 410, and combined with the tracking information to determine positions on the skin surface corresponding to the underlying temperature sample locations.
  • the processor 410 uses this information to generate graphical elements 501 (as shown dots) superimposed on the video display at these positions.
  • a graphical property of the dots is indicative of the temperature of the corresponding internal location.
  • other graphical properties may be used to represent temperature including: color, shape, size, brightness, grey scale, blink rate, an alphanumeric indicator, etc.
  • the display further includes an alphanumeric indication 503 corresponding to the temperature measured at the position currently occupied by the cross hair indicia.
  • a linear track 505 of false color dots 501 representing the measured internal temperatures along the path is generated and superimposed onto the gray scale background.
  • a false color temperature map 507 of a portion of underlying treatment region is generated over the display of the surgical field. This map may be generated in a diagnostic mode, during which no therapeutic light is delivered by the surgical probe 415. Alternatively, the map may be generated in a treatment mode during a procedure in which therapeutic light is applied, thereby generating a visual representation of the distribution of internal heating effected during the procedure.
  • each color dot persists for a selected time (e.g. a few seconds) and then is removed or, in some embodiments, slowly fades to transparent (as shown in FIG. 5E). If the location of the removed or faded dot is traversed again by the distal end of the surgical probe 415, a new color dot is generated with new temperature information.
  • the temperature In the treatment mode, when the measurement is made, the temperature is accurate for that location and point in time, but as the surgical probe moves away to deliver therapeutic light to other locations, the temperature typically drops as heat dissipates. Accordingly, the fading persistence of a dots removes older measurements from the temperature map as they become less accurate.
  • the color map can fade on its own until a new map is made or the existing map is cleared by operator command.
  • the temperature map shown in FIG. 5D is two dimensional only (depth may be subjectively known and controlled by the physician's, e.g. by manually limiting the movement of the distal end of the probe 415 to a chosen operating plane). However, in other embodiments, information indicative of the depth of the distal end of the probe 415 may be generated (e.g. using an inertial sensor or other sensing techniques) and represented temperature map (e.g. by indicating the depth of a given measurement location by a size of the corresponding color dot).
  • the depth of the temperature measurement can be determined using by using a dual color tracking light.
  • tracking light source 425 may emit both red and green light which is combined in the fiber 416 and delivered to the treatment region 411 to be scattered to illuminate skin surface area 417.
  • the intensity of each color component of the light from illuminated area 417 may be detected in the red and green channels respectively of the RGB CCD camera 405. (leaving the blue channel to produce a grayscale image of the surgical field 407 for viewing on display 440).
  • the red and green wavelengths have different absorption and scattering coefficients in typical tissue structures including fat, dermis and epidermal melanin. Accordingly, each color of tracking light will experience different effects as it passes through treatment region 411 to illuminate area 417.
  • the ratio of the green versus red scattered spot intensity and the ratio of the green vs red scattered spot diameters can be used indicate the depth of the distal end of probe 415.
  • the ratios could be stored in processor 410 as entries in a lookup tables to convert these ratios to corresponding depths.
  • the lookup tables may be pre- calculated in advance, e.g. based on Monte Carlo modeling of a range of probe depths in various skin types. Alternatively, the tables may be determine based on empirical data.
  • inertial sensor data from the probe 415 may be used in some embodiments to maintain position tracking even during intervals in which visual tracking is lost.
  • the inertial sensor 430 includes a single or multiple degree of freedom MEMS sensor mounted on the surgical probe 415 (e.g. in a handpiece element).
  • a sensor may include for example multiple accelerometers (e.g. to measure linear acceleration along 3 axes) and/or multiple gyroscopes (e.g. to measure angular acceleration along 3 axes).
  • Some embodiments may use a magnetometers (e.g. a 3 axis magnetometer) to measure motion relative to the earth's magnetic field. Information from the magnetometer may be used compensate gyro drift or to provide position information measure relative to a fixed 3D reference frame. In various embodiments, any other number and type of sensors may be used. Further examples of inertial sensors for surgical probes are described in the references incorporated above.
  • Signals from the inertial sensor may be processed using techniques known in the art to determine information related to the position or movement of the probe.
  • the information may include an absolute position, a relative position, a linear or angular speed, a linear or angular velocity, a linear or angular acceleration, and orientation in space, etc.
  • the position information determined from the inertial sensor signal may be used to provide inertial tracking of the surgical probe 415 during periods when the above described visual tracking technique fails (e.g. due to an obstruction of in the view of camera 405).
  • the inertial position information may be used by processor 410, e.g., in combination with the last know position of the probe 415 prior to the loss of visual tracking, to track the position of the probe. As with the visual tracking information, this inertial tracking information may be used to position the cross hair in the display and/or allow for uninterrupted generation of the displayed temperature map.
  • the devices and techniques described above can aid a surgeon in applying a desired distribution a therapeutic light to the treatment region 411 by allowing the surgeon to visualize the internal temperature produced in response to the therapeutic light.
  • FIG. 5B illustrates a typical thermal surgical field.
  • the surgeon inserts the distal end of the surgical probe 415 into the incision point, and moves probe throughout the indicated zone to provide sub surface heating.
  • FIG. 5B shows exemplary temperature isoclines indicative of the internal temperatures in the treatment region 411 underlying the skin surface of the patient.
  • FIG. 5B shows exemplary temperature isoclines indicative of the internal temperatures in the treatment region 411 underlying the skin surface of the patient.
  • the internal temperature visualization techniques described above provide real time guidance allowing the practitioner to reduce or eliminate these difficulties.
  • processor 410 controls the therapeutic light source
  • Tissue temperature information feedback allows for closed loop tissue temperature control wherein a property of the therapeutic light output by source 425 (e.g. output power, pulse rate, wavelength etc.) may be controlled (e.g. modulated) to effect a desired tissue temperature profile for a given procedure.
  • a property of the therapeutic light output by source 425 e.g. output power, pulse rate, wavelength etc.
  • tissue temperatures can more easily be maintained, simplifying the procedure for the clinician and providing improved efficacy with enhanced safety.
  • closed loop temperature management benefits is in skin tightening procedures where the cannula tip is placed proximal to the sub dermal layer.
  • the laser heats fat adjacent to these deeper dermal areas and the heat acts on the entire dermis to affect so called collagen remodeling (skin tightening).
  • collagen remodeling skin tightening
  • a difficulty is that thermal conduction through dermal layers (to effect skin tightening) varies greatly based on skin type and thickness. Thermal gradients from deep dermis to epidermal layers may vary considerably. Thus it is possible to over heat deeper sub-dermal areas while effecting optimum surface temperatures. This may cause vascular damage and other side effects.
  • closed loop thermal control and/or internal thermal visualization a compromise between optimum epidermal temperature and sub dermal temperatures can be made.
  • Some embodiments may further include surface temperature measurement devices and techniques as described herein, to provide even more control over both internal and surface temperatures.
  • the optimum time constant (response rate) of temperature sensor may vary.
  • a faster response time has the advantage of actively measuring tissue temperature throughout the surgeon's treatment stroke.
  • the thermal mass of, e.g., a thermistor or thermocouple should be reduced or minimized.
  • Another possibility is to measure the treatment stroke length (e.g. using an accelerometer to measure a sign change in the velocity of the probe), divide the treatment stroke into near, mid and far "ranges” and then sample average temperature for the period the cannula tip is present in each range. This allows a slower response time thermal couple to generate a relatively precise average temperature feedback signal for each of the near, mid and far range areas. The feedback can then be used by the laser to adjust or even out temperature accumulation through each "range" of the probe stroke.
  • This approach combined with the temperature visualization techniques described herein may help compensate for poor clinician technique.
  • any of the temperature feedback and safety devices and techniques described the reference incorporated above may be used in conjunction with system 400.
  • the processor 410 receives a color (RGB) video stream from a CCD camera 405.
  • the video stream is passed through a color splitter 601 which extracts a mono-color (as shown, green) portion of an video stream.
  • the color splitter may be a digital color splitter. However, in some embodiments the splitter may be an analog device.
  • the splitter is replaced by an optical element (e.g. a dichroic mirror) which directs a selected color of light to a camera to generate a mono-color video stream.
  • the mono-color video stream is passed to a black and white video stream generator 602 which converts the mono-color video stream to grey scale.
  • a color video stream passes through color splitter 601 and is passed to video to an X-Y converter 603 which converts the video stream images to a pixilated map, with each pixel having a corresponding X and Y coordinate. As shown, each pixel has an associate RGB color value.
  • the RGB X-Y pixilated map is passed to a color filter 604 to provide a mono-color pixel map (as shown red).
  • the chosen color may be matched to the color of the tracking light to enhance the appearance of the illuminated tracking spot in the pixilated map (e.g. by increasing the contrast between the spot and the surrounding background).
  • the mono-color pixilated X-Y map is then passed to a detection module 605 which detects the location of the tracking light illuminated spot.
  • a detection module 605 which detects the location of the tracking light illuminated spot.
  • Any suitable detection and tracking algorithm known in the art may be used.
  • the algorithm may detect a peak intensity pixel in the mono-color pixilated map, and identify that pixel with the position of the spot.
  • the perimeter of the spot may be identified (e.g. by examining adjacent pixel to pixel intensity changes), and the geometric centroid of the spot calculated and identified with the position of the spot.
  • a data stream indicating the time varying X-Y position of the spot is passed to a track lock indicator module 606.
  • the visual track lock indicator module 606 monitors the spot position data to determine if the spot is following a realistic track. For example, the module 606 may monitor the rate of change of the spot position and compare this rate to a threshold value above which the tracking data is unreliable (e.g. indicating that the camera view of the illuminated spot has been obstructed.)
  • the X-Y spot information is passed to a crosshair generator which generates a image of a cross hair indicia located with an X-Y frame position corresponding to the spot information.
  • the frame cross hair will thus move in real time in response to changes in the spot position.
  • the module sends a "no visual track lock" indicator to inertial tracking module 608.
  • the inertial tracking module 608 receives an inertial sensor signal 609 from the surgical probe 415.
  • the inertial tracking signal includes information indicative of the position of the handpiece which can be used to substitute for the X-Y spot position data once the visual tracking lock has been lost.
  • the inertial tracking module 609 may enable inertial tracking and process the inertial sensor signal to determine the change in position of the surgical probe relative to its last know position determined by the visual tracking (e.g. using MEMS accelerometer and/or gyroscope signals). This information can be used to determine a corresponding X-Y position data, which is passed on to the crosshair generator 607 in place of the unrealistic X-Y data generated by the visual tracking of the illuminated spot.
  • the inertial sensor signal may be subject to drift errors which increase as a function of the inertial measurement time. After a time period of a sufficient duration, the inertial measurement becomes unreliable.
  • the inertial tracking module 609 upon determination that the visual tracking has failed, the inertial tracking module 609 begins an inertial tracking countdown clock. Once the countdown completes, the module 609 determines that the inertial tracking is no longer reliable, and terminates the inertial tracking. At this time visual tracking can recommence. If visual tracking remains unrealistic, all tracking may cease and an alarm or other indication of tracking failure (e.g. disappearance of the cross hair indicia from the display) may be produced.
  • the cross hair generator 607 passes the X-Y data on to a false color generator 610.
  • a thermal sensor signal 611 from thermal sensor 435 on the distal end of probe 415 is passed through a scaling module 612 to generate color information corresponding to the temperature measured by the thermal sensor.
  • the scaling information may be provided in the form of a look up table, or any suitable scaling formula or algorithm.
  • the false color generator 610 combines the X-Y tracking data with the color scaled temperature data to generate a false color temperature map image which includes graphical elements (e.g. dots) at positions in the frame having a color indicating a corresponding measured temperature.
  • the false color temperature map is passed on to a persistence generator 613, which causes the graphical elements to be removed from the temperature map or to fade away after a given time period (e.g. at a set time after the cross hair indicia has moved away from the corresponding X-Y position). Accordingly, as described above, older, less reliable temperature data is continuously removed from the temperature map.
  • the generated cross hair image and the temperature map are combined at a mixer 614, and subsequently converted out of the X-Y format to a video stream by converter 615.
  • This cross hair and temperature map video stream is overlaid on the black and white video stream of the surgical field 407 by a video mixer 616.
  • the combined video stream is passed to the display 440 for viewing (e.g. as shown in FIGs. 5C-5E).
  • processor 410 may be implemented in software (e.g. run on a general purpose microprocessor) or in hardware (e.g. including one or more integrated circuit devices) or in a combination thereof.
  • One or more or any part thereof of the measurement, feedback, and/or display techniques described above can be implemented in computer hardware or software, or a combination of both.
  • the methods can be implemented in computer programs using standard programming techniques following the method and figures described herein.
  • Program code is applied to input data to perform the functions described herein and generate output information.
  • the output information is applied to one or more output devices such as a display monitor.
  • Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system.
  • the programs can be implemented in assembly or machine language, if desired. In any case, the language can be a compiled or interpreted language.
  • the program can run on dedicated integrated circuits preprogrammed for that purpose.
  • Each such computer program is preferably stored on a storage medium or device (e.g., ROM or magnetic diskette) readable by a general or special purpose programmable computer, for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
  • the computer program can also reside in cache or main memory during program execution.
  • the analysis method can also be implemented as a computer- readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner to perform the functions described herein.
  • laser light is used for treatment
  • other sources of treatment light e.g. flash lamps, light emitting diodes
  • the term 'light' is to be understood to include electromagnetic radiation both within and outside of the visible spectrum, including, for example, ultraviolet and infrared radiation.

Abstract

An apparatus is disclosed for monitoring a thermal surgical procedure including a thermal camera for monitoring temperature at a plurality of locations within at least a portion of a surgical field undergoing thermal surgical treatment and generating a series of thermal images based on said monitoring, a processor for processing the thermal images, and a display for displaying, in real time, a series of display images indicative of temperature at the plurality of positions.

Description

THERMAL SURGICAL MONITORING
CROSS REFERENCE TO RELATED APPLICATIONS
The present application claims benefit of U.S. Provisional Patent Application Ser. No. 61/157862 filed March 5, 2009, the entire contents of which is incorporated by reference herein in its entirety.
The present application is further related to U.S. Patent Application Serial. No. 12/135968 filed 6/9/2008, International Patent Application Serial No. PCT/US2008/007227 filed 6/9/2008, U.S. Patent Application Serial No. 12/135967 filed 6/9/2008, International Patent Application Serial No. PCT/US2008/007225 filed 6/9/2008, U.S. Patent Application Serial No. 12/135970 filed 6/9/2008, International Patent Application Serial No. PCT/US2008/007226 filed 6/9/2008, U.S. Patent Application Serial No. 12/135971 filed 6/9/2008, International Patent Application Serial No. PCT/US2008/007218 filed 6/9/2008 the entire contents of each of which is incorporated by reference herein in its entirety.
BACKGROUND
The present invention relates to a surgical theatre camera and a surgical IR sensor. More particularly, the present invention relates to a surgical camera and sensor adapted to monitor tissue temperature of a patient for during a thermal surgical procedure.
Plastic surgeons, dermatologists and their patients continually search for new and improved methods for treating the effects of an aging or otherwise damaged skin. One common procedure for rejuvenating the appearance of aged or photodamaged skin is laser skin resurfacing using a carbon dioxide laser. Another technique is non-ablative laser skin tightening, which does not take the top layer of skin off, but instead uses a deep-penetrating laser to treat the layers of skin beneath the outer epidermal layer, tightening the skin and reducing wrinkles to provide a more youthful appearance.
For such techniques for laser skin tightening treatment, it has been difficult to control the depth and amount of energy delivered to the collagen without also damaging or killing the dermal cells. Much of the energy of the treatment pulse is wasted due to scattering and absorption in the outer epidermal layer, and the relatively high pulse energy required to penetrate this outer layer can cause pain and epidermal damage.
Some skin tightening techniques include using a hollow tubular cannula that contains an optical fiber connected to a laser source. The cannula can be inserted subcutaneously into a patient so that the end of the fiber is located within the tissue underlying the dermis. The source emits a treatment output, for example an output pulse that is conveyed by the fiber to the dermis, which causes collagen shrinkage within the treatment area, thus tightening the skin.
To improve one's health or shape, patients have also turned to surgical methods for removing undesirable tissue from areas of their body. For example, to remove fat tissue, some patients have preferred liposuction, a procedure in which fat is removed by suction mechanism because despite strenuous dieting and exercise, some of the patients cannot lose fat, particularly in certain areas. Alternatively, laser or other light sources has been applied for heating, removal, destruction (for example, killing), photocoagulation, eradication or otherwise treating (hereinafter collectively referred as "treating" or "treatment") the tissue.
In applications including those mentioned above, it is often desirable to monitor the temperature of a specific location, for example, a location within a surgical field, in real time. Such monitoring may prevent, for example, skin or other tissue damaged caused by, for example, overheating. SUMMARY
The inventors have realized that a thermo-surgical theater with appropriate sensors and processing may provide hands free surface and/or internal temperature monitoring and/or feedback to a surgeon during thermal surgical procedures, for example laser surgery or any other procedure/ surgery where energy is deposited into surface tissue to effect a tissue temperature rise. Various embodiments could include surgical systems employing, e.g., laser, radio frequency (RF), acoustic or other suitable energy source.
In some embodiments, an apparatus for monitoring a thermal surgical procedure includes a thermal camera for monitoring temperature at a plurality of locations within at least a portion of a surgical field undergoing thermal surgical treatment and generating a series of thermal images based on the monitoring. The apparatus further includes a processor for processing the thermal images and a display for displaying, in real time, a series of display images indicative of temperature at the plurality of positions.
In some embodiments, the display images indicative of temperature at the plurality of locations comprise false color images. In some embodiments, the thermal camera is an infrared camera. In some embodiments, the thermal camera generates the series of thermal images at a rate of greater than about 30 frames per second. In some embodiments, the display displays the series of display images at a rate of greater than about 30 frames per second.
In some embodiments, the apparatus for monitoring a thermal surgical procedure further includes a video camera for monitoring at least a portion of the surgical field, where the display is configured to display, in real time, video images of said portion of the surgical field.
In some embodiments, the apparatus for monitoring a thermal surgical procedure further includes a processor configured to superimpose the thermal images with the video images, where the display images comprise the superimposed images. In some embodiments, the field of view of the thermal camera substantially overlaps the field of view of the video camera.
In some embodiments, the apparatus for monitoring a thermal surgical procedure further includes an optical element which selectively directs light from the surgical field in the infrared spectrum to the thermal camera, and selectively directs light from the surgical field in the visible spectrum to the video camera. In some embodiments, at least one of the thermal camera and the video camera include an autofocus. In some embodiments, the apparatus for monitoring a thermal surgical procedure further includes a servo unit for directing the thermal camera to monitor a selected portion of the surgical field.
In some embodiments, the apparatus for monitoring a thermal surgical procedure further includes an indicator which illuminates the selected portion of the surgical field, where the servo unit comprises a tracking unit adapted to track the illuminated portion and direct the thermal camera to monitor the selected portion.
In some embodiments, the processor determines temperature information about the portion of the surgical field based on the thermal images. In some embodiments, the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device based on said temperature information. In some embodiments, the processor is configured to process the thermal images and determine information indicative of temperature at the plurality of locations, compare the information indicative of temperature at the plurality of locations to a selected master threshold temperature, and produce a master alarm if the temperature at any of the plurality of locations exceeds the selected master threshold temperature.
In some embodiments, the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device in response to the master alarm. In some embodiments, the processor is configured to process the thermal images and determine information indicative of temperature at a subset of the plurality of locations, compare the information indicative of temperature at the subset of plurality of locations to a selected range of temperature, and produce a range alarm if the temperature at any of the subset of the plurality of locations falls outside the selected range.
In some embodiments, the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device in response to the range alarm. In some embodiments, the processor is configured to process the thermal images and determine information indicative of a rate of change of temperature at the plurality of locations, compare the information indicative of temperature at the plurality of locations to a selected secondary threshold temperature, compare the information indicative of the rate of change of temperature at the plurality of locations to a selected threshold rate, and produce a rate alarm if, at any one of the plurality of locations, the temperature exceeds the secondary threshold and the rate of change of temperature exceed the rate threshold. In some embodiments, the processor is coupled to a laser surgical device and is configured to control the operation of the laser surgical device in response to the rate alarm.
In some embodiments, the display comprises at least one chosen from the group consisting of: a video monitor, a flat panel monitor, a heads up display, and a virtual reality display.
In some embodiments, a method is defined for monitoring a thermal surgical procedure including using a thermal camera to monitor temperature at a plurality of locations within at least a portion of a surgical field undergoing thermal surgical treatment and generating a series of thermal images based on the monitoring, and displaying, in real time, a series of display images indicative of temperature at the plurality of positions based on the series of thermal images. In some embodiments, displaying, in real time, a series of display images indicative of temperature at the plurality of positions includes displaying false color images.
In some embodiments, the method further includes using a video camera to monitor at least a portion of the surgical field and displaying, in real time, video images of the portion of the surgical field. In some embodiments, the method further includes processing the thermal images and the video images to superimpose the thermal images with the video images and displaying display images corresponding to the superimposed images.
In some embodiments, the method further includes determining temperature information about the portion of the surgical field based on the thermal images. In some embodiments, the method further includes controlling the operation of a laser surgical device based on the temperature information.
In some embodiments, the method further includes processing the thermal images and determining information indicative of temperature at the plurality of locations, comparing the information indicative of temperature at the plurality of locations to a selected master threshold temperature, and producing a master alarm if the temperature at any of the plurality of locations exceeds the selected master threshold temperature.
In some embodiments, the method further includes processing the thermal images and determining information indicative of temperature at a subset of the plurality of locations, comparing the information indicative of temperature at the subset of plurality of locations to a selected range of temperature, and producing a range alarm if the temperature at any of the subset of the plurality of locations falls outside the selected range.
In some embodiments, the method further includes processing the thermal images and determine information indicative of a rate of change of temperature at the plurality of locations, comparing the information indicative of temperature at the plurality of locations to a selected secondary threshold temperature, comparing the information indicative of the rate of change of temperature at the plurality of locations to a selected rate threshold, and producing a rate alarm if, at any one of the plurality of locations, the temperature exceeds the secondary threshold and the rate of change of temperature exceed the rate threshold.
In some embodiments, the method further includes providing an indicator identifying a area of interest within the surgical field, tracking the position of the indicator, and adjusting the thermal camera to monitor the area of interest. In another aspect, a method is disclosed including: using a video camera to generate a video stream of a surgical field including an area of skin of a patient; inserting a surgical probe into the patient such that at least a distal end of the probe is located in a treatment region under the area of skin; emitting tracking light from the distal end of the probe to illuminate a portion of the area of skin overlaying the distal end of the probe; processing the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin; displaying the video stream; and based on the tracking information, displaying an indicia of the position of the distal end of the probe overlaying the display of the video stream..
In some embodiments, the surgical probe includes an inertial sensor, and the method further includes using the inertial sensor to determine position information indicative of the position or movement of the probe.
In some embodiments, the distal end of the probe includes a temperature sensor, and the further includes: moving the distal end of the probe to a plurality of locations in the treatment region; using the temperature sensor to determine temperature information indicative of the temperature at each of the locations; and overlaying the display of the video stream with graphical information indicative of the temperature of the locations based on the tracking information and the temperature information.
Some embodiments include, for each of the plurality of locations: determining the position of the illuminated a portion of the area of skin overlaying the probe; determining the temperature at the location of the distal end of the probe; displaying a graphical element overlaying the display of the video stream for a selected duration at a display location corresponding to the position of the illuminated a portion of the area of skin, where a property of the graphical element is indicative of the temperature of the probe at the location.
In some embodiments, the property of the graphical element includes at least one from the list consisting of: color, shape, size, brightness, grey scale, blink rate, and an alphanumeric indicator. In some embodiments, displaying a graphical element overlaying the display of the video stream for a selected duration includes causing the graphical element to fade away at a selected rate.
In some embodiments, the temperature sensor includes a thermister.
In some embodiments, the surgical probe includes an inertial sensor, and the method further includes: using the inertial sensor to determine position information indicative of the position or movement of the probe. For each of the plurality of locations, a property of the respective graphical element is determined based on the position information.
In some embodiments, the position information includes information indicative of a depth under the skin of the distal end of the probe at the location, hi some embodiments, the position information includes information indicative of a dwell time of the distal end of the probe at the location. In some embodiments, the position information includes information indicative of a speed of motion of the distal end of the probe at the location.
In some embodiments, the surgical probe includes an inertial sensor, and the method further includes: using the inertial sensor to determine position information indicative of the position or movement of the probe; providing an obstruction blocking the video camera from viewing the illumination of the of the portion of the area of skin by the tracking light, thereby causing an interruption of the determination of the tracking information; and during the interruption, providing substitute tracking information based on the position information determined by the inertial sensor.
In some embodiments, the tracking information includes information indicative of the depth of the distal end of the probe beneath the skin. In some embodiments, the step of emitting tracking light includes emitting a first portion of tracking light at a first wavelength and a second portion of tracking light at a second wavelength; and the step processing the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin includes comparing an intensity of the illumination at the first wavelength to an intensity of the illumination at the second wavelength. In some embodiments, the first wavelength is in the red portion of the visible spectrum and the second wavelength is in the green portion of the visible spectrum.
In some embodiments, the surgical probe includes an optical delivery device, and the method further includes using the optical delivery device to deliver therapeutic light from a source to the treatment region. In some embodiments, the therapeutic light includes laser light, hi some embodiments, the therapeutic light includes light having a wavelength in the visible or infrared. In some embodiments, the therapeutic light has a wavelength of about 1064nm, 1320 nm, or 1440 nm. hi some embodiments, the delivered therapeutic light has a total power in the range of 1 W to 2OW. In some embodiments, the delivered therapeutic light has a power density in the range of 200 W/cmΛ2 to 20,000 W/cmΛ2 , or any sub range therein, at the target region.
In some embodiments, the step of delivering therapeutic light from the light emitting portion of the delivery device includes delivering a series of light pulses. In some embodiments, the series of pulses includes a pulse having a duration in the range of about 0.1 ms to about 1.0 ms. In some embodiments, the series of pulses has a repetition rate in the range of about 10 to about 100 Hz.
In some embodiments, the surgical probe includes at least one sensor, and the method further includes: using the at least one sensor, generating a signal indicative of at least one property of the delivery device or the treatment region; and controlling the delivery of therapeutic light based on the sensor signal.
In some embodiments, the property of the delivery device or the target region includes at least one selected from the list consisting of: a position of the optical delivery device, a movement of the optical delivery device, temperature of the optical delivery device, a tissue type in the vicinity of the optical delivery device, an amount of energy delivered by the optical delivery device, and a temperature of tissue in the target region. In some embodiments, the sensor includes at least one selected from the list consisting of: a thermister, an inertial sensor, an accelerometer, a gyroscope, and a color sensor.
Some embodiments include generating a display based on the signal indicative of at least one property of the delivery device or the target region.
In some embodiments, the surgical probe includes a cannula and the optical delivery device includes a optical fiber having at least a portion located in or proximate to the cannula.
In another aspect, an apparatus is disclosed including: a video camera configured to generate a video stream of a surgical field including an area of skin a patient; a surgical probe configured for insertion into the patient such that at least a distal end of the probe is located in a treatment region under the area of skin; an optical emitter configured to emit tracking light from the distal end of the probe to illuminate a portion of the area of skin overlaying the distal end of the probe; a processor in communication with the camera and including a visual tracking module which processes the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin; and a display in communication with the processor and the camera, the display configured to display the video stream; and based on the tracking information, display an indicia of the position of the distal end of the probe overlaying the display of the video stream.
In some embodiments, the surgical probe includes an inertial sensor in communication with the processor, and the processor includes an inertial tracking module which processes a signal from the inertial sensor to determine position information indicative of the position or movement of the probe.
In some embodiments, the distal end of the probe includes a temperature sensor in communication with the processor, and the processor is configured to process a signal from the temperature sensor to determine temperature information indicative of the temperature at each of a plurality of the location traversed by the probe. In some embodiments, the processor includes an image generation module which forms a graphical representation indicative of the temperature of the locations based on the tracking information and the temperature information; and a video mixer which overlays the display of the video stream with the graphical representations.
In some embodiments, the processor includes: the visual tracking module, the module configured to process the video stream to determining the position of the illuminated a portion of the area of skin overlaying the probe; and a temperature sensing module configured to determine, based on the signal from the temperature sensor, the temperature at a plurality of locations in the treatment region. The image generation module and video mixing module cooperate to generate a respective graphical element overlaying the display of the video stream for a selected duration at a location corresponding to the position of the illuminated portion of the area of skin, where a property of the graphical element is indicative of the temperature of the probe at the location.
In some embodiments, the property of the graphical element includes at least one from the list consisting of: color, shape, size, brightness, grey scale, blink rate, and an alphanumeric indicator.
In some embodiments, the image generation module includes a persistence module which causes each graphical element to become increasingly transparent at a selected rate.
In some embodiments, the temperature sensor includes a thermister.
In some embodiments, the surgical probe includes an inertial sensor, and the processor includes an inertial tracking module configured to use the inertial sensor to determine position information indicative of the position or movement of the probe, and for each of the plurality of locations, determine a property of the respective graphical element based on the position information. In some embodiments, the position information includes information indicative of a depth under the skin of the distal end of the probe. In some embodiments, the position information includes information indicative of a dwell time of the distal end of the probe. In some embodiments, the position information includes information indicative of a speed of motion of the distal end of the probe.
In some embodiments, the surgical probe includes an inertial sensor, and the processor includes: an inertial tracking module configured to use the inertial sensor to determine inertial position information indicative of the position or movement of the probe; and a visual tracking lock indicator module configured to process the video stream to identify an interruption tracking by the visual tracking module; and upon identifying the interruption, substitute inertial position information for tracking information from the visual tracking module.
In some embodiments, the tracking information includes information indicative of the depth of the distal end of the probe beneath the skin. In some embodiments, the optical emitter emits a first portion of tracking light at a first wavelength and a second portion of tracking light at a second wavelength; and the visual tracking module processes the video stream to determine the information indicative of the depth of the distal end of the probe beneath the skin by comparing a property of the illumination at the first wavelength to a property of the illumination at the second wavelength. In some embodiments, the first wavelength is in the red portion of the visible spectrum and the second wavelength is in the green portion of the visible spectrum.
In some embodiments, the surgical probe includes an optical delivery device configured to deliver therapeutic light from a therapeutic light source to the treatment region.
In some embodiments, the surgical probe includes at least one sensor in communication with the processor, and the at least one sensor is configured to generate a signal indicative of at least one property of the delivery device or the treatment region; and the processor is configured to control the delivery of therapeutic light to the treatment region based on the sensor signal. In some embodiments, the property of the delivery device or the target region includes at least one selected from the list consisting of: a position of the optical delivery device, a movement of the optical delivery device, temperature of the optical delivery device, a tissue type in the vicinity of the optical delivery device, an amount of energy delivered by the optical delivery device, and a temperature of tissue in the target region.
In some embodiments, the sensor includes at least one selected from the list consisting of: a thermister, an inertial sensor, an accelerometer, a gyroscope, and a color sensor.
In some embodiments, the processor and the display are configured to cooperate to generate a display based on the signal indicative of at least one property of the delivery device or the target region.
Various embodiments may include any of the features of techniques described herein, either alone or in any suitable combination.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
FIG. 1 shows a thermal surgical monitoring system.
FIG. 2 shows a thermal surgical monitoring system with a surgical theatre thermal camera and laser treatment instrument mounted to an adjustable pole.
FIG. 2A shows a thermal surgical monitoring system with a surgical theatre thermal camera and laser treatment instrument mounted to a portable station. FIG. 3 shows a display of a thermal image taken with a surgical theatre thermal camera superimposed onto a display of a video image of a treatment site taken with a video camera.
FIG. 3 A shows a block diagram of a thermal surgical monitoring system with feedback control to a thermal surgical device.
FIGs. 4 and 4A show a thermal surgical monitoring system.
FIGs. 5A-5E show a surgical field monitored by the system of FIGs. 4 and 4A.
FIG. 6 is a schematic of a processor of he system of FIGs. 4 and 4A.
DETAILED DESCRIPTION
FIG. 1 shows a thermal surgical monitoring system 100 . A surgeon performs a thermal surgical procedure within surgical field 101. In the embodiment shown, laser light generated by a laser 125 is directed to through handpiece 126 to tissue within surgical field 101. Handpiece 126 includes a cannula for insertion into the patient through an incision located in surgical field 126. The cannula contains a waveguide, e.g. an optical fiber, which delivers later light from laser 125 to tissue to be treated. Accordingly, laser 125 may be used, for example, to treat tissues within the dermis to cause skin tightening, for laser lipolysis, or for other such applications.
System 100 measures temperatures at positions within surgical field 101 with thermal camera 105 (e.g. a thermal camera) which is, in some embodiments, mounted to a ceiling 110 above a surgical site. An exemplary suitable thermal camera is a high resolution infrared (IR) camera such as the ThermaCam45 Infrared Camera, manufactured by FLIR, Wilsonville, OR, which produces both good resolution and a wide field of view.
Display 115 (e.g. an LCD screen)is located within convenient view of a surgeon 120 such that a real time surface temperature information in the form of, for example, a false color isotherm map shown on display 115 is available to the surgeon during surgery. The surgeon can use the real time surface temperature information to control a deposition of energy within surgical field 101 in a tissue or tissues to be treated, such that, for example, a desired temperature rise in the treated tissue or tissues is achieved.
In some embodiments, the thermal camera 105 includes an auto focus 106.
In typical applications practical difficulties regarding, e.g., the sterility of surgical field 101 and accessibility of thermal cameral 105 make such a feature, allowing hands free operation, desirable. In some embodiments, thermal camera 105 includes a manual focus control, e.g. a zoom lever. Such a control could be covered with a sterile bag 130. In some embodiments, thermal camera 105 has a preset depth of field that would include the full surgical working range, reducing or eliminating the need for an auto focus feature 106.
In some embodiments, the mounting location for thermal camera 105 would be over a surgical arena adjacent to or part of a set of ceiling mounted surgical lights. As an example, the camera may be light enough to mount to existing operating room (OR) lighting booms. In some embodiments, a ceiling mounted camera boom places the thermal camera 105 advantageously close to a surgical field 101, yet out of the way of the surgeon. In various embodiments, the display 115 can also be ceiling boom mounted, hi some embodiments, the display 115 may be mounted on top of treatment laser 125.
In various embodiments, a wide angle or standard lens may be used to provide a suitable view for typical surgical applications, roughly framing a human abdomen, for example, at a working distance of 3-6 feet. In some embodiments, a temperature measurement accuracy may be approximately ±2°C, with a temperature drift over a 1 hour period of ±2°C. Some embodiments may feature a calibration shutter which operates, for example, about every minute. Some embodiments may feature less frequent calibration, more suitable for the particular application at hand.
In typical thermal surgical applications, providing real time surface temperature information is important because the large number of uncontrollable variations in surgery such as inhomogeneities in skin type, circulation/dialation, skin thickness, fibrotic tissue, geometry of surgical instruments with respect to anatomical structures, and surgical technique all effect the resulting temperature rise for a given energy input. For this reason, system 100 provides real time surface temperature information which can be used as "feedback" by the surgeon performing the treatment procedure. By monitoring the surface temperature directly, the treatment can be improved or optimized for safety and efficacy, hi some embodiments, the surgeon manually adjusts his actions based on the displayed temperature information (e.g. by adjusting applied dosage). As discussed in detail below, in some embodiments, temperature information measured by system 100 may be processed and used to provide alerts to the surgeon and/or automatically control laser 125 (e.g. to shut off the laser system in the event of the appearance of a high temperature "hot spot" in within surgical field 101).
FIG. 2 shows and embodiment of system 100 where thermal camera 105 is mounted on an adjustable pole 210 connected to laser 125. Such mounting may reduce or eliminate operating room installation costs and issues concerning the surgical sterile field. In some embodiments, thermal camera 105 may be controlled (e.g., turned On/Off) directly from touchscreen display 117 on the laser 125. In such configurations, laser 125 and system 100 are integrated such that no external wiring above and beyond a self contained wiring for the laser 125 may be required.
FIG. 2A shows a thermal surgical monitoring system with a surgical theatre thermal camera and laser treatment instrument mounted to a portable station 165. An autoclavable handle 170 allows for the adjustment of the thermal camera 105. The thermal camera 105 itself is supported and balanced by a boom 172 and counterweight 174, respectively, that allows the thermal camera 105 to pivot about a portable station pivot point 176. The bottom of the portable station 165 is configured with a support ballast 178 and a set of castors 179 to both stabilize the portable station 165 and to give it portability.
A display 115 is connected to the portable station 165 to a mounting bracket 180 approximately eye level to the surgeon 120. A processor 180 to interface the thermal camera 105 to the display 115 and the laser 125 and to interface the display
115 to the laser 125 is mounted to a back side of the portable station 165. FIG. 3 shows an embodiment of system 100 featuring a conventional video camera 305 (e.g. operating within the visual spectrum) in addition to thermal camera 105. This allows a simultaneous display (e.g. superposition) of a thermal image (e.g. a false color image) taken with thermal cameral 105 and a video image of a treatment site taken with a video camera. As shown, thermal camera 105 is ceiling mounted 110 at right angles to the standard video camera 305. Light from surgical field 101 is directed to thermal camera 105 and standard video camera 305 using beam splitting optic 306. For example, light in the IR spectrum may be selectively directed to thermal camera 105 while light in the visible spectrum is directed to conventional camera 305. The optic 306 may be made of ZnSe or Germanium, both of which have good IR transmission, and both of which can be coated to reflect the visible wavelengths. If the visible full-color spectrum is distorted due to a coating or to a substrate reflectivity, a black and white standard video image would, in some embodiments, suffice.
As shown, the two cameras 105, 305 have a set of matched field of views
308, preferably with a working range of several feet, reducing the need to refocus the cameras. The laser 125 irradiates a target tissue 309 in the field of view 308. The video images of the two cameras may be processed by a processor 307, which may combine the images for display on display 115. Processor 307 may include high speed image processing capabilities to combine and analyze a set of real time images for potential for hot spots. In some embodiments processor 307 may control laser 125 based on information determined by processing the images. In some embodiments, no connection is made from the laser driver 307 to the laser 125. In the case where no connection is made, the laser driver provides temperature and position data to the surgeon, displaying the temperature and position data on the display 115.
Although a particular arrangement of cameras 105 and 305 is shown, other arrangements are possible. For example, some embodiments may feature a stereoscopic version using, for example, a standard video camera (visual spectrum) 305 in sync and aligned immediately adjacent to the thermal camera 105. In some embodiments, both the standard video camera 305 and the thermal camera 105 have the same field of view.
In some embodiments, images from video camera 305 may be displayed on display 115 superimposed with images from thermal camera 105. For example, in some embodiments, an image from thermal camera 105 can be a transparent layer of false color superimposed on top of an image from the standard video camera 305. In this way, a set of hot spot islands could be immediately linked to a corresponding anatomical treatment area without a need to use other indicia (e.g. a warm or cold cannula placed within the field of view) to identify where the hot spots are located.
As discussed below, in some embodiments, images superimposed from thermal camera 105 onto images from the standard video camera 305 provide data to a laser driver 307 to determine laser power levels as a function of a set of treatment site surface temperatures. Controlling the laser 125 based on a set of treatment site surface temperatures may improve safety and efficacy.
In various embodiments, other display arrangements may be used. In some applications, an image from the standard video camera 305 and an image from the thermal camera 105 can be displayed side-by-side; in some applications, an image from the standard video camera 305 and an image from the thermal camera 105 can be displayed using a screen-in-screen format.
In another embodiment, display 115 is virtual reality image projection display. Such displays are commonly available and are low cost. In some embodiments, the thermal data for a virtual reality image projection is a three dimensional (3D) projection superimposed onto a 3D projection of the target tissue 309. In another embodiment, display 115 is a head mounted heads-up display (HUD) used to superimpose a set of images from the thermal camera directly onto the target tissue 309 in the field of view of the surgeon. A focal distance for the HUD may be set approximately to the distance from the eyes of the surgeon 120 to the target tissue 309. In some embodiments, a servo driven stereo camera assembly 310, supporting both the thermal camera 105 and the standard video camera 305, allows for automatic tracking and auto focus of the target tissue 309. The stereo camera assembly may, for example, track an aim beam, directed from the laser 125, and position such that the thermal camera is focused on the target tissue with a 10-15 cm diameter field of view 308. In embodiments featuring an surgical device used internally (e.g. the cannula with fiber optic described above) an aim beam may be used with sufficient intensity to shine through tissue and to provide a visible indicia of illumination on the outside of the patients skin. By tracking the aim beam and automatically focusing on a corresponding treatment area, system 100 provides a higher resolution thermal data and greater ease of use in the surgical theater.
As noted above, in various embodiments, processing of thermal data taken from thermal camera allows for numerous types of alarms, thresholds and/or automatic control of a surgical device (e.g. laser 125) based on temperature information derived from the thermal images. For example, referring to FIG. 3A, thermal images from thermal camera 105 and, optionally, video images from video camera 305 are sent to processor 307. Processor 307 processes the images to determine temperature information about the portion of the surgical field 101 within thermal camera's field of view. For example, the processor may determine, in real time, the temperature and/or change in temper at all positions (or a subset thereof) within the field of view.
The processor may analyze this temperature information and provide alarms or other indications to the surgeon. For example, still referring to FIG. 3A, processor 307 is in communication with display 115, and can display information or alarm indications on the display based on the analyzed temperature information. For example, visual indications could be provided on the display highlighting "hot spot" locations within the filed of view having temperatures above a selected threshold. Processor 307 may also be in communication with separate alarm unit 312. Alarm unit 312 may provide a visual alarm indication (e.g. a flashing light), an audible alarm indication (e.g. a buzzer or tone generator), or any other suitable indication. Processor 307 may also control laser 125 based on the analyzed temperature information. For example, the processor may be programmed to shut down the laser if the presence of one or more hot spots are detected. The processor may, in general, control any aspect of the laser (e.g. wavelength, pulse rate, intensity, etc.) based on the analyzed temperature data. In some embodiments processor 307 may receive other inputs from, e.g. acceleration, speed, or position sensors located in handpiece 126, thermal sensors located in handpiece 126, tissue type sensors located in handpiece 126, etc. Processor 307 may analyze information from these sensor inputs in conjunction with the temperature information from the thermal camera and display information, provide alarms, and/or control laser 125 based on the analysis.
For example, in some embodiments, the laser 125 may be directed by the processor 307 to automatically self limit the laser power applied to the target tissue 309. The laser driver 307 automatically adjusts the laser 125 power to achieve a selected surface sector temperature within the field of view 308. The surgeon 120 may thereby select a treatment area by for example, aiming a treatment beam or simply placing a surgical waveguide into the selected treatment sector and stepping on a footswitch. In some such embodiments, the system may also be optionally placed in a manual mode are where the surgeon 120 is in direct control of the laser 125.
The following describes a number of exemplary alarm settings which are useful in various thermal surgical setting.
Some embodiments may feature a master alarm, where an alarm occurs when any point within the thermal camera field of view exceeds a threshold, for example 48°C. In some embodiments, the alarm is not user selectable (i.e. hardwired). It is possible that the threshold is exceeded as a result of a hot cannula 207, which may be a result of a surgical waveguide 208 slipping in the cannula 207 as the surgical waveguide is plunged in and out of the tissue. In some embodiments, a false alarm resulting from a hot cannula 207 may be reduced or eliminated by weighting the points within the thermal camera field of view, allowing for localized hot spots that do not set off the alarm. Some embodiments feature a "fast alarm", where an alarm occurs when any point within the thermal camera field of view exceeds a threshold, for example 450C, and a rapid rate of change of temperature (ΔT) is detected in within the thermal camera field of view. In some embodiments, the fast alarm is not user selectable. In some embodiments, ΔT is approximately 0.20C /min.; ΔT on the order of 0.20C /min. may detect a leakage of a set of warm fluids from an incision site. Further, a ΔT on the order of 0.20C /min. may give an indirect indication of an excessive internal deep tissue temperature, as a deep tissue temperature may appear in a high resolution IR video image rapidly. In some embodiments, a more conservative threshold is chosen for the fast alarm than for the master alarm.
Some embodiments feature a local treatment zone High/Low alarm, where a set of treatment zone High/Low alarm thresholds are adjustable thresholds. In one embodiment, for example, the treatment zone High/Low alarm thresholds may be set to 39°C and 420C, respectively. In some embodiments, the treatment zone High/Low alarm is accompanied by a soft audible tone. Two ways of selecting data for determination of treatment zone thresholds include: selecting the entire thermal camera field of view and selecting a subset of the field of view, e.g. a 10 cm diameter area proximate to an aim beam, or other indicator indicative of the area targeted for treatment.
In some embodiments, the dynamic range of thermal cameral 105 may be adjusted to correspond to an expected temperature range for a given surgical application. For example, in some embodiments, the display 115 displays a treatment area within the surgical field 101 and the surgical surroundings. In such embodiments, a temperature range between the surgical surroundings and an expected maximum safe tissue temperature in the treatment area may be 100C- 800C. As an example, a typical laser lipolysis treatment may require a range of approximately 20°C-45°C. for some applications. By tailoring the dynamic range of the camera to the expected temperature range, cost and processing time reduction may be gained. In some embodiments, a software feature set of the thermal camera may be limited. For example, in some embodiments the camera may not provide 'ON/OFF' controls, may default to display a false color image, and may have only one to three easily set temperature ranges.
Referring to FIG. 4, in one embodiment, thermal surgical monitoring system
400 omits a thermal camera. A conventional camera 405 (e.g. operating within the visual spectrum) is positioned to view a surgical field 407 to produce a video stream (e.g. a sequence of images of the surgical field). In some embodiments, camera 405 may be a digital camera (e.g. a CCD camera). As shown, camera 405 is a digital color CCD camera which produces an RGB video stream familiar in the art. The video camera 405 communicates with a processor 410 to provide the video stream for processing, as described in detail below.
As shown, the surgical field includes an area of skin 408 of a patient. A surgical probe 415 is inserted through an incision 409 in the skin such that a distal end of the probe is located below the skin in a treatment region 411.
The probe 415 includes an optical delivery device (as shown, an optical fiber 416). The optical fiber 416 is coupled to a therapeutic light source 420 (e.g. a laser) and can transmit therapeutic light from the therapeutic source 420 to the treatment region 411 to effect treatment (e.g. heating of fatty tissue to induce lipolysis, thermal ablation, etc.). The surgical probe 415 may include one or more cannulas 412 (one is show), which may optionally house the optical fiber 416. In some embodiments, the cannula 412 may be a suction cannula to which a vacuum may be applied, e.g. to remove the byproducts produced by application of the therapeutic light to tissue in the target region.
The optical fiber is also coupled to a tracking light source 425 (e.g. a laser light source) to transmit visible tracking light which is emitted from the distal end of the probe 415 in the treatment region 411. The visible tracking light is transmitted and/or scattered through the treatment region to illuminate an area 417 of the skin 408 which overlays the distal end of the probe. In one embodiment, tracking light source is a laser diode emitting light in the red portion of the visible spectrum (e.g. at 808 nm). In alternative embodiments, tracking light source maybe incorporated in the probe 415, for example, as an element (e.g. an LED) positioned on the distal end of the probe to emit light directly into the treatment region 411.
Accordingly, the illuminated area 417 appears in the video stream from the camera 405 as an externally visible spot on the skin 408 in the surgical field 407 (e.g. the exemplary field shown in FIG. 5A). The processor 410 processes the video stream to track the location of the illuminated area to 417 provide tracking information indicative of the position of the distal end of the probe 415 under the patient's skin. Tracking information obtained in this fashion may be described as "visual:" tracking information. In some embodiments, the tracking light may be modulated (e.g. pulsed at a known frequency) to aid in identification of the illuminated area 417 in the video stream (e.g. using lock-in detection processing techniques known in the art).
In some embodiments, the surgical probe 415 includes a thermal sensor 435 which is in communication with processor 410 to provide a temperature signal indicative of the temperature of the treatment region surrounding the distal end of the probe 415. For example, thermal sensor 435 may be a thermister mounted on the distal end of the probe 415 which samples the temperature of its environment, e.g. continuously or at regular intervals. In other embodiments, the thermal sensor 415 may include a thermocouple, a pyrometer, or an infrared (IR) thermal sensor.
In some embodiments, the thermal sensor 435 exhibits a finite reaction rate in response to changes in temperature. As is known in the art, the response rate of the sensor may be characterized by a time constant. Typical sensors (e.g. thermister) know in the art operate wish a variety of response time constants ranging from 1 ms or less to a second or more, e.g., about 250 ms.
Referring back to FIG. 4, a display 440 is in communication with processor 410 and camera 405. The display 440 allows a user to view, in real time, video images from the video stream of the surgical field 407 showing the skin surface area 408 overlaying the internal treatment region 41 1. An exemplary display view of the surgical field 407 is shown in FIG. 5C. As shown, the display 440 shows grayscale video images of the surgical field, but in other embodiments a color, false color, or other video display may be used.
As described above, processor 410 processes the video stream to determine tracking information indicative of the position of the spot area 417 on skin surface 408 illuminated by the tracking light from the distal end of the surgical probe 415. Using this information, the processor superimposes a graphical tracking indicia (as shown, a cross hair, although any other suitable indicia may be used) on the display of the skin surface 408 indicating the position on the surface overlaying the distal end of the probe 415. As the probe 415 is moved throughout the treatment region 411, the cross hair indicia is repositioned, providing the practitioner real time visual tracking of the probe end.
As the distal end of the probe 415 is moved throughout the treatment region, temperature sensor 435 on the probe samples the temperature at several locations. This temperate information is communicated to the processor 410, and combined with the tracking information to determine positions on the skin surface corresponding to the underlying temperature sample locations. Referring to Fig, 5 C, the processor 410 uses this information to generate graphical elements 501 (as shown dots) superimposed on the video display at these positions. A graphical property of the dots (as shown, color) is indicative of the temperature of the corresponding internal location. In other embodiments, other graphical properties may be used to represent temperature including: color, shape, size, brightness, grey scale, blink rate, an alphanumeric indicator, etc. The display further includes an alphanumeric indication 503 corresponding to the temperature measured at the position currently occupied by the cross hair indicia.
Referring to FIG. 5C, as the surgical probe is moved back and forth along a stroke path, a linear track 505 of false color dots 501 representing the measured internal temperatures along the path is generated and superimposed onto the gray scale background. Referring to FIG. 5D, as the surgical probe 415 traverses additional stroke paths, a false color temperature map 507 of a portion of underlying treatment region is generated over the display of the surgical field. This map may be generated in a diagnostic mode, during which no therapeutic light is delivered by the surgical probe 415. Alternatively, the map may be generated in a treatment mode during a procedure in which therapeutic light is applied, thereby generating a visual representation of the distribution of internal heating effected during the procedure.
In some embodiments, each color dot persists for a selected time (e.g. a few seconds) and then is removed or, in some embodiments, slowly fades to transparent (as shown in FIG. 5E). If the location of the removed or faded dot is traversed again by the distal end of the surgical probe 415, a new color dot is generated with new temperature information. In the treatment mode, when the measurement is made, the temperature is accurate for that location and point in time, but as the surgical probe moves away to deliver therapeutic light to other locations, the temperature typically drops as heat dissipates. Accordingly, the fading persistence of a dots removes older measurements from the temperature map as they become less accurate. Similarly, if the camera 405 or the patient is moved, the color map can fade on its own until a new map is made or the existing map is cleared by operator command.
The temperature map shown in FIG. 5D is two dimensional only (depth may be subjectively known and controlled by the physician's, e.g. by manually limiting the movement of the distal end of the probe 415 to a chosen operating plane). However, in other embodiments, information indicative of the depth of the distal end of the probe 415 may be generated (e.g. using an inertial sensor or other sensing techniques) and represented temperature map (e.g. by indicating the depth of a given measurement location by a size of the corresponding color dot).
In some embodiments, the depth of the temperature measurement can be determined using by using a dual color tracking light. For example, tracking light source 425 may emit both red and green light which is combined in the fiber 416 and delivered to the treatment region 411 to be scattered to illuminate skin surface area 417. The intensity of each color component of the light from illuminated area 417 may be detected in the red and green channels respectively of the RGB CCD camera 405. (leaving the blue channel to produce a grayscale image of the surgical field 407 for viewing on display 440). The red and green wavelengths have different absorption and scattering coefficients in typical tissue structures including fat, dermis and epidermal melanin. Accordingly, each color of tracking light will experience different effects as it passes through treatment region 411 to illuminate area 417. For example, if the red color wavelength is chosen to avoid the strong blood absorption band with a 577 nm peak, the red light will have less absorption and will experience less scattering than the green. That will lead to a more intense and less dispersed red spot than green scattered spot on the skin surface for light emitted at deeper locations in the treatment region 411. Accordingly, the ratio of the green versus red scattered spot intensity and the ratio of the green vs red scattered spot diameters can be used indicate the depth of the distal end of probe 415. For example, the ratios could be stored in processor 410 as entries in a lookup tables to convert these ratios to corresponding depths. In some embodiments, the lookup tables may be pre- calculated in advance, e.g. based on Monte Carlo modeling of a range of probe depths in various skin types. Alternatively, the tables may be determine based on empirical data.
Obstructions in the field of view of the camera (e.g. the surgeon choosing to place their hand over the surgical site) may result in a loss of visual tracking. However, in some embodiments, as described in detail below, inertial sensor data from the probe 415 may be used in some embodiments to maintain position tracking even during intervals in which visual tracking is lost.
As shown in FIG. 4, the inertial sensor 430 includes a single or multiple degree of freedom MEMS sensor mounted on the surgical probe 415 (e.g. in a handpiece element). Such a sensor may include for example multiple accelerometers (e.g. to measure linear acceleration along 3 axes) and/or multiple gyroscopes (e.g. to measure angular acceleration along 3 axes). Some embodiments may use a magnetometers (e.g. a 3 axis magnetometer) to measure motion relative to the earth's magnetic field. Information from the magnetometer may be used compensate gyro drift or to provide position information measure relative to a fixed 3D reference frame. In various embodiments, any other number and type of sensors may be used. Further examples of inertial sensors for surgical probes are described in the references incorporated above.
Signals from the inertial sensor may be processed using techniques known in the art to determine information related to the position or movement of the probe. The information may include an absolute position, a relative position, a linear or angular speed, a linear or angular velocity, a linear or angular acceleration, and orientation in space, etc.
In some embodiments, the position information determined from the inertial sensor signal may be used to provide inertial tracking of the surgical probe 415 during periods when the above described visual tracking technique fails (e.g. due to an obstruction of in the view of camera 405). The inertial position information may be used by processor 410, e.g., in combination with the last know position of the probe 415 prior to the loss of visual tracking, to track the position of the probe. As with the visual tracking information, this inertial tracking information may be used to position the cross hair in the display and/or allow for uninterrupted generation of the displayed temperature map.
The devices and techniques described above can aid a surgeon in applying a desired distribution a therapeutic light to the treatment region 411 by allowing the surgeon to visualize the internal temperature produced in response to the therapeutic light.
For example, deep "fat busting" procedures typically place the distal end of the surgical probe 415 out of range of surface temperature feedback techniques such as IR camera based systems, as described above. It is easy to unintentionally overheat deep tissue layers (e.g. beyond the temperature required for optimum safe lipo disruption). For example, FIG. 5B illustrates a typical thermal surgical field. The surgeon inserts the distal end of the surgical probe 415 into the incision point, and moves probe throughout the indicated zone to provide sub surface heating. FIG. 5B shows exemplary temperature isoclines indicative of the internal temperatures in the treatment region 411 underlying the skin surface of the patient. As indicated, due to variation in tissue thermal properties across the region, variations in the surgeons' technique, etc, very irregular temperature distributions may develop. In the absence of the ability to visualize the internal temperature distribution, unwanted hot or cold spots may develop leading to patient injury or unsatisfactory treatment results. The internal temperature visualization techniques described above provide real time guidance allowing the practitioner to reduce or eliminate these difficulties.
In various embodiments, processor 410 controls the therapeutic light source
425 based on temperature information from the temperature sensor 435. Tissue temperature information feedback allows for closed loop tissue temperature control wherein a property of the therapeutic light output by source 425 (e.g. output power, pulse rate, wavelength etc.) may be controlled (e.g. modulated) to effect a desired tissue temperature profile for a given procedure.
For example, as noted above, excessive deep heating is associated with various deleterious side effects such as necrosis of blood vessels, or even thermal damage to adjacent tissue layers (muscle, fascia, etc). By employing a closed loop temperature management and or the internal temperature visualization techniques described herein, optimum tissue temperatures can more easily be maintained, simplifying the procedure for the clinician and providing improved efficacy with enhanced safety.
Another example of closed loop temperature management benefits is in skin tightening procedures where the cannula tip is placed proximal to the sub dermal layer. In essence the laser heats fat adjacent to these deeper dermal areas and the heat acts on the entire dermis to affect so called collagen remodeling (skin tightening). In some applications, a difficulty is that thermal conduction through dermal layers (to effect skin tightening) varies greatly based on skin type and thickness. Thermal gradients from deep dermis to epidermal layers may vary considerably. Thus it is possible to over heat deeper sub-dermal areas while effecting optimum surface temperatures. This may cause vascular damage and other side effects. With closed loop thermal control and/or internal thermal visualization a compromise between optimum epidermal temperature and sub dermal temperatures can be made. Some embodiments may further include surface temperature measurement devices and techniques as described herein, to provide even more control over both internal and surface temperatures.
For various applications, the optimum time constant (response rate) of temperature sensor may vary. A faster response time has the advantage of actively measuring tissue temperature throughout the surgeon's treatment stroke. To accomplish this, the thermal mass of, e.g., a thermistor or thermocouple should be reduced or minimized. Another possibility is to measure the treatment stroke length (e.g. using an accelerometer to measure a sign change in the velocity of the probe), divide the treatment stroke into near, mid and far "ranges" and then sample average temperature for the period the cannula tip is present in each range. This allows a slower response time thermal couple to generate a relatively precise average temperature feedback signal for each of the near, mid and far range areas. The feedback can then be used by the laser to adjust or even out temperature accumulation through each "range" of the probe stroke. This approach, combined with the temperature visualization techniques described herein may help compensate for poor clinician technique.
In various embodiments, any of the temperature feedback and safety devices and techniques described the reference incorporated above may be used in conjunction with system 400.
Referring to FIG. 6, the operation of an exemplary embodiment of the processor 410 is shown. In the embodiment shown, the processor 410 receives a color (RGB) video stream from a CCD camera 405. The video stream is passed through a color splitter 601 which extracts a mono-color (as shown, green) portion of an video stream. The color splitter may be a digital color splitter. However, in some embodiments the splitter may be an analog device. In some embodiments the splitter is replaced by an optical element (e.g. a dichroic mirror) which directs a selected color of light to a camera to generate a mono-color video stream. The mono-color video stream is passed to a black and white video stream generator 602 which converts the mono-color video stream to grey scale.
A color video stream passes through color splitter 601 and is passed to video to an X-Y converter 603 which converts the video stream images to a pixilated map, with each pixel having a corresponding X and Y coordinate. As shown, each pixel has an associate RGB color value. The RGB X-Y pixilated map is passed to a color filter 604 to provide a mono-color pixel map (as shown red). The chosen color may be matched to the color of the tracking light to enhance the appearance of the illuminated tracking spot in the pixilated map (e.g. by increasing the contrast between the spot and the surrounding background).
The mono-color pixilated X-Y map is then passed to a detection module 605 which detects the location of the tracking light illuminated spot. Any suitable detection and tracking algorithm known in the art may be used. For example, the algorithm may detect a peak intensity pixel in the mono-color pixilated map, and identify that pixel with the position of the spot. In other embodiments the perimeter of the spot may be identified (e.g. by examining adjacent pixel to pixel intensity changes), and the geometric centroid of the spot calculated and identified with the position of the spot.
A data stream indicating the time varying X-Y position of the spot is passed to a track lock indicator module 606. The visual track lock indicator module 606 monitors the spot position data to determine if the spot is following a realistic track. For example, the module 606 may monitor the rate of change of the spot position and compare this rate to a threshold value above which the tracking data is unreliable (e.g. indicating that the camera view of the illuminated spot has been obstructed.)
During periods where the module 606 determines that a realistic track has been maintained, the X-Y spot information is passed to a crosshair generator which generates a image of a cross hair indicia located with an X-Y frame position corresponding to the spot information. The frame cross hair will thus move in real time in response to changes in the spot position. During periods where the module 606 determines that a realistic track has not been maintained, the module sends a "no visual track lock" indicator to inertial tracking module 608. The inertial tracking module 608 receives an inertial sensor signal 609 from the surgical probe 415. The inertial tracking signal includes information indicative of the position of the handpiece which can be used to substitute for the X-Y spot position data once the visual tracking lock has been lost.
For example, upon determination that the visual tracking has failed, the inertial tracking module 609 may enable inertial tracking and process the inertial sensor signal to determine the change in position of the surgical probe relative to its last know position determined by the visual tracking (e.g. using MEMS accelerometer and/or gyroscope signals). This information can be used to determine a corresponding X-Y position data, which is passed on to the crosshair generator 607 in place of the unrealistic X-Y data generated by the visual tracking of the illuminated spot.
In some embodiments, the inertial sensor signal may be subject to drift errors which increase as a function of the inertial measurement time. After a time period of a sufficient duration, the inertial measurement becomes unreliable. Thus, in some embodiments upon determination that the visual tracking has failed, the inertial tracking module 609 begins an inertial tracking countdown clock. Once the countdown completes, the module 609 determines that the inertial tracking is no longer reliable, and terminates the inertial tracking. At this time visual tracking can recommence. If visual tracking remains unrealistic, all tracking may cease and an alarm or other indication of tracking failure (e.g. disappearance of the cross hair indicia from the display) may be produced.
While tracking (either visual or inertial) is maintained, the cross hair generator 607 passes the X-Y data on to a false color generator 610. A thermal sensor signal 611 from thermal sensor 435 on the distal end of probe 415 is passed through a scaling module 612 to generate color information corresponding to the temperature measured by the thermal sensor. The scaling information may be provided in the form of a look up table, or any suitable scaling formula or algorithm. The false color generator 610 combines the X-Y tracking data with the color scaled temperature data to generate a false color temperature map image which includes graphical elements (e.g. dots) at positions in the frame having a color indicating a corresponding measured temperature.
The false color temperature map is passed on to a persistence generator 613, which causes the graphical elements to be removed from the temperature map or to fade away after a given time period (e.g. at a set time after the cross hair indicia has moved away from the corresponding X-Y position). Accordingly, as described above, older, less reliable temperature data is continuously removed from the temperature map.
The generated cross hair image and the temperature map are combined at a mixer 614, and subsequently converted out of the X-Y format to a video stream by converter 615. This cross hair and temperature map video stream is overlaid on the black and white video stream of the surgical field 407 by a video mixer 616. the combined video stream is passed to the display 440 for viewing (e.g. as shown in FIGs. 5C-5E).
The various components of processor 410 described above may be implemented in software (e.g. run on a general purpose microprocessor) or in hardware (e.g. including one or more integrated circuit devices) or in a combination thereof.
One or more or any part thereof of the measurement, feedback, and/or display techniques described above can be implemented in computer hardware or software, or a combination of both. The methods can be implemented in computer programs using standard programming techniques following the method and figures described herein. Program code is applied to input data to perform the functions described herein and generate output information. The output information is applied to one or more output devices such as a display monitor. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the programs can be implemented in assembly or machine language, if desired. In any case, the language can be a compiled or interpreted language. Moreover, the program can run on dedicated integrated circuits preprogrammed for that purpose.
Each such computer program is preferably stored on a storage medium or device (e.g., ROM or magnetic diskette) readable by a general or special purpose programmable computer, for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein. The computer program can also reside in cache or main memory during program execution. The analysis method can also be implemented as a computer- readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner to perform the functions described herein.
A number of embodiments of the invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention.
For example, it is to be understood that although in the examples provided above laser light is used for treatment, other sources of treatment light (e.g. flash lamps, light emitting diodes) may be used.
As used herein the term 'light' is to be understood to include electromagnetic radiation both within and outside of the visible spectrum, including, for example, ultraviolet and infrared radiation.
While the invention has been described in connection with the specific embodiments thereof, it will be understood that it is capable of further modification. Furthermore, this application is intended to cover any variations, uses, or adaptations of the invention, including such departures from the present disclosure as come within known or customary practice in the art to which the invention pertains, and as fall within the scope of the appended claims. All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.

Claims

WHAT IS CLAIMED IS:
1. A method comprising:
using a video camera to generate a video stream of a surgical field comprising an area of skin of a patient; inserting a surgical probe into the patient such that at least a distal end of the probe is located in a treatment region under the area of skin; emitting tracking light from the distal end of the probe to illuminate a portion of the area of skin overlaying the distal end of the probe; processing the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin; displaying the video stream; and based on the tracking information, displaying an indicia of the position of the distal end of the probe overlaying the display of the video stream.
2. The method of claim 1, wherein the surgical probe comprises an inertial sensor, and further comprising using the inertial sensor to determine position information indicative of the position or movement of the probe.
3. The method of any preceding claim, wherein the distal end of the probe comprises a temperature sensor, and further comprising: moving the distal end of the probe to a plurality of locations in the treatment region; using the temperature sensor to determine temperature information indicative of the temperature at each of the locations; and overlaying the display of the video stream with graphical information indicative of the temperature of the locations based on the tracking information and the temperature information.
4. The method of claim 3, comprising, for each of the plurality of locations: determining the position of the illuminated a portion of the area of skin overlaying the probe; determining the temperature at the location of the distal end of the probe; displaying a graphical element overlaying the display of the video stream for a selected duration at a display location corresponding to the position of the illuminated a portion of the area of skin, wherein a property of the graphical element is indicative of the temperature of the probe at the location.
5. The method of claim, 4, wherein the property of the graphical element comprises at least one from the list consisting of: color, shape, size, brightness, grey scale, blink rate, and an alphanumeric indicator.
6. The method of any of claim 4 or claim 5, wherein displaying a graphical element overlaying the display of the video stream for a selected duration comprises causing the graphical element to fade away at a selected rate.
7. The method of any of claims 3-6 , wherein the temperature sensor comprises a thermister.
8. The method of any of claims 4-7, wherein the surgical probe comprises an inertial sensor, and further comprising: using the inertial sensor to determine position information indicative of the position or movement of the probe, and wherein, for each of the plurality of locations, a property of the respective graphical element is determined based on the position information.
9. The method of claim 8, wherein the position information includes information indicative of a depth under the skin of the distal end of the probe at the location.
10. The method of claim 8 or 9, wherein the position information includes information indicative of a dwell time of the distal end of the probe at the location.
11. The method of claim 8, 9, or 10, wherein the position information includes information indicative of a speed of motion of the distal end of the probe at the location.
12. The method of any preceding claim wherein the surgical probe comprises an inertial sensor, and further comprising: using the inertial sensor to determine position information indicative of the position or movement of the probe; providing an obstruction blocking the video camera from viewing the illumination of the of the portion of the area of skin by the tracking light, thereby causing an interruption of the determination of the tracking information; and during the interruption, providing substitute tracking information based on the position information determined by the inertial sensor.
13. The method of any preceding claim, wherein the tracking information comprises information indicative of the depth of the distal end of the probe beneath the skin.
14. The method of claim 13 , wherein: the step of emitting tracking light comprises emitting a first portion of tracking light at a first wavelength and a second portion of tracking light at a second wavelength; and the step processing the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin comprises comparing an intensity of the illumination at the first wavelength to an intensity of the illumination at the second wavelength.
15. The method of claim 14, wherein first wavelength is in the red portion of the visible spectrum and the second wavelength is in the green portion of the visible spectrum.
16. The method of any preceding claim wherein the surgical probe comprises an optical delivery device, and further comprising: using the optical delivery device to deliver therapeutic light from a source to the treatment region.
17. The method of any preceding claim, wherein the therapeutic light comprises laser light.
18. The method of claim 17, wherein the therapeutic light comprises light having a wavelength in the visible or infrared.
19. The method of any of claims 17-18, wherein the therapeutic light has a wavelength of about 1440 nm.
20. The method of any of claims 17-19, wherein the delivered therapeutic light has a total power in the range of IW to 2OW.
21. The method of any of claims 17-20, wherein the delivered therapeutic light has a power density in the range of 200 W/cmΛ2 to 20,000 W/cmΛ2 at the target region.
22. The method of any of claims 17-21 , wherein the step of delivering therapeutic light from the light emitting portion of the delivery device comprises delivering a series of light pulses.
23. The method of claim 22, wherein the series of pulses comprises a pulse having a duration in the range of about 0.1 ms to about 1.0 ms.
24. The method of claim 22 or claim 23, wherein the series of pulses has a repetition rate in the range of about 10 to about 100 Hz.
25. The method of any of claims 17-24, wherein the surgical probe comprises at least one sensor, and further comprising: using the at least one sensor, generating a signal indicative of at least one property of the delivery device or the treatment region; and controlling the delivery of therapeutic light based on the sensor signal.
26. The method of claim 25, wherein the property of the delivery device or the target region comprises at least one selected from the list consisting of: a position of the optical delivery device, a movement of the optical delivery device, temperature of the optical delivery device, a tissue type in the vicinity of the optical delivery device, an amount of energy delivered by the optical delivery device, and a temperature of tissue in the target region.
27. The method of claim 25 or 26, wherein the sensor comprises at least one selected from the list consisting of: a thermister, an inertial sensor, an accelerometer, a gyroscope, and a color sensor.
28. The method of claim 25, 26, or 27, further comprising generating a display based on the signal indicative of at least one property of the delivery device or the target region.
29. The method of claim 25, 26, 27, or 28, wherein the surgical probe comprises a cannula and the optical delivery device comprises a optical fiber having at least a portion located in or proximate to the cannula.
30. An apparatus comprising: a video camera configured to generate a video stream of a surgical field comprising an area of skin a patient; a surgical probe configured for insertion into the patient such that at least a distal end of the probe is located in a treatment region under the area of skin; an optical emitter configured to emit tracking light from the distal end of the probe to illuminate a portion of the area of skin overlaying the distal end of the probe; a processor in communication with the camera and comprising a visual tracking module which processes the video stream to determine tracking information indicative of the position of the distal end of the probe based on the illumination of the portion of the area of skin; and a display in communication with the processor and the camera, the display configured to display the video stream; and based on the tracking information, display an indicia of the position of the distal end of the probe overlaying the display of the video stream.
31. The apparatus of claim 30, wherein the surgical probe comprises an inertial sensor in communication with the processor, and wherein the processor comprises an inertial tracking module which processes a signal from the inertial sensor to determine position information indicative of the position or movement of the probe.
32. The apparatus of any preceding claim, wherein the distal end of the probe comprises a temperature sensor in communication with the processor, and wherein: processor is configured to process a signal from the temperature sensor to determine temperature information indicative of the temperature at each of a plurality of the location traversed by the probe; the processor comprises an image generation module which forms a graphical representation indicative of the temperature of the locations based on the tracking information and the temperature information; and a video mixer which overlays the display of the video stream with the graphical representations.
33. The apparatus of claim 32, wherein the processor comprises: the visual tracking module, said module configured to process the video stream to determining the position of the illuminated a portion of the area of skin overlaying the probe; and a temperature sensing module configured to determine, based on the signal from the temperature sensor, the temperature at a plurality of locations in the treatment region; wherein the image generation module and video mixing module cooperate to generate a respective graphical element overlaying the display of the video stream for a selected duration at a location corresponding to the position of the illuminated portion of the area of skin, wherein a property of the graphical element is indicative of the temperature of the probe at the location.
34. The apparatus of claim 33 , wherein the property of the graphical element comprises at least one from the list consisting of: color, shape, size, brightness, grey scale, blink rate, and an alphanumeric indicator.
35. The apparatus of claim 33 or 34, wherein, the image generation module comprises a persistence module which causes each graphical element to become increasingly transparent at a selected rate.
36. The apparatus of any of claims 32-35 , wherein the temperature sensor comprises a thermister.
37. The apparatus of any of claims 33-36, wherein the surgical probe comprises an inertial sensor, and wherein the processor comprises an inertial tracking module configured to use the inertial sensor to determine position information indicative of the position or movement of the probe, and for each of the plurality of locations, determine a property of the respective graphical element based on the position information.
38. The apparatus of claim 37, wherein the position information includes information indicative of a depth under the skin of the distal end of the probe.
39. The apparatus of claim 37 or 38, wherein the position information includes information indicative of a dwell time of the distal end of the probe.
40. The apparatus of claim 37, 38, or 39, wherein the position information includes information indicative of a speed of motion of the distal end of the probe.
41. The apparatus of any preceding claim wherein the surgical probe comprises an inertial sensor, and wherein the processor comprises: an inertial tracking module configured to use the inertial sensor to determine inertial position information indicative of the position or movement of the probe; a visual tracking lock indicator module configured to process the video stream to identify an interruption tracking by the visual tracking module; and upon identifying the interruption, substitute inertial position information for tracking information from the visual tracking module.
42. The apparatus of any preceding claim, wherein the tracking information comprises information indicative of the depth of the distal end of the probe beneath the skin.
43. The apparatus of claim 42, wherein: the optical emitter emits a first portion of tracking light at a first wavelength and a second portion of tracking light at a second wavelength; and the visual tracking module processes the video stream to determine the information indicative of the depth of the distal end of the probe beneath the skin by comparing a property of the illumination at the first wavelength to a property of the illumination at the second wavelength.
44. The apparatus of claim 43, wherein first wavelength is in the red portion of the visible spectrum and the second wavelength is in the green portion of the visible spectrum.
45. The apparatus of any preceding claim wherein the surgical probe comprises an optical delivery device configured to deliver therapeutic light from a therapeutic light source to the treatment region.
46. The apparatus of any preceding claim, wherein the therapeutic light comprises laser light.
47. The apparatus of claim 46, wherein the therapeutic light comprises light having a wavelength in the visible or infrared.
48. The apparatus of any of claims 46-47, wherein the therapeutic light has a wavelength of about 1440 ran.
49. The apparatus of any of claims 46-48, wherein the delivered therapeutic light has a total power in the range of IW to 2OW.
50. The apparatus of any of claims 46-49, wherein the delivered therapeutic light has a power density in the range of 200 W/cmΛ2 to 20,000 W/cmΛ2 at the target region.
51. The apparatus of any of claims 46-50, wherein the surgical probe comprises at least one sensor in communication with the processor, and wherein: the at least one sensor is configured to generate a signal indicative of at least one property of the delivery device or the treatment region; and the processor is configured to control the delivery of therapeutic light to the treatment region based on the sensor signal.
52. The apparatus of claim 51 , wherein the property of the delivery device or the target region comprises at least one selected from the list consisting of: a position of the optical delivery device, a movement of the optical delivery device, temperature of the optical delivery device, a tissue type in the vicinity of the optical delivery device, an amount of energy delivered by the optical delivery device, and a temperature of tissue in the target region.
53. The apparatus of claim 51 or 52, wherein the sensor comprises at least one selected from the list consisting of: a thermister, an inertial sensor, an accelerometer, a gyroscope, and a color sensor.
54. The apparatus of claim 51, 52, or 53, wherein the processor and the display are configured to cooperate to generate a display based on the signal indicative of at least one property of the delivery device or the target region.
55. The method of any of claims 45-54 , wherein the surgical probe comprises a cannula and the optical delivery device comprises a optical fiber having at least a potion located in or proximate to the cannula.
PCT/US2010/026349 2009-03-05 2010-03-05 Thermal surgical monitoring WO2010102197A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2010/026349 WO2010102197A2 (en) 2009-03-05 2010-03-05 Thermal surgical monitoring

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15786209P 2009-03-05 2009-03-05
US61/157,862 2009-03-05
PCT/US2010/026349 WO2010102197A2 (en) 2009-03-05 2010-03-05 Thermal surgical monitoring

Publications (2)

Publication Number Publication Date
WO2010102197A2 true WO2010102197A2 (en) 2010-09-10
WO2010102197A3 WO2010102197A3 (en) 2010-11-11

Family

ID=43795098

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/026349 WO2010102197A2 (en) 2009-03-05 2010-03-05 Thermal surgical monitoring

Country Status (1)

Country Link
WO (1) WO2010102197A2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010052219A1 (en) * 2010-11-24 2012-05-24 Karl Storz Gmbh & Co. Kg Holding system for medical instruments
EP2567668A1 (en) * 2011-09-08 2013-03-13 Stryker Leibinger GmbH & Co. KG Axial surgical trajectory guide for guiding a medical device
GB2496376A (en) * 2011-11-02 2013-05-15 Inst Medical Informatics Audio visual camera support for operating theatre
WO2014026126A1 (en) * 2012-08-10 2014-02-13 The General Hospital Corporation Method and apparatus for dermatological treatment
WO2014045558A1 (en) * 2012-09-20 2014-03-27 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
EP3096065A1 (en) * 2015-05-21 2016-11-23 Euromedis Groupe Video system including a hinged frame
EP3138526A1 (en) * 2014-06-18 2017-03-08 Covidien LP Augmented surgical reality environment system
WO2018118390A1 (en) * 2016-12-19 2018-06-28 Ethicon Llc Hot device indication of video display
CN108345097A (en) * 2017-01-25 2018-07-31 三鹰光器株式会社 Operating microscope system
WO2019224278A1 (en) * 2018-05-22 2019-11-28 Eurofeedback Device for treatment with the emission of laser pulses
WO2019224276A1 (en) * 2018-05-22 2019-11-28 Eurofeedback Device for treatment with the emission of laser pulses
WO2019224274A1 (en) * 2018-05-22 2019-11-28 Eurofeedback Device for treatment with the emission of laser pulses
EP3948778A4 (en) * 2019-03-21 2023-04-26 Verb Surgical Inc. Method and system for automatically repositioning a viewable area within an endoscope video view
WO2023175588A1 (en) * 2022-03-18 2023-09-21 DePuy Synthes Products, Inc. Surgical systems, methods, and devices employing augmented reality (ar) guidance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060058604A1 (en) * 2004-08-25 2006-03-16 General Electric Company System and method for hybrid tracking in surgical navigation
US20070208252A1 (en) * 2004-04-21 2007-09-06 Acclarent, Inc. Systems and methods for performing image guided procedures within the ear, nose, throat and paranasal sinuses
WO2008154005A1 (en) * 2007-06-08 2008-12-18 Cynosure, Inc. Thermal surgical monitoring

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070208252A1 (en) * 2004-04-21 2007-09-06 Acclarent, Inc. Systems and methods for performing image guided procedures within the ear, nose, throat and paranasal sinuses
US20060058604A1 (en) * 2004-08-25 2006-03-16 General Electric Company System and method for hybrid tracking in surgical navigation
WO2008154005A1 (en) * 2007-06-08 2008-12-18 Cynosure, Inc. Thermal surgical monitoring

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010052219A1 (en) * 2010-11-24 2012-05-24 Karl Storz Gmbh & Co. Kg Holding system for medical instruments
US9387008B2 (en) 2011-09-08 2016-07-12 Stryker European Holdings I, Llc Axial surgical trajectory guide, and method of guiding a medical device
EP2567668A1 (en) * 2011-09-08 2013-03-13 Stryker Leibinger GmbH & Co. KG Axial surgical trajectory guide for guiding a medical device
CN102988109A (en) * 2011-09-08 2013-03-27 史赛克莱宾格股份有限公司 Axial surgical trajectory guide, and method of guiding medical device
GB2496376A (en) * 2011-11-02 2013-05-15 Inst Medical Informatics Audio visual camera support for operating theatre
WO2014026126A1 (en) * 2012-08-10 2014-02-13 The General Hospital Corporation Method and apparatus for dermatological treatment
KR20150058194A (en) * 2012-09-20 2015-05-28 소니 주식회사 Information processing apparatus, information processing method, program, and measuring system
KR102200740B1 (en) * 2012-09-20 2021-01-08 소니 주식회사 Information processing apparatus, information processing method, program, and measuring system
US9646378B2 (en) 2012-09-20 2017-05-09 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
WO2014045558A1 (en) * 2012-09-20 2014-03-27 Sony Corporation Information processing apparatus, information processing method, program, and measuring system
EP3138526A1 (en) * 2014-06-18 2017-03-08 Covidien LP Augmented surgical reality environment system
EP3096065A1 (en) * 2015-05-21 2016-11-23 Euromedis Groupe Video system including a hinged frame
FR3036458A1 (en) * 2015-05-21 2016-11-25 Euromedis Groupe VIDEO SYSTEM COMPRISING ARTICULATED ARMATURE
WO2018118390A1 (en) * 2016-12-19 2018-06-28 Ethicon Llc Hot device indication of video display
US10537394B2 (en) 2016-12-19 2020-01-21 Ethicon Llc Hot device indication of video display
CN108345097A (en) * 2017-01-25 2018-07-31 三鹰光器株式会社 Operating microscope system
WO2019224278A1 (en) * 2018-05-22 2019-11-28 Eurofeedback Device for treatment with the emission of laser pulses
WO2019224276A1 (en) * 2018-05-22 2019-11-28 Eurofeedback Device for treatment with the emission of laser pulses
WO2019224274A1 (en) * 2018-05-22 2019-11-28 Eurofeedback Device for treatment with the emission of laser pulses
FR3081314A1 (en) * 2018-05-22 2019-11-29 Eurofeedback LIGHT IMPULSE TRANSMITTER TREATMENT DEVICE
FR3081311A1 (en) * 2018-05-22 2019-11-29 Eurofeedback LIGHT IMPULSE TRANSMITTER TREATMENT DEVICE
FR3081312A1 (en) * 2018-05-22 2019-11-29 Eurofeedback LASER IMPULSE TRANSMISSION PROCESSING DEVICE
EP3948778A4 (en) * 2019-03-21 2023-04-26 Verb Surgical Inc. Method and system for automatically repositioning a viewable area within an endoscope video view
US11818510B2 (en) 2019-03-21 2023-11-14 Verb Surgical Inc. Monitoring adverse events in the background while displaying a higher resolution surgical video on a lower resolution display
WO2023175588A1 (en) * 2022-03-18 2023-09-21 DePuy Synthes Products, Inc. Surgical systems, methods, and devices employing augmented reality (ar) guidance

Also Published As

Publication number Publication date
WO2010102197A3 (en) 2010-11-11

Similar Documents

Publication Publication Date Title
WO2010102197A2 (en) Thermal surgical monitoring
US8190243B2 (en) Thermal surgical monitoring
US11147640B2 (en) Medical devices, systems, and methods using eye gaze tracking
US20220151702A1 (en) Context aware surgical systems
US11648062B2 (en) System for controlling ablation treatment and visualization
US20150202005A1 (en) System and Method To Control Surgical Energy Devices
JP2005500108A (en) Apparatus and method for thermal excision of biological tissue
WO2000053261A1 (en) An apparatus for tissue treatment and having a monitor for display of tissue features
CN101653376A (en) Laser treatment apparatus
JP2004530485A (en) Guide systems and probes therefor
JP2022551642A (en) Gaze detection-based smart glasses display device
US11576555B2 (en) Medical imaging system, method, and computer program
WO2020220471A1 (en) Endoscopic imaging-guided photothermal treatment apparatus
KR20210042784A (en) Smart glasses display device based on eye tracking
Clancy et al. Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery
JP2005287832A (en) Heat treatment apparatus
JP2022185357A (en) Imaging device
WO2024050335A2 (en) Automatically controlling an integrated instrument

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10749384

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10749384

Country of ref document: EP

Kind code of ref document: A2