WO2003002011A1 - Stereoscopic video magnification and navigation system - Google Patents

Stereoscopic video magnification and navigation system Download PDF

Info

Publication number
WO2003002011A1
WO2003002011A1 PCT/IL2001/000598 IL0100598W WO03002011A1 WO 2003002011 A1 WO2003002011 A1 WO 2003002011A1 IL 0100598 W IL0100598 W IL 0100598W WO 03002011 A1 WO03002011 A1 WO 03002011A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
operator
camera
video
stereoscopic
Prior art date
Application number
PCT/IL2001/000598
Other languages
French (fr)
Inventor
Guy Koskas
Original Assignee
Surgyvision Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Surgyvision Ltd. filed Critical Surgyvision Ltd.
Priority to PCT/IL2001/000598 priority Critical patent/WO2003002011A1/en
Publication of WO2003002011A1 publication Critical patent/WO2003002011A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00221Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Definitions

  • the present invention relates generally to a stereoscopic observation system, and more particularly to a stereoscopic video observation and magnification system integrated with an imaging guided system to be utilized in surgery and other medical applications.
  • Magnification observation systems are known in the art as optical instruments comprising magnifying lenses for the magnification of the view of small objects.
  • such instruments typically include simple hand-held loupes, wearable binocular loupes with or without headlight, or, in the case of microsurgery, surgical microscopes.
  • surgical microscopes are used by a single surgeon, leaving the other members of the surgical team with a minimal view of the operating field.
  • Surgical microscopes are large, cumbersome and hard to manipulate.
  • the magnification range of the wide array of surgical microscopes is from 2 (where 1 is unaided eye) to about 12.
  • loupe-aided observation provides magnifications in the range of 2 to 3 (also larger magnification is available but the equipment is unwieldy and less frequently used) whereas the range of magnifications for a microscope is typically 8 or above.
  • Stereoscopic observation systems are known in the art as optical devices comprising two magnifying lenses having two different viewpoints from two adjacent angles of view and providing two images such as to reproduce the characteristics of human binocular vision.
  • the images provided by the optical devices are combined in a user's brain thereby enabling the user to perceive depth.
  • Current surgical microscopes are typically stereoscopic, but only one operator can observe the operating field in stereoscopy while the additional observers provided with a monocular view only.
  • a stereoscopic video magnifying system is known in the art as a mechanism that utilizes two video cameras typically mounted on an observer's head.
  • the cameras record two video images of an object, magnify the images, and forward the magnified images to an observer.
  • the magnification provided is typically in the range of 1 to more than 8.
  • the stereoscopic video magnification system having the cameras mounted on the observer's head is very useful for the performance of surgical procedures in field conditions where proper mounting devices for the cameras may be unavailable.
  • the efficiency of the system is limited. Wearing the cameras is uncomfortable and effect fatigue during prolonged operations.
  • Such a system includes the provision of a full range of magnifications, thus suitable for many different types of operations and procedures and also include a useful "see through” capability that enables the surgeon to work continuously and with ⁇ rinimum head position movements.
  • the "see through” capability refers to the attribute of the system allowing a surgeon to observe directly at the operating field through the LCD glasses when not observing the video image of the magnifying cameras.
  • the system also provides more than one surgeon with a stereoscopic view of the operating field.
  • Navigation systems in the medical field are a group of devices brought to use together in order to allow a surgeon to pinpoint the location of a probe with high accuracy.
  • An object located within the operating field for example the brain in neurosurgery, is scanned, typically by a MRI or a CT scanner, prior to an operation.
  • sets of markers are attached to the object and located around the operating field. The markers can be seen on the scanning image as well as to be detected thereafter by a sensor device.
  • the scarining information Prior to the surgical procedure the scarining information is fed to a computer.
  • markers that can be detected by a special camera typically an infrared camera
  • a pinpoint location on the object within the operating field is achieved by fixing at least three reference points in space.
  • Two markers around the operative field and a special surgery probe utilized by the surgeon constitute the three reference points designed to locate a desired location.
  • the location is displayed on a separate view screen with reference to the pre-operative scanning images.
  • existing systems operative in the typical neurosurgery procedure assist the surgeon to locate a desired area on the object but the surgeon must remove his eyes from the operative microscope as well as utilize a special probe instead of a surgical tool in order to view the desired location
  • the system relieves the surgeon from the necessity of removing his eyes from the operating field and from the necessity of manually manipulating a probe.
  • the surgeon is free to operate and observe in three-dimensional view the magnified operative field as well as pinpointing an exact location of a target point.
  • Such location can be displayed to the surgeon's eyes as a two or three-dimensional image.
  • the present invention it is also the purpose of the present invention to improve the ergonomics of the stereoscopic video magnification system, to allow a surgeon further freedom of movement, and to provide a stereoscopic view of the operating field simultaneously to more than one member of the a surgical team. It is another object of the present invention to provide new and improved means for physically supporting the video cameras.
  • the supporting apparatus providing means for stable and smooth and easy placing of the video magnification and magmfying system. Furthermore, said apparatus can be automatically controlled and can provide space for extra equipment.
  • One aspect of the present invention regards an apparatus for providing stereoscopic magnified observation enabling an operator to perform surgical procedures without having to remove his eyes from the operating field.
  • the apparatus comprises a head-mounted display for providing the operator with stereoscopic magnified images in an operating field.
  • the camera module can further comprise a camera mount for mounting at least two video cameras, and at least two video cameras attached to the camera mount for acquiring video images of the operating field. It can also include a converging system for interconnecting and adjusting the at least two video cameras with respect to each other and obtaining a focal point associated with a point in the operating field resulting in obtaining stereoscopic images.
  • the interface-processing unit can further include a central processing unit for processing stereoscopic images and for calculating a convergence angle required at a specific focal distance for generating stereoscopic images. It can also include at least memory device, a camera control unit for controlling at least the focus level, focus distance and the distance angle and for the synchronizing of received values from at least two video camera, and a camera convergence unit for controlling a convergence system. In addition, it can include an imaging sampler unit for sampling video images received from the at least two video camera and forwarding the sampled video images to the central processing unit. A video switcher unit for processing video images obtained from external video sources and for converting the images so that the head mounted display can display such images. And an on screen display control unit for each head mounted display for sending operational information to each head mounted display.
  • a central processing unit for processing stereoscopic images and for calculating a convergence angle required at a specific focal distance for generating stereoscopic images. It can also include at least memory device, a camera control unit for controlling at least the focus level, focus distance
  • a head mounted display driver for translating video image signal into information and commands fed to the head mounted display.
  • a head mounted display-inverting driver for inverting the video images and translating the video images into information fed to an inverting head mounted display.
  • the interface processing unit may also include a display control unit for controlling the information sent to HMD, and an external display driver for controlling information and format to be displayed onto the external display.
  • the interface-processing unit can further comprise an operator controller interface unit for handling the control commands input by the operator sent to the interface-processing unit.
  • a second aspect of the present invention regards an apparatus for providing stereoscopic magnified observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field.
  • the apparatus includes at least one interface-processing unit for processing and transmitting data and for controlling peripheral devices.
  • a display device for displaying the stereoscopic magnification images, information and precise location of a point in an operating field.
  • An input device for selecting and inputting data and commands into the interface processing unit.
  • a camera module comprising at least two cameras for acquiring magnified and stereoscopic images from the operation filed and for localizing markers around the operative field and for obtaining focal point distance information.
  • the interface processing unit for processing and dynamically presenting the stereoscopic magnified images and for calculating and precise location of a point in an operating field and plotting said point on display.
  • the interface processing unit can also include a central processing unit, a memory device, a communication device and a navigation system for obtaining imaging data and for localizing a point in the operative field and display such point on a three dimensional image representation displayed on a display device.
  • a third aspect of the present invention regards an apparatus for providing stereoscopic magnified observation and precise imaging location of a point in an operating field.
  • the apparatus comprising a camera module having one or more infra red camera. The cameras observe the operative field and the camera module have two or more cameras focused on a focal point at a focal distance.
  • a focal distance point is superimposed on the head-mounted display.
  • a head mounted display worn by operator.
  • An interface-processing unit receives information, which is processed, sent and stored.
  • a fourth aspect of the present invention regards a method for providing stereoscopic magnified observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures. The method steps are as follows: receiving distance and attitude of two markers or more located about the operating field from the infra red camera. Receiving distance and attitude of one marker or more located on a camera module from an infra red camera. Calculating distance of one marker or more located on the camera module from a base line between two cameras within the camera module.
  • a camera support apparatus can support the camera module and markers.
  • the infra red camera, the interface processing unit can be mounted on the camera support apparatus.
  • Such camera support apparatus can be manually and automatically controlled.
  • a fifth aspect of the present invention regards a method for providing stereoscopic magnified observation images and precise imaging location of a point in an operating field and information, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field.
  • the method comprising the steps of: Displaying of images from stereoscopic magnifying video cameras to the eye of an operator. Obtaining and storing imaging images. Displaying imaging images to the eye of the operator. Plotting focal point location information on imaging images and displaying plotted focal point location information imaging images to the eye of the operator. And selecting information to be displayed to the operator.
  • FIG. 1 is a block diagram illustrating the main components of a stereoscopic video magnification system, in accordance with a preferred embodiment of the present invention.
  • Fig. 2 is a block diagram illustration of the preferred embodiment
  • Fig. 3A is a schematic illustration of the operation sequence of the stereoscopic video magmfication system in concert with navigational data and system
  • Fig 3B is a schematic illustration of camera module
  • Fig 3C is a schematic illustration of targeting cross hair marker displayed onto the head mounted display (HMD) screen
  • Figs. 4A is schematic illustrations of the floor stand for the stereoscopic video magnification and navigation system; and swivel arm supporting the camera module.
  • the present invention overcomes the disadvantages of the prior art by providing a novel method and system that enhance and add to the capabilities of a stereoscopic video magmfication system.
  • the following description is presented in order to enable any person skilled in the art to make and use the invention.
  • specific terminology is set forth to provide a thorough understanding of the invention.
  • descriptions of a specific application are provided only as an example.
  • Various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention.
  • the present invention is not limited to the embodiment shown, but it is to be accorded the widest scope consistent with the principles and features disclosed herein.
  • Some of the elements presented in the following description are computer programs implemented in a computing device.
  • the device presented in the following description contains computer software instructions specifically developed for the practice of the present invention.
  • the software in the presented computing device causes the device to perform the various functions described herein, although it should be noted that it is possible to use dedicated electronic hardware to perform all the functionality described herein.
  • the application can be implemented as hardware by the embedding of the predetermined program instructions and/or appropriate control information within suitable electronic hardware devices containing application-specific integrated til circuits.
  • PCT application number LL00/00398 dated 1 January 2001 assigned to Surgivision Ltd. which is incorporated herein by reference.
  • the present invention provides a novel and useful apparatus for stereoscopic magnified observation and precise location of a surgical or operating field by enabling an operator, typically a surgeon, to observe, to magnify, and to locate a point in the operating field while performing a surgical procedure, without having to remove his eyes from the operating field or substantially moving his head.
  • the present invention can also provide a view substantially similar to the view provided to the surgeon to the other members of the surgical team simultaneously.
  • Fig. 1 is a schematic illustration of the configuration and the related operation of a stereoscopic video magmfication system, generally referenced as 100, in accordance with a preferred embodiment of the present invention.
  • System 100 includes an Interface Processing Unit (IPU) 102, a Head Mounted Display (HMD) 104, an External Display Unit 106, a Camera Module 108 and an Operator Control Unit 110.
  • the camera module 108 includes a camera mount 112, two video cameras 114, a converging system 116, and suitable connecting cables with associated input and output sockets.
  • module 108 includes two analog-to-digital converters 118.
  • IPU 102 includes a Central Processing Unit (CPU) with an internal memory device 120, a flash memory device 122, a camera control unit 124, a camera convergence unit 126, an image sampler unit 128, an operator controller interface unit 132, a video switcher unit 134, an On Screen Display (OSD) unit 148 for each HMD connected to the system, a HMD driver 136, a HMD inverting driver 138, a test connection interface 140, a display control unit 130, and an external display driver 144, all of which are connected by main bus 142.
  • the camera mount 112 typically is attached to the head of the surgeon by a headband, headrest or a helmet.
  • Cameras 114 are lightweight video cameras such as Model EVI 370DG from Sony, Japan or the like.
  • the cameras 114 are interconnected by a convergence system 116 allowing them to be adjusted in respect to each other.
  • the adjustment effects motions of the cameras in respect to each other along the longitudinal axis.
  • the aforementioned motions permit the positioning of both cameras in such an angle that a certain target's focal point may fall on the cameras' charged-coupled devices thereby generating a stereoscopic view.
  • the analog-to-digital (A/D) converter 118 converts an analog visual signal captured by cameras 114 to digital signals to be fed to IPU 102.
  • suitable analog-to-digital converters are supplied with and situated within the cameras. In contrast when an analog camera is used suitable A/D converters are to be acquired separately and placed IPU 102.
  • Cameras control unit 124 receives information regarding focus level and distance, distance angle and the like for the cameras 114 and automatically synchronizes the cameras 114 according to the received values. Focus level and distance, distance angle and other information relating to the cameras 114 are fed to CPU 120 of IPU 102.
  • CPU 120 is programmed with specifically developed software instructions that perform suitable calculation concerning the convergence angle required at a specific focal distance in order to generate proper stereoscopic vision.
  • the required convergence angle is fed to the camera convergence unit 126 that performs the necessary correction in the disposition of the cameras 114 by sending suitable instructions to the mechanical convergence system 116 via electrical connections.
  • the process is designed to be automatic but can be also induced by requests of the operator of the system (not shown) such as a surgeon, or the like.
  • Video images captured by the cameras 114 are fed to the image sampler 128 of IPU 102 via A/D converters 118.
  • Image sampler unit 128 samples the video images received from the cameras 114 by periodically obtaining specific values associated with the signal representing the video images, and forwards the sampled images to the CPU 120.
  • CPU 120 processes the received images. The processing involves the examination of the images obtained by the cameras 114, and appropriately corrections to any stereoscopically essential differences between the cameras 114. As a result of the processing appropriate control information is sent to the camera convergence unit 126.
  • Camera convergence unit 126 sends correction instruction signals to the convergence system 116. In accordance with the received instruction convergence system 116 mechanically changes the convergence angle between the cameras 114.
  • Video images obtained by the camera module 108 are also sent to HMD 104 via video switcher unit 134, via On Screen Display (OSD) unit 148, and via HMD driver 136 or HMD inverting driver 138.
  • Video switcher unit 134 processes video images from different sources.
  • Unit 134 receives video signals such as digital signals directly from A/D converters 118, video image files from the flash memory 122, compressed video files such as MPEG files from external video source 146, and the like.
  • Video switcher unit 134 converts the received video image signals to a format appropriate for display to HMD drivers 136 and 138.
  • On Screen Display (OSD) unit 148 sends operational mformation to the HMD 104, such as zoom level values, and the like.
  • the HMD driver 136 translates the different video images signals into information and commands fed to the HMD 104.
  • HMD inverting driver 138 first inverts the video image then translates the video signals into information and commands fed to Inverting HMD 104.
  • Inverting HMD driver 138 thus creates an opposite point of view that is displayed to an additional user such as a second surgeon donning an inverting HMD 104.
  • Inverting HMD 104 is a head mounted display receiving information and commands from inverting HMD driver 138.
  • two users (such as two surgeons) are situated on the opposite sides of an operating field.
  • Camera module 108 sends video images that allow the first surgeon to observe the scene from his point of view.
  • the second surgeon receives the same view, but as a result of being located on the opposite side of the operating field the video image received is not the proper representation of his natural view of the field.
  • the image received is inverted to the point of view of the second surgeon.
  • the inverted view may confuse the second surgeon as to the spatial location of left and right. Inverting driver 138 thus allows both surgeons to have a directionally correct view of the operating field each from his respective point of view.
  • HMD 104 is typically a pair of head mounted display units, such as LCD glasses model LDI-D100BE from Sony, Japan.
  • Display control unit 130 controls the information sent to HMD 104. Such information can include direct video images feed, commands, stored video images feed, patient data, operational information, other data and the like.
  • Display control unit 130 receives orders from CPU 120 through operator controller unit 110 as well as through other input devices such as a keyboard, a touch screen, a pointing device, or the like.
  • Operator controller unit 110 is connected to the IPU 102 via the operator controller interface 132.
  • the operator controller interface 132 handles the control commands input by the operator sent to the interface processing unit.
  • Operator controller unit 110 is used by the user (not shown) to transmit suitable instructions to IPU 102 by suitable manipulation of the unit 110.
  • the instructions transmitted by the operator controller unit via the operator controller interface unit 132 can include focusing the cameras 114, changing the zoom values, activating and deactivating the "see-through" capability, initializing the system and readjusting the parameters of the system such as luminance, contrast, and the like.
  • IPU 102 executes commands received from the operator by reading the introduced command parameters from display control unit 130. Consequently IPU 102 calculates the required parameters needed to be set and sends the instructions involving the processed command information to the camera control unit 124, and to the camera convergence unit 126.
  • Camera control unit 124 and camera convergence unit 126 sends the required set of instructions to the cameras 114 and the convergence system 116 for execution.
  • the operator controller unit 110 is a specially designed hardware control board that can be attached to the user's arm, placed on a special pedestal near the operating field, or the like.
  • the IPU 102 tests all the connections and the settings of the system and receives test and software updates through test connection interface 140.
  • External display driver 144 translates different image signals into information and commands fed to external display 106.
  • External display 106 can be a computer screen, a TV set a hand eld device screen, an LCD screen and the like.
  • IPU 200 receives, processes, and transmits data. IPU 200 also controls the peripheral devices. IPU 200 comprises of central processing unit (CPU) 230, memory device 231, communication device 233 and navigation system 244. Peripheral devices 270 communicate with IPU 200 via CPU 230. Peripheral devices 270 comprises display devices 250, input devices 220, cameras 210 and camera support apparatus 240.
  • CPU central processing unit
  • Peripheral devices 270 communicate with IPU 200 via CPU 230.
  • Peripheral devices 270 comprises display devices 250, input devices 220, cameras 210 and camera support apparatus 240.
  • Display devices 250 can be a head mounted display (HMD) 254 described in fig 1 and PCT application number IL00/00398 dated 18 th January 2001 assigned to Surgivision Ltd., an external display 256 such as a TV screen or a computer screen such as a 17" AE-1769 from AST Vision, a Printer devices 258 such as Phaser 2135 from Xerox, and the like.
  • Display device 250 provides the operator with the stereoscopic magnified images, imaging data, precise location and localization data, patient related data, surgery and surgical procedure related data as well as any other data and information required by the operator during surgical sessions, and the like.
  • Input device 220 typically comprise Operator Controller (OC) 222 such as operator controller 110 of Fig.
  • OC Operator Controller
  • Input device 220 can also comprise pointing device 226 such as a PS/2 mouse from MicrosoftTM, Keyboard such as a BTC 8110M ergonomic keyboard with touch pad, Microphone such as VR250BT Unidirectional Electret Microphone with Single Earphone Speaker and the like. Input devices 220 are used for selecting and inputting data and commands into the interface processing unit.
  • Cameras 210 typically comprise a set of two types of cameras. Cam Module 208 as described in Fig. 1 and PCT application number IL00/00398 dated 18 th January 2001 assigned to Surgivision Ltd.
  • IR camera 206, EM camera 204 and matching markers are typically supplied with Navigation system 244.
  • Camera supporting apparatus 240 can contain Supporting arm 216, Arm control panel 214 both of which are described here and below in Fig 4.
  • Cam module 208 is designed to capture and transmit preferably magnified video images from observed field of operation (not shown). Camera module 208 is thus placed within camera support apparatus 240. Camera module 208 is positioned and moved via camera support apparatus 240 by support arm 216 and controlled by arm control panel 214 as well as by input devices 220.
  • the positioning can be automatic or manual.
  • Camera support apparatus 240 is fiirther described in fig 4.
  • Magnified video images obtained by cam module 208 are transmitted to CPU 230 of IPU 200 as well as to HMD 254 of display device 250. Said video images can also be transmitted for display on external display 256, for printing a hard copy form printer 258.
  • Operator (not shown) using system 1 can use input device 220 to instruct CPU 230 of IPU 200 to perform certain operations.
  • Command data can be fed to IPU 200 by touch, pointing device click, finger tap, voice, or by any other suitable means suitable for a human operator.
  • Such command data can be sent to IPU 200 via hard cables, radio frequency wireless transmission, Infrared transmission, and the like.
  • Navigation system 244 of IPU 200 is designed to receive imaging data 242 of operative field and the surrounding areas (not shown).
  • Imaging data 242 can be a MRI scan, a CT scan, Ultrasound data and the like.
  • Imaging data 242 can be acquired pre-operatively or intra-operative and is fed to navigation system 244 directly such as via a hard copy imaging file or via network 234.
  • Network 234 typically comprises a hospital network connecting computers 235, the Internet network 236 and the like.
  • Communication device 233 of IPU 200 is designed to receive and transmit data between network 234 and CPU 230 of IPU 200. Such data can be made available for navigation system 244 as well as for memory device 231 for storage, as well as for display on display devices 250.
  • Communication device can be a X2 modem device from US Robotics and the like.
  • Memory device 231 can be a Flush memory device, or any other memory device, and the like.
  • Navigation system 244 such as VectorVision from BrainLabTM, IR camera 206 or EM camera 204 as well as a set of markers (not shown) are used to localize a point in space preferably in the operative field (not shown) and display such point on a three dimensional image representation reconstructed from imaging data 242 obtained in advance. Such three dimensional image is displayed on display device 250 such as HMD 254 and the like.
  • Fig. 3 illustrates the operational sequence of the stereoscopic video magnification system in association with the navigational data and system.
  • a navigational system In order for a navigational system to locate a point in space, it must have at least three coordinates in space. Markers placed near and around operative field attain such coordinates. Using a multiple of markers enhances accuracy of the system.
  • a multiple of active or passive markers located near and around the operative field to be used as reference points. The reference points are used to establish the location of a target probe. Typically the operator manually manipulates the target probe by physically placing it on a target point.
  • the navigation system calculates the location of the target point and displays the location thus calculated on a display monitor where the indicated location is superimposed on an imaging image, such as a MRI two dimensional image.
  • an imaging image such as a MRI two dimensional image.
  • the target probe is unnecessary as will be clearly shown from the following description.
  • the target point is superimposed on a three dimensional imaging image constructed from imaging data and displayed on visualization devices such as HMD.
  • Fig. 3A, 3B and 3C same numbers relate to like components. The following discussion relates to said Figures as a whole.
  • Fig 3 A illustrate the preferred embodiment of the present invention in which cam module 208, external display 256, keyboard 224, Infrared (IR) camera 344 and IPU 200 are all mounted on camera support apparatus 240.
  • Camera support apparatus 240 is situated preferably close to operative field 300 such that cam module 208 can be situated in a position permitting cameras 314 direct visualization of operative field 300.
  • IR camera 344 is mounted on camera support apparatus 240 such that both operative field 300 and cam module 208 mounted markers are directly visible.
  • User usually dons HMD 254 during operative session.
  • HMD 254 is connected to IPU 200 via suitable conduits.
  • Camera module 208 best seen in Fig 3B comprises camera mount 312, two cameras 314, convergence system 316 and markers 302. Cameras 314 typically observe operative field 300 such that both obtain a single focal point 308 illustrated as a cross best seen in Fig 3A.
  • operative field 300 Preferably magnified stereoscopic video image of operative field 300 is displayed stereoscopically on HMD 254.
  • Focal point 308 marked as a cross in Fig 3C is superimposed onto HMD display 306 via crosshair generator such as FK 1/F crosshair generator from KAPPA Miltram Industries Ltd.
  • Focal point 308 is thus visible to Operator (not shown) at all times and is represented as a crosshair on HMD display 306 of HMD 254.
  • an imaging machine such as a MRI scanner or CT scanner scans the operating field 300 and its surroundings.
  • Markers 302 surround operative field 300. Markers 302 are made of a substance perceived by imaging scanners and will appear as dark or light spots on the imaging data 242 which is fed to IPU 200. The process by which Navigation system 244 determines the exact location of the focal point 308 is now discussed in further detail.
  • Focal point 308 is a virtual target probe used in place of the traditional physical target probe. Markers 302 can be passive Markers such as IR or light reflectors or active markers such as IR emitting markers or electromagnetic radiation emitting markers and the like. Markers 302 are situated on camera mount 312 as well as around the operating field 300 in the same location as during the imaging session.
  • markers 302 emit IR radiation, as seen by parallel lines in Fig 3A.
  • the IR radiation is perceived by IR camera 344 of navigating system 244.
  • the analog information representing the IR radiation is converted to digital information by navigational system 244 of Fig.1 located within IPU 200. Consequently CPU 230 of IPU 200 performs the suitable calculation regarding the location of focal point 308 in respect to markers 302 mounted on Cam module 208.
  • Focal point 308 location in space is then compared to the spatial location of markers 302 located around operative field 300. Calculated location of the focal point 308 is plotted on pre or intra-operative imaging data.
  • the plotted imaging data 242 is sent via suitable conduits to output unit 250 of Fig 2 for display on HMD 254, external display 256, and for transmission to remote units via communication device 233 of Fig 1.
  • Focal point location 308 superimposed on imaging data 242 is displayed preferably as a three dimensional reconstruction of the operative field and surroundings.
  • the user can choose to observe said imaging data 242 with superimposed focal point 308 on display devices 250 of Fig 2 by introducing suitable requests through the appropriate manipulation of input devices 220 of Fig 2 such as operator controller 222 of Fig 2, keyboard 224 and the like.
  • electromagnetic probes 302 such as magnets
  • IR camera 344 is not needed and a special crown (not shown) is placed around magnetic probes. The crown is constructed to measures signals from each probe.
  • Fig. 4A which describes in further detail camera support apparatus 400.
  • Apparatus 400 typically consists of a floor-stand support 410 or a ceiling mount (not shown), a handling apparatus 420, and the camera module 430.
  • Camera support apparatus 400 is intended to provide easy, effortless, positioning of the camera module 430.
  • Camera module 430 is positioned above or around operative field 500 such that the cameras 422 capture a clear and uninterrupted view of operative field 500. Said view is transmitted to the user 600.
  • Handling apparatus 420 include a set of arms, marked 406, 408 and 412, permitting easy and smooth handling of camera module 430 to a stable but changeable position during an operating session.
  • Handling apparatus 420 also provide suitable tubing for cable routing within said arms from cameras 210 of Fig 1 such as camera module 430, as well as from display device 250 of Fig 1, as well as from a command board 414 to IPU 200 of Fig 3 A.
  • said cables are suitable conduits for the power supply, data transfer and control of certain aspects related to said devices, mounted on camera support apparatus 400.
  • Floor-stand 410 is intended to provide support, stability, mobility, cable routing, storage place, and the like to camera module 430.
  • Floor-stand support 410 comprises a base 402, a controlling handle 403, and a set of wheels 404.
  • the base 402 can consists of a rectangular shaped box or a multiple protrusion shaped round flat plate, or the like, fitted with wheels 404 at the bottom side of the base. Wheels 404 are typically made from hard plastic but also may be pneumatic. Wheels 404 can be fitted with a braking mechanism (not shown) for static positioning of floor-stand 410. Camera support apparatus 400 is easily maneuverable with the help of controlling handle 403.
  • Floor-stand support 410 can be of varied forms according to the operating room type and available space.
  • Floor-stand 410 is designed to fit the stereoscopic video magnifying and navigation systems requirements.
  • Said box type base 402 may contain within, IPU 200 of Fig 3A as well as other components related to the system.
  • Handling apparatus 420 comprising a stand arm 406, a reaching arm 408 and a swivel arm 412 is intended for easy and versatile handling of the camera module 430.
  • Handling apparatus 420 is preferably made from light components such as composite metals, plastics and the like, and is suitable for conducting electrical and data wires within tubes there within.
  • Stand arm 406 which is a vertical beam
  • reaching arm 408 which is a horizontal arm
  • swivel arm 412 which is a special support arm for camera module 430 are inter connected arms, set with oil damping bearings to allow operator 600 easy manipulation of camera module 430.
  • the camera module 430 is attached to the end of the swivel arm 412 via the camera mount 416.
  • Camera mount 416 is connected to swivel arm 412 by a ball and socket type joint 435 allowing movement in any direction.
  • Joint 435 is fitted with vertical and horizontal-locking screws (not shown) for the fixation of camera mount 416 in any desired position.
  • Arm 412 is joined with horizontal arm 408 via a rotational joint 445 allowing horizontal rotation of arm 412 around axis 480.
  • Joint 445 is fitted with locking screws (not shown) permitting fixation of arm 412 in any desired location around axis 480.
  • Horizontal arm 408 is interconnected with vertical beam 406 via joint 455.
  • Joint 455 is such that horizontal arm 408 can be elevated and descended along axis 490 as well as rotated around axis 490.
  • Joint 455 is also fitted with two locking screws (not shown).
  • Said locking screw permit the fixation of arm 408 in any point along and around axis 490.
  • Vertical beam 406 is fitted with a stopper (not shown) at beam 406 upper margin, such that arm 408 is prevented from slipping over beam 406.
  • Camera mount 416, Swivel arm 412 as well as horizontal arm 408 can be displaced manually by operator 600 or others. Such displacement can be achieved by electrical motor system apparatus (not shown) and controlled by Command board 414 of handling apparatus 420.
  • Such board 414 can allow operator 600 to set a fixed memorable positions of handling apparatus as well as control other aspects of camera apparatus such as activating braking mechanism for wheels 404, activating lighting apparatus 426 on camera module 430, and the like.
  • Floor-stand support 410 and handling apparatus 420 typically convey within their apparatuses, hidden from external observation, cable connections operative in the activation of camera module 430. It should be easily appreciated that handling apparatus 420 can be attached to a ceiling mount (not shown) and function such as to allow easy and precise handling of camera module 430.
  • Camera module 430 consists of camera mount 416, cameras 422, convergence system 424, a pair of handles 418, a lighting apparatus 426, and a set of at least three probes 428.
  • Camera mount 416 is linked via connecting cables inserted via a specialized aperture (not shown) to the power supply as well as to the cameras and the other components of the system for data and command transfer.
  • Camera mount 416 is fitted with two handles to allow operator 600 ready, precise positioning of the camera module 430 above or around the operating field 500.
  • Camera mount 416 has at least three fitted probes 428 for the transmission or the reflection of electromagnetic or infrared radiation for the purpose of spatial localization of the camera module 430.
  • camera mount 416 contains two cameras 422 attached to each other via a convergence system 424 as well as a lighting apparatus 426 such as an illumination Tl-25 3mm white Led lamp from The Led Light and the like.
  • Cameras 422, convergence system 424 and lighting apparatus 426 are fitted within camera mount 416 in such a manner as to allow unobstructed motion to the cameras 422 and the convergence system 426.

Abstract

An apparatus and method for providing stereoscopic magnified observation enabling an operator to perform surgical procedures without having to remove his eyes from the operating field comprising a head mounted display for provding the operator with stereoscopic magnified images in an operating field, a camera module for providing stereoscopic magnified images, an operator controller unit for enabling an operator to control the operation of the apparatus; and an interface processing unit for processing and dynamically presenting the stereoscopic magnified images in an operating field.

Description

STEREOSCOPIC VIDEO MAGNIFICATION AND NAVIGATION
SYSTEM
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
The present invention relates generally to a stereoscopic observation system, and more particularly to a stereoscopic video observation and magnification system integrated with an imaging guided system to be utilized in surgery and other medical applications.
DISCUSSION OF THE RELATED ART Magnification observation systems are known in the art as optical instruments comprising magnifying lenses for the magnification of the view of small objects. In the field of medicine such instruments typically include simple hand-held loupes, wearable binocular loupes with or without headlight, or, in the case of microsurgery, surgical microscopes. Typically surgical microscopes are used by a single surgeon, leaving the other members of the surgical team with a minimal view of the operating field. Surgical microscopes are large, cumbersome and hard to manipulate. The magnification range of the wide array of surgical microscopes is from 2 (where 1 is unaided eye) to about 12. Currently no single magnification device exists that can provide the whole range of magnification values necessary for all the stages of a surgical procedure. Typically, loupe-aided observation provides magnifications in the range of 2 to 3 (also larger magnification is available but the equipment is unwieldy and less frequently used) whereas the range of magnifications for a microscope is typically 8 or above.
Stereoscopic observation systems are known in the art as optical devices comprising two magnifying lenses having two different viewpoints from two adjacent angles of view and providing two images such as to reproduce the characteristics of human binocular vision. The images provided by the optical devices are combined in a user's brain thereby enabling the user to perceive depth. Current surgical microscopes are typically stereoscopic, but only one operator can observe the operating field in stereoscopy while the additional observers provided with a monocular view only.
A stereoscopic video magnifying system is known in the art as a mechanism that utilizes two video cameras typically mounted on an observer's head. The cameras record two video images of an object, magnify the images, and forward the magnified images to an observer. The magnification provided is typically in the range of 1 to more than 8. The video images captured by the cameras and transmitted via a computing device such as a computer to respective observation devices such as display screens or LCD head mounted glasses. The stereoscopic video magnification system having the cameras mounted on the observer's head is very useful for the performance of surgical procedures in field conditions where proper mounting devices for the cameras may be unavailable. However, in an orderly and properly equipped medical environment such as hospital-based operating room the efficiency of the system is limited. Wearing the cameras is uncomfortable and effect fatigue during prolonged operations. Due to the observer's head movements and accumulated fatigue the method becomes inefficient. Currently such systems are not wide spread but it is the view of the applicants that they will replace many of the operative microscopes due to the several advantages thereof. Such a system includes the provision of a full range of magnifications, thus suitable for many different types of operations and procedures and also include a useful "see through" capability that enables the surgeon to work continuously and with πrinimum head position movements. The "see through" capability refers to the attribute of the system allowing a surgeon to observe directly at the operating field through the LCD glasses when not observing the video image of the magnifying cameras. The system also provides more than one surgeon with a stereoscopic view of the operating field.
Navigation systems in the medical field are a group of devices brought to use together in order to allow a surgeon to pinpoint the location of a probe with high accuracy. An object located within the operating field, for example the brain in neurosurgery, is scanned, typically by a MRI or a CT scanner, prior to an operation. During the scanning of the object, sets of markers are attached to the object and located around the operating field. The markers can be seen on the scanning image as well as to be detected thereafter by a sensor device. Prior to the surgical procedure the scarining information is fed to a computer. During the surgical procedure markers that can be detected by a special camera (typically an infrared camera) surround the object within the operating field. A pinpoint location on the object within the operating field is achieved by fixing at least three reference points in space. Two markers around the operative field and a special surgery probe utilized by the surgeon constitute the three reference points designed to locate a desired location. The location is displayed on a separate view screen with reference to the pre-operative scanning images. Currently existing systems operative in the typical neurosurgery procedure assist the surgeon to locate a desired area on the object but the surgeon must remove his eyes from the operative microscope as well as utilize a special probe instead of a surgical tool in order to view the desired location
SUMMARY OF THE PRESENT INVENTION It is therefore the purpose of the current invention to allow operators such as surgeons engaged in a surgical procedure to use a stereoscopic magnifying system in combination with a navigation system or a system providing accurate information as to the location of a point within the operative field.
It is a further object of the present invention to use the stereoscopic magnifying system for the localization of a target point in the operative field and synchronize said target with navigational data. The system relieves the surgeon from the necessity of removing his eyes from the operating field and from the necessity of manually manipulating a probe. Thus the surgeon is free to operate and observe in three-dimensional view the magnified operative field as well as pinpointing an exact location of a target point. Such location can be displayed to the surgeon's eyes as a two or three-dimensional image. It is also the purpose of the present invention to improve the ergonomics of the stereoscopic video magnification system, to allow a surgeon further freedom of movement, and to provide a stereoscopic view of the operating field simultaneously to more than one member of the a surgical team. It is another object of the present invention to provide new and improved means for physically supporting the video cameras. The supporting apparatus providing means for stable and smooth and easy placing of the video magnification and magmfying system. Furthermore, said apparatus can be automatically controlled and can provide space for extra equipment. One aspect of the present invention regards an apparatus for providing stereoscopic magnified observation enabling an operator to perform surgical procedures without having to remove his eyes from the operating field. The apparatus comprises a head-mounted display for providing the operator with stereoscopic magnified images in an operating field. It also comprises a camera module for providing stereoscopic magnified images, an operator controller unit for enabling an operator to control the operation of the apparatus, and an interface processing unit for processing and dynamically presenting the stereoscopic magnified images in an operating field. Within the apparatus the camera module can further comprise a camera mount for mounting at least two video cameras, and at least two video cameras attached to the camera mount for acquiring video images of the operating field. It can also include a converging system for interconnecting and adjusting the at least two video cameras with respect to each other and obtaining a focal point associated with a point in the operating field resulting in obtaining stereoscopic images. The interface-processing unit can further include a central processing unit for processing stereoscopic images and for calculating a convergence angle required at a specific focal distance for generating stereoscopic images. It can also include at least memory device, a camera control unit for controlling at least the focus level, focus distance and the distance angle and for the synchronizing of received values from at least two video camera, and a camera convergence unit for controlling a convergence system. In addition, it can include an imaging sampler unit for sampling video images received from the at least two video camera and forwarding the sampled video images to the central processing unit. A video switcher unit for processing video images obtained from external video sources and for converting the images so that the head mounted display can display such images. And an on screen display control unit for each head mounted display for sending operational information to each head mounted display. A head mounted display driver for translating video image signal into information and commands fed to the head mounted display. A head mounted display-inverting driver for inverting the video images and translating the video images into information fed to an inverting head mounted display. The interface processing unit may also include a display control unit for controlling the information sent to HMD, and an external display driver for controlling information and format to be displayed onto the external display. The interface-processing unit can further comprise an operator controller interface unit for handling the control commands input by the operator sent to the interface-processing unit.
A second aspect of the present invention regards an apparatus for providing stereoscopic magnified observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field. The apparatus includes at least one interface-processing unit for processing and transmitting data and for controlling peripheral devices. A display device for displaying the stereoscopic magnification images, information and precise location of a point in an operating field. An input device for selecting and inputting data and commands into the interface processing unit. A camera module comprising at least two cameras for acquiring magnified and stereoscopic images from the operation filed and for localizing markers around the operative field and for obtaining focal point distance information. And an Interface processing unit for processing and dynamically presenting the stereoscopic magnified images and for calculating and precise location of a point in an operating field and plotting said point on display. The interface processing unit can also include a central processing unit, a memory device, a communication device and a navigation system for obtaining imaging data and for localizing a point in the operative field and display such point on a three dimensional image representation displayed on a display device. A third aspect of the present invention regards an apparatus for providing stereoscopic magnified observation and precise imaging location of a point in an operating field. The apparatus comprising a camera module having one or more infra red camera. The cameras observe the operative field and the camera module have two or more cameras focused on a focal point at a focal distance. The focal distance point is superimposed on the head-mounted display. A head mounted display worn by operator. An interface-processing unit receives information, which is processed, sent and stored. There are two or more markers located about the operative field and one or more marker located about the camera module. A fourth aspect of the present invention regards a method for providing stereoscopic magnified observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures. The method steps are as follows: receiving distance and attitude of two markers or more located about the operating field from the infra red camera. Receiving distance and attitude of one marker or more located on a camera module from an infra red camera. Calculating distance of one marker or more located on the camera module from a base line between two cameras within the camera module. Obtaining and providing focal point distance to the interface processing unit, and calculating the relative location of the two markers or more located about the operative field in relation to the focal point distance. Storing focal point location information in relation to the two markers or more about the operative field. The step of storing the focal point location information can be plotted on imaging images of the operative field. The step of storing the focal point location information is sent for display. The step of storing the focal point location information can be sent via the communication device. The operator can manipulate the focal point location information. The step of storing the focal point location information can be displaying a three dimensional fashion. A camera support apparatus can support the camera module and markers. The infra red camera, the interface processing unit can be mounted on the camera support apparatus. Such camera support apparatus can be manually and automatically controlled.
A fifth aspect of the present invention regards a method for providing stereoscopic magnified observation images and precise imaging location of a point in an operating field and information, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field. The method comprising the steps of: Displaying of images from stereoscopic magnifying video cameras to the eye of an operator. Obtaining and storing imaging images. Displaying imaging images to the eye of the operator. Plotting focal point location information on imaging images and displaying plotted focal point location information imaging images to the eye of the operator. And selecting information to be displayed to the operator.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which: Fig. 1 is a block diagram illustrating the main components of a stereoscopic video magnification system, in accordance with a preferred embodiment of the present invention; and
Fig. 2 is a block diagram illustration of the preferred embodiment; and Fig. 3A is a schematic illustration of the operation sequence of the stereoscopic video magmfication system in concert with navigational data and system; and
Fig 3B is a schematic illustration of camera module; and Fig 3C is a schematic illustration of targeting cross hair marker displayed onto the head mounted display (HMD) screen; and Figs. 4A is schematic illustrations of the floor stand for the stereoscopic video magnification and navigation system; and swivel arm supporting the camera module.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
The present invention overcomes the disadvantages of the prior art by providing a novel method and system that enhance and add to the capabilities of a stereoscopic video magmfication system. The following description is presented in order to enable any person skilled in the art to make and use the invention. For the purposes of the explanation specific terminology is set forth to provide a thorough understanding of the invention. However, it would be apparent to one with ordinary skill in the art that the specific details introduced in the description are not required to practice the present invention. Descriptions of a specific application are provided only as an example. Various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Thus, the present invention is not limited to the embodiment shown, but it is to be accorded the widest scope consistent with the principles and features disclosed herein.
Some of the elements presented in the following description are computer programs implemented in a computing device. The device presented in the following description contains computer software instructions specifically developed for the practice of the present invention. The software in the presented computing device causes the device to perform the various functions described herein, although it should be noted that it is possible to use dedicated electronic hardware to perform all the functionality described herein. In the second case the application can be implemented as hardware by the embedding of the predetermined program instructions and/or appropriate control information within suitable electronic hardware devices containing application-specific integrated til circuits. Reference is made to PCT application number LL00/00398 dated 1
Figure imgf000011_0001
January 2001 assigned to Surgivision Ltd. which is incorporated herein by reference. The present invention provides a novel and useful apparatus for stereoscopic magnified observation and precise location of a surgical or operating field by enabling an operator, typically a surgeon, to observe, to magnify, and to locate a point in the operating field while performing a surgical procedure, without having to remove his eyes from the operating field or substantially moving his head. The present invention can also provide a view substantially similar to the view provided to the surgeon to the other members of the surgical team simultaneously.
Fig. 1 is a schematic illustration of the configuration and the related operation of a stereoscopic video magmfication system, generally referenced as 100, in accordance with a preferred embodiment of the present invention. System 100 includes an Interface Processing Unit (IPU) 102, a Head Mounted Display (HMD) 104, an External Display Unit 106, a Camera Module 108 and an Operator Control Unit 110. The camera module 108 includes a camera mount 112, two video cameras 114, a converging system 116, and suitable connecting cables with associated input and output sockets. Optionally module 108 includes two analog-to-digital converters 118. IPU 102 includes a Central Processing Unit (CPU) with an internal memory device 120, a flash memory device 122, a camera control unit 124, a camera convergence unit 126, an image sampler unit 128, an operator controller interface unit 132, a video switcher unit 134, an On Screen Display (OSD) unit 148 for each HMD connected to the system, a HMD driver 136, a HMD inverting driver 138, a test connection interface 140, a display control unit 130, and an external display driver 144, all of which are connected by main bus 142. The camera mount 112 typically is attached to the head of the surgeon by a headband, headrest or a helmet. Cameras 114 are lightweight video cameras such as Model EVI 370DG from Sony, Japan or the like. The cameras 114 are interconnected by a convergence system 116 allowing them to be adjusted in respect to each other. The adjustment effects motions of the cameras in respect to each other along the longitudinal axis. The aforementioned motions permit the positioning of both cameras in such an angle that a certain target's focal point may fall on the cameras' charged-coupled devices thereby generating a stereoscopic view. The analog-to-digital (A/D) converter 118 converts an analog visual signal captured by cameras 114 to digital signals to be fed to IPU 102. When using digital video cameras, suitable analog-to-digital converters are supplied with and situated within the cameras. In contrast when an analog camera is used suitable A/D converters are to be acquired separately and placed IPU 102. Cameras control unit 124 receives information regarding focus level and distance, distance angle and the like for the cameras 114 and automatically synchronizes the cameras 114 according to the received values. Focus level and distance, distance angle and other information relating to the cameras 114 are fed to CPU 120 of IPU 102. CPU 120 is programmed with specifically developed software instructions that perform suitable calculation concerning the convergence angle required at a specific focal distance in order to generate proper stereoscopic vision. The required convergence angle is fed to the camera convergence unit 126 that performs the necessary correction in the disposition of the cameras 114 by sending suitable instructions to the mechanical convergence system 116 via electrical connections. The process is designed to be automatic but can be also induced by requests of the operator of the system (not shown) such as a surgeon, or the like. Video images captured by the cameras 114 are fed to the image sampler 128 of IPU 102 via A/D converters 118.' Image sampler unit 128 samples the video images received from the cameras 114 by periodically obtaining specific values associated with the signal representing the video images, and forwards the sampled images to the CPU 120. CPU 120 processes the received images. The processing involves the examination of the images obtained by the cameras 114, and appropriately corrections to any stereoscopically essential differences between the cameras 114. As a result of the processing appropriate control information is sent to the camera convergence unit 126. Camera convergence unit 126, in turn, sends correction instruction signals to the convergence system 116. In accordance with the received instruction convergence system 116 mechanically changes the convergence angle between the cameras 114. Video images obtained by the camera module 108 are also sent to HMD 104 via video switcher unit 134, via On Screen Display (OSD) unit 148, and via HMD driver 136 or HMD inverting driver 138. Video switcher unit 134 processes video images from different sources. Unit 134 receives video signals such as digital signals directly from A/D converters 118, video image files from the flash memory 122, compressed video files such as MPEG files from external video source 146, and the like. Video switcher unit 134 converts the received video image signals to a format appropriate for display to HMD drivers 136 and 138. On Screen Display (OSD) unit 148 sends operational mformation to the HMD 104, such as zoom level values, and the like. The HMD driver 136 translates the different video images signals into information and commands fed to the HMD 104. HMD inverting driver 138 first inverts the video image then translates the video signals into information and commands fed to Inverting HMD 104. Inverting HMD driver 138 thus creates an opposite point of view that is displayed to an additional user such as a second surgeon donning an inverting HMD 104. Inverting HMD 104 is a head mounted display receiving information and commands from inverting HMD driver 138. In a typical operation, two users (such as two surgeons) are situated on the opposite sides of an operating field. Camera module 108 sends video images that allow the first surgeon to observe the scene from his point of view. The second surgeon receives the same view, but as a result of being located on the opposite side of the operating field the video image received is not the proper representation of his natural view of the field. Thus the image received is inverted to the point of view of the second surgeon. The inverted view may confuse the second surgeon as to the spatial location of left and right. Inverting driver 138 thus allows both surgeons to have a directionally correct view of the operating field each from his respective point of view.
Video images obtained by camera module 108 in real time are processed and fed to HMD 104. HMD 104 is typically a pair of head mounted display units, such as LCD glasses model LDI-D100BE from Sony, Japan. Display control unit 130 controls the information sent to HMD 104. Such information can include direct video images feed, commands, stored video images feed, patient data, operational information, other data and the like. Display control unit 130 receives orders from CPU 120 through operator controller unit 110 as well as through other input devices such as a keyboard, a touch screen, a pointing device, or the like. Operator controller unit 110 is connected to the IPU 102 via the operator controller interface 132. The operator controller interface 132 handles the control commands input by the operator sent to the interface processing unit. Operator controller unit 110 is used by the user (not shown) to transmit suitable instructions to IPU 102 by suitable manipulation of the unit 110. The instructions transmitted by the operator controller unit via the operator controller interface unit 132 can include focusing the cameras 114, changing the zoom values, activating and deactivating the "see-through" capability, initializing the system and readjusting the parameters of the system such as luminance, contrast, and the like. IPU 102 executes commands received from the operator by reading the introduced command parameters from display control unit 130. Consequently IPU 102 calculates the required parameters needed to be set and sends the instructions involving the processed command information to the camera control unit 124, and to the camera convergence unit 126. Camera control unit 124 and camera convergence unit 126 sends the required set of instructions to the cameras 114 and the convergence system 116 for execution. The operator controller unit 110 is a specially designed hardware control board that can be attached to the user's arm, placed on a special pedestal near the operating field, or the like. Subsequent to the initialization of the stereoscopic video magnifying system, the IPU 102 tests all the connections and the settings of the system and receives test and software updates through test connection interface 140. External display driver 144 translates different image signals into information and commands fed to external display 106. External display 106 can be a computer screen, a TV set a hand eld device screen, an LCD screen and the like. Referring now to Fig. 2 which describes the main components of the proposed system according to the preferred embodiment of the present invention. IPU 200 receives, processes, and transmits data. IPU 200 also controls the peripheral devices. IPU 200 comprises of central processing unit (CPU) 230, memory device 231, communication device 233 and navigation system 244. Peripheral devices 270 communicate with IPU 200 via CPU 230. Peripheral devices 270 comprises display devices 250, input devices 220, cameras 210 and camera support apparatus 240. Display devices 250 can be a head mounted display (HMD) 254 described in fig 1 and PCT application number IL00/00398 dated 18th January 2001 assigned to Surgivision Ltd., an external display 256 such as a TV screen or a computer screen such as a 17" AE-1769 from AST Vision, a Printer devices 258 such as Phaser 2135 from Xerox, and the like. Display device 250 provides the operator with the stereoscopic magnified images, imaging data, precise location and localization data, patient related data, surgery and surgical procedure related data as well as any other data and information required by the operator during surgical sessions, and the like. Input device 220 typically comprise Operator Controller (OC) 222 such as operator controller 110 of Fig. 1 and PCT application number IL00/00398 dated 18th January 2001 assigned to Surgivision Ltd. Input device 220 can also comprise pointing device 226 such as a PS/2 mouse from Microsoft™, Keyboard such as a BTC 8110M ergonomic keyboard with touch pad, Microphone such as VR250BT Unidirectional Electret Microphone with Single Earphone Speaker and the like. Input devices 220 are used for selecting and inputting data and commands into the interface processing unit. Cameras 210 typically comprise a set of two types of cameras. Cam Module 208 as described in Fig. 1 and PCT application number IL00/00398 dated 18th January 2001 assigned to Surgivision Ltd. for the acquiring of preferably stereoscopic magnified images from the operative field (not shown), and an Infrared (IR) camera or Electromagnetic (EM) camera used to localize special markers (not shown) around operative field. IR camera 206, EM camera 204 and matching markers (not shown) are typically supplied with Navigation system 244. Camera supporting apparatus 240 can contain Supporting arm 216, Arm control panel 214 both of which are described here and below in Fig 4. Cam module 208 is designed to capture and transmit preferably magnified video images from observed field of operation (not shown). Camera module 208 is thus placed within camera support apparatus 240. Camera module 208 is positioned and moved via camera support apparatus 240 by support arm 216 and controlled by arm control panel 214 as well as by input devices 220. The positioning can be automatic or manual. Camera support apparatus 240 is fiirther described in fig 4. Magnified video images obtained by cam module 208 are transmitted to CPU 230 of IPU 200 as well as to HMD 254 of display device 250. Said video images can also be transmitted for display on external display 256, for printing a hard copy form printer 258. Operator (not shown) using system 1 can use input device 220 to instruct CPU 230 of IPU 200 to perform certain operations. Command data can be fed to IPU 200 by touch, pointing device click, finger tap, voice, or by any other suitable means suitable for a human operator. Such command data can be sent to IPU 200 via hard cables, radio frequency wireless transmission, Infrared transmission, and the like.
Navigation system 244 of IPU 200 is designed to receive imaging data 242 of operative field and the surrounding areas (not shown). Imaging data 242 can be a MRI scan, a CT scan, Ultrasound data and the like. Imaging data 242 can be acquired pre-operatively or intra-operative and is fed to navigation system 244 directly such as via a hard copy imaging file or via network 234. Network 234 typically comprises a hospital network connecting computers 235, the Internet network 236 and the like. Communication device 233 of IPU 200 is designed to receive and transmit data between network 234 and CPU 230 of IPU 200. Such data can be made available for navigation system 244 as well as for memory device 231 for storage, as well as for display on display devices 250. Communication device can be a X2 modem device from US Robotics and the like. Memory device 231 can be a Flush memory device, or any other memory device, and the like. Navigation system 244 such as VectorVision from BrainLab™, IR camera 206 or EM camera 204 as well as a set of markers (not shown) are used to localize a point in space preferably in the operative field (not shown) and display such point on a three dimensional image representation reconstructed from imaging data 242 obtained in advance. Such three dimensional image is displayed on display device 250 such as HMD 254 and the like.
The method for point localization and display as well as the integration of other components of system 1 is now further disclosed in Fig. 3. Fig. 3 described hereforth illustrates the operational sequence of the stereoscopic video magnification system in association with the navigational data and system. In order for a navigational system to locate a point in space, it must have at least three coordinates in space. Markers placed near and around operative field attain such coordinates. Using a multiple of markers enhances accuracy of the system. In currently operative navigation systems, a multiple of active or passive markers located near and around the operative field to be used as reference points. The reference points are used to establish the location of a target probe. Typically the operator manually manipulates the target probe by physically placing it on a target point. Subsequently the navigation system calculates the location of the target point and displays the location thus calculated on a display monitor where the indicated location is superimposed on an imaging image, such as a MRI two dimensional image. In the preferred embodiment of the present invention, the target probe is unnecessary as will be clearly shown from the following description. In addition in the preferred embodiment the target point is superimposed on a three dimensional imaging image constructed from imaging data and displayed on visualization devices such as HMD. In Fig. 3A, 3B and 3C same numbers relate to like components. The following discussion relates to said Figures as a whole.
Fig 3 A illustrate the preferred embodiment of the present invention in which cam module 208, external display 256, keyboard 224, Infrared (IR) camera 344 and IPU 200 are all mounted on camera support apparatus 240. Camera support apparatus 240 is situated preferably close to operative field 300 such that cam module 208 can be situated in a position permitting cameras 314 direct visualization of operative field 300. IR camera 344 is mounted on camera support apparatus 240 such that both operative field 300 and cam module 208 mounted markers are directly visible. User (not shown) usually dons HMD 254 during operative session. HMD 254 is connected to IPU 200 via suitable conduits.
Camera module 208 best seen in Fig 3B comprises camera mount 312, two cameras 314, convergence system 316 and markers 302. Cameras 314 typically observe operative field 300 such that both obtain a single focal point 308 illustrated as a cross best seen in Fig 3A. Preferably magnified stereoscopic video image of operative field 300 is displayed stereoscopically on HMD 254. Focal point 308 marked as a cross in Fig 3C is superimposed onto HMD display 306 via crosshair generator such as FK 1/F crosshair generator from KAPPA Miltram Industries Ltd. Focal point 308 is thus visible to Operator (not shown) at all times and is represented as a crosshair on HMD display 306 of HMD 254. Prior to the performance of the surgical procedure an imaging machine such as a MRI scanner or CT scanner scans the operating field 300 and its surroundings. Markers 302 surround operative field 300. Markers 302 are made of a substance perceived by imaging scanners and will appear as dark or light spots on the imaging data 242 which is fed to IPU 200. The process by which Navigation system 244 determines the exact location of the focal point 308 is now discussed in further detail. Focal point 308 is a virtual target probe used in place of the traditional physical target probe. Markers 302 can be passive Markers such as IR or light reflectors or active markers such as IR emitting markers or electromagnetic radiation emitting markers and the like. Markers 302 are situated on camera mount 312 as well as around the operating field 300 in the same location as during the imaging session. With the active marker type given here as an example, markers 302 emit IR radiation, as seen by parallel lines in Fig 3A. The IR radiation is perceived by IR camera 344 of navigating system 244. The analog information representing the IR radiation is converted to digital information by navigational system 244 of Fig.1 located within IPU 200. Consequently CPU 230 of IPU 200 performs the suitable calculation regarding the location of focal point 308 in respect to markers 302 mounted on Cam module 208. Focal point 308 location in space is then compared to the spatial location of markers 302 located around operative field 300. Calculated location of the focal point 308 is plotted on pre or intra-operative imaging data. The plotted imaging data 242 is sent via suitable conduits to output unit 250 of Fig 2 for display on HMD 254, external display 256, and for transmission to remote units via communication device 233 of Fig 1. Focal point location 308 superimposed on imaging data 242 is displayed preferably as a three dimensional reconstruction of the operative field and surroundings. The user can choose to observe said imaging data 242 with superimposed focal point 308 on display devices 250 of Fig 2 by introducing suitable requests through the appropriate manipulation of input devices 220 of Fig 2 such as operator controller 222 of Fig 2, keyboard 224 and the like. When using electromagnetic probes 302, such as magnets, IR camera 344 is not needed and a special crown (not shown) is placed around magnetic probes. The crown is constructed to measures signals from each probe. The location of the magnetic probes is then used to calculate the location of focal point 308 as described above, for IR probes. Such system requires no optical cameras and has no line-of-sight restrictions. Realization of such a system can include combining the No-Block™ tracking system, such as InstaTrak 3500 from InstaTrak with previously disclosed stereoscopic magmfication system, or by using similar probes and crown only, performing the rest of the calculations and display using the disclosed stereoscopic magmfication and navigation system. Turning now to Fig. 4A which describes in further detail camera support apparatus 400. Apparatus 400 typically consists of a floor-stand support 410 or a ceiling mount (not shown), a handling apparatus 420, and the camera module 430. Camera support apparatus 400 is intended to provide easy, effortless, positioning of the camera module 430. Camera module 430 is positioned above or around operative field 500 such that the cameras 422 capture a clear and uninterrupted view of operative field 500. Said view is transmitted to the user 600. Handling apparatus 420 include a set of arms, marked 406, 408 and 412, permitting easy and smooth handling of camera module 430 to a stable but changeable position during an operating session. Handling apparatus 420 also provide suitable tubing for cable routing within said arms from cameras 210 of Fig 1 such as camera module 430, as well as from display device 250 of Fig 1, as well as from a command board 414 to IPU 200 of Fig 3 A. said cables are suitable conduits for the power supply, data transfer and control of certain aspects related to said devices, mounted on camera support apparatus 400. Floor-stand 410 is intended to provide support, stability, mobility, cable routing, storage place, and the like to camera module 430.
Floor-stand support 410 comprises a base 402, a controlling handle 403, and a set of wheels 404. The base 402 can consists of a rectangular shaped box or a multiple protrusion shaped round flat plate, or the like, fitted with wheels 404 at the bottom side of the base. Wheels 404 are typically made from hard plastic but also may be pneumatic. Wheels 404 can be fitted with a braking mechanism (not shown) for static positioning of floor-stand 410. Camera support apparatus 400 is easily maneuverable with the help of controlling handle 403. Floor-stand support 410 can be of varied forms according to the operating room type and available space. Floor-stand 410 is designed to fit the stereoscopic video magnifying and navigation systems requirements. Said box type base 402 may contain within, IPU 200 of Fig 3A as well as other components related to the system.
Handling apparatus 420 comprising a stand arm 406, a reaching arm 408 and a swivel arm 412 is intended for easy and versatile handling of the camera module 430. Handling apparatus 420 is preferably made from light components such as composite metals, plastics and the like, and is suitable for conducting electrical and data wires within tubes there within. Stand arm 406 which is a vertical beam, reaching arm 408 which is a horizontal arm and swivel arm 412 which is a special support arm for camera module 430 are inter connected arms, set with oil damping bearings to allow operator 600 easy manipulation of camera module 430. The camera module 430 is attached to the end of the swivel arm 412 via the camera mount 416. Camera mount 416 is connected to swivel arm 412 by a ball and socket type joint 435 allowing movement in any direction. Joint 435 is fitted with vertical and horizontal-locking screws (not shown) for the fixation of camera mount 416 in any desired position. Arm 412 is joined with horizontal arm 408 via a rotational joint 445 allowing horizontal rotation of arm 412 around axis 480. Joint 445 is fitted with locking screws (not shown) permitting fixation of arm 412 in any desired location around axis 480. Horizontal arm 408 is interconnected with vertical beam 406 via joint 455. Joint 455 is such that horizontal arm 408 can be elevated and descended along axis 490 as well as rotated around axis 490. Joint 455 is also fitted with two locking screws (not shown). Said locking screw permit the fixation of arm 408 in any point along and around axis 490. Vertical beam 406 is fitted with a stopper (not shown) at beam 406 upper margin, such that arm 408 is prevented from slipping over beam 406. Camera mount 416, Swivel arm 412 as well as horizontal arm 408 can be displaced manually by operator 600 or others. Such displacement can be achieved by electrical motor system apparatus (not shown) and controlled by Command board 414 of handling apparatus 420. Such board 414 can allow operator 600 to set a fixed memorable positions of handling apparatus as well as control other aspects of camera apparatus such as activating braking mechanism for wheels 404, activating lighting apparatus 426 on camera module 430, and the like. Floor-stand support 410 and handling apparatus 420 typically convey within their apparatuses, hidden from external observation, cable connections operative in the activation of camera module 430. It should be easily appreciated that handling apparatus 420 can be attached to a ceiling mount (not shown) and function such as to allow easy and precise handling of camera module 430.
Camera module 430 consists of camera mount 416, cameras 422, convergence system 424, a pair of handles 418, a lighting apparatus 426, and a set of at least three probes 428. Camera mount 416 is linked via connecting cables inserted via a specialized aperture (not shown) to the power supply as well as to the cameras and the other components of the system for data and command transfer. Camera mount 416 is fitted with two handles to allow operator 600 ready, precise positioning of the camera module 430 above or around the operating field 500. Camera mount 416 has at least three fitted probes 428 for the transmission or the reflection of electromagnetic or infrared radiation for the purpose of spatial localization of the camera module 430. In addition camera mount 416 contains two cameras 422 attached to each other via a convergence system 424 as well as a lighting apparatus 426 such as an illumination Tl-25 3mm white Led lamp from The Led Light and the like. Cameras 422, convergence system 424 and lighting apparatus 426 are fitted within camera mount 416 in such a manner as to allow unobstructed motion to the cameras 422 and the convergence system 426.
It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described herein above. Rather the scope of the present invention is defined only by the claims, which follow.
The present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined only by the claims, which follow.

Claims

1. An apparatus for providing stereoscopic magmfied observation enabling an operator to perform surgical procedures without having to remove his eyes from the operating field, the apparatus comprising: a head mounted display for provding the operator with stereoscopic magmfied images in an operating field; a camera module for providing stereoscopic magnified images; an operator controller unit for enabling an operator to control the operation of the apparatus; and an interface processing unit for processing and dynamically presenting the stereoscopic magmfied images in an operating field.
2. The apparatus of claim 1 wherein the camera module comprises: a camera mount for mounting at least two video cameras; at least two video cameras attached to the camera mount for aquiring video images of the operating field; and a converging system for interconnecting and adjusting the at least two video cameras with respect to each other and obtaining a focal point associated with a point in the operating field resulting in obtaining stereoscopic images.
3. The apparatus of claim 1 wherein the interface processing unit comprises: a central processing unit and for processing stereoscopic images and for calculating a convergence angle required at a specific focal distance for generating stereoscopic images; at least memory device; a camera control unit for controling at least the focus level, focus distance and the distance angle and for the synchronizing of received values from at least two video camera; a camera convergence unit for controling a convergence system; an imaging sampler umt for sampling video images received from the at least two video camera and forwarding the sampled video images to the central processing unit; a video switcher unit for processing video images obtained from external video sources and for converting the images so that such images can be displayed by the head mounted display; an on screen display control unit for each head mounted display for sending operational information to each head mounted display; an head mounted display driver for translating video image signal into information and commands fed to the head mounted display an head mounted display inverting driver for inverting the video images and translating the video images into information fed to an inverting head mounted display.
4. The apparatus of claim 3 wherein the interface processing unit further comprises a display control unit for controlling the information sent to HMD; and an external display driver for controlling information and format to be displayed onto the external display.
5. The apparatus of claim 3 wherein the interface processing unit further comprises an operator controller interface unit for handling the control commands input by the operator sent to the interface processing unit.
6. An apparatus for providing stereoscopic magmfied observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field, the apparatus comprising:
At least one interface processing unit for processing and ttansmitting data and for controlling peripheral devices; at least one display device for displaying the streoscopic magnification imges, information and precise location of a point in an operating field; at least one input device for selecting and inputting data and commands into the interface processing unit; a camera module comprising at least two cameras for acquiring magnified and stereoscopic images from the operation filed and for localizing markers around the operative field and for obtaining focal point distance information; and
At least one Interface processing umt for for processing and dynamically presenting the stereoscopic magnified images and for calculating and precise location of a point in an operating field and ploting said point on display.
7. The apparatus of claim 6 wherein the interface processing unit comprises at least one central processing unit; at lease memory device; at least one communication device and at least one navigation system for obtaining imaging data and for localizing a point in the operative field and display such point on a three dimensional image representation displayed on a display device.
8. The apparatus of claim 6 wherein the display device is a head mounted display.
9. An apparatus for providing stereoscopic magmfied observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field, the apparatus comprising: a camera module having at least one infra red camera the cameras observe the operative field and the camera module have at least two cameras focused on a focal point at a focal distance, the focal distance point is superimposed on the head mounted display; at least one head mounted display worn by operator and at least one interface processing unit where information is received, processesed sent and stored; and the at least two markers are located about the operative field and at least one marker is located about the camera module.
10. A method for providing stereoscopic magmfied observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures, where: receiving distance and attitude of at least two markers located about the operating field from the infra red camera; and receiving distance and attitude of at least one marker located on a camera module from an infra red camera; and calculating distance of at least one marker located on the camera module from a base line between two cameras within the camera module; and obtaining and providing focal point distance to the interface processing unit; and calculating the relative location of the at least two markers located about the operative field in relation to the focal point distance; and storing focal point location information in relation to at least two markers about the operative field.
11. The method of claim 10 wherein the step of storing the focal point location information is ploted on imaging images of the operative field.
12. The method of claim 10 wherein the step of storing the focal point location information is sent for display.
13. The method of claim 10 wherein the step of storing the focal point location information is sent via at least one communication device.
14. The method of claim 10 wherein the step of storing the focal point location information is manipulated by operator.
15. The method of claim 10 wherein the step of storing the focal point location information is displayed in a three dimentional fasion.
16. The apparatus of claim 9 where the infra red camera is an electromagnetic camera
17. The method of claim 10 wherein the infrared markers are electromagnetic markers
18. The apparatus of claim 9 wherein the infrared markers are active
19. The apparatus of claim 9 wherein the electromagnetic markers are active.
20. The method of claim 10 wherein the infrared markers are active
21. The method of claim 10 wherein the electromagnetic markers are active
22. The method of claim 10 wherein the camera module and at least one marker is supported by a camera support apparatus.
23. The method of claim 10 wherein the infra red camera, the interface processing unit are mounted on the camera support apparatus.
24. The method of claim 22 wherein the camera support apparatus is manually and automatically controlled.
25. A method for providing stereoscopic magnified observation images and precise imaging location of a point in an operating field and information, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field the method comprising: displaying of images from stereoscopic magnifying video cameras to the eye of an operator; obtaining and storing imaging images; displaying imaging images to the eye of the operator; plotting focal point location information on imaging images and displaying plotted focal point location information imaging images to the eye of the operator; and selecting information to be displayed to the operator.
26. The method of claim 25 further comprising displaying plotted imaging images in a three dimensional representation to the eye of the operator.
27. The method of claim 25 wherein the display is a head mounted display.
28. The method of claim 25 wherein the operator selects the information to be displayed by an input device.
29. The method of claim 25 wherein the step of selecting the information comprises selecting images and text.
PCT/IL2001/000598 2001-06-28 2001-06-28 Stereoscopic video magnification and navigation system WO2003002011A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IL2001/000598 WO2003002011A1 (en) 2001-06-28 2001-06-28 Stereoscopic video magnification and navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IL2001/000598 WO2003002011A1 (en) 2001-06-28 2001-06-28 Stereoscopic video magnification and navigation system

Publications (1)

Publication Number Publication Date
WO2003002011A1 true WO2003002011A1 (en) 2003-01-09

Family

ID=11043065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2001/000598 WO2003002011A1 (en) 2001-06-28 2001-06-28 Stereoscopic video magnification and navigation system

Country Status (1)

Country Link
WO (1) WO2003002011A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1470791A1 (en) * 2003-04-25 2004-10-27 BrainLAB AG Visualisation apparatus with input means, for the combination of scanned and video images, and visualisation method
WO2004100815A2 (en) * 2003-05-16 2004-11-25 Carl Zeiss Operation lamp comprising an integrated optical observation device
EP1621153A1 (en) * 2004-07-28 2006-02-01 BrainLAB AG Stereoscopic visualisation apparatus for the combination of scanned and video images
US7203277B2 (en) 2003-04-25 2007-04-10 Brainlab Ag Visualization device and method for combined patient and object image data
EP1969416A2 (en) * 2005-12-12 2008-09-17 Universidade Federal de Sao Paulo - UNIFESP Enlarged reality visualization system with pervasive computation
US7463823B2 (en) 2003-07-24 2008-12-09 Brainlab Ag Stereoscopic visualization device for patient image data and video images
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
WO2015151447A1 (en) * 2014-03-31 2015-10-08 Sony Corporation Surgical control device, control method, and imaging control system
EP3096065A1 (en) * 2015-05-21 2016-11-23 Euromedis Groupe Video system including a hinged frame
CN107865702A (en) * 2016-09-28 2018-04-03 李健 A kind of medicinal intelligent operation microscopic system
CN109715107A (en) * 2016-09-23 2019-05-03 索尼奥林巴斯医疗解决方案公司 Medical observation device and medical viewing system
AT521076A1 (en) * 2018-03-26 2019-10-15 Bhs Tech Gmbh Stereomicroscope for use in microsurgical procedures on the patient and methods for controlling the stereomicroscope

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4395731A (en) * 1981-10-16 1983-07-26 Arnold Schoolman Television microscope surgical method and apparatus therefor
EP0629963A2 (en) * 1993-06-21 1994-12-21 General Electric Company A display system for visualization of body structures during medical procedures
WO1999038449A1 (en) * 1998-01-28 1999-08-05 Cosman Eric R Optical object tracking system
US5961456A (en) * 1993-05-12 1999-10-05 Gildenberg; Philip L. System and method for displaying concurrent video and reconstructed surgical views
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4395731A (en) * 1981-10-16 1983-07-26 Arnold Schoolman Television microscope surgical method and apparatus therefor
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US5961456A (en) * 1993-05-12 1999-10-05 Gildenberg; Philip L. System and method for displaying concurrent video and reconstructed surgical views
EP0629963A2 (en) * 1993-06-21 1994-12-21 General Electric Company A display system for visualization of body structures during medical procedures
WO1999038449A1 (en) * 1998-01-28 1999-08-05 Cosman Eric R Optical object tracking system

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7203277B2 (en) 2003-04-25 2007-04-10 Brainlab Ag Visualization device and method for combined patient and object image data
EP1470791A1 (en) * 2003-04-25 2004-10-27 BrainLAB AG Visualisation apparatus with input means, for the combination of scanned and video images, and visualisation method
WO2004100815A2 (en) * 2003-05-16 2004-11-25 Carl Zeiss Operation lamp comprising an integrated optical observation device
WO2004100815A3 (en) * 2003-05-16 2005-02-10 Zeiss Carl Operation lamp comprising an integrated optical observation device
US7463823B2 (en) 2003-07-24 2008-12-09 Brainlab Ag Stereoscopic visualization device for patient image data and video images
EP1621153A1 (en) * 2004-07-28 2006-02-01 BrainLAB AG Stereoscopic visualisation apparatus for the combination of scanned and video images
EP1969416A2 (en) * 2005-12-12 2008-09-17 Universidade Federal de Sao Paulo - UNIFESP Enlarged reality visualization system with pervasive computation
EP1969416A4 (en) * 2005-12-12 2010-03-03 Univ Fed Sao Paulo Unifesp Enlarged reality visualization system with pervasive computation
CN103211655A (en) * 2013-04-11 2013-07-24 深圳先进技术研究院 Navigation system and navigation method of orthopedic operation
US10571671B2 (en) 2014-03-31 2020-02-25 Sony Corporation Surgical control device, control method, and imaging control system
WO2015151447A1 (en) * 2014-03-31 2015-10-08 Sony Corporation Surgical control device, control method, and imaging control system
EP3096065A1 (en) * 2015-05-21 2016-11-23 Euromedis Groupe Video system including a hinged frame
FR3036458A1 (en) * 2015-05-21 2016-11-25 Euromedis Groupe VIDEO SYSTEM COMPRISING ARTICULATED ARMATURE
CN109715107A (en) * 2016-09-23 2019-05-03 索尼奥林巴斯医疗解决方案公司 Medical observation device and medical viewing system
EP3517070A4 (en) * 2016-09-23 2019-09-04 Sony Olympus Medical Solutions Inc. Medical observation device and medical observation system
US11432899B2 (en) 2016-09-23 2022-09-06 Sony Olympus Medical Solutions Inc. Medical observation device and medical observation system
CN107865702A (en) * 2016-09-28 2018-04-03 李健 A kind of medicinal intelligent operation microscopic system
AT521076A1 (en) * 2018-03-26 2019-10-15 Bhs Tech Gmbh Stereomicroscope for use in microsurgical procedures on the patient and methods for controlling the stereomicroscope
AT521076B1 (en) * 2018-03-26 2020-11-15 Bhs Tech Gmbh Stereo microscope for use in microsurgical interventions on patients and methods for controlling the stereo microscope
US11516437B2 (en) 2018-03-26 2022-11-29 Bhs Technologies Gmbh Stereo microscope for use in microsurgical operations on a patient and method for controlling the stereo microscope

Similar Documents

Publication Publication Date Title
US20050090730A1 (en) Stereoscopic video magnification and navigation system
US6919867B2 (en) Method and apparatus for augmented reality visualization
US11336804B2 (en) Stereoscopic visualization camera and integrated robotics platform
US11147443B2 (en) Surgical visualization systems and displays
US9766441B2 (en) Surgical stereo vision systems and methods for microsurgery
US6891518B2 (en) Augmented reality visualization device
US7907166B2 (en) Stereo telestration for robotic surgery
JP2575586B2 (en) Surgical device positioning system
WO2019210322A1 (en) Stereoscopic visualization camera and integrated robotics platform
US9330477B2 (en) Surgical stereo vision systems and methods for microsurgery
EP3912588A1 (en) Imaging system for surgical robot, and surgical robot
US20150085095A1 (en) Surgical visualization systems
US20060176242A1 (en) Augmented reality device and method
EP3725254A2 (en) Microsurgery system with a robotic arm controlled by a head-mounted display
WO2003002011A1 (en) Stereoscopic video magnification and navigation system
CN1894618A (en) System for 3-D observation of real-time or static-state picture
EP4221581A1 (en) Auto-navigating digital surgical microscope
CN116568219A (en) Automatic navigation digital operation microscope

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP