WO2014065336A1 - Insertion system, insertion support device, insertion support method and program - Google Patents

Insertion system, insertion support device, insertion support method and program Download PDF

Info

Publication number
WO2014065336A1
WO2014065336A1 PCT/JP2013/078738 JP2013078738W WO2014065336A1 WO 2014065336 A1 WO2014065336 A1 WO 2014065336A1 JP 2013078738 W JP2013078738 W JP 2013078738W WO 2014065336 A1 WO2014065336 A1 WO 2014065336A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
output
information
inserted body
positional relationship
Prior art date
Application number
PCT/JP2013/078738
Other languages
French (fr)
Japanese (ja)
Inventor
藤田 浩正
良 東條
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to CN201380055958.4A priority Critical patent/CN104755007B/en
Priority to EP13849891.0A priority patent/EP2912987A4/en
Publication of WO2014065336A1 publication Critical patent/WO2014065336A1/en
Priority to US14/693,142 priority patent/US20150223670A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/005Flexible endoscopes
    • A61B1/009Flexible endoscopes with bending or curvature detection of the insertion part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00025Operational features of endoscopes characterised by power management
    • A61B1/00036Means for power saving, e.g. sleeping mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00059Operational features of endoscopes provided with identification means for the endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2423Optical details of the distal end
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1076Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04CROTARY-PISTON, OR OSCILLATING-PISTON, POSITIVE-DISPLACEMENT MACHINES FOR LIQUIDS; ROTARY-PISTON, OR OSCILLATING-PISTON, POSITIVE-DISPLACEMENT PUMPS
    • F04C2270/00Control; Monitoring or safety arrangements
    • F04C2270/04Force
    • F04C2270/042Force radial
    • F04C2270/0421Controlled or regulated

Definitions

  • the present invention provides an insertion system for inserting and inserting an insertion part of an insertion tool such as an endoscope or a catheter into an inserted body, and assisting the insertion of the insertion part into the inserted body in such an insertion system.
  • the present invention relates to an insertion support apparatus, an insertion support method, and a program for causing a computer to execute a procedure of the insertion support apparatus.
  • Patent Document 1 As an assisting device for observing by inserting the insertion portion into the inserted body, for example, in Patent Document 1, when inserting the endoscope insertion portion into a human body, the shape of the endoscope insertion portion is displayed on the display portion.
  • the structure which displays is disclosed.
  • a plurality of flexible bending detection optical fibers each having a bending detection portion whose amount of light transmission changes in accordance with the angle of the bent angle is formed into a flexible belt-shaped member. It is attached in a state where it is arranged in parallel, and it is inserted and arranged over almost the entire length in the endoscope insertion part, and the bending state of the belt-like member in the part where each bending detection part is located from the light transmission amount of each bending detection optical fiber. The bending state is detected and the bending state of the endoscope insertion portion is displayed on the monitor screen.
  • Patent Document 1 can present the shape of the insertion portion inside the inserted body that cannot be seen from the outside of the inserted body when inserted into the inserted body. However, it has not been proposed to detect which part of the inserted body is imaged (observed) and to display it.
  • a detection method and a method for displaying the position inside the inserted body Is also desired.
  • the present invention has been made in view of the above points, and is an insertion that can provide an operator with information for determining where in the body to be imaged or where the user is working. It is an object to provide a system, an insertion support apparatus, an insertion support method, and a program.
  • the insertion portion to be inserted into the inserted body the insertion state acquiring portion for acquiring the insertion state information of the insertion portion, and the target for acquiring the shape information of the inserted body.
  • An insertion body shape acquisition unit, a positional relationship calculation unit that inputs the insertion state information and the inserted body shape information and calculates a positional relationship of the insertion unit with respect to the inserted body, and a calculation of the positional relationship calculation unit An insertion system having an output unit for outputting a result and a control unit for controlling an output state of the calculation result at the output unit is provided.
  • an insertion state acquisition part for acquiring insertion state information of the insertion part; Positional relationship for calculating the positional relationship of the insertion part with respect to the inserted body by inputting the inserted body shape acquisition unit that acquires the shape information of the inserted body, the insertion state information, and the inserted body shape information
  • An insertion support device includes a calculation unit, an output unit for outputting the calculation result of the positional relationship calculation unit as display information, and a control unit for controlling the output state of the calculation result in the output unit.
  • the insertion state information of the insertion part is acquired.
  • An insertion state acquisition step, an inserted body shape acquisition step for acquiring shape information of the inserted body, the insertion state information and the inserted body shape information are input, and the position of the insertion portion with respect to the inserted body Insertion support comprising: a positional relationship calculation step for calculating a relationship; an output step for outputting the calculation result of the positional relationship calculation step as display information; and a control step for controlling the output state of the calculation result at the output step A method is provided.
  • the insertion state acquisition procedure for acquiring the insertion state information of the insertion part in the insertion support device that supports the insertion of the insertion part of the insertion tool into the inserted body.
  • an insertion object shape acquisition procedure for acquiring the insertion object shape information, the insertion state information and the insertion object shape information, and calculating a positional relationship of the insertion portion with respect to the insertion object.
  • a program that executes a positional relationship calculation procedure, an output procedure for outputting the calculation result of the positional relationship calculation procedure as display information, and a control procedure for controlling the output state of the calculation result in the output procedure Is done.
  • an insertion system an insertion support device, an insertion support method, and a program capable of making it easy for an operator to determine whether all of the images have been taken or worked, and preventing oversight of an observation or work place can do.
  • the present invention in outputting the positional relationship of the insertion portion with respect to the insertion object calculated by the positional relationship calculation unit at the output unit, by setting at least the output start time by the control unit, always Since it is output only when necessary, it is not output, so that it is possible to provide easy-to-understand output (easy to see, easy to make mistakes, and can be judged quickly) without providing unnecessary information.
  • FIG. 1A is a diagram showing a schematic configuration of an insertion system to which an insertion support apparatus according to the first embodiment of the present invention is applied.
  • FIG. 1B is a block configuration diagram of the insertion support apparatus according to the first embodiment.
  • FIG. 1C is a diagram for explaining an example of provision of information by the output unit of the insertion support apparatus according to the first embodiment.
  • FIG. 2A is a diagram illustrating a schematic configuration of a flexible endoscope apparatus as an insertion tool in the insertion system according to the first embodiment.
  • FIG. 2B is a perspective view of the distal end of the insertion portion.
  • FIG. 3A is a diagram for explaining a configuration of an insertion / rotation detection unit.
  • FIG. 3B is a diagram for explaining the operation principle of the insertion / rotation detection unit.
  • FIG. 4 is a diagram illustrating an insertion state of the insertion portion into the inserted body.
  • FIG. 5A is a diagram illustrating a case where the bending portion is bent in the upward direction on the paper surface for explaining the principle of the fiber shape sensor.
  • FIG. 5B is a diagram illustrating a case where the bending portion is not curved for explaining the principle of the fiber shape sensor.
  • FIG. 5C is a diagram illustrating a case where the bending portion is bent in a downward direction in the drawing for explaining the principle of the fiber shape sensor.
  • FIG. 6 is a view showing a structure for attaching the fiber shape sensor to the insertion portion.
  • FIG. 7 is a diagram illustrating an operation flowchart of the insertion support apparatus according to the first embodiment.
  • FIG. 8A is a diagram for describing an example of an entrance / exit of an insertion target body.
  • FIG. 8B is a diagram for explaining another example of the entrance / exit of the inserted object.
  • FIG. 8C is a diagram for explaining still another example of the entrance / exit of the inserted object.
  • FIG. 8D is a diagram for explaining another example of the entrance / exit of the inserted object.
  • FIG. 9A is a diagram for explaining where the first position display shows in the inserted body.
  • FIG. 9B is a diagram for explaining where the second position display shows in the inserted body.
  • FIG. 10 is a timing chart for explaining the output timing of the output unit under the control of the control unit.
  • FIG. 11 is a diagram for explaining another example of a form of providing information.
  • FIG. 12A is a diagram for explaining a display example when an insertion unit is inserted into an insertion object with a branch as yet another example of a form of providing information.
  • FIG. 12B is a diagram for explaining another display example.
  • FIG. 12C is a diagram for explaining still another display example.
  • FIG. 13A is a diagram illustrating a state before rotation for explaining a change in a captured image due to rotation of the insertion unit.
  • FIG. 13B is a diagram illustrating a state after rotation for explaining a change in a captured image due to rotation of the insertion unit.
  • FIG. 14A is a diagram for describing another example of a form of providing information.
  • FIG. 14A is a diagram for describing another example of a form of providing information.
  • FIG. 14B is a diagram for explaining still another example of the information provision form.
  • FIG. 14C is a diagram for describing another example of a form of providing information.
  • FIG. 14D is a diagram for explaining yet another example of the information provision form.
  • FIG. 14E is a diagram for describing another example of a form of providing information.
  • FIG. 14F is a diagram for describing still another example of the information provision form.
  • FIG. 15 is a diagram for explaining still another example of the information provision form.
  • FIG. 16A is a block diagram of an insertion system according to the second embodiment of the present invention.
  • FIG. 16B is a diagram for explaining an example of provision of information by the display device of the insertion system according to the second embodiment and the output unit of the insertion support device according to the second embodiment of the present invention.
  • FIG. 16A is a block diagram of an insertion system according to the second embodiment of the present invention.
  • FIG. 16B is a diagram for explaining an example of provision of information by the display device
  • FIG. 17 is a diagram showing a schematic configuration of a rigid endoscope apparatus as an insertion tool in the insertion system according to the third embodiment of the present invention.
  • FIG. 18 is a diagram showing a schematic configuration of a catheter as an insertion tool in the insertion system according to the fourth embodiment of the present invention.
  • an insertion system 1 As shown in FIG. 1A, an insertion system 1 according to the first embodiment of the present invention includes an insertion unit 31 that is inserted into an inserted body 2 and an imaging unit 32 that images the inserted body 2. 3, the insertion / rotation detection unit 4 and the fiber shape sensor 5 as detection units for detecting displacement amount information of the insertion unit 31, and the insertion unit based on the displacement amount information from the insertion / rotation detection unit 4 and the fiber shape sensor 5.
  • the insertion state information of 31 is acquired, the positional relationship of the insertion part 31 with respect to the insertion object 2 is calculated based on the insertion state information and the shape information of the insertion object 2, and the calculation result is displayed and output.
  • an insertion support apparatus 6 according to the first embodiment of the invention.
  • the insertion tool 3 will be described as a flexible endoscope device as shown in FIG. 2A, for example.
  • This flexible endoscope apparatus includes an insertion unit 31 and an operation unit 33 configured integrally with the insertion unit 31.
  • the insertion portion 31 is a flexible tubular member and can be inserted into the inserted body 2 from the insertion port 21 of the inserted body 2.
  • the inside of the insertion object 2 is filled with a predetermined material such as air, physiological saline, or a chemical solution. As shown in FIG.
  • an imaging opening 34 is provided at the end of the insertion portion 31 in the insertion direction (hereinafter referred to as the insertion portion distal end), and in the vicinity of the insertion portion distal end inside the insertion portion 31.
  • an imaging unit 32 is built in. The light incident on the imaging opening 34 is received by the imaging unit 32 and imaged. The image captured by the imaging unit 32 is displayed and output by the insertion support device 6 according to the first embodiment.
  • the imaging unit 32 is disposed not in the vicinity of the distal end of the insertion unit inside the insertion unit 31 but in the operation unit 33, and is connected to the imaging opening 34 by a light guide or the like, and is incident on the imaging opening 34.
  • the image may be guided to the imaging unit 32 to perform imaging.
  • the insertion portion 31 has a bending portion 35 in the vicinity of the distal end of the insertion portion.
  • the bending portion 35 is connected by an operation lever provided in the operation portion 33 and a wire, although not particularly illustrated. Yes. Accordingly, the wire is pulled by moving the operation lever, and the bending operation of the bending portion 35 is possible.
  • the insertion portion 31 has an illumination optical fiber, guides light from an illumination light source (not shown) disposed in the operation portion 33, and inserts the insertion portion 31.
  • the light is emitted from the tip light supply unit 36 as illumination light for imaging.
  • a treatment opening 37 is provided at the distal end of the insertion portion, and a treatment instrument inserted through the insertion portion 31 from the operation portion 33 can extend out of the insertion portion 31 from the treatment opening 37. Yes.
  • the insertion / rotation detection unit 4 is installed in the vicinity of the insertion port 21 of the inserted body 2, detects the insertion amount and rotation amount of the insertion unit 31, and inserts support as one of the displacement amount information of the insertion unit 31. Output to device 6.
  • the insertion / rotation detection unit 4 includes a light source 41, a light projection lens 42, a light reception lens 43, an optical pattern detection unit 44, and a displacement amount calculation unit 45, as shown in FIG. 3A.
  • the light emitted from the light source 41 is applied to the insertion unit 31 through the light projecting lens 42, and the light reflected by the insertion unit 31 is received by the optical pattern detection unit 44 through the light receiving lens 43.
  • the optical pattern detection unit 44 continuously detects the image of the insertion unit 31 surface, which is an optical pattern, with detection times t 0 , t 1 , t 2 ,..., T n ,.
  • the displacement amount calculation unit 45 is an arbitrarily selected reference pattern that exists in an image (optical pattern PT n ) of image data captured at an arbitrary time t n by the optical pattern detection unit 44. alpha and, this time t n of the image data captured to a time t n + 1 after the lapse of an arbitrary time from the image (optical pattern PT n + 1) optical pattern consistent with the reference pattern alpha present in part of the alpha '
  • the displacements on the image data are compared, and the displacement amounts on the images in the x-axis direction and the y-axis direction are calculated.
  • FIG. 3B the displacement amount calculation unit 45 is an arbitrarily selected reference pattern that exists in an image (optical pattern PT n ) of image data captured at an arbitrary time t n by the optical pattern detection unit 44. alpha and, this time t n of the image data captured to a time t n + 1 after the lapse of an arbitrary time from the image (optical pattern PT
  • the optical pattern detection unit 44 is positioned so that the x-axis of the optical pattern detection unit 44 coincides with the axial direction of the insertion unit 31. Therefore, the displacement amount ⁇ x f in the x-axis direction calculated by the displacement amount calculation unit 45 is proportional to the insertion amount of the insertion unit 31, and the displacement amount ⁇ y f in the y-axis direction is proportional to the rotation amount of the insertion unit 31.
  • the displacement amount (insertion amount and rotation amount) on the image thus calculated by the displacement amount calculation unit 45 is output to the insertion support device 6 as displacement amount information.
  • the displacement amount information also includes information on the insertion direction and the rotation direction.
  • the acquisition of the displacement amount information is not limited to the above method, and may be obtained based on image data.
  • the fiber shape sensor 5 is installed inside the insertion portion 31 as shown in FIG.
  • the fiber shape sensor 5 is composed of a plurality of optical fibers, and has one curvature detection unit 51 in one optical fiber.
  • the curvature detection unit 51 is configured by removing the clad of the optical fiber and exposing the core, and applying a light absorbing member.
  • the amount of light absorbed by the curvature detection unit 51 changes according to the curvature of the bending unit 35.
  • the amount of light to be changed changes, that is, the amount of light transmission changes.
  • the fiber shape sensor 5 is composed of a plurality of optical fibers, but is not limited thereto, and may be a single optical fiber.
  • the fiber shape sensor 5 configured as described above includes two curve detectors facing the X-axis direction and the Y-axis direction, respectively, in order to detect the X-axis curve and the Y-axis curve shown in FIG.
  • Two optical fibers 52 are arranged so that 51 becomes a pair, and the amount of bending at one place is detected. And the some optical fiber 52 is arrange
  • the curvature detection part 51 is provided not only in the bending part 35 of the insertion part 31 but also on the operation part side from that, so that the flexibility of the insertion part 31 can be used according to the internal structure of the inserted body 2. It is desirable to be able to detect the bending state of the portion other than the bending portion 35 of the insertion portion 31 that is freely bent.
  • an illumination optical fiber 38 and an imaging portion wiring 39 are also provided inside the insertion portion 31.
  • the imaging unit 32 is a dark part by guiding light from an illumination light source (not shown) disposed in the operation unit 33 by the illumination optical fiber 38 and emitting it as illumination light from the distal end of the insertion unit.
  • the inside of the insert 2 can be imaged.
  • the insertion support device 6 includes an insertion state acquisition unit 61, an inserted body shape acquisition unit 62, a positional relationship calculation unit 63, an output unit 64, and a storage unit 65, and The control unit 66 is configured.
  • the insertion state acquisition unit 61 inserts at least a part of the insertion unit 31 into the inserted object 2. For example, the position and orientation of a certain point of the insertion unit 31 are acquired. Further, based on the light transmission amount that changes depending on the bending amount of each optical fiber 52 as the displacement amount information detected by the fiber shape sensor 5, as insertion state information into the inserted body 2, the insertion portion 31 further includes The position and orientation of each curvature detection unit 51 are acquired.
  • the inserted body shape acquisition unit 62 acquires the shape information (inserted body shape information) of the inserted body 2.
  • the inserted body shape information is configured based on data from the outside or the inside of the inserted body 2 before the insertion portion 31 is inserted into the inserted body 2.
  • the insertion object shape information based on data from the outside uses, for example, a CT diagnostic apparatus, an ultrasonic diagnostic apparatus, an X-ray apparatus, or the like that can be detected by being transmitted through the insertion object 2. Configured.
  • the inserted object shape information based on the data from the inside uses the trajectory data when the insertion part 31 is moved in the space of the inserted object 2 or when the tip of the insertion part contacts the inserted object 2. It is configured by connecting position information. If position information at the time of contact between the distal end of the insertion portion and the inserted body 2 is used, the size of the space can be detected, and more accurate inserted body shape information can be acquired.
  • the to-be-inserted body 2 is a human internal organ, it can comprise by estimating from a physique, and when the to-be-inserted body 2 is a structure, it can also comprise by inputting a shape with drawing. .
  • the acquisition of the inserted body shape information by the inserted body shape acquisition unit 62 may be performed by connecting a device constituting the inserted body shape information such as a CT diagnostic apparatus and directly acquiring the information.
  • the inserted object shape information output from the apparatus may be temporarily stored in a storage medium, and the stored inserted object shape information may be read out or downloaded via a network and acquired. Absent.
  • the insertion object shape acquisition unit 62 is not limited to such an interface or data reader, and may itself be a device that constitutes the insertion object shape information.
  • the positional relationship calculation unit 63 is an insertion unit for the inserted body 2 based on the insertion state information acquired by the insertion state acquisition unit 61 and the inserted body shape information acquired by the inserted body shape acquisition unit 62. 31 is calculated, that is, where the entire insertion portion 31 and the distal end are directed toward the inserted body 2. Specifically, the positional relationship calculation unit 63 firstly determines the shape information of the insertion unit 31 based on the position and orientation of each curvature detection unit 51 of the insertion unit 31 as the insertion state information acquired by the insertion state acquisition unit 61. Is calculated.
  • the positional relationship of the insertion portion 31 with respect to the to-be-inserted body 2 that is, the direction (axial direction) facing the position of the distal end of the insertion portion is calculated.
  • the intersection point between the direction in which the distal end of the insertion portion faces and the inserted body 2 is calculated. That is, as shown in FIG.
  • the positional relationship calculation unit 63 has an intersection 72 between the straight line formed in the direction in which the distal end of the insertion unit faces (imaging direction 71) and the shape of the inserted body 2, that is, the visual field (imaging region). 73) The center is obtained as the imaging position P.
  • the imaging unit 32 captures an image from the distance between the position of the distal end of the insertion unit and the imaging surface of the inserted body 2 based on the inserted body shape information.
  • the field of view (imaging area 73) that is the area of the inserted object 2 that is inserted may be calculated as the imaging position P.
  • the imaging region is more accurately detected. 73 can be obtained.
  • the imaging region 73 as the imaging position P, the range captured by the imaging unit 32 can be grasped.
  • a partial region 74 or a point in the field of view (imaging region 73) may be calculated as the imaging position P. For example, when the imaging region 73 cannot be accurately detected, it is possible to prevent erroneous detection that a non-imaging range has been captured by calculating a small region in consideration of the error. That is, observation omission can be prevented.
  • the positional relationship calculation unit 63 obtains the obtained shape information of the insertion portion 31 and the obtained positional relationship of the insertion portion 31 with respect to the inserted body 2, that is, the direction (axial direction) facing the position of the distal end of the insertion portion.
  • the inserted body shape information acquired by the inserted body shape acquisition unit 62 and the imaging position information indicating the obtained imaging position P (for example, the intersection 72) are output to the control unit 66 as a calculation result.
  • the output unit 64 is configured so that the operator can determine where the tip of the insertion unit faces in the inserted body 2 from the calculation result of the positional relationship calculation unit 63 under the control of the control unit 66. Display output.
  • the output unit 64 also functions as a display unit that can display the captured image obtained from the imaging unit 32.
  • the storage unit 65 stores the calculation result of the positional relationship calculation unit 63 and the captured image obtained from the imaging unit 32 under the control of the control unit 66.
  • the control unit 66 controls the output state of the display output in the output unit 64 and the storage in the storage unit 65 of the calculation result of the positional relationship calculation unit 63. That is, the calculation results from the positional relationship calculation unit 63 are sequentially output to the control unit 66, but the control unit 66 causes the output unit 64 to display and output all of the output calculation results.
  • the imaging position information indicating at least a part of the imaging position P (for example, the intersection 72)
  • the presence / absence of the display output that is, the display output start time and the display output end time is controlled, or the calculation result is displayed.
  • the type of display that is, the display method is controlled.
  • the storage in the storage unit 65 is controlled so that at least a part of the calculation result output from the positional relationship calculation unit 63 is stored. Further, not only the calculation result of the positional relationship calculation unit 63 but also the storage of the captured image from the imaging unit 32 can be similarly controlled.
  • the control unit 66 can include a determination unit 67 that determines whether the distal end of the insertion unit has reached a predetermined position.
  • the “predetermined position” refers to a portion that requires observation or treatment inside the inserted body 2.
  • the determination unit 67 makes this determination based on a part of the calculation result output from the positional relationship calculation unit 63, that is, the shape information of the insertion unit 31, the direction (axial direction) facing the position of the distal end of the insertion unit, and the target. This is performed based on the insert shape information, or is performed by pattern matching or the like from the captured image obtained from the imaging unit 32. Then, the control unit 66 controls the output unit 64 and the storage unit 65 based on the determination result of the determination unit 67.
  • the control unit 66 also has a function of sending information stored in the storage unit 65 (calculation results of the positional relationship calculation unit 63, captured images, etc.) to the output unit 64 for display.
  • the operation of the insertion support apparatus 6 configured as described above will be described with reference to FIG.
  • the calculation result of the positional relationship calculation unit 63 by the control unit 66 that is, the output state of the display output at the output unit 64 of the imaging position P (for example, the intersection 72), and the storage
  • the storage control in the unit 65 will be described by taking as an example a case where only display output and storage start control are performed.
  • the insertion object shape information is acquired by the insertion object shape acquisition unit 62 (step S1). Thereafter, the insertion state acquisition unit 61 acquires the insertion state information of the insertion unit 31 into the inserted body 2 (step S2). And the shape of the insertion part 31 is calculated by the positional relationship calculation part 63 based on the position and direction of each curvature detection part 51 of the insertion part 31 as insertion state information (step S3).
  • the position and orientation of a point of the insertion portion 31 as the acquired insertion state information, and the acquired insertion target shape information the insertion target 2
  • the positional relationship of the insertion portion 31 with respect to the angle, that is, the direction (axial direction) facing the position of the distal end of the insertion portion is calculated (step S4).
  • the determination unit 67 of the control unit 66 determines whether or not the distal end of the insertion unit has reached a predetermined position (step S5). For example, when the bladder is taken as an example of the inserted body 2, whether or not the distal end of the insertion portion has reached a predetermined position is determined based on whether the distal end of the insertion portion 31 inserted from the urethral opening, which is the insertion port 21. It is determined whether or not the entrance / exit 22 of the bladder as shown in 8A, that is, the connecting portion between the bladder and the urethra has been reached.
  • the distal end of the insertion portion can also be judged by whether or not the distal end of the insertion portion has reached a predetermined distance from the bladder wall, for example, a distance of about 2 cm. Whether the predetermined distance from the doorway 22 or the bladder wall has been reached is obtained from the calculation result (position of the distal end of the insertion section and the inserted body shape information) output from the positional relationship calculation section 63 or from the imaging section 32. This can be determined from the captured image. Alternatively, it is possible to determine from the captured image obtained from the imaging unit 32 based on whether or not the blood vessel interval of the bladder wall has become approximately 100 pixels, for example. Further, since the imaging unit 32 cannot obtain a captured image suitable for observation even if it is too close to the subject, the predetermined position can have a range. For example, the distance between the bladder wall and the bladder wall may be approximately 2 cm to 5 mm in terms of distance, and the blood vessel interval of the bladder wall may be in the range of approximately 100 pixels to approximately 500 pixels in terms of pixels.
  • step S6 the above operation is repeated from step S2.
  • the control unit 66 displays on the output unit 64 imaging position information indicating the imaging position P (for example, the intersection 72) that is a part of the calculation result. While outputting, it is made to preserve
  • the output unit 64 displays as shown in FIG. 1C. That is, in the present embodiment, the output unit 64 displays and outputs the captured image (captured image display 641) from the imaging unit 32 and the insertion state (insertion state display 642) of the insertion unit 31.
  • the insertion state display 642 is displayed and output as two-dimensional diagrams 6421 and 6422 obtained by dividing the inserted body 2 as the inserted body shape information by a predetermined portion. Further, on the two-dimensional diagrams 6421 and 6422 obtained by dividing the inserted body 2 by a predetermined location, the imaging position P, that is, the intersection 72 between the inserted body 2 and the axial direction (imaging direction 71) of the distal end of the insertion portion is concerned. Information is displayed and output.
  • the two-dimensional diagrams 6421 and 6422 are generated by the control unit 66 based on the shape information of the insertion unit 31. That is, the first two-dimensional diagram 6421 is a diagram showing a state in which the shape of the object to be inserted 2 is divided by the YZ plane at the coordinates of the object to be inserted 2 and opened left and right as shown in FIG. 9A. As shown in FIG. 9B, the second two-dimensional view 6422 is a view from a different viewpoint from the first two-dimensional view 6421. As shown in FIG. It is a figure which shows the state opened by dividing up and down. Then, the control unit 66 causes the current position display 643 to be displayed and output as information related to the imaging position P on the two-dimensional diagrams 6421 and 6422 of the output unit 64.
  • the current position display 643 is not always performed. That is, as shown in FIG. 10, when the insertion system starts operating at time T1, the captured image display 641 starts immediately, but the current position display 643 on the two-dimensional diagrams 6421 and 6422 is not displayed.
  • the display output of the current position display 643 on the two-dimensional diagrams 6421 and 6422 is started only when the determination unit 67 of the control unit 66 determines that the distal end of the insertion unit has reached a predetermined position (time T2). .
  • the display output of the captured image display 641 and the display output of the current position display 643 on the two-dimensional diagrams 6421 and 6422 are continued until the insertion system is terminated at time T4.
  • the determination unit 67 of the control unit 66 also determines that the distal end of the insertion unit has returned before a predetermined position, and at that time ( The display output may be terminated at time T3).
  • the current position display 643 may be the intersection 72 itself between the inserted body 2 and the axial direction of the distal end of the insertion portion.
  • the imaging region 73 is used, or for example, the intersection 72 is the center. It is easier to visually recognize a certain range such as a partial region 74 in the imaging region.
  • the output unit 64 can display and output an insertion portion shape schematic display 644 indicating the shape of the insertion portion 31 in addition to the current position display 643 as information relating to the current imaging position P. That is, since the shape of the insertion portion 31 and the position and orientation of the insertion portion tip are calculated by the positional relationship calculation portion 63 as described above, what is the insertion state of the insertion portion 31 in the inserted body 2? The insertion portion shape outline display 644 can be performed.
  • the output unit 64 can also display and output the position trajectory display 645 using the calculation result stored in the storage unit 65.
  • the calculation result stored in the storage unit 65 can also be stored from the time (time T2) when the distal end of the insertion unit reaches a predetermined position based on the determination of the determination unit 67.
  • the position trajectory display 645 also has a certain range similar to the current position display 643. Further, in this case, in order to distinguish between the current position display 643 and the position locus display 645, some identification display is performed such as changing each other's color, density, and pattern, or making the position locus display 645 blinking. Display information is desirable. This identification display form may be controlled by the control unit 66 in response to selection by an operator's switch operation.
  • the switch includes the operation unit 33 of the flexible endoscope device that is the insertion tool 3, a foot switch, a location where an operator's brain wave and muscle movement can be detected, the housing of the insertion support device 6, It can be arranged on a treatment instrument inserted through the insertion portion 31 from the operation portion 33 of the endoscope apparatus.
  • the determination unit 67 may determine whether the distal end of the insertion unit has reached a predetermined position, based on the calculation result from the positional relationship calculation unit 63 or the captured image from the imaging unit 32, or in addition, You may make it judge according to operation by the operator of a switch. That is, the operator observes the captured image displayed as the captured image display 641 and operates the switch when determining that the distal end of the insertion portion has reached a predetermined position, so that the determination portion 67 has the distal end of the insertion portion. It is determined that a predetermined position has been reached.
  • the operator may arbitrarily set a predetermined position by operating this switch. That is, the predetermined position can be changed according to the type and size of the inserted body 2. For example, with respect to the entrance / exit 22, if the inserted body 2 is a bladder, it becomes a connection point between the urethra and the bladder as shown in FIG. 8A, but if the inserted body 2 is the stomach, as shown in FIG. As shown in FIG. 8C, when the connecting part and the inserted body 2 are the cecum, the ascending colon, the transverse colon, the descending colon, the sigmoid colon, and the large intestine following the rectum, as shown in FIG. If it is a mouth and piping as shown to 8D, it will become the opening part of a piping, and a branch location.
  • a function that allows the operator to perform marking 646 by a switch operation may be added.
  • the control unit 66 controls the storage unit 65 to store the information on the location and causes the output unit 64 to display the marking 646 at the location. In this case, as shown in FIG.
  • the marking 646 is displayed on the two-dimensional diagrams 6421 and 6422 showing the opened state by dividing the shape of the inserted body 2 by different planes in the coordinates of the inserted body 2, Even if the position is difficult to recognize in one two-dimensional view (6421 in this example), the display can be easily recognized in the other two-dimensional view (6422 in this example). It is easy for the operator to determine whether the part has been marked 646.
  • the location of the marking 646 may be fixed to any one of the intersection 72, the imaging region 73, and a partial region 74 in the imaging region, or may be arbitrarily set by the operator. By making such a marking 646 possible, it is possible to confirm an observed part, a part that requires re-observation, or a part that requires some kind of treatment (removal, sampling, repair, etc.). It becomes possible. Furthermore, when performing re-observation of the insertion object 2 at different occasions, it is possible to specify a previous re-observation location or to quickly reach the insertion portion tip at the location. .
  • the storage not only the information on the location where the marking 646 was made, but also the insertion direction of the insertion portion 31 and the like can be stored, so that it can be used for confirmation after observation, observation in the same state next time, and the like. it can.
  • the display form is also changed in color and shape in accordance with the type information, symptom information / status information, and the operator can easily determine the meaning of each marking 646.
  • a pointing device or a visual recognition device is connected to the insertion support device 6, and the operator can select any range on the captured image display 641 displayed on the output unit 64 or the two-dimensional diagrams 6421 and 6422. A point may be specified.
  • the area division display 647 is performed. This is one in which two or more areas are divided by a common theory or regulation of an academic society, an area used as a standard, or a predetermined division method. If it does in this way, it will become easy for an operator to determine which part of the to-be-inserted body 2 is.
  • a curve detection unit display 648 indicating the position of the curve detection unit 51 may be superimposed on the insertion unit shape schematic display 644 and displayed.
  • the insertion / rotation detection unit 4 and the fiber shape sensor 5 optically detect the shape of the insertion unit 31 with respect to the inserted body 2 and the position and orientation of the imaging opening 34 as described above. It may be detected by other methods.
  • a coil is provided at least in the vicinity of the imaging opening 34 inside the insertion portion 31, and a magnetic field is generated by passing a current through the coil, and this magnetic field is received outside, or a magnetic field distribution generated outside is coiled.
  • the position and orientation of the coil, that is, the imaging opening 34 can be detected. If a plurality of coils are arranged in the longitudinal direction of the insertion portion 31, the shape of the insertion portion 31 can also be detected.
  • the insertion state acquisition unit 61 inserts (at least a part of) the insertion unit 31 inserted into the insertion target body 2 (into the insertion target body 2).
  • Information for example, the position and orientation of a point (insertion tip)
  • the insertion section 31 and the position and orientation of each bending detection section 51 of the insertion section 31 are acquired, and the inserted object shape acquisition section 62
  • the shape information of the inserted body 2 is acquired, and the insertion state information and the inserted body shape information are input to the positional relationship calculation unit 63, and the positional relationship of the insertion unit 31 with respect to the inserted body 2 (the entire insertion unit 31 and The position and orientation of the distal end of the insertion portion are calculated, and the output unit 64 displays and outputs the calculation result of the positional relationship calculation unit 63.
  • the captured image displayed by the output unit 64 is the image of the insertion object 2. It is possible to understand at which location and from which direction the observation is made. In addition, it is possible to easily identify an observed part or an unobserved part in the inserted body 2 and prevent oversight.
  • the output unit 64 displays and outputs the calculation result of the positional relationship calculation unit 63, that is, the imaging position P, it is always displayed when the control unit 66 is not provided, and is also output when no output is necessary. In this embodiment, however, the control unit 66 controls the display so that it is displayed only when necessary. Therefore, an unnecessary calculation result is not displayed, and an easy-to-understand screen (easy to see, difficult to make, and quick determination can be made. ). In addition, if the operator of the flexible endoscope apparatus that is the insertion tool 3 controls the control unit 66, only the calculation results (current position, already observed part, affected part, required extraction part, etc.) necessary for the operator are surely obtained. Can be displayed and output.
  • the storage unit 65 further stores at least a part of the calculation result of the positional relationship calculation unit 63, and the storage unit 65 displays the observation range of the inserted object 2 displayed on the output unit 64 (with respect to the inserted object 2).
  • the storage unit 65 displays the observation range of the inserted object 2 displayed on the output unit 64 (with respect to the inserted object 2).
  • the imaging part 32 which has the imaging opening part 34 provided in the insertion part 31, and by displaying the picked-up image obtained from this imaging part 32 on the output part 64, the front-end
  • control unit 66 includes a determination unit 67 that determines whether the distal end of the insertion unit has reached a predetermined position from the calculation result of the positional relationship calculation unit 63, and determines that the distal end of the insertion unit has reached a predetermined position.
  • the determination unit 67 determines, the control of the output state of the output unit 64, that is, the start / end of the continuous display output of the current position display 643 indicating the imaging position P which is the calculation result of the positional relationship calculation unit 63, the display output content -Switch the display output method (start / end of position locus display 645 and marking 646, type, size, etc.).
  • the predetermined position is a position set in advance on the shape information such as the entrance / exit 22 of the observation location (predetermined organ in the case of an organ) of the insertion body 2 (insertion opening of the insertion body 2, entrance of the observation region, etc.) ), Or a known branch location, or a location where the captured image has a specific pattern (cancer, polyp, crack, etc.).
  • the size of the position trajectory display 645 and the marking 646 may be changed depending on the distance between the distal end of the insertion body and the inserted body 2, or the intersection of the imaging direction 71 and the inserted body 2 as the imaging position P. It is good also as a grade which can be discriminate
  • control unit 66 is configured to automatically control the output state of the output unit 64 as described above when the distal end of the insertion unit reaches a predetermined position of the inserted body 2. Only the necessary places and states set in advance can be displayed and output, and the operator of the insertion tool 3 can concentrate on the operation.
  • the control unit 66 can perform necessary display output according to the operator's intention, and information necessary for the operator can be obtained. It is possible to output only with certainty. Also, a plurality of results determined by the operator (including those that are difficult to determine automatically by image recognition such as pattern matching) can be distinguished and output.
  • the number of switches may be one or plural (for current position display 643 start / end instruction, position locus taste 646 start / end instruction, marking 646 instruction, another marking instruction, and deletion instruction). In the case of a single switch, a plurality of output states can be switched by switching the number of switches on / off, the pattern, the sequential function switching, and the like.
  • the control unit 66 also displays a state display range corresponding to the state of the observation (work) target part that is an insertion target part of the inserted body 2 such as cancer, polyp, and crack, and the size and width of the insertion target part.
  • control unit 66 stores the captured image in association with the calculation result of the positional relationship calculation unit 63 in the storage unit 65, so that the state at a predetermined location is stored as the observation state when stored. Compared with the current observation state, it becomes possible to obtain the difference / change. That is, since the captured image of the observation range and the calculation result of the positional relationship calculation unit 63, that is, the imaging position P, coincide with each other, it is possible to obtain information on which part is the result of imaging, and through multiple diagnosis It is possible to accurately match the changes of the same affected area.
  • control unit 66 causes the storage unit 65 to store the captured image in association with at least the calculation result and the symptom information of the positional relationship calculation unit 63, so that the observation state when storing the state of the predetermined location, Differences and changes from the current observation state can be obtained.
  • control unit 66 displays, for example, distinguishable markings that display two or more different predetermined states in a distinguishable manner on the output unit 64 as the computation result of the positional relationship computation unit 63. So, multiple states (current observation range, trajectory, affected area 1 (cancer, inflammation, defect, wound, corrosion, etc.), affected area 2 (cancer, inflammation, defect, wound, corrosion, etc.), treated, necessary progress , Etc.) can be distinguished visually. If the plurality of states are stored in the storage unit 65, it is possible to observe only a part in a predetermined state when it is desired to observe the same place again at different times, and it is possible to improve the observation efficiency. Moreover, it becomes possible to obtain the degree of change of the part.
  • the operator of the insertion unit 31 can select the case classification in the predetermined state.
  • the case can be classified based on the intention of the operator, it is possible to arbitrarily set a re-observation standard or the like based on a field standard to which the operator or the operator belongs. In other words, it is possible to use properly according to each field such as medical field and industrial field.
  • control unit 66 controls the output unit 64 to perform display in which a region division display is performed in a plurality of regions. In this way, if the display screen is divided into areas so that you can see the academic conventions and regulations of the academic society, or the areas that are used as standard, you can observe which part when viewing the observation range. It is possible to make it easier to understand whether
  • control unit 66 provides the output unit 64 with two identical regions of the inserted body 2 so that the observation (/ work) target site and / or the positional relationship of the insertion unit 31 to the target site can be specified.
  • the inserted object 2 represented in three dimensions is expressed in two dimensions by performing the above different area division display, for example, different two-dimensional views 6421 and 6422, the portion corresponding to the depth direction becomes invisible. It may be difficult to see, but this can be prevented. Therefore, it is possible to reliably recognize an overlooked part and a marking position (easy to see, difficult to mistake, and can be quickly determined).
  • control unit 66 has a function as a related information calling unit that allows the information stored in the storage unit 65 to be displayed on the output unit 64, so that the output unit 64 stored in the storage unit 65 outputs the information. From the result of the field of view (imaging region 73), it is possible to look back on the observation result at the time of insertion, that is, the place where the insertion object 2 has been observed and the place where the observation has not been completed. Further, since the location of the previous observation is stored at the time of the next insertion, it is possible to easily guide or specify the same location.
  • the insertion amount and the rotation amount of the insertion portion 31 are detected by the insertion / rotation detection portion 4, and the shape of the insertion portion 31 is detected by the fiber shape sensor 5. Since the insertion amount and rotation amount of the insertion portion 31 to the inserted body 2 and the shape of the insertion portion 31 from the reference position can be detected, the shape of the insertion portion 31 relative to the insertion body 2 and the insertion It is possible to detect insertion state information (such as the position and orientation of the imaging opening 34) into the body 2.
  • the positional relationship calculation unit 63 calculates the position and orientation of the distal end of the insertion unit and outputs them to the control unit 66. Since the calculation of the position and orientation of the distal end of the insertion portion is the same as the above calculation, the description is omitted here.
  • FIG. 12A shows an insertion portion shape schematic display 644 showing the current shape of the insertion portion 31 on the two-dimensional view 6421A showing the shape of the insertion object, and the position of the imaging opening 34 which is the tip position of the insertion portion 31.
  • a current position display 643A representing the current position of position) and a position locus display 645A that is a locus of the position of the imaging opening 34 are shown. Note that the position locus display 645, which is the locus of the imaging position, is omitted.
  • the tip position that is, the position of the imaging opening 34
  • this information can be used for observation and treatment according to the current position and examination of the route from the current position to the target position, without estimating that it is this place. It can be carried out. Therefore, it is not necessary to repeat trial and error to go to the target position, or to check whether the target position has been reached or to observe the observation image by various methods.
  • the possibility of being able to reach the shortest course from the current position to the target position by one time is increased, the time can be shortened, and the situation regarding the position can be grasped, so that it is calm and confident. Lead to the operation.
  • the direction in which the tip of the insertion section (that is, the imaging opening 34) is directed is also displayed. good.
  • the direction of the distal end of the insertion portion that is, the direction in which the imaging opening 34 faces is indicated by an arrow 649.
  • the direction information is added by arrows 649 at several positions on the locus of the imaging opening 34.
  • the position locus display 645A of the distal end position and the arrow 649 which is the direction display so that the position information of the imaging opening 34 at the distal end of the insertion portion and the direction of the distal end position and the direction of the position change can be understood. become.
  • the direction in which the imaging opening 34 at the distal end of the insertion portion faces is the center of the field of view and the center of the captured image.
  • the arrival position and orientation of the imaging target can be known. From the current position and orientation, the viewing field direction and the center of the field of view are known.
  • the observation and treatment to be performed according to the current position and orientation, the path from the current position to the target position, and Examination of the shape and operation method of the insertion portion 31 when moving can be performed without estimating that the current position and orientation are like this.
  • the direction of the distal end of the insertion portion it becomes possible to study operation methods and procedures such as insertion / extraction and bending to reach the target position and orientation.
  • the direction of the distal end of the insertion section that is, the direction in which the imaging opening 34 faces may be shown three-dimensionally as representing the orientation of the distal end of the insertion section or rotation.
  • the rotation of the coordinate system fixed to the distal end of the insertion portion that is, the coordinate system in which the position and orientation of the distal end of the insertion portion does not change is defined as the “three-dimensional direction” of the distal end of the insertion portion
  • the three-dimensional direction of the distal end of the insertion portion The trajectory is shown in FIG. 12C.
  • the direction of the distal end of the insertion portion that is, the direction in which the imaging opening 34 is directed is indicated by an arrow 649A in three directions (x direction, y direction, z direction). Show.
  • information regarding the three-dimensional direction including rotation may be displayed together with the image captured by the imaging unit 32.
  • the imaging direction including rotation at the imaging position of the distal end of the insertion portion can be known.
  • the effect of rotation around the tip can be taken into account for treatment and the like.
  • the imaging direction including the rotation at the imaging position for example, as shown in FIGS. 13A and 13B, even if the direction in which the imaging opening 34 faces is the same, the imaging is performed when the insertion unit 31 rotates around the imaging direction.
  • the imaging opening 34 for the object 81 also rotates.
  • FIG. 13A and FIG. 13B since the image is rotated by 180 °, the image is upside down. In such a case, the captured image I captured by the imaging unit 32 is also displayed upside down. It is possible to consider the effect of rotation around the imaging direction during such observation and treatment, and it is possible to accurately grasp the top and bottom of the captured image without making a mistake.
  • the form of providing information is not limited to the display layout of the output unit 64 as shown in FIG. 1C.
  • the captured image display 641 and the insertion state display 642 can be laid out in various sizes and positions, and in various overlapping states.
  • the operator may arbitrarily select and change what the operator feels easy to see from the plurality of display layouts.
  • the insertion state display 642 is not limited to the display as shown in the two-dimensional diagrams 6421 and 6422. For example, it may be displayed as a three-dimensional diagram 6423 as shown in FIG.
  • the display unit that displays the captured image is configured as the same device as the output unit 64.
  • the insertion system 1 according to the second embodiment is configured as illustrated in FIG. 16A.
  • a display unit As a display unit, a display device 9 which is a separate device from the output unit 64 is further provided. In such a configuration, as shown in FIG. 16B, only the insertion state display 642 is displayed on the output unit 64, and only the captured image display 641 is displayed on the display device 9.
  • the output unit 64 and the display device 9 in the vicinity, the calculation result of the positional relationship calculation unit 63 and the captured image obtained by the imaging unit 32 can be visually observed almost simultaneously. Can be recognized simultaneously, and the calculation result of the positional relationship calculation unit 63 can be recognized without any burden.
  • the insertion system 1 according to the third embodiment is an example in which the insertion tool 3 is a rigid endoscope apparatus as shown in FIG.
  • a rigid endoscope apparatus is different from the flexible endoscope apparatus in the first (and second) embodiment described above, and the insertion portion 31 does not have flexibility, and thus the shape is constant. is there.
  • the shape information of the insertion unit 31 may be stored in advance in the positional relationship calculation unit 63 or may be simply input.
  • the insertion system 1 according to the fourth embodiment is an example in a case where the insertion tool 3 is a treatment tool that does not have an imaging unit such as a catheter as shown in FIG.
  • a treatment instrument includes a flexible insertion portion 31 as in the flexible endoscope apparatus in the first (and second) embodiment.
  • the configuration, operation, and effects other than the portion related to the captured image are the same as those of the insertion system according to the first (and second) embodiment.
  • the above functions can be realized by supplying a software program for realizing the functions shown in the flowchart of FIG. 7 to the computer and executing the programs by the computer.

Abstract

Insertion state information about an insertion unit which is inserted into a body is acquired by means of an insertion state acquisition unit (61); shape information about the body is acquired by a body shape acquisition unit (62); and the insertion state information and body shape information are inputted into a positional relation calculation unit (63) and the positional relation of the insertion unit relative to the body is calculated. Then, when the calculation result of the positional relation calculation unit (63) is outputted by an output unit (64), the output state of the calculation result in the output unit (64) is controlled by a control unit (66).

Description

挿入システム、挿入支援装置、挿入支援方法及びプログラムInsertion system, insertion support apparatus, insertion support method, and program
 本発明は、被挿入体内部に内視鏡やカテーテル等の挿入具の挿入部を挿入して観察や作業を行う挿入システム、そのような挿入システムにおける挿入部の被挿入体ヘの挿入を支援する挿入支援装置、挿入支援方法、及びコンピュータに挿入支援装置の手順を実行させるプログラムに関する。 The present invention provides an insertion system for inserting and inserting an insertion part of an insertion tool such as an endoscope or a catheter into an inserted body, and assisting the insertion of the insertion part into the inserted body in such an insertion system. The present invention relates to an insertion support apparatus, an insertion support method, and a program for causing a computer to execute a procedure of the insertion support apparatus.
 被挿入体内部に挿入部を挿入して観察する際の支援装置として、例えば、特許文献1には、内視鏡挿入部を人体に挿入するにあたり、表示部に当該内視鏡挿入部の形状を表示する構成が開示されている。 As an assisting device for observing by inserting the insertion portion into the inserted body, for example, in Patent Document 1, when inserting the endoscope insertion portion into a human body, the shape of the endoscope insertion portion is displayed on the display portion. The structure which displays is disclosed.
 これは、内視鏡装置において、曲げられた角度の大きさに対応して光の伝達量が変化する曲がり検出部を有する複数のフレキシブルな曲がり検出用光ファイバを、可撓性の帯状部材に並列に並んだ状態に取り付けて、それを内視鏡挿入部内にほぼ全長にわたって挿通配置し、各曲がり検出用光ファイバの光伝達量から各曲がり検出部が位置する部分における帯状部材の屈曲状態を検出して、その屈曲状態を内視鏡挿入部の屈曲状態としてモニタ画面に表示するものである。 This is because, in an endoscope apparatus, a plurality of flexible bending detection optical fibers each having a bending detection portion whose amount of light transmission changes in accordance with the angle of the bent angle is formed into a flexible belt-shaped member. It is attached in a state where it is arranged in parallel, and it is inserted and arranged over almost the entire length in the endoscope insertion part, and the bending state of the belt-like member in the part where each bending detection part is located from the light transmission amount of each bending detection optical fiber. The bending state is detected and the bending state of the endoscope insertion portion is displayed on the monitor screen.
米国特許第6846286号明細書US Pat. No. 6,846,286
 一般に、被挿入体内部に目印となる箇所が少なく、撮像画像からだけでは被挿入体内部のどの箇所を観察しているのか判り難い場合、必要な個所全てを撮像(観察)できたのかも判り難い。 In general, there are few places to be marked in the inserted body, and if it is difficult to determine which part of the inserted body is observed from the captured image alone, it is also possible to capture (observe) all the necessary parts. hard.
 上記特許文献1に開示の技術では、被挿入体内部への挿入時に被挿入体外部から見ることのできない被挿入体内部での挿入部の形状を提示することができる。しかしながら、被挿入体内部のどの箇所を撮像(観察)しているかの検出及びその表示方法については提案されていない。 The technique disclosed in Patent Document 1 can present the shape of the insertion portion inside the inserted body that cannot be seen from the outside of the inserted body when inserted into the inserted body. However, it has not been proposed to detect which part of the inserted body is imaged (observed) and to display it.
 また、撮像部を有しないが、被挿入体内部に挿入されて所定の作業を行う、カテーテル等の管状挿入具においても、被挿入体内部のどの箇所を作業しているかの検出及びその表示方法も望まれている。 Further, even in a tubular insertion tool such as a catheter that does not have an imaging unit but is inserted into the inserted body and performs a predetermined work, a detection method and a method for displaying the position inside the inserted body Is also desired.
 本発明は、上記の点に鑑みてなされたもので、被挿入体内のどこを撮像しているのか、あるいは、どこを作業しているのか、を判断するための情報を操作者に提供できる挿入システム、挿入支援装置、挿入支援方法、及びプログラムを提供することを目的とする。 The present invention has been made in view of the above points, and is an insertion that can provide an operator with information for determining where in the body to be imaged or where the user is working. It is an object to provide a system, an insertion support apparatus, an insertion support method, and a program.
 本発明の第1の態様によれば、被挿入体内部に挿入される挿入部と、上記挿入部の挿入状態情報を取得する挿入状態取得部と、上記被挿入体の形状情報を取得する被挿入体形状取得部と、上記挿入状態情報と上記被挿入体形状情報とを入力し、上記被挿入体に対する上記挿入部の位置関係を演算する位置関係演算部と、上記位置関係演算部の演算結果を出力するための出力部と、上記出力部での上記演算結果の出力状態を制御する制御部と、を有する挿入システムが提供される。 According to the first aspect of the present invention, the insertion portion to be inserted into the inserted body, the insertion state acquiring portion for acquiring the insertion state information of the insertion portion, and the target for acquiring the shape information of the inserted body. An insertion body shape acquisition unit, a positional relationship calculation unit that inputs the insertion state information and the inserted body shape information and calculates a positional relationship of the insertion unit with respect to the inserted body, and a calculation of the positional relationship calculation unit An insertion system having an output unit for outputting a result and a control unit for controlling an output state of the calculation result at the output unit is provided.
 また、本発明の第2の態様によれば、被挿入体内部への挿入具の挿入部の挿入を支援する挿入支援装置において、上記挿入部の挿入状態情報を取得する挿入状態取得部と、上記被挿入体の形状情報を取得する被挿入体形状取得部と、上記挿入状態情報と上記被挿入体形状情報とを入力し、上記被挿入体に対する上記挿入部の位置関係を演算する位置関係演算部と、上記位置関係演算部の演算結果を表示情報として出力するための出力部と、上記出力部での上記演算結果の出力状態を制御する制御部と、を有する挿入支援装置が提供される。 Further, according to the second aspect of the present invention, in the insertion support device for supporting the insertion of the insertion part of the insertion tool into the inserted body, an insertion state acquisition part for acquiring insertion state information of the insertion part; Positional relationship for calculating the positional relationship of the insertion part with respect to the inserted body by inputting the inserted body shape acquisition unit that acquires the shape information of the inserted body, the insertion state information, and the inserted body shape information An insertion support device is provided that includes a calculation unit, an output unit for outputting the calculation result of the positional relationship calculation unit as display information, and a control unit for controlling the output state of the calculation result in the output unit. The
 また、本発明の第3の態様によれば、被挿入体内部への挿入具の挿入部の挿入を支援する挿入支援装置に用いられる挿入支援方法において、上記挿入部の挿入状態情報を取得する挿入状態取得ステップと、上記被挿入体の形状情報を取得する被挿入体形状取得ステップと、上記挿入状態情報と上記被挿入体形状情報とを入力し、上記被挿入体に対する上記挿入部の位置関係を演算する位置関係演算ステップと、上記位置関係演算ステップの演算結果を表示情報として出力する出力ステップと、上記出力ステップでの上記演算結果の出力状態を制御する制御ステップと、を有する挿入支援方法が提供される。 According to the third aspect of the present invention, in the insertion support method used in the insertion support apparatus for supporting the insertion of the insertion part of the insertion tool into the inserted body, the insertion state information of the insertion part is acquired. An insertion state acquisition step, an inserted body shape acquisition step for acquiring shape information of the inserted body, the insertion state information and the inserted body shape information are input, and the position of the insertion portion with respect to the inserted body Insertion support comprising: a positional relationship calculation step for calculating a relationship; an output step for outputting the calculation result of the positional relationship calculation step as display information; and a control step for controlling the output state of the calculation result at the output step A method is provided.
 また、本発明の第4の態様によれば、コンピュータに、被挿入体内部への挿入具の挿入部の挿入を支援する挿入支援装置における上記挿入部の挿入状態情報を取得する挿入状態取得手順と、上記被挿入体の形状情報を取得する被挿入体形状取得手順と、上記挿入状態情報と上記被挿入体形状情報とを入力し、上記被挿入体に対する上記挿入部の位置関係を演算する位置関係演算手順と、上記位置関係演算手順の演算結果を表示情報として出力するための出力手順と、上記出力手順での上記演算結果の出力状態を制御する制御手順と、を実行させるプログラムが提供される。 Further, according to the fourth aspect of the present invention, the insertion state acquisition procedure for acquiring the insertion state information of the insertion part in the insertion support device that supports the insertion of the insertion part of the insertion tool into the inserted body. And an insertion object shape acquisition procedure for acquiring the insertion object shape information, the insertion state information and the insertion object shape information, and calculating a positional relationship of the insertion portion with respect to the insertion object. Provided is a program that executes a positional relationship calculation procedure, an output procedure for outputting the calculation result of the positional relationship calculation procedure as display information, and a control procedure for controlling the output state of the calculation result in the output procedure Is done.
 本発明によれば、被挿入体内のどこを撮像または作業しているのかを判断するための情報を提供できるので、被挿入体内部のどの箇所を撮像また作業しているのか、及び必要な個所全てを撮像または作業できたのか、を操作者に容易に判断可能とさせることができ、観察または作業箇所の見落としを防止することができる挿入システム、挿入支援装置、挿入支援方法、及びプログラムを提供することができる。 According to the present invention, it is possible to provide information for determining where in the inserted body an image or work is being performed, so what part of the inserted body is being imaged or worked, and where necessary. Provided an insertion system, an insertion support device, an insertion support method, and a program capable of making it easy for an operator to determine whether all of the images have been taken or worked, and preventing oversight of an observation or work place can do.
 加えて、本発明によれば、位置関係演算部で演算された被挿入体に対する挿入部の位置関係を出力部で出力するにあたり、制御部によって、少なくともその出力開始時点を設定することで、常に出力するのではなく、必要なときのみ出力するようになるので、無駄な情報を提供せず、分かり易い出力(見易い、間違い難い、素早く判断できる)とすることが可能となる。 In addition, according to the present invention, in outputting the positional relationship of the insertion portion with respect to the insertion object calculated by the positional relationship calculation unit at the output unit, by setting at least the output start time by the control unit, always Since it is output only when necessary, it is not output, so that it is possible to provide easy-to-understand output (easy to see, easy to make mistakes, and can be judged quickly) without providing unnecessary information.
図1Aは、本発明の第1実施形態に係る挿入支援装置を適用した挿入システムの概略構成を示す図である。FIG. 1A is a diagram showing a schematic configuration of an insertion system to which an insertion support apparatus according to the first embodiment of the present invention is applied. 図1Bは、第1実施形態に係る挿入支援装置のブロック構成図である。FIG. 1B is a block configuration diagram of the insertion support apparatus according to the first embodiment. 図1Cは、第1実施形態に係る挿入支援装置の出力部による情報の提供例を説明するための図である。FIG. 1C is a diagram for explaining an example of provision of information by the output unit of the insertion support apparatus according to the first embodiment. 図2Aは、第1実施形態に係る挿入システムにおける挿入具としての軟性内視鏡装置の概略構成を示す図である。FIG. 2A is a diagram illustrating a schematic configuration of a flexible endoscope apparatus as an insertion tool in the insertion system according to the first embodiment. 図2Bは、挿入部先端の斜視図である。FIG. 2B is a perspective view of the distal end of the insertion portion. 図3Aは、挿入・回転検出部の構成を説明するための図である。FIG. 3A is a diagram for explaining a configuration of an insertion / rotation detection unit. 図3Bは、挿入・回転検出部の動作原理を説明するための図である。FIG. 3B is a diagram for explaining the operation principle of the insertion / rotation detection unit. 図4は、挿入部の被挿入体内部への挿入状態を示す図である。FIG. 4 is a diagram illustrating an insertion state of the insertion portion into the inserted body. 図5Aは、ファイバ形状センサの原理を説明するための、紙面上方向に湾曲部が湾曲した場合を示す図である。FIG. 5A is a diagram illustrating a case where the bending portion is bent in the upward direction on the paper surface for explaining the principle of the fiber shape sensor. 図5Bは、ファイバ形状センサの原理を説明するための、湾曲部が湾曲していな場合を示す図である。FIG. 5B is a diagram illustrating a case where the bending portion is not curved for explaining the principle of the fiber shape sensor. 図5Cは、ファイバ形状センサの原理を説明するための、紙面下方向に湾曲部が湾曲した場合を示す図である。FIG. 5C is a diagram illustrating a case where the bending portion is bent in a downward direction in the drawing for explaining the principle of the fiber shape sensor. 図6は、ファイバ形状センサの挿入部への取り付け構造を示す図である。FIG. 6 is a view showing a structure for attaching the fiber shape sensor to the insertion portion. 図7は、第1実施形態に係る挿入支援装置の動作フローチャートを示す図である。FIG. 7 is a diagram illustrating an operation flowchart of the insertion support apparatus according to the first embodiment. 図8Aは、被挿入体の出入り口の例を説明するための図である。FIG. 8A is a diagram for describing an example of an entrance / exit of an insertion target body. 図8Bは、被挿入体の出入り口の別の例を説明するための図である。FIG. 8B is a diagram for explaining another example of the entrance / exit of the inserted object. 図8Cは、被挿入体の出入り口の更に別の例を説明するための図である。FIG. 8C is a diagram for explaining still another example of the entrance / exit of the inserted object. 図8Dは、被挿入体の出入り口の別の例を説明するための図である。FIG. 8D is a diagram for explaining another example of the entrance / exit of the inserted object. 図9Aは、第1の位置表示が被挿入体内部のどこを示すかを説明するための図である。FIG. 9A is a diagram for explaining where the first position display shows in the inserted body. 図9Bは、第2の位置表示が被挿入体内部のどこを示すかを説明するための図である。FIG. 9B is a diagram for explaining where the second position display shows in the inserted body. 図10は、制御部の制御による出力部の出力タイミングを説明するためのタイミングチャートを示す図である。FIG. 10 is a timing chart for explaining the output timing of the output unit under the control of the control unit. 図11は、情報の提供形態の別の例を説明するための図である。FIG. 11 is a diagram for explaining another example of a form of providing information. 図12Aは、情報の提供形態の更に別の例として、分岐のある被挿入体に挿入部を挿入した際の表示例を説明するための図である。FIG. 12A is a diagram for explaining a display example when an insertion unit is inserted into an insertion object with a branch as yet another example of a form of providing information. 図12Bは、他の表示例を説明するための図である。FIG. 12B is a diagram for explaining another display example. 図12Cは、更に別の表示例を説明するための図である。FIG. 12C is a diagram for explaining still another display example. 図13Aは、挿入部の回転による撮像画像の変化を説明するための、回転前を示す図である。FIG. 13A is a diagram illustrating a state before rotation for explaining a change in a captured image due to rotation of the insertion unit. 図13Bは、挿入部の回転による撮像画像の変化を説明するための、回転後を示す図である。FIG. 13B is a diagram illustrating a state after rotation for explaining a change in a captured image due to rotation of the insertion unit. 図14Aは、情報の提供形態の別の例を説明するための図である。FIG. 14A is a diagram for describing another example of a form of providing information. 図14Bは、情報の提供形態の更に別の例を説明するための図である。FIG. 14B is a diagram for explaining still another example of the information provision form. 図14Cは、情報の提供形態の別の例を説明するための図である。FIG. 14C is a diagram for describing another example of a form of providing information. 図14Dは、情報の提供形態の更に別の例を説明するための図である。FIG. 14D is a diagram for explaining yet another example of the information provision form. 図14Eは、情報の提供形態の別の例を説明するための図である。FIG. 14E is a diagram for describing another example of a form of providing information. 図14Fは、情報の提供形態の更に別の例を説明するための図である。FIG. 14F is a diagram for describing still another example of the information provision form. 図15は、情報の提供形態の更に別の例を説明するための図である。FIG. 15 is a diagram for explaining still another example of the information provision form. 図16Aは、本発明の第2実施形態に係る挿入システムのブロック構成図である。FIG. 16A is a block diagram of an insertion system according to the second embodiment of the present invention. 図16Bは、第2実施形態に係る挿入システムの表示装置と本発明の第2実施形態に係る挿入支援装置の出力部とによる情報の提供例を説明するための図である。FIG. 16B is a diagram for explaining an example of provision of information by the display device of the insertion system according to the second embodiment and the output unit of the insertion support device according to the second embodiment of the present invention. 図17は、本発明の第3実施形態に係る挿入システムにおける挿入具としての硬性内視鏡装置の概略構成を示す図である。FIG. 17 is a diagram showing a schematic configuration of a rigid endoscope apparatus as an insertion tool in the insertion system according to the third embodiment of the present invention. 図18は、本発明の第4実施形態に係る挿入システムにおける挿入具としてのカテーテルの概略構成を示す図である。FIG. 18 is a diagram showing a schematic configuration of a catheter as an insertion tool in the insertion system according to the fourth embodiment of the present invention.
 以下、本発明を実施するための形態を図面を参照して説明する。 Hereinafter, embodiments for carrying out the present invention will be described with reference to the drawings.
 [第1実施形態]
 本発明の第1実施形態に係る挿入システム1は、図1Aに示すように、被挿入体2の内部に挿入される挿入部31と被挿入体2を撮像する撮像部32とを備える挿入具3と、挿入部31の変位量情報を検出する検出部としての挿入・回転検出部4及びファイバ形状センサ5と、上記挿入・回転検出部4及びファイバ形状センサ5からの変位量情報から挿入部31の挿入状態情報を取得し、該挿入状態情報と被挿入体2の形状情報とに基づいて、被挿入体2に対する挿入部31の位置関係を演算して、その演算結果を表示出力する本発明の第1実施形態に係る挿入支援装置6と、から構成される。
[First Embodiment]
As shown in FIG. 1A, an insertion system 1 according to the first embodiment of the present invention includes an insertion unit 31 that is inserted into an inserted body 2 and an imaging unit 32 that images the inserted body 2. 3, the insertion / rotation detection unit 4 and the fiber shape sensor 5 as detection units for detecting displacement amount information of the insertion unit 31, and the insertion unit based on the displacement amount information from the insertion / rotation detection unit 4 and the fiber shape sensor 5. The insertion state information of 31 is acquired, the positional relationship of the insertion part 31 with respect to the insertion object 2 is calculated based on the insertion state information and the shape information of the insertion object 2, and the calculation result is displayed and output. And an insertion support apparatus 6 according to the first embodiment of the invention.
 以下、挿入具3は、本実施形態では、例えば、図2Aに示すような軟性内視鏡装置であるとして説明する。この軟性内視鏡装置は、挿入部31と、該挿入部31と一体的に構成された操作部33と、を備えている。挿入部31は、可撓性の管状部材であり、被挿入体2の挿入口21から被挿入体2の内部に挿入可能となっている。なお、被挿入体2の内部は、空気、生理食塩水、あるいは薬液などの所定材料で満たされている。挿入部31の挿入方向の端部(以下、挿入部先端と称する。)には、図2Bに示すように、撮像開口部34が設けられており、また、挿入部31内部の挿入部先端近傍には、図1Aに示すように、撮像部32が内蔵されている。撮像開口部34に入射した光は、撮像部32が受光し撮像を行う。撮像部32が撮像した画像は、本第1実施形態に係る挿入支援装置6で表示出力される。 Hereinafter, in the present embodiment, the insertion tool 3 will be described as a flexible endoscope device as shown in FIG. 2A, for example. This flexible endoscope apparatus includes an insertion unit 31 and an operation unit 33 configured integrally with the insertion unit 31. The insertion portion 31 is a flexible tubular member and can be inserted into the inserted body 2 from the insertion port 21 of the inserted body 2. Note that the inside of the insertion object 2 is filled with a predetermined material such as air, physiological saline, or a chemical solution. As shown in FIG. 2B, an imaging opening 34 is provided at the end of the insertion portion 31 in the insertion direction (hereinafter referred to as the insertion portion distal end), and in the vicinity of the insertion portion distal end inside the insertion portion 31. As shown in FIG. 1A, an imaging unit 32 is built in. The light incident on the imaging opening 34 is received by the imaging unit 32 and imaged. The image captured by the imaging unit 32 is displayed and output by the insertion support device 6 according to the first embodiment.
 なお、撮像部32は、挿入部31内部の挿入部先端近傍ではなく、操作部33内に配置し、撮像開口部34との間をライトガイド等により結んで、撮像開口部34に入射した光を撮像部32へ導光して撮像を行うようにしても良いことは勿論である。 The imaging unit 32 is disposed not in the vicinity of the distal end of the insertion unit inside the insertion unit 31 but in the operation unit 33, and is connected to the imaging opening 34 by a light guide or the like, and is incident on the imaging opening 34. Of course, the image may be guided to the imaging unit 32 to perform imaging.
 また、挿入部31は、挿入部先端近傍に、湾曲部35を有しており、該湾曲部35は、特に図示はしていないが、操作部33に設けられた操作レバーとワイヤによって繋がっている。これにより、操作レバーを動かすことでワイヤが引かれ、湾曲部35の湾曲操作が可能となっている。 The insertion portion 31 has a bending portion 35 in the vicinity of the distal end of the insertion portion. The bending portion 35 is connected by an operation lever provided in the operation portion 33 and a wire, although not particularly illustrated. Yes. Accordingly, the wire is pulled by moving the operation lever, and the bending operation of the bending portion 35 is possible.
 また、特に図示はしないが、挿入部31内部には、照明用の光ファイバを有しており、操作部33内に配された不図示の照明用光源からの光を導光し、挿入部先端の光供給部36から撮像の為の照明光として出射する。さらに、挿入部先端には、処置用開口部37が設けられ、操作部33から挿入部31内を挿通された処置具が該処置用開口部37から挿入部31外に延出可能とされている。 Although not shown, the insertion portion 31 has an illumination optical fiber, guides light from an illumination light source (not shown) disposed in the operation portion 33, and inserts the insertion portion 31. The light is emitted from the tip light supply unit 36 as illumination light for imaging. Further, a treatment opening 37 is provided at the distal end of the insertion portion, and a treatment instrument inserted through the insertion portion 31 from the operation portion 33 can extend out of the insertion portion 31 from the treatment opening 37. Yes.
 以下、各部の構成を詳細に説明する。 
 挿入・回転検出部4は、被挿入体2の挿入口21の近傍に設置され、挿入部31の挿入量と回転量を検出して、挿入部31の変位量情報の一つとして、挿入支援装置6に出力する。具体的には、この挿入・回転検出部4は、図3Aに示すように、光源41、投光レンズ42、受光レンズ43、光学パターン検出部44、及び変位量算出部45から構成される。
Hereinafter, the configuration of each unit will be described in detail.
The insertion / rotation detection unit 4 is installed in the vicinity of the insertion port 21 of the inserted body 2, detects the insertion amount and rotation amount of the insertion unit 31, and inserts support as one of the displacement amount information of the insertion unit 31. Output to device 6. Specifically, the insertion / rotation detection unit 4 includes a light source 41, a light projection lens 42, a light reception lens 43, an optical pattern detection unit 44, and a displacement amount calculation unit 45, as shown in FIG. 3A.
 光源41から出射した光は、投光レンズ42を通して挿入部31に照射され、該挿入部31で反射した光が受光レンズ43を通して光学パターン検出部44に受光される。光学パターン検出部44は、光学パターンである挿入部31面の画像を検出時間t,t,t,…,t,…と連続して検出する。 The light emitted from the light source 41 is applied to the insertion unit 31 through the light projecting lens 42, and the light reflected by the insertion unit 31 is received by the optical pattern detection unit 44 through the light receiving lens 43. The optical pattern detection unit 44 continuously detects the image of the insertion unit 31 surface, which is an optical pattern, with detection times t 0 , t 1 , t 2 ,..., T n ,.
 変位量算出部45は、図3Bに示すように、光学パターン検出部44で任意の時間tに撮像された画像データの画像(光学パターンPT)内に存在する任意に選択された基準パターンαと、この時間tから任意の時間経過後の時間tn+1に撮像された画像データの画像(光学パターンPTn+1)内の一部に存在する上記基準パターンαと一致する光学パターンα’と、の画像データ上の変位を比較し、x軸方向及びy軸方向の各々の画像上の変位量を算出する。ここで、図3Bに示すように、光学パターン検出部44のx軸は挿入部31の軸方向に一致するように、光学パターン検出部44は位置決めされている。よって、変位量算出部45で算出されるx軸方向の変位量Δxは挿入部31の挿入量と比例し、y軸方向の変位量Δyは挿入部31の回転量と比例する。こうして変位量算出部45で算出された画像上の変位量(挿入量と回転量)は、変位量情報として挿入支援装置6に出力される。なお、各々の変位量の増減方向が挿入部31の挿入及び回転の方向を示すため、変位量情報は挿入方向及び回転方向の情報も含むこととなる。 As shown in FIG. 3B, the displacement amount calculation unit 45 is an arbitrarily selected reference pattern that exists in an image (optical pattern PT n ) of image data captured at an arbitrary time t n by the optical pattern detection unit 44. alpha and, this time t n of the image data captured to a time t n + 1 after the lapse of an arbitrary time from the image (optical pattern PT n + 1) optical pattern consistent with the reference pattern alpha present in part of the alpha ' The displacements on the image data are compared, and the displacement amounts on the images in the x-axis direction and the y-axis direction are calculated. Here, as shown in FIG. 3B, the optical pattern detection unit 44 is positioned so that the x-axis of the optical pattern detection unit 44 coincides with the axial direction of the insertion unit 31. Therefore, the displacement amount Δx f in the x-axis direction calculated by the displacement amount calculation unit 45 is proportional to the insertion amount of the insertion unit 31, and the displacement amount Δy f in the y-axis direction is proportional to the rotation amount of the insertion unit 31. The displacement amount (insertion amount and rotation amount) on the image thus calculated by the displacement amount calculation unit 45 is output to the insertion support device 6 as displacement amount information. In addition, since the increase / decrease direction of each displacement amount shows the direction of insertion and rotation of the insertion part 31, the displacement amount information also includes information on the insertion direction and the rotation direction.
 なお、変位量情報の取得は、上記の方法に限らず、画像データに基づき求めてもよい。 The acquisition of the displacement amount information is not limited to the above method, and may be obtained based on image data.
 また、ファイバ形状センサ5は、図4に示すように、挿入部31内部に設置される。このファイバ形状センサ5は、複数の光ファイバから構成され、1本の光ファイバに一箇所の湾曲検出部51を有している。この湾曲検出部51は、光ファイバのクラッドが除去されコアが露出されており、光吸収部材が塗布されて構成されている。このような湾曲検出部51では、図5A乃至図5Cに示すように、湾曲部35の湾曲に応じて該湾曲検出部51で吸収される光の量が変化するため、光ファイバ52内を導光する光の量が変化する、即ち光伝達量が変化する。なお、本実施形態では、ファイバ形状センサ5は複数の光ファイバより構成されているが、これに限らず単数の光ファイバでもよい。 Further, the fiber shape sensor 5 is installed inside the insertion portion 31 as shown in FIG. The fiber shape sensor 5 is composed of a plurality of optical fibers, and has one curvature detection unit 51 in one optical fiber. The curvature detection unit 51 is configured by removing the clad of the optical fiber and exposing the core, and applying a light absorbing member. In such a curvature detection unit 51, as shown in FIGS. 5A to 5C, the amount of light absorbed by the curvature detection unit 51 changes according to the curvature of the bending unit 35. The amount of light to be changed changes, that is, the amount of light transmission changes. In the present embodiment, the fiber shape sensor 5 is composed of a plurality of optical fibers, but is not limited thereto, and may be a single optical fiber.
 このような構成のファイバ形状センサ5は、図6に示すX軸方向の湾曲と、Y軸方向の湾曲を検出するために、それぞれX軸方向、Y軸方向に向いている2つの湾曲検出部51が対となるように2本の光ファイバ52が配置され、一箇所の湾曲量を検出する。そして、対となっている湾曲検出部51が挿入部31の長手方向(挿入方向)に並んで配置されるように、複数の光ファイバ52が配置されている。そして、不図示の光源からの光を各光ファイバ52で導光し、各光ファイバ52の湾曲量によって変化する光伝達量を不図示の受光部で検出する。こうして検出された光伝達量は、挿入部31の変位量情報の一つとして、挿入支援装置6に出力される。 The fiber shape sensor 5 configured as described above includes two curve detectors facing the X-axis direction and the Y-axis direction, respectively, in order to detect the X-axis curve and the Y-axis curve shown in FIG. Two optical fibers 52 are arranged so that 51 becomes a pair, and the amount of bending at one place is detected. And the some optical fiber 52 is arrange | positioned so that the curvature detection part 51 used as a pair may be arrange | positioned along with the longitudinal direction (insertion direction) of the insertion part 31. FIG. Then, light from a light source (not shown) is guided by each optical fiber 52, and a light transmission amount that changes depending on the bending amount of each optical fiber 52 is detected by a light receiving unit (not shown). The detected light transmission amount is output to the insertion support device 6 as one piece of displacement amount information of the insertion unit 31.
 なお、湾曲検出部51は、挿入部31の湾曲部35にだけでなく、それよりも操作部側にも設けることで、挿入部31の可撓性により被挿入体2の内部構造に応じて自由に湾曲した挿入部31の湾曲部35以外の部分の湾曲状態も検出できるようにしておくことが望ましい。 In addition, the curvature detection part 51 is provided not only in the bending part 35 of the insertion part 31 but also on the operation part side from that, so that the flexibility of the insertion part 31 can be used according to the internal structure of the inserted body 2. It is desirable to be able to detect the bending state of the portion other than the bending portion 35 of the insertion portion 31 that is freely bent.
 なお、図6に示すように、挿入部31内部には、照明用光ファイバ38と撮像部用配線39も設けられている。照明用光ファイバ38によって、操作部33内に配された不図示の照明用光源からの光を導光し、挿入部先端から照明光として出射することで、撮像部32は、暗部である被挿入体2の内部を撮像することができる。 As shown in FIG. 6, an illumination optical fiber 38 and an imaging portion wiring 39 are also provided inside the insertion portion 31. The imaging unit 32 is a dark part by guiding light from an illumination light source (not shown) disposed in the operation unit 33 by the illumination optical fiber 38 and emitting it as illumination light from the distal end of the insertion unit. The inside of the insert 2 can be imaged.
 また、本実施形態に係る挿入支援装置6は、図1Bに示すように、挿入状態取得部61、被挿入体形状取得部62、位置関係演算部63、出力部64、及び記憶部65、及び制御部66から構成される。 In addition, as illustrated in FIG. 1B, the insertion support device 6 according to the present embodiment includes an insertion state acquisition unit 61, an inserted body shape acquisition unit 62, a positional relationship calculation unit 63, an output unit 64, and a storage unit 65, and The control unit 66 is configured.
 挿入状態取得部61は、上記挿入・回転検出部4の変位量算出部45から出力された変位量情報に基づいて、挿入部31の少なくとも一部の、被挿入体2内部への挿入状態情報、例えば、挿入部31のある点の位置及び向きを取得する。また、上記ファイバ形状センサ5が検出した変位量情報としての各光ファイバ52の湾曲量によって変化する光伝達量に基づいて、被挿入体2内部への挿入状態情報として、さらに、挿入部31の各湾曲検出部51の位置及び向きを取得する。 Based on the displacement amount information output from the displacement amount calculation unit 45 of the insertion / rotation detection unit 4, the insertion state acquisition unit 61 inserts at least a part of the insertion unit 31 into the inserted object 2. For example, the position and orientation of a certain point of the insertion unit 31 are acquired. Further, based on the light transmission amount that changes depending on the bending amount of each optical fiber 52 as the displacement amount information detected by the fiber shape sensor 5, as insertion state information into the inserted body 2, the insertion portion 31 further includes The position and orientation of each curvature detection unit 51 are acquired.
 被挿入体形状取得部62は、被挿入体2の形状情報(被挿入体形状情報)を取得する。この被挿入体形状情報は、挿入部31を被挿入体2に挿入する前に、被挿入体2の外側又は内側からのデータを基に構成されている。 The inserted body shape acquisition unit 62 acquires the shape information (inserted body shape information) of the inserted body 2. The inserted body shape information is configured based on data from the outside or the inside of the inserted body 2 before the insertion portion 31 is inserted into the inserted body 2.
 即ち、外部からのデータに基づく被挿入体形状情報は、例えば、CT診断装置や超音波診断装置、及びX線装置などの被挿入体2内を透過して検出することのできる装置を利用して構成される。 In other words, the insertion object shape information based on data from the outside uses, for example, a CT diagnostic apparatus, an ultrasonic diagnostic apparatus, an X-ray apparatus, or the like that can be detected by being transmitted through the insertion object 2. Configured.
 また、内部からのデータに基づく被挿入体形状情報は、挿入部31を被挿入体2の空間で動かしたときの軌跡データを利用したり、挿入部先端が被挿入体2に接触したときの位置情報を繋げることにより構成したりする。挿入部先端と被挿入体2との接触時の位置情報を利用すると、空間の大きさを検出することも可能となり、より正確な被挿入体形状情報を取得することが可能である。また、被挿入体2が人体の臓器である場合は体格から推定することで構成したり、被挿入体2が構造物である場合は図面によって形状を入力することで構成したりすることもできる。 Further, the inserted object shape information based on the data from the inside uses the trajectory data when the insertion part 31 is moved in the space of the inserted object 2 or when the tip of the insertion part contacts the inserted object 2. It is configured by connecting position information. If position information at the time of contact between the distal end of the insertion portion and the inserted body 2 is used, the size of the space can be detected, and more accurate inserted body shape information can be acquired. Moreover, when the to-be-inserted body 2 is a human internal organ, it can comprise by estimating from a physique, and when the to-be-inserted body 2 is a structure, it can also comprise by inputting a shape with drawing. .
 なお、被挿入体形状取得部62による被挿入体形状情報の取得は、例えばCT診断装置等の被挿入体形状情報を構成する装置を接続して当該装置より直接取得するものであっても良いし、当該装置から出力された被挿入体形状情報を記憶媒体に一旦保存し、その保存された被挿入体形状情報を読み出して或いはネットワークを介してダウンロードして取得するというものであっても構わない。さらには、被挿入体形状取得部62は、そのようなインタフェースやデータリーダに限らず、それ自体が被挿入体形状情報を構成する機器であっても構わない。 In addition, the acquisition of the inserted body shape information by the inserted body shape acquisition unit 62 may be performed by connecting a device constituting the inserted body shape information such as a CT diagnostic apparatus and directly acquiring the information. The inserted object shape information output from the apparatus may be temporarily stored in a storage medium, and the stored inserted object shape information may be read out or downloaded via a network and acquired. Absent. Furthermore, the insertion object shape acquisition unit 62 is not limited to such an interface or data reader, and may itself be a device that constitutes the insertion object shape information.
 位置関係演算部63は、上記挿入状態取得部61で取得した挿入状態情報と、上記被挿入体形状取得部62で取得した被挿入体形状情報と、に基づいて、被挿入体2に対する挿入部31の位置関係、即ち、挿入部31の全体や先端が被挿入体2内部のどこの部分を向いているのかを演算する。具体的には、位置関係演算部63は、まず、上記挿入状態取得部61で取得した挿入状態情報としての挿入部31の各湾曲検出部51の位置及び向きに基づいて挿入部31の形状情報を演算する。そして、この求めた挿入部31の形状情報と、上記挿入状態取得部61で取得した挿入状態情報としての、例えば、挿入部31のある点の位置及び向きと、上記被挿入体形状取得部62で取得した被挿入体形状情報と、に基づいて、被挿入体2に対する挿入部31の位置関係、即ち、挿入部先端の位置と向いている方向(軸方向)とを演算する。そして、それらの演算結果と被挿入体形状情報とに基づいて、挿入部先端が向いている方向と被挿入体2との交点を演算する。即ち、位置関係演算部63は、図4に示すように、挿入部先端が向いている方向(撮像方向71)からなる直線と、被挿入体2の形状との交点72、すなわち視野(撮像領域73)中心を、撮像位置Pとして求める。 The positional relationship calculation unit 63 is an insertion unit for the inserted body 2 based on the insertion state information acquired by the insertion state acquisition unit 61 and the inserted body shape information acquired by the inserted body shape acquisition unit 62. 31 is calculated, that is, where the entire insertion portion 31 and the distal end are directed toward the inserted body 2. Specifically, the positional relationship calculation unit 63 firstly determines the shape information of the insertion unit 31 based on the position and orientation of each curvature detection unit 51 of the insertion unit 31 as the insertion state information acquired by the insertion state acquisition unit 61. Is calculated. Then, for example, the obtained shape information of the insertion portion 31, the position and orientation of a certain point of the insertion portion 31 as the insertion state information acquired by the insertion state acquisition portion 61, and the inserted object shape acquisition portion 62. Based on the to-be-inserted body shape information acquired in step 1, the positional relationship of the insertion portion 31 with respect to the to-be-inserted body 2, that is, the direction (axial direction) facing the position of the distal end of the insertion portion is calculated. Then, based on the calculation result and the inserted body shape information, the intersection point between the direction in which the distal end of the insertion portion faces and the inserted body 2 is calculated. That is, as shown in FIG. 4, the positional relationship calculation unit 63 has an intersection 72 between the straight line formed in the direction in which the distal end of the insertion unit faces (imaging direction 71) and the shape of the inserted body 2, that is, the visual field (imaging region). 73) The center is obtained as the imaging position P.
 通常、観察対象の注目箇所は視野中心に捉えるので、視野の周辺よりも中心が重要であることが多い。なお、ここでは撮像位置Pとして交点を求める例を示したが、被挿入体形状情報に基づいて、挿入部先端の位置と被挿入体2の撮像面との距離から、撮像部32が撮像している被挿入体2の領域である視野(撮像領域73)を、撮像位置Pとして演算するようにしても良い。この場合、挿入部31と被挿入体2との間に介在する所定材料の屈折率、撮像部32の画角情報(レンズの焦点距離など)、等のパラメータを用いれば、より正確に撮像領域73を求めることができるようになる。このように撮像位置Pとして撮像領域73を求めることにより、撮像部32が撮像した範囲を把握することができる。また、視野(撮像領域73)中の一部の領域74または点を撮像位置Pとして演算するようにしても良い。例えば、撮像領域73を正確に検出できない場合に、誤差を考慮して小さい領域を演算することで、撮像されていない範囲を撮像されたと誤検出されることを防ぐことができる。即ち、観察漏れを防ぐことができる。 Usually, since the point of interest to be observed is caught at the center of the visual field, the center is often more important than the periphery of the visual field. Here, an example in which the intersection is obtained as the imaging position P has been shown, but the imaging unit 32 captures an image from the distance between the position of the distal end of the insertion unit and the imaging surface of the inserted body 2 based on the inserted body shape information. The field of view (imaging area 73) that is the area of the inserted object 2 that is inserted may be calculated as the imaging position P. In this case, if parameters such as the refractive index of a predetermined material interposed between the insertion portion 31 and the inserted body 2 and the angle of view information (such as the focal length of the lens) of the imaging portion 32 are used, the imaging region is more accurately detected. 73 can be obtained. Thus, by obtaining the imaging region 73 as the imaging position P, the range captured by the imaging unit 32 can be grasped. In addition, a partial region 74 or a point in the field of view (imaging region 73) may be calculated as the imaging position P. For example, when the imaging region 73 cannot be accurately detected, it is possible to prevent erroneous detection that a non-imaging range has been captured by calculating a small region in consideration of the error. That is, observation omission can be prevented.
 位置関係演算部63は、求められた挿入部31の形状情報と、求められた被挿入体2に対する挿入部31の位置関係、即ち、挿入部先端の位置と向いている方向(軸方向)と、上記被挿入体形状取得部62で取得した被挿入体形状情報と、求められた撮像位置P(例えば交点72)を示す撮像位置情報とを、演算結果として、制御部66に出力する。 The positional relationship calculation unit 63 obtains the obtained shape information of the insertion portion 31 and the obtained positional relationship of the insertion portion 31 with respect to the inserted body 2, that is, the direction (axial direction) facing the position of the distal end of the insertion portion. The inserted body shape information acquired by the inserted body shape acquisition unit 62 and the imaging position information indicating the obtained imaging position P (for example, the intersection 72) are output to the control unit 66 as a calculation result.
 出力部64は、制御部66の制御の下、上記位置関係演算部63の演算結果から、挿入部先端が被挿入体2内部のどこの部分を向いているのか操作者が判断できるような形態で表示出力する。また、出力部64は、上記撮像部32から得られた撮像画像を表示可能な表示部としても機能する。 The output unit 64 is configured so that the operator can determine where the tip of the insertion unit faces in the inserted body 2 from the calculation result of the positional relationship calculation unit 63 under the control of the control unit 66. Display output. The output unit 64 also functions as a display unit that can display the captured image obtained from the imaging unit 32.
 記憶部65は、制御部66の制御の下、上記位置関係演算部63の演算結果を保存すると共に、上記撮像部32から得られた撮像画像を保存する。 The storage unit 65 stores the calculation result of the positional relationship calculation unit 63 and the captured image obtained from the imaging unit 32 under the control of the control unit 66.
 制御部66は、上記位置関係演算部63の演算結果の、上記出力部64での表示出力の出力状態、並びに、上記記憶部65での記憶を制御する。即ち、上記位置関係演算部63からの演算結果は、該制御部66に対して逐次出力されるが、該制御部66は、その出力された演算結果を全て出力部64に表示出力させることは行わず、少なくともその一部である撮像位置P(例えば交点72)を示す撮像位置情報については、その表示出力の有無つまり表示出力開始時点及び表示出力終了時点を制御したり、演算結果の表示出力の種類つまり表示の仕方を制御したりする。同様に、記憶部65への記憶についても、上記位置関係演算部63から出力された演算結果の全てではなく、少なくともその一部を記憶させるように制御する。また、位置関係演算部63の演算結果だけでなく、撮像部32からの撮像画像の記憶についても、同様に制御可能である。 The control unit 66 controls the output state of the display output in the output unit 64 and the storage in the storage unit 65 of the calculation result of the positional relationship calculation unit 63. That is, the calculation results from the positional relationship calculation unit 63 are sequentially output to the control unit 66, but the control unit 66 causes the output unit 64 to display and output all of the output calculation results. For the imaging position information indicating at least a part of the imaging position P (for example, the intersection 72), the presence / absence of the display output, that is, the display output start time and the display output end time is controlled, or the calculation result is displayed. The type of display, that is, the display method is controlled. Similarly, the storage in the storage unit 65 is controlled so that at least a part of the calculation result output from the positional relationship calculation unit 63 is stored. Further, not only the calculation result of the positional relationship calculation unit 63 but also the storage of the captured image from the imaging unit 32 can be similarly controlled.
 例えば、制御部66は、挿入部先端が所定の位置に到達したかを判断する判断部67を備えることができる。ここで、所定の位置とは、被挿入体2内部の観察や処置の必要な箇所を指す。判断部67はこの判断を、上記位置関係演算部63から出力された演算結果の一部、即ち、挿入部31の形状情報、挿入部先端の位置と向いている方向(軸方向)、及び被挿入体形状情報に基づいて行う、あるいは、上記撮像部32から得られた撮像画像からパターンマッチング等により行う。そして、制御部66は、この判断部67の判断結果により、上記出力部64及び記憶部65を制御する。 For example, the control unit 66 can include a determination unit 67 that determines whether the distal end of the insertion unit has reached a predetermined position. Here, the “predetermined position” refers to a portion that requires observation or treatment inside the inserted body 2. The determination unit 67 makes this determination based on a part of the calculation result output from the positional relationship calculation unit 63, that is, the shape information of the insertion unit 31, the direction (axial direction) facing the position of the distal end of the insertion unit, and the target. This is performed based on the insert shape information, or is performed by pattern matching or the like from the captured image obtained from the imaging unit 32. Then, the control unit 66 controls the output unit 64 and the storage unit 65 based on the determination result of the determination unit 67.
 また制御部66は、記憶部65に保存されている情報(上記位置関係演算部63の演算結果、撮像画像など)を出力部64に送って表示させる機能も有する。 The control unit 66 also has a function of sending information stored in the storage unit 65 (calculation results of the positional relationship calculation unit 63, captured images, etc.) to the output unit 64 for display.
 以上のような構成の挿入支援装置6の動作を、図7を参照して説明する。ここでは、説明の簡単化のため、制御部66による、位置関係演算部63での演算結果、即ち撮像位置P(例えば交点72)の、出力部64での表示出力の出力状態、並びに、記憶部65での記憶の制御は、表示出力及び記憶の開始時点の制御のみである場合を例に説明を行う。 The operation of the insertion support apparatus 6 configured as described above will be described with reference to FIG. Here, for simplification of description, the calculation result of the positional relationship calculation unit 63 by the control unit 66, that is, the output state of the display output at the output unit 64 of the imaging position P (for example, the intersection 72), and the storage The storage control in the unit 65 will be described by taking as an example a case where only display output and storage start control are performed.
 まず、被挿入体形状取得部62により、被挿入体形状情報を取得する(ステップS1)。その後、挿入状態取得部61によって、挿入部31の被挿入体2内部への挿入状態情報を取得する(ステップS2)。そして、位置関係演算部63によって、挿入状態情報としての挿入部31の各湾曲検出部51の位置及び向きに基づいて、挿入部31の形状を演算する(ステップS3)。その後、この演算した挿入部31の形状と、上記取得した挿入状態情報としての挿入部31のある点の位置及び向きと、上記取得した被挿入体形状情報と、に基づいて、被挿入体2に対する挿入部31の位置関係、即ち、挿入部先端の位置と向いている方向(軸方向)とを演算する(ステップS4)。 First, the insertion object shape information is acquired by the insertion object shape acquisition unit 62 (step S1). Thereafter, the insertion state acquisition unit 61 acquires the insertion state information of the insertion unit 31 into the inserted body 2 (step S2). And the shape of the insertion part 31 is calculated by the positional relationship calculation part 63 based on the position and direction of each curvature detection part 51 of the insertion part 31 as insertion state information (step S3). Thereafter, based on the calculated shape of the insertion portion 31, the position and orientation of a point of the insertion portion 31 as the acquired insertion state information, and the acquired insertion target shape information, the insertion target 2 The positional relationship of the insertion portion 31 with respect to the angle, that is, the direction (axial direction) facing the position of the distal end of the insertion portion is calculated (step S4).
 そして、制御部66の判断部67によって、挿入部先端が所定の位置に到達したか否かを判断する(ステップS5)。例えば、被挿入体2として膀胱を例にとると、この挿入部先端が所定の位置に到達したか否かの判断は、挿入口21である尿道口から挿入された挿入部31の先端が図8Aに示すような膀胱の出入り口22つまり膀胱と尿道の接続部に到達したか否かを判断することとなる。あるいは、挿入部先端が膀胱壁から所定の距離、例えば略2cmの距離に到達したか否かによって判断することもできる。これら、出入り口22または膀胱壁から所定の距離に到達したかは、位置関係演算部63から出力された演算結果(挿入部先端の位置及び被挿入体形状情報)、あるいは、上記撮像部32から得られた撮像画像から判断することができる。または、上記撮像部32から得られた撮像画像から、膀胱壁の血管間隔が例えば略100画素となったか否かにより判断することも可能である。また、撮像部32は、被写体に近すぎても観察に適した撮像画像を得ることができないため、この所定の位置は範囲を持つことができる。例えば、距離で言えば膀胱壁から上記略2cmから略5mmの範囲、画素で言えば膀胱壁の血管間隔が上記略100画素から略500画素の範囲とすることができる。 Then, the determination unit 67 of the control unit 66 determines whether or not the distal end of the insertion unit has reached a predetermined position (step S5). For example, when the bladder is taken as an example of the inserted body 2, whether or not the distal end of the insertion portion has reached a predetermined position is determined based on whether the distal end of the insertion portion 31 inserted from the urethral opening, which is the insertion port 21. It is determined whether or not the entrance / exit 22 of the bladder as shown in 8A, that is, the connecting portion between the bladder and the urethra has been reached. Or it can also be judged by whether or not the distal end of the insertion portion has reached a predetermined distance from the bladder wall, for example, a distance of about 2 cm. Whether the predetermined distance from the doorway 22 or the bladder wall has been reached is obtained from the calculation result (position of the distal end of the insertion section and the inserted body shape information) output from the positional relationship calculation section 63 or from the imaging section 32. This can be determined from the captured image. Alternatively, it is possible to determine from the captured image obtained from the imaging unit 32 based on whether or not the blood vessel interval of the bladder wall has become approximately 100 pixels, for example. Further, since the imaging unit 32 cannot obtain a captured image suitable for observation even if it is too close to the subject, the predetermined position can have a range. For example, the distance between the bladder wall and the bladder wall may be approximately 2 cm to 5 mm in terms of distance, and the blood vessel interval of the bladder wall may be in the range of approximately 100 pixels to approximately 500 pixels in terms of pixels.
 ここで、まだ挿入部先端が所定の位置に到達していない場合には、制御部66により、上記位置関係演算部63から出力された演算結果の一部、即ち、挿入部31の形状情報、挿入部先端の位置と向いている方向(軸方向)、及び被挿入体形状情報に基づいて、被挿入体2の形状に対する挿入部31の挿入状態の表示を出力部64に行わせた後(ステップS6)、上記ステップS2から上記の動作を繰り返す。 Here, when the distal end of the insertion portion has not yet reached the predetermined position, a part of the calculation result output from the positional relationship calculation unit 63 by the control unit 66, that is, the shape information of the insertion unit 31, After causing the output unit 64 to display the insertion state of the insertion portion 31 with respect to the shape of the inserted body 2 based on the direction (axial direction) facing the position of the distal end of the insertion section and the inserted body shape information ( In step S6), the above operation is repeated from step S2.
 而して、挿入部先端が所定の位置に到達したならば、制御部66により、上記演算結果の一部である撮像位置P(例えば交点72)を示す撮像位置情報を、出力部64に表示出力させると共に、記憶部65に保存させる(ステップS7)。その後は、上記ステップS2から上記の動作を繰り返す。 Thus, when the distal end of the insertion unit reaches a predetermined position, the control unit 66 displays on the output unit 64 imaging position information indicating the imaging position P (for example, the intersection 72) that is a part of the calculation result. While outputting, it is made to preserve | save in the memory | storage part 65 (step S7). Thereafter, the above operation is repeated from step S2.
 このような動作により、出力部64には、図1Cに示すような表示がなされる。即ち、本実施形態では、出力部64には、撮像部32からの撮像画像(撮像画像表示641)と挿入部31の挿入状態(挿入状態表示642)とが表示出力される。ここで、挿入状態表示642は、被挿入体形状情報としての被挿入体2を所定の箇所で割った2次元図6421,6422として表示出力される。そしてさらに、この被挿入体2を所定の箇所で割った2次元図6421,6422上に、撮像位置Pつまり被挿入体2と挿入部先端の軸方向(撮像方向71)との交点72に関わる情報が表示出力される。ここで、2次元図6421,6422は、挿入部31の形状情報に基づいて、制御部66によって生成されるものである。即ち、第1の2次元図6421は、図9Aに示すように、被挿入体2の形状を被挿入体2の座標においてY-Z平面で割って左右に開いた状態を示す図であり、第2の2次元図6422は、第1の2次元図6421とは異なる視点の図として、図9Bに示すように、被挿入体2の形状を被挿入体2の座標においてX-Z平面で割って上下に開いた状態を示す図である。そして、制御部66は、出力部64のこれら2次元図6421,6422上に、撮像位置Pに関わる情報として、現在位置表示643を表示出力させる。 With this operation, the output unit 64 displays as shown in FIG. 1C. That is, in the present embodiment, the output unit 64 displays and outputs the captured image (captured image display 641) from the imaging unit 32 and the insertion state (insertion state display 642) of the insertion unit 31. Here, the insertion state display 642 is displayed and output as two-dimensional diagrams 6421 and 6422 obtained by dividing the inserted body 2 as the inserted body shape information by a predetermined portion. Further, on the two-dimensional diagrams 6421 and 6422 obtained by dividing the inserted body 2 by a predetermined location, the imaging position P, that is, the intersection 72 between the inserted body 2 and the axial direction (imaging direction 71) of the distal end of the insertion portion is concerned. Information is displayed and output. Here, the two-dimensional diagrams 6421 and 6422 are generated by the control unit 66 based on the shape information of the insertion unit 31. That is, the first two-dimensional diagram 6421 is a diagram showing a state in which the shape of the object to be inserted 2 is divided by the YZ plane at the coordinates of the object to be inserted 2 and opened left and right as shown in FIG. 9A. As shown in FIG. 9B, the second two-dimensional view 6422 is a view from a different viewpoint from the first two-dimensional view 6421. As shown in FIG. It is a figure which shows the state opened by dividing up and down. Then, the control unit 66 causes the current position display 643 to be displayed and output as information related to the imaging position P on the two-dimensional diagrams 6421 and 6422 of the output unit 64.
 但し、この現在位置表示643は、常に行われるものではない。即ち、図10に示すように、時刻T1で、該挿入システムが動作を開始すると、撮像画像表示641は直ちに開始されるが、2次元図6421,6422上の現在位置表示643は表示出力されない。そして、挿入部先端が所定の位置に到達したと制御部66の判断部67が判断した時点(時刻T2)で初めて、2次元図6421,6422上の現在位置表示643の表示出力が開始される。これらの撮像画像表示641の表示出力及び2次元図6421,6422上での現在位置表示643の表示出力は、時刻T4で、該挿入システムが動作を終了されるまで、継続される。あるいは、2次元図6421,6422上での現在位置表示643については、挿入部先端が所定の位置よりも前に戻ったことも制御部66の判断部67によって判断するようにして、その時点(時刻T3)で表示出力を終了するようにしても良い。 However, the current position display 643 is not always performed. That is, as shown in FIG. 10, when the insertion system starts operating at time T1, the captured image display 641 starts immediately, but the current position display 643 on the two-dimensional diagrams 6421 and 6422 is not displayed. The display output of the current position display 643 on the two-dimensional diagrams 6421 and 6422 is started only when the determination unit 67 of the control unit 66 determines that the distal end of the insertion unit has reached a predetermined position (time T2). . The display output of the captured image display 641 and the display output of the current position display 643 on the two-dimensional diagrams 6421 and 6422 are continued until the insertion system is terminated at time T4. Alternatively, regarding the current position display 643 on the two-dimensional diagrams 6421 and 6422, the determination unit 67 of the control unit 66 also determines that the distal end of the insertion unit has returned before a predetermined position, and at that time ( The display output may be terminated at time T3).
 なお、上記現在位置表示643は、被挿入体2と挿入部先端の軸方向との交点72そのものであっても良いが、上述したように、撮像領域73とする、或いは、例えば交点72を中心として撮像領域中の一部の領域74とするように、ある程度の範囲を有するものとした方が、視認し易い。 The current position display 643 may be the intersection 72 itself between the inserted body 2 and the axial direction of the distal end of the insertion portion. However, as described above, the imaging region 73 is used, or for example, the intersection 72 is the center. It is easier to visually recognize a certain range such as a partial region 74 in the imaging region.
 また、出力部64は、現在の撮像位置Pに関わる情報としての現在位置表示643に加えて、挿入部31の形状を示す挿入部形状概略表示644を表示出力することができる。即ち、挿入部31の形状及び挿入部先端の位置と向きは上述したように位置関係演算部63によって演算されるため、挿入部31が被挿入体2にどのような挿入状態となっているのかを知ることができ、挿入部形状概略表示644を行うことができる。 Further, the output unit 64 can display and output an insertion portion shape schematic display 644 indicating the shape of the insertion portion 31 in addition to the current position display 643 as information relating to the current imaging position P. That is, since the shape of the insertion portion 31 and the position and orientation of the insertion portion tip are calculated by the positional relationship calculation portion 63 as described above, what is the insertion state of the insertion portion 31 in the inserted body 2? The insertion portion shape outline display 644 can be performed.
 またさらに、出力部64は、記憶部65に記憶されている演算結果を使用して、位置軌跡表示645を表示出力することもできる。この記憶部65に記憶される演算結果についても、上記判断部67の判断に基づいて、挿入部先端が所定の位置に到達した時点(時刻T2)から記憶しておくことができる。なお、この位置軌跡表示645に関しても、現在位置表示643と同様、ある程度の範囲を持ったものとする。またこの場合、現在位置表示643と位置軌跡表示645とを区別可能とするために、互いの色や濃度、模様を変えたり、位置軌跡表示645は点滅表示としたりする等、何らかの識別表示するような表示情報とすることが望ましい。この識別表示の形態は、操作者のスイッチ操作による選択を受けて、制御部66が制御できるようにしても良い。この場合、スイッチは、挿入具3である軟性内視鏡装置の操作部33や、フットスイッチ、操作者の脳波や筋肉の動きなどを検出可能な箇所、挿入支援装置6の筐体、軟性内視鏡装置の操作部33から挿入部31内を挿通される処置具、等に配置することができる。 Furthermore, the output unit 64 can also display and output the position trajectory display 645 using the calculation result stored in the storage unit 65. The calculation result stored in the storage unit 65 can also be stored from the time (time T2) when the distal end of the insertion unit reaches a predetermined position based on the determination of the determination unit 67. It should be noted that the position trajectory display 645 also has a certain range similar to the current position display 643. Further, in this case, in order to distinguish between the current position display 643 and the position locus display 645, some identification display is performed such as changing each other's color, density, and pattern, or making the position locus display 645 blinking. Display information is desirable. This identification display form may be controlled by the control unit 66 in response to selection by an operator's switch operation. In this case, the switch includes the operation unit 33 of the flexible endoscope device that is the insertion tool 3, a foot switch, a location where an operator's brain wave and muscle movement can be detected, the housing of the insertion support device 6, It can be arranged on a treatment instrument inserted through the insertion portion 31 from the operation portion 33 of the endoscope apparatus.
 また、判断部67は、挿入部先端が所定の位置に到達したかを、上記位置関係演算部63からの演算結果や上記撮像部32からの撮像画像に基づく代わりに、あるいはさらに、このようなスイッチの操作者による操作に応じて判断するようにしても良い。即ち、操作者が撮像画像表示641として表示されている撮像画像を観察し、挿入部先端が所定の位置に到達したと判別した際にスイッチを操作することで、判断部67は挿入部先端が所定の位置に到達したと判断する。 In addition, the determination unit 67 may determine whether the distal end of the insertion unit has reached a predetermined position, based on the calculation result from the positional relationship calculation unit 63 or the captured image from the imaging unit 32, or in addition, You may make it judge according to operation by the operator of a switch. That is, the operator observes the captured image displayed as the captured image display 641 and operates the switch when determining that the distal end of the insertion portion has reached a predetermined position, so that the determination portion 67 has the distal end of the insertion portion. It is determined that a predetermined position has been reached.
 さらに、このスイッチの操作により、所定の位置を操作者が任意に設定できるようにしても良い。即ち、被挿入体2の種類、大きさに応じて所定の位置を変更できるようにする。例えば、出入り口22に関しては、被挿入体2が膀胱であれば図8Aに示したように尿道と膀胱の接続点となるが、被挿入体2が胃であれば図8Bに示すように胃の接続部、被挿入体2が盲腸、上行結腸、横行結腸、下行結腸、S状結腸、直腸と続く大腸であれば図8Cに示すように直腸の終点、被挿入体2が人体であれば図8Dに示すように口、配管でれば配管の開口部や分岐箇所、となる。 Furthermore, the operator may arbitrarily set a predetermined position by operating this switch. That is, the predetermined position can be changed according to the type and size of the inserted body 2. For example, with respect to the entrance / exit 22, if the inserted body 2 is a bladder, it becomes a connection point between the urethra and the bladder as shown in FIG. 8A, but if the inserted body 2 is the stomach, as shown in FIG. As shown in FIG. 8C, when the connecting part and the inserted body 2 are the cecum, the ascending colon, the transverse colon, the descending colon, the sigmoid colon, and the large intestine following the rectum, as shown in FIG. If it is a mouth and piping as shown to 8D, it will become the opening part of a piping, and a branch location.
 また、既に観察した箇所、或いは今後又は再度観察が必要な箇所(周辺の画像と異なる箇所(但し、既知のパターンの場合は、除く)、癌の特徴パターン、血液流出箇所、等)などに、スイッチ操作により操作者がマーキング646できる機能を追加しても良い。例えば、スイッチ操作に応じて、制御部66は、当該箇所の情報を記憶部65に記憶させると共に、出力部64にその箇所にマーキング646を表示させるように制御する。この場合、図1Cに示すように、被挿入体2の形状を被挿入体2の座標において異なる平面で割って開いた状態を示す2次元図6421,6422上にマーキング646が表示されるので、一方の2次元図(この例では6421)では認識し難い位置であっても、他方の2次元図(この例では6422)では認識し易い表示とすることができ、被挿入体2のどこの部分にマーキング646したかを操作者が判断することが容易となっている。 In addition, in places that have already been observed, or places that need to be observed in the future or again (locations that differ from the surrounding images (except for known patterns), cancer feature patterns, blood outflow locations, etc.) A function that allows the operator to perform marking 646 by a switch operation may be added. For example, in accordance with the switch operation, the control unit 66 controls the storage unit 65 to store the information on the location and causes the output unit 64 to display the marking 646 at the location. In this case, as shown in FIG. 1C, since the marking 646 is displayed on the two-dimensional diagrams 6421 and 6422 showing the opened state by dividing the shape of the inserted body 2 by different planes in the coordinates of the inserted body 2, Even if the position is difficult to recognize in one two-dimensional view (6421 in this example), the display can be easily recognized in the other two-dimensional view (6422 in this example). It is easy for the operator to determine whether the part has been marked 646.
 このマーキング646の箇所は、交点72、撮像領域73、撮像領域中の一部の領域74のうち何れかに固定されていても良いし、操作者が任意に設定できるようにしても良い。このようなマーキング646を可能とすることで、被挿入体2についての観察済み箇所や再観察が必要な箇所、あるいは何らかの処置(除去、採取、補修、など)が必要な箇所、などを確認することが可能となる。さらには、異なる機会に当該被挿入体2の再観察を行う際に前回の要再観察箇所を特定したり、素早く当該箇所に挿入部先端を到達させたりする場合に利用することが可能となる。また、記憶について、マーキング646した箇所の情報のみでなく、挿入部31の挿入方向などを共に記憶していけば、観察後に確認する場合や、次回同じ状態で観察する場合などに活用することができる。 The location of the marking 646 may be fixed to any one of the intersection 72, the imaging region 73, and a partial region 74 in the imaging region, or may be arbitrarily set by the operator. By making such a marking 646 possible, it is possible to confirm an observed part, a part that requires re-observation, or a part that requires some kind of treatment (removal, sampling, repair, etc.). It becomes possible. Furthermore, when performing re-observation of the insertion object 2 at different occasions, it is possible to specify a previous re-observation location or to quickly reach the insertion portion tip at the location. . In addition, regarding the storage, not only the information on the location where the marking 646 was made, but also the insertion direction of the insertion portion 31 and the like can be stored, so that it can be used for confirmation after observation, observation in the same state next time, and the like. it can.
 なお、観察済み箇所、要再観察箇所、要処置箇所それぞれが何れの箇所であるのかを特定するための種類情報や、癌、ポリープ、亀裂などの症状情報/状態情報も共に設定記憶できるようにしておけば、表示の形態もそれら種類情報、症状情報/状態情報に応じて色や形を変更して、それぞれのマーキング646の意味を操作者が容易に判断できるようになる。 In addition, it is possible to set and store the type information for identifying each of the observed part, the re-observation part, and the necessary treatment part, and symptom information / status information such as cancer, polyp, and crack. In this case, the display form is also changed in color and shape in accordance with the type information, symptom information / status information, and the operator can easily determine the meaning of each marking 646.
 また、このマーキング646に関しては、挿入支援装置6にポインティングデバイスや視認認識デバイスを接続し、操作者が出力部64に表示された撮像画像表示641あるいは2次元図6421,6422上の任意の範囲または点を指定できるようにしても良い。 Regarding the marking 646, a pointing device or a visual recognition device is connected to the insertion support device 6, and the operator can select any range on the captured image display 641 displayed on the output unit 64 or the two-dimensional diagrams 6421 and 6422. A point may be specified.
 また、図1Cに示す2次元図6421,6422では、領域分割表示647を行っている。これは、学会の通説や規定、標準的に利用されている領域、または予め決めておいた分割方法で2つ以上の領域分割をしているものである。このようにしておくと、被挿入体2のどこの部分であるのかを操作者が判断することが容易となる。 In the two-dimensional diagrams 6421 and 6422 shown in FIG. 1C, the area division display 647 is performed. This is one in which two or more areas are divided by a common theory or regulation of an academic society, an area used as a standard, or a predetermined division method. If it does in this way, it will become easy for an operator to determine which part of the to-be-inserted body 2 is.
 なお、図11に示すように、挿入部形状概略表示644上に、湾曲検出部51の位置を示す湾曲検出部表示648を重畳して表示するようにしても良い。 Note that, as shown in FIG. 11, a curve detection unit display 648 indicating the position of the curve detection unit 51 may be superimposed on the insertion unit shape schematic display 644 and displayed.
 ここで、挿入・回転検出部4及びファイバ形状センサ5は、上述のように光学的に被挿入体2に対する挿入部31の形状及び撮像開口部34の位置と向きを検出するものとしたが、その他の方法で検出しても良い。例えば、挿入部31の内部の少なくとも撮像開口部34の近傍にコイルを設け、コイルに電流を流して磁界を発生させ、この磁界を外部で受信する、または、外部で発生させた磁界分布をコイルで受信することにより、コイルすなわち撮像開口部34の位置や向きを検出できる。なお、コイルを挿入部31の長手方向に複数配列させると、挿入部31の形状も検出することができる。 Here, the insertion / rotation detection unit 4 and the fiber shape sensor 5 optically detect the shape of the insertion unit 31 with respect to the inserted body 2 and the position and orientation of the imaging opening 34 as described above. It may be detected by other methods. For example, a coil is provided at least in the vicinity of the imaging opening 34 inside the insertion portion 31, and a magnetic field is generated by passing a current through the coil, and this magnetic field is received outside, or a magnetic field distribution generated outside is coiled. , The position and orientation of the coil, that is, the imaging opening 34 can be detected. If a plurality of coils are arranged in the longitudinal direction of the insertion portion 31, the shape of the insertion portion 31 can also be detected.
 以上のように、本実施形態によれば、挿入状態取得部61によって、被挿入体2内部に挿入される挿入部31の(少なくとも一部の、)(被挿入体2内部への)挿入状態情報(例えば、挿入部31のある点(挿入部先端)の位置・向き、及び、挿入部31の各湾曲検出部51の位置及び向き)を取得し、また、被挿入体形状取得部62によって、被挿入体2の形状情報を取得し、それら挿入状態情報と被挿入体形状情報を位置関係演算部63に入力して、被挿入体2に対する挿入部31の位置関係(挿入部31全体や挿入部先端の位置や向き)を演算し、出力部64によって、その位置関係演算部63の演算結果を表示出力するようにしているので、被挿入体内のどこを撮像しているのかを判断するための情報を提供できる。すなわち、目で直接見ることのできない被挿入体内部を挿入部31を有する挿入具3である軟性内視鏡装置で観察する際、出力部64が表示している撮像画像が被挿入体2のどの当たりの場所で、どの方向から観察しているかを理解することができる。また、被挿入体2内の観察済み箇所、または未観察箇所を容易に識別することが可能となり、見落としを防止することが可能となる。 As described above, according to the present embodiment, the insertion state acquisition unit 61 inserts (at least a part of) the insertion unit 31 inserted into the insertion target body 2 (into the insertion target body 2). Information (for example, the position and orientation of a point (insertion tip)) of the insertion section 31 and the position and orientation of each bending detection section 51 of the insertion section 31 are acquired, and the inserted object shape acquisition section 62 The shape information of the inserted body 2 is acquired, and the insertion state information and the inserted body shape information are input to the positional relationship calculation unit 63, and the positional relationship of the insertion unit 31 with respect to the inserted body 2 (the entire insertion unit 31 and The position and orientation of the distal end of the insertion portion are calculated, and the output unit 64 displays and outputs the calculation result of the positional relationship calculation unit 63. Information can be provided. That is, when the inside of the insertion object that cannot be directly seen by the eyes is observed with the flexible endoscope device that is the insertion tool 3 having the insertion part 31, the captured image displayed by the output unit 64 is the image of the insertion object 2. It is possible to understand at which location and from which direction the observation is made. In addition, it is possible to easily identify an observed part or an unobserved part in the inserted body 2 and prevent oversight.
 そして、出力部64にて位置関係演算部63の演算結果つまり撮像位置Pを表示出力するにあたり、制御部66が無い場合には常に表示がされていき、出力の必要のないときも出力されることとなるが、本実施形態では、制御部66により、必要なときのみ表示するように制御しているので、無駄な演算結果を表示せず、分かり易い画面(見易い、間違い難い、素早く判断できる)とすることが可能となる。また、挿入具3である軟性内視鏡装置の操作者が制御部66を制御すれば、操作者に必要な演算結果(現在位置、既観察部、患部、要摘出部、など)のみを確実に表示出力することが可能となる。 When the output unit 64 displays and outputs the calculation result of the positional relationship calculation unit 63, that is, the imaging position P, it is always displayed when the control unit 66 is not provided, and is also output when no output is necessary. In this embodiment, however, the control unit 66 controls the display so that it is displayed only when necessary. Therefore, an unnecessary calculation result is not displayed, and an easy-to-understand screen (easy to see, difficult to make, and quick determination can be made. ). In addition, if the operator of the flexible endoscope apparatus that is the insertion tool 3 controls the control unit 66, only the calculation results (current position, already observed part, affected part, required extraction part, etc.) necessary for the operator are surely obtained. Can be displayed and output.
 また、位置関係演算部63の演算結果の少なくとも一部を保存する記憶部65をさらに有し、該記憶部65に、出力部64に表示した被挿入体2の観察範囲(被挿入体2に対する挿入部31の位置関係)を記憶していくことにより、被挿入体2の内壁に対して挿入部先端で観察又は処置した箇所と未観察又は未処置の箇所とを識別していくことが可能となる。例えば、出力部64に現在位置表示643として表示している観察又は処置範囲を、別の箇所に移動していく際に消去することなく位置軌跡表示645として残していけば、該位置軌跡表示645によって、観察又は処置済みの箇所と未観察又は未処置箇所とを容易に判断することが可能である。この構成により、見落とし無く被挿入体2全体を観察又は処置することが可能となる。 The storage unit 65 further stores at least a part of the calculation result of the positional relationship calculation unit 63, and the storage unit 65 displays the observation range of the inserted object 2 displayed on the output unit 64 (with respect to the inserted object 2). By memorizing the positional relationship of the insertion portion 31, it is possible to distinguish between the location observed or treated at the distal end of the insertion portion with respect to the inner wall of the inserted body 2 and the location that has not been observed or treated. It becomes. For example, if the observation or treatment range displayed as the current position display 643 on the output unit 64 is left as the position locus display 645 without being deleted when moving to another location, the position locus display 645 is displayed. Thus, it is possible to easily determine the observed or treated location and the unobserved or untreated location. With this configuration, the entire inserted body 2 can be observed or treated without oversight.
 また、挿入部31に設けられた撮像開口部34を有する撮像部32を有し、該撮像部32から得られた撮像画像を出力部64にて表示することで、挿入部先端が観察している被挿入体2内部の場所(位置関係演算部63の演算結果)の画像を直接得ることが可能となる。従って、観察箇所の状態を得ることができると共に、異なる時間に再度同じ場所を観察し(同じ場所に到達し)、変化を具体的に検出することが可能となる。 Moreover, it has the imaging part 32 which has the imaging opening part 34 provided in the insertion part 31, and by displaying the picked-up image obtained from this imaging part 32 on the output part 64, the front-end | tip of an insertion part observes It is possible to directly obtain an image of a place inside the inserted object 2 (the calculation result of the positional relationship calculation unit 63). Therefore, it is possible to obtain the state of the observation location, and to observe the same location again (at the same location) at different times and to specifically detect the change.
 また、制御部66は、位置関係演算部63の演算結果から挿入部先端が所定の位置に到達したかを判断する判断部67を有し、挿入部先端が所定の位置に到達したことを該判断部67が判断した際に、出力部64の出力状態の制御、つまり位置関係演算部63の演算結果である撮像位置Pを示す現在位置表示643の連続表示出力の開始・終了、表示出力内容・表示出力方法(位置軌跡表示645やマーキング646の開始・終了、種類、サイズ、等)の切り替え等を行う。ここで、所定の位置は、被挿入体2の観察箇所(臓器であれば所定臓器)の出入り口22(被挿入体2の挿入開口部、観察領域の入り口など形状情報上に予め設定された位置)、または既知の分岐箇所、または撮像画像が特定パターン(癌、ポリープ、亀裂、など)である箇所、である。なお、位置軌跡表示645やマーキング646のサイズは、挿入体先端と被挿入体2との距離に依存して変更しても良いし、撮像位置Pとして撮像方向71と被挿入体2との交点72を表示出力した際に識別可能な程度(サイズ変更可)としても良い。また、撮像部32がある場合、撮像画像による患部や亀裂の大きさに応じた領域で表示出力すると良い。 In addition, the control unit 66 includes a determination unit 67 that determines whether the distal end of the insertion unit has reached a predetermined position from the calculation result of the positional relationship calculation unit 63, and determines that the distal end of the insertion unit has reached a predetermined position. When the determination unit 67 determines, the control of the output state of the output unit 64, that is, the start / end of the continuous display output of the current position display 643 indicating the imaging position P which is the calculation result of the positional relationship calculation unit 63, the display output content -Switch the display output method (start / end of position locus display 645 and marking 646, type, size, etc.). Here, the predetermined position is a position set in advance on the shape information such as the entrance / exit 22 of the observation location (predetermined organ in the case of an organ) of the insertion body 2 (insertion opening of the insertion body 2, entrance of the observation region, etc.) ), Or a known branch location, or a location where the captured image has a specific pattern (cancer, polyp, crack, etc.). Note that the size of the position trajectory display 645 and the marking 646 may be changed depending on the distance between the distal end of the insertion body and the inserted body 2, or the intersection of the imaging direction 71 and the inserted body 2 as the imaging position P. It is good also as a grade which can be discriminate | determined when 72 is displayed and output (size change is possible). In addition, when there is the imaging unit 32, it is preferable to display and output in an area corresponding to the size of the affected part or crack from the captured image.
 また、制御部66は、挿入部先端が被挿入体2の所定の位置に到達した際に、自動的に、上記のような出力部64の出力状態の制御を行うように構成することで、予め設定した必要な箇所や状態のみを表示出力することができ、挿入具3の操作者が操作に集中することができる。 Further, the control unit 66 is configured to automatically control the output state of the output unit 64 as described above when the distal end of the insertion unit reaches a predetermined position of the inserted body 2. Only the necessary places and states set in advance can be displayed and output, and the operator of the insertion tool 3 can concentrate on the operation.
 ここで、制御部66は、操作者のスイッチ操作に応じて所定の位置を制御するようにすることで、操作者の意思によって必要な表示出力を行うことが可能となり、操作者の必要な情報のみを確実に出力していくことが可能となる。また、操作者が判断した複数の結果(パターンマッチング等の画像認識で自動的に判断し難いものを含めて)を区別して出力することができる。なお、スイッチは、1つでも複数(現在位置表示643開始/終了指示用、位置軌跡表味646開始/終了指示用、マーキング646指示用、さらに別のマーキング指示用、削除指示用)でも良い。スイッチが1つの場合、スイッチのオン/オフの回数やパターン、順次機能切り替わりなどで、複数の出力状態を切り替えることができる。 Here, by controlling the predetermined position according to the switch operation of the operator, the control unit 66 can perform necessary display output according to the operator's intention, and information necessary for the operator can be obtained. It is possible to output only with certainty. Also, a plurality of results determined by the operator (including those that are difficult to determine automatically by image recognition such as pattern matching) can be distinguished and output. The number of switches may be one or plural (for current position display 643 start / end instruction, position locus taste 646 start / end instruction, marking 646 instruction, another marking instruction, and deletion instruction). In the case of a single switch, a plurality of output states can be switched by switching the number of switches on / off, the pattern, the sequential function switching, and the like.
 また、制御部66は、癌やポリープ、亀裂などの被挿入体2の挿入対象部位である観察(/作業)対象部位の状態、および、挿入対象部位の大きさや広さに該当する状態表示範囲を、操作者の設定操作に応じて設定できるようにすることで、操作者の意思に基づいて出力状態や範囲を設定することが可能となるため、再観察の基準などを操作者や操作者の属する分野基準で任意に設定することが可能となる。つまり、医療分野や工業分野などそれぞれの分野に応じた使い分けをすることが可能となる。自動的に出力状態を制御する場合は、予めその制御の判断基準をセットしておけば良い。 The control unit 66 also displays a state display range corresponding to the state of the observation (work) target part that is an insertion target part of the inserted body 2 such as cancer, polyp, and crack, and the size and width of the insertion target part. Can be set according to the operator's setting operation, so that the output state and range can be set based on the operator's intention. It is possible to arbitrarily set in accordance with the field standard to which. In other words, it is possible to use properly according to each field such as medical field and industrial field. In the case of automatically controlling the output state, it is sufficient to set a criterion for the control in advance.
 また、制御部66は、記憶部65に、撮像画像を、位置関係演算部63の演算結果と関連付けて記憶させておくようにすることで、所定箇所の状態について、記憶したときの観察状態と、現在の観察状態とを比較して、その違い・変化を得ることが可能となる。即ち、観察範囲の撮像画像と位置関係演算部63の演算結果つまり撮像位置Pとが一致しているので、どの部分を撮像した結果なのかの情報を得ることができると共に、複数回の診断により同じ患部の変遷を正確に一致させることが可能となる。 In addition, the control unit 66 stores the captured image in association with the calculation result of the positional relationship calculation unit 63 in the storage unit 65, so that the state at a predetermined location is stored as the observation state when stored. Compared with the current observation state, it becomes possible to obtain the difference / change. That is, since the captured image of the observation range and the calculation result of the positional relationship calculation unit 63, that is, the imaging position P, coincide with each other, it is possible to obtain information on which part is the result of imaging, and through multiple diagnosis It is possible to accurately match the changes of the same affected area.
 また、制御部66は、記憶部65に、撮像画像を、少なくとも位置関係演算部63の演算結果及び症状情報と関連付けて記憶させることで、所定箇所の状態について、記憶したときの観察状態と、現在の観察状態との違い・変化を得ることが可能となる。 In addition, the control unit 66 causes the storage unit 65 to store the captured image in association with at least the calculation result and the symptom information of the positional relationship calculation unit 63, so that the observation state when storing the state of the predetermined location, Differences and changes from the current observation state can be obtained.
 また、制御部66は、位置関係演算部63の演算結果として、出力部64に、2つ以上の異なる所定状態を区別可能な方法で表示する、例えば区別可能なマーキングを表示させるようにしているので、複数の状態(現在の観察範囲、軌跡、患部1(癌、炎症、欠損、傷、腐食など)、患部2(癌、炎症、欠損、傷、腐食など)、…、処置済み、要経過、など)を見た目で区別することが可能となる。この複数の状態を記憶部65に記憶しておけば、異なる時間に再度同じ場所を観察したいときに所定の状態にある箇所のみを観察することができ、観察効率を向上させることが可能となり、また、当該箇所の変化具合を得ることが可能となる。 In addition, the control unit 66 displays, for example, distinguishable markings that display two or more different predetermined states in a distinguishable manner on the output unit 64 as the computation result of the positional relationship computation unit 63. So, multiple states (current observation range, trajectory, affected area 1 (cancer, inflammation, defect, wound, corrosion, etc.), affected area 2 (cancer, inflammation, defect, wound, corrosion, etc.), treated, necessary progress , Etc.) can be distinguished visually. If the plurality of states are stored in the storage unit 65, it is possible to observe only a part in a predetermined state when it is desired to observe the same place again at different times, and it is possible to improve the observation efficiency. Moreover, it becomes possible to obtain the degree of change of the part.
 なお、所定状態の場合分けは、挿入部31の操作者が選択可能とすることができる。この場合、操作者の意思に基づいて場合分けすることができるため、再観察の基準などを操作者や操作者の属する分野基準で任意に設定することが可能となる。つまり、医療分野や工業分野など、それぞれの分野に応じた使い分けをすることが可能となる。 It should be noted that the operator of the insertion unit 31 can select the case classification in the predetermined state. In this case, since the case can be classified based on the intention of the operator, it is possible to arbitrarily set a re-observation standard or the like based on a field standard to which the operator or the operator belongs. In other words, it is possible to use properly according to each field such as medical field and industrial field.
 また、制御部66は、出力部64に、複数の領域に領域分割表示がされている表示を行わせるように制御する。このように、表示画面に学会の通説や規定、または標準的に利用されている領域が判るように領域分割表示されていると、観察範囲を表示している際に、どこの部分を観察しているかを理解し易くすることが可能となる。 Further, the control unit 66 controls the output unit 64 to perform display in which a region division display is performed in a plurality of regions. In this way, if the display screen is divided into areas so that you can see the academic conventions and regulations of the academic society, or the areas that are used as standard, you can observe which part when viewing the observation range. It is possible to make it easier to understand whether
 なお、制御部66は、出力部64に、観察(/作業)対象部位及び/または対象部位への挿入部31の位置関係が特定可能なように、被挿入体2の同一の領域を2つ以上の異なる領域分割表示、例えば異なる2次元図6421,6422とする表示を行わせることで、3次元で表される被挿入体2を2次元で表現した場合、奥行き方向に当たる部分は、見えなくなったり、見え難くなったりするが、これを防ぐことが可能となる。従って、見落とし箇所やマーキング位置を確実に認識すること(見易い、間違い難い、素早く判断できる)が可能となる。 In addition, the control unit 66 provides the output unit 64 with two identical regions of the inserted body 2 so that the observation (/ work) target site and / or the positional relationship of the insertion unit 31 to the target site can be specified. When the inserted object 2 represented in three dimensions is expressed in two dimensions by performing the above different area division display, for example, different two- dimensional views 6421 and 6422, the portion corresponding to the depth direction becomes invisible. It may be difficult to see, but this can be prevented. Therefore, it is possible to reliably recognize an overlooked part and a marking position (easy to see, difficult to mistake, and can be quickly determined).
 また、制御部66は、記憶部65に保存している情報を出力部64に表示可能とする関連情報呼び出し部としての機能を有することで、記憶部65に保存された出力部64が出力した視野(撮像領域73)の結果から、挿入後にも挿入時の観察結果つまり被挿入体2の観察が済んだ箇所と観察が未了の箇所とを振り返ることが可能となる。また、次回挿入時、前回の観察の場所を記憶してあるので、同じ箇所に導いたり、特定したりし易くすることが可能となる。さらに、2回目以降の観察の際、前回までの同一箇所の情報(撮像画像、症状、大きさなど)がある場合に、関連情報として呼び出して表示することにより、その場で、前回同箇所でマーキングをした箇所や、前回との違いや、進行具合を判断することが可能となる。 In addition, the control unit 66 has a function as a related information calling unit that allows the information stored in the storage unit 65 to be displayed on the output unit 64, so that the output unit 64 stored in the storage unit 65 outputs the information. From the result of the field of view (imaging region 73), it is possible to look back on the observation result at the time of insertion, that is, the place where the insertion object 2 has been observed and the place where the observation has not been completed. Further, since the location of the previous observation is stored at the time of the next insertion, it is possible to easily guide or specify the same location. Furthermore, in the second and subsequent observations, if there is information (captured image, symptom, size, etc.) of the same location up to the previous time, it can be recalled and displayed as related information on the spot, at the same location as the previous time. It is possible to determine the marked part, the difference from the previous time, and the progress.
 また、挿入部31の挿入量と回転量を挿入・回転検出部4で検出し、挿入部31の形状をファイバ形状センサ5で検出することにより、被挿入体2への出入り口22などを基準位置とした被挿入体2への挿入部31の挿入量、回転量と、基準位置からの挿入部31の形状を検出することができるため、被挿入体2に対する挿入部31の形状と、被挿入体2内部への挿入状態情報(撮像開口部34の位置・向きなど)と、を検出することが可能となる。 Further, the insertion amount and the rotation amount of the insertion portion 31 are detected by the insertion / rotation detection portion 4, and the shape of the insertion portion 31 is detected by the fiber shape sensor 5. Since the insertion amount and rotation amount of the insertion portion 31 to the inserted body 2 and the shape of the insertion portion 31 from the reference position can be detected, the shape of the insertion portion 31 relative to the insertion body 2 and the insertion It is possible to detect insertion state information (such as the position and orientation of the imaging opening 34) into the body 2.
 なお、撮像位置を表示する例を記載したが、これに加えて、挿入部先端の位置、向き、さらには、挿入部先端の位置、向きの履歴を表示しても良い。このことを、図12A乃至図12Cと図13A及び図13Bを参照して説明する。図12A乃至図12Cは、分岐のある被挿入体2に挿入部31を挿入した際の、出力部64に表示される内容を示している。 In addition, although the example which displays an imaging position was described, in addition to this, you may display the position and direction of the insertion part tip, and also the history of the position and direction of the insertion part tip. This will be described with reference to FIGS. 12A to 12C and FIGS. 13A and 13B. 12A to 12C show the contents displayed on the output unit 64 when the insertion unit 31 is inserted into the inserted body 2 having a branch.
 位置関係演算部63は、挿入部先端の位置、向きを演算して、制御部66に出力する。なお、挿入部先端の位置、向きの演算は上記演算と同様のため、ここでは説明を省略する。 The positional relationship calculation unit 63 calculates the position and orientation of the distal end of the insertion unit and outputs them to the control unit 66. Since the calculation of the position and orientation of the distal end of the insertion portion is the same as the above calculation, the description is omitted here.
 図12Aは、被挿入体の形状を示す2次元図6421A上に、挿入部31の現時点での形状を示す挿入部形状概略表示644と、挿入部31の先端位置(である撮像開口部34の位置)の現在の位置を表わす現在位置表示643Aと、撮像開口部34の位置の軌跡である位置軌跡表示645Aと、を示したものである。なお、撮像位置の軌跡である位置軌跡表示645は省略している。先端位置の軌跡表示により、先端位置(である撮像開口部34の位置)が被挿入体2のどの位置を通過し、現時点でどの位置にあるかがわかる様になる。 FIG. 12A shows an insertion portion shape schematic display 644 showing the current shape of the insertion portion 31 on the two-dimensional view 6421A showing the shape of the insertion object, and the position of the imaging opening 34 which is the tip position of the insertion portion 31. A current position display 643A representing the current position of position) and a position locus display 645A that is a locus of the position of the imaging opening 34 are shown. Note that the position locus display 645, which is the locus of the imaging position, is omitted. By displaying the locus of the distal end position, it is possible to know which position of the inserted body 2 the distal end position (that is, the position of the imaging opening 34) is, and at which position at the present time.
 先端位置(である撮像開口部34の位置)が分かることで、撮像対象のどの部分まで到達しているかが分かる。現在位置が正確に分かることで、この情報を用いて、現在位置に応じてすべき観察・処置や、現在位置から狙いの位置への経路の検討を、この場所であったらという推定無しに、行うことができる。そのため、狙いの位置へ行くのに試行錯誤を繰り返したり、狙いの位置に到達したか、観察画像を見るなど様々な方法で確認したりする必要が無くなる。その結果、現在位置から狙いの位置まで最短コースに近い経路を取って1回で到達できる可能性が上がり、時間短縮出来ると共に、位置に関する状況把握が出来ていることにより、落ち着いた、自信を持った操作につながる。 By knowing the tip position (that is, the position of the imaging opening 34), it is possible to know which part of the imaging target has been reached. By accurately knowing the current position, this information can be used for observation and treatment according to the current position and examination of the route from the current position to the target position, without estimating that it is this place. It can be carried out. Therefore, it is not necessary to repeat trial and error to go to the target position, or to check whether the target position has been reached or to observe the observation image by various methods. As a result, the possibility of being able to reach the shortest course from the current position to the target position by one time is increased, the time can be shortened, and the situation regarding the position can be grasped, so that it is calm and confident. Lead to the operation.
 図12Aに示した、先端位置(である撮像開口部34の位置)の履歴としての位置軌跡表示645Aに加え、挿入部先端(である撮像開口部34)が向いている方向を表示しても良い。図12Bでは、挿入部先端の向きである、撮像開口部34が向いている方向を矢印649で示している。撮像開口部34の現在の位置と向きだけでなく、撮像開口部34の軌跡上の幾つかの位置で方向の情報を矢印649で加えている。先端位置の位置軌跡表示645Aと向きの表示である矢印649により、挿入部先端の撮像開口部34の位置情報である、先端位置の軌跡と、どの方向を向きながら位置が変化したかがわかる様になる。 In addition to the position trajectory display 645A as the history of the tip position (that is, the position of the imaging opening 34) shown in FIG. 12A, the direction in which the tip of the insertion section (that is, the imaging opening 34) is directed is also displayed. good. In FIG. 12B, the direction of the distal end of the insertion portion, that is, the direction in which the imaging opening 34 faces is indicated by an arrow 649. Not only the current position and orientation of the imaging opening 34 but also the direction information is added by arrows 649 at several positions on the locus of the imaging opening 34. The position locus display 645A of the distal end position and the arrow 649 which is the direction display so that the position information of the imaging opening 34 at the distal end of the insertion portion and the direction of the distal end position and the direction of the position change can be understood. become.
 なお、撮像のための光学系にもよるが、本例では、挿入部先端にある撮像開口部34が向いている方向が視野中心であり、撮像画像の中央である。 Although depending on the optical system for imaging, in this example, the direction in which the imaging opening 34 at the distal end of the insertion portion faces is the center of the field of view and the center of the captured image.
 先端位置と向きが分かることで、撮像対象での到達位置と向きが分かる。現在位置と向きから、観察視野方向や視野中心が分かる。到達位置と向きや観察視野方向や視野中心が正確に分かることで、この情報を用いて、現在位置と向きに応じてすべき観察・処置や、現在位置から狙いの位置への経路、および、移動する際の挿入部31の形状や操作方法の検討を、現在位置と向きがこうであったらという推定無しに、行うことができる。特に挿入部先端の向きが分かっていることで、狙いの位置や向きに到達するための挿抜や湾曲などの操作方法・手順が検討できるようになる。 ) By knowing the tip position and orientation, the arrival position and orientation of the imaging target can be known. From the current position and orientation, the viewing field direction and the center of the field of view are known. By accurately knowing the arrival position and orientation, the observation visual field direction and the visual field center, using this information, the observation and treatment to be performed according to the current position and orientation, the path from the current position to the target position, and Examination of the shape and operation method of the insertion portion 31 when moving can be performed without estimating that the current position and orientation are like this. In particular, by knowing the direction of the distal end of the insertion portion, it becomes possible to study operation methods and procedures such as insertion / extraction and bending to reach the target position and orientation.
 挿入部先端の向き、即ち、撮像開口部34が向いている方向は、挿入部先端の姿勢、または、回転を含めて表わすものとして3次元的に示しても良い。 The direction of the distal end of the insertion section, that is, the direction in which the imaging opening 34 faces may be shown three-dimensionally as representing the orientation of the distal end of the insertion section or rotation.
 挿入部先端に固定された座標系、即ち、挿入部先端の位置や姿勢が変化しない座標系の回転を挿入部先端の「3次元の方向」と定義した時、挿入部先端の3次元の方向の軌跡を図12Cに示す。図12Cでは、3次元の方向(姿勢)を示すために、挿入部先端の向き、即ち、撮像開口部34の向いている方向を三方向(x方向、y方向、z方向)の矢印649Aで示している。 When the rotation of the coordinate system fixed to the distal end of the insertion portion, that is, the coordinate system in which the position and orientation of the distal end of the insertion portion does not change is defined as the “three-dimensional direction” of the distal end of the insertion portion, the three-dimensional direction of the distal end of the insertion portion The trajectory is shown in FIG. 12C. In FIG. 12C, in order to indicate a three-dimensional direction (posture), the direction of the distal end of the insertion portion, that is, the direction in which the imaging opening 34 is directed is indicated by an arrow 649A in three directions (x direction, y direction, z direction). Show.
 また、回転を含めた3次元方向に関する情報を、撮像部32が撮像した画像と共に表示するようにしても良い。 Further, information regarding the three-dimensional direction including rotation may be displayed together with the image captured by the imaging unit 32.
 このように、挿入部先端の位置と3次元の方向が分かることで、例えば、挿入部先端の撮像位置での回転を含めた撮像方向がわかる様になる。また、撮像以外に治療などでも先端向き周りの回転の影響を考慮することができるようになる。 Thus, by knowing the position and three-dimensional direction of the distal end of the insertion portion, for example, the imaging direction including rotation at the imaging position of the distal end of the insertion portion can be known. In addition to imaging, the effect of rotation around the tip can be taken into account for treatment and the like.
 撮像位置での回転を含めた撮像方向については、例えば、図13A及び図13Bに示すように、撮像開口部34が向いている方向が同じ場合でも、挿入部31が撮像方向周りに回転すると撮像対象81に対する撮像開口部34も回転する。図13Aと図13Bとでは180°回転しているため、上下が逆になっており、このような場合、撮像部32が撮像した撮像画像Iも上下が逆に表示される。こうした観察や治療時の撮像方向周りの回転の影響も考慮することが可能となり、撮像画像での天地を誤ったりせず、正確に把握することができるようになる。 As for the imaging direction including the rotation at the imaging position, for example, as shown in FIGS. 13A and 13B, even if the direction in which the imaging opening 34 faces is the same, the imaging is performed when the insertion unit 31 rotates around the imaging direction. The imaging opening 34 for the object 81 also rotates. In FIG. 13A and FIG. 13B, since the image is rotated by 180 °, the image is upside down. In such a case, the captured image I captured by the imaging unit 32 is also displayed upside down. It is possible to consider the effect of rotation around the imaging direction during such observation and treatment, and it is possible to accurately grasp the top and bottom of the captured image without making a mistake.
 また、情報の提供形態は、図1Cに示したような出力部64の表示レイアウトに限定するものではない。 Further, the form of providing information is not limited to the display layout of the output unit 64 as shown in FIG. 1C.
 例えば、図14A乃至図14Fに示すように、撮像画像表示641と挿入状態表示642とを、それぞれ様々な大きさ及び位置で、さらには互いに様々な重なり具合で、レイアウトすることができる。また、これら複数の表示レイアウトから、操作者が見易いと感じるものを操作者が任意に選択変更できるようにしても構わない。 For example, as shown in FIGS. 14A to 14F, the captured image display 641 and the insertion state display 642 can be laid out in various sizes and positions, and in various overlapping states. In addition, the operator may arbitrarily select and change what the operator feels easy to see from the plurality of display layouts.
 また、挿入状態表示642については、2次元図6421,6422のような表示に限定するものではない。例えば、図15のように3次元図6423で表示するようなものでも良い。 Further, the insertion state display 642 is not limited to the display as shown in the two-dimensional diagrams 6421 and 6422. For example, it may be displayed as a three-dimensional diagram 6423 as shown in FIG.
 [第2実施形態]
 次に、本発明の第2実施形態を説明する。
[Second Embodiment]
Next, a second embodiment of the present invention will be described.
 上記第1実施形態に係る挿入システム1は、撮像画像を表示する表示部を出力部64と同一のデバイスとして構成したが、本第2実施形態に係る挿入システム1は、図16Aに示すように、表示部として、出力部64とは別体のデバイスである表示装置9を更に備えるようにしたものである。このような構成では、図16Bに示すように、出力部64には挿入状態表示642のみが行われ、表示装置9には撮像画像表示641のみが行われる。 In the insertion system 1 according to the first embodiment, the display unit that displays the captured image is configured as the same device as the output unit 64. However, the insertion system 1 according to the second embodiment is configured as illustrated in FIG. 16A. As a display unit, a display device 9 which is a separate device from the output unit 64 is further provided. In such a configuration, as shown in FIG. 16B, only the insertion state display 642 is displayed on the output unit 64, and only the captured image display 641 is displayed on the display device 9.
 従って、出力部64と表示装置9とを近傍に配置することで、位置関係演算部63の演算結果と、撮像部32で得られた撮像画像とを、ほぼ同時に目視することが可能となり、両者を同時に認識することができ、負担無く位置関係演算部63の演算結果を認識することが可能となる。 Therefore, by arranging the output unit 64 and the display device 9 in the vicinity, the calculation result of the positional relationship calculation unit 63 and the captured image obtained by the imaging unit 32 can be visually observed almost simultaneously. Can be recognized simultaneously, and the calculation result of the positional relationship calculation unit 63 can be recognized without any burden.
 [第3実施形態]
 次に、本発明の第3実施形態を説明する。
[Third Embodiment]
Next, a third embodiment of the present invention will be described.
 本第3実施形態に係る挿入システム1は、挿入具3が、図17に示すような硬性内視鏡装置の場合の例である。このような硬性内視鏡装置は、上記第1(及び第2)実施形態における軟性内視鏡装置とは異なり、挿入部31は可撓性を有しておらず、よって、形状は一定である。 The insertion system 1 according to the third embodiment is an example in which the insertion tool 3 is a rigid endoscope apparatus as shown in FIG. Such a rigid endoscope apparatus is different from the flexible endoscope apparatus in the first (and second) embodiment described above, and the insertion portion 31 does not have flexibility, and thus the shape is constant. is there.
 そのため、ファイバ形状センサ5や挿入部31の長手方向にコイルを複数配列させる必要は無く、位置関係演算部63にて挿入部31の形状情報を演算することも必要ではない。挿入部31の形状情報を位置関係演算部63に予め記憶させておく、あるいは、入力するだけで良い。 Therefore, it is not necessary to arrange a plurality of coils in the longitudinal direction of the fiber shape sensor 5 or the insertion portion 31, and it is not necessary to calculate the shape information of the insertion portion 31 in the positional relationship calculation portion 63. The shape information of the insertion unit 31 may be stored in advance in the positional relationship calculation unit 63 or may be simply input.
 その他は、上記第1(及び第2)実施形態に係る挿入システムと同様である。 Others are the same as those of the insertion system according to the first (and second) embodiment.
 このような第3実施形態に係る挿入システム1においても、上記第1(及び第2)実施形態に係る挿入システムと同様の効果が得られる。 Also in the insertion system 1 according to the third embodiment, the same effects as those of the insertion system according to the first (and second) embodiment can be obtained.
 [第4実施形態]
 次に、本発明の第4実施形態を説明する。
[Fourth Embodiment]
Next, a fourth embodiment of the present invention will be described.
 本第4実施形態に係る挿入システム1は、挿入具3が、図18に示すような、カテーテル等の撮像部を有しない処置具である場合の例である。このような処置具は、上記第1(及び第2)実施形態における軟性内視鏡装置と同様に、可撓性の挿入部31を備えている。 The insertion system 1 according to the fourth embodiment is an example in a case where the insertion tool 3 is a treatment tool that does not have an imaging unit such as a catheter as shown in FIG. Such a treatment instrument includes a flexible insertion portion 31 as in the flexible endoscope apparatus in the first (and second) embodiment.
 従って、撮像画像に関する部分以外の構成、動作及び作用効果は、上記第1(及び第2)実施形態に係る挿入システムと同様である。 Therefore, the configuration, operation, and effects other than the portion related to the captured image are the same as those of the insertion system according to the first (and second) embodiment.
 以上、実施形態に基づいて本発明を説明したが、本発明は、上述した実施形態に限定されるものではなく、本発明の要旨の範囲内で種々の変形や応用が可能なことは勿論である。 The present invention has been described above based on the embodiments. However, the present invention is not limited to the above-described embodiments, and various modifications and applications are possible within the scope of the gist of the present invention. is there.
 例えば、図7のフローチャートに示した機能を実現するソフトウェアのプログラムをコンピュータに供給し、当該コンピュータがこのプログラムを実行することによって、上記機能を実現することも可能である。 For example, the above functions can be realized by supplying a software program for realizing the functions shown in the flowchart of FIG. 7 to the computer and executing the programs by the computer.

Claims (19)

  1.  被挿入体内部に挿入される挿入部と、
     前記挿入部の挿入状態情報を取得する挿入状態取得部と、
     前記被挿入体の形状情報を取得する被挿入体形状取得部と、
     前記挿入状態情報と前記被挿入体形状情報とを入力し、前記被挿入体に対する前記挿入部の位置関係を演算する位置関係演算部と、
     前記位置関係演算部の演算結果を出力するための出力部と、
     前記出力部での前記演算結果の出力状態を制御する制御部と、
     を有することを特徴とする挿入システム。
    An insertion part to be inserted into the inserted body;
    An insertion state acquisition unit for acquiring insertion state information of the insertion unit;
    An inserted body shape acquisition unit for acquiring shape information of the inserted body;
    A positional relationship calculation unit that inputs the insertion state information and the inserted body shape information and calculates a positional relationship of the insertion unit with respect to the inserted body;
    An output unit for outputting a calculation result of the positional relationship calculation unit;
    A control unit for controlling an output state of the calculation result in the output unit;
    An insertion system characterized by comprising:
  2.  少なくとも前記位置関係演算部の演算結果の少なくとも一部を保存する記憶部をさらに有することを特徴とする請求項1に記載の挿入システム。 2. The insertion system according to claim 1, further comprising a storage unit that stores at least a part of a calculation result of the positional relationship calculation unit.
  3.  前記挿入部に設けられた撮像部と、
     少なくとも前記撮像部から得られた撮像画像を表示可能な表示部と、
     をさらに有することを特徴とする請求項1または2に記載の挿入システム。
    An imaging unit provided in the insertion unit;
    A display unit capable of displaying at least a captured image obtained from the imaging unit;
    The insertion system according to claim 1, further comprising:
  4.  前記制御部は、前記位置関係演算部の演算結果から挿入部先端が所定の位置に到達したかを判断する判断部を有し、前記挿入部先端が前記所定の位置に到達したことを前記判断部が判断した際に、前記出力部での前記演算結果の出力状態を制御することを特徴とする請求項1乃至3の何れかに記載の挿入システム。 The control unit has a determination unit that determines whether the distal end of the insertion unit has reached a predetermined position from the calculation result of the positional relationship calculation unit, and the determination that the distal end of the insertion unit has reached the predetermined position The insertion system according to any one of claims 1 to 3, wherein when the unit determines, the output state of the calculation result in the output unit is controlled.
  5.  前記所定の位置は、前記被挿入体の観察箇所の出入り口、既知の分岐箇所、撮像画像が特定パターンである箇所、の何れかであることを特徴とする請求項4に記載の挿入システム。 The insertion system according to claim 4, wherein the predetermined position is one of an entrance / exit of an observation location of the inserted object, a known branch location, and a location where the captured image is a specific pattern.
  6.  前記制御部は、前記挿入部先端が前記被挿入体の所定の位置に到達した際に自動的に前記出力部の出力状態の制御を行うことを特徴とする請求項4または5に記載の挿入システム。 The insertion according to claim 4 or 5, wherein the control unit automatically controls an output state of the output unit when the distal end of the insertion unit reaches a predetermined position of the inserted body. system.
  7.  前記制御部は、操作者のスイッチ操作に応じて前記所定の位置を制御することを特徴とする請求項4または5に記載の挿入システム。 The insertion system according to claim 4 or 5, wherein the control unit controls the predetermined position in accordance with an operator's switch operation.
  8.  前記制御部は、前記被挿入体の挿入対象部位の状態、および、挿入対象部位の大きさや広さに該当する状態表示範囲を、前記挿入部の操作者の設定操作に応じて設定可能であることを特徴とする請求項1乃至7の何れかに記載の挿入システム。 The control unit can set a state display range corresponding to the state of the insertion target part of the inserted body and the size and width of the insertion target part according to the setting operation of the operator of the insertion part. The insertion system according to any one of claims 1 to 7, characterized in that
  9.  前記制御部は、前記記憶部に、前記撮像部から得られた前記撮像画像を、前記位置関係演算部の前記演算結果と関連付けて記憶させることを特徴とする請求項3に記載の挿入システム。 The insertion system according to claim 3, wherein the control unit causes the storage unit to store the captured image obtained from the imaging unit in association with the calculation result of the positional relationship calculation unit.
  10.  前記制御部は、前記記憶部に、前記撮像部から得られた前記撮像画像を、少なくとも前記位置関係演算部の前記演算結果及び症状情報と関連付けて記憶させることを特徴とする請求項3に記載の挿入システム。 The said control part makes the said memory | storage part memorize | store the said captured image obtained from the said imaging part in association with the said calculation result and symptom information of the said positional relationship calculating part at least. Insertion system.
  11.  前記制御部は、前記出力部に、前記位置関係演算部の前記演算結果を、2つ以上の区別可能な方法で表示させることを特徴とする請求項1乃至3の何れかに記載の挿入システム。 4. The insertion system according to claim 1, wherein the control unit causes the output unit to display the calculation result of the positional relationship calculation unit by two or more distinguishable methods. 5. .
  12.  前記制御部は、前記出力部に、複数の領域に領域分割表示がされている表示を行わせることを特徴とする請求項1乃至3の何れかに記載の挿入システム。 The insertion system according to any one of claims 1 to 3, wherein the control unit causes the output unit to perform display in which a region is divided into a plurality of regions.
  13.  前記制御部は、前記出力部に、前記被挿入体の同一の領域を、2つ以上の異なる領域分割表示により表示させることを特徴とする請求項1乃至3の何れかに記載の挿入システム。 The insertion system according to any one of claims 1 to 3, wherein the control unit causes the output unit to display the same region of the inserted object by two or more different region division displays.
  14.  前記制御部は、前記記憶部に保存している情報を前記出力部に表示可能とする関連情報呼び出し部を有することを特徴とする請求項2または3に記載の挿入システム。 The insertion system according to claim 2 or 3, wherein the control unit includes a related information calling unit that allows information stored in the storage unit to be displayed on the output unit.
  15.  前記出力部の出力と前記表示部の表示とは、同一のデバイス、または近傍に置かれた異なるデバイスに表示されることを特徴とする請求項3に記載の挿入システム。 4. The insertion system according to claim 3, wherein the output of the output unit and the display of the display unit are displayed on the same device or different devices placed in the vicinity.
  16.  前記挿入部の挿入状態を検出する検出部をさらに有し、
     前記検出部は、
      前記挿入部の内部に搭載され、前記挿入部の形状を検出する形状センサと、
      前記被挿入体への前記挿入部の挿入口に設けられ、前記挿入部が通過するときの挿入量及び回転量を検出する挿入部センサと、
     の少なくとも一方を含む、ことを特徴とする請求項1乃至3の何れかに記載の挿入システム。
    A detection unit for detecting an insertion state of the insertion unit;
    The detector is
    A shape sensor mounted inside the insertion portion and detecting the shape of the insertion portion;
    An insertion portion sensor that is provided at an insertion port of the insertion portion to the inserted body and detects an insertion amount and a rotation amount when the insertion portion passes;
    The insertion system according to claim 1, comprising at least one of the following.
  17.  被挿入体内部への挿入具の挿入部の挿入を支援する挿入支援装置において、
     前記挿入部の挿入状態情報を取得する挿入状態取得部と、
     前記被挿入体の形状情報を取得する被挿入体形状取得部と、
     前記挿入状態情報と前記被挿入体形状情報とを入力し、前記被挿入体に対する前記挿入部の位置関係を演算する位置関係演算部と、
     前記位置関係演算部の演算結果を表示情報として出力するための出力部と、
     前記出力部での前記演算結果の出力状態を制御する制御部と、
     を有することを特徴とする挿入支援装置。
    In an insertion support apparatus for supporting insertion of an insertion portion of an insertion tool into an inserted body,
    An insertion state acquisition unit for acquiring insertion state information of the insertion unit;
    An inserted body shape acquisition unit for acquiring shape information of the inserted body;
    A positional relationship calculation unit that inputs the insertion state information and the inserted body shape information and calculates a positional relationship of the insertion unit with respect to the inserted body;
    An output unit for outputting the calculation result of the positional relationship calculation unit as display information;
    A control unit for controlling an output state of the calculation result in the output unit;
    An insertion support apparatus characterized by comprising:
  18.  被挿入体内部への挿入具の挿入部の挿入を支援する挿入支援装置に用いられる挿入支援方法において、
     前記挿入部の挿入状態情報を取得する挿入状態取得ステップと、
     前記被挿入体の形状情報を取得する被挿入体形状取得ステップと、
     前記挿入状態情報と前記被挿入体形状情報とを入力し、前記被挿入体に対する前記挿入部の位置関係を演算する位置関係演算ステップと、
     前記位置関係演算ステップの演算結果を表示情報として出力する出力ステップと、
     前記出力ステップでの前記演算結果の出力状態を制御する制御ステップと、
     を有することを特徴とする挿入支援方法。
    In an insertion support method used for an insertion support device for supporting insertion of an insertion portion of an insertion tool into an inserted body,
    An insertion state acquisition step of acquiring insertion state information of the insertion unit;
    An inserted object shape obtaining step for obtaining shape information of the inserted object;
    A positional relationship calculation step of inputting the insertion state information and the inserted body shape information, and calculating a positional relationship of the insertion portion with respect to the inserted body,
    An output step of outputting the calculation result of the positional relationship calculation step as display information;
    A control step of controlling an output state of the calculation result in the output step;
    An insertion support method characterized by comprising:
  19.  コンピュータに、
     被挿入体内部への挿入具の挿入部の挿入を支援する挿入支援装置における前記挿入部の挿入状態情報を取得する挿入状態取得手順と、
     前記被挿入体の形状情報を取得する被挿入体形状取得手順と、
     前記挿入状態情報と前記被挿入体形状情報とを入力し、前記被挿入体に対する前記挿入部の位置関係を演算する位置関係演算手順と、
     前記位置関係演算手順の演算結果を表示情報として出力するための出力手順と、
     前記出力手順での前記演算結果の出力状態を制御する制御手順と、
     を実行させるためのプログラム。
    On the computer,
    An insertion state acquisition procedure for acquiring insertion state information of the insertion part in an insertion support device that supports insertion of the insertion part of the insertion tool into the inserted body;
    A to-be-inserted body shape obtaining procedure for obtaining shape information of the to-be-inserted body;
    A positional relationship calculation procedure for inputting the insertion state information and the inserted body shape information and calculating a positional relationship of the insertion portion with respect to the inserted body,
    An output procedure for outputting the calculation result of the positional relationship calculation procedure as display information;
    A control procedure for controlling an output state of the calculation result in the output procedure;
    A program for running
PCT/JP2013/078738 2012-10-25 2013-10-23 Insertion system, insertion support device, insertion support method and program WO2014065336A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201380055958.4A CN104755007B (en) 2012-10-25 2013-10-23 Insertion system and insertion assisting system
EP13849891.0A EP2912987A4 (en) 2012-10-25 2013-10-23 Insertion system, insertion support device, insertion support method and program
US14/693,142 US20150223670A1 (en) 2012-10-25 2015-04-22 Insertion system, insertion supporting device, insertion supporting method and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012235634A JP6128796B2 (en) 2012-10-25 2012-10-25 INSERTION SYSTEM, INSERTION SUPPORT DEVICE, OPERATION METHOD AND PROGRAM FOR INSERTION SUPPORT DEVICE
JP2012-235634 2012-10-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/693,142 Continuation US20150223670A1 (en) 2012-10-25 2015-04-22 Insertion system, insertion supporting device, insertion supporting method and recording medium

Publications (1)

Publication Number Publication Date
WO2014065336A1 true WO2014065336A1 (en) 2014-05-01

Family

ID=50544711

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/078738 WO2014065336A1 (en) 2012-10-25 2013-10-23 Insertion system, insertion support device, insertion support method and program

Country Status (5)

Country Link
US (1) US20150223670A1 (en)
EP (1) EP2912987A4 (en)
JP (1) JP6128796B2 (en)
CN (1) CN104755007B (en)
WO (1) WO2014065336A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016162947A1 (en) * 2015-04-07 2016-10-13 オリンパス株式会社 Insertion/extraction device, direct-manipulation estimation method for insertion unit, and direct-manipulation estimation program for insertion unit
US20160353968A1 (en) * 2014-06-10 2016-12-08 Olympus Corporation Endoscope system and method for operating endoscope system
CN107105967A (en) * 2014-12-19 2017-08-29 奥林巴斯株式会社 Plugging auxiliary apparatus and plug householder method

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5942047B2 (en) * 2014-03-17 2016-06-29 オリンパス株式会社 Endoscope system
WO2016072288A1 (en) * 2014-11-06 2016-05-12 ソニー株式会社 Endoscopic device, method for operating endoscopic device, and program
DE112014007269T5 (en) * 2014-12-19 2017-10-12 Olympus Corporation Insertion / removal support apparatus and insertion / removal support method
JP6800567B2 (en) * 2015-09-03 2020-12-16 キヤノンメディカルシステムズ株式会社 Medical image display device
JPWO2017104080A1 (en) * 2015-12-18 2018-11-08 オリンパス株式会社 Insertion system
WO2017203814A1 (en) * 2016-05-25 2017-11-30 オリンパス株式会社 Endoscope device and operation method for endoscope device
JP6211239B1 (en) * 2016-05-25 2017-10-11 オリンパス株式会社 Endoscope device
WO2018122946A1 (en) * 2016-12-27 2018-07-05 オリンパス株式会社 Shape acquisition method and control method for medical manipulator
CN106691386A (en) * 2016-12-31 2017-05-24 中国科学院昆明动物研究所 Detection device of living body fluorescence signal and method
JP6624705B2 (en) * 2017-01-17 2019-12-25 オリンパス株式会社 Endoscope insertion shape observation device
WO2018179986A1 (en) * 2017-03-30 2018-10-04 富士フイルム株式会社 Endoscope system, processor device, and method for operating endoscope system
US11229492B2 (en) 2018-10-04 2022-01-25 Biosense Webster (Israel) Ltd. Automatic probe reinsertion
CN114554937A (en) 2019-12-02 2022-05-27 富士胶片株式会社 Endoscope system, control program, and display method
JPWO2021200295A1 (en) * 2020-03-31 2021-10-07
JP2022163562A (en) * 2021-04-14 2022-10-26 i-PRO株式会社 endoscope system
JPWO2023286196A1 (en) * 2021-07-14 2023-01-19
WO2023228659A1 (en) * 2022-05-24 2023-11-30 富士フイルム株式会社 Image processing device and endoscope system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002200030A (en) * 2000-12-27 2002-07-16 Olympus Optical Co Ltd Position detecting device for endoscope
US6846286B2 (en) 2001-05-22 2005-01-25 Pentax Corporation Endoscope system
JP2005211535A (en) * 2004-01-30 2005-08-11 Olympus Corp Insertion support system
JP2009077765A (en) * 2007-09-25 2009-04-16 Fujifilm Corp Endoscopic system
JP2009125394A (en) * 2007-11-26 2009-06-11 Toshiba Corp Intravascular image diagnostic apparatus and intravascular image diagnostic system
JP2011206425A (en) * 2010-03-30 2011-10-20 Fujifilm Corp Image processor, image processing method, image processing program, and stereoscopic endoscope
WO2012132638A1 (en) * 2011-03-30 2012-10-04 オリンパスメディカルシステムズ株式会社 Endoscope system
WO2013011733A1 (en) * 2011-07-15 2013-01-24 株式会社 日立メディコ Endoscope guidance system and endoscope guidance method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6059718A (en) * 1993-10-18 2000-05-09 Olympus Optical Co., Ltd. Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope
AU2003264354A1 (en) * 2002-08-30 2004-04-30 Olympus Corporation Medical treatment system, endoscope system, endoscope insert operation program, and endoscope device
JP4077716B2 (en) * 2002-11-20 2008-04-23 オリンパス株式会社 Endoscope insertion direction detection device
IL181470A (en) * 2006-02-24 2012-04-30 Visionsense Ltd Method and system for navigating within a flexible organ of the body of a patient
JP4153963B2 (en) * 2006-06-12 2008-09-24 オリンパスメディカルシステムズ株式会社 Endoscope insertion shape detection device
ES2511033T3 (en) * 2008-02-12 2014-10-22 Covidien Lp Controlled Perspective Orientation System
US8337397B2 (en) * 2009-03-26 2012-12-25 Intuitive Surgical Operations, Inc. Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device toward one or more landmarks in a patient
US20130109957A1 (en) * 2010-01-14 2013-05-02 Koninklijke Philips Electronics N.V. Flexible instrument channel insert for scope with real-time position tracking
EP2446806A4 (en) * 2010-05-31 2017-06-21 Olympus Corporation Endoscope shape detection device and method for detecting shape of inserted part of endoscope
WO2012026184A1 (en) * 2010-08-27 2012-03-01 オリンパスメディカルシステムズ株式会社 Endoscope shape detection device and method for detecting shape of insertion portion of endoscope
JP5603745B2 (en) * 2010-11-10 2014-10-08 オリンパス株式会社 Endoscope device
JP5766940B2 (en) * 2010-12-01 2015-08-19 オリンパス株式会社 Tubular insertion system
JP5160699B2 (en) * 2011-01-24 2013-03-13 オリンパスメディカルシステムズ株式会社 Medical equipment
JP6128792B2 (en) * 2012-10-16 2017-05-17 オリンパス株式会社 Observation device, observation support device, method of operating observation device, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002200030A (en) * 2000-12-27 2002-07-16 Olympus Optical Co Ltd Position detecting device for endoscope
US6846286B2 (en) 2001-05-22 2005-01-25 Pentax Corporation Endoscope system
JP2005211535A (en) * 2004-01-30 2005-08-11 Olympus Corp Insertion support system
JP2009077765A (en) * 2007-09-25 2009-04-16 Fujifilm Corp Endoscopic system
JP2009125394A (en) * 2007-11-26 2009-06-11 Toshiba Corp Intravascular image diagnostic apparatus and intravascular image diagnostic system
JP2011206425A (en) * 2010-03-30 2011-10-20 Fujifilm Corp Image processor, image processing method, image processing program, and stereoscopic endoscope
WO2012132638A1 (en) * 2011-03-30 2012-10-04 オリンパスメディカルシステムズ株式会社 Endoscope system
WO2013011733A1 (en) * 2011-07-15 2013-01-24 株式会社 日立メディコ Endoscope guidance system and endoscope guidance method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2912987A4

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160353968A1 (en) * 2014-06-10 2016-12-08 Olympus Corporation Endoscope system and method for operating endoscope system
US9918614B2 (en) * 2014-06-10 2018-03-20 Olympus Corporation Endoscope system with angle-of-view display information overlapped on three-dimensional image information
CN107105967A (en) * 2014-12-19 2017-08-29 奥林巴斯株式会社 Plugging auxiliary apparatus and plug householder method
WO2016162947A1 (en) * 2015-04-07 2016-10-13 オリンパス株式会社 Insertion/extraction device, direct-manipulation estimation method for insertion unit, and direct-manipulation estimation program for insertion unit
CN107529939A (en) * 2015-04-07 2018-01-02 奥林巴斯株式会社 The direct operation estimating program of plug-in and pull-off device, the direct operation estimation method of insertion section and insertion section
JPWO2016162947A1 (en) * 2015-04-07 2018-02-01 オリンパス株式会社 Insertion / removal device, insertion portion direct operation estimation method, and insertion portion direct operation estimation program
US10502693B2 (en) 2015-04-07 2019-12-10 Olympus Corporation Insertion/removal apparatus, insertion section direct manipulation estimation method and storage medium which non-transitory stores insertion section direct manipulation estimation program

Also Published As

Publication number Publication date
CN104755007B (en) 2016-11-09
EP2912987A4 (en) 2016-07-06
US20150223670A1 (en) 2015-08-13
JP2014083289A (en) 2014-05-12
EP2912987A1 (en) 2015-09-02
JP6128796B2 (en) 2017-05-17
CN104755007A (en) 2015-07-01

Similar Documents

Publication Publication Date Title
JP6128796B2 (en) INSERTION SYSTEM, INSERTION SUPPORT DEVICE, OPERATION METHOD AND PROGRAM FOR INSERTION SUPPORT DEVICE
US11911011B2 (en) Endolumenal object sizing
US11529197B2 (en) Device and method for tracking the position of an endoscope within a patient's body
KR102356881B1 (en) Graphical user interface for catheter positioning and insertion
JP5160699B2 (en) Medical equipment
CN102449666B (en) To turn to towards one or more boundary mark for the end for handling endoscopic apparatus and provide vision guide and the system of assist operator in endoscope navigation
JP4738270B2 (en) Surgery support device
US11786106B2 (en) Robotic endoscope probe having orientation reference markers
KR20190015580A (en) A graphical user interface for displaying guide information during an image guided procedure
CN103607941A (en) Medical system with multiple operating modes for steering a medical instrument through linked body passages
CN107072736A (en) The enhanced fluoroscopy systems of computed tomography, device and its application method
US20150216392A1 (en) Observation apparatus, observation supporting device, observation supporting method and recording medium
JP2023080220A (en) Endoscopic image of invasive procedures in narrow passages
WO2014061428A1 (en) Observation device, observation assistance device, observation assistance method and program
JP6749020B2 (en) Endoscope navigation device
EP3530221B1 (en) System for performing a percutaneous navigation procedure
US20190254563A1 (en) Endoscope insertion shape observation apparatus
US20150045648A1 (en) Integrated system for focused treatment and methods thereof
JP7323647B2 (en) Endoscopy support device, operating method and program for endoscopy support device
JP7300514B2 (en) Endoscope insertion control device, endoscope operation method and endoscope insertion control program
CN220655593U (en) Ultrasound imaging system
KR20240009905A (en) System and method for providing guidance for stone removal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13849891

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2013849891

Country of ref document: EP