[go: up one dir, main page]

US20230180998A1 - Endoscope system, controller, control method, and recording medium - Google Patents

Endoscope system, controller, control method, and recording medium Download PDF

Info

Publication number
US20230180998A1
US20230180998A1 US18/105,300 US202318105300A US2023180998A1 US 20230180998 A1 US20230180998 A1 US 20230180998A1 US 202318105300 A US202318105300 A US 202318105300A US 2023180998 A1 US2023180998 A1 US 2023180998A1
Authority
US
United States
Prior art keywords
endoscope
rotation angle
region
processor
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/105,300
Inventor
Chiharu MIZUTANI
Masaru YANAGIHARA
Hiroto OGIMOTO
Hiro HASEGAWA
Daichi KITAGUCHI
Nobuyoshi TAKESHITA
Shigehiro KOJIMA
Yuki Furusawa
Yumi KINEBUCHI
Masaaki Ito
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
National Cancer Center Japan
National Cancer Center Korea
Original Assignee
Olympus Corp
National Cancer Center Japan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp, National Cancer Center Japan filed Critical Olympus Corp
Priority to US18/105,300 priority Critical patent/US20230180998A1/en
Assigned to OLYMPUS CORPORATION, NATIONAL CANCER CENTER reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOJIMA, Shigehiro, MIZUTANI, Chiharu, OGIMOTO, Hiroto, TAKESHITA, Nobuyoshi, YANAGIHARA, MASARU, HASEGAWA, Hiro, ITO, MASAAKI, FURUSAWA, Yuki, KINEBUCHI, Yumi, KITAGUCHI, Daichi
Publication of US20230180998A1 publication Critical patent/US20230180998A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/0004Operational features of endoscopes provided with input arrangements for the user for electronic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/00042Operational features of endoscopes provided with input arrangements for the user for mechanical operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/0016Holding or positioning arrangements using motor drive units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present invention relates to an endoscope system, a controller, a control method, and a recording medium.
  • An endoscope system of PTL 1 stores time series variations in the rotation angle of each joint of the holder in a manual mode while an operator moves the endoscope, and the endoscope system reversely reproduces the time series variations in the rotation angle of each joint in an automatic return mode.
  • the endoscope moves reversely along a movement path in the manual mode and automatically returns to the initial position and orientation.
  • An aspect of the present invention is an endoscope system including an endoscope that is inserted into a subject (into the body cavity of a patient) and captures an endoscope image in the subject; a moving device that holds the endoscope and moves the endoscope; a storage unit; and a controller including at least one processor, wherein the storage unit stores first position information and first rotation angle information on a first region in the subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the at least one processor calculates third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions, and the at least one processor rotates the endoscope image on a basis of the
  • a controller configured to control an endoscope image that is captured by an endoscope and is displayed on a display device
  • the controller including: a storage unit; and at least one processor, wherein the storage unit stores first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the at least one processor calculates third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions, and the at least one processor rotates the endoscope image on the basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and the at least
  • Another aspect of the present invention is a control method for controlling an endoscope image that is captured by an endoscope and is displayed on a display device, by using first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the control method including the steps of: calculating third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions; rotating the endoscope image on the basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and outputting the rotated endoscope image to the display device.
  • Another aspect of the present invention is a computer-readable non-transitory recording medium in which a control program for causing a computer to perform the control method is recorded.
  • FIG. 1 A is an appearance illustrating the overall configuration of an endoscope system.
  • FIG. 1 B is an explanatory drawing of a movement of an endoscope inserted in an abdominal cavity.
  • FIG. 1 C illustrates the tip portion of a robot arm and the endoscope.
  • FIG. 2 is a block diagram illustrating the overall configuration of the endoscope system.
  • FIG. 3 A is a sequence diagram of a control method according to a first embodiment and an explanatory drawing of a user operation and the processing of a processor in a manual mode.
  • FIG. 3 B is a flowchart of the control method according to the first embodiment and an explanatory drawing of the processing of the processor in an autonomous mode.
  • FIG. 4 A is an explanatory drawing of an endoscope operation in the step of determining first position information and first rotation angle information.
  • FIG. 4 B is an explanatory drawing of an endoscope operation in the step of determining second position information and second rotation angle information.
  • FIG. 5 A illustrates an endoscope image at O-point.
  • FIG. 5 B illustrates an endoscope image at B-point.
  • FIG. 5 C illustrates the endoscope image of FIG. 5 B when the vertical direction is adjusted by a rotation.
  • FIG. 6 A illustrates an endoscope image at A-point.
  • FIG. 6 B illustrates the endoscope image of FIG. 6 A when the vertical direction is adjusted by a rotation.
  • FIG. 7 indicates position information and rotation angle information that are stored in a storage unit in the manual mode.
  • FIG. 8 A is a sequence diagram of a control method according to a second embodiment and an explanatory drawing of a user operation and the processing of a processor in a manual mode.
  • FIG. 8 B is a flowchart of the control method according to the second embodiment and an explanatory drawing of the processing of the processor in an autonomous mode.
  • FIG. 9 is a flowchart of a control method according to a third embodiment and an explanatory drawing of the processing of a processor in an autonomous mode.
  • FIG. 10 illustrates an oblique endoscope according to a first modification.
  • FIG. 11 A is a sequence diagram of a control method 1 according to the first modification and an explanatory drawing of a user operation and the processing of the processor in the manual mode.
  • FIG. 11 B is a flowchart of the control method according to the first modification and an explanatory drawing of the processing of the processor in the autonomous mode.
  • FIG. 12 illustrates an endoscope with a curved portion according to a second modification.
  • FIG. 13 A is a sequence diagram of a control method according to another modification and an explanatory drawing of a user operation and the processing of the processor in the manual mode.
  • FIG. 13 B is a flowchart of the control method according to another modification and an explanatory drawing of the processing of the processor in the autonomous mode.
  • FIG. 14 A is an appearance illustrating the overall configuration of a modification of the endoscope system in FIG. 1 A .
  • FIG. 14 B is an appearance illustrating the overall configuration of another modification of the endoscope system in FIG. 1 A .
  • an endoscope system 10 is used for a surgical operation in which an endoscope 2 and at least one surgical instrument 6 are inserted into the body of a patient X serving as a subject and an affected part is treated with the surgical instrument 6 while the surgical instrument 6 is observed through the endoscope 2 .
  • the endoscope system 10 is used for, for example, laparoscopic surgery.
  • the endoscope 2 is inserted into the subject, for example, an abdominal cavity through a hole H formed on the body wall.
  • the endoscope 2 is fixed to the subject, is supported by the body wall at the position of the hole H serving as a pivot point, and is pivotable about a pivot axis (first pivot axis) P 1 passing through the pivot point H.
  • the pivot axis P 1 extends in the anteroposterior direction of the patient X from the abdomen to the back.
  • the endoscope 2 pivots about the pivot axis P 1 so as to move an imaging region of the endoscope 2 between a first region including an aorta F and a second region including a pelvis G.
  • the endoscope 2 and the surgical instrument 6 may be inserted into the subject through a cannula passing through the hole H.
  • the cannula is a cylindrical instrument opened at both ends.
  • the endoscope 2 is supported by the cannula at the position of the hole H.
  • the endoscope system 10 includes the endoscope 2 , a moving device 3 that holds the endoscope 2 and moves the endoscope 2 in the subject, an endoscope processor 4 that is connected to the endoscope 2 and processes an endoscope image E captured by the endoscope 2 , a controller 1 that is connected to the moving device 3 and the endoscope processor 4 and controls the moving device 3 , and a display device 5 that is connected to the endoscope processor 4 and displays the endoscope image E.
  • the endoscope 2 is a direct-view endoscope having a visual axis (optical axis) C coaxial with a longitudinal axis I of the endoscope 2 .
  • the endoscope 2 is, for example, a rigid endoscope.
  • the endoscope 2 including an image sensor 2 a captures an image in a subject X, for example, an abdominal cavity and acquires the endoscope image E including the tip of the surgical instrument 6 (see FIGS. 5 A to 6 B ).
  • the image sensor 2 a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 2 and captures a stereo image as the endoscope image E.
  • the image sensor 2 a is an image sensor such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • the image sensor 2 a generates an image of a predetermined region by converting received light from the predetermined region into an electric signal through photoelectric conversion.
  • a stereo image as the endoscope image E is generated by performing image processing on two images with a parallax through the endoscope processor 4 or the like. In this case, the tip portion of the endoscope 2 has a stereo optical system.
  • the endoscope image E is transmitted from the endoscope 2 to the endoscope processor 4 , is subjected to necessary processing in the endoscope processor 4 , is transmitted from the endoscope processor 4 to the display device 5 , and is displayed on a display screen 5 a of the display device 5 .
  • the display device 5 may be any display, for example, a liquid crystal display or an organic electroluminescent display. A surgeon operates the surgical instrument 6 in a body while observing the endoscope image E displayed on the display screen 5 a .
  • the display device 5 may include an audio system, for example, a speaker.
  • a user terminal for communications with the controller 1 and the endoscope processor 4 via a communication network may be provided to display the endoscope image E at the terminal.
  • the terminal is, for example, a notebook computer, a laptop computer, a tablet computer, or a smartphone but is not particularly limited thereto.
  • the moving device 3 includes a robot arm 3 a (including an electric scope holder) that holds the endoscope 2 and three-dimensionally controls the position and orientation of the endoscope 2 .
  • the moving device 3 includes a plurality of joints 3 b and 3 c that operate to move the endoscope 2 with the pivot axis P 1 serving as a supporting point, thereby three-dimensionally changing the position and orientation of the endoscope 2 .
  • the joint 3 c is a rotary joint that rotates the endoscope 2 about the longitudinal axis I and is provided at the tip portion of the robot arm 3 a .
  • the endoscope 2 rotates about the optical axis C coaxial with the longitudinal axis I, thereby changing the rotation angle of a subject in the endoscope image E, that is, the vertical direction of the endoscope image E.
  • the moving device 3 includes a plurality of angle sensors 3 d that detects the rotation angles of the joints 3 b and 3 c .
  • the angle sensor 3 d is, for example, an encoder, a potentiometer, or a Hall sensor that is provided at each of the joints 3 b and 3 c.
  • the controller 1 includes at least one processor 11 like a central processing unit, a memory 12 , a storage unit 13 , an input interface 14 , an output interface 15 , and a user interface 16 .
  • the controller 1 may be, for example, a desktop computer, a tablet computer, a laptop computer, a smartphone, or a cellular phone.
  • the processor 11 may be a single processor, a multiprocessor, or a multicore processor.
  • the processor 11 reads and executes a program stored in the storage unit 13 .
  • the memory 12 is, for example, a semiconductor memory including a ROM (read-only memory) or RAM (Random Access Memory) area.
  • the memory 12 may store data necessary for the processing of the processor 11 (that is, the memory 12 may operate as “storage unit”) like the storage unit 13 , which will be described later.
  • the storage unit 13 is a computer-readable non-transitory recording medium, e.g., a hard disk or a nonvolatile recording medium including a semiconductor memory such as flash memory.
  • the storage unit 13 stores various programs including a follow-up control program (not illustrated) and an image control program (control program) 1 a and data necessary for the processing of the processor 11 .
  • Processing performed by the processor 11 may be implemented by dedicated logic circuits or hardware, for example, an FPGA (Field Programmable Gate Array), a SoC (System-on-a-Chip), an ASIC (Application Specific Integrated Circuit), and a PLD (Programmable Logic Device).
  • the storage unit 13 may be a server, e.g., a cloud server connected via a communication network to the controller 1 provided with a communication interface, instead of a recording medium integrated in the controller 1 .
  • the communication network may be, for example, a public network such as the Internet, a dedicated line, or a LAN (Local Area Network).
  • the connection of the devices may be wired connection or wireless connection.
  • the endoscope processor 4 for processing the endoscope image E may be provided with the processor 11 .
  • the endoscope processor 4 may be provided with processors, dedicated logic circuits, or hardware to perform processing like the processor 11 . The processing will be described later.
  • the endoscope processor 4 and the controller 1 may be integrated into one unit. Each of the endoscope processor 4 and the controller 1 may be provided with at least one processor.
  • any one of the configurations of the at least one processor 11 , the memory 12 , the storage unit 13 , the input interface 14 , the output interface 15 , and the user interface 16 in the controller 1 may be provided for a user terminal, aside from the endoscope processor 4 and the controller 1 .
  • the controller 1 may be integrated with the moving device 3 .
  • the input interface 14 and the output interface 15 are connected to the endoscope processor 4 .
  • the controller 1 can acquire the endoscope image E from the endoscope 2 via the endoscope processor 4 and output the endoscope image E to the display device 5 via the endoscope processor 4 .
  • the input interface 14 may be directly connected to the endoscope 2 and the output interface 15 may be directly connected to the display device 5 such that the controller 1 can directly acquire the endoscope image E from the endoscope 2 and directly output the endoscope image E to the display device 5 .
  • the input interface 14 and the output interface 15 are connected to the moving device 3 .
  • the controller 1 acquires, from the moving device 3 , information on rotation angles detected by the angle sensors 3 d at the joints 3 b and 3 c and transmits, to the moving device 3 , a control signal for driving the joints 3 b and 3 c.
  • the user interface 16 has input devices for inputs to the user interface 16 by users such as a surgeon and receives a user input.
  • the input devices include a button, a mouse, a keyboard, and a touch panel.
  • the user interface 16 has a means that allows a user to switch a manual mode and an autonomous mode, which will be described later.
  • the means is, for example, a switch.
  • the user interface 16 is configured to receive a first instruction and a second instruction from a user.
  • the first instruction and the second instruction are instructions for causing the controller 1 to register position information and rotation angle information, which will be described later.
  • the user interface 16 has a button operated by an operator. The user interface 16 receives the first instruction in response to a first button operation and receives the second instruction in response to a second button operation.
  • the processor 11 can be operated in the manual mode or the autonomous mode.
  • the manual mode is a mode that permits users such as a surgeon to operate the endoscope 2 .
  • a surgeon can manually move the endoscope 2 with a hand holding the proximal end portion of the endoscope 2 .
  • the surgeon can remotely operate the endoscope 2 by using an operating device connected to the moving device 3 .
  • the operating device can include a button, a joystick, and a touch panel.
  • the autonomous mode is a mode that causes the endoscope 2 to automatically follow the surgical instrument 6 by controlling the moving device 3 on the basis of the position of the surgical instrument 6 in the endoscope image E.
  • the processor 11 acquires the three-dimensional position of the tip of the surgical instrument 6 from the endoscope image E and controls the moving device 3 on the basis of the three-dimensional position of the tip of the surgical instrument 6 and the three-dimensional position of a predetermined target point set in the field of view of the endoscope 2 .
  • the target point is, for example, a point that is located on the optical axis C and corresponds to the center point of the endoscope image E.
  • the controller 1 controls a movement of the endoscope 2 and causes the surgical instrument 6 to follow the endoscope 2 such that the tip of the surgical instrument 6 is disposed at the center point in the endoscope image E.
  • the processor 11 controls the rotation angle of the endoscope image E displayed on the display screen 5 a.
  • the control method includes step SB 2 of setting the initial position of the endoscope 2 , steps SB 3 and SB 4 of determining first position information and first rotation angle information on the first region in a subject, steps SB 5 and SB 6 of determining second position information and second rotation angle information on the second region in the subject, steps SB 7 and SB 8 of calculating third position information and third rotation angle information on a third region in the subject, step SB 9 of storing the position information and the rotation angle information in the storage unit 13 , steps SC 4 to SC 9 of rotating the endoscope image E according to a current imaging region that is currently being imaged by the endoscope 2 , and step SC 10 of outputting the rotated endoscope image E to the display device 5 .
  • steps SB 2 to SB 9 are performed in the manual mode.
  • steps SC 3 to SC 10 are performed in the autonomous mode.
  • a user e.g., a surgeon inserts the endoscope 2 held by the moving device 3 into an abdominal cavity, switches to the manual mode (SAl, SB 1 ), and starts panning around by moving the endoscope 2 in the abdominal cavity (SA 3 ).
  • Panning around is an operation for observing the overall abdominal cavity to confirm the positions or the like of organs and tissues. The positions of organs and tissues vary among patients, so that the operation is required each time the endoscope is inserted.
  • the surgeon rotates the endoscope 2 about the pivot axis P 1 so as to observe, through the endoscope 2 , a range including at least two specific tissues having anatomical characteristics.
  • the specific tissues are the aorta F and the pelvis G.
  • the surgeon registers the initial position of the endoscope 2 in the controller 1 before panning around (SA 2 ). For example, the surgeon places the endoscope 2 at a desired initial position and operates a predetermined button of the user interface 16 .
  • the position ⁇ is the position of the endoscope 2 in a circumferential direction around the pivot axis P 1 and is calculated from the rotation angles detected by the angle sensors 3 d at the joints 3 b and 3 c .
  • the position p represents the position of an imaging region in the circumferential direction around the pivot axis P 1 .
  • the surgeon places the endoscope 2 at a position (O-point) for imaging the aorta F from the front and adjusts a rotation angle ⁇ of the endoscope 2 about the optical axis C such that the aorta F is placed at a desired rotation angle in the endoscope image E (SA 4 ).
  • the rotation angle of the aorta F is a position in the circumferential direction around the center point of the endoscope image E.
  • the rotation angle ⁇ is adjusted such that the aorta F is horizontally placed in the endoscope image E.
  • the surgeon then inputs the first instruction to the user interface 16 (SA 5 ).
  • the surgeon observes the overall aorta F through the endoscope 2 by rotating the endoscope 2 from O-point about the pivot axis P 1 while keeping the rotation angle ⁇ adjusted at O-point.
  • the aorta F makes a rotational movement in the endoscope image E as the endoscope 2 rotates from O-point to B-point.
  • B-point is the end point of the observation range of the aorta F in the endoscope image E.
  • the processor 11 determines, on the basis of the endoscope image E, the first position information and the first rotation angle information on the first region including the aorta (first specific tissue) F (SB 3 , SB 4 ).
  • the first rotation angle information is information that defines the rotation angle of the endoscope image E of the first region.
  • the storage unit 13 stores a learned model lb of machine learning of the correspondence between an image including a specific tissue and the type of the specific tissue.
  • the processor 11 recognizes the aorta F in the endoscope image E by using the learned model 1 b and determines, as the first position information, the range of the position ⁇ of the endoscope 2 with the aorta F included in the endoscope image E.
  • the first region is a region between O-point and B-point.
  • the initial position is determined at a time and a location as requested by the user.
  • the processor 11 may set the position ⁇ of the endoscope 2 at the time of the reception of the first instruction, as the first position information without processing using the learned model 1 b .
  • the first position information is determined at a time and a location as requested by the user.
  • step SB 4 the processor 11 sets the endoscope image E and the rotation angle ⁇ of the endoscope 2 at the time of the reception of the first instruction by the user interface 16 , as a first reference endoscope image and a first reference rotation angle, and the processor 11 determines the first rotation angle information on the basis of the first reference endoscope image and the first reference rotation angle.
  • the calculated target rotation angle ⁇ t represents a required rotation amount of the endoscope image E when the aorta F is to be horizontally placed in the endoscope image E at the position ⁇ at the time of the reception of the first instruction.
  • the first reference rotation angle ⁇ is set at the initial rotation angle 0°.
  • the processor 11 calculates a required rotation amount ⁇ of the endoscope image E, which is obtained at another position ⁇ included in the first position information, when the aorta F in the endoscope image E is to be aligned with the aorta F in the first reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle ⁇ t at another position ⁇ by adding the rotation amount ⁇ to the first reference rotation angle.
  • the calculated target rotation angle ⁇ t represents a required rotation amount of the endoscope image E when the aorta F is to be horizontally placed in the endoscope image E at another position ⁇ .
  • FIG. 5 C illustrates the endoscope image E of FIG. 5 B when a rotation is made by the target angle ⁇ t at B-point.
  • the surgeon places the endoscope 2 at a position (D-point) for imaging the pelvis G.
  • the pelvis G may be placed at an improper position in the endoscope image E as illustrated in FIG. 6 A .
  • the surgeon adjusts the rotation angle ⁇ about the optical axis C of the endoscope 2 such that the pelvis G is placed at a desired rotation angle in the endoscope image E (SA 6 ), and inputs the second instruction to the user interface 16 (SA 7 ).
  • the rotation angle ⁇ is adjusted such that the pelvis G is placed in an upper part in the endoscope image E.
  • the surgeon observes the overall pelvis G through the endoscope 2 by rotating the endoscope 2 from D-point about the pivot axis P 1 while keeping the rotation angle ⁇ adjusted at D-point. Also at this point, the pelvis G makes a rotational movement in the endoscope image E as the endoscope 2 rotates from D-point to A-point.
  • A-point is the end point of the observation range of the pelvis G in the endoscope image E.
  • the processor 11 determines, on the basis of the endoscope image E, the second position information and the second rotation angle information on the second region including the pelvis (second specific tissue) G (SBS, SB 6 ).
  • the second rotation angle information is information that defines the rotation angle of the endoscope image E of the second region.
  • step SB 5 the processor 11 recognizes the pelvis G in the endoscope image E by using the learned model 1 b and determines, as the second position information, the range of the position ⁇ of the endoscope 2 with the pelvis G included in the endoscope image E.
  • the second region is a region between D-point and A-point.
  • the processor 11 may set the position ⁇ of the endoscope 2 at the time of the reception of the second instruction, as the second position information without processing using the learned model 1 b .
  • the second position information is determined at a time and a location as requested by the user.
  • step SB 6 the processor 11 sets the endoscope image E and the rotation angle ⁇ of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 , as a second reference endoscope image and a second reference rotation angle, and the processor 11 determines the second rotation angle information on the basis of the second reference endoscope image and the second reference rotation angle.
  • the calculated target rotation angle ⁇ t represents a required rotation amount of the endoscope image E when the pelvis G is to be placed in an upper part in the endoscope image E at the position ⁇ at the time of the reception of the second instruction.
  • the processor 11 calculates a required rotation amount A of the endoscope image E, which is obtained at another position ⁇ included in the second position information, when the pelvis G in the endoscope image E is to be aligned with the pelvis G in the second reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle ⁇ t at another position ⁇ by adding the rotation amount A to the second reference rotation angle. The calculated target rotation angle ⁇ t represents a required rotation amount of the endoscope image E when the pelvis G is to be placed in an upper part in the endoscope image E at another position ⁇ .
  • the processor 11 calculates third position information and third rotation angle information on a third region on the basis of the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SB 7 , SB 8 ).
  • the third region is different from the first region and the second region and is located between A-point and B-point in the present embodiment.
  • step SB 7 the processor 11 determines, as the third position information, the range of the position ⁇ between the first position information and the second position information.
  • step SB 8 the processor 11 then calculates the third rotation angle information on the basis of the first, second, and third position information and the first and second rotation angle information.
  • the third rotation angle information is information that defines the rotation angle of the endoscope image E of the third region.
  • the processor 11 calculates the positional relationship between the third position information and the first and second position information and calculates the third rotation angle information on the basis of the positional relationship, the first rotation angle information, and the second rotation angle information.
  • each position ⁇ (M-point) of the third position information is an internally dividing point that internally divides a path between A-point and B-point in a m:n ratio.
  • the processor 11 calculates a target rotation angle ⁇ t at each position ⁇ on the basis of the ratio m:n, the rotation angle of 100 ° at A-point, and the rotation angle of ⁇ 10° at B-point.
  • the third region is a region where a specific tissue like the pelvis G and the aorta F in the first region and the second region is not included in an endoscope image, the specific tissue serving as an index of the rotation angle of the endoscope image E.
  • a region provides difficulty in recognizing a specific tissue by the learned model 1 b and determining a desired rotation angle by a user. This requires calculation of the third position information and the third rotation angle information on the basis of the first and second position information and the first and second rotation angle information of the first region and the second region.
  • step SB 9 the processor 11 stores the first position information, the first rotation angle information, the second position information, the second rotation angle information, the third position information, and the third rotation angle information, which are determined in steps SB 3 to SB 8 , in the storage unit 13 .
  • data is generated in the storage unit 13 , the data including a rotation angle ⁇ of the endoscope 2 and a target rotation angle ⁇ t of the endoscope image E at each rotation angle ⁇ indicating a position of the imaging region.
  • the surgeon switches from the manual mode to the autonomous mode and performs treatment on the aorta F and the pelvis G with the surgical instrument 6 .
  • the processor 11 rotates the rotary joint 3 c so as to match the rotation angle co of the endoscope 2 with the initial rotation angle 20° and causes the endoscope 2 to follow the tip of the surgical instrument 6 by controlling the moving device 3 while keeping the rotation angle ⁇ at 0° (SC 3 ).
  • the processor 11 controls the vertical direction of the endoscope image E displayed on the display screen 5 a (SC 4 to SC 10 ).
  • the processor 11 sequentially receives the rotation angles of the joints 3 b and 3 c from the moving device 3 and calculates the current position ⁇ of the endoscope 2 from the rotation angles of the joints 3 b and 3 c (SC 1 ).
  • the processor 11 determines which one of the first region, the second region, and the third region includes the current imaging region on the basis of the current position of the endoscope 2 , the first position information, and the second position information (SC 4 , SC 6 , SC 8 ).
  • the processor 11 determines that the current imaging region is included in the first region (YES at SC 4 ). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the first rotation angle information stored in the storage unit 13 (SC 5 ). Specifically, the processor 11 reads the target rotation angle ⁇ t of the current position ⁇ from the storage unit 13 and rotates the endoscope image E by the target rotation angle ⁇ t through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5 a (SC 10 ).
  • the aorta F is horizontally placed.
  • the aorta F in the endoscope image E displayed on the display screen 5 a is kept in a horizontal position. For example, if the endoscope 2 pivots 20° from O-point to B-point about the pivot axis P 1 , the endoscope image E rotates from 0° to ⁇ 10°.
  • the processor 11 determines that the current imaging region is included in the second region (NO at SC 4 and YES at SC 6 ).
  • the processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the second rotation angle information stored in the storage unit 13 (SC 7 ).
  • the processor 11 reads the target rotation angle ⁇ t of the current position ⁇ from the storage unit 13 and rotates the endoscope image E by the target rotation angle et through image processing.
  • the processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5 a (SC 10 ).
  • the pelvis G is placed in an upper part.
  • the pelvis G in the endoscope image E displayed on the display screen 5 a is kept in the upper part. For example, if the endoscope 2 pivots 20° from A-point to D-point about the pivot axis P 1 , the endoscope image E rotates from 100° to 90°.
  • the processor 11 determines that the current imaging region is included in the third region (SC 8 ). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the third rotation angle information stored in the storage unit 13 (SC 9 ). Specifically, the processor 11 reads the rotation angle of the current position ⁇ from the storage unit 13 and rotates the endoscope image E by the rotation angle through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5 a (SC 10 ).
  • the endoscope image E displayed on the display screen 5 a is rotated by the target rotation angle ⁇ t corresponding to the position ⁇ .
  • the target rotation angle ⁇ t gradually changes from the target rotation angle of the first region to the target rotation angle of the second region as the position ⁇ changes from the first region to the second region.
  • the endoscope image E displayed on the display screen 5 a rotates from ⁇ 10° to 100° in one direction.
  • the storage unit 13 stores the first position information on the first region including a specific tissue F and the first rotation angle information for defining the target rotation angle ⁇ t of the endoscope image E, the target rotation angle ⁇ t being defined for placing the specific tissue F at a desired rotation angle by the surgeon. Furthermore, the storage unit 13 stores the second position information on the second region including a specific tissue G and the second rotation angle information for defining the target rotation angle ⁇ t of the endoscope image E, the target rotation angle ⁇ t being defined for placing the specific tissue G at a desired rotation angle by the surgeon.
  • the target rotation angle ⁇ t that gradually changes between the target rotation angle ⁇ t of the first rotation angle information and the target rotation angle ⁇ t of the second rotation angle information is interpolated and is stored in the storage unit 13 .
  • the endoscope image E is rotated by the target rotation angle ⁇ t corresponding to the position ⁇ of the current imaging region, thereby automatically adjusting the vertical direction of the endoscope image E.
  • the target rotation angle ⁇ t that places the specific tissues F and G at a predetermined rotation angle.
  • the endoscope image E is automatically rotated by a proper target rotation angle et that is estimated from the first and second rotation angle information.
  • the operator can be provided with the endoscope image E in a proper vertical direction according to the position of the current imaging region in an abdominal cavity.
  • an automatic adjustment to the vertical direction of the endoscope image E can relieve the stress of the surgeon and shorten the treatment time. Specifically, if the surgeon adjusts the vertical direction of the endoscope image E, the surgeon needs to take a hand off from the surgical instrument 6 during an operation and then manually rotate the endoscope 2 . According to the present embodiment, the surgeon does not need to operate the endoscope 2 to adjust the vertical direction, so that the surgeon can continue treatment without being interrupted.
  • the present embodiment is different from the first embodiment in that a processor 11 rotates an endoscope image E by a rotation of an endoscope 2 instead of image processing.
  • a processor 11 rotates an endoscope image E by a rotation of an endoscope 2 instead of image processing.
  • configurations different from those of the first embodiment will be described. Configurations in common with the first embodiment are indicated by the same reference numerals and an explanation thereof is omitted.
  • An endoscope system 10 includes a controller 1 , the endoscope 2 , a moving device 3 , an endoscope processor 4 , and a display device 5 as in the first embodiment.
  • FIGS. 8 A and 8 B indicate a control method performed by the processor 11 in the present embodiment.
  • the control method includes step SB 2 of determining the initial position of the endoscope 2 , steps SB 3 and SB 4 ′ of determining first position information and first rotation angle information on a first region in a subject, steps SB 5 and SB 6 ′ of determining second position information and second rotation angle information on a second region in the subject, steps SB 7 and SB 8 ′ of determining third position information and third rotation angle information on a third region in the subject, step SB 9 of storing the position information and the rotation angle information in the storage unit 13 , steps SC 4 to SC 9 ′ of rotating the endoscope image E according to a current imaging region that is currently being imaged by the endoscope 2 , and step SC 10 of outputting the rotated endoscope image E to the display device 5 .
  • steps SB 2 to SB 9 are performed in a manual mode.
  • steps SC 4 to S 10 are performed in an autonomous mode.
  • a user performs steps SAl to SA 5 .
  • the processor 11 determines the first position information and the first rotation angle information on the first region on the basis of the endoscope image E (SB 3 , SB 4 ′).
  • step SB 4 ′ subsequent to step SB 3 , the processor 11 sets the endoscope image E and a rotation angle co of the endoscope 2 at the time of the reception of the first instruction by the user interface 16 , as a first reference endoscope image and a first reference rotation angle.
  • the processor 11 calculates a required rotation amount A of the endoscope image E, which is obtained at another position ⁇ included in the first position information, when an aorta F in the endoscope image E is to be aligned with the aorta F in the first reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle cot of the endoscope 2 at another position ⁇ by adding the rotation amount A to the first reference rotation angle.
  • the processor 11 determines the second position information and the second rotation angle information on the second region on the basis of the endoscope image E (SB 5 , SB 6 ′).
  • step SB 6 ′ subsequent to step SB 5 , the processor 11 sets the endoscope image E and the rotation angle co of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 , as a second reference endoscope image and a second reference rotation angle.
  • the processor 11 calculates a required rotation amount A of the endoscope image E, which is obtained at another position ⁇ included in the second position information, when the pelvis G in the endoscope image E is to be aligned with the pelvis G in the second reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle cot of the endoscope 2 at another position ⁇ by adding the rotation amount A to the second reference rotation angle.
  • step SB 9 the processor 11 stores the position information and the rotation angle information, which are determined in steps SB 3 , SB 4 ′, SB 5 , SB 6 ′, SB 7 , and SB 8 ′, in the storage unit 13 .
  • data is generated in the storage unit 13 , the data including the rotation angle ⁇ of the endoscope 2 and the target rotation angle cot of the endoscope image E at each rotation angle ⁇ indicating a position of the imaging region.
  • the processor 11 calculates the current position ⁇ of the endoscope 2 (SC 1 ).
  • the processor 11 determines which one of the first region, the second region, and the third region includes the current imaging region (SC 4 , SC 6 , SC 8 ).
  • the processor 11 determines that the current imaging region is included in the first region (YES at SC 4 )
  • the processor 11 rotates the endoscope 2 on the basis of the first rotation angle information stored in the storage unit 13 (SC 5 ′). Specifically, the processor 11 reads the target rotation angle cot of the current position ⁇ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 to the target rotation angle cot.
  • the processor 11 determines that the current imaging region is included in the second region (NO at SC 4 and YES at SC 6 ), the processor 11 rotates the endoscope 2 on the basis of the second rotation angle information stored in the storage unit 13 (SC 7 ′). Specifically, the processor 11 reads the target rotation angle cot of the current position ⁇ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 by the target rotation angle cot.
  • the processor 11 determines that the current imaging region is included in the third region (SC 7 )
  • the processor 11 rotates the endoscope 2 on the basis of the third rotation angle information stored in the storage unit 13 (SC 8 ′). Specifically, the processor 11 reads the target rotation angle cot of the current position ⁇ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 by the target rotation angle cot.
  • step SC 5 ′, SC 7 ′, or SC 9 ′ the processor 11 outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on a display screen 5 a (SC 10 ).
  • the endoscope 2 is rotated to the target rotation angle cot corresponding to the position ⁇ of the current imaging region, thereby automatically adjusting the vertical direction of the endoscope image E as in the first embodiment.
  • the current imaging region is the first or second region including the specific tissues F and G
  • the endoscope 2 is automatically rotated to the target rotation angle cot that places the specific tissues F and G at a predetermined rotation angle.
  • the current imaging region is the third region that does not include the specific tissues F and G
  • the endoscope 2 is automatically rotated to a proper target rotation angle cot that is estimated from the first and second rotation angle information.
  • an operator can be provided with the endoscope image E in a proper vertical direction according to the position of the current imaging region in an abdominal cavity. Moreover, an automatic adjustment to the vertical direction of the endoscope image E can relieve the stress of the surgeon and shorten the treatment time.
  • a rotation of the endoscope image E by rotating the endoscope 2 about an optical axis C can eliminate the need for image processing for rotating the endoscope image E, thereby reducing a load of the processor 11 .
  • the user can intuitively recognize the vertical direction of the endoscope image E by confirming the target angle ⁇ of a portion of the endoscope 2 outside a body.
  • the endoscope image E is rotated by rotating the overall endoscope 2 about the optical axis C.
  • an image sensor 2 a may be rotated about the optical axis C while keeping the rotation angle ⁇ of the endoscope 2 about the optical axis C.
  • the endoscope 2 includes a rotating mechanism for rotating the image sensor 2 a.
  • a rotation of the image sensor 2 a relative to the body of the endoscope 2 can rotate the endoscope image E like a rotation of the overall endoscope 2 .
  • the present embodiment is different from the first and second embodiments in that an endoscope image E is rotated by a combination of a rotation of an endoscope 2 about an optical axis C and image processing.
  • configurations different from those of the first and second embodiments will be described. Configurations in common with the first and second embodiments are indicated by the same reference numerals and an explanation thereof is omitted.
  • An endoscope system 10 includes a controller 1 , the endoscope 2 , a moving device 3 , an endoscope processor 4 , and a display device 5 as in the first embodiment.
  • FIG. 9 indicates a control method performed by a processor 11 in an autonomous mode in the present embodiment.
  • the control method according to the present embodiment includes step SC 11 of determining whether a rotation angle co of the endoscope 2 is a predetermined critical angle and step SC 12 of rotating the endoscope image E by image processing in addition to steps SB 2 , SB 3 , SB 4 ′, SBS, SB 6 ′, SB 7 , SB 8 ′, SB 9 , SC 1 to SC 4 , SC 5 ′, SC 6 , SC 7 ′, SC 8 , and SC 9 ′ that are described in the second embodiment.
  • Step SB 9 the processor 11 calculates the current position ⁇ of the endoscope 2 (SC 1 ).
  • the processor 11 performs steps SC 1 to SC 4 , SC 5 ′, SC 6 , SC 7 ′, SC 8 , and SC 9 ′.
  • the processor 11 determines whether the rotation angle ⁇ of the endoscope 2 has reached the critical angle of the rotatable range of the endoscope 2 on the basis of a rotation angle detected by an angle sensor 3 d at a rotary joint 3 c (SC 11 ).
  • the rotatable range in which the endoscope 2 is rotatable may be limited by physical constraints or the like. For example, a cable in the endoscope 2 and the moving device 3 is twisted by a rotation of the endoscope 2 and thus the rotatable range of the endoscope 2 is set without causing an excessive twist.
  • the processor 11 If the endoscope 2 rotates to a target rotation angle cot before the rotation angle ⁇ reaches the critical angle (NO at SC 11 ), the processor 11 outputs the rotated endoscope image E to the display device 5 (SC 10 ).
  • the processor 11 stops the rotation of the endoscope 2 at the critical angle, rotates the endoscope image E through image processing by a rotation angle to be added to reach the target rotation angle cot (SC 12 ), and outputs the rotated endoscope image E to the display device 5 (SC 10 ).
  • the endoscope image E can be rotated by a combination of a rotation of the endoscope 2 about the optical axis C and image processing even if the endoscope image E is hard to rotate by a rotation of the endoscope 2 alone.
  • the present modification is different from the first to third embodiments in that the endoscope 2 is an oblique type.
  • the oblique endoscope 2 includes a long insertion portion 2 b that is inserted with the longitudinal axis I into a subject, and an imaging portion 2 c that includes the image sensor 2 a and is connected to the proximal end of the insertion portion 2 b .
  • the insertion portion 2 b and the imaging portion 2 c are integrally rotated about the longitudinal axis I by a rotation of the rotary joint 3 c .
  • a camera head (imaging portion 2 c ) and an optical visual tube (insertion portion 2 b ) have different pieces of rotation angle information.
  • the camera head and the optical visual tube are integrally rotated to perform processing using common rotation angle information.
  • a visual axis (optical axis) C is coaxial with the longitudinal axis I, so that the position of the visual axis C is kept even if the endoscope 2 rotates about the longitudinal axis I.
  • the visual axis C tilts with respect to the longitudinal axis I and thus makes a rotational movement about the longitudinal axis I in response to a rotation of the endoscope 2 about the longitudinal axis I, thereby moving the imaging region.
  • FIGS. 11 A and 11 B indicate a control method performed by the processor 11 in the present modification. As indicated in FIGS. 11 A and 11 B , the control method according to the present modification includes steps SB 2 ′ and SB 3 to SB 9 and steps SC 3 ′ and SC 4 to SC 10 .
  • the orientation ⁇ of the endoscope 2 is a rotation angle about the longitudinal axis I and corresponds to the orientation of the visual axis C with respect to the longitudinal axis I.
  • the processor 11 determines the first position information and the first rotation angle information (SB 3 , SB 4 ) and holds information on a first orientation of the endoscope 2 when the first instruction is received.
  • the processor 11 determines the second position information and the second rotation angle information (SB 5 , SB 6 ) and holds information on a second orientation of the endoscope 2 when the second instruction is received.
  • step SB 9 the processor 11 stores the first orientation and the second orientation in the storage unit 13 in addition to the position information and the rotation angle information.
  • data is generated in the storage unit 13 , the data including a rotation angle ⁇ of the endoscope 2 , the target rotation angle ⁇ t of the endoscope image E at each rotation angle ⁇ , and the first orientation and the second orientation of the endoscope 2 , the rotation angle ⁇ indicating the position of the imaging region, the first and second orientations corresponding to each imaging region.
  • the processor 11 controls the position and orientation of the endoscope 2 by controlling the moving device 3 and causes the endoscope 2 to follow the tip of the surgical instrument 6 (SC 3 ′).
  • the processor 11 controls the position and orientation of the endoscope 2 on the basis of the first and second position information and the first and second orientations stored in the storage unit 13 , so that an orientation ⁇ of the endoscope 2 is controlled to the first orientation when the imaging region is included in the first region, whereas the orientation ⁇ of the endoscope 2 is controlled to the second orientation when the imaging region is included in the second region.
  • the processor 11 rotates the endoscope image E by the target rotation angle ⁇ t according to the current imaging region through image processing (SC 4 to SC 9 ).
  • the imaging region is moved by a rotation of the endoscope 2 about the longitudinal axis I.
  • the vertical direction of the endoscope image E is hard to control only by the control method of the second embodiment, in which the endoscope image E is rotated by a rotation of the endoscope 2 .
  • the first orientation of the endoscope 2 at the time of imaging of the first region and the second orientation of the endoscope 2 at the time of imaging of the second region are stored.
  • the orientation of the endoscope 2 is controlled to the first orientation and the vertical direction of the endoscope image E is adjusted by a rotation through image processing.
  • the orientation of the endoscope 2 is controlled to the second orientation and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. This can properly control the vertical direction of the endoscope image E captured by the oblique endoscope 2 .
  • the present modification is different from the first to third embodiments in that the endoscope 2 has a curved portion 2 d.
  • the endoscope 2 includes the long insertion portion 2 b that is inserted into a subject and the curved portion 2 d that is provided at the tip portion of the insertion portion 2 b and can be curved in a direction that crosses the longitudinal axis I of the insertion portion 2 b .
  • the visual axis C tilts with respect to the longitudinal axis I and thus makes a rotational movement about the longitudinal axis I in response to a rotation of the endoscope 2 about the longitudinal axis I, thereby moving the imaging region.
  • the tilt direction and the tilt angle of the visual axis C with respect to the longitudinal axis I change according to the curving direction and the curving angle of the curved portion 2 d.
  • the control method performed by the processor 11 in the present modification includes steps SB 2 ′ and SB 3 to SB 9 and steps SC 3 ′ and SC 4 to SC 10 as in the first modification.
  • the orientation of the endoscope 2 the rotation direction and the rotation angle of the curved portion 2 d are used instead of the rotation angle ⁇ about the longitudinal axis I.
  • step SB 2 ′ the processor 11 sets the current curving direction and curving angle of the curved portion 2 d as an initial orientation.
  • step SB 9 the curving direction and the curving angle of the curved portion 2 d at the time of the reception of the first instruction are stored as a first orientation in the storage unit 13 by the processor 11 , and the curving direction and the curving angle of the curved portion 2 d at the time of the reception of the second instruction are stored as a second orientation in the storage unit 13 by the processor 11 .
  • step SC 3 ′ of the autonomous mode the processor 11 controls the position and orientation of the endoscope 2 on the basis of the first and second position information and the first and second orientations stored in the storage unit 13 , so that the curving direction and the curving angle of the curved portion 2 d are controlled to the first orientation when the imaging region is included in the first region, whereas the curving direction and the curving angle of the curved portion 2 d are controlled to the second orientation when the imaging region is included in the second region (SC 3 ′).
  • the imaging region makes a rotational movement by a rotation of the endoscope 2 according to the curving direction and the curving angle of the curved portion 2 d .
  • the vertical direction of the endoscope image E is hard to control by the control method of the second embodiment, in which the endoscope image E is rotated by a rotation of the endoscope 2 .
  • the orientation of the endoscope 2 is controlled to the first orientation stored in the manual mode and the vertical direction of the endoscope image E is adjusted by a rotation through image processing as in the first modification.
  • the orientation of the endoscope 2 is controlled to the second orientation stored in the manual mode and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. This can properly control the vertical direction of the endoscope image E captured by the endoscope 2 including the curved portion 2 d.
  • the processor 11 calculates the third rotation angle information in the manual mode and stores the information in the storage unit 13 .
  • the processor 11 may calculate the third rotation angle information in real time during the autonomous mode (SC 13 ). In other words, the processor 11 does not determine or store the third position information and the third rotation angle information in the manual mode.
  • the third region is assumed to be a region other than the first region and the second region.
  • the processor 11 may calculate the target rotation angle ⁇ t or cot at the current position ⁇ of the endoscope 2 in real time on the basis of the current position ⁇ , the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SC 13 ). If the current imaging region is included in one of the first region and the second region (not included in the third region), the processor 11 may match the target rotation angle ⁇ t or cot with the first rotation angle information or the second rotation angle information without calculating the target rotation angle ⁇ t or cot in real time. This can reduce the amount of position information and rotation angle information to be stored in the storage unit 13 during the manual mode and only requires the calculation of the third position information and the third rotation angle information that are required for an operation of the autonomous mode, thereby reducing a load to the system.
  • the processor 11 may update the stored first position information or second position information or the stored first rotation angle information or second rotation angle information to the current position information and rotation angle information.
  • the endoscope 2 is moved after the update. If it is determined that the current imaging region is included in the first region or the second region, the updated first position information, second position information, first rotation angle information, and second rotation angle information can be used.
  • the user may provide an instruction to update from the user interface 16 .
  • the position information and the rotation angle information can be updated to correct information according to the current circumstances.
  • the processor 11 recognizes a specific tissue in the endoscope image E and determines the position information and the rotation angle information on the basis of the recognized specific tissue.
  • the position information and the rotation angle information may be determined on the basis of the position ⁇ and the rotation angle ⁇ of the endoscope 2 at the time of the reception of the instruction.
  • the surgeon places the endoscope 2 at a desired position at a desired rotation angle co and inputs the first instruction.
  • the processor 11 determines, as the first position information, a range around the position ⁇ of the endoscope 2 at the time of the reception of the first instruction by the user interface 16 and determines, as the first rotation angle information, the rotation angle ⁇ of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 .
  • the surgeon places the endoscope 2 at another desired position at a desired rotation angle ⁇ and inputs the second instruction.
  • the processor 11 determines, as the second position information, a range around the position ⁇ of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 and determines, as the second rotation angle information, the rotation angle ⁇ of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 .
  • the surgeon can register any regions in a subject as the first region and the second region, thereby determining the position information and the rotation angle information that are further adapted to the feeling of the surgeon. Also when the first and second regions do not include a specific tissue, any position information and rotation angle information can be determined and stored for the first and second regions without performing the processing of the learned model 1 b.
  • the determination of the position information and the rotation angle information on the basis of a specific tissue in the endoscope image E may be used in combination with the determination of the position information and the rotation angle information on the basis of the position ⁇ and the rotation angle ⁇ of the endoscope 2 at the time of the reception of an instruction.
  • the processor 11 may further determine position information and rotation angle information on any region different from the first and second regions on the basis of an instruction of the surgeon.
  • specific tissues are the aorta F and the pelvis G.
  • the specific tissues may be any organs or tissues having anatomical characteristics. For example, a uterus may be used.
  • the position information and the rotation angle information on the two regions are stored.
  • Position information and rotation angle information on three or more regions may be stored instead. This can improve accuracy when position information and rotation angle information are calculated on the basis of stored information.
  • the position ⁇ of the endoscope 2 is expressed by a two-dimensional polar coordinate system with the pivot point H serving as an origin, the position ⁇ indicating the position of the imaging region.
  • the position ⁇ may be expressed by a three-dimensional polar coordinate system.
  • the endoscope 2 may be supported so as to pivot about a second pivot axis P 2 that passes through the pivot point H and is orthogonal to the first pivot axis P 1
  • the position of the imaging region may be expressed as ( ⁇ 1 , ⁇ 2 ), where ⁇ 1 is a rotation angle about the first pivot axis P 1 and ⁇ 2 is a rotation angle about the second pivot axis P 2 .
  • the first position information, the second position information, and the third position information are three-dimensional information including rotation angles ⁇ 1 and ⁇ 2 .
  • the position of the imaging region may be expressed by other kinds of coordinate systems instead of a polar coordinate system.
  • the position of the imaging region may be expressed by a cartesian coordinate system with the hole H serving as an origin.
  • the coordinate system of the position ⁇ of the imaging region is a global coordinate system fixed relative to a subject.
  • a relative coordinate system for the tip of the endoscope 2 may be used instead.
  • the first and second position information are determined in the manual mode and are stored in the storage unit 13 .
  • the first and second position information may be stored in advance in the storage unit 13 before a surgical operation.
  • an examination image of a range including an affected part for example, a CT image of an abdominal region may be captured.
  • Deconvolution on multiple CT images generates a three-dimensional image in an abdominal cavity.
  • the first and second position information may be determined and stored in the storage unit 13 on the basis of such a three-dimensional image before a surgical operation. In this case, steps SB 4 and SB 6 are omitted in the manual mode.
  • This configuration can reduce the computational complexity of the processor 11 in the manual mode.
  • the processor 11 in the manual mode may store a first endoscope image and a second endoscope image in the storage unit 13 .
  • the first endoscope image is the endoscope image E of the first region
  • the second endoscope image is the endoscope image E of the second region.
  • the processor 11 stores at least one endoscope image E, in which the aorta F is recognized, as the first endoscope image in the storage unit 13 .
  • the processor 11 stores at least one endoscope image E, in which the pelvis G is recognized, as the second endoscope image in the storage unit 13 .
  • the processor 11 in the autonomous mode may determine which one of the first region, the second region, and the third region includes the current imaging region on the basis of the first endoscope image and the second endoscope image. In other words, the processor 11 compares the current endoscope image E with the first endoscope image and the second endoscope image. The processor 11 determines that the current imaging region is included in the first region in the presence of a first endoscope image identical or similar to the current endoscope image E. The processor 11 determines that the current imaging region is included in the second region in the presence of a second endoscope image identical or similar to the current endoscope image E.
  • the processor 11 may read information on the rotation angle of the specific tissue from a database 1 c stored in the storage unit 13 and then rotate the endoscope image E on the basis of the read information on the rotation angle.
  • the rotation angle is an angle around the center point of the endoscope image E. This configuration can rotate the endoscope image E such that a specific tissue in the endoscope image E is placed at a predetermined rotation angle.
  • registered in the database 1 c are the type of at least one specific tissue other than the aorta F and the pelvis G and the rotation angle of the type of the specific tissue.
  • the processor 11 recognizes a specific tissue in the endoscope image E, reads the rotation angle of the specific tissue from the database 1 c , and rotates the endoscope image E such that the specific tissue is placed at the rotation angle.
  • a uterus J as a specific tissue is preferably placed in an upper part of the endoscope image E and thus 90° equivalent to the 12 o'clock position is registered as a rotation angle of the uterus J.
  • the processor 11 rotates the endoscope image E such that the recognized uterus J is placed at the position of 90°.
  • the vertical direction of the endoscope image E is automatically adjusted such that the uterus J is placed at the position of 90°.
  • the rotation of the endoscope image E is controlled on the basis of the specific tissues F and G in the endoscope image E. Additionally, the rotation of the endoscope image E may be controlled on the basis of the surgical instrument 6 in the endoscope image E.
  • the processor 11 can operate in a first rotation mode for controlling the rotation of the endoscope image E on the basis of the specific tissues F and G and a second rotation mode for controlling the rotation of the endoscope image E on the basis of the surgical instrument 6 .
  • a user for example, a surgeon can switch the first rotation mode and the second rotation mode by using the user interface 16 .
  • the processor 11 detects the angle of the surgical instrument 6 in the current endoscope image E, rotates the endoscope image E by a rotation of the endoscope 2 or image processing such that the angle of the surgical instrument 6 is equal to a predetermined target angle, outputs the rotated endoscope image E to the display device 5 , and displays the image on the display screen 5 a .
  • the angle of the surgical instrument 6 is, for example, the angle of the longitudinal axis of the shaft of the surgical instrument 6 with respect to the horizon of the endoscope image E.
  • the surgeon optionally switches from the first rotation mode to the second rotation mode such that the surgical instrument 6 in the endoscope image E can be displayed at a target angle on the display screen 5 a.
  • the surgeon manually operates the surgical instrument 6 held with his/her hand.
  • the surgical instrument 6 may be held and controlled by a second moving device 31 that is different from the moving device 3 .
  • the controller 1 may acquire position information on the endoscope 2 and the surgical instrument 6 from the moving device 3 for moving the endoscope 2 and the second moving device 31 for moving the surgical instrument 6 .
  • the second moving device 31 holds the surgical instrument 6 with a robot arm or an electric holder and three-dimensionally changes the position and orientation of the surgical instrument 6 under the control of a controller 101 .
  • the surgical instrument 6 may be connected to the tip of the robot arm and is integrated with the robot arm.
  • the surgical instrument 6 may be a separate part held by a robot arm.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Astronomy & Astrophysics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

An endoscope system includes an endoscope, a moving device that moves the endoscope, a storage unit, and a processor. The storage unit stores first position information and first rotation angle information, which defines a rotation angle of an endoscope image of a first region, on the first region in a subject and second position information and second rotation angle information, which defines a rotation angle of an endoscope image of a second region, on the second region in the subject. The processor calculates third rotation angle information on a third region in the subject based on the first and second position information, the first and second rotation angle information, and third position information on the third region. If the third region includes the current imaging region, the processor rotates the endoscope image based on the third rotation angle information and outputs the rotated endoscope image to the display device.

Description

    TECHNICAL FIELD
  • The present invention relates to an endoscope system, a controller, a control method, and a recording medium.
  • The present application claims priority under the provisional U.S. patent application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/033210 which is hereby incorporated by reference herein in its entirety.
  • BACKGROUND ART
  • Conventionally, an endoscope system that controls an electric holder so as to move an endoscope held by the holder has been known (for example, see PTL 1).
  • An endoscope system of PTL 1 stores time series variations in the rotation angle of each joint of the holder in a manual mode while an operator moves the endoscope, and the endoscope system reversely reproduces the time series variations in the rotation angle of each joint in an automatic return mode. Thus, the endoscope moves reversely along a movement path in the manual mode and automatically returns to the initial position and orientation.
  • CITATION LIST Patent Literature
  • {PTL 1} The publication of Japanese Patent No. 6161687
  • SUMMARY OF INVENTION
  • An aspect of the present invention is an endoscope system including an endoscope that is inserted into a subject (into the body cavity of a patient) and captures an endoscope image in the subject; a moving device that holds the endoscope and moves the endoscope; a storage unit; and a controller including at least one processor, wherein the storage unit stores first position information and first rotation angle information on a first region in the subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the at least one processor calculates third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions, and the at least one processor rotates the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and the at least one processor outputs the rotated endoscope image to the display device.
  • Another aspect of the present invention is a controller configured to control an endoscope image that is captured by an endoscope and is displayed on a display device, the controller including: a storage unit; and at least one processor, wherein the storage unit stores first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the at least one processor calculates third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions, and the at least one processor rotates the endoscope image on the basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and the at least one processor outputs the rotated endoscope image to the display device.
  • Another aspect of the present invention is a control method for controlling an endoscope image that is captured by an endoscope and is displayed on a display device, by using first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region, the control method including the steps of: calculating third rotation angle information on a third region in the subject on the basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions; rotating the endoscope image on the basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and outputting the rotated endoscope image to the display device.
  • Another aspect of the present invention is a computer-readable non-transitory recording medium in which a control program for causing a computer to perform the control method is recorded.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1A is an appearance illustrating the overall configuration of an endoscope system.
  • FIG. 1B is an explanatory drawing of a movement of an endoscope inserted in an abdominal cavity.
  • FIG. 1C illustrates the tip portion of a robot arm and the endoscope.
  • FIG. 2 is a block diagram illustrating the overall configuration of the endoscope system.
  • FIG. 3A is a sequence diagram of a control method according to a first embodiment and an explanatory drawing of a user operation and the processing of a processor in a manual mode.
  • FIG. 3B is a flowchart of the control method according to the first embodiment and an explanatory drawing of the processing of the processor in an autonomous mode.
  • FIG. 4A is an explanatory drawing of an endoscope operation in the step of determining first position information and first rotation angle information.
  • FIG. 4B is an explanatory drawing of an endoscope operation in the step of determining second position information and second rotation angle information.
  • FIG. 5A illustrates an endoscope image at O-point.
  • FIG. 5B illustrates an endoscope image at B-point.
  • FIG. 5C illustrates the endoscope image of FIG. 5B when the vertical direction is adjusted by a rotation.
  • FIG. 6A illustrates an endoscope image at A-point.
  • FIG. 6B illustrates the endoscope image of FIG. 6A when the vertical direction is adjusted by a rotation.
  • FIG. 7 indicates position information and rotation angle information that are stored in a storage unit in the manual mode.
  • FIG. 8A is a sequence diagram of a control method according to a second embodiment and an explanatory drawing of a user operation and the processing of a processor in a manual mode.
  • FIG. 8B is a flowchart of the control method according to the second embodiment and an explanatory drawing of the processing of the processor in an autonomous mode.
  • FIG. 9 is a flowchart of a control method according to a third embodiment and an explanatory drawing of the processing of a processor in an autonomous mode.
  • FIG. 10 illustrates an oblique endoscope according to a first modification.
  • FIG. 11A is a sequence diagram of a control method1 according to the first modification and an explanatory drawing of a user operation and the processing of the processor in the manual mode.
  • FIG. 11B is a flowchart of the control method according to the first modification and an explanatory drawing of the processing of the processor in the autonomous mode.
  • FIG. 12 illustrates an endoscope with a curved portion according to a second modification.
  • FIG. 13A is a sequence diagram of a control method according to another modification and an explanatory drawing of a user operation and the processing of the processor in the manual mode.
  • FIG. 13B is a flowchart of the control method according to another modification and an explanatory drawing of the processing of the processor in the autonomous mode.
  • FIG. 14A is an appearance illustrating the overall configuration of a modification of the endoscope system in FIG. 1A.
  • FIG. 14B is an appearance illustrating the overall configuration of another modification of the endoscope system in FIG. 1A.
  • DESCRIPTION OF EMBODIMENTS (First Embodiment)
  • An endoscope system, a controller, a control method, and a recording medium according to a first embodiment of the present invention will be described below with reference to the accompanying drawings.
  • As illustrated in FIG. 1A, an endoscope system 10 according to the present embodiment is used for a surgical operation in which an endoscope 2 and at least one surgical instrument 6 are inserted into the body of a patient X serving as a subject and an affected part is treated with the surgical instrument 6 while the surgical instrument 6 is observed through the endoscope 2. The endoscope system 10 is used for, for example, laparoscopic surgery.
  • As illustrated in FIG. 1B, the endoscope 2 is inserted into the subject, for example, an abdominal cavity through a hole H formed on the body wall. Thus, the endoscope 2 is fixed to the subject, is supported by the body wall at the position of the hole H serving as a pivot point, and is pivotable about a pivot axis (first pivot axis) P1 passing through the pivot point H. In laparoscopic surgery illustrated in FIGS. 1A and 1B, the pivot axis P1 extends in the anteroposterior direction of the patient X from the abdomen to the back. The endoscope 2 pivots about the pivot axis P1 so as to move an imaging region of the endoscope 2 between a first region including an aorta F and a second region including a pelvis G.
  • The endoscope 2 and the surgical instrument 6 may be inserted into the subject through a cannula passing through the hole H. The cannula is a cylindrical instrument opened at both ends. In this case, the endoscope 2 is supported by the cannula at the position of the hole H.
  • As illustrated in FIGS. 1A and 2 , the endoscope system 10 includes the endoscope 2, a moving device 3 that holds the endoscope 2 and moves the endoscope 2 in the subject, an endoscope processor 4 that is connected to the endoscope 2 and processes an endoscope image E captured by the endoscope 2, a controller 1 that is connected to the moving device 3 and the endoscope processor 4 and controls the moving device 3, and a display device 5 that is connected to the endoscope processor 4 and displays the endoscope image E.
  • The endoscope 2 is a direct-view endoscope having a visual axis (optical axis) C coaxial with a longitudinal axis I of the endoscope 2. The endoscope 2 is, for example, a rigid endoscope. The endoscope 2 including an image sensor 2 a captures an image in a subject X, for example, an abdominal cavity and acquires the endoscope image E including the tip of the surgical instrument 6 (see FIGS. 5A to 6B). The image sensor 2 a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 2 and captures a stereo image as the endoscope image E. The image sensor 2 a is an image sensor such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. The image sensor 2 a generates an image of a predetermined region by converting received light from the predetermined region into an electric signal through photoelectric conversion. A stereo image as the endoscope image E is generated by performing image processing on two images with a parallax through the endoscope processor 4 or the like. In this case, the tip portion of the endoscope 2 has a stereo optical system.
  • The endoscope image E is transmitted from the endoscope 2 to the endoscope processor 4, is subjected to necessary processing in the endoscope processor 4, is transmitted from the endoscope processor 4 to the display device 5, and is displayed on a display screen 5 a of the display device 5. The display device 5 may be any display, for example, a liquid crystal display or an organic electroluminescent display. A surgeon operates the surgical instrument 6 in a body while observing the endoscope image E displayed on the display screen 5 a. The display device 5 may include an audio system, for example, a speaker.
  • In addition to the display device 5, a user terminal for communications with the controller 1 and the endoscope processor 4 via a communication network may be provided to display the endoscope image E at the terminal. The terminal is, for example, a notebook computer, a laptop computer, a tablet computer, or a smartphone but is not particularly limited thereto.
  • The moving device 3 includes a robot arm 3 a (including an electric scope holder) that holds the endoscope 2 and three-dimensionally controls the position and orientation of the endoscope 2. The moving device 3 includes a plurality of joints 3 b and 3 c that operate to move the endoscope 2 with the pivot axis P1 serving as a supporting point, thereby three-dimensionally changing the position and orientation of the endoscope 2.
  • As illustrated in FIG. 10 , the joint 3 c is a rotary joint that rotates the endoscope 2 about the longitudinal axis I and is provided at the tip portion of the robot arm 3 a. In response to the rotation of the joint 3 c, the endoscope 2 rotates about the optical axis C coaxial with the longitudinal axis I, thereby changing the rotation angle of a subject in the endoscope image E, that is, the vertical direction of the endoscope image E.
  • The moving device 3 includes a plurality of angle sensors 3 d that detects the rotation angles of the joints 3 b and 3 c. The angle sensor 3 d is, for example, an encoder, a potentiometer, or a Hall sensor that is provided at each of the joints 3 b and 3 c.
  • As illustrated in FIG. 2 , the controller 1 includes at least one processor 11 like a central processing unit, a memory 12, a storage unit 13, an input interface 14, an output interface 15, and a user interface 16. The controller 1 may be, for example, a desktop computer, a tablet computer, a laptop computer, a smartphone, or a cellular phone.
  • The processor 11 may be a single processor, a multiprocessor, or a multicore processor. The processor 11 reads and executes a program stored in the storage unit 13.
  • The memory 12 is, for example, a semiconductor memory including a ROM (read-only memory) or RAM (Random Access Memory) area. The memory 12 may store data necessary for the processing of the processor 11 (that is, the memory 12 may operate as “storage unit”) like the storage unit 13, which will be described later.
  • The storage unit 13 is a computer-readable non-transitory recording medium, e.g., a hard disk or a nonvolatile recording medium including a semiconductor memory such as flash memory. The storage unit 13 stores various programs including a follow-up control program (not illustrated) and an image control program (control program) 1 a and data necessary for the processing of the processor 11. Processing performed by the processor 11 may be implemented by dedicated logic circuits or hardware, for example, an FPGA (Field Programmable Gate Array), a SoC (System-on-a-Chip), an ASIC (Application Specific Integrated Circuit), and a PLD (Programmable Logic Device).
  • The storage unit 13 may be a server, e.g., a cloud server connected via a communication network to the controller 1 provided with a communication interface, instead of a recording medium integrated in the controller 1. The communication network may be, for example, a public network such as the Internet, a dedicated line, or a LAN (Local Area Network). The connection of the devices may be wired connection or wireless connection.
  • The endoscope processor 4 for processing the endoscope image E may be provided with the processor 11. Specifically, like the processor 11 included in the controller 1, the endoscope processor 4 may be provided with processors, dedicated logic circuits, or hardware to perform processing like the processor 11. The processing will be described later. The endoscope processor 4 and the controller 1 may be integrated into one unit. Each of the endoscope processor 4 and the controller 1 may be provided with at least one processor.
  • Any one of the configurations of the at least one processor 11, the memory 12, the storage unit 13, the input interface 14, the output interface 15, and the user interface 16 in the controller 1 may be provided for a user terminal, aside from the endoscope processor 4 and the controller 1. The controller 1 may be integrated with the moving device 3.
  • The input interface 14 and the output interface 15 are connected to the endoscope processor 4. The controller 1 can acquire the endoscope image E from the endoscope 2 via the endoscope processor 4 and output the endoscope image E to the display device 5 via the endoscope processor 4. The input interface 14 may be directly connected to the endoscope 2 and the output interface 15 may be directly connected to the display device 5 such that the controller 1 can directly acquire the endoscope image E from the endoscope 2 and directly output the endoscope image E to the display device 5.
  • The input interface 14 and the output interface 15 are connected to the moving device 3. The controller 1 acquires, from the moving device 3, information on rotation angles detected by the angle sensors 3 d at the joints 3 b and 3 c and transmits, to the moving device 3, a control signal for driving the joints 3 b and 3 c.
  • The user interface 16 has input devices for inputs to the user interface 16 by users such as a surgeon and receives a user input. The input devices include a button, a mouse, a keyboard, and a touch panel.
  • Moreover, the user interface 16 has a means that allows a user to switch a manual mode and an autonomous mode, which will be described later. The means is, for example, a switch.
  • The user interface 16 is configured to receive a first instruction and a second instruction from a user. The first instruction and the second instruction are instructions for causing the controller 1 to register position information and rotation angle information, which will be described later. For example, the user interface 16 has a button operated by an operator. The user interface 16 receives the first instruction in response to a first button operation and receives the second instruction in response to a second button operation.
  • The processor 11 can be operated in the manual mode or the autonomous mode.
  • The manual mode is a mode that permits users such as a surgeon to operate the endoscope 2. In the manual mode, a surgeon can manually move the endoscope 2 with a hand holding the proximal end portion of the endoscope 2. Furthermore, the surgeon can remotely operate the endoscope 2 by using an operating device connected to the moving device 3. The operating device can include a button, a joystick, and a touch panel.
  • The autonomous mode is a mode that causes the endoscope 2 to automatically follow the surgical instrument 6 by controlling the moving device 3 on the basis of the position of the surgical instrument 6 in the endoscope image E. In the autonomous mode, the processor 11 acquires the three-dimensional position of the tip of the surgical instrument 6 from the endoscope image E and controls the moving device 3 on the basis of the three-dimensional position of the tip of the surgical instrument 6 and the three-dimensional position of a predetermined target point set in the field of view of the endoscope 2. The target point is, for example, a point that is located on the optical axis C and corresponds to the center point of the endoscope image E. Thus, the controller 1 controls a movement of the endoscope 2 and causes the surgical instrument 6 to follow the endoscope 2 such that the tip of the surgical instrument 6 is disposed at the center point in the endoscope image E.
  • In the autonomous mode, by performing a control method in FIGS. 3A and 3B according to the image control program 1 a read into the memory 12, the processor 11 controls the rotation angle of the endoscope image E displayed on the display screen 5 a.
  • The control method performed by the processor 11 will be described below.
  • As indicated in FIGS. 3A and 3B, the control method according to the present embodiment includes step SB2 of setting the initial position of the endoscope 2, steps SB3 and SB4 of determining first position information and first rotation angle information on the first region in a subject, steps SB5 and SB6 of determining second position information and second rotation angle information on the second region in the subject, steps SB7 and SB8 of calculating third position information and third rotation angle information on a third region in the subject, step SB9 of storing the position information and the rotation angle information in the storage unit 13, steps SC4 to SC9 of rotating the endoscope image E according to a current imaging region that is currently being imaged by the endoscope 2, and step SC10 of outputting the rotated endoscope image E to the display device 5.
  • As indicated in FIG. 3A, steps SB2 to SB9 are performed in the manual mode. As indicated in FIG. 3B, steps SC3 to SC10 are performed in the autonomous mode.
  • A user, e.g., a surgeon inserts the endoscope 2 held by the moving device 3 into an abdominal cavity, switches to the manual mode (SAl, SB1), and starts panning around by moving the endoscope 2 in the abdominal cavity (SA3). Panning around is an operation for observing the overall abdominal cavity to confirm the positions or the like of organs and tissues. The positions of organs and tissues vary among patients, so that the operation is required each time the endoscope is inserted. When panning around, the surgeon rotates the endoscope 2 about the pivot axis P1 so as to observe, through the endoscope 2, a range including at least two specific tissues having anatomical characteristics. In the present embodiment, the specific tissues are the aorta F and the pelvis G.
  • As indicated in FIG. 3A, the surgeon registers the initial position of the endoscope 2 in the controller 1 before panning around (SA2). For example, the surgeon places the endoscope 2 at a desired initial position and operates a predetermined button of the user interface 16. In response to the operation of the predetermined operation, the processor 11 calculates a current position φ of the endoscope 2 and stores the current position φ as an initial position φ=0° in the storage unit 13 (SB2). The position φ is the position of the endoscope 2 in a circumferential direction around the pivot axis P1 and is calculated from the rotation angles detected by the angle sensors 3 d at the joints 3 b and 3 c. The position p represents the position of an imaging region in the circumferential direction around the pivot axis P1.
  • Subsequently, as illustrated in FIGS. 4A and 5A, the surgeon places the endoscope 2 at a position (O-point) for imaging the aorta F from the front and adjusts a rotation angle ω of the endoscope 2 about the optical axis C such that the aorta F is placed at a desired rotation angle in the endoscope image E (SA4). In this case, the rotation angle of the aorta F is a position in the circumferential direction around the center point of the endoscope image E. In the present embodiment, as illustrated in FIG. 5A, the rotation angle ω is adjusted such that the aorta F is horizontally placed in the endoscope image E. The surgeon then inputs the first instruction to the user interface 16 (SA5).
  • After the first instruction is inputted, the surgeon observes the overall aorta F through the endoscope 2 by rotating the endoscope 2 from O-point about the pivot axis P1 while keeping the rotation angle ω adjusted at O-point. As illustrated in FIGS. 5A and 5B, the aorta F makes a rotational movement in the endoscope image E as the endoscope 2 rotates from O-point to B-point. B-point is the end point of the observation range of the aorta F in the endoscope image E.
  • In response to the first instruction received by the user interface 16, the processor 11 determines, on the basis of the endoscope image E, the first position information and the first rotation angle information on the first region including the aorta (first specific tissue) F (SB3, SB4). The first rotation angle information is information that defines the rotation angle of the endoscope image E of the first region.
  • Specifically, the storage unit 13 stores a learned model lb of machine learning of the correspondence between an image including a specific tissue and the type of the specific tissue. In step SB3, the processor 11 recognizes the aorta F in the endoscope image E by using the learned model 1 b and determines, as the first position information, the range of the position φ of the endoscope 2 with the aorta F included in the endoscope image E. In other words, the first region is a region between O-point and B-point.
  • For example, the first position information is p=0° to 20°. As described above, instead of the setting of the initial position in steps SA2 and SB2, the processor 11 may set the position φ of the endoscope 2 at the time of the reception of the first instruction to the initial position p=0°. In other words, the initial position is determined at a time and a location as requested by the user.
  • Alternatively, the processor 11 may set the position φ of the endoscope 2 at the time of the reception of the first instruction, as the first position information without processing using the learned model 1 b. In other words, the first position information is determined at a time and a location as requested by the user.
  • Subsequently, in step SB4, the processor 11 sets the endoscope image E and the rotation angle ω of the endoscope 2 at the time of the reception of the first instruction by the user interface 16, as a first reference endoscope image and a first reference rotation angle, and the processor 11 determines the first rotation angle information on the basis of the first reference endoscope image and the first reference rotation angle.
  • Specifically, the processor 11 calculates the first reference rotation angle corresponding to a predetermined initial rotation angle ω=0°, as a target rotation angle θt of the endoscope image E at the position φ at the time of the reception of the first instruction. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the aorta F is to be horizontally placed in the endoscope image E at the position φ at the time of the reception of the first instruction. In the present embodiment, the first reference rotation angle ω is set at the initial rotation angle 0°.
  • The processor 11 then calculates a required rotation amount Δθ of the endoscope image E, which is obtained at another position φ included in the first position information, when the aorta F in the endoscope image E is to be aligned with the aorta F in the first reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle θt at another position φ by adding the rotation amount Δθ to the first reference rotation angle. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the aorta F is to be horizontally placed in the endoscope image E at another position φ. FIG. 5C illustrates the endoscope image E of FIG. 5B when a rotation is made by the target angle θt at B-point.
  • As described above, the processor 11 calculates the target rotation angle θt of the endoscope image E when the aorta F is to be horizontally placed at each position φ=0°, . . . , 20° included in the first position information, and the processor 11 determines the target rotation angle θt at each position φ=0°, . . . , 20° as the first rotation angle information. FIG. 7 only indicates representative target rotation angles θt=0°, −10° at φ=0°, 20° as the first rotation angle information.
  • Thereafter, as illustrated in FIG. 4B, the surgeon places the endoscope 2 at a position (D-point) for imaging the pelvis G. When the pelvis G is observed at the initial rotation angle ω =0°, the pelvis G may be placed at an improper position in the endoscope image E as illustrated in FIG. 6A. The surgeon adjusts the rotation angle ω about the optical axis C of the endoscope 2 such that the pelvis G is placed at a desired rotation angle in the endoscope image E (SA6), and inputs the second instruction to the user interface 16 (SA7). In the present embodiment, as illustrated in FIG. 6B, the rotation angle ω is adjusted such that the pelvis G is placed in an upper part in the endoscope image E.
  • After the second instruction is inputted, the surgeon observes the overall pelvis G through the endoscope 2 by rotating the endoscope 2 from D-point about the pivot axis P1 while keeping the rotation angle ω adjusted at D-point. Also at this point, the pelvis G makes a rotational movement in the endoscope image E as the endoscope 2 rotates from D-point to A-point. A-point is the end point of the observation range of the pelvis G in the endoscope image E.
  • In response to the second instruction received by the user interface 16, the processor 11 determines, on the basis of the endoscope image E, the second position information and the second rotation angle information on the second region including the pelvis (second specific tissue) G (SBS, SB6). The second rotation angle information is information that defines the rotation angle of the endoscope image E of the second region.
  • Specifically, in step SB5, the processor 11 recognizes the pelvis G in the endoscope image E by using the learned model 1 b and determines, as the second position information, the range of the position φ of the endoscope 2 with the pelvis G included in the endoscope image E. In other words, the second region is a region between D-point and A-point. For example, the second position information is p=70° to 90°.
  • Also for the second position information, the processor 11 may set the position φ of the endoscope 2 at the time of the reception of the second instruction, as the second position information without processing using the learned model 1 b. In other words, the second position information is determined at a time and a location as requested by the user.
  • Subsequently, in step SB6, the processor 11 sets the endoscope image E and the rotation angle ω of the endoscope 2 at the time of the reception of the second instruction by the user interface 16, as a second reference endoscope image and a second reference rotation angle, and the processor 11 determines the second rotation angle information on the basis of the second reference endoscope image and the second reference rotation angle.
  • Specifically, the processor 11 calculates the second reference rotation angle corresponding to an initial rotation angle ω =0°, as a target rotation angle θt of the endoscope image E at the position φ at the time of the reception of the second instruction. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the pelvis G is to be placed in an upper part in the endoscope image E at the position φ at the time of the reception of the second instruction.
  • The processor 11 then calculates a required rotation amount A of the endoscope image E, which is obtained at another position φ included in the second position information, when the pelvis G in the endoscope image E is to be aligned with the pelvis G in the second reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle θt at another position φ by adding the rotation amount A to the second reference rotation angle. The calculated target rotation angle θt represents a required rotation amount of the endoscope image E when the pelvis G is to be placed in an upper part in the endoscope image E at another position φ.
  • As described above, the processor 11 calculates the target rotation angle θt of the endoscope image E when the pelvis G is to be placed in an upper part at each position φ=70°, . . . , 90° included in the second position information, and the processor 11 determines the target rotation angle θt at each position φ=70°, . . . , 90° as the second rotation angle information. FIG. 7 only indicates representative target rotation angles θt=100°, 90° at φ=70°, 90° as the second rotation angle information.
  • The processor 11 then calculates third position information and third rotation angle information on a third region on the basis of the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SB7, SB8). The third region is different from the first region and the second region and is located between A-point and B-point in the present embodiment.
  • In step SB7, the processor 11 determines, as the third position information, the range of the position φ between the first position information and the second position information. For example, the third position information is φ=20° to 70°.
  • In step SB8, the processor 11 then calculates the third rotation angle information on the basis of the first, second, and third position information and the first and second rotation angle information. The third rotation angle information is information that defines the rotation angle of the endoscope image E of the third region.
  • Specifically, the processor 11 calculates the positional relationship between the third position information and the first and second position information and calculates the third rotation angle information on the basis of the positional relationship, the first rotation angle information, and the second rotation angle information.
  • For example, it is assumed that each position φ (M-point) of the third position information is an internally dividing point that internally divides a path between A-point and B-point in a m:n ratio. The processor 11 calculates a target rotation angle θt at each position φ on the basis of the ratio m:n, the rotation angle of 100° at A-point, and the rotation angle of −10° at B-point. For example, the position φ=45° internally divides the path between A-point and B-point, so that the target rotation angle θt at the position φ=45° is 45°, a median value between −10° and 100°.
  • This calculates the target rotation angle θt that gradually changes from 100° to 10° as the position φ changes from B-point to A-point.
  • The processor 11 determines the target rotation angle et at each position φ=20°, . . . , 70° as the third rotation angle information. FIG. 7 only indicates a representative target rotation angle θt =45° at φ=45° as the third rotation angle information.
  • In other words, the third region is a region where a specific tissue like the pelvis G and the aorta F in the first region and the second region is not included in an endoscope image, the specific tissue serving as an index of the rotation angle of the endoscope image E. Such a region provides difficulty in recognizing a specific tissue by the learned model 1 b and determining a desired rotation angle by a user. This requires calculation of the third position information and the third rotation angle information on the basis of the first and second position information and the first and second rotation angle information of the first region and the second region.
  • Subsequently, in step SB9, the processor 11 stores the first position information, the first rotation angle information, the second position information, the second rotation angle information, the third position information, and the third rotation angle information, which are determined in steps SB3 to SB8, in the storage unit 13. Thus, as indicated in FIG. 7 , data is generated in the storage unit 13, the data including a rotation angle φ of the endoscope 2 and a target rotation angle θt of the endoscope image E at each rotation angle φ indicating a position of the imaging region.
  • After the completion of panning, the surgeon switches from the manual mode to the autonomous mode and performs treatment on the aorta F and the pelvis G with the surgical instrument 6. As indicated in FIG. 3B, when the surgeon switches to the autonomous mode (SC2), the processor 11 rotates the rotary joint 3 c so as to match the rotation angle co of the endoscope 2 with the initial rotation angle 20° and causes the endoscope 2 to follow the tip of the surgical instrument 6 by controlling the moving device 3 while keeping the rotation angle ω at 0° (SC3). Moreover, in parallel with the tracking of the endoscope 2, the processor 11 controls the vertical direction of the endoscope image E displayed on the display screen 5 a (SC4 to SC10).
  • During the startup of the devices 1 and 3, the processor 11 sequentially receives the rotation angles of the joints 3 b and 3 c from the moving device 3 and calculates the current position φ of the endoscope 2 from the rotation angles of the joints 3 b and 3 c (SC1).
  • The processor 11 determines which one of the first region, the second region, and the third region includes the current imaging region on the basis of the current position of the endoscope 2, the first position information, and the second position information (SC4, SC6, SC8).
  • Specifically, if the current position φ is included in the first position information (φ=0° to 20°, the processor 11 determines that the current imaging region is included in the first region (YES at SC4). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the first rotation angle information stored in the storage unit 13 (SC5). Specifically, the processor 11 reads the target rotation angle θt of the current position φ from the storage unit 13 and rotates the endoscope image E by the target rotation angle θt through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5 a (SC10).
  • In the rotated endoscope image E, the aorta F is horizontally placed. Thus, while the endoscope 2 moves in the range of φ=0° to 20° and captures the endoscope image E including the aorta F, the aorta F in the endoscope image E displayed on the display screen 5 a is kept in a horizontal position. For example, if the endoscope 2 pivots 20° from O-point to B-point about the pivot axis P1, the endoscope image E rotates from 0° to −10°.
  • If the current position φ is included in the second position information (p=70° to 90°, the processor 11 determines that the current imaging region is included in the second region (NO at SC4 and YES at SC6). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the second rotation angle information stored in the storage unit 13 (SC7). Specifically, the processor 11 reads the target rotation angle θt of the current position φ from the storage unit 13 and rotates the endoscope image E by the target rotation angle et through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5 a (SC10).
  • In the rotated endoscope image E, the pelvis G is placed in an upper part. Thus, while the endoscope 2 moves in the range of φ=70° to 90° and captures the endoscope image E including the pelvis G, the pelvis G in the endoscope image E displayed on the display screen 5 a is kept in the upper part. For example, if the endoscope 2 pivots 20° from A-point to D-point about the pivot axis P1, the endoscope image E rotates from 100° to 90°.
  • If the current position φ is not included in the first position information or the second position information (NO at SC4 and NO at SC6), the processor 11 determines that the current imaging region is included in the third region (SC8). The processor 11 then rotates the endoscope image E in the plane of the endoscope image E on the basis of the third rotation angle information stored in the storage unit 13 (SC9). Specifically, the processor 11 reads the rotation angle of the current position φ from the storage unit 13 and rotates the endoscope image E by the rotation angle through image processing. The processor 11 then outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on the display screen 5 a (SC10).
  • The endoscope image E displayed on the display screen 5 a is rotated by the target rotation angle θt corresponding to the position φ. The target rotation angle θt gradually changes from the target rotation angle of the first region to the target rotation angle of the second region as the position φ changes from the first region to the second region. Thus, for example, if the endoscope 2 pivots from B-point to A-point about the pivot axis P1, the endoscope image E displayed on the display screen 5 a rotates from −10° to 100° in one direction.
  • As described above, according to the present embodiment, the storage unit 13 stores the first position information on the first region including a specific tissue F and the first rotation angle information for defining the target rotation angle θt of the endoscope image E, the target rotation angle θt being defined for placing the specific tissue F at a desired rotation angle by the surgeon. Furthermore, the storage unit 13 stores the second position information on the second region including a specific tissue G and the second rotation angle information for defining the target rotation angle θt of the endoscope image E, the target rotation angle θt being defined for placing the specific tissue G at a desired rotation angle by the surgeon. Moreover, as the third rotation angle information of the third region between the first region and the second region, the target rotation angle θt that gradually changes between the target rotation angle θt of the first rotation angle information and the target rotation angle θt of the second rotation angle information is interpolated and is stored in the storage unit 13.
  • Thereafter, in the autonomous mode, the endoscope image E is rotated by the target rotation angle θt corresponding to the position φ of the current imaging region, thereby automatically adjusting the vertical direction of the endoscope image E. Specifically, when the current imaging region is the first or second region including the specific tissues F and G, the endoscope image E is automatically rotated by the target rotation angle θt that places the specific tissues F and G at a predetermined rotation angle. When the current imaging region is the third region that does not include the specific tissues F and G, the endoscope image E is automatically rotated by a proper target rotation angle et that is estimated from the first and second rotation angle information.
  • As described above, the operator can be provided with the endoscope image E in a proper vertical direction according to the position of the current imaging region in an abdominal cavity.
  • Moreover, an automatic adjustment to the vertical direction of the endoscope image E can relieve the stress of the surgeon and shorten the treatment time. Specifically, if the surgeon adjusts the vertical direction of the endoscope image E, the surgeon needs to take a hand off from the surgical instrument 6 during an operation and then manually rotate the endoscope 2. According to the present embodiment, the surgeon does not need to operate the endoscope 2 to adjust the vertical direction, so that the surgeon can continue treatment without being interrupted.
  • (Second Embodiment)
  • An endoscope system, a controller, a control method, and a recording medium according to a second embodiment of the present invention will be described below with reference to the accompanying drawings.
  • The present embodiment is different from the first embodiment in that a processor 11 rotates an endoscope image E by a rotation of an endoscope 2 instead of image processing. In the present embodiment, configurations different from those of the first embodiment will be described. Configurations in common with the first embodiment are indicated by the same reference numerals and an explanation thereof is omitted.
  • An endoscope system 10 according to the present embodiment includes a controller 1, the endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5 as in the first embodiment.
  • FIGS. 8A and 8B indicate a control method performed by the processor 11 in the present embodiment.
  • As indicated in FIGS. 8A and 8B, the control method according to the present embodiment includes step SB2 of determining the initial position of the endoscope 2, steps SB3 and SB4′ of determining first position information and first rotation angle information on a first region in a subject, steps SB5 and SB6′ of determining second position information and second rotation angle information on a second region in the subject, steps SB7 and SB8′ of determining third position information and third rotation angle information on a third region in the subject, step SB9 of storing the position information and the rotation angle information in the storage unit 13, steps SC4 to SC9′ of rotating the endoscope image E according to a current imaging region that is currently being imaged by the endoscope 2, and step SC10 of outputting the rotated endoscope image E to the display device 5.
  • As indicated in FIG. 8A, steps SB2 to SB9 are performed in a manual mode. As indicated in FIG. 8B, steps SC4 to S10 are performed in an autonomous mode.
  • As in the first embodiment, a user performs steps SAl to SA5. In response to a first instruction received by a user interface 16, the processor 11 determines the first position information and the first rotation angle information on the first region on the basis of the endoscope image E (SB3, SB4′).
  • Specifically, in step SB4′ subsequent to step SB3, the processor 11 sets the endoscope image E and a rotation angle co of the endoscope 2 at the time of the reception of the first instruction by the user interface 16, as a first reference endoscope image and a first reference rotation angle.
  • Subsequently, the processor 11 calculates the first reference rotation angle corresponding to a predetermined initial rotation angle ω =0°, as a target rotation angle cot of the endoscope 2 at a position φ at the time of the reception of the first instruction.
  • The processor 11 then calculates a required rotation amount A of the endoscope image E, which is obtained at another position φ included in the first position information, when an aorta F in the endoscope image E is to be aligned with the aorta F in the first reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle cot of the endoscope 2 at another position φ by adding the rotation amount A to the first reference rotation angle.
  • As described above, the processor 11 calculates the target rotation angle cot of the endoscope 2 when the aorta F is to be horizontally placed at each position φ=0°, . . . , 20° included in the first position information, and the processor 11 determines the target rotation angle cot at each position φ=0°, . . . , 20° as the first rotation angle information.
  • The user then performs steps SA6 and SA7. In response to a second instruction received by the user interface 16, the processor 11 determines the second position information and the second rotation angle information on the second region on the basis of the endoscope image E (SB5, SB6′).
  • Specifically, in step SB6′ subsequent to step SB5, the processor 11 sets the endoscope image E and the rotation angle co of the endoscope 2 at the time of the reception of the second instruction by the user interface 16, as a second reference endoscope image and a second reference rotation angle.
  • Subsequently, the processor 11 calculates the second reference rotation angle corresponding to an initial rotation angle ω =0°, as a target rotation angle cot of the endoscope 2 at the position φ at the time of the reception of the second instruction.
  • The processor 11 then calculates a required rotation amount A of the endoscope image E, which is obtained at another position φ included in the second position information, when the pelvis G in the endoscope image E is to be aligned with the pelvis G in the second reference endoscope image. Subsequently, the processor 11 calculates a target rotation angle cot of the endoscope 2 at another position φ by adding the rotation amount A to the second reference rotation angle.
  • As described above, the processor 11 calculates the target rotation angle cot of the endoscope 2 when the pelvis G is to be placed in an upper part at each position φ=70°, . . . , 90° included in the second position information, and the processor 11 determines the target rotation angle cot at each position φ=70°, . . . , 90° as the second rotation angle information.
  • The processor 11 then calculates the third position information and the third rotation angle information on the third region on the basis of the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SB7, SB8′). Specifically, in step SB8′ subsequent to step SB7, the processor 11 determines the target rotation angle cot at each position φ=20°, . . . , 70° of the third position information as the third rotation angle information as in step SB8.
  • Subsequently, in step SB9, the processor 11 stores the position information and the rotation angle information, which are determined in steps SB3, SB4′, SB5, SB6′, SB7, and SB8′, in the storage unit 13. Thus, data is generated in the storage unit 13, the data including the rotation angle φ of the endoscope 2 and the target rotation angle cot of the endoscope image E at each rotation angle φ indicating a position of the imaging region.
  • As indicated in FIG. 8B, the processor 11 then calculates the current position φ of the endoscope 2 (SC1). When switching to the autonomous mode (YES at SC2), the processor 11 determines which one of the first region, the second region, and the third region includes the current imaging region (SC4, SC6, SC8).
  • If the processor 11 determines that the current imaging region is included in the first region (YES at SC4), the processor 11 rotates the endoscope 2 on the basis of the first rotation angle information stored in the storage unit 13 (SC5′). Specifically, the processor 11 reads the target rotation angle cot of the current position φ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 to the target rotation angle cot.
  • If the processor 11 determines that the current imaging region is included in the second region (NO at SC4 and YES at SC6), the processor 11 rotates the endoscope 2 on the basis of the second rotation angle information stored in the storage unit 13 (SC7′). Specifically, the processor 11 reads the target rotation angle cot of the current position φ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 by the target rotation angle cot.
  • If the processor 11 determines that the current imaging region is included in the third region (SC7), the processor 11 rotates the endoscope 2 on the basis of the third rotation angle information stored in the storage unit 13 (SC8′). Specifically, the processor 11 reads the target rotation angle cot of the current position φ from the storage unit 13 and rotates the endoscope image E by rotating the endoscope 2 by the target rotation angle cot.
  • Subsequent to step SC5′, SC7′, or SC9′, the processor 11 outputs the rotated endoscope image E from the controller 1 to the display device 5 and displays the image on a display screen 5 a (SC10).
  • As described above, in the autonomous mode according to the present embodiment, the endoscope 2 is rotated to the target rotation angle cot corresponding to the position φ of the current imaging region, thereby automatically adjusting the vertical direction of the endoscope image E as in the first embodiment. Specifically, when the current imaging region is the first or second region including the specific tissues F and G, the endoscope 2 is automatically rotated to the target rotation angle cot that places the specific tissues F and G at a predetermined rotation angle. When the current imaging region is the third region that does not include the specific tissues F and G, the endoscope 2 is automatically rotated to a proper target rotation angle cot that is estimated from the first and second rotation angle information.
  • As described above, an operator can be provided with the endoscope image E in a proper vertical direction according to the position of the current imaging region in an abdominal cavity. Moreover, an automatic adjustment to the vertical direction of the endoscope image E can relieve the stress of the surgeon and shorten the treatment time.
  • According to the present embodiment, a rotation of the endoscope image E by rotating the endoscope 2 about an optical axis C can eliminate the need for image processing for rotating the endoscope image E, thereby reducing a load of the processor 11. Moreover, the user can intuitively recognize the vertical direction of the endoscope image E by confirming the target angle ω of a portion of the endoscope 2 outside a body.
  • In the present embodiment, the endoscope image E is rotated by rotating the overall endoscope 2 about the optical axis C. Alternatively, an image sensor 2 a may be rotated about the optical axis C while keeping the rotation angle ω of the endoscope 2 about the optical axis C. In this case, the endoscope 2 includes a rotating mechanism for rotating the image sensor 2 a.
  • A rotation of the image sensor 2 a relative to the body of the endoscope 2 can rotate the endoscope image E like a rotation of the overall endoscope 2.
  • (Third Embodiment)
  • An endoscope system, a controller, a control method, and a recording medium according to a third embodiment of the present invention will be described below with reference to the accompanying drawings.
  • The present embodiment is different from the first and second embodiments in that an endoscope image E is rotated by a combination of a rotation of an endoscope 2 about an optical axis C and image processing. In the present embodiment, configurations different from those of the first and second embodiments will be described. Configurations in common with the first and second embodiments are indicated by the same reference numerals and an explanation thereof is omitted.
  • An endoscope system 10 according to the present embodiment includes a controller 1, the endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5 as in the first embodiment.
  • FIG. 9 indicates a control method performed by a processor 11 in an autonomous mode in the present embodiment. The control method according to the present embodiment includes step SC11 of determining whether a rotation angle co of the endoscope 2 is a predetermined critical angle and step SC12 of rotating the endoscope image E by image processing in addition to steps SB2, SB3, SB4′, SBS, SB6′, SB7, SB8′, SB9, SC1 to SC4, SC5′, SC6, SC7′, SC8, and SC9′ that are described in the second embodiment.
  • After Step SB9, as indicated in FIG. 9 , the processor 11 calculates the current position φ of the endoscope 2 (SC1). When switching to the autonomous mode (YES at SC2), the processor 11 performs steps SC1 to SC4, SC5′, SC6, SC7′, SC8, and SC9′.
  • In steps SC5′, SC7′, and SC9′, the processor 11 determines whether the rotation angle ω of the endoscope 2 has reached the critical angle of the rotatable range of the endoscope 2 on the basis of a rotation angle detected by an angle sensor 3 d at a rotary joint 3 c (SC11). The rotatable range in which the endoscope 2 is rotatable may be limited by physical constraints or the like. For example, a cable in the endoscope 2 and the moving device 3 is twisted by a rotation of the endoscope 2 and thus the rotatable range of the endoscope 2 is set without causing an excessive twist.
  • If the endoscope 2 rotates to a target rotation angle cot before the rotation angle ω reaches the critical angle (NO at SC11), the processor 11 outputs the rotated endoscope image E to the display device 5 (SC10).
  • If the rotation angle ω reaches the critical angle before the target rotation angle cot (YES at SC11), the processor 11 stops the rotation of the endoscope 2 at the critical angle, rotates the endoscope image E through image processing by a rotation angle to be added to reach the target rotation angle cot (SC12), and outputs the rotated endoscope image E to the display device 5 (SC10).
  • As described above, according to the present embodiment, the endoscope image E can be rotated by a combination of a rotation of the endoscope 2 about the optical axis C and image processing even if the endoscope image E is hard to rotate by a rotation of the endoscope 2 alone.
  • Other effects of the present embodiment are identical to those of the first and second embodiments and thus an explanation thereof is omitted.
  • (First Modification)
  • A first modification of the endoscope system 10, the controller 1, the control method, and the recording medium according to the first to third embodiments will be described below.
  • As illustrated in FIG. 10 , the present modification is different from the first to third embodiments in that the endoscope 2 is an oblique type.
  • The oblique endoscope 2 includes a long insertion portion 2 b that is inserted with the longitudinal axis I into a subject, and an imaging portion 2 c that includes the image sensor 2 a and is connected to the proximal end of the insertion portion 2 b. The insertion portion 2 b and the imaging portion 2 c are integrally rotated about the longitudinal axis I by a rotation of the rotary joint 3 c. In the case of a separate oblique mirror, a camera head (imaging portion 2 c) and an optical visual tube (insertion portion 2 b) have different pieces of rotation angle information. In the present modification, the camera head and the optical visual tube are integrally rotated to perform processing using common rotation angle information.
  • In the case of the direct-vision endoscope 2, a visual axis (optical axis) C is coaxial with the longitudinal axis I, so that the position of the visual axis C is kept even if the endoscope 2 rotates about the longitudinal axis I. In the case of the oblique endoscope 2, the visual axis C tilts with respect to the longitudinal axis I and thus makes a rotational movement about the longitudinal axis I in response to a rotation of the endoscope 2 about the longitudinal axis I, thereby moving the imaging region.
  • FIGS. 11A and 11B indicate a control method performed by the processor 11 in the present modification. As indicated in FIGS. 11A and 11B, the control method according to the present modification includes steps SB2′ and SB3 to SB9 and steps SC3′ and SC4 to SC10.
  • In step SB2′, the processor 11 sets the current position φ of the endoscope 2 at the initial position φ=0° and sets the current orientation ω of the endoscope 2 at the initial position ω =0°. The orientation ω of the endoscope 2 is a rotation angle about the longitudinal axis I and corresponds to the orientation of the visual axis C with respect to the longitudinal axis I.
  • In response to the first instruction received by the user interface 16 (SA5), the processor 11 determines the first position information and the first rotation angle information (SB3, SB4) and holds information on a first orientation of the endoscope 2 when the first instruction is received.
  • Subsequently, in response to the second instruction received by the user interface 16 (SA7), the processor 11 determines the second position information and the second rotation angle information (SB5, SB6) and holds information on a second orientation of the endoscope 2 when the second instruction is received.
  • In step SB9, the processor 11 stores the first orientation and the second orientation in the storage unit 13 in addition to the position information and the rotation angle information. Thus, data is generated in the storage unit 13, the data including a rotation angle φ of the endoscope 2, the target rotation angle θt of the endoscope image E at each rotation angle φ, and the first orientation and the second orientation of the endoscope 2, the rotation angle φ indicating the position of the imaging region, the first and second orientations corresponding to each imaging region.
  • Subsequently, in the autonomous mode, the processor 11 controls the position and orientation of the endoscope 2 by controlling the moving device 3 and causes the endoscope 2 to follow the tip of the surgical instrument 6 (SC3′). At this point, the processor 11 controls the position and orientation of the endoscope 2 on the basis of the first and second position information and the first and second orientations stored in the storage unit 13, so that an orientation ω of the endoscope 2 is controlled to the first orientation when the imaging region is included in the first region, whereas the orientation ω of the endoscope 2 is controlled to the second orientation when the imaging region is included in the second region.
  • As in the first embodiment, the processor 11 rotates the endoscope image E by the target rotation angle θt according to the current imaging region through image processing (SC4 to SC9).
  • As described above, in the case of the oblique endoscope 2, the imaging region is moved by a rotation of the endoscope 2 about the longitudinal axis I. Thus, the vertical direction of the endoscope image E is hard to control only by the control method of the second embodiment, in which the endoscope image E is rotated by a rotation of the endoscope 2.
  • According to the present modification, in the manual mode, the first orientation of the endoscope 2 at the time of imaging of the first region and the second orientation of the endoscope 2 at the time of imaging of the second region are stored. At the time of imaging of the first region in the autonomous mode, the orientation of the endoscope 2 is controlled to the first orientation and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. At the time of imaging of the second region in the autonomous mode, the orientation of the endoscope 2 is controlled to the second orientation and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. This can properly control the vertical direction of the endoscope image E captured by the oblique endoscope 2.
  • (Second Modification)
  • A second modification of the endoscope system 10, the controller 1, the control method, and the recording medium 13 according to the first to third embodiments will be described below.
  • As illustrated in FIG. 12 , the present modification is different from the first to third embodiments in that the endoscope 2 has a curved portion 2 d.
  • The endoscope 2 includes the long insertion portion 2 b that is inserted into a subject and the curved portion 2 d that is provided at the tip portion of the insertion portion 2 b and can be curved in a direction that crosses the longitudinal axis I of the insertion portion 2 b. When the curved portion 2 d is bent, the visual axis C tilts with respect to the longitudinal axis I and thus makes a rotational movement about the longitudinal axis I in response to a rotation of the endoscope 2 about the longitudinal axis I, thereby moving the imaging region. Moreover, the tilt direction and the tilt angle of the visual axis C with respect to the longitudinal axis I change according to the curving direction and the curving angle of the curved portion 2 d.
  • The control method performed by the processor 11 in the present modification includes steps SB2′ and SB3 to SB9 and steps SC3′ and SC4 to SC10 as in the first modification. As the orientation of the endoscope 2, the rotation direction and the rotation angle of the curved portion 2 d are used instead of the rotation angle ω about the longitudinal axis I.
  • Specifically, in step SB2′, the processor 11 sets the current curving direction and curving angle of the curved portion 2 d as an initial orientation. Subsequently, in step SB9, the curving direction and the curving angle of the curved portion 2 d at the time of the reception of the first instruction are stored as a first orientation in the storage unit 13 by the processor 11, and the curving direction and the curving angle of the curved portion 2 d at the time of the reception of the second instruction are stored as a second orientation in the storage unit 13 by the processor 11.
  • In step SC3′ of the autonomous mode, the processor 11 controls the position and orientation of the endoscope 2 on the basis of the first and second position information and the first and second orientations stored in the storage unit 13, so that the curving direction and the curving angle of the curved portion 2 d are controlled to the first orientation when the imaging region is included in the first region, whereas the curving direction and the curving angle of the curved portion 2 d are controlled to the second orientation when the imaging region is included in the second region (SC3′).
  • As described above, in the case of the endoscope 2 including the curved portion 2 d, the imaging region makes a rotational movement by a rotation of the endoscope 2 according to the curving direction and the curving angle of the curved portion 2 d. Thus, the vertical direction of the endoscope image E is hard to control by the control method of the second embodiment, in which the endoscope image E is rotated by a rotation of the endoscope 2.
  • According to the present modification, at the time of imaging of the first region in the autonomous mode, the orientation of the endoscope 2 is controlled to the first orientation stored in the manual mode and the vertical direction of the endoscope image E is adjusted by a rotation through image processing as in the first modification. At the time of imaging of the second region in the autonomous mode, the orientation of the endoscope 2 is controlled to the second orientation stored in the manual mode and the vertical direction of the endoscope image E is adjusted by a rotation through image processing. This can properly control the vertical direction of the endoscope image E captured by the endoscope 2 including the curved portion 2 d.
  • In the embodiments and the modifications, the processor 11 calculates the third rotation angle information in the manual mode and stores the information in the storage unit 13. Alternatively, as indicated in FIGS. 13A and 13B, the processor 11 may calculate the third rotation angle information in real time during the autonomous mode (SC13). In other words, the processor 11 does not determine or store the third position information and the third rotation angle information in the manual mode. In this case, the third region is assumed to be a region other than the first region and the second region.
  • In the autonomous mode of the embodiments and the modifications, if it is determined that the current imaging region is included in the third region (not included in the first region or the second region), the processor 11 may calculate the target rotation angle θt or cot at the current position φ of the endoscope 2 in real time on the basis of the current position φ, the first position information, the first rotation angle information, the second position information, and the second rotation angle information (SC13). If the current imaging region is included in one of the first region and the second region (not included in the third region), the processor 11 may match the target rotation angle θt or cot with the first rotation angle information or the second rotation angle information without calculating the target rotation angle θt or cot in real time. This can reduce the amount of position information and rotation angle information to be stored in the storage unit 13 during the manual mode and only requires the calculation of the third position information and the third rotation angle information that are required for an operation of the autonomous mode, thereby reducing a load to the system.
  • If the current imaging region is included in the first region or the second region, the processor 11 may update the stored first position information or second position information or the stored first rotation angle information or second rotation angle information to the current position information and rotation angle information. The endoscope 2 is moved after the update. If it is determined that the current imaging region is included in the first region or the second region, the updated first position information, second position information, first rotation angle information, and second rotation angle information can be used. For the update, the user may provide an instruction to update from the user interface 16. Thus, even if the body of a patient is deformed by, for example, an adjustment to pneumoperitoneum or a body posture, the position information and the rotation angle information can be updated to correct information according to the current circumstances.
  • In the embodiments and the modifications, the processor 11 recognizes a specific tissue in the endoscope image E and determines the position information and the rotation angle information on the basis of the recognized specific tissue. Alternatively, the position information and the rotation angle information may be determined on the basis of the position φ and the rotation angle ω of the endoscope 2 at the time of the reception of the instruction.
  • Specifically, in the manual mode, the surgeon places the endoscope 2 at a desired position at a desired rotation angle co and inputs the first instruction. The processor 11 determines, as the first position information, a range around the position φ of the endoscope 2 at the time of the reception of the first instruction by the user interface 16 and determines, as the first rotation angle information, the rotation angle ω of the endoscope 2 at the time of the reception of the second instruction by the user interface 16.
  • Similarly, the surgeon places the endoscope 2 at another desired position at a desired rotation angle ω and inputs the second instruction. The processor 11 determines, as the second position information, a range around the position φ of the endoscope 2 at the time of the reception of the second instruction by the user interface 16 and determines, as the second rotation angle information, the rotation angle ω of the endoscope 2 at the time of the reception of the second instruction by the user interface 16.
  • With this configuration, the surgeon can register any regions in a subject as the first region and the second region, thereby determining the position information and the rotation angle information that are further adapted to the feeling of the surgeon. Also when the first and second regions do not include a specific tissue, any position information and rotation angle information can be determined and stored for the first and second regions without performing the processing of the learned model 1 b.
  • The determination of the position information and the rotation angle information on the basis of a specific tissue in the endoscope image E may be used in combination with the determination of the position information and the rotation angle information on the basis of the position φ and the rotation angle ω of the endoscope 2 at the time of the reception of an instruction.
  • For example, after determining the first and second position information and the first and second rotation angle information on the basis of the specific tissues F and G in the endoscope image E as described in the first to third embodiments and the modifications thereof, the processor 11 may further determine position information and rotation angle information on any region different from the first and second regions on the basis of an instruction of the surgeon.
  • In the embodiments and the modifications, specific tissues are the aorta F and the pelvis G. The specific tissues may be any organs or tissues having anatomical characteristics. For example, a uterus may be used.
  • In the embodiments and the modifications, the position information and the rotation angle information on the two regions are stored. Position information and rotation angle information on three or more regions may be stored instead. This can improve accuracy when position information and rotation angle information are calculated on the basis of stored information.
  • In the embodiments and the modifications, the position φ of the endoscope 2 is expressed by a two-dimensional polar coordinate system with the pivot point H serving as an origin, the position φ indicating the position of the imaging region. The position φ may be expressed by a three-dimensional polar coordinate system. Specifically, the endoscope 2 may be supported so as to pivot about a second pivot axis P2 that passes through the pivot point H and is orthogonal to the first pivot axis P1, and the position of the imaging region may be expressed as (φ1, φ2), where φ1 is a rotation angle about the first pivot axis P1 and φ2 is a rotation angle about the second pivot axis P2. In this case, the first position information, the second position information, and the third position information are three-dimensional information including rotation angles φ1 and φ2.
  • In the embodiments and the modifications, the position of the imaging region may be expressed by other kinds of coordinate systems instead of a polar coordinate system. For example, the position of the imaging region may be expressed by a cartesian coordinate system with the hole H serving as an origin.
  • In the embodiments and the modifications, the coordinate system of the position φ of the imaging region is a global coordinate system fixed relative to a subject. A relative coordinate system for the tip of the endoscope 2 may be used instead.
  • In the embodiments and the modifications, the first and second position information are determined in the manual mode and are stored in the storage unit 13. Alternatively, the first and second position information may be stored in advance in the storage unit 13 before a surgical operation.
  • Before a surgical operation, an examination image of a range including an affected part, for example, a CT image of an abdominal region may be captured. Deconvolution on multiple CT images generates a three-dimensional image in an abdominal cavity. The first and second position information may be determined and stored in the storage unit 13 on the basis of such a three-dimensional image before a surgical operation. In this case, steps SB4 and SB6 are omitted in the manual mode.
  • This configuration can reduce the computational complexity of the processor 11 in the manual mode.
  • In the embodiments and the modifications, the processor 11 in the manual mode may store a first endoscope image and a second endoscope image in the storage unit 13. The first endoscope image is the endoscope image E of the first region, and the second endoscope image is the endoscope image E of the second region. For example, in step SB3, the processor 11 stores at least one endoscope image E, in which the aorta F is recognized, as the first endoscope image in the storage unit 13. In step SB6, the processor 11 stores at least one endoscope image E, in which the pelvis G is recognized, as the second endoscope image in the storage unit 13.
  • In this case, the processor 11 in the autonomous mode may determine which one of the first region, the second region, and the third region includes the current imaging region on the basis of the first endoscope image and the second endoscope image. In other words, the processor 11 compares the current endoscope image E with the first endoscope image and the second endoscope image. The processor 11 determines that the current imaging region is included in the first region in the presence of a first endoscope image identical or similar to the current endoscope image E. The processor 11 determines that the current imaging region is included in the second region in the presence of a second endoscope image identical or similar to the current endoscope image E.
  • In the embodiments and the modifications, if a specific tissue is included in the endoscope image E, the processor 11 may read information on the rotation angle of the specific tissue from a database 1 c stored in the storage unit 13 and then rotate the endoscope image E on the basis of the read information on the rotation angle. The rotation angle is an angle around the center point of the endoscope image E. This configuration can rotate the endoscope image E such that a specific tissue in the endoscope image E is placed at a predetermined rotation angle.
  • For example, registered in the database 1 c are the type of at least one specific tissue other than the aorta F and the pelvis G and the rotation angle of the type of the specific tissue. The processor 11 recognizes a specific tissue in the endoscope image E, reads the rotation angle of the specific tissue from the database 1 c, and rotates the endoscope image E such that the specific tissue is placed at the rotation angle.
  • For example, a uterus J as a specific tissue is preferably placed in an upper part of the endoscope image E and thus 90° equivalent to the 12 o'clock position is registered as a rotation angle of the uterus J. The processor 11 rotates the endoscope image E such that the recognized uterus J is placed at the position of 90°. Thus, if the endoscope image E includes the uterus J, the vertical direction of the endoscope image E is automatically adjusted such that the uterus J is placed at the position of 90°.
  • In the embodiments and the modifications, the rotation of the endoscope image E is controlled on the basis of the specific tissues F and G in the endoscope image E. Additionally, the rotation of the endoscope image E may be controlled on the basis of the surgical instrument 6 in the endoscope image E.
  • For example, the processor 11 can operate in a first rotation mode for controlling the rotation of the endoscope image E on the basis of the specific tissues F and G and a second rotation mode for controlling the rotation of the endoscope image E on the basis of the surgical instrument 6. A user, for example, a surgeon can switch the first rotation mode and the second rotation mode by using the user interface 16.
  • In the second rotation mode, the processor 11 detects the angle of the surgical instrument 6 in the current endoscope image E, rotates the endoscope image E by a rotation of the endoscope 2 or image processing such that the angle of the surgical instrument 6 is equal to a predetermined target angle, outputs the rotated endoscope image E to the display device 5, and displays the image on the display screen 5 a. The angle of the surgical instrument 6 is, for example, the angle of the longitudinal axis of the shaft of the surgical instrument 6 with respect to the horizon of the endoscope image E.
  • For a proper operation of the surgical instrument 6 by the surgeon who is observing the endoscope image E, it is important to properly set the angle of the surgical instrument 6 in the endoscope image E displayed on the display screen 5 a. However, a movement of the surgical instrument 6 by the surgeon or a change of the orientation of the endoscope 2 following the surgical instrument 6 leads to a change of the angle of the surgical instrument 6 in the endoscope image E.
  • The surgeon optionally switches from the first rotation mode to the second rotation mode such that the surgical instrument 6 in the endoscope image E can be displayed at a target angle on the display screen 5 a.
  • In the embodiments and the modifications, the surgeon manually operates the surgical instrument 6 held with his/her hand. Alternatively, as illustrated in FIGS. 14A and 14B, the surgical instrument 6 may be held and controlled by a second moving device 31 that is different from the moving device 3. In this case, the controller 1 may acquire position information on the endoscope 2 and the surgical instrument 6 from the moving device 3 for moving the endoscope 2 and the second moving device 31 for moving the surgical instrument 6. Like the moving device 3, the second moving device 31 holds the surgical instrument 6 with a robot arm or an electric holder and three-dimensionally changes the position and orientation of the surgical instrument 6 under the control of a controller 101. As illustrated in FIG. 14A, the surgical instrument 6 may be connected to the tip of the robot arm and is integrated with the robot arm. As illustrated in FIG. 14B, the surgical instrument 6 may be a separate part held by a robot arm.
  • REFERENCE SIGNS LIST
    • 1 Controller
    • 11 Processor
    • 12 Memory
    • 13 Storage unit, recording medium
    • 14 Input interface
    • 15 Output interface
    • 16 User interface
    • 1 a Image control program
    • 1 b Learned model
    • 1 c Database
    • 2 Endoscope
    • 2 a Image sensor
    • 3 Moving device
    • 3 a Robot arm
    • 3 b, 3 c Joint
    • 3 d Angle sensor
    • 4 Endoscope processor
    • 5 Display device
    • 5 a Display screen
    • 6 Surgical instrument
    • A, B, D, O Position
    • C Optical axis, visual axis
    • P1 First pivot axis
    • P2 Second pivot axis
    • E Endoscope image
    • F Aorta, first specific tissue
    • G Pelvis, second specific tissue
    • H Hole

Claims (26)

1. An endoscope system comprising:
an endoscope that is inserted into a subject and captures an endoscope image in the subject;
a moving device that holds the endoscope and moves the endoscope;
a storage unit; and
a controller including at least one processor,
wherein the storage unit stores first position information and first rotation angle information on a first region in the subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region,
the at least one processor calculates third rotation angle information on a third region in the subject on a basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions,
the at least one processor rotates the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and
the at least one processor outputs the rotated endoscope image to a display device.
2. The endoscope system according to claim 1, wherein the at least one processor rotates the endoscope image by image processing.
3. The endoscope system according to claim 1, wherein the moving device is configured to rotate the endoscope about an optical axis of the endoscope, and
the at least one processor rotates the endoscope image by controlling the moving device so as to rotate the endoscope about the optical axis.
4. The endoscope system according to claim 1, wherein the at least one processor is operable in a manual mode that permits a user to operate the endoscope, and
in the manual mode, the at least one processor determines the first position information, the first rotation angle information, the second position information, and the second rotation angle information and stores the first position information, the first rotation angle information, the second position information, and the second rotation angle information in the storage unit.
5. The endoscope system according to claim 4, wherein the at least one processor determines the first position information and the first rotation angle information on a basis of a first specific tissue included in the endoscope image, and
the at least one processor determines the second position information and the second rotation angle information on a basis of a second specific tissue included in the endoscope image.
6. The endoscope system according to claim 5, wherein the storage unit stores a learned model of machine learning of a correspondence between an image including a specific tissue and a type of the specific tissue,
the at least one processor recognizes the first specific tissue and the second specific tissue in the endoscope image by using the learned model stored in the storage unit,
the at least one processor determines the first position information on a basis of a position of an imaging region of the endoscope image in which the first specific tissue is recognized, and determines the first rotation angle information on a basis of a rotation angle of the first specific tissue in the endoscope image, and
the at least one processor determines the second position information on a basis of a position of an imaging region of the endoscope image in which the second specific tissue is recognized, and determines the second rotation angle information on a basis of a rotation angle of the second specific tissue in the endoscope image.
7. The endoscope system according to claim 4, wherein the controller further includes a user interface that receives a user instruction,
the at least one processor determines the first position information on a basis of a position of the imaging region at a time of reception of a first instruction by the user interface, and determines the first rotation angle information on a basis of a rotation angle about an optical axis of the endoscope at a time of reception of the first instruction by the user interface, and
the at least one processor determines the second position information on a basis of a position of the imaging region at a time of reception of a second instruction by the user interface, and determines the second rotation angle information on a basis of a rotation angle about the optical axis of the endoscope at a time of reception of the second instruction by the user interface.
8. The endoscope system according to claim 4, wherein the at least one processor stores a first endoscope image and a second endoscope image in the storage unit, the first endoscope image serving as an endoscope image of the first region, the second endoscope image serving as an endoscope image of the second region.
9. The endoscope system according to claim 8, wherein the at least one processor determines which one of the first region, the second region, and the third region includes the current imaging region on a basis of the first endoscope image and the second endoscope image that are stored in the storage unit.
10. The endoscope system according to claim 1, wherein the first position information and the second position information are stored in advance in the storage unit, the first and second position information being determined on a basis of an examination image in the subject captured before a surgical operation.
11. The endoscope system according to claim 1, wherein the at least one processor rotates the endoscope image on a basis of the first rotation angle information if the first region includes the current imaging region, and
the at least one processor rotates the endoscope image on a basis of the second rotation angle information if the second region includes the current imaging region.
12. The endoscope system according to claim 11, wherein the processor determines which one of the first region, the second region, and the third region includes the current imaging region on a basis of a position of the imaging region, the first position information, and the second position information.
13. The endoscope system according to claim 1, wherein the endoscope is supported so as to pivot about a first pivot axis at a predetermined pivot point fixed to the subject, the endoscope pivots about the first pivot axis so as to move the imaging region between the first region and the second region, and
the first position information, the second position information, and the third position information each include a rotation angle of the endoscope about the first pivot axis.
14. The endoscope system according to claim 13, wherein the endoscope is supported so as to pivot about a second pivot axis at the predetermined pivot point, the second pivot axis being orthogonal to the first pivot axis, and
the first position information, the second position information, and the third position information are each three-dimensional information and further include a rotation angle of the endoscope about the second pivot axis.
15. The endoscope system according to claim 13, wherein the moving device includes at least one joint and at least one angle sensor that detects a rotation angle of the at least one joint, and
the processor calculates a rotation angle of the endoscope about the first pivot axis on a basis of the rotation angle detected by the at least one angle sensor.
16. The endoscope system according to claim 1, wherein the storage unit stores a database in which a type of a specific tissue and rotation angle information are associated with each other, and
if the specific tissue is included in the endoscope image of the third region,
the processor reads, from the database, the rotation angle information corresponding to the type of the specific tissue in the endoscope image and
the processor rotates the endoscope image on a basis of the read rotation angle information.
17. The endoscope system according to claim 1, wherein the at least one processor calculates a positional relationship between the third position information and the first and second position information and
the at least one processor calculates the third rotation angle information on a basis of the positional relationship, the first rotation angle information, and the second rotation angle information.
18. The endoscope system according to claim 3, wherein the at least one processor rotates the endoscope image by image processing if a rotation angle of the endoscope about the optical axis reaches a critical angle of a predetermined rotatable range.
19. The endoscope system according to claim 1, wherein the endoscope is a direct-view endoscope or an oblique endoscope.
20. The endoscope system according to claim 1, wherein the endoscope has a curved portion electrically operated to be bent at a tip portion of the endoscope.
21. The endoscope system according to claim 1, wherein the at least one processor is configured to update the first position information or the first rotation angle information to position information or rotation angle information on the current imaging region if the current imaging region is included in the first region, and
the at least one processor is configured to update the second position information or the second rotation angle information to the position information or the rotation angle information on the current imaging region if the current imaging region is included in the second region.
22. The endoscope system according to claim 4, wherein in the manual mode, the at least one processor calculates the third position information and the third rotation angle information and stores the third position information and the third rotation angle information in the storage unit.
23. The endoscope system according to claim 1, wherein the at least one processor is operable in an autonomous mode for autonomously moving the endoscope by controlling the moving device, and
the at least one processor calculates the third position information and the third rotation angle information during the autonomous mode.
24. A controller configured to control an endoscope image that is captured by an endoscope and is displayed on a display device,
the controller comprising:
a storage unit; and
at least one processor,
wherein the storage unit stores first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject, the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region,
the at least one processor calculates third rotation angle information on a third region in the subject on a basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions,
the at least one processor rotates the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and
the at least one processor outputs the rotated endoscope image to the display device.
25. A control method for controlling an endoscope image that is captured by an endoscope and is displayed on a display device, by using first position information and first rotation angle information on a first region in a subject and second position information and second rotation angle information on a second region different from the first region in the subject,
the first rotation angle information defining a rotation angle of the endoscope image of the first region, the second rotation angle information defining a rotation angle of the endoscope image of the second region,
the control method comprising the steps of:
calculating third rotation angle information on a third region in the subject on a basis of the first position information, the first rotation angle information, the second position information, the second rotation angle information, and third position information on the third region, the third region being different from the first and second regions;
rotating the endoscope image on a basis of the third rotation angle information if the third region includes a current imaging region that is currently being imaged by the endoscope, and
outputting the rotated endoscope image to the display device.
26. A computer-readable non-transitory recording medium in which a control program for causing a computer to perform the control method according to claim 25 is stored.
US18/105,300 2020-09-10 2023-02-03 Endoscope system, controller, control method, and recording medium Pending US20230180998A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/105,300 US20230180998A1 (en) 2020-09-10 2023-02-03 Endoscope system, controller, control method, and recording medium

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063076408P 2020-09-10 2020-09-10
PCT/JP2021/033210 WO2022054884A1 (en) 2020-09-10 2021-09-09 Endoscope system, control device, control method, and recording medium
US18/105,300 US20230180998A1 (en) 2020-09-10 2023-02-03 Endoscope system, controller, control method, and recording medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/033210 Continuation WO2022054884A1 (en) 2020-09-10 2021-09-09 Endoscope system, control device, control method, and recording medium

Publications (1)

Publication Number Publication Date
US20230180998A1 true US20230180998A1 (en) 2023-06-15

Family

ID=80629721

Family Applications (4)

Application Number Title Priority Date Filing Date
US18/105,305 Pending US20230172675A1 (en) 2020-09-10 2023-02-03 Controller, endoscope system, and control method
US18/105,314 Pending US20230180996A1 (en) 2020-09-10 2023-02-03 Controller, endoscope system, control method, and control program
US18/105,300 Pending US20230180998A1 (en) 2020-09-10 2023-02-03 Endoscope system, controller, control method, and recording medium
US18/105,291 Abandoned US20230180995A1 (en) 2020-09-10 2023-02-03 Medical system and control method

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US18/105,305 Pending US20230172675A1 (en) 2020-09-10 2023-02-03 Controller, endoscope system, and control method
US18/105,314 Pending US20230180996A1 (en) 2020-09-10 2023-02-03 Controller, endoscope system, control method, and control program

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/105,291 Abandoned US20230180995A1 (en) 2020-09-10 2023-02-03 Medical system and control method

Country Status (4)

Country Link
US (4) US20230172675A1 (en)
JP (3) JP7535587B2 (en)
CN (3) CN116171122A (en)
WO (4) WO2022054428A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180085175A1 (en) * 2015-08-19 2018-03-29 Brainlab Ag Determining a configuration of a medical robotic arm
US20230255442A1 (en) * 2022-02-11 2023-08-17 Canon U.S.A., Inc. Continuum robot apparatuses, methods, and storage mediums

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9204939B2 (en) * 2011-08-21 2015-12-08 M.S.T. Medical Surgery Technologies Ltd. Device and method for assisting laparoscopic surgery—rule based approach
TWI782409B (en) * 2020-03-09 2022-11-01 陳階曉 Endoscopic image correction system and method thereof
WO2023195326A1 (en) * 2022-04-05 2023-10-12 オリンパス株式会社 Endoscope system, procedure supporting method, and procedure supporting program
WO2024009901A1 (en) * 2022-07-08 2024-01-11 オリンパス株式会社 Endoscope system, control method, and control program
WO2024157360A1 (en) * 2023-01-24 2024-08-02 国立研究開発法人国立がん研究センター Treatment instrument detection device for endoscopic images, treatment instrument detection method for endoscopic images, and treatment instrument detection device program for endoscopic images
US20240349985A1 (en) * 2023-04-24 2024-10-24 Karl Storz Se & Co. Kg Corrective adjustment of image parameters using artificial intelligence
CN118319430A (en) * 2023-12-29 2024-07-12 北京智愈医疗科技有限公司 Monitoring device of water sword motion trail based on endoscope
WO2025163471A1 (en) * 2024-01-29 2025-08-07 Covidien Lp Hysteroscopic surgical systems for use with surgical robotic systems and surgical robotic systems incorporating the same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160353970A1 (en) * 2014-02-20 2016-12-08 Olympus Corporation Endoscope system and the method of controlling the endoscope
US20200297200A1 (en) * 2019-03-18 2020-09-24 Sony Olympus Medical Solutions Inc. Medical observation apparatus
US20220079415A1 (en) * 2017-09-22 2022-03-17 Carl Zeiss Meditec Ag Visualization system comprising an observation apparatus and an endoscope
US20220192777A1 (en) * 2019-07-10 2022-06-23 Sony Group Corporation Medical observation system, control device, and control method
US20220354347A1 (en) * 2019-09-12 2022-11-10 Sony Group Corporation Medical support arm and medical system

Family Cites Families (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2797830B2 (en) * 1992-03-31 1998-09-17 日本ビクター株式会社 Object Tracking Method for Video Camera
JP3348933B2 (en) * 1993-03-19 2002-11-20 オリンパス光学工業株式会社 Electronic endoscope device
JP2833425B2 (en) * 1993-06-30 1998-12-09 日本ビクター株式会社 Object tracking device for video camera
JP3419869B2 (en) * 1993-12-28 2003-06-23 オリンパス光学工業株式会社 Medical equipment
JPH0938030A (en) * 1995-07-28 1997-02-10 Shimadzu Corp Endoscope device
JPH09266882A (en) * 1996-04-02 1997-10-14 Olympus Optical Co Ltd Endoscope device
US7037258B2 (en) 1999-09-24 2006-05-02 Karl Storz Imaging, Inc. Image orientation for endoscopic video displays
JP2001112704A (en) * 1999-10-20 2001-04-24 Olympus Optical Co Ltd Endoscope system
JP2003088532A (en) * 2001-09-19 2003-03-25 Olympus Optical Co Ltd Operation instrument
JP4331541B2 (en) 2003-08-06 2009-09-16 オリンパス株式会社 Endoscope device
US20050123179A1 (en) * 2003-12-05 2005-06-09 Eastman Kodak Company Method and system for automatic axial rotation correction in vivo images
US7654997B2 (en) * 2004-04-21 2010-02-02 Acclarent, Inc. Devices, systems and methods for diagnosing and treating sinusitus and other disorders of the ears, nose and/or throat
JP4377745B2 (en) * 2004-05-14 2009-12-02 オリンパス株式会社 Electronic endoscope
JP4699040B2 (en) * 2005-02-15 2011-06-08 パナソニック株式会社 Automatic tracking control device, automatic tracking control method, program, and automatic tracking system
JP4785127B2 (en) * 2005-12-08 2011-10-05 学校法人早稲田大学 Endoscopic visual field expansion system, endoscopic visual field expansion device, and endoscope visual field expansion program
JP4980625B2 (en) * 2006-02-21 2012-07-18 富士フイルム株式会社 Body cavity observation device
US7841980B2 (en) * 2006-05-11 2010-11-30 Olympus Medical Systems Corp. Treatment system, trocar, treatment method and calibration method
JP5030639B2 (en) * 2007-03-29 2012-09-19 オリンパスメディカルシステムズ株式会社 Endoscope device treatment instrument position control device
US8083669B2 (en) * 2007-06-22 2011-12-27 Olympus Medical Systems Corp. Medical device for maintaining state of treatment portion
JP5192898B2 (en) * 2008-04-25 2013-05-08 オリンパスメディカルシステムズ株式会社 Manipulator system
WO2012078989A1 (en) * 2010-12-10 2012-06-14 Wayne State University Intelligent autonomous camera control for robotics with medical, military, and space applications
JP6021369B2 (en) * 2012-03-21 2016-11-09 Hoya株式会社 Endoscope system
TWI517828B (en) * 2012-06-27 2016-01-21 國立交通大學 Image tracking system and image tracking method thereof
EP3125806B1 (en) * 2014-03-28 2023-06-14 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes
CN106456267B (en) * 2014-03-28 2020-04-03 直观外科手术操作公司 Quantitative 3D visualization of instruments in the field of view
JP6177488B2 (en) * 2015-07-23 2017-08-09 オリンパス株式会社 Manipulator and medical system
WO2017082047A1 (en) * 2015-11-13 2017-05-18 オリンパス株式会社 Endoscope system
JPWO2017130567A1 (en) * 2016-01-25 2018-11-22 ソニー株式会社 MEDICAL SAFETY CONTROL DEVICE, MEDICAL SAFETY CONTROL METHOD, AND MEDICAL SUPPORT SYSTEM
JP6150968B1 (en) * 2016-02-10 2017-06-21 オリンパス株式会社 Endoscope system
CN107456278B (en) * 2016-06-06 2021-03-05 北京理工大学 Endoscopic surgery navigation method and system
JP2019165270A (en) * 2016-08-03 2019-09-26 シャープ株式会社 Video image output system, video image output method, and control apparatus
WO2018051565A1 (en) * 2016-09-15 2018-03-22 オリンパス株式会社 Ultrasonic endoscope and ultrasonic endoscope system
WO2018159328A1 (en) * 2017-02-28 2018-09-07 ソニー株式会社 Medical arm system, control device, and control method
EP3603562B1 (en) * 2017-03-28 2022-06-29 Sony Olympus Medical Solutions Inc. Medical observation apparatus and observation field correction method
WO2018235255A1 (en) * 2017-06-23 2018-12-27 オリンパス株式会社 Medical system and its operating method
WO2019035206A1 (en) * 2017-08-18 2019-02-21 オリンパス株式会社 Medical system and image generation method
DE102017219621B4 (en) * 2017-09-22 2025-11-13 Carl Zeiss Meditec Ag Visualization system with an observation device and an endoscope
WO2019116592A1 (en) * 2017-12-14 2019-06-20 オリンパス株式会社 Device for adjusting display image of endoscope, and surgery system
JP7151109B2 (en) * 2018-03-19 2022-10-12 ソニーグループ株式会社 Medical imaging device and medical observation system
WO2020070883A1 (en) * 2018-10-05 2020-04-09 オリンパス株式会社 Endoscopic system
JP7596269B2 (en) * 2019-02-21 2024-12-09 シアター・インコーポレイテッド SYSTEMS AND METHODS FOR ANALYSIS OF SURGICAL VIDEOS - Patent application
IL290896B2 (en) * 2019-08-30 2025-07-01 Brainlab Ag Image based motion control correction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160353970A1 (en) * 2014-02-20 2016-12-08 Olympus Corporation Endoscope system and the method of controlling the endoscope
US20220079415A1 (en) * 2017-09-22 2022-03-17 Carl Zeiss Meditec Ag Visualization system comprising an observation apparatus and an endoscope
US20200297200A1 (en) * 2019-03-18 2020-09-24 Sony Olympus Medical Solutions Inc. Medical observation apparatus
US20220192777A1 (en) * 2019-07-10 2022-06-23 Sony Group Corporation Medical observation system, control device, and control method
US20220354347A1 (en) * 2019-09-12 2022-11-10 Sony Group Corporation Medical support arm and medical system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180085175A1 (en) * 2015-08-19 2018-03-29 Brainlab Ag Determining a configuration of a medical robotic arm
US12201373B2 (en) * 2015-08-19 2025-01-21 Brainlab Ag Determining a configuration of a medical robotic arm
US20230255442A1 (en) * 2022-02-11 2023-08-17 Canon U.S.A., Inc. Continuum robot apparatuses, methods, and storage mediums

Also Published As

Publication number Publication date
CN116018538A (en) 2023-04-25
US20230180996A1 (en) 2023-06-15
CN115996662B (en) 2025-11-18
JP7522840B2 (en) 2024-07-25
JPWO2022054884A1 (en) 2022-03-17
US20230172675A1 (en) 2023-06-08
WO2022054883A1 (en) 2022-03-17
JP7534423B2 (en) 2024-08-14
WO2022054882A1 (en) 2022-03-17
WO2022054428A1 (en) 2022-03-17
US20230180995A1 (en) 2023-06-15
JPWO2022054428A1 (en) 2022-03-17
JP7535587B2 (en) 2024-08-16
CN116171122A (en) 2023-05-26
WO2022054884A1 (en) 2022-03-17
CN115996662A (en) 2023-04-21
JPWO2022054882A1 (en) 2022-03-17

Similar Documents

Publication Publication Date Title
US20230180998A1 (en) Endoscope system, controller, control method, and recording medium
CN110225720B (en) Operation support device, recording medium, and operation support system
JP7160033B2 (en) Input control device, input control method, and surgical system
JP7697551B2 (en) Medical observation system, medical observation device, and medical observation method
US10638915B2 (en) System for moving first insertable instrument and second insertable instrument, controller, and computer-readable storage device
KR102358967B1 (en) Systems and methods for control of imaging instrument orientation
KR101038417B1 (en) Surgical Robot System and Its Control Method
US10441146B2 (en) Method of measuring distance by an endoscope, and endoscope system
CN113786152B (en) Endoscope lens tracking method and endoscope system
CN113645919A (en) Medical arm system, control device, and control method
US20250117073A1 (en) Systems and methods for facilitating optimization of an imaging device viewpoint during an operating session of a computer-assisted operation system
US20190183321A1 (en) Image output system, image output method and control device
US10799100B2 (en) Image processing device, method, and program
US20230139425A1 (en) Systems and methods for optimizing configurations of a computer-assisted surgical system for reachability of target objects
US20190159860A1 (en) Photographing system, photographing method and control device
CN113614607B (en) Medical observation system, method and medical observation device
CN118490146B (en) Capsule endoscope control method, device and control system
EP4636696A1 (en) Medical image processing device and method of operating the same
US20230131209A1 (en) Treatment device and endoscope system
JP2020018492A (en) Medical drone system
US20210298854A1 (en) Robotically-assisted surgical device, robotically-assisted surgical method, and system
CN120605115A (en) A surgical imaging system
WO2025206245A1 (en) Surgery assistance system and surgery assistance method
CN116152331A (en) Image acquisition assembly adjusting method, device and operating system
CN117398180A (en) Viewing angle adjustment method of three-dimensional reconstruction model, surgical navigation host, and system

Legal Events

Date Code Title Description
AS Assignment

Owner name: NATIONAL CANCER CENTER, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIZUTANI, CHIHARU;YANAGIHARA, MASARU;OGIMOTO, HIROTO;AND OTHERS;SIGNING DATES FROM 20230105 TO 20230111;REEL/FRAME:062583/0700

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIZUTANI, CHIHARU;YANAGIHARA, MASARU;OGIMOTO, HIROTO;AND OTHERS;SIGNING DATES FROM 20230105 TO 20230111;REEL/FRAME:062583/0700

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION COUNTED, NOT YET MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED