US20230180995A1 - Medical system and control method - Google Patents
Medical system and control method Download PDFInfo
- Publication number
- US20230180995A1 US20230180995A1 US18/105,291 US202318105291A US2023180995A1 US 20230180995 A1 US20230180995 A1 US 20230180995A1 US 202318105291 A US202318105291 A US 202318105291A US 2023180995 A1 US2023180995 A1 US 2023180995A1
- Authority
- US
- United States
- Prior art keywords
- endoscope
- tip
- controller
- specific region
- surgical instrument
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/0004—Operational features of endoscopes provided with input arrangements for the user for electronic operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00039—Operational features of endoscopes provided with input arrangements for the user
- A61B1/00042—Operational features of endoscopes provided with input arrangements for the user for mechanical operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/0016—Holding or positioning arrangements using motor drive units
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/305—Details of wrist mechanisms at distal ends of robotic arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6847—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to a medical system and a control method and particularly relates to a medical system having the function of causing an endoscope to follow an object, and a control method thereof.
- the present application claims priority under the provisional U.S. Pat. Application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/027564 which is hereby incorporated by reference herein in its entirety.
- the field of view may become unstable and cause a surgeon to feel stress.
- the field of view desirably stays stationary during procedures such as blunt dissection and thus a movement of the field of view may interfere with the procedure.
- a permissible region is set in an image so as to extend around the central region of the image, an endoscope follows a surgical instrument to place the surgical instrument back to the central region when the surgical instrument moves out of the permissible region, and the following is terminated when the surgical instrument moves into the central region.
- This configuration prevents the endoscope from following the surgical instrument insofar as the surgical instrument stays in the permissible region and the central region, thereby suppressing an excessive movement of the field of view.
- Another aspect of the present invention is a control method for controlling a movement of an endoscope on the basis of the position of an object, the endoscope capturing an image including the object, the control method including: controlling the movement of the endoscope in a first control mode in which the endoscope is caused to follow the object at a first speed, when the object is located outside a predetermined three-dimensional region set in a field of view of the endoscope; and controlling the movement of the endoscope in a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed, when the object is located in the predetermined three-dimensional region.
- FIG. 1 illustrates an appearance of a medical system according to an embodiment of the present invention.
- FIG. 2 is a block diagram of the medical system illustrated in FIG. 1 .
- FIG. 3 illustrates a three-dimensional specific region set in the field of view of an endoscope.
- FIG. 4 B is an endoscope image illustrating another example of a cross section of the specific region.
- FIG. 4 C is an endoscope image illustrating another example of a cross section of the specific region.
- FIG. 5 is an explanatory drawing of a size on the endoscope image of the specific region at depth positions X1, X2, and X3 in FIG. 3 .
- FIG. 6 is an explanatory drawing of a movement of a surgical instrument followed by the endoscope in the endoscope image.
- FIG. 7 D is an explanatory drawing of the specific example of the method for calculating the specific region.
- FIG. 8 A is a flowchart of a control method performed by a controller in FIG. 1 .
- FIG. 8 B is a flowchart of a modification of the control method performed by the controller in FIG. 1 .
- FIG. 9 A is an explanatory drawing of a method for setting the size of the specific region according to the viewing angle of the endoscope.
- FIG. 9 B is an explanatory drawing of the method for setting the size of the specific region according to the viewing angle of the endoscope.
- FIG. 10 is an explanatory drawing of a modification of a movement of the surgical instrument followed by the endoscope in the endoscope image.
- FIG. 11 illustrates a three-dimensional specific region in a reference example.
- the endoscope 1 is, for example, a rigid endoscope and includes an imaging portion 1 a that has an image sensor and captures an endoscope image (see FIG. 2 ).
- the endoscope 1 captures an endoscope image D (see FIGS. 5 and 6 ), which includes a tip 2 a of the surgical instrument 2 , through the imaging portion 1 a and transmits the endoscope image D to the controller 4 .
- the imaging portion 1 a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 1 and captures a stereo image, which includes information on the three-dimensional position of the tip 2 a of the surgical instrument 2 , as the endoscope image D.
- the moving device 3 includes a robot arm 3 a having a plurality of joints 3 b and holds the proximal portion of the endoscope 1 at the tip portion of the robot arm 3 a .
- the robot arm 3 a has three degrees of freedom of movement: a back-and-forth linear motion along the X axis, a rotation (pitch) about the Y axis, and a rotation (yaw) about the Z axis.
- a rotation (roll) about the X axis is preferably added as a degree of freedom of movement.
- the X axis is an axis on the same straight line as an optical axis A of the endoscope 1
- the Y and Z axes are axes that are orthogonal to the optical axis A and extend in respective directions corresponding to the lateral direction and the longitudinal direction of the endoscope image D.
- the controller 4 includes at least one processor 4 a like a central processing unit, a memory 4 b , a storage unit 4 c , an input interface 4 d , an output interface 4 e , and a network interface 4 f .
- the endoscope images D transmitted from the endoscope 1 are sequentially inputted to the controller 4 through the input interface 4 d , are sequentially outputted to the display device 5 through the output interface 4 e , and are displayed on the display device 5 .
- a surgeon operates the surgical instrument 2 inserted into a body while observing the endoscope image D displayed on the display device 5 , and performs an operation on an affected area in the body by using the surgical instrument 2 .
- the storage unit 4 c is a ROM (read-only memory) or a nonvolatile recording medium such as a hard disk and stores a program and data necessary for causing the processor 4 a to perform processing.
- the program is read in the memory 4 b and is executed by the processor 4 a , thereby implementing the functions of the controller 4 .
- the functions will be described later.
- Some of the functions of the controller 4 may be implemented by dedicated logic circuits or the like.
- the controller 4 has a manual mode or a follow-up mode.
- the manual mode is a mode in which an operator, e.g., a surgeon manually operates the endoscope 1
- the follow-up mode is a mode in which the controller 4 causes the endoscope 1 to automatically follow the tip 2 a of the surgical instrument (object) 2 .
- the controller 4 controls the moving device 3 on the basis of the three-dimensional position of the tip 2 a of the surgical instrument 2 , so that the endoscope 1 is caused to three-dimensionally follow the tip 2 a so as to move the tip 2 a toward the center of the endoscope image D and to a predetermined depth of the endoscope image D. Specifically, the controller 4 recognizes the surgical instrument 2 in the endoscope image D and calculates the three-dimensional position of the tip 2 a by using the endoscope image D.
- the controller 4 then operates the joints 3 b such that the optical axis A of the endoscope 1 moves to the tip 2 a in a direction that crosses the optical axis A and the tip of the endoscope 1 moves to a position at a predetermined observation distance from the tip 2 a in the depth direction extending along the optical axis A.
- the follow-up mode includes a first control mode in which the endoscope 1 is caused to follow the tip 2 a of the surgical instrument 2 at a first speed and a second control mode in which the endoscope 1 is caused to follow the tip 2 a of the surgical instrument 2 at a second speed lower than the first speed.
- the controller 4 controls the moving device 3 in the first control mode when the tip 2 a is located outside a predetermined specific region B, and the controller 4 controls the moving device 3 in the second control mode when the tip 2 a is located in the specific region B.
- sensitivity for following a movement of the tip 2 a by the endoscope 1 decreases, thereby suppressing excessive following motion of the endoscope 1 with respective to the tip 2 a .
- the specific region B is a predetermined three-dimensional region that is set in a field of view F of the endoscope 1 and has dimensions in the X direction, the Y direction, and the Z direction that are orthogonal to one another.
- the X direction is a depth direction parallel to the optical axis A of the endoscope 1 .
- the Y direction and the Z direction are directions that are orthogonal to the optical axis A and are parallel respectively to the lateral direction and the longitudinal direction of the endoscope image D.
- the specific region B and the endoscope image D are identical in shape in cross section.
- the specific region B is also rectangular in cross section.
- the specific region B displayed on the endoscope image D may interfere with an observation of the endoscope image D and thus is preferably hidden.
- the surgeon easily recognizes the position of the hidden specific region B.
- the field of view F of the endoscope 1 is typically shaped like a cone having the vertex at or near the tip of the endoscope 1 .
- the specific region B is preferably a frustum with the vertex shared with the field of view F of the endoscope 1 . As illustrated in FIG. 5 , the size and position of the displayed specific region B are fixed on the endoscope image D regardless of positions X1, X2, and X 3 in the X direction.
- the effect of suppressing excessive following motion of the endoscope 1 relative to a movement of the tip 2 a may become insufficient, leading to frequent movements of the field of view F.
- the size of the specific region B is larger than 55% of the size of the endoscope image D, the tip 2 a is frequently disposed at a position remote from the center of the endoscope image D, leading to difficulty in placing the tip 2 a at the center.
- the specific region B includes a non-following region B 1 and a following region B 2 .
- the non-following region B 1 is a central region of the specific region B including the optical axis A.
- the following region B 2 is an outer region of the specific region B and surrounds the non-following region B 1 .
- the non-following region B 1 has a three-dimensional shape that decreases in size in cross section toward the tip of the endoscope 1 .
- the non-following region B 1 is preferably a frustum.
- the controller 4 keeps the position of the endoscope 1 without causing the endoscope 1 to follow the tip 2 a . Specifically, the controller 4 controls the angular velocities of the joints 3 b to zero. Thus, the second speed in the non-following region B 1 is zero.
- the following region B 2 acts as a trigger for starting following of the tip 2 a by the endoscope 1
- the non-following region B 1 acts as a trigger for terminating following of the tip 2 a by the endoscope 1 .
- the tip 2 a moves from the following region B 2 to an outer region C
- the endoscope 1 starts following the tip 2 a .
- the tip 2 a enters the non-following region B 1 from the outer region C through the following region B 2 following of the tip 2 a by the endoscope 1 is terminated.
- the first speed V 1 and the second speed V 2 are each kept constant, whereas the following speed of the endoscope 1 may be changed in two steps.
- the first speed V 1 and the second speed V 2 may change according to a distance from the center of the endoscope image D to the tip 2 a .
- the controller 4 may calculate distances from the optical axis A of the endoscope 1 to the tip 2 a in the Y direction and the Z direction and increase the speeds V 1 and V 2 according to the distances.
- the following speeds V 1 and V 2 of the endoscope 1 may continuously decrease from the outer region C to the non-following region B 1 .
- FIGS. 7 A to 7 D are explanatory drawings of a method for calculating the specific region B.
- the controller 4 sets, as a fiducial point E, an intersection point of the optical axis A and a YZ plane P that passes through the tip 2 a of the surgical instrument 2 and is perpendicular to the optical axis A.
- the controller 4 then defines, as the specific region B, a region like a rectangular solid or a sphere that is centered around the fiducial point E.
- FIGS. 7 B and 7 C are explanatory drawings of a method for calculating an actual size [mm] of the specific region B.
- an actual size L_dz[mm] of the specific region B in the Z direction is calculated from the following formula by using a pixel size[dx] of the specific region B in the Z direction.
- An actual size L_dy[mm] of the specific region B in the Y direction is also calculated by the same method as L_dz.
- an actual size L_dx of the specific region B in the X direction is also set.
- the actual size L_dx may be set at a fixed value regardless of the observation distance di.
- an actual size L_dx at a reference observation distance di e.g., d1
- L_dx at another observation distance di e.g., d2
- a surgeon performs a procedure by operating the surgical instrument 2 inserted into a body while observing the endoscope image D displayed on the display device 5 .
- the surgeon switches from the manual mode to the follow-up mode or from the follow-up mode to the manual mode in response to, for example, a voice.
- step S 1 when switching to the follow-up mode in step S 1 , the controller 4 performs the control method of steps S 2 to S 8 and controls the moving device 3 in the follow-up mode.
- the control method includes step S 2 of determining whether the tip 2 a of the surgical instrument 2 is located in the specific region B and steps S 3 to S 8 of causing, when the position of the tip 2 a is located outside the specific region B, the endoscope 1 to follow the surgical instrument 2 until the tip 2 a of the surgical instrument 2 reaches the non-following region B 1 .
- the controller 4 calculates the three-dimensional position of the tip 2 a by using the endoscope image D, which is a stereo image, and determines whether the tip 2 a is located in the predetermined specific region B (step S 2 ).
- the controller 4 keeps the position of the endoscope 1 without causing the endoscope 1 to follow the surgical instrument 2 .
- the controller 4 starts following of the surgical instrument 2 by the endoscope 1 (step S 3 ).
- the controller 4 selects one of the first control mode and the second control mode on the basis of the position of the tip 2 a .
- the tip 2 a is located outside the specific region B at the start of following (NO at step S 4 ) and thus the controller 4 controls the moving device 3 in the first control mode, so that the endoscope 1 is caused to follow the tip 2 a of the surgical instrument 2 at the first speed V 1 so as to move the tip 2 a of the surgical instrument 2 toward the center of the endoscope image D (step S 5 ).
- the controller 4 controls the moving device 3 in the first control mode until the tip 2 a enters the specific region B.
- the controller 4 controls the moving device 3 in the second control mode, so that the endoscope 1 is caused to follow the tip 2 a of the surgical instrument 2 at the second speed V 2 so as to move the tip 2 a of the surgical instrument 2 toward the center of the endoscope image D. Since the second speed V 2 is lower than the first speed V 1 , the responsivity for following a movement of the tip 2 a by the endoscope 1 decreases. In other words, after the tip 2 a returns to the specific region B from the outer region C, the endoscope 1 is prevented from excessively following a movement of the surgical instrument 2 . The controller 4 controls the moving device 3 in the second control mode until the tip 2 a enters the non-following region B 1 .
- the controller 4 causes the endoscope 1 to finish following the surgical instrument 2 (step S 8 ).
- step S 9 While the follow-up mode continues (NO at step S 9 ), the controller 4 repeats steps S 1 to S 8 .
- the endoscope 1 To allow a surgeon to follow an object through the endoscope 1 with ease of operation, it is desirable to cause the endoscope 1 to follow the surgical instrument 2 so as to satisfy three conditions: suppressing excessive following motion, bringing the tip 2 a of the surgical instrument 2 at the center of the endoscope image D, and bringing the tip 2 a of the surgical instrument 2 at a proper distance in the X direction.
- the specific region B is a three-dimensional region set in the field of view F and thus can be properly designed.
- a distance between the tip of the endoscope 1 and the specific region B in the X direction and a size of the specific region B in cross section at each position in the X direction are designed to satisfy the three conditions. This allows the endoscope 1 to follow the surgical instrument 2 with ease of operation.
- the specific region B is shaped to decrease in size in cross section toward the tip of the endoscope 1 , thereby suppressing a difference in the size of the displayed specific region B on the endoscope image D between positions in the X direction.
- the specific region B is preferably displayed with a fixed size regardless of the position in the X direction. This can suppress excessive following motion of the endoscope 1 and place the tip 2 a at the center regardless of the position of the tip 2 a in the X direction.
- FIG. 11 illustrates a specific region B′ as a reference example.
- the specific region B′ is formed by simply extending a two-dimensional region on the image plane of the endoscope image D, the specific region B′ extends from the tip of the endoscope 1 in the X direction.
- the endoscope 1 cannot be caused to follow the surgical instrument 2 such that the tip 2 a is brought at a proper distance in the X direction.
- the size of the specific region B′ is fixed in cross section and thus the displayed specific region B′ on the endoscope image D changes in size according to the position in the X direction.
- the displayed specific region B′ decreases in size at a position X 3 remote from the tip of the endoscope 1 in the X direction, so that excessive following motion of the endoscope 1 cannot be suppressed, though the tip 2 a can be placed at the center.
- the size of the displayed specific region B′ increases at a position X1 close to the tip of the endoscope 1 in the X direction. This can suppress excessive following motion of the endoscope 1 but leads to difficulty in placing the tip 2 a at the center.
- the controller 4 when the tip 2 a is located in the following region B 2 , the controller 4 continues the operation of the endoscope 1 in the previous control cycle.
- the endoscope 1 may be caused to always follow the tip 2 a at the second speed V 2 larger than zero.
- the controller 4 may control the moving device 3 in the second control mode in either of the following cases: where the tip 2 a enters the following region B 2 from the outer region C, and where the tip 2 a enters the following region B 2 from the non-following region B 1 .
- the controller 4 determines whether the tip 2 a is located in the non-following region B 1 (step S 2 ′). When the tip 2 a moves from the non-following region B 1 into the following region B 2 (NO at step S 2 ′), the controller 4 starts following the surgical instrument 2 by the endoscope 1 (step S 3 ).
- the tip 2 a is located in the following region B 2 at the start of following (YES at step S 4 and NO at step S 6 ) and thus the controller 4 controls the moving device 3 in the second control mode, so that the endoscope 1 is caused to follow the tip 2 a of the surgical instrument 2 at the second speed V 2 so as to move the tip 2 a of the surgical instrument 2 toward the center of the endoscope image D (step S 7 ).
- the controller 4 controls the moving device 3 in the second control mode until the tip 2 a enters the non-following region B 1 .
- the second speed V 2 is lower than the first speed V 1 , the endoscope 1 is prevented from excessively following a movement of the surgical instrument 2 while the tip 2 a moves in the following region B 2 .
- the controller 4 switches from the second control mode to the first control mode (step S 5 ) and controls the moving device 3 in the first control mode until the tip 2 a returns to the specific region B.
- the controller 4 may change the size of the specific region B in cross section according to the viewing angle ⁇ of the endoscope 1 .
- a value of the viewing angle ⁇ is stored for each type of the endoscope 1 .
- the controller 4 recognizes the type of the endoscope 1 held by the robot arm 3 a , reads the value of the viewing angle ⁇ of the recognized type from the storage unit 4 c , and sets the vertex angle ⁇ of the specific region B at a predetermined rate of the viewing angle ⁇ .
- the vertex angle ⁇ is calculated by multiplying the value of the viewing angle ⁇ by a predetermined rate k selected from 25% to 55%.
- the specific region B increases in size in cross section in proportion to the viewing angle ⁇ .
- the area ratio of the cross section of the specific region B to the cross section of the field of view F is kept constant regardless of a difference in the viewing angle ⁇ of the used endoscope 1 .
- the displayed specific region B on the endoscope image D displayed on the display device 5 can have the same size regardless of the viewing angle ⁇ of the endoscope 1 .
- the specific region B includes the non-following region B 1 where the endoscope 1 is not caused to follow the surgical instrument 2 .
- the non-following region B 1 may be absent in the specific region B.
- the controller 4 causes the endoscope 1 to follow the surgical instrument 2 at the second speed V 2 until the tip 2 a of the surgical instrument 2 is located at the center of the endoscope image D.
- the controller 4 causes the endoscope 1 to finish following the surgical instrument 2 .
- the endoscope 1 finishes following the tip 2 a when the tip 2 a reaches the end of the following region B 2 remote from the center of the endoscope image D.
- the endoscope 1 is caused to follow the tip 2 a until the tip 2 a reaches the center of the endoscope image D.
- a procedure can be performed with the tip 2 a located at the center of the endoscope image D.
- the second speed V 2 is preferably 50% or less of the first speed V 1 .
- the second speed V 2 may remain constant or gradually decrease as the tip 2 a of the surgical instrument 2 moves close to the center of the endoscope image D. If the second speed V 2 is higher than 50% of the first speed V 1 , it is difficult to sufficiently obtain the effect of suppressing excessive following motion of the endoscope 1 .
- the shape of the specific region B may be changed in cross section.
- the cross-sectional shape can be selected from a rectangle, a circle, and an ellipse in FIGS. 4 A to 4 C , and parameters dy, dz, R, a, and b can be set to determine the size of each shape in cross section.
- the selection of the shape and the setting of the parameters may be manual operations performed by the surgeon or automatic operations performed by the controller 4 .
- the shape and size of the specific region B in cross section can be set according to, for example, the technique, the contents of the procedure, or the preferences of the surgeon.
- the cross section is set as a vertically oriented ellipse in FIG. 4 C .
- This can prevent the field of view F from excessively reacting to a longitudinal movement of the tip 2 a and vibrating in the longitudinal direction, thereby stopping the field of view F regardless of a longitudinal movement of the tip 2 a during the procedure.
- the controller 4 may recognize the type of the surgical instrument 2 or a procedure and automatically change at least one of the shape of the specific region B, the sizes of the specific region B in the X, Y, and Z directions, and the position of the specific region B according to the type of the surgical instrument 2 or the procedure. Furthermore, the controller 4 may automatically change the first speed and the second speed according to the type of the surgical instrument 2 or the procedure. For example, the controller 4 recognizes the type of the surgical instrument 2 on the basis of the endoscope image D and recognizes the type of the procedure according to the type of the surgical instrument 2 .
- the proper shape, size, and position of the specific region B change according to the type of the surgical instrument 2 or the procedure.
- the shape, size, and position of the specific region B can be automatically set to be suitable for the type of the surgical instrument 2 and the procedure.
- the specific region B having a larger size in the X direction is set to be located at a larger distance from the tip of the endoscope 1 .
- a range of 90 mm to 190 mm from the tip of the endoscope 1 is set as the specific region B.
- the specific region B is set to be located at a shorter distance from the tip of the endoscope 1 in order to perform an elaborate procedure. For example, a range of 60 mm to 90 mm from the tip of the endoscope 1 is set as the specific region B.
- the size of the specific region B may be increased in cross section or the second speed may be reduced.
- the controller 4 may learn a movement of the tip 2 a during a procedure and change the shape and size of the specific region B such that the motion range of the tip 2 a is included in the specific region B during the procedure.
- a definite border may be absent between the specific region B and the outer region C.
- the controller 4 may continuously change the following speed according to a distance from the center of the endoscope image D to the tip 2 a .
- the controller 4 may calculate an angular velocity Vp of a rotation about the Y axis and an angular velocity Vy of a rotation about the Z axis according to the formula below, and rotate the robot arm 3 a at the calculated angular velocities Vp and Vy.
- py is a distance from the center of the endoscope image D to the tip 2 a in the Y direction
- pz is a distance from the center of the endoscope image D to the tip 2 a in the Z direction
- Gy and Gz are predetermined coefficients of proportionality.
- the endoscope 1 captures a three-dimensional stereo image as the endoscope image D.
- a two-dimensional endoscope image D may be captured.
- the position of the tip 2 a of the surgical instrument 2 in the X direction may be measured by another distance-measuring means, e.g., a distance sensor provided at the tip of the endoscope 1 .
- an object to be followed by the endoscope 1 is the surgical instrument 2 but is not limited thereto.
- the endoscope 1 may follow any object in the endoscope image D during a surgical operation.
- an object may be a lesion, an organ, a blood vessel, a marker, a biomedical material such as gauze, or a medical instrument other than the surgical instrument 2 .
- Endoscope 1 a Imaging portion 2 Surgical instrument (object) 3 Moving device 3 a Robot arm 3 b Joint 4 Controller 5 Display device 10 Medical system A Optical axis B Specific region (predetermined three-dimensional region) B 1 Non-following region (specific region) B 2 Following region (specific region) C Outer region D Endoscope image F Field of view ⁇ Viewing angle
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
- The present invention relates to a medical system and a control method and particularly relates to a medical system having the function of causing an endoscope to follow an object, and a control method thereof. The present application claims priority under the provisional U.S. Pat. Application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/027564 which is hereby incorporated by reference herein in its entirety.
- Conventionally, a system has been proposed to move the field of view of an endoscope in a semiautonomous manner by causing the endoscope to follow an object, e.g., a surgical instrument (for example, see PTL 1).
- To allow a surgeon to follow an object with ease of operation, it is desirable to suppress excessive following motion of an endoscope so that an excessive movement of the field of view is prevented. Specifically, if an endoscope follows all the movements of an object, the field of view may become unstable and cause a surgeon to feel stress. Moreover, the field of view desirably stays stationary during procedures such as blunt dissection and thus a movement of the field of view may interfere with the procedure.
- In
PTL 1, a permissible region is set in an image so as to extend around the central region of the image, an endoscope follows a surgical instrument to place the surgical instrument back to the central region when the surgical instrument moves out of the permissible region, and the following is terminated when the surgical instrument moves into the central region. This configuration prevents the endoscope from following the surgical instrument insofar as the surgical instrument stays in the permissible region and the central region, thereby suppressing an excessive movement of the field of view. - {PTL 1} U.S. Pat. application publication No. 2002/0156345
- An aspect of the present invention is a medical system including an endoscope that captures an image including an object, a moving device that includes a robot arm and that moves the endoscope in a body, and a controller that controls the moving device on the basis of the position of the object, wherein the controller is configured to control the moving device in a first control mode in which the endoscope is caused to follow the object at a first speed and a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed, the controller controls the moving device in the first control mode when the object is located outside a predetermined three-dimensional region set in the field of view of the endoscope, and the controller controls the moving device in the second control mode when the object is located in the predetermined three-dimensional region.
- Another aspect of the present invention is a control method for controlling a movement of an endoscope on the basis of the position of an object, the endoscope capturing an image including the object, the control method including: controlling the movement of the endoscope in a first control mode in which the endoscope is caused to follow the object at a first speed, when the object is located outside a predetermined three-dimensional region set in a field of view of the endoscope; and controlling the movement of the endoscope in a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed, when the object is located in the predetermined three-dimensional region.
-
FIG. 1 illustrates an appearance of a medical system according to an embodiment of the present invention. -
FIG. 2 is a block diagram of the medical system illustrated inFIG. 1 . -
FIG. 3 illustrates a three-dimensional specific region set in the field of view of an endoscope. -
FIG. 4A is an endoscope image illustrating an example of a cross section of the specific region. -
FIG. 4B is an endoscope image illustrating another example of a cross section of the specific region. -
FIG. 4C is an endoscope image illustrating another example of a cross section of the specific region. -
FIG. 5 is an explanatory drawing of a size on the endoscope image of the specific region at depth positions X1, X2, and X3 inFIG. 3 . -
FIG. 6 is an explanatory drawing of a movement of a surgical instrument followed by the endoscope in the endoscope image. -
FIG. 7A is an explanatory drawing of a specific example of a method for calculating the specific region. -
FIG. 7B is an explanatory drawing of the specific example of the method for calculating the specific region. -
FIG. 7C is an explanatory drawing of the specific example of the method for calculating the specific region. -
FIG. 7D is an explanatory drawing of the specific example of the method for calculating the specific region. -
FIG. 8A is a flowchart of a control method performed by a controller inFIG. 1 . -
FIG. 8B is a flowchart of a modification of the control method performed by the controller inFIG. 1 . -
FIG. 9A is an explanatory drawing of a method for setting the size of the specific region according to the viewing angle of the endoscope. -
FIG. 9B is an explanatory drawing of the method for setting the size of the specific region according to the viewing angle of the endoscope. -
FIG. 10 is an explanatory drawing of a modification of a movement of the surgical instrument followed by the endoscope in the endoscope image. -
FIG. 11 illustrates a three-dimensional specific region in a reference example. - A medical system and a control method according to an embodiment of the present invention will be described below with reference to the accompanying drawings.
- As illustrated in
FIG. 1 , amedical system 10 according to the present embodiment includes anendoscope 1 and asurgical instrument 2 that are inserted into the body of a patient, a movingdevice 3 that holds theendoscope 1 and moves theendoscope 1 in the body, a controller 4 that is connected to theendoscope 1 and the movingdevice 3 and controls themoving device 3, and adisplay device 5 that displays an endoscope image. - The
endoscope 1 is, for example, a rigid endoscope and includes an imaging portion 1 a that has an image sensor and captures an endoscope image (seeFIG. 2 ). Theendoscope 1 captures an endoscope image D (seeFIGS. 5 and 6 ), which includes atip 2 a of thesurgical instrument 2, through the imaging portion 1 a and transmits the endoscope image D to the controller 4. The imaging portion 1 a is, for example, a three-dimensional camera provided at the tip portion of theendoscope 1 and captures a stereo image, which includes information on the three-dimensional position of thetip 2 a of thesurgical instrument 2, as the endoscope image D. - The
moving device 3 includes arobot arm 3 a having a plurality ofjoints 3 b and holds the proximal portion of theendoscope 1 at the tip portion of therobot arm 3 a. In an example, therobot arm 3 a has three degrees of freedom of movement: a back-and-forth linear motion along the X axis, a rotation (pitch) about the Y axis, and a rotation (yaw) about the Z axis. A rotation (roll) about the X axis is preferably added as a degree of freedom of movement. The X axis is an axis on the same straight line as an optical axis A of theendoscope 1, and the Y and Z axes are axes that are orthogonal to the optical axis A and extend in respective directions corresponding to the lateral direction and the longitudinal direction of the endoscope image D. - As illustrated in
FIG. 2 , the controller 4 includes at least oneprocessor 4 a like a central processing unit, amemory 4 b, astorage unit 4 c, aninput interface 4 d, anoutput interface 4 e, and anetwork interface 4 f. - The endoscope images D transmitted from the
endoscope 1 are sequentially inputted to the controller 4 through theinput interface 4 d, are sequentially outputted to thedisplay device 5 through theoutput interface 4 e, and are displayed on thedisplay device 5. A surgeon operates thesurgical instrument 2 inserted into a body while observing the endoscope image D displayed on thedisplay device 5, and performs an operation on an affected area in the body by using thesurgical instrument 2. - The
storage unit 4 c is a ROM (read-only memory) or a nonvolatile recording medium such as a hard disk and stores a program and data necessary for causing theprocessor 4 a to perform processing. The program is read in thememory 4 b and is executed by theprocessor 4 a, thereby implementing the functions of the controller 4. The functions will be described later. Some of the functions of the controller 4 may be implemented by dedicated logic circuits or the like. - The controller 4 has a manual mode or a follow-up mode. The manual mode is a mode in which an operator, e.g., a surgeon manually operates the
endoscope 1, whereas the follow-up mode is a mode in which the controller 4 causes theendoscope 1 to automatically follow thetip 2 a of the surgical instrument (object) 2. - The controller 4 switches the manual mode and the follow-up mode on the basis of an instruction from the operator. For example, the controller 4 has AI capable of recognizing a human voice. When recognizing a voice of “manual mode,” the controller 4 switches to the manual mode. When recognizing a voice of “follow-up mode,” the controller 4 switches to the follow-up mode. The controller 4 may switch the manual mode and the follow-up mode in response to turn-on or turn-off of a manual operation switch (not illustrated) provided on the
endoscope 1. - In the manual mode, for example, an operator, e.g., a surgeon can remotely operate the
robot arm 3 a by operating an operating device (not illustrated) connected to the controller 4. - In the follow-up mode, the controller 4 controls the moving
device 3 on the basis of the three-dimensional position of thetip 2 a of thesurgical instrument 2, so that theendoscope 1 is caused to three-dimensionally follow thetip 2 a so as to move thetip 2 a toward the center of the endoscope image D and to a predetermined depth of the endoscope image D. Specifically, the controller 4 recognizes thesurgical instrument 2 in the endoscope image D and calculates the three-dimensional position of thetip 2 a by using the endoscope image D. The controller 4 then operates thejoints 3 b such that the optical axis A of theendoscope 1 moves to thetip 2 a in a direction that crosses the optical axis A and the tip of theendoscope 1 moves to a position at a predetermined observation distance from thetip 2 a in the depth direction extending along the optical axis A. - In this case, the follow-up mode includes a first control mode in which the
endoscope 1 is caused to follow thetip 2 a of thesurgical instrument 2 at a first speed and a second control mode in which theendoscope 1 is caused to follow thetip 2 a of thesurgical instrument 2 at a second speed lower than the first speed. As illustrated inFIG. 3 , the controller 4 controls the movingdevice 3 in the first control mode when thetip 2 a is located outside a predetermined specific region B, and the controller 4 controls the movingdevice 3 in the second control mode when thetip 2 a is located in the specific region B. Thus, when thetip 2 a is located in the specific region B, sensitivity for following a movement of thetip 2 a by theendoscope 1 decreases, thereby suppressing excessive following motion of theendoscope 1 with respective to thetip 2 a. - The specific region B is a predetermined three-dimensional region that is set in a field of view F of the
endoscope 1 and has dimensions in the X direction, the Y direction, and the Z direction that are orthogonal to one another. The X direction is a depth direction parallel to the optical axis A of theendoscope 1. The Y direction and the Z direction are directions that are orthogonal to the optical axis A and are parallel respectively to the lateral direction and the longitudinal direction of the endoscope image D. - The specific region B is separated from the tip of the
endoscope 1 in the X direction and is set in a part of the range of the field of view F in the X direction. Moreover, the specific region B includes the optical axis A and has a three-dimensional shape that decreases in size in cross section toward the tip of theendoscope 1. Hence, the specific region B on the endoscope image D is a central region including the center of the endoscope image D. As illustrated inFIGS. 4A to 4C , the specific region B orthogonal to the optical axis A may be rectangular, circular, or oval in cross section and may have any other shapes such as a polygon. The specific region B may be superimposed on the endoscope image D or may be hidden. - In an example, the specific region B and the endoscope image D are identical in shape in cross section. For example, when the endoscope image D is rectangular, the specific region B is also rectangular in cross section. The specific region B displayed on the endoscope image D may interfere with an observation of the endoscope image D and thus is preferably hidden. When the specific region B and the endoscope image D are identical in shape, the surgeon easily recognizes the position of the hidden specific region B.
- The field of view F of the
endoscope 1 is typically shaped like a cone having the vertex at or near the tip of theendoscope 1. The specific region B is preferably a frustum with the vertex shared with the field of view F of theendoscope 1. As illustrated inFIG. 5 , the size and position of the displayed specific region B are fixed on the endoscope image D regardless of positions X1, X2, and X3 in the X direction. - The size of the specific region B on the endoscope image D (that is, the size of the specific region B relative to the field of view F in cross section) is preferably 25% to 55% of the size of the endoscope image D. In the case of the specific region B like a frustum, a vertex angle β of the specific region B is preferably 25% to 55% of a viewing angle α of the
endoscope 1. This configuration can place thetip 2 a of thesurgical instrument 2 at the center of the endoscope image D and suppress excessive following motion of theendoscope 1 with respective to thetip 2 a. - When the size of the specific region B is less than 25% of the size of the endoscope image D, the effect of suppressing excessive following motion of the
endoscope 1 relative to a movement of thetip 2 a may become insufficient, leading to frequent movements of the field of view F. When the size of the specific region B is larger than 55% of the size of the endoscope image D, thetip 2 a is frequently disposed at a position remote from the center of the endoscope image D, leading to difficulty in placing thetip 2 a at the center. - As illustrated in
FIGS. 3 to 5 , the specific region B includes a non-following region B1 and a following region B2. The non-following region B1 is a central region of the specific region B including the optical axis A. The following region B2 is an outer region of the specific region B and surrounds the non-following region B1. Like the specific region B, the non-following region B1 has a three-dimensional shape that decreases in size in cross section toward the tip of theendoscope 1. The non-following region B1 is preferably a frustum. - As illustrated in
FIG. 6 , when thetip 2 a of thesurgical instrument 2 is disposed outside the following region B2, the controller 4 rotates, for example, therobot arm 3 a about the Y axis and the Z axis so as to cause theendoscope 1 to follow thetip 2 a at a first speed V1. - When the
tip 2 a is disposed in the non-following region B1, the controller 4 keeps the position of theendoscope 1 without causing theendoscope 1 to follow thetip 2 a. Specifically, the controller 4 controls the angular velocities of thejoints 3 b to zero. Thus, the second speed in the non-following region B1 is zero. - When the
tip 2 a is disposed in the following region B2, the controller 4 continues the operation of theendoscope 1 in the previous control cycle. Specifically, when the position of theendoscope 1 is kept in the previous control cycle, the controller 4 keeps the position of theendoscope 1 also in the current control cycle. When theendoscope 1 is caused to follow thetip 2 a in the previous control cycle, the controller 4 causes theendoscope 1 to follow thetip 2 a also in the current control cycle. At this point, the speed of following is a second speed V2 higher than zero. - In the foregoing control, the following region B2 acts as a trigger for starting following of the
tip 2 a by theendoscope 1, and the non-following region B1 acts as a trigger for terminating following of thetip 2 a by theendoscope 1. Specifically, when thetip 2 a moves from the following region B2 to an outer region C, theendoscope 1 starts following thetip 2 a. When thetip 2 a enters the non-following region B1 from the outer region C through the following region B2, following of thetip 2 a by theendoscope 1 is terminated. - The first speed V1 and the second speed V2 are each kept constant, whereas the following speed of the
endoscope 1 may be changed in two steps. - Alternatively, the first speed V1 and the second speed V2 may change according to a distance from the center of the endoscope image D to the
tip 2 a. For example, the controller 4 may calculate distances from the optical axis A of theendoscope 1 to thetip 2 a in the Y direction and the Z direction and increase the speeds V1 and V2 according to the distances. In this case, the following speeds V1 and V2 of theendoscope 1 may continuously decrease from the outer region C to the non-following region B1. -
FIGS. 7A to 7D are explanatory drawings of a method for calculating the specific region B. - As illustrated in
FIG. 7A , the controller 4 sets, as a fiducial point E, an intersection point of the optical axis A and a YZ plane P that passes through thetip 2 a of thesurgical instrument 2 and is perpendicular to the optical axis A. The controller 4 then defines, as the specific region B, a region like a rectangular solid or a sphere that is centered around the fiducial point E. -
FIGS. 7B and 7C are explanatory drawings of a method for calculating an actual size [mm] of the specific region B. - A size Lmax_dz[mm] of the endoscope image D (the size of the field of view F in the Z direction) at an observation distance di (i = 1, 2, ... ) in the Z direction (longitudinal direction) is expressed by the following formula according to the geometric relationship of
FIG. 7C . -
- where α[deg] is the viewing angle (half angle of view) of the
endoscope 1. A pixel size Lmax_dz_pixel [px] of the endoscope image D in the Z direction is known and is expressed by, for example, the following formula: -
- Thus, an actual size L_dz[mm] of the specific region B in the Z direction is calculated from the following formula by using a pixel size[dx] of the specific region B in the Z direction.
-
- An actual size L_dy[mm] of the specific region B in the Y direction is also calculated by the same method as L_dz.
- An actual size L_dx of the specific region B in the X direction is also set. For example, the actual size L_dx may be set at a fixed value regardless of the observation distance di. Alternatively, as illustrated in
FIG. 7D , an actual size L_dx at a reference observation distance di (e.g., d1) may be preset and L_dx at another observation distance di (e.g., d2) may be set at a value proportionate to a change of the observation distance. - The operation of the
medical system 10 will be described below. - A surgeon performs a procedure by operating the
surgical instrument 2 inserted into a body while observing the endoscope image D displayed on thedisplay device 5. During the procedure, the surgeon switches from the manual mode to the follow-up mode or from the follow-up mode to the manual mode in response to, for example, a voice. - As indicated in
FIG. 8A , when switching to the follow-up mode in step S1, the controller 4 performs the control method of steps S2 to S8 and controls the movingdevice 3 in the follow-up mode. - The control method includes step S2 of determining whether the
tip 2 a of thesurgical instrument 2 is located in the specific region B and steps S3 to S8 of causing, when the position of thetip 2 a is located outside the specific region B, theendoscope 1 to follow thesurgical instrument 2 until thetip 2 a of thesurgical instrument 2 reaches the non-following region B1. - After the start of the follow-up mode (YES at step S1), the controller 4 calculates the three-dimensional position of the
tip 2 a by using the endoscope image D, which is a stereo image, and determines whether thetip 2 a is located in the predetermined specific region B (step S2). When thetip 2 a is located in the specific region B (YES at step S2), the controller 4 keeps the position of theendoscope 1 without causing theendoscope 1 to follow thesurgical instrument 2. When thetip 2 a is located outside the specific region B (NO at step S2), the controller 4 starts following of thesurgical instrument 2 by the endoscope 1 (step S3). - In following of the
surgical instrument 2, the controller 4 selects one of the first control mode and the second control mode on the basis of the position of thetip 2 a. As illustrated inFIG. 6 , thetip 2 a is located outside the specific region B at the start of following (NO at step S4) and thus the controller 4 controls the movingdevice 3 in the first control mode, so that theendoscope 1 is caused to follow thetip 2 a of thesurgical instrument 2 at the first speed V1 so as to move thetip 2 a of thesurgical instrument 2 toward the center of the endoscope image D (step S5). The controller 4 controls the movingdevice 3 in the first control mode until thetip 2 a enters the specific region B. - After the
tip 2 a enters the specific region B (YES at step S4), the controller 4 then controls the movingdevice 3 in the second control mode, so that theendoscope 1 is caused to follow thetip 2 a of thesurgical instrument 2 at the second speed V2 so as to move thetip 2 a of thesurgical instrument 2 toward the center of the endoscope image D. Since the second speed V2 is lower than the first speed V1, the responsivity for following a movement of thetip 2 a by theendoscope 1 decreases. In other words, after thetip 2 a returns to the specific region B from the outer region C, theendoscope 1 is prevented from excessively following a movement of thesurgical instrument 2. The controller 4 controls the movingdevice 3 in the second control mode until thetip 2 a enters the non-following region B1. - When the
tip 2 a of thesurgical instrument 2 enters the non-following region B1 (YES at step S6), the controller 4 causes theendoscope 1 to finish following the surgical instrument 2 (step S8). - While the follow-up mode continues (NO at step S9), the controller 4 repeats steps S1 to S8.
- To allow a surgeon to follow an object through the
endoscope 1 with ease of operation, it is desirable to cause theendoscope 1 to follow thesurgical instrument 2 so as to satisfy three conditions: suppressing excessive following motion, bringing thetip 2 a of thesurgical instrument 2 at the center of the endoscope image D, and bringing thetip 2 a of thesurgical instrument 2 at a proper distance in the X direction. - According to the present embodiment, the specific region B is a three-dimensional region set in the field of view F and thus can be properly designed. For example, a distance between the tip of the
endoscope 1 and the specific region B in the X direction and a size of the specific region B in cross section at each position in the X direction are designed to satisfy the three conditions. This allows theendoscope 1 to follow thesurgical instrument 2 with ease of operation. - Moreover, the specific region B is shaped to decrease in size in cross section toward the tip of the
endoscope 1, thereby suppressing a difference in the size of the displayed specific region B on the endoscope image D between positions in the X direction. The specific region B is preferably displayed with a fixed size regardless of the position in the X direction. This can suppress excessive following motion of theendoscope 1 and place thetip 2 a at the center regardless of the position of thetip 2 a in the X direction. -
FIG. 11 illustrates a specific region B′ as a reference example. As illustrated inFIG. 11 , when the specific region B′ is formed by simply extending a two-dimensional region on the image plane of the endoscope image D, the specific region B′ extends from the tip of theendoscope 1 in the X direction. Thus, theendoscope 1 cannot be caused to follow thesurgical instrument 2 such that thetip 2 a is brought at a proper distance in the X direction. - Moreover, the size of the specific region B′ is fixed in cross section and thus the displayed specific region B′ on the endoscope image D changes in size according to the position in the X direction. This leads to difficulty in suppressing excessive following motion of the
endoscope 1 with respective to thesurgical instrument 2 while placing thetip 2 a at the center. Specifically, the displayed specific region B′ decreases in size at a position X3 remote from the tip of theendoscope 1 in the X direction, so that excessive following motion of theendoscope 1 cannot be suppressed, though thetip 2 a can be placed at the center. The size of the displayed specific region B′ increases at a position X1 close to the tip of theendoscope 1 in the X direction. This can suppress excessive following motion of theendoscope 1 but leads to difficulty in placing thetip 2 a at the center. - In the foregoing embodiment, when the
tip 2 a is located in the following region B2, the controller 4 continues the operation of theendoscope 1 in the previous control cycle. Alternatively, theendoscope 1 may be caused to always follow thetip 2 a at the second speed V2 larger than zero. In other words, the controller 4 may control the movingdevice 3 in the second control mode in either of the following cases: where thetip 2 a enters the following region B2 from the outer region C, and where thetip 2 a enters the following region B2 from the non-following region B1. - In this case, as indicated in
FIG. 8B , the controller 4 determines whether thetip 2 a is located in the non-following region B1 (step S2′). When thetip 2 a moves from the non-following region B1 into the following region B2 (NO at step S2′), the controller 4 starts following thesurgical instrument 2 by the endoscope 1 (step S3). - The
tip 2 a is located in the following region B2 at the start of following (YES at step S4 and NO at step S6) and thus the controller 4 controls the movingdevice 3 in the second control mode, so that theendoscope 1 is caused to follow thetip 2 a of thesurgical instrument 2 at the second speed V2 so as to move thetip 2 a of thesurgical instrument 2 toward the center of the endoscope image D (step S7). The controller 4 controls the movingdevice 3 in the second control mode until thetip 2 a enters the non-following region B1. As described above, since the second speed V2 is lower than the first speed V1, theendoscope 1 is prevented from excessively following a movement of thesurgical instrument 2 while thetip 2 a moves in the following region B2. - When the
tip 2 a moves out of the specific region B while being followed by theendoscope 1 in the second control mode (NO at step S4), the controller 4 switches from the second control mode to the first control mode (step S5) and controls the movingdevice 3 in the first control mode until thetip 2 a returns to the specific region B. - In the foregoing embodiment, as illustrated in
FIGS. 9A and 9B , the controller 4 may change the size of the specific region B in cross section according to the viewing angle α of theendoscope 1. - For example, in the
storage unit 4 c, a value of the viewing angle α is stored for each type of theendoscope 1. The controller 4 recognizes the type of theendoscope 1 held by therobot arm 3 a, reads the value of the viewing angle α of the recognized type from thestorage unit 4 c, and sets the vertex angle βof the specific region B at a predetermined rate of the viewing angle α. For example, the vertex angle β is calculated by multiplying the value of the viewing angle α by a predetermined rate k selected from 25% to 55%. Thus, the specific region B increases in size in cross section in proportion to the viewing angle α. - With this configuration, the area ratio of the cross section of the specific region B to the cross section of the field of view F is kept constant regardless of a difference in the viewing angle α of the used
endoscope 1. Hence, the displayed specific region B on the endoscope image D displayed on thedisplay device 5 can have the same size regardless of the viewing angle α of theendoscope 1. - In the foregoing embodiment, the specific region B includes the non-following region B1 where the
endoscope 1 is not caused to follow thesurgical instrument 2. Alternatively, as illustrated inFIG. 10 , the non-following region B1 may be absent in the specific region B. In this modification, the controller 4 causes theendoscope 1 to follow thesurgical instrument 2 at the second speed V2 until thetip 2 a of thesurgical instrument 2 is located at the center of the endoscope image D. When thetip 2 a is located at the center of the endoscope image D, the controller 4 causes theendoscope 1 to finish following thesurgical instrument 2. - In the case of
FIG. 6 , theendoscope 1 finishes following thetip 2 a when thetip 2 a reaches the end of the following region B2 remote from the center of the endoscope image D. In contrast, inFIG. 10 , theendoscope 1 is caused to follow thetip 2 a until thetip 2 a reaches the center of the endoscope image D. Thus, a procedure can be performed with thetip 2 a located at the center of the endoscope image D. - In the modification of
FIG. 10 , the second speed V2 is preferably 50% or less of the first speed V1. The second speed V2 may remain constant or gradually decrease as thetip 2 a of thesurgical instrument 2 moves close to the center of the endoscope image D. If the second speed V2 is higher than 50% of the first speed V1, it is difficult to sufficiently obtain the effect of suppressing excessive following motion of theendoscope 1. - In the foregoing embodiment, the shape of the specific region B may be changed in cross section. For example, the cross-sectional shape can be selected from a rectangle, a circle, and an ellipse in
FIGS. 4A to 4C , and parameters dy, dz, R, a, and b can be set to determine the size of each shape in cross section. The selection of the shape and the setting of the parameters may be manual operations performed by the surgeon or automatic operations performed by the controller 4. - With this configuration, the shape and size of the specific region B in cross section can be set according to, for example, the technique, the contents of the procedure, or the preferences of the surgeon.
- In an example, in a procedure where the
tip 2 a frequently makes large movements in the longitudinal direction of the endoscope image D, the cross section is set as a vertically oriented ellipse inFIG. 4C . This can prevent the field of view F from excessively reacting to a longitudinal movement of thetip 2 a and vibrating in the longitudinal direction, thereby stopping the field of view F regardless of a longitudinal movement of thetip 2 a during the procedure. - The controller 4 may recognize the type of the
surgical instrument 2 or a procedure and automatically change at least one of the shape of the specific region B, the sizes of the specific region B in the X, Y, and Z directions, and the position of the specific region B according to the type of thesurgical instrument 2 or the procedure. Furthermore, the controller 4 may automatically change the first speed and the second speed according to the type of thesurgical instrument 2 or the procedure. For example, the controller 4 recognizes the type of thesurgical instrument 2 on the basis of the endoscope image D and recognizes the type of the procedure according to the type of thesurgical instrument 2. - The proper shape, size, and position of the specific region B change according to the type of the
surgical instrument 2 or the procedure. With this configuration, the shape, size, and position of the specific region B can be automatically set to be suitable for the type of thesurgical instrument 2 and the procedure. - In an example, when the type of the
surgical instrument 2 is gripping forceps, the specific region B having a larger size in the X direction is set to be located at a larger distance from the tip of theendoscope 1. For example, a range of 90 mm to 190 mm from the tip of theendoscope 1 is set as the specific region B. - In another example, when the type of the
surgical instrument 2 is an energy treatment tool, the specific region B is set to be located at a shorter distance from the tip of theendoscope 1 in order to perform an elaborate procedure. For example, a range of 60 mm to 90 mm from the tip of theendoscope 1 is set as the specific region B. Moreover, in order to prevent a movement of the field of view F during a blunt dissection operation, the size of the specific region B may be increased in cross section or the second speed may be reduced. - In still another example, the controller 4 may learn a movement of the
tip 2 a during a procedure and change the shape and size of the specific region B such that the motion range of thetip 2 a is included in the specific region B during the procedure. - In the foregoing embodiment, a definite border may be absent between the specific region B and the outer region C. In other words, the controller 4 may continuously change the following speed according to a distance from the center of the endoscope image D to the
tip 2 a. - For example, the controller 4 may calculate an angular velocity Vp of a rotation about the Y axis and an angular velocity Vy of a rotation about the Z axis according to the formula below, and rotate the
robot arm 3 a at the calculated angular velocities Vp and Vy. py is a distance from the center of the endoscope image D to thetip 2 a in the Y direction, pz is a distance from the center of the endoscope image D to thetip 2 a in the Z direction, and Gy and Gz are predetermined coefficients of proportionality. -
-
- In the foregoing embodiment, the
endoscope 1 captures a three-dimensional stereo image as the endoscope image D. Alternatively, a two-dimensional endoscope image D may be captured. In this case, for example, the position of thetip 2 a of thesurgical instrument 2 in the X direction may be measured by another distance-measuring means, e.g., a distance sensor provided at the tip of theendoscope 1. - In the foregoing embodiment, an object to be followed by the
endoscope 1 is thesurgical instrument 2 but is not limited thereto. Theendoscope 1 may follow any object in the endoscope image D during a surgical operation. For example, an object may be a lesion, an organ, a blood vessel, a marker, a biomedical material such as gauze, or a medical instrument other than thesurgical instrument 2. -
REFERENCE SIGNS LIST 1 Endoscope 1 a Imaging portion 2 Surgical instrument (object) 3 Moving device 3 a Robot arm 3 b Joint 4 Controller 5 Display device 10 Medical system A Optical axis B Specific region (predetermined three-dimensional region) B1 Non-following region (specific region) B2 Following region (specific region) C Outer region D Endoscope image F Field of view α Viewing angle
Claims (9)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/105,291 US20230180995A1 (en) | 2020-09-10 | 2023-02-03 | Medical system and control method |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202063076408P | 2020-09-10 | 2020-09-10 | |
| PCT/JP2021/027564 WO2022054428A1 (en) | 2020-09-10 | 2021-07-26 | Medical system and control method |
| US18/105,291 US20230180995A1 (en) | 2020-09-10 | 2023-02-03 | Medical system and control method |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/027564 Continuation WO2022054428A1 (en) | 2020-09-10 | 2021-07-26 | Medical system and control method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20230180995A1 true US20230180995A1 (en) | 2023-06-15 |
Family
ID=80629721
Family Applications (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/105,305 Pending US20230172675A1 (en) | 2020-09-10 | 2023-02-03 | Controller, endoscope system, and control method |
| US18/105,314 Pending US20230180996A1 (en) | 2020-09-10 | 2023-02-03 | Controller, endoscope system, control method, and control program |
| US18/105,300 Pending US20230180998A1 (en) | 2020-09-10 | 2023-02-03 | Endoscope system, controller, control method, and recording medium |
| US18/105,291 Abandoned US20230180995A1 (en) | 2020-09-10 | 2023-02-03 | Medical system and control method |
Family Applications Before (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/105,305 Pending US20230172675A1 (en) | 2020-09-10 | 2023-02-03 | Controller, endoscope system, and control method |
| US18/105,314 Pending US20230180996A1 (en) | 2020-09-10 | 2023-02-03 | Controller, endoscope system, control method, and control program |
| US18/105,300 Pending US20230180998A1 (en) | 2020-09-10 | 2023-02-03 | Endoscope system, controller, control method, and recording medium |
Country Status (4)
| Country | Link |
|---|---|
| US (4) | US20230172675A1 (en) |
| JP (3) | JP7535587B2 (en) |
| CN (3) | CN116171122A (en) |
| WO (4) | WO2022054428A1 (en) |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9204939B2 (en) * | 2011-08-21 | 2015-12-08 | M.S.T. Medical Surgery Technologies Ltd. | Device and method for assisting laparoscopic surgery—rule based approach |
| EP3337419B1 (en) * | 2015-08-19 | 2020-08-12 | Brainlab AG | Reference array holder |
| TWI782409B (en) * | 2020-03-09 | 2022-11-01 | 陳階曉 | Endoscopic image correction system and method thereof |
| US20230255442A1 (en) * | 2022-02-11 | 2023-08-17 | Canon U.S.A., Inc. | Continuum robot apparatuses, methods, and storage mediums |
| WO2023195326A1 (en) * | 2022-04-05 | 2023-10-12 | オリンパス株式会社 | Endoscope system, procedure supporting method, and procedure supporting program |
| WO2024009901A1 (en) * | 2022-07-08 | 2024-01-11 | オリンパス株式会社 | Endoscope system, control method, and control program |
| WO2024157360A1 (en) * | 2023-01-24 | 2024-08-02 | 国立研究開発法人国立がん研究センター | Treatment instrument detection device for endoscopic images, treatment instrument detection method for endoscopic images, and treatment instrument detection device program for endoscopic images |
| US20240349985A1 (en) * | 2023-04-24 | 2024-10-24 | Karl Storz Se & Co. Kg | Corrective adjustment of image parameters using artificial intelligence |
| CN118319430A (en) * | 2023-12-29 | 2024-07-12 | 北京智愈医疗科技有限公司 | Monitoring device of water sword motion trail based on endoscope |
| WO2025163471A1 (en) * | 2024-01-29 | 2025-08-07 | Covidien Lp | Hysteroscopic surgical systems for use with surgical robotic systems and surgical robotic systems incorporating the same |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190000585A1 (en) * | 2016-01-25 | 2019-01-03 | Sony Corporation | Medical safety control apparatus, medical safety control method, and medical support system |
| US20190365499A1 (en) * | 2017-02-28 | 2019-12-05 | Sony Corporation | Medical arm system, control device, and control method |
Family Cites Families (45)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2797830B2 (en) * | 1992-03-31 | 1998-09-17 | 日本ビクター株式会社 | Object Tracking Method for Video Camera |
| JP3348933B2 (en) * | 1993-03-19 | 2002-11-20 | オリンパス光学工業株式会社 | Electronic endoscope device |
| JP2833425B2 (en) * | 1993-06-30 | 1998-12-09 | 日本ビクター株式会社 | Object tracking device for video camera |
| JP3419869B2 (en) * | 1993-12-28 | 2003-06-23 | オリンパス光学工業株式会社 | Medical equipment |
| JPH0938030A (en) * | 1995-07-28 | 1997-02-10 | Shimadzu Corp | Endoscope device |
| JPH09266882A (en) * | 1996-04-02 | 1997-10-14 | Olympus Optical Co Ltd | Endoscope device |
| US7037258B2 (en) | 1999-09-24 | 2006-05-02 | Karl Storz Imaging, Inc. | Image orientation for endoscopic video displays |
| JP2001112704A (en) * | 1999-10-20 | 2001-04-24 | Olympus Optical Co Ltd | Endoscope system |
| JP2003088532A (en) * | 2001-09-19 | 2003-03-25 | Olympus Optical Co Ltd | Operation instrument |
| JP4331541B2 (en) | 2003-08-06 | 2009-09-16 | オリンパス株式会社 | Endoscope device |
| US20050123179A1 (en) * | 2003-12-05 | 2005-06-09 | Eastman Kodak Company | Method and system for automatic axial rotation correction in vivo images |
| US7654997B2 (en) * | 2004-04-21 | 2010-02-02 | Acclarent, Inc. | Devices, systems and methods for diagnosing and treating sinusitus and other disorders of the ears, nose and/or throat |
| JP4377745B2 (en) * | 2004-05-14 | 2009-12-02 | オリンパス株式会社 | Electronic endoscope |
| JP4699040B2 (en) * | 2005-02-15 | 2011-06-08 | パナソニック株式会社 | Automatic tracking control device, automatic tracking control method, program, and automatic tracking system |
| JP4785127B2 (en) * | 2005-12-08 | 2011-10-05 | 学校法人早稲田大学 | Endoscopic visual field expansion system, endoscopic visual field expansion device, and endoscope visual field expansion program |
| JP4980625B2 (en) * | 2006-02-21 | 2012-07-18 | 富士フイルム株式会社 | Body cavity observation device |
| US7841980B2 (en) * | 2006-05-11 | 2010-11-30 | Olympus Medical Systems Corp. | Treatment system, trocar, treatment method and calibration method |
| JP5030639B2 (en) * | 2007-03-29 | 2012-09-19 | オリンパスメディカルシステムズ株式会社 | Endoscope device treatment instrument position control device |
| US8083669B2 (en) * | 2007-06-22 | 2011-12-27 | Olympus Medical Systems Corp. | Medical device for maintaining state of treatment portion |
| JP5192898B2 (en) * | 2008-04-25 | 2013-05-08 | オリンパスメディカルシステムズ株式会社 | Manipulator system |
| WO2012078989A1 (en) * | 2010-12-10 | 2012-06-14 | Wayne State University | Intelligent autonomous camera control for robotics with medical, military, and space applications |
| JP6021369B2 (en) * | 2012-03-21 | 2016-11-09 | Hoya株式会社 | Endoscope system |
| TWI517828B (en) * | 2012-06-27 | 2016-01-21 | 國立交通大學 | Image tracking system and image tracking method thereof |
| JP6218634B2 (en) * | 2014-02-20 | 2017-10-25 | オリンパス株式会社 | ENDOSCOPE SYSTEM AND ENDOSCOPE OPERATING METHOD |
| EP3125806B1 (en) * | 2014-03-28 | 2023-06-14 | Intuitive Surgical Operations, Inc. | Quantitative three-dimensional imaging of surgical scenes |
| CN106456267B (en) * | 2014-03-28 | 2020-04-03 | 直观外科手术操作公司 | Quantitative 3D visualization of instruments in the field of view |
| JP6177488B2 (en) * | 2015-07-23 | 2017-08-09 | オリンパス株式会社 | Manipulator and medical system |
| WO2017082047A1 (en) * | 2015-11-13 | 2017-05-18 | オリンパス株式会社 | Endoscope system |
| JP6150968B1 (en) * | 2016-02-10 | 2017-06-21 | オリンパス株式会社 | Endoscope system |
| CN107456278B (en) * | 2016-06-06 | 2021-03-05 | 北京理工大学 | Endoscopic surgery navigation method and system |
| JP2019165270A (en) * | 2016-08-03 | 2019-09-26 | シャープ株式会社 | Video image output system, video image output method, and control apparatus |
| WO2018051565A1 (en) * | 2016-09-15 | 2018-03-22 | オリンパス株式会社 | Ultrasonic endoscope and ultrasonic endoscope system |
| EP3603562B1 (en) * | 2017-03-28 | 2022-06-29 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus and observation field correction method |
| WO2018235255A1 (en) * | 2017-06-23 | 2018-12-27 | オリンパス株式会社 | Medical system and its operating method |
| WO2019035206A1 (en) * | 2017-08-18 | 2019-02-21 | オリンパス株式会社 | Medical system and image generation method |
| US12262866B2 (en) * | 2017-09-22 | 2025-04-01 | Carl Zeiss Meditec Ag | Visualization system comprising an observation apparatus and an endoscope |
| DE102017219621B4 (en) * | 2017-09-22 | 2025-11-13 | Carl Zeiss Meditec Ag | Visualization system with an observation device and an endoscope |
| WO2019116592A1 (en) * | 2017-12-14 | 2019-06-20 | オリンパス株式会社 | Device for adjusting display image of endoscope, and surgery system |
| JP7151109B2 (en) * | 2018-03-19 | 2022-10-12 | ソニーグループ株式会社 | Medical imaging device and medical observation system |
| WO2020070883A1 (en) * | 2018-10-05 | 2020-04-09 | オリンパス株式会社 | Endoscopic system |
| JP7596269B2 (en) * | 2019-02-21 | 2024-12-09 | シアター・インコーポレイテッド | SYSTEMS AND METHODS FOR ANALYSIS OF SURGICAL VIDEOS - Patent application |
| JP2020151044A (en) * | 2019-03-18 | 2020-09-24 | ソニー・オリンパスメディカルソリューションズ株式会社 | Medical observation device |
| JP7480477B2 (en) * | 2019-07-10 | 2024-05-10 | ソニーグループ株式会社 | Medical observation system, control device and control method |
| IL290896B2 (en) * | 2019-08-30 | 2025-07-01 | Brainlab Ag | Image based motion control correction |
| JP2021040987A (en) * | 2019-09-12 | 2021-03-18 | ソニー株式会社 | Medical support arm and medical system |
-
2021
- 2021-07-26 CN CN202180053633.7A patent/CN116171122A/en active Pending
- 2021-07-26 JP JP2022547429A patent/JP7535587B2/en active Active
- 2021-07-26 WO PCT/JP2021/027564 patent/WO2022054428A1/en not_active Ceased
- 2021-09-09 WO PCT/JP2021/033205 patent/WO2022054882A1/en not_active Ceased
- 2021-09-09 CN CN202180053602.1A patent/CN116018538A/en active Pending
- 2021-09-09 JP JP2022547657A patent/JP7522840B2/en active Active
- 2021-09-09 CN CN202180053634.1A patent/CN115996662B/en active Active
- 2021-09-09 WO PCT/JP2021/033209 patent/WO2022054883A1/en not_active Ceased
- 2021-09-09 WO PCT/JP2021/033210 patent/WO2022054884A1/en not_active Ceased
- 2021-09-09 JP JP2022547659A patent/JP7534423B2/en active Active
-
2023
- 2023-02-03 US US18/105,305 patent/US20230172675A1/en active Pending
- 2023-02-03 US US18/105,314 patent/US20230180996A1/en active Pending
- 2023-02-03 US US18/105,300 patent/US20230180998A1/en active Pending
- 2023-02-03 US US18/105,291 patent/US20230180995A1/en not_active Abandoned
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190000585A1 (en) * | 2016-01-25 | 2019-01-03 | Sony Corporation | Medical safety control apparatus, medical safety control method, and medical support system |
| US20190365499A1 (en) * | 2017-02-28 | 2019-12-05 | Sony Corporation | Medical arm system, control device, and control method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN116018538A (en) | 2023-04-25 |
| US20230180996A1 (en) | 2023-06-15 |
| CN115996662B (en) | 2025-11-18 |
| JP7522840B2 (en) | 2024-07-25 |
| JPWO2022054884A1 (en) | 2022-03-17 |
| US20230172675A1 (en) | 2023-06-08 |
| WO2022054883A1 (en) | 2022-03-17 |
| JP7534423B2 (en) | 2024-08-14 |
| WO2022054882A1 (en) | 2022-03-17 |
| US20230180998A1 (en) | 2023-06-15 |
| WO2022054428A1 (en) | 2022-03-17 |
| JPWO2022054428A1 (en) | 2022-03-17 |
| JP7535587B2 (en) | 2024-08-16 |
| CN116171122A (en) | 2023-05-26 |
| WO2022054884A1 (en) | 2022-03-17 |
| CN115996662A (en) | 2023-04-21 |
| JPWO2022054882A1 (en) | 2022-03-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230180995A1 (en) | Medical system and control method | |
| US11406460B2 (en) | Surgery assisting apparatus, method of controlling the same, storage medium, and surgery assisting system | |
| US10638915B2 (en) | System for moving first insertable instrument and second insertable instrument, controller, and computer-readable storage device | |
| CN102791214B (en) | Uncalibrated visual servoing with real-time speed optimization | |
| KR102363661B1 (en) | Systems and methods for offscreen indication of instruments in a teleoperational medical system | |
| US10155315B2 (en) | Medical system and method for controlling the same | |
| JP6091410B2 (en) | Endoscope apparatus operating method and endoscope system | |
| JP4460297B2 (en) | Directional view end scope interface | |
| JP6180692B1 (en) | Medical manipulator system | |
| US5572999A (en) | Robotic system for positioning a surgical instrument relative to a patient's body | |
| JP4152402B2 (en) | Surgery support device | |
| US20020151784A1 (en) | Surgical microscope | |
| JP2020156800A (en) | Medical arm system, control device, and control method | |
| JP2013516264A5 (en) | ||
| US11540699B2 (en) | Medical manipulator system | |
| US20220218427A1 (en) | Medical tool control system, controller, and non-transitory computer readable storage | |
| US20210393331A1 (en) | System and method for controlling a robotic surgical system based on identified structures | |
| US11241144B2 (en) | Medical system and operation method of medical system | |
| KR20200107615A (en) | Ultrasound imaging apparatus, method for controlling the same, and computer program product | |
| US20250302273A1 (en) | Medical system, control device, control method, and control program | |
| KR20150105803A (en) | Three dimension endoscope system using giro sensor | |
| US20220323157A1 (en) | System and method related to registration for a medical procedure | |
| US12383126B2 (en) | Surgery system and control method for surgery system to adjust position and orientation of imager | |
| CN120602778A (en) | Registration method for exterior mirror system and exterior mirror system | |
| CN117372667A (en) | Pose adjusting method and device of image acquisition assembly and controller |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: NATIONAL CANCER CENTER, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAKAYAMA, HIROYUKI;SASAI, RYOTA;YANAGIHARA, MASARU;AND OTHERS;SIGNING DATES FROM 20221222 TO 20230111;REEL/FRAME:062582/0949 Owner name: OLYMPUS CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAKAYAMA, HIROYUKI;SASAI, RYOTA;YANAGIHARA, MASARU;AND OTHERS;SIGNING DATES FROM 20221222 TO 20230111;REEL/FRAME:062582/0949 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |