[go: up one dir, main page]

WO2019035206A1 - Système d'endoscope et procédé de génération d'image - Google Patents

Système d'endoscope et procédé de génération d'image Download PDF

Info

Publication number
WO2019035206A1
WO2019035206A1 PCT/JP2017/029614 JP2017029614W WO2019035206A1 WO 2019035206 A1 WO2019035206 A1 WO 2019035206A1 JP 2017029614 W JP2017029614 W JP 2017029614W WO 2019035206 A1 WO2019035206 A1 WO 2019035206A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display image
virtual image
medical device
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/029614
Other languages
English (en)
Japanese (ja)
Inventor
井上 慎太郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to PCT/JP2017/029614 priority Critical patent/WO2019035206A1/fr
Publication of WO2019035206A1 publication Critical patent/WO2019035206A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present invention relates to a medical system and an image generation method using the medical system.
  • the appropriate type of medical device is selected in accordance with the procedure, and the selected medical device is inserted into the abdominal cavity.
  • the operator needs to select a mesh with the optimal size necessary and sufficient to close the hernia gate, and insert the selected mesh into the abdominal cavity is there.
  • the operator confirms the image captured by the endoscope, estimates the length and area of the contour of the tissue in the three-dimensional space, and selects a mesh of an optimal size. If the selected mesh is inserted into the abdominal cavity and the size of the mesh is found to be small relative to the hernia gate, the operator needs to remove the mesh from the abdominal cavity and reinsert the reselected mesh into the abdominal cavity . Depending on the experience and technique of the operator, there is a problem that it takes time for trial and error of mesh selection.
  • Patent Document 1 describes a telesurgery robot system capable of measuring the length and area of a contour of a tissue in a three-dimensional space from an endoscopic image of the tissue acquired by the endoscope.
  • the telesurgical robot system can measure, for example, the distance between measurement points designated by a treatment tool, and superimpose the measured distance information on an endoscopic image for display.
  • the operator who uses the telesurgical robot system can grasp the dimensions and the like in the three-dimensional space of the tissue shown in the endoscopic image.
  • Patent Document 1 requires procedures such as designation of measurement points in order to perform measurement in a three-dimensional space, and it is possible to rapidly measure dimensions etc. in the three-dimensional space. It was not. In addition, it has been difficult for the operator to intuitively determine the most appropriate type of medical device from the measurement results of dimensions and the like in the three-dimensional space.
  • the present invention aims to provide a medical system capable of supporting selection of an appropriate type of medical device and an image generation method using the medical system.
  • a medical system includes a treatment tool, an endoscope having an imaging unit, a control device that generates a display image from a captured image acquired by the imaging unit, a display device that displays the display image, and medical treatment
  • An input device for selecting a device the control device measures a relative position from the endoscope to the treatment tool, and the control device reflects the display image based on the relative position
  • a virtual image of the medical device at a relative position to the treatment tool and having a relative size is superimposed on the display image.
  • the image generation method is a medical system in which a display image is generated from a captured image acquired by an imaging unit of an endoscope, and a medical device selecting step of selecting a medical device using an input device;
  • the medical device having a relative size relative to the treatment tool shown in the display image based on the relative position measuring step of measuring the relative position to the treatment tool and the relative position And a virtual image superimposing step of superimposing the virtual image on the display image.
  • selection of an appropriate type of medical device can be assisted.
  • the display image on which the virtual image of the selected mesh A is superimposed is shown.
  • the display image on which the virtual image of the selected mesh B is superimposed is shown.
  • the display image on which the virtual image of the selected stapler A is superimposed is shown.
  • the display image on which the virtual image of the selected stapler B is superimposed is shown. It is a figure which shows the whole structure of the modification of the medical system. It is a figure which shows the whole structure of the modification of the medical system.
  • the endoscope of the medical system concerning a third embodiment of the present invention is shown. In the medical system, the display image on which the virtual image of the selected snare A is superimposed is shown. In the medical system, the display image on which the virtual image of the selected snare B is superimposed is shown.
  • FIG. 1 is a diagram showing an entire configuration of a medical system 100 according to the present embodiment.
  • the medical system 100 includes a treatment tool 1, an endoscope 2, a control device 3, a display device 4, and an input device 5.
  • the medical system 100 is a system that supports a treatment to be performed by inserting the treatment instrument 1 or the endoscope 2 or the like from separate holes (openings) opened in an abdominal wall in laparoscopic surgery.
  • the treatment tool 1 has an elongated insertion portion 10 that can be inserted into the abdominal cavity of a patient, and an operation portion 11 provided on the proximal end side of the insertion portion 10.
  • the operator passes the insertion portion 10 through the trocar T which is punctured in the abdomen B of the patient and introduces the insertion portion 10 into the abdominal cavity.
  • the operator may introduce a plurality of treatment tools 1 into the abdominal cavity.
  • the insertion portion 10 has a treatment portion 12 at the tip end for treating an affected area of a patient.
  • the treatment unit 12 is a gripping mechanism configured by a pair of gripping members 12 a.
  • the operation unit 11 is a member for operating the pair of gripping members 12a.
  • the operation unit 11 has a handle, and moves the handle relative to the other portion of the operation unit 11 to open and close the pair of holding members 12 a of the treatment unit 12.
  • the operator can operate the treatment unit 12 by holding the operation unit 11 with one hand.
  • FIG. 2 is a hardware configuration diagram of the medical system 100 excluding the treatment tool 1.
  • the endoscope 2 has an elongated rigid insertion portion 20 which can be inserted into the abdominal cavity of a patient, and a handle portion 21.
  • the operator passes the insertion portion 20 through the trocar T which is punctured in the abdomen B of the patient and introduces the insertion portion 20 into the abdominal cavity.
  • the insertion unit 20 is provided at the tip thereof with an imaging unit 22 having a lens for imaging a situation in the abdomen of a patient and an imaging element.
  • the insertion unit 20 introduced into the abdominal cavity is disposed at a position where the imaging unit 22 can photograph an affected area in the abdomen that is a treatment target.
  • the imaging unit 22 may have an optical zoom or electronic zoom function.
  • the operator can change the position and the orientation of the imaging unit 22 of the endoscope 2 by holding the holding unit 21 and moving the endoscope 2.
  • the insertion portion 20 may further include an active bending portion that actively bends. By bending the active bending portion provided in a part of the insertion portion 20, the position and the orientation of the imaging unit 22 can be changed.
  • a control signal line for controlling the imaging unit 22, a transmission signal for transferring a captured image captured by the imaging unit 22, and the like are wired inside the hand holding unit 21.
  • the control device 3 has a control unit 33, as shown in FIG.
  • the control device 3 receives the captured image captured by the imaging unit 22 of the endoscope 2 and transfers it to the display device 4 as a display image.
  • the control unit 33 is configured by a device (computer) provided with hardware capable of executing programs such as a CPU (Central Processing Unit) and a memory.
  • the function of the control unit 33 can be realized as a function of software by the control unit 33 reading and executing a program for controlling the CPU.
  • at least a part of the control unit 33 may be configured by a dedicated logic circuit or the like.
  • the same function can be realized by connecting at least a part of hardware configuring the control unit 33 with a communication line.
  • FIG. 3A shows an example of the overall configuration of the control unit 33.
  • the control unit 33 includes a CPU 34, a memory 35 capable of reading a program, a storage unit 36, and an input / output control unit 37.
  • a program provided to the control unit 33 for controlling the operation of the control unit 33 is read into the memory 35 and executed by the CPU 34.
  • the storage unit 36 is a non-volatile storage medium storing the above-described program and necessary data.
  • the storage unit 36 is configured of, for example, a ROM, a hard disk or the like.
  • the program recorded in the storage unit 36 is read into the memory 35 and executed by the CPU 34.
  • the storage unit 36 executes, for example, virtual images or virtual images of “types” of medical devices introduced into the abdominal cavity, such as forceps, electric scalpel, stapler, snare, mesh, etc., and “types” of medical devices. And shape data for virtual image that can be generated. Further, with regard to medical devices having variations in size, virtual image shape data capable of generating a virtual image for each “size” or a virtual image for each “size” is also stored in the storage unit 36. In the following description, virtual image or virtual image shape data is referred to as "virtual image data", and virtual image data for each "type" and "size” of a medical device is referred to as "medical device information”. Name.
  • the input / output control unit 37 receives input data from the endoscope 2 and transfers the input data to the CPU 34 or the like.
  • the input / output control unit 37 also generates data, control signals, and the like for the endoscope 2 and the display device 4 based on an instruction of the CPU 34.
  • the control unit 33 receives a captured image as input data from the endoscope 2 and reads the captured image into the memory 35. Based on the program read into the memory 35, the CPU 34 performs image processing on the captured image. The captured image subjected to the image processing is transferred to the display device 4 as a display image.
  • the control unit 33 has at least two types of image processing operation modes of a normal mode and a virtual image superposition mode.
  • the operating mode is controlled by the program.
  • the control unit 33 operating in the normal mode performs image processing such as image format conversion, contrast adjustment, and resizing processing on the captured image to generate a display image.
  • the control unit 33 operating in the virtual image superposition mode performs an image processing of superimposing a virtual image of a medical device described later on a display image.
  • control unit 33 is not limited to the device provided in one piece of hardware.
  • the control unit 33 is configured by separating the CPU 34, the memory 35, the storage unit 36, and the input / output control unit 37 as separate pieces of hardware, and then connecting the hardware through a communication line. You may Alternatively, the control unit 33 may realize the control unit 33 as a cloud system by separating the storage unit 36 and similarly connecting through the communication line.
  • the control unit 33 is other than the CPU 34, the memory 35, the storage unit 36, and the input / output control unit 37 shown in FIG. 3A, and further, those necessary to control the operation of the control device 3 are further included. You may have.
  • the control unit 33 may further include an image operation unit 38 that performs part or all of the image processing and image recognition processing performed by the CPU 34.
  • the control unit 33 can execute specific image processing and image recognition processing at high speed. Further, it may further include an image transfer unit for transferring the display image from the memory 35 to the display device 4.
  • the display device 4 is a device that displays the display image transferred by the control device 3.
  • the display device 4 has a known monitor 41 such as an LCD display.
  • the display device 4 may have a plurality of monitors 41.
  • the display device 4 may include a head mounted display or a projector instead of the monitor 41.
  • the monitor 41 can also display a GUI (Graphical User Interface) image generated by the control device 3 as a GUI.
  • GUI Graphic User Interface
  • the monitor 41 can display a control information display of the medical system 100 and an alert display on the GUI for the operator.
  • the display device 4 can also display a message prompting the user to input information from the input device 5 or a GUI display necessary for the information input.
  • the input device 5 is a device by which an operator inputs an instruction or the like to the control unit 33.
  • the input device 5 has an input unit 51 as shown in FIG.
  • the input unit 51 is a device that selects an operation mode of image processing of the control unit 33, and selects a medical device to be superimposed and displayed as a virtual image on a display image.
  • the input unit 51 may be configured by a switch or may be configured by a touch panel.
  • the touch panel may be integrated with the monitor 41.
  • the input unit 51 may be configured by a keyboard or a mouse.
  • the control unit 33 displays a list of operation modes and a list of medical devices on the monitor 41 as a GUI, and from the list displayed on the monitor 41, the operator inputs desired operation modes and operation of medical devices to the keyboard and mouse. It may be selected by The control device 3 acquires, from the input unit 51, selection information input by the operator.
  • the operator can use the input unit 51 to select the “type” of the medical device to be introduced into the abdominal cavity, such as forceps, electric scalpel, stapler, snare or mesh, from the medical device list. Also, for medical devices that have variations in size, "size" can also be selected.
  • FIG. 4 is a view showing laparoscopic surgery using the medical system 100. As shown in FIG.
  • the operator provides a plurality of holes (openings) for placing the trocar T in the abdomen B of the patient, and punctures the trocar T in the holes.
  • the operator passes the insertion portion 10 of the treatment tool 1 of the medical system 100 and the insertion portion 20 of the endoscope 2 through the trocar T punctured in the abdomen B of the patient, Introduce into the abdominal cavity.
  • the operator holds the handle 21 and moves the endoscope 2 so that the hernia gate H to be treated is included in the display image D as shown in FIG. Change the position or orientation of If the imaging unit 22 has an optical zoom or electronic zoom function, the function is adjusted.
  • the treatment unit 12 of the insertion unit 10 of the treatment tool 1 is imaged. Since the operation mode of the image processing of the control unit 33 is initially set to the “normal mode”, the virtual image of the medical device is not superimposed on the display image D.
  • the control unit 33 in which the operation mode is set to the “virtual image superposition mode” displays on the monitor 41 a medical device list and a message prompting the user to select the “type” and “size” of the medical device from the input unit 51
  • the control unit 33 in which the operation mode is set to the “virtual image superposition mode” measures the three-dimensional relative position from the imaging unit 22 of the endoscope 2 to the treatment unit 12 of the treatment instrument (relative position measurement step).
  • the measurement of the three-dimensional relative position can be performed by a method appropriately selected from known methods. For example, methods such as position measurement by image processing (case 1), position measurement by stereo camera (case 2), position measurement by laser etc (case 3), position measurement by sensing (case 4), etc. can be used.
  • the three-dimensional relative position to the treatment unit 12 can be measured using the captured image captured by the imaging unit 22 (case 1). For example, the moving treatment unit 12 can be photographed, and the three-dimensional relative position can be measured from the movement amount of the image feature. If the dimensions of the treatment section 12 are known, the three-dimensional relative position can also be measured based on the dimensions. In addition, by providing a mark such as an optical marker on the treatment unit 12, the three-dimensional relative position can be easily measured.
  • Laser, light, ultrasonic waves and the like are emitted from the imaging unit 22, and the three-dimensional relative position from the imaging unit 22 to the treatment unit 12 can be measured from the feedback information (case 3).
  • the three-dimensional relative position to the treatment unit 12 may be measured by performing pattern projection from the imaging unit 22 and analyzing the pattern projection result.
  • a magnetic sensor or the like may be installed in the treatment unit 12 or the imaging unit 22, and the three-dimensional relative position from the imaging unit 22 to the treatment unit 12 may be measured from position information such as a magnetic sensor received by an external antenna. Yes (case 4).
  • control part 33 may parallelly implement a medical device selection process and a relative-position measurement process, and may implement either one first.
  • FIG. 5A shows a display image D in which the virtual image VM1 of “mesh A” selected as the medical device is superimposed on the position where the treatment unit 12 appears.
  • FIG. 5 (b) shows a display image D when the treatment tool 1 is moved.
  • the control unit 33 can set the transparency in the process of superimposing the virtual image VM1 on the display image D.
  • the control unit 33 can set the transparency in the process of superimposing the virtual image VM1 on the display image D.
  • the control unit 33 is a selected medical device having a relative size to the treatment unit 12 shown in the display image D based on the relative distance from the imaging unit 22 to the treatment unit 12.
  • To generate a virtual image VM1 of The medical device information stored in the storage unit 36 is used to generate the virtual image VM1.
  • the display image D on which the virtual image VM1 is superimposed is displayed on the monitor 41. As shown in FIG. 5 (a), the operator can virtually see the situation where the treatment section 12 is holding "mesh A".
  • the virtual image VM1 of the mesh A is superimposed on the position relative to the treatment unit 12 shown in the display image D Therefore, the position where the virtual image VM1 of the mesh A is superimposed is also changed. That is, the position where the virtual image VM1 of the mesh A is superimposed follows the position of the treatment unit 12 in the display image D.
  • the operator can move the treatment tool 1 to change the position of the virtual image VM1 of the mesh A in the display image D.
  • the operator superimposes the virtual image VM1 of the selected mesh A in the vicinity of the hernia gate H by moving the treatment portion 12 to the vicinity of the hernia gate H formed in the abdominal wall, as shown in FIG. 5 (b). Can.
  • the virtual image VM1 of the mesh A is superimposed on the display image D so as to have a relative size with respect to the treatment unit 12 shown in the display image D. Therefore, the size of the virtual image VM1 of the mesh A superimposed on the display image D is substantially the same size as the real image of the mesh A in the display image in which the actual mesh A actually exists in the three-dimensional space. That is, the size of the virtual image VM1 of the mesh A follows the size of the treatment unit 12 in the display image D. For example, when the treatment unit 12 is at a position far from the imaging unit 22, the virtual image VM ⁇ b> 1 of the mesh A is small and superimposed as compared to the case where the treatment unit 12 is at a position near to the imaging unit 22.
  • the operator looks at the virtual image VM1 of the mesh A shown in the display image D, determines whether the selected mesh A has a size sufficient for closing the hernia gate H, the size etc. in the three-dimensional space It can be intuitively grasped without actually measuring it.
  • the three-dimensional relative attitude can be used in the virtual image superposition process.
  • the virtual image VM1 of the mesh A is superimposed on the display image D so as to have a posture relative to the treatment unit 12 shown in the display image D.
  • the mesh virtual image VM1 is generated and superimposed as an image rotated about an axis parallel to the longitudinal axis of the treatment tool 1 .
  • FIG. 6A shows a display image D on which the virtual image VM2 of “mesh B” is superimposed when “mesh B” smaller in size than “mesh A” is selected as the medical device.
  • FIG. 5 (b) shows a display image D when the treatment tool 1 is moved.
  • the operator moves the treatment section 12 to the vicinity of the hernia gate H formed on the abdominal wall to display the virtual image VM2 of the selected mesh B superimposed on the vicinity of the hernia gate H, as shown in FIG. 6B. be able to.
  • the operator can intuitively grasp that the selected mesh B does not have a necessary and sufficient size for closing the hernia gate H by looking at the virtual image VM2 of the mesh B shown in the display image D. it can.
  • FIG. 7 is a view showing laparoscopic surgery in which a large intestine is detached using the medical system 100. As shown in FIG.
  • the operator holds the holding portion 21 and moves the endoscope 2 so that the display image D includes the cutaway portion P of the large intestine to be treated, as shown in FIG.
  • the position and the orientation of the imaging unit 22 are changed.
  • the treatment portion 12 of the insertion portion 10 of the treatment tool 1 is imaged in the display image D shown in FIG. 7. Since the operation mode of the image processing of the control unit 33 is initially set to the “normal mode”, the virtual image of the medical device is not superimposed on the display image D.
  • the operator examines the size of the stapler necessary and sufficient for separating the separated portion P of the large intestine before introducing the stapler for separating the separated portion P of the large intestine into the abdominal cavity. .
  • the operator sets the operation mode of the image processing of the control unit 33 to the “virtual image superposition mode”.
  • the operator selects a stapler as the type of medical device, and further selects “stapler A” which is estimated to be the optimum size.
  • the virtual image superimposing step based on the three-dimensional relative position from the imaging portion 22 of the endoscope 2 to the treatment portion 12 of the treatment instrument, which is measured in the relative position measuring step.
  • the virtual image VS1 of "stapler A" is superimposed on the display image D.
  • the operator looks at the virtual image VS1 of the stapler A shown in the display image D, and instincts that the selected stapler A does not have the necessary and sufficient size to cut off the separated portion P of the large intestine. Can be grasped.
  • the operator reselects the "stapler B" which is larger than the stapler A.
  • the control unit 33 superimposes the virtual image VS2 of the "stapler B" on the display image D as shown in FIG. 9 in the virtual image superposition step.
  • the operator looks at the virtual image VS2 of the stapler B shown in the display image D, and instincts that the selected stapler B has a necessary and sufficient size to cut off the separated portion P of the large intestine. Can be grasped.
  • the operator looks at the virtual image of the medical device shown in the display image D and determines whether the selected medical device has a necessary and sufficient size in the three-dimensional space. Can be intuitively grasped without actually measuring dimensions and the like. The time taken for trial and error in the selection of medical devices can be reduced, and the selection of appropriate types of medical devices can be supported.
  • the virtual image of the medical device is superimposed on the position where the treatment unit 12 appears in the display image D, but the superimposed position of the virtual image on the display image is not limited to the aspect of the above embodiment.
  • the virtual image may be superimposed on the position where the tip or side portion of the treatment unit is located.
  • the superimposition of the virtual image on the display image is performed in the control unit 33, but the mode of the superimposition of the virtual image on the display image is not limited to the aspect of the above embodiment.
  • superimposition of a virtual image on a display image may be performed on a display device.
  • the control unit 33 calculates information such as the superposition position and size necessary for superposition display, and the display device performs superposition on the display image of the virtual image based on the calculation result, and then the display image is displayed on the monitor 41. You may display it.
  • the display device 4 shown in the above embodiment is configured by the monitor 41
  • the display device 4 may be configured by a projector.
  • a projector may be incorporated into the imaging unit 22 of the endoscope 2 and projection (projection mapping) of an image of a virtual image of a medical device may be performed by the projector in a real three-dimensional space. The operator can more intuitively select the most appropriate medical device.
  • the projector is incorporated into the imaging unit 22 of the endoscope 2, infrared pattern light may be irradiated from the projector to measure the three-dimensional relative position with the treatment unit or the like.
  • the endoscope 2 shown in the above embodiment is operated by the operator holding it directly, the form of the endoscope is not limited to the endoscope 2 of the above embodiment.
  • the endoscope 2B in a medical system 100B which is a modification of the medical system 100 shown in FIG. 10, the endoscope 2B includes an articulated robot arm 21B having a plurality of joints 23.
  • the three-dimensional position of the imaging unit 22 at the tip of the endoscope 2B can be calculated from control information of the articulated robot arm 21B.
  • the treatment tool 1 shown in the above embodiment is held by the operator directly and operated, the form of the treatment tool is not limited to the treatment tool 1 of the above embodiment.
  • the treatment tool 1B includes an articulated robot arm 11B having a plurality of joints 13.
  • the three-dimensional position of the treatment section 12 at the tip of the treatment tool 1B can be calculated from control information of the articulated robot arm 11B.
  • FIGS. 12-15 A second embodiment of the present invention will be described with reference to FIGS. 12-15.
  • the present embodiment is different from the first embodiment in that depth measurement in a three-dimensional space shown in a display image is performed.
  • the same reference numerals are assigned to components common to those described above, and redundant description will be omitted.
  • the control unit 33 of the medical system 200 adds to the three-dimensional relative position from the imaging unit 22 of the endoscope 2 to the treatment unit 12 of the treatment instrument, from the imaging unit 22 of the endoscope 2 The difference is that the "three-dimensional relative depth" to a tissue or the like (object) present in the three-dimensional space shown in the display image is also measured.
  • the virtual image shape data is always stored in the storage unit 36 of the control unit 33 of the medical system 200.
  • FIGS. 12 to 15 an operation of the medical system 200 and an image generation method using the medical system 200 will be described with reference to FIGS. 12 to 15 by taking laparoscopic surgery in which a hernia gate formed in the abdominal wall is closed with a mesh as an example. . Only differences from the medical system 100 of the first embodiment will be described.
  • the operator sets the operation mode of the image processing of the control unit 33 to the “virtual image superposition mode”.
  • the operator operates the input unit 51 to select “mesh A” which is estimated to be the optimum size (medical device selection step).
  • the control unit 33 whose operation mode is set to the “virtual image superposition mode” measures the three-dimensional relative position from the imaging unit 22 of the endoscope 2 to the treatment unit 12 of the treatment tool as in the first embodiment (relative Position measurement process).
  • the relative position measurement step the “three-dimensional relative depth” from the imaging unit 22 of the endoscope 2 to the tissue or the like present in the three-dimensional space shown in the display image is also measured.
  • the measurement of the three-dimensional relative depth can be performed by a method appropriately selected from known methods.
  • methods such as passive depth measurement by image processing (case P), active depth measurement by projection of a pattern or the like (case A), and the like can be used.
  • the three-dimensional relative depth to the treatment unit 12 can be measured using the captured image captured by the imaging unit 22 (case P). For example, the moving treatment unit 12 can be photographed, and the three-dimensional relative depth can be measured from the movement amount of the image feature.
  • the imaging unit 22 is configured by a stereo camera, it is possible to measure a three-dimensional relative depth from the imaging unit 22 to a tissue or the like by using a parallax image of the stereo camera.
  • Laser, light, ultrasonic waves and the like are emitted from the imaging unit 22, and the three-dimensional relative depth from the imaging unit 22 to the tissue and the like can be measured from the feedback information (case A).
  • the three-dimensional relative depth to the tissue or the like may be measured by performing pattern projection of a striped pattern or a random pattern from the imaging unit 22 and analyzing the pattern projection result.
  • the control unit 33 superimposes the virtual image of the selected medical device on the display image D based on the measurement results of the three-dimensional relative position and the three-dimensional relative depth (virtual image superposition step).
  • 12 to 14 show the display image D on which the virtual image VM1 of "mesh A" selected as the medical device is superimposed.
  • the control unit 33 sets the display image D to a wire frame F indicating unevenness of a tissue or the like present in the three-dimensional space shown in the display image D based on the measurement result of the three-dimensional relative depth.
  • the operator can easily grasp the unevenness of the tissue or the like present in the three-dimensional space shown in the display image D by the display image D on which the wire frame F is superimposed.
  • the control unit 33 controls the “three-dimensional relative position” from the imaging unit 22 to the treatment unit 12 and the “three-dimensional relative position” from the imaging unit 22 to the tissue etc. existing in the three-dimensional space shown in the display image Depth is measured in real time. Further, regarding the medical device on which the virtual image is superimposed, shape data for virtual image is stored in the storage unit 36. Therefore, when the medical device on which the virtual image is superimposed is a real medical device, the control unit 33 determines whether the tissue or the like present in the three-dimensional space shown in the display image is touched (contact determination )It can be performed.
  • the control unit 33 determines that the tissue or the like existing in the dimensional space is in contact, as shown in FIG. 13, the control unit 33 superimposes a display indicating that the contact occurs on the display image D.
  • the operator sees the virtual image VM1 of the mesh A in the display image D and intuitively grasps whether the selected mesh A can block the hernia gate H without being obstructed by the obstacle. Can.
  • the control unit 33 causes the virtual image of the mesh A to contact a tissue or the like existing in a three-dimensional space when the treatment unit 12 approaches the vicinity of the hernia gate H formed in the abdominal wall as shown in FIG. It may be deformed and superimposed. The operator can more intuitively understand whether the selected mesh A can close the hernia gate H by looking at the virtual image VM1 of the mesh A deformed along the tissue or the like.
  • FIG. 15 shows a display image D on which the virtual image VS2 of the stapler B selected to perform the separation of the large intestine in the medical system 200 is superimposed.
  • the control unit 33 determines the tissue positioned on the right. It is determined that the touch image is touched, and a display indicating that the touch occurs is superimposed on the display image D.
  • the control unit 33 determines the tissue located on the left. It is determined that the touch image is touched, and a display indicating that the touch occurs is superimposed on the display image D. The display indicating that the contact occurs is superimposed on the place where it is determined that the contact occurs.
  • the operator looks at the virtual image VS2 of the stapler B shown in the display image D, and intuitively determines whether the selected stapler B can disconnect the large intestine without being obstructed by the obstacle around it. It can be grasped.
  • the operator looks at the virtual image of the medical device shown in the display image D and intuitively determines whether the selected medical device is obstructed by the obstacle. It can be grasped. The time taken for trial and error in the selection of medical devices can be reduced, and the selection of appropriate types of medical devices can be supported.
  • the whole structure of the medical system 300 which concerns on this embodiment is the same as the medical system 100 which concerns on 1st embodiment.
  • the difference is that the endoscope 2C has a flexible insertion portion.
  • FIG. 16 shows an endoscope 2C of the medical system 300.
  • a treatment tool to which the local injection needle 12b is attached as the treatment unit 12 is inserted through the lumen provided in the endoscope 2C.
  • the operator inserts the local injection material into the submucosa layer slowly by piercing the local injection needle 12b at the peripheral edge beyond the lesion to lift the lesion.
  • the treatment tool to which the snare 12c is attached as the treatment unit 12 is inserted through the lumen provided in the endoscope 2C.
  • the operator uses the snare 12c to separate the lesion.
  • the operator needs to select the snare 12c of the optimum size and shape in accordance with the size and shape of the lesion. Depending on the experience and skill of the caster, trial and error of the selection of the snare takes time.
  • the operator sets the operation mode of the image processing of the control unit 33 to the “virtual image superposition mode” when introducing the treatment tool to which the local injection needle 12b is attached as the treatment unit 12 into the body.
  • the operator operates the input unit 51 to select “Snare A” which is estimated to be the optimum size (medical device selection step).
  • FIG. 17 shows a display image D in which the virtual image VW1 of the selected snare A is superimposed on the position where the local injection needle 12b appears in the medical system 300.
  • the operator intuitively grasps the virtual image VM1 of the snare A shown in the display image D and that the selected snare A has a necessary and sufficient size for the separation of the lesion. Can.
  • FIG. 18 shows a display image in which the snare B smaller in size than the snare A is selected in the medical system 300 and the virtual image VW2 of the snare B is superimposed.
  • the operator should intuitively understand that the selected snare B does not have a necessary and sufficient size for dissection of the lesion by looking at the virtual image VM2 of the snare B shown in the display image D. Can.
  • the present embodiment is different from the first embodiment to the third embodiment in that the medical device selection process is linked with the stock management system of the medical device.
  • the same reference numerals are assigned to components common to those described above, and redundant description will be omitted.
  • the whole structure of the medical system 400 which concerns on this embodiment is the same as the medical system 100 which concerns on 1st embodiment.
  • the control unit 33 in which the operation mode is set to the “virtual image superposition mode” displays on the monitor 41 a medical device list and a message prompting the user to select the “type” and “size” of the medical device from the input unit 51
  • the control unit 33 is connected to an in-hospital inventory management system of medical devices, and can display the inventory status of each medical device according to the displayed medical device list.
  • the operator can select the medical device to be displayed as a virtual image after confirming the presence or absence of the stock. If you select a medical device that does not have an actual inventory, you can not actually use the medical device immediately, even if the selected medical device is the optimal size. If it is possible to display the stock status of medical devices, medical devices can be selected from the medical devices that can actually be used, reducing the time taken for trial and error in the selection of medical devices, and of appropriate types of medical devices. Can support selection.
  • the present invention can be applied to a medical system provided with an endoscope.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

Selon l'invention, ce système médical est pourvu : d'un outil de traitement ; d'un endoscope qui comporte une unité de capture d'image ; d'un dispositif de commande qui génère une image d'affichage à partir d'une image capturée par l'unité de capture d'image ; d'un dispositif d'affichage qui affiche l'image d'affichage ; et d'un dispositif d'entrée qui sélectionne un équipement médical, le dispositif de commande mesurant une position relative de l'endoscope à l'outil de traitement, et superpose, sur l'image d'affichage, une image virtuelle de l'équipement médical, qui se trouve à la position relative par rapport à l'outil de traitement affiché sur l'image d'affichage et a une taille relative, sur la base de la position relative.
PCT/JP2017/029614 2017-08-18 2017-08-18 Système d'endoscope et procédé de génération d'image Ceased WO2019035206A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/029614 WO2019035206A1 (fr) 2017-08-18 2017-08-18 Système d'endoscope et procédé de génération d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/029614 WO2019035206A1 (fr) 2017-08-18 2017-08-18 Système d'endoscope et procédé de génération d'image

Publications (1)

Publication Number Publication Date
WO2019035206A1 true WO2019035206A1 (fr) 2019-02-21

Family

ID=65362425

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/029614 Ceased WO2019035206A1 (fr) 2017-08-18 2017-08-18 Système d'endoscope et procédé de génération d'image

Country Status (1)

Country Link
WO (1) WO2019035206A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116171122A (zh) * 2020-09-10 2023-05-26 奥林巴斯株式会社 医疗系统和控制方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05337118A (ja) * 1992-06-04 1993-12-21 Olympus Optical Co Ltd スコープ保持装置
JPH08332169A (ja) * 1995-06-08 1996-12-17 Olympus Optical Co Ltd 体腔内観察装置
JPH0938030A (ja) * 1995-07-28 1997-02-10 Shimadzu Corp 内視鏡装置
JP2002238844A (ja) * 2001-02-16 2002-08-27 Olympus Optical Co Ltd 内視鏡装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05337118A (ja) * 1992-06-04 1993-12-21 Olympus Optical Co Ltd スコープ保持装置
JPH08332169A (ja) * 1995-06-08 1996-12-17 Olympus Optical Co Ltd 体腔内観察装置
JPH0938030A (ja) * 1995-07-28 1997-02-10 Shimadzu Corp 内視鏡装置
JP2002238844A (ja) * 2001-02-16 2002-08-27 Olympus Optical Co Ltd 内視鏡装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116171122A (zh) * 2020-09-10 2023-05-26 奥林巴斯株式会社 医疗系统和控制方法

Similar Documents

Publication Publication Date Title
JP7275204B2 (ja) 遠隔操作医療システムにおけるオンスクリーンメニューのためのシステム及び方法
JP7080945B2 (ja) 遠隔操作医療システムにおける器具の画面上での識別のためのシステム及び方法
JP7086150B2 (ja) 遠隔操作医療システムにおける器具の画面上での識別をレンダリングするためのシステム及び方法
CN109890311B (zh) 在计算机辅助远程操作外科手术中的可重新配置的显示器
EP1937176B1 (fr) Affichage et manipulation d'images auxiliaires sur un affichage informatique d'un système robotique médical
US9615890B2 (en) Surgical robot system and method of controlling the same
KR101038417B1 (ko) 수술 로봇 시스템 및 그 제어 방법
US20190053872A1 (en) Systems and methods for removing occluding objects in surgical images and/or video
JP2019162339A (ja) 手術支援システムおよび表示方法
EP4642366A1 (fr) Systèmes et procédés de génération d'interfaces de navigation 3d pour des actes médicaux
US20250205003A1 (en) Robotic surgical system and method for providing a stadium view with arm set-up guidance
KR100962472B1 (ko) 수술 로봇 시스템 및 그 제어 방법
KR101715026B1 (ko) 수술 로봇 시스템 및 그 동작 제한 방법
WO2019035206A1 (fr) Système d'endoscope et procédé de génération d'image
KR101683057B1 (ko) 수술 로봇 시스템 및 그 동작 제한 방법
CN118284380A (zh) 用于器械的导航辅助
JP7495242B2 (ja) 医用画像診断装置、手術支援ロボット装置、手術支援ロボット用制御装置及び制御プログラム
US11850004B2 (en) Systems and methods for determining an arrangement of explanted tissue and for displaying tissue information
CN117355862A (zh) 包含用于连接表示解剖通路的模型结构的指令的系统、方法及介质
WO2022147074A1 (fr) Systèmes et procédés de suivi d'objets traversant une paroi corporelle pour des opérations associées à un système assisté par ordinateur
CN119014982A (zh) 用于生成器械的工作空间几何结构的系统和方法
JP2024514640A (ja) 画面上及び画面外で発生する混合された要素及びアクションを表示するレンダリングされた要素で直接視覚化される混合

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17921626

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17921626

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP