WO2025069575A1 - Information processing device and control program - Google Patents
Information processing device and control program Download PDFInfo
- Publication number
- WO2025069575A1 WO2025069575A1 PCT/JP2024/020434 JP2024020434W WO2025069575A1 WO 2025069575 A1 WO2025069575 A1 WO 2025069575A1 JP 2024020434 W JP2024020434 W JP 2024020434W WO 2025069575 A1 WO2025069575 A1 WO 2025069575A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- treatment
- processing device
- dimensional model
- resection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
Definitions
- This disclosure relates to an information processing device and a control program.
- JP 2015-107268 A discloses an endoscopic observation support device that displays a three-dimensional image of a tubular organ by superimposing a lesion extracted based on the path of the endoscope, the endoscopic image acquired by the endoscope, and the observation position of an ultrasound probe attached to the tip of the endoscope.
- JP 2006-61274 A discloses an endoscope system that generates a virtual endoscope image by volume rendering and displays the virtual endoscope image superimposed on an actual endoscope image.
- JP Patent Publication No. 2013-202313 discloses a surgery support device that displays the movement of surgical tools on a three-dimensional image in real time, displays the status of organ resection by the surgical tools, and issues warnings according to the position and speed of the surgical tools.
- a method is used in which the positional information of blood vessels and tumors extracted from image data taken by a radiological imaging device such as a CT (Computed Tomography) scanner is combined with the images taken by the endoscope.
- a radiological imaging device such as a CT (Computed Tomography) scanner
- the present disclosure has been made in consideration of the above circumstances, and aims to provide an information processing device and control program that can three-dimensionally grasp the difference between the positional relationship of the treatment site shown by image data of the treatment site and the positional relationship of the treatment site shown by an image captured by an endoscope.
- the information processing device includes a processor, and the processor controls the synthesis of a three-dimensional model of the treatment area generated from image data of the treatment area captured in advance with an image of the treatment area captured by an imaging device attached to an endoscope, and the display of the synthesized three-dimensional model with the image of the treatment area combined.
- the processor controls the display of related information associated with each stage of treatment for the treatment area together with the three-dimensional model or the composite three-dimensional model, depending on the stage of treatment for the treatment area.
- the processor controls to superimpose and display at least one of the following on the corresponding position of the three-dimensional model: the position of the affected area, the preplanned resection line of the treatment area including the affected area, the resection range to be resected along the resection line, the position of the blood vessel passing through the treatment area, the preplanned ligation position of the blood vessel, and the blood flow cessation area where blood flow is halted by ligation of the blood vessel.
- the processor in the information processing device according to the third aspect performs control to display the remaining volume of the organ containing the diseased area together with the three-dimensional model when the diseased area is resected along the resection line.
- the processor performs control to superimpose and display, on the corresponding positions of the composite three-dimensional model, at least one of the following: the position of the affected area, the resection line of the treatment area including the affected area, the resection range to be resected along the resection line, the position of the blood vessel passing through the treatment area, the ligation position of the blood vessel, the blood flow cessation area in which blood flow is halted by ligation of the blood vessel, and the position of the resection mark placed on the treatment area by a medical professional as a guide for resecting the affected area.
- the processor controls to associate an identifier that identifies the ligation position of the blood vessel with the ligation position of the blood vessel and display it together with the synthetic three-dimensional model.
- the identifier indicates the order in which blood vessels are ligated.
- the processor performs control to distinguish between the ligation positions of blood vessels that have already been ligated and the ligation positions of blood vessels that have not yet been ligated, and to superimpose and display them at the corresponding positions of the composite three-dimensional model.
- the processor performs control to distinguish between blood vessels in the blood flow stop region where blood flow has been stopped by ligation and blood vessels in the blood flow stop region where blood flow is present because the blood vessels have not yet been ligated, and to display them superimposed on the corresponding positions of the composite three-dimensional model.
- the processor in the information processing device according to the tenth aspect performs control to superimpose and display a video showing blood flow on the blood vessels in the blood flow stop region where blood flow is present.
- the processor controls to display the number of ligation positions of blood vessels that have already been ligated and the number of ligation positions of blood vessels that have not yet been ligated together with the composite three-dimensional model.
- the processor in the information processing device according to the sixth aspect performs control to update the range of the blood flow halt region according to the ligation state of the blood vessel.
- the processor performs control to output a warning if the resection line of the treatment area and the resection execution line represented by the resection mark are separated by a predetermined distance or more.
- the processor of the information processing device performs control to display the composite 3D model from an angle specified by a medical professional.
- the processor performs control to display the composite 3D model to which the new image has been added while compositing new images of the treatment site captured by the imaging device as the endoscope moves into the composite 3D model along with the movement of the endoscope.
- the processor stores in a storage device historical information regarding the operation of an instrument performed by a medical professional while performing treatment on the treatment area together with the composite 3D model, and performs control to display the treatment details for the treatment area using the historical information and the composite 3D model in response to instructions from the medical professional.
- FIG. 1 is a diagram illustrating an example of the configuration of an endoscopic treatment system.
- FIG. 1 is a diagram showing an example of insertion of an endoscope.
- FIG. 2 is a diagram showing an example of an endoscopic image.
- FIG. 2 is a diagram illustrating an example of a three-dimensional model.
- 13 is an example showing an example of a composite three-dimensional model.
- FIG. 1 illustrates an example of the configuration of an information processing device.
- 13 is a flowchart showing an example of the flow of a preoperative marking process.
- FIG. 13 is a diagram showing a display example of preoperative planning information.
- 11 is a flowchart showing an example of the flow of a synthetic 3D model generation process.
- 13 is a flowchart showing an example of the flow of intraoperative sign processing.
- FIG. 13 is a diagram showing an example of display of intraoperative plan information.
- 13 is a flowchart showing an example of the flow of resection support processing.
- FIG. 13 is
- preoperative stage before the endoscope is inserted into the patient's body and the procedure is performed
- intraoperative stage after the endoscope has been inserted into the patient's body
- FIG. 1 is a diagram showing an example of the configuration of an endoscopic surgery system 100.
- the endoscopic surgery system 100 includes an image capturing device 1, an information processing device 2, and an endoscope 3.
- the imaging device 1 is a device that captures images of the treatment site from outside the body in order to understand the positional relationship of the treatment site before inserting an endoscope 3 into the patient's body and removing a lesion such as a tumor (also called the "affected area").
- a CT scanning device that captures images of the treatment site using radiation
- an ultrasound imaging device that captures images of the treatment site using ultrasound are used as the imaging device 1.
- an image of the treatment site is captured using a CT scanning device.
- the imaging device 1 converts the captured images into image data.
- the treatment site refers to an area that includes a lesion such as a tumor.
- the positional relationship of the treatment site refers to the positional relationship between the treatment site and the areas inside and around the treatment site, including, for example, the size of the treatment site, the position of the treatment site, the positions of blood vessels that pass through the treatment site, and the positional relationship with other organs around the treatment site.
- the endoscope 3 notifies the information processing device 2 of the captured endoscopic video 5.
- the endoscopic video 5 may be a video or a still image, and may be monochrome or color, but in this disclosure, as an example, the endoscopic video 5 is a color video.
- the information processing device 2 in FIG. 1 generates a three-dimensional model 6 of the treatment area using image data received from the image capture device 1, and synthesizes the endoscopic image 5 received from the endoscope 3 onto the surface of the generated three-dimensional model 6 to generate a composite three-dimensional model 7 in which the image of the treatment area is synthesized onto the three-dimensional model 6.
- step S20 the CPU 10A generates a three-dimensional model 6 of the treatment area using the image data acquired by the processing in step S10.
- the three-dimensional model 6 is generated, for example, by using three-dimensional model generation software provided by the developer of the image capture device 1.
- the CPU 10A displays the generated three-dimensional model 6 on the screen 30.
- FIG. 9 is a flowchart showing an example of the flow of the process for generating a composite 3D model 7 executed by the information processing device 2 when the endoscope 3 is inserted into a patient by a medical professional.
- the CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the process for generating a composite 3D model 7.
- the memory unit 12 is assumed to have stored in advance image data of the treatment area of the patient captured by the image capture device 1.
- the memory unit 12 is also assumed to have stored in advance the 3D model 6 generated by the preoperative marking process shown in FIG. 7 and the preoperative planning information acquired in the preoperative marking process.
- step S130 the CPU 10A determines whether or not an end instruction has been received from the medical staff via the operation unit 13. If an end instruction has not been received, the process proceeds to step S120. That is, the CPU 10A continues to generate a composite 3D model 7 to which a new endoscopic video 5 has been added by compositing a new endoscopic video 5 of the treatment site captured by the imaging device attached to the endoscope 3 as the endoscope 3 moves, into the composite 3D model 7 as the endoscope 3 moves, until the endoscope 3 finishes capturing the image of the treatment site.
- FIG. 10 is a flowchart showing an example of the flow of intraoperative marking processing that is executed after the composite 3D model 7 is generated.
- the CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the intraoperative marking processing.
- step S200 of FIG. 10 the CPU 10A displays the composite 3D model 7 on the screen 30.
- the medical staff refers to the endoscopic video 5 displayed on the surface of the synthetic 3D model 7 and considers whether the contents of the treatment plan formulated from the synthetic 3D model 7 are the same as those represented by the preoperative planning information. Because the synthetic 3D model 7 has the endoscopic video 5 synthesized therein, it is easier to grasp the characteristics and positional relationship of the treatment area compared to the 3D model 6 on which the endoscopic video 5 is not synthesized.
- step S220 CPU 10A determines whether an instruction to change the preoperative planning information has been received. If an instruction to change the preoperative planning information has been received, the process proceeds to step S230.
- FIG. 11 is a diagram showing an example of the display of intraoperative planning information.
- the intraoperative planning information includes information indicated by the preoperative planning information.
- the CPU 10A in response to a change in the resection line 22A, displays the resection line 22A at the preoperative labeling stage and the updated resection line 22B, each superimposed on the corresponding positions on the composite 3D model 7.
- the CPU 10A displays the affected area 24A at the preoperative labeling stage and the updated affected area 24B, each superimposed on the corresponding positions on the composite 3D model 7.
- step S240 of FIG. 10 the CPU 10A determines whether or not a viewpoint change instruction to change the viewpoint from which the medical professional views the composite 3D model 7 has been received. If a viewpoint change instruction has been received, the process proceeds to step S250.
- step S260 After changing the viewpoint from which the composite 3D model 7 is viewed, the process proceeds to step S260.
- step S240 determines whether a viewpoint change instruction has been received. If it is determined in the determination process of step S240 that a viewpoint change instruction has not been received, the process proceeds to step S260 without executing the process of step S250.
- step S260 CPU 10A determines whether the medical professional has started to resect the diseased area 24, i.e., whether the operation has progressed to the resection stage.
- the medical professional may mark the operation site using an electric scalpel to mark the position where the diseased area 24 will be resected.
- the mark that marks the position where the diseased area 24 will be resected is referred to as a "resection mark 27.” Therefore, CPU 10A may determine that the operation has progressed from the intraoperative marking stage to the resection stage when an operation of a resection instrument such as an electric scalpel attached to endoscope 3 has been performed.
- a resection mark 27 it is assumed that the medical professional marks the operation site with a resection mark 27.
- step S220 the intraoperative labeling stage processing continues.
- the intraoperative labeling process shown in FIG. 10 ends.
- FIG. 12 is a flowchart showing an example of the flow of the resection support process executed after the intraoperative marking process shown in FIG. 10 is completed.
- the CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the resection support process.
- the resection support process is executed in parallel with the process of generating the composite 3D model 7 shown in FIG. 9. Therefore, if the endoscope 3 moves during the resection support process, the range of the endoscopic video 5 that is composited into the composite 3D model 7 may expand.
- step S300 CPU 10A displays on screen 30 the intraoperative planning information finally formulated in the intraoperative labeling stage, superimposed on the corresponding position of the synthetic 3D model 7. Furthermore, if the acquired endoscopic video 5 includes a resection mark 27, CPU 10A displays the resection mark 27 at the corresponding position of the synthetic 3D model 7.
- the intraoperative planning information finally formulated in the intraoperative labeling stage, the resection mark 27, and the synthetic 3D model 7 are examples of resection support information.
- the resection support information is an example of related information associated with the resection stage.
- FIG. 13 is a diagram showing an example of the display of resection support information.
- the virtual lines (not shown) connecting each of the resection marks 27 in FIG. 13 represent the resection execution lines along which the medical staff will actually perform the resection.
- the CPU 10A associates an identifier for identifying the ligation position 21B with the ligation position 21B of each blood vessel 20 and displays them together with the composite three-dimensional model 7.
- the identifier "P1" is associated with the ligation position 21B of the first blood vessel 20
- the identifier "P2" is associated with the ligation position 21B of the second blood vessel 20.
- the numbers included in the identifiers indicate the order in which the blood vessels 20 are ligated. That is, in the display example shown in FIG.
- the blood vessel 20 at the ligation position 21B associated with the identifier "P1" is ligated first, and the blood vessel 20 at the ligation position 21B associated with the identifier "P2" is ligated next. Therefore, medical personnel can check the ligation order of blood vessel 20 by referring to the identifier associated with ligation position 21B of blood vessel 20.
- the identifiers of the ligation positions 21B are displayed based on the ligation order instructed by the medical staff during the intraoperative labeling stage.
- the CPU 10A may display the ligation order of the blood vessels 20 using a notation other than numbers. Specifically, the ligation order of the blood vessels 20 may be displayed in alphabetical order.
- the medical professional Before resecting the diseased area 24B, the medical professional will ligate the blood vessel 20 while referring to the resection support information, but depending on the situation, he or she may want to confirm whether or not the blood vessel 20 has been ligated. Therefore, when displaying the ligation position 21B of the blood vessel 20, the CPU 10A distinguishes between the ligation position 21B of the blood vessel 20 that has already been ligated and the ligation position 21B of the blood vessel 20 that has not yet been ligated, and displays them superimposed on the corresponding positions of the composite three-dimensional model 7.
- the blood vessel 20 at the ligation position 21B associated with the identifier "P1" has already been ligated, and the blood vessel 20 at the ligation position 21B associated with the identifier "P2" has not yet been ligated.
- the CPU 10A displays the ligation position 21B associated with the identifier "P1” and the ligation position 21B associated with the identifier "P2" in different display forms.
- the CPU 10A displays the ligation position 21B that has already been ligated as a straight line crossing the blood vessel 20, and displays the ligation position 21B that has not yet been ligated as an "x". Note that there are no restrictions on the display form of the ligation position 21B, as long as it is possible to distinguish between the ligation position 21B that has already been ligated and the ligation position 21B that has not yet been ligated.
- the CPU 10A also displays on the screen 30 the number of ligation positions 21B that have been ligated and the number of ligation positions 21B that have not yet been ligated together with the composite 3D model 7.
- the display "Remaining: 1" indicates the number of ligation positions 21B that have not yet been ligated
- the display "Completed: 1" indicates the number of ligation positions 21B that have been ligated.
- the CPU 10A can determine from the endoscopic image 5 at which ligation position 21B the blood vessel 20 has been ligated, and can update the display form of the ligation position 21B and the number of ligated and unligated positions.
- the CPU 10A may also distinguish between the blood vessel 20A in which blood flow has stopped due to ligation and the blood vessel 20B in which blood flow exists because the blood vessel has not yet been ligated, and superimpose and display them at the corresponding positions on the composite three-dimensional model 7.
- the blood vessel 20A located at the end of the blood flow direction from the ligation position 21B associated with the identifier "P1" is ligated at the ligation position 21B.
- the blood vessel 20B located at the end of the blood flow direction from the ligation position 21B associated with the identifier "P2" has not yet been ligated at the ligation position 21B.
- the CPU 10A distinguishes between the blood vessel 20A and the blood vessel 20B, and superimposes and displays them at the corresponding positions on the composite three-dimensional model 7.
- the display color of the blood vessel 20 may be changed.
- CPU 10A may display blood vessels 20A and 20B in a distinguished manner by superimposing a video showing the flow of blood on blood vessel 20B in which blood is present.
- the video showing the flow of blood may be an image showing the actual flow of blood or an animation.
- Such a video showing the flow of blood may be stored in advance in storage unit 12, for example.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Robotics (AREA)
- Endoscopes (AREA)
Abstract
Description
本開示は、情報処理装置、及び制御プログラムに関する。 This disclosure relates to an information processing device and a control program.
患者の負担を軽減するため、内視鏡を用いた施術が広く行われるようになった。内視鏡手術によって腫瘍等を切除する場合、施術箇所を正しく把握することは、例えば施術時間を削減するうえで重要な要素である。 In order to reduce the burden on patients, procedures using endoscopes have become widespread. When removing tumors or other lesions using endoscopic surgery, correctly identifying the treatment site is an important factor in, for example, reducing the procedure time.
特開2015-107268号には、管腔臓器の三次元画像に対して、内視鏡の経路、内視鏡により取得された内視鏡画像、及び内視鏡先端に設けられた超音波プローブの観察位置に基づいて抽出された病変部を重畳表示する内視鏡観察支援装置が開示されている。 JP 2015-107268 A discloses an endoscopic observation support device that displays a three-dimensional image of a tubular organ by superimposing a lesion extracted based on the path of the endoscope, the endoscopic image acquired by the endoscope, and the observation position of an ultrasound probe attached to the tip of the endoscope.
特開2006-61274号には、ボリュームレンダリングによって仮想内視鏡画像を生成し、仮想内視鏡画像と内視鏡実画像とを重畳表示する内視鏡システムが開示されている。 JP 2006-61274 A discloses an endoscope system that generates a virtual endoscope image by volume rendering and displays the virtual endoscope image superimposed on an actual endoscope image.
特開2013-202313号には、三次元画像上にリアルタイムで術具の動きを表示し、術具による臓器の切除状況を表示し、また、術具の位置や速度に応じた警告を行う手術支援装置が開示されている。 JP Patent Publication No. 2013-202313 discloses a surgery support device that displays the movement of surgical tools on a three-dimensional image in real time, displays the status of organ resection by the surgical tools, and issues warnings according to the position and speed of the surgical tools.
施術箇所を把握するため、例えばCT(Computed Tomography)スキャン装置のような放射線撮影装置によって撮影した画像データから抽出した血管及び腫瘍の位置情報を、内視鏡によって撮影された映像に合成する手法がとられる。しかしながら、内視鏡によって撮影された映像は二次元映像であるため、映像に奥行き感が感じられない。したがって、施術箇所の距離感を把握することが困難になる。 In order to determine the treatment area, a method is used in which the positional information of blood vessels and tumors extracted from image data taken by a radiological imaging device such as a CT (Computed Tomography) scanner is combined with the images taken by the endoscope. However, because the images taken by the endoscope are two-dimensional, there is no sense of depth in the images. This makes it difficult to grasp the sense of distance to the treatment area.
本開示は、上記事情を考慮して成されたものであり、施術箇所を撮影した画像データによって示される施術箇所の位置関係と、内視鏡によって撮影された映像によって示される施術箇所の位置関係との差異を立体的に把握することができる情報処理装置、及び制御プログラムを提供することを目的とする。 The present disclosure has been made in consideration of the above circumstances, and aims to provide an information processing device and control program that can three-dimensionally grasp the difference between the positional relationship of the treatment site shown by image data of the treatment site and the positional relationship of the treatment site shown by an image captured by an endoscope.
第1態様に係る情報処理装置はプロセッサを備え、前記プロセッサが、予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成し、前記施術箇所の映像が合成された合成三次元モデルを表示する制御を行う。 The information processing device according to the first aspect includes a processor, and the processor controls the synthesis of a three-dimensional model of the treatment area generated from image data of the treatment area captured in advance with an image of the treatment area captured by an imaging device attached to an endoscope, and the display of the synthesized three-dimensional model with the image of the treatment area combined.
第2態様に係る情報処理装置は、第1態様に係る情報処理装置において、前記プロセッサは、前記施術箇所に対する施術の段階に応じて、前記段階の各々と関連付けられた関連情報を前記三次元モデル、又は前記合成三次元モデルと共に表示する制御を行う。 In the information processing device according to the second aspect, in the information processing device according to the first aspect, the processor controls the display of related information associated with each stage of treatment for the treatment area together with the three-dimensional model or the composite three-dimensional model, depending on the stage of treatment for the treatment area.
第3態様に係る情報処理装置は、第2態様に係る情報処理装置において、前記施術箇所に対する施術計画を策定する術前標識段階の場合、前記プロセッサは、患部の位置、予め計画した前記患部を含む前記施術箇所の切除ライン、前記切除ラインに沿って切除される切除範囲、前記施術箇所を通る血管の位置、予め計画した血管の結紮位置、及び血管の結紮によって血流が停止する血流停止領域の少なくとも1つを、前記三次元モデルの対応する位置に重畳して表示する制御を行う。 In the information processing device according to the third aspect, in the information processing device according to the second aspect, in the preoperative marking stage in which a treatment plan for the treatment area is formulated, the processor controls to superimpose and display at least one of the following on the corresponding position of the three-dimensional model: the position of the affected area, the preplanned resection line of the treatment area including the affected area, the resection range to be resected along the resection line, the position of the blood vessel passing through the treatment area, the preplanned ligation position of the blood vessel, and the blood flow cessation area where blood flow is halted by ligation of the blood vessel.
第4態様に係る情報処理装置は、第3態様に係る情報処理装置において、前記プロセッサは、前記切除ラインに沿って前記患部を切除した場合、前記患部が含まれる臓器の残存体積を前記三次元モデルと共に表示する制御を行う。 In the information processing device according to the fourth aspect, the processor in the information processing device according to the third aspect performs control to display the remaining volume of the organ containing the diseased area together with the three-dimensional model when the diseased area is resected along the resection line.
第5態様に係る情報処理装置は、第2態様に係る情報処理装置において、前記施術箇所に対する施術計画を策定する術前標識段階において決定した術前計画情報を、前記合成三次元モデルに合成された前記施術箇所の映像から得られる前記施術箇所の性状に基づき再確認する術中標識段階の場合、前記プロセッサは、前記術前標識段階において決定した患部を含む前記施術箇所の切除ラインの変更に伴い、前記術前標識段階における前記切除ライン及び更新後の前記切除ラインをそれぞれ前記合成三次元モデルの対応する位置に重畳して表示すると共に、前記術前標識段階において決定した血管の結紮位置を更新し、更新後の血管の結紮位置を前記合成三次元モデルの対応する位置に重畳して表示する制御を行う。 In the information processing device according to the fifth aspect, in the information processing device according to the second aspect, in the intraoperative marking stage in which the preoperative planning information determined in the preoperative marking stage in which a treatment plan for the treatment site is formulated is reconfirmed based on the characteristics of the treatment site obtained from the image of the treatment site synthesized into the synthetic three-dimensional model, the processor performs control such that, in response to a change in the resection line for the treatment site including the diseased area determined in the preoperative marking stage, the resection line in the preoperative marking stage and the updated resection line are each superimposed and displayed on the corresponding positions of the synthetic three-dimensional model, and the ligation position of the blood vessel determined in the preoperative marking stage is updated, and the updated ligation position of the blood vessel is superimposed and displayed on the corresponding position of the synthetic three-dimensional model.
第6態様に係る情報処理装置は、第2態様に係る情報処理装置において、前記施術箇所に含まれる患部を切除する切除段階の場合、前記プロセッサは、患部の位置、前記患部を含む前記施術箇所の切除ライン、前記切除ラインに沿って切除される切除範囲、前記施術箇所を通る血管の位置、血管の結紮位置、血管の結紮によって血流が停止する血流停止領域、及び医療従事者が前記患部を切除する目印として前記施術箇所に付けた切除マークの位置の少なくとも1つを、前記合成三次元モデルの対応する位置に重畳して表示する制御を行う。 In the information processing device according to the sixth aspect, in the information processing device according to the second aspect, in the case of a resection stage in which an affected area included in the treatment area is resected, the processor performs control to superimpose and display, on the corresponding positions of the composite three-dimensional model, at least one of the following: the position of the affected area, the resection line of the treatment area including the affected area, the resection range to be resected along the resection line, the position of the blood vessel passing through the treatment area, the ligation position of the blood vessel, the blood flow cessation area in which blood flow is halted by ligation of the blood vessel, and the position of the resection mark placed on the treatment area by a medical professional as a guide for resecting the affected area.
第7態様に係る情報処理装置は、第6態様に係る情報処理装置において、前記プロセッサは、血管の結紮位置を識別する識別子を、血管の結紮位置と対応付けて前記合成三次元モデルと共に表示する制御を行う。 In the information processing device according to the seventh aspect, in the information processing device according to the sixth aspect, the processor controls to associate an identifier that identifies the ligation position of the blood vessel with the ligation position of the blood vessel and display it together with the synthetic three-dimensional model.
第8態様に係る情報処理装置は、第7態様に係る情報処理装置において、前記識別子は、血管の結紮順序を示す。 In the information processing device according to the eighth aspect, the identifier indicates the order in which blood vessels are ligated.
第9態様に係る情報処理装置は、第6態様に係る情報処理装置において、前記プロセッサは、結紮済みの血管の結紮位置と、まだ結紮されていない血管の結紮位置をそれぞれ区別して前記合成三次元モデルの対応する位置に重畳して表示する制御を行う。 In the information processing device according to the ninth aspect, in the information processing device according to the sixth aspect, the processor performs control to distinguish between the ligation positions of blood vessels that have already been ligated and the ligation positions of blood vessels that have not yet been ligated, and to superimpose and display them at the corresponding positions of the composite three-dimensional model.
第10態様に係る情報処理装置は、第9態様に係る情報処理装置において、前記プロセッサは、結紮により血流が停止した前記血流停止領域における血管と、まだ結紮されていないために血流が存在する前記血流停止領域における血管をそれぞれ区別して前記合成三次元モデルの対応する位置に重畳して表示する制御を行う。 In the information processing device according to the tenth aspect, in the information processing device according to the ninth aspect, the processor performs control to distinguish between blood vessels in the blood flow stop region where blood flow has been stopped by ligation and blood vessels in the blood flow stop region where blood flow is present because the blood vessels have not yet been ligated, and to display them superimposed on the corresponding positions of the composite three-dimensional model.
第11態様に係る情報処理装置は、第10態様に係る情報処理装置において、前記プロセッサは、血流が存在する前記血流停止領域における血管に、血液の流れを示す動画を重畳して表示する制御を行う。 In the information processing device according to the eleventh aspect, the processor in the information processing device according to the tenth aspect performs control to superimpose and display a video showing blood flow on the blood vessels in the blood flow stop region where blood flow is present.
第12態様に係る情報処理装置は、第9態様に係る情報処理装置において、前記プロセッサは、結紮済みの血管の結紮位置の数と、まだ結紮されていない血管の結紮位置の数を前記合成三次元モデルと共に表示する制御を行う。 In the information processing device according to the twelfth aspect, in the information processing device according to the ninth aspect, the processor controls to display the number of ligation positions of blood vessels that have already been ligated and the number of ligation positions of blood vessels that have not yet been ligated together with the composite three-dimensional model.
第13態様に係る情報処理装置は、第6態様に係る情報処理装置において、前記プロセッサは、血管の結紮状態に伴い、前記血流停止領域の範囲を更新する制御を行う。 In the information processing device according to the thirteenth aspect, the processor in the information processing device according to the sixth aspect performs control to update the range of the blood flow halt region according to the ligation state of the blood vessel.
第14態様に係る情報処理装置は、第6態様に係る情報処理装置において、前記プロセッサは、前記施術箇所の切除ラインと前記切除マークによって表される切除実施ラインが予め定めた距離以上離れている場合、警告を出力する制御を行う。 In the information processing device according to the 14th aspect, in the information processing device according to the 6th aspect, the processor performs control to output a warning if the resection line of the treatment area and the resection execution line represented by the resection mark are separated by a predetermined distance or more.
第15態様に係る情報処理装置は、第6態様に係る情報処理装置において、前記プロセッサは、医療従事者からの指示に従い、前記合成三次元モデルの対応する位置に重畳して表示される前記患部、前記切除範囲、前記施術箇所を含む臓器、及び前記施術箇所を通る血管の表示形態をそれぞれ変化させる制御を行う。 In the information processing device according to the fifteenth aspect, the processor of the information processing device according to the sixth aspect performs control to change the display form of the affected area, the resection range, the organ including the treatment site, and the blood vessels passing through the treatment site, which are displayed superimposed on the corresponding positions of the composite three-dimensional model, in accordance with instructions from a medical professional.
第16態様に係る情報処理装置は、第1態様から第15態様の何れか1つの態様に係る情報処理装置において、前記プロセッサは、医療従事者が指示した角度から前記合成三次元モデルを表示する制御を行う。 In the information processing device according to the 16th aspect, the processor of the information processing device according to any one of the first to 15th aspects performs control to display the composite 3D model from an angle specified by a medical professional.
第17態様に係る情報処理装置は、第16態様に係る情報処理装置において、前記プロセッサは、前記内視鏡の移動に伴い前記撮影装置によって撮影された前記施術箇所における新たな映像を、前記内視鏡の移動と共に前記合成三次元モデルに合成しながら、前記新たな映像が追加された前記合成三次元モデルを表示する制御を行う。 In the information processing device according to the 17th aspect, in the information processing device according to the 16th aspect, the processor performs control to display the composite 3D model to which the new image has been added while compositing new images of the treatment site captured by the imaging device as the endoscope moves into the composite 3D model along with the movement of the endoscope.
第18態様に係る情報処理装置は、第17態様に係る情報処理装置において、前記プロセッサは、医療従事者が前記施術箇所の施術中に行った器具の操作に関する履歴情報を前記合成三次元モデルと共に記憶装置に記憶し、医療従事者からの指示に応じて、前記施術箇所に対する施術内容を前記履歴情報と前記合成三次元モデルを用いて表示する制御を行う。 In the information processing device according to the 18th aspect, in the information processing device according to the 17th aspect, the processor stores in a storage device historical information regarding the operation of an instrument performed by a medical professional while performing treatment on the treatment area together with the composite 3D model, and performs control to display the treatment details for the treatment area using the historical information and the composite 3D model in response to instructions from the medical professional.
第19態様に係る制御プログラムは、予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成し、前記施術箇所の映像が合成された合成三次元モデルを表示する制御をコンピュータに実行させるためのプログラムである。 The control program according to the 19th aspect is a program for causing a computer to execute control to combine a three-dimensional model of the treatment area generated from image data of the treatment area previously photographed with an image of the treatment area photographed by an imaging device attached to an endoscope, and to display a composite three-dimensional model into which the image of the treatment area has been combined.
本開示によれば、施術箇所を撮影した画像データによって示される施術箇所の位置関係と、内視鏡によって撮影された映像によって示される施術箇所の位置関係との差異を立体的に把握することができる。 According to the present disclosure, it is possible to three-dimensionally grasp the difference between the positional relationship of the treatment site shown by image data capturing the treatment site and the positional relationship of the treatment site shown by an image captured by an endoscope.
以下、本実施の形態について図面を参照しながら説明する。なお、同じ構成要素及び同じ処理には全図面を通して同じ符号を付与し、重複する説明を省略する。また、図面の寸法比率は、説明の都合上誇張されており、実際の比率とは異なる場合がある。 The present embodiment will now be described with reference to the drawings. Note that the same components and processes are given the same reference numerals throughout the drawings, and duplicate explanations will be omitted. Also, the dimensional ratios in the drawings have been exaggerated for the sake of explanation, and may differ from the actual ratios.
また、患者の体内に内視鏡を挿入して施術を行う前の段階を「術前」と表し、患者の体内に内視鏡を挿入した以降の段階を「術中」と表す。 Furthermore, the stage before the endoscope is inserted into the patient's body and the procedure is performed is referred to as "preoperative," and the stage after the endoscope has been inserted into the patient's body is referred to as "intraoperative."
図1は、内視鏡施術システム100の構成例を示す図である。内視鏡施術システム100は、一例として画像撮影装置1、情報処理装置2、及び内視鏡3を含む。 FIG. 1 is a diagram showing an example of the configuration of an endoscopic surgery system 100. As an example, the endoscopic surgery system 100 includes an image capturing device 1, an information processing device 2, and an endoscope 3.
画像撮影装置1は、患者の体内に内視鏡3を挿入して、腫瘍等の病変部(「患部」ともいう)を切除する前に、施術箇所の位置関係を把握するため、施術箇所の画像を体外から撮影する装置である。具体的には、放射線を用いて施術箇所の画像を撮影するCTスキャン装置、及び超音波を用いて施術箇所の画像を撮影する超音波撮影装置が画像撮影装置1として用いられる。本開示では一例として、CTスキャン装置を用いて施術箇所の画像を撮影する。画像撮影装置1は、撮影した画像を画像データに変換する。 The imaging device 1 is a device that captures images of the treatment site from outside the body in order to understand the positional relationship of the treatment site before inserting an endoscope 3 into the patient's body and removing a lesion such as a tumor (also called the "affected area"). Specifically, a CT scanning device that captures images of the treatment site using radiation, and an ultrasound imaging device that captures images of the treatment site using ultrasound are used as the imaging device 1. In this disclosure, as an example, an image of the treatment site is captured using a CT scanning device. The imaging device 1 converts the captured images into image data.
本開示において施術箇所とは、腫瘍等の病変部を含む領域を表す。また、施術箇所の位置関係とは、施術箇所と、施術箇所内部の部位及び施術箇所周辺の部位との位置関係のことであり、例えば施術箇所の大きさ、施術箇所の位置、施術箇所を通る血管の位置、施術箇所周辺にある他の臓器との位置関係を含む。 In this disclosure, the treatment site refers to an area that includes a lesion such as a tumor. In addition, the positional relationship of the treatment site refers to the positional relationship between the treatment site and the areas inside and around the treatment site, including, for example, the size of the treatment site, the position of the treatment site, the positions of blood vessels that pass through the treatment site, and the positional relationship with other organs around the treatment site.
画像撮影装置1によって撮影された施術箇所の画像データは、情報処理装置2に通知される。 The image data of the treatment area captured by the image capture device 1 is sent to the information processing device 2.
内視鏡3は、鼻、口、肛門、及び腹部や胸部に開けた孔から体内に挿入され、内視鏡3に取り付けられた撮影装置(図示省略)を用いて、体内から臓器4の映像を撮影する医療器具である。図2は、腹部に開けた孔から臓器4の映像を撮影する内視鏡3の挿入例を示す図である。また、図3は、内視鏡3によって撮影された映像の一例を示す図である。以降では、内視鏡3によって撮影された映像を内視鏡映像5という。なお、内視鏡3の先端には、組織採取や組織切除を行うための器具の一例である処置具を取り付けることができる。 The endoscope 3 is a medical instrument that is inserted into the body through holes made in the nose, mouth, anus, abdomen, or chest, and captures images of organs 4 from inside the body using an imaging device (not shown) attached to the endoscope 3. Figure 2 is a diagram showing an example of the insertion of the endoscope 3 that captures images of organs 4 through a hole made in the abdomen. Also, Figure 3 is a diagram showing an example of an image captured by the endoscope 3. Hereinafter, the image captured by the endoscope 3 will be referred to as endoscopic image 5. A treatment tool, which is an example of an instrument for tissue sampling and tissue resection, can be attached to the tip of the endoscope 3.
内視鏡3は、撮影した内視鏡映像5を情報処理装置2に通知する。なお、内視鏡映像5は動画であっても静止画であってもよく、また、モノクロであってもカラーであってもよいが、本開示では一例として、内視鏡映像5はカラー動画とする。 The endoscope 3 notifies the information processing device 2 of the captured endoscopic video 5. Note that the endoscopic video 5 may be a video or a still image, and may be monochrome or color, but in this disclosure, as an example, the endoscopic video 5 is a color video.
図1の情報処理装置2は、画像撮影装置1から受け付けた画像データを用いて施術箇所の三次元モデル6を生成し、生成した三次元モデル6の表面に内視鏡3から受け付けた内視鏡映像5を合成することにより、三次元モデル6に施術箇所の映像が合成された合成三次元モデル7を生成する。 The information processing device 2 in FIG. 1 generates a three-dimensional model 6 of the treatment area using image data received from the image capture device 1, and synthesizes the endoscopic image 5 received from the endoscope 3 onto the surface of the generated three-dimensional model 6 to generate a composite three-dimensional model 7 in which the image of the treatment area is synthesized onto the three-dimensional model 6.
図4は、施術箇所の三次元モデル6の一例を示す図である。図5は、施術箇所の合成三次元モデル7の一例を示す例である。図5に示すように、情報処理装置2は、画像データから生成された三次元モデル6の表面に内視鏡映像5を合成する。内視鏡映像5が存在しない三次元モデル6の表面には内視鏡映像5は合成されず、三次元モデル6によって表される施術箇所の表面が表示される。 FIG. 4 is a diagram showing an example of a three-dimensional model 6 of the treatment area. FIG. 5 is an example showing an example of a composite three-dimensional model 7 of the treatment area. As shown in FIG. 5, the information processing device 2 composites the endoscopic video 5 onto the surface of the three-dimensional model 6 generated from image data. The endoscopic video 5 is not composited onto the surface of the three-dimensional model 6 where the endoscopic video 5 is not present, and the surface of the treatment area represented by the three-dimensional model 6 is displayed.
図4に示す三次元モデル6、及び図5に示す合成三次元モデル7には、血管等の施術箇所内部を通る部位を表示していないが、画像撮影装置1によって施術箇所内部の構造も撮影されるため、三次元モデル6及び合成三次元モデル7では施術箇所内部を通る部位の表示も可能である。 The three-dimensional model 6 shown in FIG. 4 and the composite three-dimensional model 7 shown in FIG. 5 do not show the parts that pass through the inside of the treatment area, such as blood vessels, but because the image capturing device 1 also captures the structure inside the treatment area, it is possible for the three-dimensional model 6 and composite three-dimensional model 7 to show the parts that pass through the inside of the treatment area.
次に、情報処理装置2の構成例について説明する。図6は、情報処理装置2の構成例を示す図である。 Next, we will explain an example of the configuration of the information processing device 2. Figure 6 is a diagram showing an example of the configuration of the information processing device 2.
図6に示すように、情報処理装置2は、制御部10、記憶部12、操作部13、表示部14、及びI/F(Interface)部15を含む。制御部10、記憶部12、操作部13、表示部14、及びI/F部15はバス16を介して接続され、相互に各種情報の授受を行う。 As shown in FIG. 6, the information processing device 2 includes a control unit 10, a memory unit 12, an operation unit 13, a display unit 14, and an I/F (Interface) unit 15. The control unit 10, the memory unit 12, the operation unit 13, the display unit 14, and the I/F unit 15 are connected via a bus 16, and exchange various types of information with each other.
制御部10は、医療従事者の指示に基づいて情報処理装置2の動作を制御する。制御部10は、プロセッサの一例であるCPU(Central Processing Unit)10A、ROM(Read Only Memory)10B、及びRAM(Random Access Memory)10Cを備える。RO
M10Bには、CPU10Aが合成三次元モデル7を生成し、かつ、生成した合成三次元モデル7を表示する制御を行うために読み込む制御プログラム11を含む各種プログラム、及びCPU10Aが情報処理装置2の動作を制御するうえで参照する各種パラメータが予め記憶されている。RAM10Cは、CPU10Aの一時的な作業領域として使用される。
The control unit 10 controls the operation of the information processing device 2 based on instructions from a medical professional. The control unit 10 includes a CPU (Central Processing Unit) 10A, which is an example of a processor, a ROM (Read Only Memory) 10B, and a RAM (Random Access Memory) 10C.
M10B stores in advance various programs including a control program 11 that is read by the CPU 10A to generate the composite 3D model 7 and control the display of the generated composite 3D model 7, and various parameters that the CPU 10A refers to when controlling the operation of the information processing device 2. RAM 10C is used as a temporary work area for the CPU 10A.
記憶部12は、例えば画像データ、内視鏡映像5、制御部10によって画像データから生成された施術箇所の三次元モデル6、及び三次元モデル6と内視鏡映像5とを用いて制御部10によって生成された合成三次元モデル7を記憶する。記憶部12は、記憶部12に供給される電力が遮断されても記憶した情報が維持される記憶装置の一例であり、例えばSSD(Solid State Drive)等の半導体メモリが用いられるがハードディスクを用いてもよい。 The memory unit 12 stores, for example, image data, an endoscopic video 5, a three-dimensional model 6 of the treatment area generated by the control unit 10 from the image data, and a composite three-dimensional model 7 generated by the control unit 10 using the three-dimensional model 6 and the endoscopic video 5. The memory unit 12 is an example of a storage device in which stored information is maintained even if the power supplied to the memory unit 12 is cut off, and for example, a semiconductor memory such as an SSD (Solid State Drive) is used, but a hard disk may also be used.
操作部13は、情報処理装置2に対する指示、及び情報処理装置2が動作するうえで参照する各種パラメータ等を医療従事者が入力するために用いられる。操作部13における操作形態に制約はなく、例えばキーボード、タッチパネル、タッチペン、及びマウス等による操作の受付が可能である。 The operation unit 13 is used by medical personnel to input instructions to the information processing device 2 and various parameters referenced when the information processing device 2 operates. There are no restrictions on the type of operation on the operation unit 13, and operations can be accepted, for example, from a keyboard, a touch panel, a touch pen, a mouse, etc.
表示部14は、例えば合成三次元モデル7のように、内視鏡3を用いた施術に関連する情報を画面30(例えば図8参照)表示する。 The display unit 14 displays information related to the treatment using the endoscope 3, such as the synthetic 3D model 7, on the screen 30 (see, for example, FIG. 8).
I/F部15は通信機能を備え、無線通信又は有線通信により、例えばLAN(Local Area Network)等の通信回線(図示省略)に接続された外部装置の一例である画像撮影装置1から画像データを受信する。また、I/F部15は、映像入力端子を備え、内視鏡3から内視鏡映像5を取得する。内視鏡3が通信回線を通じて内視鏡映像5を送信する場合には、I/F部15の映像入力端子は不要となる。 The I/F unit 15 has a communication function and receives image data from the image capturing device 1, which is an example of an external device connected to a communication line (not shown) such as a LAN (Local Area Network), via wireless or wired communication. The I/F unit 15 also has a video input terminal and acquires the endoscopic video 5 from the endoscope 3. When the endoscope 3 transmits the endoscopic video 5 via the communication line, the video input terminal of the I/F unit 15 is not necessary.
次に、合成三次元モデル7を生成する情報処理装置2の処理について詳細に説明する。 Next, the process of the information processing device 2 that generates the composite 3D model 7 will be described in detail.
図7は、操作部13を通じた医療従事者による操作によって三次元モデル6の生成を開始する三次元モデル生成指示を受け付けた場合に、情報処理装置2によって実行される術前標識処理の流れの一例を示すフローチャートである。情報処理装置2のCPU10Aは、ROM10Bから制御プログラム11を読み込んで術前標識処理を実行する。なお、記憶部12には、画像撮影装置1によって撮影された患者の施術箇所における画像データが予め記憶されているものとする。 FIG. 7 is a flowchart showing an example of the flow of preoperative marking processing executed by the information processing device 2 when a three-dimensional model generation instruction to start generating a three-dimensional model 6 is received by a medical professional operating the operation unit 13. The CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the preoperative marking processing. It is assumed that the memory unit 12 has stored in advance image data of the patient's treatment site captured by the image capture device 1.
患者に対して内視鏡3を用いた施術を行う場合、施術には複数の段階が存在する。内視鏡3を患者に挿入する前に、画像撮影装置1によって撮影された患者の画像データから、施術箇所に対する施術計画を策定する段階を「術前標識段階」という。施術箇所を表す三次元モデル6の生成は、内視鏡3を患者に挿入する前に行われることから、図7に示す術前標識処理は術前標識段階における処理の一例である。 When performing a procedure on a patient using an endoscope 3, the procedure involves multiple stages. The stage in which a treatment plan for the treatment area is formulated based on image data of the patient captured by the image capture device 1 before the endoscope 3 is inserted into the patient is called the "preoperative labeling stage." The three-dimensional model 6 representing the treatment area is generated before the endoscope 3 is inserted into the patient, so the preoperative labeling process shown in Figure 7 is an example of processing in the preoperative labeling stage.
まず、ステップS10において、CPU10Aは、記憶部12から患者の施術箇所における画像データを取得する。 First, in step S10, the CPU 10A acquires image data of the patient's treatment area from the memory unit 12.
ステップS20において、CPU10Aは、ステップS10の処理によって取得した画像データを用いて、施術箇所の三次元モデル6を生成する。三次元モデル6の生成には、例えば画像撮影装置1の開発企業が提供する三次元モデル生成ソフトウェアを利用する等の手法が用いられる。CPU10Aは、生成した三次元モデル6を画面30に表示する。 In step S20, the CPU 10A generates a three-dimensional model 6 of the treatment area using the image data acquired by the processing in step S10. The three-dimensional model 6 is generated, for example, by using three-dimensional model generation software provided by the developer of the image capture device 1. The CPU 10A displays the generated three-dimensional model 6 on the screen 30.
医療従事者は、画面30に表示された三次元モデル6を見ながら、患部の大きさや血管の位置を確認し、例えば患部を取り除くためには何れの血管をどの位置で結紮してどのように患部を切除すればよいのかといった施術計画を、施術を行う前に検討する。医療従事者は操作部13を通じて、検討した施術計画の内容を術前計画情報としてCPU10Aに通知する。 While looking at the three-dimensional model 6 displayed on the screen 30, the medical staff confirms the size of the affected area and the position of the blood vessels, and considers a treatment plan before performing the treatment, such as which blood vessels should be ligated and where to ligate them to remove the affected area, and how to excise the affected area. The medical staff notifies the CPU 10A of the contents of the treatment plan they have considered via the operation unit 13 as preoperative planning information.
術前計画情報には、患部の位置、患部を含む施術箇所の切除ライン、計画した切除ラインに沿って切除される切除範囲、施術箇所を通る血管の位置、計画した血管の結紮位置、及び血管の結紮によって血流が停止する血流停止領域の少なくとも1つが含まれる。術前計画情報は、術前標識段階と関連付けられた関連情報の一例である。 The preoperative planning information includes at least one of the following: the location of the affected area, a resection line at the treatment site including the affected area, the resection range to be resected along the planned resection line, the location of blood vessels passing through the treatment site, the planned ligation location of the blood vessels, and the blood flow cessation area where blood flow will be halted by ligating the blood vessels. The preoperative planning information is an example of relevant information associated with the preoperative labeling stage.
ステップS30において、CPU10Aは、通知された術前計画情報を取得する。 In step S30, the CPU 10A acquires the notified preoperative planning information.
ステップS40において、CPU10Aは、ステップS30の処理によって取得した術前計画情報を、三次元モデル6の対応する位置に重畳して画面30に表示する。 In step S40, the CPU 10A displays the preoperative planning information acquired by the processing of step S30 on the screen 30, superimposed on the corresponding position of the three-dimensional model 6.
図8は、術前計画情報の表示例を示す図である。説明の便宜上、患部を「患部24」、切除ラインを「切除ライン22」、血管を「血管20」、血管20の結紮位置を「結紮位置21」、血流停止領域を「血流停止領域25」として表す。特に、三次元モデル6から推定される患部24を「患部24A」と表し、三次元モデル6から推定される切除ライン22を「切除ライン22A」と表し、三次元モデル6から推定される結紮位置21を「結紮位置21A」と表す。 FIG. 8 is a diagram showing an example of the display of preoperative planning information. For ease of explanation, the affected area is represented as "affected area 24", the resection line as "resection line 22", the blood vessel as "blood vessel 20", the ligation position of blood vessel 20 as "ligation position 21", and the blood flow stopped area as "blood flow stopped area 25". In particular, the affected area 24 estimated from the three-dimensional model 6 is represented as "affected area 24A", the resection line 22 estimated from the three-dimensional model 6 as "resection line 22A", and the ligation position 21 estimated from the three-dimensional model 6 as "ligation position 21A".
なお、CPU10Aは、切除ライン22Aに沿って患部24Aを切除した場合、患部24Aが含まれる臓器4の残存体積を、三次元モデル6と共に術前計画情報を表示する画面30に表示する。臓器4の残存体積とは、切除ライン22Aを境界とする、患部24Aを含んでいない方の臓器4の体積である。したがって、CPU10Aは、三次元モデル6から算出される患部24Aを含んだ臓器4全体の体積と、医療従事者によって指示された切除ライン22Aから臓器4の残存体積を三次元モデル6から算出し、算出した臓器4の残存体積を画面30に表示する。 When the diseased area 24A is resected along the resection line 22A, the CPU 10A displays the remaining volume of the organ 4 including the diseased area 24A on the screen 30 that displays the preoperative planning information together with the three-dimensional model 6. The remaining volume of the organ 4 is the volume of the organ 4 that does not include the diseased area 24A, with the resection line 22A as the boundary. Therefore, the CPU 10A calculates the entire volume of the organ 4 including the diseased area 24A calculated from the three-dimensional model 6, and the remaining volume of the organ 4 from the resection line 22A indicated by the medical professional, and displays the calculated remaining volume of the organ 4 on the screen 30.
臓器4の残存体積を表示する位置に制約はないが、CPU10Aは、三次元モデル6と重複しない位置に臓器4の残存体積を表示することが好ましい。画面30に表示される臓器4の残存体積も術前計画情報の一例である。なお、CPU10Aは、臓器4の残存体積に加えて、又は、臓器4の残存体積の代わりに、例えば切除ライン22Aに沿って患部24Aを切除した場合における臓器4の切除体積、切除した臓器4の重量、残った臓器4の残存重量等を画面30に表示してもよい。 There are no restrictions on the position where the remaining volume of the organ 4 is displayed, but it is preferable that the CPU 10A displays the remaining volume of the organ 4 at a position that does not overlap with the three-dimensional model 6. The remaining volume of the organ 4 displayed on the screen 30 is also an example of preoperative planning information. Note that in addition to or instead of the remaining volume of the organ 4, the CPU 10A may display on the screen 30 the resected volume of the organ 4 when the diseased area 24A is resected along the resection line 22A, the weight of the resected organ 4, the remaining weight of the remaining organ 4, etc.
以上により、図7に示す術前標識処理を終了する。情報処理装置2は、三次元モデル6の対応する位置に術前計画情報を重畳して表示することから、医療従事者は、単に三次元モデル6を見ながら頭の中で施術計画を検討する場合と比較して、内視鏡3を患者に挿入する前の段階で施術計画を立体的にイメージすることができる。 The preoperative labeling process shown in FIG. 7 is now complete. The information processing device 2 displays the preoperative planning information superimposed at the corresponding position on the three-dimensional model 6, so the medical staff can visualize the treatment plan in three dimensions before inserting the endoscope 3 into the patient, compared to when the medical staff simply considers the treatment plan in their head while looking at the three-dimensional model 6.
術前標識段階で施術箇所に対する施術計画が策定された後、実際に内視鏡3を患者に挿入して施術を開始する段階に入る。医療従事者によって施術が開始される段階を「術中段階」という。医療従事者は、術中段階になったからといってすぐに施術箇所の処置に取り掛かるわけでなく、内視鏡3を用いて施術箇所の実際の性状を確認し、術前標識段階において決定した施術計画の妥当性を判断することになる。具体的には、医療従事者は、術前標識段階において決定した切除ライン22Aの妥当性を、施術箇所の性状に基づき再検討する。以降では、術中段階のうち、内視鏡3によって撮影された施術箇所の実際の性状に基づき、術前計画情報の再検討を行う段階を「術中標識段階」と表し、施術箇所に含まれる患部24を切除する段階を「切除段階」と表す。 After the treatment plan for the treatment area is formulated in the preoperative marking stage, the endoscope 3 is actually inserted into the patient and the treatment begins. The stage at which the treatment is started by the medical professional is called the "intraoperative stage." The medical professional does not immediately begin treating the treatment area just because it is the intraoperative stage, but uses the endoscope 3 to confirm the actual characteristics of the treatment area and judge the validity of the treatment plan decided in the preoperative marking stage. Specifically, the medical professional reconsiders the validity of the resection line 22A decided in the preoperative marking stage based on the characteristics of the treatment area. Hereinafter, the stage of the intraoperative stage at which the preoperative plan information is reconsidered based on the actual characteristics of the treatment area photographed by the endoscope 3 is referred to as the "intraoperative marking stage," and the stage at which the diseased area 24 included in the treatment area is resected is referred to as the "resection stage."
図9は、医療従事者によって内視鏡3が患者に挿入された場合に、情報処理装置2によって実行される合成三次元モデル7の生成処理の流れの一例を示すフローチャートである。情報処理装置2のCPU10Aは、ROM10Bから制御プログラム11を読み込んで合成三次元モデル7の生成処理を実行する。なお、記憶部12には、画像撮影装置1によって撮影された患者の施術箇所における画像データが予め記憶されているものとする。また、記憶部12には、図7に示した術前標識処理によって生成された三次元モデル6と術前標識処理において取得した術前計画情報が予め記憶されているものとする。 FIG. 9 is a flowchart showing an example of the flow of the process for generating a composite 3D model 7 executed by the information processing device 2 when the endoscope 3 is inserted into a patient by a medical professional. The CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the process for generating a composite 3D model 7. Note that the memory unit 12 is assumed to have stored in advance image data of the treatment area of the patient captured by the image capture device 1. The memory unit 12 is also assumed to have stored in advance the 3D model 6 generated by the preoperative marking process shown in FIG. 7 and the preoperative planning information acquired in the preoperative marking process.
ステップS100において、CPU10Aは、内視鏡3から内視鏡映像5を取得する。 In step S100, the CPU 10A acquires the endoscopic image 5 from the endoscope 3.
ステップS110において、CPU10Aは、ステップS100の処理によって取得した内視鏡映像5を、内視鏡映像5に映っている場所と同じ場所に相当する三次元モデル6の表面に合成して、合成三次元モデル7を生成する。 In step S110, the CPU 10A composites the endoscopic video 5 acquired by the processing of step S100 onto the surface of the 3D model 6 corresponding to the same location shown in the endoscopic video 5, to generate a composite 3D model 7.
三次元モデル6に対して内視鏡映像5の位置合わせを行う手法には公知の手法が用いられる。具体的には、CPU10Aは、変分オートエンコーダー(Variational Autoencoder:VAE)等の深層学習を用いて体内の内視鏡映像5を体内の位置と対応付けて予め学
習させた画像モデルに、取得した内視鏡映像5を入力して、内視鏡映像5に映っている場所を特定する。また、CPU10Aは、最初に内視鏡映像5が三次元モデル6の何れの場所を撮影しているのかの対応付けを行う位置合わせを行っておき、位置合わせを行った位置からの内視鏡3の移動量に基づいて、移動後の内視鏡映像5が三次元モデル6の何れの場所を撮影しているのかを特定してもよい。内視鏡3の移動量は、例えば内視鏡3に内蔵された慣性センサによる計測データから算出すればよい。
A known method is used for aligning the endoscopic video 5 with the three-dimensional model 6. Specifically, the CPU 10A inputs the acquired endoscopic video 5 into an image model that has been previously trained by associating the endoscopic video 5 with a position inside the body using deep learning such as a variational autoencoder (VAE), and identifies the location shown in the endoscopic video 5. Alternatively, the CPU 10A may first perform alignment to associate the location of the three-dimensional model 6 that the endoscopic video 5 is photographing, and then, based on the amount of movement of the endoscope 3 from the position where the alignment was performed, identify the location of the three-dimensional model 6 that the endoscopic video 5 is photographing after the movement. The amount of movement of the endoscope 3 may be calculated from measurement data obtained by an inertial sensor built into the endoscope 3, for example.
なお、CPU10Aは、内視鏡3による施術箇所の撮影終了を指示する終了指示を受け付けるまで、内視鏡映像5を取得し続ける。 The CPU 10A continues to acquire the endoscopic video 5 until it receives an end instruction to end imaging of the treatment site using the endoscope 3.
ステップS120において、CPU10Aは、内視鏡3が移動したか否かを判定する。内視鏡3の移動の有無は、例えば時系列に沿って撮影された内視鏡映像5に変化が認められるか否かによって判定すればよい。CPU10Aは、内視鏡映像5に変化が認められる場合に内視鏡3が移動したと判定する。当然のことながら、CPU10Aは、内視鏡3に内蔵された慣性センサによる計測データから、内視鏡3が移動したか否かを判定してもよい。 In step S120, the CPU 10A determines whether the endoscope 3 has moved. Whether the endoscope 3 has moved or not may be determined, for example, based on whether a change is recognized in the endoscope video 5 captured in time series. The CPU 10A determines that the endoscope 3 has moved when a change is recognized in the endoscope video 5. Naturally, the CPU 10A may also determine whether the endoscope 3 has moved or not from measurement data obtained by an inertial sensor built into the endoscope 3.
内視鏡3が移動した場合、ステップS110に移行し、内視鏡3の移動後に取得した施術箇所における新たな内視鏡映像5を、内視鏡映像5に映っている場所と同じ場所に相当する三次元モデル6の表面に合成して、合成三次元モデル7を生成する。これにより、三次元モデル6の表面に合成された内視鏡映像5の表示面積が拡大していく。 If the endoscope 3 has moved, the process proceeds to step S110, where a new endoscopic video 5 of the treatment site acquired after the endoscope 3 has moved is composited onto the surface of the three-dimensional model 6 corresponding to the same location as that shown in the endoscopic video 5, generating a composite three-dimensional model 7. As a result, the display area of the endoscopic video 5 composited onto the surface of the three-dimensional model 6 increases.
一方、ステップS120の判定処理によって内視鏡3は移動していないと判定された場合、ステップS130に移行する。 On the other hand, if the determination process in step S120 determines that the endoscope 3 has not moved, the process proceeds to step S130.
この場合、ステップS130において、CPU10Aは、操作部13を通じて医療従事者から終了指示を受け付けたか否かを判定する。終了指示を受け付けていない場合、ステップS120に移行する。すなわち、CPU10Aは、内視鏡3による施術箇所の撮影が終了するまで、内視鏡3の移動に伴い、内視鏡3に取り付けられた撮影装置によって撮影された施術箇所における新たな内視鏡映像5を、内視鏡3の移動と共に合成三次元モデル7に合成することによって、新たな内視鏡映像5を追加した合成三次元モデル7を生成し続ける。 In this case, in step S130, the CPU 10A determines whether or not an end instruction has been received from the medical staff via the operation unit 13. If an end instruction has not been received, the process proceeds to step S120. That is, the CPU 10A continues to generate a composite 3D model 7 to which a new endoscopic video 5 has been added by compositing a new endoscopic video 5 of the treatment site captured by the imaging device attached to the endoscope 3 as the endoscope 3 moves, into the composite 3D model 7 as the endoscope 3 moves, until the endoscope 3 finishes capturing the image of the treatment site.
ステップS130の判定処理において終了指示を受け付けたと判定された場合、図9に示す合成三次元モデル7の生成処理を終了する。 If it is determined in the determination process of step S130 that an end instruction has been received, the process of generating the composite 3D model 7 shown in FIG. 9 is terminated.
内視鏡3によって撮影された内視鏡映像5は二次元映像であるため、映像に奥行き感が感じられない。したがって、内視鏡映像5から認識される施術箇所の位置関係は、立体的に表れた三次元モデル6によって表される施術箇所の位置関係とは異なる場合がある。また、内視鏡映像5から施術箇所の位置関係を認識する場合、内視鏡映像5として映し出される範囲の映像からしか施術箇所を確認することができないため、施術箇所を俯瞰しながら立体的に把握することは困難である。 Since the endoscopic video 5 captured by the endoscope 3 is a two-dimensional image, there is no sense of depth in the image. Therefore, the positional relationship of the treatment area recognized from the endoscopic video 5 may differ from the positional relationship of the treatment area represented by the three-dimensional model 6 that is displayed three-dimensionally. Furthermore, when recognizing the positional relationship of the treatment area from the endoscopic video 5, since the treatment area can only be confirmed from the image of the range displayed as the endoscopic video 5, it is difficult to grasp the treatment area three-dimensionally while looking down on it.
一方、本開示の情報処理装置2のように、三次元モデル6の表面に、三次元モデル6の各位置に対応した内視鏡映像5を合成すれば、医療従事者は、施術箇所を撮影した画像データによって示される施術箇所の位置関係と、内視鏡映像5によって示される施術箇所の位置関係との差異を立体的に把握しやすくなる。したがって、医療従事者は、例えば内視鏡映像5のみから施術箇所の位置関係を把握する場合と比較して、内視鏡映像5に映し出される場所が施術箇所の何れの位置を示しているのかを特定しやすくなる。 On the other hand, if the endoscopic video 5 corresponding to each position of the three-dimensional model 6 is synthesized on the surface of the three-dimensional model 6 as in the information processing device 2 disclosed herein, medical personnel can easily grasp in three dimensions the difference between the positional relationship of the treatment site shown by the image data capturing the treatment site and the positional relationship of the treatment site shown by the endoscopic video 5. Therefore, medical personnel can more easily identify which position of the treatment site is shown in the endoscopic video 5, compared to, for example, when the positional relationship of the treatment site is understood from the endoscopic video 5 alone.
次に、生成した合成三次元モデル7を用いて、術前標識段階において決定した施術計画の妥当性を検討する術中標識段階での情報処理装置2の処理について説明する。 Next, we will explain the processing of the information processing device 2 in the intraoperative labeling stage, in which the generated synthetic 3D model 7 is used to examine the validity of the treatment plan determined in the preoperative labeling stage.
図10は、合成三次元モデル7が生成された後に実行される術中標識処理の流れの一例を示すフローチャートである。情報処理装置2のCPU10Aは、ROM10Bから制御プログラム11を読み込んで術中標識処理を実行する。 FIG. 10 is a flowchart showing an example of the flow of intraoperative marking processing that is executed after the composite 3D model 7 is generated. The CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the intraoperative marking processing.
なお、術中標識処理は、図9に示した合成三次元モデル7の生成処理と並列に実行される。したがって、術中標識処理の途中で内視鏡3が移動すれば、合成三次元モデル7に合成される内視鏡映像5の範囲が拡大することがある。 Note that the intraoperative labeling process is executed in parallel with the process of generating the composite 3D model 7 shown in FIG. 9. Therefore, if the endoscope 3 moves during the intraoperative labeling process, the range of the endoscopic video 5 that is synthesized into the composite 3D model 7 may expand.
図10のステップS200において、CPU10Aは、合成三次元モデル7を画面30に表示する。 In step S200 of FIG. 10, the CPU 10A displays the composite 3D model 7 on the screen 30.
ステップS210において、CPU10Aは、記憶部12から術前計画情報を取得し、取得した術前計画情報を、合成三次元モデル7の対応する位置に重畳して画面30に表示する。 In step S210, the CPU 10A acquires preoperative planning information from the memory unit 12 and displays the acquired preoperative planning information on the screen 30 by superimposing it on the corresponding position of the composite 3D model 7.
医療従事者は、合成三次元モデル7の表面に表示された内視鏡映像5を参考にして、合成三次元モデル7から策定される施術計画の内容が術前計画情報によって表される内容と同じでよいか否かを検討する。合成三次元モデル7には内視鏡映像5が合成されているため、内視鏡映像5が合成されていない三次元モデル6と比較して、施術箇所の性状や位置関係が把握しやすくなっている。 The medical staff refers to the endoscopic video 5 displayed on the surface of the synthetic 3D model 7 and considers whether the contents of the treatment plan formulated from the synthetic 3D model 7 are the same as those represented by the preoperative planning information. Because the synthetic 3D model 7 has the endoscopic video 5 synthesized therein, it is easier to grasp the characteristics and positional relationship of the treatment area compared to the 3D model 6 on which the endoscopic video 5 is not synthesized.
検討した結果、例えば合成三次元モデル7から認識される患部24の位置、及び切除ライン22Aの位置等が術前計画情報における各々の位置と異なる場合、医療従事者は操作部13を通じて、術前計画情報の変更指示をCPU10Aに通知することになる。 If, as a result of the consideration, it is determined that the position of the diseased area 24 recognized from the composite 3D model 7 and the position of the resection line 22A, etc., differ from their respective positions in the preoperative planning information, the medical staff will notify the CPU 10A via the operation unit 13 of an instruction to change the preoperative planning information.
したがって、ステップS220において、CPU10Aは、術前計画情報の変更指示を受け付けたか否かを判定する。術前計画情報の変更指示を受け付けた場合にはステップS230に移行する。 Therefore, in step S220, CPU 10A determines whether an instruction to change the preoperative planning information has been received. If an instruction to change the preoperative planning information has been received, the process proceeds to step S230.
ステップS230において、CPU10Aは、変更指示によって指示された術前計画情報の変更内容を反映した施術計画を表す術中計画情報を、合成三次元モデル7の対応する位置に重畳して画面30に表示する。 In step S230, the CPU 10A displays on the screen 30 the intraoperative planning information, which represents the treatment plan that reflects the changes to the preoperative planning information instructed by the change instruction, superimposed on the corresponding position of the composite 3D model 7.
例えばCPU10Aが切除ライン22Aの変更指示を受け付けた場合、CPU10Aは、変更指示に従って切除ライン22Aの位置を変更する。説明の便宜上、位置を変更した後の切除ライン22Aを「切除ライン22B」と表す。また、例えばCPU10Aが患部24Aの大きさや位置の変更指示を受け付けた場合、CPU10Aは、変更指示に従って患部24Aの大きさや位置を変更する。説明の便宜上、大きさや位置を変更した後の患部24Aを「患部24B」と表す。 For example, if CPU 10A receives an instruction to change resection line 22A, CPU 10A changes the position of resection line 22A in accordance with the change instruction. For ease of explanation, the resection line 22A after its position has been changed is referred to as "resection line 22B." Also, for example, if CPU 10A receives an instruction to change the size or position of affected area 24A, CPU 10A changes the size or position of affected area 24A in accordance with the change instruction. For ease of explanation, the affected area 24A after its size or position has been changed is referred to as "affected area 24B."
図11は、術中計画情報の表示例を示す図である。術中計画情報には術前計画情報によって示される情報が含まれる。また、図11に示すように、CPU10Aは、切除ライン22Aの変更に伴い、術前標識段階における切除ライン22A及び更新後の切除ライン22Bを、それぞれ合成三次元モデル7の対応する位置に重畳して表示する。また、CPU10Aは、患部24Aの大きさや位置の変更に伴い、術前標識段階における患部24A及び更新後の患部24Bを、それぞれ合成三次元モデル7の対応する位置に重畳して表示する。 FIG. 11 is a diagram showing an example of the display of intraoperative planning information. The intraoperative planning information includes information indicated by the preoperative planning information. Also, as shown in FIG. 11, in response to a change in the resection line 22A, the CPU 10A displays the resection line 22A at the preoperative labeling stage and the updated resection line 22B, each superimposed on the corresponding positions on the composite 3D model 7. Also, in response to a change in the size or position of the affected area 24A, the CPU 10A displays the affected area 24A at the preoperative labeling stage and the updated affected area 24B, each superimposed on the corresponding positions on the composite 3D model 7.
更に、CPU10Aは、切除ライン22Aが変更された場合、術前標識段階において決定した血管20の結紮位置21Aを更新し、更新後の血管20の結紮位置21を合成三次元モデル7の対応する位置に重畳して表示する。以降では、更新後の血管20の結紮位置21を「結紮位置21B」という。また、CPU10Aは、切除ライン22Aの変更に伴い、臓器4の残存体積の再算出を行い、再算出した臓器4の残存体積を画面30に表示する。 Furthermore, when the resection line 22A is changed, the CPU 10A updates the ligation position 21A of the blood vessel 20 determined in the preoperative labeling stage, and displays the updated ligation position 21 of the blood vessel 20 superimposed on the corresponding position of the synthetic 3D model 7. Hereinafter, the updated ligation position 21 of the blood vessel 20 is referred to as the "ligation position 21B." Furthermore, when the resection line 22A is changed, the CPU 10A recalculates the remaining volume of the organ 4, and displays the recalculated remaining volume of the organ 4 on the screen 30.
なお、図11には示されていないが、血管20の結紮位置21が結紮位置21Aから結紮位置21Bに変更されたことにより、術前計画情報に含まれる血流停止領域25が変更になることがある。この場合、CPU10Aは、更新後の血流停止領域25を合成三次元モデル7の対応する位置に重畳して表示する。 Although not shown in FIG. 11, the blood flow stoppage area 25 included in the preoperative planning information may change due to the ligation position 21 of the blood vessel 20 being changed from ligation position 21A to ligation position 21B. In this case, the CPU 10A displays the updated blood flow stoppage area 25 by superimposing it on the corresponding position of the composite 3D model 7.
こうした術中計画情報は、術中標識段階と関連付けられた関連情報の一例である。術中計画情報が合成三次元モデル7と共に表示されることによって、医療従事者は三次元モデル6と共に表示される術前計画情報に従って施術箇所を切除する場合と比較して、切除する位置の精度を高めることができる。また、合成三次元モデル7には、術前計画情報と術中計画情報が比較可能な状態で表示されるため、医療従事者は術前計画情報との相違点を確認しながら切除段階に進むことができる。 Such intraoperative planning information is an example of relevant information associated with the intraoperative labeling stage. By displaying the intraoperative planning information together with the composite 3D model 7, medical personnel can improve the accuracy of the resection location compared to when resecting the treatment area according to the preoperative planning information displayed together with the 3D model 6. In addition, because the composite 3D model 7 displays the preoperative planning information and the intraoperative planning information in a comparable state, medical personnel can proceed to the resection stage while checking the differences with the preoperative planning information.
医療従事者は、例えば合成三次元モデル7に表示された施術箇所の内視鏡映像5が確認しにくい場合、操作部13を通じて、合成三次元モデル7を眺める視点を変更することができる。 If, for example, a medical professional has difficulty viewing the endoscopic image 5 of the treatment area displayed on the composite 3D model 7, the medical professional can change the viewpoint from which the composite 3D model 7 is viewed via the operation unit 13.
したがって、図10のステップS240において、CPU10Aは、医療従事者から合成三次元モデル7を眺める視点を変更する視点変更指示を受け付けたか否かを判定する。視点変更指示を受け付けた場合にはステップS250に移行する。 Therefore, in step S240 of FIG. 10, the CPU 10A determines whether or not a viewpoint change instruction to change the viewpoint from which the medical professional views the composite 3D model 7 has been received. If a viewpoint change instruction has been received, the process proceeds to step S250.
ステップS250において、CPU10Aは、医療従事者が指示した視点、すなわち、医療従事者が指示した角度から眺めた合成三次元モデル7の部分が正面にくるように、合成三次元モデル7の位置を更新して画面30に表示する。合成三次元モデル7の回転には、例えば合成三次元モデル7に対して回転行列を作用させればよい。このように、CPU10Aは、合成三次元モデル7を回転させ、任意の方向を正面にした合成三次元モデル7を画面30に表示する。 In step S250, the CPU 10A updates the position of the composite 3D model 7 so that the part of the composite 3D model 7 viewed from the viewpoint specified by the medical professional, i.e., the angle specified by the medical professional, is in the front, and displays it on the screen 30. The composite 3D model 7 can be rotated, for example, by applying a rotation matrix to the composite 3D model 7. In this way, the CPU 10A rotates the composite 3D model 7 and displays the composite 3D model 7 on the screen 30 with the arbitrary direction facing forward.
合成三次元モデル7を眺める視点を変更した後、ステップS260に移行する。 After changing the viewpoint from which the composite 3D model 7 is viewed, the process proceeds to step S260.
一方、ステップS240の判定処理において視点変更指示を受け付けていないと判定された場合には、ステップS250の処理を実行することなくステップS260に移行する。 On the other hand, if it is determined in the determination process of step S240 that a viewpoint change instruction has not been received, the process proceeds to step S260 without executing the process of step S250.
ステップS260において、CPU10Aは、医療従事者が患部24の切除を開始したか否か、すなわち、切除段階に移行したか否かを判定する。医療従事者は患部24を切除する前に、患部24を切除する位置に目印を設けるため、電気メスを用いて施術箇所にマークを付けることがある。本開示では、患部24を切除する位置の目印となるマークを「切除マーク27」という。したがって、CPU10Aは、内視鏡3に取り付けられた電気メス等の切除器具の操作が行われた場合に、術中標識段階から切除段階に移行したと判定すればよい。本開示では一例として、医療従事者が切除マーク27を施術箇所に付けるものとする。 In step S260, CPU 10A determines whether the medical professional has started to resect the diseased area 24, i.e., whether the operation has progressed to the resection stage. Before resecting the diseased area 24, the medical professional may mark the operation site using an electric scalpel to mark the position where the diseased area 24 will be resected. In this disclosure, the mark that marks the position where the diseased area 24 will be resected is referred to as a "resection mark 27." Therefore, CPU 10A may determine that the operation has progressed from the intraoperative marking stage to the resection stage when an operation of a resection instrument such as an electric scalpel attached to endoscope 3 has been performed. In this disclosure, as an example, it is assumed that the medical professional marks the operation site with a resection mark 27.
切除段階に移行していない場合にはステップS220に移行し、術中標識段階の処理を継続する。一方、切除段階に移行した場合には、図10に示す術中標識処理を終了する。 If the procedure has not yet progressed to the resection stage, the procedure proceeds to step S220 and the intraoperative labeling stage processing continues. On the other hand, if the procedure has progressed to the resection stage, the intraoperative labeling process shown in FIG. 10 ends.
次に、医療従事者が患部24の切除を行う切除段階での情報処理装置2の処理について説明する。 Next, we will explain the processing of the information processing device 2 during the resection stage, when the medical staff resects the affected area 24.
図12は、図10に示した術中標識処理が終了した後に実行される切除支援処理の流れの一例を示すフローチャートである。情報処理装置2のCPU10Aは、ROM10Bから制御プログラム11を読み込んで切除支援処理を実行する。 FIG. 12 is a flowchart showing an example of the flow of the resection support process executed after the intraoperative marking process shown in FIG. 10 is completed. The CPU 10A of the information processing device 2 reads the control program 11 from the ROM 10B and executes the resection support process.
なお、切除支援処理は、図9に示した合成三次元モデル7の生成処理と並列に実行される。したがって、切除支援処理の途中で内視鏡3が移動すれば、合成三次元モデル7に合成される内視鏡映像5の範囲が拡大することがある。 Note that the resection support process is executed in parallel with the process of generating the composite 3D model 7 shown in FIG. 9. Therefore, if the endoscope 3 moves during the resection support process, the range of the endoscopic video 5 that is composited into the composite 3D model 7 may expand.
ステップS300において、CPU10Aは、術中標識段階において最終的に策定された術中計画情報を合成三次元モデル7の対応する位置に重畳して画面30に表示する。また、取得した内視鏡映像5に切除マーク27が含まれる場合、CPU10Aは、合成三次元モデル7の対応する位置に切除マーク27を表示する。術中標識段階において最終的に策定された術中計画情報、切除マーク27、及び合成三次元モデル7は切除支援情報の一例である。切除支援情報は、切除段階と関連付けられた関連情報の一例である。 In step S300, CPU 10A displays on screen 30 the intraoperative planning information finally formulated in the intraoperative labeling stage, superimposed on the corresponding position of the synthetic 3D model 7. Furthermore, if the acquired endoscopic video 5 includes a resection mark 27, CPU 10A displays the resection mark 27 at the corresponding position of the synthetic 3D model 7. The intraoperative planning information finally formulated in the intraoperative labeling stage, the resection mark 27, and the synthetic 3D model 7 are examples of resection support information. The resection support information is an example of related information associated with the resection stage.
図13は、切除支援情報の表示例を示す図である。図13における各々の切除マーク27を結ぶ仮想的なライン(図示省略)が、医療従事者が実際に切除する切除実施ラインを表す。 FIG. 13 is a diagram showing an example of the display of resection support information. The virtual lines (not shown) connecting each of the resection marks 27 in FIG. 13 represent the resection execution lines along which the medical staff will actually perform the resection.
また、血管20の結紮位置21Bが複数存在する場合、CPU10Aは、各々の血管20の結紮位置21Bに、結紮位置21Bを識別する識別子を対応付けて合成三次元モデル7と共に表示する。図13に示す表示例では、1つ目の血管20の結紮位置21Bには「P1」の識別子を対応付け、2つ目の血管20の結紮位置21Bには「P2」の識別子を対応付けている。識別子に含まれる数字は、血管20の結紮順序を示している。すなわち、図13に示す表示例では、識別子「P1」が対応付けられている結紮位置21Bの血管20を最初に結紮し、識別子「P2」が対応付けられている結紮位置21Bの血管20をその次に結紮することを表している。したがって、医療従事者は、血管20の結紮位置21Bに対応付けられている識別子を参照すれば、血管20の結紮順序を確認することができる。 Furthermore, when there are multiple ligation positions 21B of a blood vessel 20, the CPU 10A associates an identifier for identifying the ligation position 21B with the ligation position 21B of each blood vessel 20 and displays them together with the composite three-dimensional model 7. In the display example shown in FIG. 13, the identifier "P1" is associated with the ligation position 21B of the first blood vessel 20, and the identifier "P2" is associated with the ligation position 21B of the second blood vessel 20. The numbers included in the identifiers indicate the order in which the blood vessels 20 are ligated. That is, in the display example shown in FIG. 13, the blood vessel 20 at the ligation position 21B associated with the identifier "P1" is ligated first, and the blood vessel 20 at the ligation position 21B associated with the identifier "P2" is ligated next. Therefore, medical personnel can check the ligation order of blood vessel 20 by referring to the identifier associated with ligation position 21B of blood vessel 20.
なお、結紮位置21Bの識別子は、術中標識段階において医療従事者が指示した結紮順序に基づいて表示される。CPU10Aは、血管20の結紮順序を数字以外の表記によって表してもよい。具体的には、血管20の結紮順序をアルファベット順によって表してもよい。 The identifiers of the ligation positions 21B are displayed based on the ligation order instructed by the medical staff during the intraoperative labeling stage. The CPU 10A may display the ligation order of the blood vessels 20 using a notation other than numbers. Specifically, the ligation order of the blood vessels 20 may be displayed in alphabetical order.
医療従事者は、患部24Bを切除する前に、切除支援情報を参考にして血管20を結紮することになるが、状況によっては血管20を結紮したか否かを確認したいことがある。したがって、CPU10Aは、血管20の結紮位置21Bを表示する場合、結紮済みの血管20の結紮位置21Bと、まだ結紮されていない血管20の結紮位置21Bをそれぞれ区別して合成三次元モデル7の対応する位置に重畳して表示する。 Before resecting the diseased area 24B, the medical professional will ligate the blood vessel 20 while referring to the resection support information, but depending on the situation, he or she may want to confirm whether or not the blood vessel 20 has been ligated. Therefore, when displaying the ligation position 21B of the blood vessel 20, the CPU 10A distinguishes between the ligation position 21B of the blood vessel 20 that has already been ligated and the ligation position 21B of the blood vessel 20 that has not yet been ligated, and displays them superimposed on the corresponding positions of the composite three-dimensional model 7.
例えば図13において、識別子「P1」が対応付けられている結紮位置21Bの血管20は結紮済みであり、識別子「P2」が対応付けられている結紮位置21Bの血管20はまだ結紮されていないものとする。この場合、図13に示すように、CPU10Aは、識別子「P1」が対応付けられている結紮位置21Bと、識別子「P2」が対応付けられている結紮位置21Bとをそれぞれ異なる表示形態で表示する。例えばCPU10Aは、結紮済みの結紮位置21Bを、血管20を横切る直線で表示し、まだ結紮されていない結紮位置21Bを「×」で表示する。なお、結紮済みの結紮位置21Bと、まだ結紮していない結紮位置21Bとを区別することができれば、結紮位置21Bの表示形態に制約はない。 For example, in FIG. 13, the blood vessel 20 at the ligation position 21B associated with the identifier "P1" has already been ligated, and the blood vessel 20 at the ligation position 21B associated with the identifier "P2" has not yet been ligated. In this case, as shown in FIG. 13, the CPU 10A displays the ligation position 21B associated with the identifier "P1" and the ligation position 21B associated with the identifier "P2" in different display forms. For example, the CPU 10A displays the ligation position 21B that has already been ligated as a straight line crossing the blood vessel 20, and displays the ligation position 21B that has not yet been ligated as an "x". Note that there are no restrictions on the display form of the ligation position 21B, as long as it is possible to distinguish between the ligation position 21B that has already been ligated and the ligation position 21B that has not yet been ligated.
このように結紮位置21Bを表示すれば、医療従事者は、何れの結紮位置21Bにおいて血管20の結紮が行われ、何れの結紮位置21Bにおいて血管20の結紮が行われていないのかを画面30から確認できる。 By displaying the ligation positions 21B in this manner, medical personnel can check on the screen 30 which ligation positions 21B have been performed on the blood vessel 20 and which ligation positions 21B have not been performed on the blood vessel 20.
また、CPU10Aは、結紮済みの結紮位置21Bの数と、まだ結紮されていない結紮位置21Bの数を合成三次元モデル7と共に画面30に表示する。図13において、「残:1」の表示は、まだ結紮されていない結紮位置21Bの数を表し、「済:1」の表示は、結紮済みの結紮位置21Bの数を表している。 The CPU 10A also displays on the screen 30 the number of ligation positions 21B that have been ligated and the number of ligation positions 21B that have not yet been ligated together with the composite 3D model 7. In FIG. 13, the display "Remaining: 1" indicates the number of ligation positions 21B that have not yet been ligated, and the display "Completed: 1" indicates the number of ligation positions 21B that have been ligated.
なお、CPU10Aは、何れの結紮位置21Bで血管20の結紮が行われたのかを内視鏡映像5から判断して、結紮位置21Bの表示形態、並びに、結紮済み及び未結紮のそれぞれ数を更新すればよい。 The CPU 10A can determine from the endoscopic image 5 at which ligation position 21B the blood vessel 20 has been ligated, and can update the display form of the ligation position 21B and the number of ligated and unligated positions.
また、CPU10Aは、結紮により血流が停止した血管20Aと、まだ結紮されていないために血流が存在する血管20Bをそれぞれ区別して合成三次元モデル7の対応する位置に重畳して表示してもよい。図13において、識別子「P1」が対応付けられている結紮位置21Bから血流方向の先にある血管20Aは、結紮位置21Bで結紮されている。一方、識別子「P2」が対応付けられている結紮位置21Bから血流方向の先にある血管20Bは、結紮位置21Bでまだ結紮されていない。したがって、CPU10Aは、血管20Aと血管20Bをそれぞれ区別して、合成三次元モデル7の対応する位置に重畳して表示する。血管20Aと血管20Bの区別手法に制約はなく、例えば血管20の表示色を変えればよい。 The CPU 10A may also distinguish between the blood vessel 20A in which blood flow has stopped due to ligation and the blood vessel 20B in which blood flow exists because the blood vessel has not yet been ligated, and superimpose and display them at the corresponding positions on the composite three-dimensional model 7. In FIG. 13, the blood vessel 20A located at the end of the blood flow direction from the ligation position 21B associated with the identifier "P1" is ligated at the ligation position 21B. On the other hand, the blood vessel 20B located at the end of the blood flow direction from the ligation position 21B associated with the identifier "P2" has not yet been ligated at the ligation position 21B. Therefore, the CPU 10A distinguishes between the blood vessel 20A and the blood vessel 20B, and superimposes and displays them at the corresponding positions on the composite three-dimensional model 7. There are no restrictions on the method for distinguishing between the blood vessel 20A and the blood vessel 20B, and for example, the display color of the blood vessel 20 may be changed.
また、例えばCPU10Aは、血液が存在する血管20Bに血液の流れを示す動画を重畳することによって、血管20Aと血管20Bとを区別して表示してもよい。血液の流れを示す動画は、実際の血液の流れを示すイメージ映像であっても、アニメーションであってもよい。こうした血液の流れを示す動画は、例えば予め記憶部12に記憶しておけばよい。 Furthermore, for example, CPU 10A may display blood vessels 20A and 20B in a distinguished manner by superimposing a video showing the flow of blood on blood vessel 20B in which blood is present. The video showing the flow of blood may be an image showing the actual flow of blood or an animation. Such a video showing the flow of blood may be stored in advance in storage unit 12, for example.
血流が停止した血管20Aと血流が存在する血管20Bを区別して画面30に表示することにより、結紮状態に拘らずすべての血管20を同じ形態で表示する場合と比較して、医療従事者が血管20を結紮し忘れてしまうような状況が発生することを低減できる。 By displaying on the screen 30 a blood vessel 20A in which blood flow has stopped and a blood vessel 20B in which blood flow is present, it is possible to reduce the occurrence of situations in which medical personnel forget to ligate a blood vessel 20, compared to when all blood vessels 20 are displayed in the same form regardless of their ligation status.
画面30に切除支援情報を表示した後、図12のステップS310において、CPU10Aは、切除位置がずれているか否かを判定する。切除位置がずれているとは、施術箇所の切除ライン22Bと切除マーク27によって表される切除実施ラインとの間に、予め定めた距離(以降、「規定距離」という)以上離れている箇所が存在することをいう。切除位置がずれている場合には、ステップS320に移行する。 After displaying the resection support information on screen 30, in step S310 of FIG. 12, CPU 10A determines whether the resection position has shifted. A shift in the resection position means that there is a location between resection line 22B of the treatment area and the resection execution line represented by resection mark 27 that is more than a predetermined distance (hereinafter referred to as the "prescribed distance"). If the resection position has shifted, the process proceeds to step S320.
切除ライン22Bと切除実施ラインとの間に規定距離以上離れている箇所が存在する場合、医療従事者が切除実施ラインに沿って患部24Bを切除してしまうと、計画した切除ライン22Bから離れた箇所を切除することになってしまう。したがって、ステップS320において、CPU10Aは警告を出力する。医療従事者が気づく方法であれば警告の出力形態に制約はなく、例えば画面30への表示、及び音声による通知が用いられる。 If there is a location between the resection line 22B and the resection execution line that is more than the specified distance away, if the medical staff resects the diseased area 24B along the resection execution line, they will end up resecting a location that is far from the planned resection line 22B. Therefore, in step S320, the CPU 10A outputs a warning. There are no restrictions on the form of the warning output as long as it is a method that the medical staff notices, and for example, a display on the screen 30 or a voice notification can be used.
警告を受けた医療従事者は、警告を受けたとしてもそのまま切除実施ラインに沿って患部24Bを切除するか否かの指示を、操作部13を通じてCPU10Aに通知する。したがって、ステップS330において、CPU10Aは、医療従事者が切除マーク27の位置を修正せずに患部24Bを切除することを示す続行指示を受け付けたか否かを判定する。続行指示を受け付けなかった場合には、医療従事者が切除マーク27の位置を修正して患部24Bの切除位置を変更することを意味するため、ステップS300に移行して切除支援情報の表示を継続する。なお、CPU10Aは、切除マーク27の位置を修正することを表す位置修正指示を医療従事者から受け付けた場合に、ステップS300に移行するようにしてもよい。 The medical staff who received the warning notifies CPU 10A through operation unit 13 of whether or not to continue resecting diseased part 24B along the resection execution line despite the warning. Therefore, in step S330, CPU 10A determines whether or not a continue instruction has been received from the medical staff indicating that the diseased part 24B is to be resected without modifying the position of resection mark 27. If a continue instruction has not been received, this means that the medical staff will modify the position of resection mark 27 to change the resection position of diseased part 24B, so the process proceeds to step S300 and the display of resection support information continues. Note that CPU 10A may proceed to step S300 if a position modification instruction indicating that the position of resection mark 27 is to be modified is received from the medical staff.
続行指示を受け付けた場合、医療従事者は切除マーク27の位置を修正することなく、切除マーク27を目印にしながらわざと切除マーク27から離れた場所を切除して、切除位置が切除ライン22Bに近づくようにする処置を行うものと考えられる。したがって、次の処理を実行するステップS340に移行する。 If an instruction to continue is received, it is believed that the medical staff will not correct the position of the resection mark 27, but will instead use the resection mark 27 as a guide and intentionally resect a location away from the resection mark 27, so that the resection position approaches the resection line 22B. Therefore, the process proceeds to step S340, where the next process is executed.
一方、ステップS310の判定処理によって切除位置はずれていないと判定された場合、警告の出力は不要となるため、ステップS320及びステップS330の各処理を実行することなくステップS340に移行する。 On the other hand, if the determination process in step S310 determines that the resection position has not shifted, there is no need to issue a warning, and the process proceeds to step S340 without executing the processes in steps S320 and S330.
ステップS340において、CPU10Aは、何れかの結紮位置21Bにおいて新たに血管20が結紮されたか否かを判定する。CPU10Aは、新たに血管20が結紮されたか否かを内視鏡映像5から判断すればよい。新たに血管20が結紮された場合、ステップS350に移行する。 In step S340, the CPU 10A determines whether a new blood vessel 20 has been ligated at any of the ligation positions 21B. The CPU 10A may determine whether a new blood vessel 20 has been ligated from the endoscopic image 5. If a new blood vessel 20 has been ligated, the process proceeds to step S350.
ステップS350において、CPU10Aは、新たに結紮が行われた結紮位置21Bを、血管20を横切る直線によって表示し、当該結紮位置21Bが結紮済みであることを合成三次元モデル7の対応する位置に重畳して表示する。すなわち、CPU10Aは、最新の結紮状態にあわせて結紮状態の表示を更新する。また、CPU10Aは、画面30に表示した結紮済み及び未結紮のそれぞれの結紮位置21Bの数を更新する。 In step S350, the CPU 10A displays the newly ligated ligation position 21B by a straight line crossing the blood vessel 20, and displays the ligation position 21B as having been ligated by superimposing it on the corresponding position on the composite 3D model 7. In other words, the CPU 10A updates the display of the ligation state to match the latest ligation state. The CPU 10A also updates the number of ligated and unligated ligation positions 21B displayed on the screen 30.
血管20が結紮されると、結紮位置21Bから血流方向の先にある血管20では血流が停止し、臓器4に流れ込む血液が停止する。 When the blood vessel 20 is ligated, blood flow stops in the blood vessel 20 at the end of the ligation position 21B in the blood flow direction, and blood flow into the organ 4 stops.
したがって、ステップS360において、CPU10Aは、新たに結紮が行われた結紮位置21Bに基づいて血流停止領域25の範囲を更新し、更新した血流停止領域25を合成三次元モデル7の対応する位置に重畳して表示する。 Therefore, in step S360, the CPU 10A updates the range of the blood flow stop region 25 based on the newly ligated ligation position 21B, and displays the updated blood flow stop region 25 by superimposing it on the corresponding position of the composite 3D model 7.
更に、ステップS370において、CPU10Aは、新たに結紮が行われた結紮位置21Bから血流方向の先にある血管20Aの表示を、血流が停止したことを表す表示形態に更新してステップS380に移行する。 Furthermore, in step S370, the CPU 10A updates the display of the blood vessel 20A located at the end of the blood flow direction from the newly ligated ligation position 21B to a display form indicating that blood flow has stopped, and then proceeds to step S380.
一方、ステップS340の判定処理において、新たに血管20は結紮されていないと判定された場合、結紮状態に変化はないためステップS350~ステップS370の各処理を実行することなくステップS380に移行する。 On the other hand, if it is determined in the determination process of step S340 that the blood vessel 20 has not been newly ligated, there is no change in the ligation state, so the process proceeds to step S380 without executing the processes of steps S350 to S370.
ステップS380において、CPU10Aは、操作部13を通じて、医療従事者から患部24Bの切除が完了したことを指示する切除完了指示を受け付けたか否かを判定する。切除完了指示を受け付けていない場合、引き続き血管20の結紮が行われる場合があるためステップS340に移行する。すなわち、CPU10Aは、切除完了指示を受け付けるまで、血管20の結紮が行われる毎に、最新の結紮状態にあわせて結紮状態の表示、結紮済み及び未結紮のそれぞれの結紮位置21Bの数、血流停止領域25の範囲、及び血管20の血流状態の表示を更新する。 In step S380, the CPU 10A determines whether or not a resection completion instruction has been received from the medical staff via the operation unit 13, indicating that the resection of the diseased area 24B has been completed. If a resection completion instruction has not been received, the process proceeds to step S340 since ligation of the blood vessel 20 may continue. In other words, until a resection completion instruction is received, the CPU 10A updates the display of the ligation status, the number of ligated and unligated ligation positions 21B, the range of the blood flow stop region 25, and the display of the blood flow status of the blood vessel 20 according to the latest ligation status each time ligation of the blood vessel 20 is performed.
一方、切除完了通知を受け付けた場合には、図12に示す切除支援処理を終了する。 On the other hand, if a resection completion notification is received, the resection support process shown in Figure 12 ends.
なお、CPU10Aは、画面30の領域26の位置に、例えば患部24、切除範囲、施術箇所を含む臓器4、及び施術箇所を通る血管20の表示形態を設定する設定領域を表示してもよい。領域26における「腫瘍」、「切除領域」、「肝臓」、「動脈」、及び「静脈」の各項目は、医療従事者が操作部13を通じて選択可能になっており、CPU10Aは、項目が選択される毎に選択された項目の後に続くセパレータ“:”以降の表示を「表示」、「半透明表示」、及び「非表示」に順次切り替える。 The CPU 10A may display a setting area in area 26 on the screen 30 for setting the display format of the affected area 24, the resection range, the organ 4 including the treatment site, and the blood vessels 20 passing through the treatment site. Each of the items in area 26, "Tumor," "Resection area," "Liver," "Artery," and "Vein," can be selected by the medical staff via the operation unit 13, and each time an item is selected, the CPU 10A switches the display of the separator ":" following the selected item between "display," "semi-transparent display," and "non-display" in sequence.
「表示」とは、項目によって表される部位を画面30に表示するモードである。「半透明表示」とは、項目によって表される部位の後ろにある別の部位が画面30に表示されるように、項目によって表される部位を透過表示するモードである。「非表示」とは、項目によって表される部位を画面30に表示しないようにするモードである。 "Display" is a mode in which the part represented by the item is displayed on the screen 30. "Semi-transparent display" is a mode in which the part represented by the item is displayed transparently so that another part behind the part represented by the item is displayed on the screen 30. "Hide" is a mode in which the part represented by the item is not displayed on the screen 30.
図13の例では、腫瘍の項目が「表示」に設定されているため、CPU10Aは、腫瘍を伴う患部24Bを画面30に表示する。切除領域、肝臓、及び動脈の各項目も「表示」に設定されているため、CPU10Aは、切除ライン22Bによって切除される臓器4の部分、臓器4の一例である肝臓、及び血管20のうちの動脈を画面30に表示する。一方、静脈の項目は「非表示」に設定されているため、CPU10Aは、血管20のうちの静脈を画面30に表示しないようにする。 In the example of FIG. 13, the tumor item is set to "display", so CPU 10A displays the affected area 24B with the tumor on screen 30. The resection area, liver, and artery items are also set to "display", so CPU 10A displays the part of organ 4 that is resected by resection line 22B, the liver, which is an example of organ 4, and the artery of blood vessels 20 on screen 30. On the other hand, the vein item is set to "hide", so CPU 10A does not display the vein of blood vessels 20 on screen 30.
このように情報処理装置2は、医療従事者が選択した項目毎に、項目に対応した部位の表示形態を変化させることができる。したがって、例えば医療従事者が施術箇所を確認するうえで邪魔になる部位を表示しないようにすることができるため、合成三次元モデル7を構成する部位の表示形態を部位毎に選択することができない場合と比較して、施術箇所の性状を詳細に確認することができる。なお、図13の領域26に表示した表示形態の設定領域を、図8に示した術前標識段階における画面30、及び図11に示した術中標識段階における画面30に表示してもよい。 In this way, the information processing device 2 can change the display format of the part corresponding to each item selected by the medical professional. Therefore, for example, it is possible to avoid displaying parts that get in the way of the medical professional checking the treatment area, so the characteristics of the treatment area can be checked in detail compared to a case where the display format of the parts constituting the composite 3D model 7 cannot be selected for each part. The display format setting area displayed in area 26 in FIG. 13 may also be displayed on screen 30 in the preoperative labeling stage shown in FIG. 8 and on screen 30 in the intraoperative labeling stage shown in FIG. 11.
また、既に説明したように、内視鏡3には、患部24を切除する切除器具、及び患部24の組織を採取する採取器具等の処置具を取り付けることができる。したがって、CPU10Aは、医療従事者がいつどのように処置具を動作させ、施術箇所にどのような処置を行ったのかに関する記録を、処置具の操作履歴として記憶部12に記憶してもよい。処置具の操作履歴は、処置具の操作に関する履歴情報の一例である。 As already explained, the endoscope 3 can be fitted with treatment tools such as a resection tool for resecting the diseased area 24 and a collection tool for collecting tissue from the diseased area 24. Therefore, the CPU 10A may store in the memory unit 12 a record of when and how the medical staff operated the treatment tool and what treatment was performed on the treatment site as a treatment tool operation history. The treatment tool operation history is an example of history information related to the operation of the treatment tool.
なお、処置が行われた時間は、例えばCPU10Aに内蔵されるタイマ機能から得られる。処置具の移動状況は、例えば処置具に内蔵された慣性センサによる計測データから得られる。処置具を用いた処置内容は、例えば内視鏡映像5から得られる。 The time the treatment was performed can be obtained, for example, from a timer function built into the CPU 10A. The movement status of the treatment tool can be obtained, for example, from measurement data obtained by an inertial sensor built into the treatment tool. The details of the treatment using the treatment tool can be obtained, for example, from the endoscopic video 5.
CPU10Aは、医療従事者が施術箇所の施術中に行った処置具の操作履歴を、施術箇所を含む合成三次元モデル7と共に記憶部12に記憶する。 The CPU 10A stores in the memory unit 12 the operation history of the treatment tool performed by the medical staff while performing treatment on the treatment area, together with the composite 3D model 7 including the treatment area.
このように処置具の操作履歴と合成三次元モデル7を記憶部12に記憶しておけば、医療従事者によって指示された施術内容を、処置具の操作履歴と合成三次元モデル7を用いて再現して画面30に表示できるため、医療従事者は過去の施術内容を立体的に確認することができる。したがって、自分が行った施術内容を振り返って改善点があれば次の施術に生かしたり、他の医療従事者がどのように施術を行っているのかを学習したりすることができる。 If the operation history of the treatment tools and the composite 3D model 7 are stored in the memory unit 12 in this way, the treatment content instructed by the medical staff can be reproduced using the operation history of the treatment tools and the composite 3D model 7 and displayed on the screen 30, allowing the medical staff to check the past treatment content in three dimensions. Therefore, the medical staff can look back on the treatment content they performed and use any improvements they find in the next treatment, or learn how other medical staff perform treatments.
以上、実施形態を用いて内視鏡施術システム100の一形態について説明したが、開示した内視鏡施術システム100の形態は一例であり、内視鏡施術システム100の形態は実施形態に記載の範囲に限定されない。本開示の要旨を逸脱しない範囲で実施形態に多様な変更又は改良を加えることができ、当該変更又は改良を加えた形態も開示の技術的範囲に含まれる。 The above describes one form of the endoscopic procedure system 100 using an embodiment, but the disclosed form of the endoscopic procedure system 100 is only one example, and the form of the endoscopic procedure system 100 is not limited to the scope described in the embodiment. Various modifications or improvements can be made to the embodiment without departing from the gist of this disclosure, and forms with such modifications or improvements are also included in the technical scope of the disclosure.
例えば本開示の要旨を逸脱しない範囲で、図7に示した術前標識処理、図9に示した合成三次元モデル7の生成処理、図10に示した術中標識処理、及び図12に示した切除支援処理(以降、「情報処理装置2の処理」という)の各フローチャートにおける内部の処理順序を変更してもよい。 For example, the internal processing order in each of the flowcharts for the preoperative marking process shown in FIG. 7, the synthetic 3D model 7 generation process shown in FIG. 9, the intraoperative marking process shown in FIG. 10, and the resection support process shown in FIG. 12 (hereinafter referred to as "processing of the information processing device 2") may be changed without departing from the scope of the present disclosure.
上記の実施形態では、一例として、情報処理装置2の処理をソフトウェア処理で実現する形態について説明した。しかしながら、情報処理装置2の処理を示すフローチャートと同等の処理をハードウェアで処理させるようにしてもよい。この場合、情報処理装置2の処理をソフトウェア処理で実現した場合と比較して処理の高速化が図られる。 In the above embodiment, as an example, a form in which the processing of the information processing device 2 is realized by software processing has been described. However, processing equivalent to the flowchart showing the processing of the information processing device 2 may be processed by hardware. In this case, the processing speed can be increased compared to when the processing of the information processing device 2 is realized by software processing.
上記の実施形態において、プロセッサとは広義的なプロセッサを指し、汎用的なプロセッサ(例えばCPU10A)や、専用のプロセッサ(例えば GPU:Graphics Processing Unit、ASIC:Application Specific Integrated Circuit、FPGA:Field Programmable Gate Array、プログラマブル論理デバイス、等)を含むものである。 In the above embodiment, the term "processor" refers to a processor in a broad sense, including general-purpose processors (e.g., CPU 10A) and dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, programmable logic device, etc.).
また、上記の実施形態におけるプロセッサの動作は、1つのプロセッサによって成すのみでなく、物理的に離れた位置に存在する複数のプロセッサが協働して成すものであってもよい。また、プロセッサの各動作の順序は上記の実施形態において記載した順序のみに限定されるものではなく、適宜変更してもよい。 Furthermore, the processor operations in the above embodiments may not only be performed by a single processor, but may also be performed by multiple processors in physically separate locations working together. Furthermore, the order of each processor operation is not limited to the order described in the above embodiments, and may be changed as appropriate.
上記の実施形態では、ROM10Bに制御プログラム11が記憶されている例について説明した。しかしながら、制御プログラム11の記憶先はROM10Bに限定されない。本開示の制御プログラム11は、コンピュータで読み取り可能な記憶媒体に記録された形態で提供することも可能である。 In the above embodiment, an example has been described in which the control program 11 is stored in ROM 10B. However, the storage destination of the control program 11 is not limited to ROM 10B. The control program 11 of the present disclosure can also be provided in a form recorded on a computer-readable storage medium.
例えば制御プログラム11をCD-ROM(Compact Disk Read Only Memory)、DVD-ROM(Digital Versatile Disk Read Only Memory)、及びブルーレイディスクのような光ディスクに記録した形態で提供してもよい。また、制御プログラム11を、USB(Universal Serial Bus)メモリ及びメモリカードのような可搬型の半導体メモリに記録した形態で提供してもよい。ROM10B、CD-ROM、DVD-ROM、ブルーレイディスク、USB、及びメモリカードは非一時的(non-transitory)記憶媒体の一例である。 For example, the control program 11 may be provided in a form recorded on an optical disk such as a CD-ROM (Compact Disk Read Only Memory), a DVD-ROM (Digital Versatile Disk Read Only Memory), or a Blu-ray disc. The control program 11 may also be provided in a form recorded on a portable semiconductor memory such as a USB (Universal Serial Bus) memory or a memory card. ROM 10B, CD-ROM, DVD-ROM, Blu-ray disc, USB, and memory card are examples of non-transitory storage media.
更に、制御部10はI/F部15を通じて、通信回線に接続された外部装置から制御プログラム11をダウンロードし、ダウンロードした制御プログラム11を制御部10のROM10Bに記憶してもよい。この場合、制御部10のCPU10Aは、外部装置からダウンロードした制御プログラム11をROM10Bから読み込んで情報処理装置2の処理を実行する。 Furthermore, the control unit 10 may download the control program 11 from an external device connected to a communication line through the I/F unit 15, and store the downloaded control program 11 in the ROM 10B of the control unit 10. In this case, the CPU 10A of the control unit 10 reads the control program 11 downloaded from the external device from the ROM 10B and executes the processing of the information processing device 2.
本願の制御プログラム11は、プログラム製品として提供可能である。プログラム製品とは、プログラムを提供するためのあらゆる態様の製品を含む。例えば、プログラム製品は、インターネット等のネットワークを通じて提供されるプログラム、及びプログラムを保存したCD-ROM、DVD等の非一時的コンピュータ可読記録媒体等を含む。 The control program 11 of the present application can be provided as a program product. A program product includes any type of product for providing a program. For example, a program product includes a program provided over a network such as the Internet, and a non-transitory computer-readable recording medium such as a CD-ROM or DVD on which a program is stored.
2023年9月28日に出願された日本国特許出願2023-168676号の開示は、その全体が参照により本明細書に取り込まれる。本明細書に記載された全ての文献、特許出願、及び技術規格は、個々の文献、特許出願、及び技術規格が参照により取り込まれることが具体的かつ個々に記された場合と同程度に、本明細書中に参照により取り込まれる。 The disclosure of Japanese Patent Application No. 2023-168676, filed on September 28, 2023, is incorporated herein by reference in its entirety. All documents, patent applications, and technical standards described herein are incorporated herein by reference to the same extent as if each individual document, patent application, and technical standard was specifically and individually indicated to be incorporated by reference.
以下に本実施形態に係る付記を示す。 The following are additional notes related to this embodiment.
(付記1)
プロセッサを備え、
前記プロセッサが、
予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成し、前記施術箇所の映像が合成された合成三次元モデルを表示する制御を行う
を備えた情報処理装置。
(Appendix 1)
A processor is provided.
The processor,
an information processing device which controls the display of a composite 3D model of a treatment area generated from image data of the treatment area photographed in advance, and the composite 3D model into which the image of the treatment area is photographed by an imaging device attached to an endoscope is composited.
(付記2)
前記プロセッサは、前記施術箇所に対する施術の段階に応じて、前記段階の各々と関連付けられた関連情報を前記三次元モデル、又は前記合成三次元モデルと共に表示する制御を行う
付記1に記載の情報処理装置。
(Appendix 2)
The information processing device according to claim 1, wherein the processor performs control to display related information associated with each of the stages together with the three-dimensional model or the composite three-dimensional model according to a stage of treatment for the treatment area.
(付記3)
前記施術箇所に対する施術計画を策定する術前標識段階の場合、前記プロセッサは、患部の位置、予め計画した前記患部を含む前記施術箇所の切除ライン、前記切除ラインに沿って切除される切除範囲、前記施術箇所を通る血管の位置、予め計画した血管の結紮位置、及び血管の結紮によって血流が停止する血流停止領域の少なくとも1つを、前記三次元モデルの対応する位置に重畳して表示する制御を行う
付記2に記載の情報処理装置。
(Appendix 3)
In the case of a preoperative labeling stage in which a treatment plan for the treatment site is formulated, the processor controls to superimpose and display at least one of the position of the affected area, a pre-planned resection line for the treatment site including the affected area, the resection range to be resected along the resection line, the position of the blood vessel passing through the treatment site, the pre-planned ligation position of the blood vessel, and a blood flow cessation region where blood flow is halted by ligation of the blood vessel on a corresponding position of the three-dimensional model.
(付記4)
前記プロセッサは、前記切除ラインに沿って前記患部を切除した場合、前記患部が含まれる臓器の残存体積を前記三次元モデルと共に表示する制御を行う
付記3に記載の情報処理装置。
(Appendix 4)
The information processing device according to claim 3, wherein the processor performs control to display, when the diseased area is resected along the resection line, a remaining volume of an organ including the diseased area together with the three-dimensional model.
(付記5)
前記施術箇所に対する施術計画を策定する術前標識段階において決定した術前計画情報を、前記合成三次元モデルに合成された前記施術箇所の映像から得られる前記施術箇所の性状に基づき再確認する術中標識段階の場合、
前記プロセッサは、前記術前標識段階において決定した患部を含む前記施術箇所の切除ラインの変更に伴い、前記術前標識段階における前記切除ライン及び更新後の前記切除ラインをそれぞれ前記合成三次元モデルの対応する位置に重畳して表示すると共に、前記術前標識段階において決定した血管の結紮位置を更新し、更新後の血管の結紮位置を前記合成三次元モデルの対応する位置に重畳して表示する制御を行う
付記2に記載の情報処理装置。
(Appendix 5)
In the case of an intraoperative marking step of reconfirming the preoperative planning information determined in the preoperative marking step of formulating a treatment plan for the treatment area based on the characteristics of the treatment area obtained from the image of the treatment area synthesized in the synthetic three-dimensional model,
The information processing device described in Appendix 2, wherein the processor performs control to superimpose and display the resection line at the preoperative labeling stage and the updated resection line on the corresponding positions of the synthetic three-dimensional model in response to a change in the resection line of the treatment site including the affected area determined in the preoperative labeling stage, and to update the ligation position of the blood vessel determined in the preoperative labeling stage and superimpose and display the updated ligation position of the blood vessel on the corresponding position of the synthetic three-dimensional model.
(付記6)
前記施術箇所に含まれる患部を切除する切除段階の場合、前記プロセッサは、患部の位置、前記患部を含む前記施術箇所の切除ライン、前記切除ラインに沿って切除される切除範囲、前記施術箇所を通る血管の位置、血管の結紮位置、血管の結紮によって血流が停止する血流停止領域、及び医療従事者が前記患部を切除する目印として前記施術箇所に付けた切除マークの位置の少なくとも1つを、前記合成三次元モデルの対応する位置に重畳して表示する制御を行う
付記2に記載の情報処理装置。
(Appendix 6)
In the case of a resection stage in which an affected area included in the treatment area is resected, the processor controls to superimpose and display at least one of the following on a corresponding position of the synthetic three-dimensional model: a position of the affected area, a resection line of the treatment area including the affected area, a resection range to be resected along the resection line, a position of a blood vessel passing through the treatment area, a ligation position of the blood vessel, a blood flow halt area in which blood flow is halted by ligation of the blood vessel, and a position of an resection mark placed on the treatment area by a medical professional as a mark for resecting the affected area. The information processing device described in Appendix 2.
(付記7)
前記プロセッサは、血管の結紮位置を識別する識別子を、血管の結紮位置と対応付けて前記合成三次元モデルと共に表示する制御を行う
付記6に記載の情報処理装置。
(Appendix 7)
The information processing device according to claim 6, wherein the processor performs control to associate an identifier for identifying a ligation position of a blood vessel with the ligation position of the blood vessel and display the identifier together with the synthetic three-dimensional model.
(付記8)
前記識別子は、血管の結紮順序を示す
付記7に記載の情報処理装置。
(Appendix 8)
The information processing device according to claim 7, wherein the identifier indicates a ligation order of blood vessels.
(付記9)
前記プロセッサは、結紮済みの血管の結紮位置と、まだ結紮されていない血管の結紮位置をそれぞれ区別して前記合成三次元モデルの対応する位置に重畳して表示する制御を行う
付記6~付記8の何れか1つに記載の情報処理装置。
(Appendix 9)
The information processing device according to any one of appendices 6 to 8, wherein the processor controls to distinguish between the ligation positions of blood vessels that have already been ligated and the ligation positions of blood vessels that have not yet been ligated and to superimpose and display them at corresponding positions on the composite three-dimensional model.
(付記10)
前記プロセッサは、結紮により血流が停止した前記血流停止領域における血管と、まだ結紮されていないために血流が存在する前記血流停止領域における血管をそれぞれ区別して前記合成三次元モデルの対応する位置に重畳して表示する制御を行う
付記6~付記9の何れか1つに記載の情報処理装置。
(Appendix 10)
The information processing device according to any one of Supplementary Note 6 to Supplementary Note 9, wherein the processor performs control to distinguish between blood vessels in the blood flow stop region where blood flow has been stopped by ligation and blood vessels in the blood flow stop region where blood flow is present because the blood vessels have not yet been ligated, and to superimpose and display them at corresponding positions on the composite three-dimensional model.
(付記11)
前記プロセッサは、血流が存在する前記血流停止領域における血管に、血液の流れを示す動画を重畳して表示する制御を行う
付記10に記載の情報処理装置。
(Appendix 11)
The information processing device according to claim 10, wherein the processor performs control to superimpose and display a video showing blood flow on the blood vessel in the blood flow stop region where blood flow exists.
(付記12)
前記プロセッサは、結紮済みの血管の結紮位置の数と、まだ結紮されていない血管の結紮位置の数を前記合成三次元モデルと共に表示する制御を行う
付記6~付記11の何れか1つに記載の情報処理装置。
(Appendix 12)
The information processing device according to any one of claims 6 to 11, wherein the processor controls to display the number of ligation positions of blood vessels that have already been ligated and the number of ligation positions of blood vessels that have not yet been ligated together with the composite three-dimensional model.
(付記13)
前記プロセッサは、血管の結紮状態に伴い、前記血流停止領域の範囲を更新する制御を行う
付記6~付記12の何れか1つに記載の情報処理装置。
(Appendix 13)
The information processing device according to any one of claims 6 to 12, wherein the processor performs control to update a range of the blood flow stop region in accordance with a ligation state of a blood vessel.
(付記14)
前記プロセッサは、前記施術箇所の切除ラインと前記切除マークによって表される切除実施ラインが予め定めた距離以上離れている場合、警告を出力する制御を行う
付記6~付記13の何れか1つに記載の情報処理装置。
(Appendix 14)
The information processing device according to any one of claims 6 to 13, wherein the processor performs control to output a warning when the resection line of the treatment area and the resection execution line represented by the resection mark are separated by a predetermined distance or more.
(付記15)
前記プロセッサは、医療従事者からの指示に従い、前記合成三次元モデルの対応する位置に重畳して表示される前記患部、前記切除範囲、前記施術箇所を含む臓器、及び前記施術箇所を通る血管の表示形態をそれぞれ変化させる制御を行う
付記6~付記14の何れか1つに記載の情報処理装置。
(Appendix 15)
The information processing device according to any one of claims 6 to 14, wherein the processor performs control to change the display forms of the affected area, the resection range, the organ including the treatment site, and the blood vessels passing through the treatment site, which are displayed superimposed on corresponding positions of the synthetic three-dimensional model in accordance with instructions from a medical professional.
(付記16)
前記プロセッサは、医療従事者が指示した角度から前記合成三次元モデルを表示する制御を行う
付記1~付記15の何れか1つに記載の情報処理装置。
(Appendix 16)
The information processing device according to any one of claims 1 to 15, wherein the processor controls to display the synthetic 3D model from an angle specified by a medical professional.
(付記17)
前記プロセッサは、前記内視鏡の移動に伴い前記撮影装置によって撮影された前記施術箇所における新たな映像を、前記内視鏡の移動と共に前記合成三次元モデルに合成しながら、前記新たな映像が追加された前記合成三次元モデルを表示する制御を行う
付記16に記載の情報処理装置。
(Appendix 17)
The information processing device described in Appendix 16, wherein the processor controls to display the composite 3D model to which the new image has been added while compositing new images of the treatment site captured by the imaging device as the endoscope moves into the composite 3D model along with the movement of the endoscope.
(付記18)
前記プロセッサは、医療従事者が前記施術箇所の施術中に行った器具の操作に関する履歴情報を前記合成三次元モデルと共に記憶装置に記憶し、
医療従事者からの指示に応じて、前記施術箇所に対する施術内容を前記履歴情報と前記合成三次元モデルを用いて表示する制御を行う
付記17に記載の情報処理装置。
(Appendix 18)
The processor stores, in a storage device, history information regarding the operation of a tool performed by a medical professional during treatment of the treatment site together with the synthetic three-dimensional model;
The information processing device according to claim 17, further comprising: a display unit configured to display, in response to an instruction from a medical professional, a treatment content for the treatment area using the history information and the composite three-dimensional model.
(付記19)
予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成し、前記施術箇所の映像が合成された合成三次元モデルを表示する制御をコンピュータに実行させるための制御プログラム。
(Appendix 19)
A control program for causing a computer to execute control for combining a three-dimensional model of a treatment area generated from image data of the treatment area photographed in advance with an image of the treatment area photographed by an imaging device attached to an endoscope, and displaying a composite three-dimensional model into which the image of the treatment area has been combined.
(付記20)
予め定めた処理を実行するようにコンピュータによって実行可能なプログラムを記憶した非一時的記憶媒体であって、
前記処理が、
予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成する合成ステップと、
前記施術箇所の映像が合成された合成三次元モデルを表示する表示ステップと、
を含む非一時的記憶媒体。
(Appendix 20)
A non-transitory storage medium storing a program executable by a computer to execute a predetermined process,
The process,
A synthesis step of synthesizing an image of the treatment site photographed by an imaging device attached to an endoscope with a three-dimensional model of the treatment site generated from image data of the treatment site photographed in advance;
A display step of displaying a synthetic three-dimensional model synthesized with the image of the treatment area;
Non-transitory storage media including:
(付記21)
予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成し、前記施術箇所の映像が合成された合成三次元モデルを表示する制御をコンピュータに実行させる制御プログラムを含むコンピュータプログラム製品。
(Appendix 21)
A computer program product including a control program that causes a computer to execute control for combining a three-dimensional model of a treatment area generated from image data of the treatment area photographed in advance with an image of the treatment area photographed by an imaging device attached to an endoscope, and displaying a composite three-dimensional model into which the image of the treatment area has been combined.
Claims (19)
前記プロセッサが、
予め撮影された施術箇所の画像データから生成された前記施術箇所の三次元モデルに、内視鏡に取り付けられた撮影装置によって撮影された前記施術箇所の映像を合成し、前記施術箇所の映像が合成された合成三次元モデルを表示する制御を行う
情報処理装置。 A processor is provided.
The processor,
An information processing device that performs control to combine an image of the treatment area captured by an imaging device attached to an endoscope with a three-dimensional model of the treatment area generated from image data of the treatment area captured in advance, and display the combined three-dimensional model with the image of the treatment area.
請求項1に記載の情報処理装置。 The information processing device according to claim 1 , wherein the processor performs control to display related information associated with each of the stages, together with the three-dimensional model or the composite three-dimensional model, in accordance with a stage of treatment for the treatment location.
請求項2に記載の情報処理装置。 3. The information processing device according to claim 2, wherein in a preoperative labeling stage of formulating a treatment plan for the treatment site, the processor controls to superimpose and display at least one of the position of the affected area, a preplanned resection line for the treatment site including the affected area, a resection range to be resected along the resection line, the position of a blood vessel passing through the treatment site, a preplanned ligation position of the blood vessel, and a blood flow cessation region where blood flow is halted by ligation of the blood vessel on a corresponding position of the three-dimensional model.
請求項3に記載の情報処理装置。 The information processing device according to claim 3 , wherein the processor performs control to display, when the diseased part is resected along the resection line, a remaining volume of an organ including the diseased part together with the three-dimensional model.
前記プロセッサは、前記術前標識段階において決定した患部を含む前記施術箇所の切除ラインの変更に伴い、前記術前標識段階における前記切除ライン及び更新後の前記切除ラインをそれぞれ前記合成三次元モデルの対応する位置に重畳して表示すると共に、前記術前標識段階において決定した血管の結紮位置を更新し、更新後の血管の結紮位置を前記合成三次元モデルの対応する位置に重畳して表示する制御を行う
請求項2に記載の情報処理装置。 In the case of an intraoperative marking step of reconfirming the preoperative planning information determined in the preoperative marking step of formulating a treatment plan for the treatment area based on the characteristics of the treatment area obtained from the image of the treatment area synthesized in the synthetic three-dimensional model,
3. The information processing device according to claim 2, wherein the processor performs control to superimpose and display the resection line at the preoperative labeling stage and the updated resection line on the corresponding positions of the synthetic 3D model in response to a change in the resection line of the treatment site including the diseased area determined in the preoperative labeling stage, and to update the ligation position of the blood vessel determined in the preoperative labeling stage and superimpose and display the updated ligation position of the blood vessel on the corresponding position of the synthetic 3D model.
請求項2に記載の情報処理装置。 3. The information processing device according to claim 2, wherein in a resection stage of resecting an affected area included in the treatment area, the processor controls to superimpose and display at least one of the position of the affected area, a resection line of the treatment area including the affected area, a resection range to be resected along the resection line, a position of a blood vessel passing through the treatment area, a ligation position of the blood vessel, a blood flow stoppage area where blood flow is stopped by ligation of the blood vessel, and a position of an resection mark placed by a medical professional at the treatment area as a mark for resecting the affected area on a corresponding position of the synthetic three-dimensional model.
請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6 , wherein the processor performs control to display an identifier for identifying a ligation position of a blood vessel together with the synthetic three-dimensional model in association with the ligation position of the blood vessel.
請求項7に記載の情報処理装置。 The information processing device according to claim 7 , wherein the identifier indicates a ligation order of blood vessels.
請求項6に記載の情報処理装置。 The information processing device according to claim 6 , wherein the processor performs control to distinguish between ligation positions of blood vessels that have already been ligated and ligation positions of blood vessels that have not yet been ligated and to superimpose and display them at corresponding positions on the composite three-dimensional model.
請求項9に記載の情報処理装置。 The information processing device according to claim 9 , wherein the processor performs control to distinguish between blood vessels in the blood flow stop region where blood flow has been stopped by ligation and blood vessels in the blood flow stop region where blood flow is present because the blood vessels have not yet been ligated, and to superimpose and display them at corresponding positions on the synthetic three-dimensional model.
請求項10に記載の情報処理装置。 The information processing device according to claim 10 , wherein the processor performs control to superimpose and display a moving image showing blood flow on the blood vessel in the blood flow stop region where blood flow exists.
請求項9に記載の情報処理装置。 The information processing device according to claim 9 , wherein the processor performs control to display, together with the composite three-dimensional model, the number of ligation positions of blood vessels that have already been ligated and the number of ligation positions of blood vessels that have not yet been ligated.
請求項6に記載の情報処理装置。 The information processing device according to claim 6 , wherein the processor performs control to update the range of the blood flow stop region in accordance with a ligated state of the blood vessel.
請求項6に記載の情報処理装置。 The information processing device according to claim 6 , wherein the processor performs control to output a warning when the resection line of the treatment site and the resection execution line represented by the resection mark are separated by a predetermined distance or more.
請求項6に記載の情報処理装置。 7. The information processing device according to claim 6, wherein the processor performs control to change display forms of the diseased area, the resection range, the organ including the treatment site, and the blood vessels passing through the treatment site, which are displayed superimposed on corresponding positions of the synthetic three-dimensional model, in accordance with instructions from a medical professional.
請求項1~請求項15の何れか1項に記載の情報処理装置。 The information processing device according to any one of claims 1 to 15, wherein the processor performs control to display the composite three-dimensional model from an angle specified by a medical professional.
請求項16に記載の情報処理装置。 The information processing device according to claim 16, wherein the processor performs control to display the composite 3D model to which a new image of the treatment site captured by the imaging device as the endoscope moves is added while compositing the new image into the composite 3D model as the endoscope moves.
医療従事者からの指示に応じて、前記施術箇所に対する施術内容を前記履歴情報と前記合成三次元モデルを用いて表示する制御を行う
請求項17に記載の情報処理装置。 The processor stores, in a storage device, historical information regarding the operation of a tool performed by a medical professional during treatment of the treatment site together with the synthetic three-dimensional model;
The information processing device according to claim 17 , further comprising a control for displaying details of treatment for the treatment site using the history information and the composite three-dimensional model in response to an instruction from a medical professional.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-168676 | 2023-09-28 | ||
| JP2023168676 | 2023-09-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025069575A1 true WO2025069575A1 (en) | 2025-04-03 |
Family
ID=95202760
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/020434 Pending WO2025069575A1 (en) | 2023-09-28 | 2024-06-04 | Information processing device and control program |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025069575A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005278888A (en) * | 2004-03-29 | 2005-10-13 | Olympus Corp | Procedure support system |
| JP2005287893A (en) * | 2004-04-01 | 2005-10-20 | Olympus Corp | Procedure support system |
| JP2006320427A (en) * | 2005-05-17 | 2006-11-30 | Hitachi Medical Corp | Endoscopic operation support system |
| JP2013045284A (en) * | 2011-08-24 | 2013-03-04 | Nara Institute Of Science & Technology | Image processing system, image processing method, and program |
| WO2019116592A1 (en) * | 2017-12-14 | 2019-06-20 | オリンパス株式会社 | Device for adjusting display image of endoscope, and surgery system |
-
2024
- 2024-06-04 WO PCT/JP2024/020434 patent/WO2025069575A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005278888A (en) * | 2004-03-29 | 2005-10-13 | Olympus Corp | Procedure support system |
| JP2005287893A (en) * | 2004-04-01 | 2005-10-20 | Olympus Corp | Procedure support system |
| JP2006320427A (en) * | 2005-05-17 | 2006-11-30 | Hitachi Medical Corp | Endoscopic operation support system |
| JP2013045284A (en) * | 2011-08-24 | 2013-03-04 | Nara Institute Of Science & Technology | Image processing system, image processing method, and program |
| WO2019116592A1 (en) * | 2017-12-14 | 2019-06-20 | オリンパス株式会社 | Device for adjusting display image of endoscope, and surgery system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109998678B (en) | Using augmented reality to assist navigation during a medical procedure | |
| JP3820244B2 (en) | Insertion support system | |
| CN106659373B (en) | Dynamic 3D lung map view for tool navigation inside the lung | |
| JP5421828B2 (en) | Endoscope observation support system, endoscope observation support device, operation method thereof, and program | |
| JP5535725B2 (en) | Endoscope observation support system, endoscope observation support device, operation method thereof, and program | |
| JP3930423B2 (en) | Endoscope device | |
| JP2019517291A (en) | Image-based fusion of endoscopic and ultrasound images | |
| CN106030656A (en) | Spatial visualization of the internal mammary artery during minimally invasive bypass surgery | |
| EP2599432A1 (en) | Image processor, image processing method and image processing program | |
| JP2005137455A (en) | Insertion supporting system | |
| JP6013037B2 (en) | Medical image processing device | |
| US20250268666A1 (en) | Systems and methods for providing surgical assistance based on operational context | |
| JP2007044121A (en) | Medical image display method and apparatus | |
| JP2002253480A (en) | Device for assisting medical treatment | |
| EP3527159A2 (en) | 3d reconstruction and guidance based on combined endobronchial ultrasound and magnetic tracking | |
| CN116235216A (en) | Thoracic surgery planning system and method | |
| US11657547B2 (en) | Endoscopic surgery support apparatus, endoscopic surgery support method, and endoscopic surgery support system | |
| JP4445792B2 (en) | Insertion support system | |
| JP4022192B2 (en) | Insertion support system | |
| WO2025069575A1 (en) | Information processing device and control program | |
| JP6392192B2 (en) | Image registration device, method of operating image registration device, and program | |
| CN114868151A (en) | System and method for determining volume of excised tissue during surgical procedures | |
| JP2008054763A (en) | Medical image diagnostic apparatus | |
| JP4546064B2 (en) | Insertion simulation device | |
| JP2005329099A (en) | Simulation method for excision of lumen internal organ |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24869451 Country of ref document: EP Kind code of ref document: A1 |