WO2020203405A1 - 医療用観察システムおよび方法、並びに医療用観察装置 - Google Patents
医療用観察システムおよび方法、並びに医療用観察装置 Download PDFInfo
- Publication number
- WO2020203405A1 WO2020203405A1 PCT/JP2020/012676 JP2020012676W WO2020203405A1 WO 2020203405 A1 WO2020203405 A1 WO 2020203405A1 JP 2020012676 W JP2020012676 W JP 2020012676W WO 2020203405 A1 WO2020203405 A1 WO 2020203405A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- medical observation
- optical system
- unit
- change
- surgical field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00188—Optical arrangements with focusing or zooming features
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00194—Optical arrangements adapted for three-dimensional imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/0012—Surgical microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
- G02B23/2484—Arrangements in relation to a camera or imaging device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present technology relates to medical observation systems and methods, and medical observation devices, and in particular, medical observation systems and methods capable of maintaining the accuracy of three-dimensional information even when the optical system is changed. Also related to medical observation equipment.
- Patent Document 1 proposes a technique for generating three-dimensional information by SLAM and displaying it on a screen.
- the optical system of the medical observation device may be changed during surgery.
- the position of the focus lens in the optical system moves.
- the scope of the endoscope may be replaced during surgery, which results in changes in the optics.
- This technology was made in view of such a situation, and makes it possible to maintain the accuracy of three-dimensional information even if the optical system is changed.
- the medical observation system of one aspect of the present technology includes an acquisition unit that acquires surgical field data acquired by the medical observation device, a detection unit that detects changes in the optical system of the medical observation device, and the detection unit.
- an estimation unit that estimates the parameters determined by the optical system after the change and a condition for generating three-dimensional information based on the surgical field data using the estimation result of the estimation unit. It is provided with a setting unit for setting.
- the medical observation device of another aspect of the present technology includes an imaging unit that images the surgical field and generates surgical field data, and an output unit that outputs the surgical field data, and changes in the optical system of the imaging unit.
- an imaging unit that images the surgical field and generates surgical field data
- an output unit that outputs the surgical field data
- changes in the optical system of the imaging unit When a change in the optical system is detected, the parameters determined by the optical system after the change are estimated, and the estimation result is used to set the conditions for generating three-dimensional information based on the surgical field data. Used in observation systems.
- the surgical field data acquired by the medical observation device is acquired, a change in the optical system of the medical observation device is detected, and when a change in the optical system is detected, the optical after the change is detected.
- the parameters determined by the system are estimated, and the estimation results are used to set the conditions for generating three-dimensional information based on the surgical field data.
- FIG. 1 is a diagram showing a configuration example of a surgical support system according to a first embodiment of the present technology.
- FIG. 1 shows, for example, an example of an endoscopic surgery system used in abdominal endoscopic surgery, which is performed in a medical field in place of conventional open surgery.
- troccas 25a and 25b are attached to the abdominal wall at several places.
- a laparoscope hereinafter, also referred to as an endoscope
- an energy treatment tool 22 a forceps 23, etc.
- an observation medical device for observing the inside of the patient are inserted into the body through the holes provided in the troccas 25a and 25b. Will be inserted.
- the surgeon performs treatment such as excising the affected area U with an energy treatment tool 22 or the like while viewing the image of the affected area (tumor or the like) U in the patient's body imaged by the endoscope 11 in real time.
- the endoscope 11, the energy treatment tool 22, and the forceps 23 are held by an operator, a robot, or the like.
- the surgeon refers to a medical worker who is involved in the surgery performed in the operating room, and the surgeon includes, for example, a surgeon, an assistant, a scopist, a nurse, or a place other than the operating room. Includes doctors and others who are monitoring the surgery.
- the endoscope 11 is held by, for example, a scopist.
- the endoscope 11 includes a scope inserted into a patient and a camera head including an image pickup element that receives and images light guided by the scope.
- the scope may be a hard type or a soft type. Further, the scope and the image sensor may be integrated.
- a cart 31 equipped with devices for endoscopic surgery, a patient bed 33 on which a patient lies, a foot switch 35, and the like are installed.
- the cart 31 includes, for example, devices such as a camera control unit (CCU) 13, a light source device 17, a treatment tool device 21, a pneumoperitoneum device 24, a display device 15, a recorder 26, and a printer 27 as medical devices. Placed.
- CCU camera control unit
- the image signal of the affected area U captured through the observation optical system of the endoscope 11 is transmitted to the CCU 13 via the camera cable which is a signal transmission cable.
- the CCU 13 may be connected to the endoscope 11 via a camera cable, or may be connected to the endoscope 11 via a wireless communication path.
- the CCU 13 performs signal processing on the image signal output from the endoscope 11, and outputs the image signal after the signal processing to the display device 15. With such a configuration, the surgical field image of the affected area U is displayed on the display device 15.
- the CCU 13 may output the image signal after signal processing to the recorder 26 so that the recorder 26 can record the surgical field image of the affected area U as image data (for example, moving image data). Further, the CCU 13 may have the printer 27 print the surgical field image of the affected area U by outputting the image signal after the signal processing to the printer 27.
- the light source device 17 is connected to the endoscope 11 via a light guide cable, and can switch and irradiate the affected portion U with light of various wavelengths.
- the light emitted from the light source device 17 may be used as auxiliary light, for example.
- the treatment tool device 21 corresponds to, for example, a high frequency output device that outputs a high frequency current to the energy treatment tool 22 that cuts the affected portion U by using electric heat.
- the pneumoperitoneum device 24 includes air supply and intake means, and supplies air to, for example, an abdominal region in the patient's body.
- the foot switch 35 controls the CCU 13 and the treatment tool device 21 by using the foot operation of the operator or the assistant as a trigger signal.
- FIG. 2 is a block diagram showing a functional configuration example of the surgery support system.
- the surgery support system 100 of FIG. 2 is composed of an imaging unit 101, an information processing unit 102, and a display unit 103.
- the imaging unit 101 corresponds to the endoscope 11 in FIG.
- the imaging unit 101 images the surgical field in response to an operation by the scopist, and outputs the image signal obtained by the imaging to the information processing unit 102.
- the imaging unit 101 is a medical observation device that images the surgical field and outputs the obtained surgical field data.
- a microscope may be used instead of an endoscope.
- the medical observation device has a circuit for processing the imaging process and the generated image signal (for example, CPU (Central Processing Unit), RAM (Random Access Memory), ROM (Read Only Memory), or FPGA (Field-). ProgrammableGateArray)) is loaded.
- the information processing unit 102 corresponds to the CCU 13 in FIG.
- the information processing unit 102 acquires the image signal supplied from the imaging unit 101, performs signal processing on the image signal, and outputs the signal of the surgical field image generated by performing the signal processing to the display unit 103. ..
- the information processing unit 102 may be configured in a device other than the CCU 13.
- the display unit 103 corresponds to the display device 15 in FIG.
- the display unit 103 displays the surgical field image based on the image signal supplied from the information processing unit 102.
- the information processing unit 102 includes an optical system change detection unit 111, a parameter estimation unit 112, a three-dimensional information generation unit 113, and a display information generation unit 114.
- At least a part of the information processing unit 102 is realized by executing a predetermined program by a circuit including the CPU of the CCU 13 of FIG.
- the image signal output from the imaging unit 101 is input to the optical system change detection unit 111, the three-dimensional information generation unit 113, and the display information generation unit 114.
- At least a part of the functions of the information processing unit 102 may be realized by the FPGA.
- the optical system change detection unit 111 detects changes in the optical system that occur in the imaging unit 101 during surgery.
- the change in the optical system is such that when the imaging unit 101 adjusts the optical system such as zoom (angle of view) adjustment (movement of the zoom lens) and focus adjustment (movement of the focus lens), the imaging unit 101 When the scope is included in, it occurs when the scope is replaced.
- information on the optical system can be obtained electronically from the imaging unit 101 For example, zoom adjustment and focus adjustment are performed on a part of the optical members of the optical system included in the imaging unit 101 based on the output of the CCU 13. Since it is moved, information indicating a change in the optical system (for example, information indicating the position of the zoom lens or the position of the focus lens) is stored in the CCU 13. In this case, the optical system change detection unit 111 detects the change in the optical system based on the information indicating the change in the optical system stored in the CCU 13. Further, the imaging unit 101 may include a removable scope, and the scope may be provided with a storage unit that stores information indicating the type of the scope. At this time, the circuit included in the imaging unit 101 may acquire the scope information and output it to the CCU 13. In this case, the optical system change detection unit 111 detects the change in the optical system based on the information obtained from the imaging unit 101.
- the imaging unit 101 may include a removable scope, and the scope may be provided with a storage unit that stores information
- the optical system change detecting unit 111 detects the change of the optical system based on the image signal obtained from the imaging unit 101.
- the optical system change detection unit 111 repeatedly detects a mask region in an image signal sequentially supplied from the imaging unit 101.
- the mask region is a kerare region formed around the effective region in which the surgical field is reflected in the surgical field image generated from the image signal.
- the eclipse changes, so that the mask area changes, and the size of the diameter of the circular effective area changes.
- the optical system change detection unit 111 detects the change in the optical system that has occurred in the image pickup unit 101 by detecting such a change in the mask region.
- the optical system change detection unit 111 detects the change in the optical system generated in the image pickup unit 101 by using the singular value of the camera matrix (elementary matrix of the frame). It is possible to detect changes in the focal length based on the singular value of the camera matrix. This makes it possible to detect changes in the optical system such as movement of the focus lens and movement of the zoom lens.
- the detection method using the singular value of the camera matrix uses the property that the non-zero singular values of the elementary matrix calculated from the two viewpoints are the same when the focal lengths are the same, and the elementary matrix is used.
- This is a method of detecting that the focal length has changed by the ratio of the singular values of. The method is described in, for example, "Kazuki Nozawa,” Stabilization of 3D reconstruction for an input image group having an unknown focal length ", CVIM-182, vol.2012, No.19".
- the optical system change detection unit 111 performs the following processes (a) to (d).
- the optical system change detection unit 111 records a key frame as a reference for generating three-dimensional information in SLAM.
- the optical system change detection unit 111 sequentially calculates the elementary matrix E using keyframes.
- the optical system change detection unit 111 calculates the non-zero singular value of the elementary matrix E.
- the elementary matrix E is a 3 ⁇ 3 matrix.
- the epipolar condition which is the basis of three-dimensional reconstruction, is satisfied, the singular value in the third row of the diagonal matrix ⁇ when E is decomposed into singular values (formula (1) below) becomes 0, and the diagonal in frame i.
- the matrix ⁇ i is as shown in equation (2) below.
- the optical system change detection unit 111 detects that the focal length has changed by comparing the ratio of non-zero singular values at each time.
- the change in the optical system can be detected by comparing the ratio of the singular values of the diagonal matrix ⁇ in the frame i with the threshold value th.
- the optical system change detection unit 111 outputs the detection result obtained by detecting the change in the optical system as described above to the parameter estimation unit 112.
- the detection result output to the parameter estimation unit 112 also includes information on the optical system of the imaging unit 101.
- the method for detecting changes in the optical system is not limited to the above method, and other methods can be adopted.
- the parameter estimation unit 112 of FIG. 2 estimates the parameters that are the conditions for generating the three-dimensional information based on the surgical field image.
- the parameter is a parameter determined by the optical system, and is information indicating, for example, a focal length, an image center, a magnification, and a distortion coefficient of a lens.
- the information constituting the parameters may include at least one parameter determined by the optical system, for example, at least one of the focal length, the image center, the magnification, and the distortion coefficient.
- the parameters determined by the optical system include parameters determined by the arrangement of the optical system in the imaging unit 101. For example, even with the same scope, the center of the image may change slightly by removing the scope.
- the parameter estimation unit 112 refers to the table showing the relationship between the information of the optical system and the parameters, and the optical obtained from the imaging unit 101. Find the parameters corresponding to the system information.
- the parameter estimation unit 112 is provided with a table that is generated in advance and represents the relationship between the information of the optical system and the parameters.
- the parameter estimation unit 112 estimates the parameter matrix as a parameter based on the image signal obtained from the imaging unit 101.
- Self-Calibration which can estimate a matrix of parameters without using a calibration pattern.
- Self-Calibration is described in, for example, "O.D. Faugeras," Camera self-calibration: Theory and experiments ", Europe Conference on Computer Vision, 1992, pp321-334.”
- the parameter estimation unit 112 calculates information that serves as a reliability index of the estimated parameter matrix.
- the parameter estimation unit 112 newly sets the three-dimensional information generation condition using the parameter estimation result, that is, whether or not the parameter which is the three-dimensional information generation condition is updated with the estimated parameter. Judge whether or not. When it is determined that the parameter that is the condition for generating the three-dimensional information is updated, the parameter that is the condition for generating the three-dimensional information is updated.
- the parameter estimation unit 112 determines whether or not to update the parameter by the threshold value determination using the reliability index of the parameter matrix after estimation. For example, if the reliability index of the parameter matrix after estimation is higher than the preset threshold value, it is determined that the parameter is updated, and if it is lower than the threshold value, it is determined that the parameter is not updated.
- the parameter estimation unit 112 presents the estimation result to the display unit 103, and determines whether or not to update the parameter according to the selection by the user who has seen the estimation result.
- the three-dimensional information generation unit 113 generates three-dimensional information using parameters that are conditions for generating three-dimensional information, based on each frame of the surgical field image represented by the image signal supplied from the imaging unit 101.
- the three-dimensional information is information generated by using the above-mentioned parameters based on the surgical field image.
- the three-dimensional information includes a three-dimensional map showing the three-dimensional structure of the subject (in an organ or a body cavity) reflected in the surgical field image, and position / posture information showing the self-position and posture of the imaging unit 101.
- Visual SLAM that inputs only the surgical field image, or RGB-D that measures the depth information with a ToF sensor or Lidar and inputs the surgical field image and depth information. -SLAM etc. are used.
- the three-dimensional information generation unit 113 stops generating three-dimensional information until a new parameter is estimated by the parameter estimation unit 112.
- the three-dimensional information generation unit 113 restarts the generation of the three-dimensional information using the new parameter.
- the three-dimensional information generation unit 113 does not stop the generation of the three-dimensional information, and the three-dimensional information before the change in the optical system and the optical It is stored separately from the 3D information after the system change.
- the three-dimensional information generation unit 113 gives the three-dimensional information of the place (3 before the change of the optical system).
- the three-dimensional information is updated by replacing the (dimensional information) with the three-dimensional information (three-dimensional information after the change of the optical system) generated by using the new parameters.
- the three-dimensional information generation unit 113 outputs the three-dimensional information generated in this way to the display information generation unit 114.
- the display information generation unit 114 causes the display unit 103 to display the surgical field image based on the image signal supplied from the imaging unit 101.
- the display information generation unit 114 causes the display unit 103 to display the three-dimensional map based on the three-dimensional information supplied from the three-dimensional information generation unit 113.
- the display method such as color may be changed before and after the parameter is updated so that the three-dimensional map is displayed.
- the display information generation unit 114 displays the detection result of the change in the optical system in the optical system change detection unit 111.
- the imaging unit 101 may display information indicating that the scope has been exchanged, or may display information such as the type of scope after the exchange.
- the display information generation unit 114 may cause the display unit 103 to display a new parameter set as a condition for generating three-dimensional information.
- FIG. 4 is a flowchart illustrating a three-dimensional information generation process in the surgery support system 100.
- step S111 the three-dimensional information generation unit 113 generates three-dimensional information using parameters based on the surgical field image represented by the image signal obtained from the imaging unit 101.
- step S112 the three-dimensional information generation unit 113 updates the three-dimensional information generated so far by using the newly generated three-dimensional information.
- step S113 the display information generation unit 114 causes the display unit 103 to display the three-dimensional map based on the three-dimensional information supplied from the three-dimensional information generation unit 113.
- step S114 the optical system change detection unit 111 determines whether or not a change in the optical system has been detected.
- step S114 If it is determined in step S114 that a change in the optical system has been detected, the parameter estimation unit 112 estimates the parameter in step S115. Generation of 3D information is stopped until the parameters are updated.
- step S116 the parameter estimation unit 112 determines whether or not to update the parameter that is the condition for generating the three-dimensional information by using the estimated parameter. The determination here is made based on the reliability index of the estimation result of the parameter as described above.
- step S116 If it is determined in step S116 that the parameters are not updated, the process returns to step S115 and the parameter estimation is repeated.
- step S117 the parameter estimation unit 112 updates the parameter that is the condition for generating the three-dimensional information with the new parameter.
- the parameters updated by the parameter estimation unit 112 are supplied to the three-dimensional information generation unit 113.
- the scale of the 3D map is adjusted by using a new parameter so as to correspond to the 3D map before the change in the optical system is detected. Generation of dimensional information continues.
- step S118 the three-dimensional information generation unit 113 determines whether or not to end the three-dimensional information generation process. If it is determined in step S118 that the three-dimensional information generation process is not completed, or if it is determined in step S114 that the change in the optical system is not detected, the process returns to step S111 and the subsequent processes are repeated. Is done.
- step S118 if it is determined in step S118 that the three-dimensional information generation process is finished, the process of the surgery support system 100 is finished.
- the parameters that are the conditions for generating the three-dimensional information are updated, and the generation of the three-dimensional information is continued using the updated parameters.
- parameters including focal length, image center, and distortion coefficient to appropriate values.
- parameters are obtained by camera calibration, and during operation (during surgery), the parameters obtained in advance are treated as fixed values, and three-dimensional information is generated.
- the optical system such as zoom may be adjusted or the scope itself may be replaced, which changes the parameters.
- the scale of the generated three-dimensional information or the like may change or an error may occur.
- the parameters may be readjusted, but it is necessary to take out the scope and manually calibrate the camera, which is not realistic during surgery.
- FIG. 5 is a diagram showing another configuration example of the surgery support system.
- the configuration of the surgery support system 200 shown in FIG. 5 is such that a robot arm device 212 including a robot arm 211 and a cart 213 on which various devices for endoscopic surgery are mounted are provided. It is different from the configuration shown in 1.
- the robot arm device 212 holds the endoscope 11 by using the robot arm 211.
- the position / orientation information of the endoscope 11 acquired by the robot arm device 212 is supplied to the CCU 13 (information processing unit 102 in FIG. 2).
- the position / orientation information of the endoscope 11 supplied from the robot arm device 212 is used for detecting changes in the optical system and estimating parameters.
- the functional configuration of the surgery support system 200 of FIG. 5 is the same as the configuration described with reference to FIG. With reference to FIG. 2 again, a method of detecting changes in the optical system using the position / posture information of the imaging unit 101 (endoscope 11) and a method of estimating parameters for the surgery support system 200 will be described.
- the position / orientation information of the imaging unit 101 supplied from the robot arm device 212 is input to the optical system change detection unit 111 and the three-dimensional information generation unit 113.
- the optical system change detection unit 111 detects a change in the optical system based on the locus of the self-position of the imaging unit 101 supplied from the robot arm device 212.
- the self-position of the imaging unit 101 estimated by the 3D information generation unit 113 by SLAM will have an error when the optical system changes.
- the optical system change detection unit 111 compares the locus of the actual self-position of the image pickup unit 101 obtained from the robot arm device 212 with the locus of the self-position estimated by the three-dimensional information generation unit 113, and their errors become large. If so, it is detected as if the optical system has changed.
- the imaging unit 101 when the angle of view changes, it is difficult to distinguish between degeneracy due to zooming and movement of the imaging unit 101 in the optical axis direction, but when the imaging unit 101 is held by the robot arm 211, Since it is possible to detect the presence or absence of movement of the imaging unit 101, it is possible to detect changes in the optical system using the angle of view. That is, when the angle of view of the surgical field image is changed and the imaging unit 101 is not moved, it is detected as if the optical system has changed.
- the imaging unit 101 is held by a scopist, when the presence or absence of movement of the imaging unit 101 can be detected by a sensor or the like, the change in the angle of view and the movement of the imaging unit 101 are similar to the case of the robot arm 211.
- the presence or absence may be used to detect changes in the optical system.
- a method of detecting changes in the optical system for example, there is a method of recording feature points between frames and tracking changes in feature points near the outer peripheral side of the surgical field image.
- the parameter estimation unit 112 estimates the parameters using the position / orientation information of the imaging unit 101 obtained from the robot arm device 212. For estimation of parameters based on the information obtained from the robot arm, for example, “Radu Horaud,” The Advantage of Mounting a Camera onto a Robot Arm ", Europe-China Workshop on Geometrical Modeling and Invariants for Computer Vision, 1995, pp206- 213. ”.
- the operation of the surgery support system 200 of FIG. 5 is basically the same as the operation described with reference to FIG.
- the accuracy of the three-dimensional information after the change can be maintained, and the three-dimensional information generated before the change and the three-dimensional information generated before the change can be maintained. It is possible to continuously use the three-dimensional information generated after the change.
- FIG. 6 is a block diagram showing an example of the hardware configuration of the information processing device 300 constituting the surgery support system according to the second embodiment of the present technology.
- the surgery support system including the information processing device 300 of FIG. 6 displays, for example, an image during surgery recorded in the surgery support system of FIG. 1 after the surgery, and is used for education of the surgeon or the student. It is a system to be operated.
- the surgery support system including the information processing device 300 can also be called an endoscopic surgery education system.
- the information processing device 300 is composed of, for example, a computer.
- the CPU 301, ROM 302, and RAM 303 are connected to each other by the bus 304.
- An input / output interface 305 is further connected to the bus 304.
- An input unit 306 including a keyboard and a mouse, and an output unit 307 including a display and a speaker are connected to the input / output interface 305.
- the input / output interface 305 is connected to a storage unit 308 composed of a hard disk, non-volatile memory, etc., a communication unit 309 composed of a network interface, etc., and a drive 310 for driving the removable media 311.
- the surgery support system When used for education, it differs from the first embodiment in that it is not necessary to estimate the parameters immediately. It is possible to read the entire surgical field image once and then perform processing.
- a 3D map (integrated 3D map) optimized for the entire recorded image is generated once, and then SLAM including the camera posture estimation is operated, and 3 It is designed to display a 3D map.
- FIG. 7 is a block diagram showing a functional configuration example of the surgery support system.
- the surgery support system 350 of FIG. 7 is composed of an image storage unit 351, an information processing unit 352, and a display unit 353.
- the image storage unit 351 corresponds to the storage unit 308 in FIG.
- the image storage unit 351 stores the surgical field image captured by the endoscope 11 (FIG. 1) during the operation.
- the information processing unit 352 is realized in the CPU 301 of FIG.
- the information processing unit 352 performs signal processing on the surgical field image stored in the image storage unit 351 and supplies the surgical field image obtained by performing the signal processing to the display unit 353.
- the display unit 353 corresponds to the display constituting the output unit 307 of FIG.
- the display unit 353 displays the surgical field image based on the image signal supplied from the information processing unit 352.
- the information processing unit 352 is composed of an optical system change detection unit 361, a three-dimensional map generation unit 362, a three-dimensional map storage unit 363, a three-dimensional information generation unit 364, and a display information generation unit 365. At least a part of the information processing unit 352 is realized by executing a predetermined program by the CPU 301 of FIG. Descriptions that overlap with the above description will be omitted as appropriate.
- optical system change detection unit 361 refers to the entire surgical field image stored in the image storage unit 351 and detects a change in the optical system. The change in the optical system is detected in the same manner as the optical system change detection unit 111 in FIG.
- the optical system change detection unit 361 sets a section of a frame having the same parameters, that is, a section of a frame in which there is no change in the optical system as a section.
- the optical system change detection unit 361 estimates the parameters of each section. The parameter estimation is performed in the same manner as the parameter estimation unit 112 of FIG. The optical system change detection unit 361 outputs the parameters of each section to the three-dimensional map generation unit 362.
- the three-dimensional map generation unit 362 generates a three-dimensional map of each section using the parameters supplied from the optical system change detection unit 361.
- the three-dimensional map generated by the three-dimensional map generation unit 362 is a three-dimensional map of the subject reflected in the surgical field image of a plurality of frames constituting the section.
- Multiview stereo and SfM which can generate 3D maps from multiple viewpoints, for generating 3D maps.
- Multiview stereo for example, "Multi-View Stereo: A tutorial. Foundations and. TrendsR in Computer Graphics and Vision, vol. 9, no. 1-2, 2013, pp.1-148.”, "Evaluation of multi" -view 3D reconstruction software, CAIP 2015: Computer Analysis of Images and Patterns, pp.450-461. ”.
- the 3D map generation unit 362 outputs the 3D map of each section to the 3D map storage unit 363.
- the 3D map storage unit 363 stores the 3D map of each section generated by the 3D map generation unit 362.
- Three-dimensional information generation unit 364 integrates the 3D maps of each section stored in the 3D map storage unit 363, and generates an integrated 3D map through all the sections.
- the 3D map generated in each section has different parameters for each section, so the scale and position are different, and it is difficult to integrate them as they are and use them for SLAM processing. Therefore, in the three-dimensional information generation unit 364, the scale and the positional relationship of each section are corrected, and the three-dimensional map is integrated while optimizing the scale and the like.
- the 3D information generation unit 364 estimates the scale of the 3D map of the other sections with respect to the 3D map that is the reference in the section, so that the 3D map in all sections Scale and integrate.
- Each of the points on the 3D map generated in each section holds a vector called a feature, which represents the features of that point.
- the 3D information generation unit 364 can identify the overlapping portion in the 3D map by searching for a point that holds the same feature amount between different 3D maps.
- the three-dimensional information generation unit 364 uses the least squares method in the overlapping portion to identify the scale and the positional relationship that minimizes the residual.
- the points that retain the feature amount include the feature points of the surgical field image or the feature points of the three-dimensional map.
- the feature points of the surgical field image are, for example, SIFT, SURF, ORB, and AKAZE.
- the feature points of the 3D map are, for example, SHOT, PFH, and PPF.
- each point of the generated 3D map does not retain the feature quantity and the corresponding points cannot be identified between the 3D maps, that is, the overlapping parts cannot be identified, the correspondence relationship is estimated at the same time. It is also possible to use an ICP capable of registering two point clouds.
- the 3D information generation unit 364 uses the integrated 3D map to generate 3D information by SLAM including the camera's self-position and orientation estimation.
- the 3D information generation unit 364 outputs the generated 3D information to the display information generation unit 365.
- the 3D map generation unit 362 generates a 3D map of each section, but the 3D map generation unit 362 has described the 3D information (position / orientation) of each section. Information and a 3D map) may be generated.
- the three-dimensional map storage unit 363 stores the three-dimensional information of each section.
- the 3D information generation unit 364 the 3D information of each section is integrated, and SLAM processing including the self-position and orientation estimation of the camera is performed using the integrated 3D information. Dimensional information is generated.
- the display information generation unit 365 causes the display unit 353 to display the surgical field image based on the image signal read from the image storage unit 351.
- the display information generation unit 365 causes the display unit 353 to display the integrated three-dimensional map based on the three-dimensional information supplied from the three-dimensional information generation unit 364.
- FIG. 8 is a flowchart illustrating a three-dimensional information generation process in the surgery support system 350.
- step S311 the optical system change detection unit 361 reads the surgical field image represented by the image signal obtained from the image storage unit 351.
- the optical system change detection unit 361 refers to the entire surgical field image, and based on the detection result of the change in the optical system, sets a section of the frame having the same parameters, that is, a section of the frame in which the optical system does not change. Set as a section.
- step S313 the optical system change detection unit 361 estimates the parameters of each section.
- step S314 the 3D map generation unit 362 generates a 3D map of each section.
- step S315 the 3D map storage unit 363 stores the 3D map of each section generated by the 3D map generation unit 362.
- step S316 the 3D information generation unit 364 integrates the 3D maps of each section stored in the 3D map storage unit 363 to generate an integrated 3D map.
- the 3D information generation unit 364 uses the integrated 3D map to generate 3D information by SLAM including self-position and orientation estimation of the camera.
- step S317 the display information generation unit 365 causes the display unit 353 to display the three-dimensional map based on the three-dimensional information supplied from the three-dimensional information generation unit 364.
- the parameters that are the conditions for generating the three-dimensional information are updated for each section set in the optical system of the imaging unit 101 according to the change, and the three-dimensional map generated for each section is integrated.
- the second embodiment when it is used for post-surgery education or the like, it is possible to suppress an error in three-dimensional information even if a change in the optical system occurs during the surgery. be able to.
- FIG. 9 shows an example of a microsurgical system using a surgical videomicroscope as an observation medical device for observing the inside of a patient.
- FIG. 9 a doctor who is a practitioner (user) 520 performs an operation on an operation target (patient) 540 on an operation table 530 using an operation instrument 521 such as a scalpel, tweezers, and forceps. The state of going is shown.
- an operation instrument 521 such as a scalpel, tweezers, and forceps. The state of going is shown.
- treatment is a general term for various medical treatments such as surgery and examination performed by a doctor who is a user 520 on a patient who is a treatment target 540.
- the state of the operation is shown as an example of the operation, but the operation using the surgical video microscope device 510 is not limited to the operation, and may be various other operations.
- a surgical video microscope device 510 according to an embodiment of the present technology is provided beside the operating table 530.
- the surgical video microscope device 510 includes a base portion 511 as a base, an arm portion 512 extending from the base portion 511, and an imaging unit 515 connected to the tip of the arm portion 512 as a tip unit.
- the arm portion 512 has a plurality of joint portions 513a, 513b, 513c, a plurality of links 514a, 514b connected by the joint portions 513a, 513b, and an imaging unit 515 provided at the tip of the arm portion 512.
- the arm portion 512 has three joint portions 513a to 513c and two links 514a and 514b.
- the number and shape of the joint portions 513a to 513c and the links 514a and 514b, and the joint portion 513a so as to realize the desired degree of freedom.
- the direction of the drive shaft of to 513c may be appropriately set.
- the joint portions 513a to 513c have a function of rotatably connecting the links 514a and 514b to each other, and the drive of the arm portion 512 is controlled by driving the rotation of the joint portions 513a to 513c.
- An imaging unit 515 is connected to the tip of the arm portion 512 as a tip unit.
- the image pickup unit 515 is a unit that acquires an image to be imaged by including an optical system that acquires an optical image of a subject, and is configured as, for example, a camera capable of capturing a moving image or a still image. As shown in FIG. 9, the arm portion 512 and the imaging unit 515 are used by the surgical video microscope device 510 so that the imaging unit 515 provided at the tip of the arm portion 512 images the state of the treatment site of the operation target 540. Self-position and posture are controlled.
- the configuration of the imaging unit 515 connected to the tip of the arm portion 512 as a tip unit is not particularly limited, and for example, the imaging unit 515 may be configured as an endoscope or a microscope. Further, the image pickup unit 515 may be configured to be detachable from the arm portion 512.
- an imaging unit 515 may be appropriately connected to the tip of the arm portion 512 as a tip unit.
- the imaging unit 515 is applied as the tip unit will be described, but it goes without saying that the tip unit connected to the tip of the arm portion 512 is not necessarily limited to the imaging unit 515.
- a display device 550 such as a monitor or a display is installed at a position facing the user 520.
- the image of the treatment site acquired by the imaging unit 515 is displayed on the display screen of the display device 550 after being subjected to various image processing by, for example, an image processing device built in or externally attached to the surgical video microscope device 510. Displayed as an electronic image.
- the user 520 can perform various treatments (for example, surgery) while viewing the electronic image of the treatment site displayed on the display screen of the display device 550.
- the imaging unit 515 includes, for example, the imaging unit 101 described with reference to FIG. Further, an image processing device that performs various image processing on the image of the treatment site acquired by the image pickup unit 515 corresponds to an example of the information processing unit 102 described with reference to FIG. Similarly, the display device 550 corresponds to an example of the display unit 103 described with reference to FIG.
- the arm portion 512 corresponds to an example of the robot arm 211 described with reference to FIG.
- the surgical videomicroscope device 510 includes the robot arm device 212 described with reference to FIG.
- FIG. 10 is a block diagram showing an example of the hardware configuration of the information processing device 900 constituting the surgery support system according to the embodiment of the present technology.
- the information processing apparatus 900 includes a CPU 901, a ROM 903, and a RAM 905. Further, the information processing device 900 includes a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, and a storage device 919. The information processing device 900 may include a drive 921, a connection port 923, and a communication device 925.
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- ROM 903 stores programs and calculation parameters used by CPU 901.
- the RAM 905 primaryly stores a program used by the CPU 901, parameters that change as appropriate in the execution of the program, and the like. These are connected to each other by a host bus 907 composed of an internal bus such as a CPU bus.
- a host bus 907 composed of an internal bus such as a CPU bus.
- Each configuration of the information processing unit 102 described with reference to FIG. 2 is realized by, for example, the CPU 901.
- the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- An input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925 are connected to the external bus 911 via an interface 913.
- the input device 915 is an operating means operated by the user, such as a mouse, keyboard, touch panel, buttons, switches, levers, and pedals. Further, the input device 915 may be, for example, a remote control means (so-called remote controller) using infrared rays or other radio waves. The input device 915 may be, for example, an externally connected device 929 such as a mobile phone, a smartphone, or a tablet terminal that supports the operation of the information processing device 900.
- the input device 915 is composed of, for example, an input control circuit that generates an input signal based on the information input by the user using the above-mentioned operating means and outputs the input signal to the CPU 901.
- the user can input various data to the information processing device 900 and instruct the processing operation.
- the output device 917 is composed of a device capable of visually or audibly notifying the user of the acquired information.
- the output device 917 is configured as a CRT display device, a liquid crystal display device, a plasma display device, an EL display device, a display device such as a lamp, an audio output device such as a speaker and headphones, a printer device, and the like. ..
- the output device 917 outputs, for example, the results obtained by various processes performed by the information processing device 900.
- the display device displays the results obtained by various processes performed by the information processing device 900 as text or an image.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs the signal.
- the display unit 103 described with reference to FIG. 2 is realized by, for example, an output device 917.
- the storage device 919 is a data storage device configured as an example of the storage unit of the information processing device 900.
- the storage device 919 is composed of, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
- the storage device 919 stores programs executed by the CPU 901, various data, and the like.
- the drive 921 is a reader / writer for a recording medium, and is built in or externally attached to the information processing device 900.
- the drive 921 reads the information recorded on the removable recording medium 927 such as the mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 905.
- the drive 921 can also write a record to a removable recording medium 927 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
- the removable recording medium 927 is, for example, DVD media, HD-DVD media, Blu-ray (registered trademark) media, or the like. Further, the removable recording medium 927 may be a compact flash (registered trademark) (CF: CompactFlash), a flash memory, an SD (Secure Digital) memory card, or the like. Further, the removable recording medium 927 may be, for example, an IC (Integrated Circuit) card equipped with a non-contact IC chip, an electronic device, or the like.
- IC Integrated Circuit
- the connection port 923 is a port for directly connecting the external connection device 929 to the information processing device 900.
- Examples of the connection port 923 include a USB (Universal Serial Bus) port, an IEEE1394 port, and a SCSI (Small Computer System Interface) port.
- Another example of the connection port 923 is an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, and the like.
- the communication device 925 is, for example, a communication interface composed of a communication device for connecting to a communication network (network) 931.
- the communication device 925 is, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB). Further, the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communications, or the like.
- the communication device 925 can transmit and receive signals to and from the Internet and other communication devices according to a predetermined protocol such as TCP / IP. Further, the communication network 931 connected to the communication device 925 may be configured by a network connected by wire or wirelessly.
- the communication network 931 may be, for example, the Internet or a home LAN, or may be a communication network on which infrared communication, radio wave communication, or satellite communication is performed.
- Each component of the information processing device 300 of FIG. 5 and the information processing device 900 of FIG. 10 described above may be configured by using general-purpose members, or may be configured by hardware specialized for the function of each component. It may be configured. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at each time when the embodiment of the present technology is implemented.
- a computer program for realizing each function of the information processing device 300 and the information processing device 900 constituting the operation support system according to the embodiment of the present technology, and implement the computer program on a personal computer or the like. .. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the computer program may be distributed via a network, for example, without using a recording medium.
- the program executed by the computer may be a program in which processing is performed in chronological order in the order described in this specification, in parallel, or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
- the surgical field data acquired by the medical observation device is acquired, and the change in the optical system of the medical observation device is detected.
- a parameter determined by the changed optical system is estimated, and the estimation result is used to set conditions for generating three-dimensional information based on the surgical field data.
- the accuracy of the three-dimensional information can be maintained even when the optical system is changed.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
- this technology can have a cloud computing configuration in which one function is shared by a plurality of devices via a network and processed jointly.
- each step described in the above flowchart can be executed by one device or shared by a plurality of devices.
- one step includes a plurality of processes
- the plurality of processes included in the one step can be executed by one device or shared by a plurality of devices.
- the present technology can also have the following configurations.
- a medical observation system including a setting unit for setting conditions for generating three-dimensional information based on the surgical field data using the estimation result of the estimation unit.
- the medical observation system according to (1) above wherein the detection unit detects a change in the optical system by detecting a change in the focal length of the surgical field image represented by the surgical field data.
- the detection unit detects a change in the optical system by using a locus of a position of the medical observation device held by a robot arm.
- the detection unit is of the optical system when the angle of view of the surgical field image represented by the surgical field data is changed and the medical observation device held by the robot arm is not moved.
- the medical observation system according to (1) above which detects that there is a change.
- the detection unit divides the surgical field data into sections that are sections composed of frames of a plurality of surgical field images.
- the medical observation system according to any one of (1) to (5) above, wherein the estimation unit estimates the parameters for each section. (7) The estimation unit estimates the parameters corresponding to the information of the optical system acquired from the medical observation device based on a table obtained in advance and representing the relationship between the information of the optical system and the parameters. The medical observation system according to any one of (1) to (5) above. (8) The medical observation system according to any one of (1) to (5) above, wherein the estimation unit estimates the parameters based on the surgical field data. (9) The medical observation system according to (8), wherein the estimation unit estimates the parameters from the surgical field image represented by the surgical field data and generates a reliability index of a matrix of the parameters.
- the medical observation system according to any one of (1) to (9), further comprising a three-dimensional information generation unit that generates the three-dimensional information using the parameters estimated by the estimation unit.
- the three-dimensional information generation unit stops the generation of the three-dimensional information, and when the parameter is estimated by the estimation unit, the estimated parameter.
- the medical observation system according to (10) above, which resumes the generation of the three-dimensional information using the above.
- the medical observation system according to any one of (1) to (11), further comprising a display control unit that controls the display of the surgical field image represented by the surgical field data or the three-dimensional information.
- the display control unit displays information indicating that the scope of the medical observation device has been exchanged as the detection result.
- the display control unit displays information related to the scope of the medical observation device as the detection result.
- the display control unit displays the three-dimensional information before the change and the three-dimensional information after the change.
- the medical observation system Acquire the surgical field data acquired by the medical observation device, Detecting changes in the optical system of the medical observation device, When a change in the optical system is detected, the parameters determined by the optical system after the change are estimated.
- An imaging unit that images the surgical field and generates surgical field data, It is equipped with an output unit that outputs the surgical field data. When a change in the optical system of the imaging unit is detected and a change in the optical system is detected, the parameters determined by the optical system after the change are estimated, and the estimation result is used to obtain three-dimensional information based on the surgical field data.
- a medical observation device used in a medical observation system that sets the generation conditions of.
- Surgical support system 11 Endoscope, 13 CCU, 15 Display device, 100 Surgical support system, 101 Camera, 102 Information processing unit, 103 Display unit, 111 Optical system change detection unit, 112 Parameter estimation unit, 113 3D information Generation unit, 114 display information generation unit, 200 surgery support system, 211 robot arm, 212 robot arm device, 300 surgery support system, 301 CPU, 307 output unit, 308 storage unit, 350 surgery support system, 351 image storage unit, 352 Information processing unit, 353 display unit, 361 optical system change detection unit, 362 3D map generation unit, 363 3D map storage unit, 364 3D information generation unit, 365 display information generation unit
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Astronomy & Astrophysics (AREA)
- Robotics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
1.第1の実施の形態(手術中の利用)
2.第2の実施の形態(教育での利用)
3.適用例
4.ハードウェア構成
5.その他
<手術支援システムの構成例(内視鏡をスコピストが保持する例)>
図1は、本技術の第1の実施の形態に係る手術支援システムの構成例を示す図である。
(CCU13周りの構成)
図2は、手術支援システムの機能構成例を示すブロック図である。
情報処理部102は、光学系変化検知部111、パラメータ推定部112、3次元情報生成部113、および表示情報生成部114により構成される。
光学系変化検知部111は、手術中、撮像部101に生じた光学系の変化を検知する。光学系の変化は、例えば、ズーム(画角)の調整(ズームレンズの移動)やフォーカスの調整(フォーカスレンズの移動)などの光学系の調整が撮像部101において行われた場合、撮像部101にスコープが含まれるときはスコープの交換が行われた場合などに生じる。
例えば、ズームの調整やフォーカスの調整はCCU13の出力に基づいて、撮像部101に含まれる光学系の一部の光学部材を移動させているため、CCU13内に光学系の変化を示す情報(例えば、ズームレンズの位置やフォーカスレンズの位置を示す情報)が記憶される。この場合、光学系変化検知部111は、CCU13に記憶された光学系の変化を示す情報に基づいて、光学系の変化を検知する。
また、撮像部101に取り外し可能なスコープが含まれており、そのスコープにスコープの種類を示す情報を記憶した記憶部が設けられている場合がある。このとき、スコープの情報を撮像部101に含まれる回路が取得し、CCU13に出力する構成としてもよい。この場合、光学系変化検知部111は、撮像部101から得られる情報に基づいて、光学系の変化を検知する。
この場合、光学系変化検知部111は、撮像部101から得られる画像信号に基づいて、光学系の変化を検知する。
図2のパラメータ推定部112は、術野画像に基づく3次元情報の生成条件となるパラメータを推定する。パラメータは、光学系により定まるパラメータであり、例えば、焦点距離、画像中心、倍率、レンズの歪み係数を示す情報である。パラメータを構成する情報は、光学系により定まるパラメータを少なくとも1つ含んでいればよく、例えば、焦点距離、画像中心、倍率、および歪み係数のうちの少なくとも1つが含まれていればよい。なお、光学系により定まるパラメータは、撮像部101における光学系の配置により定まるパラメータを含む。例えば、同じスコープであってもスコープの取り外しを行うことで画像中心がわずかに変化することがある。
この場合、パラメータ推定部112は、光学系の情報とパラメータとの関係を表すテーブルを参照し、撮像部101から得られる光学系の情報に対応するパラメータを求める。パラメータ推定部112には、予め生成された、光学系の情報とパラメータとの関係を表すテーブルが与えられている。
この場合、パラメータ推定部112は、撮像部101から得られる画像信号に基づいて、パラメータとしてのパラメータ行列を推定する。
3次元情報生成部113は、撮像部101から供給される画像信号により表される術野画像の各フレームに基づいて、3次元情報の生成条件となるパラメータを用いて3次元情報を生成する。3次元情報は、術野画像に基づき、上述したパラメータを用いて生成される情報である。3次元情報には、術野画像に映る被写体(臓器や体腔内)の3次元構造を表す3次元マップと、撮像部101の自己位置および姿勢を表す位置・姿勢情報とが含まれる。
表示情報生成部114は、撮像部101から供給される画像信号に基づいて、術野画像を表示部103に表示させる。
図4は、手術支援システム100における3次元情報生成処理について説明するフローチャートである。
図5は、手術支援システムの他の構成例を示す図である。
図5の手術支援システム200の機能構成は、図2を参照して説明した構成と同じ構成である。図2を再び参照して、手術支援システム200についての撮像部101(内視鏡11)の位置・姿勢情報を利用した光学系の変化の検知方法と、パラメータの推定方法について説明する。
光学系変化検知部111は、ロボットアーム装置212から供給される撮像部101の自己位置の軌跡に基づいて、光学系の変化を検知する。
パラメータ推定部112は、ロボットアーム装置212から得られる撮像部101の位置・姿勢情報を用いてパラメータを推定する。ロボットアームから得られる情報に基づくパラメータの推定については、例えば、「Radu Horaud, “The Advantage of Mounting a Camera onto a Robot Arm”, Europe-China Workshop on Geometrical Modelling and Invariants for Computer Vision, 1995, pp206-213.」に開示されている。
<手術支援システムの構成例>
図6は、本技術の第2の実施の形態に係る手術支援システムを構成する情報処理装置300のハードウェア構成の一例を示すブロック図である。
(全体構成)
図7は、手術支援システムの機能構成例を示すブロック図である。
情報処理部352は、光学系変化検知部361、3次元マップ生成部362、3次元マップ記憶部363、3次元情報生成部364、および表示情報生成部365により構成される。情報処理部352のうちの少なくとも一部は、図6のCPU301により所定のプログラムが実行されることによって実現される。上述した説明と重複する説明については、適宜省略する。
光学系変化検知部361は、画像記憶部351に記憶されている術野画像全体を参照し、光学系の変化を検知する。光学系の変化の検知は、図2の光学系変化検知部111と同様にして行われる。
3次元マップ生成部362は、光学系変化検知部361から供給されるパラメータを用いて、各セクションの3次元マップを生成する。3次元マップ生成部362が生成する3次元マップは、セクションを構成する複数フレームの術野画像に映る被写体の3次元マップとなる。
3次元マップ記憶部363は、3次元マップ生成部362により生成された各セクションの3次元マップを記憶する。
3次元情報生成部364は、3次元マップ記憶部363に記憶された各セクションの3次元マップを統合し、すべてのセクションを通して統合した3次元マップを生成する。
表示情報生成部365は、図2の表示情報生成部114と同様に、画像記憶部351から読み出される画像信号に基づいて、術野画像を表示部353に表示させる。
図8は、手術支援システム350における3次元情報生成処理について説明するフローチャートである。
次に、図9を参照して、本技術の実施の形態に係る手術支援システムの適用例として、アームを備えた手術用ビデオ顕微鏡装置が用いられる場合の一例について説明する。
次に、図10を参照して、本技術の実施の形態に係る手術支援システムを構成する情報処理装置のハードウェア構成の一例について、詳細に説明する。
以上のように、本技術においては、医療用観察装置により取得された術野データが取得され、医療用観察装置の光学系の変化が検知される。そして、光学系の変化が検知された場合、変化後の光学系により定まるパラメータが推定され、推定結果を用いて、前記術野データに基づく3次元情報の生成条件が設定される。これにより、光学系に変化が生じた場合であっても、3次元情報の精度を維持することができる。
本技術は、以下のような構成をとることもできる。
(1)
医療用観察装置により取得された術野データを取得する取得部と、
前記医療用観察装置の光学系の変化を検知する検知部と、
前記検知部により前記光学系の変化が検知された場合、変化後の前記光学系により定まるパラメータを推定する推定部と、
前記推定部の推定結果を用いて、前記術野データに基づく3次元情報の生成条件を設定する設定部と
を備える医療用観察システム。
(2)
前記検知部は、前記術野データにより表される術野画像の変化を検知することで、前記光学系の変化を検知する
前記(1)に記載の医療用観察システム。
(3)
前記検知部は、前記術野データにより表される術野画像の焦点距離の変化を検知することで、前記光学系の変化を検知する
前記(1)に記載の医療用観察システム。
(4)
前記検知部は、ロボットアームが保持する前記医療用観察装置の位置の軌跡を用いて、前記光学系の変化を検知する
前記(1)に記載の医療用観察システム。
(5)
前記検知部は、前記術野データにより表される術野画像の画角の変化が生じた場合であって、ロボットアームが保持する前記医療用観察装置の移動がないときに、前記光学系の変化があったものとして検知する
前記(1)に記載の医療用観察システム。
(6)
前記検知部は、前記光学系の変化に基づいて、前記術野データを、複数の術野画像のフレームからなる区間であるセクション毎に区切り、
前記推定部は、前記セクション毎に、前記パラメータを推定する
前記(1)乃至(5)のいずれかに記載の医療用観察システム。
(7)
前記推定部は、予め求められた、前記光学系の情報と前記パラメータとの関係を表すテーブルに基づいて、前記医療用観察装置から取得される前記光学系の情報に対応する前記パラメータを推定する
前記(1)乃至(5)のいずれかに記載の医療用観察システム。
(8)
前記推定部は、前記術野データに基づいて前記パラメータを推定する
前記(1)乃至(5)のいずれかに記載の医療用観察システム。
(9)
前記推定部は、前記術野データにより表される術野画像から前記パラメータを推定し、前記パラメータの行列の信頼性指標を生成する
前記(8)に記載の医療用観察システム。
(10)
前記推定部により推定された前記パラメータを用いて、前記3次元情報を生成する3次元情報生成部をさらに備える
前記(1)乃至(9)のいずれかに記載の医療用観察システム。
(11)
前記3次元情報生成部は、前記検知部により前記光学系の変化が検知された場合、前記3次元情報の生成を停止し、前記推定部により前記パラメータが推定された場合、推定された前記パラメータを用いて、前記3次元情報の生成を再開する
前記(10)に記載の医療用観察システム。
(12)
前記術野データにより表される術野画像または前記3次元情報の表示を制御する表示制御部をさらに備える
前記(1)乃至(11)のいずれかに記載の医療用観察システム。
(13)
前記表示制御部は、前記検知部による前記光学系の変化の検知結果を表示させる
前記(12)に記載の医療用観察システム。
(14)
前記表示制御部は、前記検知結果として、前記医療用観察装置のスコープが交換されたことを表す情報を表示させる
前記(13)に記載の医療用観察システム。
(15)
前記表示制御部は、前記検知結果として、前記医療用観察装置のスコープに関連する情報を表示させる
前記(13)に記載の医療用観察システム。
(16)
前記表示制御部は、変化前の前記3次元情報と変化後の前記3次元情報とを表示させる
前記(13)に記載の医療用観察システム。
(17)
医療用観察システムが、
医療用観察装置により取得された術野データを取得し、
前記医療用観察装置の光学系の変化を検知し、
前記光学系の変化が検知された場合、変化後の前記光学系により定まるパラメータを推定し、
推定結果を用いて、前記術野データに基づく3次元情報の生成条件を設定する
医療用観察方法。
(18)
術野を撮像し、術野データを生成する撮像部と、
前記術野データを出力する出力部と
を備え、
前記撮像部の光学系の変化を検知し、前記光学系の変化を検知した場合、変化後の前記光学系により定まるパラメータを推定し、推定結果を用いて、前記術野データに基づく3次元情報の生成条件を設定する医療用観察システムにおいて用いられる
医療用観察装置。
Claims (18)
- 医療用観察装置により取得された術野データを取得する取得部と、
前記医療用観察装置の光学系の変化を検知する検知部と、
前記検知部により前記光学系の変化が検知された場合、変化後の前記光学系により定まるパラメータを推定する推定部と、
前記推定部の推定結果を用いて、前記術野データに基づく3次元情報の生成条件を設定する設定部と
を備える医療用観察システム。 - 前記検知部は、前記術野データにより表される術野画像の変化を検知することで、前記光学系の変化を検知する
請求項1に記載の医療用観察システム。 - 前記検知部は、前記術野データにより表される術野画像の焦点距離の変化を検知することで、前記光学系の変化を検知する
請求項1に記載の医療用観察システム。 - 前記検知部は、ロボットアームが保持する前記医療用観察装置の位置の軌跡を用いて、前記光学系の変化を検知する
請求項1に記載の医療用観察システム。 - 前記検知部は、前記術野データにより表される術野画像の画角の変化が生じた場合であって、ロボットアームが保持する前記医療用観察装置の移動がないときに、前記光学系の変化があったものとして検知する
請求項1に記載の医療用観察システム。 - 前記検知部は、前記光学系の変化に基づいて、前記術野データを、複数の術野画像のフレームからなる区間であるセクション毎に区切り、
前記推定部は、前記セクション毎に、前記パラメータを推定する
請求項1に記載の医療用観察システム。 - 前記推定部は、予め求められた、前記光学系の情報と前記パラメータとの関係を表すテーブルに基づいて、前記医療用観察装置から取得される前記光学系の情報に対応する前記パラメータを推定する
請求項1に記載の医療用観察システム。 - 前記推定部は、前記術野データに基づいて前記パラメータを推定する
請求項1に記載の医療用観察システム。 - 前記推定部は、前記術野データにより表される術野画像から前記パラメータを推定し、前記パラメータの行列の信頼性指標を生成する
請求項8に記載の医療用観察システム。 - 前記推定部により推定された前記パラメータを用いて、前記3次元情報を生成する3次元情報生成部をさらに備える
請求項1に記載の医療用観察システム。 - 前記3次元情報生成部は、前記検知部により前記光学系の変化が検知された場合、前記3次元情報の生成を停止し、前記推定部により前記パラメータが推定された場合、推定された前記パラメータを用いて、前記3次元情報の生成を再開する
請求項10に記載の医療用観察システム。 - 前記術野データにより表される術野画像または前記3次元情報の表示を制御する表示制御部をさらに備える
請求項1に記載の医療用観察システム。 - 前記表示制御部は、前記検知部による前記光学系の変化の検知結果を表示させる
請求項12に記載の医療用観察システム。 - 前記表示制御部は、前記検知結果として、前記医療用観察装置のスコープが交換されたことを表す情報を表示させる
請求項13に記載の医療用観察システム。 - 前記表示制御部は、前記検知結果として、前記医療用観察装置のスコープに関連する情報を表示させる
請求項13に記載の医療用観察システム。 - 前記表示制御部は、変化前の前記3次元情報と変化後の前記3次元情報とを表示させる
請求項13に記載の医療用観察システム。 - 医療用観察システムが、
医療用観察装置により取得された術野データを取得し、
前記医療用観察装置の光学系の変化を検知し、
前記光学系の変化が検知された場合、変化後の前記光学系により定まるパラメータを推定し、
推定結果を用いて、前記術野データに基づく3次元情報の生成条件を設定する
医療用観察方法。 - 術野を撮像し、術野データを生成する撮像部と、
前記術野データを出力する出力部と
を備え、
前記撮像部の光学系の変化を検知し、前記光学系の変化を検知した場合、変化後の前記光学系により定まるパラメータを推定し、推定結果を用いて、前記術野データに基づく3次元情報の生成条件を設定する医療用観察システムにおいて用いられる
医療用観察装置。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP20785287.2A EP3933481A4 (en) | 2019-03-29 | 2020-03-23 | SYSTEM AND METHOD AND DEVICE OF MEDICAL OBSERVATION |
| CN202080023381.9A CN113614607B (zh) | 2019-03-29 | 2020-03-23 | 医学观察系统、方法和医学观察装置 |
| US17/441,698 US12207794B2 (en) | 2019-03-29 | 2020-03-23 | Medical observation system, method, and medical observation device |
| JP2021511472A JP7571722B2 (ja) | 2019-03-29 | 2020-03-23 | 医療用観察システムおよび方法、並びに医療用観察装置 |
| US18/988,944 US20250127382A1 (en) | 2019-03-29 | 2024-12-20 | Medical observation system, method, and medical observation device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019-065756 | 2019-03-29 | ||
| JP2019065756 | 2019-03-29 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/441,698 A-371-Of-International US12207794B2 (en) | 2019-03-29 | 2020-03-23 | Medical observation system, method, and medical observation device |
| US18/988,944 Continuation US20250127382A1 (en) | 2019-03-29 | 2024-12-20 | Medical observation system, method, and medical observation device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020203405A1 true WO2020203405A1 (ja) | 2020-10-08 |
Family
ID=72667721
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/012676 Ceased WO2020203405A1 (ja) | 2019-03-29 | 2020-03-23 | 医療用観察システムおよび方法、並びに医療用観察装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US12207794B2 (ja) |
| EP (1) | EP3933481A4 (ja) |
| JP (1) | JP7571722B2 (ja) |
| CN (1) | CN113614607B (ja) |
| WO (1) | WO2020203405A1 (ja) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230156174A1 (en) * | 2021-11-17 | 2023-05-18 | 3Dintegrated Aps | Surgical visualization image enhancement |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11337845A (ja) * | 1998-05-25 | 1999-12-10 | Mitsubishi Electric Corp | 内視鏡装置 |
| JP2009258273A (ja) * | 2008-04-15 | 2009-11-05 | Olympus Corp | 計測用内視鏡装置およびプログラム |
| JP2010263949A (ja) * | 2009-05-12 | 2010-11-25 | Hoya Corp | 医療用ビデオプロセッサ |
| JP2015195844A (ja) * | 2014-03-31 | 2015-11-09 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法 |
| JP2017225700A (ja) | 2016-06-23 | 2017-12-28 | オリンパス株式会社 | 観察支援装置及び内視鏡システム |
| JP2018161377A (ja) * | 2017-03-27 | 2018-10-18 | ソニー株式会社 | 医療用システムの制御装置、医療用システムの制御方法及び医療用システム |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2015029318A1 (ja) * | 2013-08-26 | 2015-03-05 | パナソニックIpマネジメント株式会社 | 3次元表示装置および3次元表示方法 |
| WO2016154589A1 (en) * | 2015-03-25 | 2016-09-29 | Camplex, Inc. | Surgical visualization systems and displays |
-
2020
- 2020-03-23 CN CN202080023381.9A patent/CN113614607B/zh active Active
- 2020-03-23 JP JP2021511472A patent/JP7571722B2/ja active Active
- 2020-03-23 EP EP20785287.2A patent/EP3933481A4/en active Pending
- 2020-03-23 US US17/441,698 patent/US12207794B2/en active Active
- 2020-03-23 WO PCT/JP2020/012676 patent/WO2020203405A1/ja not_active Ceased
-
2024
- 2024-12-20 US US18/988,944 patent/US20250127382A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11337845A (ja) * | 1998-05-25 | 1999-12-10 | Mitsubishi Electric Corp | 内視鏡装置 |
| JP2009258273A (ja) * | 2008-04-15 | 2009-11-05 | Olympus Corp | 計測用内視鏡装置およびプログラム |
| JP2010263949A (ja) * | 2009-05-12 | 2010-11-25 | Hoya Corp | 医療用ビデオプロセッサ |
| JP2015195844A (ja) * | 2014-03-31 | 2015-11-09 | 富士フイルム株式会社 | 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法 |
| JP2017225700A (ja) | 2016-06-23 | 2017-12-28 | オリンパス株式会社 | 観察支援装置及び内視鏡システム |
| JP2018161377A (ja) * | 2017-03-27 | 2018-10-18 | ソニー株式会社 | 医療用システムの制御装置、医療用システムの制御方法及び医療用システム |
Non-Patent Citations (6)
| Title |
|---|
| "Evaluation of multi-view 3D reconstruction software", CAIP 2015: COMPUTER ANALYSIS OF IMAGES AND PATTERNS, pages 450 - 461 |
| KAZUKI NOZAWA: "Stabilization of three-dimensional restoration for input image group with unknown focal length", CVIM-182, vol. 2012, no. 19 |
| MULTI-VIEW STEREO: A TUTORIAL. FOUNDATIONS AND. TRENDSR IN COMPUTER GRAPHICS AND VISION, vol. 9, no. 1-2, 2013, pages 1 - 148 |
| O.D.FAUGERAS, EUROPE CONFERENCE ON COMPUTER VISION, 1992, pages 321 - 334 |
| RADU HORAUD: "The Advantage of Mounting a Camera onto a Robot Arm", EUROPE-CHINA WORKSHOP ON GEOMETRICAL MODELLING AND INVARIANTS FOR COMPUTER VISION, 1995, pages 206 - 213 |
| See also references of EP3933481A4 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7571722B2 (ja) | 2024-10-23 |
| EP3933481A1 (en) | 2022-01-05 |
| EP3933481A4 (en) | 2022-05-11 |
| CN113614607A (zh) | 2021-11-05 |
| JPWO2020203405A1 (ja) | 2020-10-08 |
| US12207794B2 (en) | 2025-01-28 |
| US20220160217A1 (en) | 2022-05-26 |
| CN113614607B (zh) | 2025-10-03 |
| US20250127382A1 (en) | 2025-04-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240033014A1 (en) | Guidance for placement of surgical ports | |
| WO2019181632A1 (en) | Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system | |
| CN102821671B (zh) | 内窥镜观察支持系统和设备 | |
| US6511418B2 (en) | Apparatus and method for calibrating and endoscope | |
| JP4630564B2 (ja) | 手術支援装置、方法及びプログラム | |
| JPWO2019044328A1 (ja) | 医療用画像処理装置、医療用画像処理システム、及び医療用画像処理装置の駆動方法 | |
| US11344180B2 (en) | System, apparatus, and method for calibrating oblique-viewing rigid endoscope | |
| WO2013138079A2 (en) | Otoscanner with camera for video and scanning | |
| WO2013141155A1 (ja) | 画像内遮蔽領域の画像補完システム、画像処理装置及びそのプログラム | |
| US12183038B2 (en) | Camera calibration using fiducial markers on surgical tools | |
| JP2022020592A (ja) | 医療用アーム制御システム、医療用アーム制御方法、及びプログラム | |
| CN118510463A (zh) | 使用搜索图像自动配准术前体积图像数据 | |
| CN114126531B (zh) | 医疗成像系统、医疗成像处理方法及医疗信息处理设备 | |
| US20250127382A1 (en) | Medical observation system, method, and medical observation device | |
| JP7517325B2 (ja) | 医療システム、信号処理装置、及び、信号処理方法 | |
| US10049480B2 (en) | Image alignment device, method, and program | |
| CN115105202B (zh) | 一种用于腔镜手术中的病灶确认方法及系统 | |
| CN114727860A (zh) | 物理医疗元件放置系统 | |
| JP2020534050A (ja) | ロボット外科手技中に立体視知覚通知および/または推奨事項を提供するためのシステム、方法、およびコンピュータ可読媒体 | |
| US12220226B2 (en) | Surgical site measurement, and camera calibration using fiducial markers on surgical tools | |
| Boudreault et al. | 6-degree vision based tracking of a mandible phantom with deep learning |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20785287 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021511472 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2020785287 Country of ref document: EP Effective date: 20210929 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 202080023381.9 Country of ref document: CN |