WO2022168242A1 - 情報処理装置、支援システム - Google Patents
情報処理装置、支援システム Download PDFInfo
- Publication number
- WO2022168242A1 WO2022168242A1 PCT/JP2021/004163 JP2021004163W WO2022168242A1 WO 2022168242 A1 WO2022168242 A1 WO 2022168242A1 JP 2021004163 W JP2021004163 W JP 2021004163W WO 2022168242 A1 WO2022168242 A1 WO 2022168242A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- image
- annotation
- unit
- generated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
- G06V10/235—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
- G06V10/95—Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
- A61B1/3132—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for laparoscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
Definitions
- the present invention relates to a technique for a practitioner to be remotely instructed by an instructor.
- a support system is known that allows a practitioner to perform a treatment on a patient while receiving instructions from a remote instructor.
- This support system is used, for example, when an inexperienced surgeon (practitioner) performs an operation under the guidance of an experienced instructor (instructor).
- a monitor for displaying an image captured by an endoscope inserted into a patient (hereinafter also referred to as an endoscope image) is installed at a remote location.
- the attending physician can input instructions while observing the endoscopic captured image displayed on the monitor.
- a composite image obtained by superimposing the instructions input remotely on the endoscopic image is displayed, and the surgeon performs the operation on the patient while receiving instructions from the supervising physician at the remote location. It can be performed.
- the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a support system that enables a remote instructor to provide appropriate support to a practitioner.
- An information processing apparatus is an information processing apparatus possessed by the second system in a support system having a first system and a second system separated from the first system, and is necessary for generating annotation image data.
- a data transmission unit that transmits drawing information based on an operation input to the first system; an image generation unit that generates annotation image data from the drawing information; and annotation image data generated by the image generation unit. and an image synthesizing unit for synthesizing the image data received from the first system.
- the annotation image data here is image data to be superimposed on the endoscope-captured image, and is image data generated based on drawing information.
- Drawing information is information such as coordinates, line types, colors, and line widths specified by operation input.
- annotation image data is generated from the drawing information
- synthesized image data is generated by synthesizing the generated annotation image data and captured image data
- the image synthesizing unit synthesizes the synthesized image data received from the first system and the annotation image data generated by the image generating unit.
- annotation image data is generated in each of the first system and the second system.
- annotation image data is generated from the drawing information, a plurality of pieces of image data including the generated annotation image data are combined to generate composite image data, and the information processing of the second system is performed. It is conceivable that the image synthesizing unit of the device synthesizes the synthetic image data received from the first system and the annotation image data generated by the image generating unit.
- the second system it becomes possible for the second system to check whether the drawing information sent to the first system is reflected in the composite image data received from the first system.
- the image generation unit in the information processing device of the second system generates annotation image data that can be distinguished from annotation image data included in the composite image data.
- display based on the annotation image data included in the synthesized image data can be distinguished from display based on the annotation image data generated by the image generation unit. It is possible to confirm whether or not the annotation image data reflecting the drawing information has been generated on the first system side.
- An information processing apparatus is an information processing apparatus possessed by the first system in a support system having a first system and a second system separated from the first system, wherein drawing information based on an operation input is acquired.
- an image generation unit that receives from the second system and generates annotation image data from the received drawing information; and synthesizes and synthesizes the input image data and the annotation image data generated by the image generation unit.
- An image synthesizing unit that generates image data, and a data transmission unit that transmits the synthetic image data generated by the image synthesizing unit to the second system.
- the delay time from the input of the plurality of image data to the output of the synthesized image data by the image synthesizing unit is less than 30 milliseconds.
- a support system includes the information processing device of the first system and the information processing device of the second system.
- a support system is implemented by the information processing apparatuses described above communicating with each other.
- the instruction content based on the input by the instructor at the remote location is displayed on the monitors of the operator and the instructor with extremely low delay, so that the operator and the instructor can receive the instruction.
- Content can be shared in real time.
- FIG. 4 is a diagram showing a composite image in which an annotation image is superimposed on an endoscope-captured image according to the present embodiment
- 1 is a diagram schematically showing an example of a configuration of a support system according to this embodiment
- FIG. 4 is a diagram showing an annotation image according to the embodiment
- FIG. FIG. 4 is a diagram showing a composite image in which an annotation image is superimposed on an endoscope-captured image according to the present embodiment
- FIG. 4 is a diagram showing a composite image in which an annotation image is superimposed on an endoscope-captured image according to the present embodiment
- FIG. 4 is a diagram showing an annotation image according to the embodiment
- FIG. 4 is a diagram showing a composite image in which an annotation image is superimposed on an endoscope-captured image according to the present embodiment
- FIG. 1 An embodiment will be described below with reference to FIGS. 1 to 9.
- FIG. 1 each configuration described in the drawings to be referred to merely shows an example for realizing the present invention. Therefore, various modifications can be made in accordance with the design and the like without departing from the technical idea of the present invention. Moreover, in order to avoid duplication, the same reference numerals may be given to the configurations that have already been described, and the description thereof will be omitted.
- FIG. 1 shows a support system 100.
- the operator 1 in the operating room Rm1 performs surgery on the patient 3 while confirming instructions from the preceptor 2 in the instruction room Rm2 separated from the operating room Rm1.
- a surgery support system capable of In the surgery support system for example, a practitioner 1 is a surgeon for a patient 3, and an instructor 2 is a doctor who instructs the surgeon.
- an endoscope-captured image 30 of the body cavity as shown in FIG. The operator 1 can check the endoscope-captured image 30 on the monitor 14 .
- the endoscopic captured image 30 is also displayed on the monitor 25 of the guidance terminal 26 in the guidance room Rm2.
- the instructor 2 can check the endoscope-captured image 30 on the monitor 25 while staying in the instruction room Rm2.
- the instruction terminal 26 is, for example, a tablet terminal having a touch panel 27, and the instructor 2 can input instructions on the endoscopic captured image 30 by operating the touch panel 27 with a finger or a touch pen.
- FIG. 4 is a block diagram showing an example of the configuration of the support system 100. As shown in FIG.
- the support system 100 has a treatment-side system 10 configured on the operating room Rm1 side and a guidance-side system 20 configured on the guidance room Rm2 side.
- the system 10 on the treatment side and the system 20 on the instruction side are separated from each other, and can communicate with each other through a wired or wireless transmission path.
- the treatment-side system 10 has an endoscope 11 , an image generator 12 , an image synthesizer 13 , a monitor 14 and a data transmitter 15 .
- the endoscope 11 has an imaging device, and the captured image signals obtained by the imaging device are A/D-converted to obtain endoscope-captured image data Id (Fig. 2 image data for displaying an endoscope-captured image 30 as shown in ).
- the image generation unit 12 is configured by, for example, an image processor or the like, and generates annotation image data Ad1 based on the drawing information PI transmitted from the instructor system 20 .
- the drawing information PI includes information such as coordinates, line type, color, and line width designated by operation input.
- the annotation image data Ad1 is image data such as the annotation images 41 and 42 shown in FIG. 5 to be superimposed on the endoscopic captured image 30, and is generated based on the drawing information PI. This also applies to annotation image data Ad2, which will be described later.
- the image synthesizing unit 13 is realized, for example, by a video mixer device configured by hardware or a dedicated circuit implemented by FPGA (Field Programmable Gate Array), etc., and synthesizes various input image data.
- the endoscope-captured image data Id and the annotation image data Ad1 are synthesized in the image synthesizing unit 13 to generate treatment-side synthesized image data Cd1.
- the monitor 14 is composed of, for example, a liquid crystal display device, etc., and displays an image on the display panel based on the supplied image data.
- a display based on the supplied endoscope-captured image data Id and treatment-side combined image data Cd1 is performed on the monitor 14 .
- the data transmission unit 15 is configured to be able to communicate with the instructor system 20, for example, and can transmit various data to the instructor system 20.
- the endoscope-captured image data Id obtained from the image synthesizing unit 13 and the treatment-side synthetic image data Cd1 are transmitted.
- the instructor system 20 has an annotation interface 21 , a data transmission section 22 , an image generation section 23 , an image synthesis section 24 and a monitor 25 .
- Each component is integrally configured as, for example, a guidance terminal 26 .
- the annotation interface 21 is composed of input devices such as a touch panel, a touch pad, a mouse, and a keyboard, and a processor or the like that generates drawing information PI according to operation inputs to these input devices.
- input to the annotation interface 21 is performed by the instructor 2 operating the touch panel 27 in FIG. 1 with a finger or a touch pen.
- the data transmission unit 22 is configured, for example, to be able to communicate with the treatment-side system 10, and can transmit various data to the treatment-side system 10. In this example, especially the drawing information PI obtained by the annotation interface 21 is transmitted.
- the image generation unit 23 is composed of, for example, an image processor and the like, and generates annotation image data Ad2 from the drawing information PI.
- the image synthesizing unit 24 is realized, for example, by a software mixer configured in a video mixer device or an information processing device as hardware, and synthesizes various types of supplied image data.
- the image synthesizing unit 24 synthesizes the annotation image data Ad2 with the endoscope-captured image data Id and the treatment-side synthesized image data Cd1, thereby generating instruction-side synthesized image data Cd2.
- the monitor 25 is composed of, for example, a liquid crystal display device, and displays an image on the display panel based on the supplied image data. In this example, display is performed based on, for example, the instruction-side combined image data Cd2.
- each of the above systems (10, 20) can be configured by hardware.
- the configuration of the unit is not limited to the above.
- all or part of each functional unit may be realized by one or more hardware devices.
- one of the functional units may be configured as a microcomputer including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
- each function can be realized by the CPU executing processing based on a program as software stored in the ROM or the like.
- the image synthesizing unit 13 supplies the endoscopic image data Id to the monitor 14 and the data transmitting unit 15.
- the endoscope-captured image 30 is displayed on the monitor 14 (see FIG. 2). Thereby, the operator 1 can operate while observing the state inside the body cavity of the patient 3 on the monitor 14 .
- the data transmission unit 15 also transmits the endoscope-captured image data Id supplied from the image synthesis unit 13 to the instructor system 20 .
- the image synthesizing unit 24 receives the endoscope-captured image data Id from the treatment-side system 10 (data transmission unit 15).
- the image synthesizing unit 24 supplies the endoscope-captured image data Id to the monitor 25 when the annotation image data Ad ⁇ b>2 is not supplied from the image generating unit 23 .
- an endoscope-captured image 30 is displayed on the monitor 25 (see FIG. 2).
- the instructor 2 can share the viewpoint with the operator 1 who is in the operating room Rm1 while being in the instruction room Rm2, and observe the state of the inside of the body cavity of the patient 3 and the situation of the operation by the operator 1. be able to.
- the instructor 2 operates an input device such as the touch panel 27 while observing the endoscopic captured image 30 displayed on the monitor 25, thereby viewing figures, characters, symbols, etc. indicating the content of instructions to the operator 1. Input on the mirror image 30 .
- the annotation interface 21 generates drawing information PI corresponding to the operation input to the touch panel 27 .
- the drawing information PI obtained by the annotation interface 21 is supplied to the image generation section 23 and the data transmission section 22 .
- the data transmission unit 22 transmits the drawing information PI obtained from the annotation interface 21 to the treatment-side system 10 .
- Data is transmitted from the instruction-side system 20 to the treatment-side system 10 by transmitting the drawing information PI having a smaller communication capacity than the annotation image data Ad2 generated by the image generation unit 23 when transmitting to the treatment-side system 10. communication delay is reduced.
- the image generation unit 12 receives the drawing information PI from the instruction-side system 20 (data transmission unit 22).
- the image generator 12 generates annotation image data Ad1 for displaying the annotation images 41 and 42 (see FIG. 5) from the received drawing information PI.
- the annotation image data Ad1 generated by the image generator 12 is supplied to the image synthesizer 13 .
- the image synthesizing unit 13 synthesizes the endoscope-captured image data Id supplied from the endoscope 11 and the annotation image data Ad1 supplied from the image generating unit 12 to generate treatment-side synthetic image data Cd1.
- the treatment-side composite image data Cd1 is image data for displaying a composite image 50 (see FIG. 3) in which the annotation images 41 and 42 (see FIG. 5) are superimposed on the endoscope-captured image 30 (see FIG. 2). be.
- the image synthesizing unit 13 is configured by a hardware video mixer device, the time from the input of the endoscope-captured image data Id and the annotation image data Ad1 to the output of the treatment-side synthesized image data Cd1 is Very short delay time.
- the image synthesizing unit 13 can be realized by a processor with high computing power instead of a hardware video mixer device. By executing the synthesizing process defined by the program, the processor can achieve a processing speed comparable to that of a hardware video mixer, for example, a processing speed with a delay time of less than 30 milliseconds from input to output. In such a case, the image synthesizing unit 13 may be implemented as a software mixer.
- the surgical side synthesized image data Cd1 generated by the synthesis process is supplied to the monitor 14 without delay, and displayed on the monitor 14 with little time lag.
- the operator 1 in the operating room Rm1 can operate on the patient 3 while confirming the instructions of the instructor 2 in the instruction room Rm2 on the monitor 14 without feeling a time lag.
- the image synthesizing unit 13 is a hardware video mixer, or a software mixer using a processor with an extremely high processing speed, and if the delay time is less than 30 ms as described above, the operator 1 can almost feel no time lag with respect to the image.
- the data transmission unit 15 transmits the treatment-side combined image data Cd1 supplied from the image combining unit 13 to the instruction-side system 20.
- the image generator 23 generates annotation image data Ad2 for displaying the annotation images 41 and 42 (see FIG. 5) based on the drawing information PI obtained from the annotation interface 21. ing.
- the annotation image data Ad2 obtained by the image generator 23 is supplied to the image synthesizer 24 .
- the image synthesizing unit 24 synthesizes the treatment-side synthesized image data Cd1 received from the treatment-side system 10 (data transmission unit 15) and the annotation image data Ad2 supplied from the image generation unit 23 to generate instruction-side synthesized image data. Generates Cd2. The generated instruction-side combined image data Cd2 is supplied to the monitor 25 .
- the image synthesizing unit 24 synthesizes the endoscopically captured image data Id and the annotation image data Ad2. to generate instructor-side composite image data Cd2.
- the instruction-side composite image data Cd2 generated at this time is supplied to the monitor 25 in the same manner as described above.
- the annotation image data Ad2 is synthesized when the instruction side synthesized image data Cd2 is generated. Therefore, the annotation images 41 and 42 ( 5) is superimposed thereon (see FIG. 3) is displayed.
- the delay is longer than when the instructor-side composite image data Cd2 is generated in the treatment-side system 10 based on the transmitted drawing information PI.
- the instructor side composite image data Cd2 can be supplied to the monitor 25 without any need.
- the instructor 2 can obtain the treatment-side synthesized image data Cd1 , and annotation image data Ad2.
- the instructor 2 can confirm the deviation of the annotation images 41 and 42 as shown in FIG. 6 on the monitor 25, for example.
- the image generation unit 23 may generate the annotation image data Ad2 as image data having a display mode different from that of the annotation image data Ad1 generated by the image generation unit 12. For example, by displaying the annotation images 41 and 42 based on the annotation image data Ad2 as shown in FIG. can be displayed on the monitor 25 by distinguishing from . In FIG. 7, the dashed line along the solid line is indicated as such for the sake of convenience of explanation, but actually they overlap each other.
- the drawing information PI transmitted from the instruction side system 20 may be partially lost when received by the treatment side system 10 due to factors such as communication failure, or may be set or changed between the systems (10, 20).
- the display coordinates of the annotation images 41 and 42 may be misaligned due to unadjusted differences in resolution between the monitors 14 and 25 . In this case, an image such as that shown in FIG. 6 may be displayed on the monitor 25 due to the displacement of the display coordinates.
- the image generation unit 12 of the treatment-side system 10 should normally display annotation images 41 and 42 as shown in FIG. 5 based on the received drawing information PI.
- the annotation image data Ad1 should be generated, but due to the loss of a part of the drawing information PI, only the annotation image 42 as shown in FIG. ) may be generated.
- the image synthesizing unit 13 synthesizes the missing annotation image data Ad1 generated by the image generating unit 12 and the endoscope-captured image data Id, and annotates an endoscope-captured image 30 as shown in FIG.
- Treatment-side synthetic image data Cd1 for displaying a synthetic image 51 in which only the image 42 is superimposed is generated.
- the synthesized image 51 lacking the annotation image 41 is displayed on the monitor 14 based on the supplied treatment-side synthesized image data Cd1.
- the inputs of the instructor 2 are not accurately reflected on the monitor 14, and the practitioner 1 performs the treatment without grasping all of the instructions of the instructor 2.
- the treatment-side composite image data Cd1 received from the treatment-side system 10 and the annotation image data Ad2 for displaying the annotation images 41 and 42 generated by the image generation unit 23 are synthesized. Therefore, instruction-side combined image data Cd2 for displaying the combined image 50 (see FIG. 5) is generated regardless of whether or not the annotation image data Ad1 of the treatment-side combined image data Cd1 is missing. .
- the synthesized image 50 based on the instructor-side synthesized image data Cd2 is displayed on the monitor 25, so that the instructor 2 can confirm that the annotation image 41 is missing in the display of the monitor 14 on the operator 1 side. cannot be verified.
- the image generation unit 23 generates annotation image data Ad2 for displaying the annotation images 41 and 42 with broken lines, for example, based on the drawing information PI.
- the image synthesizing unit 24 synthesizes the annotation image data Ad2 supplied from the image generating unit 23 and the treatment-side synthesized image data Cd1 received from the treatment-side system 10 to generate instruction-side synthesized image data Cd2.
- a synthesized image 52 as shown in FIG. 7 is displayed on the monitor 25 by supplying the generated instructor-side synthesized image data Cd2 to the monitor 25 .
- the instructor 2 can confirm that the portion of the annotation image 41 displayed only by broken lines is not displayed on the monitor 14 of the treatment-side system 10 .
- the instructor 2 can easily compare the contents of his instruction with the contents of the monitor 14 viewed by the practitioner 1.
- FIG. 1 A synthesized image 52 as shown in FIG. 7 is displayed on the monitor 25 by supplying the generated instructor-side synthesized image data Cd2 to the monitor 25 .
- the instructor 2 can see both the images based on the annotation image data Ad1 and Ad2 by looking at the image based on the instructor-side combined image data Cd2, and whether his/her instruction is correctly transmitted to the practitioner 1. You can check whether or not If the instructor 2 can recognize that the instruction by the image is not properly transmitted, he/she can take necessary measures.
- the annotation image data Ad1 and the annotation image data Ad2 in the instructor-side combined image data Cd2 are distinguished by a dashed line and a solid line as an example of a display mode. is not limited to this as long as the annotation image data Ad1 and Ad2 can be distinguished from each other, and can be distinguished by differentiating line types such as line widths, colors, brightness, and the like. Also, it is possible to distinguish by highlighting one of them. It is also conceivable to distinguish and display a portion of the annotation image data Ad2 generated by the image generation unit 23 that does not match the annotation image data Ad1.
- the support system 100 has a first system (treatment-side system 10 ) and a second system (instruction-side system 20 ) separated from the treatment-side system 10 .
- the information processing device of the instructor system 20 has a data transmission unit 22, an image generation unit 23, and an image synthesis unit 24 (see FIG. 4).
- the data transmission unit 22 transmits to the treatment-side system 10 the drawing information PI based on the operation input, which is required for generating the annotation image data Ad1.
- the image generator 23 also generates annotation image data Ad2 from the drawing information PI.
- the image synthesizing unit 24 synthesizes the annotation image data Ad2 generated by the image generating unit 23 and the treatment-side composite image data Cd1 (or endoscope-captured image data Id) received from the treatment-side system 10 .
- the instructor-side system 20 generates annotation image data Ad2, and synthesizes the generated annotation image data Ad2 and treatment-side composite image data Cd1 (or endoscope-captured image data Id) to generate instructor-side composite image data Cd2. By doing so, the instructor-side composite image data Cd2 can be acquired more quickly than receiving the data generated by the treatment-side system 10 .
- the instructor 2 can check the graphics and the like that he/she has entered on the monitor 25 without being affected by communication delays and the like. Therefore, the instructor 2 can perform input comfortably and accurately, and can give appropriate instructions to the practitioner 1 .
- the image generating unit 23 of the instructor-side system 20 generates annotation image data Ad2 that can be distinguished from the annotation image data Ad1 included in the treatment-side combined image data Cd1.
- the instructor 2 can easily confirm whether or not the content of his instruction is appropriately displayed on the monitor 14 that the practitioner 1 is looking at.
- the information processing device of the treatment-side system 10 has an image generator 12, an image synthesizer 13, and a data transmitter 15 (see FIG. 4).
- the image generator 12 receives the drawing information PI based on the operation input from the instructor system 20, and generates annotation image data Ad1 from the received drawing information PI.
- the image synthesizing unit 13 also synthesizes the input image data (endoscope-captured image data Id) and the annotation image data Ad1 generated by the image generating unit 12 to generate treatment-side synthetic image data Cd1.
- the data transmission unit 15 transmits the treatment-side combined image data Cd1 (or the endoscope-captured image data Id) generated by the image combining unit 13 to the instruction-side system 20 .
- the image data input to the image synthesizing unit 13 is the endoscope-captured image data Id
- the input image data is supplied from the endoscope 11. It is not limited to captured image data such as endoscopic captured image data Id, and can be variously conceived, such as image data recorded in a memory and acquired by reading out from the memory, image data received from an external computer device, etc. .
- the delay time from the input of the endoscope-captured image data Id and the annotation image data Ad1 to the output of the treatment-side combined image data Cd1 by the image generation unit 12 is less than 30 msec. is desirable.
- the operator 1 can visually recognize the synthesized image 50 and the like on the monitor 14 with a low delay, so that the operator 1 can perform an accurate and safe treatment on the patient 3 while confirming the instruction of the instructor 2. can be done.
- the support system 100 a surgery support system has been described in which the practitioner 1 can perform surgery on the patient 3 while receiving instructions from the instructor 2 who is in a remote location.
- the support system 100 can be widely applied to a situation in which the instructor 2 at a remote location gives instructions to the operator 1 while visually recognizing the captured image on the operator side.
- the support system 100 can be applied to various uses such as athletes and coaches in sports guidance, instructors and students in learning support such as education and vocational training, and presenters and listeners in remote conferences.
- a tablet terminal having a touch panel has been described as an example of the instruction terminal 26 in FIG. It can be realized by various devices such as a VR (Virtual Reality) having a display and a remote controller for operation input, a master console for a surgical robot, and the like.
- VR Virtual Reality
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Primary Health Care (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Epidemiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Urology & Nephrology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Endoscopes (AREA)
Abstract
Description
また、指導者側においても指示内容の入力を違和感なく行うために、入力を反映させた内視鏡撮像画像をできる限り低遅延で自身のモニタに表示させることが望ましい。
本発明は上記事情に鑑みてなされたものであり、遠隔地にいる指導者が施術者に適切な支援を行えるような支援システムを提供することを目的とする。
また描画情報とは、操作入力で指定された座標、線種、色彩、線幅などの情報のことをいう。
なお、参照する図面に記載された各構成は、本発明を実現するための一例を示したものにすぎない。従って、本発明の技術的思想を逸脱しない範囲であれば、設計などに応じて様々な変更が可能である。また一度説明した構成については重複を避けるため、以降、同一符号を付して再度の説明を省略することがある。
図1は、支援システム100を示している。
本実施の形態では、支援システム100の一例として、手術室Rm1にいる施術者1が、手術室Rm1と離隔した指導室Rm2にいる指導医2による指示を確認しながら患者3の手術を行うことのできる手術支援システムについて説明する。当該手術支援システムでは、例えば施術者1は患者3の執刀医であり、指導者2は当該執刀医を指導する指導医である。
次に支援システム100の構成について図4を参照して説明する。図4は、支援システム100の構成の一例を示すブロック図である。
ここで描画情報PIは、操作入力で指定された座標、線種、色彩、線幅などの情報を含んでいる。
またアノテーション画像データAd1は、内視鏡撮像画像30に重畳させるための図5に示すアノテーション画像41,42のような画像データであり、描画情報PIに基づいて生成される。これは後述するアノテーション画像データAd2についても同様である。
指導側システム20は、アノテーションインタフェース21、データ送信部22、画像生成部23、画像合成部24、及びモニタ25を有している。当該各構成は、例えば指導端末26として一体に構成される。
また、例えば各機能部にいずれかがCPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)等を備えたマイクロコンピュータとして構成されていてもよい。この場合、ROMなどに格納されたソフトウェアとしてのプログラムに基づく処理をCPUが実行することで、各機能を実現することができる。
本実施の形態における支援システム100の詳細について図4を参照して説明する。
まず施術側システム10において、内視鏡11により得られた内視鏡撮像画像データIdが画像合成部13に供給される。
画像合成部24は、アノテーション画像データAd2が画像生成部23から供給されていない場合、内視鏡撮像画像データIdをモニタ25に供給する。
施術側システム10への送信にあたり、画像生成部23により生成されるアノテーション画像データAd2よりも通信容量の少ない描画情報PIを送信することで、指導側システム20から施術側システム10にデータを送信する際の通信遅延が軽減される。
画像生成部12で生成されたアノテーション画像データAd1は、画像合成部13に供給される。
ずれが生じた場合、指導者2は、例えば図6で示すようなアノテーション画像41,42のずれをモニタ25上で確認することができる。
例えば、図7に示すようにアノテーション画像データAd2に基づくアノテーション画像41,42を破線で表示させることで、施術側合成画像データCd1に含まれるアノテーション画像データAd1に基づくアノテーション画像42の表示(実線)と区別してモニタ25上に表示させることができる。
なお、図7で破線が実線に沿う部分は、説明の便宜上そのように表記してあるが、実際には互いに重なっている部分である。
この場合、モニタ25には表示座標のずれにより図6のような画像が表示されることがある。
指導者2が施術者1側のモニタ14の表示状況を確認するためには、アノテーション画像データAd1,Ad2に基づく画像を異なる態様で表示させることが望ましい。
また画像合成部24は、画像生成部23から供給されたアノテーション画像データAd2と、施術側システム10から受信した施術側合成画像データCd1とを合成し、指導側合成画像データCd2を生成する。
これにより、指導者2は、破線のみで表示されたアノテーション画像41の部分が、施術側システム10のモニタ14に表示されていないことを確認することができる。
つまり、指導者2はモニタ25上で、自身の指示内容と、施術者1の見ているモニタ14の内容を容易に比較できる。
また、画像生成部23により生成されたアノテーション画像データAd2のうち、アノテーション画像データAd1と一致しない部分を区別して表示することも考えられる。
上述した本実施の形態によれば、支援システム100は、第1システム(施術側システム10)と、施術側システム10から離隔した第2システム(指導側システム20)とを有する。
データ送信部22は、アノテーション画像データAd1の生成に必要とされる、操作入力に基づく描画情報PIを施術側システム10に送信する。また画像生成部23は、描画情報PIからアノテーション画像データAd2を生成する。さらに画像合成部24は、画像生成部23により生成されたアノテーション画像データAd2と、施術側システム10から受信した施術側合成画像データCd1(又は内視鏡撮像画像データId)と、を合成する。
画像生成部12は、操作入力に基づく描画情報PIを指導側システム20から受信し、受信した描画情報PIからアノテーション画像データAd1を生成する。また画像合成部13は、入力された画像データ(内視鏡撮像画像データId)及び画像生成部12により生成されたアノテーション画像データAd1を合成して施術側合成画像データCd1を生成する。さらにデータ送信部15は、画像合成部13により生成された施術側合成画像データCd1(又は内視鏡撮像画像データId)を指導側システム20に送信する。
なお、本実施の形態では、画像合成部13に入力される画像データを内視鏡撮像画像データIdとする例について説明したが、当該入力される画像データは、内視鏡11から供給される内視鏡撮像画像データIdのような撮像画像データに限られず、メモリに記録され、当該メモリから読み出すことで取得される画像データや、外部のコンピュータ装置などから受信する画像データなど多様に考えられる。
例えば、スポーツ指導における選手と監督、教育や職業訓練などの学習支援における講師と受講者、リモート会議における発表者と傍聴者など、様々な用途に支援システム100を適用できる。
2 指導者
3 患者
10 施術側システム
11 内視鏡
12 画像生成部
13 画像合成部
14 モニタ
15 データ送信部
20 指導側システム
21 アノテーションインタフェース
22 データ送信部
23 画像生成部
24 画像合成部
25 モニタ
100 支援システム
Ad1,Ad2 アノテーション画像データ
PI 描画情報
Id 内視鏡撮像画像データ
Cd1 施術側合成画像データ
Cd2 指導側合成画像データ
Rm1 手術室
Rm2 指導室
Claims (7)
- 第1システムと、前記第1システムから離隔した第2システムとを有する支援システムにおける前記第2システムが有する情報処理装置であって、
アノテーション画像データの生成に必要とされる、操作入力に基づく描画情報を前記第1システムに送信するデータ送信部と、
前記描画情報からアノテーション画像データを生成する画像生成部と、
前記画像生成部により生成されたアノテーション画像データと、前記第1システムから受信した画像データと、を合成する画像合成部と、を備える
情報処理装置。 - 前記第1システムでは、前記描画情報からアノテーション画像データが生成され、生成されたアノテーション画像データを含む複数の画像データを合成することで合成画像データが生成され、
前記画像合成部は、前記第1システムから受信した前記合成画像データと、前記画像生成部により生成されたアノテーション画像データと、を合成する
請求項1に記載の情報処理装置。 - 前記画像生成部は、前記合成画像データに含まれるアノテーション画像データと区別できるようなアノテーション画像データを生成する
請求項2に記載の情報処理装置。 - 第1システムと、前記第1システムから離隔した第2システムとを有する支援システムにおける前記第1システムが有する情報処理装置であって、
操作入力に基づく描画情報を前記第2システムから受信し、受信した前記描画情報からアノテーション画像データを生成する画像生成部と、
入力された画像データと、前記画像生成部により生成されたアノテーション画像データと、を合成して合成画像データを生成する画像合成部と、
前記画像合成部により生成された前記合成画像データを前記第2システムに送信するデータ送信部と、を備える
情報処理装置。 - 前記画像合成部による、複数の画像データを入力してから前記合成画像データを出力するまでの遅延時間が30m秒未満とされる
請求項4に記載の情報処理装置。 - 前記入力された画像データは撮像画像データである
請求項4又は請求項5に記載の情報処理装置。 - 第1システムと、前記第1システムから離隔した第2システムとを有する支援システムにおいて、
前記第2システムが有する情報処理装置は、
アノテーション画像データの生成に必要とされる、操作入力に基づく描画情報を前記第1システムに送信する第2データ送信部と、
前記描画情報からアノテーション画像データを生成する第2画像生成部と、
前記第2画像生成部により生成されたアノテーション画像データと、前記第1システムから受信した画像データと、を合成する第2画像合成部と、を備え、
前記第1システムが有する情報処理装置は、
前記第2システムから受信した前記描画情報からアノテーション画像データを生成する第1画像生成部と、
入力された画像データと、前記第1画像生成部により生成されたアノテーション画像データと、を合成して合成画像データを生成する第1画像合成部と、
前記第1画像合成部により生成された前記合成画像データを前記第2システムに送信する第1データ送信部と、を備える
支援システム。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022530288A JP7156751B1 (ja) | 2021-02-04 | 2021-02-04 | 情報処理装置、支援システム |
| PCT/JP2021/004163 WO2022168242A1 (ja) | 2021-02-04 | 2021-02-04 | 情報処理装置、支援システム |
| CN202180080620.9A CN116528787A (zh) | 2021-02-04 | 2021-02-04 | 信息处理装置、辅助系统 |
| EP21924637.8A EP4275638B1 (en) | 2021-02-04 | 2021-02-04 | Information processing device and assistance system |
| US18/363,794 US12340710B2 (en) | 2021-02-04 | 2023-08-02 | Information processing device and assistance system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/004163 WO2022168242A1 (ja) | 2021-02-04 | 2021-02-04 | 情報処理装置、支援システム |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/363,794 Continuation US12340710B2 (en) | 2021-02-04 | 2023-08-02 | Information processing device and assistance system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022168242A1 true WO2022168242A1 (ja) | 2022-08-11 |
Family
ID=82740970
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/004163 Ceased WO2022168242A1 (ja) | 2021-02-04 | 2021-02-04 | 情報処理装置、支援システム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US12340710B2 (ja) |
| EP (1) | EP4275638B1 (ja) |
| JP (1) | JP7156751B1 (ja) |
| CN (1) | CN116528787A (ja) |
| WO (1) | WO2022168242A1 (ja) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102482682A (zh) | 2009-08-25 | 2012-05-30 | 巴斯夫植物科学有限公司 | 抗线虫的转基因植物 |
| US20240148399A1 (en) * | 2022-11-04 | 2024-05-09 | Saphena Medical, Inc. | Unitary device for vessel harvesting and method of using same |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000271147A (ja) | 1999-03-19 | 2000-10-03 | Olympus Optical Co Ltd | 遠隔手術支援システム |
| JP2005021354A (ja) * | 2003-07-01 | 2005-01-27 | Olympus Corp | 遠隔手術支援装置 |
| WO2012081194A1 (ja) * | 2010-12-17 | 2012-06-21 | パナソニック株式会社 | 医療支援装置、医療支援方法および医療支援システム |
Family Cites Families (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020128662A1 (en) * | 1998-02-24 | 2002-09-12 | Brock David L. | Surgical instrument |
| US6490490B1 (en) * | 1998-11-09 | 2002-12-03 | Olympus Optical Co., Ltd. | Remote operation support system and method |
| US6852107B2 (en) * | 2002-01-16 | 2005-02-08 | Computer Motion, Inc. | Minimally invasive surgical training using robotics and tele-collaboration |
| US6652452B1 (en) * | 1999-10-25 | 2003-11-25 | Advanced Medical Electronics Corporation | Infrared endoscope with sensor array at the distal tip |
| IL135571A0 (en) * | 2000-04-10 | 2001-05-20 | Doron Adler | Minimal invasive surgery imaging system |
| US6834207B2 (en) * | 2001-02-08 | 2004-12-21 | Kabushiki Kaisha Toshiba | Operating guidance system for medical equipment |
| US6728599B2 (en) * | 2001-09-07 | 2004-04-27 | Computer Motion, Inc. | Modularity system for computer assisted surgery |
| US20050033117A1 (en) * | 2003-06-02 | 2005-02-10 | Olympus Corporation | Object observation system and method of controlling object observation system |
| JP2008253586A (ja) * | 2007-04-06 | 2008-10-23 | Hoya Corp | 内視鏡支援システム |
| US10022041B2 (en) * | 2012-06-27 | 2018-07-17 | Camplex, Inc. | Hydraulic system for surgical applications |
| EP3060117B1 (en) * | 2013-10-25 | 2019-10-23 | ResMed Inc. | Electronic management of sleep related data |
| US9679411B2 (en) * | 2014-12-19 | 2017-06-13 | International Business Machines Corporation | Hardware management and reconstruction using visual graphics |
| JP7021110B2 (ja) * | 2016-05-09 | 2022-02-16 | マジック リープ, インコーポレイテッド | ユーザ健康分析のための拡張現実システムおよび方法 |
| US10319128B2 (en) * | 2016-09-26 | 2019-06-11 | Rockwell Automation Technologies, Inc. | Augmented reality presentation of an industrial environment |
| WO2018163977A1 (ja) * | 2017-03-08 | 2018-09-13 | ソニー株式会社 | 画像処理装置および画像処理方法 |
| US20190005841A1 (en) * | 2017-06-30 | 2019-01-03 | Intel Corporation | Representation of group emotional response |
| US20190108578A1 (en) * | 2017-09-13 | 2019-04-11 | Magical Technologies, Llc | Systems and methods of rewards object spawning and augmented reality commerce platform supporting multiple seller entities |
| JP2021531883A (ja) * | 2018-07-24 | 2021-11-25 | ソニーグループ株式会社 | 手術室における分散型画像処理システム |
| US12016566B2 (en) * | 2020-10-02 | 2024-06-25 | Cilag Gmbh International | Surgical instrument with adaptive function controls |
| US11672534B2 (en) * | 2020-10-02 | 2023-06-13 | Cilag Gmbh International | Communication capability of a smart stapler |
| US11883022B2 (en) * | 2020-10-02 | 2024-01-30 | Cilag Gmbh International | Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information |
| US11911030B2 (en) * | 2020-10-02 | 2024-02-27 | Cilag Gmbh International | Communication capability of a surgical device with component |
| US11992372B2 (en) * | 2020-10-02 | 2024-05-28 | Cilag Gmbh International | Cooperative surgical displays |
| US11963683B2 (en) * | 2020-10-02 | 2024-04-23 | Cilag Gmbh International | Method for operating tiered operation modes in a surgical system |
-
2021
- 2021-02-04 JP JP2022530288A patent/JP7156751B1/ja active Active
- 2021-02-04 EP EP21924637.8A patent/EP4275638B1/en active Active
- 2021-02-04 WO PCT/JP2021/004163 patent/WO2022168242A1/ja not_active Ceased
- 2021-02-04 CN CN202180080620.9A patent/CN116528787A/zh active Pending
-
2023
- 2023-08-02 US US18/363,794 patent/US12340710B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000271147A (ja) | 1999-03-19 | 2000-10-03 | Olympus Optical Co Ltd | 遠隔手術支援システム |
| JP2005021354A (ja) * | 2003-07-01 | 2005-01-27 | Olympus Corp | 遠隔手術支援装置 |
| WO2012081194A1 (ja) * | 2010-12-17 | 2012-06-21 | パナソニック株式会社 | 医療支援装置、医療支援方法および医療支援システム |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4275638A4 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN116528787A (zh) | 2023-08-01 |
| EP4275638A1 (en) | 2023-11-15 |
| JPWO2022168242A1 (ja) | 2022-08-11 |
| EP4275638B1 (en) | 2025-03-26 |
| JP7156751B1 (ja) | 2022-10-19 |
| EP4275638A4 (en) | 2024-03-06 |
| US20230377473A1 (en) | 2023-11-23 |
| US12340710B2 (en) | 2025-06-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP1965699B1 (en) | Medical robotic system providing three-dimensional telestration | |
| US11620920B2 (en) | Surgical training systems and methods | |
| US12165268B2 (en) | Remote surgical mentoring | |
| KR101108927B1 (ko) | 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법 | |
| US20110306986A1 (en) | Surgical robot system using augmented reality, and method for controlling same | |
| Marohn et al. | Twenty-first century surgery using twenty-first century technology: surgical robotics | |
| Bui et al. | Tele-mentoring using augmented reality technology in healthcare: A systematic review | |
| US12340710B2 (en) | Information processing device and assistance system | |
| US20060257008A1 (en) | Method and apparatus for generating an image including editing comments in a sterile working area of a medical facility | |
| Carbone et al. | A wearable augmented reality platform for telemedicine | |
| KR100957470B1 (ko) | 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법 | |
| US20210375452A1 (en) | Method and System for Remote Augmented Reality Medical Examination | |
| Fu et al. | Effects of optical see-through head-mounted display use for simulated laparoscopic surgery | |
| Charlet | Surgery of the future: Harnessing the power of virtual reality and augmented reality | |
| TWI636768B (zh) | Surgical assist system | |
| CN115836915A (zh) | 手术器械操控系统和手术器械操控系统的控制方法 | |
| US20190374298A1 (en) | Enhanced haptic feedback system | |
| Dewaele et al. | Is the human brain capable of controlling seven degrees of freedom? | |
| JP2020134710A (ja) | 手術トレーニング装置 | |
| US20240371063A1 (en) | Assistance system, assistance device, and assisted device | |
| US20240371118A1 (en) | Assistance system, assistance device, and assisted device | |
| Musthafa et al. | Tools and applications for telesurgery in healthcare industry | |
| WO2024006348A1 (en) | Systems and methods for clinical procedure training using mixed environment technology | |
| EP4607495A1 (en) | Medical training system and method for medical training | |
| JP7141209B2 (ja) | 内視鏡表示装置、及び、内視鏡システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| ENP | Entry into the national phase |
Ref document number: 2022530288 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21924637 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202180080620.9 Country of ref document: CN |
|
| ENP | Entry into the national phase |
Ref document number: 2021924637 Country of ref document: EP Effective date: 20230811 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |