[go: up one dir, main page]

WO2024128212A1 - Dispositif de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard, et procédé et programme de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard - Google Patents

Dispositif de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard, et procédé et programme de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard Download PDF

Info

Publication number
WO2024128212A1
WO2024128212A1 PCT/JP2023/044369 JP2023044369W WO2024128212A1 WO 2024128212 A1 WO2024128212 A1 WO 2024128212A1 JP 2023044369 W JP2023044369 W JP 2023044369W WO 2024128212 A1 WO2024128212 A1 WO 2024128212A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
processing device
captured image
gaze point
surgery
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2023/044369
Other languages
English (en)
Japanese (ja)
Inventor
潔 長谷川
裕一郎 三原
顕児 唐子
佐々木 脩
▲高▼山 真秀 神子
橋司 伊藤
▲ゆ▼ 陳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Tokyo NUC
Original Assignee
University of Tokyo NUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Tokyo NUC filed Critical University of Tokyo NUC
Priority to JP2024564388A priority Critical patent/JPWO2024128212A1/ja
Publication of WO2024128212A1 publication Critical patent/WO2024128212A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an information processing device, an information processing method, and a program.
  • Patent Document 1 discloses a technology for distributing and recording video of a surgical site (operative field).
  • an information processing device acquires captured images of a surgery, determines a gaze point related to the surgery based on the captured images of the surgery, and outputs information based on the determined gaze point.
  • FIG. 1 is a diagram illustrating an example of a system configuration of an information processing system 1000.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of the server device 100.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration of the client device 110.
  • FIG. 4 is an activity diagram showing an example of information processing for storing captured images.
  • FIG. 5 is an activity diagram showing an example of main information processing.
  • FIG. 6 is a diagram showing an example of an acquired captured image.
  • FIG. 7 is a diagram for explaining how to determine the axis of an object.
  • FIG. 8 is a diagram showing an example of a gaze point of a captured image (current frame) to be processed.
  • FIG. 9 is a diagram showing an example of determining a comprehensive gaze point based on the gaze point of the current frame and gaze points of several past frames.
  • FIG. 10 is a diagram showing an example of a gaze point of a captured image (current frame) to be processed in the fourth modification.
  • the term "part” may include, for example, a combination of hardware resources implemented by a circuit in a broad sense and software information processing that can be specifically realized by these hardware resources.
  • various information is handled, and communication and calculation can be performed on a circuit in a broad sense, regardless of whether this information is represented by the high and low signal values as a binary bit collection consisting of 0 or 1, represented by physical numerical values of signal values, or represented by quantum superposition.
  • a circuit in the broad sense is a circuit that is realized by at least appropriately combining a circuit, circuitry, a processor, and memory.
  • ASICs application specific integrated circuits
  • SPLDs simple programmable logic devices
  • CPLDs complex programmable logic devices
  • FPGAs field programmable gate arrays
  • the programs for realizing the software appearing in the embodiments may be implemented in a form that allows them to be downloaded from a server, the programs may be executed on a cloud computer, or the programs may be stored in non-volatile or volatile non-transient storage media and distributed.
  • System Configuration Fig. 1 is a diagram showing an example of the system configuration of an information processing system 1000.
  • the information processing system 1000 includes, as a system configuration, a server device 100, a client device 110, and a camera 120.
  • the server device 100, the client device 110, and the camera 120 are communicatively connected via a network 150.
  • the camera 120 is a device that captures an image of a surgical site (surgical field) in an operating room. In the case of an open surgery, the camera 120 is a Web camera or the like installed in the operating room.
  • the camera 120 Based on control information from the server device 100, the camera 120 changes the imaging direction, imaging position, and camera settings (e.g., ISO sensitivity, white balance, shutter mode, microphone, self-timer, focus setting, camera shake reduction, flicker prevention, storage destination setting, time-lapse shooting interval, etc.) and changes the digital zoom setting.
  • Digital zoom is a process of stretching and enlarging a part of an image by image processing.
  • the server device 100 is a device that executes the process according to the embodiment on captured images transmitted from the camera 120 or the like.
  • the client device 110 is a device that displays the results of the processing performed by the server device 100, etc.
  • FIG. 1 only one client device 110 is illustrated in the information processing system 1000 for the sake of simplicity, but the information processing system 1000 may include multiple client devices 110.
  • FIG. 1 only one camera 120 is illustrated in the information processing system 1000 for the sake of simplicity, but the information processing system 1000 may include multiple cameras 120.
  • the types of the multiple cameras 120 may be the same or different. Examples of the types of camera 120 include a web camera, an endoscope camera, etc.
  • the processes of the following embodiments and the like will be described as being executed by the server device 100, but instead of the server device 100, a PC (Personal Computer), a cloud system, etc. may execute the processes of the embodiments and the like.
  • PC Personal Computer
  • FIG. 2 is a diagram showing an example of the hardware configuration of the server device 100.
  • the server device 100 includes, as its hardware configuration, a control unit 210, a storage unit 220, and a communication unit 230.
  • the control unit 210 is a central processing unit (CPU) or the like, and controls the entire server device 100.
  • the storage unit 220 is any one of a hard disk drive (HDD), a read only memory (ROM), a random access memory (RAM), a solid state drive (SSD), or the like, or any combination thereof, and stores programs and data used when the control unit 210 executes processing based on the programs.
  • HDD hard disk drive
  • ROM read only memory
  • RAM random access memory
  • SSD solid state drive
  • the control unit 210 executes processes based on the programs stored in the storage unit 220, thereby realizing the functions of the server device 100.
  • the communication unit 230 is a NIC (Network Interface Card) or the like, which connects the server device 100 to the network 150 and controls communication with other devices.
  • the data used by the control unit 210 to execute processing based on a program is described as being stored in the storage unit 220, but the data may be stored in a storage unit or the like of another device with which the server device 100 can communicate. Also, in FIG. 2, the control unit 210 is described as being one unit, but multiple control units may execute processing based on a program stored in a storage unit or the like.
  • FIG. 3 is a diagram showing an example of the hardware configuration of the client device 110.
  • the client device 110 includes, as a hardware configuration, a control unit 310, a storage unit 320, an input unit 330, an output unit 340, and a communication unit 350.
  • the control unit 310 is a CPU or the like, and controls the entire client device 110.
  • the storage unit 320 is any one of an HDD, a ROM, a RAM, and an SSD, or any combination thereof, and stores a program, data used when the control unit 310 executes processing based on the program, and the like.
  • the control unit 310 executes processing based on the program stored in the storage unit 320, thereby realizing the function of the client device 110.
  • the input unit 330 is a keyboard and/or a mouse, and the like, and inputs user operation information, etc. to the control unit 310.
  • the output unit 340 is a display, and the like, and displays the results of information processing by the control unit 310, etc.
  • the communication unit 350 is a NIC or the like, which connects the client device 110 to the network 150 and controls communication with other devices.
  • the camera 120 also has at least a control unit, a memory unit, and an imaging unit.
  • the imaging unit captures an image of a subject (the surgical site in this embodiment) based on the control of the control unit.
  • the control unit controls the overall processing of the camera 120.
  • the control unit realizes the functions of the camera 120 by executing processing based on a program stored in the memory unit.
  • the memory unit stores the program for the camera 120 and data used when the control unit executes processing based on the program.
  • the memory unit may also be configured to store images captured by the camera 120.
  • FIG. 4 is an activity diagram showing an example of information processing for storing captured images.
  • the control unit 210 waits to receive an image of the surgical site (hereinafter also simply referred to as an image) from the camera 120. If the control unit 210 receives an image, it proceeds to activity A402, and if the control unit 210 does not receive an image, it repeats the processing of activity A401.
  • the control unit 210 stores the received captured image in the storage unit 220 or the like.
  • the server device 100 repeatedly executes the process shown in Fig. 4.
  • the camera 120 captures images of the surgical site in the operating room and transmits chronologically continuous captured images (i.e., video) to the server device 100.
  • the server device 100 stores the video transmitted from the camera 120 in the storage unit 220 or the like.
  • the server device 100 is described as executing the information processing described below after storing the captured image received from the camera 120 in the storage unit 220, etc.
  • the server device 100 may acquire the captured image from the predetermined data received from the camera 120 before storing the captured image in the storage unit 220, etc., and execute the information processing described below.
  • FIG. 5 is an activity diagram showing an example of main information processing.
  • the control unit 210 acquires a captured image (frame) from data received from the storage unit 220 or the camera 120.
  • Fig. 6 is a diagram showing an example of the acquired captured image.
  • the captured image includes a surgical site of an abdominal surgery. Also, as shown in Fig. 6, the captured image includes an operator's hand 610, a surgical instrument 620, a surgical instrument 630, and a surgical instrument 640. That is, the captured image of the surgery shown in Fig. 6 is an image of a surgical site during an abdominal surgery.
  • the operator's hand 610, the surgical instrument 620, the surgical instrument 630, and the surgical instrument 640 included in the captured image are examples of objects.
  • the control unit 210 extracts objects from the captured image.
  • the control unit 210 inputs the captured image to the trained model.
  • the trained model is a trained model that has been trained using the captured image of the surgery as input data and the objects contained in the captured image of the surgery as output data.
  • the control unit 210 extracts objects from the captured image by acquiring the objects output from the trained model.
  • the control unit 210 may perform image analysis of the captured image and extract objects from the captured image based on the image analysis. When multiple objects are present, the control unit 210 extracts each of the multiple objects. In the following explanation, unless otherwise specified, it is assumed that multiple objects are present.
  • the control unit 210 obtains the axes of each of a plurality of objects extracted from the captured image, and obtains a gaze point from the intersection of the obtained axes.
  • a method for determining the axis of an object will be described with reference to Fig. 7.
  • Fig. 7 is a diagram for explaining a method for determining the axis of an object.
  • the control unit 210 determines a rectangular area that circumscribes an area from which an object included in a captured image is extracted, and determines a width W and a height H of the rectangular area.
  • the control unit 210 also determines a center of gravity (X, Y) 710 of the rectangular area.
  • the control unit 210 obtains the area and center of gravity of an extraction area existing within a circle with a radius of ⁇ (W ⁇ 2+H ⁇ 2)*3/8 at the coordinates of the following eight points. 1. ((X-W)/4, (Y-H)/4) 2. (X, (Y-H)/4) 3. ((X+W)/4, (Y-H)/4) 4. ((X-W)/4,Y) 5. ((X+W)/4,Y) 6. ((X-W)/4, (Y+H)/4) 7. (X, (Y+H)/4) 8. ((X-W)/4, (Y-H)/4) 7, eight circles are shown as the areas of interest. The control unit 210 calculates the area and center of gravity of the extraction areas that exist within these circles.
  • control unit 210 selects the area of interest (of the eight circles) that overlaps the largest area with the extracted area.
  • area of interest 720 is selected as the selected area of interest.
  • the control unit 210 also selects three areas of interest that exist opposite the selected area of interest with respect to the center of gravity.
  • area of interest 721, area of interest 722, and area of interest 723 are selected as the three areas of interest.
  • the control unit 210 determines a center of gravity 730 of the area where the attention area 720, which is the selected attention area, overlaps with the extraction area.
  • the control unit 210 also determines a center of gravity 731 of the area where the attention area 721 overlaps with the extraction area.
  • the control unit 210 also determines a center of gravity 732 of the area where the attention area 722 overlaps with the extraction area.
  • the control unit 210 also determines a center of gravity 733 of the area where the attention area 723 overlaps with the extraction area.
  • the control unit 210 also determines a center of gravity 735 which is the center of gravity of the centers of gravity 731, 732, and 733. Then, the control unit 210 sets the straight line connecting the center of gravity 730, the center of gravity 710, and the center of gravity 735 as the axis of the object to be processed among the extracted objects.
  • the control unit 210 determines the axes of each of the multiple objects extracted from the captured image, and determines the gaze point of the captured image to be processed from the intersection of the determined axes. More specifically, the control unit 210 removes outlier intersections from all intersections of the determined axes. The control unit 210 removes outliers using an outlier removal algorithm called LocalOutlierFactor, but is not limited to this method. The control unit 210 sets the average point of the intersections from which the outliers have been removed as the gaze point of the captured image to be processed (current frame).
  • Figure 8 is a diagram showing an example of the gaze point of the captured image to be processed (current frame). Point 810 is an example of the determined gaze point.
  • the control unit 210 acquires the gaze points of the captured image (current frame) and the past captured images (past frames) that are chronologically consecutive from the captured image (current frame) from the storage unit 220, etc.
  • the control unit 210 acquires gaze points for four frames as the gaze points of the past captured images (past frames), but is not limited to this.
  • the control unit 210 calculates the average of the gaze points of the captured image (current frame) and the gaze points of the past captured images (past frames) to obtain a comprehensive gaze point.
  • This process is an example of a process for obtaining a gaze point related to surgery based on the gaze point obtained based on the captured image (current frame) to be processed and the gaze point of the past captured images (past frames) that are chronologically consecutive to the captured image to be processed.
  • FIG. 9 is a diagram showing an example of obtaining a comprehensive gaze point based on the gaze point of the current frame and the gaze points of several past frames.
  • the determined overall gaze point is the gaze point determined in activity A503.
  • control unit 210 outputs information based on the gaze point obtained in activity A503.
  • the control unit 210 outputs information based on the obtained gaze point (for example, an object indicating the gaze point) by displaying information indicating the obtained gaze point superimposed on the captured image displayed on the output unit 340 of the client device 110.
  • the control unit 210 outputs information based on the obtained gaze point by transmitting control information for controlling an imaging device that captures an image based on the obtained gaze point to the imaging device.
  • the control information includes information for controlling the camera 120 so that the gaze point is included in the captured image.
  • the control unit 210 transmits control information to the camera 120 for shifting the imaging orientation or position of the camera 120 so that the gaze point is included in the captured image.
  • the camera 120 receives the control information, the camera 120 moves the imaging orientation of the camera 120 or the position of the camera 120 based on the control information so that the gaze point is included in the captured image.
  • the camera 120 can be controlled to record an image including the gaze point of the surgical site.
  • an image including the gaze point of the surgical site is recorded in the camera 120 or the server device 100.
  • Modifications of the first embodiment will be described. Modification 1 will describe another example of a part of the processing of the first embodiment. Modification 1 is included in the first embodiment, and does not describe other embodiments. The same applies to the modifications described below.
  • Another example of the control information is information for controlling the imaging device to focus on the gaze point. For example, when the focus of the camera 120 deviates from the gaze point, the control unit 210 transmits control information to the camera 120 for changing the focus setting of the camera 120 so that the focus of the camera 120 coincides with the gaze point of the surgical site (surgical field). Upon receiving the control information, the camera 120 focuses the camera 120 on the gaze point of the surgical site (surgical field) based on the control information.
  • the camera 120 can be controlled to record an image that includes the gaze point of the surgical site that is not out of focus. As a result, an image that includes the gaze point of the surgical site and is not out of focus is recorded in the camera 120 or the server device 100.
  • the control unit 210 may output audio information as information based on the obtained gaze point.
  • An example of the audio information is information such as "the gaze point is blocked by the surgeon's head.”
  • information regarding the point of gaze can be output by audio information.
  • the control unit 210 transmits control information to the laparoscopic camera for controlling the laparoscopic camera so that an image including the gaze point of the surgical site (operative field) is always captured during the laparoscopic surgery.
  • the laparoscopic camera moves, for example, automatically during the laparoscopic surgery so that an image including the gaze point of the surgical site (operative field) is always captured.
  • the camera 120 can be controlled to record an image including the gaze point of the surgical site even in surgery other than open surgery, such as laparoscopic surgery. As a result, an image including the gaze point of the surgical site is recorded in the camera 120 or the server device 100.
  • the control unit 210 extracts surgical instruments and anatomical structures from the captured image in the activity A502.
  • the surgical instruments are an example of a first object.
  • the anatomical structures are an example of a second object.
  • the anatomical structures are anatomical parts that constitute the human body, and examples of such anatomical structures include the gallbladder, liver, and heart.
  • the control unit 210 inputs the acquired captured image into the trained model.
  • the trained model is a trained model trained using the surgical captured image as input data and the objects and object types included in the surgical captured image as output data.
  • the control unit 210 determines the axis of the surgical instrument. Then, the control unit 210 determines the intersection of the determined axis and the extracted anatomical structure, and sets the determined intersection as the gaze point.
  • FIG. 10 is a diagram showing an example of a gaze point of a captured image (current frame) to be processed in Modification Example 4.
  • An axis 1010 is an axis of a surgical instrument extracted from the captured image.
  • a gaze point 1020 is an intersection point of an anatomical structure and the axis 1010. According to the fourth modification, even if there is only one surgical instrument, the gaze point can be obtained based on the captured image. Therefore, even if there is only one surgical instrument, the camera 120 can be controlled to record an image including the gaze point of the surgical site. As a result, an image including the gaze point of the surgical site is recorded in the camera 120 or the server device 100.
  • An information processing device that acquires captured images of a surgery, determines a gaze point related to the surgery based on the captured images of the surgery, and outputs information based on the determined gaze point.
  • an information processing device extracts multiple objects included in the captured image, determines the axes of the extracted multiple objects, and determines the gaze point from the intersection of the determined axes.
  • the object is an instrument related to the surgery.
  • a first object and a second object included in the captured image are extracted, an axis of the extracted first object is determined, and the gaze point is determined from the intersection of the axis and the second object.
  • the first object is an instrument related to the surgery
  • the second object is an anatomical structure.
  • An information processing device which determines a gaze point related to the surgery based on the gaze point determined based on the captured image and the gaze point of a past captured image that is chronologically consecutive to the captured image.
  • An information processing device which outputs information based on the determined gaze point by displaying information indicating the determined gaze point superimposed on the captured image.
  • An information processing device which outputs information based on the determined gaze point by transmitting control information to an imaging device that captures the captured image based on the determined gaze point.
  • control information is information for controlling the imaging device so that the gaze point is included in the captured image.
  • control information is information for controlling the imaging device so as to focus on the gaze point.
  • An information processing method that acquires an image of a surgery, determines a gaze point related to the surgery based on the image of the surgery, and outputs information based on the determined gaze point.
  • 100 Server device, 110: Client device, 120: Camera, 150: Network, 210: Control unit, 220: Memory unit, 230: Communication unit, 310: Control unit, 320: Memory unit, 330: Input unit, 340: Output unit, 350: Communication unit, 610: Hand, 620: Instrument, 630: Instrument, 640: Instrument, 710: Center of gravity, 720: Area of interest, 721: Area of interest, 722: Area of interest, 723: Area of interest, 730: Center of gravity, 731: Center of gravity, 732: Center of gravity, 733: Center of gravity, 735: Center of gravity, 1000: Information processing system

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Image Analysis (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un dispositif de traitement d'informations. Ce dispositif de traitement d'informations : acquiert une image capturée d'une opération chirurgicale ; obtient un point de regard par rapport à l'opération chirurgicale, sur la base de l'image capturée de l'opération chirurgicale ; et délivre en sortie des informations sur la base du point de regard obtenu.
PCT/JP2023/044369 2022-12-15 2023-12-12 Dispositif de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard, et procédé et programme de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard Ceased WO2024128212A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2024564388A JPWO2024128212A1 (fr) 2022-12-15 2023-12-12

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-200600 2022-12-15
JP2022200600 2022-12-15

Publications (1)

Publication Number Publication Date
WO2024128212A1 true WO2024128212A1 (fr) 2024-06-20

Family

ID=91485739

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/044369 Ceased WO2024128212A1 (fr) 2022-12-15 2023-12-12 Dispositif de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard, et procédé et programme de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard

Country Status (2)

Country Link
JP (1) JPWO2024128212A1 (fr)
WO (1) WO2024128212A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090245600A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Automated panning and digital zooming for robotic surgical systems
WO2018159328A1 (fr) * 2017-02-28 2018-09-07 ソニー株式会社 Système de bras médical, dispositif de commande et procédé de commande
WO2019087904A1 (fr) * 2017-11-01 2019-05-09 ソニー株式会社 Système de bras chirurgical et système de commande de bras chirurgical
JP2021013412A (ja) * 2019-07-10 2021-02-12 ソニー株式会社 医療用観察システム、制御装置及び制御方法
WO2021193697A1 (fr) * 2020-03-26 2021-09-30 国立大学法人筑波大学 Dispositif de prise de vues vidéo à points de vue multiples

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090245600A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Automated panning and digital zooming for robotic surgical systems
WO2018159328A1 (fr) * 2017-02-28 2018-09-07 ソニー株式会社 Système de bras médical, dispositif de commande et procédé de commande
WO2019087904A1 (fr) * 2017-11-01 2019-05-09 ソニー株式会社 Système de bras chirurgical et système de commande de bras chirurgical
JP2021013412A (ja) * 2019-07-10 2021-02-12 ソニー株式会社 医療用観察システム、制御装置及び制御方法
WO2021193697A1 (fr) * 2020-03-26 2021-09-30 国立大学法人筑波大学 Dispositif de prise de vues vidéo à points de vue multiples

Also Published As

Publication number Publication date
JPWO2024128212A1 (fr) 2024-06-20

Similar Documents

Publication Publication Date Title
US12444094B2 (en) Systems and methods for controlling surgical data overlay
US10169535B2 (en) Annotation of endoscopic video using gesture and voice commands
JP5904812B2 (ja) 医療ディスプレイのための外科医の補助
JP7160033B2 (ja) 入力制御装置、入力制御方法、および手術システム
JP6663571B2 (ja) 内視鏡画像処理装置、および、内視鏡画像処理装置の作動方法、並びにプログラム
JP2005110878A (ja) 手術支援システム
JPWO2016208246A1 (ja) 医療用立体観察装置、医療用立体観察方法、及びプログラム
EP3415076B1 (fr) Dispositif, système, procédé et programme de traitement d'image médicale
JPWO2018096987A1 (ja) 情報処理装置および方法、並びにプログラム
CN113768619B (zh) 路径定位方法、信息显示装置、存储介质及集成电路芯片
US20230410491A1 (en) Multi-view medical activity recognition systems and methods
WO2023002661A1 (fr) Système de traitement d'informations, procédé de traitement d'informations, et programme
JP7521538B2 (ja) 医療イメージングシステム、医療イメージング方法及び画像処理装置
JP7417337B2 (ja) 情報処理システム、情報処理方法及びプログラム
WO2020152758A1 (fr) Instrument endoscopique et système endoscopique
WO2024128212A1 (fr) Dispositif de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard, et procédé et programme de traitement d'informations pour délivrer en sortie des informations sur la base d'un point de regard
US12008682B2 (en) Information processor, information processing method, and program image to determine a region of an operation target in a moving image
JP7563384B2 (ja) 医療画像処理装置および医療画像処理プログラム
Green et al. Microanalysis of video from a robotic surgical procedure: implications for observational learning in the robotic environment
WO2018087977A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7571722B2 (ja) 医療用観察システムおよび方法、並びに医療用観察装置
JP7480783B2 (ja) 内視鏡システム、制御装置、および制御方法
JP7451707B2 (ja) 制御装置、データログの表示方法及び医療用集中制御システム
WO2020049993A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme
US12290232B2 (en) Enhanced video enabled software tools for medical environments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23903496

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024564388

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 23903496

Country of ref document: EP

Kind code of ref document: A1