WO2024202956A1 - Dispositif de traitement de données médicales et système médical - Google Patents
Dispositif de traitement de données médicales et système médical Download PDFInfo
- Publication number
- WO2024202956A1 WO2024202956A1 PCT/JP2024/007940 JP2024007940W WO2024202956A1 WO 2024202956 A1 WO2024202956 A1 WO 2024202956A1 JP 2024007940 W JP2024007940 W JP 2024007940W WO 2024202956 A1 WO2024202956 A1 WO 2024202956A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display area
- surgeon
- image display
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/35—Surgical robots for telesurgery
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
Definitions
- This technology relates to medical data processing devices and medical systems, for example, medical data processing devices and medical systems that allow medical staff to check the details of surgery during or after surgery.
- Endoscopic surgery and robotic surgery are becoming more common.
- the medical equipment used in such surgeries is equipped with foot switches that are operated with the feet, and some equipment allows the surgeon to operate the foot switches while performing the surgery.
- the foot switch is displayed on a monitor different from the monitor that displays the surgical field, so the surgeon must move his or her gaze from the monitor that displays the surgical field to the monitor that displays the foot switch.
- a medical data processing device is a medical data processing device that includes a surgical field image display area that displays an image of the surgical field, and a part image display area that displays an image including a specified part of the surgeon's body, both of which are provided on the same screen.
- the medical data processing device includes an image data generating unit that generates image data for the image.
- a medical system is a medical system that includes a surgical field camera that photographs the surgical field, a handheld camera that photographs the hands of the surgeon who operates the operating levers of the surgical system, a foot camera that photographs the feet of the surgeon who operates the foot pedals of the surgical system, and a display device that displays at least two of the surgical field image display area that displays the surgical field image photographed by the surgical field camera, the hand image display area that displays the hand image photographed by the handheld camera, and the foot image display area that displays the foot image photographed by the foot camera, all on the same screen.
- a medical data processing device generates video data of an image in which a surgical field image display area that displays an image of the surgical field and a body part image display area that displays an image including a specified part of the surgeon's body are provided on the same screen.
- a medical system includes a surgical field camera that photographs the surgical field, a hand camera that photographs the hands of the surgeon operating the operating levers of the surgical system, a foot camera that photographs the feet of the surgeon operating the foot pedals of the surgical system, and a display device that displays at least two of the surgical field image display area that displays the surgical field image photographed by the surgical field camera, the hand image display area that displays the hand image photographed by the hand camera, and the foot image display area that displays the foot image photographed by the foot camera, all on the same screen.
- the medical data processing device may be an independent device or an internal block that constitutes a single device.
- FIG. 1 is a diagram showing a configuration of an embodiment of a medical system to which the present technology is applied.
- 1A and 1B are diagrams for explaining the mounting position of a camera and an image to be captured.
- FIG. 13 is a diagram showing the configuration of another embodiment of a medical system to which the present technology is applied.
- FIG. 1 is a diagram illustrating an example of the configuration of a closed console type operation unit.
- FIG. 1 is a diagram illustrating an example of the configuration of an open console type operation unit.
- FIG. 1 is a diagram illustrating an example of the configuration of a medical system.
- FIG. 2 is a diagram for explaining a screen displayed on a display device.
- FIG. 1 illustrates an example of the configuration of a data processing device.
- FIG. 13 is a diagram for explaining distortion correction.
- FIG. 13 is a diagram for explaining how to synchronize videos.
- FIG. 13 is a diagram for explaining video clipping.
- FIG. 13 is a diagram for explaining superimposition of an image of a hand.
- FIG. 13 is a diagram for explaining how to generate an image of a hand.
- FIG. 2 illustrates an example of the configuration of a PC.
- Example of a medical system The technology disclosed herein can be applied to medical systems that enable the viewing of images of the surgical field, the surgeon's hands, the surgeon's feet, etc. during or after surgery.
- Fig. 1 is a schematic diagram of an endoscopic surgery system 1 as an example of a medical system to which the technology according to the present disclosure is applied.
- Fig. 1 shows a state in which an operator (doctor) 81 is performing surgery on a patient PT lying on a patient bed BD.
- the endoscopic surgery system 1 is composed of an endoscope 10, other surgical tools 20, and a support arm device 30 that supports the endoscope 10.
- the endoscopic surgery system 1 also includes various devices for endoscopic surgery.
- the endoscope 10 may also be supported by medical staff known as a scopist.
- the tube 11 of the endoscope 10 and other surgical tools 20 are inserted into the body cavity of the patient PT from the trocars 41a to 41d.
- the other surgical tools 20 inserted into the body cavity of the patient PT include a pneumoperitoneum tube 21, an energy treatment tool 22 that uses high-frequency current or ultrasonic vibration to cut or dissect tissue and seal blood vessels, and forceps 23.
- the illustrated surgical tools 20 are merely an example, and various surgical tools generally used in endoscopic surgery, such as a pneumoperitoneum or retractor, may be used as the surgical tool 20.
- An image of the surgical site inside the body cavity of the patient PT captured by the endoscope 10 is displayed on the display 50. While viewing the image of the surgical site displayed on the display 50 in real time, the surgeon 81 performs treatment such as resecting the affected area using the energy treatment tool 22 and forceps 23. Although not shown, the insufflation tube 21, energy treatment tool 22, and forceps 23 are supported by the surgeon 81 or an assistant during surgery.
- the support arm device 30 includes an arm portion 32 extending from a base portion 31.
- the arm portion 32 is composed of joints 33a, 33b, and 33c and links 34a and 34b, and is driven under the control of a control device (not shown).
- the arm portion 32 supports the endoscope 10, and controls its position and orientation. This allows the endoscope 10 to be stably fixed in its position.
- the endoscope 10 is composed of a telescope tube 11, the tip of which is inserted into the body cavity of the patient PT at a predetermined length, and a camera head 12 connected to the base end of the telescope tube 11.
- the endoscope 10 is configured as a so-called rigid scope having a rigid telescope tube 11, but the endoscope 10 may also be configured as a so-called flexible scope having a flexible telescope tube 11.
- the tip of the lens barrel 11 has an opening in which a wide-angle lens is fitted as an objective lens.
- a light source device (not shown) is connected to the endoscope 10, and light generated by the light source device is guided to the tip of the lens barrel 11 by a light guide that extends inside the lens barrel 11, and is irradiated via the wide-angle lens towards the object to be observed inside the body cavity of the patient PT.
- the endoscope 10 may be a direct-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
- An optical system and an image sensor are provided inside the camera head 12, and reflected light from the object being observed (observation light) is focused onto the image sensor by the optical system.
- the observation light is photoelectrically converted by the image sensor to generate an electrical signal corresponding to the observation light, i.e., an image signal corresponding to the observed image.
- the image signal is sent as RAW data to a CCU (Camera Control Unit) (not shown).
- the camera head 12 is equipped with a function for adjusting the magnification and focal length by appropriately driving the optical system.
- multiple imaging elements may be provided in the camera head 12.
- multiple relay optical systems are provided inside the lens barrel 11 to guide the observation light to each of the multiple imaging elements.
- the display 50 may also be capable of 3D display.
- the endoscopic surgery system 1 is provided with an input device that is an input interface for the endoscopic surgery system 1.
- the input device is composed of, for example, a mouse, a keyboard, a touch panel, a switch, a foot pedal 61, a lever, etc.
- the touch panel may be provided on the display surface of the display 50.
- the input device may be composed of a device worn by the surgeon, such as a glasses-type wearable device or a head mounted display (HMD), or a camera capable of detecting the surgeon's movements, and various inputs may be made according to the surgeon's gestures and line of sight.
- a device worn by the surgeon such as a glasses-type wearable device or a head mounted display (HMD), or a camera capable of detecting the surgeon's movements, and various inputs may be made according to the surgeon's gestures and line of sight.
- HMD head mounted display
- This configuration allows the surgeon 81 to perform various procedures while viewing the image of the surgical site displayed on the display 50.
- the endoscopic surgery system 1 is equipped with a gaze position detection camera 71-1 that captures the face (eyes) of the surgeon 81 to detect the gaze position of the surgeon 81, a hand video camera 71-2 that captures the hands of the surgeon 81, and a foot video camera 71-3 that captures the feet of the surgeon 81.
- a gaze position detection camera 71-1 that captures the face (eyes) of the surgeon 81 to detect the gaze position of the surgeon 81
- a hand video camera 71-2 that captures the hands of the surgeon 81
- a foot video camera 71-3 that captures the feet of the surgeon 81.
- the gaze position detection camera 71-1 is attached to a part of the display 50 and is attached so as to face the surgeon 81.
- the handheld video camera 71-2 is attached to the shadowless lamp 52 on the ceiling. For example, as shown in A of FIG. 2, the handheld video camera 71-2 is attached on the illuminated side of the shadowless lamp 52 in a position where it does not affect the light from the shadowless lamp 52.
- the foot video camera 71-3 is attached to the underside of the patient bed BD. For example, as shown in FIG. 2B, the foot video camera 71-3 captures the feet of the surgeon 81 and captures the operation of the foot pedal 61 by the surgeon 81.
- the images captured by such cameras 71 are viewed during or after surgery by the surgeon 81 and other medical staff (such as medical students), as described below.
- the images captured by the camera 71 are displayed on the display 50 or on the display device 380 ( Figure 6).
- Robot endoscopic surgery system The present technology can also be applied to a medical system known as a robotic endoscopic surgery system, etc.
- FIG. 1 A robotic endoscopic surgery system, etc.
- the robotic endoscopic surgery system 100 is a system in which a robot has some of the functions of the endoscopic surgery system 1 shown in FIG. 1.
- the robotic endoscopic surgery system 100 is composed of an operation unit 101, a main body 102, and a monitor unit 103.
- the operation unit 101 is a device for operating the main body 102.
- the main body 102 has, for example, three arms 111 to 113.
- the operation unit 101 is operated by the surgeon 81a and remotely controls the arms 111 to 113 of the main body 102.
- the surgeon 81a operates the arms 111 to 113 of the main body 102 while looking at a display provided on the operation unit 101.
- the arms 111 to 113 of the main body 102 are an electric scalpel, an endoscope, forceps, etc.
- the monitor unit 103 is installed near the main body 102 and is used to monitor the progress of the surgery.
- the surgeon 81b watches the monitor unit 103 and provides assistance with the surgery as necessary.
- FIG. 4 is a diagram showing an example of the configuration of the operation unit 101.
- the operation unit 101 shown in FIG. 4 is a type of operation unit known as a closed console type.
- the surgical field imaged by an imaging device attached to the main body 102 is displayed on this display 121, and the surgeon 81a performs surgery while viewing the image.
- the gaze position detection camera 71-1 which captures the gaze position of the surgeon 81a looking at the display 121, can be installed in a position where it can capture the gaze position and does not get in the way of the surgeon 81a, the gaze position detection camera 71-1 is attached in such a position. Note that if there is no suitable position for attaching the gaze position detection camera 71-1, it is also possible to configure the system without providing the gaze position detection camera 71-1.
- the operation unit 101 is provided with operation levers 122-1, 122-2 that operate the arms 111 to 113 of the main body 102.
- a handheld video camera 71-2 is attached in a position where it can capture images of the operation levers 122-1, 122-2, for example, in a position outside the recess in which the display 121 is provided.
- the handheld video camera 71-2 captures images of the hands of the surgeon 81a, and captures the surgeon 81a operating the operation lever 122.
- the operation unit 101 is provided with a foot pedal 123.
- a foot video camera 71-3 is attached in a position where it can capture an image of this foot pedal 123, for example, on the underside of the platform on which the operation lever 122 is provided.
- the foot video camera 71-3 captures the feet of the surgeon 81a, and captures the surgeon 81a operating the foot pedal 123.
- the images captured by such cameras 71 are viewed during or after surgery by the surgeon 81a or medical staff other than the surgeon (e.g., medical students), as described below.
- the images captured by the camera 71 are displayed on a display (not shown) or on the display of the monitor unit 103.
- FIG. 5 is a diagram showing another example of the configuration of the operation unit 101.
- the operation unit 101 shown in FIG. 5 is of a type known as an open console type.
- the operation unit 101 is provided with a display 221.
- the surgical field captured by an imaging device attached to the main body 102 is displayed on this display 221, and the surgeon 81a performs surgery while viewing the image.
- a gaze position detection camera 71-1 that captures the gaze position of the surgeon 81a looking at the display 221 is attached to the frame of the display 221 at the upper side in the figure.
- the operation unit 101 is provided with operation levers 222-1, 222-2 that operate the arms 111 to 113 of the main body 102.
- a handheld video camera 71-2 is attached to a position where it can capture images of the operation levers 222-1, 222-2, for example, on the lower frame of the display 221 in the figure.
- the handheld video camera 71-2 captures images of the hands of the surgeon 81a, and captures images of the surgeon 81a operating the operation lever 222.
- the operation unit 101 is provided with a foot pedal 223.
- a foot video camera 71-3 is attached to a position where it can capture an image of this foot pedal 223, for example, to a frame that is provided so as to surround the operation lever 222.
- the foot video camera 71-3 captures the feet of the surgeon 81a, and captures the surgeon 81a operating the foot pedal 223.
- the images captured by such cameras 71 are viewed during or after surgery by the surgeon 81a or medical staff other than the surgeon (e.g., medical students), as described below.
- the images captured by the camera 71 are displayed on a display (not shown) or on the display 221.
- ⁇ Medical system configuration> 6 is a diagram showing a configuration of an embodiment of a medical system 300 to which the present technology is applied.
- the medical system 300 includes a surgery system 320, an in-hospital information system 340, a data processing device 360, and a display device 380.
- the surgical system 320 is the endoscopic surgical system 1 or the robotic endoscopic surgical system 100 described with reference to Figures 1 to 5.
- Figure 6 shows the parts of the endoscopic surgical system 1 or the robotic endoscopic surgical system 100 that are necessary for the following explanation.
- the surgery system 320 includes a surgery video camera 322, a gaze position detection camera 71-1, a handheld video camera 71-2, a foot video camera 71-3, an operating room video camera 324, and an audio capture unit 326.
- the surgical video camera 322 is a camera that captures the image displayed on the display 50 ( Figure 1), and is a camera that captures the surgical site inside the body cavity of the patient PT.
- the surgical video camera 322 is a camera that captures the video displayed on the display 121 ( Figure 4) or the display 221 ( Figure 5), and is a camera that captures the surgical site of the patient PT.
- the gaze position detection camera 71-1, hand video camera 71-2, and foot video camera 71-3 are cameras that capture images to detect the surgeon's gaze position, capture images of the surgeon's hands, and capture images of the surgeon's feet, respectively.
- the operating room video camera 324 is a camera installed, for example, on the ceiling or wall of the operating room in which the surgical system 320 is located, and is a camera that captures images of the inside of the operating room.
- the audio capture unit 326 includes a microphone and mainly captures the voice of the surgeon.
- the audio capture unit 326 may be configured to capture not only the voice of the surgeon, but also the voices of people other than the surgeon.
- the in-hospital information system 340 which will be described later with reference to Figure 8, is a system that manages, acquires, and provides information acquired within the hospital, such as vital signs (vital monitor information) and patient attribute information.
- the data processing device 360 includes a workstation and a server, and uses data supplied from the surgery system 320 and the in-hospital information system 340 to generate video data to be displayed on the display device 380.
- the display device 380 is a display and provides a screen such as that shown in FIG. 7 to the viewer.
- a person who views the image displayed on the display device 380 is appropriately referred to as a viewer.
- a screen (image) such as that shown in FIG. 7 is viewed during surgery, it is viewed by the surgeon or medical staff, and these people, including the surgeon, can be viewers.
- a screen (image) like that shown in Figure 7 is viewed after surgery, it is viewed by the surgeon and medical staff, and these people, including the surgeon, can become viewers.
- the surgeon can view a screen like that shown in Figure 7 to confirm the details of the surgery he or she performed.
- a medical student can view a screen like that shown in Figure 7 to learn about the details of the surgery.
- the display device 380 is provided with a surgical field image display area 382 in which an image of the surgical field captured by the surgical image camera 322 ( Figure 6) is displayed, and a hand image display area 384 in which an image of the surgeon's hands captured by the hand image camera 71-2 ( Figure 6) is displayed.
- the display device 380 also has a foot image display area 386 in which an image of the surgeon's feet captured by the foot image camera 71-3 ( Figure 6) is displayed, and an information display area 388 in which other information is displayed.
- a pointer 391 indicating the eye gaze position of the surgeon detected by the eye gaze position detection camera 71-1 ( Figure 6) is displayed within the surgical field video display area 382.
- a mechanism may be provided that allows the viewer to select whether or not to display the pointer 391.
- the screen displayed on the display device 380 includes a surgical field image display area 382, a hand image display area 384, a foot image display area 386, and an information display area 388, but the screen may display two or three of these four areas.
- Two areas may be displayed: a surgical field image display area 382, and a hand image display area 384 or a foot image display area 386.
- An information display area 388 may be further displayed in addition to these two areas.
- the area to be displayed on the display device 380 among the surgical field image display area 382, hand image display area 384, foot image display area 386, and information display area 388 may be selected by the viewer, and a screen in which the area selected by the user is laid out may be displayed.
- the display device 380 may be a display capable of displaying three-dimensional images.
- a display device 380 capable of displaying three-dimensional images at least the images displayed in the surgical field image display area 382 are displayed as three-dimensional images.
- the surgical video camera 322 is a camera that captures three-dimensional images.
- the surgical video camera 322 may be a camera that captures two-dimensional images, and the two-dimensional images captured by the surgical video camera 322 may be processed into three-dimensional images, and the three-dimensional images may be displayed in the surgical field image display area 382.
- the surgical field image display area 382 and the hand image display area 384 are provided on the same screen, allowing the image of the surgical field and the image of the surgeon's hand to be viewed simultaneously, making it possible to check how the surgeon has operated the operating lever 222 ( Figure 5) while performing the procedure displayed in the surgical field image display area 382.
- the surgical field image display area 382 and foot image display area 386 are provided on the same screen, allowing the image of the surgical field and the image of the surgeon's feet to be viewed simultaneously, making it possible to check how the surgeon has operated the foot pedal 223 ( Figure 5) while performing the procedure displayed in the surgical field image display area 382.
- images of the inside of the operating room captured by the operating room video camera 324 ( Figure 6) and text transcribed from the voice of the surgeon captured by the voice capture unit 326 ( Figure 6) are displayed.
- the voice may be provided to the viewer through a speaker (not shown).
- Information display area 388 displays information on the patient's vital signs monitor, X-ray fluoroscopic image data, ultrasound image data, CT and MRI image data, and patient attribute information supplied from the in-hospital information system 340.
- Patient attribute information includes, for example, the patient's age, sex, disease name, medical history, etc.
- These images and information are displayed on the display device 380.
- the surgeon performs surgery while watching the display device 380 during surgery, for example, he or she can check the operation of the foot pedal 223 at his or her feet without looking at his or her feet, or can check the values on a vital sign monitor while checking the surgical field.
- the surgeon can confirm the details of the surgery and identify areas for improvement.
- various information such as vital signs, X-ray fluoroscopic images, and the overall state of the operating room can be checked, allowing the surgeon to grasp the situation at the time.
- the surgeon's voice can be converted into text and displayed, or presented as audio, allowing the surgeon's intentions and explanations during surgery to be confirmed.
- the display device 380 is a display that allows 3D images to be viewed with the naked eye, the 3D images can be viewed without wearing glasses, reducing the strain on the viewer even when viewing for long periods of time.
- Figure 8 shows an example of the internal configuration of the data processing device 360 and the data exchanged within the medical system 300.
- the surgical system 320 and the in-hospital information system 340 are each connected to an NTP (Network Time Protocol) server 420.
- NTP Network Time Protocol
- the NTP server 420 is a server that acquires and distributes correct time information. By connecting to the NTP server 420, when exchanging time information between the server and client, the internal clocks of the computers in the surgery system 320 and the in-hospital information system 340 can be set correctly by making queries using a communications protocol called NTP over a TCP/IP network.
- a surgical field image based on surgical image data supplied from the surgical system 320 is displayed in the surgical field image display area 382, and a hand image based on hand image data is displayed in the hand image display area 384.
- the surgical field image displayed in the surgical field image display area 382 and the hand image displayed in the hand image display area 384 need to be synchronized.
- time information from the NTP server 420 is added to the data supplied from the surgical system 320 to the data processing device 360.
- time information from the NTP server 420 is added to the data supplied from the in-hospital information system 340 to the data processing device 360.
- the data processing device 360 includes an image distortion correction unit 361, a pointer overlay unit 362, a transcription unit 363, an image time synchronization unit 364, a display layout unit 365, a 3D rendering unit 366, an image output control unit 367, an image display angle determination unit 368, a gaze detection unit 369, and a display content selection unit 370.
- the configuration of the data processing device 360 shown in FIG. 8 shows a configuration when 3D image data is acquired as the surgical image data.
- the configuration of the data processing device 360 shown in FIG. 8 can be such that the part that processes the 3D image data has been deleted.
- the configuration of the data processing device 360 shown in FIG. 8 can be such that the part that processes the 3D image data is not processed.
- the surgical video data from the surgical system 320 is supplied to the image distortion correction unit 361.
- an image (image) based on the surgical video data may be suitable for display on the display 121 of the operation unit 101, but may be distorted when displayed as is on another display.
- the image distortion correction unit 361 converts such an image based on the surgical video data into an image that fits the display in the surgical field image display area 382 of the display device 380 (a distortion-free image as shown in the right diagram of FIG. 9).
- the data is supplied to the video time synchronization unit 364 without being processed by the image distortion correction unit 361.
- the image distortion correction unit 361 may be omitted, and the data may be supplied directly to the video time synchronization unit 364.
- the surgical video data corrected by the image distortion correction unit 361 is supplied to the video time synchronization unit 364 as a corrected surgical 3D video.
- the surgeon's gaze position data from the surgical system 320 is supplied to the pointer superimposition unit 362.
- the pointer superimposition unit 362 sets the display position (coordinates) of the pointer 391 ( Figure 7) representing the surgeon's gaze position within the surgical field image display area 382 ( Figure 7).
- the data for the corrected surgical 3D image also includes information on the coordinates for superimposing the pointer 391 set by the pointer superimposition unit 362.
- the surgeon voice data from the surgical system 320 is supplied to the transcription unit 313.
- the surgeon voice data is audio data
- the transcription unit 313 executes a process of converting the voice based on the audio data into text data.
- the text data generated by converting the audio data is supplied to the video time synchronization unit 364.
- the handheld video data from the surgery system 320 is supplied to the video time synchronization unit 364.
- the handheld video data has been described as 2D video data, if it is 3D video data, it can also be configured so that the data is supplied to the video time synchronization unit 364 after being subjected to correction processing by the image distortion correction unit 361.
- the foot image data from the surgery system 320 is supplied to the image time synchronization unit 364.
- the foot image data has been described as 2D image data, if it is 3D image data, it can also be configured so that the data is supplied to the image time synchronization unit 364 after correction processing is performed by the image distortion correction unit 361.
- the operating room video data from the surgery system 320 is supplied to the video time synchronization unit 364.
- the video time synchronization unit 364 executes a process to synchronize the time of each piece of video data based on the time information attached to the supplied data.
- the time-synchronized video data is supplied to the display layout unit 365.
- the display layout unit 365 generates image data for a screen to be displayed on the display device 380 so that an image based on the supplied video data is displayed in the layout described with reference to FIG. 7.
- the display layout unit 365 is also supplied with data from the in-hospital information system 340.
- the in-hospital information system 340 supplies data such as vital monitor information, X-ray fluoroscopic image data, ultrasound image data, and patient attribute information to the display layout unit 365.
- the display layout unit 365 lays out this data supplied from the in-hospital information system 340 so that it is displayed in the information display area 388 ( Figure 7).
- the display layout unit 365 is also supplied with data from the 3D rendering unit 366.
- the 3D rendering unit 366 is supplied with CT/MRI image data from the in-hospital information system 340.
- the 3D rendering unit 366 performs processing to convert an image based on the CT/MRI image data into an image that can be displayed in the information display area 388.
- the converted CT/MRI image data is supplied to the display layout unit 365.
- the display layout section 365 is also supplied with data from the display content selection section 370.
- the display content selection section 370 allows the viewer to select the content to be displayed on the display device 380, i.e., in this case, which of the surgical field image display area 382, hand image display area 384, foot image display area 386, and information display area 388 will be displayed, and what information will be displayed in the information display area 388.
- the display content selection unit 370 accepts a selection (operation related to selection) from the viewer and supplies the accepted operation information to the display layout unit 365.
- the display layout unit 365 generates video data in which images and information are arranged so that the content selected by the viewer is displayed on the display device 380.
- the video data from the display layout unit 365 is supplied to the video display angle determination unit 368.
- the video display angle determination unit 368 performs processing so that the 3D video is displayed appropriately regardless of the angle from which the viewer views the display device 380.
- the video display angle determination unit 368 processes the video data supplied from the gaze detection unit 369 so that video appropriate for the viewer's gaze direction is displayed.
- the gaze detection unit 69 includes a camera that captures the image of the viewer, and detects the viewer's gaze position (gaze direction) by analyzing the captured image, and supplies the detected gaze position to the image display angle determination unit 368. If the image displayed in the surgical field image display area 382 is not a 3D image, the processing of the gaze detection unit 369 and the image display angle determination unit 368 can be omitted.
- the video data from the video display angle determination unit 368 is supplied to the video output control unit 367.
- the video output control unit 367 supplies the video data to the display device 380.
- the data processing device 360 processes various types of video data supplied from the surgical system 320 and generates screen data to be displayed on the display device 380 as shown in FIG. 7.
- the hand image corresponding to the surgical field image at time t1 is the hand image at time t2, and the foot image corresponding to the surgical field image at time t1 is the foot image at time t3.
- times t1, t2, and t3 are all different times, and the images are not synchronized.
- the viewer feels uncomfortable because of the lack of synchronization when watching such images, they will perform an operation to synchronize them. For example, at time t1, the surgical field video is stopped. Then, at time t2, the video of the hands is stopped, and at time t3, the video of the feet is stopped. In other words, the viewer stops playback of each video when it is determined that the surgical field video, the video of the hands, and the video of the feet are synchronized.
- the viewer specifies the start and end positions of the video they wish to cut out from the video being displayed on the display device 380.
- a cut-out button (not shown) is displayed on the display device 380, and the start and end positions for cutting out are specified by operating this cut-out button.
- video data 511 relating to the video for that section is created.
- the created video data 501 is, for example, a 2D video clip, and is video data showing an image of the surgical field, an image of the hands, and an image of the feet.
- a mechanism may be provided in which a check box (not shown) is displayed in each of the surgical field video display area 382, the hand video display area 384, the foot video display area 386, and the information display area 388, and the video to clip is selected by checking the check box.
- a mechanism may be provided that allows the display layout to be changed, such as by rearranging the area in which the selected image is displayed. For example, a mechanism may be provided that allows the areas to be swapped by dragging and dropping any of the surgical field image display area 382, hand image display area 384, foot image display area 386, and information display area 388.
- Audiences can play the video data 501 on their own information terminals to study, or play it back when making a presentation at an academic conference or other event to give a presentation.
- a surgical field image display area 382 and a hand image display area 384 are displayed on the display device 380.
- the hand image display area 384 displays the operating levers 222-1 and 222-2, and the surgeon's hand 551-1 operating the operating lever 222-1 and the surgeon's hand 551-2 operating the operating lever 222-2 are respectively displayed.
- the learner's hands 561-1 and 561-2 are further displayed in the hand image display area 384.
- the learner performs image training for operating the operating lever 222 while imitating the movements of the surgeon's hands.
- the learner's hand 561-1 is displayed superimposed on the surgeon's hand 551-1
- the learner's hand 561-2 is displayed superimposed on the surgeon's hand 551-1.
- the image of the learner's hand 561 is displayed in a transparent state so that the image of the surgeon's hand 551 can be seen.
- the image of the learner's hand 561 is shown by a dotted line.
- the image of the learner's hand 561 can be an image generated by CG (Computer Graphics).
- the learner's hand 561 is superimposed on the image of the surgeon's hand 551 actually operating the operating lever 222, allowing the learner to learn how to operate in a way that is similar to that of the surgeon while viewing a visually easy-to-understand display.
- the learner's actual hand 561 is photographed with a camera, the photographed image of the hand 561 is analyzed, and feature points are extracted. A CG image of the hand 561 is generated using the extracted feature points. This generated CG image of the hand 561 is displayed in the hand image display area 384, superimposed on the image of the surgeon's actual hand 551.
- controller 571-1 is attached to learner's hand 561-1
- controller 571-2 is attached to learner's hand 561-2
- a CG image of hand 561 is generated from position information of hand 561 (fingers and back of hand) obtained from attached controller 571.
- This generated CG image of hand 561 is displayed in hand image display area 384, superimposed on the image of the surgeon's actual hand 551.
- Hand 551 displayed in hand image display area 384 may be the learner's hand, and hand 561 may be the hand of the surgeon (supervisor). This type of learning can be applied when the learner is performing surgery, or when learning is taking place assuming that the learner is performing surgery, and the supervising surgeon is teaching the ideal operation of operating lever 222.
- the display device 380 displays a surgical field image display area 382 and a hand image display area 384.
- the learner performs surgery while checking the surgical field displayed in the surgical field image display area 382, and can check (learn) the operation of the operating lever 222 while checking the hands 561 of the supervising surgeon displayed in the hand image display area 384.
- feet may be superimposed on the foot image display area 386.
- the operation of the foot pedal 223 can be learned.
- the above-mentioned series of processes can be executed by hardware or software.
- the programs constituting the software are installed in a computer.
- the computer includes a computer built into dedicated hardware, and a general-purpose personal computer, for example, capable of executing various functions by installing various programs.
- FIG. 14 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input/output interface 2005 is further connected to the bus 2004.
- An input unit 2006, an output unit 2007, a memory unit 2008, a communication unit 2009, and a drive 2010 are connected to the input/output interface 2005.
- the input unit 2006 includes a keyboard, a mouse, a microphone, etc.
- the output unit 2007 includes a display, a speaker, etc.
- the storage unit 2008 includes a hard disk, a non-volatile memory, etc.
- the communication unit 2009 includes a network interface, etc.
- the drive 2010 drives removable media 2011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 2001 loads a program stored in the storage unit 2008, for example, into the RAM 2003 via the input/output interface 2005 and the bus 2004, and executes the program, thereby performing the above-mentioned series of processes.
- the program executed by the computer can be provided by being recorded on removable media 2011 such as package media, for example.
- the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 2008 via the input/output interface 2005 by inserting the removable medium 2011 into the drive 2010.
- the program can also be received by the communication unit 2009 via a wired or wireless transmission medium and installed in the storage unit 2008.
- the program can be pre-installed in the ROM 2002 or storage unit 2008.
- the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
- a system refers to an entire device that is made up of multiple devices.
- a medical data processing device comprising: a surgical field image display area for displaying an image of a surgical field; and a part image display area for displaying an image including a specified part of the surgeon, the part image display area being arranged on the same screen; and a video data generating unit for generating video data of the image.
- the medical data processing device according to (1) wherein the part image display area displays the hands of the surgeon operating an operation lever of a surgical system.
- the part image display area includes a hand image display area for displaying an image of the hand of the surgeon who operates an operation lever of the surgical system, and a foot image display area for displaying an image of the foot of the surgeon who operates a foot pedal of the surgical system,
- the video data generation unit generates video data of the video in which the surgical field video display area, the hand video display area, and the foot video display area are displayed on the same screen.
- the medical data processing device according to any one of (1) to (5), wherein the image displayed in the surgical field image display area is a three-dimensional image.
- the medical data processing device according to any one of (1) to (6), further displaying an information display area for displaying at least one of vital signs, X-ray fluoroscopic images, ultrasound images, patient attribute information, CT images, and MRI images.
- the medical data processing device according to (7), wherein the voice of the surgeon is converted into text data, and text based on the text data is displayed in the information display area.
- a pointer indicating the eye gaze position of the surgeon is displayed in the surgical field image display area.
- the medical data processing device according to any one of (1) to (9), wherein video data is generated by extracting a part of a total video time designated by a user.
- the part image display area is a hand image display area for displaying an image of the hand of the surgeon operating an operation lever of the surgical system
- the medical data processing device according to (2), wherein the hand image display area displays an image of the hands of a learner who is learning to operate the operating lever of the surgeon superimposed on the hands of the surgeon.
- a surgical field imaging camera for photographing the surgical field; a hand-held camera for photographing the hands of a surgeon who operates an operating lever of the surgical system; a foot photographing camera that photographs the feet of the surgeon who operates a foot pedal of the surgical system; and a display device that displays at least two of the following areas on the same screen: a surgical field image display area that displays the surgical field image captured by the surgical field camera, a hand image display area that displays the hand image captured by the hand image camera, and a foot image display area that displays the foot image captured by the foot image camera.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Theoretical Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Mathematical Optimization (AREA)
- Radiology & Medical Imaging (AREA)
- Medicinal Chemistry (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Computational Mathematics (AREA)
- Robotics (AREA)
- Algebra (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Mathematical Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
La présente technologie concerne un dispositif de traitement de données médicales et un système médical avec lesquels il est possible de faire remonter du contenu chirurgical. La présente invention comprend une unité de génération de données vidéo pour générer des données vidéo d'une vidéo dans laquelle une région d'affichage vidéo de champ opératoire pour afficher une vidéo dans laquelle un champ opératoire est capturé et une région d'affichage vidéo de site pour afficher une vidéo incluant un site prédéterminé d'un opérateur sont disposées à l'intérieur du même écran. La main de l'opérateur actionnant un levier d'actionnement d'un système de chirurgie est affichée dans la région d'affichage vidéo de site. La présente technologie peut être appliquée, par exemple, à un dispositif de traitement de données qui génère des données vidéo à fournir à un dispositif d'affichage qui affiche une vidéo pendant une chirurgie.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-049459 | 2023-03-27 | ||
| JP2023049459 | 2023-03-27 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024202956A1 true WO2024202956A1 (fr) | 2024-10-03 |
Family
ID=92905670
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/007940 Pending WO2024202956A1 (fr) | 2023-03-27 | 2024-03-04 | Dispositif de traitement de données médicales et système médical |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024202956A1 (fr) |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11318935A (ja) * | 1998-05-19 | 1999-11-24 | Olympus Optical Co Ltd | 医療システムの制御装置 |
| JP2002112958A (ja) * | 2000-10-10 | 2002-04-16 | Olympus Optical Co Ltd | 撮像システム |
| JP2002153487A (ja) * | 2000-11-17 | 2002-05-28 | Topcon Corp | 顕微鏡 |
| JP2004041605A (ja) * | 2002-07-16 | 2004-02-12 | Toshiba Corp | 検査/処置情報記録システム、情報処理装置、情報端末、及び情報記録媒体 |
| JP2006218229A (ja) * | 2005-02-14 | 2006-08-24 | Olympus Corp | 医療支援システム |
| JP2007175428A (ja) * | 2005-12-28 | 2007-07-12 | Olympus Medical Systems Corp | 制御情報入力装置 |
| JP2008058364A (ja) * | 2006-08-29 | 2008-03-13 | National Institute Of Advanced Industrial & Technology | 手術用トレーニング装置 |
| JP2009246864A (ja) * | 2008-03-31 | 2009-10-22 | Toshiba Teli Corp | カメラヘッドおよび管内検査カメラ装置 |
| JP2013106752A (ja) * | 2011-11-21 | 2013-06-06 | National Cancer Center | 電子内視鏡システム |
| WO2015114901A1 (fr) * | 2014-01-30 | 2015-08-06 | オリンパス株式会社 | Système médical d'enregistrement et de lecture de vidéo, et dispositif médical d'enregistrement et de lecture de vidéo |
| JP2018183314A (ja) * | 2017-04-25 | 2018-11-22 | 林栄精器株式会社 | 足元確認システム |
| JP2019013397A (ja) * | 2017-07-05 | 2019-01-31 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置 |
| WO2019078237A1 (fr) * | 2017-10-18 | 2019-04-25 | 富士フイルム株式会社 | Dispositif de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à l'industrie médicale |
-
2024
- 2024-03-04 WO PCT/JP2024/007940 patent/WO2024202956A1/fr active Pending
Patent Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11318935A (ja) * | 1998-05-19 | 1999-11-24 | Olympus Optical Co Ltd | 医療システムの制御装置 |
| JP2002112958A (ja) * | 2000-10-10 | 2002-04-16 | Olympus Optical Co Ltd | 撮像システム |
| JP2002153487A (ja) * | 2000-11-17 | 2002-05-28 | Topcon Corp | 顕微鏡 |
| JP2004041605A (ja) * | 2002-07-16 | 2004-02-12 | Toshiba Corp | 検査/処置情報記録システム、情報処理装置、情報端末、及び情報記録媒体 |
| JP2006218229A (ja) * | 2005-02-14 | 2006-08-24 | Olympus Corp | 医療支援システム |
| JP2007175428A (ja) * | 2005-12-28 | 2007-07-12 | Olympus Medical Systems Corp | 制御情報入力装置 |
| JP2008058364A (ja) * | 2006-08-29 | 2008-03-13 | National Institute Of Advanced Industrial & Technology | 手術用トレーニング装置 |
| JP2009246864A (ja) * | 2008-03-31 | 2009-10-22 | Toshiba Teli Corp | カメラヘッドおよび管内検査カメラ装置 |
| JP2013106752A (ja) * | 2011-11-21 | 2013-06-06 | National Cancer Center | 電子内視鏡システム |
| WO2015114901A1 (fr) * | 2014-01-30 | 2015-08-06 | オリンパス株式会社 | Système médical d'enregistrement et de lecture de vidéo, et dispositif médical d'enregistrement et de lecture de vidéo |
| JP2018183314A (ja) * | 2017-04-25 | 2018-11-22 | 林栄精器株式会社 | 足元確認システム |
| JP2019013397A (ja) * | 2017-07-05 | 2019-01-31 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置 |
| WO2019078237A1 (fr) * | 2017-10-18 | 2019-04-25 | 富士フイルム株式会社 | Dispositif de traitement d'image médicale, système d'endoscope, dispositif d'aide au diagnostic, et dispositif d'aide à l'industrie médicale |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN102821671B (zh) | 内窥镜观察支持系统和设备 | |
| JP4296278B2 (ja) | 医療用コクピットシステム | |
| US10169535B2 (en) | Annotation of endoscopic video using gesture and voice commands | |
| CN102811655B (zh) | 内窥镜观察支持系统和设备 | |
| US20210205027A1 (en) | Context-awareness systems and methods for a computer-assisted surgical system | |
| US20170160549A1 (en) | Augmented reality glasses for medical applications and corresponding augmented reality system | |
| JP2004181229A (ja) | 遠隔手術支援システム及び支援方法 | |
| JP7226325B2 (ja) | 焦点検出装置および方法、並びにプログラム | |
| KR20080089376A (ko) | 3차원 텔레스트레이션을 제공하는 의료용 로봇 시스템 | |
| EP4161426A1 (fr) | Mentorat chirurgical à distance utilisant la réalité augmentée | |
| JP2021192313A (ja) | 情報処理装置および方法、並びにプログラム | |
| JP2012075507A (ja) | 手術用カメラ | |
| US11883120B2 (en) | Medical observation system, medical signal processing device, and medical signal processing device driving method | |
| CN110913787B (zh) | 手术支持系统、信息处理方法和信息处理装置 | |
| EP1705513A1 (fr) | Systeme de vision stereoscopique d'images en temps reel ou statiques | |
| WO2020054595A1 (fr) | Système d'assistance chirurgicale, dispositif de commande d'affichage et procédé de commande d'affichage | |
| US10330945B2 (en) | Medical image display apparatus, medical information processing system, and medical image display control method | |
| WO2024202956A1 (fr) | Dispositif de traitement de données médicales et système médical | |
| JP7086929B2 (ja) | 医療情報処理システム | |
| JP7230923B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
| Ilgner et al. | Using a high-definition stereoscopic video system to teach microscopic surgery | |
| US20250387175A1 (en) | Context-awareness systems and methods for a computer-assisted surgical system | |
| US20240358439A1 (en) | Augmented reality surgery set-up for robotic surgical procedures | |
| WO2023052535A1 (fr) | Dispositifs et systèmes destinés à être utilisés en imagerie pendant une chirurgie | |
| WO2023052474A1 (fr) | Dispositifs et systèmes utilisés pour l'imagerie pendant la chirurgie |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24779094 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |