[go: up one dir, main page]

WO2024047696A1 - Système de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Système de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2024047696A1
WO2024047696A1 PCT/JP2022/032390 JP2022032390W WO2024047696A1 WO 2024047696 A1 WO2024047696 A1 WO 2024047696A1 JP 2022032390 W JP2022032390 W JP 2022032390W WO 2024047696 A1 WO2024047696 A1 WO 2024047696A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information processing
processing system
video
annotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/032390
Other languages
English (en)
Japanese (ja)
Inventor
賢世 坂井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Endov Inc
Original Assignee
Endov Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Endov Inc filed Critical Endov Inc
Priority to PCT/JP2022/032390 priority Critical patent/WO2024047696A1/fr
Publication of WO2024047696A1 publication Critical patent/WO2024047696A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/14Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations with provision for individual teacher-student communication

Definitions

  • the present invention relates to an information processing system, an information processing method, and a program.
  • Patent Document 1 discloses a technique aimed at improving communication between surgeons.
  • Patent Document 1 The technology disclosed in Patent Document 1 is configured so that annotations can be written to specific parts of a video. However, it has been found that inexperienced users (physicians) may not be able to understand the instructor's ideas clearly.
  • the present invention aims to provide an information processing system etc. that can efficiently provide medical education.
  • an information processing system that supports medical education.
  • This information processing system includes a control section.
  • the control unit is configured to perform the following steps.
  • a surgical operation video is received from the first user.
  • the display control step the operation video is displayed as a shared video so that the first user and a second user different from the first user can view it.
  • the shared video is configured to allow the first user and/or the second user to write an annotation, and on the screen that displays the shared video, the first user and/or The second user is configured to be able to add a text comment.
  • FIG. 1 is a configuration diagram showing an information processing system 1 according to the present embodiment.
  • 2 is a block diagram showing the hardware configuration of a first user terminal 2.
  • FIG. 3 is a block diagram showing the hardware configuration of a server 3.
  • FIG. 2 is a block diagram showing the hardware configuration of a second user terminal 4.
  • FIG. 2 is a block diagram showing functions realized by a control unit 33 and the like in the server 3.
  • FIG. 2 is an activity diagram showing the flow of information processing according to the present embodiment.
  • FIG. 2 is a schematic diagram for explaining a shared video. This is an example of a screen displaying search results shown on the first user terminal 2.
  • An information processing system that supports medical education, Equipped with a control unit, The control unit is configured to execute the following steps, In the video reception step, a surgical operation video is accepted from the first user, In the display control step, the operation video is displayed so as to be visible to the first user and a second user different from the first user as a shared video;
  • the shared video is configured to allow the first user and/or the second user to write an annotation
  • the information processing system is configured to be able to add a text comment of the first user and/or the second user that is linked to the annotation on a screen on which the shared video is displayed.
  • the program for realizing the software appearing in this embodiment may be provided as a non-transitory computer-readable medium, or may be downloaded from an external server.
  • the program may be provided in a manner that allows the program to be started on an external computer and the function thereof is realized on the client terminal (so-called cloud computing).
  • the term "unit” may include, for example, a combination of hardware resources implemented by circuits in a broad sense and software information processing that can be concretely implemented by these hardware resources.
  • various types of information are handled in this embodiment, and these information include, for example, the physical value of a signal value representing voltage and current, and the signal value as a binary bit collection consisting of 0 or 1. It is expressed by high and low levels or quantum superposition (so-called quantum bits), and communication and calculations can be performed on circuits in a broad sense.
  • a circuit in a broad sense is a circuit realized by at least appropriately combining a circuit, a circuit, a processor, a memory, and the like.
  • ASIC Application Specific Integrated Circuit
  • SPLD Simple Programmable Logic Device
  • CPLD Complex Programmable Logic Device
  • FPGA field programmable gate array
  • FIG. 1 is a configuration diagram showing an information processing system 1 according to the present embodiment.
  • the information processing system 1 is an information processing system that supports medical education.
  • the information processing system 1 includes a first user terminal 2, a server 3, and a second user terminal 4, which are connected through a network 5. These components will be further explained. Note that a system exemplified by the information processing system 1 is composed of one or more devices or components. Therefore, even the server 3 alone is an example of a system.
  • the first user terminal 2 is typically a terminal owned by a user receiving operational guidance.
  • FIG. 2 is a block diagram showing the hardware configuration of the first user terminal 2. As shown in FIG. The first user terminal 2 has a communication section 21, a storage section 22, a control section 23, a display section 24, an input section 25, and an audio output section 26, and these components are connected to the first user terminal 2. It is electrically connected within the user terminal 2 via a communication bus 20 . Descriptions of the communication unit 21, storage unit 22, and control unit 23 will be omitted because they are substantially the same as the communication unit 31, storage unit 32, and control unit 33 in the server 3, which will be described later.
  • the display unit 24 may be included in the casing of the first user terminal 2, or may be attached externally.
  • the display unit 24 displays a screen of a graphical user interface (GUI) that can be operated by the user.
  • GUI graphical user interface
  • This is preferably implemented by using display devices such as a CRT display, a liquid crystal display, an organic EL display, and a plasma display depending on the type of the first user terminal 2, for example.
  • display unit 24 will be described as being included in the casing of the first user terminal 2.
  • the input unit 25 may be included in the casing of the first user terminal 2, or may be externally attached.
  • the input unit 25 may be integrated with the display unit 24 and implemented as a touch panel. With a touch panel, the user can input tap operations, swipe operations, and the like. Of course, a switch button, a mouse, a QWERTY keyboard, etc. may be used instead of the touch panel. That is, the input unit 25 receives operation inputs made by the user. The input is transferred as a command signal to the control unit 23 via the communication bus 20, and the control unit 23 can perform predetermined control or calculation as necessary.
  • the audio output unit 26 may be included in the housing of the first user terminal 2, or may be externally attached.
  • the audio output unit 26 outputs audio that can be recognized by the user.
  • the audio output unit 26 may be a non-directional speaker, a directional speaker, or both.
  • the audio output unit 26 will be described as being included in the casing of the first user terminal 2.
  • FIG. 3 is a block diagram showing the hardware configuration of the server 3. As shown in FIG.
  • the server 3 includes a communication section 31, a storage section 32, and a control section 33, and these components are electrically connected via a communication bus 30 inside the server 3. Each component will be further explained.
  • the communication unit 31 is preferably a wired communication means such as USB, IEEE1394, Thunderbolt (registered trademark), wired LAN network communication, etc., it is also suitable for wireless LAN network communication, mobile communication such as 3G/LTE/5G, Bluetooth (registered trademark) Communication etc. may be included as necessary. That is, it is more preferable to implement it as a set of these plurality of communication means. That is, the server 3 communicates various information with the first user terminal 2 and the second user terminal 4 via the network 5 via the communication unit 31 .
  • wired communication means such as USB, IEEE1394, Thunderbolt (registered trademark), wired LAN network communication, etc.
  • mobile communication such as 3G/LTE/5G, Bluetooth (registered trademark) Communication etc.
  • the server 3 communicates various information with the first user terminal 2 and the second user terminal 4 via the network 5 via the communication unit 31 .
  • the storage unit 32 stores various information defined by the above description. This may be used, for example, as a storage device such as a solid state drive (SSD) that stores various programs related to the server 3 executed by the control unit 33, or as a temporary storage device related to program calculations. It can be implemented as a memory such as a random access memory (RAM) that stores information (arguments, arrays, etc.). Alternatively, a combination of these may be used. In particular, the storage unit 32 stores various programs related to the server 3 that are executed by the control unit 33.
  • SSD solid state drive
  • RAM random access memory
  • the control unit 33 processes and controls the overall operation related to the server 3.
  • the control unit 33 is, for example, a central processing unit (CPU) not shown.
  • the control unit 33 implements various functions related to the server 3 by reading predetermined programs stored in the storage unit 32. That is, information processing by software stored in the storage unit 32 is specifically implemented by the control unit 33, which is an example of hardware, and can be executed as each functional unit included in the control unit 33. Regarding these, 2. Further details are provided in Sec.
  • the control section 33 is not limited to a single control section, and may be implemented so as to have a plurality of control sections 33 for each function. It may also be a combination thereof.
  • FIG. 4 is a block diagram showing the hardware configuration of the second user terminal 4.
  • the second user terminal 4 includes a communication section 41, a storage section 42, a control section 43, a display section 44, an input section 45, and an audio output section 46, and these components are connected to the second user terminal 4. It is electrically connected within the user terminal 4 via a communication bus 40 .
  • the details of the communication unit 41, storage unit 42, control unit 43, display unit 44, input unit 45, and audio output unit 46 are as described above in the first user terminal 2. Since they are substantially the same as the display section 24, input section 25, and audio output section 26, their explanation will be omitted.
  • FIG. 5 is a block diagram showing the functions realized by the control unit 33 and the like in the server 3.
  • the server 3, which is an example of the information processing system 1, includes a video reception section 331 and a display control section 332.
  • FIG. 5 shows an embodiment in which the server 3 includes a search request reception section 333, a search execution section 334, a payment execution section 335, and a storage management section 336 as functions provided in the server 3.
  • the video reception unit 331 is configured to be able to execute the video reception step. In this video reception step, a surgical operation video is received from the first user. This will be explained in more detail later.
  • the display control unit 332 is configured to be able to execute display control steps. In this display control step, various display information is generated to control display contents that can be viewed by the user.
  • the display information may be visual information itself such as a screen, image, icon, text, etc. that is generated in a form that is visible to the user, or, for example, visual information such as a screen, image, icon, text, etc. on various terminals. It may also be rendering information for display.
  • the display control unit 332 displays the operation video as a shared video for the first user and a second user who is different from the first user.
  • the shared video is configured to allow the first user and/or the second user to write an annotation, and on the screen that displays the shared video, the first user and/or The second user is configured to be able to add a text comment. Details of this display content will be explained later.
  • the search request receiving unit 333 is configured to be able to execute a search request receiving step.
  • a search request for a user who shares the operation video is received from the first user.
  • the search execution unit 334 is configured to be able to execute a search execution step.
  • a search for users who share the operation video is executed based on the above-mentioned search request. Information processing related to these searches will be explained later.
  • the payment execution unit 335 is configured to be able to execute the payment execution step.
  • the first user pays the second user compensation for writing an annotation and/or adding a text comment by the second user.
  • the storage management unit 336 is configured to be able to execute storage management steps. In this storage management step, various information related to the information processing system 1 of this embodiment is configured to be stored and managed. Typically, the storage management unit 336 is configured to store received videos, annotations, comments, etc. in a storage area. This storage area is exemplified by the storage unit 32 of the server 3, but may also be the storage unit 22 provided in the first user terminal 2 or the storage unit 42 provided in the second user terminal 4. . That is, according to the function of the storage management unit 336, various information related to education can be configured to be viewable not only by the first user but also by the second user. Note that this storage area does not necessarily have to be within the information processing system 1, and the storage management unit 336 can also manage to store various information in an external storage device or the like.
  • the information processing system 1 of this embodiment is used to support medical education by receiving surgical operation videos from the first user.
  • the operation video of a surgery to which this information processing system 1 is applied may be a video related to surgery, ophthalmology, dentistry, urology, otorhinolaryngology, or obstetrics
  • the surgical target may be a human. It may also be an animal other than a human. This embodiment will be described based on a case where the surgical operation video is a surgical operation video.
  • FIG. 6 is an activity diagram showing the flow of information processing according to this embodiment. Specifically, in the information processing method of this embodiment, at least a video reception step and a display control step are executed. The details of the processing of each activity will be described below.
  • the video reception unit 331 receives a surgery operation video from the first user (activity A101).
  • This operation video is typically a video of the surgery performed by the first user, and there are no particular restrictions on the file extension or the like.
  • Acceptance of this video can be achieved, for example, by the first user logging into a predetermined web page and uploading the video file.
  • the display control unit 332 displays this operation video so that the first user and a second user different from the first user can view it as a shared video (activity A102).
  • the shared video is configured to allow the first user and/or the second user to write an annotation, and on the screen that displays the shared video, the first user and/or The second user is configured to be able to add a text comment.
  • FIG. 7 is a schematic diagram for explaining a shared video.
  • a video MV1 shared video
  • an input form IF1 is provided in which a text comment can be posted.
  • the shared video is configured to allow the first user and/or the second user to write annotations. That is, the video MV1 is a recording of the operation performed on the organ OG1, and the video MV1 is configured such that an object OBJ1 can be written as an annotation. Notes can be made regarding operations, etc.
  • the object OBJ1 that functions as this annotation may be any object that is displayed using a known method. Typically, this object OBJ1 may be provided as a handwritten object using a pen tablet device or the like, or may be based on a stamp function prepared in advance.
  • a text comment of the first user and/or the second user can be attached to this annotation.
  • the text comment associated with this annotation (the comment ⁇ in this direction'' in Figure 7) is , can be posted (assigned) via the input form IF1.
  • a text comment can be added by inputting a predetermined text into the input form IF1 and then pressing the button BT2 labeled "ENTER" by a click operation or the like.
  • the comment written in the input form IF1 and the annotation (object OBJ1) before being saved are deleted.
  • the poster of this comment may be displayed.
  • the annotation and text comment may be displayed in association with the time information of the operation video.
  • the configuration may be such that, for example, it is possible to move to a predetermined time in the video based on the content of the annotation or comment. In other words, referring to FIG. 7, by clicking on the comment section "Pinch further down," you can jump to the "16 minutes 56 seconds" portion of the video.
  • a comment made by a preceptor named "XY" is described, but a reply to this comment is sent to another user (a first user or a second user different from XY).
  • the reply to this comment may be a text reply.
  • the comment field on display screen D may be provided with a "like" button, a button for various reactions, a button for displaying a stamp, etc., and the above-mentioned reply can be displayed by pressing this button. It may be something that is done.
  • the display screen D in FIG. 7 is provided with an object OBJ2 that can control playback/stop of a moving image, an object OBJ3 that can be repeatedly played, and an object OBJ4 that can adjust the volume.
  • object OBJ5 can be set to change the color when drawing the annotation (OBJ1), and it can be set to change the line width when drawing the annotation (OBJ1).
  • object OBJ6 is provided.
  • object OBJ7 that can display the video MV1 in full screen is provided. Note that the functions that can be provided on the display screen D are not limited to these, and may be added or deleted as necessary.
  • the storage management unit 336 stores the displayed shared videos, annotations, comments, etc. as data in a predetermined storage area (activity A103).
  • a search process for a second user who provides guidance to the first user may be performed.
  • a search process may be performed after the video reception unit 331 of the server 3 receives the operation video as shown in the activity diagram of FIG. 6, but is not necessarily limited to this.
  • This search process may be performed before accepting the operation video.
  • the search request receiving unit 333 receives a search request from the first user for a user who shares the operation video (activity A104). Furthermore, the search execution unit 334 executes a search for users who share the operation video based on the search request (activity A105). In addition to this, this search process is accomplished by specifying the second user who shares the operation video (activity A106).
  • This search request may be based on various information, for example, a first user searches for a predetermined second user based on field of expertise, age, years of experience, affiliation, location, etc. There may be. Furthermore, the second user is identified by identifying the user who requests to share the video from among the second user candidates extracted as a result of the search.
  • search execution unit 334 may be configured to identify the part where the operation is being performed from the contents of the operation video and search for a user who has expertise in the part.
  • the video posted by the first user includes images of body parts such as the organ OG1.
  • the search execution unit 334 of the server 3 receives a search request from the first user, the search execution unit 334 determines the region captured in the video and searches for a user who has expertise in surgery for this region. may be configured.
  • expertise in a region refers to, for example, having specialized knowledge regarding a predetermined region, and typically refers to having expertise according to the classification of clinical medicine.
  • the degree of expertise may be determined based on the number of submitted papers, the position at the hospital to which the person belongs, etc.
  • FIG. 8 is an example of a screen displayed on the first user terminal 2 that displays search results.
  • the search execution unit 334 specifies that the organ OG1 captured in the operation video is the pancreas, and based on this specific content, searches a second doctor who has expertise in the field of pancreas. It may be configured to be presented as a candidate to the user.
  • the first user who comes into contact with such a display screen D identifies the second user who will share the video by pressing at least one (or more than one) of the buttons BT11, BT21, and BT31. be able to.
  • buttons BT12, BT22, and BT32 are pressed, reviews from contributors of other operation videos are displayed. In this way, in the information processing method of this embodiment, the evaluation regarding the second user may be displayed so that the first user can view it.
  • compensation may be paid to the second user with whom the video is shared.
  • the second user with whom the video is shared may not be paid (free of charge).
  • the information processing method of this embodiment is one in which a consideration is paid to the second user with whom the video is shared
  • the information processing system of this embodiment as shown in the activity diagram of FIG. 1, payment for the second user may be executed.
  • the payment execution unit 335 of the server 3 executes a process in which the first user pays the second user compensation for writing an annotation and/or adding a text comment by the second user.
  • This consideration may be appropriately set between the first user and the second user.
  • the amount of consideration may depend on the number and quality of annotations and comments.
  • the compensation may be set higher depending on the degree of expertise. It should be noted that, together with the search results as shown in FIG. 8 described above, the compensation for requesting each candidate of the second user to share or add a comment may be clearly displayed.
  • the annotation and the text comment linked to this annotation are visible to the first user, so that the first user receiving the education can easily understand the information processing method. can help. Therefore, medical education can be provided efficiently.
  • the first user searches for a candidate for the second user, but conversely, the second user searches for the first user that he or she wants to instruct.
  • the first user can set a tag associated with the operation, and the second user can search for the first user by searching for this tag.
  • the server 3 performed various storage and control operations, but instead of the server 3, a plurality of external devices may be used. That is, using blockchain technology or the like, information regarding the attendance history may be distributed and stored in a plurality of external devices.
  • An information processing system that supports medical education, including a control unit, and the control unit is configured to execute the following steps, and in the video reception step, a video reception step is performed by receiving a video request from a first user.
  • the operation video is received, and in the display control step, the operation video is displayed as a shared video for the first user and a second user different from the first user, and here, the operation video is
  • the shared video is configured to allow the first user and/or the second user to write an annotation, and on the screen that displays the shared video, the first user and/or the annotation associated with the annotation are displayed.
  • An information processing system configured to be able to add/or a text comment of the second user.
  • the search request receiving step a search request of a user who shares the operation video is received from the first user, and in the search execution step, the search request is An information processing system that searches for the user who shares the operation video based on the information processing system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

Le problème décrit par la présente invention consiste à fournir un système de traitement d'informations, etc., qui permet l'enseignement efficace de la médecine. Selon un aspect de la présente invention, la solution consiste en un système de traitement d'informations destiné à aider à l'enseignement de la médecine. Ce système de traitement d'informations comprend une unité de commande. L'unité de commande est configurée pour exécuter chacune des étapes suivantes. Dans une étape d'acceptation d'image animée, une image animée d'opération représentant une chirurgie est acceptée en provenance d'un premier utilisateur. Dans une étape de commande d'affichage, l'image animée d'opération est affichée de manière visible sous la forme d'une image animée partagée présentée au premier utilisateur et à un second utilisateur différent du premier utilisateur. L'image animée partagée est conçue pour permettre au premier utilisateur et/ou au second utilisateur de l'annoter, et l'écran sur lequel l'image animée partagée est affichée est conçu pour permettre au premier utilisateur et/ou au second utilisateur d'ajouter un commentaire textuel associé à l'annotation.
PCT/JP2022/032390 2022-08-29 2022-08-29 Système de traitement d'informations, procédé de traitement d'informations et programme Ceased WO2024047696A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032390 WO2024047696A1 (fr) 2022-08-29 2022-08-29 Système de traitement d'informations, procédé de traitement d'informations et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032390 WO2024047696A1 (fr) 2022-08-29 2022-08-29 Système de traitement d'informations, procédé de traitement d'informations et programme

Publications (1)

Publication Number Publication Date
WO2024047696A1 true WO2024047696A1 (fr) 2024-03-07

Family

ID=90099086

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032390 Ceased WO2024047696A1 (fr) 2022-08-29 2022-08-29 Système de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2024047696A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002207832A (ja) * 2000-12-28 2002-07-26 Atsushi Takahashi インターネット技術指導教育配信システム、及び通信網を利用した指導システム
WO2005093687A1 (fr) * 2004-03-26 2005-10-06 Atsushi Takahashi Système de verre grossissant numérique d'entité 3d ayant une fonction d'instruction visuelle 3d
KR20150117165A (ko) * 2014-04-09 2015-10-19 (주)라파로넷 인터넷 기반 의료 술기 교육 정보 제공 시스템 및 제공 방법
JP2019125211A (ja) * 2018-01-17 2019-07-25 株式会社教育ネット 擬似同一空間授業システム
JP2019162339A (ja) * 2018-03-20 2019-09-26 ソニー株式会社 手術支援システムおよび表示方法
US20200273359A1 (en) * 2019-02-26 2020-08-27 Surg Time, Inc. System and method for teaching a surgical procedure

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002207832A (ja) * 2000-12-28 2002-07-26 Atsushi Takahashi インターネット技術指導教育配信システム、及び通信網を利用した指導システム
WO2005093687A1 (fr) * 2004-03-26 2005-10-06 Atsushi Takahashi Système de verre grossissant numérique d'entité 3d ayant une fonction d'instruction visuelle 3d
KR20150117165A (ko) * 2014-04-09 2015-10-19 (주)라파로넷 인터넷 기반 의료 술기 교육 정보 제공 시스템 및 제공 방법
JP2019125211A (ja) * 2018-01-17 2019-07-25 株式会社教育ネット 擬似同一空間授業システム
JP2019162339A (ja) * 2018-03-20 2019-09-26 ソニー株式会社 手術支援システムおよび表示方法
US20200273359A1 (en) * 2019-02-26 2020-08-27 Surg Time, Inc. System and method for teaching a surgical procedure

Similar Documents

Publication Publication Date Title
US12482562B2 (en) Systems and methods for and displaying patient data
US9524569B2 (en) Systems and methods for displaying patient data
US20210150478A1 (en) Systems and methods for and displaying patient data
CN109346158A (zh) 超声影像处理方法、计算机设备和可读存储介质
US10460409B2 (en) Systems and methods for and displaying patient data
CN118778864A (zh) 来自远程设备的直接输入
US9996667B2 (en) Systems and methods for displaying patient data
Wang et al. SurfaceSlide: a multitouch digital pathology platform
US20150193584A1 (en) System and method for clinical procedure timeline tracking
US20140172457A1 (en) Medical information processing apparatus and recording medium
US20130246067A1 (en) User interface for producing automated medical reports and a method for updating fields of such interface on the fly
CN111653330B (zh) 医学图像显示和诊断信息生成方法、系统、终端及介质
WO2024047696A1 (fr) Système de traitement d'informations, procédé de traitement d'informations et programme
CN113391737A (zh) 界面的显示控制方法及装置、存储介质及电子设备
CN109635304A (zh) 多语言系统数据处理方法和装置
KR102716383B1 (ko) 외국인 의료 진료를 위한 병원 추천 서비스를 제공하는 방법 및 이를 수행하는 전자 장치
Madapana et al. Touchless interfaces in the operating room: A study in gesture preferences
US20100125196A1 (en) Ultrasonic Diagnostic Apparatus And Method For Generating Commands In Ultrasonic Diagnostic Apparatus
US11768573B2 (en) Graphical user interface marking feedback
BR112020018877A2 (pt) Método, dispositivo de computação local e meio de armazenamento legível por computador não transitório para transmissão de arquivos através de uma conexão de soquete da web em um espaço de trabalho de colaboração em rede
akkaoui et al. Developing medical devices with emerging technologies: trends, challenges, and future directions
CN112947804B (zh) 一种用于测量设备的控制方法、系统、设备及介质
KR20230171217A (ko) 진료 기록 공유 디바이스, 시스템 및 그것의 제어 방법
KR20140124200A (ko) 전자의무기록 시스템 기반 오브젝트 정보 검색 방법 및 장치
JP6947253B2 (ja) 電子カルテシステム及び電子カルテプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22957300

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22957300

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP