WO2025169527A1 - Programme, procédé de traitement d'informations, dispositif de traitement d'informations et système de traitement d'informations - Google Patents
Programme, procédé de traitement d'informations, dispositif de traitement d'informations et système de traitement d'informationsInfo
- Publication number
- WO2025169527A1 WO2025169527A1 PCT/JP2024/032795 JP2024032795W WO2025169527A1 WO 2025169527 A1 WO2025169527 A1 WO 2025169527A1 JP 2024032795 W JP2024032795 W JP 2024032795W WO 2025169527 A1 WO2025169527 A1 WO 2025169527A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- puncture
- living body
- deviation
- target
- surgeon
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/15—Devices for taking samples of blood
- A61B5/151—Devices specially adapted for taking samples of capillary blood, e.g. by lancets, needles or blades
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M25/00—Catheters; Hollow probes
- A61M25/01—Introducing, guiding, advancing, emplacing or holding catheters
- A61M25/06—Body-piercing guide needles or the like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M5/00—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
- A61M5/14—Infusion devices, e.g. infusing by gravity; Blood infusion; Accessories therefor
- A61M5/158—Needles for infusions; Accessories therefor, e.g. for inserting infusion needles, or for holding them on the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M5/00—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
- A61M5/42—Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests having means for desensitising skin, for protruding skin to facilitate piercing, or for locating point where body is to be pierced
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- This disclosure relates to a program, an information processing method, an information processing device, and an information processing system.
- Patent Document 1 discloses technology that presents recommended surgical strategies based on the usage status of medical devices during surgery and the condition of the patient's affected area. Using the technology disclosed in Patent Document 1 can support doctors and other medical professionals who perform medical procedures during surgery.
- One aspect is to provide a program or the like that can assist with the puncture procedure.
- the program described in (1) above further causes the computer to execute a process of outputting an alert in accordance with the calculated amount of deviation.
- the program described in (1) or (2) above further causes the computer to execute a process of detecting the user's palpation action on the living body and identifying the target puncture position based on the user's viewpoint position during the palpation action.
- the program described in any one of (1) to (3) above further causes the computer to execute a process of detecting the operation of the puncture device and, when the puncture device has stopped operating, detecting the position of the tip of the puncture device relative to the living body.
- the program described in any one of (1) to (4) above further causes the computer to execute the following process: acquiring a blood vessel visualization image that visualizes the blood vessels of the living body; calculating a second deviation amount between the blood vessels in the blood vessel visualization image and the target puncture position; and outputting an alert in accordance with the second deviation amount.
- the program described in any one of (1) to (5) above further causes the computer to execute a process in which the puncture device has a marker provided at a predetermined position, the computer detects the position of the marker on the puncture device, and the computer detects the position of the tip of the puncture device based on the position of the marker.
- the program described in any one of (1) to (6) above further causes the computer to input the user's biometric information into a learning model that has been trained to output a threshold value used to determine whether to output an alert regarding the amount of deviation between the target puncture position and the position of the tip of the puncture instrument when biometric information about the living body is input; acquire the threshold value of the deviation used to determine whether to output an alert from the learning model; and output an alert if the amount of deviation between the target puncture position and the position of the tip of the puncture instrument is equal to or greater than the threshold value of the deviation.
- the program described in any one of (1) to (7) above further causes the computer to execute a process of detecting the user's palpation position relative to the living body, calculating a third deviation amount between the palpation position and the target puncture position, and outputting an alert if the third deviation amount is equal to or greater than a predetermined value.
- the present disclosure also relates to an information processing method in which a computer executes a process of detecting a user's viewpoint position relative to a living organism, identifying a target puncture position relative to the living organism based on the user's viewpoint position, detecting the position of a tip of a puncture device relative to the living organism, and calculating the amount of deviation between the target puncture position and the position of the tip of the puncture device.
- the present disclosure also provides an information processing device including a control unit, which detects a user's viewpoint position relative to a living organism, identifies a target puncture position relative to the living organism based on the user's viewpoint position, detects the position of a tip of a puncture device relative to the living organism, and calculates the amount of deviation between the target puncture position and the position of the tip of the puncture device.
- a control unit which detects a user's viewpoint position relative to a living organism, identifies a target puncture position relative to the living organism based on the user's viewpoint position, detects the position of a tip of a puncture device relative to the living organism, and calculates the amount of deviation between the target puncture position and the position of the tip of the puncture device.
- the present disclosure also provides an information processing system including an information processing device having a control unit, a first camera that captures an image of a user's eyes, and a second camera that captures an image in the user's line of sight, wherein the control unit detects the user's viewpoint position relative to a living organism based on an image captured by the first camera and an image captured by the second camera, identifies a target puncture position relative to the living organism based on the user's viewpoint position, detects the position of a tip of a puncture instrument relative to the living organism based on the image captured by the second camera, and calculates the amount of deviation between the target puncture position and the position of the tip of the puncture instrument.
- an information processing device having a control unit, a first camera that captures an image of a user's eyes, and a second camera that captures an image in the user's line of sight
- the control unit detects the user's viewpoint position relative to a living organism based on an image captured by the first camera and an image captured by
- it can assist with puncture procedures.
- FIG. 1 is an explanatory diagram showing an example of the configuration of a puncture support system.
- FIG. 1 is a block diagram illustrating an example configuration of an information processing device and a wearable device.
- FIG. 1 is an explanatory diagram showing an example of the configuration of an indwelling needle.
- 10 is a flowchart showing an example of a procedure for supporting a puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 10 is an explanatory diagram showing a configuration example of an indwelling needle according to a second embodiment.
- FIG. 10 is an explanatory diagram of a process for estimating the tip position of an indwelling needle.
- 13 is a flowchart illustrating an example of a procedure for supporting a puncture operation according to the third embodiment.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- FIG. 2 is an explanatory diagram showing an example of the configuration of a first learning model.
- FIG. 10 is an explanatory diagram showing an example of the configuration of a second learning model.
- 13 is a flowchart illustrating an example of a procedure for supporting a puncture operation according to a fourth embodiment.
- 13 is a flowchart showing an example of a procedure for supporting a puncture operation according to a fifth embodiment.
- FIG. 10 is an explanatory diagram of a support process for the puncture operation.
- a puncture operation will be described using as an example a puncture site in the forearm of a subject such as a patient or examinee, but the puncture site is not limited to the forearm and may be any suitable location such as the arm or lower leg.
- the purpose of the puncture operation may be any purpose, such as blood collection, injection, transfusion, or placement of an intravascular catheter.
- FIG. 1 is an explanatory diagram showing an example of the configuration of the puncture support system.
- the puncture support system 100 of this embodiment includes an information processing device 10, a wearable device 20, and an indwelling needle 30.
- the puncture support system 100 of this embodiment is installed in a facility such as a medical institution or testing institution where the puncture operation is performed.
- the indwelling needle 30 is an example of a puncture device.
- the indwelling needle 30 is, for example, an indwelling needle for a peripheral artery or vein, and a medical professional uses the indwelling needle 30 to puncture the blood vessel of a living body 40 (subject) to be punctured.
- the indwelling needle 30 may be any medical device that requires puncturing a blood vessel of the living body 40, and may be an indwelling needle for a peripheral artery or vein, a dialysis indwelling needle, a vascular access device such as a PICC (Peripherally Inserted Central venous Catheter), a midline catheter, or a CVC (Central Venous Catheter), or a blood collection and drug administration device composed of a puncture needle and a suction device such as a syringe.
- a vascular access device such as a PICC (Peripherally Inserted Central venous Catheter), a midline catheter, or a CVC (Central Venous Catheter)
- a blood collection and drug administration device composed of a puncture needle and a suction device such as a syringe.
- the information processing device 10 and the wearable device 20 are configured to perform wireless communication.
- the information processing device 10 is a device capable of various information processing and information transmission and reception, and is composed of, for example, a personal computer, a server computer, etc.
- the wearable device 20 is worn by the surgeon (medical professional) performing the puncture procedure.
- the wearable device 20 is formed, for example, in the shape of glasses and worn on the head (face) of the user (surgeon).
- the wearable device 20 shown in Figure 1 has a frame formed by temples 20a (for wearing on the user's face) and nose pads 20b (pads), and pupil cameras 21a, 21b and a scenery camera 22 are provided at appropriate locations on the frame.
- the pupil cameras 21a, 21b are cameras that capture images of the left and right eyes (the eye area including at least the pupil (pupil), iris, and cornea) of the user wearing the wearable device 20, and at least one is provided for each eye (eyeball).
- the pupil cameras 21a and 21b may be configured to capture images of both the left and right eyes (eyeballs) with a single camera.
- the scenery camera 22 is a camera that captures the scenery viewed by the user (the scenery in the direction of the surgeon's line of sight), and at least one scenery camera 22 is provided.
- the wearable device 20 is not limited to the eyeglass-type configuration shown in FIG.
- the information processing device 10 and the wearable device 20 are preferably configured to perform wireless communication, but may also be configured to perform wired communication via a cable such as a USB (Universal Serial Bus) cable.
- the information processing device 10 and the wearable device 20 may also be configured to communicate via a network, which may be the Internet, a public communication line, or a LAN (Local Area Network) constructed within the facility where the puncture support system 100 is installed.
- FIG 2 is a block diagram showing an example configuration of the information processing device 10 and the wearable device 20.
- the wearable device 20 has pupil cameras 21a and 21b (first cameras), a scenery camera 22 (second camera), and a communication unit 23.
- the pupil cameras 21a and 21b are, for example, near-infrared cameras, and have a near-infrared light LED (Light Emitting Diode) that emits near-infrared light with a wavelength in the range of approximately 750 nm to 2500 nm toward the user's eye, and a light receiving unit that receives the near-infrared light emitted by the near-infrared LED reflected from the user's eye (cornea).
- a near-infrared light LED Light Emitting Diode
- the pupil cameras 21a and 21b acquire pupil images using near-infrared light.
- the scenery camera 22 is, for example, a visible light camera, and is an imaging device having a lens, an image sensor, etc. Thus, the scenery camera 22 acquires scenery images using visible light.
- the scenery camera 22 may be a near-infrared camera, or may be configured to include both a visible light camera and a near-infrared camera.
- the pupil cameras 21a, 21b and the scenery camera 22 capture video (pupil images, scenery images) at a rate of, for example, 15 or 30 frames per second, and sequentially transmit the captured pupil images and scenery images to the information processing device 10 via the communication unit 23.
- the communication unit 23 is a communication module for transmitting pupil images captured by the pupil cameras 21a and 21b and scenery images captured by the scenery camera 22 to the information processing device 10 via wireless communication.
- the communication unit 23 transmits the pupil images and scenery images via wireless communication compliant with, for example, IEEE 802.15.1, i.e., Bluetooth (registered trademark).
- the communication unit 23 may be a communication module for wired communication.
- the communication unit 23 may also be a communication module for connecting to a network, in which case the pupil images and scenery images are transmitted to the information processing device 10 via the network.
- the wearable device 20 may also be configured to include an input unit that accepts operational inputs from the user.
- the wearable device 20 may also be configured to include a memory unit, in which case the pupil images captured by the pupil cameras 21a and 21b and the scenery images captured by the scenery camera 22 may be temporarily stored in the memory unit and then transmitted together to the information processing device 10 at a predetermined timing.
- the information processing device 10 has a control unit 11, a memory unit 12, a communication unit 13, an input unit 14, a display unit 15, a speaker 16, and a reading unit 17, and each of these units is connected via a bus.
- the control unit 11 is configured using one or more processors such as a CPU (Central Processing Unit), MPU (Micro-Processing Unit), GPU (Graphics Processing Unit), or TPU (Tensor Processing Unit).
- the control unit 11 executes various information processing and control processes to be performed by the information processing device 10 by appropriately executing a program P stored in the memory unit 12. Note that if the control unit 11 includes multiple processors, the control unit 11 may execute each process using a different processor.
- the storage unit 12 includes RAM (Random Access Memory), flash memory, a hard disk, an SSD (Solid State Drive), etc.
- the storage unit 12 pre-stores the program P (program product) executed by the control unit 11 and various data required to execute the program P.
- the storage unit 12 also temporarily stores data generated when the control unit 11 executes the program P.
- the storage unit 12 may be made up of multiple storage devices, and part of the storage unit 12 may be another storage device connected to the information processing device 10, or another storage device with which the information processing device 10 can communicate.
- the communication unit 13 has a configuration similar to the communication unit 23 of the wearable device 20, and receives pupil images and scenery images captured by the wearable device 20 from the communication unit 23.
- the communication unit 13 may be a communication module for wired communication, or a communication module for connecting to a network.
- the communication unit 13 may also be configured to have a separate communication module for communicating with the wearable device 20 and a separate communication module for connecting to the network.
- the input unit 14 accepts operation input by the user and sends a control signal corresponding to the operation content to the control unit 11.
- the display unit 15 is an LCD display, an organic EL display, or the like, and displays various information in accordance with instructions from the control unit 11. A part of the input unit 14 and the display unit 15 may be integrated into a touch panel.
- the speaker 16 is an audio output unit that outputs messages, warning sounds, etc. in accordance with instructions from the control unit 11.
- the reading unit 17 reads information stored on portable storage media 10a, including CDs (Compact Discs), DVDs (Digital Versatile Discs), USB (Universal Serial Bus) memories, SD (Secure Digital) cards, etc.
- the program P and various data stored in the storage unit 12 may be read by the control unit 11 from the portable storage medium 10a via the reading unit 17 and stored in the storage unit 12.
- the program P and various data may also be written to the storage unit 12 during the manufacturing stage of the information processing device 10, or may be downloaded by the control unit 11 from another device via the communication unit 13 and stored in the storage unit 12.
- the information processing device 10 is not limited to a single computer, but may be a multi-computer consisting of multiple computers. Furthermore, the information processing device 10 may be a virtual machine virtually constructed within a single device using software, or may be a cloud server. The following description will be given assuming that the information processing device 10 is a single computer. Furthermore, the program P may be located on a single computer or at a single site, or may be distributed across multiple sites and executed in a distributed manner on multiple computers interconnected via a network. Furthermore, the input unit 14 and display unit 15 are not essential for the information processing device 10, and the information processing device 10 may be configured to accept operations via a connected computer, or to output information to be displayed to an external display device.
- the indwelling needle 30 comprises a catheter 31, a catheter hub 32 connected to the base end of the catheter 31, an inner needle 33 (needle) inserted into the catheter 31, an inner needle hub 34 connected to the base end of the inner needle 33, and a filter 35 connected to the base end of the inner needle hub 34.
- a syringe (not shown) can be connected to the base end of the inner needle hub 34.
- a hemostatic valve (not shown) may be provided within the catheter hub 32.
- the indwelling needle 30 does not need to include the filter 35.
- the catheter 31 constitutes the outer needle and extends to the vicinity of the tip of the inner needle 33 inserted into the catheter 31.
- a blade surface 331 formed at the tip of the inner needle 33 is exposed (protrudes) from the tip of the catheter 31.
- the following describes a method of puncture using the indwelling needle 30 configured as described above.
- the surgeon grasps the indwelling needle 30 with the inner needle 33 inserted into the catheter 31, and inserts the tip of the inner needle 33 from the surface of the living body 40 (the subject's skin surface) toward the blood vessel, gradually inserting the inner needle 33 toward the desired site. This causes the tip of the inner needle 33 to advance while cutting through the body tissue.
- the catheter 31 is also inserted into the same blood vessel.
- the surgeon moves the catheter 31 and catheter hub 32 toward the tip and inserts the catheter 31 into the blood vessel.
- the surgeon removes the inner needle 33, inner needle hub 34, and filter 35, leaving the catheter 31 and catheter hub 32 behind, thereby leaving the catheter 31 and catheter hub 32 indwelling in the living body.
- the medical professional wears the wearable device 20, for example, on their head (face) when performing the puncture procedure.
- the surgeon also palpates the living body 40 to confirm the location of blood vessels within the living body 40, and then performs the puncture procedure.
- the pupil cameras 21a and 21b of the wearable device 20 capture images of the surgeon's eyes (at least the pupil and iris), and the view camera 22 captures images of the scenery in the surgeon's line of sight.
- the view in the surgeon's line of sight includes the state of the surgeon's palpation procedure and the state of the puncture procedure relative to the living body 40.
- the wearable device 20 transmits pupil images captured by the pupil cameras 21a and 21b and view images captured by the view camera 22 to the information processing device 10.
- the information processing device 10 performs processing to track the surgeon's line of sight (eye movement) based on the pupil images and view images captured by the wearable device 20.
- the information processing device 10 also detects whether the surgeon is performing a palpation operation or a puncture operation based on the scenery image.
- the information processing device 10 identifies the target puncture position based on the surgeon's line of sight (the viewpoint relative to the living body 40) during the palpation operation, and estimates the planned puncture position based on the tip position of the indwelling needle 30 in the scenery image during the puncture operation.
- the information processing device 10 determines whether the puncture state (planned puncture position) is appropriate based on the target puncture position and the planned puncture position, and outputs an alert according to the determination result, thereby supporting the medical professional in the puncture operation.
- Figure 4 is a flowchart showing an example of the puncture operation support process procedure
- Figures 5A to 7 are explanatory diagrams of the puncture operation support process.
- the following process is executed by the control unit 11 in accordance with the program P stored in the storage unit 12 of the information processing device 10, but some of the following process may also be implemented by a dedicated hardware circuit (for example, an FPGA or ASIC).
- a dedicated hardware circuit for example, an FPGA or ASIC
- the wearable device 20 starts capturing images using pupil cameras 21a, 21b and scenery camera 22, and sequentially transmits the pupil images captured by pupil cameras 21a, 21b and the scenery images captured by scenery camera 22 to the information processing device 10.
- the wearable device 20 may start capturing images using pupil cameras 21a, 21b and scenery camera 22 after starting communication with the information processing device 10, or, if the wearable device 20 is provided with a power button for instructing the start and end of operation, the wearable device 20 may start capturing images using pupil cameras 21a, 21b and scenery camera 22 after the power button is turned on.
- the wearable device 20 continues capturing images using pupil cameras 21a, 21b and scenery camera 22 and transmitting the pupil images and scenery images to the information processing device 10 until the puncture support process by the puncture support system 10 is completed. For example, when communication with the information processing device 10 ends, or when the power button is turned off, the wearable device 20 stops capturing images using the pupil cameras 21a and 21b and the scenery camera 22, and stops transmitting the pupil images and scenery images to the information processing device 10.
- the control unit 11 of the information processing device 10 begins acquiring pupil images and scenery images sequentially transmitted from the wearable device 20 (S11).
- the control unit 11 first performs a detection process to determine whether the surgeon is performing palpation on the living body 40 based on the scenery image (S12), and determines whether the surgeon has started the palpation (S13).
- Figure 5A shows an example of a scenery image, and the scenery image in Figure 5A captures the surgeon's palpation of the living body 40.
- the control unit 11 determines whether the living body 40 and the surgeon's fingers, etc. are present in the scenery image, for example, by template matching using a template. If it determines that the living body 40 and the surgeon's fingers, etc. are present, it determines that the surgeon has started the palpation.
- template images are generated by extracting the features of the puncture target site (such as a typical subject's forearm), the surgeon's fingers, and the instrument used for the puncture procedure (such as a tourniquet), and the template images are stored in the memory unit 12.
- the control unit 11 determines whether or not there is an area in the scenery image that matches each template image, thereby determining whether or not there is a living body 40 and the surgeon's fingers, etc. in the scenery image.
- the control unit 11 may also determine whether or not the living body 40 and the surgeon's fingers, etc. are present in the scenery image by performing object detection processing on the scenery image to detect the presence or absence of the living body 40 and the surgeon's fingers, etc.
- the object detection processing here can be performed, for example, using a learning model that has been machine-learned to determine whether or not a subject in an input scenery image is a pre-learned subject (e.g., the living body 40, the surgeon's fingers, a tourniquet, etc.) when a scenery image is input.
- the control unit 11 can input the scenery image to the learning model and detect the presence or absence of the living body 40 and the surgeon's fingers, etc. in the scenery image based on output information from the learning model.
- the predetermined operation here may also be an operation input by the surgeon's blink, an operation input using an operation button provided on the wearable device 20 or the information processing device 10, a voice input using a predetermined message (for example, a message such as "Set this as the puncture position"), a gesture input by the surgeon, etc.
- control unit 11 may be configured to output the message "Let's start puncturing” if the amount of deviation is less than a first threshold (e.g., 1 mm), to output the message "There is a slight deviation from the target position” if the amount of deviation is greater than or equal to the first threshold and less than a second threshold (e.g., 2 mm), and to output the message "Let's try again” if the amount of deviation is greater than or equal to the second threshold.
- a first threshold e.g. 1 mm
- second threshold e.g. 2 mm
- the target puncture position is set based on the surgeon's line of sight (the viewpoint relative to the living body 40) during palpation, and the planned puncture position is set based on the position of the tip of the indwelling needle 30 during the puncture operation. Then, by outputting an alert if the planned puncture position and the target puncture position are separated by a predetermined distance or more, it is possible to notify the surgeon that there is a high possibility that the puncture operation will fail due to the planned puncture position deviating from the target puncture position. Therefore, the surgeon can prevent the puncture operation from failing by, for example, redoing the puncture operation.
- the target puncture position is set based on the operator's viewpoint relative to the living body 40
- the planned puncture position is set based on the tip position of the indwelling needle 30 relative to the living body 40, so it is not affected by ambient light, the skin color (race) of the living body 40, the condition of the skin surface of the living body 40 (amount of body hair), etc.
- a server that executes eye-tracking processing may also be provided.
- the information processing device 10 can be configured to transmit pupil images and scenery images acquired from the wearable device 20 to the server and receive a viewpoint map generated by the server that shows the trajectory of the surgeon's gaze point.
- a server may be provided that executes the process of detecting the indwelling needle 30 in the scenery image based on the scenery image.
- the information processing device 10 can be configured to transmit the scenery image acquired from the wearable device 20 to a server and receive the result determined by the server (information indicating the presence or absence of the indwelling needle 30 in the scenery image).
- the wearable device 20 may be provided with a control unit so that at least one of the above-mentioned processes is executed locally by the wearable device 20.
- FIG. 2 A puncture support system that supports a puncture operation using an indwelling needle 30 provided with a marker will be described.
- the puncture support system of this embodiment can be realized by devices similar to the puncture support system 100 of embodiment 1 shown in Fig. 1 except for the indwelling needle 30, so a description of the configuration of each device will be omitted.
- the puncture procedure is performed with marker M1 positioned on the same side as the blade surface 331. This allows for accurate imaging of marker M1 when photographing the indwelling needle 30 from above, which is used with the blade surface 331 facing upward during puncture.
- the control unit 11 of the information processing device 10 is capable of executing processing similar to that shown in FIG. 4, and can assist the surgeon in the puncture procedure based on pupil images captured by the pupil cameras 21a and 21b of the wearable device 20 and scenery images captured by the scenery camera 22.
- the control unit 11 detects the presence or absence of the indwelling needle 30 in the scenery image by executing processing to detect the presence or absence of markers M1 and M2 attached to the indwelling needle 30 in the scenery image.
- the control unit 11 determines that the surgeon should start the puncture procedure if the positions of markers M1 and M2 in the scenery images captured in time series have not moved for a predetermined time (e.g., 2 or 3 seconds) or more.
- the control unit 11 estimates the tip position of the indwelling needle 30 based on the positions of markers M1 and M2 in the scenery image, and identifies the estimated position as the planned puncture position.
- Figure 9 is an explanatory diagram of the process of estimating the tip position of the indwelling needle 30.
- Figure 9 shows a scenery image of a puncture operation using an indwelling needle 30 equipped with only marker M1. As shown in Figure 9, the position of marker M1 does not coincide with the tip position of the indwelling needle 30, so the control unit 11 estimates that a predetermined position based on marker M1 is the tip position of the indwelling needle 30.
- the scenery image during the puncture operation is captured from the base end side of the indwelling needle 30.
- the control unit 11 may estimate that a position in the scenery image that is a predetermined distance (predetermined pixels) above the position of marker M1 (position P2 in Figure 9) is the tip position of the indwelling needle 30. If the indwelling needle 30 in the scenery image is photographed in a state where it is tilted relative to the vertical direction, the control unit 11 may estimate the tip position of the indwelling needle 30 to be a position a predetermined distance above the position of marker M1 in the direction of tilt (the axial direction of the indwelling needle 30), or may estimate the tip position of the indwelling needle 30 to be a position a predetermined distance above the longitudinal direction of the living body 40 (e.g., the forearm) as shown in FIG. 7. When an indwelling needle 30 equipped with markers M1 and M2 is used, the tip position of the indwelling needle 30 may be estimated from the relative positions of markers M1 and M2 in the scenery image.
- markers M1 and M2 are preferably provided on the tip side of indwelling needle 30, but they do not have to be provided on the tip side.
- control unit 11 can estimate the tip position of indwelling needle 30 from the positions of markers M1 and M2 in the scenery image.
- the scenery camera 22 may be a near-infrared camera having a near-infrared LED and a near-infrared light receiving unit.
- the markers M1 and M2 may be markers containing a near-infrared fluorescent dye that emits near-infrared fluorescence of a specific wavelength when irradiated with excitation light.
- the scenery camera 22 emits near-infrared light in the direction of the surgeon's line of sight using the near-infrared LED, and receives the near-infrared light emitted by the near-infrared LED reflected from the living body 40, the indwelling needle 30, etc.
- the markers M1 and M2 contain a near-infrared fluorescent dye, the shape, range (area), etc. of the markers M1 and M2 in the scenery image can be detected with high accuracy.
- the markers M1 and M2 may contain, for example, a near-infrared fluorescent dye that emits near-infrared fluorescence in a wavelength range of approximately 750 nm to 2500 nm, or may contain a fluorescent dye that emits fluorescence in a wavelength range corresponding to green, blue, etc. (e.g., wavelengths of 400 to 600 nm) that is not present in the subject's living body 40.
- a near-infrared fluorescent dye that emits near-infrared fluorescence in a wavelength range of approximately 750 nm to 2500 nm
- a fluorescent dye that emits fluorescence in a wavelength range corresponding to green, blue, etc. e.g., wavelengths of 400 to 600 nm
- markers M1, M2 and scenery camera 22 are not limited as long as markers M1, M2 can be detected by scenery camera 22.
- Markers M1, M2 may include, for example, a light-emitting body that emits light of a specific wavelength, an upconversion (UC) phosphor that emits light of a specific wavelength, or a reflector that reflects irradiated light of a specific wavelength.
- Markers M1, M2 may also have a reflective structure that reflects light by applying surface treatment such as embossing or unevenness.
- Markers M1, M2 may also include a material that absorbs visible light, such as colored ink, and be detected by a visible light camera.
- the puncture support system of this embodiment can be realized by devices similar to the puncture support system 100 of embodiment 1 shown in Fig. 1 , and therefore a description of the configuration of each device will be omitted.
- the scenery camera 22 of the wearable device 20 is a near-infrared camera having a near-infrared LED and a near-infrared light receiving unit.
- the light-receiving unit of the scenery camera 22 receives near-infrared light reflected from the living body 40, the indwelling needle 30, and the surgeon's fingers, and acquires a scenery image using near-infrared light.
- the scenery camera 22 captures the status of the surgeon's palpation and puncture operations during the palpation and puncture operations. If a living body 40 is present within the imaging range of the scenery camera 22, the scenery image includes a visualized image of blood vessels representing the blood vessels within the living body 40.
- the spectral sensitivity wavelength range of the light receiving element of the scenery camera 22 is not particularly limited as long as it corresponds to the emission wavelength of the near-infrared light LED, and can be set appropriately taking into account the response wavelength of the blood vessels.
- the response wavelength of the blood vessels of the living body 40 i.e., the wavelength of light absorbed by the blood vessels
- the spectral sensitivity wavelength range of the light receiving element of the scenery camera 22 can be set to a range including 800 nm to 900 nm.
- the spectral sensitivity wavelength range of the light receiving element of the scenery camera 22 can be set to a range including the response wavelength of the markers M1 and M2, i.e., the wavelength of light reflected by the markers M1 and M2, and the response wavelength of the blood vessels.
- the light receiving element of the scenery camera 22 may include a blood vessel detection light receiving element having a spectral sensitivity wavelength corresponding to the blood vessel response wavelength, and a marker detection light receiving element having a spectral sensitivity wavelength corresponding to the blood vessel response wavelength of the markers M1 and M2.
- the light receiving unit for blood vessel detection acquires a scenery image including a visualized blood vessel image
- the light receiving unit for marker detection acquires a scenery image including a marker image.
- the spectral sensitivity wavelengths of the light receiving unit for blood vessel detection and the light receiving unit for marker detection may be the same or partially overlap, or may be different from each other.
- the scenery camera 22 of this embodiment may include a near-infrared camera for acquiring a visualized image of blood vessels, and a visible light camera for acquiring photographic images of the living body 40, the indwelling needle 30, etc.
- the markers M1 and M2 may contain a near-infrared fluorescent dye
- the marker images may be acquired by the near-infrared camera
- the markers M1 and M2 do not contain a near-infrared fluorescent dye
- the scenery image including the markers M1 and M2 may be acquired by the visible light camera.
- FIG. 10 is a flowchart showing an example of the puncture operation support processing procedure in embodiment 3, and FIG. 11 is an explanatory diagram of the puncture operation support processing.
- the processing shown in FIG. 10 is the processing shown in FIG. 4 with steps S31 to S35 added between steps S16 and S17. Explanations of steps that are the same as those in FIG. 4 will be omitted. Note that steps S18 to S22 from FIG. 4 are not shown in FIG. 10.
- the scenery camera 22 of the wearable device 20 acquires a scenery image using near-infrared light. Therefore, in step S12, the control unit 11 of the information processing device 10 performs a detection process to determine whether the surgeon is performing a palpation operation on the living body 40, based on the scenery image using near-infrared light. In addition, in step S14, the control unit 11 performs an eye tracking process to track the surgeon's line of sight, based on the pupil image and the scenery image using near-infrared light.
- the control unit 11 calculates the amount of deviation (second deviation) between the blood vessels in the blood vessel visualization image generated in step S31 and the target puncture position set in step S16 (S32).
- Figure 11 shows a scenery image including a blood vessel visualization image, and shows an enlarged view of an area including target puncture position P1 on the living body 40 and blood vessel 40b in the blood vessel visualization image.
- the control unit 11 identifies the blood vessel closest to target puncture position P1 in the two-dimensional scenery image, and calculates the distance (number of pixels) between the identified blood vessel and target puncture position P1, which is used as the amount of deviation.
- the control unit 11 determines whether the calculated amount of deviation is greater than or equal to a predetermined value (S33). If it determines that the amount of deviation is greater than or equal to the predetermined value (S33: YES), it outputs a message such as "The target position is deviated from the blood vessel” or "Please start again from the palpation operation” to warn the surgeon (outputs an alert) (S35). If the control unit 11 determines that the amount of deviation is less than the predetermined value (S33: NO), it determines whether the target puncture position is appropriate based on the positional relationship between the blood vessel in the blood vessel visualization image and the target puncture position (S34).
- the control unit 11 determines whether the target puncture position is near a point where the blood vessel branches, as in the area circled by a dashed line in Figure 11. If it determines that the target puncture position is near, it determines that the target puncture position is inappropriate. If it determines that the target puncture position is not near, it determines that the target puncture position is appropriate. Specifically, the control unit 11 measures the distance between the target puncture position and the point where the blood vessel branches, and if the measured distance is less than a predetermined value, it determines that the target puncture position is inappropriate.
- control unit 11 determines that the target puncture position is inappropriate (S34: NO)
- it outputs a message such as "The target position is close to a branching point of the blood vessel” or "Select a straight point on the blood vessel” to warn the surgeon (outputs an alert) (S35).
- S34: YES If the control unit 11 determines that the target puncture position is appropriate (S34: YES), it proceeds to processing of step S17. If an alert is output, the surgeon starts over from the palpation action. Therefore, after processing of step S35, the control unit 11 proceeds to processing of step S12 and repeats the processing from step S12 onwards.
- the above-described processing allows a blood vessel visualization image to be generated from a scenery image captured by the scenery camera 22, which is a near-infrared camera, and it is then possible to determine whether the target puncture position is appropriate for the blood vessels in the blood vessel visualization image and output an alert.
- the configuration of this embodiment is applicable to the puncture support system 100 of the first and second embodiments described above, and similar processing can be performed and similar effects can be obtained even when applied to the puncture support system 100 of the first and second embodiments.
- the modified examples described as appropriate in the first and second embodiments described above can also be applied to this embodiment.
- the predetermined value used to determine whether to output an alert in step S33 (threshold value for the distance between the blood vessel and the target puncture position P1) and the predetermined value used to determine whether to output an alert in step S34 (threshold value for the distance between the branching point of the blood vessel and the target puncture position P1) may be configurable, and may be switchable depending on attribute information such as the subject's age and gender.
- the puncture support systems 100 of the first to third embodiments described above are configured to output a message according to the amount of deviation between the target puncture position and the planned puncture position, thereby alerting the surgeon.
- This embodiment describes a puncture support system configured to use a learning model to determine a threshold value used to determine whether to output an alert based on the amount of deviation between the target puncture position and the planned puncture position.
- the puncture support system of this embodiment can be implemented using devices similar to the puncture support system 100 of the first embodiment shown in FIG. 1 , and therefore a description of the configuration of each device will be omitted.
- the information processing device 10 of this embodiment stores, in the storage unit 12, a learning model that has learned training data by, for example, machine learning.
- FIG. 12A is an explanatory diagram showing an example of the configuration of a first learning model
- FIG. 12B is an explanatory diagram showing an example of the configuration of a second learning model.
- FIG. 12A shows a first learning model Ma that estimates, from the subject's biometric information, a threshold used to determine whether to issue an alert for the amount of deviation between the target puncture position and the planned puncture position.
- FIG. 12B shows a second learning model Mb, as a variation of the first learning model Ma, that estimates the success rate of puncture for each of pre-set deviation amounts (first to third deviation amounts, etc.).
- the first learning model Ma is generated by machine learning using training data that associates training biometric information with a correct threshold for the amount of deviation between the target puncture position and the planned puncture position.
- the correct threshold can be a threshold determined by medical professionals such as doctors and nurses based on the results (success or failure) of a puncture procedure performed on a subject of the training biometric information.
- the first learning model Ma receives input biometric information contained in the training data, it learns so that the output value from the output node corresponding to the correct threshold approaches 1 and the output values from other output nodes approach 0. In the learning process, the first learning model Ma performs calculations based on the input biometric information and calculates output values from each output node.
- the second learning model Mb is generated by machine learning using training data that associates training biometric information with the correct puncture success rate for each amount of deviation between the target puncture position and the planned puncture position.
- the correct puncture success rate can be the puncture success rate determined by a medical professional based on the results (success or failure) of puncturing a subject with the training biometric information at each amount of deviation.
- the second learning model Mb performs calculations based on the input biometric information and calculates the output values from each output node.
- the second learning model Mb compares the calculated output value of each output node for each deviation amount with a value corresponding to the correct puncture success rate (1 for the output node corresponding to the correct puncture success rate, 0 for the other output nodes), and optimizes the parameters used in the calculation process so that the two values approximate each other.
- parameters such as the weights (coupling coefficients) between nodes in the second learning model Mb are also optimized using methods such as backpropagation and steepest descent. This allows for the acquisition of a second learning model Mb that, when the subject's biometric information is input, outputs the puncture success rate when a puncture procedure is performed on the subject at each deviation amount.
- the learning models Ma and Mb may be learned on another learning device.
- the learned learning models Ma and Mb generated by learning on another learning device are downloaded from the learning device to the information processing device 10 via a network or via the portable storage medium 10a and stored in the storage unit 12.
- FIG. 13 is a flowchart showing an example of the puncture support processing procedure of embodiment 4.
- the processing shown in FIG. 13 is the same as the processing shown in FIG. 4, except that steps S41 and S42 are added before step S11, and steps S43 and S44 are added instead of step S22. Explanations of steps that are the same as those in FIG. 4 will be omitted. Note that steps S11 to S20 from FIG. 4 are not shown in FIG. 13.
- the control unit 11 first acquires biometric information of the person to be punctured (S41).
- the biometric information may be manually input by the surgeon via the input unit 14, for example, or may be acquired by the control unit 11 from an electronic medical record server or the like via the communication unit 13 over a network.
- the biometric information may include, for example, the age and gender of the person, and may further include information such as the person's diagnosis, medical history, edema status, height, and weight. If the input data of the first learning model Ma includes environmental information, the control unit 11 may acquire the temperature and humidity of the location where the puncture is to be performed, for example, from a thermometer and hygrometer connected to the input unit 14.
- the control unit 11 Based on the acquired biometric information, the control unit 11 sets a threshold for the amount of deviation between the target puncture position and the planned puncture position during the puncture operation on the person to be punctured (S42).
- the control unit 11 inputs the biometric information into the first learning model Ma and acquires the threshold for the amount of deviation as an output value from the first learning model Ma.
- the control unit 11 identifies the output node that outputs the largest output value (certainty) among the output values from each output node of the first learning model Ma, and estimates and sets the numerical value associated with the identified output node as the threshold for the amount of deviation.
- the control unit 11 may also estimate the threshold for the amount of deviation using the second learning model Mb.
- the control unit 11 inputs the biometric information into the second learning model Mb, identifies the output node that outputs the largest output value (certainty) among the output nodes corresponding to each amount of deviation, and estimates the puncture success rate associated with the identified output node as the puncture success rate for each amount of deviation.
- the control unit 11 sets the deviation amount at which the puncture success rate is estimated to be 80% or higher as the threshold for the deviation amount during the puncture operation. For example, if the puncture success rate for the first and second deviation amounts is estimated to be 80% and the puncture success rate for the third deviation amount is estimated to be less than 40%, the second deviation amount is set as the threshold.
- the process of estimating the threshold for the deviation amount from the biometric information is not limited to processing using the learning models Ma and Mb, and may be rule-based processing. For example, by previously associating the biometric information with the threshold for the deviation amount, it is possible to estimate the threshold for the deviation amount corresponding to the subject's biometric information.
- control unit 11 proceeds to processing step S11 and executes the processes of steps S11 to S21. Then, the control unit 11 determines whether the amount of deviation calculated in step S21 is equal to or greater than the threshold value set in step S42 (S43). If the control unit 11 determines that the amount of deviation is equal to or greater than the threshold value (S43: YES), it outputs a message such as "There is a large deviation from the target position" or "Please try again” to warn the surgeon (outputs an alert) (S44). If the control unit 11 determines that the amount of deviation is less than the threshold value (S43: NO), it skips the processing of step S44 and ends the series of processes.
- S43 the threshold value set in step S42
- control unit 11 may output a message such as "Close to the target position" or "Let's start puncture.”
- the threshold for the amount of deviation between the target puncture position and the planned puncture position which is used to determine whether or not to output an alert when performing a puncture operation, can be switched according to the biometric information of the person to be punctured. For example, as blood vessels become less elastic with age, it is necessary to perform the puncture operation by capturing the center of the blood vessel, and therefore it is necessary to set a stricter threshold for the amount of deviation between the target puncture position and the planned puncture position. In this embodiment, the threshold for the amount of deviation can be switched according to the subject's biometric information, making it possible to output a more appropriate alert depending on the subject.
- the learning models Ma and Mb are used to determine the threshold for the amount of deviation between the target puncture position and the planned puncture position. Additionally, the learning model may be used to determine the gaze time (e.g., 2 or 3 seconds) of the surgeon at the gaze point, which is used as a condition for setting the target puncture position (target setting condition). For example, the state of the surgeon's palpation movement may be detected based on a scenery image, and the gaze time used as the target setting condition may be determined according to the state of the palpation movement.
- the gaze time e.g., 2 or 3 seconds
- a learning model may be used that has been trained to output the gaze time used as the target setting condition when the surgeon inputs information such as the time required for the palpation movement (elapsed time) and the agility of the palpation movement.
- the target setting condition can be switched according to the state of the surgeon's palpation movement, enabling support processing tailored to the surgeon.
- This section describes a puncture support system that outputs an alert when the palpation position palpated by the operator during palpation of the living body 40 deviates by a predetermined distance or more from the set target puncture position.
- the puncture support system of this embodiment can be realized by devices similar to the puncture support system 100 of the first embodiment shown in Fig. 1 , and therefore a description of the configuration of each device will be omitted.
- FIG. 14 is a flowchart showing an example of the puncture operation support processing procedure in embodiment 5, and FIG. 15 is an explanatory diagram of the puncture operation support processing.
- the processing shown in FIG. 14 is the processing shown in FIG. 4, with step S51 added between YES in step S13 and step S14, and steps S52 to S54 added between steps S16 and S17. Explanations of steps that are the same as those in FIG. 4 will be omitted. Note that steps S18 to S22 from FIG. 4 are not shown in FIG. 14.
- the control unit 11 of the information processing device 10 of this embodiment executes a detection process to determine whether the surgeon is performing a palpation operation on the living body 40 based on the scenery image (S12). If it is determined that the surgeon has started the palpation operation (S13: YES), it identifies the surgeon's palpation position on the living body 40 (S51).
- Figure 15 shows an example of a scenery image captured during a palpation operation.
- the surgeon is performing a palpation operation with his thumb, and the center position of the first joint of the thumb is identified as palpation position P3.
- the method of identifying the surgeon's fingers in the scenery image can use pattern matching, a method using a learning model trained by machine learning to extract the surgeon's fingers in the scenery image, or the like.
- control unit 11 executes eye tracking processing (S14) and sets the target puncture position (S16).
- the control unit 11 calculates the amount of deviation (third deviation) between the palpation position identified in step S51 and the target puncture position identified in step S16 (S52).
- the example in Figure 15 shows a state in which the fourth gaze point detected by eye tracking processing is set to the target puncture position P1, and the control unit 11 calculates the linear distance (number of pixels) of the line segment connecting the palpation position P3 and the target puncture position P1, and sets this as the amount of deviation.
- the control unit 11 determines whether the calculated amount of deviation is equal to or greater than a predetermined value (S53), and if it determines that the amount of deviation is equal to or greater than the predetermined value (S53: YES), it outputs a message such as "Look at the location where you are about to puncture" to warn the surgeon (outputs an alert) (S54). If the control unit 11 determines that the amount of deviation is less than the predetermined value (S53: NO), it proceeds to processing of step S17. If an alert is output, the surgeon starts over from the palpation operation. Therefore, after processing step S54, the control unit 11 proceeds to processing of step S12 and repeats the processing from step S12 onwards.
- a predetermined value S53
- the palpation position where the surgeon palpates the living body 40 and the target puncture position set based on the surgeon's viewpoint position relative to the living body 40 deviate by more than a predetermined distance, an alert is output, thereby notifying the surgeon that there is a high possibility that the puncture operation will fail.
- the surgeon searches for the blood vessel to be punctured by palpating the living body 40, and the palpation position is likely to be on the blood vessel. Therefore, by moving the target puncture position closer to the palpation position, the target puncture position can be set on the blood vessel, improving the accuracy of setting the target puncture position.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Hematology (AREA)
- Molecular Biology (AREA)
- Anesthesiology (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Robotics (AREA)
- Vascular Medicine (AREA)
- Theoretical Computer Science (AREA)
- Dermatology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Pulmonology (AREA)
- Pathology (AREA)
- Image Processing (AREA)
Abstract
L'invention propose un programme et similaire pouvant aider au travail de perforation. Conformément à ce programme, un ordinateur détecte une position de point de vue d'un utilisateur par rapport à un corps vivant et identifie une position de perforation cible par rapport au corps vivant sur la base de la position de point de vue de l'utilisateur. L'ordinateur détecte également la position de la pointe d'un instrument de perforation par rapport au corps vivant, et calcule la quantité d'écart entre la position de perforation cible et la position de la pointe de l'instrument de perforation.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024-017930 | 2024-02-08 | ||
| JP2024017930 | 2024-02-08 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025169527A1 true WO2025169527A1 (fr) | 2025-08-14 |
Family
ID=96699694
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/032795 Pending WO2025169527A1 (fr) | 2024-02-08 | 2024-09-13 | Programme, procédé de traitement d'informations, dispositif de traitement d'informations et système de traitement d'informations |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025169527A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002058658A (ja) * | 2000-06-05 | 2002-02-26 | Toshiba Corp | インターベンショナルmri用の磁気共鳴イメージング装置及びその準備方法 |
| JP2006055407A (ja) * | 2004-08-20 | 2006-03-02 | Toshiba Corp | 超音波診断装置及びその制御方法 |
| JP2006102110A (ja) * | 2004-10-05 | 2006-04-20 | Matsushita Electric Ind Co Ltd | 血管位置提示装置 |
| JP2011227365A (ja) * | 2010-04-22 | 2011-11-10 | Osaka Prefecture Univ | 医療看護技術学習支援装置および医療看護技術学習方法 |
-
2024
- 2024-09-13 WO PCT/JP2024/032795 patent/WO2025169527A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002058658A (ja) * | 2000-06-05 | 2002-02-26 | Toshiba Corp | インターベンショナルmri用の磁気共鳴イメージング装置及びその準備方法 |
| JP2006055407A (ja) * | 2004-08-20 | 2006-03-02 | Toshiba Corp | 超音波診断装置及びその制御方法 |
| JP2006102110A (ja) * | 2004-10-05 | 2006-04-20 | Matsushita Electric Ind Co Ltd | 血管位置提示装置 |
| JP2011227365A (ja) * | 2010-04-22 | 2011-11-10 | Osaka Prefecture Univ | 医療看護技術学習支援装置および医療看護技術学習方法 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AU2023214273B2 (en) | Imaging modification, display and visualization using augmented and virtual reality eyewear | |
| US12161410B2 (en) | Systems and methods for vision assessment | |
| CN114980810B (zh) | 用于检测人的运动过程和/或生命体征参数的系统 | |
| WO2014015378A1 (fr) | Dispositif informatique mobile, serveur d'application, support de stockage lisible par ordinateur et système pour calculer des indices de vitalité, détecter un danger environnemental, fournir une aide à la vision et détecter une maladie | |
| US10835120B2 (en) | Extended medical test system | |
| WO2020005053A1 (fr) | Système portatif pour l'identification de cas potentiels d'œdème maculaire diabétique par traitement d'image et intelligence artificielle | |
| KR101637314B1 (ko) | 안구 촬영 장치 및 방법 | |
| Mishra et al. | Artificial intelligence and ophthalmic surgery | |
| KR20170095992A (ko) | 헤드 장착식 컴퓨팅 장치, 방법 및 컴퓨터 프로그램 제품 | |
| WO2025169527A1 (fr) | Programme, procédé de traitement d'informations, dispositif de traitement d'informations et système de traitement d'informations | |
| WO2024181549A1 (fr) | Programme informatique, procédé de traitement d'informations, dispositif de traitement d'informations et système de ponction | |
| EP4231893A1 (fr) | Dispositif de thérapie de neuromodulation rétinienne et de lecture extrafovéale chez des sujets affectés par une déficience visuelle | |
| US12504634B2 (en) | Imaging modification, display and visualization using augmented and virtual reality eyewear | |
| KR102752731B1 (ko) | 착용시에 주변 환경 또는 생체징후 감시가 가능한 스마트 글래스와 그의 제어 방법 및 프로그램 | |
| Smith et al. | The EYESIGHT Robotic Eye Examination System for Non-Mydriatic Indirect Retinal Imaging on Unconstrained Individuals | |
| Maeshiba et al. | Development for tablet-based perimeter using temporal characteristics of saccadic durations | |
| Dragusin et al. | Development of a System for Correlating Ocular Biosignals to Achieve the Movement of a Wheelchair | |
| CN120616716A (zh) | 静脉穿刺辅助方法、系统及相关设备 | |
| CN120693108A (zh) | 穿刺系统、信息处理方法、计算机程序以及信息处理装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24923846 Country of ref document: EP Kind code of ref document: A1 |