WO2025070545A1 - Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images - Google Patents
Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images Download PDFInfo
- Publication number
- WO2025070545A1 WO2025070545A1 PCT/JP2024/034284 JP2024034284W WO2025070545A1 WO 2025070545 A1 WO2025070545 A1 WO 2025070545A1 JP 2024034284 W JP2024034284 W JP 2024034284W WO 2025070545 A1 WO2025070545 A1 WO 2025070545A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- point
- control unit
- sensor
- ray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
Definitions
- This disclosure relates to an image processing device, an image processing system, an image processing method, and an image processing program.
- Patent Document 1 discloses a method for identifying the direction of the patient's ventral side in a three-dimensional image of biological tissue that is generated and displayed based on a two-dimensional ultrasound image acquired by a transducer of an ultrasound catheter.
- CT computed tomography
- DSA digital subtraction angiography
- the purpose of this disclosure is to make it easier to reproduce the shape of a biological lumen in a three-dimensional image when imaging the three-dimensional structure of the biological lumen to generate a three-dimensional image.
- An image processing device that generates a three-dimensional image by imaging a three-dimensional structure of a biological lumen based on a plurality of two-dimensional images obtained by moving a sensor provided on a catheter inserted into the biological lumen along an axis of the catheter through a section from a first point to a second point within the biological lumen,
- An image processing device comprising: a control unit that adjusts the shape of the three-dimensional image to match the shape of the catheter axis in the section, which is identified using at least one X-ray image including an image of at least a portion of the catheter taken by an X-ray imaging device that sees through the biological lumen.
- the control unit is Associating a first location on the at least one X-ray image corresponding to the first point with a location at one end of the line of motion; Associating a second location on the at least one X-ray image corresponding to the second point with a location of the other end of the line of motion; setting a connecting line connecting the first position and the second position on the at least one X-ray image;
- the image processing device according to [2], wherein the shape of the movement line is set in response to an operation for manually changing the shape of the set connection line.
- the control unit is Associating a first location on the at least one X-ray image corresponding to the first point with a location at one end of the line of motion; Associating a second location on the at least one X-ray image corresponding to the second point with a location of the other end of the line of motion; automatically detecting an image of the catheter shaft from the at least one X-ray image;
- the image processing device further comprising: a detecting unit for detecting a shape of the movement line based on the detected image;
- the control unit is Associating a first location on the at least one X-ray image corresponding to the first point with a location at one end of the line of motion; Associating a second location on the at least one X-ray image corresponding to the second point with a location of the other end of the line of motion; automatically detecting an image of the sensor from each of a plurality of X-ray images captured by the X-ray imaging device while the sensor moves through the section;
- the image processing device according to [2], further comprising: a detecting unit for detecting a shape of the movement line based on the detected image;
- control unit corresponds to a position of one end of the movement line in response to an operation of manually specifying a first position corresponding to the first point on the at least one X-ray image when the sensor is present at the first point.
- the control unit corresponds the second position to the position of the other end of the movement line in response to an operation of manually specifying a second position corresponding to the second point on the at least one X-ray image when the sensor is present at the second point.
- the control unit is obtaining ratio information indicative of a ratio of a dimension of the at least one X-ray image to a dimension of the three-dimensional image;
- An image processing device as described in [7], wherein, based on the acquired ratio information, a position on the at least one X-ray image that is a distance away from the first position corresponding to the distance of the movement line is associated with the other end position of the movement line as a second position on the at least one X-ray image corresponding to the second point.
- control unit automatically detects a first position corresponding to the first point on the at least one X-ray image when the sensor is present at the first point, and corresponds the first position to a position at one end of the movement line.
- control unit automatically detects a second position corresponding to the second point on the at least one X-ray image when the sensor is present at the second point, and corresponds the second position to the position of the other end of the movement line.
- control unit causes the three-dimensional image to be displayed on a display in a superimposed manner on the at least one X-ray image.
- control unit updates the three-dimensional image in accordance with changes in the movement line while the movement line is being set.
- a computer that generates a three-dimensional image by imaging a three-dimensional structure of the biological lumen based on a plurality of two-dimensional images obtained by moving a sensor provided on a catheter inserted into the biological lumen along an axis of the catheter through a section from a first point to a second point in the biological lumen,
- An image processing program that executes an operation including adjusting the shape of the three-dimensional image to match the shape of the catheter axis in the section, which is identified using at least one X-ray image including an image of at least a portion of the catheter taken by an X-ray imaging device that visualizes the biological lumen.
- FIG. 1 is a block diagram showing a configuration of an image processing system according to an embodiment of the present disclosure.
- 4 is a flowchart illustrating an operation of an image processing device according to an embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of a screen displayed on a display according to an embodiment of the present disclosure.
- FIG. 11 is a diagram showing an example of updating a screen displayed on a display according to an embodiment of the present disclosure in step S4.
- FIG. 2 is a diagram illustrating an example of a three-dimensional image generated by an image processing device according to an embodiment of the present disclosure.
- FIG. 11 is a diagram showing an example of shape adjustment in step S7 of a three-dimensional image generated by an image processing device according to an embodiment of the present disclosure.
- FIG. 11 is a diagram showing an example of updating a screen displayed on a display according to an embodiment of the present disclosure in step S7.
- 13 is a flowchart showing a first example of a detailed procedure of step S6.
- FIG. 11 is a diagram showing an example of updating a screen displayed on a display according to an embodiment of the present disclosure in step S603.
- FIG. 13 is a diagram showing an example of updating a screen displayed on a display according to an embodiment of the present disclosure in step S604.
- 13 is a flowchart showing a second example of the detailed procedure of step S6.
- 13 is a flowchart showing a third example of a detailed procedure of step S6.
- 13 is a diagram showing an example of updating a screen displayed on a display according to an embodiment of the present disclosure in step S623.
- 13 is a flowchart showing a fourth example of the detailed procedure of step S6.
- 13 is a flowchart showing a fifth example of the detailed procedure of step S6.
- 13 is a flowchart illustrating a modified example of the operation of the image processing device according to the embodiment of the present disclosure.
- the image processing system 10 includes an image processing device 20, a sensor 28, an X-ray imaging device 30, an input device 40, and a display 50.
- the image processing device 20 is connected to the sensor 28, the X-ray imaging device 30, the input device 40, and the display 50 via a cable or a network, or wirelessly.
- the image processing device 20 is, for example, a general-purpose computer such as a PC, a server computer such as a cloud server, or a dedicated computer. "PC" is an abbreviation for personal computer.
- the image processing device 20 may be installed in a medical facility such as a hospital, or may be installed in a facility separate from the medical facility, such as a data center.
- the sensor 28 is, for example, an ultrasound transducer used in IVUS. "IVUS" is an abbreviation for intravascular ultrasound.
- the sensor 28 transmits ultrasound while moving along the axis 27 of a catheter 26 inserted into a biological lumen such as a blood vessel, as shown in FIG. 3, and receives the reflected waves of the transmitted ultrasound.
- the X-ray imaging device 30 is, for example, a C-arm.
- the X-ray imaging device 30 is installed in a medical facility and visualizes the body lumen.
- the input device 40 is, for example, a pointing device such as a mouse, a keyboard, or a touch screen that is integrated with the display 50.
- the input device 40 is installed in a medical facility and is used by an operator such as a doctor or clinical engineer to control the display of various information, including images, on the display 50.
- the display 50 is, for example, an LCD or an organic EL display.
- LCD is an abbreviation for liquid crystal display.
- EL is an abbreviation for electro luminescent.
- the display 50 is installed in a medical facility and displays various information, including images, to the operator to assist in catheter surgery, such as stent graft placement procedures.
- the image processing device 20 includes a control unit 21, a memory unit 22, a communication unit 23, an input unit 24, and an output unit 25.
- the control unit 21 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination of these.
- the processor is a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for specific processing.
- CPU is an abbreviation for central processing unit.
- GPU is an abbreviation for graphics processing unit.
- An example of the programmable circuit is an FPGA.
- FPGA is an abbreviation for field-programmable gate array.
- An example of the dedicated circuit is an ASIC.
- ASIC is an abbreviation for application specific integrated circuit.
- the control unit 21 executes processing related to the operation of the image processing device 20 while controlling each part of the image processing device 20.
- the memory unit 22 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
- the semiconductor memory is, for example, a RAM, a ROM, or a flash memory.
- RAM is an abbreviation for random access memory.
- ROM is an abbreviation for read only memory.
- the RAM is, for example, an SRAM or a DRAM.
- SRAM is an abbreviation for static random access memory.
- DRAM is an abbreviation for dynamic random access memory.
- the ROM is, for example, an EEPROM.
- EEPROM is an abbreviation for electrically erasable programmable read only memory.
- the flash memory is, for example, an SSD.
- SSD is an abbreviation for solid-state drive.
- the magnetic memory is, for example, an HDD. "HDD” is an abbreviation for hard disk drive.
- the memory unit 22 functions, for example, as a main memory device, an auxiliary memory device, or a cache memory. The memory unit 22 stores data used in the operation of the image processing device 20 and data obtained by the operation of the image processing device 20.
- the communication unit 23 includes at least one communication module.
- the communication module is, for example, a module compatible with a wired LAN communication standard such as Ethernet (registered trademark) or a wireless LAN communication standard such as IEEE 802.11. "IEEE” is an abbreviation for Institute of Electrical and Electronics Engineers.
- the communication unit 23 receives data used in the operation of the image processing device 20, and transmits data obtained by the operation of the image processing device 20.
- the communication unit 23 is connected to the sensor 28 and the X-ray imaging device 30. Note that the communication unit 23 does not have to be a communication module as described above, as long as it can receive image signals from the sensor 28 and the X-ray imaging device 30.
- the input unit 24 includes at least one input interface.
- the input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- USB is an abbreviation for Universal Serial Bus.
- HDMI registered trademark
- HDMI registered trademark
- HDMI High-Definition Multimedia Interface
- the output unit 25 includes at least one output interface.
- the output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- the output unit 25 outputs data obtained by the operation of the image processing device 20.
- the output unit 25 is connected to the display 50.
- the program may be stored on a non-transitory computer-readable medium.
- non-transitory computer-readable media include flash memory, magnetic recording devices, optical disks, magneto-optical recording media, and ROMs.
- the program may be distributed, for example, by selling, transferring, or lending portable media such as SD cards, DVDs, or CD-ROMs on which the program is stored.
- SD is an abbreviation for Secure Digital.
- DVD is an abbreviation for digital versatile disc.
- CD-ROM is an abbreviation for compact disc read only memory.
- the program may be distributed by storing it in the storage of a server and transferring it from the server to other computers.
- the program may be provided as a program product.
- the computer temporarily stores in its main storage device, for example, a program stored in a portable medium or a program transferred from a server.
- the computer then reads the program stored in the main storage device with a processor, and executes processing in accordance with the read program with the processor.
- the computer may read the program directly from the portable medium and execute processing in accordance with the program.
- the computer may execute processing in accordance with the received program each time a program is transferred to the computer from the server. Processing may be executed by a so-called ASP-type service that does not transfer a program from the server to the computer, but instead realizes functions only by issuing execution instructions and obtaining results.
- "ASP" is an abbreviation for application service provider.
- Programs include information used for processing by a computer that is equivalent to a program. For example, data that is not a direct command to a computer but has properties that define computer processing falls under "something equivalent to a program.”
- Some or all of the functions of the image processing device 20 may be realized by a programmable circuit or a dedicated circuit as the control unit 21. In other words, some or all of the functions of the image processing device 20 may be realized by hardware.
- the image processing device 20 will be described with reference to FIG. 2.
- the operation described below corresponds to the image processing method according to this embodiment.
- the image processing method according to this embodiment includes steps S1 to S7 shown in FIG. 2.
- control unit 21 acquires one or more X-ray images taken by the X-ray imaging device 30. Specifically, the control unit 21 receives one or more X-ray images from the X-ray imaging device 30 via the communication unit 23.
- the control unit 21 acquires multiple two-dimensional images obtained by the sensor 28 while the sensor 28 moves through the section from the first point P1 to the second point P2 in the biological lumen. Specifically, the control unit 21 receives reflected wave information from the sensor 28 via the communication unit 23 regarding reflected waves received by the sensor 28 while the sensor 28 moves through the section from the first point P1 to the second point P2 in the biological lumen. The control unit 21 generates multiple cross-sectional images as multiple two-dimensional images by imaging the cross-sectional structure of the biological lumen based on the received reflected wave information. Alternatively, the control unit 21 may receive multiple cross-sectional images generated by a separately provided IVUS device based on the reflected wave information as multiple two-dimensional images from the IVUS device via the communication unit 23.
- Step S1 may be executed before S2, during S2, or after S2.
- the control unit 21 causes the display 50 to display a screen 51 such as that shown in FIG. 3 via the output unit 25.
- the screen 51 includes a first area 52 and a second area 53 adjacent to the first area 52.
- the second area 53 is disposed to the right of the first area 52, but the second area 53 may also be disposed to the left of the first area 52.
- the latest two-dimensional image 60 and the human body model image 61 are displayed among the multiple two-dimensional images acquired in S2.
- the human body model image 61 is displayed so that its orientation is aligned with that of the two-dimensional image 60.
- the catheter 26 is inserted from the groin of the patient, and the upper side of the two-dimensional image 60 corresponds to the side where the abdomen of the patient is located, and the lower side of the two-dimensional image 60 corresponds to the side where the back of the patient is located. Therefore, the human body model image 61 is displayed so that the bottom surface is visible, and the abdomen faces upward and the back faces downward.
- the human body model image 61 is positioned at the upper left of the two-dimensional image 60, but the human body model image 61 may be positioned anywhere in the vicinity of the two-dimensional image 60.
- the position of the human body model image 61 may be fixed or may be arbitrarily changeable.
- a slider 54 is further displayed in the first area 52.
- the slider 54 indicates the position of the sensor 28 in the longitudinal direction of the catheter 26.
- the slider 54 is disposed at the end of the first area 52 that is closer to the second area 53. Therefore, in the example shown in FIG. 3, the slider 54 is disposed at the right end of the first area 52.
- the second area 53 displays the X-ray image 80 acquired in S1.
- the X-ray image 80 was taken by the X-ray imaging device 30 when the axis 27 of the catheter 26 was present at least over the section from the first point P1 to the second point P2 within the biological lumen.
- the X-ray image 80 includes an image of at least a portion of the catheter 26, specifically, an image of the axis 27 of the catheter 26, and an image of the sensor 28. In the example shown in FIG. 3, the X-ray image 80 further includes an image of the guidewire 29.
- control unit 21 visualizes the three-dimensional structure of the biological lumen based on the multiple two-dimensional images acquired in S2, and generates a three-dimensional image 70 as shown in Fig. 5. Specifically, the control unit 21 stacks the multiple cross-sectional images generated or received in S2 to generate the three-dimensional image 70.
- control unit 21 causes the three-dimensional image 70 generated in S3 to be superimposed on the X-ray image 80 acquired in S1 and displayed on the display 50.
- control unit 21 updates the screen 51 displayed on the display 50 via the output unit 25 as shown in FIG. 4.
- the three-dimensional image 70 generated in S3 and the human body model image 71 are displayed superimposed on the X-ray image 80 acquired in S1.
- the angle of the three-dimensional image 70 corresponds to the angle when the surgical subject is viewed from the front
- the upper side of the three-dimensional image 70 corresponds to the side where the head of the surgical subject is located
- the lower side of the three-dimensional image 70 corresponds to the side where the feet of the surgical subject are located. Therefore, the human body model image 71 is displayed so that the front is visible, the head faces up, and the feet face down.
- the human body model image 71 is placed in the upper left of the three-dimensional image 70, but the human body model image 71 may be placed in any position near the three-dimensional image 70.
- the position of the human body model image 71 may be fixed or may be arbitrarily changeable.
- the three-dimensional image 70 may be cut out in part along the long axis direction of the catheter 26 so that the internal structure of the biological lumen can be observed, and the cut-out part may be made invisible.
- a cross-sectional image 72 is further displayed in the second area 53.
- the cross-sectional image 72 is a reduced version of the two-dimensional image 60 displayed in the first area 52.
- a camera icon 73 indicating the viewpoint position is placed on the edge of the cross-sectional image 72. This makes it easy to see from what direction the three-dimensional image 70 displayed in the second area 53 is viewed in the cross-sectional image 72.
- the image display orientation in which the three-dimensional image 70 is displayed on the display 50 is adjusted manually or automatically to an orientation in which the X-ray image 80 is viewed from a viewpoint corresponding to the shooting angle at which the X-ray image 80 was captured by the X-ray imaging device 30.
- the relative position and size in which the three-dimensional image 70 is displayed on the display 50 relative to the X-ray image 80 have already been adjusted in the example shown in FIG. 4, but in reality they do not need to be adjusted yet.
- FIG. 4 shows a first position L1 corresponding to a first point P1 and a second position L2 corresponding to a second point P2 on the X-ray image 80, but the first position L1 and the second position L2 may be manually specified or automatically detected by S6.
- the control unit 21 identifies the position of the sensor 28 as the sensor position for each of the two-dimensional images included in the multiple two-dimensional images acquired in S2. Specifically, the control unit 21 identifies the position of the center of the two-dimensional image 60 obtained by the sensor 28 when the sensor 28 is located at the first point P1 as the first sensor position V1. The control unit 21 identifies the position of the center of the two-dimensional image 60 obtained by the sensor 28 when the sensor 28 is located at the second point P2 as the second sensor position V2.
- the control unit 21 identifies the positions of the center of the two-dimensional image 60 obtained by the sensor 28 when the sensor 28 is located at the third point P3 and the fourth point P4 that are different from the first point P1 and the second point P2 as the third sensor position V3 and the fourth sensor position V4, respectively.
- the control unit 21 may further identify the position of the center of the two-dimensional image 60 obtained by the sensor 28 when the sensor 28 is located at a point different from the first point P1, the second point P2, the third point P3, and the fourth point P4 as the sensor position.
- the control unit 21 does not need to identify the fourth sensor position V4.
- the control unit 21 sets a movement line 74 as shown in FIG. 6 in the three-dimensional image 70 generated in S3.
- the movement line 74 is a line extending from the position of the first sensor position V1 identified in S5 in the three-dimensional image 70 in a shape corresponding to the shape of the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen.
- the shape of the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen is identified using at least one X-ray image acquired in S1.
- the shape of the axis 27 of the catheter 26 from the first position L1 to the second position L2 on the X-ray image 80 acquired in S1 is identified as the shape of the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen.
- the control unit 21 adjusts the shape of the three-dimensional image 70 generated in S3 to match the shape of the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen, which is identified using at least one X-ray image. Specifically, as shown in FIG. 6, the control unit 21 adjusts the shape of the three-dimensional image 70 generated in S3 by placing each sensor position identified in S5 on the movement line 74 set in S6. For example, as shown in FIG. 7, the control unit 21 updates the screen 51 displayed on the display 50 via the output unit 25.
- Step S7 may be executed after S6, but is preferably executed simultaneously with S6.
- the control unit 21 updates the three-dimensional image 70 in response to changes in the movement line 74 while the movement line 74 is being set.
- the control unit 21 In order to reduce the amount of calculations required when adjusting the shape of the three-dimensional image 70 in S7, it is desirable for the control unit 21 to convert the voxel data of the three-dimensional image 70 into triangular or quadrangular mesh data when generating the three-dimensional image 70 in S3. Any method such as the marching cubes method, Surface Nets, or Delaunay triangulation can be used to convert the voxel data into mesh data. Any method such as the RMF or Frenet-Serret formula can be used to deform the mesh data when adjusting the shape of the three-dimensional image 70. "RMF" is an abbreviation for rotation minimizing frames.
- control unit 21 may cause the three-dimensional image 70 and the X-ray image 80 to be displayed separately on the display 50 in S4.
- control unit 21 may further obtain another X-ray image taken by the X-ray imaging device 30 at an angle different from that of the X-ray image 80 in S1, and further adjust the shape of the three-dimensional image 70 in the depth direction using the another X-ray image in S4 to S7.
- the control unit 21 generates a three-dimensional image 70 by imaging the three-dimensional structure of the biological lumen based on a plurality of two-dimensional images 60 obtained by moving the sensor 28 provided on the catheter 26 inserted into the biological lumen along the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen.
- the control unit 21 adjusts the shape of the three-dimensional image 70 to match the shape of the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen, which is identified using at least one X-ray image including an image of the sensor 28 taken by an X-ray imaging device 30 that sees through the biological lumen.
- the shape of the biological lumen can be easily reproduced in the three-dimensional image 70.
- the shape of the blood vessel can be easily reproduced in the three-dimensional image 70.
- step S6 a first example of the detailed procedure for step S6 is described.
- the control unit 21 corresponds the first position L1 to one end of the moving line 74 in response to an operation of manually specifying the first position L1 when the sensor 28 is at the first point P1. Specifically, the control unit 21 accepts an operation of clicking or touching, via the input unit 24, the position where the sensor 28 is shown on the X-ray image taken by the X-ray imaging device 30 when the sensor 28 is at the first point P1, using the input device 40. The control unit 21 receives, via the communication unit 23, a notification from the MDU that drives the catheter 26 regarding the position of the sensor 28 in the longitudinal direction of the catheter 26 when the sensor 28 is at the first point P1. "MDU" is an abbreviation for motor drive unit.
- the control unit 21 specifies the first sensor position V1 specified in S5 in the cross section present at the position notified by the MDU in the three-dimensional image 70 as the position of the first sensor position V1 in the three-dimensional image 70.
- the control unit 21 stores the clicked or touched position as the first position L1 and the position of the first sensor position V1 in the three-dimensional image 70 as one end position of the movement line 74 in the storage unit 22 in association with each other.
- control unit 21 corresponds the second position L2 to the other end position of the movement line 74 in response to an operation of manually specifying the second position L2 when the sensor 28 is at the second point P2.
- control unit 21 accepts an operation of clicking or touching, via the input unit 24, the position at which the sensor 28 is shown on the X-ray image taken by the X-ray imaging device 30 when the sensor 28 is at the second point P2, using the input device 40.
- the control unit 21 receives, via the communication unit 23, a notification from the MDU that drives the catheter 26 regarding the position of the sensor 28 in the longitudinal direction of the catheter 26 when the sensor 28 is at the second point P2.
- the control unit 21 specifies the second sensor position V2 specified in S5 in the cross section present at the position notified by the MDU in the three-dimensional image 70 as the position of the second sensor position V2 in the three-dimensional image 70.
- the control unit 21 stores the clicked or touched position as the second position L2 and the position of the second sensor position V2 in the three-dimensional image 70 as the other end position of the movement line 74 in the storage unit 22 in association with each other.
- the control unit 21 sets a connecting line 82 as shown in FIG. 9 on the X-ray image 80.
- the connecting line 82 is a line connecting the first position L1 and the second position L2.
- the control unit 21 places marks such as circles at the first position L1 and the second position L2 on the X-ray image 80 displayed on the screen 51, and draws a straight line connecting the placed marks as the connecting line 82.
- the X-ray image 80 was taken when the sensor 28 was at the first point P1, but it may be taken at any time as long as it was taken when the axis 27 of the catheter 26 was at least in the section from the first point P1 to the second point P2 in the biological lumen.
- the control unit 21 adjusts the relative position and size of the three-dimensional image 70 displayed on the display 50 with respect to the X-ray image 80 so that the position and length of the movement line 74 match the position and length of the connection line 82. Specifically, the control unit 21 adjusts the relative position and size of the three-dimensional image 70 displayed on the display 50 with respect to the X-ray image 80 so that the positions of both ends of the movement line 74 stored in the memory unit 22 match the first position L1 and second position L2 stored in the memory unit 22 in association with them.
- the control unit 21 sets the shape of the movement line 74 in response to an operation to manually change the shape of the connecting line 82 set in S603. Specifically, the control unit 21 accepts an operation to bend the connecting line 82 by dragging any position of the connecting line 82 other than the first position L1 and the second position L2 on the X-ray image 80 displayed on the screen 51 with the input device 40 via the input unit 24. Each time the connecting line 82 is bent, the control unit 21 adjusts the shape of the movement line 74 so that it matches the bent shape of the connecting line 82.
- step S7 when step S7 is executed, for example, the control unit 21 updates the screen 51 displayed on the display 50 via the output unit 25 as shown in FIG. 10.
- step S6 a second example of the detailed procedure for step S6 is described.
- Step S611 is similar to step S601 shown in Figure 8, so a description of it will be omitted.
- the control unit 21 acquires ratio information indicating the ratio of the dimensions of the X-ray image 80 to the dimensions of the three-dimensional image 70. Specifically, when the actual scale of the X-ray image 80 is known, the control unit 21 calculates the ratio of the distance on the X-ray image 80 to the movement distance of the sensor 28 as the ratio information.
- the control unit 21 based on the ratio information acquired in S612, associates a position on the X-ray image 80 that is a distance away from the first position L1 by a distance corresponding to the distance of the movement line 74 with the other end position of the movement line 74 as the second position L2.
- the control unit 21 receives a notification from the MDU that drives the catheter 26 via the communication unit 23 regarding the position of the sensor 28 in the longitudinal direction of the catheter 26 when the sensor 28 is at the second point P2.
- the control unit 21 specifies the second sensor position V2 specified in S5 in the cross section that exists at the position notified by the MDU in the three-dimensional image 70 as the position of the second sensor position V2 in the three-dimensional image 70.
- the control unit 21 calculates the movement distance when the sensor 28 moves in the section from the first point P1 to the second point P2 based on the information notified by the MDU.
- the control unit 21 converts the calculated movement distance into a distance on the X-ray image 80 according to the ratio calculated in S612.
- the control unit 21 stores in the storage unit 22 a position that is a converted distance away from the first position L1 specified in S611 as the second position L2, and a position in the three-dimensional image 70 of the second sensor position V2 as the other end position of the movement line 74, in association with each other.
- the direction in which the second position L2 exists relative to the first position L1 may be specified manually, or may be automatically determined according to the position of the axis 27 of the catheter 26 on the X-ray image 80.
- Steps S614 and S615 are similar to steps S603 and S604 shown in FIG. 8, so a description of these steps will be omitted.
- step S6 a third example of the detailed procedure for step S6 is described.
- Steps S621 and S622 are similar to steps S601 and S602 shown in FIG. 8, respectively, and therefore will not be described.
- Step S622 may be replaced with steps S612 and S613 shown in FIG. 11. In such a modified example, the operation of manually specifying the second position L2 becomes unnecessary.
- control unit 21 sets the shape of the movement line 74 to a shape corresponding to the first position L1, the second position L2, and the connecting line 82 connecting the at least one position, in response to an operation of manually specifying at least one position on the X-ray image 80 that is different from the first position L1 and the second position L2.
- control unit 21 accepts an operation of clicking or touching, via the input unit 24, an arbitrary position of the shaft 27 of the catheter 26 other than the first position L1 and the second position L2 on the X-ray image 80 displayed on the screen 51 as a third position L3 using the input device 40.
- the control unit 21 further accepts an operation of clicking or touching, via the input unit 24, an arbitrary position of the shaft 27 of the catheter 26 other than the first position L1, the second position L2, and the third position L3 on the X-ray image 80 displayed on the screen 51 as a fourth position L4 using the input device 40.
- the control unit 21 places marks such as circles at the first position L1, the second position L2, the third position L3, and the fourth position L4 on the X-ray image 80 displayed on the screen 51, and draws a straight line connecting all the placed marks as a connecting line 82.
- the control unit 21 may further accept an operation of clicking or touching any position of the axis 27 of the catheter 26 other than the first position L1, the second position L2, the third position L3, and the fourth position L4 on the X-ray image 80 displayed on the screen 51 with the input device 40 via the input unit 24. Alternatively, the control unit 21 may not accept an operation of clicking or touching the fourth position L4.
- the control unit 21 adjusts the relative position and size of the three-dimensional image 70 displayed on the display 50 with respect to the X-ray image 80 so that the position and length of the movement line 74 match the position and length of the connection line 82. Specifically, the control unit 21 adjusts the relative position and size of the three-dimensional image 70 displayed on the display 50 with respect to the X-ray image 80 so that the positions of both ends of the movement line 74 stored in the memory unit 22 match the first position L1 and second position L2 stored in the memory unit 22 in association with them.
- the control unit 21 automatically changes the shape of the connecting line 82 each time a position other than the first position L1 and the second position L2 is specified, and changes the shape of the movement line 74 accordingly. Specifically, each time a position other than the first position L1 and the second position L2 is clicked or touched, the control unit 21 places a new mark such as a circle on the X-ray image 80 displayed on the screen 51 at the clicked or touched position, and updates the connecting line 82 so that it passes through the newly placed mark. The control unit 21 then adjusts the shape of the movement line 74 so that it matches the updated shape of the connecting line 82.
- step S7 when step S7 is executed, for example, the control unit 21 updates the screen 51 displayed on the display 50 via the output unit 25 as shown in FIG. 13.
- step S6 a fourth example of the detailed procedure for step S6 is described.
- Steps S631 and S632 are similar to steps S601 and S602 shown in FIG. 8, respectively, and therefore will not be described.
- Step S632 may be replaced with steps S612 and S613 shown in FIG. 11. In such a modified example, the operation of manually specifying the second position L2 becomes unnecessary.
- control unit 21 automatically detects an image of the shaft 27 of the catheter 26 from the X-ray image 80.
- a known image recognition technique can be used as a method for detecting the image of the shaft 27 of the catheter 26.
- Machine learning such as deep learning may also be used.
- the control unit 21 sets the shape of the movement line 74 based on the image detected in S633. Specifically, the control unit 21 adjusts the relative position and size of the three-dimensional image 70 displayed on the display 50 with respect to the X-ray image 80 so that the positions of both ends of the movement line 74 stored in the memory unit 22 match the first position L1 and second position L2 stored in the memory unit 22 in association with them. The control unit 21 then adjusts the shape of the movement line 74 so that it matches the shape of the image detected in S633.
- step S6 a fifth example of the detailed procedure for step S6 is described.
- step S632 in the fourth example is replaced with steps S612 and S613 shown in FIG. 11.
- Step S641 is similar to step S601 shown in FIG. 8, so a description will be omitted.
- Steps S642 and S643 are similar to steps S612 and S613 shown in FIG. 11, respectively, so a description will be omitted.
- Steps S644 and S645 are similar to steps S633 and S634 shown in FIG. 14, respectively, so a description will be omitted.
- the control unit 21 may automatically detect the first position L1 when the sensor 28 is at the first point P1 and correspond the first position L1 to the position of one end of the moving line 74. In such a modified example, the control unit 21 automatically detects the position where the sensor 28 is captured on the X-ray image taken by the X-ray imaging device 30 when the sensor 28 is at the first point P1.
- a known image recognition technique can be used as a method for detecting the position where the sensor 28 is captured.
- Machine learning such as deep learning may also be used.
- the control unit 21 receives a notification from the MDU that drives the catheter 26 via the communication unit 23 about the position of the sensor 28 in the longitudinal direction of the catheter 26 when the sensor 28 is at the first point P1.
- the control unit 21 identifies the first sensor position V1 identified in S5 in the cross section present at the position notified by the MDU in the three-dimensional image 70 as the position of the first sensor position V1 in the three-dimensional image 70.
- the control unit 21 stores the automatically detected position as the first position L1 and the position of the first sensor position V1 in the three-dimensional image 70 as the position of one end of the movement line 74 in the memory unit 22 in association with each other.
- control unit 21 may automatically detect the second position L2 when the sensor 28 is at the second point P2 and correspond the second position L2 to the other end position of the moving line 74, instead of accepting an operation to manually specify the second position L2 when the sensor 28 is at the second point P2.
- control unit 21 automatically detects the position where the sensor 28 is captured on the X-ray image taken by the X-ray imaging device 30 when the sensor 28 is at the second point P2.
- a known image recognition technique can be used as a method for detecting the position where the sensor 28 is captured.
- Machine learning such as deep learning may also be used.
- the control unit 21 receives a notification from the MDU that drives the catheter 26 via the communication unit 23 about the position of the sensor 28 in the longitudinal direction of the catheter 26 when the sensor 28 is at the second point P2.
- the control unit 21 identifies the second sensor position V2 identified in S5 in the cross section present at the position notified by the MDU in the three-dimensional image 70 as the position of the second sensor position V2 in the three-dimensional image 70.
- the control unit 21 stores the automatically detected position as the second position L2 and the position of the second sensor position V2 in the three-dimensional image 70 as the position of the other end of the movement line 74 in the memory unit 22 in association with each other.
- steps S1A to S7A shown in FIG. 16 the description of the same processes as steps S1 to S7 shown in FIG. 2 will be omitted or simplified as appropriate.
- X-ray images are continuously taken during the pull-back operation, which is the operation of pulling the sensor 28 toward the user, and each time a two-dimensional image is generated, that is, for each frame, the position of the sensor 28 on the corresponding X-ray image is continuously captured.
- a curved three-dimensional image 70 can be constructed in real time.
- i is set to "1" when steps S1A to S7A are executed for the first time
- i is set to "2" when steps S1A to S7A are executed for the last time
- i is set to "3" or greater when steps S1A to S7A are executed for the second or subsequent time excluding the last time, and is incremented by 1 each time. For example, if steps S1A to S7A are executed a total of four times, i will have the values "1", "3", "4", and "2" in that order. When i is "1", steps S3A, S4A, S6A, and S7A may be skipped. After S7A, steps S1A and onward are repeatedly executed until some end operation is performed, such as pressing the end button or ending the pullback.
- the control unit 21 acquires an X-ray image taken by the X-ray imaging device 30 when the sensor 28 is located at the i-th point Pi in the living body lumen. Specifically, the control unit 21 receives, via the communication unit 23, the X-ray image taken when the sensor 28 is located at the i-th point Pi from the X-ray imaging device 30. This X-ray image needs to include at least an image of the sensor 28, and does not need to include an image of the shaft 27 of the catheter 26.
- the control unit 21 acquires a two-dimensional image obtained by the sensor 28 when the sensor 28 is located at the i-th point Pi. Specifically, the control unit 21 receives reflected wave information from the sensor 28 via the communication unit 23 regarding the reflected wave received by the sensor 28 when the sensor 28 is located at the first point P1. The control unit 21 generates a cross-sectional image corresponding to the first point P1 as a two-dimensional image by imaging the cross-sectional structure of the biological lumen based on the received reflected wave information. Alternatively, the control unit 21 may receive a cross-sectional image generated by a separately provided IVUS device based on the reflected wave information corresponding to the first point P1 as a two-dimensional image from the IVUS device via the communication unit 23.
- Step S1A may be performed before S2A, simultaneously with S2A, or after S2A.
- control unit 21 In S3A, the control unit 21 generates a three-dimensional image 70 by imaging the three-dimensional structure of the biological lumen based on the two-dimensional image acquired in S2A and, if any, one or more two-dimensional images acquired earlier. Specifically, the control unit 21 generates the three-dimensional image 70 by stacking the cross-sectional image generated or received in S2A and, if any, one or more cross-sectional images generated or received earlier.
- control unit 21 causes the display 50 to display the three-dimensional image 70 generated in S3A superimposed on the X-ray image acquired in S1A.
- control unit 21 identifies the position of the sensor 28 as the sensor position for the two-dimensional image acquired in S2A. Specifically, the control unit 21 identifies the position of the center of the two-dimensional image acquired by the sensor 28 when the sensor 28 is located at the i-th point Pi as the i-th sensor position Vi.
- the control unit 21 sets at least a portion of the movement line 74 in the three-dimensional image 70 generated in S3A.
- the movement line 74 is a line that extends from the position of the first sensor position V1 in the three-dimensional image 70 in a shape corresponding to the shape of the shaft 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen.
- the shape of the shaft 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen is identified using multiple X-ray images acquired in S1A up to when i becomes "2". If i is not "2", the shape of the shaft 27 of the catheter 26 from the first point P1 to partway through the section is identified using the X-ray images acquired in S1A up to that point.
- control unit 21 automatically detects an image of the sensor 28 from the X-ray image acquired in S1A. That is, the control unit 21 automatically detects the position where the sensor 28 is imaged on the X-ray image acquired in S1A.
- a known image recognition technique can be used as a method for detecting the position where the sensor 28 is imaged.
- Machine learning such as deep learning may also be used.
- the control unit 21 receives a notification from the MDU driving the catheter 26 via the communication unit 23 regarding the position of the sensor 28 in the longitudinal direction of the catheter 26 when the sensor 28 is present at the i-th point Pi.
- the control unit 21 identifies the i-th sensor position Vi identified in S5 within the cross section present at the position notified by the MDU in the three-dimensional image 70 as the position of the i-th sensor position Vi in the three-dimensional image 70.
- the control unit 21 stores the automatically detected position as the i-th position Li and the position of the i-th sensor position Vi in the 3D image 70 as the i-th set position of the moving line 74 in the storage unit 22 in association with each other.
- the first set position of the moving line 74 corresponds to the position of one end of the moving line 74.
- the second set position of the moving line 74 corresponds to the position of the other end of the moving line 74.
- the control unit 21 adjusts the shape of the three-dimensional image 70 generated in S3A to match the shape of the axis 27 of the catheter 26 in the section from the first point P1 to the second point P2 in the biological lumen, which was identified using the multiple X-ray images acquired in S1A up to when i becomes "2". If i is not "2", the control unit 21 adjusts the shape of the three-dimensional image 70 generated in S3A to match the shape of the axis 27 of the catheter 26 from the first point P1 to the middle of the section, which was identified using the X-ray images acquired in S1A up to that point.
- control unit 21 adjusts the shape of the three-dimensional image 70 generated in S3A by placing the sensor position identified in S5A up to that point on the portion of the movement line 74 that was set in S6A up to that point. More specifically, the control unit 21 adjusts the relative position and size of the three-dimensional image 70 displayed on the display 50 with respect to the X-ray image 80, as well as the shape of the three-dimensional image 70, so that each set position of the moving line 74 stored in the memory unit 22 matches a position on the X-ray image 80 displayed on the screen 51 that has already been associated with it and stored in the memory unit 22.
- Steps S5A to S7A may be performed before S4A.
- the control unit 21 associates a first position L1 corresponding to the first point P1 on an X-ray image taken by the X-ray imaging device 30 when the sensor 28 is at a first point P1 in the living body lumen with one end position of the moving line 74.
- the control unit 21 associates a second position L2 corresponding to the second point P2 on an X-ray image taken by the X-ray imaging device 30 when the sensor 28 is at a second point P2 in the living body lumen with the other end position of the moving line 74.
- the control unit 21 automatically detects an image of the sensor 28 from each of the multiple X-ray images taken by the X-ray imaging device 30 while the sensor 28 moves through the section from the first point P1 to the second point P2 in the living body lumen.
- the control unit 21 sets the shape of the moving line 74 based on the detected images.
- a curved three-dimensional image 70 can be constructed in real time. Although a corresponding X-ray image is required for each cross-sectional image, it is sufficient that at least the sensor 28 is visible in each X-ray image, and the axis 27 of the catheter 26 does not have to be visible.
- the present disclosure is not limited to the above-described embodiment.
- two or more blocks shown in the block diagram may be integrated, or one block may be divided.
- the steps may be executed in parallel or in a different order depending on the processing capacity of the device executing each step, or as necessary.
- Other modifications are possible without departing from the spirit of the present disclosure.
- the control unit 21 may not display the X-ray image 80 on the display 50 in S4.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
L'invention concerne un dispositif de traitement d'image qui génère une image tridimensionnelle par imagerie d'une structure tridimensionnelle d'une lumière de corps vivant sur la base d'une pluralité d'images bidimensionnelles obtenues par déplacement d'un capteur, qui est disposé dans un cathéter inséré dans la lumière de corps vivant, à l'intérieur d'une section allant d'un premier point à un second point dans la lumière de corps vivant le long de l'arbre du cathéter. Le dispositif de traitement d'image comprend une partie de commande qui ajuste la forme de l'image tridimensionnelle en fonction de la forme de l'arbre du cathéter dans la section, la forme étant spécifiée à l'aide d'au moins une image radiographique comprenant une image d'au moins une partie du cathéter capturée par un dispositif d'imagerie par rayons X qui voit à travers la lumière de corps vivant.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-169236 | 2023-09-29 | ||
| JP2023169236 | 2023-09-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025070545A1 true WO2025070545A1 (fr) | 2025-04-03 |
Family
ID=95201554
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/034284 Pending WO2025070545A1 (fr) | 2023-09-29 | 2024-09-25 | Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025070545A1 (fr) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6148095A (en) * | 1997-09-08 | 2000-11-14 | University Of Iowa Research Foundation | Apparatus and method for determining three-dimensional representations of tortuous vessels |
| US20070093710A1 (en) * | 2005-10-20 | 2007-04-26 | Michael Maschke | Cryocatheter for introduction into a body vessel together with medical investigation and treatment equipment |
| JP2010526556A (ja) * | 2006-11-22 | 2010-08-05 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | X線の経脈管的に収集されたデータとの結合 |
| WO2015045368A1 (fr) * | 2013-09-26 | 2015-04-02 | テルモ株式会社 | Dispositif de traitement d'image, système d'affichage d'image, système d'imagerie, procédé de traitement d'image, et programme |
| JP2017131348A (ja) * | 2016-01-26 | 2017-08-03 | テルモ株式会社 | 画像表示装置およびその制御方法、x線不透過マーカ検出方法 |
| US20180168732A1 (en) * | 2016-12-16 | 2018-06-21 | General Electric Company | Combination Of 3D Ultrasound And Computed Tomography For Guidance In Interventional Medical Procedures |
| US20220087752A1 (en) * | 2017-12-05 | 2022-03-24 | Covidien Lp | Multi-rigid registration of magnetic navigation to a computed tomography volume |
| JP2022523445A (ja) * | 2019-03-14 | 2022-04-22 | コーニンクレッカ フィリップス エヌ ヴェ | 動的な介入的3次元モデル変形 |
-
2024
- 2024-09-25 WO PCT/JP2024/034284 patent/WO2025070545A1/fr active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6148095A (en) * | 1997-09-08 | 2000-11-14 | University Of Iowa Research Foundation | Apparatus and method for determining three-dimensional representations of tortuous vessels |
| US20070093710A1 (en) * | 2005-10-20 | 2007-04-26 | Michael Maschke | Cryocatheter for introduction into a body vessel together with medical investigation and treatment equipment |
| JP2010526556A (ja) * | 2006-11-22 | 2010-08-05 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | X線の経脈管的に収集されたデータとの結合 |
| WO2015045368A1 (fr) * | 2013-09-26 | 2015-04-02 | テルモ株式会社 | Dispositif de traitement d'image, système d'affichage d'image, système d'imagerie, procédé de traitement d'image, et programme |
| JP2017131348A (ja) * | 2016-01-26 | 2017-08-03 | テルモ株式会社 | 画像表示装置およびその制御方法、x線不透過マーカ検出方法 |
| US20180168732A1 (en) * | 2016-12-16 | 2018-06-21 | General Electric Company | Combination Of 3D Ultrasound And Computed Tomography For Guidance In Interventional Medical Procedures |
| US20220087752A1 (en) * | 2017-12-05 | 2022-03-24 | Covidien Lp | Multi-rigid registration of magnetic navigation to a computed tomography volume |
| JP2022523445A (ja) * | 2019-03-14 | 2022-04-22 | コーニンクレッカ フィリップス エヌ ヴェ | 動的な介入的3次元モデル変形 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5683065B2 (ja) | 容積式位置揃えのための改良型システム及び方法 | |
| CN110393591B (zh) | 多模态2d至3d成像导航 | |
| JP7671776B2 (ja) | 複数の画像化モダリティにおける関心対象領域を識別するシステム及び方法 | |
| JP6453857B2 (ja) | 超音波画像の3d取得のためのシステムおよび方法 | |
| CN101579241B (zh) | 图像处理装置 | |
| KR101716421B1 (ko) | 정보 제공 방법 및 정보 제공을 위한 의료 진단 장치 | |
| US8345957B2 (en) | Data processing apparatus, X-ray apparatus, and data processing method | |
| JP5478328B2 (ja) | 診断支援システム、診断支援プログラムおよび診断支援方法 | |
| US20130257910A1 (en) | Apparatus and method for lesion diagnosis | |
| CN105096306B (zh) | 用于对医学图像进行配准的方法和设备 | |
| US11464571B2 (en) | Virtual stent placement apparatus, virtual stent placement method, and virtual stent placement program | |
| CN113853151A (zh) | 用于在手术期间定向远程相机的用户界面元件 | |
| US20160095581A1 (en) | Ultrasonic diagnosis apparatus | |
| JP5415245B2 (ja) | 医用画像表示装置および方法並びにプログラム | |
| US20140055448A1 (en) | 3D Image Navigation Method | |
| CN112869761B (zh) | 医用图像诊断支持系统、医用图像处理装置及方法 | |
| US11607273B2 (en) | Virtual stent placement apparatus, virtual stent placement method, and virtual stent placement program | |
| WO2025070545A1 (fr) | Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images | |
| JPWO2019069867A1 (ja) | 血管抽出装置および血管抽出方法 | |
| JP7075115B2 (ja) | 医療画像処理システムおよび医療画像処理方法 | |
| JP2012147939A (ja) | 画像処理装置 | |
| WO2025205679A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé de traitement d'image et programme de traitement d'image | |
| JP2023148901A (ja) | 情報処理方法、プログラムおよび情報処理装置 | |
| KR20240059417A (ko) | 요로위치 추정 시스템 | |
| JP7480010B2 (ja) | 情報処理装置、プログラムおよび情報処理方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24872331 Country of ref document: EP Kind code of ref document: A1 |