[go: up one dir, main page]

WO2025087777A1 - Avertissement de position perdue pour dispositifs endovasculaires - Google Patents

Avertissement de position perdue pour dispositifs endovasculaires Download PDF

Info

Publication number
WO2025087777A1
WO2025087777A1 PCT/EP2024/079284 EP2024079284W WO2025087777A1 WO 2025087777 A1 WO2025087777 A1 WO 2025087777A1 EP 2024079284 W EP2024079284 W EP 2024079284W WO 2025087777 A1 WO2025087777 A1 WO 2025087777A1
Authority
WO
WIPO (PCT)
Prior art keywords
interventional device
movement
interventional
monitored
unintended
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/079284
Other languages
English (en)
Inventor
Hendrik MANDELKOW
Torre Michelle BYDLON
Alyssa Torjesen
Leili SALEHI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of WO2025087777A1 publication Critical patent/WO2025087777A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Definitions

  • the following relates generally to endovascular therapy. More particularly, embodiments herein relate to interventional devices losing position during endovascular therapy.
  • FEVAR Fenestrated Endovascular Aortic Repair
  • SFA superficial femoral artery
  • celiac artery through endograft fenestrations.
  • the guidewire is then parked in that branching vessel until the interventionalist is ready to bring in a stent or balloon catheter to continue the procedure.
  • a guidewire or catheter may be navigated into a pedal artery where it will be parked while the interventionalist is preparing or using other devices in another area of the limb.
  • interventional devices e.g., guidewires or catheters or therapy devices
  • position e.g., inside a cannulated branch of the vasculature, such as a side branch of the aorta or the femoral artery.
  • interventional devices e.g., guidewires or catheters or therapy devices
  • patient motion or the manipulation of other devices can unintentionally cause such parked interventional devices (e.g., guidewires and/or catheters) to slip from their parked position. This slippage may delay or even foil the ongoing procedure.
  • the loss of position may go unnoticed if the tip of the device is not actively observed (e.g., by fluoroscopy). This not only causes the procedural time to increase but can lead to device breakage or vessel injury, like perforation of the vessel wall.
  • interventional device lost position support includes 1) a system that measures any changes in the position of an endovascular device overtime (e.g., via fluoroscopy, Fiber Optic RealShape (FORS) interventional device tracking, electromagnetic interventional device tracking, etc.), 2) an algorithm that detects or predicts any critical change in device position to sound an alarm or provide a warning, and 3) a procedure that defines the critical motion parameters algorithmically or by human input. While the techniques described herein refer to “endovascular” procedures, it will be appreciated that these techniques are equally applicable to endoluminal procedures (e.g., endobronchial procedures, intracardiac procedures, and/or the like).
  • endovascular e.g., endobronchial procedures, intracardiac procedures, and/or the like.
  • an interventional device displacement warning system for patients.
  • Such an interventional device displacement warning system monitors a shape and a movement of an interventional device in an anatomy of a patient. A determination is made that the interventional device has been placed in a parked position at a target location based on the monitored shape and movement of the interventional device. A determination is made that the interventional device has been displaced from the parked position based on the monitored movement relative to the target location.
  • a system includes a processor and a memory communicatively coupled to the processor.
  • the memory stores logic that includes a set of instructions executable by the processor, which when executed by the processor, cause the processor to monitor a shape and a movement of an interventional device in an anatomy of a patient. A determination is made that the interventional device has been placed in a parked position at a target location based on the monitored shape and movement of the interventional device. A determination is made that the interventional device has been displaced from the parked position based on the monitored movement relative to the target location.
  • At least one non-transitory machine-readable storage includes a set of instructions, which when executed by a computing device, cause the computing device to monitor a shape and a movement of an interventional device in an anatomy of a patient. A determination is made that the interventional device has been placed in a parked position at a target location based on the monitored shape and movement of the interventional device. Future movement of the interventional device is predicted relative to the target location based on the monitored movement of the interventional device. An alarm is triggered when the predicted future movement of the interventional device is predicted to be displaced beyond a displacement threshold value.
  • a method includes monitoring a shape and a movement of an interventional device in an anatomy of a patient. A determination is made that the interventional device has been placed in a parked position at a target location based on the monitored shape and movement of the interventional device. A determination is made that the interventional device has been displaced from the parked position based on the monitored movement relative to the target location.
  • FIG. 1 is a schematic diagram illustrating an example C-arm imaging system for use in interventional device lost position warnings, in accordance with some aspects of the present disclosure
  • FIG. 2 is an illustration of a block diagram of an example interventional device lost position warning system according to an embodiment
  • FIG. 3 is an illustration of an interventional treatment according to an embodiment
  • FIG. 4 is an illustration of a flowchart of a method for managing an interventional device lost position warning system according to an embodiment
  • FIG. 5 is an illustration of a flowchart of another method for managing an interventional device lost position warning system according to an embodiment
  • FIG. 6 is an illustration of a flowchart of a further method for managing an interventional device lost position warning system according to an embodiment
  • FIG. 7 is an illustration of a flowchart of a still further method for managing an interventional device lost position warning system according to an embodiment
  • FIG. 8 is an illustration of a block diagram of a computer program product according to an embodiment
  • FIG. 9 is a further illustration of a system according to an embodiment.
  • FIG. 10 is an illustration of a hardware apparatus including a semiconductor package according to an embodiment.
  • FIG. 1 is a schematic diagram illustrating an example interventional imaging system 100, in accordance with some aspects of the present disclosure.
  • the system 100 includes one or more processors 110 that are configured to perform one or more aspects of the below-described methods.
  • the system 100 may include an X-ray imaging system 120, which may be configured to provide a single X-ray image, or a temporal sequence of X-ray images also known as fluoroscopy images.
  • the X-ray imaging system 120 may also be configured to provide a temporal sequence of contrast-enhanced images, from which digital subtraction angiography (DSA) images are generated by subtracting a mask image (and upon which motion compensation may be performed).
  • DSA digital subtraction angiography
  • X-ray imaging system 120 is illustrated as a C-arm X-ray system with the top of the X-ray imaging system 120 being the X-ray detector and the bottom of the X- ray imaging system 120 being the X-ray source.
  • the system 100 may also include a display 130 for displaying the acquired X-ray and/or DSA image.
  • the system 100 may also include a patient bed 140.
  • the system 100 may also include a user interface device such as a keyboard, and/or a pointing device such as a mouse, and/or a joystick to control any of the components of the system 100. These items may be in communication with each other via wired or wireless communication.
  • FIG. 2 is an illustration of a block diagram of an example interventional device lost position warning system 200 according to an embodiment.
  • the interventional device lost position warning system 200 includes an imaging system 220 (e.g., such as the example interventional imaging system 100 of FIG. 1), a workstation 240, an Artificial Intelligence (Al) controller 270 and an Al training system 280.
  • the imaging system 220 captures medical images that include imagery of an interventional medical device 201.
  • the interventional medical device 201 may enable the administration of one or more patient procedures (e.g., endovascular therapy).
  • one or more patient procedures e.g., endovascular therapy
  • an interventional medical device 201 is used to perform endovascular coiling, flow diverter deployment, stent deployment, balloon angioplasty therapy, atherectomy therapy, the like, and/or combinations thereof.
  • the imaging system 220 may include one or more medical imaging devices, medical imaging systems, the like, and/or combinations thereof.
  • the imaging system 220 may include an interventional imaging device including an X-ray device (such as a fixed C-arm imaging system and/or a mobile C-arm imaging device), an ultrasound device, and/or the like.
  • X-ray imaging systems include a fixed C-arm X-ray system such as Azurion from Koninklijke Philips N.V. or a mobile X-ray system such as Veradius from Koninklijke Philips N.V.
  • the imaging system 220 may include a non- interventional imaging device such as a magnetic resonance imaging (MRI) device, a computed tomography (CT) device, and/or the like.
  • the imaging system 220 may include a radiology information system (RIS), a picture archiving and communication system (PACS), the like, and/or combinations thereof, where images acquired from interventional or non- interventional imaging systems may be stored.
  • the imaging system 220 is associated with standardized format medical imaging information to acquire, transmit, store, retrieve, print, process, and display medical imaging information (e.g., Digital Imaging and Communications in Medicine (DICOM) data).
  • DICOM Digital Imaging and Communications in Medicine
  • Machine learning, artificial intelligence and/or image processing techniques can be used to segment and track the interventional medical device 201.
  • the international medical device may be tracked in the image during live fluoroscopic imaging or other imaging procedures performed by imaging system 220.
  • the interventional medical device 201 may itself provide such data.
  • a fiber optic shape sensing system may be built into the interventional medical device 201 (or an electromagnetic tracking device may be built into the interventional medical device 201).
  • such a fiber optic shape sensing system may be a Fiber Optic RealShape (FORS) system.
  • FORS is an alternative guidance system for endovascular surgery. FORS uses light reflected along optical fibers embedded within wires and catheters to generate real-time, high-fidelity, three-dimensional (3D) reconstructions of endovascular devices (referred to herein as “3D device reconstructions”) without fluoroscopy.
  • gratings are manufactured into the fiber cores (filaments), which reflect specific wavelengths of light when a bend is applied.
  • a laser light source and sensor are used to measure light backscatter.
  • FORS-enabled devices are embedded with an optical fiber composed of multiple fiber cores, enabling differentiation of axial force and twisting from the applied bend.
  • Similar 3D device reconstructions may also be provided via an electromagnetic tracking device built into the interventional medical device 201.
  • an electromagnetic tracking device built into the interventional medical device 201.
  • one or multiple electromagnetic sensor could be embedded throughout the length of the interventional device and based on the location and orientation of those sensor an estimate of the shape, location and orientation of the interventional device could be captured.
  • the electromagnetic sensors are tracked by an electromagnetic tracker installed below the operating table.
  • a video system 291 may be utilized to monitor conditions of the interventional device and its surroundings outside the patient body (referred to herein as “external conditions”). For example, the video system 291 may monitor movement of a patient 214, whether a user is actively engaged with the interventional medical device 201 (e.g., has hands on the device), and/or monitor the exterior portions of the interventional medical device 201 itself.
  • the workstation 240 includes a controller 250, an interface 253, a monitor 255 and a user interface 256.
  • the controller 250 includes a memory 251 that stores instructions and a processor 252 that executes the instructions.
  • the Al controller 270 includes a memory 271 that stores instructions and a processor 272 that executes the instructions.
  • the Al controller 270 may dynamically implement trained machine learning models based on images received by the workstation 240 from the imaging system 220, 3D device reconstructions from interventional medical device 201, and/or external conditions from video system 291.
  • the Al controller 270 is integrated with the workstation 240.
  • functionality of the Al controller 270 as described herein may be performed by the controller 250.
  • the Al controller 270 may include a network computer configured to receive a stream of a plurality of images from a system computer implemented by the workstation 240, 3D device reconstructions from interventional medical device 201, and/or external conditions from video system 291 and apply the trained machine learning model to the plurality of input.
  • the significance of the input may be determined according to a standardized metric that may be applied across different clinical sites and users.
  • the metric used to determine the significance may vary based on the context in which the input are being analyzed. For example, the metric may vary based on subject matter being sought as triggers, and the subject matter may include medical instruments, anatomical structures, motion(s), or presence of people.
  • the significance metric may be determined by experts in the field(s) in which the machine learning model described herein is applied. Accordingly, the significance may be appropriately termed a contextual significance that reflects the context of the subject matter in the input deemed significant.
  • the Al training system 280 includes an Al training controller 281.
  • the Al training controller 281 may include a memory (not shown) that stores instructions and a processor (not shown) that executes the instructions.
  • the Al training system 280 may train machine learning models as described herein and provide the trained machine learning model to the Al controller 270 and/or the workstation 240 in FIG. 2.
  • the Al training system 280 may include a controller configured to train the machine learning model.
  • the machine learning model may be a convolutional neural network (CNN) model, a temporal convolutional network (TCN) model, a recurrent neural network (RNN) model, a transformer model, a Hidden Markov model (HMM), and so forth.
  • CNN convolutional neural network
  • TCN temporal convolutional network
  • RNN recurrent neural network
  • HMM Hidden Markov model
  • the controller 250 may be a data processing controller that is configured to receive consecutive fluoroscopy images from the imaging system 220, 3D device reconstructions from interventional medical device 201, and/or external conditions from video system 291.
  • the fluoroscopy images may be images that are generated and stored by the imaging system 220 during a clinical procedure.
  • the fluoroscopy images may be further enhanced by other information from the imaging system 220 such as a C- arm position, radiation dose, identification of procedure phase, identification of the interventional medical device 201 in the image, image generation settings, type of scanning protocol type, as well as patient characteristics including information from the patient's electronic medical record (EMR).
  • EMR electronic medical record
  • the interface 253 interfaces the workstation 240 to the imaging system 220.
  • the interface 253 may include a receiver that receives a stream of a plurality of images from the X- ray imaging system 220.
  • the interface 253 may be a port or an adapter that accepts a cable line from the imaging system 220.
  • the monitor 255 displays images generated by the imaging system 220 and/or 3D device reconstructions from interventional medical device 201.
  • the monitor 255 may also display instructions or alerts for a clinician using the interventional device lost position warning system 200.
  • the user interface 256 accepts touch instructions from a clinician, such as instructions input via a mouse or keyboard.
  • the user interface 256 may also accept touch input such as via a keypad, touchpad, touchscreen, or the like.
  • the workstation 240 may receive time-series interventional images such as fluoroscopy X-ray images from the imaging system 220.
  • the workstation 240 may also receive digital subtraction angiography (DSA) images from the imaging system 220.
  • DSA digital subtraction angiography
  • the workstation 240 may also receive radiation exposure information, and other system information such as C-arm position, radiation settings and table position.
  • the workstation 240 applies trained machine learning models to raw sensor data (e.g., x-ray images, ultrasound images, fiber optic interventional device data, electromagnetic interventional device data, video images, etc.) to monitor a shape and a movement of an interventional device 201 in an anatomy of a patient, determine that the interventional device 201 has been placed in a parked position at a target location based on the monitored shape and movement of the interventional device 201, and determine that the interventional device 201 has been displaced from the parked position based on the monitored movement relative to the target location.
  • raw sensor data e.g., x-ray images, ultrasound images, fiber optic interventional device data, electromagnetic interventional device data, video images, etc.
  • a robotic control manipulator 290 is coupled to the interventional medical device 201.
  • the robotic control manipulator 290 is configured to perform one or more operations in response to an alarm.
  • such operations may be utilized to inhibit movement of the interventional medical device 201, relieve rotational torque of the interventional medical device 201, relieve linear tension of the interventional medical device 201, the like, and/or combinations thereof.
  • FIG. 3 is an illustration of an interventional treatment of a patient vasculature 300 according to an embodiment. It will be understood that the techniques described herein are applicable to any interventional procedure where the motion of the intervention device from a parked position needs to be monitored. As illustrated, an interventional medical device 201 cannulates a blood vessel 301 past a branching point 303.
  • the interventional device 201 may be utilized to access patient vasculature 300 or other anatomical structures accessible through patient vasculature (e.g., using catheters, guidewires, sheaths, and so on) and/or to deliver treatment to the patient (e.g., using stents, coils, flow diverters, and so on).
  • the interventional device 201 may enable the administration of one or more patient procedures (e.g., endovascular therapy).
  • endovascular therapy refers to endovascular coiling, flow diverter deployment, stent deployment, balloon angioplasty, atherectomy procedures, the like, and/or combinations thereof.
  • an interventional medical device 201 enters a “parked” position when the end 309 of the interventional medical device 201 remains at rest at a target location. Such parking may occur while a user of the interventional medical device 201 is busy with another task. During such parking, it is often important for the end 309 of the interventional medical device 201 to remain in place.
  • the target location is a location adjacent to the branching point 303 where a branching vessel 311 extends from another blood vessel 301.
  • a branching vessel 311 extends from another blood vessel 301.
  • FIG. 4 is an illustration of a flowchart of a method 400 for managing interventional device displacement warnings according to an embodiment.
  • method 400 includes human operator 410 interaction with an environment 402 (e.g., including a patient and various devices, etc.).
  • environment 402 e.g., including a patient and various devices, etc.
  • observational systems 404 obtain data regarding the patient, the human operator 410, and the interventional devices.
  • various aspects of the environment are captured by sensors (e.g., x-ray images, ultrasound images, fiber optic interventional device data, electromagnetic interventional device data, video images, etc.).
  • sensors e.g., x-ray images, ultrasound images, fiber optic interventional device data, electromagnetic interventional device data, video images, etc.
  • the position of interventional devices relative to patient anatomy are monitored by x-ray imaging, a video camera monitors the external parts of those devices possibly in contact with moving objects like an x-ray C-arm or people in the operating room, and/or a fiber optic interventional device provides shape information of the interventional device itself.
  • the observational systems 404 raw sensor data e.g., x-ray images, ultrasound images, fiber optic interventional device data, electromagnetic interventional device data, video images, etc.
  • raw sensor data e.g., x-ray images, ultrasound images, fiber optic interventional device data, electromagnetic interventional device data, video images, etc.
  • Al Computer vision algorithms
  • Such features based on Al could be predictive in nature (e.g., by predicting the position of a device into the future).
  • relevant thresholds may be defined automatically (e.g., algorithmically via Al) or by human input based on the extracted features and possibly the imaging data. For example, a physician may point and click on an x-ray image to identify the tip of a device and a critical point in the anatomy to set a threshold distance. In another example, an Al algorithm may be trained on such human input to do this task automatically.
  • the extracted features and associated thresholds are passed to a monitoring system to determine if the associated thresholds are violated. For example, this could be based on a single distance measure (e.g., a distance of device tip from a branching point) or it could be a complex machine learning prediction based on multiple features. For example, if the patient moves and the electromagnetic tracker equipped interventional device also moves but in a different direction, complex machine learning prediction based on multiple features may predict a threshold being exceeded by unintended movement of the interventional device.
  • a single distance measure e.g., a distance of device tip from a branching point
  • complex machine learning prediction based on multiple features may predict a threshold being exceeded by unintended movement of the interventional device.
  • a visual or acoustic or otherwise sensory warning is generated to alert the human operators 410 to take corrective action (e.g., to adjust a device manually or stop the interfering motion of other equipment like the C-arm, etc.). Additionally, or alternatively, the alarm may trigger a robotic control manipulator to react instead of a human operator or in addition to the human operator (e.g., to inhibit movement of the interventional device).
  • interventional device lost position support includes 1) a system that measures any changes in the position of an endovascular device overtime (e.g., via fluoroscopy, Fiber Optic RealShape (FORS) interventional device tracking, electromagnetic interventional device tracking, etc.), 2) an algorithm that detects or predicts any critical change in device position to sound an alarm or provide a warning, and 3) a procedure that defines the critical motion parameters algorithmically or by human input.
  • FORS Fiber Optic RealShape
  • Some implementations herein envisage a warning system that monitors the position of an endovascular device inside a vessel and detects any movement that will lead to a critical displacement of the device from its intended position.
  • x-ray fluoroscopy Fiber Optic RealShape (FORS) interventional device tracking, electromagnetic interventional device tracking, or similar data may be used as input to a device tracking algorithm.
  • FORS Fiber Optic RealShape
  • a video surveillance system may be utilized that monitors the external parts of interventional devices, the hands of the interventionalist with respect to the interventional devices, patient movement, or the like to detect critical motion.
  • any kind of endovascular device tracking technology with sufficient spatial and temporal resolution may be utilized.
  • the temporal and spatial resolution will be such that the system can alert the human interventionalist in time to react and prevent device slippage.
  • the required spatial resolution depends on the anatomical context of the operation (e.g., the size of cannulated vessels and the required precision of the device position to be monitored and maintained). In most cases the required precision will be a small fraction (e.g., 10%) of the distance between the tip of the cannulating device and the most distal branching point.
  • a tracking resolution of 1mm or less is achievable by multiple device tracking technologies like x-ray fluoroscopy, FORS, or electromagnetic sensing.
  • the required temporal resolution depends on the typical reaction time of a (human) interventionalist as well as the maximal speed of involuntary motion for the monitored endovascular device (e.g., motion that is not the result of intentional manipulation). Given a typical human reaction time of roughly 200ms, a temporal resolution well below Is is achievable by multiple device tracking technologies like x-ray fluoroscopy, FORS, or electromagnetic sensing.
  • An automatic warning system utilizes a tracking algorithm to monitor the aforementioned device position data for critical changes.
  • the specifics of this tracking algorithm will depend on the chosen input modality and on the selection of critical features / motion parameters discussed below.
  • the tracking algorithm will continuously extract relevant features like the location of the device tip or other relevant parts as well as boundaries located at some relative distance and/or defined by proximity to other features of the surrounding anatomy or devices.
  • Computer vision methods including neural networks are utilized for localizing such features in imaging data.
  • the tracking algorithm will monitor distances and positions of relevant features to sound the alarm when predetermined displacement threshold safety margins are exceeded. In the simplest case this would mean alerting the user when a device has moved more than a predefined distance from its intended position or came too close to an undesired position.
  • a dynamic model of device motion could be added to make a prediction that extrapolates device motion from recent history and may thus afford a longer warning time.
  • GUI graphical user interface
  • computer vision and machine learning methods can be used to extract information about device position in relation to relevant anatomical structures like the branching point of the cannulated vessel.
  • feature tracking algorithms for tracking image features in video sequences over time.
  • the position of the device in the anatomy may also be determined based on information from tracked devices, such as using shape features of a FORS equipped interventional device to determine if it is parked in a cannulated vessel or tracking the path that an electromagnetic (EM) equipped interventional device has travelled to determine if it is parked in a cannulated vessel.
  • EM electromagnetic
  • Identifying the relevant features of the patient anatomy and the device for tracking may be done from user input (e.g., by marking the device tip and an anatomical location that will trigger the alarm if approached). Alternatively, these features can be identified automatically using computer vision and machine learning algorithms trained on previously recorded data.
  • the algorithm that predicts device slippage and triggers the warning can be varied in complexity.
  • the simplest option might be a predefined threshold on the distance between the device tip and another location along the same device (e.g., the nearest branching point of the cannulated vessel).
  • a more sophisticated algorithm might include a prediction of the tips future position given its recent evolution. Such predictions may be made using Kalman filters, in one example. Given a sufficient amount of training data from previously recorded slippage events one could employ machine learning algorithms like recurrent or convolutional neural networks, which might simultaneously be trained to identify the critical points mentioned above.
  • FIG. 5 shows an example method 500 for managing interventional device displacement warnings according to an embodiment.
  • the method 500 may generally be implemented in the interventional device lost position warning system 200 (FIG. 2), already discussed.
  • the methods described herein may be performed at least in part by cloud processing.
  • Illustrated processing block 502 provides for monitoring shape and movement of an interventional device in an anatomy of a patient.
  • a fiber optic shape sensing system may be built into the interventional medical device (or an electromagnetic tracking device may be built into the interventional medical device) to monitor the shape and the movement of an interventional device in an anatomy of a patient.
  • a fiber optic shape sensing system may be a Fiber Optic Real Shape (FORS) system or an electromagnetic tracking device built into the interventional medical device.
  • FORS Fiber Optic Real Shape
  • the overall shape of the device can provide a great deal of information about what the interventional device is doing, such as enabling one to infer where the interventional device is parked (e.g., in a branching vessel) or determining how one segment of the interventional device is moving relative to another segment (e.g., if the proximal and distal ends are both advancing or parked as expected).
  • an imaging system may include one or more medical imaging devices, medical imaging systems, the like, and/or combinations thereof to monitor the shape and the movement of an interventional device in an anatomy of a patient.
  • Machine learning, artificial intelligence and/or image processing techniques can be used to segment and track the interventional medical device via data from the medical images.
  • the international medical device may be tracked in the image during live fluoroscopic imaging or other imaging procedures performed by the imaging system.
  • the interventional device tip and length may be tracked from fluoroscopic image data.
  • interventional devices are tracked temporally to estimate location with respect to a target location.
  • machine learning algorithms can be used to train models that can segment and track the interventional device present in the image using training data consisting of annotated interventional devices present in fluoroscopic images containing various target locations.
  • Annotations may be made manually by an expert.
  • annotations may be made automatically by collecting fluoroscopic image data with the interventional device on known anatomical background which can then be subtracted from the fluoroscopic image data.
  • location may be determined from the data from the interventional device itself, which may include positional location data.
  • Illustrated processing block 504 provides for determining that the interventional device has been placed in a parked position at a target location. For example, determining that the interventional device has been placed in a parked position at a target location is based on the monitored shape and movement of the interventional device.
  • an interventional medical device enters a “parked” position when the end of the interventional medical device remains at rest at a target location. Such parking may occur while a user of the interventional medical device is busy with another task. During such parking, it is often important for the end of the interventional medical device to remain in place.
  • the position of the device in a parked position in the anatomy may also be determined based on information from tracked devices. For example, shape features of a FORS equipped interventional device may be used to determine if the interventional device is parked in a cannulated vessel. Similarly, tracking the path that an electromagnetic (EM) equipped interventional device has travelled may be utilized to determine if the interventional device is parked in a cannulated vessel.
  • EM electromagnetic
  • the target destination is set manually by a user of the interventional device (e.g., by marking the device tip and an anatomical location that will trigger the alarm if approached).
  • the target location may be identified manually by a user using a user interface such as by pointing on a touch screen, or clicking using a mouse, or drawing a bounding box using a mouse, and so forth.
  • these features can be identified automatically using computer vision algorithms, machine learning algorithms, artificial intelligence (Al) algorithms, image processing algorithms, the like, or some combination thereof trained on previously recorded data.
  • Illustrated processing block 506 provides for determining that the interventional device has been displaced from the parked position. For example, determining that the interventional device has been displaced from the parked position is based on the monitored movement relative to the target location.
  • FIG. 6 is a flowchart of an example of another method 600 for managing interventional device displacement warnings according to an embodiment.
  • the method 600 may generally be implemented in the interventional device lost position warning system 200 (FIG. 2), already discussed.
  • Illustrated processing block 602 provides for predicting future movement of the interventional device relative to the target location. For example, predicting future movement of the interventional device relative to the target location is based on the monitored movement of the interventional device.
  • a prediction of future movement may be made. For example, a dynamic model of device motion could be added to make a prediction that extrapolates device motion from recent history and may thus afford a longer warning time.
  • a sophisticated algorithm might include a prediction of the tips future position given its recent evolution. Such predictions may be made using Kalman filters, in one example.
  • machine learning algorithms like recurrent or convolutional neural networks, which might simultaneously be trained to make a prediction of future movement.
  • Advanced warning algorithms could also take into account “supplemental” information (e.g., about the strain and tension in a (FORS) equipped interventional device) making it likely to “snap” out of place, or any physical interactions on the proximal end where a physician might touch to make a prediction of future movement.
  • Illustrated processing block 604 provides for triggering an alarm. For example, an alarm is triggered when the predicted future movement of the interventional device is predicted to be displaced beyond a displacement threshold value.
  • such a displacement threshold value may be defined automatically (e.g., algorithmically via Al) or by human input based on the extracted features and possibly the imaging data. For example, a physician may point and click on an x-ray image to identify the tip of a device and a critical point in the anatomy to set a threshold distance. An Al algorithm may be trained on such human input to do this task automatically.
  • FIG. 7 is a flowchart of an example of another method 700 for managing interventional device displacement warnings according to an embodiment.
  • the method 700 may generally be implemented in the interventional device lost position warning system 200 (FIG. 2), already discussed.
  • various processing blocks are illustrated as being performed by displacement warning logic 702, an interventional device 704, an imaging system 706, video system 708, a display 710, and/or a robotic control manipulator 712 in conjunction with one another (e.g., as discussed above in FIG. 2).
  • Illustrated processing blocks 720 provide for receiving interventional device data.
  • Illustrated processing blocks 722 provide for receiving imaging system data.
  • Illustrated processing blocks 724 provide for receiving video system data.
  • Illustrated processing block 726 provides for monitoring shape and movement of an interventional device in an anatomy of a patient.
  • the monitoring the shape and movement of the interventional device is based on data from one or more sources comprising a fiber optic shape sensing system built into the interventional device, an electromagnetic shape sensing system built into the interventional device, an x-ray medical imaging system, an ultrasound medical imaging system, a magnetic resonance imaging system, a video system, the like, and/or combinations thereof.
  • Illustrated processing block 728 provides for determining that the interventional device has been placed in a parked position at a target location. For example, determining that the interventional device has been placed in a parked position at a target location is based on the monitored shape and movement of the interventional device.
  • Illustrated processing block 734 provides for monitoring a position and a shape of the interventional device outside the anatomy. For example, monitoring a position and a shape of the interventional device outside the anatomy is based on the video system data.
  • the video system in the room captures frames that are fed into an object detection algorithm which is trained to detect different interventional devices or different portions of an interventional device.
  • object detection algorithms could be trained on labeled data or in an unsupervised fashion based on correlations between the data from different imaging modalities.
  • triggering the alarm is based on the determination that the movement of the interventional device is unintended.
  • the monitored position and shape of the interventional device outside the anatomy may indicate unintended movement due to movement of a coaxial device (e.g., if a guidewire portion of the interventional device is moving unintentionally due to the movement of a coaxial catheter portion of the interventional device).
  • Illustrated processing block 736 provides for monitoring a position of an interventionist’s hands with respect to the interventional device. For example, monitoring a position of an interventionist’ s hands with respect to the interventional device is based on the video system data.
  • Al-based computer vision models may be used for estimating the pose of human bodies as well as their hands. Pretrained models are available in this domain.
  • triggering the alarm is based on the determination that the movement of the interventional device is unintended. For example, if the interventional device is moving or predicted to be moving, but the interventionist’s hands are not on the interventional device, this may be an indication that any movement is unintended.
  • Illustrated processing block 738 provides for monitoring movement of a patient. For example, monitoring movement of a patient is based on the video system data.
  • Al-based computer vision models may be used for estimating the pose of human bodies. Pretrained models are available in this domain.
  • triggering the alarm is based on the determination that the movement of the interventional device is unintended. For example, if the interventional device is moving or predicted to be moving, and the patient is also moving, this may be an indication that any movement is unintended.
  • Illustrated processing block 740 provides for monitoring monitor position and strain information of the interventional device. For example, monitoring the position and strain information of the interventional device is based on data from the interventional device itself.
  • triggering the alarm is based on the determination that the movement of the interventional device is unintended.
  • the monitored shape, position, and strain information of the interventional device may be used to estimate torque or tension build-up such that when the interventional device is near the target location, the aggregate information can be used to estimate risk of sudden movement of the interventional device.
  • Illustrated processing block 742 provides for determining/predicting that movement in unintended. As described above at blocks 734-740 such a determination/prediction that movement in unintended may be made based on one or many factors described above.
  • Illustrated processing block 746 provides for determining/predicting displacement. For example, determining that the interventional device has been displaced from the parked position (or predicted to be displaced from the parked position) is based on the monitored movement relative to the target location.
  • Illustrated processing block 748 provides for triggering the alarm. For example, the alarm will be triggered when the monitored movement or predicted future movement of the interventional device is displaced beyond a displacement threshold and/or when the monitored movement or predicted future movement is determined to be unintended.
  • Illustrated processing block 750 provides for presenting the alarm to a user.
  • the alarm may be presented to a user via the display 710.
  • Illustrated processing block 752 provides for controlling the interventional device in response to triggering the alarm.
  • the robotic control manipulator may be utilized automatically to counteract or inhibit any unintentional movement of the interventional device.
  • the robotic control manipulator is to perform one or more operations in response to the alarm.
  • the robotic control manipulator is to inhibit movement of the interventional device, relieve rotational torque of the interventional device, and/or relieve linear tension of the interventional device in response to triggering the alarm.
  • the procedures described herein may provide a framework for clinical deployment of decision support algorithms. These procedures can work together with many clinical decision support (CDS) algorithms (such as acute kidney injury (AKI), acute respiratory distress syndrome (ARDS), acute decompensated heart failure (ADHF), etc.).
  • CDS clinical decision support
  • Clinical decision support (CDS) refers to computer-based support of clinical staff responsible for making decisions for the care of patients. Computer-based support for clinical decision-making staff may take many forms, from patient-specific visual/numeric health status indicators to patientspecific health status predictions and patient-specific health care recommendations.
  • the procedures described herein may be deployed on analytics platforms (such as Inference Engine, Critical Care Information System, Interoperability Solution, etc.) in conjunction with CDS algorithms.
  • FIG. 8 illustrates a block diagram of an example computer program product 800.
  • computer program product 800 includes a machine-readable storage 802 that may also include logic 804.
  • the machine-readable storage 802 may be implemented as a non-transitory machine-readable storage.
  • the logic 804 may be implemented as machine-readable instructions, such as software, for example.
  • the logic 804 when executed, implements one or more aspects of the method 500 (FIG. 5), the method 600 (FIG. 6), the method 700 (FIG. 7), and/or realize the system 200 (FIG. 2), already discussed.
  • FIG. 9 shows an illustrative example of a system 900.
  • the system 900 may include a processor 902 and a memory 904 communicatively coupled to the processor 902.
  • the memory 904 may include logic 906 as a set of instructions.
  • the logic 906 may be implemented as software.
  • the logic 906, when executed by the processor 902, implements one or more aspects of the method 500 (FIG. 5), the method 600 (FIG. 6), the method 700 (FIG. 7), and/or realize the system 200 (FIG. 2), already discussed.
  • the processor 902 may include a general purpose controller, a special purpose controller, a storage controller, a storage manager, a memory controller, a microcontroller, a general purpose processor, a special purpose processor, a central processor unit (CPU), the like, and/or combinations thereof.
  • implementations may include distributed processing, component/object distributed processing, parallel processing, the like, and/or combinations thereof.
  • virtual computer system processing may implement one or more of the methods or functionalities as described herein, and the processor 902 described herein may be used to support such virtual processing.
  • the memory 904 is an example of a computer-readable storage medium.
  • memory 904 may be any memory which is accessible to the processor 902, including, but not limited to RAM memory, registers, and register files, the like, and/or combinations thereof. References to “computer memory” or “memory” should be interpreted as possibly being multiple memories.
  • the memory may for instance be multiple memories within the same computer system.
  • the memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
  • FIG. 10 shows an illustrative semiconductor apparatus 1000 (e.g., chip and/or package).
  • the illustrated semiconductor apparatus 1000 includes one or more substrates 1002 (e.g., silicon, sapphire, or gallium arsenide) and logic 1004 (e.g., configurable logic and/or fixed- functionality hardware logic) coupled to the substrate(s) 1002.
  • the logic 1004 implements one or more aspects of the method 500 (FIG. 5), the method 600 (FIG. 6), the method 700 (FIG. 7), and/or realize the system 200 (FIG. 2), already discussed.
  • logic 1004 may include transistor array and/or other integrated circuit/IC components.
  • configurable logic and/or fixed-functionality hardware logic implementations of the logic 1004 may include configurable logic such as, for example, programmable logic arrays (PLAs), field programmable gate arrays (FPGAs), complex programmable logic devices (CPLDs), or fixed-functionality logic hardware using circuit technology such as, for example, application specific integrated circuit (ASIC), complementary metal oxide semiconductor (CMOS) or transistor-transistor logic (TTL) technology, the like, and/or combinations thereof.
  • PLAs programmable logic arrays
  • FPGAs field programmable gate arrays
  • CPLDs complex programmable logic devices
  • ASIC application specific integrated circuit
  • CMOS complementary metal oxide semiconductor
  • TTL transistor-transistor logic
  • any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality.
  • operably couplable include but are not limited to physically mateable and/or physically interacting components.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • a list of items joined by the term “one or more of’ may mean any combination of the listed terms.
  • the phrases “one or more of A, B or C” may mean A; B; C; A and B; A and C; B and C; or A, B and C.
  • processors other unit, the like, and/or combinations thereof may fulfill the functions of several items recited in the claims.
  • a computer program may be stored/distributed on a suitable computer readable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable computer readable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Des systèmes, des appareils et des procédés selon l'invention fournissent un système d'avertissement de déplacement de dispositif d'intervention pour des intervenants. Un tel système d'avertissement de déplacement de dispositif d'intervention surveille une forme et un mouvement d'un dispositif d'intervention dans une partie anatomique d'un patient. On détermine que le dispositif d'intervention a été placé dans une position de mise en place à un emplacement cible sur la base de la forme et du mouvement surveillés du dispositif d'intervention. On détermine que le dispositif d'intervention a été déplacé de la position de mise en place sur la base du mouvement surveillé par rapport à l'emplacement cible.
PCT/EP2024/079284 2023-10-25 2024-10-17 Avertissement de position perdue pour dispositifs endovasculaires Pending WO2025087777A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363592960P 2023-10-25 2023-10-25
US63/592,960 2023-10-25

Publications (1)

Publication Number Publication Date
WO2025087777A1 true WO2025087777A1 (fr) 2025-05-01

Family

ID=93211603

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/079284 Pending WO2025087777A1 (fr) 2023-10-25 2024-10-17 Avertissement de position perdue pour dispositifs endovasculaires

Country Status (1)

Country Link
WO (1) WO2025087777A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190192234A1 (en) * 2016-08-23 2019-06-27 Intuitive Surgical Operations, Inc. Systems and methods for monitoring patient motion during a medical procedure
US20190239723A1 (en) * 2016-09-21 2019-08-08 Intuitive Surgical Operations, Inc. Systems and methods for instrument buckling detection
EP4016455A1 (fr) * 2020-12-16 2022-06-22 Koninklijke Philips N.V. Mappage de mouvement prédictif pour dispositifs flexibles

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190192234A1 (en) * 2016-08-23 2019-06-27 Intuitive Surgical Operations, Inc. Systems and methods for monitoring patient motion during a medical procedure
US20190239723A1 (en) * 2016-09-21 2019-08-08 Intuitive Surgical Operations, Inc. Systems and methods for instrument buckling detection
EP4016455A1 (fr) * 2020-12-16 2022-06-22 Koninklijke Philips N.V. Mappage de mouvement prédictif pour dispositifs flexibles

Similar Documents

Publication Publication Date Title
US12053144B2 (en) Robotic systems for navigation of luminal networks that compensate for physiological noise
US11266424B2 (en) Autonomous catheterization assembly
US9675302B2 (en) Prolapse detection and tool dislodgement detection
US10524652B2 (en) Information processing device, imaging system, information processing method and program
US10362943B2 (en) Dynamic overlay of anatomy from angiography to fluoroscopy
EP4016455A1 (fr) Mappage de mouvement prédictif pour dispositifs flexibles
JP2020511255A (ja) Oss誘導及び監視システム、コントローラ、及び方法
WO2014091380A1 (fr) Système d'intervention
JP2023552645A (ja) リードの癒着の推定
JP7679473B2 (ja) 可撓性デバイスのための予測動作マッピング
WO2025087777A1 (fr) Avertissement de position perdue pour dispositifs endovasculaires
CN116669634A (zh) 导线粘附估计
WO2025002905A1 (fr) Caractérisation de tissu athéroscléreux basée sur une dynamique de traitement endovasculaire
US12161431B2 (en) Monitoring method and medical system
WO2025061532A1 (fr) Détection et alerte pour intrusion de dispositif dans des régions anatomiques vulnérables
US20250062000A1 (en) Image-guided therapy system
EP4600967A1 (fr) Fourniture de données annotées pour entraîner un modèle d'apprentissage automatique à surveiller des événements pendant une procédure médicale
WO2024088836A1 (fr) Systèmes et procédés d'estimation de temps par rapport à une cible à partir de caractéristiques d'image
US20230386113A1 (en) Medical image processing apparatus and medical image processing method
EP4625434A1 (fr) Support d'une procédure interventionnelle
US20250356990A1 (en) Percutaneous coronary intervention planning
JP2024050046A (ja) コンピュータプログラム、情報処理方法及び情報処理装置
Cao et al. Design of an Autonomous Delivery System for Vascular Intervention Robots Based on Curvature Variation
JP2025062654A (ja) 医用画像処理装置、医用画像処理方法及び医用画像処理プログラム
CN121003490A (zh) 用于移动医学对象的系统、方法和计算机程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24793758

Country of ref document: EP

Kind code of ref document: A1