[go: up one dir, main page]

WO2024252227A1 - Système et procédé de mise à jour d'enregistrement et de localisation pendant une navigation chirurgicale - Google Patents

Système et procédé de mise à jour d'enregistrement et de localisation pendant une navigation chirurgicale Download PDF

Info

Publication number
WO2024252227A1
WO2024252227A1 PCT/IB2024/055206 IB2024055206W WO2024252227A1 WO 2024252227 A1 WO2024252227 A1 WO 2024252227A1 IB 2024055206 W IB2024055206 W IB 2024055206W WO 2024252227 A1 WO2024252227 A1 WO 2024252227A1
Authority
WO
WIPO (PCT)
Prior art keywords
catheter
computing device
node
coordinates
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2024/055206
Other languages
English (en)
Inventor
Dany JUNIO
Steven J. LEVINE
Jing Zhao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Covidien LP
Original Assignee
Covidien LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/669,674 external-priority patent/US20240407741A1/en
Application filed by Covidien LP filed Critical Covidien LP
Publication of WO2024252227A1 publication Critical patent/WO2024252227A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/267Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
    • A61B1/2676Bronchoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00743Type of operation; Specification of treatment sites
    • A61B2017/00809Lung operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/061Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • This disclosure relates to systems, methods, and devices for dynamically updating registration between sensor data and CT data during a surgical navigation procedure.
  • MRI magnetic resonance imaging
  • CT computed tomography
  • fluoroscopy a technique for identifying and navigate to areas of interest within a patient and ultimately a target for biopsy or treatment.
  • pre-operative scans may be utilized for target identification and intraoperative guidance.
  • an endoscopic approach has proven useful in navigating to areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs.
  • endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three-dimensional (3D) rendering, model, or volume of the particular body part such as the lungs.
  • the resulting volume generated from the MRI scan or CT scan may be utilized to create a navigation plan to facilitate the advancement of a navigation catheter (or other suitable medical device) through a bronchoscope and a branch of the bronchus of a patient to an area of interest.
  • a locating or tracking system such as an electromagnetic (EM) tracking system, may be utilized in conjunction with, for example, CT data, to facilitate guidance of the navigation catheter through the branch of the bronchus to the area of interest.
  • the navigation catheter may be positioned within one of the airways of the branched luminal networks adjacent to, or within, the area of interest to provide access for one or more medical instruments.
  • a surgical navigation system includes a catheter and a computing device.
  • the catheter is configured to be navigated through a luminal network and to capture images during navigation.
  • the computing device includes a processor and memory storing instructions which, when executed by the processor, cause the computing device to register data, detected by the catheter in the luminal network, to CT data of the luminal network, detect when the catheter is located at a node of the luminal network.
  • the computing device is further configured to determine coordinates of the detected node, determine a difference between the determined coordinates of the node and expected coordinates of the node, and determine if the difference between the determined coordinates and the expected coordinates is greater than a predetermined threshold.
  • the computing device is configured to determine coordinates of the detected node based on data detected from the catheter.
  • the computing device is configured to perform image depth sensing to determine a distance between the catheter and the detected node, and to determine coordinates of the detected node based on data detected from the catheter and the determined distance between the catheter and the detected node.
  • the computing device is configured to update registration of the location data to the CT data of the luminal network if it is determined that the difference between the determined coordinates and the expected coordinates is greater than the predetermined threshold. [0011] In an aspect, the computing device is configured to determine a node corresponding to a location of the catheter if it is determined that the difference between the determined coordinates and the expected coordinates is not greater than the predetermined threshold.
  • the computing device is configured to detect when the catheter is at the node based on an image analysis of the images captured during navigation.
  • the computing device is configured to estimate expected changes of the luminal network during navigation, and to determine if changes of the luminal network during navigation differ from the estimated expected changes.
  • the computing device is configured to analyze a captured image of the node to calculate an angle and a distance between two lumens at the node, and to determine translation and rotation differences between the catheter data and the CT data based on the calculated angle and distance between the two lumens at the node.
  • a navigation system includes a catheter, a tracking system operably coupled to the catheter, and a computing device operably coupled to the surgical navigation catheter and the tracking system.
  • the surgical navigation catheter includes a sensor and is configured to navigate along a path through a luminal network.
  • the tracking system is configured to generate location data corresponding to locations of the catheter within the luminal network based on signals received from the location sensor as the surgical navigation catheter is navigated through the luminal network.
  • the computing device includes a processor and memory storing instructions which, when executed by the processor, cause the computing device to register the generated data to CT data of the luminal network, and detect when the catheter is located at a node of the luminal network.
  • the computing device is further configured to determine coordinates of the detected node, determine a difference between the determined coordinates of the node and expected coordinates of the node, and determine if the difference between the determined coordinates and the expected coordinates is greater than a predetermined threshold.
  • the computing device is configured to determine coordinates of the detected node based on data sensed from the catheter.
  • the computing device is configured to perform image depth sensing to determine a distance between the catheter and the detected node, and to determine coordinates of the detected node based on data sensed from the catheter and the determined distance between the catheter and the detected node.
  • the computing device is configured to update registration of the data to the CT data of the luminal network if it is determined that the difference between the determined coordinates and the expected coordinates is greater than the predetermined threshold.
  • the computing device is configured to determine a node corresponding to a location of the catheter if it is determined that the difference between the determined coordinates and the expected coordinates is not greater than the predetermined threshold.
  • the computing device is configured to detect when the catheter is at the node based on an image analysis of the images captured during navigation.
  • the computing device is further configured to estimate expected changes of the luminal network during navigation, and to determine if changes of the luminal network during navigation differ from the estimated expected changes.
  • the computing device is configured to analyze a captured image of the node to calculate an angle and a distance between two lumens at the node, and to determine translation and rotation differences between the sensor and the CT data based on the calculated angle and distance between the two lumens at the node.
  • a method includes registering data, detected by a catheter in a luminal network, to CT data of the luminal network, detecting when the catheter is located at a node of the luminal network, determining coordinates of the detected node, determining a difference between the determined coordinates of the node and expected coordinates of the node, and determining if the difference between the determined coordinates and the expected coordinates is greater than a predetermined threshold.
  • the method further includes updating registration of the data detected by the catheter to the CT data of the luminal network if it is determined that the difference between the determined coordinates and the expected coordinates is greater than the predetermined threshold.
  • the method further includes determining a node corresponding to a location of the catheter if it is determined that the difference between the determined coordinates and the expected coordinates is not greater than the predetermined threshold. [0026] In an aspect, the method further includes detecting when the catheter is at the node based on images captured during navigation.
  • FIG. 1 is a schematic diagram of a surgical navigation procedure system in accordance with an illustrative aspect of this disclosure
  • FIG. 2 is a schematic diagram of a computing device which forms part of the surgical navigation procedure system of FIG. 1 in accordance with an aspect of this disclosure
  • FIG. 3 is a flowchart illustrating a method for dynamic localization and registration in accordance with an aspect of this disclosure
  • FIG. 4A illustrates a tree representation of a luminal network derived from CT data of the luminal network in accordance with an aspect of this disclosure
  • FIG. 4B illustrates an indicator based on location data of a tracked surgical navigation catheter relative to the tree representation of the luminal network of FIG. 4A before dynamic updating of registration between the location data and the CT data is applied in accordance with an aspect of this disclosure
  • FIG. 4C illustrates an indicator based on location data of a tracked surgical navigation catheter relative to the tree representation of the luminal network of FIG. 4A after dynamic updating of registration between the location data and the CT data is applied in accordance with an aspect of this disclosure
  • FIG. 5A illustrates a three dimensional rendering of a luminal network derived from CT data of the luminal network in accordance with an aspect of this disclosure
  • FIG. 5B illustrates an image of a parent node of the luminal network in accordance with an aspect of this disclosure
  • FIG. 5C illustrates an image of a child node of the luminal network in accordance with an aspect of this disclosure
  • FIG. 6 illustrates an image of a node within a luminal network in accordance with an aspect of this disclosure.
  • This disclosure provides a system and method for dynamically updating registration between location data, corresponding to a location of a surgical navigation catheter, and CT data of a luminal network during a surgical navigation procedure. Updating registration is based on the detection of anatomical landmarks during navigation of the surgical navigation catheter through the luminal network.
  • this disclosure utilizes image data of the luminal network as captured by the surgical navigation catheter to determine the location of the surgical navigation catheter and update the registration of the location data to the CT data.
  • FIG. 1 depicts a surgical navigation system 10 configured for reviewing CT image data to identify one or more targets, planning a pathway to an identified target (planning phase), navigating a catheter 12 (e.g., an extended working channel) of a catheter guide assembly 40 to a target (navigation phase) via a user interface, and confirming placement of the catheter 12 relative to the target.
  • a catheter 12 e.g., an extended working channel
  • a target e.g., an extended working channel
  • the target may be tissue of interest identified by review of the CT image data during the planning phase.
  • a medical instrument such as a biopsy tool, ablation tool (e.g., ablation device 130), or other tool, may be inserted into the catheter 12 to treat the tissue or obtain a tissue sample from the tissue located at, or proximate to, the target.
  • a medical instrument such as a biopsy tool, ablation tool (e.g., ablation device 130), or other tool, may be inserted into the catheter 12 to treat the tissue or obtain a tissue sample from the tissue located at, or proximate to, the target.
  • ablation tool e.g., ablation device 130
  • CT data any form of imaging data by any imaging device may be utilized prior to, or during, a procedure.
  • System 10 generally includes an operating table 20 configured to support the patient “P;” a bronchoscope 30 configured for insertion through patient’s “P’s” mouth into patient’s “P’s” airways; monitoring equipment including a display 120 coupled to bronchoscope 30 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 30); a tracking system 50 including a tracking module 52, a plurality of reference sensors 54 and a transmiter mat 56; and a computing device 100 including software and/or hardware used to facilitate identification of a target, pathway planning to the target, navigation of a medical instrument to the target, and confirmation of placement of the catheter 12, or a suitable device therethrough, relative to the target.
  • monitoring equipment including a display 120 coupled to bronchoscope 30 (e.g., a video display, for displaying the video images received from the video imaging system of bronchoscope 30); a tracking system 50 including a tracking module 52, a plurality of reference sensors 54 and a transmiter mat 56;
  • catheter 12 is part of a catheter guide assembly 40.
  • catheter 12 is inserted into bronchoscope 30 for access to a luminal network of patient “P.”
  • catheter 12 of catheter guide assembly 40 may be inserted into a working channel of bronchoscope 30 for navigation through a patient’s luminal network.
  • a distal portion of the catheter 12 includes a sensor 44 (such as, e.g., a location sensor), and a camera 126.
  • the position and orientation of the sensor 44 within an electromagnetic field, and thus, the distal portion of the catheter 12 relative to a reference coordinate system, can be derived by the tracking system 50.
  • the camera 126 may be any type of sensing device capable of capturing images.
  • computing device 100 analyzes the images captured by the camera 126 to detect when the catheter 12 is located at a bifurcation within the luminal network.
  • An imaging device 110 capable of acquiring images or video of patient “P” (e.g., fluoroscopic, x-ray, MRI, CT, ultrasonic, etc.) may also be included in this particular aspect of system 10.
  • the image data e.g., images, series of images, orvideo
  • the imaging device 110 may move relative to patient “P” so that images may be acquired from different angles or perspectives relative to patient “P” to create a video from a sweep.
  • Computing device 100 may be any suitable computing device including a processor and storage medium, wherein the processor is capable of executing instructions stored on the storage medium.
  • the computing device 100 may further include a database configured to store patient data, CT data sets including CT images, fluoroscopic data sets including fluoroscopic images and video, navigation plans, and any other such data.
  • the computing device 100 may include inputs, or may otherwise be configured to receive, CT data sets, fluoroscopic images/video and other data described herein.
  • computing device 100 includes a display (e.g., display 206) configured to display graphical user interfaces.
  • computing device 100 utilizes previously acquired image data (e.g., CT image data, MRI image data, etc.) for generating and viewing a three- dimensional model or rendering of patient “P’s” airways, enables the identification of a target on the three-dimensional model (automatically, semi-automatically, or manually), and allows for determining a pathway through patient “P’s” airways to tissue located at and around the target. More specifically, in an aspect, CT images acquired from previous CT scans are processed and assembled into a three-dimensional CT volume, which is then utilized to generate a three- dimensional model of patient “P’s” airways. The three-dimensional model may be displayed on a display 206 associated with computing device 100, or in any other suitable fashion.
  • image data e.g., CT image data, MRI image data, etc.
  • the enhanced two-dimensional images may possess some three-dimensional capabilities because they are generated from three-dimensional data.
  • the three-dimensional model may be manipulated to facilitate identification of target on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through patient “P’s” airways to access tissue located at the target can be made. Once selected, the pathway plan, three-dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s).
  • One such planning software is the ILLUMISITE® planning suite currently sold by Medtronic pic.
  • Tracking system 50 is utilized for performing registration of the images and the pathway for navigation, although other configurations are also contemplated.
  • Tracking system 50 includes a tracking module 52, a plurality of reference sensors 54, and a transmitter mat 56 (including markers if applicable).
  • Tracking system 50 is configured for use with a sensor 44 (such as, e.g., a location sensor) of catheter 12 and may be configured to track, for example, the electromagnetic position thereof within an electromagnetic coordinate system.
  • Transmitter mat 56 is positioned beneath patient “P.” Transmitter mat 56 generates an electromagnetic field around at least a portion of patient “P” within which the position of a plurality of reference sensors 54 and the sensor 44 can be determined with use of a tracking module 52. One or more of reference sensors 54 are attached to the chest of patient “P.” The six degrees of freedom coordinates of reference sensors 54 are sent to computing device 100 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference.
  • Registration is generally performed to coordinate locations of the three dimensional model and two dimensional images from the planning phase with patient’s “P’s” airways as observed through the bronchoscope 30, and to allow for the navigation phase to be undertaken with precise knowledge of the location of the sensor 44, even in portions of the airway where the bronchoscope 30 cannot reach.
  • Initial registration of patient’s “P’s” location on the transmitter mat 56 is performed by moving sensor 44 through the airways of patient “P.” More specifically, data pertaining to locations of sensor 44, while catheter 12 is moving through the airways, is recorded using transmitter mat 56, reference sensors 54, and tracking module 52. A shape resulting from this location data is compared to an interior geometry of passages of the three dimensional model generated in the planning phase, and a location correlation between the shape and the three dimensional model based on the comparison is determined, e.g., utilizing the software on computing device 100. In addition, the software identifies non-tissue space (e.g., air filled cavities) in the three dimensional model.
  • non-tissue space e.g., air filled cavities
  • the software aligns, or registers, an image representing a location of sensor 44 with the three dimensional model and two dimensional images generated from the three dimension model, which are based on the recorded location data and an assumption that sensor 44 remains located in non-tissue space in patient’s “P’s” airways.
  • a manual registration technique may be employed by navigating the bronchoscope 30 with the sensor 44 to pre-specified locations in the lungs of patient “P”, and manually correlating the images from the bronchoscope to the model data of the three dimensional model.
  • the instant disclosure is not so limited and may be used in conjunction with flexible sensor, ultrasonic sensors, or without sensors. Additionally, the methods described herein may be used in conjunction with robotic systems such that robotic actuators drive the catheter 12, catheter guide assembly 40 components, or bronchoscope 30 proximate the target.
  • a user interface is displayed in the navigation software which sets forth the pathway that the catheter 12 is to follow to reach the target.
  • ablation device 130 is a flexible surgical navigation catheter which is guided through catheter 12 for placement relative to a target and ablation of the target.
  • Ablation device 130 is configured to connect to microwave generator 33 (FIG. 1) which generates and controls the application of micro wave energy through the ablation device 130.
  • Microwave generator 33 may be a component of computing device 100 or may be a separate stand-alone component.
  • FIG. 2 illustrates a system diagram of computing device 100.
  • Computing device 100 may include memory 202, processor 204, display 206, network interface 208, input device 210, and/or output module 212.
  • Memory 202 includes any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 204 and which controls the operation of computing device 100.
  • memory 202 may include one or more solid-state storage devices such as flash memory chips.
  • mass storage controller not shown
  • communications bus not shown
  • computer-readable media can be any available media that can be accessed by the processor 204. That is, computer readable storage media includes non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data.
  • computer- readable storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information, and which can be accessed by computing device 100.
  • Memory 202 may store application 216 and/or functional respiratory imaging data 214 of one or more patients.
  • Application 216 may, when executed by processor 204, cause display 206 to present user interfaces.
  • Processor 204 may be a general-purpose processor, a specialized graphics processing unit (GPU) configured to perform specific graphics processing tasks while freeing up the general-purpose processor to perform other tasks, and/or any number or combination of such processors.
  • Display 206 may be touch sensitive and/or voice activated, enabling display 206 to serve as both an input and output device. Alternatively, a keyboard (not shown), mouse (not shown), or other data input devices may be employed.
  • Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet.
  • computing device 100 may receive functional respiratory imaging data, DICOM imaging data, computed tomographic (CT) image data, or other imaging data, of a patient from an imaging workstation and/or a server, for example, a hospital server, internet server, or other similar servers, for use during surgical ablation planning.
  • CT computed tomographic
  • Patient functional respiratory imaging data may also be provided to computing device 100 via a removable memory 202.
  • Computing device 100 may receive updates to its software, for example, application 216, via network interface 208.
  • Computing device 100 may also display notifications on display 206 that a software update is available.
  • Input device 210 may be any device by means of which a user may interact with computing device 100, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
  • Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • Application 216 may be one or more software programs stored in memory 202 and executed by processor 204 of computing device 100. During a planning phase, application 216 guides a clinician through a series of steps to identify a target, size the target, size a treatment zone, and/or determine an access route to the target for later use during the procedure phase. In some embodiments, application 216 is loaded on computing devices in an operating room or other facility where surgical procedures are performed, and is used as a plan or map to guide a clinician performing a surgical procedure, but without any feedback from ablation device 130 used in the procedure to indicate where ablation device 130 is located in relation to the plan. In other embodiments, system 10 provides computing device 100 with data regarding the location of ablation device 130 within the body of the patient, such as by EM tracking, which application 216 may then use to indicate on the plan where ablation device 130 is located.
  • Application 216 may be installed directly on computing device 100, or may be installed on another computer, for example, a central server, and opened on computing device 100 via network interface 208.
  • Application 216 may run natively on computing device 100, as a web-based application, or any other format known to those skilled in the art.
  • application 216 will be a single software program having all of the features and functionality described in this disclosure.
  • application 216 may be two or more distinct software programs providing various parts of these features and functionality.
  • application 216 may include one software program for use during the planning phase, and a second software program for use during the procedure phase of the microwave ablation treatment.
  • the various software programs forming part of application 216 may be enabled to communicate with each other and/or import and export various settings and parameters relating to the microwave ablation treatment and/or the patient to share information.
  • a treatment plan and any of its components generated by one software program during the planning phase may be stored and exported to be used by a second software program during the procedure phase.
  • Application 216 communicates with a user interface 218 that generates a user interface for presenting visual interactive features to a clinician, for example, on display 206 and for receiving clinician input, for example, via a user input device.
  • user interface 218 may generate a graphical user interface (GUI) and output the GUI to display 206 for viewing by a clinician.
  • GUI graphical user interface
  • Computing device 100 is linked to display 120, thus enabling computing device 100 to control the output on display 120 along with the output on display 206.
  • Computing device 100 may control display 120 to display output which is the same as or similar to the output displayed on display 206.
  • the output on display 206 may be mirrored on display 120.
  • computing device 100 may control display 120 to display different output from that displayed on display 206.
  • display 120 may be controlled to display guidance images and information during the microwave ablation procedure, while display 206 is controlled to display other output, such as configuration or status information.
  • method 300 a method for dynamically updating registration between location data and CT data during a surgical navigation procedure based on anatomical landmarks within the luminal network is illustrated and described as method 300.
  • Method 300 is described as being executed by computing device 100, but some or all of the steps of method 300 may be implemented by one or more other components of the system 10, alone or in combination. Additionally, although method 300 is illustrated and described as including specific steps, and is described as being carried out in a particular order, it is understood that method 300 may include some or all of the steps described and may be carried out in any order not specifically described.
  • Method 300 begins at step 301 where a catheter 12 is navigated manually or robotically through a patient’s luminal network along a planned path using location data of the catheter 12 that is registered to CT data of the patient’s luminal network.
  • computing device 100 registers location data of the catheter 12 acquired by tracking system 50 to CT data of the luminal network.
  • a graphical user interface may be displayed including a CT data rendering (derived from the CT data) and a catheter rendering (derived from the location data of the sensor 44 of the catheter 12 as tracked by the tracking system 50) relative to the CT data rendering based on the registration between the location data and the CT data.
  • a tree model 401 (FIGS. 4A-4C) and a three dimensional rendering (FIGS. 5A-5C) of a patient’s luminal network are shown, respectively, having a parent node 403 and a child node 405 stemming from the parent node 403 along a piece 407.
  • Each node within the tree model 401 represents an anatomical landmark within the luminal network, such as, for example, a bifurcation within the luminal network.
  • the tree model 401 is uniquely defined by a union of all pieces 407.
  • a catheter indicator 409 representing a location of the catheter 12 derived from the location data of the tracking system 50, is shown positioned relative to the tree model 401 of the patient’s luminal network.
  • computing device 100 detects that the catheter 12 is positioned at (or near) the parent node 403.
  • computing device 100 may analyze the images captured by the camera 126 of the catheter 12 and determine that the image contains distinctive features that correspond to a bifurcation (e.g., the parent node 403) within the luminal network.
  • the computing device 100 generates virtual fly -through images of the luminal network based on the CT data and the images captured by the camera 126 are compared to the virtual fly -through images to detect which bifurcation within the luminal network corresponds to the bifurcation in the captured image.
  • the computing device 100 conducts image recognition on an image 601 captured by the camera 126 based on the number of pixels forming dark areas, and when two dark areas are present within an image (e.g., first dark area 603 and second dark area 605), the computing device 100 determines that a bifurcation is present within the image.
  • the computing device 100 detects that the catheter 12 has been moved to the child node 405 (e.g., along a piece 407 connecting the child node 405 to the parent node 403). Like the detection of the parent node 403, the computing device 100 may analyze the images captured by the camera 126 of the catheter 12 and determine that the image contains distinctive features that correspond to a bifurcation (e.g., the child node 405) within the luminal network and may also factor the distance traveled from the parent node 403 in determining whether a potential image includes a bifurcation corresponding to the child node 405.
  • a bifurcation e.g., the child node 405
  • the computing device 100 may utilize imagebased depth sensing techniques to calculate a distance between the catheter 12 and the bifurcation that is depicted in the captured image.
  • the calculated distance between the catheter 12 and the bifurcation that is depicted in the captured image may be factored along with the location data of the tracking system 50 in determining the actual location of the catheter 12 within the luminal network.
  • deep learning techniques may be utilized to conduct the image analysis of the captured images for determining whether an image depicts a bifurcation representing a node in the tree model 401.
  • computing device 100 may analyze the captured images by detecting lumens within the image and calculating an angle and distance between the lumens detected in the captured image.
  • the distinct angle and distance between the lumens detected in the captured image may be considered when determining the proper translation and rotation factor to apply to the registration update. Additionally, or alternatively, the computing device 100 may factor any physical lumen interaction (e.g., as sensed by the camera 126 and/or by a force sensor connected to the catheter 12) between the catheter 12 and a wall or anatomical feature of the luminal network in steps 303 and/or 305 to detect a parent node 403 and/or a child node 405.
  • any physical lumen interaction e.g., as sensed by the camera 126 and/or by a force sensor connected to the catheter 12
  • step 307 the computing device 100 determines the actual coordinates (AC) of the child node 405 based on the location data derived from the tracking system 50 (e.g., directly corresponding to the location of the catheter indicator 409).
  • step 309 the actual coordinates of the child node 405 are compared to the expected coordinates (EC) of the child node 405.
  • the expected coordinates of the child node 405 may be determined by the computing device 100 based on the CT data.
  • Step 309 may include determining the distance between the actual coordinates of the child node 405 and the expected coordinates of the child node 405, determining the offset between the two, and/or determining a rotation factor between the two. [0067] In step 311, a determination is made as to whether the difference determined in step 309 exceeds a predetermined threshold which may distinguish between an instance where the location data and the CT data are misaligned due to CT to body divergence and an instance where the catheter 12 has been navigated through a different path and is located at a different node than expected.
  • step 311 determines the difference between the actual coordinates of the child node 405 (e.g., the location of the catheter 12) and the expected coordinates of the child node 405 exceeds a predetermined threshold (YES, in step 311), then method 300 proceeds to step 313 where computing device 100 determines the correct (e.g., actual) child node 405 corresponding to the location of the catheter 12 and either repositions the catheter indicator 409 to the correct child node 405 or repositions the correct child node 405 to the catheter indicator 409 to align the location data with the CT data.
  • Step 313 may be carried out by computing device 100 by performing an image analysis of the captured image corresponding to the location of the catheter 12.
  • step 313 additionally includes generating a notification to alert the clinician that the catheter 12 has entered a wrong or unintended branch of the luminal network.
  • step 311 if in step 311 it is determined that the difference between the actual coordinates of the child node 405 (e.g., the location of the catheter 12) and the expected coordinates of the child node 405 does not exceed a predetermined threshold (NO, in step 311), then method 300 proceeds to step 314 where computing device 100 updates the registration between the location data and the CT data based on one or more of a translation or rotation factor offset. In step 314, the offset is applied to the child node 405 and all offspring (e.g., downstream child nodes) of the parent node 403.
  • offspring e.g., downstream child nodes
  • the computing device 100 may estimate expected changes in the luminal network (e.g., using an atlas or a learning based model) based on previous, similar cases, and the expected changes may be considered in generating the CT data with predictions. In such a case, during a navigation procedure, the computing device 100 may compare the expected changes to the actual changes and determine if the actual changes of the luminal network during navigation differ from the estimated expected changes. If the difference between the expected changes and the actual changes exceeds a predetermined threshold, the computing device 100 may generate a notification to alert the clinician of an abnormality in the procedure.
  • expected changes in the luminal network e.g., using an atlas or a learning based model
  • Example 1 - A navigation system including a catheter configured to be navigated through a luminal network and to capture images during navigation; a computing device including a processor and memory storing instructions which, when executed by the processor, cause the computing device to: register data, detected by the catheter in the luminal network, to CT data of the luminal network; detect when the catheter is located at a node of the luminal network; determine coordinates of the detected node; determine a difference between the determined coordinates of the node and expected coordinates of the node; and determine whether the difference between the determined coordinates and the expected coordinates is greater than a predetermined threshold.
  • Example 2 The navigation system according to example 1, wherein the computing device is configured to determine coordinates of the detected node based on data detected from the catheter.
  • Example 3 The navigation system according to any of examples 1 or 2, wherein the computing device is configured to: perform image depth sensing to determine a distance between the catheter and the detected node; and determine coordinates of the detected node based on data detected from the catheter and the determined distance between the catheter and the detected node.
  • Example 4 The navigation system according to any of examples 1-3, wherein the computing device is configured to update registration of the location data to the CT data of the luminal network if it is determined that the difference between the determined coordinates and the expected coordinates is greater than the predetermined threshold.
  • Example 5 The navigation system according to any of examples 1-4, wherein the computing device is configured to determine a node corresponding to a location of the catheter if it is determined that the difference between the determined coordinates and the expected coordinates is not greater than the predetermined threshold.
  • Example 6 The navigation system according to any of examples 1-5, wherein the computing device is configured to detect when the catheter is at the node based on an image analysis of the images captured during navigation.
  • Example 7 The navigation system according to any of examples 1-6, wherein the computing device is further configured to: estimate expected changes of the luminal network during navigation; and determine if changes of the luminal network during navigation differ from the estimated expected changes.
  • Example 8 The navigation system according to any of examples 1-7, wherein the computing device is configured to: analyze a captured image of the node to calculate an angle and a distance between two lumens at the node; and determine translation and rotation differences between the catheter data and the CT data based on the calculated angle and distance between the two lumens at the node.
  • Example 9 A navigation system including a catheter configured to navigate along a path through a luminal network, the surgical navigation catheter including a sensor; a tracking system operably coupled to the catheter and configured to generate data within the luminal network based on data received from the sensor as the catheter is navigated through the luminal network; and a computing device operably coupled to the catheter and the tracking system, the computing device including a processor and memory storing instructions which, when executed by the processor, cause the computing device to: register the generated data to CT data of the luminal network; detect when the catheter is located at a node of the luminal network; determine coordinates of the detected node; determine a difference between the determined coordinates of the node and expected coordinates of the node; and determine if the difference between the determined coordinates and the expected coordinates is greater than a predetermined threshold.
  • Example 10 The navigation system according to example 9, wherein the computing device is configured to determine coordinates of the detected node based on data sensed from the catheter.
  • Example 11 The navigation system according to any of examples 9 or 10, wherein the computing device is configured to: perform image depth sensing to determine a distance between the catheter and the detected node; and determine coordinates of the detected node based on data sensed from the catheter and the determined distance between the catheter and the detected node.
  • Example 12 The navigation system according to any of examples 9-11, wherein the computing device is configured to update registration of the data to the CT data of the luminal network if it is determined that the difference between the determined coordinates and the expected coordinates is greater than the predetermined threshold.
  • Example 13 The navigation system according to any of examples 9-12, wherein the computing device is configured to determine a node corresponding to a location of the catheter if it is determined that the difference between the determined coordinates and the expected coordinates is not greater than the predetermined threshold.
  • Example 14 The navigation system according to any of examples 9-13, wherein the computing device is configured to detect when the catheter is at the node based on an image analysis of the images captured during navigation.
  • Example 15 The navigation system according to any of examples 9-14, wherein the computing device is further configured to: estimate expected changes of the luminal network during navigation; and determine if changes of the luminal network during navigation differ from the estimated expected changes.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Un système de navigation chirurgicale comprend un cathéter de navigation et un dispositif informatique. Le dispositif informatique est configuré pour enregistrer des données détectées par un capteur dans un réseau luminal concernant des données CT du réseau luminal, et détecter lorsque le cathéter est situé au niveau d'un nœud du réseau luminal. Le dispositif informatique est en outre configuré pour déterminer des coordonnées du nœud détecté, déterminer une différence entre les coordonnées déterminées du nœud et des coordonnées attendues du nœud, et déterminer si la différence entre les coordonnées déterminées et les coordonnées attendues est supérieure à un seuil prédéterminé.
PCT/IB2024/055206 2023-06-08 2024-05-29 Système et procédé de mise à jour d'enregistrement et de localisation pendant une navigation chirurgicale Pending WO2024252227A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363471782P 2023-06-08 2023-06-08
US63/471,782 2023-06-08
US18/669,674 US20240407741A1 (en) 2023-06-08 2024-05-21 System and method for updating registration and localization during surgical navigation
US18/669,674 2024-05-21

Publications (1)

Publication Number Publication Date
WO2024252227A1 true WO2024252227A1 (fr) 2024-12-12

Family

ID=91465302

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/055206 Pending WO2024252227A1 (fr) 2023-06-08 2024-05-29 Système et procédé de mise à jour d'enregistrement et de localisation pendant une navigation chirurgicale

Country Status (1)

Country Link
WO (1) WO2024252227A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160073928A1 (en) * 2003-12-12 2016-03-17 University Of Washington Catheterscope 3d guidance and interface system
US20180368920A1 (en) * 2017-06-23 2018-12-27 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
WO2022035584A1 (fr) * 2020-08-13 2022-02-17 Intuitive Surgical Operations, Inc. Alerte et atténuation de divergence d'emplacements de caractéristiques anatomiques à partir d'images antérieures à l'interrogation en temps réel
WO2022123577A1 (fr) * 2020-12-10 2022-06-16 Magnisity Ltd. Suivi de déformation dynamique de bronchoscopie de navigation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160073928A1 (en) * 2003-12-12 2016-03-17 University Of Washington Catheterscope 3d guidance and interface system
US20180368920A1 (en) * 2017-06-23 2018-12-27 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
WO2022035584A1 (fr) * 2020-08-13 2022-02-17 Intuitive Surgical Operations, Inc. Alerte et atténuation de divergence d'emplacements de caractéristiques anatomiques à partir d'images antérieures à l'interrogation en temps réel
WO2022123577A1 (fr) * 2020-12-10 2022-06-16 Magnisity Ltd. Suivi de déformation dynamique de bronchoscopie de navigation

Similar Documents

Publication Publication Date Title
US11341692B2 (en) System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US11622815B2 (en) Systems and methods for providing proximity awareness to pleural boundaries, vascular structures, and other critical intra-thoracic structures during electromagnetic navigation bronchoscopy
US12059281B2 (en) Systems and methods of fluoro-CT imaging for initial registration
US20230172670A1 (en) Systems and methods for visualizing navigation of medical devices relative to targets
EP3164050B1 (fr) Vue cartographique dynamique en 3d du poumon permettant à un instrument de naviguer à l'intérieur dudit poumon
AU2017312764B2 (en) Method of using soft point features to predict breathing cycles and improve end registration
EP3910591B1 (fr) Cartographie d'étalement de maladie
EP3607906A1 (fr) Identification et notification de déplacement d'outil pendant une procédure médicale
US20250152252A1 (en) Mri based navigation
US20240225584A1 (en) Systems and methods of assessing breath hold during intraprocedural imaging
US20240407741A1 (en) System and method for updating registration and localization during surgical navigation
WO2024252227A1 (fr) Système et procédé de mise à jour d'enregistrement et de localisation pendant une navigation chirurgicale
WO2024241218A1 (fr) Système et procédé de mise à jour d'enregistrement et de localisation pendant une navigation chirurgicale
EP4601574A1 (fr) Systèmes et procédés de déplacement d'un instrument médical avec une cible dans un système de visualisation ou un système robotique pour des rendements supérieurs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24732077

Country of ref document: EP

Kind code of ref document: A1