[go: up one dir, main page]

WO2025054333A1 - Prédiction de trajectoire d'aiguille - Google Patents

Prédiction de trajectoire d'aiguille Download PDF

Info

Publication number
WO2025054333A1
WO2025054333A1 PCT/US2024/045395 US2024045395W WO2025054333A1 WO 2025054333 A1 WO2025054333 A1 WO 2025054333A1 US 2024045395 W US2024045395 W US 2024045395W WO 2025054333 A1 WO2025054333 A1 WO 2025054333A1
Authority
WO
WIPO (PCT)
Prior art keywords
trajectory
sampling device
ultrasound
real
fov
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/045395
Other languages
English (en)
Inventor
Vignesh MANDALAPA BHOOPATHY
Joshua Michael ADLER
Richard Rhodes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Veran Medical Technologies Inc
Original Assignee
Veran Medical Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Veran Medical Technologies Inc filed Critical Veran Medical Technologies Inc
Publication of WO2025054333A1 publication Critical patent/WO2025054333A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/445Details of catheter construction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus

Definitions

  • the present document relates generally to ultrasound equipment, and more particularly but without limitation, to ultrasound-guided tissue acquisition equipment including a graphical user interface to assist with navigation of a tissue sampling device to a target to collect tissue samples.
  • Endoscopes have been used in a variety of clinical procedures, including, for example, illuminating, imaging, detecting and diagnosing one or more disease states, providing fluid delivery (e.g., saline or other preparations via a fluid channel) toward an anatomical region, providing a passage (e.g., via a working channel) for passing a tissue sampling or biopsy device to collect tissue samples from an anatomical target or for passing a diagnostic or therapeutic device for medical diagnosis or treatment, or providing a suction passage for collecting fluids and unwanted object (e.g., tissue or calculi structures) from an anatomical region, among other procedures.
  • fluid delivery e.g., saline or other preparations via a fluid channel
  • a passage e.g., via a working channel
  • tissue sampling or biopsy device e.g., a tissue sampling or biopsy device to collect tissue samples from an anatomical target or for passing a diagnostic or therapeutic device for medical diagnosis or treatment
  • a suction passage for collecting fluids and unwanted object (e
  • the anatomical region or targets to be intervened include gastrointestinal tract, respiratory tract and lungs, renal system organs and tissues, sinus cavities, submucosal regions, reproductive system organs, among others.
  • Some endoscopes can be used with an energy source, such as a laser or plasma system, to provide treatment energy (e.g., laser pulses) to the anatomical target such as soft or hard tissue or calculi structures to achieve various treatment goals.
  • treatment energy e.g., laser pulses
  • endoscopic lasers have been used in applications including tissue ablation, coagulation, vaporization, fragmentation, and lithotripsy to break down calculi in kidney, gallbladder, ureter, among other stone-forming regions, or to ablate large calculi into smaller fragments.
  • Endoscopic ultrasound is a specialized endoscopy that combines conventional endoscopy with ultrasound imaging to obtain ultrasound images of an anatomical target or a region of interest.
  • Such specialized endoscope also known as an echoendoscope, includes an ultrasound transducer typically located at the distal portion of the elongate endoscope body. The ultrasound transducer emits ultrasound waves toward an anatomical target, and converts the ultrasound echoes into ultrasound images (e.g., a live image stream).
  • EUS has been used in diagnosing lung disease.
  • Endobronchial ultrasound (EBUS) is a minimally invasive and highly effective procedure to diagnose lung cancer, infections, and other diseases causing enlarged lymph nodes in the chest.
  • EBUS is typically performed with the use of a specialized bronchoscope associated with an ultrasound transducer or removable ultrasound probe delivered through a working channel and out of a port located at a distal end of the bronchoscope.
  • EBUS has been used in a tissue sampling or biopsy procedure, known as an endobronchial ultrasound-guided transbronchi al fine needle aspiration (EBUS-TBNA) procedure, where a specialized sampling device, such as a biopsy needle, may be passed down and then extended from the bronchoscope, and under the real-time guidance of EBUS, to collect tissue samples from peribronchial masses such as peribronchial nodules or tissue from peribronchial lymph nodes.
  • the sampled tissue may be analyzed to assist in diagnosis of various diseases such as tuberculosis, sarcoid, or cancer.
  • the present inventors have recognized several technological problems to be solved with conventional EUS systems and technologies, particularly those related to EBUS-TBNA procedures.
  • One of such technological problems is that the existing endobronchial sampling devices are generally designed for procedures involving shallow and large-diameter airways of the respiratory system, within which a clinician may freely advance, retract, or angulate a transbronchial aspiration or sampling device (e.g., a needle) to align a nominal trajectory.
  • the sampling device may be manipulated to traverse along the nominal trajectory and finally to intersect an anatomical target of interest, such as a peribronchial nodule.
  • a target nodule becomes clearly imaged at a location below some nominal trajectory line, the clinician may find it difficult to further advance the sampling device into the airway to align the target nodule with the nominal trajectory (e.g., to shove or force a 1.9 mm OD sampling device into a portion of the airway that has less than a 1.9 mm ID), because further advancing the sampling device may cause trauma to the airway wall or other unintended anatomical structures.
  • a saline-filled balloon that encompasses an ultrasound transducer is typically used to achieve apposition, which refers to an appropriate level of direct contact or air-gapless contact and pressure needed for ultrasound energy to propagate from and back to the ultrasound transducer.
  • apposition refers to an appropriate level of direct contact or air-gapless contact and pressure needed for ultrasound energy to propagate from and back to the ultrasound transducer.
  • a presence of airgaps between the balloon and the contact tissue may result in a high impedance mismatch at typical frequencies of the ultrasound transducer. This may cause deterioration in image quality or even a total loss of image beyond the airgap.
  • existing EBUS- TBNA systems generally rely on operators visually confirming that the tissue sampling needle penetrates a desired lesion (or a portion thereof) by way of visual confirmation of the needle traversing toward and ultimately entering the target within an ultrasound field of view (FOV).
  • existing EBUS-TBNA systems generally do not provide adequate visual aids to assist the clinician with navigating the sampling needle to the target, such as precisely aligning the target with the nominal needle trajectory prior to extension of the sampling needle out of the echoendoscope and prior to the sampling needle entering the ultrasound FOV.
  • Another technical problem with the present EBUS-TBNA system relates to an uncertainty of the initial nominal needle trajectory, or a deviation from the initial nominal needle trajectory. Obtaining adequate biopsy samples during an EBUS-TBNA procedure requires accurately hitting a target nodule with the sampling needle. However, during an EBUS-TBNA procedure, the needle trajectory may vary depending at least on an initial needle posture and puncturing angle when the airway wall is first penetrated after leaving the side exit ramp.
  • the nominal needle trajectory is determined to be a linear axis of a side exit ramp from which the sampling needle exits the echoendoscope
  • a slight deviation in the puncture angle at which the sampling needle penetrates the airway wall may result in a substantial deviation from the nominal needle trajectory as the sampling needle moves farther across the ultrasound FOV and deeper into the patient’s tissue.
  • existing EBUS-TBNA systems generally do not account for an actual traversed needle trajectory or real-time needle trajectory parameters to provide real-time feedback as to whether a needle being extended will intersect or miss the anatomical target, much less a quantitative assessment such as a probability of intersecting the target.
  • a target lesion is very large and takes up substantially all of the ultrasound FOV produced by an linear or curvilinear ultrasound transducer, even roughly aligning a lesion within the ultrasound FOV is precise enough because of the large size of the target lesion.
  • solitary pulmonary nodules may be very small (e.g., 3 millimeters (mm) in diameter or less) and may be upwards of 10-20 mm deep in the tissue. As such, more precision is needed than the mere rough alignment typically performed with existing systems.
  • the present inventors have recognized an unmet need for improved technologies that can effectively produce clear ultrasound images of an anatomical target during an EBUS-TBNA procedure, while maintaining, and dynamically adjusting as needed, proper alignment of the anatomical target with a nominal trajectory upon extending the sampling device into the lung periphery (e.g., primary, secondary, or tertiary bronchi, or bronchioles).
  • An exemplary system comprises an ultrasound imaging device to generate ultrasound images of an anatomical target in a real-time ultrasound field of view (FOV) of a region of interest, a display to display the ultrasound images in the real-time ultrasound FOV, and a controller circuit.
  • the controller circuit can display graphical user interface elements (UIEs) indicating a predicted nominal sampling device trajectory that intersects the anatomical target in the real-time ultrasound FOV.
  • UAEs graphical user interface elements
  • the controller circuit can determine one or more real-time trajectory parameters based on an analysis of the ultrasound images, and update the UIEs to represent an actual sampling device trajectory.
  • the controller circuit can display the updated UIEs on the realtime ultrasound FOV to provide a visual feedback regarding whether the tissue sampling device will intersect the target.
  • the controller circuit can determine an intersection probability, and dynamically update the sampling device trajectory based at least in part on the determined one or more real-time trajectory parameters.
  • Example l is a system for planning an ultrasound-guided tissue acquisition procedure.
  • the system includes: an ultrasound imaging device configured to generate ultrasound images of an anatomical target in a real-time ultrasound field of view (FOV) of a region of interest; a display configured to display the ultrasound images in the real-time ultrasound FOV; and a controller circuit configured to: prior to an extension of a tissue sampling device into the real-time ultrasound FOV, display graphical user interface elements (UIEs) indicating a predicted sampling device trajectory that intersects the anatomical target on the real-time ultrasound FOV; upon the extension of the tissue sampling device into the real-time ultrasound FOV, determine one or more realtime trajectory parameters based on an analysis of the ultrasound images; update the UIEs based on the determined one or more real-time trajectory parameters to represent at least one of an actual sampling device trajectory or an updated predicted sampling device trajectory; and display the updated UIEs on the realtime ultrasound FOV to provide a visual feedback regarding whether the tissue sampling device will intersect the anatomical target.
  • UIEs graphical
  • Example 2 the subject matter of Example 1 optionally includes the controller circuit that can be configured to: determine a location of the anatomical target within the real-time ultrasound FOV; and determine a hit probability of the tissue sampling device intersecting the anatomical target based least in part on the determined target location within the real-time ultrasound FOV.
  • Example 3 the subject matter of Example 2 optionally includes, wherein to determine the location of the anatomical target, the controller circuit is configured to: receive, via a user interface, a user input identifying the anatomical target within the real-time ultrasound FOV; and analyze an ultrasound image within the real-time ultrasound FOV to determine the location of the anatomical target as identified by the user.
  • Example 4 the subject matter of any one or more of Examples 1-3 optionally includes the one or more real-time trajectory parameters that can include one or more of a position, a posture, or a heading direction or an entering angle of the tissue sampling device upon entering the real-time ultrasound FOV.
  • Example 5 the subject matter of Example 4 optionally includes the controller circuit that can be configured to determine the predicted sampling device trajectory further based on a posture or an entering angle of the tissue sampling device upon entering the real-time ultrasound FOV.
  • Example 6 the subject matter of any one or more of Examples 1-5 optionally includes the controller circuit that can be configured to determine the predicted sampling device trajectory further based on a hardware configuration of the tissue sampling device, including a pre-formed curvature or bending angle of the tissue sampling device.
  • Example 7 the subject matter of any one or more of Examples 1-6 optionally includes the controller circuit that can be configured to determine at least one of the predicted sampling device trajectory or the updated predicted sampling device trajectory using population data including sampling device trajectories collected from similar ultrasound-guided tissue acquisition procedures performed on a patient population.
  • Example 8 the subject matter of any one or more of Examples 1-7 optionally includes the controller circuit that can be further configured to: determine an upper trajectory limit and a lower trajectory limit that define a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level; and prior to the extension of the tissue sampling device into the real-time ultrasound FOV, display UIEs indicating the determined upper trajectory limit and the lower trajectory limit on the real-time ultrasound FOV.
  • Example 9 the subject matter of Example 8 optionally includes the controller circuit that can be configured to determine the upper trajectory limit and the lower trajectory limit using population data including sampling device trajectories collected from similar ultrasound-guided tissue acquisition procedures performed on a patient population.
  • Example 10 the subject matter of any one or more of Examples 1-9 optionally includes the controller circuit that can be configured to determine a hit probability of the tissue sampling device intersecting the anatomical target based at least in part on the actual sampling device trajectory or the one or more real-time trajectory parameters relative to the anatomical target within the real-time ultrasound FOV.
  • Example 11 the subject matter of Example 10 optionally includes the controller circuit that can be configured to: in response to the hit probability being lower than a probability threshold, provide a recommendation to withdraw the tissue sampling device or to adjust a position or posture of the tissue sampling device relative to the anatomical target within the real-time ultrasound FOV; and in response to the hit probability being greater than the probability threshold, provide a notification to a user to extend the tissue sampling device in conformity with the predicted sampling device trajectory.
  • the subject matter of any one or more of Examples 1-11 optionally includes the controller circuit that can be further configured to: dynamically update the predicted sampling device trajectory based at least in part on the determined one or more real-time trajectory parameters; and update the UIEs to represent the updated predicted sampling device trajectory.
  • Example 13 the subject matter of Example 12 optionally includes the controller circuit that can be further configured to dynamically update one or more of an upper trajectory limit or a lower trajectory limit, the upper trajectory limit and the lower trajectory limit defining a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level.
  • Example 14 the subject matter of any one or more of Examples 12-13 optionally includes the controller circuit that can be further configured to determine an updated hit probability of the tissue sampling device intersecting the anatomical target when traversing along the updated predicted sampling device trajectory.
  • Example 15 the subject matter of any one or more of Examples 1-14 optionally includes the ultrasound imaging device that can be located at a distal end of a endobronchial sampling device configured to be inserted into a patient’s airway during an endobronchial ultrasound-guided transbronchial fine needle aspiration (EBUS-TBNA) procedure.
  • EBUS-TBNA endobronchial ultrasound-guided transbronchial fine needle aspiration
  • Example 16 is an intraluminal imaging device system that includes: an intraluminal imaging device, comprising a tubular device body having an inner lumen and a side exit port, and an ultrasound transducer configured to produce ultrasound scans of an anatomical target and generate ultrasound images in a real-time ultrasound field of view (FOV) of a region of interest; a display configured to display the ultrasound images in the real-time ultrasound FOV; a tissue sampling device configured to operatively, under ultrasound guidance, pass through the inner lumen and exit from the side exit port of the tubular device body, and to advance toward the anatomical target in accordance with a predicted nominal trajectory and subsequently intersect the anatomical target; and a controller circuit configured to: prior to an extension of a tissue sampling device into the real-time ultrasound FOV, display graphical user interface elements (UIEs) indicating a predicted sampling device trajectory that intersects the anatomical target on the real-time ultrasound FOV; upon the extension of the tissue sampling device into the real-time ultrasound FOV, analyze the ultrasound images to determine one
  • Example 17 the subject matter of Example 16 optionally includes the tissue sampling device that can include a needle, a brush, a snare, a suction device, forceps, or an assistive insertion device.
  • the tissue sampling device can include a needle, a brush, a snare, a suction device, forceps, or an assistive insertion device.
  • Example 18 the subject matter of any one or more of Examples 16-17 optionally includes the tissue sampling device that an include a transbronchial needle for sampling tissue from a lung periphery target during an endobronchial ultrasound-guided transbronchial fine needle aspiration (EBUS- TBNA) procedure.
  • EBUS- TBNA endobronchial ultrasound-guided transbronchial fine needle aspiration
  • Example 19 the subject matter of any one or more of Examples 16-18 optionally includes the tissue sampling device that can include a needle-stylet combination comprising a needle and a stylet insertable into the needle, wherein one or more of the stylet or the needle has a pre-formed curvature or bending angle substantially conforming to the predicted nominal sampling device trajectory.
  • Example 20 the subject matter of any one or more of Examples 16-19 optionally includes the controller circuit that can be configured to: determine a location of the anatomical target within the real-time ultrasound FOV; and determine a hit probability of the tissue sampling device intersecting the anatomical target based least in part on the determined target location within the real-time ultrasound FOV.
  • Example 21 the subject matter of any one or more of Examples 16-20 optionally includes the controller circuit that can be further configured to: determine an upper trajectory limit and a lower trajectory limit that define a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level; and prior to the extension of the tissue sampling device into the real-time ultrasound FOV, display UIEs indicating the determined upper trajectory limit and the lower trajectory limit on the real-time ultrasound FOV.
  • Example 22 the subject matter of any one or more of Examples 26-21 optionally includes the controller circuit that can be configured to: determine a hit probability of the tissue sampling device intersecting the anatomical target based at least in part on the actual sampling device trajectory relative to the anatomical target within the real-time ultrasound FOV; and display the hit probability on the display.
  • Example 23 the subject matter of Example 22 optionally includes the controller circuit that can be configured to: in response to the hit probability being lower than a probability threshold, provide a recommendation to withdraw the tissue sampling device or to adjust a position or posture of the tissue sampling device relative to the anatomical target within the FOV; and in response to the hit probability being greater than the probability threshold, provide a notification to a user to extend the tissue sampling device in conformity with the predicted sampling device trajectory.
  • Example 24 the subject matter of any one or more of Examples 16-23 optionally includes the controller circuit that can be further configured to: determine whether the tissue sampling device has intersected the anatomical target within the real-time ultrasound FOV based on an analysis of the ultrasound images to; in response to the analysis indicating the tissue sampling device has not intersected the anatomical target, dynamically update the predicted sampling device trajectory based on the determined one or more real-time trajectory parameters; and update the UIEs to represent the updated predicted sampling device trajectory.
  • Example 25 the subject matter of Example 24 optionally includes the controller circuit that can be further configured to, in response to the analysis indicating the tissue sampling device has not intersected the anatomical target, dynamically update one or more of an upper trajectory limit or a lower trajectory limit, the upper trajectory limit and the lower trajectory limit defining a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level.
  • Example 26 the subject matter of any one or more of Examples 24-25 optionally includes the controller circuit that can be further configured to determine an updated hit probability of the tissue sampling device intersecting the anatomical target when traversing along the updated predicted sampling device trajectory.
  • Example 27 is a method for planning an ultrasound-guided tissue acquisition procedure.
  • the method includes steps of: generating and displaying, in a real-time ultrasound field of view (FOV) of a region of interest, ultrasound images of an anatomical target; prior to an extension of a tissue sampling device into the real-time ultrasound FOV, displaying graphical user interface elements (UTEs) indicating a predicted sampling device trajectory that intersects the anatomical target on the real-time ultrasound FOV; upon the extension of the tissue sampling device into the real-time ultrasound FOV, determining one or more real-time trajectory parameters based on an analysis of ultrasound images; updating the UTEs based on the determined one or more real-time trajectory parameters to represent at least one of an actual needle trajectory or an updated predicted sampling device trajectory; and displaying the updated UIEs on the real-time ultrasound FOV to provide a visual feedback regarding whether the needle will intersect the anatomical target.
  • FOV real-time ultrasound field of view
  • Example 28 the subject matter of Example 27 optionally includes, determining a location of the anatomical target within the real-time ultrasound FOV; and determining a hit probability of the tissue sampling device intersecting the anatomical target based least in part on the determined target location within the real-time ultrasound FOV.
  • Example 29 the subject matter of any one or more of Examples 27-28 optionally includes determining an upper trajectory limit and a lower trajectory limit that define a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level; and displaying UIEs on the real-time ultrasound FOV indicating the determined upper trajectory limit and the lower trajectory limit.
  • Example 30 the subject matter of any one or more of Examples 27-29 optionally includes: determining a hit probability of the tissue sampling device intersecting the anatomical target based at least in part on the actual needle trajectory relative to the anatomical target within the real-time ultrasound FOV; and displaying the hit probability.
  • Example 31 the subject matter of Example 30 optionally includes, in response to the hit probability being lower than a probability threshold, providing a recommendation to withdraw the tissue sampling device or to adjust a position or posture of the tissue sampling device relative to the anatomical target within the FOV; and in response to the hit probability being greater than the probability threshold, providing a notification to a user to extend the tissue sampling device in conformity with the predicted sampling device trajectory.
  • Example 32 the subject matter of any one or more of Examples 27-31 optionally includes: dynamically updating the predicted sampling device trajectory based on the determined one or more real-time trajectory parameters; and updating the UIEs to represent the updated predicted sampling device trajectory.
  • Example 33 the subject matter of Example 32 optionally includes dynamically updating one or more of an upper trajectory limit or a lower trajectory limit, the upper trajectory limit and the lower trajectory limit defining a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level.
  • Example 34 the subject matter of any one or more of Examples 32-33 optionally includes determining an updated hit probability of the tissue sampling device intersecting the anatomical target when traversing along the updated predicted sampling device trajectory.
  • FIG. 1 is a schematic diagram illustrating an example of an echoendoscopy system for use in an endoscopic ultrasound (EUS) procedure.
  • FIG. 2 illustrates a perspective view of a distal end portion of an echoendoscope such as that illustrated in FIG. 1.
  • FIG. 3 illustrates an example of an endobronchial ultrasound- guided transbronchial fine needle aspiration (EBUS-TBNA) procedure, and portion of the EBUS-TBNA system being used in the procedure.
  • EBUS-TBNA ultrasound- guided transbronchial fine needle aspiration
  • FIG. 4 is a block diagram illustrating an example of an EUS- guided tissue acquisition (EUS-TA) planning system that can automatically generate an EUS-TA plan for use in a medical procedure, such as an EBUS- TBNA procedure.
  • EUS-TA EUS- guided tissue acquisition
  • FIGS. 5A-5B, 6A-6B, and 7A-7B are diagrams illustrating an example endobronchial ultrasound (EBUS) system with an associated computing system that determines and displays on a user interface graphical representations of a nominal sampling device trajectories and user interface elements (UIEs) at different operation states of an exemplary operational flow in an EBUS-TBNA procedure.
  • EBUS endobronchial ultrasound
  • FIGS. 8A-8C are diagrams illustrating UIEs of hit probabilities that can be estimated and dynamically updated in association with the determination and dynamic update of the nominal needle trajectory and the associated UTL and LTL, as various operation states as described above with respect to FIGS. 5A-5B, 6A-6B, and 7A-7B.
  • FIG. 9 is a flow chart illustrating an example method for generating an EUS-TA plan and presenting on a user interface visual aids to assist with navigation of a tissue sampling device during a medical procedure, such as an EBUS-TBNA procedure.
  • FIG. 10 is a block diagram illustrating an example machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform.
  • An exemplary system comprises an ultrasound imaging device to generate ultrasound images of an anatomical target in a real-time ultrasound field of view (FOV) of a region of interest, a display to display the ultrasound images in the real-time ultrasound FOV, and a controller circuit.
  • a tissue sampling device e.g., a biopsy needle
  • the controller circuit can display graphical user interface elements (UIEs) indicating a predicted nominal sampling device trajectory that intersects the anatomical target in the real-time ultrasound FOV.
  • UEEs graphical user interface elements
  • the controller circuit can determine one or more real-time trajectory parameters based on an analysis of the ultrasound images, and update the UIEs to represent an actual sampling device trajectory.
  • the controller circuit can display the updated UIEs to provide a visual feedback regarding whether the tissue sampling device will intersect the anatomical target.
  • the controller circuit can determine an intersection probability, and dynamically update the predicted sampling device trajectory and the intersection probability based on the determined one or more real-time trajectory parameters.
  • FIG. 1 is a schematic diagram illustrating an example of an echoendoscopy system 100 for use in an endoscopic ultrasound (EUS) procedure for diagnostic or treatment purposes, such as EUS-guided tissue acquisition.
  • the echoendoscopy system 100 may be configured for collecting tissue samples from targets located deep in the peripheral regions of the lungs (e.g., primary, secondary, or tertiary bronchi, or bronchioles) during an EBUS- TBNA procedure.
  • the echoendoscopy system 100 comprises an echoendoscope 120, a light source apparatus 130, a video processor 140, a first monitor 150 for displaying an optical image, an ultrasound observation apparatus 160, and a second monitor 170 for displaying an ultrasound image.
  • the echoendoscope 120 includes an insertion portion 111, an operation portion 112 from which the insertion portion 111 extends, and a universal cord 113 that from the operation portion 112.
  • the insertion portion 111 extends in a longitudinal direction and configured to be inserted into a living body.
  • the universal cord 113 can be connected to the light source apparatus 130 via a scope connector 113A provided at a proximal end portion.
  • a coiled scope cable 114 and an ultrasound signal cable 115 extend from the scope connector 113A.
  • An electric connector portion 114A is provided at one end of the scope cable 114.
  • the electric connector portion 114A can be connected to the video processor 140.
  • An ultrasound connector portion 115A is provided at one end of the ultrasound signal cable 115.
  • the ultrasound connector portion 115A can be connected to the ultrasound observation apparatus 160.
  • the insertion portion 111 of the echoendoscope 120 can be configured to consecutively connect a distal end portion 121, a bending portion 122, and a flexible tube portion 123 in that order from the distal end.
  • Channel opening portions, an optical observation window, an optical illuminating window and an ultrasound transducer, or the like are arranged on one side of the distal end portion 121, as to be described further with reference to FIG. 2.
  • the operation portion 112 may include a bend preventing portion 124 from which the insertion portion 111 extends, a channel opening setting portion 125, an operation portion body 126 making up a grip portion, a bending operation portion 127 including two bending operation knobs 127A and 127B provided so as to superimpose on one another on one upper side of this operation portion body 126, a plurality of switches 128 that instruct the execution of various endoscope functions and a raising lever 129 for operating a raising stand.
  • the switches 128 include an air/water feeding button, a suction button and a freeze button.
  • the channel opening setting portion 125 is provided on one side in the lower part of the operation portion body 126 and provided with one or more ports each configured to receive respective treatment instruments.
  • two instrument ports 125A and 125B are disposed at the channel opening setting portion 125.
  • Such instrument ports can communicate with respective two channel opening portions provided at the distal end portion 121 of the insertion portion 111 via two respective treatment instrument channels (not shown) inside the insertion portion 111.
  • the instrument port 125A can receive a tissue acquisition tool, such as a fine needle for use in EUS-guided tissue acquisition, such as EUS-guided fine-needle aspiration (FNA) or fine-needle biopsy (FNB).
  • FNA EUS-guided fine-needle aspiration
  • FNB fine-needle biopsy
  • the instrument port 125B can receive a cannula for use in endoscopic retrograde cholangiopancreatography (ERCP).
  • ERCP endoscopic retrograde cholangiopancreatography
  • a puncture needle handle portion Nh shown by a single-dot dashed line is fitted into the instrument port 125A.
  • the two instrument ports 125 A and 125B can be arranged at the channel opening setting portion 125 such that when the operator brings the right hand RH close to the channel opening setting portion 125, the instrument port closer to the right hand RH becomes the instrument port 125B and the instrument port farther from the right hand RH becomes the instrument port 125 A. More specifically, as shown by a dotted line in FIG.
  • the operator manipulates the treatment instrument inserted into each instrument port by the right hand RH while holding the operation portion body 126 by the left hand LH.
  • the manipulation using the treatment instrument such as ERCP cannula has a higher degree of difficulty than manipulation using a treatment instrument of an EUS-FNA puncture apparatus.
  • the instrument port 125B for a treatment instrument such as cannula requiring fine manipulation when the operator holds the operation portion body 126 by the left hand LH is arranged at the channel opening setting portion 125 so as to be located on the right side compared to the instrument port 125 A when seen from the operator.
  • the bending knob 127A is a vertical direction bending knob and the bending knob 127B is a horizontal direction bending knob.
  • a bending fixing lever 127A1 for fixing the vertical direction bending state is provided on the proximal end side of the bending knob 127 A and a bending fixing lever 127B1 for fixing the horizontal direction bending state is provided on the distal end side of the bending knob 127B.
  • An image pickup section for acquiring an optical image inside a subject, and an illumination section and an ultrasound transducer section (see FIG. 2) for acquiring an ultrasound tomographic image inside the subject are provided at the distal end portion 121 of the echoendoscope 120. This allows the operator to insert the echoendoscope 120 into the subject and causes the monitors 150 and 170 to display an optical image and an ultrasound tomographic image inside the subject at a desired position in the subject respectively.
  • FIG. 2 illustrates a perspective view of the distal end portion 121 of the insertion portion 111 of the echoendoscope 120 as illustrated in FIG.
  • the distal end portion 121 may include a metallic distal end rigid member 131 and a cylindrical synthetic resin cover member 132 in which the distal end rigid member 131 is inserted, such that the cover member 132 can partially cover the distal end rigid member 131.
  • An ultrasound transducer section 133 is accommodated within the distal end portion 121.
  • the ultrasound transducer section 133 may include an ultrasound transducer configured to emit ultrasound waves sideward at a predetermined angle with respect to an insertion axis of the insertion portion 111.
  • the ultrasound transducer may have a linear array, a curvilinear, or a phased array configuration.
  • the cylindrical synthetic resin cover member 132 provides insulation of the distal end portion 121, and allows the ultrasound transducer section 133 to be reliably fixed there within.
  • part of the opening portion of the cylindrical cover member 132 is covered with part of the distal end rigid member 131 on which an illuminating window 141 and an optical observation window 142 are arranged.
  • Optical light emitted from a light source such as located at the distal end portion 121 of the echoendoscope 120 and coupled to the light source apparatus 130, can pass through the illuminating window 141 and incident on the anatomical target and surrounding environment.
  • the optical observation window 142 allow an imaging device (e.g., a camera lens, not shown) at the distal end portion 121 of the echoendoscope 120 to view target tissue.
  • Other part of the opening portion not covered with part of the distal end rigid member 131 forms an opening portion 144 from which a raising stand 151 protrudes when the raising stand 151 is raised.
  • One or more diagnostic or treatment instruments can be activated to protrude from the opening portion 144.
  • an instrument 240 may protrude from the opening portion 144 when the raising stand 151 is in a maximum raised position.
  • the instrument 240 may be inserted into one of the instrument ports on the operation portion 112 of the echoendoscope 120, such as the instrument port 125 A, passed through a channel within the echoendoscope 120, and extended from the opening portion 144 at the distal end portion 121 (see FIG. 1).
  • the instrument 240 is a puncture device from which a needle 242 protrudes.
  • the needle 242 may be used to sample tissues from an anatomical target, such as a peripheral lung target in an EBUS-TBNA procedure.
  • anatomical target such as a peripheral lung target in an EBUS-TBNA procedure.
  • other types of tissue sampling devices may be used and extended from the instrument 240, such as a brush, a snare, forceps, a suction device, among others.
  • Other devices that may be used with the instrument 240 may include, for example, an object retrieval device for retrieving biological matters (e.g., soft or hard tissue samples, cancerous tissue, or calculi structures), a resection device for surgically removing tissue, a diagnostic device for performing in vivo analysis of the sampled tissue and make diagnosis, or a therapeutic device for delivering therapeutic agents (e.g., cancer treatment drugs) or energy of different modalities (e.g., ultrasound, radiofrequency, laser, or thermal energy) to the anatomical target.
  • an object retrieval device for retrieving biological matters (e.g., soft or hard tissue samples, cancerous tissue, or calculi structures)
  • a resection device for surgically removing tissue
  • diagnostic device for performing in vivo analysis of the sampled tissue and make diagnosis
  • a therapeutic device for delivering therapeutic agents (e.g., cancer treatment drugs) or energy of different modalities (e.g., ultrasound, radiofrequency, laser, or thermal energy) to the anatomical target.
  • EUS-guided therapeutic devices may include an ablation device, a drainage device such as a needle to tube to drain pancreatic cysts or pseudocysts, or a stricture management device to open or dilate a narrowed or obstructed portion of a duct in the pancreaticobiliary system, among others.
  • the instrument 240 may house an assistive insertion device, such as a stylet, a guidewire, a catheter, a probe, or a cannula capable of being endoscopically inserted and passed through the lung peripheral during an EBUS-TBNA procedure.
  • the instrument 240 may house a needle-stylet combination comprising a needle and a stylet insertable into the needle to facilitate maneuvering (e.g., advancement and retraction) and positioning of the needle during the procedure.
  • Tissue sampling devices, stylets or other assistive insertion devices, or combined needle-stylet devices used with the instrument 240 may have different stiffness.
  • such devices may be formed into pre-shaped configurations, such as pre-formed curvatures or bending angles. The stiffness and the pre-shaped configurations of such devices dictate nominal needle trajectories that respective devices are likely to follow through within the ultrasound FOV.
  • the echoendoscope 120 can be robotically controlled, such as by a robot arm attached thereto.
  • the robot arm can automatically, or semi-automatically (e.g., with certain degree of user manual control or commands), via an actuator, position and navigate instrument such as the echoendoscope 120 in an anatomical target, or position a device at a desired location with desired posture to facilitate an operation on the anatomical target (e.g., to collect tissue samples from the anatomical targe using a brush, a snare, forceps, or a suction device).
  • a controller can use artificial intelligence (Al) or machine learning (ML) technologies to determine navigation parameters and/or tool operational parameters (e.g., position, angle, posture, force, and navigation path), and generate a control signal to the actuator of the robot arm to facilitate operation of such instrument or tools in accordance with the determined navigation and operational parameters in a robot-assisted procedure.
  • Al artificial intelligence
  • ML machine learning
  • FIG. 3 illustrate an example of endobronchial ultrasound-guided transbronchial fine needle aspiration (EBUS-TBNA) procedure, and portion of the EBUS-TBNA system being used in the procedure.
  • EBUS-TBNA uses a specialized bronchoscope 320 equipped with ultrasound capabilities to image beyond the walls of the airways to detect in real time the precise locations of anatomical targets of interest, such as a lymph node.
  • a clinician inserts the bronchoscope 320 into a lung periphery region 301, such as primary, secondary, or tertiary bronchi, or bronchioles.
  • the bronchoscope 320 similar to the echoendoscope 120 as shown in FIG.
  • the ultrasound transducer 333 includes an ultrasound transducer 333 which may be located at the distal end of the bronchoscope.
  • the ultrasound transducer 333 may have a linear array, a curvilinear, or a phased array configuration.
  • the ultrasound transducer 333 can produce a live stream of ultrasound images of a target 302 (e.g., a peribronchial nodule) simultaneously in a real-time ultrasound field of view (FOV) 350 of the region of interest.
  • a target 302 e.g., a peribronchial nodule
  • FOV real-time ultrasound field of view
  • An endobronchial sampling device 342 (e.g., a biopsy needle such as the needle 242, or a needle-stylet combination) can be inserted from a proximal port of the bronchoscope 320, passed through the entire length of the bronchoscope 320, and extended from a side exit port 344 (also referred to as a side exit ramp, an example of the opening port 144 of FIG. 2) located at the distal end of the bronchoscope 320, slightly proximal of the ultrasound transducer 333.
  • the endobronchial sampling device 342 can then be operatively protruded from the side exit port 344 and enter the ultrasound FOV 350. Under the real-time ultrasound guidance, the endobronchial sampling device 342 may traverse toward and ultimately intersect the target 302 (e.g., a peribronchial nodule), and collect tissue samples therefrom.
  • FIG. 4 is a block diagram illustrating an example of an EUS- guided tissue acquisition (EUS-TA) planning system 400 that can automatically generate an EUS-TA plan to navigate a sampling device during a medical procedure, such as an EBUS-TBNA procedure.
  • the EUS-TA plan may include a determined navigation path, also referred to as a nominal trajectory, extending from the endoscope (such as via the side exit port 344 at a distal portion of the bronchoscope 320, as shown in FIG. 3) to the anatomical target within an ultrasound FOV.
  • the nominal trajectory may be initially predicted upon the extension of the sampling device out of the endoscope but prior to entering the ultrasound FOV.
  • the nominal trajectory may be initially predicted even before the sampling device is extended out of the endoscope.
  • a selected sampling device with a specific hardware configuration e.g., a pre-formed curvature or bending angle
  • the nominal trajectory can be dynamically updated during the procedure to ensure that the sampling device will intersect the anatomical target with a high probability.
  • the system 400 may include a controller circuit 410, a device controller 420, an input interface 430, and a user interface 440.
  • the controller circuit 410 may include circuit sets comprising one or more other circuits or subcircuits that may, alone or in combination, perform the functions, methods, or techniques described herein.
  • the controller circuit 410 and the circuits sets therein may be implemented as a part of a microprocessor circuit, which may be a dedicated processor such as a digital signal processor, application specific integrated circuit (ASIC), microprocessor, or other type of processor for processing information including physical activity information.
  • the microprocessor circuit may be a general -purpose processor that may receive and execute a set of instructions of performing the functions, methods, or techniques described herein.
  • hardware of the circuit set may be immutably designed to carry out a specific operation (e.g., hardwired).
  • the hardware of the circuit set may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • variably connected physical components e.g., execution units, transistors, simple circuits, etc.
  • a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • the underlying electrical properties of a hardware constituent are changed, for example, from an insulator to a conductor or vice versa.
  • the instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuit set in hardware via the variable connections to carry out portions of the specific operation when in operation.
  • the computer readable medium is communicatively coupled to the other components of the circuit set member when the device is operating.
  • any of the physical components may be used in more than one member of more than one circuit set.
  • execution units may be used in a first circuit of a first circuit set at one point in time and reused by a second circuit in the first circuit set, or by a third circuit in a second circuit set at a different time.
  • the controller circuit 410 may generate an EUS-TA plan using image data received from the input interface 430.
  • the input interface 430 may be a direct data link between the system 400 and one or more medical devices that generate at least some of the input features.
  • the input interface 430 may transmit EUS images 431, optionally along with endoscopic images 432, external image sources 433, or other information such as collected by physiological sensors during a medical procedure directly to the system 400.
  • the input interface 430 may be a part of the user interface 440 that facilitates interaction between a user and the system 400.
  • the user may manually provide input data to the system 400 via the user interface 440.
  • the input interface 430 may provide the system 400 with access to an electronic patient record from which one or more data features may be extracted. In any of these cases, the input interface 430 can collect one or more of sources of patient information before and during the procedure.
  • the EUS images 431 may include perioperative EUS images (e.g., a live stream of real-time EUS images) of the anatomical target and surrounding environment during an EUS-guided procedure.
  • the EUS images may be produced by an ultrasound transducer, such as the ultrasound transducer associated with the echoendoscope 120, or the ultrasound transducer 333 associated with the bronchoscope 320.
  • an ultrasound transducer such as the ultrasound transducer associated with the echoendoscope 120, or the ultrasound transducer 333 associated with the bronchoscope 320.
  • a real-time ultrasound field of view (FOV) of the target may be continuously generated and presented to the user via a display 443 of the user interface 440.
  • FOV real-time ultrasound field of view
  • the endoscopic images 432 may include perioperative endoscope images or videos of the anatomical target and its surrounding environment captured by a camera associated with the echoendoscope.
  • the external image sources 433 may include preoperative or perioperative images of the anatomical target acquired by external imaging devices other than the echoendoscope, which may include, for example, X-ray or fluoroscopy images, electrical potential map or an electrical impedance map, CT images, or MRI images, among others.
  • the input interface 430 may receive other information including, for example, endo-therapeutic device information including, for example: size, dimension, shape, and structures of tissue sampling devices (e.g., needles, forceps, brushes, snares, knives, suction devices) or assistive insertion devices (e.g., a stylet, a guidewire, a probe, a catheter, or a cannula) to assist insertion and maneuvering of the tissue sampling device.
  • tissue sampling devices and/or assistive insertion devices may have pre-shaped hardware configurations, such as pre-formed curvatures or bending angles.
  • the device information of the tissue sampling device and/or the assistive insertion devices may be used to assist in selecting a proper tissue sampling device, and determining device operation parameters to navigate the selected sampling device to the anatomical target accurately and efficiently.
  • the device information may be received directly from the input interface 430. Alternatively or additionally, the device information may be stored in a memory accessible to the controller circuit 410.
  • the input interface 430 may receive information from sensors coupled to the echoendoscope or a treatment device passing through the endoscope, or otherwise associated with the patient.
  • a proximity sensor positioned at a distal end portion of the echoendoscope can sense information including position, direction, or proximity of a distal portion of the echoendoscope relative to an anatomical target.
  • the controller circuit 410 may include an image processor 411, a target identification and localization circuit 412, a trajectory prediction circuit 413, a sampling device tracker 414, and a trajectory assessment and adjustment circuit 415.
  • the image processor 411 may process the images received from the input interface 430, and extract image features characterizing an anatomical target of interest (e.g., peribronchial nodule).
  • the target identification and localization circuit 412 may use the extracted image features to detect a presence and location of the anatomical target within the real-time ultrasound FOV.
  • the location of the anatomical target may be represented by a longitudinal location and a depth relative to the surface of ultrasound transducer in a coordinate system in the ultrasound of FOV.
  • a user may provide, via the input unit 445 of the user interface 440, an input identifying the anatomical target, such as by highlighting or clicking on the anatomical target as appeared on the display 443, or by drawing a bounding box around the anatomical target on the display 443.
  • the target identification and localization circuit 412 may analyze the image to determine the location of the user- identified anatomical target.
  • the target identification and localization circuit 412 may determine other characteristics of the anatomical target, including, for example, a size, a shape, or a structure of the anatomical target, and/or recognize a pathophysiological property, such as a lesion, an inflammation state, a stricture level, or a malignancy state (e.g., degree or area of invasion by cancer) of the anatomical target using on-site, real-time in vivo tissue diagnosis.
  • the target identification and localization circuit 412 may perform image processing techniques such as edge detection operations to identify the boundary edges of a solitary pulmonary nodule (or other anomaly of target of interest) within the real-time ultrasound image steam.
  • the trajectory prediction circuit 413 may predict a nominal sampling device trajectory (S*) when the sampling device (e.g., a biopsy needle or a needle-stylet combination) is extended out of the endoscope (such as via the side exit port 344 of the bronchoscope 320 as shown in FIG. 3) but prior to the sampling device entering the ultrasound FOV, or even before the sampling device is extended out of the endoscope.
  • the nominal trajectory S* may be initially predicted based at least in part on the detected target location in the realtime ultrasound FOV, such that the predicted nominal trajectory S* begins at an entry point into the real-time ultrasound FOV, and intersects the anatomical target at the detected target location which is visible in the real-time ultrasound FOV.
  • the trajectory prediction circuit 413 may predict the nominal trajectory S* further based on a posture and a puncture angle of the sampling device (e.g., a needle) upon piercing through the airway wall, or a posture and an entering angle upon entering the real-time ultrasound FOV.
  • the nominal trajectory S* may be predicted additionally or alternatively based on hardware configurations (e.g., pre-formed curvatures or bending angles) of available tissue sampling devices. The predicted nominal trajectory S* thus determined corresponds to a tissue sampling device X* with a specific pre-shaped configuration, such as a pre-formed curvature or bending angle.
  • the trajectory prediction circuit 413 may further determine an upper trajectory limit (UTL) above the predicted nominal trajectory S* and a lower trajectory limit (LTL) below the predicted nominal trajectory S* in the real-time ultrasound FOV.
  • the UTL and LTL define a trajectory range or “zone” within which an intersection with the anatomical target is likely to occur at a sufficiently high confidence (e.g., 95%).
  • the UTL and the LTL may each be determined based on uncertainties associated with determination of device posture and puncturing angle upon penetrating through the airway wall, and/or device posture and entering angle upon entering the realtime ultrasound FOV.
  • the nominal trajectory S* may be initially predicted (e.g., before the sampling device entering the ultrasound FOV, or even before the sampling device is extended out of the endoscope) based on an axis of the side exit ramp 344 where the sampling needle is extended out of the endoscope with respect to the FOV in one example, or based on a population-based estimate of the puncturing angle in another example.
  • a nominal trajectory S* may be initially determined based on a relative pose of the side exit ramp with respect to the FOV prior to a needle entering the FOV.
  • the sampling device punctures through the lung membrane or airway wall, a slight deflection may occur, which may introduce deviations from the hardware-based or the population-based estimate of the puncturing angle and thus uncertainties associated with the predicted nominal trajectory S*.
  • the UTL and the LTL may also be determined based on uncertainties associated with measurements of the target location from the images in the real-time ultrasound FOV.
  • the trajectory prediction circuit 413 may predict the nominal sampling device trajectory S* using procedure data collected form a patient population.
  • the nominal sampling device trajectory S* may represent a best fit line or curve obtained from, for example, linear or nonlinear regression analysis of needle trajectories collected under real-life or modeled circumstances from similar procedures performed on a selected patient population under similar conditions (e.g., similar types or models of bronchoscopes).
  • ultrasound image streams may be collected from N (e.g., 1000) procedures in which a needle is extended from the side exit ramp into patient tissue within the FOV.
  • the UTL 511 and LTL 512 may represent the amount of variability of the population-based trajectories away from the nominal needle trajectory 510.
  • the UTL 511 and LTL 512 may be determined based on a variance of needle puncturing angles in similar procedures performed on a selected patient population (e.g., the UTL 511 and LTL 512 may be determined based on a standard deviation across procedures).
  • Graphical user interface elements including graphical representations of the predicted nominal trajectory S* and the indication that S* intersects the anatomical target, may be overlaid upon the real-time ultrasound FOV and displayed on the display 443 during the sampling procedure.
  • the UIEs may additionally include graphical representations of UTL and LTL overlaid upon the real-time ultrasound FOV.
  • such UIEs may be displayed prior to an extension of the tissue sampling device into the real-time ultrasound FOV, or even before the sampling device is extended out of the endoscope.
  • the UIEs may be continuously or periodically updated upon the extension of the tissue sampling device into the ultrasound FOV and traversing toward the anatomical target.
  • Such UIEs may serve as real-time feedback to the user on the state of the sampling device, and indicators of how well the sampling device is tracking the predicted nominal trajectory S* and how likely an intersection with the anatomical target will occur. Examples of the UIEs indicating the predicted nominal trajectory S* and associated UTL and LTL are described further below with respect to FIGS. 5A-5B, 6A-6B, and 7A-7B.
  • the sampling device tracker 414 may track actual state and trajectory of the sampling device as soon as it enters the real-time ultrasound FOV.
  • the sampling device tracker 414 may determine one or more real-time trajectory parameters based on an analysis of the ultrasound images or image features produced by the image processor 411.
  • the real-time trajectory parameters may include a position, a posture, or a heading direction or angle of the sampling device within the real-time ultrasound FOV.
  • a graphical representation of the actual tracked trajectory (St) as of time t (and prior to an occurrence of intersection with the target) may be overlaid upon the real-time ultrasound FOV and displayed on the display 443 during the sampling procedure.
  • the trajectory assessment and adjustment circuit 415 may assess a trajectory tracking performance during the procedure prior to the occurrence of intersection.
  • the tracking performance may be quantified by a “hit probability,” which can be estimated using a hit probability estimator 416.
  • the hit probability denoted by Pst, represents a conditional probability of the sampling device intersecting the anatomical target given the one or more realtime trajectory parameters and/or the actual tracked trajectory (St), as determined by the sampling device tracker 414.
  • the hit probability estimator 416 may estimate the hit probability Pst based on the actual tracked trajectory St or the one or more real-time trajectory parameters relative to the target location within the real-time ultrasound FOV.
  • the hit probability estimator 416 may estimate the hit probability Pst based on a level of discrepancy between the actual tracked trajectory (St) and the corresponding portion of the predicted nominal trajectory S*. In another example, the hit probability estimator 416 may estimate the hit probability Pst based on a distance between the current location the sampling device (which is one of the real-time trajectory parameters produced by the sampling device tracker 414) and the target location within the real-time ultrasound FOV. In some examples, the hit probability estimator 416 may estimate the hit probability Pst further based on the homogeneity of media in the region of interest within the FOV.
  • the controller circuit 410 may generate feedback or recommendation to adjust subsequent tracking strategies to ensure a high likelihood that the sampling device will intersect the anatomical target.
  • the estimated hit probability Pst may be displayed on a user interface. In one example, when the hit probability is greater than a probability threshold PTH, the trajectory tracking performance is deemed satisfactory, and the user may be notified (such as via the output unit 442) to continue advancing the sampling device along the existing predicted nominal trajectory S*.
  • the controller circuit 410 may provide a recommendation to the user (such as via the output unit 442) to adjust the position or posture of the sampling device relative to the target location within the FOV, such as withdrawing the sampling device by a specific amount from its current location.
  • UIEs e.g., the actual tracked trajectory St
  • the actual tracked trajectory (St) may be displayed in red if Pst ⁇ PTH, or displayed in green if Pst > PTH.
  • the trajectory assessment and adjustment circuit 415 may include a trajectory update circuit 417 that can dynamically update the predicted nominal trajectory S* based on the one or more real-time trajectory parameters produced by the sampling device tracker 414, or a level of discrepancy between the actual tracked trajectory St and the corresponding portion of the predicted nominal trajectory S*.
  • the update may be initiated in response to the trajectory tracking performance satisfying a specific condition, such as the estimated hit probability Pst being lower than the probability threshold PTH.
  • the trajectory update circuit 417 can dynamically update one or more of the UTL or LTL associated with the predicted nominal trajectory S*.
  • the UTL and LTL define a trajectory range within which an intersection with the anatomical target is likely to occur at a specific confidence level.
  • the update of the UTL and LTL may be triggered by the trajectory tracking performance satisfying a specific condition, such as Pst ⁇ PTH.
  • the controller circuit 410 may update the UIEs to represent an updated prediction of sampling device trajectory (S*)’ and the updated the updated UTL’ and LTL’. Some of all of (S*)’, UTL’ and LTL’ may be superimposed on the real-time ultrasound FOV.
  • the original predicted nominal trajectory S* and the associated UTL and LTL may be erased from the FOV, or displayed distinguishably from the updated counterparts (i.e., (S*)’, UTL’ and LTL’), such as in different colors.
  • Such update of the nominal trajectory S* the associated UTL and LTL as described above is “dynamic” in that the update can occur continuously, periodically, or on-demand over the course of the sampling device being advanced toward, but prior to intersecting, the anatomical target.
  • updated UIEs including the updated prediction of nominal trajectory (S*)’ and the associated UTL’ and LTL’ are described below such as with respect to FIGS. 7A-7B and 8A-8C.
  • one or more of the trajectory prediction circuit 413, the hit probability estimator 416, or the trajectory update circuit 417 may use artificial intelligence (Al) or machine-learning (ML) technology to perform respective tasks as stated above.
  • One or more ML models may provide the system 400 with the ability to perform tasks, without explicitly being programmed, by making inferences based on patterns found in the analysis of data.
  • the ML model(s) explores the study and construction of algorithms (e.g., ML algorithms) that may learn from existing data and make predictions about new data. Such algorithms operate by building the ML model(s) from training data in order to make data-driven predictions or decisions expressed as outputs or assessments.
  • the ML model(s) may be trained using supervised learning or unsupervised learning.
  • Supervised learning uses prior knowledge (e.g., examples that correlate inputs to outputs or outcomes) to learn the relationships between the inputs and the outputs.
  • the goal of supervised learning is to learn a function that, given some training data, best approximates the relationship between the training inputs and outputs so that the ML model can implement the same relationships when given inputs to generate the corresponding outputs.
  • Unsupervised learning is the training of an ML algorithm using information that is neither classified nor labeled, and allowing the algorithm to act on that information without guidance. Unsupervised learning is useful in exploratory analysis because it can automatically identify structure in data.
  • Common tasks for supervised learning are classification problems and regression problems.
  • Classification problems also referred to as categorization problems, aim at classifying items into one of several category values.
  • Regression algorithms aim at quantifying some items (for example, by providing a score to the value of some input).
  • Some examples of commonly used supervised-ML algorithms are Logistic Regression (LR), Naive-Bayes, Random Forest (RF), neural networks (NN), deep neural networks (DNN), matrix factorization, and Support Vector Machines (SVM).
  • DNN include a convolutional neural network (CNN), a recurrent neural network (RNN), a deep belief network (DBN), or a hybrid neural network comprising two or more neural network models of different types or different model configurations.
  • Some common tasks for unsupervised learning include clustering, representation learning, and density estimation.
  • Some examples of commonly used unsupervised learning algorithms are K-means clustering, principal component analysis, and autoencoders.
  • federated learning also known as collaborative learning
  • This approach stands in contrast to traditional centralized machine-learning techniques where all the local datasets are uploaded to one server, as well as to more classical decentralized approaches which often assume that local data samples are identically distributed.
  • Federated learning enables multiple actors to build a common, robust machine learning model without sharing data, thus allowing to address critical issues such as data privacy, data security, data access rights and access to heterogeneous data.
  • a trained ML model can have a neural network structure comprising an input layer, one or more hidden layers, and an output layer.
  • Information fed to the input layer may include images data or image features derived therefrom such as produced by the image processor 411, location and/or characteristics of the anatomical target produced by the target identification and localization circuit 412, actual device trajectory detected by the sampling device tracker 414, among others.
  • the input information may be propagated through one or more hidden layers to the output layer that outputs one or more of the predicted nominal trajectory S* and the associated UTL or LTL, the hit probability Pst, or the updated predicted nominal trajectory (S*)’ and the associated updated UTL’ and LTL’, among others.
  • a plurality of ML models can be separately trained, validated, and used (in an inference phase) in different applications.
  • a first ML model (or a first set of ML models) may be trained to predict or update S*
  • a second ML model or a second set of ML models
  • a third ML model (or a third set of ML models) may be trained to estimate the hit probability Pst.
  • the device controller 420 can generate a control signal to one or more actuators 450, such as a motor actuating a robot arm.
  • the one or more actuators 450 may be coupled to one or more devices, such as the nominal sampling device (or an assistive insertion device to facilitate insertion and maneuvering of the sampling device, the EUS probe, or the echoendoscope 120 or the bronchoscope 320.
  • the one or more actuators 450 can robotically adjust position, posture, direction, and navigation path of the devices coupled thereto in accordance with the (initial) predicted nominal trajectory S* or the updated predicted nominal trajectory (S*)’.
  • the user interface 440 may include an output unit 442 and an input unit 445.
  • the input unit 445 can receive input from the user or from other data sources.
  • the input interface 430 can be included in the input unit 445.
  • the output unit 442 may include a display 443 to display the ultrasound FOV, with images of the anatomical target and real-time ultrasound images of the tracked trajectory and position and motion information of the sampling device during the procedure, such as being advanced toward or retracted from the target anatomy within the ultrasound FOV. Display settings can be adjusted by the user via the input unit 445.
  • the visual indication may take the format of markers, annotations (icons, texts, or graphs), highlights, or animation, among other visual indicators. For example, markers of different shapes, colors, forms, or sizes can be display on the reconstructed or integrated image to distinguish different tissue, anatomical regions, their accessibility or criticality.
  • the output unit 442 may include an alert and feedback generator 444 that can generate an alert, a notification, or other formats of human- perceptible feedback to the operating physician on the status or progress of the cannulation or navigation in reference to the navigation plan.
  • an alert can be generated to indicate a risk of tissue damage.
  • the feedback can be in one or more forms of audio feedback, visual feedback, or haptic feedback.
  • a critical zone e.g., proximity sensor detecting a distance to a critical anatomy of interest shorter than a threshold distance
  • the critical zone can be shown in different colors to represent such distance (e.g., green zone, yellow zone, and read zone as the endoscope gets closer to the critical zone).
  • haptic feedback such as touch or vibration may be generated and felt by the operating physician.
  • the alert and feedback generator 444 can automatically adjust the vibration strength according to the distance to the critical zone. For example, a low vibration can be generated when the endoscope tip is in a green zone. If the system predicts, based on present advancing speed and direction of the endoscope, that the endoscope tip will reach the critical zone in a time lower than a predetermined threshold, then alert and feedback generator 444 can apply moderate vibration when the endoscope tip reaches a yellow zone, and apply high vibration when the endoscope tip reaches red zones to indicate a risk of tissue damage.
  • the real-time alert and feedback in an image- guided medical procedure as described herein can improve the efficiency of cannulation and endoscope navigation, especially for inexperienced physicians, and can improve procedure success rate and patient outcome.
  • FIGS. 5A-5B, 6A-6B, and 7A-7B illustrate schematic views of an exemplary EBUS-TBNA system with an associated computing system that determines and displays graphical representations of one or more nominal sampling device trajectories and user interface elements (UTEs) at different operation states of an exemplary operational flow in an EBUS-TBNA procedure.
  • the graphical representations of the one or more nominal sampling device trajectories and the UIEs may be overlaid onto the ultrasound images in a realtime ultrasound FOV.
  • the EBUS-TBNA system includes an EBUS sampling device 520 located in an airway enclosed by the airway walls 501.
  • a predicted sampling device trajectory 510 may be displayed over ultrasound images of a lung periphery region in a real-time ultrasound FOV 550.
  • the real-time ultrasound FOV 550 can be produced by an ultrasound transducer 533 located at a distal end of the EBUS sampling device 520.
  • the EBUS sampling device 520 has a side exit port 544 (also known as a needle ramp) proximal to the ultrasound transducer 533 along the length of a tubular device body of the EBUS sampling device 520.
  • a tissue sampling device (not shown here), such as the endobronchial sampling device 342 as shown in FIG.
  • FIGS. 5 A, 6 A, and 7 A each illustrate aspects of the UIEs within the real-time ultrasound FOV 550 along with the EBUS sampling device 520, the airway, and the target nodule 540.
  • the UIEs are computer generated graphical elements displayed as overlays onto an ultrasound image in the realtime ultrasound FOV 550 to indicate a likely needle trajectory in relation to the target nodule 540 visible within the ultrasound image. Aspects of the UIEs may include graphical representations of the nominal needle trajectory 510, an upper trajectory limit (UTL) 511 above the nominal needle trajectory 510, and a lower trajectory limit (LTL) 512 below the nominal needle trajectory 510.
  • the nominal needle trajectory 510 and the associated UTL 511 and LTL 512 may be predicted by the trajectory perdition circuit 413.
  • FIGS. 5A-5B illustrate a first operation state of an exemplary operational flow associated with use of the EUS-TA planning system 400 in accordance with an embodiment as described herein.
  • no sampling device e.g., sampling needle
  • the EBUS sampling device 520 may be moved fore and/or aft within the airway enclosed by the airway walls 501 to align the target nodule 540 with the nominal needle trajectory 510.
  • the nominal needle trajectory 510 may be an extension of the axis of the side exit ramp 344, or any other trajectory which a needle exiting the ramp is most likely to follow when extended.
  • the nominal needle trajectory 510 may represent a best fit line or curve of a sample of needle trajectories collected under real-life or modeled circumstances from similar procedures performed on a selected patient population.
  • ultrasound image streams may be collected of N (e.g., 1000) procedures in which a needle is extended from the side exit ramp into patient tissue within the real-time ultrasound FOV 550, and the nominal needle trajectory may represent the best fit line or curve such as obtained from linear or nonlinear regression analysis or average of all the data collected.
  • the UTL 511 and LTL 512 may represent some predetermined amount of variability (e.g., standard deviation) away from the nominal needle trajectory 510.
  • the nominal needle trajectory 510 may also be dependent on the sampling device configuration (e.g., needle and stylet combination). Accordingly, the displayed trajectory may also be dependent on the selected sampling device and may update with different selections from the clinician.
  • the nominal needle trajectory 510 and the associated UTL 511 and LTL 512 may be determined as a whole and are not specific to an individual procedure. Accordingly, the UIEs indicating the nominal needle trajectory 510 and the associated UTL 511 and LTL 512 may be displayed before a sampling needle is extended into the tissue or even before inserted into the proximal end (outside of the patient) of the EBUS sampling device 520. For example, these UIEs may be displayed while the user is still free to slide the EBUS sampling device along through the airway and/or to select a different combination of needle and stylet. Thus, the UIEs displayed at this state of the procedure may serve as useful visual aids for the user to align the target nodule 540 with the nominal needle trajectory 510.
  • FIGS. 6A-6B illustrate a second operation state of an exemplary operational flow associated with use of the EUS-TA planning system 400 in accordance with an embodiment as described herein.
  • the second operation state may happen right after the first operation state as descried above with respect to FIGS. 5A-5B.
  • the user may extend a sampling needle 642 (an example of the endobronchial sampling device 342 as shown in FIG. 3) slightly from the side exit port 544 but not so far that the sampling needle 642 has entered the real-time ultrasound FOV 550. Accordingly, the sampling needle 642 is not yet visible in the ultrasound image as displayed to the user.
  • the nominal needle trajectory 510 and the associated UTL 511 and LTL 512 may remain unchanged, because the lack of the sampling needle 642 being within the real-time ultrasound FOV 550 suggests that the system does not have real-time needle trajectory information usable to determine whether or not to update the nominal needle trajectory 510 or the associated UTL 511 and LTL 512.
  • FIGS. 7A-7B illustrate a third operation state of an exemplary operational flow associated with use of the EUS-TA planning system 400 in accordance with an embodiment as described herein.
  • the third operation state may happen right after the second operation state as described above with respect to FIGS. 6A-6B.
  • the sampling needle 642 has entered the real-time ultrasound FOV 550.
  • Actual needle trajectory can be tracked, and one or more real-time needle trajectory parameters may be derived from the images, by the sampling device tracker 414.
  • a tracking performance may be assessed.
  • the nominal needle trajectory 510 and the associated UTL 511 and LTL 512 may be dynamically updated.
  • the trajectory assessment and adjustment circuit 415 may analyze the ultrasound images to distinguish the needle from the tissue, and to determine the real-time needle trajectory parameters which may include, for example, a height within the FOV that the needle enters, an angle of the needle within the FOV, an amount of curvature and/or direction of curvature of the needle, and any other parameter which may have some predictive value to the trajectory the sampling needle 642 is likely to follow. Then, based on the real-time needle trajectory parameters, a hit probability may be estimated by the hit probability estimator 416.
  • the trajectory update circuit 417 may update one or more of the nominal needle trajectory 510 or the associated UTL 511 and LTL 512 when the estimated hit probability falls below a probability threshold. In this way, if the sampling needle 642 enters the real-time ultrasound FOV 550 and is advancing in a manner indicating that the sampling needle 642 is likely to deviate from the original nominal needle trajectory 510 (such that a lower hit probability is likely to result), then the user may be provided with visual indications to this effect. As shown below in FIG.
  • the updated UTL 711 and LTL 712 have narrowed in relation to the originally displayed UTL 511 and LTL 512, even though the updated nominal needle trajectory 710 does not differ much from the original nominal needle trajectory 510 (thus the label “510(710)” in FIG. 5A). This is the case because as the sampling needle 642 progresses through the tissue toward the target nodule 540, the possible trajectory within the FOV closes in around the actually followed needle trajectory.
  • the system may receive some indication of where within the FOV the targeted tissue resides, and may use this indication to further enhance the visual aids (e.g., UTEs) displayed to the clinician.
  • the system may receive a user input which indicates a location of the target nodule with the FOV.
  • the system may perform some image processing techniques to automatically identify a portion of the ultrasound images having characteristics indicative of a target tissue (e.g., a cancerous lesion, a solitary pulmonary nodule (SPN), etc.).
  • a target tissue e.g., a cancerous lesion, a solitary pulmonary nodule (SPN), etc.
  • FIGS. 8A-8C are diagrams illustrating UTEs, displayed as overlays over the real-time ultrasound FOV 550, that indicate hit probabilities in association with the determination and dynamic update of the nominal needle trajectory and the associated UTL and LTL at various operation states as described above with respect to FIGS. 5A-5B, 6A-6B, and 7A-7B.
  • FIG. 8 A illustrates, at a first time prior to the sampling needle 642 having entered the FOV (corresponding to the first and second operation states) and after the system having received some indication of where the target nodule 540 is within the real-time ultrasound FOV 550, the system determines a 74% probability of the target nodule 540 being hit if the sampling needle 642 follows through the original nominal needle trajectory 510.
  • the determination of such initial hit probability (Po) can be made based on, for example, a distance the sampling needle 642 will need to travel to reach the target nodule 540, the homogeneity of the tissue within the FOV, or any other factor which may indicate how likely the sampling needle 642 is to deviate from the original nominal needle trajectory 510.
  • the initial hit probability (Po) may be estimated using data collected from similar procedures performed on a selected patient population under similar conditions (e.g., similar types or models of bronchoscopes). UIEs 810 may be displayed with such initial estimate of hit probability.
  • the user may advance, retract, or rotate the EBUS sampling device 520 (or the EUS probe if maneuverable separately from the EBUS sampling device 520) to change the location of the target nodule 540 as it appears in the real-time ultrasound FOV 550.
  • the initial hit probability Po can be affected by the sampling device hardware configuration and puncturing angle relative to the target location, changing the bronchoscope or EUS probe positions relative to the target may result in different Po values.
  • Each estimate can be made using data collected from similar procedures performed on a selected patient population under similar conditions (e.g., similar types or models of bronchoscopes, and the same positions of the bronchoscopes or EUS probes relative to respective targets during the procedures).
  • a “sweet spot” corresponding to the highest or a satisfactory Po value may be identified from the bronchoscope or EUS probe positions that have been tested. Then the bronchoscope or EUS probe may be settled at the “sweet spot,” and the sampling needle can be extended out of the side exit port 544 and puncture into the tissue.
  • FIGS. 8B-8C illustrate that at a second time corresponding to the third operation state, when the sampling needle 642 has entered the real-time ultrasound FOV 550 and traversed across roughly 1/3 of the original nominal needle trajectory 510 toward the target nodule 540.
  • FIG. 8B illustrates an example where the sampling needle 642 has remained very closely aligned with the original nominal needle trajectory 510 (e.g., the updated nominal needle trajectory 710 is nearly aligned with the original nominal needle trajectory 510, thus the label “510(710)”), the trajectory assessment and adjustment circuit 415 has determined that an increase in the hit probability 820 up to 95%.
  • the UIEs may be updated to reflect this new estimate of hit probability.
  • the updated UIEs may be displayed distinguishably over the previous UIEs, such as colored (e.g., in green) to indicate a determined “go” or “hit” scenario for the current sampling attempts.
  • FIG. 8C illustrates an example where the sampling needle 642 deviates substantially from the original nominal needle trajectory 510 or diverges substantially away from the target nodule 540.
  • the trajectory assessment and adjustment circuit 415 may determine a reduced hit probability 830 down to 14%.
  • the updated UIEs may include such reduced hit probability value, optionally in a different color (e.g., yellow or red) to indicate the lower likelihood of hitting the target.
  • FIG. 9 is a flow chart illustrating an example method 900 for generating an EUS-TA plan and presenting on a graphical user interface visual aids to assist with navigation of a tissue sampling device during a medical procedure, such as an EBUS-TBNA procedure.
  • the method 900 may be implemented in and executed by the system 400. Although the processes of the method 900 are drawn in one flow chart, they are not required to be performed in a particular order. In various examples, some of the processes can be performed in a different order than that illustrated.
  • ultrasound images of an anatomical target may be generated using an ultrasound transducer associated with an endoscope, such as the ultrasound transducer associated with the echoendoscope 120, or the ultrasound transducer 333 associated with the bronchoscope 320.
  • an ultrasound transducer associated with an endoscope such as the ultrasound transducer associated with the echoendoscope 120, or the ultrasound transducer 333 associated with the bronchoscope 320.
  • a real-time ultrasound field of view (FOV) of the target may be presented to the user on the graphical user interface.
  • FOV real-time ultrasound field of view
  • other images may be generate or otherwise received at 910, including, for example, perioperative endoscope images or videos of the anatomical target and its surrounding environment captured by a camera associated with the echoendoscope, or preoperative or perioperative images of the anatomical target acquired by external imaging devices, such as X-ray or fluoroscopy images, electrical potential map or an electrical impedance map, CT images, or MRI images, among others.
  • perioperative endoscope images or videos of the anatomical target and its surrounding environment captured by a camera associated with the echoendoscope
  • preoperative or perioperative images of the anatomical target acquired by external imaging devices such as X-ray or fluoroscopy images, electrical potential map or an electrical impedance map, CT images, or MRI images, among others.
  • a predicted nominal sampling device trajectory S* may be determined, and graphical user interface elements (UIEs) indicating the predicted trajectory S* intersecting the anatomical target on the real-time ultrasound FOV may be displayed on the graphical user interface.
  • the predicted nominal trajectory S* begins at an entry point into the real-time ultrasound FOV, and intersects the anatomical target at the detected target location in the real-time ultrasound FOV.
  • the prediction of the nominal sampling device trajectory S* may be based at least in part on the location of the anatomical target in the real-time ultrasound FOV, which can be detected by the target identification and localization circuit 412 using image features extracted from the ultrasound images received at step 910.
  • Other factors taken into consideration when predicting the nominal trajectory S* may include, for example, a posture and a puncture angle of the sampling device (e.g., a needle) upon piercing through the airway wall, or a posture and an entering angle upon entering the real-time ultrasound FOV.
  • the nominal trajectory S* may be predicted further based on hardware configurations (e.g., pre-formed curvatures or bending angles) of available tissue sampling devices, corresponds to a tissue sampling device X* with a specific pre-shaped configuration, such as a pre-formed curvature or bending angle.
  • the predicted nominal trajectory S* thus determined corresponds to a tissue sampling device X* with a specific preshaped configuration, such as a pre-formed curvature or bending angle.
  • the nominal sampling device trajectory S* may be predicted using procedure data collected form a patient population.
  • the nominal sampling device trajectory S* may represent a best fit line or curve obtained from, for example, linear or nonlinear regression analysis of needle trajectories collected under real-life or modeled circumstances from similar procedures performed on a selected patient population under similar conditions (e.g., similar types or models of bronchoscopes).
  • an upper trajectory limit (UTL) above the predicted nominal trajectory S* and a lower trajectory limit (LTL) below the predicted nominal trajectory S* may be determined at step 920.
  • the UTL and LTL define a trajectory range or “zone” within which an intersection with the anatomical target is likely to occur at a sufficiently high confidence (e.g., 95%).
  • the UTL and the LTL may each be determined based on uncertainties associated with determination of device posture and puncturing angle upon penetrating through the airway wall, and/or device posture and entering angle upon entering the realtime ultrasound FOV, as described above with respect to FIG. 4.
  • GUIEs Graphical user interface elements
  • one or more real-time trajectory parameters may be determined based on an analysis of ultrasound images. Examples of the realtime trajectory parameters may include a position, a posture, or a heading direction or angle of the sampling device within the real-time ultrasound FOV.
  • the UIEs as appeared on the user interface may be updated based on the determined one or more real-time trajectory parameters.
  • the updated UIEs may include a graphical representation of the actual tracked trajectory (St) as of time t (and prior to an occurrence of intersection with the target) overlaid upon the real-time ultrasound FOV and displayed on the user interface during the procedure.
  • the update of UIEs may be triggered by a trajectory tracking performance satisfying a specific condition.
  • the trajectory tracking performance may be assessed during the procedure prior to the occurrence of intersection using the trajectory assessment and adjustment circuit 415.
  • the tracking performance may be quantified by a hit probability Pst representing a conditional probability of the sampling device intersecting the anatomical target given the one or more real-time trajectory parameters and/or the actual tracked trajectory (St).
  • the hit probability Pst may be estimated based on the actual tracked trajectory St or the one or more real-time trajectory parameters relative to the target location within the real-time ultrasound FOV.
  • the hit probability Pst may be estimated based on a level of discrepancy between the actual tracked trajectory (St) and the corresponding portion of the predicted nominal trajectory S*. In another example, the hit probability Pst may be estimated based on a distance between the current location the sampling device (which is one of the real-time trajectory parameters produced at step 930) and the target location within the real-time ultrasound FOV. In some examples, the hit probability Pst may be determined further based on the homogeneity of media in the region of interest within the FOV.
  • the update of the UIEs may include dynamic update of the predicted nominal trajectory S* based on the one or more real-time trajectory parameters or a level of discrepancy between the actual tracked trajectory St and the corresponding portion of the predicted nominal trajectory S*.
  • the update may be initiated in response to the estimated hit probability Pst being lower than the probability threshold PTH.
  • one or more of the UTL or LTL associated with the predicted nominal trajectory S* may also be updated.
  • the update of the UTL and LTL may be triggered by the trajectory tracking performance satisfying a specific condition, such as Pst ⁇ PTH.
  • the update of the UIEs may further include dynamic update of the hit probability.
  • one or more artificial intelligence (Al) or machine-learning (ML) models may be respectively trained to predict or update S*, to predict or update UTL or LTL, or to estimate the hit probability Pst, as described above with respect to FIG. 4.
  • the updated UIEs including the updated prediction of sampling device trajectory (S*)’ and the updated the updated UTL’ and LTL’, may be superimposed on the real-time ultrasound FOV to provide a visual feedback regarding whether the tissue sampling device will intersect the anatomical target.
  • the original predicted nominal trajectory S* and the associated UTL and LTL may be erased from the FOV, or displayed distinguishably from the updated counterparts (i.e., (S*)’, UTL’ and LTL’), such as in different colors.
  • Such update of the UIEs is “dynamic” in that the update can occur continuously, periodically, or on-demand over the course of the sampling device being advanced toward, but prior to intersecting, the anatomical target.
  • FIG. 10 illustrates generally a block diagram of an example machine 1000 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform. Portions of this description may apply to the computing framework of various portions of the EUS-TA planning system 400, such as one or more of components 411-417 of the controller circuit 410.
  • the machine 1000 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1000 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 1000 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment.
  • the machine 1000 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
  • cloud computing software as a service
  • SaaS software as a service
  • Examples, as described herein, may include, or may operate by, logic or a number of components, or mechanisms.
  • Circuit sets are a collection of circuits implemented in tangible entities that include hardware (e.g., simple circuits, gates, logic, etc.). Circuit set membership may be flexible over time and underlying hardware variability. Circuit sets include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuit set may be immutably designed to carry out a specific operation (e.g., hardwired).
  • the hardware of the circuit set may include variably connected physical components (e.g., execution units, transistors, simple circuits, etc.) including a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • the instructions enable embedded hardware (e.g., the execution units or a loading mechanism) to create members of the circuit set in hardware via the variable connections to carry out portions of the specific operation when in operation.
  • the computer readable medium is communicatively coupled to the other components of the circuit set member when the device is operating.
  • any of the physical components may be used in more than one member of more than one circuit set.
  • execution units may be used in a first circuit of a first circuit set at one point in time and reused by a second circuit in the first circuit set, or by a third circuit in a second circuit set at a different time.
  • Machine 1000 may include a hardware processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 1004 and a static memory 1006, some or all of which may communicate with each other via an interlink (e.g., bus) 1008.
  • the machine 1000 may further include a display unit 1010 (e.g., a raster display, vector display, holographic display, etc.), an alphanumeric input device 1012 (e.g., a keyboard), and a user interface (UI) navigation device 1014 (e.g., a mouse).
  • a hardware processor 1002 e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof
  • main memory 1004 e.g., main memory
  • static memory 1006 some or all of which may communicate with each other via an interlink (e.g., bus) 1008.
  • the machine 1000 may further
  • the display unit 1010, input device 1012 and UI navigation device 1014 may be a touch screen display.
  • the machine 1000 may additionally include a storage device (e.g., drive unit) 1016, a signal generation device 1018 (e.g., a speaker), a network interface device 1020, and one or more sensors 1021, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensors.
  • GPS global positioning system
  • the machine 1000 may include an output controller 1028, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
  • USB universal serial bus
  • IR infrared
  • NFC near field communication
  • the storage device 1016 may include a machine readable medium 1022 on which is stored one or more sets of data structures or instructions 1024 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
  • the instructions 1024 may also reside, completely or at least partially, within the main memory 1004, within static memory 1006, or within the hardware processor 1002 during execution thereof by the machine 1000.
  • one or any combination of the hardware processor 1002, the main memory 1004, the static memory 1006, or the storage device 1016 may constitute machine readable media.
  • machine-readable medium 1022 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 1024.
  • the term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 1000 and that cause the machine 1000 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
  • Nonlimiting machine-readable medium examples may include solid-state memories, and optical and magnetic media.
  • a massed machine-readable medium comprises a machine readable medium with a plurality of particles having invariant (e.g., rest) mass. Accordingly, massed machine-readable media are not transitory propagating signals.
  • Specific examples of massed machine- readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EPSOM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • the instructions 1024 may further be transmitted or received over a communication network 1026 using a transmission medium via the network interface device 1020 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
  • transfer protocols e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
  • Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as WiFi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others.
  • the network interface device 1020 may include one or more physical jacks (e.g., Ethernet, coaxial, or phonejacks) or one or more antennas to connect to the communication network 1026.
  • the network interface device 1020 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques.
  • SIMO single-input multiple-output
  • MIMO multiple-input multiple-output
  • MISO multiple-input single-output
  • transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 1000, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Vascular Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention divulgue des systèmes, des dispositifs et des procédés pour planifier une procédure d'acquisition de tissu guidée par ultrasons. Un système donné à titre d'exemple comprend un dispositif d'imagerie ultrasonore pour générer des images ultrasonores d'une cible anatomique dans un champ de vision (FOV) ultrasonore en temps réel, un afficheur pour afficher les images ultrasonores, et un circuit de commande. Avant une extension d'un dispositif d'échantillonnage de tissu dans le FOV ultrasonore, le circuit de commande affiche des éléments d'interface graphique utilisateur (UIE) indiquant une trajectoire du dispositif d'échantillonnage nominale prédite croisant la cible anatomique dans le FOV ultrasonore. Lors de l'extension du dispositif d'échantillonnage de tissu dans le FOV ultrasonore, le circuit de commande détermine un ou plusieurs paramètres de trajectoire en temps réel, et met à jour les UIE. Le circuit de commande affiche les UIE mis à jour pour fournir un retour visuel indiquant si le dispositif d'échantillonnage de tissu va croiser la cible anatomique.
PCT/US2024/045395 2023-09-07 2024-09-05 Prédiction de trajectoire d'aiguille Pending WO2025054333A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363581084P 2023-09-07 2023-09-07
US63/581,084 2023-09-07
US202363607654P 2023-12-08 2023-12-08
US63/607,654 2023-12-08

Publications (1)

Publication Number Publication Date
WO2025054333A1 true WO2025054333A1 (fr) 2025-03-13

Family

ID=92926148

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/045395 Pending WO2025054333A1 (fr) 2023-09-07 2024-09-05 Prédiction de trajectoire d'aiguille

Country Status (1)

Country Link
WO (1) WO2025054333A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140276051A1 (en) * 2013-03-13 2014-09-18 Gyrus ACM, Inc. (d.b.a Olympus Surgical Technologies America) Device for Minimally Invasive Delivery of Treatment Substance
EP3206617B1 (fr) * 2014-10-17 2019-11-27 Koninklijke Philips N.V. Système de segmentation d'organe en temps réel et navigation d'outil pendant l'insertion d'un outil au cours d'une thérapie interventionnelle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140276051A1 (en) * 2013-03-13 2014-09-18 Gyrus ACM, Inc. (d.b.a Olympus Surgical Technologies America) Device for Minimally Invasive Delivery of Treatment Substance
EP3206617B1 (fr) * 2014-10-17 2019-11-27 Koninklijke Philips N.V. Système de segmentation d'organe en temps réel et navigation d'outil pendant l'insertion d'un outil au cours d'une thérapie interventionnelle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FELIX J.F. HERTH ET AL: "Flexible Bronchoscopy and its Role in the Staging of Non-Small Cell Lung Cancer", CLINICS IN CHEST MEDICINE, vol. 31, no. 1, 2 March 2010 (2010-03-02), pages 87 - 100, XP055131121, ISSN: 0272-5231, DOI: 10.1016/j.ccm.2009.08.006 *

Similar Documents

Publication Publication Date Title
US20250345124A1 (en) Anatomical feature tracking
US20250325327A1 (en) Alerting and mitigating divergence of anatomical feature locations from prior images to real-time interrogation
JP2023519714A (ja) 標的解剖学的特徴の位置特定
EP4170675A1 (fr) Réglage automatique de la position et de la force dans une endoscopie
US12478433B2 (en) Image guidance during cannulation
US20250288186A1 (en) Ai-based endoscopic tissue acquisition planning
CN117615724A (zh) 医疗器械指导系统和相关联方法
US20240324870A1 (en) Medical instrument guidance systems, including guidance systems for percutaneous nephrolithotomy procedures, and associated devices and methods
US20240197403A1 (en) Endoscopic ultrasound guided tissue acquisition
WO2025054333A1 (fr) Prédiction de trajectoire d'aiguille
WO2025117336A1 (fr) Cathéters orientables et différences de force de fil
US20230122179A1 (en) Procedure guidance for safety
WO2025054243A1 (fr) Recommandation d'équipement d'échantillonnage basée sur un emplacement cible
US20230119097A1 (en) Endoluminal transhepatic access procedure
US20240197163A1 (en) Endoscopy in reversibly altered anatomy
US20230363628A1 (en) Wire puncture of stricture for pancreaticobiliary access
US20230225802A1 (en) Phase segmentation of a percutaneous medical procedure
US20250170363A1 (en) Robotic catheter tip and methods and storage mediums for controlling and/or manufacturing a catheter having a tip
WO2025085314A1 (fr) Détection d'anomalie d'instrument pour dispositif médical
WO2025059143A1 (fr) Commande de retrait d'endoscope guidé par l'image
WO2025111486A1 (fr) Endoscope à capacité de navigation
WO2024081745A2 (fr) Localisation et ciblage de petites lésions pulmonaires
WO2025059207A1 (fr) Appareil médical doté d'une structure de support et son procédé d'utilisation
WO2025072201A1 (fr) Commande robotique pour robot continuum

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24782379

Country of ref document: EP

Kind code of ref document: A1