[go: up one dir, main page]

WO2024263947A1 - Suivi de transpondeur et amélioration d'image ultrasonore - Google Patents

Suivi de transpondeur et amélioration d'image ultrasonore Download PDF

Info

Publication number
WO2024263947A1
WO2024263947A1 PCT/US2024/035062 US2024035062W WO2024263947A1 WO 2024263947 A1 WO2024263947 A1 WO 2024263947A1 US 2024035062 W US2024035062 W US 2024035062W WO 2024263947 A1 WO2024263947 A1 WO 2024263947A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical sensor
ultrasound
acoustic
sensor
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/035062
Other languages
English (en)
Inventor
Jiangang Zhu
Mucong Li
Linhua Xu
Michael Hazarian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deepsight Technology Inc
Original Assignee
Deepsight Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/492,593 external-priority patent/US12025489B1/en
Priority claimed from US18/382,984 external-priority patent/US20240423482A1/en
Application filed by Deepsight Technology Inc filed Critical Deepsight Technology Inc
Priority claimed from US18/749,712 external-priority patent/US20240426650A1/en
Publication of WO2024263947A1 publication Critical patent/WO2024263947A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray

Definitions

  • Figure 3 is a graph illustrating a method of triangulation in one example; and [0009] Figures 4A, 4B, and 4C provide examples of fiber sensors detecting acoustic signals as a point sensor or a line sensor.
  • Figures 5 and 6 show example methods for transponder tracking and ultrasound image enhancement.
  • Figure 7 depicts a flow chart of an embodiment of a method for enhancing an ultrasound image using a point sensor or a line sensor.
  • Object visualization, tracking, and location in medical applications may be important aspects for performing medical procedures in a safe and reliable manner.
  • Therapeutic and diagnostic medical applications include ultrasound imaging as well as sensing (e.g., tracking, visualizing, and monitoring) of objects (e.g., needle, catheter, guidewire, etc.) during guided needle access, biopsy, aspiration, delivery of drugs, biologies, anesthesia or other therapeutics, catheterization, minimally invasive procedures, ablation, cauterization, placement or moving of objects, tissue, cutting, sectioning, and other medical procedures.
  • Procedures and applications in the following disciplines are examples of the wide usage and need for accurate guidance and imaging during diagnostic and therapeutic procedures: anesthesia, cardiology, critical care, dermatology, emergency medicine, endocrinology, gastroenterology, gynecology and obstetrics, hepatology, infectious diseases, interventional radiology, musculoskeletal medicine, nephrology, neurology, oncology, orthopedics, pain management, pediatrics, plastic and reconstructive surgery, urology, vascular access, and other disciplines.
  • ultrasound is used in industrial applications for defect detection and microparticle particle sorting among other applications, non-destructive testing, structural testing, geological applications including mining and drilling operations, and underw ater marine applications. Such applications are consistent with embodiments described herein.
  • Objects for tracking, visualization, and location may include any type of medical device that travels or is located within the body of a subject. For instance, medical practitioners visualize and track a needle tip while conducting a biopsy to ensure safety. In such instances, accurate needle tip visualization or tracking may help to prevent or reduce unintentional vascular, neural, tissue or visceral injury. Similarly, it may be helpful to visualize , track, or locate needles, endoscopes, cannulas, laparoscopic tools or other medical device tools when performing medical procedures such as, but not limited to.
  • an ultrasound transponder is coupled to a medical device, such as a needle, that is to be inserted into the tissue or body lumen of a patient, such as a human or animal.
  • the transponder may be an ultrasound receiver or transmitter or a combination of both.
  • the example transponder includes a sensor, such as a point sensor, a line sensor, or a sensor formed in some other know n shape.
  • the transponder may be coupled to one end of the needle, such as the distal end of a needle, which is the end that first penetrates the tissue or enters a body cavity or lumen.
  • multiple transponders are coupled to the needle or medical device.
  • one transponder may be coupled to the distal end while another is coupled to the mid-point of the needle or other area that will provide positional information helpful during the procedure.
  • the transponder may also be formed in an array (e.g., ID, 1.5D, 2D etc.) that may be linear, annular, or curved depending on a form factor of the needle or medical tool to which the transponder is secured and/or the imaging area of interest.
  • the transponder includes a transmitter
  • the transmitter may or may not be integral with the sensor and may be on the medical device being tracked or on a component of the medical device delivery' system, such as a catheter, cannula, or endoscope.
  • the example sy stem also includes an ultrasound probe.
  • the ultrasound probe includes an array of transducers that output and receive a plurality’ of acoustic beamforming pulses or signals.
  • the example system also includes a computer processor, display and associated electronics for receiving data from the ultrasound probe and utilizing the data to generate an ultrasound image.
  • the transponder in the example system is also in communication with a processor and associated electronics. When the transponder senses the acoustic pulses from the probe, it provides information to the processor that may be used to determine the location of the transponder in relation to the probe. For example, the location of the transponder may be determined by triangulation or by coherent image formation. The location of the transponder sensor can then be used to display the transponder in conjunction with the ultrasound image, e.g., the transponder location overlay ed on the ultrasound image.
  • the transponder may also work as a receiver that detects scattered acoustic signals and/or tissue harmonics.
  • the transponder may detect w eak scattered or harmonic signals that are unable to propagate very far (e.g., acoustic signals that have too low of signal-to-noise ratio to be detected by probe 100 in Figure 1).
  • the transponder transmits detection of these signals to the processor.
  • the processor uses the signals detected by the transponder to reconstruct the ultrasound image of the anatomy and insonified region surrounding the transponder (e.g., with a delay and sum beamforming method).
  • the transponder also includes an emitter, such as a transducer, which can transmit a plurality of ultrasound pulses.
  • the ultrasound probe receives these pulses and transmits corresponding signals to the processor.
  • the transponder sensor may also receive reflections of these ultrasound pulses and transmit corresponding signals to the processor.
  • the processor uses the signals in conjunction with the location of the transponder to coherently reconstruct the ultrasound image of the anatomy surrounding the transponder. This allows the ultrasound processor to generate an image of better quality than one generated solely based on the pulses emitted by the ultrasound probe. It is to be understood that the transponder does not include an emitter in some embodiments.
  • Figure 1 is an example of a system 101 for ultrasound visualization of a transponder, such as a transponder coupled to a medical device.
  • System 101 may be used for ultrasound transponder visualization of a medical device, such as needle 10 present in a media 5 (e.g., body tissue, body cavity, body lumen).
  • a media 5 e.g., body tissue, body cavity, body lumen
  • system 101 may be used for ultrasound visualization of other medical devices such as a catheter, a guidewire, an intravenous (IV) line, an endoscope, a trocar, an implant, combinations thereof.
  • System 101 may also be used to enhance visualization of aspects present in the medium 5, such as, for example, organs, vessels, tissue, tumors, other anatomical structures, other medical devices, or implants.
  • examples of this disclosure may be utilized to locate non-medical devices as well, such as applications in nonmedical industries that use ultrasound imaging and/or tracking.
  • the elements of the probe 100 may be arranged as an array such as an ultrasound array.
  • probe 100 may include one or more acoustic energy generating (AEG) transducers, such as one or more of a piezoelectric transducer, a lead zirconate titanate (PZT) transducer, a polymer thick film (PTF) transducer, a polyvinylidene fluoride (PVDF) transducer, a capacitive micromachined ultrasound transducer (CMUT).
  • AEG acoustic energy generating
  • PZT lead zirconate titanate
  • PTF polymer thick film
  • PVDF polyvinylidene fluoride
  • CMUT capacitive micromachined ultrasound transducer
  • the probe 100 can be a traditional ultrasound probe with an acoustic energy generating transmitter and receiver, or the probe 100 can be an acoustic- optical probe (e.g., as described in application 63/450,554, filed on March 7, 2023, titled Mixed Array Imaging Probe,” US application 17/990.596, filed on November 18, 2022 titled “Mixed Ultrasound Transducer Arrays,” and US application 17/244,605 filed on April 29, 2021 titled “Modularized Acoustic Probe”).
  • acoustic- optical probe e.g., as described in application 63/450,554, filed on March 7, 2023, titled Mixed Array Imaging Probe,” US application 17/990.596, filed on November 18, 2022 titled “Mixed Ultrasound Transducer Arrays,” and US application 17/244,605 filed on April 29, 2021 titled “Modularized Acoustic Probe”.
  • the fiber optical sensors include resonant structures, including, but not limited to Fabry -Perot (FP) resonators, whispering-gallery-mode resonators, optical cavity, and photonic crystal resonators; interferometers, including, but not limited to MZI, phase-shift coherent interferometers, and self-mixing interferometers; acoustic induced birefringent polarization sensors; fiber end facets with acoustic responsive structures such as metasurfaces including patterns of small elements arranged to change the wavefront shape of the acoustic signals and maximize the collection of acoustic signals, low-dimensional materials with special optomechanical features that more prone to deformation; and plasmonic structure patterned to amplify light-matter interactions.
  • FP Fabry -Perot
  • interferometers including, but not limited to MZI, phase-shift coherent interferometers, and self-mixing interferometers
  • the fiber end facet structures can also be added to the other fiber optical sensors to further enhance acoustic response.
  • These optical structures are configured to respond to acoustic (such as ultrasound) signals. Reponses to acoustic signals in interferencebased fiber optical sensors may be due to the photo-elastic effect and/or physical deformation of the structures.
  • the resonant structures, or interferometer structures or fiber end facets with acoustic responsive structures are subject to mechanical stress and/or strain from the alternating pressures of the acoustic signal sound waves.
  • a system comprises the optical sensor for sensing acoustic signals used for calculating a position of a device within a medium, while the optical sensor is also within the medium.
  • the sensor can be coupled with the device (e.g., a needle) for insertion into the medium.
  • the device can be part of a third-party system (e.g., so that the sensor provides additional capabilities to the third-party system).
  • the sensor and the device are provided as a unit to be incorporated into a third-party' system (e.g., a third-party system comprising the probe 100 and processing system 200 in Figure 1).
  • the cylindrical body may be open at one end and may taper into a distal tip (e.g., hollow tip) at the other end.
  • the tip of the needle 10 may include an attachment (e.g., connector) for a stem having a piercing tip configured to pierce through a predetermined medium (e.g., skin of a patient or tissue in order to obtain a biopsy sample).
  • the stem may be slender so as to be narrow er in diameter than the needle 10.
  • the tip may be any suitable type of lip such as Slip-Tip®, Luer-Lok®, eccentric, etc.
  • the optical sensor 20 may be coupled to one or more optical waveguides 22 (e.g., optical fibers, photonic integrated circuit waveguides, or other optical transmitting channel) to transmit the set of optical signals to the processing system 200.
  • the processing system 200 may be configured to generate a real time transponder location indicator based on the optical signals.
  • the transponder indicator may be representative of a position of the tip of the needle 10 and/or may be used to track the tip of the needle 10.
  • the tip of the needle 10 may be visualized and tracked based on the transponder indicator. Accordingly, a needle 10 may be reliably visualized and tracked during a medical procedure using at least a single optical sensor 20.
  • the example needle 10 shown in Figure 2A may also include at least one emitter 24 as part of the system delivering the needle, such as a catheter, cannula, endoscope or the like.
  • the emitter may be, for example, an AEG transducer, such as a PZT transducer element or array.
  • the example shown in Figure 2A includes 4 emitters, but examples may include fewer or additional emitters.
  • the emitters generate signals that can be received by transducers on the probe 100.
  • the emitter 24 and sensor 20 may be combined in some embodiments.
  • the signals received by the probe can be used to determine the location of the needle 10 either by triangulation or coherent image formation as described herein.
  • the signals can also be used to enhance the ultrasound image produced by the processing system 200.
  • the processing system 200 can combine information from the signals generated by the probe 100 and by the emitter 24 coupled to the needle 10 to provide a higher quality image, particularly of structures surrounding the tip of the needle 10.
  • Figure 2B illustrates a cross-sectional view of an exemplary example of a system in which two optical sensors 20 are attached to a needle 10 for tracking and/or determining a position of needle 10.
  • a first optical sensor 20 may be arranged on a distal tip of the needle 10 while a second optical sensor 20 may be proximal to the first optical sensor 20 ( e.g., arranged on an elongate member of the needle 10) or maybe coupled at the mid-point or elsewhere on the needle 10.
  • the first and second optical sensors 20 may be configured to receive acoustic signals generated by probe 100 in Figure 1.
  • Figure 2B also illustrates an emitter 24.
  • the example shown in Figure 2B illustrates 2 emitters, but examples may include fewer or additional emitters.
  • the emitters generate signals that can be received by transducers on the probe 100.
  • the emitter 24 and sensor 20 may be combined in some embodiments.
  • Figure 2A illustrates a single optical sensor 20 for visualizing and tracking a needle 10
  • Figure 2B illustrates two optical sensors 20 for visualizing and tracking the needle 10
  • a suitable number of optical sensors may be used to visualize and track a medical device (e.g.. three or more optical sensors, such as three, four, five, or more optical sensors and/or sensors configured in a linear, annular, curved or other suitable array).
  • These optical sensors may be attached to, coupled to, integrated with, or otherwise mounted on a suitable part of a medical device/instrument.
  • a single needle 10 may facilitate tracking of a bend of the needle 10 in addition to visualizing and tracking the position of the needle tip.
  • the system 101 in Figure 1 is described and depicts needle tracking solely for illustrative purposes. It should be readily understood that any other object (e.g., end effector, catheter, guidewire, endoscope, trocar, implant) may be visualized and/or tracked using the systems and methods described herein.
  • the transponder can include an interferometer sensor, a resonator sensor, fiber end facet with acoustic responsive structures, and/or a polarization (birefringence) sensor (e.g., as described in U.S. Application No. 18/492,593, titled ‘‘FIBER-OPTICAL SENSOR SYSTEM FOR ULTRASOUND SENSING AND IMAGING ’).
  • a polarization (birefringence) sensor e.g., as described in U.S. Application No. 18/492,593, titled ‘‘FIBER-OPTICAL SENSOR SYSTEM FOR ULTRASOUND SENSING AND IMAGING ’).
  • the fiber end facet structures may include acoustically responsive microstructures, such as metasurfaces including patterns of small elements arranged to change the wavefront shape of the acoustic signals and maximize the detection of acoustic signals, acoustically responsive lowdimensional materials with optomechanical features selected to optimize acoustic response (e.g.. features that are more prone to deformation when receiving acoustic signals, exhibit greater material responses to acoustic signals) and plasmonic structures patterned to amplify light-matter interactions. Plasmonic structures may locally amplify incident light due to their plasmonic resonance.
  • the transponder can be used to locate a device’s location and/or orientation while a fiber sensor is mounted on the device.
  • the device can be a needle, catheter, endoscope, surgical tool, biopsy tool, etc.
  • Previously described transponder sensors e.g., sensor 20 in Figure 2
  • Previously described transponder sensors may be "‘point like” in that the sensor 20 has a dimension close to or smaller than a certain feature size that is meaningful for an application, such as a wavelength of an acoustic signal or a diameter of a needle (e.g., sensor 20a in Figure 4A).
  • the fiber sensor using polarization may in addition to being “point like,” may also be “line like” or “line type.” (e.g., sensors 20b and 20c in Figures 4B and 4C).
  • a line type sensor can use a polarization sensitive detection mechanism in an optical fiber.
  • an acoustic signal can be sensed using polarization of light within a waveguide (e.g., an optical fiber).
  • a waveguide e.g., an optical fiber
  • acoustic signals can be detected strongest when the acoustic signal is propagated in a direction orthogonal to (e.g., in a direction orthogonal to a tangent of) the optical fiber (e.g., see Figure 4B and Figure 4C).
  • the orthogonal direction may also be referred to as lateral, substantially lateral, or from any direction relative to the axis of the optical fiber.
  • Many sections, or portions, of the optical fiber can be sensitive to an acoustic signal, because the acoustic signal changes the polarization state of light within the sections of the optical fiber.
  • Detection of lateral signals at multiple points along the length of the sensors 20b and 20c may enhance an ability to track and/or locate the sensor fibers when it is disposed within a medium (e.g., within a human body during a medical procedure).
  • a medium e.g., within a human body during a medical procedure.
  • multiple signals incident along the length of the sensor fibers may enhance an ability to determine the location of different portions of the sensor fibers along its length and therefore to identify the location of the entire sensor fibers 20b and 20c. and not just a tip region like 20a.
  • multiple signals incident along the length of the sensor fibers 20b and 20c may enhance an ability to determine the location of different portions of the sensor fibers 20b and 20c and therefore to identify cur ature of the sensor fibers 20c with greater accuracy.
  • a groove or channel may be fabricated on a device inner or outer surface to allow the optical fiber to be embedded in, the optical fiber can be glued on the surface directly, and/or the optical fiber can be covered in a protective material layer, such as a polymer coating or other acoustically transparent material.
  • the line type fiber sensor 20b or 20c can be used in lieu of, or in combination with, one or more point like 20a sensors.
  • an imaging system comprises the probe 100 and a transponder sensor 20.
  • a “delay-and-sum” beamforming method may be applied to generate an ultrasound image of the surrounding medium (tissue).
  • ultrasound is transmitted from a probe/transducer array (could be multiple transmits with different transmit patterns), and the medium/tissue scattering signal is received by the transponder sensor/sensors to form an ultrasound image.
  • Signals from multiple transponder sensors, or signals from the same sensor but at different locations, can be coherently combined to form the ultrasound image.
  • the locations of the transponder sensors are known or can be calculated at the time of signal acquisition.
  • a delay used to calculate the delay-and-sum beamforming corresponds to an orthogonal line distance from each pixel (or voxel in 3D imaging) to the transponder sensor 20b line location (e.g., see Figure 4B). If the “line type” transponder is curved, there may be multiple delay values for each pixel (or voxel) since there may be multiple orthogonal line paths from it to the transponder line sensor 20c (e.g., see Figure 4C).
  • the line type sensor 20b and/or 20c is a simpler front-end design, optical detection is performed on the back end (e.g., using a polarization analyzer), and/or wavelength locking may not be required.
  • a location of tissue scattering can be calculated based on a propagation time of the acoustic signal (e.g., assuming the scattering signal is incident orthogonal to the optical fiber).
  • triangulation may be used to determine a position of one or more of the optical sensors.
  • Ultrasound is transmitted from the probe 100, one or more external elements or array, or an in vivo array (e.g., an array for EBUS, EUS, IVUS).
  • the transducers on the probe 100 emit at least two signals with different wavefronts.
  • the transponder sensor 20 location is determined by the interception point of the different transmit wavefronts at respective received pulse timing.
  • the pulse timing for the ultrasound transmission is determined by extracting and matching the known pulse shape from the transponder-received time sequence ultrasound signal.
  • the pulse timing can be extracted when the pulse signal’s signal-noise-ratio is higher than a certain number.
  • a matched filter for known pulse shape or a Wiener filter can be used to enhance the pulse detection fidelity.
  • Figure 3 is a schematic illustrating example positions of probe transducer elements 122 configured to emit acoustic pulses and an example position of an optical sensor 20 in a Cartesian coordinate system.
  • the optical sensor 20 may be arranged on an object (not shown) to be tracked. The location of the transponder optical sensor 20 may be determined using the Cartesian coordinate system as described in the example below.
  • three probe transducer element 122 may be configured to emit acoustic pulses.
  • the probe transducer element 122 may form an array (e.g., 1.5D ultrasound array) of a probe (e.g., probe 100).
  • the probe may be configured to emit acoustic beamforming pulses (e.g...
  • Optical sensor 20 may be configured to detect the beamforming signals corresponding to the acoustic beamforming pulses.
  • the three probe transducer element 122 are located at Pl: (-a, 0, 0), P2: (a, 0, 0), P3: (0, b, 0), and the optical sensor is located at P: (x, y, z).
  • Equation 5 may be determined from Equation 4. Equation 5 indicates that b p Q. That is, the third element cannot be on the line determined by the first element and the second element. For example, the first, second, and third elements may form a triangle. Accordingly, the third element is offset in a first dimension (e.g., elevation dimension).
  • rz and rs may be determined in a similar manner as n. Therefore, the location of the optical sensor 20 may be determined based on the time required for an acoustic pulse to travel from an element 122 to the optical sensor 20.
  • the location of the optical sensor 20 may be determined by detecting acoustic signals (e.g., echoes) corresponding to acoustic pulses from three probe transducer elements 122, in some examples, more than three elements 122 may be used to determine the location of the optical sensor.
  • the elements 122 may be positioned in any suitable manner.
  • elements 122 and the sensor 20 cannot be in the same plane.
  • a first and second element may be arranged along a lateral dimension and a third element may be arranged along an elevation dimension transverse to the lateral dimension where the third element does not intersect the lateral dimension (e.g., so as to be arranged as vertices of a triangle).
  • the third element in this example is not aligned with respect to the lateral dimension of the first and second elements.
  • the first and second elements are offset with respect to each other but are aligned in the lateral dimension.
  • using more than three elements 122 may improve the accuracy of the determined location of the optical sensor 20.
  • more than one optical sensor 20 may be used to detect acoustic signals.
  • the position of each optical sensor 20 may be determined similar to as described above. If probe transducer elements 122 and the optical sensor 20 are in the same plane, 2D tracking information within that plane can still be obtained. In this case, at least two transducer elements 122 are used.
  • the location of the optical sensor 20 is determined by coherent image forming.
  • Features are most easily identified in ultrasound images when they differ in image brightness.
  • the intensity of the image in ultrasound imaging system is a function of the amplitude of the beamformed received signal, i.e. the amplitude after coherent addition of the delayed received signal from each transducer element.
  • multiple ultrasound firing is transmitted by the external elements or array on the probe 100 and from different locations and/or directions, and with different wavefront (similar to ultrasound imaging transmit sequences).
  • the pixel values are calculated from the transponder-received signal of the multiple transmissions, with the assumption that the optical sensor 20 is at the location of that pixel.
  • the obtained image adds signals coherently only at the true transponder location where the received signal aligns, and ultrasound interference is constructive.
  • the transponder signal image allows transponder sensor 20 position determination because only the transponder location will light up in the image (with the ultrasound physics limiting the transponder image spot size).
  • a single point transponder location can be extracted from the bright transponder spot in the transponder signal image by different methods (e.g., maximal pixel value, median filter, center of brightness weight, etc.).
  • the advantage of using the coherent transponder tracking image is that the received transponder signal from different transmit is first added coherently, and then the pulse timing is determined on the coherently summed signal where the signal-to-noise ratio (SNR) is much higher than a single received time sequence signal.
  • SNR signal-to-noise ratio
  • an ultrasound image can be generated at the same time of transponder tracking.
  • This coherent beamforming transponder imaging method can also be used for 3D tracking of the transponder.
  • the probe 100 will have (e.g., at least) three probe transducer elements 122, and (e.g., at least) one probe transducer element 122 is outside the plane defined by the optical sensor 20 and another two probe transducer elements 122 of the probe 100 are in plane as shown in FIG. 3.
  • the acoustic sensing signal received by the optical sensor 20 from different transducer elements 122 of the probe 100 are summed at the processing system 200 so that a net signal representing the ultrasound signal emitted from each transducer element 122 of the probe 100 is obtained.
  • the sum of the amplitude of the summed signal represents the intensity of the signal received and thus corresponds to the distance along the beam associated with the signal at the angle from the sensor 20 to the probe transducer element 122.
  • Summing of the individual signals is accomplished by providing separate time delay (and / or phase) and gain to the signal from each transducer element 122 in the probe 100.
  • the output signal from the sensor 20 corresponding to each beam forming channel is then coherently added, i.e., each channel is summed, to form a respective pixel intensity value for each beam.
  • the pixel intensity values can be logarithmically compressed, scan converted, and then displayed as an image of the tip of the needle 10 where the sensor 20 is located or the entire needle when multiple sensors 20 are utilized.
  • transponder sensors 20 can share or receive the same external elements or array firing sequence signals from probe 100 for tracking each of their respective locations.
  • Coded excitation may be used to increase the signal-to-noise ratio (SNR).
  • SNR signal-to-noise ratio
  • Such coded excitation may be used in conjunction with a long or multi-pulse, chirp-signal technique for the ultrasound firing sequences.
  • the received transponder sensor signals can be applied to a matched filter/Weiner filter for pulse compression to achieve a much higher SNR for the pulse timing determination and/or a much better axial resolution in the beamformed transponder signal image. The resulting higher SNR can increase transponder tracking accuracy.
  • the transponder includes an emitter 24
  • the external element or array of the probe 100 can be used to triangulate or beamform to get the transponder location.
  • the single point transponder transmits a signal towards the probe 100, and the signal is received by the individual external transducer elements 122 of the probe 100.
  • the position of the transponder can be determined by either triangulation method of coherent transponder tracking image method as described above. Multiple transponder emitters can be used and can transmit at the same time, and each which will show up as a bright spot in the transponder tracking image.
  • Figures 4A, 4B, and 4C depict embodiments of sensing using sensors 20a, 20b, and 20c.
  • sensor 20a a point-like sensor
  • sensor 20b is a fiber polarimetric sensor that is a straight line receiver. Sensor 20b receives scattering from lateral directions.
  • sensor 20c is a fiber polarimetric sensor that is a curved line receiver. Sensor 20c receives scattering from orthogonal directions.
  • the optical sensor structures are configured to detect the acoustic signal across a directional range of at least 180 degrees, at least 270 degrees, at least 300 degrees, at least 330 degrees, or at least 360 degrees.
  • One or more electrical signals can be generated as sensor data based on one or more detected optical responses to light propagation within one or more optical sensors 20 in response to one or more acoustic signals incident on the one or more sensors 20.
  • the sensor data can be used to enhance an ultrasound image.
  • the probe 100 is used to generate an ultrasound image (e.g., a first image); sensor data is used to generate a sensor image (e.g., a second image; based on known time and location generation of acoustic pulses from the probe 100 and/or a known location of the sensor 20 with respect to the probe 100); and the sensor image is combined with the ultrasound image (e.g., by image fusion using processor 260 in FIG.
  • sensor data is sent to the processor 260 in Figure 1 without generating a sensor image (e.g., the processor 260 generates the enhanced image based on the sensor data and data from the probe 100 so that one image, the third image, is generated and the first image and/or the second image is not generated separately from the third image).
  • the first image (the ultrasound image) and third image (the enhanced image) are generated without the second image (the sensor image).
  • the second image (the sensor image) is generated without generating the third image (the enhanced image) or the first image (the ultrasound image).
  • a device path can be ascertained by a transponder sensor.
  • a transponder sensor or multiple transponder sensors, are integrated on a device (e.g., a needle, catheter, etc.)
  • the location history of the transponder sensor/sensors can be used to determine the path the device has taken.
  • the history path can be used to provide valuable medical information. In some applications, it can be used to predict the device movement. For example, when a needle has travelled a certain distance, using its location history, a projected needle path can be predicted and/or overlay ed on the ultrasound image. In doing so, one can assume, in some embodiments, the needle is taking a straight path, or a curved path that can be defined by the history locations.
  • the history path can also be used to indicate the physiological structure the device has gone through. For example, a catheter device travelling through a blood vessel can map the shape of the vessel from the history path of the device transponder sensor.
  • the history path of a device can also serve as records of medical operation and/or to evaluate operation performance and safety. For example, the history of the two transponders on the two sides of a forceps can be used to determine how many times they have closed/opened.
  • One or more transponder sensors can be used to ascertain the shape and/or orientation of the device.
  • a transponder sensor or multiple transponder sensors are integrated on a device (e.g., a needle, catheter, etc.)
  • the locations of the transponder sensor/sensors can be used to ascertain the shape and/or orientation of the device.
  • the shape of the catheter e.g., point-by-point curve.
  • the shape of the catheter can then be used to ascertain the shape of the physiological structure it is in, for example a blood vessel or a lung bronchus.
  • the locations of two transponder sensors on a needle can be used to ascertain the orientation and position of the needle (e.g., assuming the needle is a straight line).
  • the locations of three transponder sensors can be used to ascertain the orientation and position of a surface of a medical device (three points form a surface), or the medical device itself if it is a rigid body.
  • a polarization line sensor When a polarization line sensor is used, multiple transmits can be programmed to emit from a probe to “scan”’ the line sensor. Since the line sensor is sensitive to ultrasound that laterally arrives at the sensor, the “scan 7 ’ will generate signals at the sensor when the transmitted ultrasound is lateral to part of the line, therefore locating the section of the line that is lateral to a specific transmit pattern.
  • the shape and position of the line can be ascertained/estimated from the sectional information.
  • the shape and position of the line sensor can therefore be used to indicate the shape and position of a medical device that integrates the line sensor.
  • Figure 5 shows an example method 500 for transponder tracking and ultrasound image enhancement. This example method 500 will be described with respect to the system shown in Figures 1 and 2; however, another suitable system according to this disclosure may be employed.
  • an ultrasound probe transmits and receives acoustics signals.
  • the ultrasound probe 100 shown in Figure 1 transmits acoustic pulses from an array of transducers into the medium 5, which represents the anatomy of a patient.
  • the probe 100 may transmit these pulses using a variety of known methods or as described above.
  • the probe 100 receives the acoustic signals (e.g.. probe 100 receives acoustic signals reflected or scattered from objects and/or features, such as tissue, in the medium 5). For example, echoes might be reflected off of a tumor present in the medium.
  • the probe 100 converts the ultrasound pulses to signals that are then transmitted to the processing system 200.
  • a transponder senses acoustic signals.
  • sensor 20 coupled to needle 10 in Figure 1 also receives ultrasound pulses that were emitted by the probe 100.
  • the sensor 20 converts the ultrasound pulses to signals that are then transmitted to the processing system 200.
  • the processing system 200 determines the location of the transponder based, at least in part, on the signals received from sensor 20.
  • the processing system 200 may utilize triangulation and/or beamformed transponder signal image method to determine the position of the transponder based on a plurality of signals received from the sensor 20.
  • the processing system 200 generates an ultrasound image.
  • the ultrasound image is generated from acoustic signals received by the probe 100.
  • the ultrasound image may be transmitted to and displayed on the display 300.
  • the processing system 200 overlays the location of the transponder over the ultrasound image.
  • a graphic such as cross hairs (e.g.. “+”) or a circle, is overlayed on the ultrasound image to correspond to a location of the needle tip 14 in the ultrasound image.
  • the transponder is shown on the same display as the ultrasound image, indicating where in the medium 5, the transponder, sensor 20 on needle 10, is located.
  • the image may also display the path and/or projected path.
  • Figure 6 shows an example method 600 for transponder tracking and ultrasound image enhancement. This example method 600 will be described with respect to the system shown in Figure 1 and 2. however, any suitable system according to this disclosure may be employed.
  • an ultrasound probe transmits and receives acoustics signals, (e.g.. conventional ultrasound)
  • acoustics signals e.g.. conventional ultrasound
  • the ultrasound probe 100 shown in Figure 1 transmits acoustic pulses from an array of transducers into the medium 5, which represents the anatomy of a patient.
  • the probe 100 may transmit these pulses using a variety of known methods or as described above.
  • the probe 100 receives the acoustic signals (e.g., probe 100 receives acoustic signals reflected or scattered from objects and/or features, such as tissue, in the medium 5). For example, echoes might be reflected off of a tumor present in the medium.
  • the probe 100 converts the ultrasound pulses to signals that are then transmitted to the processing system 200.
  • a transponder senses acoustic signals.
  • sensor 20 coupled to needle 10 in Figure 1 also receives ultrasound pulses that were emitted by the probe 100.
  • the sensor 20 converts the ultrasound pulses to signals that are then transmitted to the processing system 200.
  • the processing system 200 determines the location of the transponder based, at least in part, on the signals received from sensor 20.
  • the processing system 200 may utilize triangulation and/or beamformed transponder signal image method to determine the position of the transponder based on a plurality of signals received from the sensor 20.
  • acoustic signals are transmitted from a transponder emitter located approximate the distal end of a device towards the probe transducer elements 122.
  • the transponder on the needle 10 show n in Figure 1 transmits acoustic pulses from an array of emitters 24 into the medium 5.
  • the transponder may transmit these pulses using a variety of known methods or as described above.
  • acoustic signals generated from the emitters 24 approximate the distal end of the device are received by the ultrasound probe.
  • probe 100 receives signals generated by emitters 24.
  • the probe 100 then converts the ultrasound pulses to signals that are then transmitted to the processing system 200. These signals may be in addition to the echoes received by the probe 100 as described, for example, in relation to Figure 5.
  • the processing system 200 can also determine the location of the transponder based at least in part on the signals received from the probe 100. For example, the processing system 200 may utilize triangulation to determine the position of the transponder based on a plurality of signals received from the probe 100.
  • the processing system 200 generates an ultrasound image.
  • the ultrasound image is generated from acoustic signals received by the probe 100.
  • the ultrasound image is generated using acoustic signals emitted by the probe 100.
  • the ultrasound image is generated using acoustic signals emitted from the emitters 24.
  • the ultrasound image may be transmitted to and displayed on the display 300.
  • an ultrasound image is generated from acoustic signals transmitted and received by the probe 100. and then the image is modified based on ultrasound pulses received by the probe 100 that were emitted by emitters 24.
  • the processing system may be able to improve the resolution of the ultrasound image, particularly in relation to objects in the medium 5 that are near the transponder.
  • the processing system 200 overlays the location of the transponder over the ultrasound image.
  • a graphic such as cross hairs (e.g., "‘+”) or a circle, is overlayed on the ultrasound image to correspond to a location of the tip 14 in the ultrasound image.
  • the transponder is shown on the same display as the ultrasound image, indicating where in the medium 5, the transponder, sensor 20 on needle 10, is located.
  • Figure 7 shows an example method 700 for ultrasound image enhancement with a point sensor (e.g., using a fiber end optical sensor) or a line sensor (e.g., using polarization in an optical fiber or multiple point sensors).
  • a point sensor e.g., using a fiber end optical sensor
  • a line sensor e.g., using polarization in an optical fiber or multiple point sensors.
  • the fiber sensor 20 can detect scattered signals and tissue harmonics.
  • an ultrasound probe transmits acoustics pulses.
  • the ultrasound probe 100 shown in Figure 1 transmits acoustic pulses from an array of transducers into the medium 5, which represents the anatomy of a patient.
  • the probe 100 may transmit these pulses using a variety of known methods and/or as described above.
  • the point sensor or line sensor senses the direct acoustic signals (e.g., from a probe 100), acoustic signals reflected and/or scattered from objects and/or features such as tissue in the medium 5, and/or tissue harmonics.
  • echoes might be reflected off a tumor present in the medium 5 in Figure 1.
  • the point type sensor will receive scattering from any direction or axially as shown in Figure 4A while the line sensor will receive scattering from orthogonal or transverse directions as shown in Figures 4B and 4C.
  • the transponder signal can be used together with signals received by elements in the probe (e.g., probe 100 in Figure 1) for beamforming of ultrasound images, harmonics etc.
  • Transponder sensors can be useful for harmonic imaging of surroundings because transponders are very' close to an imaging area of interest, and a harmonic signal is usually weak or unable to propagate very far. Tissue scattering can cause scattering of acoustic signals and/or tissue harmonics.
  • the fiber sensor can detect direct signals (e.g., from a probe), scattered signals, and/or tissue harmonics.
  • the ultrasound probe senses acoustic signals.
  • Signals e.g., electrical and/or optical from the transducer and/or the probe
  • a processing system e.g., system 200 in Figure 1.
  • a point-like sensor e.g., sensor 20 in Figure 1 or 20a in Figure 4A
  • a position of a device e.g., as described in conjunction with Figure 5.
  • the processing system 200 generates an ultrasound image.
  • the ultrasound image is generated from acoustic signals received by the probe 100 in Figure 1.
  • the processing system 200 enhances the ultrasound image to generate an enhanced ultrasound image.
  • the processing system 200 uses data from the fiber sensor 20 to enhance the ultrasound image. This data includes the direct signals and scattered signals.
  • the enhanced ultrasound image may be transmitted to and displayed on the display 300.
  • the data from the fiber sensor may 7 also be used to create a separate image of the insonified region surrounding the sensor that is then transmitted to and displayed on display 300.
  • a method comprises receiving, by an optical sensor coupled with a medical device, a plurality of acoustic beamforming signals, each acoustic beamforming signal corresponding to one of a plurality of acoustic beamforming pulses emitted from an ultrasound transducer array; and ascertaining, by a processor, a location of the optical sensor based on one or more of the plurality of acoustic beamforming signals received by the optical sensor.
  • the method comprises generating an ultrasound image based on acoustic signals detected by an ultrasound receiver array and the plurality of acoustic beamforming signals received by the optical sensor; generating an ultrasound image based on the plurality of acoustic beamforming signals received by the optical sensor; real-time generation of the location of the optical sensor during an ultrasound procedure; tracking a path of the optical sensor based on a history of locations ascertained of the optical sensor based on the plurality of acoustic beamforming signals received by the optical sensor; displaying the path of the optical sensor during an ultrasound-guided procedure; projecting a path of the optical sensor during an ultrasound-guided procedure based on a history of locations ascertained of the optical sensor based on the plurality of acoustic beamforming signals received by the optical sensor; and/or displaying the projected path of the optical sensor during the ultrasound-guided procedure.
  • the optical sensor comprises a line sensor, a point sensor, or both a line sensor and a point sensor; ascertaining the location of the optical sensor comprises triangulating the location of the optical sensor; ascertaining the location of the optical sensor comprises coherent image forming; one or more sensors are coupled to the medical device to enable real-time generation of a shape or orientation of the medical device during an ultrasound procedure; the optical sensor is one of a plurality of optical sensors coupled with the medical device; and/or the method comprises calculating an orientation of the medical device based on ascertained locations of the plurality of optical sensors.
  • a system comprises an optical sensor coupled with a medical device and configured to receive a plurality of acoustic beamforming signals corresponding to a plurality' of acoustic beamforming pulses emitted from an ultrasound array; and a processor configured to ascertain a location of the optical sensor based on at least some of the plurality of acoustic beamforming signals received by the optical sensor.
  • the optical sensor is configured to receive a plurality of acoustic signals from a surrounding insonified region; the processor is configured to create an ultrasound image of at least a portion the surrounding insonified region adjacent the medical device based on at least some of the plurality of acoustic signals from the surrounding insonified region received by the optical sensor; the optical sensor comprises a fiber optical sensor; the optical sensor is configured to optically sense a deformation of a material of the optical sensor caused by the acoustic beamforming signals incident on the optical sensor; and/or the optical sensor is configured to detect a polarization change in light guided in the optical sensor as the acoustic beamforming signals are incident on the optical sensor.
  • a system comprises an optical sensor coupled with a medical device and configured to receive a plurality of acoustic beamforming signals corresponding to a plurality of acoustic beamforming pulses emitted from an ultrasound array, and a plurality of acoustic signals from a surrounding insonified region; and a processor configured to ascertain a location of the optical sensor based on at least some of the plurality of acoustic beamforming signals received by the optical sensor, and create an ultrasound image of at least a portion the surrounding insonified region adjacent the medical device based on at least some of the plurality 7 of acoustic signals from the surrounding insonified region received by the optical sensor.
  • the processor is configured to present the location of the optical sensor and the ultrasound image in real time; the ultrasound image of at least the portion of the surround insonified region is combined with an image generated by the ultrasound array; and/or the optical sensor comprises a fiber sensor.
  • a system comprises an optical sensor coupled with a needle and configured to receive a plurality of acoustic signals from a surrounding insonified region; and a processor configured to generate an image of at least a portion the surrounding insonified region adjacent the needle based on at least some of the plurality' of acoustic signals from the surrounding insonified region received by the optical sensor.
  • the optical sensor is configured to receive a plurality of acoustic beamforming signals corresponding to a plurality of acoustic beamforming pulses emitted from an ultrasound array; the processor is configured to ascertain the location of the optical sensor based on at least some of the plurality of acoustic beamforming signals received by the optical sensor; the optical sensor is coupled with the needle at a distal portion of the needle; the optical sensor is arranged on the needle for a diagnostic or therapeutic procedure; the image of at least a portion of the surrounding insonified region is generated in real time; the optical sensor is arranged to detect a change in polarization of light in response to the plurality of acoustic signals; the optical sensor is configured to optically sense a deformation of a material of the optical sensor caused by the acoustic beamforming signals incident on the optical sensor; and/or the optical sensor is arranged to amplify light matter interactions.
  • a device may include a processor or processors.
  • the processor comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
  • the processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs.
  • Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines.
  • Such processors may further comprise programmable electronic devices such as PLCs. programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
  • PLCs programmable interrupt controllers
  • PLDs programmable logic devices
  • PROMs programmable read-only memories
  • EPROMs or EEPROMs electronically programmable read-only memories
  • Such processors may comprise, or may be in communication with, media, for example one or more non-transitory computer-readable media, that may store processorexecutable instructions that, when executed by the processor, can cause the processor to perform methods according to this disclosure as carried out, or assisted, by a processor.
  • non-transitory computer-readable medium may include, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with processor-executable instructions.
  • non-transitory computer-readable media include, but are not limited to, a floppy disk, CD- ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, optical media, magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
  • the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
  • the processor may comprise code to cany 7 out methods (or parts of methods) according to this disclosure.
  • a or B or C includes any or all of the following alternative combinations as appropriate for a particular usage: A alone: B alone; C alone; A and B only; A and C only; B and C only; and A and B and C.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Un transpondeur est utilisé pour suivre une position d'une extrémité distale d'un dispositif médical dans une image ultrasonore et/ou améliorer une image ultrasonore.
PCT/US2024/035062 2023-06-23 2024-06-21 Suivi de transpondeur et amélioration d'image ultrasonore Pending WO2024263947A1 (fr)

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US202363510079P 2023-06-23 2023-06-23
US202363522994P 2023-06-23 2023-06-23
US202363522793P 2023-06-23 2023-06-23
US63/522,793 2023-06-23
US63/522,994 2023-06-23
US63/510,079 2023-06-23
US18/492,593 US12025489B1 (en) 2023-06-23 2023-10-23 Fiber-optical sensor system for ultrasound sensing and imaging
US18/382,984 US20240423482A1 (en) 2023-06-23 2023-10-23 Transponder tracking and ultrasound image enhancement
US18/382,984 2023-10-23
US18/492,593 2023-10-23
US18/749,712 US20240426650A1 (en) 2023-06-23 2024-06-21 Optical fiber with an acoustically sensitive fiber bragg grating and ultrasound sensor including the same
US18/749,712 2024-06-21

Publications (1)

Publication Number Publication Date
WO2024263947A1 true WO2024263947A1 (fr) 2024-12-26

Family

ID=91960451

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/035062 Pending WO2024263947A1 (fr) 2023-06-23 2024-06-21 Suivi de transpondeur et amélioration d'image ultrasonore

Country Status (1)

Country Link
WO (1) WO2024263947A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210401398A1 (en) * 2016-12-05 2021-12-30 Fujifilm Sonosite, Inc. Method and apparatus for visualizing a medical instrument under ultrasound guidance
WO2023060235A1 (fr) * 2021-10-08 2023-04-13 Deepsight Technology, Inc. Visualisation de balise ultrasonore avec des capteurs optiques

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210401398A1 (en) * 2016-12-05 2021-12-30 Fujifilm Sonosite, Inc. Method and apparatus for visualizing a medical instrument under ultrasound guidance
WO2023060235A1 (fr) * 2021-10-08 2023-04-13 Deepsight Technology, Inc. Visualisation de balise ultrasonore avec des capteurs optiques

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MATHEWS SUNISH J ET AL: "Ultrasonic Needle Tracking with Dynamic Electronic Focusing", ULTRASOUND IN MEDICINE AND BIOLOGY, NEW YORK, NY, US, vol. 48, no. 3, 30 December 2021 (2021-12-30), pages 520 - 529, XP086950504, ISSN: 0301-5629, [retrieved on 20211230], DOI: 10.1016/J.ULTRASMEDBIO.2021.11.008 *

Similar Documents

Publication Publication Date Title
EP2996606B1 (fr) Sytème de détermination d'emplacement d'instrument médical par rapport à une imagerie ultrasonore et instrument médical pour faciliter une telle détermination
CN105899143B (zh) 超声导航/组织定征组合
US7068867B2 (en) Ultrasonic position indicator
JP3772002B2 (ja) 被検体内断層イメージング装置
US7999945B2 (en) Optical coherence tomography / acoustic radiation force imaging probe
US9636083B2 (en) High quality closed-loop ultrasound imaging system
US9486143B2 (en) Intravascular forward imaging device
EP3013245B1 (fr) Injection d'une forme dans une image ultrasonore de façon à étalonner en temps réel des motifs de faisceau
Xia et al. In‐plane ultrasonic needle tracking using a fiber‐optic hydrophone
US20110066073A1 (en) Biopsy device with acoustic element
US11123141B2 (en) Systems and methods for navigating a catheter and delivering a needle
CN116058873A (zh) 通过基于多普勒和图像的脉管区分的互操作优化功能
JP2020506005A (ja) 装置追跡に対する超音波システムにおける経路追跡
US20130204138A1 (en) Steerable catheter navigation with the use of interference ultrasonography
EP3890615B1 (fr) Système de cathéter endobronchique de diagnostic rapide de maladie pulmonaire
JP2015503392A (ja) Usイメージングにおけるpa効果を使用したニードル・ナビゲーションのためのシステム及び方法
JP6732054B2 (ja) 光音響画像生成装置
US20240423482A1 (en) Transponder tracking and ultrasound image enhancement
WO2024263947A1 (fr) Suivi de transpondeur et amélioration d'image ultrasonore
CN118355293A (zh) 利用光学传感器的超声信标可视化
EP4340731B1 (fr) Dispositif d'imagerie en profondeur de surface permettant d'enregistrer des images ultrasons les uns sur les autres et sur des images de surface en utilisant des informations de surface
US20240299009A1 (en) Spectroscopic photoacoustic imaging probe
JP2004141346A (ja) 穿刺難易度評価装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24745828

Country of ref document: EP

Kind code of ref document: A1