[go: up one dir, main page]

WO2025174993A1 - Systèmes et méthodes d'imagerie médicale échoguidée améliorée - Google Patents

Systèmes et méthodes d'imagerie médicale échoguidée améliorée

Info

Publication number
WO2025174993A1
WO2025174993A1 PCT/US2025/015770 US2025015770W WO2025174993A1 WO 2025174993 A1 WO2025174993 A1 WO 2025174993A1 US 2025015770 W US2025015770 W US 2025015770W WO 2025174993 A1 WO2025174993 A1 WO 2025174993A1
Authority
WO
WIPO (PCT)
Prior art keywords
vertebrae
stage
transducer
rail
link
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/015770
Other languages
English (en)
Inventor
Emad Boctor
Peter Kazanzides
Baichuan JIANG
Keshuai XU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Johns Hopkins University
Original Assignee
Johns Hopkins University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Johns Hopkins University filed Critical Johns Hopkins University
Publication of WO2025174993A1 publication Critical patent/WO2025174993A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0875Clinical applications for diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data

Definitions

  • This disclosure relates to systems and methods for ultrasound-guided medical imaging, such as an ultrasound-guided imaging system for spine intervention, and an ultrasound-guided imaging system for lumbar puncture guidance.
  • the image reconstruction module can be further configured to process the vertebrae imaging data based, at least in part, on an estimated vertebrae surface map.
  • embodiments consistent with the present disclosure can include a non-transitory computer-readable medium storing instructions that when executed by a module for image reconstruction cause the module to perform a method for processing vetebrae imaging data based, at least in part, on vertebrae imaging data acquired from redundant insonification angles using a plurality of beams and based on an estimated vertebrae surface.
  • the method can include: acquiring the vertebrae imaging data for processing; assigning at least a first weight to a first contribution from a first beam of the plurality of beams to a reconstructed voxel based on a first incidence angle from the first beam to the estimated vertebrae surface; and assigning at least a second weight to a second contribution from a second beam of the plurality of beams to the reconstructed voxel based on a second incidence angle from the second beam to the estimated vertebrae surface.
  • the vertebrae imaging data can be generated by a wearable ultrasound device comprising a transducer, the wearable ultrasound device configured to generate the vertebrae imaging data when situated to a side of the vertebrae; and, in a further aspect, the generated vertebrae imaging data can be acquired based on translational scans of the imaged vertebrae by said transducer and based on rotational scans of the imaged vertebrae by said transducer.
  • the estimated vertebrae surface map can be generated by a deeplearning bone surface estimator.
  • FIGS. 5A and 5B depicts further aspects of the mechanism of FIG. 4A;
  • FIGS. 6A and 6B depict a device utilizing the mechanism of FIGS. 4A, 5 A, and 5B consistent with this disclosure, where FIG. 6B is an exploded view of the device of FIG. 6A;
  • FIG. 9 depicts an embodiment of a device consistent with this disclosure used for a medical procedure
  • FIG. 10 depicts an image of a vertebrae obtained from devices and methods consistent with this disclosure
  • FIG. 11 depicts an image reconstruction algorithm consistent with this disclosure
  • FIGS. 12A-12C depicts the results of a reconstruction method consistent with this disclosure.
  • Redundant ultrasound beams refers to ultrasound beams generated within a workspace of a motorized phased arrayprobe consistent with this disclosure such that one point on a surface of a vertebrae can be hit by ultrasound beams multiple times from different kinematic configurations.
  • Methods and systems consistent with this disclosure can assign weights to each beam’s contribution to the same reconstructed voxel during the reconstruction process based on its incidence angle to the estimated bone surface.
  • FIG. 2 illustrates, in schematic form, a configuration involving a bone surface 250 and an ultrasound transducer 200, (with the elevational direction indicated by arrow 250).
  • Beam 230 is also shown in FIG. 2, where the narrowing of beam 230 about the focal point (where the beam has narrowed according to its elevational profile) is shown.
  • Angle 255 is the angle between the center of beam 230 and the normal to the bone surface (arrow 265).
  • Band 270 is associated with the displayed response of the bone surface 250, where the actual bone surface (line 260) is merged within the response 270.
  • FIG. 2 illustrates early and late echo artifacts and their relation to the beam angle of incidence.
  • a wearable 3D ultrasound device capable of imaging the vertebra from multiple insonification angles to improve the 3D bone surface visualization for interventional guidance.
  • FIG. 3 depicts the general functionality’ of a phased array transducer 300 consistent with this disclosure. Also shown in FIG. 3 is a portion of a spine 395 to be imaged, and the general scope of a beam 330 available to transducer 300 in a fixed position with respect to spine 395. Consistent with this disclosure, transducer 300 (and beam 330) can be configured to exhibit rotational motion (indicated by arrow- 355) and translational motion (indicated by arrow’ 365). Accordingly, phased array transducer 300 is configured to exhibit two degrees-of-freedom (“2 -DOF’’) in motion. Components of a 2-DOF wobbler probe 490, consistent with this disclosure, is depicted in FIG. 4.
  • Transducer 300 can be attached to a link that spans two parallel motor-driven linear stages.
  • the link can be connected to a first linear stage (485) w ith a re volute joint and the second linear stage (475) with a pin slot joint.
  • a common motion of the tw o linear stages (475 and 485) along the direction of the linear rails 481 and 471 can cause the transducer 300 (and therefore the beam 330) to translate along the direction parallel to that of the of the linear rails 481 and 471.
  • the kinematics is equivalent to a prismatic joint followed by a rotation joint, as depicted in FIG. 4B.
  • the linear stages 475 and 485 can have 50 mm range of motion.
  • the device can be positioned to image a vertebrae from a side of the vertebrae, without obstructing a space along a midline of the vertebrae for access by a medical device. Accordingly, in the central 26 mm of the linear motion, the pin slot joint rotates the transducer link up to 45 degrees in either direction. The maximum rotation decays toward the limits of the linear motion.
  • the transducer 300 can be tilted 15 degrees toward the midline to further enlarge the useful workspace.
  • acoustic coupling gel pad 441 which can be positioned in the gap between transducer 300 and a patient (not shown).
  • gel pad 441 can be a wedge-shaped elastic gel pad (Gelatin #4, Humimic Medical) to maintain acoustic coupling under the conical transducer surface trajectory.
  • transducer 300 can be an ATL P7-4 transducer. Consistent with this disclosure, transducer 300 can be modified to reduce its axial length by replacing micro-coax cables with flat-flex cables to allow the transducer 300 to freely translate and rotate inside the device housing.
  • FIGS. 5 A and 5B illustrate aspects of the motion discussed above.
  • transducer 300 can be attached to linear stages 475 and 485, each of which can be configured to move alone rails 471 and 481 (respectively).
  • linear stages 475 and 485 maintain relative separation during translation movement (in the direction of arrow 365 along the linear rails 471 and 491, respectively)
  • beam 330 can translationally '‘sweep” the region below the device (and/or to the side of the device) in the direction of arrow 365.
  • the differential movement in the linear stages can cause the transducer 300 (and therefore the beam 330) to rotate about arrow 355.
  • FIGS. 6A and 6B Further aspects of the device 690, utilizing the mechanism of FIGS. 4A-5B are depicted in FIGS. 6A and 6B.
  • 6B provides an exploded view of the device shown in FIG. 6A, which includes the following components: transducer 300, linear stage 475, linear rail 471, linear stage 485, linear rail 481. motors 665 and 675. electronics 655. Acoustic coupling gel pad 641, and enclosure 642.
  • FIG. 7 is another view of FIG. 3. but with device 690 producing beam 330 according to the configuration and mechanism discussed above.
  • FIG. 8 is another view of device 690 within the enclosure 642, with arrows 355 and 365 depicting the 2-DOF available.
  • FIG. 9 depicts device 690 attached to patient 900 (allowing the ultrasound device 690 to operate in a “hands-free” mode), where the vertebrae 995 of patient 900, which is being imaged using device 690 and output to display device 982.
  • the display 982 is generated using device 690. which can be positioned to the side of the vertebrae 995, so as to permit the person operating the medical device 998 to operate the device 998 directly over the vertebrae without obstruction from the ultrasound device 690, or an additional ultrasound operator.
  • FIG. 10 provides a 3D visualization (point cloud, MIP) (image 1096) of the lumbar spine in agar phantom. Each graticule is 1 cm. The axis markers indicate the poses of the transducer when the data was recorded. The top-left inset (1095) shows the geometry' of the phantom view from a similar direction. The bounding box 1091 indicates the visualized area.
  • a pre-procedure volume can be acquired in a shuttling motion.
  • the transducer 300 can move back and forth along the linear axis. At the end of each linear motion segment, the transducer 300 can rotate a small increment. Consistent with an embodiment, one can use a linear velocity of 2 mm/s and increment by 5 degrees. Further still, B-mode can images were acquired on an ultrasound machine (Ultrasonix SonixTablet) at 30 fps.
  • a scalar volume map that represents the probability' of a voxel located on the bone surface.
  • this volume map can be the output of a deep learning-based bone surface estimator.
  • a 3D Gaussian filter with 10 voxel standard deviation to simulate the uncertainty in estimation, and in the end we max-normalized the intensities between 0 and 1 to represent the bone probability'.
  • the 3D surface probability map With the 3D surface probability map, one can estimate the surface normal orientation by computing the 3D gradient of the surface probability'.
  • the backbone of the volume reconstruction algorithm consistent with this disclosure is a Pixel-based method (PBM) as introduced in Solberg et al. 2007 “Freehand 3D ultrasound reconstruction algorithms — a review” Ultrasound in medicine and biology. 33(7), pp.991-1009 (2007), which contains a distribution step (DS) followed by a hole-filling step (HFS).
  • DS distribution step
  • HFS hole-filling step
  • the DS one can align the ultrasound image to the global reference coordinates based on its tracked transformation.
  • one can apply the pixel nearest neighbor as the distribution method, such that the pixel intensity is assigned to the nearest voxel in the reconstruction volume.
  • empty voxels will also be filled with nearest neighbor values up to a limit distance, such that if the gap is larger than the threshold limit we determine that the space is never visited by the ultrasound scan.
  • T Recorded transformation matrices for the tracked ultrasound images.
  • Target reconstruction volume of size Sx * Sy * Sz Estimated bone probability map (a scalar volume) of size Sx x Sy x Sz : Bone surface gradient map (a vector volume) of size Sx x Sy x Sz x 3
  • Vcount Beam visiting score volume (a scalar volume) counting how much good insonification have been received at each voxel.
  • Volume size Sx x Sy * Sz . pair The ultrasound beam direction map which can be pre-computed when using a fixed imaging depth. We can compute for each pixel the direction of the beam going through the pixel.
  • pdata The scalar intensity value of a pixel to fill in the nearest voxel.
  • vtemp The temporary value to be assigned to the reconstructed voxel.
  • a The empirical enhancement factor for controlling the level of enhancement (saturation) for the bone response (chosen as 0.1).
  • fi The empirical incidence angle cosine value threshold for applying energy compensation.
  • boolext the boolean variable controlling whether energy compensation is active.
  • step 4 we first find out the correspondence between the image pixel index and the nearest reconstructed volume voxel index for distributing the intensity value.
  • step 7 we compute the angle weight Wangle , which is the dot product between the transformed US scanline beam direction and the surface gradient. Intuitively, if the scanline is perpendicular to the surface, it should be in line with the surface gradient and the dot product will be 1. Otherwise it is less than 1.
  • step 14 we assign the bone weight Whom as the probability of the voxel located on the bone surface.
  • step 15 we compute a temporary voxel value by multiplying the original pixel intensity 7 with both angle weight and bone weight as well as an enhancement factor a , and then we add this value to the original pixel intensity. This will basically enhance the bone surfaces with high bone weight and high angle weight, while keeping it the same for other voxels not on bone surfaces.
  • step 19 we use a running average scheme so that we can update the current voxel value based on how often this voxel has already been visited by good input frames.
  • step 20 we update the visiting count not simply adding 1 each time, but actually using bone weight and angle weight so that frames with higher quality will have much larger impact on the reconstructed voxel intensity.
  • the disclosed algorithm enhances the bone surface voxels with good beam angles.
  • an energy compensation framework is added so that surface voxels reached by sub-optimal beams can be compensated.
  • FIGS. 12A-12C provide a qualitative comparison between different reconstruction methods using crosssection view of reconstructed volumes.
  • FIG. 12A depicts the results of a baseline method that averages all contributing beams for reconstructed voxel.
  • FIG. 12B depicts the algorithm of FIG. 11, with only bone surface enhancement.
  • FIG. 12C depicts the algorithm of FIG. 11 with non-optimal beam energy' compensation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Rheumatology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne des systèmes et des méthodes d'imagerie de vertèbres. Les systèmes et les méthodes peuvent comprendre un dispositif à ultrasons pouvant être porté muni d'un transducteur, le dispositif à ultrasons pouvant être porté étant configuré pour générer des données d'imagerie des vertèbres lorsqu'il est placé sur un côté des vertèbres, les données d'imagerie des vertèbres générées étant acquises sur la base de balayages translationnels des vertèbres imagées par le transducteur et sur la base des balayages rotationnels des vertèbres imagées par le transducteur. Dans un autre aspect, les systèmes et les méthodes peuvent comprendre un module de reconstruction d'image configuré pour traiter les données d'imagerie des vertèbres sur la base, au moins en partie, des données d'imagerie des vertèbres acquises à partir d'angles d'insonification redondants.
PCT/US2025/015770 2024-02-14 2025-02-13 Systèmes et méthodes d'imagerie médicale échoguidée améliorée Pending WO2025174993A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463553479P 2024-02-14 2024-02-14
US63/553,479 2024-02-14

Publications (1)

Publication Number Publication Date
WO2025174993A1 true WO2025174993A1 (fr) 2025-08-21

Family

ID=96773505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/015770 Pending WO2025174993A1 (fr) 2024-02-14 2025-02-13 Systèmes et méthodes d'imagerie médicale échoguidée améliorée

Country Status (1)

Country Link
WO (1) WO2025174993A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011089160A1 (fr) * 2010-01-19 2011-07-28 Otto-Von-Guericke-Universität Magdeburg Dispositif de détermination d'écartements de vertèbres de la colonne vertébrale
US20200253554A1 (en) * 2019-02-13 2020-08-13 DePuy Synthes Products, Inc. Noninvasive spinal tracking
US11253729B2 (en) * 2016-03-11 2022-02-22 Sorbonne Universite External ultrasound generating treating device for spinal cord and/or spinal nerve treatment, apparatus comprising such device and method
US11341634B2 (en) * 2017-07-18 2022-05-24 Koninklijke Philips N.V. Fetal ultrasound image processing
US20220160332A1 (en) * 2020-03-09 2022-05-26 Christopher Schlenger Apparatus and method for automatic ultrasound segmentation for visualization and measurement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011089160A1 (fr) * 2010-01-19 2011-07-28 Otto-Von-Guericke-Universität Magdeburg Dispositif de détermination d'écartements de vertèbres de la colonne vertébrale
US11253729B2 (en) * 2016-03-11 2022-02-22 Sorbonne Universite External ultrasound generating treating device for spinal cord and/or spinal nerve treatment, apparatus comprising such device and method
US11341634B2 (en) * 2017-07-18 2022-05-24 Koninklijke Philips N.V. Fetal ultrasound image processing
US20200253554A1 (en) * 2019-02-13 2020-08-13 DePuy Synthes Products, Inc. Noninvasive spinal tracking
US20220160332A1 (en) * 2020-03-09 2022-05-26 Christopher Schlenger Apparatus and method for automatic ultrasound segmentation for visualization and measurement

Similar Documents

Publication Publication Date Title
Mohamed et al. A survey on 3D ultrasound reconstruction techniques
Huang et al. A review on real‐time 3D ultrasound imaging technology
US10426345B2 (en) System for generating composite images for endoscopic surgery of moving and deformable anatomy
EP3003161B1 (fr) Procédé d'acquisition en 3d d'images ultrasonores
EP2701607B1 (fr) Reconstruction de l'image d'une surface osseuse par ultrasons
CN100522066C (zh) 超声波诊断装置和图像处理方法
US20210059762A1 (en) Motion compensation platform for image guided percutaneous access to bodily organs and structures
US10130328B2 (en) Method and apparatus for ultrasound image acquisition
Chen et al. Reconstruction of freehand 3D ultrasound based on kernel regression
JP2013505778A (ja) 運動情報を用いた医用画像解析のためのコンピュータ可読媒体、システム、および方法
Poon et al. Three-dimensional extended field-of-view ultrasound
US20220151496A1 (en) Device and method for analyzing optoacoustic data, optoacoustic system and computer program
Welch et al. A real-time freehand 3D ultrasound system for image-guided surgery
Huang et al. 2.5-D extended field-of-view ultrasound
Zimmer et al. Multi-view image reconstruction: Application to fetal ultrasound compounding
Xu et al. AutoInFocus, a new paradigm for ultrasound-guided spine intervention: a multi-platform validation study
Zhao et al. Endobronchial ultrasound image simulation for image-guided bronchoscopy
JP2011156286A (ja) 超音波診断装置及び超音波画像表示プログラム
WO2025174993A1 (fr) Systèmes et méthodes d'imagerie médicale échoguidée améliorée
US12357274B2 (en) Systems and methods for acquiring ultrasonic data
Jiang et al. Insonification Angle-based Ultrasound Volume Reconstruction for Spine Intervention
CN108289655B (zh) 具有中轴弯曲和横向偏心的心脏的超声心脏评估
Deshmukh et al. Five-dimensional ultrasound system for soft tissue visualization
Khamene et al. Local 3D reconstruction and augmented reality visualization of free-hand ultrasound for needle biopsy procedures
Ji et al. Coregistered volumetric true 3D ultrasonography in image-guided neurosurgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25755681

Country of ref document: EP

Kind code of ref document: A1