[go: up one dir, main page]

WO2025051557A1 - Systèmes et procédés d'interprétation de balayage aveugle ultrasonore et d'évaluation de fidélité - Google Patents

Systèmes et procédés d'interprétation de balayage aveugle ultrasonore et d'évaluation de fidélité Download PDF

Info

Publication number
WO2025051557A1
WO2025051557A1 PCT/EP2024/073634 EP2024073634W WO2025051557A1 WO 2025051557 A1 WO2025051557 A1 WO 2025051557A1 EP 2024073634 W EP2024073634 W EP 2024073634W WO 2025051557 A1 WO2025051557 A1 WO 2025051557A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
ultrasound
images
probe
signal processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/073634
Other languages
English (en)
Inventor
Leili SALEHI
Shyam Bharat
Sean Flannery
Leila KALANTARI
Jonathan Thomas SUTTON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of WO2025051557A1 publication Critical patent/WO2025051557A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Definitions

  • the present disclosure generally relates to medical diagnostic ultrasound systems. More specifically, the present disclosure is directed to systems and methods for guiding ultrasound imaging procedures using information detected from the ultrasound sweeps themselves.
  • Ultrasound diagnostic systems have become increasingly portable in recent years. In the early 2000s, system miniaturization started with desktop unit, then laptop-based configurations appeared. In recent years, there has been an increasing need for medical ultrasound imaging equipment that is portable to allow medical personnel to easily transport and deploy the equipment to and from or within healthcare facility and/or field locations, and more user-friendly to accommodate medical personnel who may possess a range of skill levels.
  • a blind sweep is known protocol for helping a novice operator acquire ultrasound images.
  • the operator sweeps the ultrasound over a pre-determined pattern, as opposed to moving the probe freely.
  • This pre-determined pattern greatly simplifies the procedure because the operator is not required to interpret the ultrasound images in realtime and reposition the device according to these interpretations.
  • the present disclosure is generally directed to systems and methods for evaluating an ultrasound imaging procedure. This includes acquiring a series of 2D ultrasound images of an anatomical area of interest by sweeping an ultrasound probe over a predetermined pattern, and then reconstructing a 3D image from the sequence of 2D images. The system or method further includes determining a validation metric for the reconstructed 3D image and communicating evaluation feedback based on the validation metric for the 3D image.
  • One manner in which the systems and methods disclosed herein improve upon conventional approaches is through the realization that acceleration and angular velocity acquired during an ultrasound imaging procedure can be used to generate a 3D image from the series of 2D images.
  • some ultrasound probes such as Philips Lum if y T al ready include an inertial measurement unit (“IMU”) that could provide one approach for determining the acceleration and angular velocity.
  • IMU inertial measurement unit
  • the disclosed systems and methods can automatically generate the reconstructed 3D image as part of the blind sweep protocol, and communicate evaluation feedback based on the 3D image to assist the operator in correcting any errors in executing the blind sweep protocol and obtaining the imaging information of the anatomical area of interest with sufficient fidelity to provide reliable diagnostic information.
  • the disclosure relates to an ultrasound system configured to evaluate a plurality of ultrasounds sweeps in a blind sweep protocol.
  • the ultrasound system includes an ultrasound probe configured to acquire a plurality of two-dimensional (2D) ultrasound images of an anatomical area of interest.
  • the system further includes a signal processor configured to generate at least one three-dimensional (3D) image of at least a portion of the anatomical area of interest based at least in part on the plurality of 2D images (48), and further configured to determine a validation metric for the 3D image.
  • the system further includes a user interface for communicating evaluation feedback at least partially based on the validation metric for the 3D image.
  • the probe estimation sensors are selected from the group consisting of an accelerometer, a gyroscope, and a magnetometer.
  • the system may further include an image sensor.
  • the image sensor is configured to track a movement of the ultrasound probe.
  • the signal processor may be further configured to reconstruct the 3D image, at least partially based on the movement tracked by the image sensor.
  • the validation metric is based, at least partially on, a completeness of the 3D image.
  • the evaluation feedback includes identifying one or more low- fidelity regions. In some additional or alternative embodiments, the evaluation feedback includes displaying an alphanumeric image fidelity score.
  • the evaluation feedback includes an instruction to repeat all or a portion of the blind sweep protocol.
  • the evaluation feedback may further include an instruction to perform an abridged sweep, wherein at least a portion of the abridged sweep is distinct from the blind sweep protocol.
  • the signal processor is configured to generate the 3D image by inputting the plurality of 2D images into a neural network.
  • the signal processor is configured to generate the 3D image by inserting a filling image into the plurality of 2D images.
  • the signal processor may produce the filling image at least partially based on calculating an intermediate image value between two or more images of the plurality of 2D images.
  • a further aspect of the disclosure relates to a method for evaluating a plurality ultrasound sweeps in a blind sweep protocol.
  • the method includes: i) receiving a plurality of 2D images acquired with an ultrasound probe, ii) detecting an acceleration and an angular velocity of the ultrasound probe as the ultrasound probe is moved, via a plurality of sweeps, in a predetermined pattern over an anatomical area of interest in accordance with the blind sweep protocol, iii) generating a 3D image, with reference to the predetermined pattern, using the plurality of 2D images and based, at least partially, on the detected acceleration and angular velocity, iv) determining a validation metric for the 3D image, and v) communicating evaluation feedback, at least partially, based on the validation metric.
  • a processor or controller can be associated with one or more storage media (generically referred to herein as “memory,” e.g., volatile and non-volatile computer memory such as RDM, RAM, PROM, EPROM, and EEPROM, floppy disks, compact disks, optical disks, magnetic tape, Flash, OTP-ROM, SSD, HDD, etc.).
  • the storage media can be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein.
  • program or “computer program” are used herein in a generic sense to refer to any type of computer code (e.g., software, firmware, or microcode) that can be employed to program one or more processors or controllers.
  • FIG. 1 is an illustration of an ultrasound system according to some aspects of the present disclosure.
  • FIG. 2 is a schematic of one embodiment of an inertial measurement unit according to some aspects of the present disclosure.
  • FIG. 3 is a schematic of an electromagnetic tracking system according to some aspects of the present disclosure.
  • FIG. 4 is a schematic of an ultrasound probe and a computer according to some aspects of the present disclosure.
  • FIG. 5 is an illustration of a blind sweep protocol according to some aspects of the present disclosure.
  • FIG. 6 is a method for evaluating a plurality of ultrasound sweeps in a blind sweep protocol according to some aspects of the present disclosure.
  • FIG. 7 is an illustration of a 3D image being generated from a plurality of 2D images according to some aspects of the present disclosure.
  • FIG. 8 is an illustration of one method of generating a 3D image from a plurality of 2D images according to some aspects of the present disclosure.
  • FIG. 9 is an illustration of a filling image being interpolated into a plurality of 2D images according to some aspects of the present disclosure.
  • FIG. 10 is a method for communicating an alphanumeric fidelity score according to some aspects of the present disclosure.
  • FIG. 11 is an illustration of an abridged sweep according to some aspects of the present disclosure.
  • the present disclosure is generally directed to systems and methods for evaluating an ultrasound imaging procedure. This includes acquiring a series of 2D ultrasound images by sweeping an ultrasound probe over a predetermined pattern over an anatomical area of interest, and then generating a reconstructed 3D image from the sequence of 2D images. The system or method further includes determining a validation metric for the reconstructed 3D image and communicating evaluation feedback based on the validation metric for the 3D image.
  • FIG 1 is an illustration of an ultrasound system 10 according to some aspects of the present disclosure.
  • the ultrasound system 10 includes an ultrasound probe 12 that acquires ultrasound images of an anatomical area of interest as it is moved along the body of a patient 16.
  • ultrasound system 10 is a portable app-based ultrasound system designed to perform diagnostic imaging at the point of care.
  • Ultrasound probe 12 is a small handheld probe including an array of transducers and the circuitry necessary to produce an ultrasound image and is further connectable to a compatible smart device, such as a smartphone or tablet running the suitable software application.
  • the ultrasound system 10 is implemented based on Philips Lumify system available from Royal Philips, N.A. of Cambridge, Massachusetts.
  • the handheld ultrasound probe 12 configured in accordance with the principles of this disclosure, could be deployed with other ultrasound systems, including mid-range and high-end non-portable ultrasound systems, without deviating from the scope and spirit of this disclosure.
  • Ultrasound probe 12 can be connected to a display device having a user interface 20 where the operator can view the acquired ultrasound images.
  • ultrasound probe 12 can be further connected to a tablet 18 that an operator uses to adjust various settings of the ultrasound probe 12.
  • the ultrasound probe 12 can be connected to only the tablet 18, so that the tablet is used both for displaying the acquired ultrasound images via the user interface 20 and controlling the ultrasound probe settings.
  • the ultrasound probe could alternatively be connected to any other device suitable for displaying ultrasound images or controlling ultrasound probe settings, for example, a smartphone.
  • the connection between the probe 12 and the display/control components can take the form of a wired connection, e.g. via USB port.
  • the connection can also take the form of a wireless connection using any known wireless connection protocol and hardware, such as Bluetooth or Wi-Fi.
  • the ultrasound system 10 may also include one or more probe estimation sensors.
  • the probe estimation sensors can be implemented as components of an inertial measurement unit (“IMU”) 14.
  • the IMU 14 can be a separate device releasably attached to the ultrasound probe 12, or it can be integrated into the ultrasound probe 12.
  • the probe estimation sensors may be configured to detect a three-dimensional acceleration and an angular velocity using an accelerometer 22 and a gyroscope 24, as shown in FIG. 2.
  • Such an accelerometer 22 and gyroscope 24 are possible components of an IMU 14.
  • the probe estimation sensors may include a magnetometer 26.
  • Including the magnetometer 26 would have the advantage of leading to even more accurate measurements of angular velocity.
  • magnetometer 26 may be used to compensate for gyroscope drift error in yaw angle estimation over time. For example, this can be applied to reducing errors when transforming acceleration measurements to a global frame.
  • a magnetometer is a sensor that is used to detect and measure magnetic fields. When deployed in conjunction with ultrasound probe 12, magnetometer 26 can be used to improve tracking and localization of the probe during imaging procedures. One way this can be achieved is by embedding a small magnetometer sensor in the ultrasound probe 12 itself.
  • magnetometer 26 can also be used to reduce the effects of motion artifacts, such as hand tremors, on image fidelity. By detecting and compensating for small movements of the ultrasound probe 12, magnetometer 26 can help to produce more stable and consistent images, reducing the need for repeated scans and improving diagnostic accuracy.
  • a fixed external magnetic field generator 28 along with three orthogonal coils 30 coupled to the ultrasound probe 12 can be provided as shown in FIG. 3.
  • a voltage is induced in the three coils. This induced voltage can be used to determine the angular orientation and position of the ultrasound probe 12.
  • the system 10 can include an external image sensor 32.
  • the image sensor 32 can take the form of, for example, an RGB or RGBD camera.
  • the image sensor 32 can generate a video feed of the ultrasound probe’s 12 movements.
  • This video feed can be used to visually track the movement of the ultrasound probe 12 as an additional or alternative technique for determining the location, acceleration, and angular orientation.
  • the intended path or the probe itself can be marked using markers which are picked up by the camera. This visual tracking can be accomplished using any known visual odometry or simultaneous localization and mapping (“SLAM”) approach.
  • SLAM simultaneous localization and mapping
  • a neural network could be used that is trained on videos taken with the image sensor 32.
  • the neural network could extract and calculate any deviation from the intended path indicated by the markers.
  • the network may be a convolutional neural network which is trained on many images of ultrasound probes with different known orientations moving in known directions on a marked surface. The network learns the association between the visible markers/ features, in the consecutive video frames and probe location, orientation, and velocity.
  • image processing of the ultrasound image frames themselves may be used for motion detection in supplement to or in absence of IMU 14 motion data.
  • One approach to using ultrasound image frames for motion detection is to compare consecutive frames and calculate the displacement of each pixel between frames. This displacement can be used to estimate the motion of the probe and to track its position and orientation over time. By analyzing multiple frames, it is possible to detect changes in the motion of the probe and to adjust for any movements or vibrations that may be affecting image fidelity. For instance, seeing a mother’s kidneys in fetal ultrasound images indicates that the probe is either on the right or on the left side of the mother’s abdomen, and the velocity of the probe can be estimating using displacements in these images given a known average size for adult kidneys. This processing of the ultrasound images themselves helps create a map of the probe’s location and movements.
  • Another approach is to use feature detection and tracking techniques to detect and track specific features in the ultrasound image frames, such as the edges of structures or the location of blood vessels. By tracking these features over time, it is possible to estimate the motion of the probe and to compensate for any movements or vibrations that may be affecting image quality.
  • ultrasound image frame-based motion detection can be particularly useful in situations where IMU 14 motion data is not available or unreliable, such as when using an ultrasound system that does not include IMU 14. It can also be used to supplement IMU 14 data to improve accuracy and robustness of motion detection.
  • motion between ultrasound image frames may be assessed by optical flow or by similar visual odometry methods. The start of a new sweep is detectable when the probe transitions from a static to moving state. The conclusion of a sweep is detectable as an end to probe motion. If coupling is maintained during intra-sweep transitions, relative position and orientation of sweeps may also be obtained by tracking probe repositioning via the ultrasound image frames.
  • the ultrasound 12 transmits the acquired ultrasound images to a computing device 34.
  • the computing device 34 may take several forms including being part of the tablet 18 discussed in conjunction with FIG. 1 .
  • the computing device 34 may be in a location remote from the ultrasound probe 12, and the ultrasound probe 12 transmits the acquired ultrasound images to the remote computing device 34 according to any data transmission method previously known in the art.
  • the computing device 34 includes a memory 36 for storing acquired ultrasound images and processor instructions.
  • the computing device 34 may also include a signal processor 38 used to generate a 3D ultrasound image from a series of 2D ultrasound images, and further configured to determine a validation metric for the generated 3D ultrasound image.
  • the present disclosure involves image acquisition of an anatomical area of interest of a patient 16 via a blind sweep protocol, a technique used in ultrasound imaging that involves systematically scanning an area of interest without any prior knowledge of the location or orientation of the target anatomical structure.
  • This technique is often used in applications where the target structure is difficult to visualize, such as in deep tissue imaging or when the target is obscured by surrounding structures, or as described herein, when assisting an inexperienced operator to conduct antenatal screening of gravid abdomen.
  • the parameters of a blind sweep protocol can vary depending on the specific imaging application and the type of ultrasound system being used. In general, the protocol will involve systematically scanning the area of interest using a variety of imaging modes and parameters to optimize image fidelity and increase the chances of detecting the target structure. Advantages of a blind sweep protocol include increased detection rate, because, by systematically scanning the entire area of interest, a blind sweep protocol can increase the chances of detecting the target structure, even if it is small or obscured by surrounding tissues. Further advantage is a reduced operator bias by removing the need for the operator to make assumptions about the location or orientation of the target structure.
  • FIG. 5 is an illustration of a blind sweep protocol according to some aspects of the present disclosure.
  • the blind sweep protocol utilizes a predetermined pattern 40.
  • the predetermined pattern 40 may be a grid shape, for example, an orthogonal grid, such as the one shown in FIG. 5. In the embodiment shown in FIG. 5, the predetermined pattern 40 is located along the abdomen of patient 16. However, other pattern locations or geometries may be appropriate depending on the particular ultrasound procedure being performed.
  • the predetermined pattern 40 may be used to guide a novice operator in making a series of longitudinal sweeps 42 and lateral sweeps 44.
  • the example embodiment of FIG. 5 shows predetermined pattern 40 including three longitudinal sweeps 42 and three lateral sweeps 44.
  • the user will need to execute repositioning movements 46 between sweeps.
  • FIG. 5 shows repositioning movements 46 being made between each of the longitudinal sweeps 42.
  • the repositioning movements 46 introduce a chance for error if the ultrasound operator fails to correctly pause the acquisition of ultrasound images during the repositioning movements 46.
  • FIG. 6 is a method 100 of evaluating a plurality of ultrasound sweeps in a blind sweep protocol according to some aspects of the present disclosure.
  • Step 102 of the method 100 includes acquiring ultrasound images by sweeping an ultrasound probe 12 over a predetermined pattern. This can be achieved, for example, by sweeping the ultrasound probe 12 previously discussed over a predetermined pattern 40 such as the one shown in FIG. 5.
  • Step 104 of method 100 includes detecting an acceleration and an angular velocity of the ultrasound probe 12 as the ultrasound probe 12 is swept over the predetermined pattern 40. Acceleration and angular velocity can be detected using any of the methods or devices previously discussed including: an IMU 14, electromagnetic tracking, image tracking with a camera, or any additional detection method previously known in the art.
  • step 106 of method 100 includes fusing the acceleration and angular velocity data. For example, if the acceleration and the angular velocity are detected at different sampling rates, the sampling must be unified. This may be accomplished by any synchronization algorithm in the art, for example, by resampling one data stream to match the other or by only using samples taken at the same timepoint. [0052] Once sampling frequencies are unified, there are several approaches to perform sensor fusion. For example, we can define a reference frame for IMU 14 where the origin is the IMU’s 14 location within the ultrasound probe 12.
  • the probe estimation sensor may detect an angular velocity as opposed to directly detecting an angular orientation. This would be the case, for example if probe estimation sensor was implemented as a gyroscope.
  • angular orientation detected by probe estimation sensor would then be integrated to provide the orientation of the IMU 14 from both reference frames.
  • angular velocity and acceleration data may be fused with an Extended Kalman Filter, Kalman Filter, Complementary Filter, Madgwick Filter, or similarly known sensor fusion methodology.
  • the output in this case would be a more accurate estimate of orientation than from simple integration.
  • orientation is then used to transform the detected acceleration to the global reference frame.
  • the gravitational acceleration is subtracted from the vertical axis acceleration.
  • acceleration is single and double integrated, respectively.
  • Step 108 of method 100 includes generating a reconstructed 3D image, with reference to the predetermined pattern 40, using the plurality of 2D images 48.
  • the signal processor 38 receives a plurality of 2D images 48 and outputs a 3D image 50. This step can be implemented in several different ways.
  • the ultrasound probe 12 acquires the plurality of 2D images 48 using an array of one-dimensional transducers.
  • the signal processor 38 applies a Fourier transform to the signals acquired by the one-dimensional transducers and combines these into the 3D image 50 in the frequency domain. This can be achieved in accordance with known methods, for example, as disclosed by Han et al. “3D ultrasound imaging in frequency domain with 1 D array transducer”, Ultrasonics, volume 76, pages 28-34 (December 2016), incorporated herein by reference.
  • the 3D image 50 can be generated using raw ultrasound signals from the transducers. These raw ultrasound signals can either replace or supplement the acceleration and angular velocity data detected by the IMU 14.
  • this data can be used to map each of the plurality of 2D images 48 to a particular location in 3D space. This would significantly improve the accuracy and fluidity of previously known 3D models built using only raw ultrasound data.
  • the signal processor 38 receives the plurality of 2D images 48 as the ultrasound probe 12 moves along the predetermined sweeping pattern 40.
  • the signal processor may further receive the acceleration and angular velocity data detected by the IMU 14.
  • the signal processor 38 may use this acceleration and angular velocity data to determine areas of the plurality of 2D images 48 acquired either along the sweep paths 52, or along a space perpendicular to the sweep path 54.
  • the signal processor 38 would then use only overlapping areas 56 along the space perpendicular to the sweep path 54 in reconstructing the 3D image 50.
  • the overlapping areas 56 can be stitched together in accordance with known methods, for example, as disclosed by Deng et al. “Generating panorama photos”, Internet Multimedia Management Systems IV, Proceedings volume 5242 (2003), incorporated herein by reference. This method would have the advantage of reducing the number of visible misalignments in the generated 3D image 50.
  • the 3D image 50 could be generated from a neural network trained to estimate the underlying 3D anatomy of the patient 16 from which the plurality of 2D images 48 were acquired.
  • the neural network may be a convolutional neural network (CNN), temporal convolutional network (TCN), recurrent neural network (RNN), transformer or any other variation known in the art.
  • An RNN-based implementation may use unidirectional or bidirectional long short-term memory (LSTM) architecture.
  • the network may include several fully connected or convolutional layers with pooling, normalization, dropout, or non-linear layers between them.
  • the detected acceleration and angular orientation data can be used to train the neural network by projecting the 3D estimate back into the 2D space the ultrasound images were acquired from.
  • the neural network would then reassess the projected 3D estimate in the 2D space to evaluate the loss of the 3D estimate. Additionally or alternatively, the neural network could use raw ultrasound data to estimate the underlying 3D anatomy of patient 16. [0058] As shown in FIG. 9, the signal processor 38 can further improve the reconstructed 3D image 50 by generating and interpolating a filler image 58 into the plurality of 2D images 48. For example, if the ultrasound operator moves through a portion of the blind sweep protocol too quickly the ultrasound probe 12 may be unable to acquire enough 2D images 48 to produce a complete and continuous 3D image. In this case, the signal processor 38 could fill in areas lacking in image data with one or more filling image 58.
  • the filling image 58 could be produced using any interpolation method previously known in the art.
  • the signal processor 38 could produce filling image 58 by calculating a series of intermediate pixel values between each pixel of any two of the acquired 2D ultrasound images 48.
  • the filling image 58 could be generated using a deep neural network architecture such as a generative adversarial network (“GAN”) or any of the other neural network configurations previously discussed.
  • the neural network that generates the filling image 58 could be the same or a different neural network from the one that generates the 3D image 50 from the plurality of 2D images 48.
  • the validation metric could be based on the completeness of the 3D image 50.
  • the signal processor 38 may detect portions of 3D image 50 where there is an insufficient number of 2D images 48 to create an accurate and/or continuous 3D reconstruction. This would indicate a low level of completeness. Additionally or alternatively, the number of filling images 58 the signal processor 38 was required to generate and interpolate could be an indication of the completeness of the 3D image. More filling images 58 would indicate a less complete 3D model 50.
  • the validation metric could be based on parameters relating to the sweep fidelity. For example, sweeping with a uniform velocity throughout the blind sweep protocol would be an indication of higher sweep fidelity, as the plurality of 2D images 48 would be uniformly distributed among each area of the patient’s 16 anatomy of interest. Meanwhile, sweeps performed at a high velocity would indicate a lower sweep fidelity as they might result in an insufficient number of image frames for a reliable 3D reconstruction. Moreover, detected deviations from the predetermined pattern 40 could also be an indication of lower sweep fidelity.
  • the validation metric could be based on parameters relating to the ultrasound images themselves.
  • the signal processor 38 could evaluate the plurality of 2D images 48 for shadows obscuring the images or for overall image contrast.
  • approaches such as Monte Carlo dropout could be used to determine the validation metric based on network confidence.
  • the network trained with drop-out layers, may be run multiple times on the same input during inference to generate slightly different outputs (since dropout drops the outputs from a specified number of nodes at random). These different outputs can be used to compute the mean and variance in the reconstructed 3D image 50.
  • the variance can be used to indicate a level of confidence in the output (i.e., high variance indicates that network output is not consistent and, therefore, confidence is low, while low variance indicates consistent output and high confidence).
  • the validation metric could be an alphanumeric fidelity score that is communicated as evaluation feedback.
  • the 3D image 50 could be made up of a plurality of voxels.
  • An example embodiment of a method 200 for calculating an alphanumeric quality score based the voxels of 3D image 50 is shown in FIG. 10.
  • Step 202 of method 200 includes comparing a fidelity of each voxel in the 3D image 50 with a voxel threshold. This threshold can be determined using any of the previously discussed methods for determining the validation metric.
  • Step 204 of method 200 includes determining a total number of voxels in the 3D image 50. This could, for example, be performed by the signal processor 38.
  • Step 206 of method 200 includes comparing the number of voxels with fidelity greater than the voxel threshold to the total number of voxels. In some examples, this comparison includes calculating a ratio or percentage of the number of voxels with fidelity greater than or equal to the voxel threshold over the total number of voxels.
  • Step 208 of method 200 includes communicating an alphanumeric fidelity score based on the comparison. This alphanumeric fidelity score could again be the calculated ratio or percentage. The comparison may take many forms depending on the particular application and/or imaging ultrasound procedure. Alternatively, this alphanumeric score could include a message, such as a message indicating that the model is “reliable”, “unreliable”, or “moderately reliable”.
  • step 112 of method 100 includes communicating evaluation feedback at least partially based on the validation metric.
  • This feedback could be based on the fidelity of individual sweeps. This could, for example, take the form of visual feedback where the sweeps that were performed correctly are shown on the predetermined pattern 40 as green and the sweeps that need to be repeated are shown as red. Additionally or alternatively, the feedback could take the form of auditory feedback that outputs an auditory message to the ultrasound probe 12 operator. Additionally or alternatively, the feedback could take the form of haptic feedback. In which case, the ultrasound probe 12 or tablet 18 could vibrate when the user deviates from the predetermined pattern 40. In some further examples, the system may communicate: a fidelity score for individual sweeps, the maximum and minimum ultrasound probe 12 acceleration/velocity, a maximum deviation from the predetermined pattern 40, or guidance to the operator for improving their next sweep.
  • the evaluation feedback could include visually displaying the 3D image 50 and indicating to the operator less complete areas of the 3D image 50.
  • the system may further communicate instructions to repeat all or a portion of the blind sweep protocol. Referring to FIG. 11 , the system can identify low-fidelity regions 60 on the 3D image 50.
  • the sweeps on predetermined pattern 40 corresponding to the low-fidelity region 60 can be filled in or re-taken with a different sweep than in the original sweeping protocol.
  • One way this can be achieved is by using the detected acceleration and angular velocity to determine locations in the blind sweep protocol that correspond to coordinates on the 3D image 50.
  • the system 10 feedback may further include user interface 20 that including two separate displays, where one display shows the 3D image 50 and the other display shows both the path and orientation of the ultrasound probe 12, as detected by the IMU 14, and the areas of the predetermined pattern 40 corresponding to the low-fidelity regions 60.
  • This 3D image 50 along with the motion tracking information from the IMU 14 allows for further guidance of the ultrasound probe 12 operator.
  • the system 10 guides the ultrasound probe 12 operator to fill in low-fidelity regions 60 with an abridged sweep 62, as opposed to the complete longitudinal 42 or lateral 44 sweep shown in FIG. 5.
  • Such an abridged sweep 62 can be a subset of the original sweeps, but could also be a shorter sweep in a completely different direction, so long as the low-fidelity region 60 is covered appropriately.
  • the start position will be conveyed to the operator via the user interface 20, and the sweep will be monitored via the IMU 14 for adherence to the intended pattern.
  • the benefit to the operator is shorter additional sweeps that they can complete faster than if they were to repeat the entire blind sweep protocol.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements can optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • the present disclosure can be implemented as a system, a method, and/or a computer program product at any possible technical detail level of integration
  • the computer program product can include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium can be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM) , a digital versatile disk (DVD) , a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network can comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure can be assembler instructions, instruction-set -architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions can execute entirely on the user’s computer, partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection can be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field- programmable gate arrays (FPGA), or programmable logic arrays (PLA) can execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • the computer readable program instructions can be provided to a processor of a, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/ acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions can also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/ act specified in the flowchart and/or block diagram or blocks.
  • the computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus, or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/ acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams can represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks can occur out of the order noted in the Figures.
  • two blocks shown in succession can, in fact, be executed substantially concurrently, or the blocks can sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne des procédés et des systèmes ultrasonores qui évaluent une pluralité de balayages ultrasonores dans un protocole de balayage aveugle. Les procédés et les systèmes comprennent : i) une sonde ultrasonore conçue pour acquérir une pluralité d'images bidimensionnelles (2D) ; ii) un processeur de signal conçu pour générer une image tridimensionnelle (3D) à l'aide de la pluralité d'images 2D, et conçu en outre pour déterminer une métrique de validation pour l'image 3D ; et iv) une interface utilisateur qui communique une rétroaction d'évaluation au moins partiellement sur la base de l'image 3D.
PCT/EP2024/073634 2023-09-06 2024-08-23 Systèmes et procédés d'interprétation de balayage aveugle ultrasonore et d'évaluation de fidélité Pending WO2025051557A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363536750P 2023-09-06 2023-09-06
US63/536,750 2023-09-06

Publications (1)

Publication Number Publication Date
WO2025051557A1 true WO2025051557A1 (fr) 2025-03-13

Family

ID=92582969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/073634 Pending WO2025051557A1 (fr) 2023-09-06 2024-08-23 Systèmes et procédés d'interprétation de balayage aveugle ultrasonore et d'évaluation de fidélité

Country Status (1)

Country Link
WO (1) WO2025051557A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210068782A1 (en) * 2019-09-10 2021-03-11 Metritrack, Inc. System and method for tracking completeness of co-registered medical image data
WO2021099214A1 (fr) * 2019-11-21 2021-05-27 Koninklijke Philips N.V. Systèmes et procédés destinés à l'obtention d'images ultrasonores médicales
US20230026942A1 (en) * 2019-11-22 2023-01-26 Koninklijke Philips N.V. Intelligent measurement assistance for ultrasound imaging and associated devices, systems, and methods
US20230037923A1 (en) * 2021-02-26 2023-02-09 Cae Healthcare Canada Inc System and method for evaluating the performance of a user in capturing an ultrasound image of an anatomical region

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210068782A1 (en) * 2019-09-10 2021-03-11 Metritrack, Inc. System and method for tracking completeness of co-registered medical image data
WO2021099214A1 (fr) * 2019-11-21 2021-05-27 Koninklijke Philips N.V. Systèmes et procédés destinés à l'obtention d'images ultrasonores médicales
US20230026942A1 (en) * 2019-11-22 2023-01-26 Koninklijke Philips N.V. Intelligent measurement assistance for ultrasound imaging and associated devices, systems, and methods
US20230037923A1 (en) * 2021-02-26 2023-02-09 Cae Healthcare Canada Inc System and method for evaluating the performance of a user in capturing an ultrasound image of an anatomical region

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DENG ET AL.: "Generating panorama photos", INTERNET MULTIMEDIA MANAGEMENT SYSTEMS IV, PROCEEDINGS, vol. 5242, 2003, XP002400359
HAN ET AL.: "3D ultrasound imaging in frequency domain with 1 D array transducer", ULTRASONICS, vol. 76, December 2016 (2016-12-01), pages 28 - 34, XP029921017, DOI: 10.1016/j.ultras.2016.12.007
WEIN WOLFGANG ET AL: "Three-Dimensional Thyroid Assessment from Untracked 2D Ultrasound Clips", 1 January 2020, SPRINGER, PAGE(S) 514 - 523, XP047594668 *

Similar Documents

Publication Publication Date Title
EP3478209B1 (fr) Système de suivi à dispositif inertiel et procédé de fonctionnement de celui-ci
EP3056151B1 (fr) Procédé d'imagerie par fusion ultrasonore et système de navigation d'imagerie par fusion ultrasonore
US20200069285A1 (en) System and method for ultrasound navigation
CN100496407C (zh) 超声波诊断装置
CN111292277B (zh) 超声融合成像方法及超声融合成像导航系统
US20140243671A1 (en) Ultrasound imaging system and method for drift compensation
CN113260313A (zh) 用于超声数据收集的方法和装置
US20190231438A1 (en) Intravascular catheter for modeling blood vessels
US10362964B2 (en) Method, apparatus, and system for providing medical image
CN107111875A (zh) 用于多模态自动配准的反馈
EP2944258A1 (fr) Procédé et appareil d'enregistrement d'images médicales
KR20230173714A (ko) 초음파 프로브의 위치결정 및 배향을 유도하기 위한 시스템 및 방법
KR20160046670A (ko) 영상 진단 보조 장치 및 방법
JP2022037101A (ja) 超音波システム及び方法
US11911213B2 (en) Techniques for determining ultrasound probe motion
CN107072638B (zh) 对超声图像的序列进行可视化的方法、计算机程序产品和超声系统
US10925679B2 (en) Position determination device for determining a position of an instrument within a tubular structure
JP2014212904A (ja) 医用投影システム
US20160345937A1 (en) System and method for imaging using ultrasound
WO2025051557A1 (fr) Systèmes et procédés d'interprétation de balayage aveugle ultrasonore et d'évaluation de fidélité
US20240324848A1 (en) Information processing apparatus, information processing method, and information processing program
CN102119002A (zh) 超声成像
CN104887271B (zh) 输出包括在感兴趣区域中的血流信息的方法、设备和系统
CN110546684B (zh) 时变数据的定量评估
WO2025051602A1 (fr) Système et procédés de marquage automatique de balayages aveugles ultrasonores

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24762264

Country of ref document: EP

Kind code of ref document: A1