[go: up one dir, main page]

WO2025021632A1 - Recommandations échographiques dynamiques - Google Patents

Recommandations échographiques dynamiques Download PDF

Info

Publication number
WO2025021632A1
WO2025021632A1 PCT/EP2024/070341 EP2024070341W WO2025021632A1 WO 2025021632 A1 WO2025021632 A1 WO 2025021632A1 EP 2024070341 W EP2024070341 W EP 2024070341W WO 2025021632 A1 WO2025021632 A1 WO 2025021632A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
procedure
recommendation
image
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/070341
Other languages
English (en)
Inventor
Sven Peter PREVRHAL
Faik Can MERAL
William Tao Shi
Nicolas YANEZ MORENO
Antonio Luigi PERRONE
Khaled Salem Abdalleh YOUNIS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of WO2025021632A1 publication Critical patent/WO2025021632A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/585Automatic set-up of the device
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like

Definitions

  • Ultrasound imaging is increasingly being performed by low-cost, ultra-mobile ultrasound devices in addition to ultrasound imaging devices with ultrasound carts.
  • the growing capabilities of cart-based ultrasound systems and the advent of the ultra-mobile ultrasound devices places increased training burden on user groups.
  • ultrasound users are not adequately trained to administer an ultrasound examination in full compliance with device capabilities, guidelines, state of the art knowledge and understanding, and stakeholder preferences.
  • Ultrasound examinations are typically dynamic in that a sonographer may dynamically determine which ultrasound images to capture next and for how long during the ultrasound examination. The variations may be based on findings and insights during the ultrasound examination, and the ultrasound examination is only complete when the sonographer deems the ultrasound images and measurements to be complete.
  • an ultrasound system includes a memory that stores instructions; and a processor that executes the instructions.
  • the instructions When executed by the processor, the instructions cause the ultrasound system to: receive static information relating to an ultrasound procedure and dynamic information dynamically obtained from performing the ultrasound procedure; input the static information and the dynamic information to a trained machine learning model; generate, by the trained machine learning model during the ultrasound procedure, a recommendation for performing the ultrasound procedure based on the static information and the dynamic information; and overlay a marker for the recommendation on an ultrasound image to recommend an adjustment to the ultrasound procedure.
  • a method for ultrasound imaging comprising includes receiving, by a controller comprising a memory that stores instructions and a processor that executes the instructions, static information relating to an ultrasound procedure and dynamic information dynamically obtained from performing the ultrasound procedure; inputting the static information and the dynamic information to a trained machine learning model; generating, by the trained machine learning model during the ultrasound procedure, a recommendation for performing the ultrasound procedure based on the static information and the dynamic information; and overlaying a marker for the recommendation on an ultrasound image to recommend an adjustment to the ultrasound procedure.
  • a tangible non-transitory computer- readable storage medium stores a computer program.
  • the computer program When executed by a processor, the computer program causes a system to: receive static information relating to an ultrasound procedure and dynamic information dynamically obtained from performing the ultrasound procedure; input the static information and the dynamic information to a trained machine learning model; generate, by the trained machine learning model during the ultrasound procedure, a recommendation for performing the ultrasound procedure based on the static information and the dynamic information; and overlay a marker for the recommendation on an ultrasound image to recommend an adjustment to the ultrasound procedure.
  • FIG. 1 illustrates a system for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • FIG. 2 illustrates another system for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • FIG. 3 illustrates a method for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • FIG. 4 illustrates inputs and outputs for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • FIG. 5 illustrates a dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 6 illustrates another dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 7 illustrates a user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 8 illustrates another user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 9 illustrates another user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 10 illustrates another user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 11 illustrates a computer system, on which a method for dynamic ultrasound recommendations is implemented, in accordance with another representative embodiment.
  • ultrasound devices including ultra-mobile ultrasound devices, may be used to receive guidance before, during and after ultrasound examinations, to assist in reaching adequate diagnostic relevance.
  • Dynamic ultrasound recommendations may be used by both highly trained professional sonographers as well as ultrasound users from other specialties.
  • the teachings herein may also be applicable to both relatively simple ultrasound examinations as well as complex ultrasound examinations.
  • Guidance may be provided by a user interface specific to ultrasound examinations, and may be used to guide capturing of additional ultrasound images or measurements based on the capture history of the current ultrasound examination and the capture history of previous, similar ultrasound examinations.
  • FIG. 1 illustrates a system 100 for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • the system 100 in FIG. 1 is a system for dynamic ultrasound recommendations and includes components that may be provided together or that may be distributed.
  • the system 100 includes an ultrasound probe 110, an ultrasound base 120, and a display 180.
  • the ultrasound probe 110 includes a processing circuit 115 and a transducer array 113.
  • the processing circuit 115 may comprise a memory for storing data and instructions, an application-specific integrated circuit (ASIC) and/or a processor for processing data and instructions.
  • the transducer array 113 includes an array of transducer elements including at least a first transducer element 1131, a second transducer element 1132, and an Xth transducer element 113X.
  • the transducer array 113 converts electrical energy into sound waves which bounce off of body tissue, and receives echoes of the sound waves and converts the echoes into electrical energy.
  • the transducer array 113 may include dozens, hundreds or thousands of individual transducer elements.
  • the ultrasound probe 110 may transmit a beam to produce images and may detect the echoes.
  • the processing circuit 115 may process ultrasound images captured by the transducer array 113 of the ultrasound probe 110.
  • the ultrasound base 120 may comprise an ultrasound cart, and the memory and the processor may be implemented in the ultrasound cart.
  • the ultrasound base 120 includes a first interface 121, a second interface 122, a third interface 123, and a controller 150.
  • a computer that can be used to implement the ultrasound base 120 is depicted in FIG. 11, though an ultrasound base 120 may include more or fewer elements than depicted in FIG. 1 or FIG. 11.
  • One or more of the interfaces may include ports, disk drives, wireless antennas, or other types of receiver circuitry that connect the controller 150 to other electronic elements.
  • the first interface 121 connects the ultrasound base 120 to the ultrasound probe 110, and may comprise a port, an antenna, and/or another type of physical component for wired or wireless communications.
  • the second interface 122 connects the ultrasound base 120 to the display 180, and may also comprise a port, an antenna, and/or another type of physical component for wired or wireless communications.
  • the third interface 123 is a user interface, and may comprise buttons, keys, a mouse, a microphone, a speaker, switches, a touchscreen or other type of display separate from the display 180, and/or other types of physical components that allow a user to interact with the ultrasound base 120 such as to enter instructions and receive output.
  • the controller 150 includes at least a memory 151 that stores instructions and a processor 152 that executes the instructions.
  • the instructions stored in the memory 151 may comprise a software program including instructions for implementing a model such as an artificial intelligence model, as well as the same or a different software program for generating a user interface on an ultrasound device.
  • the user interface may be generated by the instructions stored in the memory 151 and may be displayed on the display 180.
  • the software program(s) in the memory 151 generate recommendations to be displayed on the display 180 for further ultrasound scans and for which images to capture and which measurements to take.
  • the display 180 may be local to the ultrasound base 120 or may be remotely connected to the ultrasound base 120, such as wirelessly.
  • the display 180 includes or otherwise provides a graphical user interface 181 which is configured to display various of the user interfaces described herein, for example with respect to FIG. 5, FIG. 6, FIG. 7, FIG. 8, FIG. 9 and FIG. 10.
  • the display 180 may be connected to the ultrasound base 120 via a local wired interface such as an Ethernet cable or via a local wireless interface such as a Wi-Fi connection.
  • the display 180 may be interfaced with other user input devices by which users can input instructions, including mouses, keyboards, thumbwheels and so on.
  • the display 180 may be a monitor such as a computer monitor, a display on a mobile device, an augmented reality display, a television, an electronic whiteboard, or another screen configured to display electronic imagery.
  • the display 180 may also include one or more input interface(s) such as those noted above that may connect to other elements or components, as well as an interactive touch screen configured to display prompts to users and collect touch input from users.
  • the controller 150 may perform some of the operations described herein directly and may implement other operations described herein indirectly. For example, the controller 150 may indirectly control operations such as by generating and transmitting content to be displayed on the display 180. The controller 150 may directly control other operations such as logical operations performed by the processor 152 executing instructions from the memory 151 based on input received from electronic elements and/or users via the interfaces. Accordingly, the processes implemented by the controller 150 when the processor 152 executes instructions from the memory 151 may include steps not directly performed by the controller 150.
  • FIG. 2 illustrates another system for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • an ultrasound probe 210 and a smartphone A and a smartphone B are connected over a network 201.
  • the network 201 may comprise a local wireless network such as a WiFi network, though the network 201 may also or alternatively include wired elements such as wires connected to smartphone A and smartphone B via USB cables.
  • the ultrasound probe 210 may comprise a portable transducer.
  • the ultrasound probe 210 includes a transducer array 213, a lens 214, a user interface 223, a controller 250 and a wireless communication circuit 290.
  • the transducer array 213 includes at least a first transducer element 2131, a second transducer element 2132, and an Xth transducer element 213X.
  • the transducer array 213 converts electrical energy into sound waves which bounce off of body tissue, and receives echoes of the sound waves and converts the echoes into electrical energy.
  • the transducer array 213 may include dozens, hundreds or thousands of individual transducer elements.
  • the ultrasound probe 210 may transmit a beam to produce images and may detect the echoes.
  • the processor 252 may process ultrasound images captured by the transducer array 213 of the ultrasound probe 210.
  • the lens 214 may be used to transmit the ultrasound beams and to receive echoes of ultrasound beams.
  • the user interface 223 may be used by a user to interact with the ultrasound probe 210.
  • the wireless communication circuit 290 may be used to communicate with smartphone A and smartphone B via the network 201.
  • the ultrasound probe may be configured to link to an external device such as the smartphone A and smartphone B via applications installed on the external device(s).
  • Smartphone A stores and executes an ultrasound application 299A.
  • Smartphone B stores and executes an ultrasound application 299B.
  • the ultrasound application 299A and the ultrasound application 299B may be configured to enable smartphone A and smartphone B to interact with the ultrasound probe 210 via the network 201.
  • ultrasound application 299A and ultrasound application 299B may be configured to enable displays of ultrasound images from the ultrasound probe 210.
  • Ultrasound application 299 A and ultrasound application 299B may also be configured to overlay, or generate and overlay, markers on the displays of ultrasound images from the ultrasound probe 210.
  • determinations of whether recommendations are to be made may be performed by the ultrasound application 299A and the ultrasound application 299B.
  • the ultrasound application 299A and the ultrasound application 299B may each comprise a model that can be applied to static information and dynamic information as described below.
  • the controller 250 includes at least a memory 251 that stores instructions and a processor 252 that executes the instructions.
  • the instructions stored in the memory 251 may comprise a software program with a model such as an artificial intelligence model, as well as the same or a different software program for generating a user interface on an ultrasound device.
  • the user interface may be generated by the instructions stored in the memory 251 and may be displayed on a display of the smartphone A or smartphone B.
  • the software program(s) in the memory 251 generate recommendations to be displayed on displays of the smartphone A and/or the smartphone B for further ultrasound scans and which images to capture and which measurements to take.
  • the controller 250 may perform some of the operations described herein directly and may implement other operations described herein indirectly.
  • the controller 250 may indirectly control operations such as by generating and transmitting content to be displayed on a display of the smartphone A or smartphone B.
  • the controller 250 may directly control other operations such as logical operations performed by the processor 252 executing instructions from the memory 251 based on input received from electronic elements and/or users via the interfaces. Accordingly, the processes implemented by the controller 250 when the processor 252 executes instructions from the memory 251 may include steps not directly performed by the controller 250.
  • FIG. 3 illustrates a method for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • the method of FIG. 3 may be performed by the system 100 in FIG. 1 or by the ultrasound probe 210 in FIG. 2. In some embodiments based on FIG. 2 and FIG. 3, the method of FIG. 3 may be performed by a combination of the ultrasound probe 210 and one or both of the smartphone A and the smartphone B in FIG. 2.
  • the method of FIG. 3 includes obtaining static information.
  • the static information is information relating to an ultrasound procedure, such as a type of diagnosis, the type of exam prescribed by the physician, the type of transducer, demographic information of the patient, identity and experiences levels and training levels of the sonographer performing the ultrasound procedure, and other types of static information that may be obtained before an ultrasound procedure.
  • the static information may also comprise a feature of the ultrasound system, such as a type of the ultrasound probe 110, the ultrasound base 120 or the ultrasound probe 210.
  • an ultrasound procedure is started.
  • the ultrasound probe 110 in FIG. 1 or the ultrasound probe 210 in FIG. 2 may be turned on, and may be initiated or activated, and a sonographer may begin obtaining ultrasound images from a patient.
  • the dynamic information is information dynamically obtained from performing the ultrasound procedure.
  • the dynamic information may be generated based on usage of the ultrasound system during the ultrasound procedure.
  • the dynamic information is information directly or derived indirectly from the ultrasound procedure started at S320.
  • the dynamic information may include the actual content of the ultrasound images, information derived from the content of the ultrasound images, as well as the context of the ultrasound imaging examination such as diagnosis.
  • the static information and the dynamic information are input to a model.
  • the model may comprise a trained machine learning model stored in and executed by the controller 150 in FIG. 1 or the controller 250 in FIG. 2.
  • the model may be stored in the memory 151 or the memory 251 as a software program.
  • the software program(s) in the memory 151 or the memory 251 generate recommendations to be displayed on the display 180 or on smartphone A or smartphone B for further ultrasound scans and for which images to capture and which measurements to take.
  • the model may operate in two modes, either continuous or summary.
  • the continuous mode involves continuous update of recommendations based on the captured information of the current examination, from start until the current time.
  • the summary mode involves summary recommendations when the user indicates that the examination is complete in the opinion of the user.
  • the model may comprise an artificial intelligence model trained on archive ultrasound imaging reports. Inputs for training the artificial model may include metadata and image and measurement contents.
  • the metadata may comprise static information such as a diagnostic question, patient characteristics such as body mass index (BMI) or age, and reading/referring physician preferences.
  • the image and measurement contents (IMC) may comprise dynamic information that can be obtained during an ultrasound examination, such as from image analysis by an image analysis program, and measurements by the image analysis program.
  • the ultrasound base 120 or the ultrasound probe 210 may be configured to analyze an ultrasound image generated during the ultrasound procedure, and generate the dynamic information based on analyzing the ultrasound image. Also, or alternatively, the ultrasound base 120 or the ultrasound probe 210 may be configured to identify their usage during the ultrasound procedure, and generate the dynamic information based on the usage during the ultrasound procedure.
  • the training may use a complete metadata set and leave one out subset of the image and measurement content (IMC) to predict the difference set between the full set of image and measurement content and the current subset.
  • IMC image and measurement content
  • a full set may be A-B-C-D-E, A-B- C-D may be trained to predict E, A-B-C-E may be trained to predict D, and so on.
  • the predicted images may be images that get recommended during inference in the deployment of the trained artificial intelligence model.
  • the recommendation is generated by the trained machine learning model during the ultrasound procedure insofar as the recommendation if for one or more next steps to take or consider taking during the ultrasound procedure.
  • the recommendation is generated by the model applied at S340.
  • the recommendation may be provided as guidance to be output on a user interface used during ultrasound examinations, such as the display 180 in FIG. 1 or smartphone A and/or smartphone B in FIG. 2.
  • recommendations may be output audibly instead of or together with visualizations of the recommendations.
  • the recommendation may be a recommendation to capture additional ultrasound images or measurements based on the capture history of the current ultrasound examination and the capture history of previous, similar examinations, as well as other types of static and dynamic information.
  • a recommendation may be generated by the trained machine learning model based on actions taken by one or more sonographer(s) in one or more matching previous context used in training the trained machine learning model.
  • the recommendation may be implemented to alert a sonographer to take additional measurements that other sonographers in a matching imaging context have taken.
  • the alerts and recommendations described herein may indicate how to integrate the recommendations into the examination workflow.
  • a marker is generated.
  • the marker is for the recommendation and is used to recommend an adjustment to the ultrasound procedure.
  • the marker may comprise text and/or icons to be overlaid on or adjacent to ultrasound images on the display 180 or smartphone A and/or smartphone B.
  • the marker may be generated by the controller 150 of the ultrasound base 120 in FIG. 1, or by either the controller 250 of the ultrasound probe 210 in FIG. 2 or else by the ultrasound application 299A on smartphone A or by the ultrasound application 299B of smartphone B in FIG. 2.
  • the marker is overlaid on or adjacent to ultrasound images from the ultrasound procedure.
  • the marker is overlaid on the ultrasound images on the display 180.
  • the marker may be overlaid on the ultrasound images on smartphone A or smartphone B.
  • the method of FIG. 3 may use software executing in the background during and immediately after the acquisition phase of an ultrasound examination.
  • the use of a suitable user interface on the display 180 or smartphone A or smartphone B enables automatic recommendations for acquisition of additional scans and/or for capture of additional images and measurements.
  • the model applied at S340 may comprise an artificial intelligence model that matches the current examination to an archive of ultrasound reports.
  • the model may apply a diagnostic question similar to metadata, patient characteristics and preferences of the reading/referring physician.
  • the model may also be applied dynamically to the image-level content of the actual ultrasound images from scans, image captures and measurements already completed during the current examination.
  • FIG. 4 illustrates inputs and outputs for dynamic ultrasound recommendations, in accordance with a representative embodiment.
  • a series of factors used as inputs for a dynamic ultrasound recommendation include information from other sonographers 401 who have performed ultrasound examinations and on similar patients 402.
  • the series of factors may also or alternatively include information particular to the sonographer 403 and particular to the ultrasound device(s) 404 being used by the sonographer.
  • the series of factors may also or alternatively include information particular to the diagnosis 405 for the current patient.
  • the output in FIG. 4 comprises a recommendation 406 of a particular capture for the sonographer to take.
  • the label for the input for information from other sonographers 401 in FIG. 4 states “other sonographers who scanned”.
  • the label for the input for similar patients 402 in FIG. 4 states “such patients”.
  • the various inputs in FIG. 4 may correspond to some or all of the inputs used as training inputs when training a trained machine learning algorithm.
  • FIG. 5 illustrates a dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • an ultrasound image on the user interface 581 A is analyzed to recommend capture of an ultrasound image on the user interface 581B. Nodularity is not shown in the ultrasound image on the user interface 581 A.
  • a curvilinear transducer is used for liver imaging to provide coverage and penetration. However, the curvilinear transducers are lower frequency and not optimized for near field visualization.
  • the ultrasound image on the user interface 58 IB is from a high frequency transducer optimized for near field visualization and reveals nodularity at the liver capsule, as shown by the line below the portal vein as indicated by the arrow on the user interface 581B.
  • an overlaid icon 582 may be provided with text and is separate from the ultrasound images.
  • the overlaid icon 582 may be provided on the display 180 in FIG. 1 as a recommendation to capture an image with a high frequency probe to check for nodularity.
  • the recommendation may include text to suggest that a sonographer changes transducers.
  • the overlaid icon 582 may have text that states “Recommended: image with high freq, probe to check for nodularity”. With sufficient data, the machine learning model used to generate the recommendation may also be trained to provide the reason for the recommendation.
  • the context for the dynamic ultrasound recommendation in FIG. 5 is for a sonographic examination of the liver.
  • the liver is a complex organ with many vessels, multiple and different views may be necessary during a liver ultrasound examination, depending on the diagnostic question. Variations may be based, for example, on confirming and characterizing liver tumor versus assessing severity of a fatty liver disease. Variations may be based on, for instance patient characteristics such as for obese patients requiring increasing signal and depth. Variations may also be based on the dynamically generated content during the examination, resulting in the ultrasound image on the user interface 581 A.
  • the model used to generate the recommendation on the overlaid icon 582 may comprise an artificial intelligence model trained on the metadata and image and measurement contents of native historical archive data.
  • the historical archive data may be native in the sense that there is no human annotation of salient features. For example, in the ultrasound image on the user interface 581 A, the echogenic thrombus does not need to be marked up in the image by a human reader.
  • the artificial intelligence model may learn that such image characteristics triggered Doppler imaging in all or at least some cases and thus the model may be trained to recommend color and spectral Doppler mode for a current examination that exhibits similar characteristics in a B-mode image.
  • An approach of matching using only metadata from the content of the archive report is not as effective because the current examination may only contain relevant metadata after the report is made. For example, if metadata from the content of the archive report were to be solely relied on and were to recommend a Doppler examination, the metadata would not be effective since the omission of a Doppler examination could not easily be remedied.
  • FIG. 6 illustrates another dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 6 a different imaging mode is recommended based on what is seen during the ultrasound examination.
  • the ultrasound image on the user interface 681 A is analyzed, and results in an overlaid icon 682 with a recommendation resulting in the ultrasound image on the user interface 68 IB.
  • the ultrasound image with the overlaid icon 682 may be provided on the display 180 in FIG. 1.
  • the overlaid icon 682 may have text that states “Recommended: switch to color and spectral Doppler”.
  • the ultrasound image on the user interface 681 A includes a portal vein and reveals an echogenic thrombus indicated by the arrow on the user interface 681 A. The echogenic thrombus nearly occludes the vessel lumen.
  • the recommendation on the overlaid icon 682 is to switch to color and use a spectral Doppler imaging mode.
  • the ultrasound image on the user interface 68 IB comprises a Doppler image that demonstrates low-resistance arterial vascularity within the thrombus, consistent with a tumor in a vein.
  • a new thrombus in a vein whether a tumor in the vein or a bland thrombus, qualifies an hepatocellular carcinoma (HCC) surveillance ultrasound examination as “US-3 Positive”, based on the Ultrasound Liver Reporting & Data System (LLRADS) scoring metric.
  • HCC hepatocellular carcinoma
  • the ultrasound image on the user interface 681B may result in a recommendation for a further diagnostic evaluation with a multiphase contrast-enhanced imaging study based on identification of the new thrombus in the vein.
  • FIG. 7 illustrates a user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 7 illustrates a current view of an EPIQ main screen with a variety of labels.
  • the ultrasound image shown on the user interface 781 is from a continuous-update mode of the model used to generate recommendations.
  • the model is trained on historical ultrasound examinations including exam metadata such as the various labels shown on the user interface 781.
  • Exam metadata may consist of diagnostic question, the hosting department, the referring physician, and patient characteristics.
  • the diagnostic question may be represented by one or more International Statistical Classification of Diseases and Related Health Problems 10 th revision (ICD-10) or newer codes for easier indexation. All captured ultrasound images in the training data may also be labeled to link the captured ultrasound images with anatomy and view.
  • ICD-10 International Statistical Classification of Diseases and Related Health Problems
  • the software program used to implement the model analyzes each screen capture or measurement acquired by the user immediately after the screen capture or measurement is taken.
  • the model predicts the next view within the context of the examination metadata and screens and measurements captured previously during the present exam. Either the label of the predicted view or measurement is communicated to the user right away or after the following screen capture, in the event that the following screen capture does not match the prediction.
  • the continuous-update mode may be appropriate for novice or less-well trained users and may improve overall exam diagnostic quality and completeness, as well as reduce exam time.
  • it may be useful to display the label of the next recommended view or measurement as well as a picture rendition for further guidance also a picture rendition.
  • the thumbnail views labeled 1, 2, 3, 4, 5, 6 and 7 may each correspond to other labelled ultrasound images used as training inputs for a trained machine learning model.
  • FIG. 8 illustrates another user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 8 illustrates an ultrasound image on the user interface 881.
  • the ultrasound image in FIG. 8 may be the same as or similar to the ultrasound image in FIG. 7.
  • suggested views are provided with indicators for corresponding recommendations.
  • the ultrasound image on the user interface 881 is overlaid with alert icons to indicate for which captures or measurements a recommendation is found.
  • the exclamation point used as an alert icon 882 may be colored yellow on the user interface 881.
  • Similar alert icons on thumbnails on the right may also be colored yellow, including the alert icon 883 and the alert icon 884. The user can then click on these icons to obtain further guidance on the additional recommended captures or measurements.
  • FIG. 9 illustrates another user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • FIG. 9 a series of ultrasound images are shown as a dynamic view for suggested recommendations on a user interface 981.
  • the series of ultrasound images in FIG. 9 provide an example of how completed follow-ups to recommendations may be indicated.
  • Icons in FIG. 9 may be shown in different color codes. For example, yellow color may indicate a recommendation identified by the model, as demonstrated by the low-quality image shown as alert icon 982. If the user ignores the yellow recommendation at the moment but captures the recommended views or measurements in the rest of the examination, the exclamation mark color code may change to green, as shown in alert icon 983 with the improved image-quality. If the user interrupts the examination flow to follow a recommendation, clicks/taps the icon and performs the recommended task then the icon may turn into a green check mark to indicate completion via recommender interaction. This is demonstrated as alert icon 984, showing an image with optimal image quality.
  • FIG. 10 illustrates another user interface for dynamic ultrasound recommendation, in accordance with a representative embodiment.
  • a live scanning screen is shown on the user interface 1081.
  • the live scanning screen is from after a user clicked on a recommendation.
  • the portion labelled A is a reference image 1082.
  • the portion labelled B is the recommended capture 1083.
  • the portion labelled C is the actual capture 1084 which triggered the recommendation.
  • the portion labelled D is the live image on the primary window of the user interface 1081.
  • the model may change the view on the display 180 or the smartphone A and/or the smartphone B.
  • the view may change to overlay an image pair as shown in FIG. 5 and FIG. 6 on top of the current captured image.
  • the arrow with text in FIG. 5 and FIG. 6 may be replaced by a set or arrows such as in FIG. 4 to show the meta information that the recommendation is based on.
  • An example of the live scanning screen after the user activated a recommendation is shown in FIG. 10.
  • Interaction with an exclamation mark of a different color such as green may be slightly different. Since there is another capture which satisfies the recommendation, the exclamation mark of the different color will be shown together with the recommendation pair, i.e. the live scanning screen in the portion labelled D in FIG. 10 replaced by the capture which the system found to be close enough to the recommended image. If the user agrees that the capture is indeed equivalent to the recommended image and they are satisfied, the recommendation may be considered completed. Alternatively, a user may find a recommended image different than the other acquisition and may want to acquire another image based on the recommendation, in that case the center portion of the user interface 1081 turns into the live scanning screen again and the user continues with the acquisition.
  • the summary-recommendation mode may be more appropriate for highly trained users and is akin to a review of examination completeness performed at the end of the examination.
  • the advantage of the summary-recommendation mode is that it is more unobtrusive, yet the user may find the summary-recommendation mode more cumbersome to perform an extra task - looking at the summary recommendations - at the end of an exam, and then potentially restart scanning to capture missed screens or measurements.
  • user input is not particularly required by the prediction software.
  • one of the two modes may be implemented without requiring the other of the two modes.
  • FIG. 11 illustrates a computer system, on which a method for dynamic ultrasound recommendations is implemented, in accordance with another representative embodiment.
  • the computer system 1100 includes a set of software instructions that can be executed to cause the computer system 1100 to perform any of the methods or computer-based functions disclosed herein.
  • the computer system 1100 may operate as a standalone device or may be connected, for example, using a network 1101, to other computer systems or peripheral devices.
  • a computer system 1100 performs logical processing based on digital signals received via an analog-to-digital converter.
  • the computer system 1100 operates in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
  • the computer system 1100 can also be implemented as or incorporated into various devices, such as a workstation that includes a controller, a stationary computer, a mobile computer, a personal computer (PC), a laptop computer, a tablet computer, or any other machine capable of executing a set of software instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • the computer system 1100 can be incorporated as or in a device that in turn is in an integrated system that includes additional devices.
  • the computer system 1100 can be implemented using electronic devices that provide voice, video or data communication. Further, while the computer system 1100 is illustrated in the singular, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of software instructions to perform one or more computer functions.
  • the computer system 1100 includes a processor 1110.
  • the processor 1110 may be considered a representative example of a processor of a controller and executes instructions to implement some or all aspects of methods and processes described herein.
  • the processor 1110 is tangible and non-transitory.
  • non- transitory is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
  • non-transitory specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
  • the processor 1110 is an article of manufacture and/or a machine component.
  • the processor 1110 is configured to execute software instructions to perform functions as described in the various embodiments herein.
  • the processor 1110 may be a general-purpose processor or may be part of an application specific integrated circuit (ASIC).
  • the processor 1110 may also be a microprocessor, a microcomputer, a processor chip, a controller, a microcontroller, a digital signal processor (DSP), a state machine, or a programmable logic device.
  • the processor 1110 may also be a logical circuit, including a programmable gate array (PGA), such as a field programmable gate array (FPGA), or another type of circuit that includes discrete gate and/or transistor logic.
  • the processor 1110 may be a central processing unit (CPU), a graphics processing unit (GPU), or both.
  • any processor described herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
  • the term “processor” as used herein encompasses an electronic component able to execute a program or machine executable instruction. References to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems. The term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
  • the computer system 1100 further includes a main memory 1120 and a static memory 1130, where memories in the computer system 1100 communicate with each other and the processor 1110 via a bus 1108.
  • main memory 1120 and static memory 1130 may be considered representative examples of a memory of a controller, and store instructions used to implement some or all aspects of methods and processes described herein.
  • Memories described herein are tangible storage mediums for storing data and executable software instructions and are non-transitory during the time software instructions are stored therein. As used herein, the term “non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
  • non- transitory specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
  • the main memory 1120 and the static memory 1130 are articles of manufacture and/or machine components.
  • the main memory 1120 and the static memory 1130 are computer-readable mediums from which data and executable software instructions can be read by a computer (e.g., the processor 1110).
  • Each of the main memory 1120 and the static memory 1130 may be implemented as one or more of random access memory (RAM), read only memory (ROM), flash memory, electrically programmable read only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, or any other form of storage medium known in the art.
  • RAM random access memory
  • ROM read only memory
  • EPROM electrically programmable read only memory
  • EEPROM electrically erasable programmable read-only memory
  • registers a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, or any other form of storage medium known in the art.
  • the memories may be volatile or non-volatile, secure and/or encrypted, unsecure and/or unencrypted.
  • “Memory” is an example of a computer-readable storage medium.
  • Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to “computer memory” or “memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
  • the computer system 1100 further includes a video display unit 1150, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT), for example.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • CRT cathode ray tube
  • the computer system 1100 includes an input device 1160, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 1170, such as a mouse or touch-sensitive input screen or pad.
  • the computer system 1100 also optionally includes a disk drive unit 1180, a signal generation device 1190, such as a speaker or remote control, and/or a network interface device 1140.
  • the disk drive unit 1180 includes a computer- readable medium 1182 in which one or more sets of software instructions 1184 (software) are embedded.
  • the sets of software instructions 1184 are read from the computer-readable medium 1182 to be executed by the processor 1110. Further, the software instructions 1184, when executed by the processor 1110, perform one or more steps of the methods and processes as described herein.
  • the software instructions 1184 reside all or in part within the main memory 1120, the static memory 1130 and/or the processor 1110 during execution by the computer system 1100.
  • the computer-readable medium 1182 may include software instructions 1184 or receive and execute software instructions 1184 responsive to a propagated signal, so that a device connected to a network 1101 communicates voice, video or data over the network 1101.
  • the software instructions 1184 may be transmitted or received over the network 1101 via the network interface device 1140.
  • dedicated hardware implementations such as application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays and other hardware components, are constructed to implement one or more of the methods described herein.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • programmable logic arrays and other hardware components are constructed to implement one or more of the methods described herein.
  • One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. None in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
  • the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
  • dynamic ultrasound recommendations enables ultrasound imaging technologists and other users who administer ultrasound imaging examinations to comply with device capabilities, guidelines, state of the art of ultrasound imaging techniques and stakeholder preferences.
  • dynamic ultrasound recommendations has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of dynamic ultrasound recommendations in its aspects.
  • dynamic ultrasound recommendations has been described with reference to particular means, materials and embodiments, dynamic ultrasound recommendations is not intended to be limited to the particulars disclosed; rather dynamic ultrasound recommendations extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims.
  • the illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments.
  • inventions of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept.
  • inventions merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept.
  • specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown.
  • This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Selon l'invention, un système échographique comprend une mémoire qui stocke des instructions; et un processeur qui exécute les instructions. Les instructions amènent le système échographique à mettre en œuvre un processus. Le procédé comprend la réception d'informations statiques relatives à une procédure échographique et d'informations dynamiques obtenues de manière dynamique à partir de la réalisation de la procédure échographique. Les informations statiques et les informations dynamiques sont entrées dans un modèle d'apprentissage automatique entraîné pendant la procédure échographique. Le modèle d'apprentissage automatique entraîné génère une recommandation pour réaliser la procédure échographique sur la base des informations statiques et des informations dynamiques. Le système échographique superpose un marqueur pour la recommandation sur une image échographique pour recommander un ajustement de la procédure échographique.
PCT/EP2024/070341 2023-07-21 2024-07-18 Recommandations échographiques dynamiques Pending WO2025021632A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363528110P 2023-07-21 2023-07-21
US63/528,110 2023-07-21

Publications (1)

Publication Number Publication Date
WO2025021632A1 true WO2025021632A1 (fr) 2025-01-30

Family

ID=91966735

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/070341 Pending WO2025021632A1 (fr) 2023-07-21 2024-07-18 Recommandations échographiques dynamiques

Country Status (1)

Country Link
WO (1) WO2025021632A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040147840A1 (en) * 2002-11-08 2004-07-29 Bhavani Duggirala Computer aided diagnostic assistance for medical imaging
US20170071468A1 (en) * 2015-09-11 2017-03-16 Carestream Health, Inc. Motion tracking method for sonographer
US20170360401A1 (en) * 2016-06-20 2017-12-21 Alex Rothberg Automated image acquisition for assisting a user to operate an ultrasound device
US20190350564A1 (en) * 2018-05-21 2019-11-21 Siemens Medical Solutions Usa, Inc. Tuned medical ultrasound imaging
US20200126661A1 (en) * 2017-01-17 2020-04-23 Koninklijke Philips N.V. Augmented reality for predictive workflow in an operating room
US20220087644A1 (en) * 2020-09-24 2022-03-24 GE Precision Healthcare LLC Systems and methods for an adaptive interface for an ultrasound imaging system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040147840A1 (en) * 2002-11-08 2004-07-29 Bhavani Duggirala Computer aided diagnostic assistance for medical imaging
US20170071468A1 (en) * 2015-09-11 2017-03-16 Carestream Health, Inc. Motion tracking method for sonographer
US20170360401A1 (en) * 2016-06-20 2017-12-21 Alex Rothberg Automated image acquisition for assisting a user to operate an ultrasound device
US20200126661A1 (en) * 2017-01-17 2020-04-23 Koninklijke Philips N.V. Augmented reality for predictive workflow in an operating room
US20190350564A1 (en) * 2018-05-21 2019-11-21 Siemens Medical Solutions Usa, Inc. Tuned medical ultrasound imaging
US20220087644A1 (en) * 2020-09-24 2022-03-24 GE Precision Healthcare LLC Systems and methods for an adaptive interface for an ultrasound imaging system

Similar Documents

Publication Publication Date Title
JP7373494B2 (ja) 超音波検査にアノテーションを付けるための方法および装置
US12070360B2 (en) Method of sharing information in ultrasound imaging
US20110245632A1 (en) Medical Diagnosis Using Biometric Sensor Protocols Based on Medical Examination Attributes and Monitored Data
JP6916876B2 (ja) 画像セレクタを有する超音波イメージング装置
US20210280298A1 (en) Methods and systems for detecting abnormalities in medical images
CN111374698A (zh) 超声成像系统及相关的工作流系统和方法
KR102207255B1 (ko) 초음파 장치의 정보 공유 방법, 의료 전문가 디바이스의 통신 방법 및 정보 공유 시스템
US20140200449A1 (en) Ultrasound apparatus and method of providing information of the same
US20180368812A1 (en) Ultrasound imaging apparatus and control method thereof
JP2023503818A (ja) 医用超音波画像を取得するためのシステム及び方法
KR101263831B1 (ko) 진단영상 생성장치, 프로브, 프로브를 제어하는 방법 및 진단영상을 생성하는 방법
CN117334112A (zh) 具有改进培训模式的超声成像系统
CN104856727B (zh) 用于显示黏液囊的位置信息的方法和超声设备
US20220079553A1 (en) Ultrasound diagnosis apparatus, measurement condition setting method, and non-transitory computer-readable storage medium
WO2025021632A1 (fr) Recommandations échographiques dynamiques
US20230238151A1 (en) Determining a medical professional having experience relevant to a medical procedure
CN111096765B (zh) 超声诊断设备及其快速查找未完成切面的方法、存储介质
WO2023242072A1 (fr) Ultrason complété
WO2020103103A1 (fr) Procédé de traitement de données ultrasonores, dispositif ultrasonore et support d'informations
US20170372019A1 (en) Ultrasound system and method
CN109259802B (zh) 一种超声造影成像方法及系统
WO2021190984A1 (fr) Amélioration de l'efficacité de flux de travail par analyse de contenu d'interface utilisateur graphique (gui) basée sur une intelligence artificielle (ai)
US12430015B2 (en) Image display apparatus including display controller and operation acceptance section, image display method therefor, and program therefor
US12144683B2 (en) Ultrasound diagnosis apparatus and operating method thereof for displaying ultrasound elasticity images
US20240099686A1 (en) Ultrasound diagnostic system and control method for ultrasound diagnostic system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24746256

Country of ref document: EP

Kind code of ref document: A1