[go: up one dir, main page]

WO2024223029A1 - Procédé de détermination d'un paramètre d'acquisition d'ultrasons - Google Patents

Procédé de détermination d'un paramètre d'acquisition d'ultrasons Download PDF

Info

Publication number
WO2024223029A1
WO2024223029A1 PCT/EP2023/060791 EP2023060791W WO2024223029A1 WO 2024223029 A1 WO2024223029 A1 WO 2024223029A1 EP 2023060791 W EP2023060791 W EP 2023060791W WO 2024223029 A1 WO2024223029 A1 WO 2024223029A1
Authority
WO
WIPO (PCT)
Prior art keywords
roi
ultrasound
feature
subset
medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2023/060791
Other languages
English (en)
Inventor
Bo Zhang
Biao Chen
Junchi Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SuperSonic Imagine SA
Original Assignee
SuperSonic Imagine SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SuperSonic Imagine SA filed Critical SuperSonic Imagine SA
Priority to PCT/EP2023/060791 priority Critical patent/WO2024223029A1/fr
Publication of WO2024223029A1 publication Critical patent/WO2024223029A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4417Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0825Clinical applications for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/025Tomosynthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • G01S7/52042Details of receivers using analysis of echo signal for target characterisation determining elastic properties of the propagation medium or of the reflective target
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection

Definitions

  • Ultrasound imaging provides a non-invasive method to visualize the internal structure of a material, an animal, a human body or parts of it (for example organs).
  • visualization methods can be used to screen for and diagnose cancer in a patient.
  • early breast screening can allow detection of lesions that might be maligns or about to become cancerous so that treatment can take place at an early stage in the disease.
  • Mammography and tomosynthesis utilize x-ray radiation to allow visualization of breast in a compressed position. These techniques are used to screen patients for potentially cancerous lesions detection or cancer monitoring.
  • Traditional mammograms involve acquiring two-dimensional images of compressed breast.
  • Tomosynthesis produces a plurality of x-ray images, each of discrete layers or slices of the breast, through the entire thickness thereof.
  • Tomosynthesis pieces together a three- dimensional visualization of the breast.
  • Mammography and tomosynthesis are typically performed while the patient is standing or sitting and the patient's breast tissue is under compression.
  • an ultrasound scan may be the next step in better identifying the potential lesion.
  • Ultrasound uses sound waves, typically produced by piezoelectric transducers, to image tissues in a patient.
  • An ultrasound probe focuses on the sound waves by producing an arc-shaped sound wave that travels into the body and is partially reflected or diffused by the layers between different tissues in the patient.
  • the reflected sound wave is detected by the transducers and converted into electrical signals that can be processed by the ultrasound scanner, for example to form an ultrasound image of the tissue.
  • Ultrasound is typically performed while the patient is in a supine position and the patient's breast tissue is not under compression.
  • a computer-implemented method of determining an ultrasound acquisition parameter comprises: obtaining a feature associated with a medium, and determining the ultrasound acquisition parameter based on the feature.
  • an ultrasound acquisition process may be optimized based on one or several specific characteristics (i.e., the feature) derived from for example another imaging modality.
  • the ultrasound acquisition process may be adjusted in this way to better acquire data from a particular region of interest (ROI) in the medium, for example a lesion.
  • ROI region of interest
  • the obtained ultrasound data of the ROI may correlate more precisely and/or more reliably with image data of the other imaging modality.
  • the ultrasound acquisition process quality can be significantly improved in terms of time consumption, optimized image acquisition parameters, automatic image review (i.e., selection/weighting of subsets, post-processing), and accuracy of detection, segmentation, correlation, and classification of a lesion (i.e., of a region of interest in the medium, as described below).
  • automatic image review i.e., selection/weighting of subsets, post-processing
  • accuracy of detection, segmentation, correlation, and classification of a lesion i.e., of a region of interest in the medium, as described below.
  • the workflow from lesion detection, correlation to classification in ultrasound imaging can be enhanced by automating several critical operations of ultrasound scan and decision-making process thus reducing the ultrasound procedure time.
  • the method of the present disclosure makes different exam modalities (for example, tomosynthesis and ultrasound imaging) work collaboratively to enhance the image formation and image review performance.
  • the method of the present disclosure may lead to a reduction of the user dependency (i.e. novice versus expert) for ultrasound data acquisition, and optionally interpretation, due to the possible automation of determining the individual examination process. Interpretation may be improved, being based on more relevant ultrasound data obtained during the exam thanks to the disclosure.
  • the method may further allow: an increased exam reproducibility for follow-up exams over time, an increased exam standardization, reporting, and documentation; the capability to provide personalized and optimal examination with relevant detection I assessment of potential lesions and well-targeted acquisition and measurements (for example via depth focalization on a detected lesion), leading to an improved accuracy of the exam outcomes; a cost reduction (for example manpower of medical workers, optimization of a medical device usage, more generally reduction of healthcare costs) due to an improved subject management.
  • the dynamic examination process may for example avoid useless exams, irrelevant additional exams I useless interventional procedures, unnecessary follow-up exams, generally due to partial and/or improper acquisition of the data leading to potential misinterpretation of the data; and a cost reduction due to an increased productivity, for example due to an accelerated examination process.
  • An ultrasound acquisition process (as described below) may be provided for examinations of a subject.
  • the method of the present disclosure may thus improve the outcomes of the ultrasound acquisition process, in particular in case the user of an ultrasound imaging system performing the ultrasound acquisition process is not skilled to manually detect and correlate a particular lesion within the medium.
  • the medium may be or may comprise a tissue of a subject, for example a breast.
  • subject may refer to a patient who may be the focus of clinical attention, investigation, or treatment.
  • subject may include sick or healthy humans who are receiving medical care, as well as sick animals who are being treated by veterinarians.
  • a healthy subject may also be examined for prophylactic or preventive purposes, i.e. to avoid any illness or pathology.
  • a subject may also be examined during or after a treatment.
  • the term "subject” may also be used to refer to individuals or animals who are participating in the study, whether as patients or healthy volunteers. The use of the term "subject” may reflect the fact that these individuals may be the primary focus of attention and observation in the medical or research context.
  • the feature may be obtained based on data of a sensor modality.
  • the feature may be obtained based on image data of an imaging modality.
  • the imaging modality may be hence a specific type of the sensor modality.
  • the feature may be associated with a region of interest, ROI, in the medium.
  • the ROI may be or may comprise a lesion in the medium (for example a human or animal tissue), as described in more detail below.
  • the feature may comprise at least one of: a geometric feature of the medium and/or the region of interest, a tissue of the medium and/or the region of interest and/or of the medium, and a category of the medium and/or the region of interest.
  • the ultrasound acquisition parameter may comprise at least one of: a type of a transducer device for obtaining ultrasound image data, a frequency, a focal depth, a frame rate, a ROI location, a display range, a quantification range, an attenuation of sound, a speed of sound characterizing the ROI and/or the medium, and an image processing property.
  • ultrasound acquisition parameters may comprise settings, such as acquisition parameters (for example depth, focus, frequency), processing parameters (for example speed of sound, edge enhancement, speckle reduction) and display parameters (for example dynamic range, quantification range) or other (for example a mode, gain, persistence, compression, harmonic imaging, spatial compounding, speckle reduction, edge enhancement, color Doppler settings, pulsed wave Doppler settings, and power Doppler settings, etc. It is noted that the afore-mentioned list is not exhaustive and may also comprise further settings to configure ultrasound acquisition parameters.
  • the imaging modality for obtaining the feature may be a medical imaging modality, and/or the imaging modality may comprise at least one of: mammography, tomosynthesis, magnetic resonance imaging (MRI), single-photon emission computerized tomography (SPECT) scan, positron emission tomography (PET) scan, optical coherence tomography (OCCT), optical tomography (OCT), X-ray exam, and ultrasound imaging.
  • MRI magnetic resonance imaging
  • SPECT single-photon emission computerized tomography
  • PET positron emission tomography
  • OCCT optical coherence tomography
  • OCT optical tomography
  • X-ray exam X-ray exam
  • Obtaining the feature may comprise: detecting or localizing or identifying and optionally segmenting the ROI in the medium based on the data of the sensor modality of the medium. Accordingly, a ROI may be detected or identified in the medium or its position may be additionally localized. Moreover, once a ROI is detected it may be segmented, for example using a predefined computer-implemented segmentation algorithm.
  • the data of the sensor modality may be or may comprise, for example, image data.
  • the geometric feature may comprise at least one of: a distance from the ROI to a predefined landmark in the medium, and a location and/or size of the ROI in relation to a predefined area in the medium.
  • the tissue feature may comprise at least one of: a density, a texture, a distribution, a stiffness, and a visco-elasticity of the ROI and/or of the medium.
  • the method may further comprise obtaining follow-up information of the ROI and/or the medium and determining the ultrasound acquisition parameter further based on the follow-up information.
  • the follow-up information may be or may also be referred to as progressive, evolutionary, or historic information of the ROI and/or the medium.
  • the follow-up information may comprise several data sets, wherein each data set has been collected at a different point in time, for instance over several weeks, months or even years. In this way, an evolution of the medium in general and of the ROI in particular can be derived from the follow-up information.
  • the follow-up information may serve as an information source for determining the ultrasound acquisition parameter.
  • the ultrasound acquisition parameter may not only be determined or adapted based on the feature, but also on the follow-up information.
  • the follow-up information may comprise at least one of image data of an imaging modality (as described above), subject information, and clinical records obtained at different time points .
  • the follow-up information may comprise information about a temporal evolution of a lesion (i.e. the ROI), for instance regarding its location, size, and/or stiffness. Based on said information the current location, size, and/or stiffness may be predicted, this information may be used together with the feature for determining the ultrasound acquisition parameter.
  • a temporal evolution of a lesion i.e. the ROI
  • this information may be used together with the feature for determining the ultrasound acquisition parameter.
  • the method may further comprise configuring an ultrasound acquisition process based on the ultrasound acquisition parameter.
  • the ultrasound acquisition parameter may configure, set, or define at least one property or setting of an ultrasound acquisition process.
  • the method may further comprise obtaining ultrasound image data, for example of a plurality of image frames in the configured ultrasound acquisition process.
  • Ultrasound image data may be obtained using an ultrasound imaging system associated with an ultrasound probe (i.e., an ultrasound transducer device).
  • an ultrasound probe i.e., an ultrasound transducer device
  • the ultrasound acquisition parameter may be used for controlling the ultrasound probe and/or the ultrasound imaging system.
  • the obtained ultrasound image data may comprise B-mode (brightness mode) and/or data ShearWaveTM Elastography image data or data of any other ultrasound modality.
  • the ultrasound acquisition process may comprise guiding a pose, for instance 3D or 6D pose, of a transducer device for obtaining the ultrasound image data.
  • a 3D pose may be defined by three-dimensional translation information, and 6D pose by three-dimensional translation information and in addition three- dimensional rotation information.
  • the pose may optionally be guided as a function of already obtained plurality of image frames and/or the feature of the ROI in the image data of the medical imaging modality.
  • the guidance may happen in real-time or pseudo real time, based on the already obtained ultrasound image data, for example such that the ultrasound image data cover or contain the ROI.
  • the method may further comprise evaluating and/or selecting a subset of ultrasound image data by comparing the subset with the feature.
  • the subset may be or may comprise one or a several ultrasound image frames.
  • the subset may also be a particular type of ultrasound modality, for example ShearWave Elastography.
  • the selection and/or evaluation may imply selecting and/or weighting one or several image frames, and/or selecting a particular type of ultrasound modality, for example SWE.
  • the evaluation may comprise assigning the subset with a weighting coefficient. In this way, a soft selection of subsets may be achieved by assigning respective weights to the subsets.
  • the feature for instance a geometric feature of a lesion in tomosynthesis image data
  • the feature may be used to select an ultrasound image frame, for instance by comparing and/or correlating the lesion of the tomosynthesis image data with a lesion on the image frame.
  • Evaluating and/or selecting the subset may comprise or may be based on at least one of:
  • Detecting may also be referred to as for example “localizing” or “identifying”. At least one of these operations (i.e. detecting a ROI, assigning the with a detection confidence coefficient, and assigning with a correlation coefficient) may thus be part of evaluating and/or selecting operation. However, at least one of these operations, or all of these operations may also be performed in advance, and the results may be provided to the evaluating and/or selecting operation, for example during and/or after the ultrasound acquisition process.
  • the evaluating and/or selecting operation may also be based on one of the aforementioned operations. By performing at least one of these operations in advance, the evaluating and/or selecting operation may be accelerated and a real-time requirement of said operation may be achieved with less computational resources or more reliably.
  • the detection confidence coefficient may for example indicate a probability that the detected area is a lesion, and/or a probability that the ROI belongs to a predefined ROI category.
  • the correlation coefficient may for example indicate a strength of relationship between the ROI of the subset (for example a selected image frame) and the ROI of the sensor modality data.
  • the ROI may be detected and optionally segmented in each subset.
  • a subset among a plurality of subsets having each a detected ROI may then be selected and/or weighted.
  • a subset with an ROI having a relatively high detection confidence coefficient and/or a relatively high correlation coefficient may be selected and/or may be assigned with a relatively high weighting value, for example when compared to other subsets.
  • the correlation coefficient may be determined based on the feature associated with the sensor modality data and a second feature determined based on the subset.
  • the second feature may comprise at least one of: a geometric feature of the ROI, a tissue of the ROI and/or the medium, and a ROI categorization.
  • a tissue of a medium may for example comprise information about the vicinity of the ROI, for example of vessels in the vicinity.
  • Such areas in the vicinity may for example provide helpful reference points for correlating ROIs in image data of two different modalities. Accordingly, it may be advantageous to not only take into account information of the ROI itself, but also from other areas in the medium.
  • the correlation coefficient may be determined to be relatively high.
  • the method may further comprise post-processing the subset and optionally displaying an image based on the post- processed subset.
  • post-processing may comprise at least one of: zooming to and/or cropping of the ROI, performing a measurement of the ROI, and mapping the ROI in the displayed image to the image data of the sensor modality.
  • the image or the post-processed subset may be displayed on a display device.
  • the method may further comprise reconstructing the data of the sensor modality based on the post-processed subset.
  • the obtained ultrasound data may be fed back to a reconstruction operation which reconstructs the data of the sensor modality, for example the tomosynthesis image data, in order to adjust and/or optimize the reconstructing process.
  • the post-processed subset can give insights regarding the content or characteristics of a lesion, for example tissue characteristics inside the lesion. It is thus possible to for example focalize the sensor modality (for example the tomosynthesis process) according to the information obtained from the post-processed subset, in order to construct only a relatively small part of the initial image data of the sensor modality (for example only the ROI or parts of an identified area in the subset).
  • the precision and/or resolution of the reconstructed ROI can be increased. Accordingly, the reconstructed ROI may become more robust. In other words, for example artefacts from for instance other tissues or bones in the medium may be reduced, as these areas are not re-constructed anymore, but only the ROI.
  • the method may be performed at least partially in real-time or pseudo real time. For example, determining the ultrasound acquisition parameter based on a feature, configuring an ultrasound acquisition process, obtaining ultrasound image data, and optionally also evaluating and/or selecting a subset, post-processing the subset and displaying the subset may be performed in real-time or in quasi real-time, such that a user steering the ultrasound acquisition process (for example by holding an ultrasound probe) does not perceive any significant delay, when looking to the display device.
  • the method may be at least partially performed using an artificial intelligence (Al) model.
  • Al artificial intelligence
  • Examples of Al models may comprise machine learning models, such as decision trees, random forests, logistic regression, gradient boosting, neural networks, and/or support vector machines. Predefined algorithms may be used to perform classification and prediction tasks, such as image recognition. For example, a decision tree algorithm may be used to classify images of objects based on their features. Other examples of machine leaning models may comprise deep learning models, such as neural networks, k-nearest neighbors, and/or clustering algorithms, which may be used for tasks such as anomaly detection and data segmentation.
  • Al may refer to the development of computer systems and algorithms that may perform tasks that typically require human intelligence, such as visual perception, speech recognition, decisionmaking, and language translation. Al systems and algorithms may be trained using large sets of data and sophisticated algorithms that may allow them to identify patterns and trends, make predictions, and optimize outcomes.
  • the present disclosure may further refer to a computing device, comprising: at least one processor, and at least one memory storing computer-executable instructions, the computerexecutable instructions when executed by the processor cause the computing device to perform a method according to any one of the preceding claims.
  • the computing device may be configured to obtaining a feature associated with a medium, and determining an ultrasound acquisition parameter based on the feature.
  • the processor may be a component of electronic devices that may be responsible for carrying out computational tasks. There may be different types of processing units, each designed for specific purposes.
  • a processing unit may be or may comprise a Central Processing Unit (CPU), Graphics Processing Unit (GPU), Digital Signal Processor (DSP), Field-Programmable Gate Array (FPGA), and/or Application-Specific Integrated Circuit (ASIC).
  • the CPU may be the primary processing unit in a computer that may be responsible for executing most of the instructions and calculations required by the computer.
  • GPUs may be specialized processing units that may be designed to handle the complex calculations required for rendering graphics and video.
  • DSPs may be specialized processing units that may handle signal processing tasks
  • FPGAs may be reconfigurable processing units that may be programmed to perform various computational tasks
  • ASICs may be customized processing units designed to perform a specific set of tasks, optionally making them highly efficient and effective for their intended purpose. These processing units may be found in a wide range of electronic devices, such as medical electronic devices. Medical electronic devices may include computers, smartphones, and other digital devices, optionally enabling these devices to perform various computational tasks efficiently and accurately. The method according to the present disclosure may also run on a virtual server.
  • the present disclosure may also relate to a system for determining an ultrasound acquisition parameter, the system comprising means for carrying out the method according to any examples of the present disclosure.
  • the system may comprise or may be a computing device, as described above.
  • the present disclosure may also relate to an imaging system (e.g., ultrasound imaging) comprising means for carrying out the method according to any examples of the present disclosure.
  • an imaging system e.g., ultrasound imaging
  • the present disclosure may also relate to a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to any examples of the present disclosure.
  • the method operations may comprise any aspects (for example physical) which go beyond a mere data processing (for example an ultrasound signal processing)
  • the computer program may further comprise computer-readable instructions which when executed by a data processing system cause any external elements of a system (for example an ultrasound transducer device) to carry out these operations.
  • the present disclosure may also relate to a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to any examples of the present disclosure.
  • FIG. 1 schematically shows a method of determining an ultrasound acquisition parameter according to examples of the present disclosure.
  • FIG. 2a shows an example of stage A. (feature extraction process) of the method of Fig. 1.
  • FIG. 2b shows an example of stage B. (an ultrasound acquisition process) of the method of Fig. 1.
  • FIG. 2c shows an example of stage C. (review process) of the method of Fig. 1.
  • FIG. 3a shows a schematic example of a region of interest (ROI) mapping process according to the present disclosure
  • FIG. 3b shows a schematic example of a post-processing process (zooming among others) according to the present disclosure
  • Fig. 3c shows a further schematic example of a post-processing process (measurement) according to the present disclosure
  • FIG. 3d shows a schematic example of a feedback-controlled reconstructions process according to the present disclosure
  • FIG. 4 shows a schematic drawing of an ultrasound system 10 according to examples of the present disclosure.
  • Fig. 1 schematically shows a method 100 of determining an ultrasound acquisition parameter according to examples of the present disclosure.
  • the method may comprise several stages, as described in more detail in the following.
  • Each stage may comprise one or several operations.
  • Each stage and/or each operation may for example be implemented by a respective software module.
  • a first optional stage A. may concern a feature extraction process. In this context, it is also referred to Fig. 2a.
  • Fig. 2a shows an example of stage A. (feature extraction process) of the method of Fig. 1.
  • the imaging modality may be a medical imaging modality.
  • the imaging modality may comprise at least one of: mammography, tomosynthesis, magnetic resonance imaging (MRI), single-photon emission computerized tomography (SPECT) scan, positron emission tomography (PET) scan, optical coherence tomography (OCCT), optical tomography (OCT), X-ray exam, and ultrasound imaging.
  • prior tomosynthesis or other modalities e.g., MRI, SPECT, PET, and OCT
  • prior B-mode images prior B-mode images
  • corresponding imaging parameters and patient history records
  • This may be done using one or several systems, for example a tomosynthesis imaging system and an ultrasound imaging system, or a single system combining both. At least one of the tomosynthesis imaging system and the ultrasound imaging system or another imaging or scanning system as for example mentioned above may be part of the system for determining an ultrasound acquisition parameter according to the present disclosure.
  • a feature associated with a medium may be obtained. The operations SI and S2 may be performed simultaneously or subsequently.
  • the feature may be obtained in operation S2 based on image data of an imaging modality reconstructed in operation SI.
  • the feature may be associated with a region of interest, ROI, in the medium.
  • the ROI may be or may comprise a lesion in the medium (for example a human or animal tissue), as described in more detail below.
  • the feature may comprise at least one of: a geometric feature of the medium and/or the region of interest, a tissue of the medium and/or the region of interest, and a category of the medium and/or the region of interest.
  • the geometric feature may comprise at least one of a distance from the ROI to a predefined landmark in the medium, and a location and/or size of the ROI in relation to a predefined area in the medium.
  • the tissue feature may comprise at least one of a density, a stiffness, and a visco-elasticity of the ROI and/or of the medium.
  • the operation S2 may comprise a suboperation S2a of detecting and optionally segmenting a region of interest (ROI) in the medium based on the data of the sensor modality of the medium.
  • ROI region of interest
  • a ROI may be detected or identified in the medium or its position may be additionally localized.
  • a ROI may be segmented, for example using a predefined computer-implemented segmentation algorithm, for example an Al base algorithm, as for example a neural network, such as a convolutional neural network (CNN).
  • the data of the sensor modality may be or may comprise, for example image data.
  • the feature is merely obtained from a communication interface, a data storage or there like, i.e. without any a detecting or segmenting operation.
  • the detected and/or segmented feature may be predetermined and/or pre-stored (for example together with the reconstructed data of the sensor modality) and provided to a system performing the method according to the present disclosure.
  • geometric features of lesions may be extracted in tomosynthesis I B-mode and or other modalities' prior images, such as for instance lesion locations, lesion sizes, lesion-to-nipple-distance, lesion-to-skin-distance-over-to- chest-wall-distance-ratios, and/or lesion to landmarks (e.g., skin I pectoral muscle line I nipple) scales, etc.
  • the subject information such as lesion size change, prior quadrant and/or regions, and scores, may be mined.
  • the prior B-mode imaging parameters such as B-mode probe type, scan frequency, focal depths, and image processing preferences, etc. may be explored and tabulated as prior imaging parameters.
  • FIG. 2b shows an example of stage B. (an ultrasound acquisition process) of the method of Fig. 1
  • an ultrasound acquisition parameter is determined (or estimated or predicted) based on the feature obtained in stage A., i.e. in operation S2.
  • the ultrasound acquisition parameter may comprise at least one of: a type of a transducer device for obtaining ultrasound image data, a frequency, a focal depth, a frame rate, a speed of sound of the ROI and/or the medium, and an image processing property.
  • the live ultrasound image acquisition parameters may be determined, such as for instance transducer type, frequency, focal depth, frame rate, and image processing preference setting, etc. as outputs.
  • a module for determining ultrasound acquisition parameter may be implemented as a multi-input and multioutput neural network to perform a mapping from image features, geometric features, and temporal and spatial information based high-dimensional, for example with assistance of Al powered lesion detection, segmentation, and correlation.
  • ultrasound image data may be obtained, optionally of a plurality of image frames and/or in different ultrasound modalities.
  • the operation S4 may comprise the optional sub-operation S4' of configuring the ultrasound acquisition process based on the ultrasound acquisition parameter (i.e. prior to or simultaneously with obtaining the ultrasound image data). Accordingly, the ultrasound acquisition parameter may configure, set, or define at least one property or setting of an ultrasound acquisition process.
  • the ultrasound acquisition parameter may be used for controlling an ultrasound probe 20 and/or the ultrasound imaging system 10, as for example described in context of Fig. 4.
  • the obtained ultrasound image data may comprise B-mode (brightness mode) and/or data ShearWaveTM Elastography (SWE) image data or data of any other ultrasound modality.
  • Obtaining ultrasound image data may comprise or may be based an optional sub-operation S4a of detecting and optionally segmenting a ROI in one or several ultrasound image frames.
  • Detecting and optionally segmenting a ROI may comprise or may further be based on at least one of an optional sub-operation S4b of assigning the ROI of the subset with a detection confidence coefficient.
  • the detection confidence coefficient may for example indicate a probability that the detected area is really a lesion, and/or a probability that the ROI belongs to a predefined ROI category.
  • Detecting and optionally segmenting a ROI may comprise or may further be based on at least one of an optional sub-operation S4c of assigning the ROI of the subset with a correlation coefficient determined based on a correlation between the ROI of the subset and the ROI of the sensor modality data.
  • the correlation coefficient may for example indicate a strength of relationship between the ROI of the subset (for example a selected image frame) and the ROI of the of the sensor modality data.
  • the correlation coefficient may be determined based on the feature associated with the sensor modality data and a second feature determined based on the subset.
  • the second feature may comprise at least one of: a geometric feature of the ROI, a tissue of the ROI and/or the medium, and a ROI categorization. Accordingly, in case the ROI of the sensor modality data and ROI of the subset have similar characteristics (i.e. similar or corresponding features) the correlation coefficient may be determined to be relatively high.
  • the ultrasound acquisition process may comprise guiding a pose, for instance 3D or 6D pose, of a transducer device for obtaining the ultrasound image data.
  • a 3D pose may be defined by three-dimensional translation information, and 6D pose by three-dimensional translation information and in addition three- dimensional rotation information.
  • the pose may optionally be guided as a function of detected ROIs in obtained plurality of image frames, as described in more detail in the example below.
  • the guidance may happen in real-time, based on the already obtained ultrasound image data, for example such that the ultrasound image data cover the ROI.
  • sonographers may use the determined ultrasound acquisition parameter(s) and may quickly maneuver an ultrasound probe to the lesion quadrants and/or regions assisted by the real-time Al lesion detection, correlation, and/or classification. Then they may focus on the probe scan along the breast skin with the real time visual feedbacks of lesion's information shown on the display or AR (augmented reality) glasses, such as lesion bounding boxes, boundaries, masks, lesion probabilities, and malignancy probabilities. Eventually they may find the expected probe pose or posture (i.e., location, angle clock, and image plane) for optimal lesion presentation, and voice-activate the ultrasound image acquisition and recording while the live SWE image shows least artifacts with adapted and optimized image quality.
  • AR augmented reality
  • the ultrasound acquisition parameter may be determined further based on follow-up information, for example of a subject record.
  • the follow-up information may be for example external follow-up information.
  • the follow-up information may be or may also be referred to as progressive, evolutionary or historic information of the ROI and/or the medium.
  • the follow-up information may comprise several data sets, wherein each data set has been collected at a different point in time, for instance over several weeks, months or even years. In this way, an evolution of the medium in general and of the ROI in particular can be derived from the follow-up information.
  • the follow-up information may serve as an information source for determining the ultrasound acquisition parameter. In other words, the ultrasound acquisition parameter may not only be determined or adapted based on the feature, but also on the followup information.
  • the follow-up information may comprise at least one of image data of an imaging modality (as described above), subject information, and clinical records obtained at different points in time.
  • the follow-up information may comprise information about a temporal evolution of a lesion (i.e. the ROI), for instance regarding its location, size, and/or stiffness. Based on said information the current location, size, and/or stiffness may be predicted, this information may be used together with the feature for determining the ultrasound acquisition parameter
  • An optional second stage C. concerns a review process. In this context, it is also referred to Fig. 2c.
  • Fig. 2c shows an example of stage C. (review process) of the method of Fig. 1.
  • a subset of ultrasound image data is evaluated and/or selected by comparing the subset with the feature.
  • the subset may be or may comprise one or a several ultrasound image frames.
  • the subset may also be a particular type of ultrasound modality, for example SWE.
  • the selection (and/or evaluation) may imply selecting (and/or weighting) one or several image frames, and/or selecting a particular type of ultrasound modality, for example SWE.
  • the evaluation may comprise assigning the subset with a weighting coefficient.
  • a soft selection of subsets may be achieved by assigning respective weights to the subsets.
  • the feature for instance a geometric feature of a lesion in tomosynthesis image data, may be used to select an ultrasound image frame, for instance by comparing and/or correlating the lesion of the tomosynthesis image data with a lesion on the image frame.
  • Evaluating and/or selecting the subset may comprise or may be based on an optional sub-operation S5a of detecting and optionally segmenting a ROI based on the subset. It is also possible that operation S5a is configured to detect and optionally segment one or several ROIs in each subset.
  • At least one or a plurality of subsets may then be selected and/or at least one or a plurality of or all subsets may be weighted.
  • key frames with optimal lesion presentation regarding at least one of lesion shape, orientation, margins, boundaries, echo pattern, posterior features, and surrounding tissue findings may be determined, for instance in terms of at least one of duct and copper ligament change, architectural distortion, skin thickening, retraction, and irregularities, etc.
  • the obtained ultrasound image frames may also be displayed to a user (optionally with the above-mentioned ROI detection results). Such that the user can review and/or select them manually.
  • Evaluating and/or selecting the subset may comprise or may further be based on at least one of an optional sub-operation S5b of assigning the ROI of the subset with a detection confidence coefficient.
  • the detection confidence coefficient may for example indicate a probability that the detected area is really a lesion, and/or a probability that the ROI belongs to a predefined ROI category.
  • Evaluating and/or selecting the subset may comprise or may further be based on at least one of an optional sub-operation S5c of assigning the ROI of the subset with a correlation coefficient determined based on a correlation between the ROI of the subset and the ROI of the sensor modality data.
  • the correlation coefficient may for example indicate a strength of relationship between the ROI of the subset (for example a selected image frame) and the ROI of the sensor modality data.
  • the correlation coefficient may be determined based on the feature associated with the sensor modality data and a second feature determined based on the subset.
  • the second feature may comprise at least one of: a geometric feature of the ROI, a tissue of the ROI and/or the medium, and a ROI categorization. Accordingly, in case the ROI of the sensor modality data and ROI of the subset have similar characteristics (i.e. similar or corresponding features) the correlation coefficient may be determined to be relatively high.
  • a subset among a plurality of subsets may then be selected and/or weighted in operation S5. For example, a subset with a ROI having a relatively high detection confidence coefficient and/or a relatively high correlation coefficient may be selected and/or may be assigned with a relatively high weighting value (compared to other subsets).
  • At least one of the operations S5a, S5b, S5c may be part of evaluating and/or selecting operation S5 (option 2). However, at least one of these operations S5a, S5b, S5c, or all of these operations may also be performed in advance, for example during and/or after the ultrasound acquisition process in stage B, in particular operation S4 (option 1, cf. operations S4a to S4c Fig. 2b). In the latter case, the results of operations S4a to S4c may be provided to the evaluating and/or selecting operation S5. [122] It is further possible that stages B. and C. are performed simultaneously and/or in real-time. For example, ultrasound image frames may be obtained, and ROIs may be detected, and simultaneously an image frame may be selected, post-processed and displayed.
  • the subset may be post-processed.
  • post-processing operations comprise zooming to and/or cropping of the ROI, performing a measurement of the ROI, and mapping the ROI in the displayed image to the image data of the sensor modality. Respective examples are shown in Fig. 3b and Fig. 3c and described in more detail in this context.
  • an image may be displayed based on the post-processed subset in an optional operation S6.
  • the image (or the post-processing the subset) may be diasplayed on a display device.
  • Exemplary images are shown in Fig. 3ba to Fig. 3c and described in more detail in this context.
  • the method may further comprise a feedback loop from operation S6 to operation SI.
  • the data of the sensor modality may be reconstructed again based on the post-processed subset of operation S6.
  • the obtained ultrasound data may be fed back to a reconstruction operation which reconstructs the data of the sensor modality, for example the tomosynthesis image data, in order to adjust and/or optimize the reconstructing process.
  • the selected and/or post- processed subset may be fed back.
  • the post-processed subset can give insights regarding the content or characteristics of a lesion, for example tissue characteristics inside the lesion. It is thus possible to for example focalize the sensor modality according to the information obtained from the post-processed subset, in order to construct only a relatively small part of the initial image data of the sensor modality.
  • the relatively small part may be for example only the ROI or parts of an identified in the subset.
  • the precision and/or resolution of the reconstructed ROI can be increased. Accordingly, the reconstructed ROI may become more robust.
  • the method may be performed at least partially in real-time. For example, determining the ultrasound acquisition parameter, configuring an ultrasound acquisition process, obtaining ultrasound image data, and optionally also evaluating and/or selecting a subset, post-processing the subset and displaying the subset may be performed in real-time or in quasi real-time, such that a user steering the ultrasound acquisition process (for example by holding an ultrasound probe) does not perceive any significant delay, when concentrating on the post- processed image data displayed on a display device.
  • Fig. 3a shows a schematic example of a region of interest (ROI) mapping process according to the present disclosure.
  • the medium may correspond to a human breast comprising two ROIs 201, 202 in the form of lesions (lesion A and lesion B).
  • the example thereby schematically illustrates the task of mapping the ROIs 201, 202 in a tomosynthesis image 200a (i.e. the sensor modality), to respective ROIs 201', 202' in ultrasound image frames 200b.
  • ROIs may be detected and segmented in the tomosynthesis image 200a and in each of the ultrasound image frames 200b. Then, an image frame among the sequence may be selected having a ROI with a relatively high detection confidence coefficient and correlation coefficient in view of a lesion in the tomosynthesis image 200a.
  • This mapping operation may be made automatically, such that both the tomosynthesis image 200a and the selected ultrasound frame 200b are displayed with respectively segmented and mapped ROIs (for example 201 and 201').
  • a sonographer for example a user of a system according to the present disclosure holds an ultrasound probe in B-Mode, both B-Mode and shear wave elastography SWE mode, or other modes (e.g., Doppler and H-mode, where the 'H' stands for Hermite or hue) to scan the breast in live video, while a patient lays on their back with breasts in supine position, sits in up-right position, or faces down in prone position, and the lesions in the pre-acquired tomosynthesis pictures (Tomo) or other imaging modalities' images are shown on monitors or other displays.
  • modes e.g., Doppler and H-mode, where the 'H' stands for Hermite or hue
  • the sonographers assess and annotate the ultrasound images by selecting the region-of-interests (ROIs) of possible lesions on ultrasound images, may compare those ultrasound findings with findings from a screening mammogram or other imaging modalities, and determine the correlations between ultrasound and other modalities (e.g., MRI, SPECT, PET, and)optical CT)' findings.
  • ROIs region-of-interests
  • the sonographers may report the lesion BI-RADS scores, lesion descriptors, and tissue description, for example according to definitions in the literature of this field and make clinical decisions after assessing the lesion malignancy.
  • the ultrasound image acquisition parameters such as transducer type, frequency, focal depth, frame rate, and image processing preference setting without adaptation to individual patient or pathology.
  • the ultrasound scan is affected significantly by the acquisition parameters in terms of the image quality in general, lesion visibility, lesion detectability, and lesion descriptors.
  • the continuous changes of the acquisition parameters increase breast scan time, lower the sonographers' confidence, and make them pay more attention to the parameter adjustments instead of the stead and robust probe maneuvering.
  • the probe pressure to skin may cause apparent artifacts in ultrasound image, in particular in ShearWave mode and subtle hand shaking can change the SWE features adversely with artifacts.
  • the ultrasound imaging plane constantly depends on handheld probe's postures, the acquired ultrasound image frames after freezing often are not exactly what the sonographers try to freeze as shown on the display, resulting in sub-optimal image frames recorded for image review.
  • the technology according to the present disclosure can address current issues for whole breast lesion continuum of detection, correlation, and classification, and can adapt and optimize the breast ultrasound imaging workflow in collaboration with tomosynthesis and other image modalities.
  • the method of the present disclosure may perform at least one of the following:
  • B-mode B-mode
  • SWE shear wave elastography
  • the ultrasound I tomosynthesis and I or other modality can improve each other's lesion presentation, resulting in optimized lesion assessment.
  • Fig. 3b shows a schematic example of a post-processing process (zooming among others) according to the present disclosure.
  • the image 300a may be a selected ultrasound image frame, for example as described in context of Fig. 3a.
  • the post-processing process may be thus an optional enhancement of the mapping process of the Fig. 3a.
  • the image accordingly has been selected in view of its relatively good representation of an ROI 301 compared to other ultrasound image frames.
  • the image 300a comprises a B-mode image 305 and a partially overlapping SWE image 304.
  • the SWE image which can give valuable insights into tissue stiffness and thus into present lesions, is relatively small and has been reconstructed (i.e. beamformed) with a relatively coarse resolution.
  • the image is post- processed by zooming into the SWE image, i.e. into the area of the ROI 301' (cf. post-processed image 300b).
  • the zoomed area i.e. the SWE image or the respective are of the B-mode image
  • the area of the ROI 301 may be reconstructed again with an increased resolution or quality.
  • an enhance beamforming technique can be used for this area, or the area can be beamformed in isolation of adjacent area, such that clutter or other disturbing factors can be reduced.
  • This post-processing operation may be made automatically, such that a system user directly sees the zoomed image 300b.
  • Fig. 3c shows a further schematic example of a post-processing process (measurement) according to the present disclosure.
  • the image 400 may be a selected ultrasound image frame, as for example described in context of FFig. 3a, or a zoomed image, as for example described in context of FFig. 3b.
  • the post-processing process may be an optional enhancement of the mapping process of the FFig. 3a or of the zooming process of FFig. 3b.
  • One or several measurements may be made, for example of a distance 401 between two detected reference points (for example a ROI and a geometric feature in the medium, not shown in Fig. 3c). It is also possible that a tissue feature is measured of a ROI 402, for example in relation to a corresponding tissue feature in a reference region 403.
  • this post-processing operation may be made automatically, such that a system user directly sees the measurements in the image.
  • ultrasound may be displayed in real-time fashion, for example together with other modalities' images.
  • the SWE image area may be cropped with spatial super-resolution technique to show more tissue and fatty background distribution details by occupying most part of the display with the much smaller ultrasound overlaid image taken as a lesion location map, while full sized B- mode and overlaid SWE image can still be toggled.
  • the SWE feature areas may be detected, and the SWE feature values may be measured automatically.
  • the lesion BI-RADS descriptors may be computed by Al powered regression and classification methods.
  • Fig. 3d shows a schematic example of a feedback-controlled reconstructions process according to the present disclosure.
  • the post-processed ultrasound image (for example an SWE) can give insights regarding the content or characteristics of a lesion, for example tissue characteristics inside a lesion 501 detected and segmented in a tomosynthesis image 500.
  • the ultrasound image can point out a relatively small area in the ROI of the tomosynthesis image, which can be of particular interest to assess the ROI. It is thus possible to focus on this small area, when reconstructing again the tomosynthesis image.
  • the small area may be reconstructed in isolation, thereby allowing a higher resolution with the same or less computational resources and a higher robustness from artefacts from any cluttering elements (bones, tissues, etc.) in the vicinity of the small area.
  • the automatic review of ultrasound and other modalities' images may generate requirements as feedbacks on tomosynthesis and other modalities imaging for further image ROI reconstruction and lesion confirmation.
  • Fig. 4 shows a schematic drawing of an ultrasound system 10 according to examples of the present disclosure.
  • the ultrasound system 10 may be an example of a system for determining an ultrasound acquisition parameter.
  • the method according to the present disclosure may also be (at least partially) performed by an external system.
  • the ultrasound imaging system 10 may comprise:
  • processing unit 30 for processing an image on the bases of signals received by the probe
  • control panel 40a connected to the processing unit, said control panel at least comprising buttons 41 and a touch pad 42, and
  • the probe 20 may be associated to the processing unit 30 via a cable 21 or via a wireless connection, and it is able to emit ultrasound waves W into a medium M and to receive ultrasound waves W from the medium M, said received ultrasound
  • the display screen 50 may be a screen for visualizing the image processed by the processing unit 30.
  • the display 50 may also visualize other information such as scales used in the image, or configuration information for the processing or any information such as help information or contextual gesture help for the touch pad 42.
  • the display screen may by articulated on a support arm 51 for better positioning for the user.
  • the display screen is usually a high-definition screen of a great size (at least 20 inches) for better image visualization to the user.
  • the control panel 40a is for example a portion of the system casing 31, said portion comprising a panel casing having a substantially flat surface inclined towards the user for manipulation by one hand of said user.
  • the control panel 40a may be moved by a hanger upwards and downward for being adapted to the user size, and may be optionally moved frontward and rearward for being adapted to the user position.
  • the control panel 40a may include a control panel display screen 49 for visualizing several configuration information or any information dedicated to the user.
  • the processing unit 30 is configured to determine an ultrasound acquisition parameter according to the present disclosure.
  • the system may further display images, as described in context of Fig. 3a to 3c, in particular with mapped ROIs in two different images (cf. Fig. 3a). This case may in particular be advantageous when the user of the system is a novice person. That means the operator or user of the system 10 may be for example a unskilled or merely technically skilled person.
  • the system may use an (Al) algorithm.
  • the algorithm may automatically process and/or analyze information available in the subject record.
  • the algorithm may at least partially be external to the system 10, i.e. run at least partially on an external computing device.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Vascular Medicine (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un procédé de détermination d'un paramètre d'acquisition d'ultrasons, le procédé consistant : à obtenir une caractéristique associée à un milieu, à déterminer le paramètre d'acquisition d'ultrasons sur la base de la caractéristique.
PCT/EP2023/060791 2023-04-25 2023-04-25 Procédé de détermination d'un paramètre d'acquisition d'ultrasons Pending WO2024223029A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2023/060791 WO2024223029A1 (fr) 2023-04-25 2023-04-25 Procédé de détermination d'un paramètre d'acquisition d'ultrasons

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2023/060791 WO2024223029A1 (fr) 2023-04-25 2023-04-25 Procédé de détermination d'un paramètre d'acquisition d'ultrasons

Publications (1)

Publication Number Publication Date
WO2024223029A1 true WO2024223029A1 (fr) 2024-10-31

Family

ID=86332168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/060791 Pending WO2024223029A1 (fr) 2023-04-25 2023-04-25 Procédé de détermination d'un paramètre d'acquisition d'ultrasons

Country Status (1)

Country Link
WO (1) WO2024223029A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170128037A1 (en) * 2015-11-11 2017-05-11 Toshiba Medical Systems Corporation Medical image-processing apparatus and ultrasonic diagnostic device
US20190325573A1 (en) * 2018-04-24 2019-10-24 General Electric Company Multimodality 2D To 3D Imaging Navigation
US20200100759A1 (en) * 2018-09-27 2020-04-02 Fujifilm Corporation Information processing apparatus, program for operating information processing apparatus, method for operating information processing apparatus, and mammography apparatus
US20200323512A1 (en) * 2017-11-08 2020-10-15 Koninklijke Philips N.V. Ultrasound system and method for correlation between ultrasound breast images and breast images of other imaging modalities
US20230124481A1 (en) * 2020-03-27 2023-04-20 Hologic, Inc. Systems and methods for identifying regions of interest in multiple imaging modalities

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170128037A1 (en) * 2015-11-11 2017-05-11 Toshiba Medical Systems Corporation Medical image-processing apparatus and ultrasonic diagnostic device
US20200323512A1 (en) * 2017-11-08 2020-10-15 Koninklijke Philips N.V. Ultrasound system and method for correlation between ultrasound breast images and breast images of other imaging modalities
US20190325573A1 (en) * 2018-04-24 2019-10-24 General Electric Company Multimodality 2D To 3D Imaging Navigation
US20200100759A1 (en) * 2018-09-27 2020-04-02 Fujifilm Corporation Information processing apparatus, program for operating information processing apparatus, method for operating information processing apparatus, and mammography apparatus
US20230124481A1 (en) * 2020-03-27 2023-04-20 Hologic, Inc. Systems and methods for identifying regions of interest in multiple imaging modalities

Similar Documents

Publication Publication Date Title
CN112469340A (zh) 具有用于引导式肝成像的人工神经网络的超声系统
US11627941B2 (en) Methods and systems for detecting pleural irregularities in medical images
US11653897B2 (en) Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus
CN111683600B (zh) 用于根据超声图像获得解剖测量的设备和方法
CN112641464B (zh) 用于启用上下文感知的超声扫描的方法和系统
Chen et al. Improvement of 3-D ultrasound spine imaging technique using fast reconstruction algorithm
JP2008073305A (ja) 超音波乳房診断システム
KR102539922B1 (ko) 탄성초음파영상에 대한 변형률 계산 및 변형량의 자동 측정을 위한 방법 및 시스템
JP2019526357A (ja) 超音波診断装置
EP2601637B1 (fr) Système et procédé de segmentation multi-modalité d'un tissu interne avec rétroaction en direct
US11941806B2 (en) Methods and systems for automatic assessment of fractional limb volume and fat lean mass from fetal ultrasound scans
CN115666400B (zh) 辅助用户执行医学超声检查
CN114902288A (zh) 利用基于解剖结构的三维(3d)模型切割进行三维(3d)打印的方法和系统
WO2024042044A1 (fr) Échographie guidée pour la stadification au point de soins de pathologies médicales
WO2024223029A1 (fr) Procédé de détermination d'un paramètre d'acquisition d'ultrasons
CN117036302A (zh) 主动脉瓣膜钙化程度的确定方法和系统
WO2021230230A1 (fr) Dispositif de diagnostic par ultrasons, dispositif de traitement d'images médicales et procédé de traitement d'images médicales
EP4327750A1 (fr) Imagerie ultrasonore guidée pour stadification de point d'intervention d'états médicaux
US12183008B2 (en) Device agnostic systems and methods for acquiring and analyzing images from an ultrasound probe
CN114098687B (zh) 用于超声运动模式的自动心率测量的方法和系统
US11881301B2 (en) Methods and systems for utilizing histogram views for improved visualization of three-dimensional (3D) medical images
US20240407762A1 (en) Methods and systems for generating 3d pleural surfaces
EP4665235A1 (fr) Systèmes agnostiques de dispositif et procédés d'acquisition et d'analyse d'images à partir d'une sonde ultrasonore
CN119654106A (zh) 超声图像采集
CN117157013A (zh) 用于超声成像的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23723156

Country of ref document: EP

Kind code of ref document: A1