WO2025178564A1 - Ultrasonic probe, ultrasonic scanner and a method of ultrasonic scanning - Google Patents
Ultrasonic probe, ultrasonic scanner and a method of ultrasonic scanningInfo
- Publication number
- WO2025178564A1 WO2025178564A1 PCT/SG2025/050114 SG2025050114W WO2025178564A1 WO 2025178564 A1 WO2025178564 A1 WO 2025178564A1 SG 2025050114 W SG2025050114 W SG 2025050114W WO 2025178564 A1 WO2025178564 A1 WO 2025178564A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- ultrasonic
- reflection signal
- transducer array
- recited
- ultrasonic transducer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
- A61B8/085—Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4209—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
- A61B8/4236—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by adhesive patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4477—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/8925—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8934—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
- G01S15/8936—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in three dimensions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4209—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
- A61B8/4227—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by straps, belts, cuffs or braces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
- A61B8/4455—Features of the external shape of the probe, e.g. ergonomic aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4488—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/58—Testing, adjusting or calibrating the diagnostic device
Definitions
- This application relates to the field of ultrasonic imaging, and more particularly to an ultrasonic probe, an ultrasonic scanner, and a method of ultrasonic scanning.
- Ultrasonic imaging or ultrasound imaging is often recommended for bladder volume estimation in LUTS initial assessment in addition to the evaluation of treatment effects.
- conventional ultrasonic imaging methods when used for characterizing bladder volume, such as: (1 ) bulky equipments which are often available in medical facilities such as clinics or hospitals, which require patients to be present at the medical facility for the imaging process; (2) inaccurate post-void residual urine volume estimation caused by long-time waiting between voiding and measurement in medical facilities; (3) inability to provide real-time monitoring.
- the ultrasonic probe comprises a flexible base; a first ultrasonic transducer array coupled to the flexible base, the first ultrasonic transducer array arranged along a first axis; a second ultrasonic transducer array coupled to the flexible base, the second ultrasonic transducer array arranged along a second axis, the second axis orthogonal to the first ultrasonic transducer array; and a sensor layer coupled to the first ultrasonic transducer array and the second ultrasonic transducer array, the sensor layer being configured to provide a sensor signal corresponding to at least one of: a first bending of the first ultrasonic transducer array about the second axis; and a second bending of the second ultrasonic transducer array about the first axis.
- the ultrasonic scanner comprises: the ultrasonic probe as described above, the ultrasonic probe attachable to a target surface; a controller in signal communication with the ultrasonic probe, wherein the controller is configured to: receive a first reflection signal of a target from the first ultrasonic transducer array; receive a second reflection signal of the target from the second ultrasonic transducer array; adjust the first reflection signal based on the first bending to obtain a first adjusted reflection signal ; and adjust the second reflection signal based on the second bending to obtain a second adjusted reflection signal .
- a method of ultrasonic scanning comprises: attaching the ultrasonic probe as described above to a target surface, the target surface spaced apart from a target; receiving a first reflection signal of the target from the first ultrasonic transducer array; receiving a second reflection signal of the target from the second ultrasonic transducer array; adjusting the first reflection signal based on the first bending to obtain a first adjusted reflection signal; and adjusting the second reflection signal based on the second bending to obtain a second adjusted reflection signal.
- FIG. 1 is a schematic diagram of an ultrasonic scanner according to various embodiments
- FIG. 3 is an exploded view of the ultrasonic probe of FIG. 2;
- FIG. 4 is a partial exploded view of the ultrasonic probe of FIG. 2;
- FIG. 5 is a partial perspective view of an ultrasonic probe according to various embodiments.
- FIGs. 6 A and 6B show respective top views of ultrasonic transducer layers according to various embodiments
- FIGs. 7A to 7C show a side view of an ultrasonic probe with different curvatures according to various embodiments
- FIGs. 8A and 8B show the respective phase errors of beamforming distortion and image reconstruction distortion
- FIG. 8C is a flowchart of a strain sensor-based method for phase error correction according to various embodiments.
- FIGs. 9A and 9B show a bladder ultrasound image and a segmented bladder area using an FCN model according to various embodiments;
- FIG. 10 is an image of an incomplete bladder image due to limited field of view, with the dotted line representing the estimated bladder contour;
- FIG. 11 is a flowchart of a proposed automatic volume estimation algorithm according to various embodiments.
- FIG. 12 is a flowchart of a method of ultrasonic scanning according to various embodiments.
- FIG. 13 A is an image of a flexible PI PCB (Polyimide PCB) based orthogonal ultrasonic probe
- FIGs. 14 A to 14C show the ultrasonic pressure intensity when the transducer number (N) is 1, 64, and 128, respectively.
- the pitch, that is the space one transducer occupies, is 0.4 mm.
- the ultrasonic pressure is normalized to 64 dB and the focal length is 50 mm;
- FIG. 15A is an exemplary bladder ultrasonic image
- FIG. 17B shows an exemplary ultrasonic signal of a transducer with a backing layer
- FIG. 19 shows an exemplary image of a strain sensor of an ultrasonic probe according to various embodiments
- FIG. 20 shows the maximum ultrasonic beamforming error before and after strain sensor-based phase error compensation
- FIGs. 21 to 25 show the various performance parameters of the proposed ultrasonic probe according to various embodiments
- FIG. 26 shows the ultrasound images of a phantom bladder using the proposed ultrasonic probe according to various embodiments
- FIGs. 27A to 27D show a comparison of bladder volume estimation between a manual method and a fully convolutional neural network (FCN) method
- FIG. 28 shows a comparison of bladder volume estimation between the proposed ultrasonic probe and conventional systems
- FIG. 29A shows a temperature profile of the proposed ultrasonic probe according to various embodiments.
- FIG. 29B shows a temperature data over an hour of the proposed ultrasonic probe of FIG. 29A.
- the articles “a”, “an” and “the” as used with regard to a feature or element include a reference to one or more of the features or elements.
- Bladder volume measurement often requires multiple ultrasonic images to be obtained, followed by the extraction of the bladder width, bladder depth and the bladder length for volume estimation.
- the operator has to hold onto the probe during the imaging and reposition.
- the manual process requires a precise repositioning (such as a 90° probe rotation) of the probe.
- the ultrasonic probe may be positioned and/or oriented at different positions relative to the target (such as the bladder) in obtaining the multiple ultrasonic images for volumetric estimation.
- the inconsistency in the positioning and orientation may introduce undesirable measurement errors during measurement, resulting in inaccurate bladder volume estimation. Error may be introduced due to the assumption of exact orthogonality between the repositioned probes / captures ultrasonic images.
- the ultrasonic transducer layer 120 and the sensor layer 130 may both be bendable in response to bending of the flexible base 110. This enables the ultrasonic probe 100 to be conformable to a target surface or an attachment surface of the subject, such as an abdomen of the subject, for bladder ultrasound measurement.
- the target surface may be spaced apart from a target of ultrasonic imaging, such as the bladder of the subject.
- the sensor layer 130 may bend together or in tandem with the ultrasonic transducer layer 120.
- the sensor layer 130 may be configured to provide a sensor signal corresponding to a bending of the ultrasonic transducer layer 120. Therefore, bending of the ultrasonic transducer layer 120 may be measured and estimated by the sensor layer 130.
- the sensor signal may correspond to: a first bending of the ultrasonic transducer layer 120 about the second axis 74 and/or a second bending of the ultrasonic transducer layer 120 about the first axis 72.
- the sensor signal may collectively represent one of or a combination of the first bending and the second bending.
- the ultrasonic transducer layer 120 may comprise a first ultrasonic transducer array 122 coupled to the flexible base 110 and a second ultrasonic transducer array 124 also coupled to the flexible base 110.
- the first ultrasonic transducer array 122 may be arranged along the first axis 72 and the second ultrasonic transducer array 124 may be arranged along the second axis 74.
- the first ultrasonic transducer array 122 may be said to be orthogonal to the second ultrasonic transducer array 124.
- the sensing zone 133 may cover the transducer zone 123.
- the sensing zone 133 may cover the first ultrasonic transducer array 122 along the first axis 72 and the second ultrasonic transducer array 124 along the second axis 74.
- the first ultrasonic transducer array 122 may comprise a plurality of first transducer units 125.
- the second ultrasonic transducer array 124 may also comprise a plurality of second transducer units 127.
- the plurality of first transducer units 125 of the first ultrasonic transducer array 122 may align along the first axis 72 and the plurality of second transducer units 127 of the second ultrasonic transducer array 124 may aligned along the second axis 74.
- the first ultrasonic transducer array 122 may obtain a first reflection signal corresponding to a first ultrasonic image of the target along the first axis 72.
- the second ultrasonic transducer array 124 may obtain a second reflection signal corresponding to a second ultrasonic image of the target along the second axis 74.
- the first ultrasonic transducer array 122 may also include first transducer units 125 aligned along the second axis 74.
- the second ultrasonic transducer array 124 may also include second transducer units 127 aligned along the first axis 72.
- the first reflection signal from the first ultrasonic transducer anay 122 may still correspond to the first ultrasonic image of the target along the first axis 72
- the second reflection signal from the second ultrasonic transducer array 124 may still correspond to the second ultrasonic image of the target along the second axis 74.
- each pair of first transducer units 125 and/or each pair of second transducer units 127 may define a plurality of transducer pitches (P).
- the plurality of transducer pitches (P) may define a common pitch length.
- each of the plurality of transducer pitches (P) may be in a range between a full transducer wavelength (X) and half the transducer wavelength (X/2).
- FIGs. 7A to 7C illustrate the ultrasonic probe 100 in various bending states.
- FIG. 7A shows the ultrasonic probe 100 in a neural state or a planar state with minimal curvature (CO) or bending.
- the ultrasonic probe 100 may assume the neutral state when the ultrasonic probe
- FIG. 7B shows a first side view of an ultrasonic probe 100A when viewed along the first axis 72.
- the ultrasonic probe 100A may be in a bent state with a first curvature (Cl).
- FIG. 7C shows a second side view of another ultrasonic probe 100B when viewed along the second axis 74.
- the ultrasonic probe 100B may be in a bent state with a second curvature (C2). It may be appreciated that when the ultrasonic probe 100 is attached to a target surface of a subject, the ultrasonic probe 100 will conform to the target surface to bend along both the first axis 72 and the second axis 74. As such, bending of ultrasonic probe 100 will be measured by the sensor layer 130 for phase error correction.
- the ultrasonic probe 100 may conform to a target surface of the subject.
- the ultrasonic probe 100 may be configured to obtain ultrasonic reflection signals of a target, such as a bladder of the subject, using the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124.
- the ultrasonic reflection signals may include a first reflection signal of the target received from the first ultrasonic transducer array 122, and a second reflection signal of the target from the second ultrasonic transducer array 124.
- the first reflection signal may include a plurality of first unit reflection signals obtained from the first ultrasonic transducer array 122. Each of the plurality of first unit reflection signals may correspond to a respective first transducer unit in the first ultrasonic transducer array 122.
- the second reflection signal may include a plurality of second unit reflection signals obtained from the second ultrasonic transducer array 124. Each of the second plurality of unit reflection signals may correspond to a respective second transducer unit in the second ultrasonic transducer array 124.
- the first reflection signal and the second reflection signal may each be a 2D ultrasonic image of the target.
- the first reflection signal and the second reflection signal may be an electrical signal corresponding to a 2D ultrasonic image.
- the first reflection signal and the second reflection signal may collectively form a 3D ultrasonic image of the target.
- the sensor layer 130 may provide a sensor signal corresponding to a bending of the ultrasonic transducer layer 120.
- the bending may include at least one of: a first bending of the first ultrasonic transducer array 122 about the second axis 74; and a second bending of the second ultrasonic transducer array 124 about the first axis 72.
- the ultrasonic reflection signals may include phase errors such as beamforming distortion and image reconstruction distortion, for the outgoing and incoming ultrasonic waves.
- the controller 60 may correct or adjust the ultrasonic reflection signals based on the bending of the ultrasonic transducer layer 120 as determined or measured by the sensor layer 130.
- the controller 60 may correct the first reflection signal to obtain a first adjusted reflection signal based on the first bending (e.g. C2) of the first ultrasonic transducer array 122.
- the controller 60 may correct the second reflection signal to obtain a second adjusted reflection signal based on the second bending (e.g. Cl) of the second ultrasonic transducer array 124.
- the corrected or adjusted ultrasonic reflection signals may then be used for volumetric estimation of the target, e.g. the bladder.
- FIG. 8C illustrates an exemplary curvature estimation using a strain sensor-based sensor layer 130.
- the controller 60 may first determine a bending in the ultrasonic probe 100 based on a resistance change in the strain sensor.
- the strain sensor may be calibrated for any bending state between a flat state and a bending state, such as a minimum radius bending state.
- One or more curvature of the ultrasonic probe 100 may be estimated based on the strain sensor signals. Hence, phase error compensation or correction may be performed based on the estimated curvature.
- the controller 60 may transmit ultrasonic waves towards the target, and concurrently receive both the first reflection signal and the second reflection signal.
- ultrasonic imaging may be performed in intervals or periodically. For example, an ultrasonic imaging of the subject may be performed every 3 seconds in observing the change in bladder volume responsive to a medication. Therefore, in various embodiments, the controller 60 may periodically receive both the first reflection signal and the second reflection signal for volumetric estimation.
- the multiplication coefficient (0.72 which is frequency used) is dependent on the shape of the bladder.
- the target surface or attachment surface such as the human torso
- the first bending the first ultrasonic transducer array 122 may have a different curvature from the second bending of the second ultrasonic transducer array 124.
- the first bending and/or the second bending may vary during the course of ultrasonic measurement.
- the controller 60 may estimate a volume of the target using a first machine learning model.
- the controller 60 may estimate the volume of the target by providing the first adjusted reflection signal and the second adjusted reflection signal to the first machine learning model.
- the first machine learning model may be a convolutional neural network, such as a FCN32 fully convolutional network.
- the controller 60 may perform feature extraction or segmentation of the target from the ultrasonic signals or ultrasound images. As shown in FIGs. 9 A and 9B, the first machine learning model may segment the target from each of the first adjusted ultrasonic signal and the second adjusted ultrasonic signal, respectively. This enables the controller 60 to obtain a first segmented target based on the first adjusted ultrasonic signal and a second segmented target based on the second adjusted ultrasonic signal. Thereafter, the controller 60 may estimate the volume of the target based on the first segmented target and the second segmented target. In an example, the first machine learning model may be trained based on a training dataset of 761 images, and validated with a validation dataset of 84 images.
- the ultrasonic signals or ultrasound images may include one or more defective region(s) of the target.
- a part of the target may be missing or distorted on the ultrasound imagc(s).
- each of the defective region may be characterized by one or a combination of: an occluded region, a missing region, and a distorted region.
- the controller 60 may be configured to estimate the defective region based on a second machine learning model.
- the controller 60 may first receive the ultrasonic signals or ultrasound images, for example, the first adjusted ultrasonic signal and the second adjusted ultrasonic signal. Thereafter, the controller 60 determines a defective region of the target in one or both of the first adjusted reflection signal and the second adjusted reflection signal. Using the second machine learning model, such as a Pix2Pix conditional generative adversarial network, the controller 60 may estimate or determine the respective defective region of the target. Thereafter, the defective region may be replaced or overlaid, for bladder shape feature extraction or segmentation using the first machine learning model. This may be followed by step of volume estimation. In some examples, the process of volume estimation of the bladder may be performed by the controller 60 quickly and locally, thus providing the subject a near real-time volume estimation. In addition, the results from the volume estimation may be used for diagnosis or tracking.
- the second machine learning model such as a Pix2Pix conditional generative adversarial network
- the controller 60 may be configured to perform a method 700 of ultrasonic scanning.
- the method 700 of ultrasonic scanning may include: in 710, attaching the ultrasonic probe 100 as described above to a target surface, the target surface spaced apart from a target.
- the target surface may refer to an abdomen of a subject, and the target may refer to a bladder of the subject.
- the method 700 may further include: in 720, receiving a first reflection signal of the target from the first ultrasonic transducer array; in 730, receiving a second reflection signal of the target from the second ultrasonic transducer array; in 740, adjusting the first reflection signal based on the first bending to obtain a first adjusted reflection signal; and in 750, adjusting the second reflection signal based on the second bending to obtain a second adjusted reflection signal.
- the method 700 may further include: in 760, estimating a volume of the target based on the first adjusted reflection signal and the second adjusted reflection signal.
- estimating the volume of the target by providing the first adjusted reflection signal and the second adjusted reflection signal to a first machine learning model.
- the first machine learning model comprises a FCN32 fully convolutional network.
- the method may further comprise: segmenting the target from each of the first adjusted reflection signal and the second adjusted reflection signal to obtain a first segmented target and a second segmented target; and estimate the volume of the target based on the first segmented target and the second segmented target.
- the method may further comprise: estimating a defective region of the target using a second machine learning model based on at least one of: the first adjusted reflection signal and the second adjusted reflection signal.
- the second machine learning model comprises a Pix2Pix conditional generative adversarial network.
- the defective region is characterized by at least one of: an occluded region, a missing region, and a distorted region.
- the method may further comprise: concurrently receiving both the first reflection signal and the second reflection signal. In various embodiments, the method may further comprise: periodically receiving both the first reflection signal and the second reflection signal. In various embodiments, the first bending has a different curvature from the second bending. In various embodiments, each of the first bending and the second bending changes varies in operation. In other words, during the method of ultrasonic scanning, the first bending and the second bending may change dynamically.
- Exemplary ultrasonic scanner and ultrasonic probe for bladder volume monitoring utilize ultrasonic imaging, which is a safe diagnosis method, for real-time and extended period bladder volume monitoring. Estimation of the bladder volume estimation uses two orthogonal cross-section ultrasonic images to estimate three scales in three directions: width, length, and depth. Departing from conventional ultrasonic probe which requires a 90° or orthogonal rotation mid-way during the ultrasound process, the proposed wearable ultrasonic probe comprises a pair of orthogonal linear transducer array which alleviates the need for probe rotation.
- the transducer number of 64 can be chosen in consideration of compatibility with existing ultrasound system, and is also often used in existing medical probes.
- the pitch is larger than one wavelength, the ultrasonic beam pattern appears strong side lobe(s) which may result in artifacts when imaging (see FIG. 14F dotted box). Therefore, to balance the ultrasonic intensity and to mitigate formation of side lobe(s), the transducer pitch has to be determined based on a trade-off. i.e., a large pitch reduces resolution, while increases the imaging area with the same under of channels. It was determined that a preferred pitch is one between a full transducer wavelength and half the transducer wavelength.
- FIG. 15A it shows the B-mode ultrasound image of an actual bladder
- FIG. 15B shows the B-mode ultrasound image of a synthetic bladder used to demonstrate the ultrasonic probe imaging ability.
- B-mode refers to the brightness mode of an ultrasound image.
- the simulation of artificial phantoms was done by simulating and summing the received ultrasonic fields from scatter points.
- the scatters were extracted from an existing bladder ultrasonic image, which includes 128 scanning lines.
- a single scan line in an image can be calculated by summing the response from the scatters, in which the scattering strength was determined by the density and ultrasonic speed changing in the tissue. Thereafter, the ultrasonic image was reconstructed using the signals from each line. It can be seen that the proposed ultrasound probe is able to perform ultrasonic imaging of the bladder.
- the size of the bladder can be as large as ⁇ 10 cm when the bladder is full.
- FBW fractional bandwidth
- the proposed ultrasonic probe enables measurement of the bladder volume, with no need for probe rotation.
- the ultrasonic probe may include two linear ultrasonic arrays/probes intersecting at 90 degrees, with each lineal' probe configured to obtain one respective crosssection image of the bladder.
- the two orthogonal linear probes may obtain two orthogonal cross-section images of the bladder for estimating the length, depth, and width of the bladder.
- Resolution is one key performance parameter for ultrasonic imaging.
- the axial resolution is proportional to the spatial pulse length.
- the proposed ultrasonic probe includes a backing layer coupled or attached to the ultrasonic transducer layer for decreasing vibration or oscillatory motions, improving the axial resolution.
- a sensor layer for measuring bending in the ultrasonic probe may be a strain sensor layer integrated with the ultrasonic probe.
- tire strain sensor deforms accordingly causing a resistance change to the strain sensor.
- the bending radius of the ultrasonic probe is estimated according the resistance change.
- the phase error is then calculated based on the bending radius and the phase error is corrected or compensated in realtime.
- a wearable ultrasonic imaging method for quantitatively characterize the bladder volume using a pair of orthogonal linear transducer arrays which may capture two orthogonal cross-section images of the bladder.
- a real- time phase error compensation method to enhance the accuracy of the imaging. With the integration of a sensor layer in the ultrasonic probe, real-time phase error compensation may be performed addressing the change in probe curvature during imaging due to movement of the subject, such as breathing.
- the proposed ultrasonic scanner and ultrasonic probe may be used in the following non-limiting scenarios:
- Post-void residual urine measurement The volume of the post-void residual urine may be measured immediately after voiding. This avoids inaccuracies caused by long-time waits after urination in medical centre.
- Urination speed characterization A slow flow rate of the urine may mean that there is an obstruction at the bladder neck or in the urethra, an enlarged prostate, or a weak bladder. Ultrasound imaging speed can be as high as dozens of frames per second. During the voiding process, the urination speed between any two frames can be estimated, which can provide diagnosis for lower urinary tract symptoms.
- On-demand self-catheterization Forbladder sensation problems, a common therapy is regular intermittent self-catheterization. However, the empty interval has to be chosen properly. A real time monitoring which can provide on-demand self-catheterization will make patients' lives easier.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Acoustics & Sound (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Gynecology & Obstetrics (AREA)
- Vascular Medicine (AREA)
- Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Disclosed herein is an ultrasonic probe. The ultrasonic probe comprises: a flexible base; a first ultrasonic transducer array coupled to the flexible base, the first ultrasonic transducer array arranged along a first axis; a second ultrasonic transducer array coupled to the flexible base, the second ultrasonic transducer array arranged along a second axis, the second axis orthogonal to the first ultrasonic transducer array; and a sensor layer coupled to the first ultrasonic transducer array and the second ultrasonic transducer array, the sensor layer being configured to provide a sensor signal corresponding to at least one of: a first bending of the first ultrasonic transducer array about the second axis; and a second bending of the second ultrasonic transducer array about the first axis.
Description
ULTRASONIC PROBE, ULTRASONIC SCANNER AND A METHOD OF ULTRASONIC SCANNING
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of priority to Singapore patent application no. 10202400458X which was filed on 20 February 2024, the contents of which are hereby incorporated by reference in its entirety for all purposes.
TECHNICAL FIELD
[0002] This application relates to the field of ultrasonic imaging, and more particularly to an ultrasonic probe, an ultrasonic scanner, and a method of ultrasonic scanning.
BACKGROUND
[0003] Lower urinary tract symptoms (LUTSs) impacts over 2.3 billion individuals globally each year. Chronic urinary retention patients are unable to completely empty their bladders. The chronic urinary retention symptom often results in complications such as urinary tract infection, bladder and kidney damage, and urinary incontinence.
[0004] Ultrasonic imaging or ultrasound imaging is often recommended for bladder volume estimation in LUTS initial assessment in addition to the evaluation of treatment effects. However, there are limitations for conventional ultrasonic imaging methods when used for characterizing bladder volume, such as: (1 ) bulky equipments which are often available in medical facilities such as clinics or hospitals, which require patients to be present at the medical facility for the imaging process; (2) inaccurate post-void residual urine volume estimation caused by long-time waiting between voiding and measurement in medical facilities; (3) inability to provide real-time monitoring.
SUMMARY
[0005] According to an aspect, disclosed herein is an ultrasonic probe. The ultrasonic probe comprises a flexible base; a first ultrasonic transducer array coupled to the flexible base, the first ultrasonic transducer array arranged along a first axis; a second ultrasonic transducer array coupled to the flexible base, the second ultrasonic transducer array arranged along a second axis, the second axis orthogonal to the first ultrasonic transducer array; and a sensor layer coupled to the first ultrasonic transducer array and the second ultrasonic transducer array, the sensor layer being configured to provide a sensor signal corresponding to at least one of: a first bending of the first ultrasonic transducer array about the second axis; and a second bending of the second ultrasonic transducer array about the first axis.
[0006] According to another aspect, disclosed herein is an ultrasonic scanner. The ultrasonic scanner comprises: the ultrasonic probe as described above, the ultrasonic probe attachable to a target surface; a controller in signal communication with the ultrasonic probe, wherein the controller is configured to: receive a first reflection signal of a target from the first ultrasonic transducer array; receive a second reflection signal of the target from the second ultrasonic transducer array; adjust the first reflection signal based on the first bending to obtain a first adjusted reflection signal ; and adjust the second reflection signal based on the second bending to obtain a second adjusted reflection signal .
[0007] According to yet another aspect, disclosed herein a method of ultrasonic scanning. The method comprises: attaching the ultrasonic probe as described above to a target surface, the target surface spaced apart from a target; receiving a first reflection signal of the target from the first ultrasonic transducer array; receiving a second reflection signal of the target from the second ultrasonic transducer array; adjusting the first reflection signal based on the first bending
to obtain a first adjusted reflection signal; and adjusting the second reflection signal based on the second bending to obtain a second adjusted reflection signal.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Various embodiments of the present disclosure arc described below with reference to the following drawings:
[0009] FIG. 1 is a schematic diagram of an ultrasonic scanner according to various embodiments;
[0010] FIG. 2 is a perspective view of an ultrasonic probe according to various embodiments;
[001 1] FIG. 3 is an exploded view of the ultrasonic probe of FIG. 2;
[0012] FIG. 4 is a partial exploded view of the ultrasonic probe of FIG. 2;
[0013] FIG. 5 is a partial perspective view of an ultrasonic probe according to various embodiments;
[0014] FIGs. 6 A and 6B show respective top views of ultrasonic transducer layers according to various embodiments;
[0015] FIGs. 7A to 7C show a side view of an ultrasonic probe with different curvatures according to various embodiments;
[0016] FIGs. 8A and 8B show the respective phase errors of beamforming distortion and image reconstruction distortion;
[0017] FIG. 8C is a flowchart of a strain sensor-based method for phase error correction according to various embodiments;
[0018] FIGs. 9A and 9B show a bladder ultrasound image and a segmented bladder area using an FCN model according to various embodiments;
[0019] FIG. 10 is an image of an incomplete bladder image due to limited field of view, with the dotted line representing the estimated bladder contour;
[0020] FIG. 11 is a flowchart of a proposed automatic volume estimation algorithm according to various embodiments;
[0021] FIG. 12 is a flowchart of a method of ultrasonic scanning according to various embodiments;
[0022] FIG. 13 A is an image of a flexible PI PCB (Polyimide PCB) based orthogonal ultrasonic probe;
[0023] FIG. 13B is an exploded view of an ultrasonic probe according to various embodiments;
[0024] FIGs. 14 A to 14C show the ultrasonic pressure intensity when the transducer number (N) is 1, 64, and 128, respectively. The pitch, that is the space one transducer occupies, is 0.4 mm. The ultrasonic pressure is normalized to 64 dB and the focal length is 50 mm;
[0025] FIGs. 14D to 14F show the ultrasonic pressure intensity when the pitch is 0.2, 0.4, and 0.8 mm, and N = 64;
[0026] FIG. 15A is an exemplary bladder ultrasonic image;
[0027] FIG. 15B is an exemplary simulated bladder ultrasonic image, with parameter of the linear probe: frequency = 3 MHz, N = 64, pitch = 0.4 mm, focal depth = 50 mm;
[0028] FIG. 16 shows the steps of an assembly process of an ultrasonic probe according to various embodiments;
[0029] FIG. 17A shows an exemplary ultrasonic signal of a bare transducer;
[0030] FIG. 17B shows an exemplary ultrasonic signal of a transducer with a backing layer;
[0031] FIG. 18A shows the measurements of the toughness between the PI substrate and polyacrylate adhesive, with sample width of 0.5 cm;
[0032] FIG. 18B shows the measurements of the toughness between the polyacrylate adhesive and the human skin, with sample width of 2.5 cm;
[0033] FIG. 19 shows an exemplary image of a strain sensor of an ultrasonic probe according to various embodiments;
[0034] FIG. 20 shows the maximum ultrasonic beamforming error before and after strain sensor-based phase error compensation;
[0035] FIGs. 21 to 25 show the various performance parameters of the proposed ultrasonic probe according to various embodiments;
[0036] FIG. 26 shows the ultrasound images of a phantom bladder using the proposed ultrasonic probe according to various embodiments;
[0037] FIGs. 27A to 27D show a comparison of bladder volume estimation between a manual method and a fully convolutional neural network (FCN) method;
[0038] FIG. 28 shows a comparison of bladder volume estimation between the proposed ultrasonic probe and conventional systems;
[0039] FIG. 29A shows a temperature profile of the proposed ultrasonic probe according to various embodiments; and
[0040] FIG. 29B shows a temperature data over an hour of the proposed ultrasonic probe of FIG. 29A.
DETAILED DESCRIPTION
[0041] The following detailed description is made with reference to the accompanying drawings, showing details and embodiments of the present disclosure for the purposes of illustration. Features that are described in the context of an embodiment may correspondingly be applicable to the same or similar features in the other embodiments, even if not explicitly described in these other embodiments. Additions and/or combinations and/or alternatives as
described for a feature in the context of an embodiment may correspondingly be applicable to the same or similar feature in the other embodiments.
[0042] In the context of various embodiments, the articles “a”, “an” and “the” as used with regard to a feature or element include a reference to one or more of the features or elements.
[0043] In the context of various embodiments, the term “about” or “approximately” as applied to a numeric value encompasses the exact value and a reasonable variance as generally understood in the relevant technical field, e.g., within 10% of the specified value.
[0044] As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0045] The term “pose” may include a position and an orientation of an object or part of an object. The term “position” may refer to a location or coordinate (for example, X-coordinate, Y coordinate, Z coordinate) of an object or part of an object in a space or a frame. The term “orientation” may refer to a facing or angle (for example, an X-direction vector, a Y-direction vector, a Z-dircction vector) of an object or part of an object in a space or a frame.
[0046] Patients with lower urinary tract symptoms often experience urinary retention or bladder fullness sensing problems which requires monitoring of the bladder volume. Traditional ultrasonic bladder volume estimation procedures require the patients to visit the medical facilities frequently, often with long waiting time prior to the actual measurement. The waiting time may result in inaccuracy of the ultrasonic measurements, as the patient may not be in the optimum state for measurement. For example, the patient may have just visited the lavatory prior to measurement. In addition, conventional ultrasonic method does not provide a real-time or long-term monitoring modality, which is often essential for patients who require selfcatheterization.
[0047] In addition, conventional ultrasound measurement methods often require a 3D scanner and 3D surface reconstruction for estimating an attachment surface position. This is a
complex process and impractical to be adapted for home use. In addition, the attachment surface position may be constantly changing due to the patient’s breathing or body posture change. This introduces errors in the attachment surface position, leading to imaging phase errors. With the various limitations, real-time imaging is not viable using conventional ultrasound imaging methods.
[0048] Bladder volume measurement often requires multiple ultrasonic images to be obtained, followed by the extraction of the bladder width, bladder depth and the bladder length for volume estimation. For conventional ultrasonic probe/procedure, the operator has to hold onto the probe during the imaging and reposition. The manual process requires a precise repositioning (such as a 90° probe rotation) of the probe. However, in the actual process, the ultrasonic probe may be positioned and/or oriented at different positions relative to the target (such as the bladder) in obtaining the multiple ultrasonic images for volumetric estimation. The inconsistency in the positioning and orientation may introduce undesirable measurement errors during measurement, resulting in inaccurate bladder volume estimation. Error may be introduced due to the assumption of exact orthogonality between the repositioned probes / captures ultrasonic images.
[0049] In addition, as the ultrasonic images are acquired sequentially, there is a time space/gap between the acquisition of the ultrasonic images. Sequential acquisition of the ultrasonic images limits or even prohibits the possibility of real-time ultrasonic imaging and measurements. As the ultrasonic probe may shift or be in motion during the measurement process, due to breathing or accidental movements, further error may be introduced during volume estimation. Hence, for the purpose of medical condition observation or diagnosis, such as the above-mentioned self-catheterization time schedules, the bladder volume estimation may only be observed in an intermittent manner instead of the desirable real-time continuous manner.
[0050] In view of the above inexhaustive limitations, proposed herein an ultrasonic scanner for real-time ultrasonic measurement. In addition, a proposed ultrasonic probe may be adapted for real-time monitoring for medical diagnosis. In various embodiments of the proposed ultrasonic probe, ultrasound image(s) (cross-section views) of a target may be obtained concurrently allowing monitoring to be performed real-time.
[0051 ] In various exemplary embodiments, the target may be a bladder of a subject. As such, the volume of the bladder may be computed and determined using the ultrasound images in real-time. The proposed ultrasonic probe and ultrasonic scanner may provide a real-time, longterm, and high accuracy bladder volume monitoring solution for home-use. In applications such as for the establishment of appropriate self-catheterization time schedules for patients with bladder fullness sensation problems, real-time bladder volume monitoring aids in diagnosis and treatment.
[0052] The proposed ultrasonic probe allows volume estimation without the need for repositioning or rotating the ultrasonic probe. This advantageously alleviates the error caused by manual repositioning of the ultrasonic probe for traditional ultrasonic imaging procedure/probe as well as the error in volume estimation incurred due to the manual repositioning.
[0053] The proposed wearable ultrasonic probe may include a pair of ultrasonic transducer array sensors. The pair of ultrasonic transducer array sensors may be orthogonal to one another. The ultrasonic probe may be configured to measure the bladder width, depth, and length in a single measurement for volume estimation.
[0054] In various embodiments, the proposed ultrasonic probe may be a wearable device, or a wearable probe which is deformable relative to a target surface or an attachment surface. The target surface may be spaced apart from a target of ultrasonic imaging. As the wearable probe may conform to the skin contour or body contour of a subject, the bending of the probe may
cause phase error and artifacts in the captured ultrasonic images. Hence, the proposed ultrasonic scanner / ultrasonic probe comprises real-time compensation of the phase error for high-quality ultrasonic imaging. The compensation or correction for phase error may be performed using a bending sensor for measuring the curvature(s) of the ultrasonic probe. In various embodiments, the bending sensor may be a strain-based sensor integrated with the ultrasonic probe. Such strain-based sensor may deform together with the ultrasonic probe, such that a bending radius of the ultrasonic probe may be measured/estimated for phase error compensation in real time. [0055] FIG. 1 illustrates an ultrasonic scanner 50 for ultrasonic imaging according to various embodiments of the present disclosure. The ultrasonic scanner 50 may include an ultrasonic probe 100 and a controller 60 in signal communication with the ultrasonic probe 100. The ultrasonic probe 100 may be attachable to a target surface, such as a torso of a subject 80. The ultrasonic probe 100 also may be a wearable by a subject 80. In exemplary embodiments, the ultrasonic probe 100 may be wore around a waist/hip of the subject 80 for ultrasonic imaging of the bladder. It may be appreciated that the ultrasonic probe 100 may be worn on other parts of the subject’s 80 body according to the respective medical procedure or application.
[0056] According to various embodiments, the controller 60 may be configured to transmit control signals to and receive sensor signals from the ultrasonic probe 100. Further, for embodiments where the ultrasonic probe 100 does not include a dedicated power source, the controller 60 may also transmit power signals or act as a power source to the ultrasonic probe 100. In various embodiments, the controller 60 may be connected to the ultrasonic probe 100 via a wired connection, such as via a signal cable. Alternatively, the controller 60 may be connected to the ultrasonic probe 100 via a wireless connection or a wireless protocol, such as Bluetooth or Wi-Fi. In various embodiments, for instance where the controller 60 is connected to the ultrasonic probe 100 via a wired connection, the controller 60 and the ultrasonic probe 100 may be located in a common location. This enables sensor and measurement data to be
stored locally in the ultrasonic scanner 50. Alternatively, the controller 60 may be located remotely from the ultrasonic probe 100, such that sensor data may be stored in a remote location, such as in the cloud or in a remote server.
[0057] FIGs. 2 to 4 illustrate an ultrasonic probe 100 according to various embodiments of the disclosure. The ultrasonic probe 100 may comprise a flexible base 110. The flexible base 1 10 may be a planar member. When the ultrasonic probe 100 or the flexible base 1 10 is in a neutral state or a generally flat planar state, the flexible base 110 may define a probe plane 70. The flexible base 110 may be bendable or deformable in bending. In other words, the flexible base 110 may bend towards and/or away from the probe plane 70 into a bending state. In various embodiments, the flexible base 110 may be substantially non-extensible along the probe plane 70, such that stretching or extension of the flexible base 1 10 is limited. In other embodiments, the flexible base 110 may be extensible along the probe plane 70, such that the flexible base 110 is stretchable.
[0058] In various embodiments, the ultrasonic probe 100 may include an adhesive contact surface 112 for attaching the ultrasonic probe 100 relative to a target surface, such as an abdomen of a subject. In various embodiments, the ultrasonic probe 100 may be deformable to conform to a contour of a target surface.
[0059] Referring to FIG. 2, the ultrasonic probe 100 may further comprise an ultrasonic transducer layer 120 and a sensor layer 130, both of which are coupled to the flexible base 110. The ultrasonic transducer layer 120 and the sensor layer 130 may be disposed on opposing sides or surfaces of the flexible base 1 10 along a thickness axis 76 of the flexible base 1 10. The thickness axis 76 is in alignment with a thickness direction of the flexible base 110 or the ultrasonic probe 100. In other embodiments, the ultrasonic transducer layer 120 and the sensor layer 130 may be disposed on a common side of the flexible base 110.
L0060J In various embodiments, the ultrasonic transducer layer 120 may be configured in a transverse or cross configuration. Hence, the ultrasonic transducer layer 120 may define two orthogonal axes, such as a first axis 72 and a second axis 74. The first axis 72 and the second axis 74 may both be co-planar or parallel to the probe plane 70 when the the ultrasonic probe 100 or the flexible base 110 is in the neutral state. Both of the first axis 72 and the second axis 74 may be orthogonal to the thickness axis 76 of the flexible base 1 10.
[0061] The ultrasonic transducer layer 120 and the sensor layer 130 may both be bendable in response to bending of the flexible base 110. This enables the ultrasonic probe 100 to be conformable to a target surface or an attachment surface of the subject, such as an abdomen of the subject, for bladder ultrasound measurement. The target surface may be spaced apart from a target of ultrasonic imaging, such as the bladder of the subject.
[0062] In various embodiments, the sensor layer 130 may bend together or in tandem with the ultrasonic transducer layer 120. Hence, the sensor layer 130 may be configured to provide a sensor signal corresponding to a bending of the ultrasonic transducer layer 120. Therefore, bending of the ultrasonic transducer layer 120 may be measured and estimated by the sensor layer 130. As an exemplary embodiment, the sensor signal may correspond to: a first bending of the ultrasonic transducer layer 120 about the second axis 74 and/or a second bending of the ultrasonic transducer layer 120 about the first axis 72. Hence, the sensor signal may collectively represent one of or a combination of the first bending and the second bending.
[0063] In various embodiments, the ultrasonic probe 100 may be a wearable probe. Hence, the ultrasonic probe 100 may further include a belt 102 or a wearable band for attaching to a subject. The belt 102 may be an elastic belt which applies a compressive force on the ultrasonic probe 100 such that the ultrasonic probe 100 conforms to the target surface of the subject, such as the abdomen of the subject. In other embodiments, the belt 102 may include a fastening and
tightening mechanism to enable the ultrasonic probe 100 to conform to the target surface of the subject.
[0064] Referring to FIG. 3, in various embodiments, the ultrasonic transducer layer 120 may comprise a first ultrasonic transducer array 122 coupled to the flexible base 110 and a second ultrasonic transducer array 124 also coupled to the flexible base 110. The first ultrasonic transducer array 122 may be arranged along the first axis 72 and the second ultrasonic transducer array 124 may be arranged along the second axis 74. As the first axis 72 is orthogonal to the second axis 74, the first ultrasonic transducer array 122 may be said to be orthogonal to the second ultrasonic transducer array 124.
[0065] The ultrasonic probe 100 may be configured in a multi-layer structure along the thickness axis 76. In some embodiments, the flexible base 1 10 may be configured as a printed circuit board (PCB) comprising a printed circuit layer 140 disposed thereon. The printed circuit layer 140 may include electrodes for connecting to the ultrasonic transducer layer 120 as well as the sensor layer 130. In addition, the ultrasonic probe 100 may include a ground electrode 150 coupled to the ultrasonic transducer layer 120, such that the printed circuit layer 140 and the ground electrode 150 sandwich the ultrasonic transducer layer 120. In various embodiments, an Electro Magnetic Interference (EMI) shielding layer 160 may be disposed between the ultrasonic transducer layer 120 and the sensor layer 130. The EMI shielding layer 160 may shield the ultrasonic transducer layer 120 and/or the sensor layer 130 from electromagnetic interferences.
[0066] In various embodiments, the sensor layer 130 may be coupled to the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 such that there is minimal relative movement between the sensor layer 130 and each of the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124. The sensor layer 130 and the ultrasonic transducer layer 120 may each be fixedly coupled to each other via the intermediate layers, such
as the flexible base 110 and the EMI shielding layer 160. For example, the sensor layer 130 and the ultrasonic transducer layer 120 may be fixedly coupled to each other via the intermediate layers using adhesives. This enables the sensor layer 130 to bend together with the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124. Hence, the sensor layer 130 may be configured to provide a sensor signal corresponding to one of or a combination of: a first bending of the first ultrasonic transducer array 122 about the second axis 74; and a second bending of the second ultrasonic transducer array 124 about the first axis 72. [0067] In various embodiments, the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 may be aligned along the thickness axis 76 such that the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 are disposed on a common plane 73. The common plane 73 may be parallel to the probe plane 70. In various embodiments, the sensor layer 130 may be disposed adjacent to the common plane 73 along the thickness axis 76 of the flexible base 110.
[0068] Further referring to FIG. 4, in various embodiments, the alignment of the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 along the thickness axis 76 allows the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 to intersect with or overlap on each other. The first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 may overlap at an intersecting zone 121. The intersecting zone 121 may define an imaging datum or a reference position for each of the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 during ultrasonic imaging/measurement. In other words, the intersecting zone 121 may determine a positional and/or orientational reference between the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124.
[0069] In various embodiments, due to the overlap between the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124, the intersecting zone 121 may also
have a higher transducer array density in comparison to the other zones of the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124.
[0070] In some embodiments, the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 may have a common length. However, in alternative embodiments, the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124 may have differing lengths. In an example where the ultrasonic probe 100 is placed across an abdomen of a subject, the first ultrasonic transducer array 122 may be attached to align with a height direction of the subject and the second ultrasonic transducer array 124 may align with a lateral direction of the subject. As the second ultrasonic transducer array 124 has to bend around the subject’s abdomen, the second ultrasonic transducer array 124 may bend more than the first ultrasonic transducer array 122. Hence, the second ultrasonic transducer array 124 may be longer than the first ultrasonic transducer array 122.
[0071] Stil referring to FIG. 4, in various embodiments, the sensor layer 130 may be a strainbased sensor. In an exemplary embodiment, the sensor layer 130 may comprise one or more first strain sensors 132/136 arranged along the first axis 72, and one or more second strain sensors 134/138 arranged along the second axis 74. As such, the first strain sensors 132/136 may measure a first bending of the first ultrasonic transducer array 122 about the second axis 74, and the second strain sensors 134/138 may measure a second bending of the second ultrasonic transducer array 124 about the first axis 72. In various embodiments, the strain sensors 132/134/136/138 may be a strain gauge. In other embodiments, the sensor layer 130 may comprise a non-strain based bending sensor, such as a MEMS-based strain sensor, an optical fiber-based strain sensor, etc.
[0072] Referring now to FIG. 5, in various embodiments, the ultrasonic transducer layer 120 may define a transducer zone 123, and the sensor layer 130 may define a sensing zone 133. The transducer zone 123 may correspond to a transducer area for transmitting and receiving
ultrasonic waves. The sensing zone 133 may correspond to a sensing area sensitive to bending of the ultrasonic transducer layer 120. In some embodiments, the sensing zone 133 may be sensitive to the first bending of the first ultrasonic transducer array 122 about the second axis 74, and the second strain sensors 134/138 may measure a second bending of the second ultrasonic transducer array 124 about the first axis 72.
[0073] In various embodiments, the sensing zone 133 may cover the transducer zone 123. The sensing zone 133 may cover the first ultrasonic transducer array 122 along the first axis 72 and the second ultrasonic transducer array 124 along the second axis 74.
[0074] In various embodiments, the flexible base 110 may be configured as a backing layer 110 disposed between the sensor layer 130 and the ultrasonic transducer layer 120. Hence, the backing layer 1 10 may have a higher stiffness than each of the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124. It was found that a backing layer 110 with higher stiffness than the ultrasonic transducer arrays 122/124 improves the axial resolution of the ultrasonic measurements.
[0075] Referring to FIGs. 4 and 5, in various embodiments, the first ultrasonic transducer array 122 may comprise a plurality of first transducer units 125. Similarly, the second ultrasonic transducer array 124 may also comprise a plurality of second transducer units 127. In various embodiments, the plurality of first transducer units 125 of the first ultrasonic transducer array 122 may align along the first axis 72 and the plurality of second transducer units 127 of the second ultrasonic transducer array 124 may aligned along the second axis 74. The first ultrasonic transducer array 122 may obtain a first reflection signal corresponding to a first ultrasonic image of the target along the first axis 72. The second ultrasonic transducer array 124 may obtain a second reflection signal corresponding to a second ultrasonic image of the target along the second axis 74.
[0076] In various embodiments, the first ultrasonic transducer array 122 may also include first transducer units 125 aligned along the second axis 74. Similarly, the second ultrasonic transducer array 124 may also include second transducer units 127 aligned along the first axis 72. However, it may be appreciated that the first reflection signal from the first ultrasonic transducer anay 122 may still correspond to the first ultrasonic image of the target along the first axis 72, and the second reflection signal from the second ultrasonic transducer array 124 may still correspond to the second ultrasonic image of the target along the second axis 74.
[0077] In various embodiments, each pair of first transducer units 125 and/or each pair of second transducer units 127 may define a plurality of transducer pitches (P). In various embodiments, the plurality of transducer pitches (P) may define a common pitch length. In various embodiments, each of the plurality of transducer pitches (P) may be in a range between a full transducer wavelength (X) and half the transducer wavelength (X/2).
[0078] In various embodiments, the ultrasonic transducer layer 120 may include multiple first ultrasonic transducer arrays and multiple second ultrasonic transducer arrays. As shown in FIG. 6A, the ultrasonic transducer layer 120 may include two first ultrasonic transducer arrays 122a/122b and two second ultrasonic transducer array I 24a/124b. Further, as shown in FIG. 6B, the ultrasonic transducer layer 120 may include three first ultrasonic transducer arrays 122a/ 122b/ 122c and three second ultrasonic transducer array 124a/124b/124c. The number of transducer arrays may be configured according to the application, and is not limited to the number as disclosed herein.
[0079] FIGs. 7A to 7C illustrate the ultrasonic probe 100 in various bending states. FIG. 7A shows the ultrasonic probe 100 in a neural state or a planar state with minimal curvature (CO) or bending. The ultrasonic probe 100 may assume the neutral state when the ultrasonic probe
100 is placed on a flat surface or prior to being worn by a subject. FIG. 7B shows a first side view of an ultrasonic probe 100A when viewed along the first axis 72. The ultrasonic probe
100A may be in a bent state with a first curvature (Cl). FIG. 7C shows a second side view of another ultrasonic probe 100B when viewed along the second axis 74. The ultrasonic probe 100B may be in a bent state with a second curvature (C2). It may be appreciated that when the ultrasonic probe 100 is attached to a target surface of a subject, the ultrasonic probe 100 will conform to the target surface to bend along both the first axis 72 and the second axis 74. As such, bending of ultrasonic probe 100 will be measured by the sensor layer 130 for phase error correction.
[0080] In various embodiments, during operation of the ultrasonic scanner 50, i.e., when the ultrasonic probe 100 is attached to a subject, the ultrasonic probe 100 may conform to a target surface of the subject. The ultrasonic probe 100 may be configured to obtain ultrasonic reflection signals of a target, such as a bladder of the subject, using the first ultrasonic transducer array 122 and the second ultrasonic transducer array 124. In various embodiments, the ultrasonic reflection signals may include a first reflection signal of the target received from the first ultrasonic transducer array 122, and a second reflection signal of the target from the second ultrasonic transducer array 124.
[0081] As exemplary embodiments, the first reflection signal may include a plurality of first unit reflection signals obtained from the first ultrasonic transducer array 122. Each of the plurality of first unit reflection signals may correspond to a respective first transducer unit in the first ultrasonic transducer array 122. Similarly, the second reflection signal may include a plurality of second unit reflection signals obtained from the second ultrasonic transducer array 124. Each of the second plurality of unit reflection signals may correspond to a respective second transducer unit in the second ultrasonic transducer array 124.
[0082] As exemplary embodiments, the first reflection signal and the second reflection signal may each be a 2D ultrasonic image of the target. In other embodiments, the first reflection signal and the second reflection signal may be an electrical signal corresponding to a 2D
ultrasonic image. In various embodiments, the first reflection signal and the second reflection signal may collectively form a 3D ultrasonic image of the target.
[0083] Sequentially or concurrently with obtaining ultrasonic reflection signals, the sensor layer 130 may provide a sensor signal corresponding to a bending of the ultrasonic transducer layer 120. In various embodiments, the bending may include at least one of: a first bending of the first ultrasonic transducer array 122 about the second axis 74; and a second bending of the second ultrasonic transducer array 124 about the first axis 72.
[0084] Referring to FIGs. 8 A and 8B, due to bending of the ultrasonic probe 100 and/or ultrasonic transducer layer 120, the ultrasonic reflection signals may include phase errors such as beamforming distortion and image reconstruction distortion, for the outgoing and incoming ultrasonic waves. Hence, in various embodiments, the controller 60 may correct or adjust the ultrasonic reflection signals based on the bending of the ultrasonic transducer layer 120 as determined or measured by the sensor layer 130. In various embodiments, the controller 60 may correct the first reflection signal to obtain a first adjusted reflection signal based on the first bending (e.g. C2) of the first ultrasonic transducer array 122. Similarly, the controller 60 may correct the second reflection signal to obtain a second adjusted reflection signal based on the second bending (e.g. Cl) of the second ultrasonic transducer array 124. The corrected or adjusted ultrasonic reflection signals may then be used for volumetric estimation of the target, e.g. the bladder.
[0085] FIG. 8C illustrates an exemplary curvature estimation using a strain sensor-based sensor layer 130. The controller 60 may first determine a bending in the ultrasonic probe 100 based on a resistance change in the strain sensor. The strain sensor may be calibrated for any bending state between a flat state and a bending state, such as a minimum radius bending state.
One or more curvature of the ultrasonic probe 100 may be estimated based on the strain sensor
signals. Hence, phase error compensation or correction may be performed based on the estimated curvature.
[0086] In various embodiments, to enable real-time ultrasonic imaging or measurement, the controller 60 may transmit ultrasonic waves towards the target, and concurrently receive both the first reflection signal and the second reflection signal. In some embodiments, ultrasonic imaging may be performed in intervals or periodically. For example, an ultrasonic imaging of the subject may be performed every 3 seconds in observing the change in bladder volume responsive to a medication. Therefore, in various embodiments, the controller 60 may periodically receive both the first reflection signal and the second reflection signal for volumetric estimation.
[0087] In various embodiments, the bladder volume may be estimated from the ultrasonic reflection signals or the adjusted ultrasonic reflection signals by a formula: Volume = 0.72* W*L*D, where, W, L, and D are the width, length, and depth of the bladder, respectively. The multiplication coefficient (0.72 which is frequency used) is dependent on the shape of the bladder.
[0088] As the target surface or attachment surface, such as the human torso, may be a non- spherical surface or area, in various embodiments, the first bending the first ultrasonic transducer array 122 may have a different curvature from the second bending of the second ultrasonic transducer array 124. In addition, as the subject may not be stationary during measurement due to motion of the subject or breathing, the first bending and/or the second bending may vary during the course of ultrasonic measurement.
[0089] In various embodiments, the controller 60 may estimate a volume of the target using a first machine learning model. The controller 60 may estimate the volume of the target by providing the first adjusted reflection signal and the second adjusted reflection signal to the first
machine learning model. As exemplary embodiments, the first machine learning model may be a convolutional neural network, such as a FCN32 fully convolutional network.
[0090] The controller 60 may perform feature extraction or segmentation of the target from the ultrasonic signals or ultrasound images. As shown in FIGs. 9 A and 9B, the first machine learning model may segment the target from each of the first adjusted ultrasonic signal and the second adjusted ultrasonic signal, respectively. This enables the controller 60 to obtain a first segmented target based on the first adjusted ultrasonic signal and a second segmented target based on the second adjusted ultrasonic signal. Thereafter, the controller 60 may estimate the volume of the target based on the first segmented target and the second segmented target. In an example, the first machine learning model may be trained based on a training dataset of 761 images, and validated with a validation dataset of 84 images.
[0091] Referring to FIG. 10, in various embodiments, the ultrasonic signals or ultrasound images may include one or more defective region(s) of the target. For example, a part of the target may be missing or distorted on the ultrasound imagc(s). As examples, each of the defective region may be characterized by one or a combination of: an occluded region, a missing region, and a distorted region. Hence, in various embodiments, the controller 60 may be configured to estimate the defective region based on a second machine learning model.
[0092] Referring to FIG. 11, the controller 60 may first receive the ultrasonic signals or ultrasound images, for example, the first adjusted ultrasonic signal and the second adjusted ultrasonic signal. Thereafter, the controller 60 determines a defective region of the target in one or both of the first adjusted reflection signal and the second adjusted reflection signal. Using the second machine learning model, such as a Pix2Pix conditional generative adversarial network, the controller 60 may estimate or determine the respective defective region of the target. Thereafter, the defective region may be replaced or overlaid, for bladder shape feature extraction or segmentation using the first machine learning model. This may be followed by
step of volume estimation. In some examples, the process of volume estimation of the bladder may be performed by the controller 60 quickly and locally, thus providing the subject a near real-time volume estimation. In addition, the results from the volume estimation may be used for diagnosis or tracking.
[0093] In various embodiments, with reference to FIG. 12, the controller 60 may be configured to perform a method 700 of ultrasonic scanning. The method 700 of ultrasonic scanning may include: in 710, attaching the ultrasonic probe 100 as described above to a target surface, the target surface spaced apart from a target. In an exemplary embodiment, the target surface may refer to an abdomen of a subject, and the target may refer to a bladder of the subject. The method 700 may further include: in 720, receiving a first reflection signal of the target from the first ultrasonic transducer array; in 730, receiving a second reflection signal of the target from the second ultrasonic transducer array; in 740, adjusting the first reflection signal based on the first bending to obtain a first adjusted reflection signal; and in 750, adjusting the second reflection signal based on the second bending to obtain a second adjusted reflection signal. In various embodiments, the method 700 may further include: in 760, estimating a volume of the target based on the first adjusted reflection signal and the second adjusted reflection signal.
[0094] In various embodiments, estimating the volume of the target by providing the first adjusted reflection signal and the second adjusted reflection signal to a first machine learning model. In various embodiments, the first machine learning model comprises a FCN32 fully convolutional network.
[0095] In various embodiments, the method may further comprise: segmenting the target from each of the first adjusted reflection signal and the second adjusted reflection signal to obtain a first segmented target and a second segmented target; and estimate the volume of the target based on the first segmented target and the second segmented target. In various embodiments, the method may further comprise: estimating a defective region of the target
using a second machine learning model based on at least one of: the first adjusted reflection signal and the second adjusted reflection signal. In various embodiments, the second machine learning model comprises a Pix2Pix conditional generative adversarial network. In various embodiments, the defective region is characterized by at least one of: an occluded region, a missing region, and a distorted region.
[0096] In various embodiments, the method may further comprise: concurrently receiving both the first reflection signal and the second reflection signal. In various embodiments, the method may further comprise: periodically receiving both the first reflection signal and the second reflection signal. In various embodiments, the first bending has a different curvature from the second bending. In various embodiments, each of the first bending and the second bending changes varies in operation. In other words, during the method of ultrasonic scanning, the first bending and the second bending may change dynamically.
[0097] Exemplary ultrasonic scanner and ultrasonic probe for bladder volume monitoring [0098] The proposed ultrasonic scanner and ultrasonic probe utilize ultrasonic imaging, which is a safe diagnosis method, for real-time and extended period bladder volume monitoring. Estimation of the bladder volume estimation uses two orthogonal cross-section ultrasonic images to estimate three scales in three directions: width, length, and depth. Departing from conventional ultrasonic probe which requires a 90° or orthogonal rotation mid-way during the ultrasound process, the proposed wearable ultrasonic probe comprises a pair of orthogonal linear transducer array which alleviates the need for probe rotation.
[0099] FIGs. 13A and 13B illustrate the proposed wearable ultrasonic probe. FIG. 13A shows a two-layer flexible polyimide (PI) printed circuit board (PCB) and an orthogonal ultrasonic transducer array which also includes an orthogonal transducer array electrode. The two linear arrays orthogonal to each other share a centre area. The proposed PCB configuration reduces the risk of wires getting disconnected, ensuring high reliability of the proposed system.
FIG. 13B shows the schematic diagram of an exploded view of the wearable ultrasonic probe with components labelled, including: a strain sensor, an electromagnetic interference (EMI) shielding layer, a PCB circuit, a transducer array, and ground (GND) electrodes. The strain sensor, which is used for phase correction when the ultrasonic probe bends, is integrated on the PCB circuit directly by thermal evaporation. The EMI shielding layer is used for blocking external electromagnetic interference.
[00100] FIGs. 14A to 14F show the optimization of the ultrasonic probe parameters. The impact of transducer number (N) and the transducer size/pitch on ultrasonic beamforming were used to determine the optimal parameters. The frequency of the ultrasonic transducer was 3 MHz, and the wavelength was 0.5 mm, which allowed penetration depth and the resolution. The distance between each transducer was determined by the transducer dicing. In the example, the distance is chosen as 30 pm. As shown in FIGs. 14A to 14F, it was found that the choice of the linear transducer array parameter is a trade-off. As the transducer number increases, the focal performance of the ultrasonic wave will be better. However, with more transducers, the ultrasound wave will diverge faster after the focal point. Furthermore, more transducers lead to more complicated systems. The transducer number of 64 can be chosen in consideration of compatibility with existing ultrasound system, and is also often used in existing medical probes. When the pitch is larger than one wavelength, the ultrasonic beam pattern appears strong side lobe(s) which may result in artifacts when imaging (see FIG. 14F dotted box). Therefore, to balance the ultrasonic intensity and to mitigate formation of side lobe(s), the transducer pitch has to be determined based on a trade-off. i.e., a large pitch reduces resolution, while increases the imaging area with the same under of channels. It was determined that a preferred pitch is one between a full transducer wavelength and half the transducer wavelength.
[00101] Referring to FIG. 15A, it shows the B-mode ultrasound image of an actual bladder, and FIG. 15B shows the B-mode ultrasound image of a synthetic bladder used to demonstrate
the ultrasonic probe imaging ability. B-mode refers to the brightness mode of an ultrasound image. The simulation of artificial phantoms was done by simulating and summing the received ultrasonic fields from scatter points. The scatters were extracted from an existing bladder ultrasonic image, which includes 128 scanning lines. A single scan line in an image can be calculated by summing the response from the scatters, in which the scattering strength was determined by the density and ultrasonic speed changing in the tissue. Thereafter, the ultrasonic image was reconstructed using the signals from each line. It can be seen that the proposed ultrasound probe is able to perform ultrasonic imaging of the bladder.
[00102] Assembly process
[00103] FIG. 16 shows the assembly process of the ultrasonic probe, as follows: (1) The 1-3 composite transducer was bonded to a 2 mm backing layer (Ag:epoxy) and then was diced onto adhesive tape with perfect alignment. (2) Ag:epoxy conductive paste was dispensed on the orthogonal electrode, and the whole orthogonal transducer array was stuck to the PI PCB in one operation. (3) Baked in the oven at 80°C for 2 hours to make the transducer and the electrode on the PI PCB stick firmly by Ag:epoxy paste. (4) Sticked the GND electrode with the transducer, and dispensed Ag:epoxy paste on the GND electrode and baked as the same conditions. (5) Before the polydimethylsiloxane (PDMS) package, the PI PCB was treated by (3- Aminopropyl) triethoxy silane (APTES) to enhance the adhesion between PI and PDMS. The details as follows: the Pl PCB was treated by oxygen atmosphere plasma for 5 minutes, followed up soaked in 1% APTES aqueous solution for 20 minutes and dried by N2 gas. (6) Thereafter, the probe is packaged by PDMS. The PDMS (10:1 , means 10 silicone elastomer base and 1 silicone elastomer curing agent, by weight) was stirred for 5 min for even mixing and centrifuged for 10 min to remove air bubbles. The PDMS was poured into the mold to package the probe, followed up baked in the oven at 60°C for 12 hours. (7) Fabricate strain sensor on the PI PCB based ultrasonic probe using thermal evaporation. The strain sensor has
a thickness Cr/Au of 5/5 nm and comprises serpentine lines. (8) The polyacrylate adhesive interface was fabricated on the ultrasonic probe by pouring the polyacrylate into the mold, followed by baking in the oven at 60°C for 24 hours.
[00104] Backing layer
[00105] A 2 mm Ag:epoxy plate was bonded with the transducer as the backing layer to improve the axial resolution. The axial resolution equals the spatial pulse length times ultrasound speed then divided by 2. Referring to FIG. 17A (without backing layer) and FIG. 17B (with backing layer), the vibration circle was reduced by the backing layer. The axial resolution was improved from 2.25 mm to 1.05 mm, which represents a more than 2 times axial resolution improvement.
[00106] Adhesive interface
[00107] In some exemplary embodiments, polyacrylate was used as the adhesive interface in addition to being the ultrasonic couplant. The Young’s modulus of the adhesive interface is about 16 kPa. As the human muscle is several hundreds of kilopascal to megapascal, the soft adhesive interface may insulate the ultrasonic probe from skin deformation during dynamic body motions. FIG. 18 A shows the measurement of the toughness between the PI substrate and polyacrylate adhesive. FIG. 18B shows the measurement of the toughness between the polyacrylate adhesive and the human skin. It was shown that the adhesion toughness between the ultrasonic probe and the adhesive interface is about 38 N/m. Additionally, for the 2.5 cm width interface, a force of 4 N is needed to peel it off from the skin, which is sufficiently robust for wearable usage.
[00108] Strain Sensor
[00109] When the ultrasonic probe is bent or when conformed to a subject’s skin, the exact position of each transducer is unknown. This may lead to the time delay errors for ultrasonic
beamforming, thus resulting in a decrease in ultrasonic imaging accuracy manifesting as artifacts. This presents a challenge to the typical ultrasonic imaging process.
[00110] FIG. 19 shows a strain sensor for phase error correction. Referring again to FIG. 8C, the strain sensor resistance of flat and bending (minimum bending radius such as 5 cm) states was measured for calibration. As the resistance of the strain sensor is proportional to the curvature, in practical use, it was possible to infer the bending radius based on the resistance change of the strain sensor. When the ultrasonic probe is deformed, the strain sensor will also deform accordingly. Therefore, according to the resistance change, the bending radius can be estimated.
[00111] As shown in Table 1, the bending radius estimation error using the proposed strain sensor is below 10 percent. Beamforming simulation was done to evaluate the performance of the strain sensor-based phase error compensation. As shown in FIG. 20, when no phase error compensation was performed, the maximum beamforming error can be as large as 43 dB when the bending radius is 28 mm. After the phase error compensation, the beamforming error is reduced to 2 dB at the same bending radius. Hence, the beamforming error was reduced by more than 95% after strain sensor correction.
Table 1. Actual bending radius of the ultrasonic probe, the estimated bending radius of the strain sensor, and estimation error.
Bending radius (mm) Estimated bending radius (mm) Error (%)
74 71 3.9%
50 46 7.9%
28 26 5.4%
[00112] Defects in ultrasound images
[00113] In addition, the size of the bladder can be as large as ~ 10 cm when the bladder is full.
In situations where the ultrasonic probe can image a complete bladder in the field of view, the
size of the bladder like width, length and depth can then be calculated directly by extracting the bladder shape using a machine learning model, such as a FCN32.
[00114] Referring again to FIG. 10, for a larger bladder or when there is an occlusion, the ultrasonic probe may not be able to get a complete image of the bladder due to a limited field of view. The incomplete image may cause a smaller size estimation, leading to a smaller bladder volume estimation. Hence, proposed herein an automatic volume estimation algorithm for the proposed wearable ultrasonic probe. As shown in FIG. 11, upon capturing or obtaining the ultrasonic bladder image, the ultrasonic image was first processed by a neural network or a machine learning model, such as a Pix2Pix model. This enables the missing part of the ultrasonic image to be completed by the Pix2Pix model prior to extracting the bladder shape using the FCN32 neural network model. The bladder shape feature contributes to the volume estimation thereby improving the volume estimation accuracy. In alternative embodiments, the bladder shape may be extracted by curve fitting method such as polynomial fitting (such as quadratic, cubic polynomials) or ellipse fitting. In addition, edge detection algorithms such as Canny, Sobel, Prewitt can also be used for the bladder size extraction.
[001 15] Experiments
[00116] FIGs. 21 to 25 show the various performance parameters of the proposed ultrasonic probe. FIG. 21 shows the impedance at resonance frequency (fi) and anti-resonance frequency (fa) of the ultrasonic transducer array. FIG. 22 shows the resonance frequency if) and antiresonance frequency (fa) for different element numbers, i.e., number of ultrasonic transducer units. FIG. 23 shows a pulse response of the ultrasonic probe based on: sampling rate: 12 MHz, stimulation voltage: 96 V, amplifier: 42 dB, measured in water tank. The values shown represent the mean and standard deviation for all 125 channels. The axial resolution is 1.5 mm, calculated by axial resolution=spatial pulse length (SPL)*Ultrasound speed *0.5.
[00117] FIG. 24 shows the fractional bandwidth (FBW) of the ultrasonic probe, wherein Fractional bandwidth (FBW) = FBW = (Af/fc) x 100 % = 37±6%, Af = (fH-fL), fc = (fH + fL)/2, where fH: high frequency limit, fL: low frequency limit, fc: central frequency. The values shown represents the mean and standard deviation of all 125 channels. FIG. 25 shows the bandwidth for different element numbers.
[001 18] FIG. 26 shows exemplary ultrasound images obtained from the proposed ultrasonic scanner and ultrasonic probe in both the longitudinal and transverse direction. FIGs. 27A to 27D and Table 2 show a bladder volume estimation comparison of a bladder phantom between a manual method and the fully convolutional neural network (FCN) method. The values represent the mean and standard deviation (n=5), with the bottom values showing the results from the 150 ml phantom testing result, with units in millimetres. It was shown that the manual method and the FCN method has a size estimation difference of less than 4 mm, thus providing the justification for utilizing the FCN method for near instant and real-time bladder volume estimation.
Table 2. Comparison between Manual method and FCN method
Volume = Width*Length*Depth/2; and Error = [estimated vol/standard vol-l|
[00119] FIG. 28 shows a bladder size estimation comparison between the proposed wearable ultrasonic scanner (WUBS) and conventional scanners such as the C5-2V and Apogee 2G EXP. [00120] FIG. 29 A shows a temperature profile of the proposed ultrasonic probe over a 5 minutes operational duration. FIG. 29B shows the surface temperature of the ultrasonic probe
over an hour. It may be seen that the temperature was kept below 42 °C after operating over an hour, at a 10 frames per second (fps) rate.
[00121] The proposed ultrasonic probe enables measurement of the bladder volume, with no need for probe rotation. The ultrasonic probe may include two linear ultrasonic arrays/probes intersecting at 90 degrees, with each lineal' probe configured to obtain one respective crosssection image of the bladder. The two orthogonal linear probes may obtain two orthogonal cross-section images of the bladder for estimating the length, depth, and width of the bladder. Further, the bladder volume may be estimated by the formula: Vol = 0.72* W*L*D, where, W, L, and D are the width, length, and depth of the bladder, respectively.
[00122] Resolution is one key performance parameter for ultrasonic imaging. The axial resolution is proportional to the spatial pulse length. The proposed ultrasonic probe includes a backing layer coupled or attached to the ultrasonic transducer layer for decreasing vibration or oscillatory motions, improving the axial resolution.
[00123] In addition, when the ultrasonic probe bends in conformance to the attachment surface, such as the skin of a subject, the unavailability of ultrasonic transducer location results in phase error which detrimentally impacts imaging accuracy. Proposed herein a sensor layer for measuring bending in the ultrasonic probe. The sensor layer may be a strain sensor layer integrated with the ultrasonic probe. When the ultrasonic probe deformed, tire strain sensor deforms accordingly causing a resistance change to the strain sensor. Thus, the bending radius of the ultrasonic probe is estimated according the resistance change. The phase error is then calculated based on the bending radius and the phase error is corrected or compensated in realtime.
[00124] Disclosed herein a wearable ultrasonic imaging method for quantitatively characterize the bladder volume using a pair of orthogonal linear transducer arrays which may capture two orthogonal cross-section images of the bladder. Further, proposed herein a real-
time phase error compensation method to enhance the accuracy of the imaging. With the integration of a sensor layer in the ultrasonic probe, real-time phase error compensation may be performed addressing the change in probe curvature during imaging due to movement of the subject, such as breathing.
[00125] The proposed ultrasonic scanner and ultrasonic probe may be used in the following non-limiting scenarios:
[00126] Post-void residual urine measurement: The volume of the post-void residual urine may be measured immediately after voiding. This avoids inaccuracies caused by long-time waits after urination in medical centre.
[00127] Urination speed characterization: A slow flow rate of the urine may mean that there is an obstruction at the bladder neck or in the urethra, an enlarged prostate, or a weak bladder. Ultrasound imaging speed can be as high as dozens of frames per second. During the voiding process, the urination speed between any two frames can be estimated, which can provide diagnosis for lower urinary tract symptoms.
[00128] On-demand self-catheterization: Forbladder sensation problems, a common therapy is regular intermittent self-catheterization. However, the empty interval has to be chosen properly. A real time monitoring which can provide on-demand self-catheterization will make patients' lives easier.
[00129] All examples described herein, whether of methods, materials, or products, are presented for the purpose of illustration and to aid understanding and are not intended to be limiting or exhaustive. Modifications may be made by one of ordinary skill in the art without departing from the scope of the invention as claimed.
Claims
1. An ultrasonic probe, comprising: a flexible base; a first ultrasonic transducer array coupled to the flexible base, the first ultrasonic transducer array arranged along a first axis; a second ultrasonic transducer array coupled to the flexible base, the second ultrasonic transducer array arranged along a second axis, the second axis orthogonal to the first ultrasonic transducer array; and a sensor layer coupled to the first ultrasonic transducer array and the second ultrasonic transducer array, the sensor layer being configured to provide a sensor signal corresponding to at least one of: a first bending of the first ultrasonic transducer array about the second axis; and a second bending of the second ultrasonic transducer array about the first axis.
2, The ultrasonic probe as recited in claim 1, wherein the first ultrasonic transducer array and the second ultrasonic transducer array are disposed on a common plane intersecting each other.
3. The ultrasonic probe as recited claim 2, wherein the sensor layer is disposed adjacent to the common plane along a thickness axis of the flexible base.
4, The ultrasonic probe as recited in any one of claims 2 and 3, wherein the first ultrasonic transducer array and the second ultrasonic transducer array overlap at an intersecting zone, wherein the intersecting zone defines an imaging datum for each of the first ultrasonic transducer array and the second ultrasonic transducer array.
5. The ultrasonic probe as recited in any one of the above claims, wherein the sensor layer comprises: at least one first strain sensor arranged along the first axis; and at least one second strain sensor arranged along the second axis.
6. The ultrasonic probe as recited in any one of the above claims, wherein the sensor layer defines a sensing zone responsive to the first bending and the second bending to provide the sensor signal, the sensing zone covering the first ultrasonic transducer array along the first axis and the second ultrasonic transducer array along the second axis.
7. The ultrasonic probe as recited in any one of the above claims, further comprising a backing layer disposed between the sensor layer and each of the first ultrasonic transducer array and the second ultrasonic transducer array, wherein the backing layer has a higher stiffness than each of the first ultrasonic transducer array and the second ultrasonic transducer array.
8. The ultrasonic probe as recited in any one of the above claims, further comprising an adhesive contact surface for attaching the ultrasonic probe relative to a target surface.
9. The ultrasonic probe as recited in any one of the above claims, wherein the ultrasonic probe is deformable to conform to a contour of a target surface.
10. The ultrasonic probe as recited in any one of the above claims, wherein each of the first ultrasonic transducer array and the second ultrasonic transducer array comprises a plurality of ultrasonic transducers defining a plurality of transducer pitches, each of the
plurality of transducer pitches is between a full transducer wavelength and half the transducer wavelength.
11. An ultrasonic scanner, comprising: the ultrasonic probe as recited in any one of the above claims, the ultrasonic probe attachable to a target surface; a controller in signal communication with the ultrasonic probe, wherein the controller is configured to: receive a first reflection signal of a target from the first ultrasonic transducer array; receive a second reflection signal of the target from the second ultrasonic transducer array; adjust the first reflection signal based on the first bending to obtain a first adjusted reflection signal; and adjust the second reflection signal based on the second bending to obtain a second adjusted reflection signal.
12. The ultrasonic scanner as recited in claim 11, wherein the controller is further configured to: estimate a volume of the target based on the first adjusted reflection signal and the second adjusted reflection signal.
13. The ultrasonic scanner as recited in claim 12, wherein the controller is further configured to: estimate the volume of the target by providing the first adjusted reflection signal and the second adjusted reflection signal to a first machine learning model.
14. The ultrasonic scanner as recited in claim 13, wherein the first machine learning model comprising a FCN32 fully convolutional network.
15. The ultrasonic scanner as recited any one of claims 12 to 14, wherein the controller is further configured to: segment the target from each of the first adjusted reflection signal and the second adjusted reflection signal to obtain a first segmented target and a second segmented target; and estimate the volume of the target based on the first segmented target and the second segmented target.
16. The ultrasonic scanner as recited in any one of claims 11 to 15, wherein responsive to a defective region of the target in at least one of: the first adjusted reflection signal and the second adjusted reflection signal, the controller is further configured to: estimate the defective region of the target using a second machine learning model.
17. The ultrasonic scanner as recited in claim 16, wherein the second machine learning model comprises a Pix2Pix conditional generative adversarial network.
18. The ultrasonic scanner as recited in any one of claims 16 and 17, wherein the defective region is characterized by at least one of: an occluded region, a missing region, and a distorted region.
19. The ultrasonic scanner as recited in any one of claims 11 to 18, wherein the controller is configured to concurrently receive both the first reflection signal and the second reflection signal.
20. The ultrasonic scanner as recited in any one of claims 11 to 19, wherein the controller is configured to periodically receive both the first reflection signal and the second reflection signal.
21. The ultrasonic scanner as recited in any one of claims 11 to 20, wherein the first bending has a different curvature from the second bending.
22. The ultrasonic scanner as recited in any one of claims 11 to 21, wherein each of the first bending and the second bending varies in operation.
23. A method of ultrasonic scanning, the method comprising: attaching the ultrasonic probe as recited in any one of the above claims to a target surface, the target surface spaced apart from a target; receiving a first reflection signal of the target from the first ultrasonic transducer array; receiving a second reflection signal of the target from the second ultrasonic transducer array; adjusting the first reflection signal based on the first bending to obtain a first adjusted reflection signal; and adjusting the second reflection signal based on the second bending to obtain a second adjusted reflection signal.
24. The method as recited in claim 23, further comprising: estimating a volume of the target based on the first adjusted reflection signal and the second adjusted reflection signal.
25. The method as recited in claim 24, further comprising: estimating the volume of the target by providing the first adjusted reflection signal and the second adjusted reflection signal to a first machine learning model.
26. The method as recited in claim 25, wherein the first machine learning model comprises a FCN32 fully convolutional network.
27. The method as recited in any one of claims 24 to 26, further comprising: segmenting the target from each of the first adjusted reflection signal and the second adjusted reflection signal to obtain a first segmented target and a second segmented target; and estimate the volume of the target based on the first segmented target and the second segmented target.
28. The method as recited in any one of claims 23 to 27, further comprising: estimating a defective region of the target using a second machine learning model based on at least one of: the first adjusted reflection signal and the second adjusted reflection signal.
29. The method as recited in claim 28, wherein the second machine learning model comprises a Pix2Pix conditional generative adversarial network.
30. The method as recited in any one of claims 28 and 29, wherein the defective region is characterized by at least one of: an occluded region, a missing region, and a distorted region.
31. The method as recited in any one of claims 23 to 30, further comprising: concurrently receiving both the first reflection signal and the second reflection signal.
32. The method as recited in any one of claims 23 to 31, further comprising: periodically receiving both the first reflection signal and the second reflection signal.
33. The method as recited in any one of claims 23 to 32, wherein the first bending has a different curvature from the second bending.
34. The method as recited in any one of claims 23 to 33, wherein each of the first bending and the second bending varies in operation.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| SG10202400458X | 2024-02-20 | ||
| SG10202400458X | 2024-02-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025178564A1 true WO2025178564A1 (en) | 2025-08-28 |
Family
ID=96847917
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/SG2025/050114 Pending WO2025178564A1 (en) | 2024-02-20 | 2025-02-19 | Ultrasonic probe, ultrasonic scanner and a method of ultrasonic scanning |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025178564A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070078345A1 (en) * | 2005-09-30 | 2007-04-05 | Siemens Medical Solutions Usa, Inc. | Flexible ultrasound transducer array |
| US20170311924A1 (en) * | 2014-10-23 | 2017-11-02 | Koninklijke Philips N.V. | Shape sensing for flexible ultrasound trasnducers |
| US20180168544A1 (en) * | 2015-06-30 | 2018-06-21 | Koninklijke Philips N.V. | Methods, apparatuses, and systems for coupling a flexible transducer to a a surface |
| US20220175340A1 (en) * | 2019-04-18 | 2022-06-09 | The Regents Of The University Of California | System and method for continuous non-invasive ultrasonic monitoring of blood vessels and central organs |
| WO2022240843A1 (en) * | 2021-05-11 | 2022-11-17 | The Regents Of The University Of California | Wearable ultrasound imaging device for imaging the heart and other internal tissue |
-
2025
- 2025-02-19 WO PCT/SG2025/050114 patent/WO2025178564A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070078345A1 (en) * | 2005-09-30 | 2007-04-05 | Siemens Medical Solutions Usa, Inc. | Flexible ultrasound transducer array |
| US20170311924A1 (en) * | 2014-10-23 | 2017-11-02 | Koninklijke Philips N.V. | Shape sensing for flexible ultrasound trasnducers |
| US20180168544A1 (en) * | 2015-06-30 | 2018-06-21 | Koninklijke Philips N.V. | Methods, apparatuses, and systems for coupling a flexible transducer to a a surface |
| US20220175340A1 (en) * | 2019-04-18 | 2022-06-09 | The Regents Of The University Of California | System and method for continuous non-invasive ultrasonic monitoring of blood vessels and central organs |
| WO2022240843A1 (en) * | 2021-05-11 | 2022-11-17 | The Regents Of The University Of California | Wearable ultrasound imaging device for imaging the heart and other internal tissue |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7229263B2 (en) | Pressure measuring device, pressure measuring system and method for measuring pressure and/or elasticity of veins or organs and combining with an ultrasound measuring unit | |
| JP5551936B2 (en) | Hands-free ultrasonic diagnostic equipment | |
| JP4177116B2 (en) | Real-time mechanical imaging of the prostate | |
| US20080294022A1 (en) | Birthing Medical Monitor | |
| CN106725363B (en) | Pulse wave acquisition device and pulse wave acquisition calibration method | |
| JP2003532478A (en) | Optical non-invasive blood pressure sensor and method | |
| CN106580273A (en) | Pulse wave acquisition device and pulse wave acquisition calibration method | |
| KR101930883B1 (en) | Wearable device monitoring residual urine volume in bladder using ultrasonic wave and method of monitoring residual urine volume in bladder using ultrasonic wave | |
| US11363978B2 (en) | Bladder monitoring system | |
| WO2018094737A1 (en) | Pulse wave collection device, and pulse wave acquisition and calibration method | |
| US20250017569A1 (en) | Systems and methods for locating and monitoring a subcutaneous target site | |
| Kristiansen et al. | Design and evaluation of an ultrasound-based bladder volume monitor | |
| Pu et al. | A stretchable and wearable ultrasonic transducer array for bladder volume monitoring application | |
| WO2018094738A1 (en) | Pulse wave collection device, and pulse wave acquisition and calibration method | |
| WO2025178564A1 (en) | Ultrasonic probe, ultrasonic scanner and a method of ultrasonic scanning | |
| CN117918886A (en) | Blood pressure monitoring device, system and method based on ultrasonic waves | |
| Borghetti et al. | Measuring inside your mouth! Measurement approaches, design considerations, and one example for tongue pressure monitoring | |
| TWI902979B (en) | An apparatus and a method for measuring jugular vein pressure waveform | |
| TWI871237B (en) | Colloidal sheet, ultrasonic inspection system, and method for manufacturing the same | |
| EP4251033B1 (en) | An apparatus for measuring jugular vein pressure waveform | |
| CN223516346U (en) | A hydrogel patch-type respiratory-gated magnetic resonance imaging device | |
| Tang et al. | Synchronous Gesture Recognition and Arm Joint Angle Monitoring for Human‐Machine Interaction Using Multiple Flexible Ultrasonic Patches | |
| Beker et al. | Wirelessly-powered, Electronics-free Ultrasonic Tags for Continuous Wearable Health Monitoring in Epidermal and Ocular Applications | |
| US20210361256A1 (en) | Gastrointestinal motility measurement system | |
| Beker et al. | An integrated and flexible ultrasonic device for continuous bladder volume monitoring |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25758740 Country of ref document: EP Kind code of ref document: A1 |