[go: up one dir, main page]

WO2019048284A1 - Étalonnage intra-procédure pour suivi basé sur des images - Google Patents

Étalonnage intra-procédure pour suivi basé sur des images Download PDF

Info

Publication number
WO2019048284A1
WO2019048284A1 PCT/EP2018/073054 EP2018073054W WO2019048284A1 WO 2019048284 A1 WO2019048284 A1 WO 2019048284A1 EP 2018073054 W EP2018073054 W EP 2018073054W WO 2019048284 A1 WO2019048284 A1 WO 2019048284A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
probe
frame
scanner
speckle decorrelation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/073054
Other languages
English (en)
Inventor
Jochen Kruecker
Faik Can MERAL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of WO2019048284A1 publication Critical patent/WO2019048284A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/5205Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52098Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging related to workflow protocols
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8934Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
    • G01S15/8938Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in two dimensions
    • G01S15/894Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in two dimensions by rotation about a single axis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems

Definitions

  • the device to be tracked is an imaging device (e.g. ultrasound probe), and it is desirable to provide the tracking information economically.
  • ultrasound-guided prostate biopsy in particular ultrasound-MRI fusion biopsy in which live ultrasound images are referenced to a previously acquired MRI (“magnetic resonance imaging”) image.
  • Lower-cost solutions for ultrasound tracking commonly involve image-based tracking and low-cost inertial sensors such as gyroscopes and accelerometers attached to the ultrasound probe.
  • Low-cost-sensors combined with image-based position estimates can achieve tracking accuracies of less than 3mm for brief scans and simple scan geometries such as a uni-directional "sweep" across an organ that is required to reconstruct a three-dimensional (3D) view of that organ.
  • a speckle decorrelation based method is typically used to estimate out-of- plane distance and angulation between image frames. The overall accuracy of this method depends in part on careful calibration of the frame-to-frame speckle decorrelation as a function of distance between adjacent frames.
  • the decorrelation between ultrasound images depends on the density and distribution of scatter and on the presence of specular reflectors (e.g. interfaces) in the tissue/phantom being imaged.
  • Variations in these tissue parameters affect the calibration of speckle decorrelation vs. frame spacing. If the scatter parameters in the calibration phantom differ substantially from those in a patient to be scanned, then image-based frame spacing estimates in that patient are inaccurate. Moreover, since the image-based tracking determines the pose of each successive 2D image frame relative to the previous 2D image frame, any errors can be cumulative over time, leading to increasingly larger error for longer or more complex ultrasound imaging sessions.
  • an ultrasound (“US”) device includes a US scanner and a US probe operatively connected to the US scanner.
  • the US scanner and US probe are configured to acquire a succession of two-dimensional (2D) image frames of a portion of a patient.
  • At least one electronic processor is programmed to: track a pose of each 2D image frame relative to a preceding 2D image frame in the succession of 2D image frames based at least in part on distance from the preceding 2D image frame determined by computing a speckle decorrelation value between the 2D image frame and the preceding 2D image frame and using a speckle decorrelation versus frame spacing calibration; and calibrate the speckle decorrelation versus frame spacing calibration by operations including: acquiring a reference three-dimensional (3D) image volume comprising a succession of 2D image frames orthogonal to and acquired along a first direction using the US scanner and US probe; determining distances between pairs of 2D image frames of the reference 3D image volume along the first direction without using the speckle decorrelation versus frame spacing calibration; determining speckle
  • a non-transitory computer readable medium stores instructions executable by at least one electronic processor to perform a probe calibration method comprising: acquiring a succession of two-dimensional (2D) image frames of a portion of a patient using a US scanner and a US probe; tracking a pose of each 2D image frame relative to a preceding 2D image frame in the succession of 2D image frames based at least in part on distance from the preceding 2D image frame determined by computing a speckle decorrelation value between the 2D image frame and the preceding 2D image frame and using a speckle decorrelation versus frame spacing calibration; calibrating the speckle decorrelation versus frame spacing calibration by operations including: acquiring a reference three-dimensional (3D) image volume comprising a succession of 2D image frames orthogonal to and acquired along a first direction using the US scanner and US probe; determining distances between pairs of 2D image frames of the reference 3D image volume along the first direction without using the speckle decorrelation versus frame spacing calibration; determining speckle decorrelation
  • an ultrasound (US) device in another disclosed aspect, includes a US scanner and a US probe operatively connected to the US scanner.
  • the US scanner and US probe are configured to acquire a succession of two-dimensional (2D) image frames of a portion of a patient.
  • a tracking sensor comprises an inertial sensor tracking relative changes in the position of US probe and the US scanner.
  • At least one electronic processor is programmed to: track a pose of each 2D image frame relative to a preceding 2D image frame in the succession of 2D image frames based at least in part on distance from the preceding 2D image frame determined by computing a speckle decorrelation value between the 2D image frame and the preceding 2D image frame and using a speckle decorrelation versus frame spacing calibration; and calibrate the speckle decorrelation versus frame spacing calibration by operations including: acquiring a reference three-dimensional (3D) image volume comprising a succession of 2D image frames orthogonal to and acquired along a first direction using the US scanner and US probe; determining distances between pairs of 2D image frames of the reference 3D image volume along the first direction without using the speckle decorrelation versus frame spacing calibration; determining speckle decorrelation values between pairs of 2D image frames of the reference 3D image volume; and updating the speckle decorrelation versus frame spacing calibration along the first direction using the determined speckle decorrelation values and the determined distances.
  • 3D three-dimensional
  • At least one of the calibration further includes acquiring a second reference 3D image volume comprising a succession of 2D image frames orthogonal to and acquired along a second direction orthogonal to the first direction using the US probe and the US scanner; and the determining of distances between pairs of 2D image frames of the reference 3D image volume uses measurements of distances along the first direction within single 2D image frames of the second reference 3D image volume; and the at least one electronic processor is programmed to track the pose of each 2D image frame relative to the preceding 2D image frame further based on the relative changes in the position of the US probe tracked by the inertial sensor.
  • One advantage resides in providing a low cost US probe tracking device.
  • Another advantage resides in correcting for errors in frame to frame pose measurements of a US probe relative to a patient. Another advantage resides in providing for use of image-based and/or inertial sensor-based US probe tracking with improved performance for longer scans and/or complex free-hand probe manipulations.
  • Another advantage resides in calibrating a frame-to-frame decorrelation curve used to estimate the spacing and angulation between image frames.
  • a given embodiment may provide none, one, two, more, or all of the foregoing advantages, and/or may provide other advantages as will become apparent to one of ordinary skill in the art upon reading and understanding the present disclosure.
  • FIGURE 1 diagrammatically shows an illustrative ultrasound (US) device in accordance with one aspect.
  • FIGURE 2 shows an exemplary flow chart operation of the device of FIGURE i;
  • FIGURES 3A and 3B schematically show first and second scans acquired by the device of FIGURE 1 ;
  • FIGURE 4 shows an example operation of the device of FIGURE 1
  • FIGURES 5 A and 5B shows calibration curves used by the device of FIGURE
  • FIGURE 6 shows another exemplary flow chart operation of the device of FIGURE 1.
  • the following relates to ultrasound tracking of interventional procedures.
  • US low-cost ultrasound
  • the US system acquires two-dimensional (2D)-US images (or "frames") fast enough to approximate real-time imaging (i.e., "live” ultrasound imaging).
  • 2D two-dimensional
  • each successive frame is compared with the previous frame to assess any movement of the US probe between the frames.
  • Motion in the plane of the frame is easy to detect as it merely entails performing a spatial registration of the (already close) successive frames.
  • tracking motion perpendicular to the frame is more challenging.
  • One approach relies on the observation that there is decorrelation of speckle patterns between the frames, and this decorrelation increases with increasing distance between the successive frames.
  • a factory calibration is typically performed using a realistic phantom to measure a speckle correlation coefficient as a function of frame spacing.
  • the speckle decorrelation between successive frames is measured and the result used to estimate the distance between the successive frames in the direction perpendicular to the frames.
  • Angulation between successive 2D image frames can also be estimated based on spatial variation of the local speckle decorrelation over the image area.
  • two orthogonal linear sweeps are performed, with each linear sweep acquiring a series of frames whose planes are transverse to the sweep direction and are spaced apart from one another.
  • the approach leverages the insight that the spacing between image features within a frame is accurate - thus, the spacing between features measured in the orthogonal frames can be used to compute a scaling factor for the correlation curve. This can be done in both orthogonal scan directions, as the scaling for each direction can be determined by features imaged in frames of the orthogonal scan.
  • the probe is rotated through a large angle, instead of using the linear sweeps.
  • the single rotational scan provides known inter-frame distances at a given radius from the rotation axis (the distance is ⁇ where r is the radial distance from the rotation axis and ⁇ is the angle of rotation in radians, which can be obtained from an accelerometer or gyroscope mounted on the US probe).
  • the known distance ⁇ can be used to compute the scaling of the calibration speckle correlation coefficient versus frame spacing curve.
  • An ultrasound (US) imaging device 10 may, for example, be an EPIQTM ultrasound imaging system available from Koninklijke Philips N.V., Amsterdam, the Netherlands, a UroNav® system for US/MRI-fusion-guided prostate biopsy available from Koninklijke Philips N.V., Amsterdam, the Netherlands, the PercuNav ® system (available from Koninklijke Philips N.V., Amsterdam) for general fusion of US with prior 3D imaging (CT, MR, cone-beam CT, etc.), or may be another commercial or custom- built ultrasound imaging system.
  • the ultrasound imaging device 10 includes an US probe 12 operatively connected to an US scanner 14 to perform ultrasound imaging.
  • the illustrative ultrasound probe 12 is connected with the ultrasound imaging system 10 via cabling 15, though a wireless connection is contemplated.
  • the US probe 12 includes a sensor array 16 that acquires a two-dimensional (2D) image frame in a sonicated plane 17.
  • the surgeon or other operator can adjust the location and orientation (i.e. "pose") of the image frame by free-hand movement of the ultrasound probe 12.
  • Such free-hand motion may entail a translational sweep of the US probe 12 (and hence of the sonicated plane 17) and/or may include rotating the US probe 12 about an axis 18, e.g. through an angle theta ( ⁇ ).
  • the US scanner 14 and the US probe 12 are configured to acquire a succession of two-dimensional (2D) image frames of a portion of a patient, where each 2D image frame corresponds to the current pose of the sonicated plane 17.
  • the US scanner 14 also includes at least one electronic processor 20 (e.g., a microprocessor, a microcontroller, and the like), at least one user input device (e.g., a mouse, a keyboard, a trackball, and/or the like) 22, and a display 24 for displaying ultrasound images and/or US scanner settings, image parameters, and/or so forth.
  • a microprocessor e.g., a microcontroller, and the like
  • user input device e.g., a mouse, a keyboard, a trackball, and/or the like
  • display 24 for displaying ultrasound images and/or US scanner settings, image parameters, and/or so forth.
  • the at least one electronic processor 20 is operatively connected with a non- transitory storage medium (not shown) that stores instructions which are readable and executable by the at least one electronic processor 20 to perform disclosed operations including, e.g. operating the US scanner 14 to perform live US imaging and performing a probe calibration method or process 100 to calibrate a position of the US probe 12 relative to the portion of the patient being scanned.
  • the non-transitory storage medium may, for example, comprise a hard disk drive or other magnetic storage medium; a solid state drive (SSD), flash drive, electronically erasable read-only memory (EEROM) or other electronic memory; an optical disk or other optical storage; various combinations thereof; or so forth.
  • the non-transitory storage medium storing the instructions is disposed in the US scanner 14, e.g. in the form of an internal hard drive, SSD, flash memory, and/or the like.
  • the US imaging device 10 includes at least one tracking sensor 28 disposed on a portion of the US probe 12.
  • the tracking sensor 28 is configured to track orientation of the US probe 12.
  • the tracking sensor 28 comprises the US scanner 14 and the US probe 12.
  • the tracking sensor 28 comprises an inertial sensor 30 configured to track relative changes in the position of the US probe 12.
  • the inertial sensor 30 can be a gyroscope, an accelerometer, or any other suitable sensor that tracks position of the US probe 12.
  • the US probe 12 is positioned on or near to a portion of the patient to be scanned (e.g., the abdomen of the patient, inside a rectum in the case of transrectal prostrate imaging, or so forth).
  • the at least one electronic processor 20 programmed to control the US scanner 14 and the US probe 12 to acquire a plurality of live 2D images, and during the live 2D imaging, track a pose of each 2D image frame relative to the pose of the last 2D image frame in the succession of 2D image frames based at least in part on distance from the preceding 2D image frame determined by computing a speckle decorrelation value between the 2D image frame and the preceding 2D image frame and using a speckle decorrelation versus frame spacing calibration 101.
  • the pose refers to the spatial position and angle of the image frame in three-dimensional space.
  • the US probe 12 is shown in different positions in each image frame.
  • the distance data includes tracking data acquired by the tracking sensor 28.
  • the at least one electronic processor 20 is programmed to track the pose of each 2D image frame relative to the pose of the last 2D image frame in the succession of 2D image frames based on one or more detected changes of the 2D image frame relative to the last 2D image frame in the succession of 2D image frames. Changes "in-plane” are readily detected by spatially registering the current frame with the previous frame. Changes "out-of-plane" are more difficult to quantify - in one known approach, the amount of speckle decorrelation is known to increase with movement of the current frame out-of-plane respective to the previous frame.
  • the speckle decorrelation-versus-distance calibration curve 101 may be used to quantify the out-of-plane distance the current 2D image frame has moved respective to the last 2D image frame.
  • the at least electronic processor 20 is programmed to track the pose of each 2D image frame relative to the pose of the last 2D image frame in the succession of 2D image frames based at least in part on the relative changes in the position of the US probe 12 tracked by the inertial sensor.
  • a combination of imaging-based tracking and inertial sensor-based tracking is contemplated to provide improved accuracy over either technique alone.
  • accurate imaging-based in-plane motion tracking may be combined with a combination of speckle decorrelation-based and inertial tracking of out-of-plane motion or angulation.
  • the tracking performed at 102 is relative, in the sense that the pose of each 2D image frame is tracked relative to the pose of the last 2D image frame in the succession of 2D image frames.
  • any errors in the pose of the current frame estimated respective to the last frame will generally accumulate over time.
  • Such cumulative error is particularly likely to be problematic when the US probe 12 is moved over large distances or in a complex way (e.g. a combination of a translational sweep and a rotation), and when the duration of the imaging is long. This cumulative error is also referred to herein as "drift".
  • the accuracy depends upon how accurately the speckle decorrelation versus frame spacing calibration 101 captures the relationship between frame spacing and speckle decorrelation.
  • the accuracy can be affected by factors such as density and distribution of ultrasonic wave scatterers in the tissue, and on the presence of specular ultrasonic wave reflectors (e.g. interfaces) in the tissue/phantom being imaged.
  • the speckle decorrelation versus frame spacing calibration 101 may be more or less accurate for a given patient, or for a given region of the patient.
  • the at least one electronic processor 20 is programmed to calibrate the speckle decorrelation versus frame spacing calibration 101 of the US probe 12.
  • part of the pose estimation is typically obtained using image-based methods.
  • the frame-to-frame decorrelation is used to estimate the spacing and angulation between image frames.
  • Calibration 101 of the frame spacing vs. decorrelation is inaccurate if the decorrelation in the calibration phantom differs from the decorrelation behaviour in the patient to be scanned. Such inaccuracy can affect the clinical utility of the low-cost tracking solution.
  • the speckle decorrelation calibration 104 may be performed automatically upon initiation of live US imaging. Additionally or alternatively, the calibration 104 may be triggered by a substantial change in the US image, e.g. detection of a large gradient in the image indicative of a (potentially specularly reflecting) interface or detection of a large change in tissue density (potentially associated with a large change in density of US scattering centers). Additionally or alternatively, the calibration 104 can be triggered by manual activation of a user input (e.g., via the at least one user input device 22). These are merely non-limiting illustrative examples.
  • the speckle decorrelation calibration 104 proceeds as follows.
  • the at least one electronic processor 20 is programmed to control the US scanner 14 and the US probe 12 to acquire a reference three-dimensional (3D) image volume comprising a succession of 2D image frames using the US scanner and US probe. This may be done directly if the US sensor array 16 is capable of sonicating a 3D volume, or may be done in conjunction with a free-hand sweep of the US probe 12 performed by the surgeon.
  • the display 24 may request that the surgeon perform a free-hand sweep, and the inertial sensor 30 detects when this occurs.
  • the free-hand sweep may be a translation, or in some embodiments may be a rotation of the US probe 12 about the axis 18.
  • the acquisition 106 should employ planes that are orthogonal to, or at least at a large angle to, the imaging plane of the live US imaging.
  • the 3D image volume is acquired in a first scanning procedure by acquiring 2D images orthogonal to and acquired along a first direction 32.
  • the at least one electronic processor 20 programmed to control the US scanner 14 and the US probe 12 to acquire a second reference 3D image volume comprising a succession of 2D image frames orthogonal to and acquired along a second direction 34 orthogonal to the first direction 32 using the US probe and the US scanner.
  • the second reference 3D image volume should substantially overlap with the (first) reference 3D image volume.
  • the first and second reference image volumes are acquired by performing linear sweeps with the US probe 12 and US scanner 14 along the first and second directions 32 and 34 to acquire the reference 3D image volume and the second reference 3D image volume, respectively.
  • either the reference 3D image volume or the second reference 3D volume image can contain a single image approximately perpendicular to the corresponding first direction 32 or second direction 34.
  • the reference 3D image volume is obtained by acquiring the succession of 2D image frames while rotating the US probe through an angle ⁇ through the sonicated plane 17.
  • the scan with the US probe 12 can be predominantly rotational around a y-axis (depth axis) of the image plane.
  • the inter-plane spacing depends on the radius r from the rotational axis 18 the rotation angle ⁇ and is given by r9, as diagrammatically shown in FIGURE 4.
  • the inter-frame spacing between 2D US image frames can be afflicted with error due to the uncertainty in the image-based (or other low-cost) estimation of the frame spacing and any inaccuracy of the speckle decorrelation versus frame spacing calibration 101.
  • Distances measured (or structures depicted) within any of the image planes are generally accurate, since these are provided directly by the ultrasound system 10.
  • the in-plane depiction of structures in the 2D image frames of scan #2 can therefore be used to calibrate the frame spacing in scan #1, and vice versa.
  • This mutual calibration is achieved by image registration of scan #2 on scan #1 in which a scaling parameter is introduced for the frame spacing.
  • the inter- frame spacing r9 is accurately known from the rotation angle ⁇ and the distance r, the latter again being an accurately measureable in-plane distance.
  • the at least one electronic processor 20 is programmed to determine distances between pairs of 2D image frames of the reference 3D image volume along the first direction 32 without using the speckle decorrelation versus frame spacing calibration (acquired at 102).
  • the at least one electronic processor 20 when the two orthogonal scans (scan #1 and scan #2) are performed to obtain the first and second reference 3D image volumes, the at least one electronic processor 20 programmed to determine the distances between pairs of 2D image frames of the reference 3D image volume using measurements of distances along the first direction within single 2D image frames of the second reference 3D image volume.
  • the at least one electronic processor 20 is programmed to determining speckle decorrelation values between pairs of 2D image frames of the reference 3D image volume.
  • the speckle decorrelation values can be determined by computing the correlation between the two planes. This speckle decorrelation determination can be performed by operations know in the art (see, e.g., Chen et al. "Determination of Scan-Plane Motion Using Speckle Decorrelation: Theoretical Considerations and Initial Test"; Dep't of Radiology, University of Michigan Medical Center, 1996).
  • the at least one electronic processor 20 is programmed to update the speckle decorrelation versus frame spacing calibration 101 along the first direction using the determined speckle decorrelation values and the determined inter-frame distances.
  • the at least one electronic processor 20 is programmed to update the speckle decorrelation versus frame spacing calibration 101 by adjusting a scale factor of the speckle decorrelation versus frame spacing calibration.
  • the scaling factor s m is determined from the speckle decorrelation values determined at operation 110 and the corresponding inter- frame distances determined at operation 108.
  • FIGURE 5 A shows an illustrative representation of the speckle decorrelation versus frame spacing calibration 101 illustrated as a calibration curve of frame-to-frame decorrelation (correlation coefficient) vs frame spacing between ultrasound images.
  • FIGURE 5B shows the updating operation 112 with the determined scaling factor s m .
  • the dashed curve is the original (e.g., phantom-based) pre-calibration curve of FIGURE 5A
  • the solid curve is the scaled version determined at 112 (i.e., stretched in the x-direction) to be used for subsequent frame spacing computations.
  • the scaling factor s m can be determined to be 1.2, meaning that if the factory calibration of FIGURE 5 A generates an inter- frame spacing of D for a given measured speckle decorrelation value, then the updated curve of FIGURE 5B generates an inter- frame spacing of 1.2D for that speckle decorrelation value.
  • the at least one electronic processor 20 is programmed to then resume (or start, in the case of a preparatory calibration) the live imaging with processing flowing back to 102 to continue tracking, but henceforth using the updated speckle decorrelation versus frame spacing calibration generated at 112.
  • FIGURE 6 is a flow chart showing a method 200 describing one illustrative implementation of the operations 102-114 in more detail. Steps 201-205 correspond to operation 106 of FIGURE 2, i.e. acquiring the first and second reference 3D image.
  • the US imaging device 10 is configured to prompt the operator (e.g., doctor, nurse, technician, and the like) to scan a region of interest (ROI) in a first direction (e.g. craniocaudal) with the probe orientation such that the individual image planes are approximately perpendicular to the scan direction (e.g.
  • a region of interest ROI
  • a first direction e.g. craniocaudal
  • the prompt can be a visual prompt (e.g., displayed on the display device 24), an audible prompt (e.g., via a speaker (not shown)), or a haptic prompt (e.g., the probe 12 can vibrate).
  • a visual prompt e.g., displayed on the display device 24
  • an audible prompt e.g., via a speaker (not shown)
  • a haptic prompt e.g., the probe 12 can vibrate.
  • the at least one electronic processor 20 is programmed to use a pre- acquired calibration curve of frame-to-frame decorrelation vs. frame spacing (e.g., such as the one shown in FIGURE 5A) to reconstruct the first scan into a first volume.
  • a pre- acquired calibration curve of frame-to-frame decorrelation vs. frame spacing e.g., such as the one shown in FIGURE 5A
  • the US imaging device 10 is configured to prompt the operator to obtain a second scan of the same ROI in a direction roughly perpendicular to first scan and with image planes roughly perpendicular to the scan direction.
  • the prompt can be a visual prompt, an audible prompt, or a haptic prompt.
  • the at least one electronic processor 20 is programmed to again use the pre-acquired calibration curve of frame-to-frame decorrelation vs. frame spacing to reconstruct the second scan into a second volume.
  • the at least one electronic processor 20 is programmed to map the second scan onto the first scan using an image-based rigid registration algorithm. Any structures within the ROI are now depicted in both scans, but with roughly perpendicular view directions. Structures visible in the in-plane direction in the first scan are visible in the out-of-plane direction in the second scan, and vice versa. Any depiction or measurement within the individual image planes is accurate since these image planes are generated directly by the ultrasound scanner.
  • Depictions or measurements in the out-of-plane direction depend on the image-based (or other low-cost) determination of the frame spacing (that is, the speckle decorrelation versus frame spacing calibration 101), which is a factory phantom-based calibration and likely afflicted with error when applied to human tissue that may materially differ from the phantom.
  • a probe calibration procedure is triggered which includes operations 207-210, which corresponds to the calibration 104 of FIGURE 2.
  • the probe calibration can be triggered in various ways, such as automatically as a preparatory procedure performed prior to interventional imaging, or in response to a user input received via the user input device 22 (e.g., a mouse click, a keyboard button press, and the like, or so forth).
  • the calibration procedure aims to improve the speckle decorrelation-based estimation of the frame spacing between successive images during the live US imaging.
  • the at least one electronic processor 20 is programmed to introduce a scaling factor s to scale the frame spacing with each scan.
  • the at least one electronic processor 20 is programmed to optimize the registration of the first and second scans by modifying the scaling factor until the image similarity between the first and second scans is maximized or optimized. This can be achieved using a similarity metric, such as a correlation coefficient, mutual information, or sum of squared differences.
  • the optimized scale factor is denoted as s m .
  • the at least one electronic processor 20 is programmed to scale the calibration curve based on which the frame spacings were computed (e.g. the decorrelation- vs-frame spacing measurements initially obtained during pre-calibration).
  • the at least one electronic processor 20 is programmed to use the scaled calibration curve for all frame tracking calculations to improve spatial accuracy for visualizing frames, making measurements between frames, or registering frames to additional images.
  • the scaled calibration curve is then used for the next iteration of operation 102.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne un dispositif (10) à ultrasons (US) comprenant un scanner US (14) et une sonde US (12) fonctionnellement connectée au scanner US. Le scanner US et la sonde US sont conçus pour acquérir une succession de trames d'images bidimensionnelles (2D) d'une partie d'un patient. Au moins un processeur électronique (20) est programmé pour : suivre une pose de chaque trame d'image 2D par rapport à une trame d'image 2D précédente dans la succession de trames d'images 2D sur la base, au moins en partie, de la distance à partir de la trame d'image 2D précédente déterminée par calcul d'une valeur de décorrélation de chatoiement entre la trame d'image 2D et la trame d'image 2D précédente et à l'aide d'une décorrélation de chatoiement par rapport à l'étalonnage d'espacement de trame; et étalonner la décorrélation de chatoiement par rapport à l'étalonnage d'espacement de trame par des opérations comprenant : l'acquisition d'un volume d'image tridimensionnelle de référence (3D) comprenant une succession de trames d'image 2D orthogonales à une direction (32) et acquises le long de cette dernière à l'aide du scanner US et de la sonde US; la détermination des distances entre des paires de trames d'image 2D du volume d'image 3D de référence le long de la première direction sans utiliser la décorrélation de chatoiement par rapport à l'étalonnage d'espacement de trame; la détermination des valeurs de décorrélation de chatoiement entre des paires de trames d'image 2D du volume d'image 3D de référence; et la mise à jour de la décorrélation de chatoiement par rapport à l'étalonnage d'espacement de trame le long de la première direction à l'aide des valeurs de décorrélation de chatoiement déterminées et des distances déterminées.
PCT/EP2018/073054 2017-09-08 2018-08-28 Étalonnage intra-procédure pour suivi basé sur des images Ceased WO2019048284A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762555723P 2017-09-08 2017-09-08
US62/555,723 2017-09-08

Publications (1)

Publication Number Publication Date
WO2019048284A1 true WO2019048284A1 (fr) 2019-03-14

Family

ID=63407215

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/073054 Ceased WO2019048284A1 (fr) 2017-09-08 2018-08-28 Étalonnage intra-procédure pour suivi basé sur des images

Country Status (1)

Country Link
WO (1) WO2019048284A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114945327A (zh) * 2019-12-12 2022-08-26 皇家飞利浦有限公司 用于引导超声探头的系统和方法
CN117814844A (zh) * 2022-09-27 2024-04-05 慧威医疗科技(台州)有限公司 一种超声图像配准方法、系统及成像设备
US20240282047A1 (en) * 2023-02-22 2024-08-22 Aaron Fenster System and method for display plane visualization through an ultrasound imaging volume

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHEN ET AL.: "Dep't of Radiology", 1996, UNIVERSITY OF MICHIGAN MEDICAL CENTER, article "Determination of Scan-Plane Motion Using Speckle Decorrelation: Theoretical Considerations and Initial Test"
GEE A H ET AL: "Sensorless freehand 3D ultrasound in real tissue: Speckle decorrelation without fully developed speckle", MEDICAL IMAGE ANALYSIS, OXFORD UNIVERSITY PRESS, OXOFRD, GB, vol. 10, no. 2, 1 April 2006 (2006-04-01), pages 137 - 149, XP028013445, ISSN: 1361-8415, [retrieved on 20060401], DOI: 10.1016/J.MEDIA.2005.08.001 *
HOUSDEN R J ET AL: "Rotational motion in sensorless freehand three-dimensional ultrasound", ULTRASONICS, IPC SCIENCE AND TECHNOLOGY PRESS LTD. GUILDFORD, GB, vol. 48, no. 5, 1 September 2008 (2008-09-01), pages 412 - 422, XP024338729, ISSN: 0041-624X, [retrieved on 20080215], DOI: 10.1016/J.ULTRAS.2008.01.008 *
HOUSDEN RICHARD JAMES ET AL: "Calibration of an orientation sensor for freehand 3D ultrasound and its use in a hybrid acquisition system", BIOMEDICAL ENGINEERING ONLINE, BIOMED CENTRAL LTD, LONDON, GB, vol. 7, no. 1, 24 January 2008 (2008-01-24), pages 5, XP021036722, ISSN: 1475-925X *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114945327A (zh) * 2019-12-12 2022-08-26 皇家飞利浦有限公司 用于引导超声探头的系统和方法
CN117814844A (zh) * 2022-09-27 2024-04-05 慧威医疗科技(台州)有限公司 一种超声图像配准方法、系统及成像设备
CN117814844B (zh) * 2022-09-27 2025-11-28 慧威医疗科技(台州)有限公司 一种超声图像配准方法、系统及成像设备
US20240282047A1 (en) * 2023-02-22 2024-08-22 Aaron Fenster System and method for display plane visualization through an ultrasound imaging volume

Similar Documents

Publication Publication Date Title
EP3478209B1 (fr) Système de suivi à dispositif inertiel et procédé de fonctionnement de celui-ci
US11116582B2 (en) Apparatus for determining a motion relation
CN103654784B (zh) 用于在医学成像检查期间采集患者的运动的方法
JP7089521B2 (ja) 高速且つ自動化された超音波プローブ校正のためのシステム及び方法
US9526476B2 (en) System and method for motion tracking using unique ultrasound echo signatures
EP3454757B1 (fr) Suivi 3d d'un instrument d'intervention dans des interventions guidées par ultrasons 2d
EP3393366B1 (fr) Appareil d'imagerie ultrasonore et procédé d'imagerie ultrasonore pour l'inspection d'un volume de sujet
WO2013033552A2 (fr) Procédés de détection et de suivi d'aiguille
EP3968861B1 (fr) Système et procédé ultrasonore pour suivre le mouvement d'un objet
KR20140144633A (ko) 영상 정합 방법 및 장치
CN105433977A (zh) 医学成像系统、手术导引系统以及医学成像方法
WO2016175758A2 (fr) Pilotage guidé par image d'un réseau de transducteurs et/ou d'un instrument
WO2019048284A1 (fr) Étalonnage intra-procédure pour suivi basé sur des images
US20180333138A1 (en) Ultrasonic diagnostic apparatus, and ultrasonic diagnostic method
US8870750B2 (en) Imaging method for medical diagnostics and device operating according to this method
EP3738515A1 (fr) Système et procédé ultrasonore pour suivre le mouvement d'un objet
EP3975859B1 (fr) Détermination de position relative pour capteurs à ultrasons passifs
US20200275915A1 (en) Ultrasound probe localization with drift correction
Baumann et al. 3-D ultrasound probe calibration for computer-guided diagnosis and therapy
Hummel et al. Evaluation of three 3d us calibration methods
EP3771432A1 (fr) Détermination de position relative pour capteurs à ultrasons passifs
US20110098567A1 (en) Three dimensional pulsed wave spectrum ultrasonic diagnostic apparatus and three dimensional pulsed wave spectrum data generation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18762063

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18762063

Country of ref document: EP

Kind code of ref document: A1