[go: up one dir, main page]

WO2007101346A1 - Simulateur d'ultrasons et procédé pour simuler un examen par ultrasons - Google Patents

Simulateur d'ultrasons et procédé pour simuler un examen par ultrasons Download PDF

Info

Publication number
WO2007101346A1
WO2007101346A1 PCT/CA2007/000370 CA2007000370W WO2007101346A1 WO 2007101346 A1 WO2007101346 A1 WO 2007101346A1 CA 2007000370 W CA2007000370 W CA 2007000370W WO 2007101346 A1 WO2007101346 A1 WO 2007101346A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
doppler
image
data
dus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CA2007/000370
Other languages
English (en)
Inventor
David Steinman
Samira Hirji
David Holdswort
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robarts Research Institute
Original Assignee
Robarts Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robarts Research Institute filed Critical Robarts Research Institute
Publication of WO2007101346A1 publication Critical patent/WO2007101346A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/22Details, e.g. general constructional or apparatus details
    • G01N29/30Arrangements for calibrating or comparing, e.g. with standard objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/04Analysing solids
    • G01N29/06Visualisation of the interior, e.g. acoustic microscopy
    • G01N29/0654Imaging
    • G01N29/0672Imaging by acoustic tomography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/22Details, e.g. general constructional or apparatus details
    • G01N29/26Arrangements for orientation or scanning by relative movement of the head and the sensor
    • G01N29/265Arrangements for orientation or scanning by relative movement of the head and the sensor by moving the sensor relative to a stationary material
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/44Processing the detected response signal, e.g. electronic circuits specially adapted therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N29/00Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
    • G01N29/44Processing the detected response signal, e.g. electronic circuits specially adapted therefor
    • G01N29/4463Signal correction, e.g. distance amplitude correction [DAC], distance gain size [DGS], noise filtering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/587Calibration phantoms

Definitions

  • the present invention relates generally to the field of ultrasound and in particular, to an ultrasound simulator and method of simulating an ultrasound examination.
  • Duplex ultrasound is among the most accessible imaging modalities available today for the diagnosis of carotid disease, a common precursor to the incidence of stroke.
  • Duplex ultrasound provides a view of the anatomy under examination overlaid with blood flow velocity information by combining both B- mode and Doppler ultrasound.
  • Doppler ultrasound detects the velocity of blood travelling through an individual's artery, by transmitting a high frequency signal into the body and detecting shifts in the frequency of returned signals. The detected velocity in turn can be used to approximate the degree of occlusion in the artery due to plaque build-up, or atherosclerosis.
  • the degree of narrowing in the artery, or its stenosis severity is a known indicator of an individual's risk of stroke.
  • B-mode ultrasound provides information relating to tissue properties.
  • Duplex ultrasound combines the blood flow velocity information extracted from Doppler ultrasound, with the tissue property information extracted B-mode ultrasound thereby to enable blood flow and anatomical visualization of the anatomy under examination.
  • DUS is increasingly becoming the sole imaging modality used to determine the appropriate treatment and management steps required for patients with carotid atherosclerotic disease. This is due to the fact that DUS holds many advantages over other imaging modalities, including its capability to display in vivo images non-invasively and in real-time. Ultrasound also remains the least expensive diagnostic imaging tool to purchase, operate and maintain as compared to X-ray computed tomography (CT) or magnetic resonance imaging (MRI). Furthermore, ultrasound machines are comparatively smaller in size than CT or MRI machines making them more portable and convenient for use in clinics.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • Ultrasim ® Computer-based DUS simulators that make use of pre-recorded clinical data are also available such as that sold by MedSim Ltd. of Fort Lauderdale, Florida under the name Ultrasim ® .
  • the Ultrasim ® simulator couples a motion-tracked mock probe to pre-recorded, three-dimensional ultrasound clinical datasets virtually placed within an anthropomorphic mannequin.
  • the Ultrasim ® simulator offers a single DUS module based on pre-recorded Doppler audio clips sampled at a few points on a plane through an artery, under the assumption that blood flow dynamics are symmetric around the artery axis.
  • U.S. Patent No. 5,609,485 to Bergman et al. discloses a computer- based interactive reproduction system device designed to be used by physicians and technicians in medical training and diagnosis using medical systems such as ultrasound machines. Biological data is collected from a living body and stored in memory. An operator manipulates a simulated sensor over a transmitter which may be attached to a simulated body. The transmitter transmits position data to a receiver in the sensor. A reproduction unit processes the pre-recorded biological data and displays data corresponding to the position of the sensor with respect to the transmitter.
  • 6,117,078 to Lysyansky et al. discloses a method and apparatus for providing a virtual volumetric ultrasound phantom to construct an ultrasound training system from any ultrasound system.
  • the ultrasound system and method retrieve and display previously stored ultrasound data to simulate an ultrasound scanning session.
  • a real ultrasound system acquires an image of an ultrasound phantom.
  • the ultrasound image comprises ultrasound echo data for an image/scan plane representing a cross-section or partial volume of the ultrasound phantom.
  • the ultrasound image is analyzed to identify image attributes that are unique for each image/scan plane. A portion of the previously stored data that corresponds to the image attributes is retrieved and displayed.
  • actual position and orientation of the acquired image/scan plane with respect to a known structure within the ultrasound phantom are determined by processing the image/scan plane to obtain a number of geometrical image parameters. Position and orientation of the image/scan plane are calculated from the image parameters using formulas based on a known three-dimensional structure within the phantom. The determination of actual image/scan plane position and orientation is enhanced using image de-correlation techniques. Retrieval of the stored data is based upon the calculated position and orientation or on the obtained image parameters.
  • U.S. Patent No. 6,210,168 to Aiger et al. discloses a method and system for simulating, on a B-mode ultrasound simulator, a D-mode and C-mode Doppler ultrasound examination.
  • Velocity and sound data describing blood flow at selected locations within blood vessels of a patient are gathered during an actual Doppler ultrasound examination.
  • the gathered data are processed off-line to generate - A -
  • Doppler simulation at a designated location on a B-mode image generated from the virtual frame buffer is achieved by performing bilinear interpolation, at the time of simulation, from the data stored in the memory, so as to determine blood flow velocity and sound values for all designated virtual frame buffer voxels.
  • the interpolated blood flow velocity values are depicted as either a gray scale Doppler spectral waveform or a color scale flow map on the screen of the B-mode ultrasound simulator.
  • the sound values are depicted as an audible signal simulating a Doppler sound waveform.
  • U.S. Patent Application Publication No. 2005/0283075 to Ma et al. discloses a three-dimensional, fly-through ultrasound system.
  • a volume is represented using high spatial resolution ultrasound data.
  • the same set of ultrasound data is used for identifying a boundary, for placing the perspective position within the volume and rendering from the perspective position. The identification of the boundary and rendering are automated and performed by a processor.
  • Ultrasound data is used to generate three-dimensional, fly-through representations allowing for virtual endoscopy or other diagnostically useful views of structure or fluid flow channels.
  • the computer-based ultrasound simulators described above all make use of pre-recorded clinical data in order to generate audio and/or video ultrasound images.
  • relying on pre-recorded clinical data reduces the effectiveness of the ultrasound simulators as training tools.
  • displayed images are based on pre-recorded clinical data and are not generated on-the- fiy, operators are unable to learn about the many important operating parameters that influence the representation, and hence the interpretation, of the returned Doppler signal.
  • improvements in Doppler ultrasound simulators are desired. It is therefore an object of the present invention to provide a novel ultrasound simulator and method of simulating an ultrasound examination. Summary of the Invention
  • a method of simulating an ultrasound examination comprising: synthesizing ultrasound data using a computational phantom; and coupling the simulated ultrasound data to motion of a sensor manipulated over a target volume thereby to simulate said ultrasound examination.
  • the synthesized ultrasound data comprises both synthesized Doppler ultrasound data and B-mode data and the computational phantom comprises a computational fluid dynamics (CFD) model.
  • the synthesizing comprises interrogating the CFD model at points within a sample volume; for each point, determining from the CFD model, a velocity vector in the direction of the sensor and converting the velocity vector into a frequency; and summing the frequencies to yield Doppler ultrasound data.
  • the synthesized ultrasound data is displayed thereby to render an image selected from the group comprising a spectrograph, a colour Doppler image, a power Doppler image, a B-mode image, a duplex ultrasound (DUS) image, a colour DUS image and a power Doppler DUS image.
  • the image is displayed at clinical frame rates in response to manipulation of the sensor. Frames of the image are manipulated to maintain synchronism between the displayed image and the cycle of anatomy encompassed by the target volume.
  • a Doppler audio signal is also generated and is represented by a summation of signals generated for each sampled point within the target volume.
  • an ultrasound simulator comprising a motion tracking device outputting position data when moved over a target volume; and processing structure communicating with said motion tracking device, said processing structure synthesizing ultrasound data using a computational phantom and said position data thereby to simulate said ultrasound examination.
  • the ultrasound simulator provides the operator with a realistic experience in the operation of a true clinical DUS system.
  • the library of carotid CFD models allows a variety of carotid conditions to be simulated, including hazardous high-grade stenosis, thereby enabling the operator to gain both practical and theoretical experience in assessing the health of a carotid artery via DUS.
  • Doppler ultrasound data, B-mode data or both Doppler ultrasound and B-mode data is synthesized on-the-fly, the subjectivity, machine parameters and measurement errors associated with using pre-recorded clinical data are avoided. Also, the development, capital and on-going costs associated with tissue-mimicking physical phantoms are avoided.
  • Figure 1 shows a duplex ultrasound simulator comprising a motion tracker, a mannequin and a computer;
  • Figure 2 shows a sensor and a control unit forming part of the motion tracker, a portion of the mannequin and conversion of motion tracker coordinate values from the motion tracker coordinate system to a computational fluid dynamics (CFD) model coordinate system via transformations TFoB ⁇ sensor and TCFD ⁇ FOB;
  • CFD computational fluid dynamics
  • Figure 3 shows the general steps performed during Doppler ultrasound simulation
  • Figure 4 shows information processed and generated during ultrasound simulation
  • Figure 5 shows calibration of the DUS simulator
  • Figure 6 is a time-domain waveform for synthesized Doppler audio
  • Figure 7 shows a frame dropping algorithm employed by the DUS simulator
  • Figure 8 shows three cases handled by the frame dropping algorithm
  • Figure 9 shows B-mode ultrasound data simulation via texture mapping
  • Figure 10 shows union of a sphere and plane and intersection of a sphere and plane achieved via CSG using a stencil buffer
  • Figure 11 shows a DUS simulator image
  • Figure 12 shows a DUS simulator image with no beam steering and an insonation angle that is perpendicular to blood flow
  • Figure 13 shows the effects of positive and negative beam steering on the DUS simulator image
  • Figure 14 shows the effects of sample volume size on DUS simulator images.
  • the DUS simulator employs Doppler physics and clinically relevant carotid geometries and blood flow velocity fields together with B-mode physics and clinically relevant anatomical structures to simulate accurately a DUS system allowing colour DUS, power Doppler DUS, DUS, B-mode, colour Doppler, power Doppler and spectrograph images to be rendered at clinically relevant frame rates.
  • the DUS simulator also simulates the Doppler audio signal which sonographers listen to in order to detect abnormalities in blood flow. Embodiments of the DUS simulator will now be described with particular reference to Figures 1 to 14.
  • the DUS simulator 20 comprises a computer 22, external DUS controls 24 connected to the computer 22, a mannequin 26 in the shape of a torso upon which DUS examinations are performed and a motion tracker 27 such as that manufactured by Ascension Technology Corporation under the name of Flock of Birds ® (FoB).
  • the motion tracker 27 comprises a transmitter 28 embedded within the mannequin 26, a hand-held magnetically tracked, motion sensor 30 embedded in a shell resembling a conventional linear array ultrasound transducer and a control unit 32.
  • the external DUS controls 24 comprise a trackball 40 to provide the operator with fine control over sample volume placement and a programmable keypad 42 to provide the operator with dedicated control over Doppler parameters.
  • the trackball 40 or keypad 42 can be manipulated to enable the operator to steer the virtual beam, rotate an angle correction marker, adjust spectral gain, move and resize the colour box, change pulse repetition frequency (PRF) etc.
  • PRF change pulse repetition frequency
  • Other Doppler and B- mode settings may be adjusted using the external DUS controls 24.
  • the computer 22 executes software that enables realistic ultrasound images as well as Doppler audio to be generated in real-time and on-the-fly in response to movement of the motion sensor 30 over the mannequin 26 as will be described.
  • the computer 22 also stores a library of computational fluid dynamic
  • CFD CFD models representing a variety of carotid conditions including for example, hazardous high-grade stenosis.
  • fluid flow is governed by a set of partial differential equations, the Navier-Stokes equations.
  • the complex domain here, an artery or vein
  • the complex domain must be divided into a volumetric "mesh" of contiguous, regular, three-dimensional shapes (finite elements), connected to each other at their corners (nodes).
  • a set of algebraic equations can be formulated, and solved simultaneously to obtain nodal flow velocities and pressures.
  • the nodal flow velocities are written to a series of datafiles, with each dataf ⁇ le corresponding to one time point during a cardiac cycle.
  • a separate datafile provides information about how the nodes are connected together (i.e., the mesh).
  • Each CFD model is therefore made up of CFD mesh and flow velocity datafiles. Together the CFD mesh and velocity datafiles allow the DUS simulator 20 to identify the flow velocity at any point in the artery or vein of interest.
  • a separate, structural mesh datafile describes the tissue surrounding the artery or vein. Because this surrounding tissue is not moving, velocity information is not required.
  • each structural mesh element has assigned ultrasound properties such as for example acoustic impendances and attenuation coefficients that are used by the DUS simulator 20 to determine the intensity of simulated B-mode ultrasound data.
  • ultrasound properties such as for example acoustic impendances and attenuation coefficients that are used by the DUS simulator 20 to determine the intensity of simulated B-mode ultrasound data.
  • the operator manipulates the handle-held motion sensor 30 over the mannequin 26.
  • the transmitter 28 in the mannequin 26 transmits position data which is received by the sensor 30.
  • the received position data is then conveyed to the control unit 32.
  • the control unit 32 continuously tracks the relative position and orientation of the sensor 30 as the sensor 30 is manipulated over the mannequin 26 and generates (x,y) coordinate values that are transformed to the motion tracker's coordinate system.
  • the x-vector of the sensor 30 represents the direction in which the virtual beam of the sensor is emitted (the axial direction on an image) and the y- vector represents the lateral direction of the virtual beam or the direction that the virtual beam is swept.
  • the motion tracker 27 in turn conveys the coordinate values to the computer 22.
  • the computer 22 upon receipt of the coordinate data further processes the received data to generate either a colour DUS, power Doppler DUS, DUS, B-mode, colour Doppler, power Doppler or spectrograph image depending on the selected mode of operation as well as to simulate Doppler audio.
  • the generated image is displayed on the display screen of the computer 22 on a graphical user interface that resembles a clinical DUS system thereby to provide the operator with a realistic simulation experience. Further specifics concerning the DUS simulator 20 will now be described.
  • the computer 22 in order to simulate Doppler ultrasound, the computer 22 combines a model of Doppler physics and a pulsatile velocity field that is derived from one of the CFD models, where nodal velocities throughout the mesh are solved using equations governing pulsatile blood flow.
  • Figure 3 shows the general steps performed by the computer 22 during Doppler ultrasound simulation. Initially, the sample volume is placed by the operator via manipulation of the trackball 40 at a known location within the CFD model. The sample volume comprises between 100 to 1000 sampling points which are distributed uniformly and randomly throughout a spherical volume. This number of sampling points has been found to produce realistic spectrograph images.
  • each sampling point, or scatterer is weighted according to its distance from the centre of the sample volume via a Gaussian power distribution, to account for non-uniform virtual beam intensity. Efficient velocity interpolation of each sampled point within the sample volume is made possible via the application of a fast, geometric searching algorithm. Once the true velocity value is determined, the scalar velocity component along the direction of the virtual beam is computed. The specified Doppler angle is then applied to solve for the corrected Doppler frequency or velocity.
  • the effects of spectral broadening due to both geometric and transit-time broadening are empirically applied by convolving the Doppler frequency with an intrinsic spectral broadening function. This serves to give the spectrographs a realistic, "smeared" appearance. These steps are performed to produce a power versus velocity spectrum for each time step within a full cardiac cycle. From this, a spectrograph is produced, and characteristic information such as mean and peak velocities are derived.
  • Figure 4 shows information processed and generated by the computer 22 during ultrasound simulation.
  • the computer 22 uses the CFD mesh, flow velocity and structural mesh datafiles as well as the motion tracker coordinate value output.
  • the output coordinate values and datafiles are used to compute spectral data at the sensor's location as described above.
  • the DUS simulator outputs either a B- mode image, a colour DUS image, a DUS image, a power Doppler DUS image, a power Doppler image, a colour Doppler image or a spectrograph that is rendered in real-time on the display of the computer 22.
  • Real-time Doppler audio is also produced at the desired location.
  • the position and orientation of the sensor 30 in terms of the CFD model and structural mesh coordinates is required. This involves performing two coordinate transformations to map the coordinate systems of the motion tracker 27 and the CFD model, as shown in Figure 2.
  • One mapping determines the position and orientation of the sensor 30 relative to the transmitter 28, and is established via the transformation matrix, TF O B ⁇ sensor- This mapping is performed by the control unit 32.
  • a second transformation matrix, T CFD ⁇ FOB is applied to complete the conversion to CFD model coordinates. Calibration of the CFD model over a sample volume representing the anatomy under examination is however required.
  • the control unit 32 of the motion tracker 27 outputs coordinate values representing the positions of the fiducial markers in the motion tracker's coordinate system or space. As the positions of these fiducial markers in the CFD model coordinate system is known, from these two sets of coordinate values, a mapping of the motion tracker and CFD model spaces can be obtained.
  • Figure 5 shows placement of the fiducial markers on a neck mannequin.
  • the fiducial markers are chosen to be noncollinear so that a similarity transformation can be performed, where a translation, scaling and rotation is applied so that all angles and changes in distances are preserved between the CFD model and motion tracker coordinate systems. This is achieved using a least-squares approximation, although those of skill in the art will appreciate that other approximations may be used.
  • three fiducial markers are placed on the surface of the mannequin for virtual examination of the left carotid artery.
  • the first fiducial marker is placed on the left side of the neck mannequin on the same plane as the bifurcation apex to position correctly the CFD model when the first marker is probed with the sensor 30.
  • the second fiducial marker is placed at a known distance below the first fiducial marker to orient the long axis and scale the CFD model when the second marker is probed with the sensor 30.
  • the third fiducial marker is placed at a point that was one quarter of the neck's perimeter from the first fiducial marker in the clockwise direction, along the circumference of the neck mannequin to correctly rotate the CFD model when the third marker is probed with the sensor 30.
  • the colour DUS image is composed of an array of sample volumes whose corresponding pixels characterize the spectral mean velocity of that sample volume.
  • a colour Doppler image is superimposed over a simulated grayscale B-mode image.
  • the spectral mean at a time interval is defined by the following equation as:
  • the colour Doppler image is defined as the cross-product of the axial and lateral directions of the motion sensor 30, restricted to the height and width of the colour box.
  • Each sample volume is colour encoded using a lookup table consisting of one- hundred and seven (107) colour shades that was extracted from a present-day ultrasound machine.
  • Red and yellow hues indicate blood flow towards the sensor, and by convention represent blood flow towards the brain.
  • Blue and cyan hues indicate blood flow away from the sensor, and by convention represent blood flow away from the brain. Darker colours characterize slower blood flow (red or blue), while brighter colours indicate faster blood flow (yellow or cyan).
  • a power Doppler DUS image is simulated in a similar manner, however, power information is displayed in place of velocities.
  • the spectral analyzer of an ultrasound system collects short periods of a signal, and via a fast Fourier transform (FFT), extracts the relative frequencies and amplitudes contributing to that signal.
  • FFT fast Fourier transform
  • Each block of analyzed data shows up as a vertical line on the spectral display, and comprises a number of frequency bins that depend on the chosen FFT size.
  • the power contained in each frequency bin is encoded as the intensity of the corresponding pixels.
  • Temporal resolutions for spectral displays vary from 20 to 200 Hz, where each block of data would have typically been analyzed via a 128 or 256 point FFT.
  • the computer 22 generates spectrographs using a combination of
  • a real spectral display is mimicked by creating a two-dimensional image consisting of 256 frequency bins along the vertical axis and 400 time intervals along the horizontal axis. The 400 time intervals extend over four seconds, which is long enough to display approximately four cardiac cycles.
  • a grey-scale lookup table consisting of 128 entries of black, white and shades of grey, is applied to linearly encode the power values in the spectrograph.
  • Synthesizing B-mode ultrasound images in the DUS simulator 20 involves use of a semi-empirical acoustic field model which is applied onto a number of randomly distributed scatterers located within a precomputed, anthropomorphic, computational volume mesh.
  • ultrasound waves are transmitted from the transducer and towards the body, known as the axial direction. These waves are reflected back at varying intensities depending on the acoustic impedances encountered, attenuation coefficient of the encountered tissue, and depth from which the echo originated. The time at which echoes are returned to the transducer determines the depth from which the echoes were produced.
  • the ultrasound beam is swept laterally across the elements of the transducer face, the corresponding column of pixels in the frame buffer are rendered, until a two- dimensional image is produced.
  • the motion tracker 27 continuously outputs positional and orientation information as the sensor 30 is manipulated, from which a virtual beam is computed. Scatterers within the structural mesh phantom are then sampled at discrete locations along each virtual beam. Tissue properties such as acoustic impedance and attenuation coefficient are known at each of the scatterer locations throughout the structural mesh. The arrangement of acoustic impedances and attenuation coefficients along the axial direction of the virtual beam, along with the depth of the sampled scatterers, determine the signal intensity of the corresponding pixel.
  • I(d) is the intensity; a is the attenuation coefficient; /is the transmit frequency; and d is the round trip distance of the virtual ultrasound wave.
  • Speckle is modeled by applying Rician noise along the virtual beam paths. As speckle is spatially-correlated, this noise is spatially convolved with a Gaussian distribution in the lateral direction.
  • Real-time rendering of the simulated B-mode images is accomplished by exploiting the graphics processing unit (GPU) of the computer 22 to compute and write pixel-values directly. This is beneficial since current graphics cards contain multiple processors to render pixels in parallel, and bandwidth limitations between computer memory and the memory of the graphics card is circumvented. These processors are accessed via a pixel shader program, which is a function that computes effects on a per-pixel basis. Computation of intensity values and rendering of each pixel is executed largely in parallel.
  • texture mapping can be employed.
  • pre-acquired three-dimensional textures of various tissues, and an anthropomorphic computational volume mesh are texture mapped.
  • texture mapping is often applied when modelling a detailed scene where the number of polygons required to produce an image becomes too great and impractical. Similar to pasting wallpaper onto a white wall rather than drawing out an exact pattern by hand, texture mapping allows for a digital image to be pasted onto a single polygon rather than having to model the image explicitly.
  • a three-dimensional volume mesh whose nodal values contain tissue information such as for example, that of blood, fat, or a calcified vessel, at their respective locations, is provided.
  • the sensor 30 supplies the plane normal of the anatomical slice that is displayed on computer screen.
  • a set of pre-recorded three-dimensional textures which are preacquired from a real ultrasound system, is also supplied. These textures contain the B-mode representations of various tissues found in the anatomy. Each of these textures comprise a volume encompassing the entire mesh.
  • the smallest unit within a texture is known as a "texel", analogous to a pixel being the smallest element within an image. Every texel within a texture has an associated weighting, which indicates the amount of tissue present at that location. The assignment of weights is described in further detail below. In order to achieve real-time performance, speedy access to the weights is desired. This is achieved by storing the weighting information on the graphics card of the computer 22 in RGBA format, which consists of a red, green, and blue channel and an additional transparency channel (known as alpha). These channels are used to store the weightings of each tissue texture so that quick access to these weightings is achieved.
  • RGBA format which consists of a red, green, and blue channel and an additional transparency channel (known as alpha).
  • the center point of every texel within each of the textures is firstly probed.
  • the element that contains this point is found within the volume mesh using an efficient geometric search algorithm.
  • the interpolated nodal values indicate the type of tissue present at the sampled point location, and the tissue type is used to assign the weightings at that texel. For instance, if the point corresponds to fat tissue, then the respective texel within the texture corresponding to fat tissue will record a full weighting of 1.0, while the other textures will record a weighting of 0.0.
  • this pre-processing step serves to record the tissue information, as weights in the textures.
  • Figure 9 illustrates this process.
  • the coordinates of the ultrasound image plane is determined.
  • the resolution of the image is set equivalent to that of an actual B-mode image. Every pixel within this image is then probed for its mesh coordinate value, which serves as an index into each of the textures. Each texture returns its weighting at the probed texel, and the resulting shade is applied. Once all pixels have been colour-coded, the scene is rendered and displayed yielding a realistic B-mode simulation requiring little computation.
  • constructive solid geometry is used to synthesize and render B-mode ultrasound.
  • constructive solid geometry (CSG) objects are created via a stencil buffer, to efficiently outline various anatomical structures present in the ultrasound image plane. Pixels on the plane are colour-coded using pre-acquired textures derived from true B-mode ultrasound images.
  • CSG is a technique which allows for the combination of objects (where objects refer to closed polygonal surfaces), using Boolean set operators, such as union, difference and intersection, to create more complex objects.
  • objects where objects refer to closed polygonal surfaces
  • Boolean set operators such as union, difference and intersection
  • an intersection operation is performed between the ultrasound image plane and a number of anthropomorphic surface meshes which model the morphology of all tissues found in the region.
  • the image plane is texture mapped with realistic B-mode images which are preacquired from an actual ultrasound machine. The normal and position of the ultrasound image plane is provided by the output of the sensor.
  • the intersection operation described above is performed by means of the stencil buffer, which is among the several buffers that reside on the computer graphics card.
  • the stencil buffer can be employed in an analogous manner as a real world stencil, or outline.
  • a stencil test compares the value in the stencil buffer to a reference value and determines whether a pixel is eliminated or not, hence acting as a mask. This test is first setup by disabling colour bits from being written to the frame buffer, so that draw calls are not displayed on the screen. The front face of the intersecting plane is then drawn, or written to the frame buffer.
  • next draw call which is to the front face of the surface mesh where the first stencil test is performed: glEnable(GL_STENCIL_TEST); glStencilFunc(GL_ALWAYS, 0, 0); glStencilOp(GL_KEEP, GLJCEEP, INCR);
  • bits in the stencil buffer are incremented wherever the front face of the surface mesh is drawn.
  • the next call decrements values in the stencil buffer wherever the back face of the surface mesh is drawn: glStencilOp(GL_KEEP, GL KEEP, GL_DECR);
  • FIG. 10 shows an example of the stencilling procedure described above used to display the intersection between a plane and a sphere.
  • the virtual DUS simulator involves the intersection of the provided anthropomorphic surface meshes and the ultrasound image plane.
  • the surface meshes provided represent surfaces of the tissues occurring throughout the anatomical region. Each surface mesh belongs to a certain tissue type such as for example, fat.
  • the above stencilling procedure is then executed on all surface meshes that belong to a particular tissue type.
  • the CFD model velocity field information is contained in a number of time-steps that make up one complete heart cycle period. Hence, new velocities that are rendered at every frame update are derived from a particular time-step from the velocity field data.
  • a frame-dropping algorithm is employed. As will be appreciated, render speeds and thus frame rates, may fluctuate depending on computation times and therefore might not be in sync with the hypothetical heart rate of the CFD model.
  • the frame-dropping algorithm ensures that frames are rendered in real-time irrespective of variations in the underlying computation time.
  • the frame dropping algorithm either discards or appends frames that will cause the display to become out of sync depending on whether the software is ahead of or behind "schedule".
  • Figures 7 and 8 illustrate an example of what is meant by ahead of and behind schedule.
  • “Elapsed time” is the true time, or the time dictated by the actual computer CPU wall time that has passed (i.e. the physical time that has passed, as opposed to the number of CPU clock cycles) and “theoretical time” is the time dictated by the program's next scheduled frame number.
  • frame number seven is due, but enough CPU wall time has passed that frame number eight should now be rendered, then the program is behind schedule, and in this case frame number seven is discarded.
  • the last rendered frame will be conserved until it is time for the next frame.
  • CFD model velocities at a cloud of randomly distributed points are sampled within a predefined sample volume power distribution.
  • Each velocity is converted to a Doppler frequency via the Doppler equation, weighted according to the power at its sample volume location, and convolved with an intrinsic spectral broadening function.
  • Spectra are constructed at discrete times, with the velocity sampling points randomly distributed within a nominal temporal window ⁇ t.
  • each point within the CFD velocity field is sampled at some time t 0 at a random location within the sample volume, and a velocity, v, is returned which is converted to a Doppler frequency, f. From this the audio waveform basis function expressed below:
  • T is the time required to traverse the sample volume, which is assumed to be the nominal sample volume diameter divided by the velocity.
  • waveform basis functions are generated at each broadened frequency and their associated powers are summed together.
  • a continuous audio signal is built representing the summation of signal contributions from every point within the sample volume.
  • the DUS simulator 20 allows the operator to steer the virtual beam at three angles, +20°, 0°, -20°. Multiple steering angles are incorporated into the application by rotating the "axial direction" vector of the sensor 30 (i.e. its x-axis) about the slicing plane normal, by the steering angle as shown in Figures 12 and 13. The new vector is used to calculate the blood velocity component along the axial direction. As can be seen in Figure 13, both the colour DUS and spectrograph images are correctly updated.
  • the DUS simulator 20 starts with a 60° default angle between the virtual beam and angle correction marker but allows the angle correction marker to be rotated at increments of 2° from -70° to +70°, as is permitted in conventional ultrasound systems. Adjustment of the angle correction marker then alters the Doppler angle which is used for the derivation of velocities displayed on the spectrogram.
  • the system can support any starting angle and incremental rotation.
  • the spectrograph utilizes a grey scale lookup table. Computed power values serve as indices into the lookup table. By adjusting the grey scale levels, or the scalar range to which the colours are mapped, the basic use of the spectral gain feature, i.e. to vary the strength of the backscattered signal can be mimicked.
  • the operator is allowed to move and resize the colour box. In doing so, the frame rate and pulse repetition frequency (PRF) may be affected.
  • PRF pulse repetition frequency
  • c represents the speed of sound in
  • the DUS simulator 20 allows for interactivity via the keyboard and trackball and detects for certain keyboard and trackball movement events. As an example, Figure 11 shows the list of keyboard events and their associated functions.
  • the sample volume marker may be moved about the display screen via the trackball and keyboard for selection of a region to be viewed in spectrogram mode.
  • the gate size of the sample volume may also be increased or decreased as shown in Figure 14.
  • the left side shows a lmm sample volume that yields a clean spectrograph with little spectral broadening.
  • the right side shows an enlarged sample volume that provides for a broader spectra. Since, in ultrasonography the operator only has control of the axial size of the sample volume, i.e. along the beam direction, an oblate shaped sample volume with radii equivalent to the standard deviation of the Gaussian power distribution along the respective axes is employed.
  • the PRF of a Doppler ultrasound system is primarily associated with the velocity or frequency scale of the colour flow map or spectrogram. Depending on the system's scanhead, various sets of PRF ranges are available for the sonographer. For both CDUS and spectrogram images, various PRFs are provided for and in this example include 3500 Hz, 5000 Hz, 6250 Hz, 8333 Hz, 10000 Hz, 11950 Hz, and
  • fo max is the Doppler frequency
  • f ⁇ the transmit frequency
  • V max is the maximum velocity limit seen on the velocity scale
  • the Doppler angle
  • c is the speed of ultrasound in blood.
  • Settings that modify B-mode ultrasound on typical ultrasound systems are mimicked. This includes, but is not limited to i) gain level, ii) time-gain compensation, and iii) dynamic range.
  • the DUS simulator provides a strong tool for the advancement of current diagnosis protocols for the widespread problem of carotid disease. By realistically simulating a DUS examination, operators are able to gain useful experience that translates directly to real life DUS examinations. Signal processing techniques for improving and expanding the realm of information obtainable from Doppler ultrasound can be tested and analyzed. The DUS simulator also opens the door for the discovery of new and better risk indicators for stroke using Doppler ultrasound.

Landscapes

  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne un procédé pour simuler un examen par ultrasons mettant en jeu la synthèse de données d'ultrasons au moyen d'un modèle (« fantôme ») informatique et l'accouplement des données d'ultrasons simulées au mouvement d'un capteur manipulé sur un volume cible dans le but de simuler l'examen par ultrasons.
PCT/CA2007/000370 2006-03-07 2007-03-07 Simulateur d'ultrasons et procédé pour simuler un examen par ultrasons Ceased WO2007101346A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US77941806P 2006-03-07 2006-03-07
US60/779,418 2006-03-07

Publications (1)

Publication Number Publication Date
WO2007101346A1 true WO2007101346A1 (fr) 2007-09-13

Family

ID=38474572

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2007/000370 Ceased WO2007101346A1 (fr) 2006-03-07 2007-03-07 Simulateur d'ultrasons et procédé pour simuler un examen par ultrasons

Country Status (1)

Country Link
WO (1) WO2007101346A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014134289A1 (fr) * 2013-03-01 2014-09-04 Heartflow, Inc. Procédé et système pour déterminer des traitements en modifiant des modèles géométriques spécifiques aux patients
US10573200B2 (en) 2017-03-30 2020-02-25 Cae Healthcare Canada Inc. System and method for determining a position on an external surface of an object
EP3071113B1 (fr) * 2013-11-21 2020-07-29 Samsung Medison Co., Ltd. Procédé et appareil d'affichage d'image ultrasonore

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5230339A (en) * 1991-06-13 1993-07-27 Array Tech, Inc. Performance evaluation of ultrasonic examination equipment
US6117075A (en) * 1998-09-21 2000-09-12 Meduck Ltd. Depth of anesthesia monitor
US6193657B1 (en) * 1998-12-31 2001-02-27 Ge Medical Systems Global Technology Company, Llc Image based probe position and orientation detection
US6210168B1 (en) * 1998-03-16 2001-04-03 Medsim Ltd. Doppler ultrasound simulator

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5230339A (en) * 1991-06-13 1993-07-27 Array Tech, Inc. Performance evaluation of ultrasonic examination equipment
US6210168B1 (en) * 1998-03-16 2001-04-03 Medsim Ltd. Doppler ultrasound simulator
US6117075A (en) * 1998-09-21 2000-09-12 Meduck Ltd. Depth of anesthesia monitor
US6193657B1 (en) * 1998-12-31 2001-02-27 Ge Medical Systems Global Technology Company, Llc Image based probe position and orientation detection

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014134289A1 (fr) * 2013-03-01 2014-09-04 Heartflow, Inc. Procédé et système pour déterminer des traitements en modifiant des modèles géométriques spécifiques aux patients
US9449146B2 (en) 2013-03-01 2016-09-20 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US10390885B2 (en) 2013-03-01 2019-08-27 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
US11185368B2 (en) 2013-03-01 2021-11-30 Heartflow, Inc. Method and system for image processing to determine blood flow
US11564746B2 (en) 2013-03-01 2023-01-31 Heartflow, Inc. Method and system for image processing to determine blood flow
US11869669B2 (en) 2013-03-01 2024-01-09 Heartflow, Inc. Method and system for image processing to model vasculasture
EP3071113B1 (fr) * 2013-11-21 2020-07-29 Samsung Medison Co., Ltd. Procédé et appareil d'affichage d'image ultrasonore
US10573200B2 (en) 2017-03-30 2020-02-25 Cae Healthcare Canada Inc. System and method for determining a position on an external surface of an object

Similar Documents

Publication Publication Date Title
KR101717695B1 (ko) 의료 영상의 시뮬레이션
EP3563769B1 (fr) Procédé et système à ultrasons pour imagerie d'élasticité d'ondes de cisaillement
US11471130B2 (en) Method and ultrasound system for shear wave elasticity imaging
US11006926B2 (en) Region of interest placement for quantitative ultrasound imaging
JP6297085B2 (ja) 関心ボリュームの超音波イメージングのための超音波イメージングシステムおよびその作動方法
Burger et al. Real-time GPU-based ultrasound simulation using deformable mesh models
JP5567502B2 (ja) 医療用訓練方法及び装置
US20110125016A1 (en) Fetal rendering in medical diagnostic ultrasound
Aiger et al. Real-time ultrasound imaging simulation
US20170032702A1 (en) Method and Apparatus For Generating an Ultrasound Scatterer Representation
WO2007100263A1 (fr) Procédé de simulation d'images ultrasonores
US6458082B1 (en) System and method for the display of ultrasound data
CN102481141B (zh) 基于多普勒的流量测量
WO2007101346A1 (fr) Simulateur d'ultrasons et procédé pour simuler un examen par ultrasons
CN114173673A (zh) 使用图像数据进行的超声系统声学输出控制
EP3843637B1 (fr) Système à ultrasons et procédés d'élastographie intelligente à onde de cisaillement
Hirji et al. Real-time and interactive virtual Doppler ultrasound
Rohling et al. Issues in 3-D free-hand medical ultrasound imaging
Wiesauer Methodology of three-dimensional ultrasound
Petrinec Patient-specific interactive ultrasound image simulation based on the deformation of soft tissue
Petrinec Patient-specific interactive ultrasound image simulation with soft-tissue deformation
Gjerald et al. Real-time ultrasound simulation for low cost training simulators
Wei Distance Estimation for 3D Freehand Ultrasound-Scans with Visualization System
HK1160697A (en) Simulation of medical imaging
Karadayi Study on error and image quality degradation in three-dimensional ultrasound imaging with a mechanical probe

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07710704

Country of ref document: EP

Kind code of ref document: A1