WO2015092628A1 - Ultrasound imaging systems and methods for tracking locations of an invasive medical device - Google Patents
Ultrasound imaging systems and methods for tracking locations of an invasive medical device Download PDFInfo
- Publication number
- WO2015092628A1 WO2015092628A1 PCT/IB2014/066788 IB2014066788W WO2015092628A1 WO 2015092628 A1 WO2015092628 A1 WO 2015092628A1 IB 2014066788 W IB2014066788 W IB 2014066788W WO 2015092628 A1 WO2015092628 A1 WO 2015092628A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- invasive medical
- medical device
- array transducer
- scan plane
- ultrasound
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
- A61B8/4488—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Instruments for taking body samples for diagnostic purposes; Other methods or instruments for diagnosis, e.g. for vaccination diagnosis, sex determination or ovulation-period determination; Throat striking implements
- A61B10/02—Instruments for taking cell samples or for biopsy
- A61B10/0233—Pointed or sharp biopsy instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
- A61B2017/3413—Needle locating or guiding means guided by ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52073—Production of cursor lines, markers or indicia by electronic means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52074—Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
Definitions
- This invention relates to medical diagnostic ultrasound systems and, in particular, to imaging systems and methods for tracking an invasive medical device during an interventional procedure.
- Ultrasound guidance of needle visualization in biopsies and interventional procedures is commonly used to assist accurate placement of interventional devices in a patient. It is, however, difficult to accurately align the imaging plane of the ultrasound with the interventional device during a procedure.
- Some approaches that have attempted to address these difficulties include a mechanical "biopsy guide" accessory that is attached to the ultrasound
- a diagnostic ultrasound system which acquires volume image data by sub-sampling a
- volumetric region with a low (widely spaced) scan plane density that is sufficient to sub-sample the entire volumetric region in a time interval
- the transducer is pointed in a section of the volumetric region that includes an invasive medical device, such as a biopsy needle. Owing to the low scan plane density, the volumetric region is quickly scanned and used to identify the invasive medical device. Various known techniques, such as orthogonal projections, are used to locate which scan plane of an array transducer is aligned with the invasive medical device. An ultrasound image of the invasive medical device and surrounding tissue is displayed and used to guide the device during an interventional procedure. In addition, the sub-sampling of the volumetric region is repeatedly scanned so as to provide a real-time location of the invasive device and to continually show the device in plane by shifting which scan plane is being imaged by the array transducer.
- FIGURE 1 illustrates in block diagram form a three dimensional ultrasonic imaging system suitable for use in an embodiment of the present invention.
- FIGURE 2 illustrates a workflow in accordance with the present invention for identifying and displaying the location of the needle with respect to a lesion being biopsied.
- FIGURE 3 depicts an example procedure of subsampling a volumetric region that includes a lesion of interest and an invasive medical device.
- FIGURE 4 shows two of the scan planes from
- FIGURE 3 displayed on a screen for easy viewing of the needle during an interventional procedure.
- FIGURE 5 illustrates identification of multiple needles during an interventional procedure.
- the present invention includes an ultrasonic imaging system for identifying a scan plane including an invasive medical device.
- the system can include an ultrasonic probe, which can include an array transducer adapted to generate a tracking volume of ultrasonic beam slices over a volumetric tissue region including an invasive medical device.
- the system can further include a transmit beamformer coupled to the array transducer and adapted to control a spatial beam density of the beam slices transmitted by the array transducer in the volumetric region.
- the system can further include a receive beamformer coupled to the array transducer and responsive to echo signals from array elements for the production of received scanlines in the vicinity of the invasive medical device and in the volumetric region at locations removed from the invasive medical device location.
- An image processor in the system is also responsive to the received scanlines and adapted to identify a scan plane of the array transducer that includes the invasive medical device.
- a display module in the system is coupled to the image processor and adapted to display an image processor and adapted to display an image processor and acous
- the system is adapted to repeatedly scan the tracking volume to locate the position of the invasive medical device and display it in real-time.
- an ultrasonic imaging system is shown in block diagram form and includes a transducer probe 10 that is used to guide or monitor an invasive medical device.
- a two dimensional array transducer 12 having a plurality of elements is coupled to microbeamformers 14.
- the microbeamformers control the transmission of ultrasound by the
- the microbeamformers 14 can be fabricated in integrated circuit form and located in the housing of the probe 10 near the array
- Microbeamformers or subarray beamformers as they are often called, are more fully described in U.S. Pat. Nos. 6,375,617 and 5,997,479, which are hereby incorporated by reference in their entirety.
- the partially beamformed signals produced by the microbeamformers 14 are coupled to a beamformer 22 where the beamformation process is completed.
- the resultant coherent echo signals along the beams are processed by a signal processor 24.
- the echo signals are then processed into image signals in the
- the image signals are converted to a desired image format (x,y,z Cartesian coordinates, for example) by a scan converter 28.
- the probe 10 may also include a position sensor 16 which provides signals indicative of the position of the probe 10 to a transducer position detector 18.
- the sensor 16 may be a magnetic, electromagnetic, radio frequency, infrared, or other type of sensor such as one which transmits a signal that is detected by a voltage impedance circuit.
- the transducer position signal 20 produced by the detector 18 may be used by the ultrasound system or coupled to an interventional device system when useful for the formation of spatially coordinated images containing information from both systems.
- the present invention is particularly useful for free-hand procedures, it can also be used during tracked, needle-guided procedures.
- the image plane data from the scan converter 28 can be coupled directly to a video processor 30 which produces video drive signals compatible with the requirements of the display 32.
- Volume rendered 3D images can also be coupled to the video processor 30 for display.
- Three dimensional image data from the probe 10 is sent to a volume processor 34, which renders a three dimensions
- the system can display individual volume rendered images or a series of volume rendered images showing the dynamic flow and motion of the anatomy being imaged in real time.
- volume rendering is well known in the art and is described, e.g., in U.S. Pat. No. 5,474,073, which is hereby incorporated by reference in its entirety. Volume rendering may also be performed on image data which has not been scan converted as described in US Patent Application. 10/026,996, filed Dec. 19, 2001, which is hereby incorporated by reference in its entirety.
- the ultrasound system acquires data by scanning a patient's target anatomy with the transducer probe and by receiving multiple frames of ultrasound data. The system derives
- the system uses the frame data and corresponding indicators for each frame as inputs for the volume reconstruction and image visualization processes.
- the 3D ultrasound system performs volume reconstruction by defining a
- the reference coordinate system is the
- the system in FIGURE 1 further includes an invasive device (e.g. a biopsy needle) position processor 36, which is coupled to the scan converter 28 and volume processor 34.
- the invasive device position processor 36 processes the ultrasound image data in 2D and/or 3D to identify the position of the invasive device (e.g., biopsy needle) in relation to the array transducer. For example, using the 3D information that is acquired at low scan plane densities, the system can apply known
- the invasive device position processor 36 After identifying the location of a biopsy needle the invasive device position processor 36 causes the array transducer 12 to automatically generate a scan plane that is aligned with a spatial plane containing the biopsy needle. As will be described more below, any scan plane can be generated in relation to the needle depending on the user's viewing preferences. In some embodiments, a user may select scan planes via a controller 38, which is also used to control the ultrasound imaging system.
- a controller 38 which is also used to control the ultrasound imaging system.
- the system For displaying graphics, such as a needle trajectory on a screen, the system includes a
- graphics processor 40 that receives either scan converted image data from the scan converter 28 or unscan-converted image data from the image processor
- the visual emphasis may be provided by an enhanced or unique brightness, color, or volume rendering process for imaging the tip of the device, for example.
- the resultant graphics are coupled to the video processor 30 where they are coordinated and overlaid with the image for display.
- the method can include using an ultrasonic probe
- Steps of the method can include acquiring ultrasound image data from the ultrasound beam slices, the image data including structures m the volumetric region and the invasive medical device, processing the ultrasound image data to identify the scan plane of the array transducer that includes the invasive medical device, acquiring ultrasound image data from the scan plane that includes the invasive medical device, and displaying an ultrasound image of the scan plane including the invasive medical device, wherein the tracking volume is repeatedly scanned to locate the position of the invasive medical device for real-time display.
- FIGURE 2 describes an example workflow 42 for implementing a procedure in accordance with an embodiment of the present invention.
- a coarse 3D ultrasound image is acquired to include an invasive medical device, such as a biopsy needle or ablation needle.
- the transducer probe 10 including an array transducer 12 can be used to generate a tracking volume of ultrasonic beam slices over a volumetric tissue region including the
- Ultrasound image data is acquired from the ultrasound beam slices and includes the volumetric region and the invasive medical device.
- the ultrasound imaging system can acquire volume image data by sub-sampling a volumetric region with a low (widely spaced) scan plane density that is sufficient to sub-sample the entire volumetric region in a time interval
- the workflow includes determining a scan plane of the array transducer that includes the invasive medical device. For example, using the 3D information that is
- Step 48 includes displaying an ultrasound image of the scan plane including the invasive medical device.
- a B-mode image can be displayed in which the tip and/or a portion of the needle are visualized in plane with the scan plane of the B-mode image.
- step 50 includes monitoring the location of the needle during the procedure so it can be displayed in real-time.
- the tracking volume in step 44 is repeatedly scanned to locate the position of the invasive medical device for real-time display.
- a tracking volume of a desired beam density can be used to sample the patient's tissue and identify the needle in relative space within the
- the system will automatically identify the needle 56 and its position in relation to the array transducer 12. Once the position is identified, the scan plane of interest (e.g., a scan plan parallel to the needle) can be identified and displayed. If the scan plane has shifted from the previous tracking volume, then different elements in the array transducer are used to generate an image in the proper scan plane.
- the scan plane of interest e.g., a scan plan parallel to the needle
- FIGURE 3 provides an example implementation of the present invention for identifying a scan plane containing an invasive medical device, such as a biopsy needle.
- a transducer probe 10 including an array transducer is positioned on the skin 52 of a patient.
- a tissue region of interest 54 that is to be biopsied is present under the skin 52.
- a biopsy needle 56 is inserted into the patient and the transducer probe 10 is used to locate the needle 56 and track its movement towards the tissue region 54 for an accurate and successful biopsy during a free-hand interventional procedure.
- the ultrasound system acquires volume image data by sub-sampling a volumetric region or tracking volume 58 with a low (widely spaced) scan plane density that is sufficient to sub-sample the entire volumetric region in a time interval sufficient for a desired volumetric frame rate of display.
- An example scan plane density is illustrated by dashed lines 60, which represent scan lines in both the elevation and azimuth directions of the 2D array transducer. It will be appreciated that virtually any scan plane density can be used, but the density will generally be defined according to a desired time resolution that does not sacrifice accuracy and time efficiency for generating an adequate number of scan planes that intersect the needle.
- the coarseness of the beam scan plane density can be controlled by the user or be implemented according to a preset protocol that uses an optimal scan plane density.
- the volumetric ultrasound data is processed by the invasive device position processor 36 to calculate which scan plane of the array transducer is aligned with the needle.
- the needle 56 is aligned with an azimuth- oriented scan plane of the array transducer.
- the thicker dashed line shows the scan plane 62 that is aligned with the needle.
- the system can identify that orientation and implement transmit and receive cycles to provide optimal alignment of the image plane with the plane containing the biopsy needle.
- the identified scan plane can be imaged as a single thin plane within the elevational resolution of the probe and system. In other embodiments, the
- identified scan plane can also be imaged as a thick slice image of a plane thickness greater than that of a single thin plane as described in US patent
- FIGURE 4 shows an example implementation of displaying orthogonal scan planes simultaneously including an invasive medical device.
- a display 32 can include two panels that show different scan plane cross-sections of a biopsy needle or ablation needle.
- the top panel shows a high resolution 2D image 62 that is aligned with the needle 56 in the vicinity of the lesion of interest 54.
- Tissue structures 64a are shown in the image, and can be used to further guide the needle insertion path.
- the biplane view of the scan plane in the top panel is provided.
- tissue structures 64b are also shown in the 2D image.
- the display can also show an icon indicating an orientation of the scan plane and/or the needle in relation to the array transducer.
- the needle is shown in relation to a 3D volume
- the insertion planes of multiple needles can be identified, facilitating use of an ultrasound system of the present invention for procedures such as r.f. ablation using multiple needles.
- the insertion planes of differently inclined needles can be identified and visualized.
- multiple needles can be inserted at the same time in different identified scan planes. For example, two different instruments may be used for microwave ablation of target anatomy, which case the clinician will want to visually guide both ablation needles to the target so that their tips are in contact with the anatomy to be ablated.
- the ablation needles can be inserted at different times and the system will automatically register another needle in the view plane.
- multiple needles that are in different planes can be imaged simultaneously.
- a volume image can be generated and displayed simultaneously along with the needles in different image planes.
- three different needles, 56a, 56b, and 56c are being used and imaged at the same time.
- the respective scan planes 62a, 62b, and 62c of the array transducer can be imaged and each of the needles can be displayed in real-time, for example, in separate image displays on a screen using different colors for each needle. Identifying and coloring a needle in an ultrasound image can be
- each needle is colored with its distinctive color, blue, red, or yellow, so that the clinician can easily relate each needle in the 3D image to its own 2D insertion plane image.
- Each 2D image plane and the full 3D volume are scanned in a time interleaved manner, with the individual insertion planes being scanned at a greater repetition rate (and hence real time frame rate of display) than the 3D image.
- the individual 2D images can be frozen on the screen so that the full acquisition time is devoted to 3D imaging and the procedure at the target anatomy can continue to be imaged in live 3D.
- any of the detection devices, systems, and components thereof, of the present invention may vary depending upon the intended application, as may be apparent to those of skill in the art in view of the disclosure herein.
- a selected probe or needle size, design or dimensions can differ depending on intended use.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Gynecology & Obstetrics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
A diagnostic ultrasound system is provided, e.g., which acquires volume image data by sub- sampling a volumetric region with a low (widely spaced) scan plane density. A transducer can be pointed in a section of the volumetric region that includes an invasive medical device and, the volumetric region can be quickly scanned and used to identify the invasive medical device. The system locates a scan plane of an array transducer that is aligned with the invasive medical device. An ultrasound image of the invasive medical device and surrounding tissue is displayed and used to guide the device during an interventional procedure. In addition, the sub-sampling of the volumetric region is repeatedly scanned so as to provide a real-time location of the invasive device and to continually show the device in plane by shifting which scan plane is being imaged by the array transducer.
Description
ULTRASOUND IMAGING SYSTEMS AND METHODS FOR TRACKING LOCATIONS OF AN INVASIVE MEDICAL DEVICE
The present application claims the benefit of U.S. Provisional Patent Application No. 61/918,867, filed December 20, 2013, which is hereby incorporated by reference in its entirety.
This invention relates to medical diagnostic ultrasound systems and, in particular, to imaging systems and methods for tracking an invasive medical device during an interventional procedure.
Ultrasound guidance of needle visualization in biopsies and interventional procedures is commonly used to assist accurate placement of interventional devices in a patient. It is, however, difficult to accurately align the imaging plane of the ultrasound with the interventional device during a procedure. Some approaches that have attempted to address these difficulties include a mechanical "biopsy guide" accessory that is attached to the ultrasound
transducer to align the nominal path of the needle with the imaging plane. These guidance devices restrict needle motion to varying degrees but
primarily to assist alignment of the needle path to the imaging plane.
While mechanical accessories can be useful, they do not generally provide support for free-hand procedures. In particular, some clinicians prefer to do "free-hand" biopsies in which they rely on
experience to provide the necessary alignment of an interventional device. For free-hand biopsies, a physician guides a needle to the target with one hand while the other hand positions the ultrasound
transducer for adequate visualization. Guidance under free-hand conditions is challenging and may be time-
consuming because of the necessity to align the scanning plane and needle axis. Moreover, the ultrasound scanning plane should remain coplanar with the needle axis for adequate visualization of the needle tip. Therefore, the experience of the
physician dictates the success of the freehand ultrasound-guided biopsy procedure, which can lead to problems with achieving reliable biopsy results.
Accordingly, there is a need for better methods to monitor an invasive medical device during free¬ hand interventional procedures.
In accordance with the principles of the present invention, methods and systems are provided for using ultrasound to visualize an invasive medical device during an interventional procedure in real-time. A diagnostic ultrasound system is described which acquires volume image data by sub-sampling a
volumetric region with a low (widely spaced) scan plane density that is sufficient to sub-sample the entire volumetric region in a time interval
sufficient for a desired volumetric frame rate of display. In general, the transducer is pointed in a section of the volumetric region that includes an invasive medical device, such as a biopsy needle. Owing to the low scan plane density, the volumetric region is quickly scanned and used to identify the invasive medical device. Various known techniques, such as orthogonal projections, are used to locate which scan plane of an array transducer is aligned with the invasive medical device. An ultrasound image of the invasive medical device and surrounding tissue is displayed and used to guide the device during an interventional procedure. In addition, the sub-sampling of the volumetric region is repeatedly scanned so as to provide a real-time location of the
invasive device and to continually show the device in plane by shifting which scan plane is being imaged by the array transducer.
In the drawings :
FIGURE 1 illustrates in block diagram form a three dimensional ultrasonic imaging system suitable for use in an embodiment of the present invention.
FIGURE 2 illustrates a workflow in accordance with the present invention for identifying and displaying the location of the needle with respect to a lesion being biopsied.
FIGURE 3 depicts an example procedure of subsampling a volumetric region that includes a lesion of interest and an invasive medical device.
FIGURE 4 shows two of the scan planes from
FIGURE 3 displayed on a screen for easy viewing of the needle during an interventional procedure.
FIGURE 5 illustrates identification of multiple needles during an interventional procedure.
In one aspect, the present invention includes an ultrasonic imaging system for identifying a scan plane including an invasive medical device. The system can include an ultrasonic probe, which can include an array transducer adapted to generate a tracking volume of ultrasonic beam slices over a volumetric tissue region including an invasive medical device. The system can further include a transmit beamformer coupled to the array transducer and adapted to control a spatial beam density of the beam slices transmitted by the array transducer in the volumetric region. The system can further include a receive beamformer coupled to the array transducer and responsive to echo signals from array elements for the production of received scanlines in the vicinity of the invasive medical device and in
the volumetric region at locations removed from the invasive medical device location. An image processor in the system is also responsive to the received scanlines and adapted to identify a scan plane of the array transducer that includes the invasive medical device. A display module in the system is coupled to the image processor and adapted to display an
ultrasound image of the scan plane including the invasive medical device. In certain aspects, the system is adapted to repeatedly scan the tracking volume to locate the position of the invasive medical device and display it in real-time.
Referring first to FIG. 1, an ultrasonic imaging system is shown in block diagram form and includes a transducer probe 10 that is used to guide or monitor an invasive medical device. A two dimensional array transducer 12 having a plurality of elements is coupled to microbeamformers 14. The microbeamformers control the transmission of ultrasound by the
elements of the array transducer 12 and partially beamform echoes returned to groups of the elements of the array transducer. The microbeamformers 14 can be fabricated in integrated circuit form and located in the housing of the probe 10 near the array
transducer. Microbeamformers , or subarray beamformers as they are often called, are more fully described in U.S. Pat. Nos. 6,375,617 and 5,997,479, which are hereby incorporated by reference in their entirety. The partially beamformed signals produced by the microbeamformers 14 are coupled to a beamformer 22 where the beamformation process is completed. The resultant coherent echo signals along the beams are processed by a signal processor 24. The echo signals are then processed into image signals in the
coordinate system of the probe by an image processor
26. The image signals are converted to a desired image format (x,y,z Cartesian coordinates, for example) by a scan converter 28.
If a tracking system is used during an
interventional procedure, the probe 10 may also include a position sensor 16 which provides signals indicative of the position of the probe 10 to a transducer position detector 18. The sensor 16 may be a magnetic, electromagnetic, radio frequency, infrared, or other type of sensor such as one which transmits a signal that is detected by a voltage impedance circuit. The transducer position signal 20 produced by the detector 18 may be used by the ultrasound system or coupled to an interventional device system when useful for the formation of spatially coordinated images containing information from both systems. For example, while the present invention is particularly useful for free-hand procedures, it can also be used during tracked, needle-guided procedures.
For display of two-dimensional images, the image plane data from the scan converter 28 can be coupled directly to a video processor 30 which produces video drive signals compatible with the requirements of the display 32. Volume rendered 3D images can also be coupled to the video processor 30 for display. Three dimensional image data from the probe 10 is sent to a volume processor 34, which renders a three
dimensional view of a volumetric region as seen from a selected look direction. The system can display individual volume rendered images or a series of volume rendered images showing the dynamic flow and motion of the anatomy being imaged in real time.
Volume rendering is well known in the art and is described, e.g., in U.S. Pat. No. 5,474,073, which is
hereby incorporated by reference in its entirety. Volume rendering may also be performed on image data which has not been scan converted as described in US Patent Application. 10/026,996, filed Dec. 19, 2001, which is hereby incorporated by reference in its entirety. In some embodiments, the ultrasound system acquires data by scanning a patient's target anatomy with the transducer probe and by receiving multiple frames of ultrasound data. The system derives
position and orientation indicators for each frame relative to a prior frame, a reference frame or a reference position. Then, the system uses the frame data and corresponding indicators for each frame as inputs for the volume reconstruction and image visualization processes. The 3D ultrasound system performs volume reconstruction by defining a
reference coordinate system within which each image frame in a sequence of the registered image frames.
The reference coordinate system is the
coordinate system for a 3D volume encompassing all image planes to be used in generating a 3D image and for identifying the location of the invasive medical device with the 3D volume. The system in FIGURE 1 further includes an invasive device (e.g. a biopsy needle) position processor 36, which is coupled to the scan converter 28 and volume processor 34. The invasive device position processor 36 processes the ultrasound image data in 2D and/or 3D to identify the position of the invasive device (e.g., biopsy needle) in relation to the array transducer. For example, using the 3D information that is acquired at low scan plane densities, the system can apply known
techniques, such as orthogonal projection techniques, to locate a scan plan including the invasive device in the imaging volume. After identifying the
location of a biopsy needle the invasive device position processor 36 causes the array transducer 12 to automatically generate a scan plane that is aligned with a spatial plane containing the biopsy needle. As will be described more below, any scan plane can be generated in relation to the needle depending on the user's viewing preferences. In some embodiments, a user may select scan planes via a controller 38, which is also used to control the ultrasound imaging system.
For displaying graphics, such as a needle trajectory on a screen, the system includes a
graphics processor 40 that receives either scan converted image data from the scan converter 28 or unscan-converted image data from the image processor
26 for analysis. Other graphics can also be
generated, such as visual emphasis of the tip of an interventional device or the detection of the border of an organ within the image field. The visual emphasis may be provided by an enhanced or unique brightness, color, or volume rendering process for imaging the tip of the device, for example. The resultant graphics are coupled to the video processor 30 where they are coordinated and overlaid with the image for display.
In another aspect, the present invention
provides methods for identifying a scan plane of an array transducer that includes an invasive medical device during an interventional procedure. The method can include using an ultrasonic probe
including an array transducer to generate a tracking volume of ultrasonic beam slices over a volumetric tissue region including the invasive medical device. Steps of the method can include acquiring ultrasound image data from the ultrasound beam slices, the image
data including structures m the volumetric region and the invasive medical device, processing the ultrasound image data to identify the scan plane of the array transducer that includes the invasive medical device, acquiring ultrasound image data from the scan plane that includes the invasive medical device, and displaying an ultrasound image of the scan plane including the invasive medical device, wherein the tracking volume is repeatedly scanned to locate the position of the invasive medical device for real-time display.
FIGURE 2 describes an example workflow 42 for implementing a procedure in accordance with an embodiment of the present invention. In step 44, a coarse 3D ultrasound image is acquired to include an invasive medical device, such as a biopsy needle or ablation needle. Here, the transducer probe 10 including an array transducer 12 can be used to generate a tracking volume of ultrasonic beam slices over a volumetric tissue region including the
invasive medical device. Ultrasound image data is acquired from the ultrasound beam slices and includes the volumetric region and the invasive medical device. In particular, the ultrasound imaging system can acquire volume image data by sub-sampling a volumetric region with a low (widely spaced) scan plane density that is sufficient to sub-sample the entire volumetric region in a time interval
sufficient for a desired volumetric frame rate of display. As described in step 46, the workflow includes determining a scan plane of the array transducer that includes the invasive medical device. For example, using the 3D information that is
acquired at low scan plane densities, the system can apply techniques, such as orthogonal projection
techniques, to locate a scan plan including the invasive device in the imaging volume. Step 48 includes displaying an ultrasound image of the scan plane including the invasive medical device. For example, a B-mode image can be displayed in which the tip and/or a portion of the needle are visualized in plane with the scan plane of the B-mode image.
Furthermore, step 50 includes monitoring the location of the needle during the procedure so it can be displayed in real-time. In particular, the tracking volume in step 44 is repeatedly scanned to locate the position of the invasive medical device for real-time display. Here, a tracking volume of a desired beam density can be used to sample the patient's tissue and identify the needle in relative space within the
3D volume. Using the methods described herein, the system will automatically identify the needle 56 and its position in relation to the array transducer 12. Once the position is identified, the scan plane of interest (e.g., a scan plan parallel to the needle) can be identified and displayed. If the scan plane has shifted from the previous tracking volume, then different elements in the array transducer are used to generate an image in the proper scan plane.
FIGURE 3 provides an example implementation of the present invention for identifying a scan plane containing an invasive medical device, such as a biopsy needle. As shown, a transducer probe 10 including an array transducer is positioned on the skin 52 of a patient. A tissue region of interest 54 that is to be biopsied is present under the skin 52. A biopsy needle 56 is inserted into the patient and the transducer probe 10 is used to locate the needle 56 and track its movement towards the tissue region 54 for an accurate and successful biopsy during a
free-hand interventional procedure. To locate the needle, the ultrasound system acquires volume image data by sub-sampling a volumetric region or tracking volume 58 with a low (widely spaced) scan plane density that is sufficient to sub-sample the entire volumetric region in a time interval sufficient for a desired volumetric frame rate of display. An example scan plane density is illustrated by dashed lines 60, which represent scan lines in both the elevation and azimuth directions of the 2D array transducer. It will be appreciated that virtually any scan plane density can be used, but the density will generally be defined according to a desired time resolution that does not sacrifice accuracy and time efficiency for generating an adequate number of scan planes that intersect the needle. The coarseness of the beam scan plane density can be controlled by the user or be implemented according to a preset protocol that uses an optimal scan plane density.
As described above, the volumetric ultrasound data is processed by the invasive device position processor 36 to calculate which scan plane of the array transducer is aligned with the needle. In FIGURE 3, the needle 56 is aligned with an azimuth- oriented scan plane of the array transducer. The thicker dashed line shows the scan plane 62 that is aligned with the needle. Thus, after acquiring the ultrasound data from the coarse 3D scan, the needle can be located and the properly aligned scan plane can be imaged in real-time and with high resolution to assist in the interventional procedure. As is well known in the art, a 2D array transducer can be operated to generate scan lines in a wide variety of orientations along the azimuth and elevation
dimensions of the array. Therefore, if the needle is
oriented partially in the azimuthal and elevation dimensions, the system can identify that orientation and implement transmit and receive cycles to provide optimal alignment of the image plane with the plane containing the biopsy needle. In some embodiments, the identified scan plane can be imaged as a single thin plane within the elevational resolution of the probe and system. In other embodiments, the
identified scan plane can also be imaged as a thick slice image of a plane thickness greater than that of a single thin plane as described in US patent
publication no. US2010/0168580A1 (Thiele et al . ) , which is hereby incorporated by reference in its entirety. The use of thick slice imaging enables the needle to be continually visualized in the image even if its path of insertion varies from a perfectly straight line, so long as the path remains within the thickness of the thick slice image.
FIGURE 4 shows an example implementation of displaying orthogonal scan planes simultaneously including an invasive medical device. In one
embodiment, a display 32 can include two panels that show different scan plane cross-sections of a biopsy needle or ablation needle. In FIGURE 4, the top panel shows a high resolution 2D image 62 that is aligned with the needle 56 in the vicinity of the lesion of interest 54. Tissue structures 64a are shown in the image, and can be used to further guide the needle insertion path. In the bottom panel, the biplane view of the scan plane in the top panel is provided. Here, the physician can simultaneously get a different view of how close the needle 56 is to the lesion 54. Similarly, tissue structures 64b are also shown in the 2D image. For additional reference, the display can also show an icon indicating an
orientation of the scan plane and/or the needle in relation to the array transducer. In 66a and 66b, the needle is shown in relation to a 3D volume
generated by the array transducer, showing the user how the 3D volume is oriented with respect to the probe and thereby further facilitating the procedure.
In a given invasive procedure it may be desirable to access anatomy in the body with several invasive instruments simultaneously. For example, in one embodiment, the insertion planes of multiple needles can be identified, facilitating use of an ultrasound system of the present invention for procedures such as r.f. ablation using multiple needles. In another embodiment the insertion planes of differently inclined needles can be identified and visualized. In some instances, multiple needles can be inserted at the same time in different identified scan planes. For example, two different instruments may be used for microwave ablation of target anatomy, which case the clinician will want to visually guide both ablation needles to the target so that their tips are in contact with the anatomy to be ablated. And, because the 3D ultrasound data can be repeatedly updated during an interventional procedure, the ablation needles can be inserted at different times and the system will automatically register another needle in the view plane. In some embodiments, multiple needles that are in different planes can be imaged simultaneously. In certain embodiments, a volume image can be generated and displayed simultaneously along with the needles in different image planes. In FIGURE 5, three different needles, 56a, 56b, and 56c are being used and imaged at the same time. The respective scan planes 62a, 62b, and 62c of the array transducer can be imaged and each of the needles can be displayed in real-time, for
example, in separate image displays on a screen using different colors for each needle. Identifying and coloring a needle in an ultrasound image can be
performed by a segmentation technique that specifically identifies the needle in an image from its surrounding tissue as described in US patent pub. no. 2004/0002653 (Greppi et al.) and in the paper "Enhancement of Needle Visibility in Ultrasound-guided Percutaneous
Procedures" by S. Cheung et al . , Ultrasound in Med. & Biol., vol. 30, no. 5 (2004) at pp 617-24, which are hereby incorporated by reference in their entirety. Similarly, the needles can be shown in respective 2D images along with their insertion paths that can be outlined in distinctive colors, such as red and yellow. In addition, a full 3D volumetric image of the region of the procedure, which shows the target anatomy being accessed, can be taken to include all three needles. In the 3D image, each needle is colored with its distinctive color, blue, red, or yellow, so that the clinician can easily relate each needle in the 3D image to its own 2D insertion plane image. Each 2D image plane and the full 3D volume are scanned in a time interleaved manner, with the individual insertion planes being scanned at a greater repetition rate (and hence real time frame rate of display) than the 3D image. Once the needles are in their desired positions in the target anatomy the individual 2D images can be frozen on the screen so that the full acquisition time is devoted to 3D imaging and the procedure at the target anatomy can continue to be imaged in live 3D.
The specific dimensions of any of the detection devices, systems, and components thereof, of the present invention may vary depending upon the intended application, as may be apparent to those of skill in the art in view of the disclosure herein. For example,
a selected probe or needle size, design or dimensions can differ depending on intended use.
It will be understood that the examples and embodiments described herein are for illustrative purposes and that various modifications or changes in light thereof may be suggested to persons skilled in the art and are included within the spirit and purview of this application and scope of the appended claims. Moreover, different combinations of embodiments described herein are possible, and such combinations are considered part of the present invention. In addition, all features discussed in connection with any one embodiment herein can be readily adapted for use in other embodiments herein. The use of different terms or reference numerals for similar features in different embodiments does not necessarily imply differences other than those which may be expressly set forth.
Accordingly, the present invention is intended to be described solely by reference to the appended claims, and not limited to the preferred embodiments disclosed herein .
Claims
1. A method for identifying a scan plane of an array transducer that includes an invasive medical device during an interventional procedure, the method comprising :
using an ultrasonic probe including an array transducer to generate a tracking volume of
ultrasonic beam slices over a volumetric tissue region including the invasive medical device;
acquiring ultrasound image data from the
ultrasound beam slices, the image data including structures in the volumetric region and the invasive medical device;
processing the ultrasound image data to identify the scan plane of the array transducer that includes the invasive medical device;
acquiring ultrasound image data from the scan plane that includes the invasive medical device; and displaying an ultrasound image of the scan plane including the invasive medical device, wherein the tracking volume is repeatedly scanned to locate the position of the invasive medical device for real-time display .
2. The method of claim 1, wherein the
displayed ultrasound image including the invasive medical device has higher spatial resolution than the ultrasonic beam slices.
3. The method of claim 1, comprising
displaying at least some ultrasound images generated from the ultrasonic beam slices.
4. The method of claim 1, wherein the invasive
medical device comprises a needle or an ablation probe .
5. The method of claim 1, comprising
displaying an icon showing an orientation of the scan plane in relation to the array transducer.
6. The method of claim 5, wherein the icon changes positions with respect to the array
transducer as the tracking volume is repeatedly scanned to locate the position of the invasive medical device.
7. The method of claim 1, comprising
displaying a path trajectory graphic over the ultrasound image to guide insertion of the invasive medical device into a tissue for biopsy, wherein the position path trajectory graphic changes as the tracking volume is repeatedly scanned to locate the position of the invasive medical device.
8. The method of claim 1, wherein the
processing comprises identifying a plurality of scan planes corresponding to a plurality of invasive medical devices, and the displaying comprises displaying a plurality of images of each of the plurality of invasive medical devices.
9. An ultrasonic imaging system for
identifying a scan plane including an invasive medical device, the system comprising:
an ultrasonic probe including an array
transducer adapted to generate a tracking volume of ultrasonic beam slices over a volumetric tissue region including the invasive medical device;
a transmit beamformer coupled to the array transducer and adapted to control a spatial beam density of the beam slices transmitted by the array transducer in the volumetric region;
a receive beamformer coupled to the array transducer and responsive to echo signals from array elements for the production of received scanlines in the vicinity of the invasive medical device and in the volumetric region at locations removed from the invasive medical device location;
an image processor responsive to the received scanlines and adapted to identify a scan plane of the array transducer that includes the invasive medical device; and
a display module coupled to the image processor and adapted to display an ultrasound image of the scan plane including the invasive medical device, wherein the system is adapted to repeatedly scan the tracking volume to locate the position of the
invasive medical device and display it in real-time.
10. The system of claim 9, wherein the spatial beam density of the beam slices is lower than a spatial density used to generate a 3D ultrasound volume of the volumetric tissue region.
11. The system of claim 9, wherein spatial resolution for the displayed ultrasound image is higher than a spatial resolution of the beam slices transmitted by the array transducer.
12. The system of claim 9, wherein the invasive medical device comprises a needle or an ablation probe .
13. The system of claim 9, wherein the display module is further adapted to display at least some ultrasound images generated from the ultrasonic beam slices .
14. The system of claim 9, wherein the display module is further adapted to display an icon showing an orientation of the scan plane in relation to the array transducer.
15. The system of claim 14, wherein the display module changes display positions of the icon with respect to the array transducer as the tracking volume is repeatedly scanned to locate the position of the invasive medical device.
16. The system of claim 9, wherein the display module changes is further adapted to display a path trajectory graphic over the ultrasound image to guide insertion of the invasive medical device into a tissue for biopsy, wherein the position path
trajectory graphic changes as the tracking volume is repeatedly scanned to locate the position of the invasive medical device.
17. The system of claim 9, wherein the image processor is adapted to identify a plurality of scan planes corresponding to a plurality of invasive medical devices, and the display module is adapted to display a plurality of images of each of the
plurality of invasive medical devices.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201361918867P | 2013-12-20 | 2013-12-20 | |
| US61/918,867 | 2013-12-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015092628A1 true WO2015092628A1 (en) | 2015-06-25 |
Family
ID=52432860
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2014/066788 Ceased WO2015092628A1 (en) | 2013-12-20 | 2014-12-11 | Ultrasound imaging systems and methods for tracking locations of an invasive medical device |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2015092628A1 (en) |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017211774A1 (en) * | 2016-06-06 | 2017-12-14 | Koninklijke Philips N.V. | Medical ultrasound image processing device |
| CN108634984A (en) * | 2018-04-20 | 2018-10-12 | 李辉 | It is a kind of that there is the Ultrasonic-B probe for puncturing function |
| EP3392862A1 (en) * | 2017-04-20 | 2018-10-24 | Fundació Hospital Universitari Vall d' Hebron - Institut de Recerca | Medical simulations |
| WO2019153352A1 (en) * | 2018-02-12 | 2019-08-15 | 深圳迈瑞生物医疗电子股份有限公司 | Display method and system for ultrasound-guided intervention |
| CN110636799A (en) * | 2017-03-16 | 2019-12-31 | 皇家飞利浦有限公司 | Optimal scan plane selection for organ viewing |
| CN110868939A (en) * | 2017-06-07 | 2020-03-06 | 皇家飞利浦有限公司 | Ultrasound systems and methods |
| CN111093516A (en) * | 2017-11-21 | 2020-05-01 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasound system and method for planning ablation |
| CN111093519A (en) * | 2017-09-14 | 2020-05-01 | 皇家飞利浦有限公司 | Ultrasound image processing |
| CN112334076A (en) * | 2018-06-29 | 2021-02-05 | 皇家飞利浦有限公司 | Biopsy prediction and guidance using ultrasound imaging and associated devices, systems and methods |
| CN112672696A (en) * | 2018-09-14 | 2021-04-16 | 皇家飞利浦有限公司 | System and method for tracking tools in ultrasound images |
| CN113038901A (en) * | 2018-11-15 | 2021-06-25 | 皇家飞利浦有限公司 | Simultaneous sensor tracking in medical interventions |
| CN113950294A (en) * | 2019-05-31 | 2022-01-18 | 皇家飞利浦有限公司 | Passive ultrasound sensor based initialization for image based device segmentation |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US2699601A (en) | 1950-03-13 | 1955-01-18 | Howard O Darnell | Positioning means for fabrication of jointed caulked pipe |
| US5474073A (en) | 1994-11-22 | 1995-12-12 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic scanning for three dimensional display |
| US5997479A (en) | 1998-05-28 | 1999-12-07 | Hewlett-Packard Company | Phased array acoustic systems with intra-group processors |
| US6375617B1 (en) | 2000-08-24 | 2002-04-23 | Atl Ultrasound | Ultrasonic diagnostic imaging system with dynamic microbeamforming |
| US20040002653A1 (en) | 2002-06-26 | 2004-01-01 | Barbara Greppi | Method and apparatus for ultrasound imaging of a biopsy needle or the like during an ultrasound imaging examination |
| US20090306511A1 (en) * | 2008-06-09 | 2009-12-10 | Kabushiki Kaisha Toshiba | Ultrasound imaging apparatus and method for generating ultrasound image |
| US20100022871A1 (en) * | 2008-07-24 | 2010-01-28 | Stefano De Beni | Device and method for guiding surgical tools |
| US20100121190A1 (en) * | 2008-11-12 | 2010-05-13 | Sonosite, Inc. | Systems and methods to identify interventional instruments |
| US20100168580A1 (en) | 2007-04-13 | 2010-07-01 | Koninklijke Philips Electronics N.V. | High speed ultrasonic thick slice imaging |
| EP2363071A1 (en) * | 2010-03-03 | 2011-09-07 | Technische Universiteit Eindhoven | Needle detection in medical image data |
-
2014
- 2014-12-11 WO PCT/IB2014/066788 patent/WO2015092628A1/en not_active Ceased
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US2699601A (en) | 1950-03-13 | 1955-01-18 | Howard O Darnell | Positioning means for fabrication of jointed caulked pipe |
| US5474073A (en) | 1994-11-22 | 1995-12-12 | Advanced Technology Laboratories, Inc. | Ultrasonic diagnostic scanning for three dimensional display |
| US5997479A (en) | 1998-05-28 | 1999-12-07 | Hewlett-Packard Company | Phased array acoustic systems with intra-group processors |
| US6375617B1 (en) | 2000-08-24 | 2002-04-23 | Atl Ultrasound | Ultrasonic diagnostic imaging system with dynamic microbeamforming |
| US20040002653A1 (en) | 2002-06-26 | 2004-01-01 | Barbara Greppi | Method and apparatus for ultrasound imaging of a biopsy needle or the like during an ultrasound imaging examination |
| US20100168580A1 (en) | 2007-04-13 | 2010-07-01 | Koninklijke Philips Electronics N.V. | High speed ultrasonic thick slice imaging |
| US20090306511A1 (en) * | 2008-06-09 | 2009-12-10 | Kabushiki Kaisha Toshiba | Ultrasound imaging apparatus and method for generating ultrasound image |
| US20100022871A1 (en) * | 2008-07-24 | 2010-01-28 | Stefano De Beni | Device and method for guiding surgical tools |
| US20100121190A1 (en) * | 2008-11-12 | 2010-05-13 | Sonosite, Inc. | Systems and methods to identify interventional instruments |
| EP2363071A1 (en) * | 2010-03-03 | 2011-09-07 | Technische Universiteit Eindhoven | Needle detection in medical image data |
Non-Patent Citations (1)
| Title |
|---|
| S. CHEUNG ET AL.: "Enhancement of Needle Visibility in Ultrasound-guided Percutaneous Procedures", ULTRASOUND IN MED. & BIOL., vol. 30, no. 5, 2004, pages 617 - 24 |
Cited By (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11266380B2 (en) | 2016-06-06 | 2022-03-08 | Koninklijke Philips N.V. | Medical ultrasound image processing device |
| CN109310399A (en) * | 2016-06-06 | 2019-02-05 | 皇家飞利浦有限公司 | Medical Ultrasound Image Processing equipment |
| WO2017211774A1 (en) * | 2016-06-06 | 2017-12-14 | Koninklijke Philips N.V. | Medical ultrasound image processing device |
| CN109310399B (en) * | 2016-06-06 | 2022-12-06 | 皇家飞利浦有限公司 | Medical ultrasonic image processing apparatus |
| CN110636799A (en) * | 2017-03-16 | 2019-12-31 | 皇家飞利浦有限公司 | Optimal scan plane selection for organ viewing |
| EP3392862A1 (en) * | 2017-04-20 | 2018-10-24 | Fundació Hospital Universitari Vall d' Hebron - Institut de Recerca | Medical simulations |
| WO2018193064A1 (en) * | 2017-04-20 | 2018-10-25 | Fundació Hospital Universitari Vall D'hebron - Institut De Recerca | Medical simulations |
| CN110868939A (en) * | 2017-06-07 | 2020-03-06 | 皇家飞利浦有限公司 | Ultrasound systems and methods |
| CN111093519B (en) * | 2017-09-14 | 2023-10-13 | 皇家飞利浦有限公司 | Ultrasound image processing |
| CN111093519A (en) * | 2017-09-14 | 2020-05-01 | 皇家飞利浦有限公司 | Ultrasound image processing |
| CN111093516B (en) * | 2017-11-21 | 2023-01-10 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasound system and method for planning ablation |
| CN111093516A (en) * | 2017-11-21 | 2020-05-01 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasound system and method for planning ablation |
| WO2019153352A1 (en) * | 2018-02-12 | 2019-08-15 | 深圳迈瑞生物医疗电子股份有限公司 | Display method and system for ultrasound-guided intervention |
| CN111065341A (en) * | 2018-02-12 | 2020-04-24 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic intervention display method and system |
| US12144554B2 (en) | 2018-02-12 | 2024-11-19 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Display method and system for ultrasound-guided intervention |
| CN108634984B (en) * | 2018-04-20 | 2021-11-30 | 李辉 | B-ultrasonic probe with puncture function |
| CN108634984A (en) * | 2018-04-20 | 2018-10-12 | 李辉 | It is a kind of that there is the Ultrasonic-B probe for puncturing function |
| CN112334076A (en) * | 2018-06-29 | 2021-02-05 | 皇家飞利浦有限公司 | Biopsy prediction and guidance using ultrasound imaging and associated devices, systems and methods |
| CN112672696A (en) * | 2018-09-14 | 2021-04-16 | 皇家飞利浦有限公司 | System and method for tracking tools in ultrasound images |
| CN113038901A (en) * | 2018-11-15 | 2021-06-25 | 皇家飞利浦有限公司 | Simultaneous sensor tracking in medical interventions |
| CN113038901B (en) * | 2018-11-15 | 2023-03-10 | 皇家飞利浦有限公司 | Simultaneous sensor tracking in medical interventions |
| CN113950294A (en) * | 2019-05-31 | 2022-01-18 | 皇家飞利浦有限公司 | Passive ultrasound sensor based initialization for image based device segmentation |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2015092628A1 (en) | Ultrasound imaging systems and methods for tracking locations of an invasive medical device | |
| US10891777B2 (en) | Ultrasound imaging system and method for image guidance procedure | |
| CN101601590B (en) | Ultrasound imaging apparatus | |
| US9597054B2 (en) | Ultrasonic guidance of a needle path during biopsy | |
| JP4443672B2 (en) | Ultrasonic diagnostic equipment | |
| US6764449B2 (en) | Method and apparatus for enabling a biopsy needle to be observed | |
| US8303502B2 (en) | Method and apparatus for tracking points in an ultrasound image | |
| EP3888559A1 (en) | Ultrasound probe, user console, system and method | |
| US10123767B2 (en) | Ultrasonic guidance of multiple invasive devices in three dimensions | |
| US20140039316A1 (en) | Ultrasonic diagnostic apparatus and ultrasonic image processing method | |
| US11250603B2 (en) | Medical image diagnostic apparatus and medical image diagnostic method | |
| EP2866672B1 (en) | Ultrasonically guided biopsies in three dimensions | |
| CN1764849A (en) | Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging | |
| CN101779969A (en) | Ultrasound diagnosis apparatus, medical image display apparatus and medical image displaying method | |
| WO2015068073A1 (en) | Multi-plane target tracking with an ultrasonic diagnostic imaging system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14831079 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14831079 Country of ref document: EP Kind code of ref document: A1 |