US20220273258A1 - Path tracking in ultrasound system for device tracking - Google Patents
Path tracking in ultrasound system for device tracking Download PDFInfo
- Publication number
- US20220273258A1 US20220273258A1 US17/745,431 US202217745431A US2022273258A1 US 20220273258 A1 US20220273258 A1 US 20220273258A1 US 202217745431 A US202217745431 A US 202217745431A US 2022273258 A1 US2022273258 A1 US 2022273258A1
- Authority
- US
- United States
- Prior art keywords
- processor
- ultrasound
- display
- track
- ultrasound probe
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30021—Catheter; Guide wire
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- the diagram illustrates an ultrasound image 305 .
- the ultrasound image 305 is shown on a screen 300 of a display device 301 .
- the ultrasound image 305 depicts the NTT needle 230 travelling along a tracked path 330 defined by a pair of parallel lines 310 .
- the pair of parallel lines 310 engage opposed endpoints of a location circle 320 .
- the distal end of the NTT needle 230 includes a pointed end or beveled tip 231 .
- the distal end of the NTT needle 230 further includes an ultrasound sensor 234 .
- Sensor 234 may comprise, in one embodiment, a single piezo electric transducer element.
- a fine cable connection from sensor 234 is integrated into the NTT needle 230 and connects to NTT module 220 .
- the ultrasound probe 205 includes the motion sensor 207 , which can include an accelerometer 235 and a gyroscope 237 to continuously monitor movement of the ultrasound probe 205 .
- the functions noted in the blocks may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Primary Health Care (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
A method for determining a projected track of an object (230) includes measuring movement from frame to frame of a detected object point in a field of view by periodic comparison of positions, extrapolating a locus of periodically detected object points, and qualifying the locus by calculating and applying a threshold to the linearity in a sequence of positions and a threshold to consistency in strength. The method further produces the plurality of ultrasound images by including thereon a rendering of a plurality of lines (310) as a path track indicator (330) on one or more ultrasound images (305) and displaying the projected track of the object when a user moves the tracked object a minimum distance in a region of interest (242) of a subject (240). The method also includes utilizing a motion sensor (234) with a probe (205) to suppress calculation and display of the projected track.
Description
- This application is a Continuation of application Ser. No. 16/485,463, filed Aug. 13, 2019, which is the U.S. National Phase application under 35 U.S.C. § 371 of International Application No. PCT/EP2018/052730, filed on Feb. 5, 2018, which claims the benefit of U.S. Provisional Patent Application Ser. No. 62/458,789, filed Feb. 14, 2017. These applications are hereby incorporated by reference herein.
- This disclosure relates to ultrasound devices and more particularly to path tracking in ultrasound systems which have the capability of tracking a device and displaying the position of the device in the ultrasound images.
- Precise visualization of objects such as needles or catheters and real-time localization with respect to imaged anatomy are needed for minimally invasive interventions. Intra-operative ultrasound is often used for these purposes. Various ultrasound systems are available in the market which utilize some method for tracking the location of an object in the body of the patient. Such systems share the common attribute that each detected position of the object is digitally represented in the system, allowing display of the positions, and that the positions are updated periodically, typically in conjunction with active scanning, so that the real time ultrasound image display can also show the detected location of the object being tracked. Some systems offer a means of showing the path of the detected object in the image, either as history (where the object came from), or future extrapolation (where it will go if moved in the same direction), or both. Generating such a projected path is typically by means of a method well understood in the art. One method is to include a mechanical fixture like a needle guide mounted on the ultrasound probe which simply constrains the object to follow a predetermined path, i.e., to physically constrain the path of the object with respect to the ultrasound probe as the object is inserted. Other means include locating the device such as by magnetic or electro-magnetic (EM) sensing of the location of the object with respect to similar sensing of the ultrasound probe position.
- These systems suffer from complex, expensive parts and circuitry, susceptibility to interference, positional ambiguity due to the deformation of the object (such as bending of the needle), workflow burden such as the obligation to calibrate the positional sensing, etc. There is one system which requires no physical registration of the relative positions of the ultrasound probe (and thus the displayed image) and the object whose position is displayed in the image. U.S. Pat. No. 9,282,946, commonly owned, and incorporated herein in its entirety, describes a system wherein an acoustic signal from the probe is used to activate an acoustic sensor on the tracked object, and via the timing of a returned electrical signal from the object, detect the position of the object with respect to the image itself, thereby obviating all mechanical, magnetic, electromagnetic (EM), or other mechanisms for tracking, and thus also eliminating their cost, complexity, calibration, and susceptibility to error.
- In any ultrasound imaging system that also tracks and displays the position of an object, it would be desirable to show the path of the tracked object throughout the ongoing series of displayed images (i.e. through time) without relying upon positioning fixtures or circuitry to detect the relative position of the object with respect to the ultrasound probe. In a system which indeed possesses no such relative registration apparatus, such as the simplified, lower cost system of U.S. Pat. No. 9,282,946, which uses only an acoustic sensor on the object for position detection, showing the path of the detected object presents an unsolved problem. A particular impediment is that while the position of the object is continuously and accurately located in the displayed ultrasound image, the ultrasound probe itself may be rotated or translated with respect to the object, which is largely indistinguishable from motion of the object itself in the medium being scanned.
- As further background, a very brief review of ultrasound probes and imaging follows. The versatility of a diagnostic ultrasound system is largely determined by the types of probes which can be used with the system. Linear array transducer probes are generally preferred for abdominal and small parts imaging and phased array transducer probes are preferred for cardiac imaging. Probes may have 1D or 2D array transducers for two dimensional or three dimensional imaging. Indwelling probes are in common use, as are specialty probes such as surgical probes. Each type of probe can operate at a unique frequency range and have a unique aperture and array element count. Some ultrasound systems are designed for grayscale operation or operation at the transmit frequency such as for greyscale and color Doppler imaging while others can additionally perform harmonic imaging. For each of the intended imaging modes, the functional characteristics of the probes, such as physical aperture, transducer element spacing, passband frequencies, etc. determine the requirements for transmitting ultrasound pulses and processing the received echoes. The variation in probe characteristics and functionality means that the processing system operable with a variety of probes must be reprogrammed each time a different probe is put to use.
- An example of an object that is tracked during an ultrasound procedure is a needle. During needle biopsy and some interventional therapy, clinicians insert a needle into a subject, such as the body, to reach a target mass. For regional anesthesia, a needle is used to deliver anesthetic to the vicinity of a target nerve bundle in the body, typically in preparation for a surgical procedure. Usually ultrasound imaging is used for live monitoring of the needle insertion procedure. To perform a safe and successful insertion, it is necessary to locate the needle accurately in the guided ultrasound image. Unfortunately, in clinical practice the visibility of the needle itself in the conventional ultrasound image is poor, resulting in difficulty for clinicians to insert the needle accurately. Hence the desirability of a needle tracking system and further, a means of projecting the path of the needle on the image display.
- Different techniques have been used to achieve better needle visualization in ultrasound images, for example, adaptively steering the ultrasound beam towards the needle to improve the acoustic reflection of the needle and compounding with the non-steered ultrasound image; manipulating the needle surface coating, geometry and diameter to enhance acoustic reflection; providing an extra optical, magnetic, or electro-magnetic position sensor on the needle to track the needle location in the ultrasound image, etc. In these techniques, either a specially designed needle is used, or an extra position sensor is attached to the needle, or the ultrasound imaging system is manipulated to enhance the visualization of the needle. Those approaches lead to an increase of the cost of providing enhanced needle visualization. In contrast, the simple system mentioned above, which utilizes only an acoustic sensor on the object to provide an electrical signal to the system for location detection, reduces the cost and complexity of the tracking apparatus while increasing its accuracy. But it presents the challenge of how to effectively project the path of the tracked object.
- In accordance with the present principles, an ultrasound probe communicates with an image processor for producing a plurality of ultrasound images by standard methods known in the art, and also provides a method of detecting an object in the ultrasound image field, preferably without the complexity and cost of an apparatus to measure the relative position of probe and object, such as by instead utilizing an acoustic sensor in the object. The system then additionally measures movement from frame to frame of a detected object point in a field of view by periodic comparison of positions, extrapolating a locus of periodically detected object points, and qualifying the locus by calculating and applying a threshold to the linearity in a sequence of positions and a threshold to consistency in strength. The image processor further produces the plurality of ultrasound images by including thereon a rendering of a plurality of lines as a path track indicator on one or more ultrasound images of the plurality of ultrasound images, displaying the projected track of the object when a user moves the tracked object a minimum distance in a region of interest of a subject, and utilizing a motion sensor in the ultrasound probe or motion detection from image data to suppress calculation and display of the projected track when the ultrasound probe is rotating or translating in space.
- A system includes an ultrasound probe and an image processor for producing a plurality of ultrasound images by including a pair of reference lines passing as parallel tangents on opposite sides of a location circle displayed by the system to locate an object, displaying a projected track of the object as the tracked object moves within a region of interest of a subject, and utilizing a motion sensor to suppress calculation and display of the projected track when the ultrasound probe is rotating or translating in space.
- A method for determining a projected track of an object includes measuring movement from frame to frame of a detected object point in a field of view by periodic comparison of positions, extrapolating a locus of periodically detected object points, and qualifying the locus by calculating and applying a threshold to the linearity in a sequence of positions and a threshold to consistency in strength. The method further includes rendering a plurality of lines as a path track indicator on an ultrasound image and displaying the projected track of the object when a user moves the tracked object a minimum distance in a region of interest of a subject. The method also includes utilizing a motion sensor in an ultrasound probe or motion detection from image data to suppress calculation and display of the projected track when the ultrasound probe is rotating or translating in space.
- These and other objects, features and advantages of the present disclosure will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings.
- This disclosure will present in detail the following description of preferred embodiments with reference to the following figures wherein:
-
FIG. 1 is a block/flow diagram showing an ultrasonic diagnostic imaging system, in accordance with one embodiment; -
FIG. 2 is a diagram showing a needle tip tracking (NTT) system in communication with an ultrasound system, in accordance with one embodiment; -
FIG. 3 is a diagram showing an ultrasound image depicting a pair of reference lines passing as parallel tangents on opposite sides of a location circle displayed by the system in response to the position of the object/needle, in accordance with one embodiment; -
FIG. 4 is a diagram showing a needle inserted into a patient with the ultrasound probe scanning it “in-plane” or “transverse,” in accordance with one embodiment; and -
FIG. 5 is a flow diagram showing a method for determining and displaying a projected track of an object/needle within a region of interest of an object, in accordance with illustrative embodiments. - In accordance with the present principles, systems, devices and methods are provided to track and display a projected track of an object. The present principles provide embodiments where the systems, devices and methods are self-referential in that they do not rely on an external positioning system to determine the validity of the detected path.
- In one useful embodiment, an ultrasound system includes an object location apparatus where the object location apparatus utilizes the ultrasound acoustic pulses generated by the ultrasound system to energize the tracking sensor and a method of automatically determining and displaying the projected track of the object in the scanned medium. The method comprises a) measuring movement from frame to frame of the detected object point in the field of view by periodic comparison of positions, b) extrapolating the locus of periodically detected points, c) qualifying the locus by calculating and applying a threshold to the linearity in the sequence of positions and a threshold to consistency in strength, d) rendering a plurality of lines or another path track indicator on the ultrasound image as an overlay, e) using data from the previous steps for displaying the projected track when the user moves the tracked object a minimum distance in the medium, and f) utilizing a motion sensor in the ultrasound probe or motion detection from the image data to suppress calculation and display of the track projection when the ultrasound probe is rotating or translating in space.
- It should be understood that the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any acoustic instruments. In some embodiments, the present principles are employed in tracking or analyzing complex biological or mechanical systems. In particular, the present principles are applicable to internal and/or external tracking procedures of biological systems and procedures in all areas of the body such as the lungs, gastro-intestinal tract, excretory organs, blood vessels, etc. The functional elements depicted in the FIGS. may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple functional elements.
- The functions of the various elements shown in the FIGS. can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.
- Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, it will be appreciated that any flow charts, flow diagrams and the like represent various processes which may be substantially represented in computer readable storage media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
- Furthermore, embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W), Blu-Ray™ and DVD.
- Reference in the specification to “one embodiment” or “an embodiment” of the present principles, as well as other variations thereof, means that a particular feature, structure, characteristic, and so forth described in connection with the embodiment is included in at least one embodiment of the present principles. Thus, the appearances of the phrase “in one embodiment” or “in an embodiment”, as well any other variations, appearing in various places throughout the specification are not necessarily all referring to the same embodiment.
- It is to be appreciated that the use of any of the following “/”, “and/or”, and “at least one of”, for example, in the cases of “A/B”, “A and/or B” and “at least one of A and B”, is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of both options (A and B). As a further example, in the cases of “A, B, and/or C” and “at least one of A, B, and C”, such phrasing is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of the third listed option (C) only, or the selection of the first and the second listed options (A and B) only, or the selection of the first and third listed options (A and C) only, or the selection of the second and third listed options (B and C) only, or the selection of all three options (A and B and C). This may be extended, as readily apparent by one of ordinary skill in this and related arts, for as many items listed.
- It will also be understood that when an element such as a layer, region or material is referred to as being “on” or “over” another element, it can be directly on the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly on” or “directly over” another element, there are no intervening elements present. It will also be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present.
- Referring now to the drawings in which like numerals represent the same or similar elements and initially to
FIG. 1 , an ultrasonic diagnostic imaging system is illustratively shown in accordance with one embodiment. - Referring first to
FIG. 1 , an ultrasonic diagnostic imaging system showing one embodiment of the present invention is shown in block diagram form. Anultrasound probe 10 transmits and receives ultrasound waves from the piezoelectric elements of an array oftransducer elements 12. For imaging a planar region of the body a one-dimensional (1-D) array of elements may be used, and for imaging a volumetric region of the body a two-dimensional (2-D) array of elements may be used to steer and focus ultrasound beams over the image region. A transmit beamformer actuates elements of the array to transmit ultrasound waves into the subject. The signals produced in response to the reception of ultrasound waves are coupled to a receivebeamformer 14. Thebeamformer 14 delays and combines the signals from the individual transducer elements to form coherent beamformed echo signals. When the probe includes a 2-D array for 3D imaging, it may also include a microbeamformer which does partial beamforming in the probe by combining signals from a related group (“patch”) of transducer elements as described in U.S. Pat. No. 6,709,394. In that case the microbeamformed signals are coupled to themain beamformer 14 in the system which completes the beamforming process. - The beamformed echo signals are coupled to a
signal processor 16 which processes the signals in accordance with the information desired. The signals may be filtered, for instance, and/or harmonic signals may be separated out for processing. The processed signals are coupled to adetector 18 which detects the information of interest. For B mode imaging amplitude detection is usually employed, whereas for spectral and color Doppler imaging the Doppler shift or frequency can be detected. The detected signals are coupled to ascan converter 20 where the signals are coordinated to the desired display format, generally in a Cartesian coordinate system. Common display formats used are sector, rectilinear, and parallelogram display formats. The scan converted signals are coupled to an image processor for further desired enhancement such as persistence processing. The scan converter may be bypassed for some image processing. For example the scan converter may be bypassed when 3D image data is volume rendered by the image processor by direct operation on a 3D data set. The resulting two dimensional or three dimensional image is stored temporarily in animage memory 24, from which it is coupled to adisplay processor 26. Thedisplay processor 26 produces the necessary drive signals to display the image on a dockingstation image display 28 or theflat panel display 38 of the portable system. The display processor also overlays the ultrasound image with graphical information from agraphics processor 30 such as system configuration and operating information, patient identification data, and the time and date of the acquisition of the image. - A
central controller 40 responds to user input from the user interface and coordinates the operation of the various parts of the ultrasound system, as indicted by the arrows drawn from the central controller to thebeamformer 14, thesignal processor 16, thedetector 18, and thescan converter 20, and thearrow 42 indicating connections to the other parts of the system. Theuser control panel 44 is shown coupled to thecentral controller 40 by which the operator enters commands and settings for response by thecentral controller 40. Thecentral controller 40 is also coupled to an a.c.power supply 32 to cause the a.c. supply to power abattery charger 34 which charges thebattery 36 of the portable ultrasound system when the portable system is docked in the docking station. - It is thus seen that, in this embodiment, the partitioning of the components of
FIG. 1 is as follows. Thecentral controller 40,beamformer 14,signal processor 16,detector 18,scan converter 20,image processor 22,image memory 24,display processor 26,graphics processor 30,flat panel display 38, andbattery 36 reside in the portable ultrasound system. Thecontrol panel 44,display 28, a.c.supply 32 andcharger 34 reside on the docking station. In other embodiments the partitioning of these subsystems may be done in other ways as design objectives dictate. - Referring to
FIG. 2 , a diagram showing a needle tip tracking (NTT) system in communication with an ultrasound system is presented in accordance with one embodiment. - The
tracking system 200 includes anultrasound system 210 in communication with a needle tip tracking (NTT)module 220. TheNTT module 220 is connected to an object, such as anNTT needle 230 viaNTT cable 225. Theultrasound system 210 may include animage processor 202, auser interface 204, adisplay 206, and amemory 208. Additionally, anultrasound probe 205 may be connected to theultrasound system 210. Theultrasound probe 205 may be positioned adjacent the subject 240. The subject 240 can be, e.g., a patient. The ultrasound probe may include amotion sensor 207. Themotion sensor 207 of theprobe 205 detects motion of theprobe 205 with respect to the tissue of the subject 240. - The
NTT needle 230 is inserted into a volume or region ofinterest 242 of the subject 240. Theneedle 230 is tracked ideally from the point of entry at the skin surface all the way to the point where it stops insertion. For regional anesthesia, for instance, the stopping point is near a visualized nerve bundle, at which point the anesthetic is injected through the needle cannula so that it optimally bathes the nerve bundle. - The distal end of the
NTT needle 230 may include anultrasound sensor 234, whereas the proximal end of theNTT needle 230 may include ahub 232. The distal end of the NTT needle may be, e.g., a pointed end orbeveled tip 231. Of course, one skilled in the art may contemplate a number of different design configurations for the distal end of theNTT needle 230. U.S. Pat. No. 9,282,946, commonly owned, and incorporated herein in its entirety, provides further information regarding thetracking system 200 and various beamforming techniques. - Referring to
FIG. 3 , a diagram showing an ultrasound image depicting a pair of reference lines passing as parallel tangents on opposite sides of a location circle displayed by the system in response to the position of the object/needle is presented in accordance with one embodiment. - The diagram illustrates an
ultrasound image 305. Theultrasound image 305 is shown on ascreen 300 of adisplay device 301. Theultrasound image 305 depicts theNTT needle 230 travelling along a trackedpath 330 defined by a pair ofparallel lines 310. The pair ofparallel lines 310 engage opposed endpoints of alocation circle 320. The distal end of theNTT needle 230 includes a pointed end orbeveled tip 231. The distal end of theNTT needle 230 further includes anultrasound sensor 234.Sensor 234 may comprise, in one embodiment, a single piezo electric transducer element. A fine cable connection fromsensor 234 is integrated into theNTT needle 230 and connects toNTT module 220. Theultrasound probe 205 includes themotion sensor 207, which can include anaccelerometer 235 and agyroscope 237 to continuously monitor movement of theultrasound probe 205. -
FIGS. 2 and 3 will be discussed in tandem. In the needle tracking system that accompanies the ultrasound scanning system, the path tracking feature automatically displays a pair ofreference lines 310 passing as parallel tangents on opposite sides of thelocation circle 320 displayed at theneedle tip 231. Thelines 310 project in the direction of the last motion of thelocation circle 320 and also in the reverse direction. When theneedle tip 231 is motionless for a predetermined period, path tracking disappears (i.e., is not displayed on the display screen 300). - As the physician inserts or withdraws the
needle 230, the pair oflines 310 appear or are displayed, thus forming a constructed,virtual lane 330 in which theneedle 230 is moving, that is, a linear region, a straight path, in which the needle shaft andtip 231 proceed to move if insertion or withdrawal continues. Thepath 330 is shown when the needle tip location locus meets certain conditions, such as minimum strength and stability of the tracking signal, movement over a minimum distance in a minimally co-linear series of sample positions, etc. The path lines 310 extend to the boundaries of theimage 305, and may be solid, dotted, colored, etc. to indicate status, such as confidence based on tracking signal strength. - Further, the
motion sensor 207, which includes both accelerometer and 235, 237, is used to continuously monitor movement of thegyroscope components probe 205, and to suppress display of the path lines 310 if theprobe 205 is in motion, since such movement results in changes to the tracked needle position on the displayed image, independent of any actual insertion or withdrawal of the needle. In general, detecting probe motion causes immediate suppression of thelane lines 310, whereas mere lack of needle insertion/withdrawal results in lane display suppression after a number of seconds. Thus, the physician may effectively invoke the display of the tracked path of the instrument/needle 230 by holding theultrasound probe 205 steady and moving theneedle 230, per the typical workflow, yet may also pause for some seconds to study the projectedpath lane 330 while not moving theneedle 230. Translation of theprobe 205 in any axis is detected as a change in total force, equivalent to a change in acceleration, such that the magnitude of the 3-dimensional force vector deviates from 1.0 g, the baseline gravity vector. Rotation in any axis is detected as non-zero angular velocity, but rotation in the X axis ignored, since that corresponds to elevation tilt of theultrasound probe 205, which by itself does not necessarily diminish the needle tip display nor invalidate the displayedpath 330. Instead, the above mentioned constraint on needle tracking signal strength serves to suppress the path display as soon as pure X rotation effectively moves theneedle tip 231 out of the rendered image plane. - Therefore,
FIGS. 2 and 3 provide a novel way of both locating and tracking thetip 231 ofneedle 230. Theultrasound system 210, in conjunction withprobe 205, actuate a series of acoustic transmissions that generate 2D sweeps of scan lines or scan beams. The acoustic echo data gathered from the sweeps is detected, scan converted, and rendered as image frames on a system display as described above. Additionally, in each sweep, the acoustic transmit beams insonicate thesensor 234 ontip 231 ofneedle 230. The scan line with the transmit beam that is closest to thesensor 234 produces the return signal from that sensor with the highest amplitude of the sweep. Further, the depth of thesensor 234, and therefore thetip 231, is determined from the acoustic time of flight of the transmit pulse from theprobe 205 transmission surface to thesensor 234, as indicated by the time of the return electrical signal with respect to the time of the start of transmission at the probe in any given scan line. In this way, the system detects the location of the needle tip as a point in the 2D sweep of scan lines, with both sweep line position coordinate and line depth coordinate. Utilizing standard scan conversion geometry, those two coordinates are transformed into the standard Cartesian X, Y coordinates used in rendering a point. In this embodiment, the coordinates are used as a center of a rendered circle indicating the position of the needle tip. The circle has a small radius to represent the uncertainty arising from variations in acoustic time of flight through the tissue, from resolution limits of the scan timing, etc. The needle tip is thus represented as lying somewhere within the rendered circle on the displayed image on which the circle is overlaid, frame by frame. Over a continuous series of scan sweeps, which create a series of displayed image frames, the series of object points are thus detected, rendered as circles on the image frames, recorded, and analyzed for sufficiently accurate needle location detection. If the needle does not move, the path lines or lane is suppressed (i.e., not illustrated/depicted), whereas if the needle moves, a path track is displayed. - To calculate the path track, a common linear regression may be utilized on a series of the stored object points as X, Y coordinates. For a series of N object points, the standard linear regression formula is shown below, yielding the equation for the line of the path track in the same coordinate system. Rendering the path track on the
ultrasound image 305 is preferably represented bylines 310 surroundingvirtual lane 330, where the line equation as calculated in the regression is further offset in a direction orthogonal to its slope by a distance equal to the radius of the rendered object circle, in both positive and negative directions. -
linear regression form y=m·x+b for N value pairs of X i ,Y i -
slope m=(NΣ i(X i Y i)−Σi(X i)Σi(Y i)/(NΣ i(X i 2)−Σi(X i)2) -
intercept b=(Σi(Y i)−mΣ i(X i))/N - Besides the
aforementioned motion sensor 207 inprobe 205, an alternative method of detecting probe motion is to detect relative movement between image data and the ultrasound probe on a frame by frame basis. The technique performs simple image correlation to detect gross probe motion when the ultrasound probe is coupled to the body, and can thereby suppress the display of the tracked path until the ultrasound probe is once again steady and the needle track has been re-established. Standard methods of correlation of image data between successive frames, optimally taken from a region of the image near the probe face, can be used to generate an average correlation of the whole frame, which is then compared to a threshold that represents substantial image motion. If above the threshold, image motion is asserted, and display of the tracked path is suppressed. -
FIG. 4 is a diagram 400 showing a needle inserted into a patient with the ultrasound probe scanning it “in-plane” or “transverse,” in accordance with one embodiment. - In practice, a
needle 230 may be inserted into apatient 240 with theultrasound probe 205 scanning it “in-plane” or “transverse/out-of-plane.” The left-hand side illustration shows “in-plane” scanning, whereas the right-hand side illustration shows “out-of-plane” or “transverse” scanning. When scanning in-plane, the majority of the needle shaft is typically visualized (though frequently poorly, as mentioned previously), and insertion of the needle produces a natural projection of thepath 330 of the needle shaft in the plane where the tip may continue to appear but is in any case tracked. In the transverse scanning position, thepath 330 is mostly axial with respect to the probe face, and, thus, has a different interpretation. While still representing the path of theneedle tip 231, it is effectively showing the projection of thepath 330 on the image plane as thetip 231 of theneedle 230 moves from behind the plane to the front of the plane, or in the reverse direction. In this case, moderate rotation of theprobe 205 in the X axis results in generation of sufficient needle tip locus points to display the projected vertical path. Rotation to the point where the needle tip signal is lost results in suppression of the path lines 310, as is appropriate. Therefore, in summary, there is no change to the path tracking algorithm for these two cases. Indeed, the system is unaware of which type of scan is chosen by the physician and the algorithm behaves in the same manner for each scan. - Finally, the path tracking algorithm affords the needle tracking system an improvement in tracking reliability in that spurious needle tip locations, which are sometimes generated in the presence of simultaneous ultrasound pulse reverberation and acoustic shadowing, may be beneficially rejected if the falsely detected locations are outside of the previously detected path. Since the spurious detections are almost always transient, the detected path can be used as boundaries for needle tip location as long as the path is valid. Therefore, the path tracking feature serves to enhance the reliability of the needle tracking system, as well as add utility for procedure visualization.
- Referring to
FIG. 5 , a flow diagram showing a method for determining and displaying a projected track of an object/needle within a volume or region of interest of an object is illustrated. - In
block 502, measure movement from frame to frame of a detected object point in a field of view by periodic comparison of positions. - In
block 504, extrapolate a locus of periodically detected object points. - In
block 506, qualify the locus by calculating and applying a threshold to the linearity in a sequence of positions and a threshold to consistency in strength. - In
block 508, render a plurality of lines as a path track indicator on an ultrasound image. - In block 510, display the projected track of the object when a user moves the tracked object a minimum distance in a region of interest of a subject.
- In
block 512, utilize a motion sensor in an ultrasound probe or motion detection from image data to suppress calculation and display the projected track when the ultrasound probe is rotating or translating in space. - In summary, in an ultrasound system which includes an object location apparatus, wherein the object location apparatus utilizes the ultrasound acoustic pulses generated by the ultrasound system to energize the tracking sensor, a method of automatically determining and displaying the projected track of the object in the scanned medium is introduced. The method is self-referential in that it does not rely on an external positioning system to determine the validity of the detected path. Specifically, the exemplary tracking system of certain embodiments of the present invention requires no fixed positional reference point for the object/needle. Instead, using the needle tracking system to get the needle tip position in the imaging field, the tracking system operates self-referentially in that it relies on the locus of sequential positions detected within the imaging field in order to display a plausible path, or suppress such a display if the locus is excessively non-linear or if the probe moves. The probe's motion sensor may be utilized for detection of probe movement only. In a preferred embodiment, the novel aspects include at least a) extrapolation of un-referenced needle tip locations and/or b) qualification for the purpose of suppression of path display via conditions such as co-linearity, signal strength, and probe movement, the latter being detected at least by the probe's motion sensor or by image data correlation. Therefore, the solution provided by the tracking system of the present invention is differential location from one detected point to the next, with extrapolation and display qualification using measurements of the detection signals, using path linearity, and using probe movement.
- In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- In interpreting the appended claims, it should be understood that:
-
- a) the word “comprising” does not exclude the presence of other elements or acts than those listed in a given claim;
- b) the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements;
- c) any reference signs in the claims do not limit their scope;
- d) several “means” may be represented by the same item or hardware or software implemented structure or function; and
- e) no specific sequence of acts is intended to be required unless specifically indicated.
- Having described preferred embodiments for calculation and display of a projected path track of object points in an ultrasound image (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular embodiments of the disclosure disclosed which are within the scope of the embodiments disclosed herein as outlined by the appended claims. Having thus described the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims.
Claims (20)
1. An ultrasound system comprising:
an ultrasound probe configured to produce acoustic transmissions and receive acoustic echo data in response to the acoustic transmissions;
a processor coupled to the ultrasound probe, the processor configured to:
produce a plurality of ultrasound images based on the acoustic echo data,
produce a projected track of an object by:
measuring movement from frame to frame of an object point detected in a field of view by periodic comparison of positions of the object,
extrapolating a locus of periodically detected object points, and
providing, on the plurality of ultrasound images, a path track indicator, and
suppress calculation and display of the projected track of the object when the ultrasound probe rotates or translates in space; and
a display coupled to the processor, the display configured to display the plurality of ultrasound images, including the projected track of the object, when a user moves the object a minimum distance in a region of interest.
2. The system of claim 1 , further comprising a motion sensor in the ultrasound probe, the motion sensor configured to detect a motion of the ultrasound probe that is used by the processor to suppress the calculation and display of the projected track of the object.
3. The system of claim 2 , wherein the motion sensor includes an accelerometer and a gyroscope configured to continuously monitor movement of the ultrasound probe.
4. The system of claim 1 , wherein the processor is configured to detect a motion of the ultrasound probe from image data and use the detected motion to suppress the calculation and the display of the projected track of the object.
5. The system of claim 4 , wherein the processor is configured to detect a motion of the ultrasound probe based on detection of relative movement between the image data and the ultrasound probe on a frame by frame basis.
6. The system of claim 1 , wherein the path track indicator comprises a rendering of a plurality of lines and the plurality of lines comprises a pair of reference lines separate from the object and positioned on opposite sides of the object to indicate the projected track of the object.
7. The system of claim 6 , wherein the processor is further configured to render a location circle to locate the object in the plurality of ultrasound images and render the pair of reference lines as parallel tangents on opposite sides of the location circle.
8. The system of claim 6 , wherein the processor is configured to render the pair of reference lines projected in a direction of a last motion of the location circle.
9. The system of claim 6 , wherein the processor is configured to render the pair of reference lines to form a virtual lane that extends up to a boundary of one or more images of the plurality of ultrasound images.
10. The system of claim 6 , wherein, when the object is motionless, the processor is configured to display the pair of reference lines for a predetermined period of time for observation by the user.
11. The system of claim 1 , wherein the processor is further configured to qualify the locus by calculating and applying a threshold to a linearity in a sequence of positions and a threshold to consistency in strength of the detected object point.
12. A controller for tracking an object, the controller comprising:
at least one processor configured to:
produce a plurality of ultrasound images based on acoustic echo data received by an ultrasound probe,
produce a projected track of an object by:
measuring movement from frame to frame of an object point detected in a field of view by periodic comparison of positions of the object,
extrapolating a locus of periodically detected object points, and
providing, on the plurality of ultrasound images, a path track indicator,
display the projected track of the object when a user moves the object a minimum distance in a region of interest; and
suppress calculation and display of the projected track of the object when the ultrasound probe rotates or translates in space.
13. The controller of claim 12 , wherein the path track indicator comprises a rendering of a plurality of lines and the plurality of lines comprises a pair of reference lines separate from the object and positioned on opposite sides of the object to indicate the projected track of the object.
14. The controller of claim 13 , wherein the processor is further configured to render a location circle to locate the object in the plurality of ultrasound images and render the pair of reference lines as parallel tangents on opposite sides of the location circle.
15. The controller of claim 13 , wherein the processor is configured to render the pair of reference lines projected in a direction of a last motion of the location circle.
16. The controller of claim 13 , wherein the processor is configured to render the pair of reference lines to form a virtual lane that extends up to a boundary of one or more images of the plurality of ultrasound images.
17. The controller of claim 13 , when the object is motionless, the processor is configured to display the pair of reference lines for a predetermined period of time for observation by the user.
18. The controller of claim 12 , wherein the processor is further configured to qualify the locus by calculating and applying a threshold to a linearity in a sequence of positions and a threshold to consistency in strength of the detected object point.
19. The controller of claim 12 , wherein the processor is further configured to utilize a motion signal obtained from a motion sensor in the ultrasound probe or motion detected from image data to suppress the calculation and the display of the projected track of the object.
20. A method for tracking an object, the method comprising:
producing a plurality of ultrasound images based on the acoustic echo data received by an ultrasound probe,
producing a projected track of an object by:
measuring movement from frame to frame of an object point detected in a field of view by periodic comparison of positions of the object,
extrapolating a locus of periodically detected object points, and
providing, on the plurality of ultrasound images, a path track indicator,
displaying, on a display, the projected track of the object when a user moves the object a minimum distance in a region of interest; and
suppressing calculation and display of the projected track of the object when the ultrasound probe rotates or translates in space.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/745,431 US20220273258A1 (en) | 2017-02-14 | 2022-05-16 | Path tracking in ultrasound system for device tracking |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762458789P | 2017-02-14 | 2017-02-14 | |
| PCT/EP2018/052730 WO2018149671A1 (en) | 2017-02-14 | 2018-02-05 | Path tracking in ultrasound system for device tracking |
| US201916485463A | 2019-08-13 | 2019-08-13 | |
| US17/745,431 US20220273258A1 (en) | 2017-02-14 | 2022-05-16 | Path tracking in ultrasound system for device tracking |
Related Parent Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/485,463 Continuation US11357473B2 (en) | 2017-02-14 | 2018-02-05 | Path tracking in ultrasound system for device tracking |
| PCT/EP2018/052730 Continuation WO2018149671A1 (en) | 2017-02-14 | 2018-02-05 | Path tracking in ultrasound system for device tracking |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20220273258A1 true US20220273258A1 (en) | 2022-09-01 |
Family
ID=61599090
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/485,463 Active 2039-03-20 US11357473B2 (en) | 2017-02-14 | 2018-02-05 | Path tracking in ultrasound system for device tracking |
| US17/745,431 Abandoned US20220273258A1 (en) | 2017-02-14 | 2022-05-16 | Path tracking in ultrasound system for device tracking |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/485,463 Active 2039-03-20 US11357473B2 (en) | 2017-02-14 | 2018-02-05 | Path tracking in ultrasound system for device tracking |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US11357473B2 (en) |
| EP (1) | EP3582692A1 (en) |
| JP (1) | JP7218293B2 (en) |
| CN (1) | CN110300549A (en) |
| WO (1) | WO2018149671A1 (en) |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11638569B2 (en) | 2018-06-08 | 2023-05-02 | Rutgers, The State University Of New Jersey | Computer vision systems and methods for real-time needle detection, enhancement and localization in ultrasound |
| WO2020036968A1 (en) * | 2018-08-13 | 2020-02-20 | Rutgers, The State University Of New Jersey | Computer vision systems and methods for real-time localization of needles in ultrasound images |
| EP3886737A4 (en) | 2018-11-28 | 2022-08-24 | Histosonics, Inc. | HISTOTRYPSY SYSTEMS AND METHODS |
| WO2020114815A2 (en) | 2018-12-03 | 2020-06-11 | 3Mensio Medical Imaging B.V. | Method, device and system for intracavity probe procedure planning |
| KR102747176B1 (en) * | 2018-12-11 | 2024-12-27 | 삼성메디슨 주식회사 | Ultrasound imaging apparatus, method for controlling the same, and computer program product |
| JP7168474B2 (en) * | 2019-01-31 | 2022-11-09 | 富士フイルムヘルスケア株式会社 | ULTRASOUND IMAGING DEVICE, TREATMENT ASSISTANCE SYSTEM, AND IMAGE PROCESSING METHOD |
| EP3738515A1 (en) | 2019-05-17 | 2020-11-18 | Koninklijke Philips N.V. | Ultrasound system and method for tracking movement of an object |
| CN113853162B (en) | 2019-05-17 | 2024-07-30 | 皇家飞利浦有限公司 | Ultrasound system and method for tracking motion of a subject |
| JP7507792B2 (en) * | 2019-05-30 | 2024-06-28 | コーニンクレッカ フィリップス エヌ ヴェ | Determining the relative position of passive ultrasonic sensors |
| CN114222531A (en) | 2019-07-17 | 2022-03-22 | 亚克安娜生命科学有限公司 | Image-guided lumbar puncture aspiration and injection system and method |
| US11813485B2 (en) | 2020-01-28 | 2023-11-14 | The Regents Of The University Of Michigan | Systems and methods for histotripsy immunosensitization |
| EP4204084A4 (en) | 2020-08-27 | 2024-10-09 | The Regents Of The University Of Michigan | Ultrasound transducer with transmit-receive capability for histotripsy |
| EP4011299A1 (en) * | 2020-12-11 | 2022-06-15 | Sorbonne Université | Ultrasound device tracking system |
| US20230030941A1 (en) * | 2021-07-29 | 2023-02-02 | GE Precision Healthcare LLC | Ultrasound imaging system and method for use with an adjustable needle guide |
| US12329582B2 (en) * | 2022-02-17 | 2025-06-17 | Procept Biorobotics Corporation | Apparatus to detect tissue stretching during insertion of probes |
| CN117137519A (en) * | 2022-04-29 | 2023-12-01 | 通用电气精准医疗有限责任公司 | Ultrasound imaging method and ultrasound imaging system |
| CN114897879A (en) * | 2022-06-08 | 2022-08-12 | 北京永新医疗设备有限公司 | Axial scanning path planning method for intelligent fitting of SPECT-CT (single photon emission computed tomography-computed tomography) human body contour |
| KR20250102047A (en) | 2022-10-28 | 2025-07-04 | 히스토소닉스, 인크. | Histotripsy systems and methods |
| WO2024221001A2 (en) | 2023-04-20 | 2024-10-24 | Histosonics, Inc. | Histotripsy systems and associated methods including user interfaces and workflows for treatment planning and therapy |
| TWI837015B (en) * | 2023-06-06 | 2024-03-21 | 國立臺中科技大學 | Process for rendering real-time ultrasound images used in virtual reality |
| KR20250076283A (en) * | 2023-11-22 | 2025-05-29 | 삼성메디슨 주식회사 | Method for providing user interface and ultrasound imaging apparatus thereof |
Family Cites Families (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6709394B2 (en) | 2000-08-17 | 2004-03-23 | Koninklijke Philips Electronics N.V. | Biplane ultrasonic imaging |
| WO2006054635A1 (en) | 2004-11-17 | 2006-05-26 | Hitachi Medical Corporation | Ultrasonograph and ultrasonic image display method |
| JP2007007343A (en) * | 2005-07-04 | 2007-01-18 | Matsushita Electric Ind Co Ltd | Ultrasonic diagnostic equipment |
| US20100234731A1 (en) * | 2006-01-27 | 2010-09-16 | Koninklijke Philips Electronics, N.V. | Automatic Ultrasonic Doppler Measurements |
| US8303502B2 (en) * | 2007-03-06 | 2012-11-06 | General Electric Company | Method and apparatus for tracking points in an ultrasound image |
| US9895135B2 (en) * | 2009-05-20 | 2018-02-20 | Analogic Canada Corporation | Freehand ultrasound imaging systems and methods providing position quality feedback |
| US9282947B2 (en) * | 2009-12-01 | 2016-03-15 | Inneroptic Technology, Inc. | Imager focusing based on intraoperative data |
| JP5889874B2 (en) | 2010-05-03 | 2016-03-22 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Ultrasonic tracking of ultrasonic transducers mounted on interventional instruments |
| JP5337782B2 (en) * | 2010-10-13 | 2013-11-06 | 富士フイルム株式会社 | Ultrasonic diagnostic equipment |
| US10130340B2 (en) | 2011-12-30 | 2018-11-20 | Koninklijke Philips N.V. | Method and apparatus for needle visualization enhancement in ultrasound images |
| RU2638621C2 (en) * | 2012-01-18 | 2017-12-14 | Конинклейке Филипс Н.В. | Ultrasonic management of needle trajectory during biopsy |
| US9295449B2 (en) * | 2012-01-23 | 2016-03-29 | Ultrasonix Medical Corporation | Landmarks for ultrasound imaging |
| JP6123458B2 (en) * | 2013-04-25 | 2017-05-10 | コニカミノルタ株式会社 | Ultrasonic diagnostic imaging apparatus and method of operating ultrasonic diagnostic imaging apparatus |
| CN105491955B (en) | 2013-08-30 | 2018-07-03 | 富士胶片株式会社 | Diagnostic ultrasound equipment and method of generating ultrasonic image |
| WO2015068073A1 (en) | 2013-11-11 | 2015-05-14 | Koninklijke Philips N.V. | Multi-plane target tracking with an ultrasonic diagnostic imaging system |
| US9622720B2 (en) * | 2013-11-27 | 2017-04-18 | Clear Guide Medical, Inc. | Ultrasound system with stereo image guidance or tracking |
| US20150327841A1 (en) | 2014-05-13 | 2015-11-19 | Kabushiki Kaisha Toshiba | Tracking in ultrasound for imaging and user interface |
| EP4011298B1 (en) * | 2014-11-18 | 2025-03-05 | C. R. Bard, Inc. | Ultrasound imaging system having automatic image presentation |
| EP3236859B1 (en) * | 2014-12-24 | 2021-03-31 | Koninklijke Philips N.V. | Needle trajectory prediction for target biopsy |
| US20180132944A1 (en) | 2015-05-18 | 2018-05-17 | Koninklijke Philips N.V. | Intra-procedural accuracy feedback for image-guided biopsy |
| JP2018522646A (en) * | 2015-06-25 | 2018-08-16 | リヴァンナ メディカル、エルエルシー. | Probe ultrasound guidance for anatomical features |
| CN106344153B (en) * | 2016-08-23 | 2019-04-02 | 深圳先进技术研究院 | A kind of flexible puncture needle needle point autotracker and method |
-
2018
- 2018-02-05 EP EP18709474.3A patent/EP3582692A1/en not_active Withdrawn
- 2018-02-05 WO PCT/EP2018/052730 patent/WO2018149671A1/en not_active Ceased
- 2018-02-05 US US16/485,463 patent/US11357473B2/en active Active
- 2018-02-05 JP JP2019542565A patent/JP7218293B2/en active Active
- 2018-02-05 CN CN201880011815.6A patent/CN110300549A/en active Pending
-
2022
- 2022-05-16 US US17/745,431 patent/US20220273258A1/en not_active Abandoned
Also Published As
| Publication number | Publication date |
|---|---|
| WO2018149671A1 (en) | 2018-08-23 |
| US11357473B2 (en) | 2022-06-14 |
| JP7218293B2 (en) | 2023-02-06 |
| CN110300549A (en) | 2019-10-01 |
| EP3582692A1 (en) | 2019-12-25 |
| JP2020506005A (en) | 2020-02-27 |
| US20200037983A1 (en) | 2020-02-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20220273258A1 (en) | Path tracking in ultrasound system for device tracking | |
| US11147532B2 (en) | Three-dimensional needle localization with a two-dimensional imaging probe | |
| EP2566394B1 (en) | Ultrasonic tracking of ultrasound transducer(s) aboard an interventional tool | |
| King et al. | Three‐dimensional spatial registration and interactive display of position and orientation of real‐time ultrasound images. | |
| CN107920861B (en) | Device for determining a kinematic relationship | |
| US6733458B1 (en) | Diagnostic medical ultrasound systems and methods using image based freehand needle guidance | |
| KR102607014B1 (en) | Ultrasound probe and manufacturing method for the same | |
| US20190298457A1 (en) | System and method for tracking an interventional instrument with feedback concerning tracking reliability | |
| US20190219693A1 (en) | 3-D US Volume From 2-D Images From Freehand Rotation and/or Translation of Ultrasound Probe | |
| US20230380805A1 (en) | Systems and methods for tissue characterization using multiple aperture ultrasound | |
| EP3582693B1 (en) | Focus tracking in ultrasound system for device tracking | |
| US10213185B2 (en) | Ultrasonic diagnostic apparatus | |
| EP3446150B1 (en) | Acoustic registration of internal and external ultrasound probes | |
| US20220241024A1 (en) | Ultrasound object point tracking | |
| CN114269252A (en) | Ultrasound based device positioning | |
| CN109310393B (en) | Image orientation identification for external microprotrusion linear ultrasound probe | |
| Baker et al. | Real-Time Ultrasonic Tracking of an Intraoperative Needle Tip with Integrated Fibre-Optic Hydrophone | |
| EP3804629A1 (en) | Ultrasound object point tracking | |
| Tamura et al. | Intrabody three-dimensional position sensor for an ultrasound endoscope | |
| Liang et al. | Volumetric ultrasound imaging with a sparse matrix array and integrated fiber-optic sensing for robust needle tracking in interventional procedures | |
| CN116421216A (en) | System and method for displaying ultrasound probe position using post diastole 3D imaging | |
| US20110098567A1 (en) | Three dimensional pulsed wave spectrum ultrasonic diagnostic apparatus and three dimensional pulsed wave spectrum data generation method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |