[go: up one dir, main page]

US20160174932A1 - Ultrasonic diagnostic device and ultrasonic image generation method - Google Patents

Ultrasonic diagnostic device and ultrasonic image generation method Download PDF

Info

Publication number
US20160174932A1
US20160174932A1 US15/055,143 US201615055143A US2016174932A1 US 20160174932 A1 US20160174932 A1 US 20160174932A1 US 201615055143 A US201615055143 A US 201615055143A US 2016174932 A1 US2016174932 A1 US 2016174932A1
Authority
US
United States
Prior art keywords
needle
needle tip
image
unit
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/055,143
Other languages
English (en)
Inventor
Kimito Katsuyama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Assigned to FUJIFILM CORPORATION reassignment FUJIFILM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KATSUYAMA, KIMITO
Publication of US20160174932A1 publication Critical patent/US20160174932A1/en
Priority to US17/681,492 priority Critical patent/US12186126B2/en
Priority to US18/965,809 priority patent/US20250090134A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • A61B8/145Echo-tomography characterised by scanning multiple planes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4488Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M25/00Catheters; Hollow probes
    • A61M25/01Introducing, guiding, advancing, emplacing or holding catheters
    • A61M25/0105Steering means as part of the catheter or advancing means; Markers for positioning

Definitions

  • the present invention relates to an ultrasonic diagnostic device and an ultrasonic image generation method, and in particular, to an ultrasonic diagnostic device that visualizes the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue by visualizing the needle tip of the needle inserted into the subject in an ultrasonic image.
  • this kind of ultrasonic diagnostic device includes an ultrasonic probe with built-in ultrasonic transducers and a device body connected to the ultrasonic probe, and generates an ultrasonic image by transmitting an ultrasonic wave toward a subject from the ultrasonic probe, receiving an ultrasonic echo from the subject using the ultrasonic probe, and performing electrical processing on the reception signal in the device body.
  • the needle that is inserted so as to be inclined at a predetermined angle with respect to the skin surface of the subject is inclined with respect to the ultrasonic wave transmitting and receiving surface of the ultrasonic probe, as shown in FIG. 16A .
  • the specular reflection wave from the needle may deviate from the reception opening.
  • the reflection at the needle tip is not a perfect specular reflection, a slight reflection returns to the reception opening.
  • the received signal strength is low, it is difficult to visualize the needle to the extent that the needle can be visually recognized.
  • the visualization depth is limited by tilting the ultrasonic beam, it is not possible to draw either the needle tip or the target tissue even if it is possible to draw the needle. Accordingly, the positional relationship between the needle direction or the needle tip and the target tissue is not known.
  • JP2010-183935A focuses on the fact that the amount of high frequency components in the reflection signal from the needle tip portion is smaller than that in the reflection signal from portions other than the needle tip portion, and an ultrasonic image in which the position of the needle tip portion can be easily visually recognized is generated by capturing an image of the low frequency band and an image of the high frequency band, taking a difference therebetween, and superimposing the difference image on an image of another high frequency band.
  • JP2012-213606A improves the visibility of both the body tissue and the puncture needle in a displayed image by capturing reflected waves from the puncture needle by performing a plurality of scans while changing the transmission direction of the ultrasonic wave, generating ultrasonic images with improved visibility of the puncture needle, generating a needle image based on the plurality of ultrasonic images with the changed transmission directions and the normal tissue image, and combining the normal tissue image and the needle image.
  • JP2010-183935A the frequency difference between the reflection signal from the needle tip portion and reflection signals from portions other than the needle tip portion is small. Therefore, also in the portions other than the needle tip portion, the same frequency may be obtained due to isolated point-like reflection or reflection conditions. For this reason, it is difficult to visualize only the needle tip.
  • JP2012-213606A does not describe the visualization of the needle tip even though a plurality of needle images are generated by performing a scan in a plurality of directions.
  • the present invention provides an ultrasonic diagnostic device that transmits an ultrasonic wave toward a subject from an ultrasonic probe and generates an ultrasonic image based on obtained reception data.
  • the ultrasonic diagnostic device includes: a tissue image generation unit that generates a tissue image of the subject by transmitting a transmission wave in a normal direction of an ultrasonic wave transmitting and receiving surface of the ultrasonic probe and receiving a reception wave from the normal direction of the subject; a needle information generation unit that generates needle information of a needle inserted into the subject by steering at least one of the transmission wave and the reception wave; a needle direction estimation unit that estimates a direction of the needle based on the needle information generated by the needle information generation unit; a search region setting unit that sets a search region of a needle tip in the tissue image based on the needle direction estimated by the needle direction estimation unit; a needle tip search unit that searches for the needle tip in the search region set by the search region setting unit; and a needle tip visualizing unit that visualizes the needle tip
  • the needle information generation unit generates a plurality of pieces of the needle information with different steering directions by changing a steering direction to steer at least one of the transmission wave and the reception wave, and the needle direction estimation unit estimates the needle direction based on the plurality of pieces of needle information with different steering directions. It is preferable that the needle information generated by the needle information generation unit is needle image data.
  • the needle direction estimation unit can estimate the needle direction by Hough conversion.
  • the search region setting unit sets the search region that extends to both sides of the needle direction estimated by the needle direction estimation unit with a predetermined width.
  • the needle tip search unit searches for a point, at which a brightness value is a maximum, in the search region as the needle tip.
  • the needle tip search unit includes a needle tip pattern of the needle tip and searches for a point, at which a correlation with the needle tip pattern is a maximum, in the search region as the needle tip.
  • the needle tip visualizing unit visualizes a point image having a predetermined size at a position of the needle tip.
  • the needle tip visualizing unit visualizes a frame of a predetermined range from a position of the needle tip.
  • the needle tip visualizing unit may change a brightness value or a color of the tissue image inside or outside the frame, or the needle tip visualizing unit may apply a translucent mask onto the tissue image inside or outside the frame.
  • the needle tip search unit may compare a tissue image before movement of the needle tip with a tissue image after movement of the needle tip and search for the needle tip based on a change between the tissue images.
  • the present invention provides an ultrasonic image generation method of transmitting an ultrasonic wave toward a subject from an ultrasonic probe and generating an ultrasonic image based on obtained reception data.
  • the ultrasonic image generation method includes: generating a tissue image of the subject by transmitting a transmission wave from the ultrasonic probe and receiving a reception wave from the subject; generating needle information of a needle inserted into the subject by steering at least one of the transmission wave and the reception wave; estimating a direction of the needle based on the needle information; setting a search region of a needle tip in the tissue image based on the estimated needle direction; searching for the needle tip in the set search region; and visualizing the needle tip on the tissue image based on the found needle tip.
  • the present invention during the insertion, by specifying the position of the needle tip present in a deep portion of the subject and visualizing the needle tip in the tissue image, it is possible to visualize the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue in the tissue image.
  • FIG. 1 is a block diagram showing the overall configuration of an ultrasonic diagnostic device according to an embodiment of the present invention.
  • FIG. 2 is a flowchart showing the operation of the ultrasonic diagnostic device shown in FIG. 1 .
  • FIG. 3A is an explanatory view for explaining a scanning line V_i in a normal direction and a scanning line H_i in a steering direction in the ultrasonic diagnostic device shown in FIG. 1
  • FIG. 3B is an explanatory view of a tissue image corresponding to the scanning line V_i in the normal direction
  • FIG. 3C is an explanatory view of a needle image corresponding to the scanning line H_i in the steering direction.
  • FIG. 4 is an example of a tissue image, which is generated by the ultrasonic diagnostic device shown in FIG. 1 and in which a needle direction L estimated from the needle image is visualized.
  • FIG. 5 is an example of a tissue image in which a search region F, which is set based on the needle direction L in the tissue image shown in FIG. 4 , is visualized.
  • FIG. 6 is an enlarged extraction image of a region W shown in FIG. 5 .
  • FIG. 7 is an example when a needle tip N, which is a point image in FIG. 6 , is visualized.
  • FIG. 8 is an example when a needle tip N, a needle tip region NF, a needle body NB, and a search region F are visualized in the tissue image generated by the ultrasonic diagnostic device shown in FIG. 1 .
  • FIG. 9 is an example when the brightness value inside or outside the needle tip region NF in the tissue image shown in FIG. 8 is changed.
  • FIG. 10 is an example when a translucent mask is applied onto the inside or the outside of the needle tip region NF in the tissue image shown in FIG. 8 .
  • FIG. 11 is an explanatory view when performing transmission focus processing in the normal direction and reception focus processing in the needle direction in the ultrasonic diagnostic device shown in FIG. 1 .
  • FIG. 12 is an explanatory view when selecting a needle image for estimating the needle direction from a plurality of needle images with different steering directions.
  • FIG. 13 is a schematic diagram showing an example of the needle tip pattern.
  • FIG. 14 is an explanatory view when searching for the needle tip based on the needle tip pattern.
  • FIG. 15A is an example of a tissue image captured before the movement of the needle tip when capturing a plurality of tissue images with the movement of the needle tip
  • FIG. 15B is an example of a tissue image captured after the movement of the needle tip.
  • FIG. 16A is a diagram showing that the specular reflection of a needle by the ultrasonic beam in the normal direction deviates from the reception opening in the subject into which the needle is inserted
  • FIG. 16B is a diagram showing that an ultrasonic echo based on reflection from the needle can be received by transmitting the ultrasonic beam in a state in which the ultrasonic beam is steered in the needle direction in the subject into which the needle is inserted.
  • FIG. 1 is a block diagram showing the overall configuration of an ultrasonic diagnostic device according to an embodiment of the present invention.
  • the ultrasonic diagnostic device includes an ultrasonic probe 1 , and a transmission circuit 2 and a reception circuit 3 are connected to the ultrasonic probe 1 .
  • a tissue image generation unit 4 and a needle image generation unit 5 are connected in parallel to the reception circuit 3 .
  • a needle tip visualizing unit 9 is connected to the tissue image generation unit 4
  • a display unit 11 is connected to the needle tip visualizing unit 9 through a display control unit 10 .
  • a needle direction estimation unit 6 is connected to the needle image generation unit 5
  • a needle tip search unit 8 is connected to the needle direction estimation unit 6 through a search region setting unit 7
  • the needle tip search unit 8 is connected to the needle tip visualizing unit 9 .
  • the search region setting unit 7 is connected to the tissue image generation unit 4 .
  • a control unit 12 is connected to the transmission circuit 2 , the reception circuit 3 , the tissue image generation unit 4 , the needle image generation unit 5 , the needle tip visualizing unit 9 , the needle direction estimation unit 6 , the search region setting unit 7 , the needle tip search unit 8 , and the display control unit 10 .
  • An operation unit 13 and a storage unit 14 are connected to the control unit 12 .
  • the tissue image generation unit 4 includes a phasing addition section 15 A, a detection processing section 16 A, a digital scan converter (DSC) 17 A, and an image processing section 18 A, which are connected sequentially from the reception circuit 3 , and an image memory 19 A connected to the DSC 17 A.
  • a phasing addition section 15 A a detection processing section 16 A
  • a digital scan converter (DSC) 17 A a digital scan converter (DSC) 17 A
  • an image processing section 18 A which are connected sequentially from the reception circuit 3 , and an image memory 19 A connected to the DSC 17 A.
  • the needle image generation unit 5 includes a phasing addition section 15 B, a detection processing section 16 B, a digital scan converter (DSC) 17 B, and an image processing section 18 B, which are connected sequentially from the reception circuit 3 , and an image memory 19 B connected to the DSC 17 B.
  • a phasing addition section 15 B a detection processing section 16 B, a digital scan converter (DSC) 17 B, and an image processing section 18 B, which are connected sequentially from the reception circuit 3 , and an image memory 19 B connected to the DSC 17 B.
  • DSC digital scan converter
  • the ultrasonic probe 1 includes a plurality of elements arranged in a one-dimensional or two-dimensional array, and transmits an ultrasonic beam (transmission wave) based on a transmission signal supplied from the transmission circuit 2 , receives an ultrasonic echo (reception wave) from the subject, and outputs a reception signal.
  • each element that forms the ultrasonic probe 1 is formed by a transducer in which electrodes are formed at both ends of the piezoelectric body formed of piezoelectric ceramic represented by lead zirconate titanate (PZT), a polymer piezoelectric element represented by polyvinylidene fluoride (PVDF), piezoelectric single crystal represented by lead magnesium niobate-lead titanate (PMN-PT), or the like.
  • PZT lead zirconate titanate
  • PVDF polymer piezoelectric element represented by polyvinylidene fluoride
  • PMN-PT piezoelectric single crystal represented by lead magnesium niobate-lead titanate
  • the piezoelectric body When a pulsed or continuous-wave transmission signal voltage is applied to the electrodes of the transducer, the piezoelectric body expands and contracts to generate pulsed or continuous-wave ultrasonic waves from each transducer. By combination of these ultrasonic waves, an ultrasonic beam is formed. In addition, the respective transducers expand and contract by receiving the propagating ultrasonic waves, thereby generating electrical signals. These electrical signals are output as reception signals of the ultrasonic waves.
  • the transmission circuit 2 includes a plurality of pullers, for example.
  • the transmission circuit 2 performs transmission focus processing so that ultrasonic waves transmitted from the plurality of elements of the ultrasonic probe 1 form an ultrasonic beam based on the transmission delay pattern selected according to the control signal from the control unit 12 , adjusts the amount of delay of each transmission signal, and supplies the adjusted signals to the plurality of elements.
  • the ultrasonic beam from the ultrasonic probe 1 can be steered at a predetermined angle with respect to the normal direction of the ultrasonic wave transmitting and receiving surface.
  • the reception circuit 3 performs amplification and A/D conversion of the analog reception signals output from the plurality of elements of the ultrasonic probe 1 , and outputs digital reception signals to the phasing addition section 15 A of the tissue image generation unit 4 or the phasing addition section 15 B of the needle image generation unit 5 or to both of the phasing addition section 15 A of the tissue image generation unit 4 and the phasing addition section 15 B of the needle image generation unit 5 in response to the instruction from the control unit 12 .
  • the phasing addition section 15 A of the tissue image generation unit 4 acquires the digital reception signals from the reception circuit 3 in response to the instruction from the control unit 12 , and performs reception focus processing by delaying the reception signals based on the reception delay pattern from the control unit 12 and adding the delayed reception signals.
  • reception focus processing reception data (sound ray signal) based on the ultrasonic echo from the target tissue is generated.
  • the detection processing section 16 A generates a B-mode image signal, which is tomographic image information regarding a tissue within the subject, by correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave and then performing envelope detection processing for the reception data.
  • the DSC 17 A converts the B-mode image signal generated by the detection processing section 16 A into an image signal according to the normal television signal scanning method (raster conversion). In addition, by converting the B-mode image signal in the DSC 17 A, it is possible to grasp the positional relationship or the distance corresponding to the tissue of the actual subject on the B-mode image.
  • the image processing section 18 A generates a B-mode image signal of the tissue image by performing various kinds of required image processing, such as gradation processing, on the B-mode image signal input from the DSC 17 A.
  • the phasing addition section 15 B of the needle image generation unit 5 acquires the digital reception signals from the reception circuit 3 in response to the instruction from the control unit 12 , and performs reception focus processing by delaying the reception signals based on the reception delay pattern from the control unit 12 and adding the delayed reception signals.
  • the phasing addition section 15 B generates reception data (sound ray signal) based on the ultrasonic echo from the needle, which is steered at a predetermined angle with respect to the normal direction of the ultrasonic wave transmitting and receiving surface by adjusting the amount of delay of each reception signal.
  • the detection processing section 16 B Similar to the detection processing section 16 A, the detection processing section 16 B generates a B-mode image signal, which is tomographic image information regarding a tissue within the subject, by correcting the attenuation due to the distance according to the depth of the reflection position of the ultrasonic wave and then performing envelope detection processing for the reception data.
  • a B-mode image signal which is tomographic image information regarding a tissue within the subject
  • the DSC 17 B converts the B-mode image signal generated by the detection processing section 16 B into an image signal according to the normal television signal scanning method (raster conversion).
  • the B-mode image signal in the DSC 17 B it is possible to grasp the positional relationship or the distance corresponding to the tissue of the actual subject on the B-mode image.
  • the image processing section 18 B generates a B-mode image signal of the needle image from the B-mode image signal input from the DSC 17 B.
  • the needle direction estimation unit 6 estimates a needle direction, which indicates a direction in which the needle inserted into the subject is present, from the B-mode image signal of the needle image output from the image processing section 18 B, and generates needle direction information indicating the position of the needle direction.
  • the search region setting unit 7 acquires the needle direction information from the needle direction estimation unit 6 and acquires the B-mode image signal of the tissue image from the image processing section 18 A of the tissue image generation unit 4 , visualizes a needle direction on the tissue image based on the needle direction information, and sets a search region for searching for the needle tip based on the needle direction on the tissue image. For example, a region that extends to both sides of the needle direction with a predetermined width may be set as a search region.
  • the needle tip search unit 8 generates the position information of the needle tip by searching for the needle tip in the search region, which is set by the search region setting unit 7 , in the tissue image in which the needle direction and the search region are set.
  • the needle tip visualizing unit 9 acquires the position information of the needle tip from the needle tip search unit 8 and acquires the B-mode image signal of the tissue image from the image processing section 18 A of the tissue image generation unit 4 , and visualizes the needle tip on the tissue image.
  • the needle tip visualizing unit 9 may visualize a needle direction from the needle tip to the base of the needle based on the needle direction information, or may visualize a search region based on the information of the search region.
  • the display control unit 10 acquires a B-mode image signal of the tissue image in which the needle tip is visualized by the needle tip visualizing unit 9 , and displays the tissue image in which the needle tip is visualized on the display unit 11 .
  • the display unit 11 includes a display device, such as an LCD, and displays a tissue image, which is an ultrasonic image, under the control of the display control unit 10 .
  • the control unit 12 controls each unit based on the instruction input from the operation unit by the operator. As described above, the control unit 12 selects and outputs a transmission delay pattern for the transmission circuit 2 or selects and outputs a reception delay pattern for the reception circuit 3 , and outputs an instruction on phasing addition or the correction of attenuation and envelope detection processing, based on the reception delay pattern or the transmission delay pattern, to the phasing addition section 15 A or the detection processing section 16 A of the tissue image generation unit 4 or to the phasing addition section 15 B or the detection processing section 16 B of the needle image generation unit 5 .
  • the operation unit 13 is used when the operator performs an input operation, and can be formed by a keyboard, a mouse, a trackball, a touch panel, and the like.
  • Various kinds of information input from the operation unit 13 information based on the above-described transmission delay pattern or reception delay pattern, information regarding the sound speed in an inspection target region of the subject, the focal position of the ultrasonic beam, and the transmission opening and the reception opening of the ultrasonic probe 1 , an operation program required for the control of each unit, and the like are stored in the storage unit 14 .
  • Recording media such as a hard disk, a flexible disk, an MO, an MT, a RAM, a CD-ROM, and a DVD-ROM, can be used as the storage unit 14 .
  • FIG. 2 is a flowchart showing the operation of an embodiment.
  • i is the order of the scanning line of the ultrasonic probe 1 , and the ultrasonic probe 1 acquires a reception signal corresponding to each scanning line.
  • step S 2 corresponding to the scanning line V_ 1 in the normal direction, the ultrasonic probe 1 acquires a reception signal corresponding to the scanning line V_ 1 in the normal direction by transmitting the ultrasonic beam toward the target tissue T in the normal direction of the ultrasonic wave transmitting and receiving surface S and receiving the ultrasonic echo from the normal direction of the ultrasonic wave transmitting and receiving surface S, and the tissue image generation unit 4 generates a tissue image corresponding to the normal direction scanning line V_ 1 shown in FIG. 3B and stores the tissue image in the image memory 19 A.
  • step S 3 corresponding to the scanning line H_ 1 in the steering direction, the ultrasonic probe 1 acquires a reception signal corresponding to the scanning line H_ 1 in the steering direction by transmitting the ultrasonic beam in the steering direction, which is steered by the predetermined angle ⁇ toward the needle direction from the normal direction of the ultrasonic wave transmitting and receiving surface S, and receiving the ultrasonic echo from the steering direction, and the needle image generation unit 5 generates a needle image corresponding to the steering direction scanning line H_ 1 shown in FIG. 3C and stores the needle image in the image memory 19 B.
  • the predetermined angle ⁇ may be a fixed value set in advance, or may be acquired from a device (not shown) for calculating the angle formed by the normal direction and the insertion angle of the probe. Alternatively, a direction in which a strong signal is returned after transmitting and receiving signals in a plurality of directions in advance may be set as the predetermined angle.
  • step S 5 to increase i by 1, that is, to move to the second scanning line, and steps S 2 to S 4 are repeated to generate B-mode image signals of the corresponding tissue image and needle image.
  • steps S 2 and S 3 are repeated.
  • step S 6 when B-mode image signals of tissue images for all of the “n” scanning lines V_ 1 to V_n and B-mode image signals of needle images for all of the “n” scanning lines H_ 1 to H_n are generated, the process proceeds to step S 6 from step S 4 .
  • the needle direction estimation unit 6 estimates a needle direction L based on the B-mode image signal obtained by performing image processing by scan-converting the needle image stored in the image memory 19 B.
  • the needle direction is performed by calculating the brightness distribution in the entire needle image or in a predetermined region in which it is assumed that a needle is included, detecting a straight line in the entire needle image or in the predetermined region by Hough conversion and setting it as a needle direction, and setting the position information of the needle direction as needle direction information.
  • a straight line is detected by Hough conversion, a brightness value may be multiplied as a weighting factor when converting each pixel to the curve in the Op coordinate system and superimposing the curves on each other. Through this method, a high-brightness straight line, such as a needle, can be easily detected.
  • the needle direction information of the needle direction L estimated by the needle direction estimation unit 6 is output to the search region setting unit 7 .
  • step S 7 the search region setting unit 7 superimposes a signal in the needle direction L on the B-mode image signal, which is obtained by performing image processing by scan-converting the tissue image stored in the image memory 19 B, based on the needle direction information output from the needle direction estimation unit 6 as shown in FIG. 4 , and sets a search region F extending from the needle direction L of the tissue image to both sides of the needle direction L with a predetermined width r as shown in FIG. 5 .
  • the B-mode image signal of the tissue image in which the needle direction L and the search region F are set is output to the needle tip search unit 8 .
  • the predetermined width r may be set to three to five times the width of the needle based on the width of the needle inserted into the body.
  • the needle tip search unit 8 calculates the brightness distribution of the tissue image, and determines a maximum brightness point B in a search region F as a needle tip as shown in FIG. 6 obtained by extracting a region W in FIG. 5 in an enlarged manner.
  • the needle tip search unit 8 may have a needle tip pattern, such as an image of the needle tip, in advance, take a correlation with the needle tip pattern in the tissue image in the search region F, and determine a point at which the correlation is the maximum as the needle tip.
  • the position information of the needle tip found by the needle tip search unit 8 is output to the needle tip visualizing unit 9 .
  • the needle tip search unit 8 may output needle direction information or the information of the search region together with the position information of the needle tip.
  • step S 9 the needle tip visualizing unit 9 visualizes a needle tip N, which is a point image having a predetermined size, in the tissue image from the position information of the needle tip found by the needle tip search unit 8 .
  • the B-mode image signal of the tissue image in which the needle tip is visualized is output to the display control unit 10 , and is displayed as a tissue image in which the needle tip is visualized on the display unit 11 .
  • the needle tip visualizing unit 9 visualizes the needle tip N in the tissue image, but also it is possible to adopt various kinds of display methods for making the needle tip clear in the tissue image.
  • a circular frame showing a needle tip region NF that extends by a predetermined radius from the position of the needle tip may be displayed, or the search region F may be displayed based on the information of the above-described search region, or a needle body NB, which is obtained by visualizing the needle direction L from the needle tip N or which is obtained by connecting the needle tip N to the base portion of the needle direction L in a straight line may be visualized based on the needle direction L.
  • the needle tip visualizing unit 9 may change the brightness value or the color of a tissue image inside or outside the needle tip region NF surrounded by the circular frame as shown in FIG. 9 , or may apply a translucent mask onto the tissue image of the inside or the outside of the needle tip region NF surrounded by the circular frame as shown in FIG. 10 .
  • a frame having a predetermined shape for example, a rectangular frame or a rhombic frame having a position of the needle tip on its center, may be displayed.
  • the needle tip By emphasizing the needle tip by visualizing the needle tip in the tissue image as described above, the needle tip can be easily visually recognized in the tissue image. Therefore, it is possible to clearly grasp the positional relationship between the needle direction and the target tissue and the positional relationship between the needle tip and the target tissue.
  • a tissue image can be generated by performing transmission focus processing on the ultrasonic wave toward a predetermined focal point in the normal direction of the ultrasonic wave receiving surface and performing reception focus processing on the ultrasonic echo from the target tissue in the normal direction of the ultrasonic wave transmitting and receiving surface, and a needle image can be generated by performing reception focus processing on the ultrasonic echo from the needle in the R direction indicated by the dotted arrow.
  • the needle direction is estimated based on one needle image.
  • a plurality of needle images with different steering directions may be generated by changing the steering direction for steering at least one of the direction of transmission focus processing, which is the transmission direction of the ultrasonic beam, and the direction of reception focus processing of the ultrasonic echo, the sharpest needle image among the plurality of needle images may be selected, and the above-described needle direction may be estimated based on the selected sharpest needle image.
  • the needle direction estimation unit 6 acquires a plurality of needle images with different steering directions from the needle image generation unit 5 , and selects a needle image in which the needle is visualized best as shown in FIG. 12 .
  • the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included may be calculated for each needle image, and a needle image including a point of the highest brightness value may be selected or a needle image having a maximum average brightness value may be selected, for example.
  • it is possible to estimate the needle direction by selecting a needle image in which the needle is visualized best. That is, a direction perpendicular to the steering direction in which the needle is visualized best can be estimated to be the needle direction.
  • the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included may be calculated for a plurality of needle images with different steering directions, a straight line may be detected by the Hough conversion or the like, and a needle image in which the average brightness value of the straight line is the maximum may be selected.
  • a needle image having a point of the maximum brightness value on the straight line, which is higher than points of the maximum brightness values on the straight lines in the other needle images may also be selected.
  • the predetermined region in which it is assumed that a needle is included is assumed from the approximate angle of the insertion, for example.
  • the needle direction estimation unit 6 estimates the needle direction based on the selected needle image.
  • the needle direction estimation unit 6 may calculate the brightness distribution in the entire needle image or a predetermined region in which it is assumed that a needle is included using all of a plurality of needle images with different steering directions, detect a straight line based on the brightness distribution in the entire needle image or the predetermined region by the Hough conversion or the like, and set the straight line as a needle direction.
  • the needle tip search unit 8 determines the maximum brightness point B in the search region F as a needle tip.
  • the needle tip search unit 8 may have a needle tip pattern in advance and search for the needle tip based on the needle tip pattern.
  • a needle tip pattern for example, as shown in FIG. 13 , there is an image that is a line segment, which connects the needle tip and the end of the cut surface of the needle to each other and has a predetermined length d, and that has a high-brightness point at both ends of the line segment.
  • the needle tip search unit 8 may have the above-described needle tip pattern. Then, as shown in FIG. 14 , the needle tip search unit 8 may search for a high-brightness point B 1 and a high-brightness point B 2 , which are considered to be the most correlated with the needle tip pattern in the search region F, and determine the high-brightness point B 1 located in a deep portion of the subject, between the high-brightness point B 1 and the high-brightness point B 2 , as a needle tip.
  • the needle tip search unit 8 may search for the needle tip by comparing the tissue image before the movement with the tissue image after the movement.
  • the needle tip search unit 8 may calculate the brightness distribution in each of the tissue image before movement and the tissue image after movement and search for the needle tip based on the change in the brightness value.
  • FIG. 15A that is a tissue image before movement
  • FIG. 15B that is a tissue image after movement
  • a point P 2 where the brightness value becomes suddenly large in FIG. 15B may be determined as the needle tip
  • a point P 1 where the brightness value becomes suddenly small may be determined as the needle tip.
  • the point P 2 may be determined as the needle tip based on the fact that the point P 2 where the brightness value becomes suddenly large and the point P 1 where the brightness value becomes suddenly small are adjacent to each other.
  • a needle tip pattern image of the brightness change including the point P 2 where the brightness value becomes suddenly large and the point P 1 where the brightness value becomes suddenly small may be prepared in advance, and a point considered to be the most correlated with the needle tip pattern in the search region F in the image of the brightness change between the tissue image before movement and the tissue image after movement may be searched for and determined to be the needle tip.
  • the needle tip search unit 8 may compare the tissue image before movement with the tissue image after movement, calculate the amount of movement and the movement direction between the images at each point in a predetermined region including the needle tip by the two-dimensional correlation operation or the like, and determine a point of the largest amount of movement or a point of the largest spatial change in the amount of movement or the movement direction as the needle tip.
  • the needle tip search unit 8 may compare the tissue image before movement with the tissue image after movement, calculate a change before and after movement in the image pattern near each point in a predetermined region including the needle tip by the two-dimensional correlation operation or the like, and determine a point of the largest image pattern change or a point of the largest spatial change of the image pattern change as the needle tip.
  • the needle image generation unit 5 generates a needle image
  • the needle direction estimation unit 6 estimates a needle direction based on the needle image.
  • the needle direction may be estimated based on the reception signal from each element of the ultrasonic probe 1
  • the needle direction may be estimated based on the reception data (sound ray signal) after phasing addition.
  • control unit 12 control unit
  • 19 A, 19 B image memory
  • V_i normal direction scanning line

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Physiology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
US15/055,143 2013-08-30 2016-02-26 Ultrasonic diagnostic device and ultrasonic image generation method Abandoned US20160174932A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/681,492 US12186126B2 (en) 2013-08-30 2022-02-25 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium
US18/965,809 US20250090134A1 (en) 2013-08-30 2024-12-02 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013-179830 2013-08-30
JP2013179830 2013-08-30
PCT/JP2014/062064 WO2015029499A1 (fr) 2013-08-30 2014-05-01 Dispositif de diagnostic à ultrasons et procédé de génération d'images par ultrasons

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/062064 Continuation WO2015029499A1 (fr) 2013-08-30 2014-05-01 Dispositif de diagnostic à ultrasons et procédé de génération d'images par ultrasons

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/681,492 Division US12186126B2 (en) 2013-08-30 2022-02-25 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium

Publications (1)

Publication Number Publication Date
US20160174932A1 true US20160174932A1 (en) 2016-06-23

Family

ID=52586081

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/055,143 Abandoned US20160174932A1 (en) 2013-08-30 2016-02-26 Ultrasonic diagnostic device and ultrasonic image generation method
US17/681,492 Active US12186126B2 (en) 2013-08-30 2022-02-25 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium
US18/965,809 Pending US20250090134A1 (en) 2013-08-30 2024-12-02 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/681,492 Active US12186126B2 (en) 2013-08-30 2022-02-25 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium
US18/965,809 Pending US20250090134A1 (en) 2013-08-30 2024-12-02 Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium

Country Status (4)

Country Link
US (3) US20160174932A1 (fr)
JP (1) JP6097258B2 (fr)
CN (1) CN105491955B (fr)
WO (1) WO2015029499A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3278738A4 (fr) * 2015-04-03 2018-05-02 Fujifilm Corporation Dispositif et procédé de génération d'image d'onde acoustique
JP2020506005A (ja) * 2017-02-14 2020-02-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 装置追跡に対する超音波システムにおける経路追跡
US10646198B2 (en) * 2015-05-17 2020-05-12 Lightlab Imaging, Inc. Intravascular imaging and guide catheter detection methods and systems
US20230131115A1 (en) * 2021-10-21 2023-04-27 GE Precision Healthcare LLC System and Method for Displaying Position of Echogenic Needles

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6589619B2 (ja) * 2015-01-09 2019-10-16 コニカミノルタ株式会社 超音波診断装置
JP6746895B2 (ja) * 2015-11-06 2020-08-26 コニカミノルタ株式会社 超音波診断装置、及び超音波信号処理方法
JP6668817B2 (ja) * 2016-02-26 2020-03-18 コニカミノルタ株式会社 超音波診断装置、及び制御プログラム
CN119014903A (zh) * 2018-01-23 2024-11-26 皇家飞利浦有限公司 提供针插入引导的超声成像系统
JP2021534861A (ja) * 2018-08-22 2021-12-16 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 介入性音響撮像におけるセンサ追跡推定値を制約するためのシステム、デバイス及び方法
CN110251210B (zh) * 2019-05-28 2021-01-01 聚融医疗科技(杭州)有限公司 一种基于分块rht的穿刺增强方法及装置
EP4252672B1 (fr) * 2020-11-27 2025-07-02 FUJIFILM Corporation Dispositif de traitement d'informations, dispositif de diagnostic par ultrasons, procédé de traitement d'informations et programme de traitement d'informations

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6048312A (en) * 1998-04-23 2000-04-11 Ishrak; Syed Omar Method and apparatus for three-dimensional ultrasound imaging of biopsy needle
JP2004208859A (ja) * 2002-12-27 2004-07-29 Toshiba Corp 超音波診断装置
JP2006346477A (ja) * 2006-08-21 2006-12-28 Olympus Corp 超音波診断装置
US20070270687A1 (en) * 2004-01-13 2007-11-22 Gardi Lori A Ultrasound Imaging System and Methods Of Imaging Using the Same
US20100056917A1 (en) * 2008-08-26 2010-03-04 Fujifilm Corporation Ultrasonic diagnostic apparatus
US20100298705A1 (en) * 2009-05-20 2010-11-25 Laurent Pelissier Freehand ultrasound imaging systems and methods for guiding fine elongate instruments
US20120078103A1 (en) * 2010-09-28 2012-03-29 Fujifilm Corporation Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method
US20120253181A1 (en) * 2011-04-01 2012-10-04 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus and controlling method
US8343054B1 (en) * 2010-09-30 2013-01-01 Hitachi Aloka Medical, Ltd. Methods and apparatus for ultrasound imaging
US20140031673A1 (en) * 2012-07-26 2014-01-30 Ge Medical Systems Global Technology Company, Llc Ultrasonic diagnostic apparatus and control program thereof
US20150320386A9 (en) * 2013-06-27 2015-11-12 Ge Medical Systems Global Technology Company, Llc Ultrasonic diagnostic device and control program for the same
US9642592B2 (en) * 2013-01-03 2017-05-09 Siemens Medical Solutions Usa, Inc. Needle enhancement in diagnostic ultrasound imaging

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6951542B2 (en) * 2002-06-26 2005-10-04 Esaote S.P.A. Method and apparatus for ultrasound imaging of a biopsy needle or the like during an ultrasound imaging examination
JP5416900B2 (ja) * 2007-11-22 2014-02-12 株式会社東芝 超音波診断装置及び穿刺支援用制御プログラム
CN101744639A (zh) * 2008-12-19 2010-06-23 Ge医疗系统环球技术有限公司 超声成像方法及设备
JP5438985B2 (ja) 2009-02-10 2014-03-12 株式会社東芝 超音波診断装置及び超音波診断装置の制御プログラム
JP5495593B2 (ja) * 2009-03-23 2014-05-21 株式会社東芝 超音波診断装置及び穿刺支援用制御プログラム
WO2011058840A1 (fr) * 2009-11-16 2011-05-19 オリンパスメディカルシステムズ株式会社 Dispositif d'observation ultrasonore et procédé de commande pour un dispositif d'observation ultrasonore
US8861822B2 (en) * 2010-04-07 2014-10-14 Fujifilm Sonosite, Inc. Systems and methods for enhanced imaging of objects within an image
EP2566394B1 (fr) * 2010-05-03 2016-12-14 Koninklijke Philips N.V. Poursuite ultrasonore de transducteur(s) à ultrasons embarqués sur un outil d'intervention
JP5645628B2 (ja) * 2010-12-09 2014-12-24 富士フイルム株式会社 超音波診断装置
JP5486449B2 (ja) * 2010-09-28 2014-05-07 富士フイルム株式会社 超音波画像生成装置及び超音波画像生成装置の作動方法
EP2454996A1 (fr) * 2010-11-17 2012-05-23 Samsung Medison Co., Ltd. Fourniture d'une image ultrasonore optimale pour traitement interventionnel dans un système médical
JP5435751B2 (ja) * 2011-03-03 2014-03-05 富士フイルム株式会社 超音波診断装置、超音波送受信方法、および超音波送受信プログラム
WO2014002963A1 (fr) * 2012-06-25 2014-01-03 株式会社東芝 Appareil de diagnostic par ultrasons et procédé de traitement d'image
CN105518482B (zh) * 2013-08-19 2019-07-30 Bk医疗控股股份有限公司 超声成像仪器可视化

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6048312A (en) * 1998-04-23 2000-04-11 Ishrak; Syed Omar Method and apparatus for three-dimensional ultrasound imaging of biopsy needle
JP2004208859A (ja) * 2002-12-27 2004-07-29 Toshiba Corp 超音波診断装置
US20070270687A1 (en) * 2004-01-13 2007-11-22 Gardi Lori A Ultrasound Imaging System and Methods Of Imaging Using the Same
JP2006346477A (ja) * 2006-08-21 2006-12-28 Olympus Corp 超音波診断装置
US20100056917A1 (en) * 2008-08-26 2010-03-04 Fujifilm Corporation Ultrasonic diagnostic apparatus
US20100298705A1 (en) * 2009-05-20 2010-11-25 Laurent Pelissier Freehand ultrasound imaging systems and methods for guiding fine elongate instruments
US20120078103A1 (en) * 2010-09-28 2012-03-29 Fujifilm Corporation Ultrasound diagnostic system, ultrasound image generation apparatus, and ultrasound image generation method
US8343054B1 (en) * 2010-09-30 2013-01-01 Hitachi Aloka Medical, Ltd. Methods and apparatus for ultrasound imaging
US20120253181A1 (en) * 2011-04-01 2012-10-04 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus and controlling method
US20140031673A1 (en) * 2012-07-26 2014-01-30 Ge Medical Systems Global Technology Company, Llc Ultrasonic diagnostic apparatus and control program thereof
US9642592B2 (en) * 2013-01-03 2017-05-09 Siemens Medical Solutions Usa, Inc. Needle enhancement in diagnostic ultrasound imaging
US20150320386A9 (en) * 2013-06-27 2015-11-12 Ge Medical Systems Global Technology Company, Llc Ultrasonic diagnostic device and control program for the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3278738A4 (fr) * 2015-04-03 2018-05-02 Fujifilm Corporation Dispositif et procédé de génération d'image d'onde acoustique
US10646198B2 (en) * 2015-05-17 2020-05-12 Lightlab Imaging, Inc. Intravascular imaging and guide catheter detection methods and systems
US11850089B2 (en) 2015-11-19 2023-12-26 Lightlab Imaging, Inc. Intravascular imaging and guide catheter detection methods and systems
US12226256B2 (en) 2015-11-19 2025-02-18 Lightlab Imaging, Inc. Intravascular imaging and guide catheter detection methods and systems
JP2020506005A (ja) * 2017-02-14 2020-02-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 装置追跡に対する超音波システムにおける経路追跡
US11357473B2 (en) 2017-02-14 2022-06-14 Koninklijke Philips N.V. Path tracking in ultrasound system for device tracking
US20230131115A1 (en) * 2021-10-21 2023-04-27 GE Precision Healthcare LLC System and Method for Displaying Position of Echogenic Needles

Also Published As

Publication number Publication date
CN105491955B (zh) 2018-07-03
US12186126B2 (en) 2025-01-07
CN105491955A (zh) 2016-04-13
WO2015029499A1 (fr) 2015-03-05
US20250090134A1 (en) 2025-03-20
JP2015062668A (ja) 2015-04-09
US20220175343A1 (en) 2022-06-09
JP6097258B2 (ja) 2017-03-15

Similar Documents

Publication Publication Date Title
US12186126B2 (en) Control method of an ultrasonic diagnostic apparatus and non-transitory computer readable recording medium
US10588598B2 (en) Ultrasonic inspection apparatus
US10687786B2 (en) Ultrasound inspection apparatus, ultrasound inspection method and recording medium
US11116475B2 (en) Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus
US11278262B2 (en) Ultrasonic diagnostic device and ultrasonic image generation method
US11666310B2 (en) Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus using predetermined imaging conditions for B-mode image generation
US11096665B2 (en) Ultrasound diagnostic device, ultrasound diagnostic method, and ultrasound diagnostic program
US20140031687A1 (en) Ultrasonic diagnostic apparatus
US20160157830A1 (en) Ultrasonic diagnostic device and ultrasonic image generation method
CN115103634A (zh) 超声波诊断装置、超声波诊断装置的控制方法及超声波诊断装置用处理器
US10980515B2 (en) Acoustic wave processing apparatus, signal processing method, and program for acoustic wave processing apparatus
US10792014B2 (en) Ultrasound inspection apparatus, signal processing method for ultrasound inspection apparatus, and recording medium
US20130060142A1 (en) Ultrasound diagnostic apparatus and method of producing ultrasound image
US9907532B2 (en) Ultrasound inspection apparatus, signal processing method for ultrasound inspection apparatus, and recording medium
US20160139252A1 (en) Ultrasound diagnostic device, method for generating acoustic ray signal of ultrasound diagnostic device, and program for generating acoustic ray signal of ultrasound diagnostic device
US11812920B2 (en) Ultrasound diagnostic apparatus and method for controlling ultrasound diagnostic apparatus
JP5836241B2 (ja) 超音波検査装置、超音波検査装置の信号処理方法およびプログラム
JP2008048951A (ja) 超音波診断装置
JP6275960B2 (ja) 画像表示装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJIFILM CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KATSUYAMA, KIMITO;REEL/FRAME:037854/0137

Effective date: 20160106

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION