[go: up one dir, main page]

WO2015088277A1 - Procédé et appareil d'affichage d'une image ultrasonore - Google Patents

Procédé et appareil d'affichage d'une image ultrasonore Download PDF

Info

Publication number
WO2015088277A1
WO2015088277A1 PCT/KR2014/012272 KR2014012272W WO2015088277A1 WO 2015088277 A1 WO2015088277 A1 WO 2015088277A1 KR 2014012272 W KR2014012272 W KR 2014012272W WO 2015088277 A1 WO2015088277 A1 WO 2015088277A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
plaque
ultrasound
tubular tissue
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2014/012272
Other languages
English (en)
Korean (ko)
Inventor
이봉헌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Medison Co Ltd
Original Assignee
Samsung Medison Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Medison Co Ltd filed Critical Samsung Medison Co Ltd
Priority to US15/103,555 priority Critical patent/US10631823B2/en
Priority to CN201480075482.5A priority patent/CN106028950B/zh
Priority to EP14869861.6A priority patent/EP3081169B1/fr
Publication of WO2015088277A1 publication Critical patent/WO2015088277A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/085Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0891Clinical applications for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal

Definitions

  • the present invention relates to a method and apparatus for displaying an ultrasound image, and more particularly, to an ultrasound image display method and apparatus for providing a 3D ultrasound image for a tubular tissue.
  • Ultrasound systems have non-invasive and non-destructive properties and are widely used in the medical field for obtaining information inside an object. Ultrasound systems are very important in the medical field because they can provide a doctor with a high-resolution image of the internal tissue of a subject without the need for a surgical operation to directly incise and observe the subject.
  • an ultrasound system transmits an ultrasound signal to an object and receives an ultrasound signal (hereinafter referred to as an echo signal) reflected from the object while the probe is in contact with the surface of the object.
  • the ultrasound system forms an ultrasound image of the object based on an echo signal received through the probe, and displays the formed ultrasound image on the display.
  • Ultrasound images are often represented as B-modes using a reflection coefficient that depends on the difference in acoustic impedance between tissues.
  • the carotid artery among the many blood vessels included in the human body is a blood vessel connecting the aorta and the cerebrovascular vessels from the heart, there are two on the left and right of the neck. About 80% of the blood going to the brain passes through the carotid artery. Carotid artery examination using an ultrasound system is a useful test method to accurately assess the degree of narrowing of the carotid artery.
  • An ultrasound image display method and apparatus for providing an ultrasound image for easily diagnosing an object having tubular tissue are provided.
  • an ultrasound image display method and apparatus for providing a 3D image of a blood vessel are provided.
  • the ultrasound image display apparatus and method unfolds the tubular tissue to generate and display an ultrasound image three-dimensionally displayed on a reference plane, so that the user can easily diagnose the inside and the outside of the tubular tissue.
  • FIG. 1 is a block diagram of an ultrasound image display device according to an exemplary embodiment.
  • FIG. 2A is a flowchart of a method of displaying an ultrasound image, according to an exemplary embodiment.
  • 2B is a flowchart of a method of displaying an ultrasound image, according to another exemplary embodiment.
  • 3A is a conceptual diagram illustrating a method of generating a first image according to an embodiment of the present invention.
  • 3B illustrates an example of a screen displaying a plaque image in which contour lines are displayed according to an embodiment of the present invention.
  • 4A is a conceptual diagram illustrating a method of generating a first image according to another embodiment of the present invention.
  • 4B illustrates an example of a screen displaying a plaque image in which contour lines are displayed according to another exemplary embodiment of the present invention.
  • FIG. 5 illustrates an example of a screen displaying an image in which the first image is rotated based on a user input according to an embodiment of the present invention.
  • FIG. 6A illustrates an example of a screen displaying a plaque image to which at least one color is mapped according to an embodiment of the present invention.
  • FIG. 6B illustrates an example of a screen displaying a plaque image mapped with at least one color according to another embodiment of the present invention.
  • FIG. 7 is a plane corresponding to a cut plane for blood vessels according to an embodiment of the present invention.
  • FIG. 8 illustrates examples of a screen including a first image output according to an exemplary embodiment of the present invention.
  • FIG 9 illustrates examples of a screen including a first image output according to another exemplary embodiment of the present invention.
  • FIG. 10 is a block diagram of an ultrasound system to which an ultrasound image display device according to an exemplary embodiment of the present invention may be applied.
  • An ultrasound image display apparatus based on the ultrasound data corresponding to the object including the tubular tissue, an image processing unit for generating a first image three-dimensionally representing the surface forming the tubular tissue on a reference plane ; And a display unit displaying the first image.
  • the first image may be a three-dimensional image that shows the surface forming the tubular tissue unfolded.
  • the image processor may detect at least one of a predetermined portion and a predetermined tissue present on a surface of the tubular tissue based on the ultrasound data, and detect at least one of the detected predetermined portion and the predetermined tissue by using the tubular tissue.
  • the first image may be generated by displaying on a surface to be formed.
  • the image processor may include at least one of at least two two-dimensional ultrasound images capable of acquiring a three-dimensional shape of the tubular tissue and three-dimensional data representing the tubular tissue in three dimensions based on the ultrasound data.
  • First area data may be obtained and the first image may be generated based on the first area data.
  • the image processor detects a first area corresponding to the tubular tissue from the first volume data acquired based on the ultrasound data, and maps the volume data corresponding to the first area on the reference plane. 2 volume data may be generated and the first image may be generated based on the second volume data.
  • the image processor may be further configured to acquire a plurality of two-dimensional ultrasound images corresponding to a plurality of consecutive slices based on the ultrasound data, and to obtain a three-dimensional shape of the tubular tissue based on the plurality of two-dimensional ultrasound images.
  • the first image may be generated based on the three-dimensional shape of the tubular tissue.
  • the tubular tissue may include blood vessels
  • the first region may include a blood vessel region.
  • the image processor may generate the first image representing information on the plaque included in the blood vessel.
  • the image processing unit may detect a first region corresponding to the tubular tissue from the first volume data acquired based on the ultrasound data, and set a cutting line in a length direction of the tubular tissue with respect to the first region.
  • the first image may be cut by the cutting line to generate the first image representing an unfolded surface of the tubular tissue.
  • the display unit may further display a second image, which is generated based on the first volume data acquired based on the ultrasound data, and is a 3D ultrasound image representing the tubular tissue.
  • the ultrasound image display apparatus may further include a user input unit for receiving a first user input for setting a cut line parallel to the longitudinal direction of the tubular tissue on the second image.
  • the image processing unit may set a cutting line for a first area corresponding to the tubular tissue included in the first volume data based on the first user input, and set the cutting line by the cutting line.
  • the first image may be generated by cutting the surface to form the tubular tissue.
  • the image processor may detect a plaque region corresponding to the plaque based on the ultrasound data, generate a plaque image representing the plaque region, and generate the first image including the plaque image. .
  • the image processor may generate the plaque image in which contour lines are displayed based on the height of the plaque region.
  • the image processor may generate the plaque image to which at least one color determined based on the height of the plaque region is mapped.
  • the image processor may further include at least one of the plaque region based on at least one of a ratio between the height of the plaque region and the diameter of the blood vessel, the elastic value of the plaque region, and the brightness value of the image in the plaque region.
  • the plaque image may be generated by mapping colors.
  • the image processor may generate the plaque image so that the plaques protruding into the tubular tissue and the plaques protruding out of the tubular tissue may be distinguished from each other and displayed based on the reference plane.
  • the image processing unit may detect a plurality of plaque areas corresponding to the plaques based on the ultrasound data, generate a plaque image representing the plurality of plaque areas, and generate the first image including the plaque images.
  • the plaque image may include a plurality of identifiers corresponding to the plurality of plaque regions.
  • the ultrasound image display apparatus may further include a user input unit for receiving a second user input for rotating the first image.
  • the image processor may control the rotated first image to be displayed by rotating the first image based on the second user input.
  • the ultrasound image display apparatus may further include a probe that transmits an ultrasound signal to the object and receives an echo signal reflected from the object.
  • the image processor may receive the ultrasound data including the echo signal.
  • an ultrasound image display method includes generating a first image three-dimensionally representing a surface forming the tubular tissue on a reference plane based on ultrasound data corresponding to an object including the tubular tissue; And displaying the first image.
  • an “ultrasound image” refers to an image of an object obtained using ultrasound.
  • the object may be a living or non-living object that the image is intended to represent.
  • the subject may mean a part of the body, and the subject may include organs such as the liver, the heart, the uterus, the brain, the breast, the abdomen, the fetus, and the like, and may include any one side of the body.
  • a user may be a doctor, a nurse, a clinical pathologist, a sonographer, a medical imaging expert, or the like as a medical expert, but is not limited thereto.
  • FIG. 1 is a block diagram of an ultrasound image display device according to an exemplary embodiment.
  • the ultrasound image display apparatus 100 includes an image processor 120 and a display unit 130.
  • the ultrasound image display apparatus 100 may further include a probe 110.
  • the ultrasound image display apparatus 100 may be any image display apparatus capable of processing and displaying an ultrasound image.
  • the ultrasound image display apparatus 100 may be implemented as a portable type as well as a cart type.
  • the portable ultrasound diagnostic apparatus may include, but are not limited to, a PACS viewer, a smart phone, a laptop computer, a PDA, a tablet PC, and the like.
  • the ultrasound image display apparatus 100 may include a probe 110.
  • a first image for diagnosis may be generated based on ultrasound data received from the probe 110, for example, an ultrasound echo signal.
  • the probe 110 may be a wired probe or a wireless probe.
  • the ultrasound image display apparatus 100 may not externally receive ultrasound data including an ultrasound echo signal obtained by ultrasonic scanning of an object by the probe 110 without the probe 110 itself.
  • the image processor 120 may receive ultrasound data from an external server (not shown), an ultrasound diagnosis apparatus (not shown), or a medical imaging system (not shown).
  • the image processor 120 is a communication module (not shown) for performing data transmission and reception through an external server (not shown), an ultrasound diagnostic device (not shown), or a medical imaging system (not shown) through a wired or wireless communication network. ) May be included.
  • the image processor 120 may receive ultrasonic data through a communication module (not shown).
  • the probe 110 transmits an ultrasound signal to an object and receives an echo signal reflected from the object.
  • the probe 110 transmits an ultrasonic signal to the object according to a driving signal applied to the probe 110 and receives an echo signal reflected from the object.
  • the probe 110 includes a plurality of transducers, and the plurality of transducers vibrate according to an electrical signal transmitted and generate ultrasonic waves which are acoustic energy.
  • the probe 110 may be connected to the main body of the ultrasound image display apparatus 100 by wire or wirelessly, and the ultrasound image display apparatus 100 may include a plurality of probes 110 according to an implementation form.
  • Probe 110 according to an embodiment of the present invention may include at least one of 1D (Dimension), 1.5D, 2D (matrix), and 3D probe.
  • the image processor 120 generates a first image three-dimensionally representing a surface forming the tubular tissue on the reference plane, based on the ultrasound data corresponding to the object including the tubular tissue.
  • the subject is the subject of diagnosis and includes certain body parts of the patient.
  • the subject may comprise tubular tissue.
  • a first image which can easily observe at least one of the inside and the outside of the tubular tissue is to provide.
  • the tubular tissue may be at least one of all body tissues, organs, and sites having a tube form.
  • the tubular tissue may be digestive tissue such as small intestine, large intestine, stomach, esophagus, duodenum, and the like.
  • the tubular tissue may be a blood vessel.
  • the tubular tissue can also be a urethral canal, a prostate, or the like.
  • the first image may be a 3D image in which the tubular tissue is unfolded.
  • the reference plane refers to a plane for showing the tubular tissue unfolded.
  • the reference plane may be the outer boundary surface of the tubular tissue or the inner boundary surface of the tubular tissue.
  • the reference plane may be a two-dimensional plane for unfolding and showing tubular tissue.
  • the first image refers to an image in which at least one of an inner surface and an outer surface forming tubular tissue on a reference plane is displayed so that the surface of the tubular tissue is three-dimensionally displayed.
  • ultrasound data refers to data obtained by ultrasound scanning an object.
  • the ultrasound data may include an ultrasound echo signal received through the probe 110.
  • the ultrasound data may be two-dimensional ultrasound data or volume data formed based on the ultrasound echo signal.
  • the image processor 120 includes at least one of at least two two-dimensional ultrasound images capable of acquiring a three-dimensional shape of the tubular tissue and three-dimensional data representing the tubular tissue in three dimensions based on the ultrasound data.
  • First area data may be obtained.
  • the first image may be generated based on the first area data.
  • the image processor 120 may acquire a plurality of two-dimensional ultrasound images corresponding to a plurality of consecutive slices based on the ultrasound data, and obtain a three-dimensional shape of the tubular tissue based on the plurality of two-dimensional ultrasound images. have.
  • the image processor 120 may acquire a three-dimensional shape of the tubular tissue by using the plurality of two-dimensional ultrasound images obtained based on the ultrasound data.
  • the image processor 120 detects a first region corresponding to the tubular tissue from the first volume data acquired based on the ultrasound data, maps the volume data corresponding to the first region on the reference plane, and then the second volume. You can generate data.
  • the first image may be generated based on second volume data.
  • the image processor 120 may obtain first volume data based on ultrasound data corresponding to the object.
  • the ultrasound data may include an ultrasound echo signal received by the probe 110.
  • the ultrasound data may be data obtained by processing an ultrasound echo signal.
  • the first volume data refers to data expressed so that the object has a predetermined volume based on the ultrasound data corresponding to the object.
  • the image processor 120 may generate ultrasound data by processing the echo signal received from the probe 110.
  • the image processor 120 may generate an ultrasound image through a scan conversion process on the generated ultrasound data.
  • the ultrasound image may represent the movement of the object as a Doppler image, as well as the gray scale ultrasound image in which the object is scanned according to the A mode, the B mode, and the M mode.
  • the Doppler image may include a blood flow Doppler image (or also called a color Doppler image) representing blood flow, a tissue Doppler image representing a tissue movement, and a spectral Doppler image displaying a moving speed of an object as a waveform. have.
  • the image processor 120 may generate volume data by processing the ultrasound data and generate a 3D ultrasound image through a volume rendering process of the volume data.
  • the image processor 120 may further generate an elastic image that images the degree of deformation of the object according to the pressure, and may display various additional information as text or graphics on the ultrasound image.
  • the image processor 120 obtains first volume data based on ultrasound data corresponding to the object.
  • the first volume corresponding to the tubular tissue included in the object is detected from the first volume data, and the second volume data is generated by mapping volume data included in the first region on the reference plane. Then, using the second volume data, a first image representing a surface forming the tubular tissue is generated.
  • the first image may be an image representing the inner surface of the tubular tissue.
  • the first image may be an image showing the inner surface of the tubular tissue unfolded.
  • the first image may be a virtual endoscope image having the same view as the image acquired through the endoscope of the tubular tissue.
  • the 'virtual endoscope image' may be displayed by applying a method such as fish's eye and perspective. Virtual endoscopic images allow more intuitive recognition of the inner surface of tubular tissue with tubular shape.
  • the first image may be an image representing at least one of an inner surface and an outer surface forming the tubular tissue.
  • the first image may be an image representing the inner surface and the outer surface of the tubular tissue unfolded.
  • the first image may be an image representing a thickness difference between the inner surface and the outer surface forming the tubular tissue.
  • the first image may be an image in which the inner surface and the outer surface forming the tubular tissue are distinguished and displayed.
  • the first image may be an image showing the inner wall, which is the inner surface of the colon, and / or the outer wall, which is the outer surface. The user can easily diagnose whether the inner wall or outer wall of the colon is swollen or perforated through the first image.
  • vascular stenosis As another example, in order to prevent or treat a disease caused by vascular stenosis, it is necessary to diagnose whether vascular stenosis has occurred.
  • coronary artery stenosis is a typical case where diagnosis of vascular stenosis is needed. Coronary artery stenosis differentiates serious diseases such as myocardial infarction, arrhythmia, and angina pectoris. Therefore, accurate diagnosis must be made to increase or remove the stenosis. In order to do this, the constricted blood vessels must be found through medical images, and the degree of narrowing of the constricted blood vessels must be accurately observed and diagnosed.
  • the tubular tissue may include blood vessels to be diagnosed
  • the first region may include a blood vessel region
  • the first image may be an image representing a surface forming a blood vessel.
  • the image processor 120 may detect a first region corresponding to the tubular tissue from the first volume data acquired based on the ultrasound data, and set a cutting line in the longitudinal direction of the tubular tissue with respect to the first region.
  • the first region may be cut by the cutting line to generate a first image representing an unfolded surface of the tubular tissue.
  • the first image will be described in detail with reference to FIGS. 3A and 4A below.
  • the tubular tissue includes blood vessels and the first region is a blood vessel region will be described and illustrated as an example.
  • the image processor 120 detects a first area corresponding to the tubular tissue included in the object from the first volume data.
  • the image processor 120 detects a blood vessel region corresponding to a blood vessel included in the object from the first volume data.
  • the blood vessel used to diagnose the degree of narrowing of the blood vessel may include, for example, a jugular vein, a lower extremity vein, a coronary artery, or the like.
  • the image processor 120 may generate a first image representing information on the plaques included in the blood vessels, which are tubular tissues.
  • the image processor 120 when the reference plane is set as the inner wall of the blood vessel, the image processor 120 generates second volume data by mapping volume data included in the blood vessel region on a plane corresponding to the inner wall of the blood vessel. In addition, the image processor 120 may set the reference plane as a surface formed by unfolding a surface set between the inner wall and the outer wall of the blood vessel. The image processor 120 generates a first image representing a surface forming a blood vessel based on the second volume data.
  • the image processor 120 may set a cut line parallel to the longitudinal direction of the blood vessel, to the blood vessel region included in the first volume data.
  • the image processor 120 may generate second volume data such that both ends of a plane corresponding to the inner wall of the blood vessel included in the second volume data correspond to a cutting line set for the blood vessel region.
  • the second volume data may be volume data showing the inner wall of the blood vessel.
  • the image processor 120 may generate a first image, which is a 3D image representing an inner wall of the blood vessel, based on the second volume data showing the inner wall of the blood vessel.
  • the image processor 120 detects at least one of a predetermined portion and a predetermined tissue present on the surface of the tubular tissue based on the second volume data, and detects at least one of the detected predetermined portion and the predetermined tissue by using the tubular tissue.
  • the first image may be generated by displaying on the surface to be formed.
  • the image processor 120 may detect a specific tissue present on an inner wall forming a blood vessel, based on the second volume data.
  • a specific tissue such as plaque causing disease may be detected, and a first image representing information on at least one of the presence, location, size, and shape of the specific tissue may be generated.
  • the specific tissue may be a specific tissue, a body part, an abnormal area, or a suspected disease site existing inside the tubular tissue.
  • certain tissues may be perforation of the colon, plaques of blood vessels, malignant tumors of the stomach, abnormal tissues of the stomach, and the like.
  • the image processor 120 may detect a body part or tissue that needs to be observed for diagnosing a disease in the tubular tissue, and generate a first image representing the detected body part or tissue. In the above-described example, when perforation exists in the colon, the image processor 120 may generate a first image indicating the perforation. In addition, when tumor tissue is present on the inner wall of the stomach, the image processor 120 may display the tumor tissue on a first image representing the stomach wall.
  • the image processor 120 may generate a first image representing information on plaques included in the blood vessel.
  • Plaque is a deposit that accumulates on the inner wall of blood vessels and causes vascular narrowing, and means an atherosclerotic plaque. Plaques may include fibrous plaques, lipid plaques, and the like.
  • the image processor 120 may set cut lines parallel to the length direction of the blood vessel with respect to the blood vessel region included in the first volume data.
  • the image processor 120 may generate second volume data such that both ends of a plane corresponding to the inner wall of the blood vessel included in the second volume data correspond to a cutting line set for the blood vessel region.
  • the image processor 120 may detect a plaque region corresponding to the plaque from the second volume data.
  • the image processor 120 may generate a plaque image representing the plaque region and generate a first image including the plaque image.
  • the plaque image generated by the image processor 120 may be an image in which contour lines are displayed based on the height of the detected plaque region or at least one color mapped based on the height of the detected plaque region. have.
  • the image processor 120 may detect a plurality of plaque regions corresponding to the plaques from the second volume data.
  • the image processor 120 may generate a plaque image representing a plurality of plaque regions and generate a first image including the plaque image.
  • the image processor 120 may generate a plaque image including a plurality of identifiers corresponding to the plurality of plaque regions.
  • the display unit 130 displays and outputs the ultrasound image generated by the image processor 120.
  • the display unit 130 may display and output not only an ultrasound image but also various information processed by the ultrasound image display apparatus 100 on a screen through a graphical user interface (GUI).
  • GUI graphical user interface
  • the ultrasound image display apparatus 100 may include two or more display units 130 according to an implementation form.
  • the display unit 130 according to an embodiment of the present invention displays the first image generated by the image processor 120.
  • the ultrasound image display apparatus 100 is a user as compared to the ultrasound image display apparatus 100 illustrated in FIG. 1A. It may further include an input unit 140. Therefore, in the ultrasound image display apparatus 100 of FIG. 1B, a description overlapping with the ultrasound image display apparatus 100 of FIG. 1A is omitted.
  • the user input unit 140 receives a user input.
  • the user input unit 140 refers to a means for a user to input data for controlling the ultrasound image display apparatus 100.
  • the user input unit 140 includes a key pad, a dome switch, a touch pad (contact capacitive type, pressure resistive layer type, infrared sensing type, surface ultrasonic conduction type, and integral type). Tension measurement method, piezo effect method, etc.), a jog wheel, a jog switch, and the like, but are not limited thereto.
  • the touch pad forming a layer structure with the display panel of the display unit 130 may be referred to as a touch screen.
  • the display unit 130 may further display a second image generated based on the first volume data, and the user input unit 140 may receive a user input for the second image displayed through the display unit 130.
  • the image processor 120 may set a cutting line for the first area corresponding to the tubular tissue included in the first volume data, based on the user input.
  • the display unit 130 may display an image to which the first image is moved based on a user input received through the user input unit 140.
  • the display unit 130 may provide a GUI for setting a cutting line and a GUI for receiving a direction and an angle for rotating the first image.
  • the user input unit 140 receives a user input for rotating the first image from the user.
  • the image processor 120 may control the first image to be rotated and displayed at a predetermined angle in a predetermined direction based on a user input.
  • the ultrasound image display apparatus 100 displays a plaque located on an inner wall of a cylindrical vessel on a plane, so that a user quickly and accurately recognizes the size, shape, and position distribution of the plaque, and further, the vessel The diagnosis of stenosis can be quickly and accurately diagnosed.
  • a method of displaying an ultrasound image of a blood vessel by the ultrasound image display apparatus 100 according to an exemplary embodiment of the present invention will be described in detail with reference to FIG. 2A.
  • FIG. 2A is a flowchart of a method of displaying an ultrasound image, according to an exemplary embodiment.
  • the ultrasound image display method 200 according to an embodiment of the present invention illustrated in FIG. 2A has the same configuration as the operation configuration of the ultrasound image display apparatus according to one or another embodiment of the present invention described with reference to FIGS. 1A and 1B. Includes award features. Therefore, in describing the ultrasound image display method 200 illustrated in FIG. 2A, a description overlapping with FIGS. 1A and 1B will be omitted.
  • the ultrasound image display method 200 generates a first image three-dimensionally representing a surface forming tubular tissue on a reference plane based on ultrasound data corresponding to an object including the tubular tissue ( S201).
  • the operation of step S201 may be performed by the image processor 120.
  • step S201 The first image acquired in step S201 is displayed (S205).
  • the operation of step S205 may be performed on the display unit 130.
  • FIG. 2B is a flowchart of a method of displaying an ultrasound image, according to another exemplary embodiment.
  • the ultrasound image display method according to an embodiment of the present invention illustrated in FIG. 2B has the same configuration features as the operation configuration of the ultrasound image display apparatus according to one or another embodiment of the present invention described with reference to FIGS. 1A and 1B. Include.
  • steps S220, S230, S240, and S250 shown in FIG. 2B correspond to steps S201 shown in FIG. 2A
  • step S260 shown in FIG. 2B corresponds to steps S205 shown in FIG. 2A. Therefore, in describing the ultrasound image display method illustrated in FIG. 2B, descriptions overlapping with FIGS. 1A, 1B, and 2A will be omitted.
  • the ultrasound image display apparatus 100 transmits an ultrasound signal to an object and receives an echo signal reflected from the object.
  • the ultrasound image display apparatus 100 acquires first volume data by using the received echo signal.
  • the ultrasound image display apparatus 100 may obtain ultrasound image data of a plurality of cross sections included in the object.
  • the ultrasound image display apparatus 100 may generate first volume data of the object by reconstructing the ultrasound image data of the plurality of cross sections.
  • the ultrasound image display apparatus 100 may obtain first volume data in consideration of electrocardiography (ECG).
  • ECG electrocardiography
  • the ultrasound image display apparatus 100 may consider the ECG in order to acquire first volume data capable of accurately diagnosing whether a blood vessel is narrowed.
  • the ultrasound image display apparatus 100 may consider the ECG to determine the timing for acquiring the first volume data. For example, the ultrasound image display apparatus 100 may acquire the first volume data at the end of the systole or the end of the cardiac diastolic.
  • the ultrasound image display apparatus 100 detects a blood vessel region corresponding to a blood vessel included in the object from the first volume data.
  • the ultrasound image display apparatus 100 may perform noise filtering on the first volume data before detecting the blood vessel region in order to increase the accuracy of the blood vessel region detection.
  • a method of acquiring the second volume data and generating the first image will be described in detail with reference to FIG. 3A below.
  • FIG. 3A is a conceptual diagram illustrating a method of generating a first image according to an embodiment of the present invention.
  • the tubular tissue is a blood vessel.
  • the image processor 120 detects and displays a plaque present in the tubular tissue to generate a first image as an example.
  • FIG. 3A illustrates first volume data 310 generated based on ultrasound data obtained from an object.
  • the first volume data 310 is volume data representing an object and includes a tubular tissue in a cylindrical shape.
  • FIG. 3A illustrates an example of first volume data 310 obtained from a subject including blood vessels.
  • the ultrasound image display apparatus 100 detects a first region corresponding to the tubular tissue from the first volume data 310.
  • the ultrasound image display apparatus 100 may detect the blood vessel region 315 corresponding to the blood vessel by analyzing the first volume data 310.
  • the blood vessel region 315 may be formed by the outer wall region 311 and the inner wall region 312, which are outer surfaces of the blood vessel.
  • the ultrasound image display apparatus 100 estimates the blood vessel wall by calculating the brightness, the gradient, or the ultrasonic signal received from the probe of the pixels constituting the first volume data 310, and based on the estimated blood vessel wall, Vascular regions can be detected.
  • the ultrasound image display apparatus 100 may detect a blood vessel region using a conventionally known method of Intima-Media Thickness (IMT).
  • IMT Intima-Media Thickness
  • blood vessels generally appear dark in the first volume data 310, and the blood vessel inner wall region 312 and the blood vessel outer wall region 311 corresponding to the blood vessel boundary are generally brightly expressed.
  • the blood vessel region 315 can be detected by detecting the inflection point of the graph representing the brightness of the pixel.
  • the ultrasound image display apparatus 100 may generate second volume data by mapping volume data included in a first area on a reference plane.
  • the ultrasound image display apparatus 100 generates second volume data by mapping volume data included in a blood vessel region on a plane corresponding to an inner wall of the blood vessel.
  • the ultrasound image display apparatus 100 may include at least one of an inner wall and an outer wall of the tubular tissue, or a predetermined tissue or a predetermined portion of the inner wall and the outer wall of the tubular tissue, for example, plaque, perforation, swollen portion, or the like.
  • the tubular tissue may be cut out to generate the flattened second volume data 320.
  • the plaque in obtaining information on the plaque located inside the blood vessel using the first volume data 315, the plaque is located on the lower side or the upper side of the cylindrical inner wall of the blood vessel. Can be located. Therefore, only the 3D image generated by rendering the first volume data 315 may be difficult to quickly recognize the shape, size, height, and position distribution of the plaque located on the upper surface or the side surface of the blood vessel inner wall.
  • the ultrasound image display apparatus 100 may include a three-dimensional representation of a surface forming tubular tissue on a reference plane.
  • One image 320 may be generated.
  • (c) of FIG. 3A a case in which the surface forming the tubular tissue and the site forming the tubular tissue is expressed in a volume in the first image 320 in which the tubular tissue is unfolded is illustrated as an example.
  • the image processor 120 may cut the cylindrical blood vessel region in the longitudinal direction using the first volume data 315 to generate the flattened second volume data 320.
  • the 3D image corresponding to the second volume data 320 may be generated as the first image.
  • the ultrasound image display apparatus 100 may generate the second volume data 320 by setting a cutting line for the detected blood vessel region and rearranging the first volume data on a plane based on the set cutting line. Can be.
  • the ultrasound image display apparatus 100 may set cutting lines parallel to the longitudinal direction of the blood vessel region. Further, a cut line parallel to the direction of blood flow can be set for the blood vessel region. As another example, the ultrasound image display apparatus 100 displays a second image generated based on the first volume data or the detected blood vessel region on a screen, and cuts the image based on a user input for the second image displayed on the screen. You can set the line.
  • the ultrasound image display apparatus 100 may set a cutting line by displaying a 3D image of a blood vessel on a screen and receiving a command for setting a cutting line on the 3D image of the blood vessel displayed on the screen.
  • the ultrasound image display apparatus 100 may use the outer wall of the blood vessel based on the line segment AB (The second volume data 320 may be generated by mapping volume data included in the blood vessel region on the plane 322 corresponding to 311.
  • the ultrasound image display apparatus 100 may set the line segment AB as a cutting line, and then set the line segment CD corresponding to the line segment AB as an auxiliary line.
  • the ultrasound image display apparatus 100 may be positioned on a plane perpendicular to the blood vessel including the point A and determine the point C farthest from the point A among the points corresponding to the blood vessel inner wall 312.
  • the ultrasound image display apparatus 100 may be positioned on a plane perpendicular to the blood vessel including the point B and determine the point D farthest from the point B among the points corresponding to the blood vessel inner wall 312.
  • the line segment CD connecting the determined point C and the point D can be set as an auxiliary line.
  • the cutting line and the auxiliary line are straight lines is described as an example, but the present invention is not limited thereto, and the cutting line and the auxiliary line may be curved.
  • the image processor 120 cuts the blood vessel region 315 which is the first region by cutting lines parallel to the longitudinal direction of the tubular tissue (for example, line segment AB), and unfolds the cut blood vessel region 315.
  • the reference plane 322 specifically, the vessel outer wall 311, displays the surface forming tubular tissue, for example, the surface forming the inside of the tubular tissue
  • the first image may be generated, for example, a 3D image formed by rendering the volume data 320.
  • the reference plane is a plane formed by connecting vertices A ′, A ′′, B ′′, and B ′, and may correspond to the outer wall 311 of the tubular tissue.
  • a curved surface formed by connecting O ', O' '', P '' ', and P', which are surfaces forming tubular tissue may correspond to the inner wall 312, which is a surface, which forms an inner side of the tubular tissue. Can be.
  • plaques 331 and 332, which are predetermined tissues present inside the tubular tissue, are shown.
  • the image processor 320 may generate a first image displayed as shown in (c) of FIG. 3A. The user can easily check the state of the plaques 331 and 332 and blood vessels through the first image.
  • the ultrasound image display apparatus 100 may include line segments AB at which both ends of the plane 322 corresponding to the inner wall 312 of the blood vessel included in the second volume data 320 are set as cut lines for the blood vessel region. To correspond to.
  • the curved ABDC corresponding to the inner wall of the right vessel corresponds to the plane A''B''D'C 'of the second volume data 320. do.
  • the curved ABDC corresponding to the inner wall of the left blood vessel corresponds to the plane A'B'D'C 'of the second volume data 320.
  • the point O which is the center of the points A and C corresponds to the line O'O''O ''' 'of the second volume data 320. do.
  • the point P which is the center of the point B and the point D is the line P'P''P ''' 'of the second volume data 320. Corresponds to.
  • the upper surface of the second volume data 320 may be flat or curved depending on whether the blood vessel has a constant thickness.
  • 3A illustrates an example in which the upper surface of the second volume data 320 is curved due to a slight difference in the thickness of blood vessels.
  • the ultrasound image display apparatus 100 In operation S250, the ultrasound image display apparatus 100 according to an exemplary embodiment generates a first image 330 representing information on plaques included in blood vessels using the second volume data 320. .
  • the ultrasound image display apparatus 100 may render the second volume data to generate a first image representing information on the plaque included in the blood vessel.
  • the first image may include a 2D image, a 3D image, and a stereoscopic image.
  • the ultrasound image display apparatus 100 may detect a plaque region corresponding to the plaque from the second volume data. For example, the ultrasound image display apparatus 100 may perform A'C'A '' B when the upper surface O'O''O '' 'P' '' P''P 'of the second volume data 320 is used. The brightness of the pixels included in the second volume data 320 may be analyzed in the direction '' D'B '. The ultrasound image display apparatus 100 may detect a peak of the plaque region based on a change in brightness of the pixels. The ultrasound image display apparatus 100 may analyze a predetermined region near the detected peak as corresponding to one plaque. In order to detect plaque areas, for example, a watershed algorithm and the like can be used.
  • the ultrasound image display apparatus 100 may include, for example, a boundary mask such as Sobel, Prewitt, Robert, The Laplacian of Gaussian or Canny mask.
  • the boundary of the plaque region may be detected by using a mask or by using intensity or gradient.
  • the ultrasound image display apparatus 100 may determine the plaque region based on the detected boundary of the plaque region.
  • the ultrasound image display apparatus 100 renders the second volume data 320 of (c) of FIG. 3A, thereby rendering the plaques 331 and 332 included in the blood vessel.
  • the first image 330 representing the information may be generated.
  • the surface of the blood vessel is illustrated as a three-dimensional image.
  • the ultrasound image display apparatus 100 provides a first image 330 in which a plaque located in a cylindrical blood vessel inner wall is rearranged on a plane to form a plaque in which a user is located on a blood vessel inner wall.
  • This allows for quick and accurate recognition of size, height and position distribution.
  • different colors, shapes, and marks according to sizes, heights, and positions of the plaques 331 and 332 may be easily identified so that the shapes, sizes, heights, and positions of the plaques present in the first image 330 may be easily understood.
  • the plaques 331 and 332 may be displayed in the first image 330 by applying at least one of the above.
  • tissue for example, a plaque, which protrudes inwardly of the tubular tissue is illustrated as an example.
  • tissue may be formed to protrude outward from the tubular tissue, or tissue formed through the tubular tissue.
  • the first image 400 illustrated in FIG. 3B may correspond to the first ultrasound image 330 illustrated in FIG. 3A (d).
  • the image processor 120 may detect a plaque region corresponding to the plaque based on the ultrasound data, and generate a plaque image representing the plaque region.
  • the first image including the plaque image may be generated. That is, the plaque image may be generated as the first image.
  • the image processor 120 may generate a plaque image in which contour lines are displayed based on the height of the plaque region.
  • the ultrasound image display apparatus 100 may generate a first image 400 including plaque images representing the plaque regions 410 and 420.
  • the ultrasound image display apparatus 100 may generate the first image 400 by rendering volume data corresponding to the detected plaque areas 410 and 420.
  • the ultrasound image display apparatus 100 may generate a plaque image in which the contour line 401 is displayed based on the height of the plaque region. By referring to the contour line 401, the user can easily grasp the shape and size of the plaque areas 410 and 420.
  • the first image 400 may be generated by mapping different colors according to the contour line 401.
  • the first image 400 may be generated so that the degree of occurrence of the plaque can be intuitively determined by displaying the different colors according to the height of the plaque.
  • the image processing unit 120 may be based on at least one of the ratio between the height of the plaque region and the diameter of the blood vessel (eg, the diameter of the blood vessel), the elastic value of the plaque region, and the brightness value of the image in the plaque region. At least one color may be mapped to an area to generate a plaque image.
  • FIG. 4A is a conceptual diagram illustrating a method of generating a first image according to another embodiment of the present invention.
  • the configuration overlapping with FIG. 3A is illustrated using the same reference numerals. Therefore, in FIG. 4A, the description overlapping with FIG. 3A is omitted.
  • a tissue for example, a plaque, which protrudes inwardly of the tubular tissue is illustrated as an example.
  • FIG. 4A illustrates an example in which the plaques include plaques 414 and 415 formed to protrude outward of the tubular tissue, as shown in FIG. 3A, in addition to the plaque protruding inwardly of the tubular tissue. .
  • the second volume data 320 includes plaques 412 and 413 that protrude in the inner direction of the blood vessel and plaques 414 and 415 that protrude outward of the blood vessel. Are all shown.
  • the first image 330 which is an image in which the plaque is more clearly displayed, has plaques 412 and 413 formed to protrude in the inner direction of the blood vessel and the outer side of the blood vessel. And formed plaques 414 and 415 are shown as shown.
  • FIG. 4B illustrates an example of a screen displaying a plaque image in which contour lines are displayed according to another exemplary embodiment of the present invention.
  • the first image 450 illustrated in FIG. 4B may correspond to the first ultrasound image 330 illustrated in FIG. 3B (d).
  • the first image 450 illustrated in FIG. 4B may generate a plaque image in which the contour line 401 is displayed based on the height of the plaque region.
  • the user can easily grasp the shape and size of the plaque areas 410, 420, 430, and 440.
  • the image processor 120 may generate a plaque image so that the plaques protruding into the tubular tissue and the plaques protruding outward of the tubular tissue may be distinguished from each other and displayed based on the reference plane.
  • the plaque regions 410 and 420 protruding into the blood vessel may be represented by a solid line as shown, and the plaque regions 430 and 440 protruding out of the blood vessel. ) May be indicated by a dotted line as shown.
  • the image processor 120 may protrude out of the blood vessel and the plaque areas 410 and 420 formed to protrude into the blood vessel using different colors, different textures, different shapes, or different symbols.
  • the first image 450 may be generated so that the plaque regions 430 and 440 formed as described above may be easily distinguished.
  • the ultrasound image display apparatus 100 may generate an image viewed from different angles based on a user input. That is, the ultrasound image display apparatus 100 may generate an image in which the first image is rotated based on a user input.
  • the ultrasound image display apparatus 100 rotates the first image 400 shown in FIG. 3B, thereby plaques in a direction parallel to the plane 422 corresponding to the blood vessel inner wall, as shown in FIG. 5A.
  • the rotated image may be generated to look at the image.
  • the user can easily grasp the heights of the plaque areas 410 and 420.
  • the user input unit 140 may receive a user input for requesting the movement of the first image 400, including at least one of rotation and translation of the first image 400. Then, the image processor 120 may generate and output the first image moved according to the user input.
  • the ultrasound image display apparatus 100 rotates the first image 400 illustrated in FIG. 3B, so that the ultrasound image display apparatus 100 may be perpendicular to the plane 422 corresponding to the inner wall of the blood vessel, as illustrated in FIG. 5B. Rotate the image to look at the plaque. Referring to FIG. 5B, the user can easily grasp the shapes of the plaque areas 410 and 420.
  • FIG. 6A illustrates an example of a screen displaying a plaque image to which at least one color is mapped according to an embodiment of the present invention.
  • the plaque image 600 illustrated in FIG. 6A is an image corresponding to the first image or the plaque image illustrated in FIG. 3A (d) or FIG. 3B.
  • the ultrasound image display apparatus 100 may generate the plaque image 600 to which at least one color is mapped based on the heights of the plaque regions 410 and 420.
  • the screen including the plaque image 600 generated by the ultrasound image display apparatus 100 may include a map 620 including information on the plaque region.
  • the map 620 may include a color map 620 in which the height of the plaque and the plurality of colors are mapped.
  • the color map 620 is a map representing a color that varies depending on the ratio of the height of the plaque region to the diameter (diameter) of the blood vessel.
  • the plaque region may be displayed in a plurality of colors according to the above-described ratio of height to diameter.
  • the plaque image 600 may display the plaque region using one color corresponding to the height of the peak point of the plaque region.
  • the plaque image 600 may display one plaque area in a plurality of colors by displaying the plaque area using a color corresponding to the height of each point included in the plaque area.
  • the ultrasound image display apparatus 100 detects the plurality of plaque areas 410 and 420, and detects the plaque image 600 including the plurality of identifiers 615 corresponding to the plurality of plaque areas 410 and 420. Can be generated.
  • the plurality of identifiers 615 may include numbers assigned to each plaque region.
  • the ultrasound image display apparatus 100 allocates a rank based on at least one of height, length, area, and volume of the detected plaque regions, and assigns a number corresponding to the assigned rank on the plaque image 600. I can display it.
  • the ultrasound image display apparatus 100 displays the first image generated in operation S250.
  • the first image may be included and displayed on the screen through the display unit 130.
  • the ultrasound image display apparatus 100 may further display direction information so that a user may easily identify which part of the blood vessel the plane 422 corresponding to the inner wall of the blood vessel corresponds to.
  • the direction information of plane 422 may be represented as far, near, left, and right.
  • the user based on the direction information shown in FIG. 3B, the far-near direction may be the length direction of the blood vessel.
  • the far-near direction can be seen that the direction parallel to the direction of blood flow, left ⁇ right can be seen that the direction perpendicular to the direction of blood flow.
  • the ultrasound image display apparatus 100 may determine the severity of the degree of narrowing of blood vessels by displaying at least one of a height, a length, an area, and a volume of the plaque region.
  • the ultrasound image display apparatus 100 may determine the risk of a disease (eg, atherosclerosis) according to the degree of narrowing of blood vessels based on a predefined guideline, and display the result in color, symbols, numbers, or letters. have.
  • a disease eg, atherosclerosis
  • FIG. 6B illustrates an example of a screen displaying a plaque image mapped with at least one color according to another embodiment of the present invention.
  • the plaque image 650 illustrated in FIG. 6B is an image corresponding to the first image or the plaque image illustrated in FIG. 4A (d) or FIG. 4B.
  • the plaque image 650 illustrated in FIG. 6B includes a plaque region (plaque region indicated by 615 identifier) protruding into the blood vessel and a plaque region (651 and 652 identifier region formed protruding out of the blood vessel). Include.
  • the screen including the plaque area 650 may include a map 620 indicating information about the plaque area. Since the map 620 illustrated in FIG. 6B is the same as the map 620 illustrated in FIG. 6ADp, detailed description thereof will be omitted.
  • the plaque region protruding into the blood vessel and the plaque region protruding outward of the blood vessel may be marked with markers having different codes.
  • a (+) marker may be placed on the plaque region protruding into the blood vessel, and a (-) marker may be displayed on the plaque region protruding outward from the blood vessel.
  • '-100%' represents a ratio of height to diameter in the plaque area formed by protruding outward from the blood vessel, and '+ 100%' is formed in the plaque area protruding into the blood vessel. Represents the ratio of height to diameter.
  • the ultrasound image display apparatus 100 sets the cut surface 701 of the blood vessel region 715 detected from the first volume data, and generates the cut surface 701 based on the cut surface 701. 3 Volume data can be created.
  • the ultrasound image display apparatus 100 may set the cut surface 701 automatically or manually.
  • the ultrasound image display apparatus 100 includes a plane A'C'C''A''B''D''D'B corresponding to a plane ACBD.
  • the third volume data 720 may be generated by mapping volume data included in the blood vessel region 715 with respect to '.
  • the ultrasound image display apparatus 100 detects the plaque areas 731 and 732 corresponding to the plaques from the third volume data, and indicates the plaque areas 731 and 732.
  • the 3D image 730 may be generated.
  • the ultrasound image display apparatus 100 acquires a 3D image of the inside of the blood vessel and extracts an accurate plaque region, thereby displaying the severity of the oilification or calcification of the blood vessel. Can be. Thus, the speed and accuracy of disease diagnosis can be increased.
  • FIGS. 8A and 8A illustrate examples of screens output through the display unit 130.
  • the screen 810 may include a first image 820.
  • the first image 820 may include at least one of the image illustrated in (c) of FIG. 3A and the image 400 illustrated in FIG. 3B.
  • FIG. 8A illustrates a case where the first image 820 corresponds to the image 400 illustrated in FIG. 3B as an example.
  • the user may move the first image 820 using the movement cursor 835.
  • the movement may include at least one of movement and rotation of the first image 820.
  • the image processor 120 may generate a second image 815 representing the object region including the tubular tissue using the ultrasound data.
  • the second image 815 may be a 3D ultrasound image generated based on the first volume data acquired based on the ultrasound data and represent the tubular tissue.
  • the second image 815 may include at least one of the images illustrated in (a) or (b) of FIG. 3A. That is, the second image 815 may be a 3D ultrasound image representing an object including tubular tissue.
  • the second image 815 may be a 3D ultrasound image which extracts and shows only tubular tissue from the object.
  • the screen 810 may include a first image 820 and a second image 815.
  • the second image 815 may also be displayed by moving to correspond to the first image 820.
  • the user may move the second image 815 using the movement cursor 835.
  • the first image 820 may also be displayed by moving to correspond to the second image 815.
  • the first image 870 included in the screen 850 may include markers 871 and 872 for displaying cut lines.
  • the description of the overlapping part with the screen 810 of FIG. 8A is omitted.
  • markers 861 and 862 may be included in a portion corresponding to the cut line of the first image 870. That is, the markers 861 and 871 become markers indicating the point where the cut line exists around the blood vessel. In addition, the markers 862 and 872 become markers indicating a cutting line.
  • the user may easily identify the positional relationship between the unfolded tubular tissue and the unfolded tubular tissue by looking at the markers displayed at the same corresponding positions of the first image 870 and the second image 860.
  • FIGS. 9A and 9A illustrate examples of screens output through the display unit 130.
  • the screen 920 may include a first image 920.
  • FIG. 9A illustrates a case where the first image 920 corresponds to the image illustrated in FIG. 3A (c) as an example.
  • the screen 920 may further include a second image 915.
  • a description of the overlapping part with the screen 810 of FIG. 8A is omitted.
  • the screen 950 may include a first image 970.
  • FIG. 9B illustrates a case in which the first image 970 corresponds to the image shown in FIG. 3A (c) as an example.
  • the screen 920 may further include a second image 960.
  • the first image 970 may include markers 971 and 972 for displaying cut lines.
  • markers 961 and 962 may be included in a portion corresponding to the cut line of the first image 970.
  • a screen including a first image (eg, 810 and 910) may include at least one of a predetermined portion and a predetermined tissue detected in the tubular tissue (hereinafter, referred to as “tense tissue”) (eg, Information about the size, height, volume, brightness, or severity of the plaque in the blood vessel) may be additionally displayed.
  • tissue eg, Information about the size, height, volume, brightness, or severity of the plaque in the blood vessel
  • a screen may be displayed including information indicating occasional display of values such as at least one of the size, height, and volume of the sensing tissue.
  • the ultrasound image display apparatus and method unfold the tubular tissue to generate and display an ultrasound image represented in three dimensions on a reference plane, so that the user can easily diagnose the inside and the outside of the tubular tissue. Make sure Accordingly, the accuracy of disease diagnosis can be improved.
  • FIG. 10 is a block diagram of an ultrasound system to which an ultrasound image display apparatus and method according to an exemplary embodiment may be applied.
  • the ultrasound image display method according to an embodiment of the present invention may be performed by the ultrasound system 1000 shown in FIG. 10, and the ultrasound image display device according to an embodiment of the present invention may be the ultrasound system shown in FIG. 10. 1000 may be included.
  • the ultrasound image display apparatus 100 of FIG. 1 may perform some or all of the functions performed by the ultrasound system 1000 of FIG. 10.
  • the probe 110 and the image processor 120 of FIG. 1 are used for the probe 1020, the ultrasound transceiver 1100, and the image processor 1200 of FIG. 10, and the display unit 130 of FIG.
  • the user input unit 140 of FIG. 1 may correspond to the input device 1500 of FIG. 10.
  • the ultrasound diagnosis apparatus 1000 may include a probe 20, an ultrasound transceiver 1100, an image processor 1200, a communicator 1300, a display 1400, a memory 1500, and an input device 1600. , And the controller 1700, and the above-described various components may be connected to each other through the bus 1800.
  • the ultrasound diagnosis apparatus 1000 may be implemented as a portable type as well as a cart type.
  • Examples of the portable ultrasound diagnostic apparatus may include a fax viewer (PACS, Picture Archiving and Communication System viewer), a smart phone, a laptop computer, a PDA, a tablet PC, but are not limited thereto.
  • the probe 20 transmits an ultrasound signal to the object 10 according to a driving signal applied from the ultrasound transceiver 1100, and receives an echo signal reflected from the object 10.
  • the probe 20 includes a plurality of transducers, and the plurality of transducers vibrate according to an electrical signal transmitted and generate ultrasonic waves which are acoustic energy.
  • the probe 20 may be connected to the main body of the ultrasound diagnosis apparatus 1000 by wire or wirelessly, and the ultrasound diagnosis apparatus 1000 may include a plurality of probes 20 according to an implementation form.
  • the transmitter 1110 supplies a driving signal to the probe 20, and includes a pulse generator 1112, a transmission delay unit 1114, and a pulser 1116.
  • the pulse generator 1112 generates a pulse for forming a transmission ultrasonic wave according to a predetermined pulse repetition frequency (PRF), and the transmission delay unit 1114 determines a transmission directionality. Apply a delay time to the pulse. Each pulse to which the delay time is applied corresponds to a plurality of piezoelectric vibrators included in the probe 20, respectively.
  • the pulser 1116 applies a driving signal (or a driving pulse) to the probe 20 at a timing corresponding to each pulse to which a delay time is applied.
  • the receiver 1120 processes the echo signal received from the probe 20 to generate ultrasonic data
  • an amplifier 1122 an ADC (Analog Digital Converter) 1124, a reception delay unit 1126, and The adder 1128 may be included.
  • the amplifier 1122 amplifies the echo signal for each channel, and the ADC 1124 converts the amplified echo signal to analog-digital conversion.
  • the reception delay unit 1126 applies a delay time for determining reception directionality to the digitally converted echo signal, and the adder 1128 adds the echo signals processed by the reception delay unit 1166 by Generate ultrasound data.
  • the receiver 1120 may not include the amplifier 1122 according to the implementation form. That is, when the sensitivity of the probe 20 is improved or the number of processing bits of the ADC 1124 is improved, the amplifier 1122 may be omitted.
  • the image processor 1200 generates an ultrasound image through a scan conversion process on the ultrasound data generated by the ultrasound transceiver 1100.
  • the ultrasound image includes gray scale images obtained by scanning an object in an A mode, B mode, and M mode, as well as a Doppler effect. It may also be a Doppler image representing a moving object by using.
  • the Doppler image may be a blood flow Doppler image (or also referred to as a color Doppler image) representing a blood flow, a tissue Doppler image representing a tissue movement, or a spectral Doppler image displaying a moving speed of an object as a waveform.
  • the B mode processing unit 1212 included in the data processing unit 1210 extracts and processes the B mode component from the ultrasonic data.
  • the image generator 1220 may generate an ultrasound image in which the intensity of the signal is expressed as brightness based on the B mode component extracted by the B mode processor 1212.
  • the Doppler processor 1214 included in the data processor 1210 extracts the Doppler component from the ultrasound data, and the image generator 1220 expresses the movement of the object in color or waveform based on the extracted Doppler component.
  • a Doppler image may be generated.
  • the image generator 1220 may generate a 3D ultrasound image through a volume rendering process for volume data, and generate an elastic image that images the deformation degree of the object 10 according to pressure. It may be. In addition, the image generator 1220 may express various additional information in text or graphics on the ultrasound image. The generated ultrasound image may be stored in the memory 1500.
  • the display unit 1400 displays and outputs the generated ultrasound image.
  • the display unit 1400 may display and output not only an ultrasound image but also various information processed by the ultrasound diagnosis apparatus 1000 on a screen through a graphical user interface (GUI).
  • GUI graphical user interface
  • the ultrasound diagnosis apparatus 1000 may include two or more display units 1400 according to an implementation form.
  • the communication unit 1300 is connected to the network 30 by wire or wirelessly to communicate with an external device or a server.
  • the communication unit 1300 may exchange data with a hospital server or another medical device in a hospital connected through a medical image information system (PACS).
  • PPS medical image information system
  • the communication unit 1300 may perform data communication according to a digital imaging and communications in medicine (DICOM) standard.
  • DICOM digital imaging and communications in medicine
  • the communication unit 1300 may transmit and receive data related to diagnosis of the object, such as an ultrasound image, ultrasound data, and Doppler data, of the object 10 through the network 30, and may perform other medical treatments, such as a CT device, an MRI device, and an X-ray device. Medical images taken by the device can also be transmitted and received.
  • the communication unit 1300 may receive information on a diagnosis history, a treatment schedule, and the like of the patient from the server and use the same to diagnose the object 10.
  • the communication unit 1300 may perform data communication with a portable terminal of a doctor or a patient, as well as a server or a medical device in a hospital.
  • the communication unit 1300 may be connected to the network 30 by wire or wirelessly to exchange data with the server 32, the medical device 34, or the portable terminal 36.
  • the communication unit 1300 may include one or more components that enable communication with an external device, and may include, for example, a short range communication module 1310, a wired communication module 1320, and a mobile communication module 1330. Can be.
  • the short range communication module 1310 refers to a module for short range communication within a predetermined distance.
  • Local area communication technology includes a wireless LAN, Wi-Fi, Bluetooth, ZigBee, WFD (Wi-Fi Direct), UWB (ultra wideband), infrared communication ( IrDA (Infrared Data Association), Bluetooth Low Energy (BLE), Near Field Communication (NFC), and the like, but are not limited thereto.
  • the wired communication module 1320 refers to a module for communication using an electrical signal or an optical signal, and the wired communication technology according to an embodiment includes a twisted pair cable, a coaxial cable, an optical fiber cable, and an ethernet. Cables and the like.
  • the mobile communication module 1330 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may be various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the memory 1500 stores various types of information processed by the ultrasound diagnosis apparatus 1000.
  • the memory 1500 may store medical data related to diagnosis of an object, such as input / output ultrasound data and an ultrasound image, or may store an algorithm or a program performed in the ultrasound diagnosis apparatus 1000.
  • the memory 1500 may be implemented with various types of storage media such as a flash memory, a hard disk, and an EEPROM. Also, the ultrasound diagnosis apparatus 1000 may operate a web storage or a cloud server that performs a storage function of the memory 1500 on the web.
  • the input device 1600 refers to a means for receiving data for controlling the ultrasound diagnosis apparatus 1000 from a user.
  • Examples of the input device 1600 may include, but are not limited to, a hardware configuration such as a keypad, a mouse, a touch pad, a touch screen, a trackball, a jog switch, and an ECG measurement module, a respiration measurement module, a voice recognition sensor, and a gesture.
  • Various input means such as a recognition sensor, a fingerprint recognition sensor, an iris recognition sensor, a depth sensor, and a distance sensor may be further included.
  • the controller 1700 generally controls the operation of the ultrasound diagnosis apparatus 1000. That is, the controller 1700 may include the probe 20, the ultrasound transceiver 1100, the image processor 1200, the communication unit 1300, the display unit 1400, the memory 1500, and the input device illustrated in FIG. 1. 1600 may control the operation.
  • the probe 20, the ultrasonic transceiver 1100, the image processor 1200, the communication unit 1300, the display unit 1400, the memory 1500, the input device 1600, and the controller 1700 may be software.
  • the module may be operated by a module, but is not limited thereto. Some of the above-described configurations may be operated by hardware.
  • at least some of the ultrasound transceiver 1100, the image processor 1200, and the communicator 1300 may be included in the controller 1600, but are not limited thereto.
  • Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media.
  • Computer readable media may include both computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transmission mechanism, and includes any information delivery media.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physiology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un procédé d'affichage d'une image ultrasonique offrant à un utilisateur une image tridimensionnelle qui permet un diagnostic facile d'un tissu tubulaire, de qui permet un diagnostic aisé d'une maladie survenant dans le tissu tubulaire. Un appareil d'affichage d'une image ultrasonique selon un mode de réalisation de la présente invention comprend : une unité de traitement d'image permettant de créer une première image qui montre en trois dimensions la surface formant le tissu tubulaire sur un plan de référence, sur la base de données ultrasonores correspondant à un objet incluant le tissu tubulaire ; et un écran permettant d'afficher la première image.
PCT/KR2014/012272 2013-12-12 2014-12-12 Procédé et appareil d'affichage d'une image ultrasonore Ceased WO2015088277A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/103,555 US10631823B2 (en) 2013-12-12 2014-12-12 Method and apparatus for displaying ultrasonic image
CN201480075482.5A CN106028950B (zh) 2013-12-12 2014-12-12 显示超声图像的设备和方法
EP14869861.6A EP3081169B1 (fr) 2013-12-12 2014-12-12 Procédé et appareil d'affichage d'une image ultrasonore

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20130154899 2013-12-12
KR10-2013-0154899 2013-12-12

Publications (1)

Publication Number Publication Date
WO2015088277A1 true WO2015088277A1 (fr) 2015-06-18

Family

ID=53371499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/012272 Ceased WO2015088277A1 (fr) 2013-12-12 2014-12-12 Procédé et appareil d'affichage d'une image ultrasonore

Country Status (5)

Country Link
US (1) US10631823B2 (fr)
EP (1) EP3081169B1 (fr)
KR (1) KR101660370B1 (fr)
CN (1) CN106028950B (fr)
WO (1) WO2015088277A1 (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9042613B2 (en) * 2013-03-01 2015-05-26 Heartflow, Inc. Method and system for determining treatments by modifying patient-specific geometrical models
CN105232086A (zh) * 2015-10-29 2016-01-13 深圳市德力凯医疗设备股份有限公司 一种基于经颅多普勒的颅内血流三维信息显示方法及系统
US10332305B2 (en) * 2016-03-04 2019-06-25 Siemens Healthcare Gmbh Cinematic rendering of unfolded 3D volumes
KR102032611B1 (ko) * 2017-08-23 2019-10-15 주식회사 메디웨일 Ct 영상을 이용하여 심혈관 병변을 판단하는 방법 및 애플리케이션
EP3586759A1 (fr) * 2018-06-28 2020-01-01 Koninklijke Philips N.V. Procédés et systèmes de réalisation d'imagerie par ultrasons doppler en couleur
KR102714846B1 (ko) * 2018-08-09 2024-10-10 삼성메디슨 주식회사 초음파 진단 장치, 초음파 영상을 표시하는 방법, 및 컴퓨터 프로그램 제품
KR102372826B1 (ko) * 2019-12-26 2022-03-11 계명대학교 산학협력단 결석 제거장치
US11497465B2 (en) 2019-10-25 2022-11-15 Bard Peripheral Vascular, Inc. Method for treatment of a vascular lesion
US11151718B2 (en) * 2019-10-30 2021-10-19 Nikon Corporation Image processing method, image processing device, and storage medium
CN113545807A (zh) * 2020-04-26 2021-10-26 深圳迈瑞生物医疗电子股份有限公司 血管斑块的超声测量方法、装置和存储介质
CN113693628A (zh) * 2020-05-20 2021-11-26 深圳迈瑞生物医疗电子股份有限公司 一种超声成像的方法和系统
CN111920450A (zh) * 2020-07-14 2020-11-13 上海市同仁医院 一种可视化颈动脉斑块超声自查仪
CN115192073A (zh) * 2021-07-21 2022-10-18 上海市同仁医院 心血管疾病风险评估分层管理及颈动脉超声一体化系统
US20240081782A1 (en) * 2022-09-14 2024-03-14 Boston Scientific Scimed, Inc. Graphical user interface for intravascular ultrasound calcium display

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080051660A1 (en) * 2004-01-16 2008-02-28 The University Of Houston System Methods and apparatuses for medical imaging
JP2009195585A (ja) * 2008-02-25 2009-09-03 Toshiba Corp 超音波診断装置、及び超音波診断装置の制御プログラム
JP2013052131A (ja) * 2011-09-05 2013-03-21 Toshiba Corp 超音波診断装置及び血管狭窄改善表示プログラム
JP2013118932A (ja) * 2011-12-07 2013-06-17 Toshiba Corp 超音波診断装置及び血管厚測定プログラム

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4421203B2 (ja) * 2003-03-20 2010-02-24 株式会社東芝 管腔状構造体の解析処理装置
CN101166470B (zh) 2005-04-28 2016-04-06 株式会社日立医药 图像显示装置及图像显示方法
JP2006346022A (ja) * 2005-06-14 2006-12-28 Ziosoft Inc 画像表示方法及び画像表示プログラム
EP1897063B1 (fr) * 2005-06-22 2018-05-02 Koninklijke Philips N.V. Procede de visualisation de plans de coupe pour des structures courbes allongees
US7668342B2 (en) 2005-09-09 2010-02-23 Carl Zeiss Meditec, Inc. Method of bioimage data processing for revealing more meaningful anatomic features of diseased tissues
JP4958901B2 (ja) * 2006-03-29 2012-06-20 株式会社日立メディコ 医用画像表示システム及び医用画像表示プログラム
JP4937843B2 (ja) 2007-06-11 2012-05-23 株式会社日立メディコ 医用画像表示装置、医用画像表示方法及びプログラム
JP5191989B2 (ja) * 2007-07-31 2013-05-08 株式会社日立メディコ 医用画像表示装置、医用画像表示方法
US8743118B2 (en) * 2008-03-21 2014-06-03 Hitachi Medical Corporation Medical image display device and medical image display method
US8818060B2 (en) * 2008-12-25 2014-08-26 Hitachi Medical Corporation Medical image display device and method
KR20130056855A (ko) 2010-03-01 2013-05-30 카리스 라이프 사이언스 룩셈부르크 홀딩스 치료진단용 생물학적 지표들
JP5683831B2 (ja) * 2010-04-14 2015-03-11 株式会社東芝 医用画像処理装置、及び医用画像処理プログラム
US9196057B2 (en) * 2011-03-10 2015-11-24 Kabushiki Kaisha Toshiba Medical image diagnosis apparatus, medical image display apparatus, medical image processing apparatus, and medical image processing program
KR20210133321A (ko) 2012-11-08 2021-11-05 클리어사이드 바이오메디컬, 인코포레이드 인간 대상체에서 안구 질병을 치료하기 위한 방법 및 장치
CN104968276B (zh) 2013-02-28 2018-02-27 株式会社日立制作所 图像处理装置以及区域抽出方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080051660A1 (en) * 2004-01-16 2008-02-28 The University Of Houston System Methods and apparatuses for medical imaging
JP2009195585A (ja) * 2008-02-25 2009-09-03 Toshiba Corp 超音波診断装置、及び超音波診断装置の制御プログラム
JP2013052131A (ja) * 2011-09-05 2013-03-21 Toshiba Corp 超音波診断装置及び血管狭窄改善表示プログラム
JP2013118932A (ja) * 2011-12-07 2013-06-17 Toshiba Corp 超音波診断装置及び血管厚測定プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3081169A4 *

Also Published As

Publication number Publication date
CN106028950A (zh) 2016-10-12
KR20150068922A (ko) 2015-06-22
CN106028950B (zh) 2020-03-31
EP3081169A1 (fr) 2016-10-19
EP3081169A4 (fr) 2017-11-08
EP3081169B1 (fr) 2023-06-28
KR101660370B1 (ko) 2016-09-27
US20160310101A1 (en) 2016-10-27
US10631823B2 (en) 2020-04-28

Similar Documents

Publication Publication Date Title
WO2015088277A1 (fr) Procédé et appareil d'affichage d'une image ultrasonore
WO2015080522A1 (fr) Méthode et appareil ultrasonore pour le marquage de tumeur sur une image élastographique ultrasonore
WO2015130070A2 (fr) Appareil de diagnostic à ultrasons et son procédé de fonctionnement
KR102618500B1 (ko) 초음파 진단장치 및 그에 따른 초음파 진단 방법
WO2015102474A1 (fr) Appareil de diagnostic à ultrasons, procédé de capture d'image à ultrasons et support d'enregistrement lisible par ordinateur
WO2016186279A1 (fr) Procédé et appareil de synthèse d'images médicales
EP3073930A1 (fr) Méthode et appareil ultrasonore pour le marquage de tumeur sur une image élastographique ultrasonore
WO2016182166A1 (fr) Procédé d'affichage d'une image obtenue par élastographie et appareil de diagnostic à ultrasons mettant en œuvre le procédé
EP3185778A1 (fr) Appareil de diagnostic à ultrasons pour auto-diagnostic et diagnostic à distance, et procédé de fonctionnement de l'appareil de diagnostic à ultrasons
WO2015076508A1 (fr) Procédé et appareil d'affichage d'image ultrasonore
WO2016027959A1 (fr) Procédé, appareil et système pour délivrer une image médicale représentant un objet et une image de clavier
WO2015141913A1 (fr) Appareil de diagnostic par ultrasons et son procédé de fonctionnement
WO2016052817A1 (fr) Procédé et appareil d'imagerie médicale pour générer une image élastique à l'aide d'une sonde à réseau incurvé
WO2017135500A1 (fr) Procédé permettant d'émettre la vitesse d'un objet et dispositif de diagnostic ultrasonore associé
WO2015160047A1 (fr) Appareil d'imagerie médicale et son procédé de fonctionnement
WO2018092993A1 (fr) Dispositif de diagnostic ultrasonore et son procédé de fonctionnement
US20160085328A1 (en) Ultrasound diagnosis apparatus and method and computer-readable storage medium
EP3071113A1 (fr) Procédé et appareil d'affichage d'image ultrasonore
WO2017179782A1 (fr) Dispositif de diagnostic ultrasonore et son procédé de commande
WO2016093453A1 (fr) Appareil de diagnostic à ultrasons et son procédé de fonctionnement
WO2015002400A1 (fr) Appareil de diagnostic par ultrasons et son procédé de fonctionnement
WO2016047867A1 (fr) Procédé de traitement d'image à ultrasons et appareil d'imagerie à ultrasons associé
WO2016098929A1 (fr) Dispositif d'imagerie ultrasonore et son procédé de commande
WO2022019616A1 (fr) Procédé de fourniture d'informations concernant le diagnostic d'un polype de la vésicule biliaire et dispositif de fourniture d'informations concernant le diagnostic d'un polype de la vésicule biliaire l'utilisant
WO2015137616A1 (fr) Appareil de diagnostic médical et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14869861

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15103555

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2014869861

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014869861

Country of ref document: EP