[go: up one dir, main page]

WO2014208977A1 - Ultrasonic imaging apparatus and control method thereof - Google Patents

Ultrasonic imaging apparatus and control method thereof Download PDF

Info

Publication number
WO2014208977A1
WO2014208977A1 PCT/KR2014/005572 KR2014005572W WO2014208977A1 WO 2014208977 A1 WO2014208977 A1 WO 2014208977A1 KR 2014005572 W KR2014005572 W KR 2014005572W WO 2014208977 A1 WO2014208977 A1 WO 2014208977A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasonic
image
operator
imaging apparatus
eyeline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2014/005572
Other languages
French (fr)
Inventor
Yun Tae Kim
Jung Ho Kim
Kyu Hong Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of WO2014208977A1 publication Critical patent/WO2014208977A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4227Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by straps, belts, cuffs or braces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4455Features of the external shape of the probe, e.g. ergonomic aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8925Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/899Combination of imaging systems with ancillary equipment

Definitions

  • Exemplary embodiments relate to an ultrasonic imaging apparatus and a control method thereof, and more particularly, to an ultrasonic imaging apparatus which is capable of providing intuitive ultrasonic images, and a control method thereof.
  • Medical imaging apparatuses include an X-ray imaging apparatus, a fluoroscopy system, a Computerized Tomography (CT) scanner, a Magnetic Resonance Image (MRI) apparatus, Positron Emission Tomography (PET), and an ultrasonic imaging apparatus.
  • CT Computerized Tomography
  • MRI Magnetic Resonance Image
  • PET Positron Emission Tomography
  • ultrasonic imaging apparatus an ultrasonic imaging apparatus.
  • the ultrasonic imaging apparatus irradiates ultrasonic waves toward an object (e.g., a human body), and receives an ultrasonic echo which is reflected from the inside of the object so as to non-invasively acquire section images which relate to the inner tissues of the object or images which relate to blood vessels of the object based on the ultrasonic echo.
  • an object e.g., a human body
  • the ultrasonic imaging apparatus has advantages that it is a compact, low-priced apparatus compared to other medical imaging apparatuses and it can display images in real time. In addition, the ultrasonic imaging apparatus is relatively safe, because there is no risk that patients will be exposed to radiation such as X-rays. Because of these advantages, the ultrasonic imaging apparatus is widely used to diagnose medical conditions which relate to the heart, breasts, abdomen, urinary organs, uterus, and other parts of the human body.
  • an ultrasonic imaging apparatus includes: a flexible probe which has a planar shape, and which is configured to emit ultrasonic waves toward a subject, and to receive at least one ultrasonic echo which is reflected from the subject; an image processor which is configured to trace an eyeline of an operator from an image which is acquired by photographing the operator, and to generate an ultrasonic image by processing three-dimensional (3D) volume data which is acquired from the at least one ultrasonic echo based on the traced eyeline; and a flexible display which has a planar shape, and which is disposed on one side of the flexible probe, and which is configured to display the generated ultrasonic image.
  • a flexible probe which has a planar shape, and which is configured to emit ultrasonic waves toward a subject, and to receive at least one ultrasonic echo which is reflected from the subject
  • an image processor which is configured to trace an eyeline of an operator from an image which is acquired by photographing the operator, and to generate an ultrasonic image by processing three-dimensional (3D) volume data which is acquired from the
  • a control method which is executable by using an ultrasonic imaging apparatus includes: emitting ultrasonic waves toward a subject by using a flexible probe which has a planar shape, and receiving at least one ultrasonic echo which is reflected from the subject; tracing an eyeline of an operator from an image which is acquired by photographing the operator, and generating an ultrasonic image by processing 3D volume data which is acquired from the at least one ultrasonic echo based on the traced eyeline; and displaying the generated ultrasonic image by using a flexible display which has a planar shape and which is disposed on one side of the flexible probe.
  • the flexible probe and the flexible display are embodied integrally with each other, it is possible to provide an intuitive ultrasonic image.
  • FIG. 1 is a perspective view of an ultrasonic imaging apparatus, according to an exemplary embodiment
  • FIG. 2 is a perspective view of an ultrasonic imaging apparatus, according to another exemplary embodiment
  • FIG. 3 is a block diagram of an ultrasonic imaging apparatus, according to an exemplary embodiment
  • FIG. 4 is a block diagram of a transmit beamformer, according to an exemplary embodiment
  • FIG. 5 is a block diagram of a receive beamformer, according to an exemplary embodiment
  • FIG. 6 is a block diagram of an image processor, according to an exemplary embodiment
  • FIGS. 7A and 7B are views which illustrate section images that are generated by the section image generator of FIG. 6;
  • FIG. 8 is a view which illustrates a projection image that is generated by the image processor of FIG. 6;
  • FIG. 9 is a block diagram of an image processor, according to another exemplary embodiment.
  • FIG. 10 is a block diagram of an image processor, according to still another exemplary embodiment.
  • FIG. 11 is a flowchart which illustrates a control method which is executable by using an ultrasonic imaging apparatus including the image processor of FIG. 6, according to an exemplary embodiment
  • FIG. 12 is a flowchart which illustrates a control method which is executable by using an ultrasonic imaging apparatus including the image processor of FIG. 9, according to another exemplary embodiment.
  • FIG. 13 is a flowchart which illustrates a control method which is executable by using an ultrasonic imaging apparatus including the image processor of FIG. 10, according to still another exemplary embodiment.
  • the ultrasonic imaging apparatus irradiates ultrasonic waves toward a target area of an object, receives at least one ultrasonic echo which is reflected from the target area of the object, converts the received at least one ultrasonic echo into electrical signals, and then creates an ultrasonic image which relates to the target area based on the electrical signals.
  • FIG. 1 is a perspective view of an ultrasonic imaging apparatus 200, according to an exemplary embodiment.
  • the ultrasonic imaging apparatus 200 may include a body 201, and a belt 290 which is connected to the body 201.
  • the body 201 may include a flexible probe 230, a flexible display 220, and one or more photographing devices 210.
  • the flexible probe 230 contacts the skin surface of an object 10, and may have a planar shape, such as, for example, a quadrangle shape. Although not illustrated in FIG. 1, the flexible probe 230 may include a flexible substrate and a plurality of ultrasonic elements T which are mounted on the flexible substrate.
  • the flexible substrate may be flexible, foldable, and/or bendable. Accordingly, the flexible substrate may be made from one or more of plastic (a polymer film), a metal foil, and/or thin glass.
  • each ultrasonic element T irradiate ultrasonic waves toward the object 10, receive at least one ultrasonic echo which is reflected from the inside of the object 10, and convert the received at least one ultrasonic echo into electrical signals.
  • each ultrasonic element T may include an ultrasonic generator for generating ultrasonic waves, and an ultrasonic receiver for receiving an ultrasonic echo and converting the received ultrasonic echo into electrical signals.
  • each ultrasonic element T itself may perform all functions of generating ultrasonic waves and receiving an ultrasonic echo.
  • the ultrasonic element T may include an ultrasonic transducer.
  • a transducer is a device for converting a specific first type of energy into a second type of energy.
  • an ultrasonic transducer may convert electrical energy into wave energy, and wave energy into electrical energy.
  • the ultrasonic transducer may perform the functions of an ultrasonic generator and an ultrasonic receiver.
  • the ultrasonic transducer may include a piezoelectric material and/or a piezoelectric thin film.
  • alternating current from an internal capacitor such as a battery or from an external power source is applied to the piezoelectric material or the piezoelectric thin film, the piezoelectric material or the piezoelectric thin film vibrates at a specific frequency, and ultrasonic waves of the specific frequency are generated according to the vibration frequency.
  • an ultrasonic echo of a specific frequency arrives at the piezoelectric material or the piezoelectric thin film, the piezoelectric material or the piezoelectric thin film vibrates at the specific frequency of the ultrasonic echo so as to output an alternating current of the specific frequency which corresponds to the vibration frequency.
  • the ultrasonic transducer may include a magnetostrictive ultrasonic transducer which uses the magnetostrictive effect of a magnetic material, a piezoelectric ultrasonic transducer which uses the piezoelectric effect of a piezoelectric material, and/or a capacitive micromachined ultrasonic transducer (CMUT) that transmits and receives ultrasonic waves by using a vibration of several hundreds or thousands of micromachined thin films.
  • CMUT capacitive micromachined ultrasonic transducer
  • the ultrasonic transducer may include any other type of transducer which is capable of generating ultrasonic waves according to an electrical signal, or which is capable of generating an electrical signal according to ultrasonic waves.
  • a plurality of ultrasonic transducers may be arranged in a matrix form on a flexible substrate.
  • 3D volume data can be acquired.
  • a plurality of ultrasonic transducers may be arranged in a line on a flexible substrate such that they are movable with respect to the flexible substrate.
  • rails (not shown), on which both ends of the plurality of transducers are placed, may be provided perpendicular to the arrangement direction of the transducers. Accordingly, by moving the transducers along the rails in a scanning direction, a plurality of ultrasonic images can be acquired, and the ultrasonic images are accumulated in order to obtain 3D volume data.
  • the flexible display 220 is disposed on the flexible probe 230.
  • the flexible display 220 may have, similarly as the flexible probe 230, a planar shape, such as, for example, a quadrangle shape.
  • the flexible display 220 may include any one or more of a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), and/or an Electrophoretic Display (EPD).
  • LCD Liquid Crystal Display
  • OLED Organic Light Emitting Diode
  • EPD Electrophoretic Display
  • the flexible display 220 may include a flexible substrate, and a Thin Film Transistor (TFT) array which is formed on the flexible substrate.
  • TFT Thin Film Transistor
  • the flexible substrate may be flexible, foldable, and/or bendable. Accordingly, the flexible substrate may be made from at least one of plastic (a polymer film), a metal foil, and/or thin glass.
  • the TFT array has a structure in which a plurality of TFTs are arranged.
  • the TFTs are located to correspond to individual pixels of a display screen.
  • Each TFT controls current flowing through a thin film type semiconductor by applying an electric field in a direction which is perpendicular to the direction of the current flow.
  • the TFT is a kind of Field Effect Transistor (FET).
  • FET Field Effect Transistor
  • the TFT may include a source electrode and a drain electrode spaced apart on a substrate, a semiconductor thin film formed across the source electrode and the drain electrode, an insulating layer formed on the semiconductor thin film, and a gate formed on the insulating layer.
  • the source electrode and the drain electrode may be formed by depositing aluminum or indium on the substrate.
  • the semiconductor thin film may be formed by depositing cadmium sulfide (CdS) on the source electrode and the drain electrode.
  • the insulating layer may be formed by depositing silicon oxide (SiO2) on the semiconductor thin film.
  • the flexible display 220 may have a display function.
  • the flexible display 220 may display an ultrasonic image.
  • the ultrasonic image may include any one or more of a section image which is obtainable from 3D volume data, and a projection image which is obtainable by volume-rendering 3D volume data with respect to a specific viewpoint.
  • a type of ultrasonic images to be displayed via the flexible display 220 may be set in advance by an operator.
  • Information relating to the type of ultrasonic images to be displayed via the flexible display 220 may be stored in a storage unit (see, e.g., item 280 of FIG. 2) of the ultrasonic imaging apparatus 200.
  • the flexible display 220 may function as an input device.
  • the flexible display 220 may be implemented as a touch display.
  • a display screen of the flexible display 220 may display menus in the forms of images, numerals, and/or any other suitable format.
  • the menus may be displayed in a separate area from ultrasonic images, and/or may overlap ultrasonic images.
  • An operator may touch one of the displayed menus with his/her finger or a stylus pen, thereby inputting an instruction or command for manufacturing the ultrasonic imaging apparatus 200.
  • the operator may mark a lesion area in an ultrasonic image by using his/her finger or a stylus pen.
  • the ultrasonic imaging apparatus 200 may further include an input device and/or an input unit (not shown) which is configured for enabling an operator to input instructions or commands, and/or a communication device and/or a communication unit (not shown) which is configured for receiving the instructions or commands input by the operator from an external device.
  • the communication device and/or the communication unit may receive the instructions or commands from the external device via wired communication and/or wireless communication.
  • main components of the ultrasonic imaging apparatus 200 may be disposed between the flexible probe 230 and the flexible display 220.
  • any one or more of a controller, a transmit beamformer, a receive beamformer, an image processor, and a storage unit may be disposed between the flexible probe 230 and the flexible display 220. Details about the above-mentioned components will be described below with reference to FIG. 3.
  • the photographing units (also referred to herein as photographing devices) 210 may photograph the operator in order to acquire an image of the operator. More specifically, each photographing unit 210 may include an infrared emitter which is configured for emitting infrared light toward the operator's eye, and a camera which is configured for photographing infrared light which is reflected from the operator's cornea.
  • the infrared emitter may include a Light Emitting Diode (LED).
  • An image which is acquired by the camera may be used to trace the operator's eyeline. A degree of reflection of the infrared light varies depending on the operator's eyeline. Accordingly, the operator's eyeline can be determined based on the reflected infrared light.
  • the number and installation locations of the photographing units 210 may vary.
  • two photographing units 210 may be installed in one edge of the flexible display 220.
  • the two photographing units 210 may be arranged in the center of one edge of the flexible display 220 such that they are spaced apart by a predetermined distance from each other.
  • the two photographing units 210 may be respectively installed in left and right corners of the flexible display 220.
  • a single photographing unit 210 may be installed in one edge of the flexible display 220.
  • the photographing unit 210 may be located in the center of one edge of the flexible display 220.
  • the location of the photographing unit 210 is not limited to this, and therefore, as another example, the photographing unit 210 may be installed in one of four corners of the flexible display 220.
  • the photographing unit 210 may be installed in wearable equipment, such as, e.g., glasses, goggles, or a headset.
  • images photographed by the photographing unit 210 may be transmitted to the body 201 via wired communication or wireless communication.
  • the belt 290 may be connected to both ends of the body 201.
  • the belt 290 acts to fasten the body 201 to the object 10, such as, for example, a patient's abdomen. Further, the belt 290 acts to cause the flexible probe 230 to closely contact the object 10.
  • the belt 290 may be configured to have an adjustable length.
  • the belt 290 may be made of a material having elasticity.
  • FIG. 3 is a block diagram of an ultrasonic imaging apparatus 200, according to an exemplary embodiment.
  • the ultrasonic imaging apparatus 200 may include a photographing unit 210, a flexible display 220, a flexible probe 230, a controller 240, a transmit beamformer 250, a receive beamformer 260, an image processor 270, and a storage unit (also referred to herein as a “storage” or as a “storage device”) 280.
  • a photographing unit 210 a flexible display 220, a flexible probe 230, a controller 240, a transmit beamformer 250, a receive beamformer 260, an image processor 270, and a storage unit (also referred to herein as a “storage” or as a “storage device”) 280.
  • the controller 240 may control the entire operation of the ultrasonic imaging apparatus 200. More specifically, the controller 240 may generate a control signal for controlling at least one of the transmit beamformer 250, the receive beamformer 260, the image processor 270, the storage unit 280, and the flexible display 220 based on an instruction or command received from an operator or an external device. For example, when an operator marks a lesion area by using his/her finger or a stylus pen, the controller 240 may display a solid line which corresponds to the operator's marking so that the lesion area can be distinguished from other areas.
  • the transmit beamformer 250 may perform transmit beamforming.
  • the transmit beamforming is performed in order to focus ultrasonic waves which are generated by one or more ultrasonic elements T to a focal point.
  • the transmit beamforming may be performed by coordinating one or more ultrasonic elements T to generate ultrasonic waves in an appropriate order in order to compensate for time differences with which ultrasonic waves generated by the ultrasonic elements T arrive at a focal point.
  • the transmit beamforming will be described in more detail with reference to FIG. 4, below.
  • FIG. 4 is a block diagram of the transmit beamformer 250, according to an exemplary embodiment.
  • the transmit beamformer 250 may include a transmission signal generator 251 and a time delay unit (also referred to herein as a “time delay device”) 252.
  • the transmission signal generator 251 may generate transmission signals (such as, for example, high frequency alternating current signals) that are to be applied to one or more ultrasonic elements T based on a control signal which is received from the controller 240.
  • the transmission signals generated by the transmission signal generator 251 may be provided to the time delay unit 252.
  • the time delay unit 252 delays each transmission signal generated by the transmission signal generator 251 in order to adjust a time at which the transmission signal arrives at the corresponding ultrasonic element T. If a transmission signal delayed by the time delay unit 252 is applied to the corresponding ultrasonic element T, the ultrasonic element T generates ultrasonic waves which correspond to the frequency of the transmission signal. Ultrasonic waves generated by the individual ultrasonic elements T are focused at the focal point. The location of the focal point at which ultrasonic waves generated by the ultrasonic elements T are focused may vary based on the type of delay pattern that is applied to transmission signals.
  • five ultrasonic elements t1 through t5 are provided, and three delay patterns that can be applied to transmission signals are respectively represented by using thick solid lines, medium solid lines, and thin solid lines.
  • the location of a focal point varies based on the type of delay pattern that is applied to transmission signals generated by the transmission signal generator 251. Accordingly, when a delay pattern is applied, ultrasonic waves that are to be applied to an object are focused at a fixed focal point (fixed-focusing). However, when two or more different delay patterns are applied, ultrasonic waves that are to be applied to the object are focused at several focal points (multi-focusing).
  • ultrasonic waves generated by the individual ultrasonic elements T are fixed-focused at a focal point, or multi-focused at several focal points.
  • the focused ultrasonic waves are directed to the inside of the object.
  • the ultrasonic waves directed to the inside of the object are reflected from a target area of the object.
  • An ultrasonic echo which is reflected from the target area is received by the ultrasonic elements T.
  • the ultrasonic elements T convert the received ultrasonic echo into electrical signals.
  • the converted electrical signals will be simply referred to as received signals (ultrasonic echo signals).
  • the received signals output from the ultrasonic elements T are amplified and filtered, then converted into digital signals, and provided to the receive beamformer 260.
  • the receive beamformer 260 may perform receive beamforming on the received signals which are converted into the digital signals.
  • the receive beamforming is performed in order to correct time differences between the received signals output from the individual ultrasonic elements T, and then to focus the resultant received signals.
  • the receive beamforming will be described in more detail with reference to FIG. 5, below.
  • FIG. 5 is a block diagram of the receive beamformer 260, according to an exemplary embodiment.
  • the receive beamformer 260 may include a time-difference corrector 262 and a focusing unit (also referred to herein as a “focuser”) 261.
  • the time-difference corrector 262 delays a respective received signal which is output from each ultrasonic element T by a predetermined time period so that all received signals output from the individual ultrasonic elements T can be transferred to the focusing unit 261 at the same time.
  • the focusing unit 261 may focus the received signals based on the time-difference correction which is performed by the time-difference corrector 262. In particular, the focusing unit 261 may focus the received signals after allocating a predetermined weight (such as, for example, a beamforming coefficient) to each received signal in order to enhance or attenuate the corresponding received signal with respect to the other received signals.
  • the focused, received signal may be provided to the image processor 270.
  • the image processor 270 may acquire 3D volume data based on the received signal focused by the focusing unit 261 of the receive beamformer 260. Further, the image processor 270 may trace an operator's eyeline based on an image acquired by the photographing unit 210. Then, the image processor 270 may process the 3D volume data based on the traced operator's eyeline, thereby generating an ultrasonic image.
  • the ultrasonic image may include a section image which is obtained from 3D volume data, and/or a projection image which is obtained by volume-rendering 3D volume data with respect to a specific viewpoint.
  • the image processor 270 will be described in more detail with reference to FIG. 6, below.
  • FIG. 6 is a block diagram of the image processor 270, according to an exemplary embodiment.
  • the image processor 270 may include a volume data acquiring unit (also referred to herein as a “volume data acquirer” or a “volume data acquisition device”) 271, an eyeline tracing unit (also referred to herein as an “eyeline tracer” or an “eyeline tracing device”) 272, a section image generator 273, and a volume rendering unit (also referred to herein as a “volume renderer” or a “volume rendering device”) 274.
  • a volume data acquiring unit also referred to herein as a “volume data acquirer” or a “volume data acquisition device”
  • eyeline tracing unit also referred to herein as an “eyeline tracer” or an “eyeline tracing device”
  • section image generator 273 also referred to herein as a “volume renderer” or a “volume rendering device”
  • the volume data acquiring unit 271 may acquire 3D volume data from a received signal focused by the focusing unit 261 of the receive beamformer 260 (see FIG. 5).
  • the eyeline tracing unit 272 may trace an operator's eyeline based on an image acquired by the photographing unit 210 (see FIG. 3). More specifically, if the infrared emitter of the photographing unit 210 emits infrared light onto an operator's eye, the infrared light is reflected from the operator's cornea, and the reflected light (hereinafter, referred to as "cornea light") is photographed by the camera of the photographing unit 210. The eyeline tracing unit 272 may detect the location of the cornea light and the location of the operator's pupil from the image acquired by the photographing unit 210, and then obtain a trace of the operator's eyeline based on the results of the detection. The results of the eyeline tracing may be provided to the section image generator 273 and the volume rendering unit 274, which will be described below.
  • the section image generator 273 may generate a section image which corresponds to a predetermined plane from the 3D volume data, based on the results of the eyeline tracing. More specifically, the section image generator 273 may generate a section image which corresponds to a plane which is perpendicular to the operator's eyeline, from the 3D volume data.
  • FIGS. 7A and 7B are views which illustrate section images that are generated by the section image generator 273 of FIG. 6.
  • FIG. 7A shows an example in which a section image corresponding to an y-z plane is generated from 3D volume data located in a 3D space that is defined by x, y, and z axes.
  • a section image of an operator which corresponds instead to a x-y plane or a z-x plane, may be generated.
  • FIG. 7B shows an example in which a section image corresponding to a plane which is not parallel to any of the x-y, y-z, and z-x planes is generated.
  • a plane to be used for generating a section image from among the planes which are perpendicular to the operator's eyeline may be selected by using any one or more of various methods.
  • a plane to be used for generating a section image from among planes which are perpendicular to an operator's eyeline may be automatically generated.
  • a plane which is located closest to the operator's viewpoint from among the planes which are perpendicular to the operator's eyeline may be automatically selected.
  • a plane which is located most distant from the operator's viewpoint from among the planes which are perpendicular to the operator's eyeline may be automatically selected.
  • FIG. 7B when planes which are perpendicular to an operator's eyeline have different sizes, a plane having a largest size from among the planes which are perpendicular to the operator's eyeline may be automatically selected.
  • a plane which is used for generating a section image from among a plurality of planes which are perpendicular to an operator's eyeline may be manually selected.
  • a section image which corresponds to an arbitrary plane from among the planes which are perpendicular to the operator's eyeline is first generated, and the section image is displayed via the flexible display 220 (see FIG. 3).
  • the section image generator 273 may select a plane which corresponds to the instruction or command, and then generate a section image which corresponds to the selected plane.
  • the volume rendering unit 274 may perform volume rendering with respect to the 3D volume data based on the results of the eyeline tracing.
  • the volume rendering is performed in order to project 3D volume data to a 2D plane with respect to a predetermined viewpoint.
  • the volume rendering operation may be classified into surface rendering and direct volume rendering.
  • the surface rendering is performed in order to extract surface information from volume data based on predetermined scalar values and amounts of spatial changes, to convert the surface information into a geometric factor, such as a polygon or a surface patch, and then to apply a conventional rendering technique to the geometric factor.
  • a geometric factor such as a polygon or a surface patch
  • Examples of algorithms which are used to implement the surface rendering are a marching cubes algorithm and a dividing cubes algorithm.
  • the direct volume rendering is performed in order to directly render volume data without converting volume data into a geometric factor.
  • the direct volume rendering is useful to represent a translucent structure, because direct volume rendering effectively enables a visualization of the inside of an object as it is.
  • the direct volume rendering operation may be classified into implementing methods which include an object-order method and an image-order method, according to a way of approaching volume data.
  • the object-order method is performed in order to search for volume data in its storage order and to synthesize each voxel with the corresponding pixel.
  • a representative example of an operation which implements the object-order method is splatting.
  • the image-order method is performed in order to sequentially decide pixel values in the order of scan lines of an image.
  • Examples of methods which implement the image-order method are Ray-Casting and Ray-Tracing.
  • a virtual ray is irradiated from a specific viewpoint toward a predetermined pixel of a display screen, and then voxels through which the virtual ray has been transmitted from among voxels of volume data are detected. Then, brightness values of the detected voxels are accumulated in order to determine a brightness value of the corresponding pixel of the display screen.
  • an average value of the detected voxels may be determined as a brightness value of the corresponding pixel of the display screen.
  • a weighted average value of the detected voxels may be determined as a brightness value of the corresponding pixel of the display screen.
  • the Ray-Tracing is performed in order to trace a path of a ray which is traveling toward an observer's eyes. Unlike the Ray-Casting, which entails detecting an intersection at which a ray meets volume data, the Ray-Tracing can trace an irradiated ray and thereby determine parameters or qualities which relate to how the ray travels, such as reflection, refraction, etc. of the ray.
  • the Ray-Tracing can be classified into Forward Ray-Tracing and Backward Ray-Tracing.
  • the Forward Ray-Tracing is performed in order to model a phenomenon in which a ray irradiated from a virtual light source arrives at volume data to be reflected, scattered, or transmitted, thereby finding a ray which finally arrives at an observer's eyes.
  • the Backward Ray-Tracing is to backwardly trace a path of a ray which is traveling toward an observer's eyes.
  • the volume rendering unit 274 may perform volume rendering with respect to the 3D volume data by using any one or more of the above-mentioned volume rendering methods.
  • a 2D projection image may be obtained.
  • two 2D projection images that is, left and right images, may be obtained.
  • a 2D projection image and/or a 3D stereoscopic image which are acquired by the volume rendering unit 274 and a section image generated by the section image generator 273 may be displayed via the flexible display 220.
  • An indicator which relates to a type of images to be displayed via the flexible display 220 may be set in advance by the operator. In addition, even when a predetermined type of images is displayed via the flexible display 220, a different type of images can be displayed if an instruction or command for changing the type of images to be displayed is received.
  • the storage unit 280 may store data and/or algorithms which are required for operations of the ultrasonic imaging apparatus 200.
  • the storage unit 280 may store any one or more of an algorithm for tracing an operator's eyeline from an image acquired via the photographing unit 210, an algorithm for generating a section image from 3D volume data, and an algorithm for volume rendering 3D volume data.
  • the storage unit 280 may store values which are set in advance by the operator.
  • the storage unit 280 may store information which relates to a type of ultrasonic images to be displayed via the flexible display 220, information relating to whether or not to automatically generate a section image, and a criterion for selecting a plane for generating a section image from among planes which are perpendicular to an operator's eyeline.
  • the storage unit 280 may include any one or more of Read Only Memory (ROM), Random Access Memory (RAM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), flash memory, Hard Disk Drive (HDD), Optical Disk Drive (ODD), and/or a combination thereof.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • PROM Programmable Read Only Memory
  • EPROM Erasable Programmable Read Only Memory
  • flash memory Hard Disk Drive (HDD), Optical Disk Drive (ODD), and/or a combination thereof.
  • HDD Hard Disk Drive
  • ODD Optical Disk Drive
  • the storage unit 280 is not limited to these, and may include any other storage device which is well-known in the art.
  • the above-described exemplary embodiment relates to an example in which the image processor 270 includes both the section image generator 273 and the volume rendering unit 274.
  • the image processor 270 may include the volume data acquiring unit 271, the eyeline tracing unit 272, and the section image generator 273. Because these components have been described above with reference to FIGS. 6, 7A, and 7B, further descriptions will be omitted. If the image processor 270 has a configuration as illustrated in FIG. 9, a section image may be automatically generated from 3D volume data even when no instruction or command from an operator is received.
  • the image processor 270 may include the volume data acquiring unit 271, the eyeline tracing unit 272, and the volume rendering unit 274. Because these components have been described above with reference to FIGS. 6 to 8, further descriptions will be omitted. If the image processor 270 has a configuration as illustrated in FIG. 10, a 2D projection image and/or a 3D stereoscopic image may be generated from 3D volume data. A type of images to be displayed via the flexible display 220 from among 2D projection images and 3D stereoscopic images may be set in advance by an operator. Further, even when a predetermined type of images is displayed via the flexible display 220, a different type of images can be displayed if an instruction or command for changing the type of images to be displayed is received.
  • FIG. 11 is a flowchart which illustrates a control method which is executable by using the ultrasonic imaging apparatus 200 including the image processor 270 of FIG. 6, according to an exemplary embodiment.
  • the flexible probe 230 irradiates ultrasonic waves toward the object 10, and receives at least one ultrasonic echo which is reflected from the object 10.
  • the irradiation of ultrasonic waves and the reception of the at least one ultrasonic echo may be performed by at least one ultrasonic element T (see FIG. 4), such as, for example, at least one ultrasonic transducer.
  • the ultrasonic element T converts the received ultrasonic echo into an electrical signal in order to output a received signal.
  • the received signal output from the ultrasonic element T is amplified and filtered, and then converted into a digital signal.
  • the received signal converted into the digital signal may be received and focused by the receive beamformer 260.
  • 3D volume data may be acquired based on the received signal focused by the receive beamformer 260.
  • the 3D volume data may be acquired by the volume data acquiring unit 271 of the image processor 270.
  • Operation S730 if the photographing unit 210 photographs the operator in order to acquire an image of the operator, the eyeline of the operator may be traced from the acquired image.
  • Operation 730 of tracing the operator's eyeline may include sub-operations of: emitting infrared light onto the operator; photographing cornea light which is reflected from the operator's cornea; detecting the location of the cornea light and the location of the operator's pupil from an image which is obtained by photographing the cornea light; and tracing the operator's eyeline based on the results of the detection.
  • volume rendering should be performed. Whether or not volume rendering should be performed may be determined according to a type of images to be displayed. For example, if the type of images to be displayed has been set to section images, a determination may be made that volume rendering need not be performed. Conversely, if the type of images to be displayed has been set to 2D projection images or 3D stereoscopic images, a determination may be made that volume rendering should be performed.
  • volume rendering may be performed with respect to the 3D volume data according to the operator's eyeline in operation S750.
  • operation S750 of performing volume rendering with respect to the 3D volume data may include performing volume rendering on the 3D volume data with respect to a predetermined viewpoint in order to generate a 2D projection image.
  • operation S750 of performing volume rendering with respect to the 3D volume data may include performing volume rendering on the 3D volume data with respect to two viewpoints in order to generate left and right images, and synthesizing the left image with the right image in order to generate a 3D stereoscopic image.
  • the results of the volume rendering may be displayed via the flexible display 220.
  • the 2D projection image may be displayed via the flexible display 220.
  • the 3D stereoscopic image may be displayed via the flexible display 220.
  • a section image which corresponds to a plane which is perpendicular to the operator's eyeline may be generated from the 3D volume data.
  • Operation S770 of generating the section image may include selecting a predetermined plane from among a plurality of planes which are perpendicular to the operator's eyeline, and generating a section image which corresponds to the selected plane. Selecting the predetermined plane from among the plurality of planes which are perpendicular to the operator's eyeline may be automatically performed, or may be manually performed based on an instruction or command from the operator.
  • the generated section image may be displayed via the flexible display 220.
  • FIG. 12 is a flowchart which illustrates a control method which is executable by using the ultrasonic imaging apparatus 200 including the image processor 270 of FIG. 9, according to another exemplary embodiment.
  • the operator causes the flexible probe 230 of the body 201 of the ultrasonic imaging apparatus 200 to contact the object 10, and then fixes the flexible probe 230 on the object 10 by using the belt 290.
  • the flexible probe 230 irradiates ultrasonic waves toward the object 10, and receives at least one ultrasonic echo which is reflected from the object 10.
  • the irradiation of ultrasonic waves and the reception of the at least one ultrasonic echo may be performed by at least one ultrasonic element T (see FIG. 4), such as, for example, at least one ultrasonic transducer.
  • the ultrasonic element T converts the received at least one ultrasonic echo into an electrical signal in order to output a received signal.
  • the received signal output from the ultrasonic element T is amplified and filtered, and then converted into a digital signal.
  • the received signal converted into the digital signal may be received and focused by the receive beamformer 260.
  • 3D volume data may be acquired based on the received signal focused by the receive beamformer 260.
  • the 3D volume data may be acquired by the volume data acquiring unit 271 of the image processor 270.
  • Operation S830 if the photographing unit 210 photographs the operator in order to acquire an image of the operator, the eyeline of the operator may be traced based on the acquired image.
  • Operation S830 of tracing the operator's eyeline may include sub-operations of: emitting infrared light toward the operator; photographing cornea light which is reflected from the operator's cornea; detecting the location of the cornea light and the location of the operator's pupil from an image which is obtained by photographing the cornea light; and tracing the operator's eyeline based on the results of the detection.
  • a section image which corresponds to a plane which is perpendicular to the operator's eyeline may be generated from the acquired 3D volume data.
  • operation S870 of generating the section image may include selecting a predetermined plane from among a plurality of planes which are perpendicular to the operator's eyeline, and generating a section image which corresponds to the selected plane. Selecting the predetermined plane from among the plurality of planes which are perpendicular to the operator's eyeline may be automatically performed, or may be manually performed based on an instruction or command from the operator.
  • the generated section image may be displayed via the flexible display 220.
  • FIG. 13 is a flowchart which illustrates a control method which is executable by using the ultrasonic imaging apparatus 200 including the image processor 270 of FIG. 10, according to still another exemplary embodiment.
  • the operator causes the flexible probe 230 of the body 201 of the ultrasonic imaging apparatus 200 to contact the object 10, and then fixes the flexible probe 230 on the object 10 by using the belt 290.
  • the flexible probe 230 irradiates ultrasonic waves toward the object 10, and receives at least one ultrasonic echo which is reflected from the object 10.
  • the irradiation of ultrasonic waves and the reception of the at least one ultrasonic echo may be performed by at least one ultrasonic element T (see FIG. 4), such as, for example, at least one ultrasonic transducer.
  • the ultrasonic element T converts the received at least one ultrasonic echo into an electrical signal in order to output a received signal.
  • the received signal output from the ultrasonic element T is amplified and filtered, and then converted into a digital signal.
  • the received signal converted into the digital signal may be received and focused by the receive beamformer 260.
  • 3D volume data may be acquired based on the received signal focused by the receive beamformer 260.
  • the 3D volume data may be acquired by the volume data acquiring unit 271 of the image processor 270.
  • Operation S930 of tracing the operator's eyeline may include sub-operations of: emitting infrared light toward the operator; photographing cornea light which is reflected from the operator's cornea; detecting the location of the cornea light and the location of the operator's pupil from an image which is obtained by photographing the cornea light; and tracing the operator's eyeline based on the results of the detection.
  • volume rendering may be performed with respect to the acquired 3D volume data.
  • operation S950 of performing volume rendering with respect to the 3D volume data may include volume rendering on the 3D volume data with respect to a predetermined viewpoint in order to generate a 2D projection image.
  • operation S950 of performing volume rendering with respect to the 3D volume data may include performing volume rendering on the 3D volume data with respect to two viewpoints in order to generate left and right images, and synthesizing the left image with the right image in order to generate a 3D stereoscopic image.
  • the results of the volume rendering may be displayed via the flexible display 220.
  • the generated 2D projection image may be displayed via the flexible display 220.
  • the 3D stereoscopic image may be displayed via the flexible display 220.
  • the methods and/or operations described above may be recorded, stored, or fixed in one or more transitory or non-transitory computer-readable storage media that includes program instructions to be implemented by a computer to cause a processor to execute or perform the program instructions.
  • the media may also include, alone or in combination with the program instructions, data files, data structures, and the like.
  • Examples of computer-readable storage media include magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as compact disk read-only memory (CD ROM) disks and digital versatile disks (DVDs); magneto-optical media, such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
  • Examples of program instructions include machine code, such as that produced by a compiler, and files containing higher level code that may be executed by the computer by using an interpreter.
  • the described hardware devices may be configured to act as one or more software modules in order to perform the operations and methods described above, or vice versa.
  • a computer-readable storage medium may be distributed among computer systems connected through a network and computer-readable codes or program instructions may be stored and executed in a decentralized manner.
  • functional programs, codes and code segments to implement those exemplary embodiments may be easily inferred by programmers who are skilled in the related art.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Acoustics & Sound (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Disclosed herein are an ultrasonic imaging apparatus which is capable of providing intuitive ultrasonic images and a control method which is implementable by using the apparatus. According to an exemplary embodiment, there is provided an ultrasonic imaging apparatus including: a flexible probe having a planar shape, which is configured to emit ultrasonic waves toward an object, and to receive an ultrasonic echo reflected from the subject; an image processor which is configured to trace an operator's eyeline from an image acquired by photographing the operator, and to process a three-dimensional (3D) volume data acquired from the ultrasonic echo based on the traced operator's eyeline, thereby generating an ultrasonic image; and a flexible display having a planar shape, which is disposed on one side of the flexible probe, and which is configured to display the ultrasonic image.

Description

ULTRASONIC IMAGING APPARATUS AND CONTROL METHOD THEREOF
Exemplary embodiments relate to an ultrasonic imaging apparatus and a control method thereof, and more particularly, to an ultrasonic imaging apparatus which is capable of providing intuitive ultrasonic images, and a control method thereof.
Medical imaging apparatuses include an X-ray imaging apparatus, a fluoroscopy system, a Computerized Tomography (CT) scanner, a Magnetic Resonance Image (MRI) apparatus, Positron Emission Tomography (PET), and an ultrasonic imaging apparatus.
The ultrasonic imaging apparatus irradiates ultrasonic waves toward an object (e.g., a human body), and receives an ultrasonic echo which is reflected from the inside of the object so as to non-invasively acquire section images which relate to the inner tissues of the object or images which relate to blood vessels of the object based on the ultrasonic echo.
The ultrasonic imaging apparatus has advantages that it is a compact, low-priced apparatus compared to other medical imaging apparatuses and it can display images in real time. In addition, the ultrasonic imaging apparatus is relatively safe, because there is no risk that patients will be exposed to radiation such as X-rays. Because of these advantages, the ultrasonic imaging apparatus is widely used to diagnose medical conditions which relate to the heart, breasts, abdomen, urinary organs, uterus, and other parts of the human body.
Therefore, it is an aspect of one or more exemplary embodiments to provide an ultrasonic imaging apparatus which is capable of providing intuitive ultrasonic images, and a control method thereof.
Additional aspects of the exemplary embodiments will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the exemplary embodiments.
In accordance with one aspect of one or more exemplary embodiments, an ultrasonic imaging apparatus includes: a flexible probe which has a planar shape, and which is configured to emit ultrasonic waves toward a subject, and to receive at least one ultrasonic echo which is reflected from the subject; an image processor which is configured to trace an eyeline of an operator from an image which is acquired by photographing the operator, and to generate an ultrasonic image by processing three-dimensional (3D) volume data which is acquired from the at least one ultrasonic echo based on the traced eyeline; and a flexible display which has a planar shape, and which is disposed on one side of the flexible probe, and which is configured to display the generated ultrasonic image.
In accordance with another aspect of one or more exemplary embodiments, a control method which is executable by using an ultrasonic imaging apparatus includes: emitting ultrasonic waves toward a subject by using a flexible probe which has a planar shape, and receiving at least one ultrasonic echo which is reflected from the subject; tracing an eyeline of an operator from an image which is acquired by photographing the operator, and generating an ultrasonic image by processing 3D volume data which is acquired from the at least one ultrasonic echo based on the traced eyeline; and displaying the generated ultrasonic image by using a flexible display which has a planar shape and which is disposed on one side of the flexible probe.
Therefore, according to the ultrasonic imaging apparatus as described above, because the flexible probe and the flexible display are embodied integrally with each other, it is possible to provide an intuitive ultrasonic image.
These and/or other aspects will become apparent and more readily appreciated from the following description of exemplary embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a perspective view of an ultrasonic imaging apparatus, according to an exemplary embodiment;
FIG. 2 is a perspective view of an ultrasonic imaging apparatus, according to another exemplary embodiment;
FIG. 3 is a block diagram of an ultrasonic imaging apparatus, according to an exemplary embodiment;
FIG. 4 is a block diagram of a transmit beamformer, according to an exemplary embodiment;
FIG. 5 is a block diagram of a receive beamformer, according to an exemplary embodiment;
FIG. 6 is a block diagram of an image processor, according to an exemplary embodiment;
FIGS. 7A and 7B are views which illustrate section images that are generated by the section image generator of FIG. 6;
FIG. 8 is a view which illustrates a projection image that is generated by the image processor of FIG. 6;
FIG. 9 is a block diagram of an image processor, according to another exemplary embodiment;
FIG. 10 is a block diagram of an image processor, according to still another exemplary embodiment;
FIG. 11 is a flowchart which illustrates a control method which is executable by using an ultrasonic imaging apparatus including the image processor of FIG. 6, according to an exemplary embodiment;
FIG. 12 is a flowchart which illustrates a control method which is executable by using an ultrasonic imaging apparatus including the image processor of FIG. 9, according to another exemplary embodiment; and
FIG. 13 is a flowchart which illustrates a control method which is executable by using an ultrasonic imaging apparatus including the image processor of FIG. 10, according to still another exemplary embodiment.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.
Hereinafter, exemplary embodiments of an ultrasonic imaging apparatus and a control method thereof will be described in detail with reference to the accompanying drawings.
The ultrasonic imaging apparatus irradiates ultrasonic waves toward a target area of an object, receives at least one ultrasonic echo which is reflected from the target area of the object, converts the received at least one ultrasonic echo into electrical signals, and then creates an ultrasonic image which relates to the target area based on the electrical signals.
FIG. 1 is a perspective view of an ultrasonic imaging apparatus 200, according to an exemplary embodiment.
Referring to FIG. 1, the ultrasonic imaging apparatus 200 may include a body 201, and a belt 290 which is connected to the body 201.
The body 201 may include a flexible probe 230, a flexible display 220, and one or more photographing devices 210.
The flexible probe 230 contacts the skin surface of an object 10, and may have a planar shape, such as, for example, a quadrangle shape. Although not illustrated in FIG. 1, the flexible probe 230 may include a flexible substrate and a plurality of ultrasonic elements T which are mounted on the flexible substrate.
The flexible substrate may be flexible, foldable, and/or bendable. Accordingly, the flexible substrate may be made from one or more of plastic (a polymer film), a metal foil, and/or thin glass.
The ultrasonic elements T irradiate ultrasonic waves toward the object 10, receive at least one ultrasonic echo which is reflected from the inside of the object 10, and convert the received at least one ultrasonic echo into electrical signals. For example, each ultrasonic element T may include an ultrasonic generator for generating ultrasonic waves, and an ultrasonic receiver for receiving an ultrasonic echo and converting the received ultrasonic echo into electrical signals. However, each ultrasonic element T itself may perform all functions of generating ultrasonic waves and receiving an ultrasonic echo.
The ultrasonic element T may include an ultrasonic transducer. A transducer is a device for converting a specific first type of energy into a second type of energy. For example, an ultrasonic transducer may convert electrical energy into wave energy, and wave energy into electrical energy. In particular, the ultrasonic transducer may perform the functions of an ultrasonic generator and an ultrasonic receiver.
The ultrasonic transducer may include a piezoelectric material and/or a piezoelectric thin film. When alternating current from an internal capacitor such as a battery or from an external power source is applied to the piezoelectric material or the piezoelectric thin film, the piezoelectric material or the piezoelectric thin film vibrates at a specific frequency, and ultrasonic waves of the specific frequency are generated according to the vibration frequency. Further, when an ultrasonic echo of a specific frequency arrives at the piezoelectric material or the piezoelectric thin film, the piezoelectric material or the piezoelectric thin film vibrates at the specific frequency of the ultrasonic echo so as to output an alternating current of the specific frequency which corresponds to the vibration frequency.
The ultrasonic transducer may include a magnetostrictive ultrasonic transducer which uses the magnetostrictive effect of a magnetic material, a piezoelectric ultrasonic transducer which uses the piezoelectric effect of a piezoelectric material, and/or a capacitive micromachined ultrasonic transducer (CMUT) that transmits and receives ultrasonic waves by using a vibration of several hundreds or thousands of micromachined thin films. However, the ultrasonic transducer may include any other type of transducer which is capable of generating ultrasonic waves according to an electrical signal, or which is capable of generating an electrical signal according to ultrasonic waves.
For example, a plurality of ultrasonic transducers may be arranged in a matrix form on a flexible substrate. In this case, via a one-time transmission of ultrasonic waves, 3D volume data can be acquired.
As another example, a plurality of ultrasonic transducers may be arranged in a line on a flexible substrate such that they are movable with respect to the flexible substrate. In this case, rails (not shown), on which both ends of the plurality of transducers are placed, may be provided perpendicular to the arrangement direction of the transducers. Accordingly, by moving the transducers along the rails in a scanning direction, a plurality of ultrasonic images can be acquired, and the ultrasonic images are accumulated in order to obtain 3D volume data.
Hereinafter, an example in which a plurality of ultrasonic transducers are arranged in a matrix form on a flexible substrate will be described with reference to FIG. 1.
The flexible display 220 is disposed on the flexible probe 230. The flexible display 220 may have, similarly as the flexible probe 230, a planar shape, such as, for example, a quadrangle shape. The flexible display 220 may include any one or more of a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), and/or an Electrophoretic Display (EPD).
The flexible display 220 may include a flexible substrate, and a Thin Film Transistor (TFT) array which is formed on the flexible substrate.
The flexible substrate may be flexible, foldable, and/or bendable. Accordingly, the flexible substrate may be made from at least one of plastic (a polymer film), a metal foil, and/or thin glass.
The TFT array has a structure in which a plurality of TFTs are arranged. The TFTs are located to correspond to individual pixels of a display screen. Each TFT controls current flowing through a thin film type semiconductor by applying an electric field in a direction which is perpendicular to the direction of the current flow. The TFT is a kind of Field Effect Transistor (FET). Although not illustrated in FIG. 1, the TFT may include a source electrode and a drain electrode spaced apart on a substrate, a semiconductor thin film formed across the source electrode and the drain electrode, an insulating layer formed on the semiconductor thin film, and a gate formed on the insulating layer. The source electrode and the drain electrode may be formed by depositing aluminum or indium on the substrate. The semiconductor thin film may be formed by depositing cadmium sulfide (CdS) on the source electrode and the drain electrode. The insulating layer may be formed by depositing silicon oxide (SiO2) on the semiconductor thin film.
The flexible display 220 may have a display function. For example, the flexible display 220 may display an ultrasonic image. The ultrasonic image may include any one or more of a section image which is obtainable from 3D volume data, and a projection image which is obtainable by volume-rendering 3D volume data with respect to a specific viewpoint. A type of ultrasonic images to be displayed via the flexible display 220 may be set in advance by an operator. Information relating to the type of ultrasonic images to be displayed via the flexible display 220 may be stored in a storage unit (see, e.g., item 280 of FIG. 2) of the ultrasonic imaging apparatus 200.
In addition to the display function, the flexible display 220 may function as an input device. For example, the flexible display 220 may be implemented as a touch display. In this case, a display screen of the flexible display 220 may display menus in the forms of images, numerals, and/or any other suitable format. The menus may be displayed in a separate area from ultrasonic images, and/or may overlap ultrasonic images. An operator may touch one of the displayed menus with his/her finger or a stylus pen, thereby inputting an instruction or command for manufacturing the ultrasonic imaging apparatus 200. Alternatively, the operator may mark a lesion area in an ultrasonic image by using his/her finger or a stylus pen.
In the case in which the flexible display 220 has only a display function, the ultrasonic imaging apparatus 200 may further include an input device and/or an input unit (not shown) which is configured for enabling an operator to input instructions or commands, and/or a communication device and/or a communication unit (not shown) which is configured for receiving the instructions or commands input by the operator from an external device. The communication device and/or the communication unit may receive the instructions or commands from the external device via wired communication and/or wireless communication.
Further, main components of the ultrasonic imaging apparatus 200 may be disposed between the flexible probe 230 and the flexible display 220. For example, any one or more of a controller, a transmit beamformer, a receive beamformer, an image processor, and a storage unit (see 240, 250, 260, 270, and 280 of FIG. 3) may be disposed between the flexible probe 230 and the flexible display 220. Details about the above-mentioned components will be described below with reference to FIG. 3.
The photographing units (also referred to herein as photographing devices) 210 may photograph the operator in order to acquire an image of the operator. More specifically, each photographing unit 210 may include an infrared emitter which is configured for emitting infrared light toward the operator's eye, and a camera which is configured for photographing infrared light which is reflected from the operator's cornea. The infrared emitter may include a Light Emitting Diode (LED). An image which is acquired by the camera may be used to trace the operator's eyeline. A degree of reflection of the infrared light varies depending on the operator's eyeline. Accordingly, the operator's eyeline can be determined based on the reflected infrared light.
Further, the number and installation locations of the photographing units 210 may vary.
For example, as illustrated in FIG. 1, two photographing units 210 may be installed in one edge of the flexible display 220. In this case, the two photographing units 210 may be arranged in the center of one edge of the flexible display 220 such that they are spaced apart by a predetermined distance from each other. Alternatively, the two photographing units 210 may be respectively installed in left and right corners of the flexible display 220.
As another example, as illustrated in FIG. 2, a single photographing unit 210 may be installed in one edge of the flexible display 220. In this case, the photographing unit 210 may be located in the center of one edge of the flexible display 220. However, the location of the photographing unit 210 is not limited to this, and therefore, as another example, the photographing unit 210 may be installed in one of four corners of the flexible display 220.
As still another example, the photographing unit 210 may be installed in wearable equipment, such as, e.g., glasses, goggles, or a headset. In this case, images photographed by the photographing unit 210 may be transmitted to the body 201 via wired communication or wireless communication.
The belt 290 may be connected to both ends of the body 201. The belt 290 acts to fasten the body 201 to the object 10, such as, for example, a patient's abdomen. Further, the belt 290 acts to cause the flexible probe 230 to closely contact the object 10. The belt 290 may be configured to have an adjustable length. In addition, the belt 290 may be made of a material having elasticity.
FIG. 3 is a block diagram of an ultrasonic imaging apparatus 200, according to an exemplary embodiment.
Referring to FIG. 3, the ultrasonic imaging apparatus 200 may include a photographing unit 210, a flexible display 220, a flexible probe 230, a controller 240, a transmit beamformer 250, a receive beamformer 260, an image processor 270, and a storage unit (also referred to herein as a “storage” or as a “storage device”) 280.
Because the photographing unit 210, the flexible display 220, and the flexible probe 230 have been described above with reference to FIGS. 1 and 2, further descriptions thereof will be omitted.
The controller 240 may control the entire operation of the ultrasonic imaging apparatus 200. More specifically, the controller 240 may generate a control signal for controlling at least one of the transmit beamformer 250, the receive beamformer 260, the image processor 270, the storage unit 280, and the flexible display 220 based on an instruction or command received from an operator or an external device. For example, when an operator marks a lesion area by using his/her finger or a stylus pen, the controller 240 may display a solid line which corresponds to the operator's marking so that the lesion area can be distinguished from other areas.
The transmit beamformer 250 may perform transmit beamforming. The transmit beamforming is performed in order to focus ultrasonic waves which are generated by one or more ultrasonic elements T to a focal point. In particular, the transmit beamforming may be performed by coordinating one or more ultrasonic elements T to generate ultrasonic waves in an appropriate order in order to compensate for time differences with which ultrasonic waves generated by the ultrasonic elements T arrive at a focal point. The transmit beamforming will be described in more detail with reference to FIG. 4, below.
FIG. 4 is a block diagram of the transmit beamformer 250, according to an exemplary embodiment. Referring to FIG. 4, the transmit beamformer 250 may include a transmission signal generator 251 and a time delay unit (also referred to herein as a “time delay device”) 252.
The transmission signal generator 251 may generate transmission signals (such as, for example, high frequency alternating current signals) that are to be applied to one or more ultrasonic elements T based on a control signal which is received from the controller 240. The transmission signals generated by the transmission signal generator 251 may be provided to the time delay unit 252.
The time delay unit 252 delays each transmission signal generated by the transmission signal generator 251 in order to adjust a time at which the transmission signal arrives at the corresponding ultrasonic element T. If a transmission signal delayed by the time delay unit 252 is applied to the corresponding ultrasonic element T, the ultrasonic element T generates ultrasonic waves which correspond to the frequency of the transmission signal. Ultrasonic waves generated by the individual ultrasonic elements T are focused at the focal point. The location of the focal point at which ultrasonic waves generated by the ultrasonic elements T are focused may vary based on the type of delay pattern that is applied to transmission signals.
In the exemplary embodiment of FIG. 4, five ultrasonic elements t1 through t5 are provided, and three delay patterns that can be applied to transmission signals are respectively represented by using thick solid lines, medium solid lines, and thin solid lines.
When the delay pattern represented as the thick solid lines is applied to transmission signals generated by the transmission signal generator 251, ultrasonic waves generated by the ultrasonic elements t1 through t5 are focused at a first focal point F1.
When the delay pattern represented as the medium solid lines is applied to transmission signals generated by the transmission signal generator 251, ultrasonic waves generated by the ultrasonic elements t1 through t5 are focused at a second focal point F2 which is more distant than the first focal point F1.
When the delay pattern represented as the thin solid lines is applied to transmission signals generated by the transmission signal generator 251, ultrasonic waves generated by the ultrasonic elements t1 through t5 are focused at a third focal point F3 which is more distant than the second focal point F2.
As described above, the location of a focal point varies based on the type of delay pattern that is applied to transmission signals generated by the transmission signal generator 251. Accordingly, when a delay pattern is applied, ultrasonic waves that are to be applied to an object are focused at a fixed focal point (fixed-focusing). However, when two or more different delay patterns are applied, ultrasonic waves that are to be applied to the object are focused at several focal points (multi-focusing).
As such, ultrasonic waves generated by the individual ultrasonic elements T are fixed-focused at a focal point, or multi-focused at several focal points. The focused ultrasonic waves are directed to the inside of the object. The ultrasonic waves directed to the inside of the object are reflected from a target area of the object. An ultrasonic echo which is reflected from the target area is received by the ultrasonic elements T. Then, the ultrasonic elements T convert the received ultrasonic echo into electrical signals. Hereinafter, the converted electrical signals will be simply referred to as received signals (ultrasonic echo signals). The received signals output from the ultrasonic elements T are amplified and filtered, then converted into digital signals, and provided to the receive beamformer 260.
Referring again to FIG. 3, the receive beamformer 260 may perform receive beamforming on the received signals which are converted into the digital signals. The receive beamforming is performed in order to correct time differences between the received signals output from the individual ultrasonic elements T, and then to focus the resultant received signals. The receive beamforming will be described in more detail with reference to FIG. 5, below.
FIG. 5 is a block diagram of the receive beamformer 260, according to an exemplary embodiment. Referring to FIG. 5, the receive beamformer 260 may include a time-difference corrector 262 and a focusing unit (also referred to herein as a “focuser”) 261.
The time-difference corrector 262 delays a respective received signal which is output from each ultrasonic element T by a predetermined time period so that all received signals output from the individual ultrasonic elements T can be transferred to the focusing unit 261 at the same time.
The focusing unit 261 may focus the received signals based on the time-difference correction which is performed by the time-difference corrector 262. In particular, the focusing unit 261 may focus the received signals after allocating a predetermined weight (such as, for example, a beamforming coefficient) to each received signal in order to enhance or attenuate the corresponding received signal with respect to the other received signals. The focused, received signal may be provided to the image processor 270.
The image processor 270 may acquire 3D volume data based on the received signal focused by the focusing unit 261 of the receive beamformer 260. Further, the image processor 270 may trace an operator's eyeline based on an image acquired by the photographing unit 210. Then, the image processor 270 may process the 3D volume data based on the traced operator's eyeline, thereby generating an ultrasonic image. The ultrasonic image may include a section image which is obtained from 3D volume data, and/or a projection image which is obtained by volume-rendering 3D volume data with respect to a specific viewpoint. The image processor 270 will be described in more detail with reference to FIG. 6, below.
FIG. 6 is a block diagram of the image processor 270, according to an exemplary embodiment. Referring to FIG. 6, the image processor 270 may include a volume data acquiring unit (also referred to herein as a “volume data acquirer” or a “volume data acquisition device”) 271, an eyeline tracing unit (also referred to herein as an “eyeline tracer” or an “eyeline tracing device”) 272, a section image generator 273, and a volume rendering unit (also referred to herein as a “volume renderer” or a “volume rendering device”) 274.
The volume data acquiring unit 271 may acquire 3D volume data from a received signal focused by the focusing unit 261 of the receive beamformer 260 (see FIG. 5).
The eyeline tracing unit 272 may trace an operator's eyeline based on an image acquired by the photographing unit 210 (see FIG. 3). More specifically, if the infrared emitter of the photographing unit 210 emits infrared light onto an operator's eye, the infrared light is reflected from the operator's cornea, and the reflected light (hereinafter, referred to as "cornea light") is photographed by the camera of the photographing unit 210. The eyeline tracing unit 272 may detect the location of the cornea light and the location of the operator's pupil from the image acquired by the photographing unit 210, and then obtain a trace of the operator's eyeline based on the results of the detection. The results of the eyeline tracing may be provided to the section image generator 273 and the volume rendering unit 274, which will be described below.
The section image generator 273 may generate a section image which corresponds to a predetermined plane from the 3D volume data, based on the results of the eyeline tracing. More specifically, the section image generator 273 may generate a section image which corresponds to a plane which is perpendicular to the operator's eyeline, from the 3D volume data.
FIGS. 7A and 7B are views which illustrate section images that are generated by the section image generator 273 of FIG. 6.
FIG. 7A shows an example in which a section image corresponding to an y-z plane is generated from 3D volume data located in a 3D space that is defined by x, y, and z axes. However, a section image of an operator, which corresponds instead to a x-y plane or a z-x plane, may be generated.
FIG. 7B shows an example in which a section image corresponding to a plane which is not parallel to any of the x-y, y-z, and z-x planes is generated.
Referring to FIGS. 7A and 7B, there may be a plurality of planes which are perpendicular to the operator's eyeline, according to distances from the operator's viewpoint. In this case, a plane to be used for generating a section image from among the planes which are perpendicular to the operator's eyeline may be selected by using any one or more of various methods.
For example, a plane to be used for generating a section image from among planes which are perpendicular to an operator's eyeline may be automatically generated. Referring to the example of FIG. 7A, when all planes which are perpendicular to an operator's eyeline have the same size, a plane which is located closest to the operator's viewpoint from among the planes which are perpendicular to the operator's eyeline may be automatically selected. Alternatively, a plane which is located most distant from the operator's viewpoint from among the planes which are perpendicular to the operator's eyeline may be automatically selected. Referring to the example of FIG. 7B, when planes which are perpendicular to an operator's eyeline have different sizes, a plane having a largest size from among the planes which are perpendicular to the operator's eyeline may be automatically selected.
As another example, a plane which is used for generating a section image from among a plurality of planes which are perpendicular to an operator's eyeline may be manually selected. In this case, a section image which corresponds to an arbitrary plane from among the planes which are perpendicular to the operator's eyeline is first generated, and the section image is displayed via the flexible display 220 (see FIG. 3). Thereafter, if an instruction or command for changing the section image is received from the operator, the section image generator 273 (see FIG. 6) may select a plane which corresponds to the instruction or command, and then generate a section image which corresponds to the selected plane.
Referring again to FIG. 6, the volume rendering unit 274 may perform volume rendering with respect to the 3D volume data based on the results of the eyeline tracing. The volume rendering is performed in order to project 3D volume data to a 2D plane with respect to a predetermined viewpoint. The volume rendering operation may be classified into surface rendering and direct volume rendering.
The surface rendering is performed in order to extract surface information from volume data based on predetermined scalar values and amounts of spatial changes, to convert the surface information into a geometric factor, such as a polygon or a surface patch, and then to apply a conventional rendering technique to the geometric factor. Examples of algorithms which are used to implement the surface rendering are a marching cubes algorithm and a dividing cubes algorithm.
The direct volume rendering is performed in order to directly render volume data without converting volume data into a geometric factor. The direct volume rendering is useful to represent a translucent structure, because direct volume rendering effectively enables a visualization of the inside of an object as it is. The direct volume rendering operation may be classified into implementing methods which include an object-order method and an image-order method, according to a way of approaching volume data.
The object-order method is performed in order to search for volume data in its storage order and to synthesize each voxel with the corresponding pixel. A representative example of an operation which implements the object-order method is splatting.
The image-order method is performed in order to sequentially decide pixel values in the order of scan lines of an image. Examples of methods which implement the image-order method are Ray-Casting and Ray-Tracing.
According to the Ray-Casting, as illustrated in FIG. 8, a virtual ray is irradiated from a specific viewpoint toward a predetermined pixel of a display screen, and then voxels through which the virtual ray has been transmitted from among voxels of volume data are detected. Then, brightness values of the detected voxels are accumulated in order to determine a brightness value of the corresponding pixel of the display screen. Alternatively, an average value of the detected voxels may be determined as a brightness value of the corresponding pixel of the display screen. Further, a weighted average value of the detected voxels may be determined as a brightness value of the corresponding pixel of the display screen.
The Ray-Tracing is performed in order to trace a path of a ray which is traveling toward an observer's eyes. Unlike the Ray-Casting, which entails detecting an intersection at which a ray meets volume data, the Ray-Tracing can trace an irradiated ray and thereby determine parameters or qualities which relate to how the ray travels, such as reflection, refraction, etc. of the ray.
The Ray-Tracing can be classified into Forward Ray-Tracing and Backward Ray-Tracing. The Forward Ray-Tracing is performed in order to model a phenomenon in which a ray irradiated from a virtual light source arrives at volume data to be reflected, scattered, or transmitted, thereby finding a ray which finally arrives at an observer's eyes. The Backward Ray-Tracing is to backwardly trace a path of a ray which is traveling toward an observer's eyes.
Referring again to FIG. 6, the volume rendering unit 274 may perform volume rendering with respect to the 3D volume data by using any one or more of the above-mentioned volume rendering methods. In this aspect, if volume rendering is performed on the 3D volume data with respect to a viewpoint, a 2D projection image may be obtained. If volume rendering is performed on the 3D volume data with respect to two viewpoints which respectively correspond to a human's left and right eyes, two 2D projection images, that is, left and right images, may be obtained. By synthesizing the left image with the right image, a 3D stereoscopic image can be obtained.
Further, a 2D projection image and/or a 3D stereoscopic image which are acquired by the volume rendering unit 274 and a section image generated by the section image generator 273 may be displayed via the flexible display 220. An indicator which relates to a type of images to be displayed via the flexible display 220 may be set in advance by the operator. In addition, even when a predetermined type of images is displayed via the flexible display 220, a different type of images can be displayed if an instruction or command for changing the type of images to be displayed is received.
Referring again to FIG. 3, the storage unit 280 may store data and/or algorithms which are required for operations of the ultrasonic imaging apparatus 200. For example, the storage unit 280 may store any one or more of an algorithm for tracing an operator's eyeline from an image acquired via the photographing unit 210, an algorithm for generating a section image from 3D volume data, and an algorithm for volume rendering 3D volume data.
Further, the storage unit 280 may store values which are set in advance by the operator. For example, the storage unit 280 may store information which relates to a type of ultrasonic images to be displayed via the flexible display 220, information relating to whether or not to automatically generate a section image, and a criterion for selecting a plane for generating a section image from among planes which are perpendicular to an operator's eyeline.
The storage unit 280 may include any one or more of Read Only Memory (ROM), Random Access Memory (RAM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), flash memory, Hard Disk Drive (HDD), Optical Disk Drive (ODD), and/or a combination thereof. However, the storage unit 280 is not limited to these, and may include any other storage device which is well-known in the art.
An exemplary embodiment of the ultrasonic imaging apparatus 200 has been described above with reference to FIGS. 3 to 8. The above-described exemplary embodiment relates to an example in which the image processor 270 includes both the section image generator 273 and the volume rendering unit 274.
According to another exemplary embodiment, as illustrated in FIG. 9, the image processor 270 may include the volume data acquiring unit 271, the eyeline tracing unit 272, and the section image generator 273. Because these components have been described above with reference to FIGS. 6, 7A, and 7B, further descriptions will be omitted. If the image processor 270 has a configuration as illustrated in FIG. 9, a section image may be automatically generated from 3D volume data even when no instruction or command from an operator is received.
According to another exemplary embodiment, as illustrated in FIG. 10, the image processor 270 may include the volume data acquiring unit 271, the eyeline tracing unit 272, and the volume rendering unit 274. Because these components have been described above with reference to FIGS. 6 to 8, further descriptions will be omitted. If the image processor 270 has a configuration as illustrated in FIG. 10, a 2D projection image and/or a 3D stereoscopic image may be generated from 3D volume data. A type of images to be displayed via the flexible display 220 from among 2D projection images and 3D stereoscopic images may be set in advance by an operator. Further, even when a predetermined type of images is displayed via the flexible display 220, a different type of images can be displayed if an instruction or command for changing the type of images to be displayed is received.
FIG. 11 is a flowchart which illustrates a control method which is executable by using the ultrasonic imaging apparatus 200 including the image processor 270 of FIG. 6, according to an exemplary embodiment.
Prior to describing the control method, it is assumed that the flexible probe 230 of the ultrasonic imaging apparatus 200 is fixed on the object 10 in contact with the object 10 (see FIG. 3).
Referring to FIGS. 1, 3, 6, and 11, in operation S710, after the flexible probe 230 is fixed on the object 10, the flexible probe 230 irradiates ultrasonic waves toward the object 10, and receives at least one ultrasonic echo which is reflected from the object 10. The irradiation of ultrasonic waves and the reception of the at least one ultrasonic echo may be performed by at least one ultrasonic element T (see FIG. 4), such as, for example, at least one ultrasonic transducer. The ultrasonic element T converts the received ultrasonic echo into an electrical signal in order to output a received signal. The received signal output from the ultrasonic element T is amplified and filtered, and then converted into a digital signal. The received signal converted into the digital signal may be received and focused by the receive beamformer 260.
Thereafter, in operation S720, 3D volume data may be acquired based on the received signal focused by the receive beamformer 260. The 3D volume data may be acquired by the volume data acquiring unit 271 of the image processor 270.
In operation S730, if the photographing unit 210 photographs the operator in order to acquire an image of the operator, the eyeline of the operator may be traced from the acquired image. Operation 730 of tracing the operator's eyeline may include sub-operations of: emitting infrared light onto the operator; photographing cornea light which is reflected from the operator's cornea; detecting the location of the cornea light and the location of the operator's pupil from an image which is obtained by photographing the cornea light; and tracing the operator's eyeline based on the results of the detection.
Thereafter, in operation S740, a determination may be made regarding whether or not volume rendering should be performed. Whether or not volume rendering should be performed may be determined according to a type of images to be displayed. For example, if the type of images to be displayed has been set to section images, a determination may be made that volume rendering need not be performed. Conversely, if the type of images to be displayed has been set to 2D projection images or 3D stereoscopic images, a determination may be made that volume rendering should be performed.
If a determination is made that volume rendering should be performed (i.e., "Yes" in operation S740), volume rendering may be performed with respect to the 3D volume data according to the operator's eyeline in operation S750. For example, operation S750 of performing volume rendering with respect to the 3D volume data may include performing volume rendering on the 3D volume data with respect to a predetermined viewpoint in order to generate a 2D projection image. As another example, operation S750 of performing volume rendering with respect to the 3D volume data may include performing volume rendering on the 3D volume data with respect to two viewpoints in order to generate left and right images, and synthesizing the left image with the right image in order to generate a 3D stereoscopic image.
After volume rendering is completed, in operation S760, the results of the volume rendering may be displayed via the flexible display 220. For example, if a 2D projection image has been acquired in operation S750 of performing volume rendering, the 2D projection image may be displayed via the flexible display 220. Further, if a 3D stereoscopic image has been acquired in operation S750 of performing volume rendering, the 3D stereoscopic image may be displayed via the flexible display 220.
If a determination is made that volume rendering need not be performed (i.e., "No" in operation S740), then in operation S770, a section image which corresponds to a plane which is perpendicular to the operator's eyeline may be generated from the 3D volume data. Operation S770 of generating the section image may include selecting a predetermined plane from among a plurality of planes which are perpendicular to the operator's eyeline, and generating a section image which corresponds to the selected plane. Selecting the predetermined plane from among the plurality of planes which are perpendicular to the operator's eyeline may be automatically performed, or may be manually performed based on an instruction or command from the operator.
In operation S780, the generated section image may be displayed via the flexible display 220.
FIG. 12 is a flowchart which illustrates a control method which is executable by using the ultrasonic imaging apparatus 200 including the image processor 270 of FIG. 9, according to another exemplary embodiment.
Referring to FIGS. 1, 3, 9, and 12, first, the operator causes the flexible probe 230 of the body 201 of the ultrasonic imaging apparatus 200 to contact the object 10, and then fixes the flexible probe 230 on the object 10 by using the belt 290.
After the flexible probe 230 is fixed on the object 10, in operation S810, the flexible probe 230 irradiates ultrasonic waves toward the object 10, and receives at least one ultrasonic echo which is reflected from the object 10. The irradiation of ultrasonic waves and the reception of the at least one ultrasonic echo may be performed by at least one ultrasonic element T (see FIG. 4), such as, for example, at least one ultrasonic transducer. The ultrasonic element T converts the received at least one ultrasonic echo into an electrical signal in order to output a received signal. The received signal output from the ultrasonic element T is amplified and filtered, and then converted into a digital signal. The received signal converted into the digital signal may be received and focused by the receive beamformer 260.
Thereafter, in operation S820, 3D volume data may be acquired based on the received signal focused by the receive beamformer 260. The 3D volume data may be acquired by the volume data acquiring unit 271 of the image processor 270.
Further, in operation S830, if the photographing unit 210 photographs the operator in order to acquire an image of the operator, the eyeline of the operator may be traced based on the acquired image. Operation S830 of tracing the operator's eyeline may include sub-operations of: emitting infrared light toward the operator; photographing cornea light which is reflected from the operator's cornea; detecting the location of the cornea light and the location of the operator's pupil from an image which is obtained by photographing the cornea light; and tracing the operator's eyeline based on the results of the detection.
After the operator's eyeline is traced, in operation S870, a section image which corresponds to a plane which is perpendicular to the operator's eyeline may be generated from the acquired 3D volume data. operation S870 of generating the section image may include selecting a predetermined plane from among a plurality of planes which are perpendicular to the operator's eyeline, and generating a section image which corresponds to the selected plane. Selecting the predetermined plane from among the plurality of planes which are perpendicular to the operator's eyeline may be automatically performed, or may be manually performed based on an instruction or command from the operator.
In operation S880, the generated section image may be displayed via the flexible display 220.
FIG. 13 is a flowchart which illustrates a control method which is executable by using the ultrasonic imaging apparatus 200 including the image processor 270 of FIG. 10, according to still another exemplary embodiment.
Referring to FIGS. 1, 3, 10, and 13, first, the operator causes the flexible probe 230 of the body 201 of the ultrasonic imaging apparatus 200 to contact the object 10, and then fixes the flexible probe 230 on the object 10 by using the belt 290.
After the flexible probe 230 is fixed on the object 10, in operation S910, the flexible probe 230 irradiates ultrasonic waves toward the object 10, and receives at least one ultrasonic echo which is reflected from the object 10. The irradiation of ultrasonic waves and the reception of the at least one ultrasonic echo may be performed by at least one ultrasonic element T (see FIG. 4), such as, for example, at least one ultrasonic transducer. The ultrasonic element T converts the received at least one ultrasonic echo into an electrical signal in order to output a received signal. The received signal output from the ultrasonic element T is amplified and filtered, and then converted into a digital signal. The received signal converted into the digital signal may be received and focused by the receive beamformer 260.
Thereafter, in operation S920, 3D volume data may be acquired based on the received signal focused by the receive beamformer 260. The 3D volume data may be acquired by the volume data acquiring unit 271 of the image processor 270.
Further, if the photographing unit 210 photographs the operator in order to acquire an image of the operator, then in operation S930, the eyeline of the operator may be traced based on the acquired image. Operation S930 of tracing the operator's eyeline may include sub-operations of: emitting infrared light toward the operator; photographing cornea light which is reflected from the operator's cornea; detecting the location of the cornea light and the location of the operator's pupil from an image which is obtained by photographing the cornea light; and tracing the operator's eyeline based on the results of the detection.
After the operator's eyeline is traced, in operation S950, volume rendering may be performed with respect to the acquired 3D volume data. For example, operation S950 of performing volume rendering with respect to the 3D volume data may include volume rendering on the 3D volume data with respect to a predetermined viewpoint in order to generate a 2D projection image. As another example, operation S950 of performing volume rendering with respect to the 3D volume data may include performing volume rendering on the 3D volume data with respect to two viewpoints in order to generate left and right images, and synthesizing the left image with the right image in order to generate a 3D stereoscopic image.
After volume rendering is completed, in operation S960, the results of the volume rendering may be displayed via the flexible display 220. For example, if a 2D projection image has been generated in operation S950 of performing volume rendering, the generated 2D projection image may be displayed via the flexible display 220. Conversely, if a 3D stereoscopic image has been generated in step S950 of performing volume rendering, the 3D stereoscopic image may be displayed via the flexible display 220.
The methods and/or operations described above may be recorded, stored, or fixed in one or more transitory or non-transitory computer-readable storage media that includes program instructions to be implemented by a computer to cause a processor to execute or perform the program instructions. The media may also include, alone or in combination with the program instructions, data files, data structures, and the like.
Examples of computer-readable storage media include magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as compact disk read-only memory (CD ROM) disks and digital versatile disks (DVDs); magneto-optical media, such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include machine code, such as that produced by a compiler, and files containing higher level code that may be executed by the computer by using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations and methods described above, or vice versa. In addition, a computer-readable storage medium may be distributed among computer systems connected through a network and computer-readable codes or program instructions may be stored and executed in a decentralized manner. Further, functional programs, codes and code segments to implement those exemplary embodiments may be easily inferred by programmers who are skilled in the related art.
Although a few exemplary embodiments have been shown and described, it will be appreciated by those skilled in the art that changes may be made in these exemplary embodiments without departing from the principles and spirit of the present inventive concept, the scope of which is defined in the claims and their equivalents.

Claims (19)

  1. An ultrasonic imaging apparatus comprising:
    a flexible probe which has a planar shape and which is configured to irradiate ultrasonic waves toward an object, and to receive at least one ultrasonic echo which is reflected from the object;
    an image processor which is configured to trace an eyeline of an operator from an image which is acquired by photographing the operator, and to generate an ultrasonic image by processing three-dimensional (3D) volume data which is acquired from the at least one ultrasonic echo based on the traced eyeline; and
    a flexible display which has a planar shape and which is disposed on one side of the flexible probe, and which is configured to display the generated ultrasonic image.
  2. The ultrasonic imaging apparatus according to claim 1, further comprising a belt which is configured to fasten a body which includes the flexible probe and the flexible display to the object.
  3. The ultrasonic imaging apparatus according to claim 1, further comprising a photographing device which is configured to photograph the operator.
  4. The ultrasonic imaging apparatus according to claim 3, wherein the photographing device comprises an infrared emitter which is configured to emit infrared light toward the operator, and a camera which is configured to photograph infrared light which is reflected from a cornea of the operator.
  5. The ultrasonic imaging apparatus according to claim 1, wherein the image processor comprises a section image generator which is configured to generate a section image which corresponds to a plane selected from among a plurality of planes which are perpendicular to the traced eyeline, from the 3D volume data.
  6. The ultrasonic imaging apparatus according to claim 1, wherein the image processor comprises a volume renderer which is configured to perform volume rendering with respect to the 3D volume data based on the traced eyeline.
  7. The ultrasonic imaging apparatus according to claim 1, wherein the image processor comprises:
    a section image generator which is configured to generate a section image which corresponds to a plane selected from among a plurality of planes which are perpendicular to the traced eyeline, from the 3D volume data; and
    a volume renderer which is configured to perform volume rendering with respect to the 3D volume data based on the traced eyeline, and to generate at least one of a two-dimensional (2D) projection image and a three-dimensional (3D) stereoscopic image by using a result of the volume rendering.
  8. The ultrasonic imaging apparatus according to claim 1, wherein the flexible display includes a touch display.
  9. A control method which is executable by using an ultrasonic imaging apparatus, comprising:
    irradiating ultrasonic waves toward an object by using a flexible probe which has a planar shape, and receiving at least one ultrasonic echo which is reflected from the object;
    tracing an eyeline of an operator from an image which is acquired by photographing the operator, and generating an ultrasonic image by processing three-dimensional (3D) volume data which is acquired from the at least one ultrasonic echo based on the traced eyeline; and
    displaying the generated ultrasonic image by using a flexible display which has a planar shape and which is disposed on one side of the flexible probe.
  10. The control method according to claim 9, wherein the generating the ultrasonic image comprises generating a section image which corresponds to a plane which is selected from among a plurality of planes which are perpendicular to the traced eyeline, from the 3D volume data.
  11. The control method according to claim 9, wherein the generating the ultrasonic image comprises performing volume rendering with respect to the 3D volume data based on the traced eyeline, and generating at least one of a two-dimensional (2D) projection image and a three-dimensional (3D) stereoscopic image.
  12. The control method according to claim 9, wherein the generating the ultrasonic image comprises:
    determining a type of ultrasonic image which corresponds to a display setting of the flexible display; and
    performing at least one from among generating a section image from the 3D volume data and performing volume rendering with respect to the 3D volume data based on a result of the determining.
  13. The control method according to claim 9, wherein the photographing the operator comprises emitting infrared light toward the operator and recording infrared light which is reflected from a cornea of the operator.
  14. A non-transitory computer readable medium having recorded thereon a program which is executable by a computer for performing a method for using an ultrasonic imaging apparatus to generate an ultrasonic image, the method comprising:
    storing data relating to at least one ultrasonic echo which is reflected from an object toward which ultrasonic waves are irradiated by the ultrasonic imaging apparatus;
    storing an image which is acquired by using the ultrasonic imaging apparatus to photograph an operator of the ultrasonic imaging apparatus;
    tracing an eyeline of the operator based on the stored image;
    using the stored data relating to the at least one ultrasonic echo and a result of the tracing the eyeline of the operator to acquire three-dimensional (3D) volume data; and
    generating the ultrasonic image by processing the acquired 3D volume data.
  15. The non-transitory computer readable medium according to claim 14, wherein the method further comprises causing the generated ultrasonic image to be displayed on a flexible display of the ultrasonic imaging apparatus.
  16. The non-transitory computer readable medium according to claim 14, wherein the using the ultrasonic imaging apparatus to photograph the operator comprises causing the ultrasonic imaging apparatus to emit infrared light toward the operator and to record infrared light which is reflected from a cornea of the operator.
  17. The non-transitory computer readable medium according to claim 14, wherein the generating the ultrasonic image comprises generating a section image which corresponds to a plane which is selected from among a plurality of planes which are perpendicular to the traced eyeline from the acquired 3D volume data.
  18. The non-transitory computer readable medium according to claim 14, wherein the generating the ultrasonic image comprises performing volume rendering with respect to the acquired 3D volume data based on the traced eyeline and generating at least one of a two-dimensional projection image and a three-dimensional stereoscopic image by using a result of the volume rendering.
  19. The non-transitory computer readable medium according to claim 15, wherein the generating the ultrasonic image comprises determining a type of ultrasonic image to be displayed based on a predetermined display setting of the flexible display, and performing at least one from among generating a section image from the acquired 3D volume data and performing volume rendering with respect to the acquired 3D volume data based on a result of the determining.
PCT/KR2014/005572 2013-06-25 2014-06-24 Ultrasonic imaging apparatus and control method thereof Ceased WO2014208977A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0072953 2013-06-25
KR20130072953A KR20150000627A (en) 2013-06-25 2013-06-25 Ultrasonic imaging apparatus and control method for thereof

Publications (1)

Publication Number Publication Date
WO2014208977A1 true WO2014208977A1 (en) 2014-12-31

Family

ID=52142243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/005572 Ceased WO2014208977A1 (en) 2013-06-25 2014-06-24 Ultrasonic imaging apparatus and control method thereof

Country Status (2)

Country Link
KR (1) KR20150000627A (en)
WO (1) WO2014208977A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111184532A (en) * 2020-04-09 2020-05-22 上海尽星生物科技有限责任公司 Ultrasonic system and method of contact type flexible conformal ultrasonic probe
WO2021004076A1 (en) * 2019-07-05 2021-01-14 山东大学 Ai chip-based conformal wearable biological information monitoring device and system
US20210212658A1 (en) * 2018-05-31 2021-07-15 Matt Mcgrath Design & Co, Llc Integrated Medical Imaging Apparatus And Associated Method Of Use

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20250012331A (en) * 2023-07-17 2025-01-24 사회복지법인 삼성생명공익재단 Mixed reality-based ultrasound image output system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001064094A2 (en) * 2000-02-28 2001-09-07 Wilk Patent Development Corporation Ultrasonic imaging system and associated method
KR20060058327A (en) * 2004-11-25 2006-05-30 주식회사 메디슨 Remote control of ultrasonic imaging device
KR20090099953A (en) * 2008-03-19 2009-09-23 주식회사 메디슨 Ultrasonic Systems with Different Displays
JP2010500894A (en) * 2006-08-17 2010-01-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Dynamic physical condition display device
US20120116227A1 (en) * 2009-07-16 2012-05-10 Unex Corporation Ultrasonic blood vessel inspecting apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001064094A2 (en) * 2000-02-28 2001-09-07 Wilk Patent Development Corporation Ultrasonic imaging system and associated method
KR20060058327A (en) * 2004-11-25 2006-05-30 주식회사 메디슨 Remote control of ultrasonic imaging device
JP2010500894A (en) * 2006-08-17 2010-01-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Dynamic physical condition display device
KR20090099953A (en) * 2008-03-19 2009-09-23 주식회사 메디슨 Ultrasonic Systems with Different Displays
US20120116227A1 (en) * 2009-07-16 2012-05-10 Unex Corporation Ultrasonic blood vessel inspecting apparatus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210212658A1 (en) * 2018-05-31 2021-07-15 Matt Mcgrath Design & Co, Llc Integrated Medical Imaging Apparatus And Associated Method Of Use
US12303324B2 (en) 2018-05-31 2025-05-20 Faction Imaging Inc. Method of medical imaging using multiple arrays
US12329569B2 (en) 2018-05-31 2025-06-17 Faction Imaging Inc. Anatomical attachment device and associated method of use
WO2021004076A1 (en) * 2019-07-05 2021-01-14 山东大学 Ai chip-based conformal wearable biological information monitoring device and system
CN111184532A (en) * 2020-04-09 2020-05-22 上海尽星生物科技有限责任公司 Ultrasonic system and method of contact type flexible conformal ultrasonic probe

Also Published As

Publication number Publication date
KR20150000627A (en) 2015-01-05

Similar Documents

Publication Publication Date Title
US20200337599A1 (en) Human Body Measurement Using Thermographic Images
CN113974689B (en) Spatial Alignment Equipment
CN107105972B (en) Model register system and method
KR102617605B1 (en) Jaw motion tracking
KR101193017B1 (en) Operation supporting device, method, and program
US20190117190A1 (en) Ultrasound imaging probe positioning
CN107320124A (en) The method and medical image system of spacer scanning are set in medical image system
JP6974354B2 (en) Synchronized surface and internal tumor detection
CN103892919A (en) Microsurgery system and navigation method based on optical coherence tomography guidance
US20140171799A1 (en) Systems and methods for providing ultrasound probe location and image information
WO2014208977A1 (en) Ultrasonic imaging apparatus and control method thereof
US20200069291A1 (en) Methods and apparatuses for collection of ultrasound data
CN105342561B (en) The wearable molecular image navigation system of Wireless sound control
JP2005136726A (en) Stereoscopic image display apparatus, stereoscopic image display system, stereoscopic image display method, and program
CN109223303A (en) Full-automatic wound shooting assessment safety goggles and measurement method
US20130208249A1 (en) Medical imaging apparatus
US9589387B2 (en) Image processing apparatus and image processing method
EP3072448B1 (en) Device and method for generating dental three-dimensional surface image
CN104224211B (en) Digital X rays view stereoscopic alignment system and its method
KR101133503B1 (en) Integrated optical and x-ray ct system and method of reconstructing the data thereof
US20220361952A1 (en) Physical medical element placement systems
WO2018139707A1 (en) Ultrasonic diagnosis apparatus displaying shear wave data for object and method for operating same
WO2015002498A1 (en) Ultrasonic imaging apparatus and control method thereof
US20190350546A1 (en) Backscattering x-ray imager with x-ray source at detector center
JP2012143419A (en) Apparatus and method for displaying radiographic image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14818547

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14818547

Country of ref document: EP

Kind code of ref document: A1