WO2025033804A1 - Procédé permettant d'obtenir des informations sur le classement d'un échogramme doppler et dispositif permettant d'obtenir des informations sur le classement d'un échogramme doppler à l'aide de celui-ci - Google Patents
Procédé permettant d'obtenir des informations sur le classement d'un échogramme doppler et dispositif permettant d'obtenir des informations sur le classement d'un échogramme doppler à l'aide de celui-ci Download PDFInfo
- Publication number
- WO2025033804A1 WO2025033804A1 PCT/KR2024/011062 KR2024011062W WO2025033804A1 WO 2025033804 A1 WO2025033804 A1 WO 2025033804A1 KR 2024011062 W KR2024011062 W KR 2024011062W WO 2025033804 A1 WO2025033804 A1 WO 2025033804A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cross
- doppler
- section
- classification
- providing information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0883—Clinical applications for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/488—Diagnostic techniques involving Doppler signals
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
Definitions
- the present invention relates to a method for providing information on Doppler ultrasound image classification and a device for providing information on Doppler ultrasound image classification using the same.
- a cardiac ultrasound examination is performed by projecting ultrasound waves on the three-dimensional structure of the heart in multiple planes to obtain images of the heart and measure hemodynamic variables.
- the medical staff positions the ultrasound probe in a location where it is easy to obtain Doppler ultrasound images to obtain multi-faceted images through the anatomical structures around the heart, such as between the ribs, and records the images by finding the appropriate slice through rotation and tilting.
- the Doppler mode can measure the velocity of blood flow in blood vessels.
- the blood flow measurement method based on the Doppler mode of ultrasound has the characteristic of being able to measure blood flow velocity in real time noninvasively, and is widely used in modern medical diagnosis.
- Doppler ultrasound images can provide blood flow information for various cross-sectional views.
- Doppler images acquired during cardiac ultrasound only provide velocity-time information for blood vessels.
- the medical staff must classify which cross-section the acquired Doppler ultrasound image corresponds to.
- the inventors of the present invention attempted to develop an information providing system based on an artificial neural network that is trained to recognize cross-sectional views of Doppler cardiac ultrasound images and to distinguish between each cross-sectional view.
- the inventors of the present invention were able to recognize and classify cross-sectional views of Doppler ultrasound images by applying an artificial neural network.
- the inventors of the present invention focused on an artificial neural network trained to extract cross-sectional views using Doppler echocardiographic images as input and classify each cross-sectional view corresponding to the extracted cross-sectional views.
- the inventors of the present invention focused more on a classification model that derives a distance value using a Doppler cross-section distance graph based on the cross-section correlation of extracted Doppler ultrasound images for Doppler cardiac ultrasound images, and outputs a probability vector corresponding to each cross-section using the distance value.
- the inventors of the present invention based on a classification model, represent a cross-section for a Doppler ultrasound image as a Doppler cross-section distance graph based on the degree of correlation between each cross-section, input the distance value derived by the Doppler cross-section distance graph into the ⁇ function, and train the network using the probability vector matching the input distance value, thereby outputting (at the time of inference) a probability vector more suitable for the Decision Tree shown in Fig. 5 that matches the distance value, thereby constructing a system so that each cross-section can be classified with high accuracy.
- the inventors of the present invention were able to recognize that it is possible to classify cross-sectional views of Doppler ultrasound images with high accuracy based on the classification model and decision tree [Fig. 5] for the acquired Doppler cardiac ultrasound images.
- the inventors of the present invention were able to realize that by providing a new information provision system, unnecessary diagnosis time can be reduced and cross-section classification with high accuracy is possible with an artificial neural network-based system.
- the inventors of the present invention expected that by providing a new information provision system, classification of Doppler ultrasound image cross-sections can be performed with high accuracy regardless of the skill level of medical staff, and thus, provision of highly reliable analysis results for Doppler cardiac ultrasound images can be made possible.
- the problem to be solved by the present invention is to provide a method for providing information on a Doppler ultrasound image and a device using the same, which is configured to classify a cross-sectional view extracted from a received Doppler cardiac ultrasound image with high accuracy using a classification model based on an artificial neural network.
- the above information providing method is an information providing method for Doppler cardiac ultrasound image classification implemented by a processor, comprising the steps of receiving a Doppler cardiac ultrasound image of an object, extracting a cross-sectional view using the Doppler cardiac ultrasound image as input, and classifying the cross-sectional view for the Doppler cardiac ultrasound image using a classification model and a decision tree.
- a step of classifying a cross-sectional view of a Doppler cardiac ultrasound image using a classification model further includes a step of calculating a distance value from a Doppler view distance graph, a step of inputting the distance value calculated using the classification model into a ⁇ function and outputting a matching probability vector, and a step of learning a network based on the output probability vector and classifying the cross-sectional view of the Doppler cardiac ultrasound image using the network probability output and a Decision Tree.
- the step of classifying a cross-sectional view for a Doppler cardiac ultrasound image using a classification model may further include the step of obtaining a probability corresponding to each of a plurality of cross-sectional views based on an output probability vector, and the step of determining a cross-sectional view having the highest value among the corresponding probabilities as a cross-sectional view for the Doppler cardiac ultrasound image.
- a cross-section is a plurality of cross-sections, and labeling for a cross-section classification model is generated by calculating a distance value from a Doppler view distance graph, inputting the calculated distance value into a ⁇ function, and using the matching probability vector as ground-truth during training, so that a probability corresponding to each cross-section reflecting the relative distance between the Doppler ultrasound image cross-sections can be finally output.
- the plurality of cross-sections can be one of MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein, Septal Annulus TDI or Lateral Annulus TDI.
- the Doppler view distance graph is a binary tree, and each cross-section may be arranged in the binary tree according to the degree of interrelationship between each cross-section.
- the classification model further includes a step of calculating a distance value from a Doppler cross-section distance graph when training a cross-section for a Doppler cardiac ultrasound image, wherein the step of calculating the distance value may be generating a Training GT using a binary tree created by considering correlations between a plurality of cross-sections for the cross-sections of the Doppler cardiac ultrasound image.
- the final cross-section classification is obtained using a Doppler Decision Tree using a probability value considering the distance between the Doppler cross-sections.
- the step of generating a label of a cross-sectional view for a Doppler cardiac ultrasound image to learn a classification model may further include a step of inputting the calculated distance value into a gyro ( ⁇ ) function and outputting a matching probability vector.
- a gyro ( ⁇ ) function is
- the step of classifying a cross-sectional view of a Doppler cardiac ultrasound image using a classification model may further include a step of classifying the cross-sectional view of the Doppler cardiac ultrasound image based on a probability vector learned through the above process, and at this time, the step of classifying the cross-sectional view of the Doppler cardiac ultrasound image based on the calculated probability vector may further include a step of calculating a probability corresponding to each cross-sectional view calculated based on the calculated probability vector and determining it as a cross-sectional view of the Doppler cardiac ultrasound image through a Decision Tree.
- the step of classifying the cross-section may include a step of determining at least one Doppler mode cross-section among MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein or Septal Annulus TDI, Lateral Annulus TDI, by using a probability corresponding to each cross-section calculated using a cross-section classification model.
- the step of classifying the cross-section may include a step of determining a cross-section having the highest probability value among MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein or Septal Annulus TDI, Lateral Annulus TDI as a cross-section for a Doppler cardiac ultrasound image, by using a Decision Tree and a probability corresponding to each cross-section calculated using a cross-section classification model.
- a device for classification information in a Doppler ultrasound image is provided according to another embodiment of the present invention.
- the above information providing device includes an ultrasound probe that provides a Doppler cardiac ultrasound image of an object and a processor functionally connected to a communication unit.
- the processor is configured to classify a cross-sectional view of a Doppler cardiac ultrasound image based on the received Doppler cardiac ultrasound image by using a cross-sectional view classification model learned to classify a cross-sectional view by taking each Doppler cardiac ultrasound image as an input.
- the classification model may include an output unit that takes a distance value as input and outputs a probability vector.
- the cross-section is a plurality of cross-sections
- the cross-section classification label can calculate a distance value corresponding to each cross-section using a Doppler cross-section distance graph. Furthermore, through the calculated distance value and the gyro function, a probability vector corresponding to each of the distances between the plurality of cross-sections for the Doppler cardiac ultrasound image can be output.
- the processor may be configured to obtain a probability corresponding to each of a plurality of output cross-sectional views, and determine, through a decision tree, a cross-sectional view having the highest value among the corresponding probabilities as a cross-sectional view for a Doppler cardiac ultrasound image.
- the present invention provides an information providing system for Doppler ultrasound image classification based on an artificial neural network configured to classify ultrasound cross-sectional views using Doppler cardiac ultrasound images, thereby providing highly reliable cardiac ultrasound diagnosis results.
- the present invention recognizes a cross-sectional view of a Doppler ultrasound image and classifies it using a classification model, thereby being able to classify with high accuracy which cross-sectional view a Doppler image corresponds to.
- the present invention provides an information providing system for Doppler ultrasound image classification based on an artificial neural network, thereby enabling classification of cross-sectional views of Doppler cardiac ultrasound images with high accuracy regardless of the skill level of medical staff, thereby contributing to more accurate decision-making and establishment of treatment plans at the image analysis stage.
- FIG. 1 illustrates an information providing system for Doppler ultrasound image classification using a device for providing information for Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 2a is a block diagram showing the configuration of a medical device according to one embodiment of the present invention.
- FIG. 2b is a block diagram showing the configuration of a server for providing information according to one embodiment of the present invention.
- FIG. 3 illustrates a procedure of a method for providing information on Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 4 is an exemplary diagram illustrating a process of deriving a probability corresponding to each cross-section based on a distance-based label generation function zai according to one embodiment of the present invention.
- FIG. 5 is an exemplary diagram illustrating a final Doppler ultrasound image classification decision tree based on the probability value of a network learned using a Doppler cross-section distance graph for Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 6 is an exemplary diagram illustrating a Doppler cross-sectional distance graph created using each cross-sectional view of a Doppler ultrasound image for Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 7 is an exemplary diagram illustrating a gyrofunction graph for outputting a probability vector reflecting the distance of a cross-sectional view according to one embodiment of the present invention.
- subject as used in this specification may mean any subject from which information on Doppler ultrasound image classification is provided. Meanwhile, the subject disclosed in this specification may be any mammal other than a human, but is not limited thereto.
- the term "ultrasound image” may be a cardiac ultrasound image that can be obtained by a noninvasive method.
- the ultrasound image may be a still cut image or a video composed of a plurality of cuts.
- the video may be classified into cross-sectional views for each Doppler cardiac ultrasound image for each frame of the video according to the method for providing information on a Doppler ultrasound image according to an embodiment of the present invention.
- the present invention can provide a streaming service by performing cross-sectional view classification simultaneously with the reception of a Doppler cardiac ultrasound image from an image diagnosis device, and can also provide Doppler ultrasound image information in real time.
- ultrasound images may be two-dimensional videos, but are not limited thereto and may also be three-dimensional images.
- Doppler cardiac ultrasound image or “Doppler ultrasound image” may be an image that can analyze the velocity of blood flow in a blood vessel in a non-invasive manner among various ultrasound modes. At this time, the Doppler ultrasound image can provide blood flow information for various cross-sectional views.
- cross-sectional view refers to a cross-sectional view for a Doppler ultrasound image, which may include a cross-sectional view for a Pulsed Wave (PW) Doppler mode, a cross-sectional view for a Continuous Wave (CW) Doppler mode, and a cross-sectional view for a Tissue Doppler imaging (TDI) mode.
- PW Pulsed Wave
- CW Continuous Wave
- TDI Tissue Doppler imaging
- the "cross-sectional view for Doppler ultrasound image” includes cross-sectional views of MV inflow PW, AL (LVOT) PW, PV (RVOT) PW and pulmonary vein for intermittent wave Doppler mode, cross-sectional views of AV (AS) CW, PV (PS) CW, MV (MR)-CW, TV (TR)-CW, AV (AR)-CW, PV (PR)-CW and MV (MS)-CW) for continuous wave Doppler mode, and cross-sectional views of Septal Annulus PW TDI and Lateral Annulus PW TDI for tissue Doppler imaging mode.
- AV CW
- PV (PS) CW MV
- MR MV
- TV TR
- PV (PR)-CW PV (PR)-CW
- MV (MS)-CW for continuous wave Doppler mode
- the ultrasound image may be an image in DICOM format including metadata for the ultrasound image.
- Metadata can correspond to image DICOM tag information.
- a Doppler ultrasound image can be acquired by cropping only the relevant area based on the coordinate values of the DICOM header.
- Metadata may further include information such as whether the Doppler ultrasound image is a still image or a video, a color image or a black and white image.
- Doppler cardiac ultrasound images for multiple cross-sections can have ultrasound cross-sections recognized and classified within the image regardless of the Doppler ultrasound mode by the cross-section classification model.
- cross-sectional view classification model may be a model configured to take a Doppler echocardiographic image as input, and output a determination corresponding to each of MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein, Septal Annulus TDI and Lateral Annulus TDI for each cross-sectional view of the Doppler ultrasound image.
- the cross-sectional classification model may be a model learned to classify cross-sectional views in various Doppler modes based on a learning Doppler cardiac ultrasound image.
- the learning Doppler cardiac ultrasound image may be an image in which cross-sectional views in PW, CW, and TDI Doppler modes are each labeled.
- the cross-sectional classification model may be a model having 13 output nodes learned to classify 13 cross-sectional views of MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein, Septal Annulus TDI and Lateral Annulus TDI for Doppler ultrasound images.
- the present invention is not limited thereto.
- the cross-sectional view classification model may include a step of calculating a distance value from a Doppler view distance graph, inputting the distance value calculated using the classification model into a ⁇ function to output a matching probability vector, and calculating the probability corresponding to each cross-sectional view based on the probability vector.
- the Doppler view distance graph may be a binary tree, and each cross-sectional view may be arranged in the binary tree according to the degree of interrelationship between each cross-sectional view.
- the cross-sectional classification model may further include a step of inputting the calculated distance value into the ⁇ function and outputting a matching probability vector.
- the ⁇ function is
- the cross-sectional view classification model can classify cross-sectional views with high accuracy regardless of the mode of the input Doppler ultrasound image.
- the cross-sectional classification model may be a model based on DBNet, but is not limited thereto.
- the classification models may be based on at least one algorithm selected from among U-net, VGG net, DenseNet, and FCN (Fully Convolutional Network) with encoder-decoder structure, SegNet, DeconvNet, DNN (deep neural network) such as DeepLAB V3+, SqueezeNet, Alexnet, ResNet18, MobileNet-v2, GoogLeNet, Resnet-v2, Resnet50, RetinaNet, Resnet101, Inception-v3.
- the cross-sectional classification model may be an ensemble model based on at least two algorithm models among the aforementioned algorithms.
- FIG. 1 illustrates an information providing system for Doppler ultrasound image classification using a device for providing information for Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 2a illustrates an exemplary configuration of a medical device that provides information on Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 2b illustrates an example configuration of a device for providing information on Doppler ultrasound image classification according to one embodiment of the present invention.
- the information providing system (1000) may be a system configured to provide information related to a Doppler ultrasound image based on a Doppler cardiac ultrasound image of an object.
- the information providing system (1000) may be configured with a medical device (100) that receives information related to Doppler ultrasound image classification, a Doppler cardiac ultrasound image diagnosis device (200) that provides a Doppler cardiac ultrasound image, and an information providing server (300) that generates information related to Doppler ultrasound image classification based on the received Doppler cardiac ultrasound image.
- the medical device (100) is an electronic device that provides a user interface for displaying information related to Doppler ultrasound image classification, and may include at least one of a smart phone, a tablet PC (Personal Computer), a laptop, and/or a PC.
- a smart phone a tablet PC (Personal Computer), a laptop, and/or a PC.
- a tablet PC Personal Computer
- the medical device (100) can receive prediction results related to classification of Doppler ultrasound images for an object from the information providing server (300) and display the received results through a display unit to be described later.
- the server (300) for providing information may include a general-purpose computer, laptop, and/or data server that performs various operations to determine information related to Doppler ultrasound image classification based on a Doppler cardiac ultrasound image provided from a Doppler cardiac ultrasound imaging diagnostic device (200), such as an ultrasound diagnostic device.
- the server (300) for providing information may be a device for accessing a web server that provides a web page or a mobile web server that provides a mobile website, but is not limited thereto.
- the information providing server (300) can receive a Doppler cardiac ultrasound image from a Doppler cardiac ultrasound imaging diagnostic device (200), classify an ultrasound mode and a cross-sectional view of the received Doppler cardiac ultrasound image, and provide information related to the Doppler ultrasound image. At this time, the information providing server (300) can classify an ultrasound cross-sectional view from a Doppler cardiac ultrasound image using a classification model.
- the information providing server (300) can provide the cross-sectional classification results for the Doppler ultrasound image to the medical device (100).
- Information provided from the information provision server (300) in this way may be provided as a web page through a web browser installed on a medical device (100), or may be provided in the form of an application or program. In various embodiments, such data may be provided in a form included in a platform in a client-server environment.
- the medical device (100) may include a memory interface (110), one or more processors (120), and a peripheral interface (130). Various components within the medical device (100) may be connected by one or more communication buses or signal lines.
- the memory interface (110) is connected to the memory (150) and can transmit various data to the processor (120).
- the memory (150) can include at least one type of storage medium among flash memory type, hard disk type, multimedia card micro type, card type memory (for example, SD or XD memory, etc.), RAM, SRAM, ROM, EEPROM, PROM, network storage, cloud, and blockchain data.
- the memory (150) can store at least one of an operating system (151), a communication module (152), a graphical user interface (GUI) module (153), a sensor processing module (154), a telephone module (155), and an application module (156).
- the operating system (151) can include instructions for processing basic system services and instructions for performing hardware operations.
- the communication module (152) can communicate with at least one of other devices, computers, and servers.
- the graphical user interface module (GUI) (153) can process a graphical user interface.
- the sensor processing module (154) can process sensor-related functions (e.g., processing voice input received using one or more microphones (192)).
- the telephone module (155) can process telephone-related functions.
- the application module (156) can perform various functions of the user application, such as electronic messaging, web browsing, media processing, navigation, imaging, and other processing functions.
- the medical device (100) can store one or more software applications (156-1, 156-2) (e.g., information providing applications) associated with one type of service in the memory (150).
- the memory (150) may store a digital assistant client module (157) (hereinafter, DA client module), and thus store instructions for performing client-side functions of the digital assistant and various user data (158).
- DA client module digital assistant client module
- the DA client module (157) can obtain the user's voice input, text input, touch input, and/or gesture input through various user interfaces (e.g., I/O subsystem (140)) provided in the medical device (100).
- user interfaces e.g., I/O subsystem (140)
- the DA client module (157) can output data in audiovisual and tactile forms.
- the DA client module (157) can output data consisting of a combination of at least two or more of voice, sound, notification, text message, menu, graphic, video, animation, and vibration.
- the DA client module (157) can communicate with a digital assistant server (not shown) using a communication subsystem (180).
- the DA client module (157) may collect additional information about the surroundings of the medical device (100) from various sensors, subsystems, and peripheral devices to construct a context associated with the user input.
- the DA client module (157) may provide context information along with the user input to a digital assistant server to infer the user's intent.
- the context information that may accompany the user input may include sensor information, such as lighting, ambient noise, ambient temperature, images of the surroundings, video, etc.
- the context information may include the physical state of the medical device (100) (e.g., device orientation, device position, device temperature, power level, speed, acceleration, motion pattern, cellular signal strength, etc.).
- context information may include information related to the software state of the medical device (100) (e.g., processes running on the medical device (100), installed programs, past and current network activity, background services, error logs, resource usage, etc.).
- the memory (150) may include additional or deleted instructions, and may further include additional configurations other than those illustrated in FIG. 2A of the medical device (100), or may exclude some configurations.
- the processor (120) can control the overall operation of the medical device (100) and execute various commands to implement an interface that provides information related to Doppler ultrasound images by driving an application or program stored in the memory (150).
- the processor (120) may correspond to a computational unit such as a CPU (Central Processing Unit) or an AP (Application Processor).
- the processor (120) may be implemented in the form of an integrated chip (IC) such as a SoC (System on Chip) in which various computational units such as an NPU (Neural Processing Unit) are integrated.
- IC integrated chip
- SoC System on Chip
- the peripheral interface (130) can be connected to various sensors, subsystems, and peripheral devices to provide data so that the medical device (100) can perform various functions.
- the function performed by the medical device (100) is performed by the processor (120).
- the peripheral interface (130) can receive data from a motion sensor (160), a light sensor (light sensor) (161), and a proximity sensor (162), through which the medical device (100) can perform orientation, light, and proximity detection functions, etc.
- the peripheral interface (130) can receive data from other sensors (163) (positioning system-GPS receiver, temperature sensor, biometric sensor), through which the medical device (100) can perform functions related to the other sensors (163).
- the medical device (100) may include a camera subsystem (170) connected to a peripheral interface (130) and an optical sensor (171) connected thereto, which may enable the medical device (100) to perform various photographic functions, such as taking photographs and recording video clips.
- the medical device (100) may include a communication subsystem (180) coupled with a peripheral interface (130).
- the communication subsystem (180) may be comprised of one or more wired/wireless networks and may include various communication ports, radio frequency transceivers, and optical transceivers.
- the medical device (100) includes an audio subsystem (190) coupled to the peripheral interface (130), the audio subsystem (190) including one or more speakers (191) and one or more microphones (192), such that the medical device (100) can perform voice-activated functions, such as speech recognition, voice replication, digital recording, and telephony functions.
- voice-activated functions such as speech recognition, voice replication, digital recording, and telephony functions.
- the medical device (100) may include an I/O subsystem (140) coupled with a peripheral interface (130).
- the I/O subsystem (140) may control a touch screen (143) included in the medical device (100) via a touch screen controller (141).
- the touch screen controller (141) may detect a user's contact and movement or cessation of contact and movement using any one of a plurality of touch sensing technologies, such as capacitive, resistive, infrared, surface acoustic wave technology, proximity sensor array, and the like.
- the I/O subsystem (140) may control other input/control devices (144) included in the medical device (100) via other input controller(s) (142).
- the other input controller(s) (142) may control one or more buttons, rocker switches, thumb-wheels, infrared ports, USB ports, and pointer devices such as a stylus.
- the information providing server (300) may include a communication interface (310), a memory (320), an I/O interface (330), and a processor (340), each component of which may communicate with each other through one or more communication buses or signal lines.
- the communication interface (310) can be connected to a medical device (100) and a Doppler cardiac ultrasound imaging device (200) via a wired/wireless communication network to exchange data.
- the communication interface (310) can receive a Doppler cardiac ultrasound image from the Doppler cardiac ultrasound imaging device (200) and transmit information about the determined cross-section to the medical device (100).
- a communication interface (310) that enables transmission and reception of such data includes a wired communication port (311) and a wireless circuit (312), wherein the wired communication port (311) may include one or more wired interfaces, for example, Ethernet, Universal Serial Bus (USB), FireWire, etc.
- the wireless circuit (312) may transmit and receive data with an external device via an RF signal or an optical signal.
- the wireless communication may use at least one of a plurality of communication standards, protocols, and technologies, for example, GSM, EDGE, CDMA, TDMA, Bluetooth, Wi-Fi, VoIP, Wi-MAX, or any other suitable communication protocol.
- the memory (320) can store various data used in the information providing server (300).
- the memory (320) can store a Doppler cardiac ultrasound image, or a cross-sectional classification model learned to classify ultrasound cross-sections within a Doppler cardiac ultrasound image.
- the memory (320) may include a volatile or nonvolatile storage medium capable of storing various data, commands, and information.
- the memory (320) may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., an SD or XD memory, etc.), a RAM, an SRAM, a ROM, an EEPROM, a PROM, a network storage storage, a cloud, and blockchain data.
- the memory (320) may store a configuration of at least one of an operating system (321), a communication module (322), a user interface module (323), and one or more applications (324).
- An operating system (e.g., an embedded operating system such as LINUX, UNIX, MAC OS, WINDOWS, VxWorks, etc.) may include various software components and drivers to control and manage general system operations (e.g., memory management, storage device control, power management, etc.) and may support communication between various hardware, firmware, and software components.
- an embedded operating system such as LINUX, UNIX, MAC OS, WINDOWS, VxWorks, etc.
- general system operations e.g., memory management, storage device control, power management, etc.
- the communication module (323) can support communication with other devices through the communication interface (310).
- the communication module (320) can include various software components for processing data received by the wired communication port (311) or wireless circuit (312) of the communication interface (310).
- the user interface module (323) can receive a user's request or input from a keyboard, touch screen, microphone, etc. through an I/O interface (330) and provide a user interface on the display.
- the application (324) may include a program or module configured to be executed by one or more processors (330).
- the application for providing information associated with Doppler ultrasound images may be implemented on a server farm.
- the I/O interface (330) can connect at least one of an input/output device (not shown) of the information providing server (300), such as a display, a keyboard, a touch screen, and a microphone, to the user interface module (323).
- the I/O interface (330) can receive user input (e.g., voice input, keyboard input, touch input, etc.) together with the user interface module (323) and process a command according to the received input.
- the processor (340) is connected to a communication interface (310), a memory (320), and an I/O interface (330) to control the overall operation of the information providing server (300), and can perform various commands for providing information through an application or program stored in the memory (320).
- the processor (340) may correspond to a computational device such as a CPU (Central Processing Unit) or an AP (Application Processor).
- the processor (340) may be implemented in the form of an integrated chip (Integrated Chip (IC)) such as a SoC (System on Chip) in which various computational devices are integrated.
- the processor (340) may include a module for calculating an artificial neural network model such as an NPU (Neural Processing Unit).
- the processor (340) can be configured to classify and provide cross-sectional views within a Doppler cardiac ultrasound image using classification models.
- FIG. 3 illustrates a procedure of a method for providing information on Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 4 illustrates an example process of how to replace an existing one-hot vector using a gyration function to generate a label for Doppler ultrasound image classification learning according to one embodiment of the present invention.
- FIG. 5 is an exemplary diagram illustrating a Doppler cross-sectional distance graph created using each cross-sectional view of a Doppler ultrasound image for Doppler ultrasound image classification according to one embodiment of the present invention.
- FIG. 6 is an exemplary diagram illustrating a zygosity function graph for outputting a probability vector of a cross-sectional view according to one embodiment of the present invention.
- FIG. 7 is an exemplary diagram illustrating a gyrofunction graph for outputting a probability vector reflecting the distance of a cross-sectional view according to one embodiment of the present invention.
- an information provision procedure is as follows.
- a Doppler cardiac ultrasound image of an object is received (S310). Then, a cross-sectional view is extracted using the Doppler ultrasound image as input (S320). The cross-sectional view of the Doppler cardiac ultrasound image is classified using a classification model (S330).
- a Doppler cardiac ultrasound image of a target area i.e., a heart area
- a Doppler cardiac ultrasound image of a target area i.e., a heart area
- a cardiac ultrasound image in DICOM format can be received.
- the cardiac ultrasound image in DICOM format can include metadata such as coordinates for the Doppler cardiac ultrasound image.
- a step (S320) of extracting a cross-sectional diagram by inputting a Doppler ultrasound image is performed.
- the cross-sectional view may be a Doppler mode cross-sectional view of any one of MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein, Septal Annulus TDI and Lateral Annulus TDI.
- a step (S330) of classifying a cross-sectional view of a Doppler cardiac ultrasound image using a classification model is performed.
- a probability corresponding to each of a plurality of cross-sectional views of an input Doppler image can be calculated and output by the cross-sectional view classification model.
- the classification model calculates a distance value from a Doppler view distance graph, inputs the calculated distance value into a gyro function to output a matching probability vector, and based on the output probability vector, obtains a probability corresponding to each of a plurality of cross-sections, and determines a cross-section having the highest value of probability among the cross-sections as a cross-section for a Doppler cardiac ultrasound image.
- each of the Doppler cardiac ultrasound images (412a) in the cardiac ultrasound image (412) is input into the cross-section classification model (420).
- the cross-section may include a cross-section for pulsed wave (PW) Doppler mode, a cross-section for continuous wave (CW) Doppler mode, and a cross-section for tissue Doppler imaging (TDI) mode.
- PW pulsed wave
- CW continuous wave
- TDI tissue Doppler imaging
- the step of classifying the cross-section may include a step of classifying the cross-section according to the Doppler mode of the cross-section for the intermittent Doppler mode, the cross-section for the continuous wave Doppler mode, or the cross-section for the tissue Doppler imaging mode, based on the received image, using a cross-section classification model.
- a Doppler cardiac ultrasound image includes a baseline that serves as a reference for a Doppler flow direction, and the step of classifying a cross-sectional view according to a Doppler mode of the method for providing information may further include a step of sub-classifying the cross-sectional view according to the Doppler mode based on the position of the baseline.
- the cross-section is a cross-section for an intermittent wave Doppler mode (PW)
- the step of subdividing may further include a step of determining the cross-section as an MV inflow PW when the baseline is located at the top of the Doppler cardiac ultrasound image, or determining the cross-section as an AV (LVOT) PW or a PV (RVOT) PW when the baseline is located at the bottom of the Doppler cardiac ultrasound image.
- PW intermittent wave Doppler mode
- the cross-section is a cross-section for a continuous wave Doppler mode (CW)
- the step of subdividing may further include a step of determining the cross-section as at least one of AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, and TV (TR) - CW when the baseline is located at the top of the Doppler cardiac ultrasound image, or determining the cross-section as at least one of AV (AR) - CW, PV (PR) - CW, and MV (MS) - CW when the baseline is located at the bottom of the Doppler cardiac ultrasound image.
- the cross-section is a cross-section for tissue Doppler imaging mode (TDI), and the step of classifying the details may further include a step of determining at least one of Septal annulus TDI or Lateral annulus TDI.
- TDI tissue Doppler imaging mode
- the information providing method may further include, prior to the receiving step, a step of receiving an ultrasound image in a DICOM (Digital Imaging and Communications in Medicine) format that displays metadata including tagging information for a Doppler cardiac ultrasound image, and a step of obtaining a Doppler cardiac ultrasound image based on the metadata.
- DICOM Digital Imaging and Communications in Medicine
- the step of classifying the cross-sectional view may include the step of determining, using a cross-sectional view classification model, at least one Doppler mode cross-sectional view among MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein, Septal Annulus TDI and Lateral Annulus TDI for a cardiac ultrasound image.
- a cross-section classification model in the step of classifying a cross-section, can calculate a distance value from a Doppler cross-section distance graph.
- the Doppler view distance graph can be a binary tree, and each view can be arranged in the binary tree according to the degree of interrelationship of each view.
- the binary tree can be generated through the degree of similarity between each view.
- MV(MS) CW and MV(MR) CW which contain hemodynamic information targeting the same valve as MV inflow PW, can be binary trees designed to exhibit relatively smaller distances compared to PV(PR) CW, PV(RVOT) PW, PV(PS) CW targeting other valves.
- d(yi, yj) can represent the distance between two nodes as follows.
- the cross-section to be classified is i, and all remaining cross-sections can be represented as ⁇ i, and when i is MV inflow PW, the distance is
- the distance value to MV inflow PW when i is MV inflow PW, the distance value to MV inflow PW is 0, and when i is MV inflow PW, the distance to PV (PR) CW includes Up, MV, Atrioventicular Valves, Image, Ventriculoarterial Valves, PV, Up, and PV (PR) CW nodes because the common node is Image, so the distance value is 8, and when i is MV inflow PW, the distance to TV (TR) CW includes Up, MV, Atrioventricular Valves, TV, Down, and TV (TR) CW because the common node is Atrioventricular Valves, so the distance value can be 6.
- the cross-section classification model in the step of classifying the cross-section, can input a distance value calculated from the Doppler cross-section distance graph into a gyro function and output a matching probability vector.
- the distance value derived above can be input into the ⁇ function to output a matching probability vector.
- the step of generating a label of a cross-sectional view for a Doppler cardiac ultrasound image to learn a classification model may further include a step of inputting the calculated distance value into a gyro ( ⁇ ) function and outputting a matching probability vector.
- a gyro ( ⁇ ) function is
- lambda ( ⁇ ) is a hyperparameter value that determines how close the probability is to 0 when the distance is far, and can be a number exceeding 1.
- the final label is generated through the softmax function to make it a value between 0 and 1 after passing through the zirconia function, and the optimal lambda value is selected when the validation loss is the lowest value or the validation accuracy is the highest.
- the step of generating a label of a cross-section can calculate a probability corresponding to each cross-section by using the output probability vector function as an input, and at this time, the probability corresponding to each cross-section can be calculated as a value between 0 and 1 by using a softmax function.
- the step of classifying the cross-section may include a step of determining at least one Doppler mode cross-section among MV inflow PW, AV (LVOT) PW, PV (RVOT) PW, AV (AS) - CW, PV (PS) - CW, MV (MR) - CW, TV (TR) - CW, AV (AR) - CW, PV (PR) - CW, MV (MS) - CW, pulmonary vein or Septal Annulus TDI, Lateral Annulus TDI, by using a probability corresponding to each cross-section calculated using a cross-section classification model.
- the learning data of the cross-sectional view classification model of the present invention is not limited to what was described above, and the structure of the cross-sectional view classification model is not limited to what was described above.
- Peripheral Interface 140 I/O Subsystem
- Memory 151 Operating System
- Camera subsystem 171 Optical sensor
- I/O interface 340 Processor
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Epidemiology (AREA)
- General Physics & Mathematics (AREA)
- Primary Health Care (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Cardiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
La présente invention concerne un procédé permettant d'obtenir des informations sur le classement d'un échogramme Doppler, mis en œuvre par un processeur, et un dispositif utilisant le procédé, le procédé comprenant les étapes consistant à : recevoir un échocardiogramme Doppler d'un objet ; extraire des vues en coupe à l'aide de l'échocardiogramme Doppler en tant qu'entrée ; et classer les vues en coupe de l'échocardiogramme Doppler sur la base de l'échocardiogramme Doppler reçu, à l'aide d'un modèle de classement de vues en coupe qui a été entraîné pour classer des vues en coupe.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020230102341A KR20250020951A (ko) | 2023-08-04 | 2023-08-04 | 도플러 초음파 영상 분류에 대한 정보 제공 방법 및 이를 이용한 도플러 초음파 영상 분류에 대한 정보 제공용 디바이스 |
| KR10-2023-0102341 | 2023-08-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025033804A1 true WO2025033804A1 (fr) | 2025-02-13 |
Family
ID=94534108
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/011062 Pending WO2025033804A1 (fr) | 2023-08-04 | 2024-07-29 | Procédé permettant d'obtenir des informations sur le classement d'un échogramme doppler et dispositif permettant d'obtenir des informations sur le classement d'un échogramme doppler à l'aide de celui-ci |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20250020951A (fr) |
| WO (1) | WO2025033804A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060074315A1 (en) * | 2004-10-04 | 2006-04-06 | Jianming Liang | Medical diagnostic ultrasound characterization of cardiac motion |
| KR20140084213A (ko) * | 2011-10-19 | 2014-07-04 | 베라소닉스, 인코포레이티드 | 평면파 송신들을 사용하는 벡터 도플러 이미징을 위한 추정 및 디스플레이 |
| KR20150111697A (ko) * | 2014-03-26 | 2015-10-06 | 삼성전자주식회사 | 초음파 장치 및 초음파 장치의 영상 인식 방법 |
| WO2018136805A1 (fr) * | 2017-01-19 | 2018-07-26 | New York University | Système, procédé et support accessible par ordinateur pour analyse ultrasonore |
| KR20220053737A (ko) * | 2020-10-22 | 2022-05-02 | 단국대학교 산학협력단 | 수정아인슈타인 방법 및 횡방향 초음파 도플러 유속계를 이용하여 하천 총유사량을 산정하는 방법 |
-
2023
- 2023-08-04 KR KR1020230102341A patent/KR20250020951A/ko active Pending
-
2024
- 2024-07-29 WO PCT/KR2024/011062 patent/WO2025033804A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060074315A1 (en) * | 2004-10-04 | 2006-04-06 | Jianming Liang | Medical diagnostic ultrasound characterization of cardiac motion |
| KR20140084213A (ko) * | 2011-10-19 | 2014-07-04 | 베라소닉스, 인코포레이티드 | 평면파 송신들을 사용하는 벡터 도플러 이미징을 위한 추정 및 디스플레이 |
| KR20150111697A (ko) * | 2014-03-26 | 2015-10-06 | 삼성전자주식회사 | 초음파 장치 및 초음파 장치의 영상 인식 방법 |
| WO2018136805A1 (fr) * | 2017-01-19 | 2018-07-26 | New York University | Système, procédé et support accessible par ordinateur pour analyse ultrasonore |
| KR20220053737A (ko) * | 2020-10-22 | 2022-05-02 | 단국대학교 산학협력단 | 수정아인슈타인 방법 및 횡방향 초음파 도플러 유속계를 이용하여 하천 총유사량을 산정하는 방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20250020951A (ko) | 2025-02-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020091350A1 (fr) | Dispositif électronique et procédé de commande de celui-ci | |
| WO2019164232A1 (fr) | Dispositif électronique, procédé de traitement d'image associé et support d'enregistrement lisible par ordinateur | |
| WO2019059505A1 (fr) | Procédé et appareil de reconnaissance d'objet | |
| WO2020213842A1 (fr) | Structures multi-modèles pour la classification et la détermination d'intention | |
| WO2019031714A1 (fr) | Procédé et appareil de reconnaissance d'objet | |
| WO2020130747A1 (fr) | Appareil et procédé de traitement d'image pour transformation de style | |
| WO2016126007A1 (fr) | Procédé et dispositif de recherche d'image | |
| WO2020159288A1 (fr) | Dispositif électronique et son procédé de commande | |
| EP3523710A1 (fr) | Appareil et procédé servant à fournir une phrase sur la base d'une entrée d'utilisateur | |
| WO2018088806A1 (fr) | Appareil de traitement d'image et procédé de traitement d'image | |
| WO2019231130A1 (fr) | Dispositif électronique et son procédé de commande | |
| WO2023132657A1 (fr) | Dispositif, procédé et programme pour fournir un service de prédiction de tendance de produit | |
| WO2018101671A1 (fr) | Appareil et procédé servant à fournir une phrase sur la base d'une entrée d'utilisateur | |
| WO2018097439A1 (fr) | Dispositif électronique destiné à la réalisation d'une traduction par le partage d'un contexte d'émission de parole et son procédé de fonctionnement | |
| WO2019074257A1 (fr) | Dispositif électronique et serveur de traitement d'énoncés d'utilisateur | |
| WO2020096255A1 (fr) | Appareil électronique et son procédé de commande | |
| WO2019172642A1 (fr) | Dispositif électronique et procédé pour mesurer la fréquence cardiaque | |
| WO2020130260A1 (fr) | Terminal mobile et son procédé de fonctionnement | |
| WO2019088338A1 (fr) | Dispositif électronique et procédé de commande associé | |
| WO2024028849A1 (fr) | Procédé de fourniture d'informations concernant une image échographique doppler, et dispositif de fourniture d'informations concernant une image échographique doppler faisant appel à celui-ci | |
| WO2020101178A1 (fr) | Appareil électronique et procédé de connexion wifi de celui-ci | |
| WO2020122513A1 (fr) | Procédé de traitement d'image bidimensionnelle et dispositif d'exécution dudit procédé | |
| WO2025033804A1 (fr) | Procédé permettant d'obtenir des informations sur le classement d'un échogramme doppler et dispositif permettant d'obtenir des informations sur le classement d'un échogramme doppler à l'aide de celui-ci | |
| WO2021118229A1 (fr) | Procédé de fourniture d'informations et dispositif électronique prenant en charge ce procédé | |
| WO2023204610A2 (fr) | Procédé de guidage d'échocardiographie et dispositif de guidage d'échocardiographie l'utilisant |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24852160 Country of ref document: EP Kind code of ref document: A1 |