WO2024225785A1 - Method, program, and device for acquiring user-customized neural network model for identifying biological information - Google Patents
Method, program, and device for acquiring user-customized neural network model for identifying biological information Download PDFInfo
- Publication number
- WO2024225785A1 WO2024225785A1 PCT/KR2024/005625 KR2024005625W WO2024225785A1 WO 2024225785 A1 WO2024225785 A1 WO 2024225785A1 KR 2024005625 W KR2024005625 W KR 2024005625W WO 2024225785 A1 WO2024225785 A1 WO 2024225785A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- biometric information
- neural network
- network model
- acquired
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/346—Analysis of electrocardiograms
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
Definitions
- the present disclosure relates to deep learning technology in the medical field, and more particularly, to a method, program and device for obtaining a user-customized neural network model for identifying biometric information.
- deep learning technology is being utilized in various technical fields.
- deep learning technology is being utilized in various ways in the medical field, which used to rely on specialized and limited personnel such as doctors and researchers to determine the patient's disease. For example, this includes inputting biometric information acquired from the patient in real time into a pre-trained deep learning model to analyze the patient's condition or disease.
- the present disclosure has been made in response to the aforementioned background technology, and aims to provide a method, program and device for obtaining a user-customized neural network model for identifying biometric information.
- a method for obtaining a user-customized neural network model for identifying biometric information comprises the steps of inputting a biometric signal detected from a user into a pre-learned neural network model to predict biometric information related to a state of the user, and when new data about the user is acquired, updating the pre-learned neural network model to be personalized based on the acquired new data, wherein the new data includes biometric information measured from the user in relation to the state.
- the updating step includes a step of calibrating the learned neural network model to be personalized based on the measured biometric information and the biometric signal detected in response to the measured biometric information.
- the correcting step includes performing supervised learning based on biometric information measured from the user and biometric signals detected in response to the biometric information measured from the user.
- the updating step includes, when new data about the user is acquired, adding the acquired new data to a data set acquired by accumulating the acquired new data at a previous point in time, and updating the neural network model to be personalized based on the data set, wherein the data set includes a plurality of biometric information accumulated and measured from the user at the previous point in time.
- the method comprises the step of identifying biological information of the user, and transmitting the new data to a computing device of another user having similar biological information to the identified biological information, such that the computing device of the other user is updated based on the new data.
- the bio-signal includes an electrocardiogram (ECG) signal, and the bio-information includes left ventricular ejection fraction (EF).
- ECG electrocardiogram
- EF left ventricular ejection fraction
- the method comprises the step of outputting information warning the user of reduced left ventricular ejection fraction heart failure if the left ventricular ejection fraction is below a preset value.
- the updating step includes a step of monitoring a trend of the predicted biometric information, and if a change in the trend of the predicted biometric information is detected, updating the pre-learned neural network model to be personalized based on the acquired new data.
- the updating step includes a step of updating the pre-trained neural network model to be personalized based on the acquired new data, if it is identified that a preset time has elapsed since the previous update time.
- the updating step includes a step of updating the pre-learned neural network model to be personalized based on the acquired new data if the number of times biometric information related to the user's status has been predicted from a previous update point in time is greater than or equal to a preset value.
- a computer program stored in a computer-readable storage medium, wherein the computer program, when executed on one or more processors, performs an operation of obtaining a user-customized neural network model for identifying biometric information, the operation including an operation of inputting a biometric signal detected from the user into a pre-learned neural network model to predict biometric information related to a state of the user, and an operation of updating the pre-learned neural network model to be personalized based on new data obtained about the user, wherein the new data includes biometric information measured from the user in relation to the state.
- a computing device for obtaining a user-customized neural network model for identifying biometric information comprises a memory including program codes, a network unit, a sensing unit for detecting biometric information of a user, and a processor for inputting a biometric signal detected from the user into a pre-learned neural network model to predict biometric information related to a state of the user, and when new data about the user is acquired, updating the pre-learned neural network model to be personalized based on the acquired new data, wherein the new data includes biometric information measured from the user in relation to the state.
- a neural network model for identifying more accurate biometric information about a user by reflecting the characteristics of the user can be obtained.
- a neural network model stored in a computing device can be personalized for the user simply by obtaining biometric information about the user.
- FIG. 1 is an exemplary diagram of a device for obtaining a user-customized neural network model for identifying biometric information according to one embodiment of the present disclosure.
- FIG. 2 is a block diagram of a computing device according to an embodiment of the present disclosure.
- FIG. 3 is a flowchart schematically illustrating a method for obtaining a user-customized neural network model for identifying biometric information according to one embodiment of the present disclosure.
- FIG. 4 is an exemplary diagram illustrating updating a neural network model based on first and second biometric information according to one embodiment of the present disclosure.
- FIG. 5 is a flowchart illustrating updating a neural network model based on first and second biometric information according to one embodiment of the present disclosure.
- FIG. 6 is an exemplary diagram illustrating updating a neural network model based on first biometric information and a plurality of second biometric information included in a data set according to one embodiment of the present disclosure.
- FIG. 7 is a block diagram of a computing device according to another embodiment of the present disclosure.
- N N is a natural number
- N a natural number
- components performing different functional roles in the present disclosure can be distinguished as a first component or a second component.
- components that are substantially the same within the technical idea of the present disclosure but should be distinguished for convenience of explanation may also be distinguished as a first component or a second component.
- acquisition as used in this disclosure may be understood to mean not only receiving data via a wired or wireless communication network with an external device or system, but also generating data in an on-device form.
- module or “unit” used in the present disclosure may be understood as a term referring to an independent functional unit that processes computing resources, such as a computer-related entity, firmware, software or a part thereof, hardware or a part thereof, a combination of software and hardware, etc.
- the "module” or “unit” may be a unit composed of a single element, or may be a unit expressed as a combination or set of multiple elements.
- a “module” or “unit” may refer to a hardware element of a computing device or a set thereof, an application program that performs a specific function of software, a processing process implemented through software execution, or a set of instructions for program execution, etc.
- a “module” or “unit” may refer to a computing device itself that constitutes a system, or an application that is executed on a computing device, etc.
- a “module” or “unit” may refer to a computing device itself that constitutes a system, or an application that is executed on a computing device, etc.
- the above-described concept is only an example, and the concept of “module” or “part” may be variously defined within a category understandable to those skilled in the art based on the contents of the present disclosure.
- model used in the present disclosure may be understood as a system implemented using mathematical concepts and language to solve a specific problem, a set of software units to solve a specific problem, or an abstract model regarding a processing process to solve a specific problem.
- a neural network "model” may refer to the entire system implemented as a neural network that has a problem-solving ability through learning. In this case, the neural network may have a problem-solving ability by optimizing parameters connecting nodes or neurons through learning.
- a neural network "model” may include a single neural network, or may include a neural network set in which multiple neural networks are combined.
- data used in the present disclosure may include “images”, signals, etc.
- image used in the present disclosure may refer to multidimensional data composed of discrete image elements.
- image may be understood as a term referring to a digital representation of an object that can be seen with the human eye.
- image may refer to multidimensional data composed of elements corresponding to pixels in a two-dimensional image.
- Image may refer to multidimensional data composed of elements corresponding to voxels in a three-dimensional image.
- FIG. 1 is an exemplary diagram of a device for obtaining a user-customized neural network model (10) for identifying biometric information according to one embodiment of the present disclosure.
- the computing device (100) may be a hardware device or a part of a hardware device that performs comprehensive processing and calculation of data, or may be a software-based computing environment connected to a communication network.
- the computing device (100) may be a server that performs an intensive data processing function and shares resources, or may be a client that shares resources through interaction with a server.
- the computing device (100) may be a cloud system in which a plurality of servers and clients interact to comprehensively process data. Since the above description is only one example related to the type of the computing device (100), the type of the computing device (100) may be configured in various ways within a category understandable to those skilled in the art based on the contents of the present disclosure.
- a computing device (100) may be implemented as a smart watch.
- the present disclosure is not limited thereto, and the computing device (100) may be implemented as various electronic devices such as a desktop, a laptop, a smart phone, a server device, a smart band, a smart ring, etc.
- the computing device (100) may be implemented as a biosignal measuring device that measures bioinformation of a user (1) (e.g., a patient, etc.).
- a biosignal measuring device that measures bioinformation of a user (1) (e.g., a patient, etc.).
- the computing device (100) will be described assuming a smart watch.
- a pre-learned neural network model (10) may be stored in the computing device (100).
- the neural network model (10) may be a model that has been learned in advance to identify the user's (1) condition, symptoms, disease, biometric information, etc.
- the neural network model (10) may be learned in advance according to the type, purpose of use, etc. of the computing device (100).
- a neural network model (10) may be pre-trained to identify another biometric information of a user (1) based on biometric information obtained from the user (1) by the computing device (100).
- the bio-information may include various bio-signals (e.g., electrocardiogram (ECG), electroencephalogram (EEG), photoplethysmogram (PPG), arterial blood pressure (ABP), central venous pressure (CVP)) obtained from the user (1).
- bio-information may include diseases (e.g., heart failure, etc.) and disease potentials identified for the user (1), and measurements indicating the status of the user (1) (e.g., body temperature, blood pressure, left ventricular ejection fraction (EF), etc.).
- biometric information acquired by a computing device (100) and input to a neural network model (10) is referred to as first biometric information
- data i.e., output data
- second biometric information data (i.e., output data) output through the neural network model (10) using the acquired biometric signal as input data
- the first biometric information and the second biometric information may be different types of biometric information.
- the user (1) can identify the status of the user (1) with the first biometric information and the second biometric information.
- the first biometric information may be an electrocardiogram signal
- the second biometric information may be a left ventricular ejection fraction.
- the user (1) can monitor the left ventricular ejection fraction and examine the disease and health status of the user (1) in real time without visiting a specialized medical institution, simply by using the computing device (100).
- the disease and health status of the user (1) may be related to the heart of the user (1).
- the first biometric information may include biometric information obtained from the user (1) based on a non-invasive measurement method.
- the first biometric information may include an electrocardiogram signal, an electroencephalogram signal, a photosensitive pulse wave, etc.
- the second biometric information may include biometric information obtained from the user (1) based on an invasive measurement method or biometric information obtained by direct measurement by medical staff.
- the second biometric information may include body temperature, blood pressure (e.g., arterial blood pressure, central venous blood pressure, etc.), left ventricular ejection fraction, etc.
- the neural network model (10) may be trained to identify the second biometric information obtained based on a non-invasive method that requires intervention by a medical professional or is generally difficult to measure, with the first biometric information obtained based on a non-invasive method that can be easily measured from the user (1).
- the user (1) can monitor the second biometric information without visiting a medical institution by only using the computing device (100) and can examine the condition of the user (1) in real time.
- the neural network model (10) may be implemented as a multi-layer perceptron (MLP), a convolutional neural network (CNN), a recurrent neural network (RNN), a generative adversarial network, etc.
- the neural network model (10) may include a plurality of residual blocks that extract potential features of input first bio-information and a classifier that classifies second bio-information or a specific disease based on the extracted potential features.
- the computing device (100) may be manufactured in a state in which the neural network model (10) is stored.
- the computing device (100) may obtain the neural network model (10) from an external computing device (e.g., a server device, etc.) that is connected to the computing device (100).
- the computing device (100) may update the neural network model (10) stored in the computing device (100) to suit the user (1).
- the neural network model (10) obtained by the computing device (100) may be a model that has been learned in advance based on a preset learning data set.
- the computing device (100) may re-learn the neural network model (10) based on new data obtained from the user (1) and update it to suit the user (1).
- updating the neural network model (10) may be to correct or rectify the neural network model (10) based on the biometric information acquired by the computing device (100).
- it may be a process of personalizing the neural network model (10) to suit the user.
- the computing device (100) may perform a learning process of fine-tuning the neural network model (10) based on the acquired biometric information.
- the computing device (100) can re-learn and update the neural network model (10) based on the first biometric information acquired from the user (1). More specifically, if the first biometric information is an electrocardiogram signal and the second biometric information is a left ventricular ejection fraction, when the computing device (100) acquires an electrocardiogram signal of the user (1), it inputs the signal into the neural network model (10) to identify the left ventricular ejection fraction of the user (1), and also learns the neural network model (10) based on the acquired electrocardiogram signal to update the neural network model (10) to fit the user (1).
- the first biometric information is an electrocardiogram signal
- the second biometric information is a left ventricular ejection fraction
- the computing device (100) can obtain a personalized neural network model (10) (i.e., a user-tailored neural network model (10)) that more accurately identifies the second biometric information (e.g., left ventricular ejection fraction) of the user (1) based on the first biometric information (e.g., electrocardiogram signal) of the user (1).
- a personalized neural network model (10) i.e., a user-tailored neural network model (10)
- the second biometric information e.g., left ventricular ejection fraction
- electrocardiogram signal electrocardiogram signal
- the computing device (100) may also obtain second biometric information measured from the user.
- the second biometric information may be obtained from an external computing device. That is, the second biometric information may be obtained directly from the user (1) by the external computing device, or may be a value calculated based on a biometric signal obtained from the user (1) by the external computing device.
- the second biometric information obtained from the external computing device is different from the second biometric information obtained by inputting the first biometric information into the neural network model (10).
- the second biometric information obtained from the external computing device may be more accurate than the second biometric information output from the neural network model (10) in that it is information directly measured from the user (1).
- the computing device (100) may also update the neural network model (10) based on the first biometric information obtained by directly measuring the user (1) by the computing device (100) together with the obtained second biometric information.
- the computing device (100) can update the neural network model (10) to more accurately identify the second biometric information for the user (1) using the first and second biometric information. For example, if the computing device (100) acquires the left ventricular ejection fraction value for the user (1) from an external device, the computing device (100) can update the neural network model (10) with the acquired left ventricular ejection fraction value and the user's electrocardiogram signal corresponding to the acquired left ventricular ejection fraction.
- FIG. 2 is a block diagram of a computing device (100) according to one embodiment of the present disclosure.
- a computing device (100) may include one or more processors (hereinafter, “processors”) (110), a memory (120), a network unit (130), and a sensing unit (140).
- processors hereinafter, “processors”
- FIG. 1 is only an example, and thus, the computing device (100) may further include other configurations for implementing a computing environment.
- only some of the disclosed configurations may be included in the computing device (100).
- the processor (110) may be understood as a configuration unit including hardware and/or software for performing computing operations.
- the processor (110) may read a computer program to perform data processing for machine learning.
- the processor (110) may process computational processes such as processing of input data for machine learning, feature extraction for machine learning, and error calculation based on backpropagation.
- the processor (110) for performing such data processing may include a central processing unit (CPU), a general purpose graphics processing unit (GPGPU), a tensor processing unit (TPU), an application specific integrated circuit (ASIC), or a field programmable gate array (FPGA).
- the type of the processor (110) described above is only one example, and thus, the type of the processor (110) may be configured in various ways within a range understandable to those skilled in the art based on the contents of the present disclosure.
- the processor (110) is electrically connected to other components of the computing device (100) (i.e., memory (120), network unit (130), and sensing unit (140)) and controls the overall operation of the computing device (100).
- the memory (120) may be understood as a configuration unit including hardware and/or software for storing and managing data processed in the computing device (100). That is, the memory (120) may store any type of data generated or determined by the processor (110) and any type of data received by the network unit (130).
- the memory (120) may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, a card type memory, a RAM (random access memory), a SRAM (static random access memory), a ROM (read-only memory), an EEPROM (electrically erasable programmable read-only memory), a PROM (programmable read-only memory), a magnetic memory, a magnetic disk, and an optical disk.
- the memory (120) may also include a database system that controls and manages data in a predetermined system.
- the type of memory (120) described above is only an example, and thus the type of memory (120) can be configured in various ways within a range understandable to those skilled in the art based on the contents of the present disclosure.
- the memory (120) can manage, by structuring and organizing, data required for the processor (110) to perform operations, combinations of data, and program codes executable in the processor (110).
- the memory (120) can store a neural network model (10) learned to identify second biometric information based on first biometric information.
- the memory (120) can store program codes that operate to perform learning on the neural network model (10) based on the acquired first and second biometric information, program codes that operate the neural network model (10) to receive time series data and perform inference according to the intended use of the computing device (100), and processed data generated as the program codes are executed.
- the network unit (130) may be understood as a configuration unit that transmits and receives data via any type of known wired and wireless communication system.
- the network unit (130) may perform data transmission and reception using a wired and wireless communication system such as a local area network (LAN), wideband code division multiple access (WCDMA), long term evolution (LTE), wireless broadband internet (WiBro), fifth generation mobile communication (5G), ultrawide-band, ZigBee, radio frequency (RF) communication, wireless LAN, wireless fidelity, near field communication (NFC), or Bluetooth.
- LAN local area network
- WCDMA wideband code division multiple access
- LTE long term evolution
- WiBro wireless broadband internet
- 5G fifth generation mobile communication
- ultrawide-band ZigBee
- RF radio frequency
- wireless LAN wireless fidelity
- NFC near field communication
- Bluetooth Bluetooth
- the network unit (130) can receive data required for the processor (110) to perform calculations through wired or wireless communication with any system or any client, etc. In addition, the network unit (130) can transmit data generated through the calculation of the processor (110) through wired or wireless communication with any system or any client, etc. For example, the network unit (130) can receive medical data through communication with a cloud server that performs tasks such as standardization of medical data, a database in a hospital environment, or a computing device, etc. The network unit (130) can transmit output data of the neural network model (10), and intermediate data, processed data, etc. derived from the calculation process of the processor (110) through communication with the aforementioned database, server, or computing device, etc.
- the processor (110) can obtain a neural network model (10) learned to identify second biometric information based on first biometric information from an external computing device (e.g., an external server device) through the network unit (130). Additionally, the processor (110) can obtain second biometric information of the user (1) from an external computing device (e.g., a biometric signal measuring device) through the network unit (130).
- an external computing device e.g., a biometric signal measuring device
- the sensing unit (140) can obtain first biometric information about the user (1).
- the sensing unit (140) can obtain first biometric information about the user (1) based on a non-invasive method. That is, according to one embodiment of the present disclosure, the first biometric information may be biometric information measured based on a non-invasive method.
- the sensing unit (140) may include a plurality of electrodes. At this time, the processor (110) may obtain an electrocardiogram signal of the user (1) as the first biometric information through at least one electrode.
- the sensing unit (140) may include an image sensor or an optical sensor. At this time, the processor (110) may obtain a photoplethysmographic blood flow signal of the user (1) as the first biometric information through the image sensor (or optical sensor).
- FIG. 3 is a flowchart schematically illustrating a method for obtaining a user-customized neural network model (10) for identifying biometric information according to one embodiment of the present disclosure.
- the processor (110) inputs a bio-signal detected from a user into a pre-learned neural network model (10) to predict bio-information related to the state (S310).
- the processor (110) can obtain a neural network model (10) stored in the memory (120). At this time, the neural network model (10) can be stored in the memory (120) at the time of manufacturing the computing device (100).
- a plurality of neural network models (10) may be stored in the memory (120) of the computing device (100).
- a plurality of neural network models (10) corresponding to a plurality of users (1) may be stored in advance in the memory (120), and the processor (110) may select and acquire a neural network model (10) corresponding to the user (1) among the plurality of neural network models (10) based on the identification information (e.g., name, resident registration number, etc.) (or biological information) of the user (1).
- the computing device (100) is a biosignal measuring device installed in a specialized medical institution
- a plurality of neural network models (10) corresponding to a plurality of patients who have visited the specialized medical institution may be stored in the computing device (100), and each neural network model (10) may be repeatedly updated based on the first and second biometric information acquired for each patient according to an embodiment of the present disclosure described below, and may be stored in the memory (120).
- the processor (110) may also obtain a neural network model (10) from an external computing device (200) (e.g., a server device, etc.) that is connected to the computing device (100) through the network unit (130).
- an external computing device e.g., a server device, etc.
- the processor (110) may also obtain a neural network model (10) from an external computing device (200) (e.g., a server device, etc.) that is connected to the computing device (100) through the network unit (130).
- the external computing device (200) may be a server device that monitors the status (i.e., disease and health status) of the user (1) of the computing device (100) in conjunction with the computing device (100).
- the external computing device (200) may be a biosignal measuring device that measures second bioinformation of the user (1).
- the processor (110) can detect first bio-information about the user through the sensing unit (140). For example, the processor (110) can detect and obtain a bio-signal as the first bio-information about the user through the sensing unit (140), and at this time, the bio-signal can be an electrocardiogram signal. When a command requesting prediction of the left ventricular ejection fraction of the user (1) is input, the processor (110) can detect and obtain an electrocardiogram signal about the user through the sensing unit (140).
- the processor (110) can input the first biometric information into the neural network model (10) to predict the second biometric information.
- the second biometric information may be related to the status of the user (1).
- the status of the user (1) may be a disease or health status of the user (1).
- the disease or health status of the user (1) may be related to the heart of the user (1).
- the neural network model (10) may be a model learned to predict the second biometric information based on the first biometric information, as described in FIG. 1.
- the neural network model (10) may be learned based on a plurality of learning data sets in which the electrocardiogram signal and the left ventricular ejection fraction corresponding to the electrocardiogram signal are matched. And, the processor (110) inputs an electrocardiogram signal detected from the user (1) into the neural network model (10) to predict the left ventricular ejection fraction of the user (1) related to heart failure with reduced left ventricular ejection fraction.
- the processor (110) may update the pre-learned neural network model to be personalized based on the acquired new data (S320).
- the new data may include biometric information measured from the user in relation to the condition of the user (1) (e.g., disease and health condition).
- the new data may be left ventricular ejection fraction measured from the user in relation to heart failure with reduced left ventricular ejection fraction.
- the processor (110) can obtain second biometric information about the user (1), and perform first learning on the neural network model (10) based on the obtained second biometric information and first biometric information corresponding to the obtained second biometric information, thereby updating the neural network model (10).
- the processor (110) may receive second biometric information from an external computing device (200) through the network unit (130). Alternatively, the processor (110) may also receive second biometric information through a user interface. At this time, while the processor (110) detects and acquires first biometric information about the user (1), the processor (110) may acquire second biometric information from the external computing device (200) through the network unit (130) or through the user interface.
- the second biometric information is biometric information directly measured and acquired from the user (1), and is different from and may be more accurate than second biometric information predicted by inputting first biometric information into the neural network model (10).
- the second biometric information may be the left ventricular ejection fraction measured for the user (1) by the external computing device (200). More specifically, the external computing device (200) may observe the internal structure of the heart of the user (1) through an ultrasound examination, evaluate the size and shape of the left ventricle of the user (1), and measure the left ventricular ejection fraction.
- the present invention is not limited thereto, and the external computing device (200) may measure the left ventricular ejection fraction of the user (1) based on various methods such as the Doppler measurement method and the red wave emission tracing method.
- the left ventricular ejection fraction for the user (1) may be measured by medical staff using the external computing device (200).
- the second biometric information may include biometric information obtained from the user (1) based on an invasive measurement method or biometric information obtained by direct measurement by medical staff.
- the processor (110) can perform second learning on the neural network model (10) based on the acquired second biometric information to update the neural network model (10). That is, the processor (110) can correct the previously learned neural network model (10) to fit (or be personalized) the user (1) based on the second biometric information.
- the processor (110) can identify first biometric information (hereinafter, first biometric information corresponding to the second biometric information) acquired through the sensing unit (140) at the time when the second biometric information is acquired (more specifically, at the time when the second biometric information is acquired by the external computing device (200), and perform second learning on the neural network model (10) using the first and second biometric information.
- first biometric information hereinafter, first biometric information corresponding to the second biometric information
- the processor (110) can identify first biometric information (hereinafter, first biometric information corresponding to the second biometric information) acquired through the sensing unit (140) at the time when the second biometric information is acquired (more specifically, at the time when the second biometric information is acquired by the external computing device (200), and perform second learning on the neural network model (10) using the first and second biometric information.
- the processor (110) can identify an electrocardiogram signal detected from the user (1) when the external computing device (200) measures the left ventricular ejection fraction of the user (1), and perform first learning on the neural network model (10) using the left ventricular ejection fraction acquired from the external computing device (200) and the identified electrocardiogram signal.
- the first learning may be a supervised learning method based on the acquired second biometric information and the acquired first biometric information corresponding to the acquired second biometric information.
- the processor (110) may set the first biometric information corresponding to the second biometric information as input data and the second biometric information as a label, thereby allowing the neural network model (10) to learn the relationship between the first and second biometric information.
- the processor (110) can continuously obtain first biometric information about the user (1). Then, the processor (110) can perform second learning on the neural network model (10) based on the obtained first biometric information, thereby updating the neural network model (10).
- the processor (110) can continuously obtain first biometric information about the user (1) through the sensing unit (140). For example, the processor (110) can set a plurality of sample points (e.g., 125 points or 250 points) for a preset time (e.g., 1 second) and repeatedly obtain first biometric information about the user (1) for each of the plurality of sample points.
- the processor (110) can obtain the electrocardiogram signal by measuring it at a sampling rate of 500 Hz for 10 seconds through the sensing unit (140). At this time, the processor (110) can process the electrocardiogram signal by removing a 1-second region where the electrocardiogram signal starts and a last 1-second region where the electrocardiogram signal ends in order to remove noise or artifacts included in the electrocardiogram signal.
- the processor (110) continuously obtains the first biometric information of the user (1) through the sensing unit (140), and the processor (110) may obtain a plurality of pieces of first biometric information by dividing the continuous first biometric information based on the set sample points.
- the processor (110) may periodically obtain first biometric information from an external computing device (200). For example, if the computing device (100) is implemented as a server device, the processor (110) may continuously receive first biometric information obtained from an external biometric signal measuring device that measures first biometric information by coming into contact with the body of the user (1). At this time, the processor (110) may receive first biometric information from the external biometric signal measuring device through the network unit (130).
- the processor (110) can perform second learning on the neural network model (10) based on the acquired first biometric information of the user (1), thereby updating the neural network model (10). That is, the processor (110) can correct the previously learned neural network model (10) to fit the user (1) based on the first biometric information.
- the processor (110) may perform second learning on the neural network model (10) with the acquired first biometric information every time the processor (110) acquires the first biometric information. That is, the second learning may be performed on the neural network model (10) in accordance with the cycle in which the first biometric information is acquired.
- the second learning may be a self-supervised learning method based on the acquired first biometric information.
- the processor (110) may extract latent features from the acquired first biometric information, assign a virtual label to the first biometric information based on the extracted latent features, and then train the neural network model (10) based on the first biometric information to which the virtual label has been assigned.
- FIG. 4 is an exemplary diagram showing updating a neural network model (10) based on first and second biometric information according to an embodiment of the present disclosure.
- FIG. 5 is a flowchart showing updating a neural network model (10) based on first and second biometric information according to an embodiment of the present disclosure.
- the first learning may be performed whenever the second biometric information is acquired, and the second learning may be performed whenever the first biometric information is acquired.
- the processor (110) may perform only the first learning without performing the second learning for the neural network model (10). This is to increase the accuracy of the neural network model (10) by not using the same first biometric information for each different learning (first and second learning), but only using it for the first learning that enables more accurate correction of the neural network model (10).
- this is not limited thereto.
- S510 illustrated in FIG. 5 may correspond to S310 illustrated in FIG. 3.
- the processor (110) may perform the first learning and second learning processes in parallel (S522 and S524) by obtaining the first and second biometric information, respectively (S521 and S523).
- the processor (110) may perform second learning on the neural network model (10) with the first biometric information of the user (1) obtained periodically based on Equations 1 to 3 below, and may perform first learning on the neural network model (10) with the second biometric information obtained. Specifically, the processor (110) may update the neural network model (10) based on the gradient descent method according to Equations 1 to 3 below. At this time, the processor (110) may perform learning with batch size 1 using data of the time step t of the neural network model (10).
- L is the overall loss function of the neural network model, is the supervised learning loss function, can be a self-supervised learning loss function.
- is the cross-entropy loss of the second biometric information predicted by the first biometric information acquired at t and the second biometric information actually acquired at t. ) is defined as, silver
- the result of embedding and input data It can be defined as a loss function that minimizes the Euclidean distance between the two.
- FIG. 6 is an exemplary diagram showing updating a neural network model (10) based on first biometric information and a plurality of second biometric information included in a data set according to one embodiment of the present disclosure.
- the processor (110) may add the acquired new data to a data set accumulated at a previous time and perform first learning on the neural network model (10) based on the data set, thereby updating the neural network model to be personalized.
- the data set may include at least one second biometric information accumulated at a previous time and first biometric information corresponding to at least one second biometric information that is matched and acquired.
- the processor (110) may identify first biometric information acquired at the time when the second biometric information is acquired, and add the second biometric information and the first biometric information to a data set (20) accumulated and acquired at a previous time.
- the data set (20) may include a plurality of second biometric information accumulated and acquired before the time when the second biometric information is newly acquired, and a plurality of first biometric information corresponding to the plurality of second biometric information, respectively. That is, when second biometric information is acquired, the processor (110) may update the data set (20) with the acquired second biometric information and the first biometric information corresponding to the acquired second biometric information.
- the processor (110) may perform first learning on the neural network model (10) based on the data set (20) to update the neural network model (10).
- the processor (110) may perform second learning on the neural network model (10) with the first biometric information of the user (1) obtained periodically based on Equations 4 to 7 below, and may perform first learning on the neural network model (10) with the data set (20) including the plurality of second biometric information obtained. Specifically, the processor (110) may update the neural network model (10) based on the gradient descent method according to Equations 4 to 7 below.
- L is the overall loss function of the neural network model, is the supervised learning loss function, can be a self-supervised learning loss function.
- is the cross-entropy loss of the second biometric information predicted by the first biometric information acquired at t and the second biometric information actually acquired at t. ) is defined as, silver
- the result of embedding and input data It can be defined as a loss function that minimizes the Euclidean distance between the two. may be a data set (20).
- the processor (110) may identify biological information of the user (1) and share a data set (20) related to second biometric information with another computing device (100) of another user (1) having similar biological information to the identified biological information.
- the biological information may include the height, weight, age, gender, etc. of the user (1).
- the processor (110) may obtain biological information of the other user (1) from information of another smart watch of the other user (1).
- the processor (110) may identify the similarity between the biological information of a plurality of other users (1) and the biological information of the user (1), and may identify another computing device (100) of another user (1) having similar biological information to the biological information of the user (1) based on the identified similarity.
- the processor (110) can share the data set (20) related to the second biometric information by transferring the acquired data set (20) to the identified other computing device (100).
- the processor (110) can also receive the data set (20) related to the second biometric information of the other user (1) from the other computing device (100) of the other user (1) having similar biological information to the biological information of the user (1).
- the neural network model (10) can be updated to more quickly reflect the characteristics of the user (1) by sharing the acquired data set (20) related to the second biometric information.
- a server device that is linked with a computing device (100) may obtain biological information of a user (1) from the computing device (100) and transmit a neural network model (10) corresponding to the obtained biological information to the computing device (100).
- a plurality of neural network models (10) corresponding to a plurality of users (1) may be stored in the server device.
- each neural network model (10) may correspond to a user (1) group including a plurality of users (1) having similar biological information.
- the server device may select biological information similar to the received biological information and transmit a neural network model (10) corresponding to the selected biological information to the computing device (100).
- the server device may also include the user (1) of the computing device (100) in the user (1) group corresponding to the selected biological information.
- the server device can share second biometric information (or a data set (20) established based on the second biometric information) obtained from each computing device (100) with multiple users (1) included in the same user (1) group.
- the processor (110) can identify biological information of a user (1), cluster the user (1) and a plurality of other users (1) based on the identified biological information, and allocate a data set (20) composed of new data to each clustering group. Specifically, the processor (110) can obtain biological information of a plurality of users (1) from a plurality of user (1) terminal devices that are linked to the computing device (100). At this time, the processor (110) can cluster the plurality of users (1) based on the biological information of the plurality of users (1). Then, the processor (110) can set a data set (20) for each clustering group with a second biosignal obtained from a user (1) terminal device included in each clustering group.
- the processor can collect the obtained biosignals, set a data set (20) for a specific clustering group, and allocate the collected biosignals to a specific clustering group.
- the computing device (100) may be implemented as a server device or the like that manages the status (e.g., disease and health status) of multiple users (1).
- the processor (110) can input the acquired first biometric information of the user (1) into the neural network model (10) to predict second biometric information about the user (1). This may be performed through the second learning (i.e., self-supervised learning) process described above. Then, the processor (110) monitors the trend of the predicted second biometric information, and if it is detected that the trend of the predicted second biometric information has changed, the processor (110) can acquire the second biometric information from the external computing device (200). Alternatively, the processor (110) can acquire the second biometric information acquired from the external computing device (200) that is accumulated and stored in a memory. Then, the processor (110) can perform first learning on the neural network model (10) based on the acquired second biometric information to update the neural network model (10).
- the processor (110) can perform first learning on the neural network model (10) based on the acquired second biometric information to update the neural network model (10).
- the processor (110) may determine to perform an update on the neural network model (10) based on the second biometric information. For example, if the processor (110) determines that the similarity between the trend of the second biometric information of the previous period and the trend of the second biometric information of the current period is less than a preset value, it may determine to perform first learning on the neural network model (10) using the second biometric information, identifying that the trend of the second biometric information has changed.
- the processor (110) may update the pre-learned neural network model to be personalized based on the acquired new data.
- the previous update time may be the time at which the update was most recently performed. That is, if the processor (110) identifies that a preset time has passed since the most recently performed update time, the processor (110) may determine to perform an update process for the pre-learned neural network model (10). For example, assuming that the preset time is 7 days, the processor (110) may perform an update process for the neural network model (10) if it is identified that 10 days have passed since the most recently updated neural network model (10).
- the processor (110) may update the pre-learned neural network model (10) to be personalized based on the acquired new data.
- the previous update time may be the time at which the most recent update was performed. For example, assuming that the preset value is 20 times, if the processor (110) identifies that the number of times the second biometric information (e.g., left ventricular ejection fraction) has been predicted through the neural network model (10) from the time at which the neural network model (10) was most recently updated is 25 times, the processor (110) may perform an update process for the neural network model (10).
- the processor (110) identifies that a preset time has passed since the previous update time, it can identify whether the number of times biometric information related to the user's status has been predicted from the previous update time is greater than or equal to a preset value and determine whether to perform an update process for the neural network model (10).
- the processor (110) when second biometric information is acquired, the processor (110) inputs first biometric information corresponding to the second biometric information into the neural network model (10), acquires the second biometric information as output data, and identifies an error with the acquired second biometric information.
- the processor (110) may determine to stop an update process (e.g., a first learning process) regarding the neural network model (10). This is to determine that the personalization process of the neural network model (10) for the user (1) is completed, and to save resources of the computing device (100) required for learning.
- the processor (110) may stop the update process and change the neural network model (10) to a new neural network model to resume the update process.
- the processor (110) inputs the ECG information into the neural network model (10) to identify the left ventricular ejection fraction of the user (1), and predicts heart failure with reduced left ventricular ejection fraction of the user (1) based on the identified left ventricular ejection fraction. For example, the processor (110) calculates a score corresponding to the left ventricular ejection fraction of the user (1) identified through the neural network model (10), and identifies the possibility of heart failure with reduced left ventricular ejection fraction of the user (1) based on the calculated score.
- ECG electrocardiogram
- EF left ventricular ejection fraction
- the processor (110) identifies that the user's (1) has a risk of heart failure with reduced left ventricular ejection fraction, and can output a warning message, etc. through an output interface (e.g., speaker, display, etc.).
- an output interface e.g., speaker, display, etc.
- FIG. 7 is a block diagram of a computing device (400) according to another embodiment of the present disclosure.
- the computing device (400) includes a processor (410), a memory (420), a network unit (430), a sensing unit (440), a display (450), a user interface (460), a camera (470), and a speaker (480).
- the processor (410), the memory (420), the network unit (430), and the sensing unit (440) correspond to the configurations of the processor (110), the memory (120), the network unit (130), and the sensing unit (140) of the computing device (100) illustrated in FIG. 2, and therefore, a detailed description thereof will be omitted.
- the display (450) can display various images.
- the images include both still images and moving images.
- the display (450) can output guide information about activities generated based on the user (1) status.
- the display (450) can be implemented as various types of displays, such as an LCD (Liquid Crystal Display Panel), an OLED (Organic Light Emitting Diodes), an LCoS (Liquid Crystal on Silicon), a DLP (Digital Light Processing), etc.
- the display (450) can also include a driving circuit, a backlight unit, etc., which can be implemented in a form such as an a-si TFT, an LTPS (low temperature poly silicon) TFT, an OTFT (organic TFT), etc.
- the display (450) may be implemented as a touch screen by being combined with a touch panel, and in this case, the display (450) may perform the function of an input interface that receives a touch input from a user (1) as well as an output interface that outputs an image through the touch screen.
- the user interface (460) is a configuration used by the computing device (100) to perform interaction with the user (1), and may include at least one of a touch sensor, a motion sensor, a button, a jog dial, and a switch, but is not limited thereto.
- the processor (410) may receive second biometric information and biological information (occupation, age, gender, etc.) of the user (1) through the user interface (460).
- the camera (470) photographs an object around the user (1) to obtain an image of the object.
- the camera (470) can obtain an image of food consumed by the user (1).
- the processor (410) can determine the nutritional status of the user (1) based on the status of the user (1) and the image of the food consumed by the user (1) and provide recommended diet information as guide information.
- the camera (470) can be implemented as an imaging device such as an imaging device having a CMOS structure (CIS, CMOS Image Sensor) or an imaging device having a CCD structure (Charge Coupled Device).
- the present invention is not limited thereto, and the camera (470) can be implemented as a camera module having various resolutions capable of photographing a subject.
- the camera (470) can be implemented as a depth camera (for example, an IR depth camera, etc.), a stereo camera, or an RGB camera.
- the speaker (480) is a component that outputs various audio data on which various processing operations such as decoding, amplification, and noise filtering have been performed by an audio processing unit (not shown).
- the speaker (480) can output various notification sounds or voice messages.
- the processor (410) can convert an electrical signal received from an external device into a user's (1) voice and output it through the speaker (480).
- the speaker (480) can output a voice message warning of or suggesting a diagnosis of a disease related to second biometric information identified through a neural network model (10).
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Cardiology (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Business, Economics & Management (AREA)
- Data Mining & Analysis (AREA)
- General Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Vascular Medicine (AREA)
- Physiology (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
Description
본 개시는 의료 분야의 딥 러닝 기술에 관한 것으로, 구체적으로 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 방법. 프로그램 및 장치에 관한 것이다.The present disclosure relates to deep learning technology in the medical field, and more particularly, to a method, program and device for obtaining a user-customized neural network model for identifying biometric information.
최근 정보 통신 기술의 발달로, 다양한 기술 분야에서 딥 러닝 기술이 활용되고 있다. 특히, 환자의 질병을 판단하기 위하여 의사, 연구원 등과 같이 전문화되고 한정된 인력에 의존했던 의료 분야에서도 딥 러닝 기술이 다양하게 활용되고 있다. 예를 들어, 환자로부터 실시간으로 획득된 생체 정보를 기 학습된 딥 러닝 모델에 입력하여, 환자의 상태를 분석하거나 질환을 분석하는 것이 이에 해당한다. Recently, with the development of information and communication technology, deep learning technology is being utilized in various technical fields. In particular, deep learning technology is being utilized in various ways in the medical field, which used to rely on specialized and limited personnel such as doctors and researchers to determine the patient's disease. For example, this includes inputting biometric information acquired from the patient in real time into a pre-trained deep learning model to analyze the patient's condition or disease.
한편, 의료 분야에서 이용되는 딥 러닝 모델의 경우, 기 설정된 학습 데이터에 기반하여 사전 학습되어 각각의 사용자에게 제공되는 만큼, 실사용에서는 사용자 또는 대상체(예를 들어, 환자 등)마다 다른 결과를 초래하는 문제가 발생하기도 한다. 동일한 유형 또는 동일한 상태에 관한 생체 정보라도 사용자마다 서로 다른 형태나 값으로 획득될 수 있기 때문이다. 이는, 사용자마다의 생물학적 특성에 기인하기도 한다. 따라서 동일한 딥 러닝 모델이더라도, 사용자의 환경, 사용자의 특성 등이 반영하여 각각의 사용자에 맞춰 개인화 하는 방안이 요구되는 실정이다.Meanwhile, in the case of deep learning models used in the medical field, since they are pre-trained based on preset learning data and provided to each user, there is a problem that different results are generated for each user or subject (e.g., patients, etc.) in actual use. This is because biometric information of the same type or condition can be acquired in different forms or values for each user. This is also due to the biological characteristics of each user. Therefore, even for the same deep learning model, a method is required to personalize it for each user by reflecting the user's environment and characteristics.
본 개시는 전술한 배경기술에 대응하여 안출된 것으로, 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 방법, 프로그램 및 장치를 제공하는 것을 목적으로 한다. The present disclosure has been made in response to the aforementioned background technology, and aims to provide a method, program and device for obtaining a user-customized neural network model for identifying biometric information.
다만, 본 개시에서 해결하고자 하는 과제는 이상에서 언급된 과제로 제한되지 않으며, 언급되지 않은 또 다른 과제들은 아래의 기재를 근거로 명확하게 이해될 수 있을 것이다.However, the problems to be solved in the present disclosure are not limited to the problems mentioned above, and other problems not mentioned can be clearly understood based on the description below.
전술한 바와 같은 과제를 실현하기 위한 본 개시의 일 실시 예에 따른 적어도 하나의 프로세서(processor)를 포함하는 컴퓨팅 장치에 의해 수행되는 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 방법은, 사용자로부터 감지된 생체 신호를 기 학습된 신경망 모델에 입력하여, 사용자의 상태와 관련된 생체 정보를 예측하는 단계 및 상기 사용자에 대한 새로운 데이터가 획득되면, 상기 획득된 새로운 데이터에 기초하여, 상기 기 학습된 신경망 모델이 개인화 되도록 업데이트 하는 단계를 포함하고, 상기 새로운 데이터는, 상기 상태와 관련하여 상기 사용자로부터 측정된 생체 정보를 포함한다. According to an embodiment of the present disclosure for achieving the above-described task, a method for obtaining a user-customized neural network model for identifying biometric information, performed by a computing device including at least one processor, comprises the steps of inputting a biometric signal detected from a user into a pre-learned neural network model to predict biometric information related to a state of the user, and when new data about the user is acquired, updating the pre-learned neural network model to be personalized based on the acquired new data, wherein the new data includes biometric information measured from the user in relation to the state.
대안적으로, 상기 업데이트 하는 단계는, 상기 측정된 생체 정보와 상기 측정된 생체 정보에 대응하여 감지된 생체 신호에 기초하여, 상기 기 학습된 신경망 모델이 개인화되도록 보정하는 단계를 포함한다.Alternatively, the updating step includes a step of calibrating the learned neural network model to be personalized based on the measured biometric information and the biometric signal detected in response to the measured biometric information.
대안적으로, 상기 보정하는 단계는 상기 사용자로부터 측정된 생체 정보와 상기 사용자로부터 측정된 생체 정보에 대응하여 감지된 생체 신호에 기초하여 지도 학습을 수행하는 단계를 포함한다. Alternatively, the correcting step includes performing supervised learning based on biometric information measured from the user and biometric signals detected in response to the biometric information measured from the user.
대안적으로, 상기 업데이트 하는 단계는, 상기 사용자에 대한 새로운 데이터가 획득되면, 상기 획득된 새로운 데이터를 이전 시점에 누적하여 획득된 데이터 셋에 추가하고, 상기 데이터 셋에 기초하여 상기 신경망 모델이 개인화 되도록 업데이트 하는 단계를 포함하고, 상기 데이터 셋은, 상기 이전 시점에 사용자로부터 누적하여 측정된 복수의 생체 정보를 포함한다. Alternatively, the updating step includes, when new data about the user is acquired, adding the acquired new data to a data set acquired by accumulating the acquired new data at a previous point in time, and updating the neural network model to be personalized based on the data set, wherein the data set includes a plurality of biometric information accumulated and measured from the user at the previous point in time.
대안적으로, 상기 사용자의 생물학적 정보를 식별하고, 상기 식별된 생물학적 정보와 유사한 생물학적 정보를 갖는 타 사용자의 컴퓨팅 장치로, 상기 새로운 데이터에 기초하여 상기 타 사용자의 컴퓨터 장치가 업데이트 되도록 상기 새로운 데이터를 송신하는 단계를 포함한다. Alternatively, the method comprises the step of identifying biological information of the user, and transmitting the new data to a computing device of another user having similar biological information to the identified biological information, such that the computing device of the other user is updated based on the new data.
대안적으로 상기 생체 신호는, 심전도 신호(Electrocardiogram, ECG)를 포함하고, 상기 생체 정보는, 좌심실 박출률(Ejection Fraction, EF)을 포함한다. Alternatively, the bio-signal includes an electrocardiogram (ECG) signal, and the bio-information includes left ventricular ejection fraction (EF).
대안적으로, 상기 방법은 상기 좌심실 박출률이 기 설정된 값 미만이면, 상기 사용자의 좌심실 박출률 감소 심부전을 경고하는 정보를 출력하는 단계를 포함한다. Alternatively, the method comprises the step of outputting information warning the user of reduced left ventricular ejection fraction heart failure if the left ventricular ejection fraction is below a preset value.
대안적으로, 상기 업데이트 하는 단계는, 상기 예측된 생체 정보의 추이를 모니터링하고, 상기 예측된 생체 정보의 추이가 변경된 것으로 감지되면, 상기 획득된 새로운 데이터에 기초하여, 상기 기 학습된 신경망 모델이 개인화 되도록 업데이트 하는 단계를 포함한다. Alternatively, the updating step includes a step of monitoring a trend of the predicted biometric information, and if a change in the trend of the predicted biometric information is detected, updating the pre-learned neural network model to be personalized based on the acquired new data.
대안적으로, 상기 업데이트 하는 단계는, 이전 업데이트 시점으로부터 기 설정된 시간이 초과된 것으로 식별되면, 상기 획득된 새로운 데이터에 기초하여, 상기 기 학습된 신경망 모델이 개인화 되도록 업데이트 하는 단계를 포함한다. Alternatively, the updating step includes a step of updating the pre-trained neural network model to be personalized based on the acquired new data, if it is identified that a preset time has elapsed since the previous update time.
대안적으로, 상기 업데이트 하는 단계는 이전 업데이트 시점으로부터 상기 사용자의 상태와 관련된 생체 정보를 예측한 횟수가 기 설정된 값 이상이면, 상기 획득된 새로운 데이터에 기초하여, 상기 기 학습된 신경망 모델이 개인화 되도록 업데이트 하는 단계를 포함한다. Alternatively, the updating step includes a step of updating the pre-learned neural network model to be personalized based on the acquired new data if the number of times biometric information related to the user's status has been predicted from a previous update point in time is greater than or equal to a preset value.
전술한 바와 같은 과제를 실현하기 위한 본 개시의 일 실시 예에 따른 컴퓨터 판독가능 저장 매체 저장된 컴퓨터 프로그램(program)으로서, 상기 컴퓨터 프로그램은 하나 이상의 프로세서(processor)에서 실행되는 경우, 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 동작을 수행하도록 하며, 상기 동작은 사용자로부터 감지된 생체 신호를 기 학습된 신경망 모델에 입력하여, 사용자의 상태와 관련된 생체 정보를 예측하는 동작과 상기 사용자에 대한 새로운 데이터가 획득되면, 상기 획득된 새로운 데이터에 기초하여, 상기 기 학습된 신경망 모델이 개인화 되도록 업데이트 하는 동작을 포함하고, 상기 새로운 데이터는 상기 상태와 관련하여 상기 사용자로부터 측정된 생체 정보를 포함한다. According to one embodiment of the present disclosure for achieving the above-described task, a computer program stored in a computer-readable storage medium, wherein the computer program, when executed on one or more processors, performs an operation of obtaining a user-customized neural network model for identifying biometric information, the operation including an operation of inputting a biometric signal detected from the user into a pre-learned neural network model to predict biometric information related to a state of the user, and an operation of updating the pre-learned neural network model to be personalized based on new data obtained about the user, wherein the new data includes biometric information measured from the user in relation to the state.
전술한 바와 같은 과제를 실현하기 위한 본 개시의 일 실시 예에 따른 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 컴퓨팅 장치로서,프로그램 코드(code)들을 포함하는 메모리(memory), 네트워크부(network unit), 사용자의 생체 정보를 감지하는 센싱부 및 사용자로부터 감지된 생체 신호를 기 학습된 신경망 모델에 입력하여, 사용자의 상태와 관련된 생체 정보를 예측하고, 상기 사용자에 대한 새로운 데이터가 획득되면, 상기 획득된 새로운 데이터에 기초하여, 상기 기 학습된 신경망 모델이 개인화 되도록 업데이트 하는 프로세서(processor)를 포함하고, 상기 새로운 데이터는 상기 상태와 관련하여 상기 사용자로부터 측정된 생체 정보를 포함한다. According to one embodiment of the present disclosure for realizing the task as described above, a computing device for obtaining a user-customized neural network model for identifying biometric information comprises a memory including program codes, a network unit, a sensing unit for detecting biometric information of a user, and a processor for inputting a biometric signal detected from the user into a pre-learned neural network model to predict biometric information related to a state of the user, and when new data about the user is acquired, updating the pre-learned neural network model to be personalized based on the acquired new data, wherein the new data includes biometric information measured from the user in relation to the state.
본 개시의 일 실시 예에 따른 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 방법에 따르면, 사용자에 특성을 반영하여 사용자에 대한 보다 정확한 생체 정보를 식별하는 신경망 모델을 획득할 수 있다. 또한, 사용자에 대한 생체 정보 획득만으로도 컴퓨팅 장치에 저장된 신경망 모델을 사용자에 맞춰 개인화 시킬 수 있다. According to a method for obtaining a user-customized neural network model for identifying biometric information according to an embodiment of the present disclosure, a neural network model for identifying more accurate biometric information about a user by reflecting the characteristics of the user can be obtained. In addition, a neural network model stored in a computing device can be personalized for the user simply by obtaining biometric information about the user.
도 1은 본 개시의 일 실시 예에 따른 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 장치의 예시도이다. FIG. 1 is an exemplary diagram of a device for obtaining a user-customized neural network model for identifying biometric information according to one embodiment of the present disclosure.
도 2는 본 개시의 일 실시 예에 따른 컴퓨팅 장치의 블록 구성도이다. FIG. 2 is a block diagram of a computing device according to an embodiment of the present disclosure.
도 3은 본 개시의 일 실시 예에 따른 생체 정보를 식별하는 사용자 맞춤형 신경망 모델을 획득하는 방법을 개략적으로 나타낸 순서도이다. FIG. 3 is a flowchart schematically illustrating a method for obtaining a user-customized neural network model for identifying biometric information according to one embodiment of the present disclosure.
도 4는 본 개시의 일 실시 예에 따른 제1 및 제2 생체 정보에 기초하여 신경망 모델을 업데이트 하는 것을 나타낸 예시도이다. FIG. 4 is an exemplary diagram illustrating updating a neural network model based on first and second biometric information according to one embodiment of the present disclosure.
도 5는 본 개시의 일 실시 예에 따른 제1 및 제2 생체 정보에 기초하여 신경망 모델을 업데이트 하는 것을 나타낸 순서도이다. FIG. 5 is a flowchart illustrating updating a neural network model based on first and second biometric information according to one embodiment of the present disclosure.
도 6은 본 개시의 일 실시 예에 따른 제1 생체 정보와 및 데이터 셋에 포함된 복수의 제2 생체 정보에 기초하여 신경망 모델을 업데이트 하는 것을 나타낸 예시도이다. FIG. 6 is an exemplary diagram illustrating updating a neural network model based on first biometric information and a plurality of second biometric information included in a data set according to one embodiment of the present disclosure.
도 7은 본 개시의 다른 실시 예에 따른 컴퓨팅 장치의 블록도이다.FIG. 7 is a block diagram of a computing device according to another embodiment of the present disclosure.
아래에서는 첨부한 도면을 참조하여 본 개시의 기술 분야에서 통상의 지식을 가진 자(이하, 당업자)가 용이하게 실시할 수 있도록 본 개시의 실시 예가 상세히 설명된다. 본 개시에서 제시된 실시 예들은 당업자가 본 개시의 내용을 이용하거나 또는 실시할 수 있도록 제공된다. 따라서, 본 개시의 실시 예들에 대한 다양한 변형들은 당업자에게 명백할 것이다. 즉, 본 개시는 여러 가지 상이한 형태로 구현될 수 있으며, 이하의 실시 예에 한정되지 않는다. Hereinafter, embodiments of the present disclosure will be described in detail with reference to the attached drawings so that those skilled in the art can easily implement the present disclosure. The embodiments presented in the present disclosure are provided so that those skilled in the art can utilize or implement the contents of the present disclosure. Accordingly, various modifications to the embodiments of the present disclosure will be apparent to those skilled in the art. That is, the present disclosure may be implemented in various different forms and is not limited to the embodiments below.
본 개시의 명세서 전체에 걸쳐 동일하거나 유사한 도면 부호는 동일하거나 유사한 구성요소를 지칭한다. 또한, 본 개시를 명확하게 설명하기 위해서, 도면에서 본 개시에 대한 설명과 관계없는 부분의 도면 부호는 생략될 수 있다.Throughout the specification of the present disclosure, the same or similar drawing reference numerals refer to the same or similar components. In addition, in order to clearly describe the present disclosure, drawing reference numerals of parts that are not related to the description of the present disclosure may be omitted in the drawings.
본 개시에서 사용되는 "또는" 이라는 용어는 배타적 "또는" 이 아니라 내포적 "또는" 을 의미하는 것으로 의도된다. 즉, 본 개시에서 달리 특정되지 않거나 문맥상 그 의미가 명확하지 않은 경우, "X는 A 또는 B를 이용한다"는 자연적인 내포적 치환 중 하나를 의미하는 것으로 이해되어야 한다. 예를 들어, 본 개시에서 달리 특정되지 않거나 문맥상 그 의미가 명확하지 않은 경우, "X는 A 또는 B를 이용한다" 는 X가 A를 이용하거나, X가 B를 이용하거나, 혹은 X가 A 및 B 모두를 이용하는 경우 중 어느 하나로 해석될 수 있다. The term "or" as used herein is intended to mean an inclusive "or" rather than an exclusive "or." That is, unless otherwise specified herein or the context makes clear, "X employs either A or B" should be understood to mean either one of the natural inclusive permutations. For example, unless otherwise specified herein or the context makes clear, "X employs A or B" can be interpreted to mean either X employs A, X employs B, or X employs both A and B.
본 개시에서 사용되는 "및/또는" 이라는 용어는 열거된 관련 개념들 중 하나 이상의 개념의 가능한 모든 조합을 지칭하고 포함하는 것으로 이해되어야 한다.The term "and/or" as used herein should be understood to refer to and include all possible combinations of one or more of the associated concepts listed.
본 개시에서 사용되는 "포함한다" 및/또는 "포함하는" 이라는 용어는, 특정 특징 및/또는 구성요소가 존재함을 의미하는 것으로 이해되어야 한다. 다만, "포함한다" 및/또는 "포함하는" 이라는 용어는, 하나 이상의 다른 특징, 다른 구성요소 및/또는 이들에 대한 조합의 존재 또는 추가를 배제하지 않는 것으로 이해되어야 한다. The terms "comprises" and/or "comprising" as used herein should be understood to mean the presence of particular features and/or components. However, it should be understood that the terms "comprises" and/or "comprising" do not exclude the presence or addition of one or more other features, other components, and/or combinations thereof.
본 개시에서 달리 특정되지 않거나 단수 형태를 지시하는 것으로 문맥상 명확하지 않은 경우에, 단수는 일반적으로 "하나 또는 그 이상" 을 포함할 수 있는 것으로 해석되어야 한다.Unless otherwise specified in this disclosure or unless the context makes it clear that the singular form is intended to be referred to, the singular should generally be construed to include “one or more.”
본 개시에서 사용되는 "제N(N은 자연수)" 이라는 용어는 본 개시의 구성요소들을 기능적 관점, 구조적 관점, 혹은 설명의 편의 등 소정의 기준에 따라 상호 구별하기 위해 사용되는 표현으로 이해될 수 있다. 예를 들어, 본 개시에서 서로 다른 기능적 역할을 수행하는 구성요소들은 제1 구성요소 혹은 제2 구성요소로 구별될 수 있다. 다만, 본 개시의 기술적 사상 내에서 실질적으로 동일하나 설명의 편의를 위해 구분되어야 하는 구성요소들도 제1 구성요소 혹은 제2 구성요소로 구별될 수도 있다.The term "Nth (N is a natural number)" used in the present disclosure can be understood as an expression used to mutually distinguish components of the present disclosure according to a predetermined standard such as a functional viewpoint, a structural viewpoint, or convenience of explanation. For example, components performing different functional roles in the present disclosure can be distinguished as a first component or a second component. However, components that are substantially the same within the technical idea of the present disclosure but should be distinguished for convenience of explanation may also be distinguished as a first component or a second component.
본 개시에서 사용되는 "획득" 이라는 용어는, 외부 장치 혹은 시스템과의 유무선 통신 네트워크를 통해 데이터를 수신하는 것 뿐만 아니라, 온-디바이스(on-device) 형태로 데이터를 생성하는 것을 의미하는 것으로 이해될 수 있다.The term "acquisition" as used in this disclosure may be understood to mean not only receiving data via a wired or wireless communication network with an external device or system, but also generating data in an on-device form.
한편, 본 개시에서 사용되는 용어 "모듈(module)", 또는 "부(unit)" 는 컴퓨터 관련 엔티티(entity), 펌웨어(firmware), 소프트웨어(software) 혹은 그 일부, 하드웨어(hardware) 혹은 그 일부, 소프트웨어와 하드웨어의 조합 등과 같이 컴퓨팅 자원을 처리하는 독립적인 기능 단위를 지칭하는 용어로 이해될 수 있다. 이때, "모듈", 또는 "부"는 단일 요소로 구성된 단위일 수도 있고, 복수의 요소들의 조합 혹은 집합으로 표현되는 단위일 수도 있다. 예를 들어, 협의의 개념으로서 "모듈", 또는 "부"는 컴퓨팅 장치의 하드웨어 요소 또는 그 집합, 소프트웨어의 특정 기능을 수행하는 응용 프로그램, 소프트웨어 실행을 통해 구현되는 처리 과정(procedure), 또는 프로그램 실행을 위한 명령어 집합 등을 지칭할 수 있다. 또한, 광의의 개념으로서 "모듈", 또는 "부"는 시스템을 구성하는 컴퓨팅 장치 그 자체, 또는 컴퓨팅 장치에서 실행되는 애플리케이션 등을 지칭할 수 있다. 다만, 상술한 개념은 하나의 예시일 뿐이므로, "모듈", 또는 "부"의 개념은 본 개시의 내용을 기초로 당업자가 이해 가능한 범주에서 다양하게 정의될 수 있다.Meanwhile, the term "module" or "unit" used in the present disclosure may be understood as a term referring to an independent functional unit that processes computing resources, such as a computer-related entity, firmware, software or a part thereof, hardware or a part thereof, a combination of software and hardware, etc. At this time, the "module" or "unit" may be a unit composed of a single element, or may be a unit expressed as a combination or set of multiple elements. For example, as a narrow concept, a "module" or "unit" may refer to a hardware element of a computing device or a set thereof, an application program that performs a specific function of software, a processing process implemented through software execution, or a set of instructions for program execution, etc. In addition, as a broad concept, a "module" or "unit" may refer to a computing device itself that constitutes a system, or an application that is executed on a computing device, etc. However, the above-described concept is only an example, and the concept of “module” or “part” may be variously defined within a category understandable to those skilled in the art based on the contents of the present disclosure.
본 개시에서 사용되는 "모델(model)" 이라는 용어는 특정 문제를 해결하기 위해 수학적 개념과 언어를 사용하여 구현되는 시스템, 특정 문제를 해결하기 위한 소프트웨어 단위의 집합, 혹은 특정 문제를 해결하기 위한 처리 과정에 관한 추상화 모형으로 이해될 수 있다. 예를 들어, 신경망(neural network) "모델" 은 학습을 통해 문제 해결 능력을 갖는 신경망으로 구현되는 시스템 전반을 지칭할 수 있다. 이때, 신경망은 노드(node) 혹은 뉴런(neuron)을 연결하는 파라미터(parameter)를 학습을 통해 최적화하여 문제 해결 능력을 가질 수 있다. 신경망 "모델" 은 단일 신경망을 포함할 수도 있고, 복수의 신경망들이 조합된 신경망 집합을 포함할 수도 있다.The term "model" used in the present disclosure may be understood as a system implemented using mathematical concepts and language to solve a specific problem, a set of software units to solve a specific problem, or an abstract model regarding a processing process to solve a specific problem. For example, a neural network "model" may refer to the entire system implemented as a neural network that has a problem-solving ability through learning. In this case, the neural network may have a problem-solving ability by optimizing parameters connecting nodes or neurons through learning. A neural network "model" may include a single neural network, or may include a neural network set in which multiple neural networks are combined.
본 개시에서 사용되는 "데이터"는 "영상", 신호 등을 포함할 수 있다. 본 개시에서 사용되는 "영상" 이라는 용어는 이산적 이미지 요소들로 구성된 다차원 데이터를 지칭할 수 있다. 다시 말해, "영상"은 사람의 눈으로 볼 수 있는 대상의 디지털 표현물을 지칭하는 용어로 이해될 수 있다. 예를 들어, "영상"은 2차원 이미지에서 픽셀에 해당하는 요소들로 구성된 다차원 데이터를 지칭할 수 있다. "영상"은 3차원 이미지에서 복셀에 해당하는 요소들로 구성된 다차원 데이터를 지칭할 수 있다.The term "data" used in the present disclosure may include "images", signals, etc. The term "image" used in the present disclosure may refer to multidimensional data composed of discrete image elements. In other words, "image" may be understood as a term referring to a digital representation of an object that can be seen with the human eye. For example, "image" may refer to multidimensional data composed of elements corresponding to pixels in a two-dimensional image. "Image" may refer to multidimensional data composed of elements corresponding to voxels in a three-dimensional image.
전술한 용어의 설명은 본 개시의 이해를 돕기 위한 것이다. 따라서, 전술한 용어를 본 개시의 내용을 한정하는 사항으로 명시적으로 기재하지 않은 경우, 본 개시의 내용을 기술적 사상을 한정하는 의미로 사용하는 것이 아님을 주의해야 한다.The explanation of the terms set forth above is intended to aid in understanding the present disclosure. Therefore, if the terms set forth above are not explicitly stated as matters limiting the contents of the present disclosure, it should be noted that they are not used to limit the technical ideas of the contents of the present disclosure.
도 1은 본 개시의 일 실시 예에 따른 생체 정보를 식별하는 사용자 맞춤형 신경망 모델(10)을 획득하는 장치의 예시도이다. FIG. 1 is an exemplary diagram of a device for obtaining a user-customized neural network model (10) for identifying biometric information according to one embodiment of the present disclosure.
본 개시의 일 실시 예에 따른 컴퓨팅 장치(100)는 데이터의 종합적인 처리 및 연산을 수행하는 하드웨어 장치 혹은 하드웨어 장치의 일부일 수도 있고, 통신 네트워크로 연결되는 소프트웨어 기반의 컴퓨팅 환경일 수도 있다. 예를 들어, 컴퓨팅 장치(100)는 집약적 데이터 처리 기능을 수행하고 자원을 공유하는 주체인 서버일 수도 있고, 서버와의 상호 작용을 통해 자원을 공유하는 클라이언트(client)일 수도 있다. 또한, 컴퓨팅 장치(100)는 복수의 서버들 및 클라이언트들이 상호 작용하여 데이터를 종합적으로 처리하는 클라우드 시스템(cloud system)일 수도 있다. 상술한 기재는 컴퓨팅 장치(100)의 종류와 관련된 하나의 예시일 뿐이므로, 컴퓨팅 장치(100)의 종류는 본 개시의 내용을 기초로 당업자가 이해 가능한 범주에서 다양하게 구성될 수 있다. The computing device (100) according to one embodiment of the present disclosure may be a hardware device or a part of a hardware device that performs comprehensive processing and calculation of data, or may be a software-based computing environment connected to a communication network. For example, the computing device (100) may be a server that performs an intensive data processing function and shares resources, or may be a client that shares resources through interaction with a server. In addition, the computing device (100) may be a cloud system in which a plurality of servers and clients interact to comprehensively process data. Since the above description is only one example related to the type of the computing device (100), the type of the computing device (100) may be configured in various ways within a category understandable to those skilled in the art based on the contents of the present disclosure.
도 1을 참조하면, 본 개시의 일 실시 예에 따른 컴퓨팅 장치(100)는 스마트 워치로 구현될 수 있다. 다만, 이에 제한되는 것은 아니며 컴퓨팅 장치(100)는 데스크탑, 노트북, 스마트폰, 서버 장치, 스마트 밴드, 스마트 링 등의 다양한 전자 장치로 구현될 수 있다. 또한, 컴퓨팅 장치(100)는 사용자(1)(예를 들어, 환자 등)의 생체 정보를 측정하는 생체 신호 측정 장치로 구현될 수도 있다. 이하에서는, 본 개시의 이해를 위해 컴퓨팅 장치(100)를 스마트 워치로 상정하여 설명하도록 한다. Referring to FIG. 1, a computing device (100) according to an embodiment of the present disclosure may be implemented as a smart watch. However, the present disclosure is not limited thereto, and the computing device (100) may be implemented as various electronic devices such as a desktop, a laptop, a smart phone, a server device, a smart band, a smart ring, etc. In addition, the computing device (100) may be implemented as a biosignal measuring device that measures bioinformation of a user (1) (e.g., a patient, etc.). In the following, for the purpose of understanding the present disclosure, the computing device (100) will be described assuming a smart watch.
한편, 컴퓨팅 장치(100)에는 기 학습된 신경망 모델(10)이 저장될 수 있다. 신경망 모델(10)은 사용자(1)의 상태, 증상, 질병, 생체 정보 등을 식별하도록 사전에 학습된 모델일 수 있다. 특히, 신경망 모델(10)은 컴퓨팅 장치(100)의 유형, 사용 목적 등에 맞춰 사전에 학습될 수 있다. Meanwhile, a pre-learned neural network model (10) may be stored in the computing device (100). The neural network model (10) may be a model that has been learned in advance to identify the user's (1) condition, symptoms, disease, biometric information, etc. In particular, the neural network model (10) may be learned in advance according to the type, purpose of use, etc. of the computing device (100).
본 개시의 일 실시 예에 따라, 신경망 모델(10)은 컴퓨팅 장치(100)가 사용자(1)로부터 획득된 생체 정보에 기초하여, 사용자(1)의 또 다른 생체 정보를 식별하도록 사전에 학습될 수 있다.According to one embodiment of the present disclosure, a neural network model (10) may be pre-trained to identify another biometric information of a user (1) based on biometric information obtained from the user (1) by the computing device (100).
여기서, 생체 정보는 사용자(1)로부터 획득되는 다양한 생체 신호(예를 들어, 심전도 신호(Electrocardiogram, ECG), 뇌전도 신호(Electroencephalogram, EEG), 광적용 맥파(photoplethysmogram, PPG), 동맥 혈압(Arterial Blood Pressure, ABP), 중심 정맥 혈압(Central Venous Pressure, CVP))를 포함할 수 있다. 또한, 생체 정보는 사용자(1)에 대하여 식별되는 질병(예를 들어, 심부전 등) 및 질병 가능성, 사용자(1)의 상태를 나타내는 측정값(예를 들어, 체온, 혈압, 좌심실 박출률(Ejection Fraction, EF) 등) 등을 포함할 수 있다. Here, the bio-information may include various bio-signals (e.g., electrocardiogram (ECG), electroencephalogram (EEG), photoplethysmogram (PPG), arterial blood pressure (ABP), central venous pressure (CVP)) obtained from the user (1). In addition, the bio-information may include diseases (e.g., heart failure, etc.) and disease potentials identified for the user (1), and measurements indicating the status of the user (1) (e.g., body temperature, blood pressure, left ventricular ejection fraction (EF), etc.).
이하에서는, 본 개시의 설명의 편의를 위해, 컴퓨팅 장치(100)에 의해 획득되고, 신경망 모델(10)에 입력되는 생체 정보를 제1 생체 정보로 지칭하고, 획득된 생체 신호를 입력 데이터로 신경망 모델(10)를 통해 출력되는 데이터(즉, 출력 데이터)를 제2 생체 정보로 지칭하도록 한다. Hereinafter, for the convenience of explanation of the present disclosure, biometric information acquired by a computing device (100) and input to a neural network model (10) is referred to as first biometric information, and data (i.e., output data) output through the neural network model (10) using the acquired biometric signal as input data is referred to as second biometric information.
본 개시의 일 실시 예에 따라 제1 생체 정보 및 제2 생체 정보는 서로 다른 유형의 생체 정보일 수 있다. 사용자(1)는 제1 생체 정보와 제2 생체 정보로 사용자(1)의 상태를 파악할 수 있다. 일 예로, 제1 생체 정보는 심전도 신호이고, 제2 생체 정보는 좌심실 박출률일 수 있다. 이를 통해, 사용자(1)는 컴퓨팅 장치(100)의 사용만으로도, 전문 의료 기관 방문 없이도 좌심실 박출률에 관한 모니터링이 가능하며 사용자(1)의 질병 및 건강 상태를 실시간으로 검진할 수 있다. 특히, 사용자(1)의 질병 및 건강 상태는 사용자(1)의 심장에 관한 것일 수 있다. According to one embodiment of the present disclosure, the first biometric information and the second biometric information may be different types of biometric information. The user (1) can identify the status of the user (1) with the first biometric information and the second biometric information. For example, the first biometric information may be an electrocardiogram signal, and the second biometric information may be a left ventricular ejection fraction. Through this, the user (1) can monitor the left ventricular ejection fraction and examine the disease and health status of the user (1) in real time without visiting a specialized medical institution, simply by using the computing device (100). In particular, the disease and health status of the user (1) may be related to the heart of the user (1).
또 다른 예로, 제1 생체 정보는 비침습적 측정 방식에 기반하여 사용자(1)로부터 획득된 생체 정보를 포함할 수 있다. 일 예로, 제1 생체 정보는 심전도 신호, 뇌전도 신호, 광적용 맥파 등을 포함할 수 있다. 반면에, 제2 생체 정보는 침습적 측정 방식에 기반하여 사용자(1)로부터 획득된 생체 정보 또는 의료진에 의해 직접 측정되어 획득된 생체 정보를 포함할 수 있다. 일 예로, 제2 생체 정보는 체온, 혈압(예를 들어, 동맥 혈압, 중심 정맥 혈압 등), 좌심실 박출률 등을 포함할 수 있다. 즉, 신경망 모델(10)은 사용자(1)로부터 쉽게 측정이 가능한, 비침습적 방식에 기초하여 획득된 제1 생체 정보로 전문 의료진의 개입이 필요한, 또는 일반적으로 측정이 어려운 비침습적 방식에 기초하여 획득된 제2 생체 정보를 식별하도록 학습될 수 있다. 이를 통해, 사용자(1)는 컴퓨팅 장치(100)의 사용만으로도, 전문 의료 기관 방문 없이도 제2 생체 정보의 모니터링이 가능하며 사용자(1)의 상태를 실시간으로 검진할 수 있다. As another example, the first biometric information may include biometric information obtained from the user (1) based on a non-invasive measurement method. For example, the first biometric information may include an electrocardiogram signal, an electroencephalogram signal, a photosensitive pulse wave, etc. On the other hand, the second biometric information may include biometric information obtained from the user (1) based on an invasive measurement method or biometric information obtained by direct measurement by medical staff. For example, the second biometric information may include body temperature, blood pressure (e.g., arterial blood pressure, central venous blood pressure, etc.), left ventricular ejection fraction, etc. That is, the neural network model (10) may be trained to identify the second biometric information obtained based on a non-invasive method that requires intervention by a medical professional or is generally difficult to measure, with the first biometric information obtained based on a non-invasive method that can be easily measured from the user (1). Through this, the user (1) can monitor the second biometric information without visiting a medical institution by only using the computing device (100) and can examine the condition of the user (1) in real time.
한편, 신경망 모델(10)은 다층 퍼셉트론(Multi-Layer Perceptron, MLP), 콘볼루셔널 신경망(Convolutional Neural Networks, CNN), 순환 신경망(Recurrent Neural Networks, RNN), 적대적 생성 신경망(Generative Adversarial Network) 등으로 구현될 수 있다. 특히, 신경망 모델(10)은 입력된 제1 생체 정보의 잠재적 특징을 추출하는 복수의 잔여 블록(Residual Block)과 추출된 잠재적 특징에 기초하여 제2 생체 정보 또는 특정 질병을 분류하는 분류기를 포함할 수 있다.Meanwhile, the neural network model (10) may be implemented as a multi-layer perceptron (MLP), a convolutional neural network (CNN), a recurrent neural network (RNN), a generative adversarial network, etc. In particular, the neural network model (10) may include a plurality of residual blocks that extract potential features of input first bio-information and a classifier that classifies second bio-information or a specific disease based on the extracted potential features.
한편, 컴퓨팅 장치(100)는 신경망 모델(10)이 저장된 상태로 제작될 수도 있다. 또는, 컴퓨팅 장치(100)는 컴퓨팅 장치(100)와 연동하는 외부 컴퓨팅 장치(예를 들어, 서버 장치 등)로부터 신경망 모델(10)을 획득할 수도 있다. 이때, 컴퓨팅 장치(100)는 컴퓨팅 장치(100)에 저장된 신경망 모델(10)을 사용자(1)에 맞춰 업데이트 할 수 있다. 구체적으로, 컴퓨팅 장치(100)가 획득된 신경망 모델(10)은 기 설정된 학습 데이터 셋에 기반하여 사전에 학습된 모델일 수 있다. 이때, 컴퓨팅 장치(100)는 사용자(1)로부터 획득된 새로운 데이터에 기초하여 신경망 모델(10)을 재 학습 시켜, 사용자(1)에 맞도록 업데이트할 수 있다. Meanwhile, the computing device (100) may be manufactured in a state in which the neural network model (10) is stored. Alternatively, the computing device (100) may obtain the neural network model (10) from an external computing device (e.g., a server device, etc.) that is connected to the computing device (100). At this time, the computing device (100) may update the neural network model (10) stored in the computing device (100) to suit the user (1). Specifically, the neural network model (10) obtained by the computing device (100) may be a model that has been learned in advance based on a preset learning data set. At this time, the computing device (100) may re-learn the neural network model (10) based on new data obtained from the user (1) and update it to suit the user (1).
여기서, 신경망 모델(10)을 업데이트하는 것은, 컴퓨팅 장치(100)가 획득한 생체 정보를 바탕으로, 신경망 모델(10)을 보정하거나 교정하는 것일 수 있다. 특히, 신경망 모델(10)을 사용자에 맞도록 개인화 하는 과정이 될 수 있다. 일 예로, 컴퓨팅 장치(100)는 획득된 생체 정보를 바탕으로, 신경망 모델(10)을 미세 조정(Fine-Tunning)하는 학습 과정을 수행할 수 있다. Here, updating the neural network model (10) may be to correct or rectify the neural network model (10) based on the biometric information acquired by the computing device (100). In particular, it may be a process of personalizing the neural network model (10) to suit the user. For example, the computing device (100) may perform a learning process of fine-tuning the neural network model (10) based on the acquired biometric information.
예를 들어, 도 1을 참조하면, 신경망 모델(10)이 제1 생체 정보에 기초하여 제2 생체 정보를 식별하도록 학습된 모델인 경우, 컴퓨팅 장치(100)는 사용자(1)로부터 획득된 제1 생체 정보에 기초하여 신경망 모델(10)을 재 학습시켜 업데이트할 수 있다. 보다 구체적으로, 제1 생체 정보가 심전도 신호이고, 제2 생체 정보가 좌심실 박출률인 경우, 컴퓨팅 장치(100)는 사용자(1)의 심전도 신호가 획득되면, 신경망 모델(10)에 입력하여 사용자(1)의 좌심실 박출률을 식별할 뿐만 아니라, 획득된 심전도 신호에 기초하여 신경망 모델(10)을 학습시켜 사용자(1)에 맞도록 신경망 모델(10)을 업데이트할 수 있다. 이러한 과정을 반복함에 따라, 컴퓨팅 장치(100)는 사용자(1)의 제1 생체 정보(예를 들어, 심전도 신호)에 기초하여 사용자(1)의 제2 생체 정보(예를 들어, 좌심실 박출률)를 보다 정확하게 식별하는 개인화된 신경망 모델(10)(즉, 사용자 맞춤형 신경망 모델(10))을 획득할 수 있다. For example, referring to FIG. 1, if the neural network model (10) is a model learned to identify second biometric information based on first biometric information, the computing device (100) can re-learn and update the neural network model (10) based on the first biometric information acquired from the user (1). More specifically, if the first biometric information is an electrocardiogram signal and the second biometric information is a left ventricular ejection fraction, when the computing device (100) acquires an electrocardiogram signal of the user (1), it inputs the signal into the neural network model (10) to identify the left ventricular ejection fraction of the user (1), and also learns the neural network model (10) based on the acquired electrocardiogram signal to update the neural network model (10) to fit the user (1). By repeating this process, the computing device (100) can obtain a personalized neural network model (10) (i.e., a user-tailored neural network model (10)) that more accurately identifies the second biometric information (e.g., left ventricular ejection fraction) of the user (1) based on the first biometric information (e.g., electrocardiogram signal) of the user (1).
한편, 도 1을 참조하면, 컴퓨팅 장치(100)는 사용자로부터 측정된 제2 생체 정보를 획득할 수도 있다. 여기서, 제2 생체 정보는 외부 컴퓨팅 장치로부터 획득할 수 있다. 즉, 제2 생체 정보는 외부 컴퓨팅 장치가 사용자(1)로부터 직접 획득할 수도 있으며, 또는 외부 컴퓨팅 장치가 사용자(1)로부터 획득한 생체 신호에 기초하여 산출된 값일 수도 있다. 외부 컴퓨팅 장치로부터 획득된 제2 생체 정보는 제1 생체 정보를 신경망 모델(10)에 입력하여 획득된 제2 생체 정보와는 상이하다. 외부 컴퓨팅 장치로부터 획득된 제2 생체 정보는, 사용자(1)를 대상으로 직접 측정된 정보라는 점에서 신경망 모델(10)로부터 출력된 제2 생체 정보 보다 더욱 정확할 수 있다. 컴퓨팅 장치(100)는 획득된 제2 생체 정보와 함께 컴퓨팅 장치(100)가 직접 사용자(1)를 측정하여 획득한 제1 생체 정보에 기초하여 신경망 모델(10)을 업 데이트할 수도 있다. 특히, 획득된 제2 생체 정보는 레이블이 부여된 학습 데이터가 될 수 있으므로, 컴퓨팅 장치(100)는 제1 및 제2 생체 정보를 이용하여 사용자(1)에 대하여 보다 정확하게 제2 생체 정보를 식별하도록 신경망 모델(10)을 업데이트할 수 있다. 예를 들어, 컴퓨팅 장치(100)는 외부 장치로부터 사용자(1)에 대한 좌심실 박출률 값을 획득하면, 획득된 좌심실 박출률 값과 회득된 좌심실 박출률에 대응하는 사용자의 심전도 신호로 신경망 모델(10)을 업데이트 할 수 있다. Meanwhile, referring to FIG. 1, the computing device (100) may also obtain second biometric information measured from the user. Here, the second biometric information may be obtained from an external computing device. That is, the second biometric information may be obtained directly from the user (1) by the external computing device, or may be a value calculated based on a biometric signal obtained from the user (1) by the external computing device. The second biometric information obtained from the external computing device is different from the second biometric information obtained by inputting the first biometric information into the neural network model (10). The second biometric information obtained from the external computing device may be more accurate than the second biometric information output from the neural network model (10) in that it is information directly measured from the user (1). The computing device (100) may also update the neural network model (10) based on the first biometric information obtained by directly measuring the user (1) by the computing device (100) together with the obtained second biometric information. In particular, since the acquired second biometric information can be labeled learning data, the computing device (100) can update the neural network model (10) to more accurately identify the second biometric information for the user (1) using the first and second biometric information. For example, if the computing device (100) acquires the left ventricular ejection fraction value for the user (1) from an external device, the computing device (100) can update the neural network model (10) with the acquired left ventricular ejection fraction value and the user's electrocardiogram signal corresponding to the acquired left ventricular ejection fraction.
이하, 도 2 내지 도 6을 참조하면 본 개시의 일 실시 예에 대하여 자세히 설명하도록 한다.Hereinafter, an embodiment of the present disclosure will be described in detail with reference to FIGS. 2 to 6.
도 2는 본 개시의 일 실시 예에 따른 컴퓨팅 장치(100)의 블록 구성도이다. FIG. 2 is a block diagram of a computing device (100) according to one embodiment of the present disclosure.
도 2를 참조하면, 본 개시의 일 실시 예에 따른 컴퓨팅 장치(100)는 하나 이상의 프로세서(processor)(이하, 프로세서)(110), 메모리(memory)(120), 네트워크부(network unit)(130), 센싱부(140)를 포함할 수 있다. 다만, 도 1은 하나의 예시일 뿐이므로, 컴퓨팅 장치(100)는 컴퓨팅 환경을 구현하기 위한 다른 구성들을 더 포함할 수 있다. 또한, 개시된 구성들 중 일부만이 컴퓨팅 장치(100)에 포함될 수도 있다.Referring to FIG. 2, a computing device (100) according to an embodiment of the present disclosure may include one or more processors (hereinafter, “processors”) (110), a memory (120), a network unit (130), and a sensing unit (140). However, FIG. 1 is only an example, and thus, the computing device (100) may further include other configurations for implementing a computing environment. In addition, only some of the disclosed configurations may be included in the computing device (100).
본 개시의 일 실시 예에 따른 프로세서(110)는 컴퓨팅 연산을 수행하기 위한 하드웨어 및/또는 소프트웨어를 포함하는 구성 단위로 이해될 수 있다. 예를 들어, 프로세서(110)는 컴퓨터 프로그램을 판독하여 기계 학습을 위한 데이터 처리를 수행할 수 있다. 프로세서(110)는 기계 학습을 위한 입력 데이터의 처리, 기계 학습을 위한 특징 추출, 역전파(backpropagation)에 기반한 오차 계산 등과 같은 연산 과정을 처리할 수 있다. 이와 같은 데이터 처리를 수행하기 위한 프로세서(110)는 중앙 처리 장치(CPU: central processing unit), 범용 그래픽 처리 장치(GPGPU: general purpose graphics processing unit), 텐서 처리 장치(TPU: tensor processing unit), 주문형 반도체(ASIC: application specific integrated circuit), 혹은 필드 프로그래머블 게이트 어레이(FPGA: field programmable gate array) 등을 포함할 수 있다. 상술한 프로세서(110)의 종류는 하나의 예시일 뿐이므로, 프로세서(110)의 종류는 본 개시의 내용을 기초로 당업자가 이해 가능한 범주에서 다양하게 구성될 수 있다.The processor (110) according to one embodiment of the present disclosure may be understood as a configuration unit including hardware and/or software for performing computing operations. For example, the processor (110) may read a computer program to perform data processing for machine learning. The processor (110) may process computational processes such as processing of input data for machine learning, feature extraction for machine learning, and error calculation based on backpropagation. The processor (110) for performing such data processing may include a central processing unit (CPU), a general purpose graphics processing unit (GPGPU), a tensor processing unit (TPU), an application specific integrated circuit (ASIC), or a field programmable gate array (FPGA). The type of the processor (110) described above is only one example, and thus, the type of the processor (110) may be configured in various ways within a range understandable to those skilled in the art based on the contents of the present disclosure.
프로세서(110)는 컴퓨팅 장치(100)의 다른 구성요소(즉, 메모리(120), 네트워크부(130) 및 센싱부(140))와 전기적으로 연결되어 컴퓨팅 장치(100)의 전반적인 동작을 제어한다. The processor (110) is electrically connected to other components of the computing device (100) (i.e., memory (120), network unit (130), and sensing unit (140)) and controls the overall operation of the computing device (100).
본 개시의 일 실시 예에 따른 메모리(120)는 컴퓨팅 장치(100)에서 처리되는 데이터를 저장하고 관리하기 위한 하드웨어 및/또는 소프트웨어를 포함하는 구성 단위로 이해될 수 있다. 즉, 메모리(120)는 프로세서(110)가 생성하거나 결정한 임의의 형태의 데이터 및 네트워크부(130)가 수신한 임의의 형태의 데이터를 저장할 수 있다. 예를 들어, 메모리(120)는 플래시 메모리 타입(flash memory type), 하드디스크 타입(hard disk type), 멀티미디어 카드 마이크로 타입(multimedia card micro type), 카드 타입의 메모리, 램(RAM: random access memory), 에스램(SRAM: static random access memory), 롬(ROM: read-only memory), 이이피롬(EEPROM: electrically erasable programmable read-only memory), 피롬(PROM: programmable read-only memory), 자기 메모리, 자기 디스크, 광디스크 중 적어도 하나의 타입의 저장매체를 포함할 수 있다. 또한, 메모리(120)는 데이터를 소정의 체제로 통제하여 관리하는 데이터베이스(database) 시스템을 포함할 수도 있다. 상술한 메모리(120)의 종류는 하나의 예시일 뿐이므로, 메모리(120)의 종류는 본 개시의 내용을 기초로 당업자가 이해 가능한 범주에서 다양하게 구성될 수 있다.The memory (120) according to one embodiment of the present disclosure may be understood as a configuration unit including hardware and/or software for storing and managing data processed in the computing device (100). That is, the memory (120) may store any type of data generated or determined by the processor (110) and any type of data received by the network unit (130). For example, the memory (120) may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, a card type memory, a RAM (random access memory), a SRAM (static random access memory), a ROM (read-only memory), an EEPROM (electrically erasable programmable read-only memory), a PROM (programmable read-only memory), a magnetic memory, a magnetic disk, and an optical disk. In addition, the memory (120) may also include a database system that controls and manages data in a predetermined system. The type of memory (120) described above is only an example, and thus the type of memory (120) can be configured in various ways within a range understandable to those skilled in the art based on the contents of the present disclosure.
메모리(120)는 프로세서(110)가 연산을 수행하는데 필요한 데이터, 데이터의 조합, 및 프로세서(110)에서 실행 가능한 프로그램 코드(code) 등을 구조화 및 조직화 하여 관리할 수 있다. 예를 들어, 메모리(120)에는 제1 생체 정보에 기초하여 제2 생체 정보를 식별하도록 학습된 신경망 모델(10)이 저장될 수 있다. 또한, 메모리(120)에는 획득된 제1 및 제2 생체 정보에 기초하여 신경망 모델(10)에 대한 학습을 수행하도록 동작시키는 프로그램 코드, 신경망 모델(10)이 시계열 데이터를 입력 받아 컴퓨팅 장치(100)의 사용 목적에 맞춰 추론을 수행하도록 동작시키는 프로그램 코드, 및 프로그램 코드가 실행됨에 따라 생성된 가공 데이터 등을 저장할 수 있다. The memory (120) can manage, by structuring and organizing, data required for the processor (110) to perform operations, combinations of data, and program codes executable in the processor (110). For example, the memory (120) can store a neural network model (10) learned to identify second biometric information based on first biometric information. In addition, the memory (120) can store program codes that operate to perform learning on the neural network model (10) based on the acquired first and second biometric information, program codes that operate the neural network model (10) to receive time series data and perform inference according to the intended use of the computing device (100), and processed data generated as the program codes are executed.
본 개시의 일 실시 예에 따른 네트워크부(130)는 임의의 형태의 공지된 유무선 통신 시스템을 통해 데이터를 송수신하는 구성 단위로 이해될 수 있다. 예를 들어, 네트워크부(130)는 근거리 통신망(LAN: local area network), 광대역 부호 분할 다중 접속(WCDMA: wideband code division multiple access), 엘티이(LTE: long term evolution), 와이브로(WiBro: wireless broadband internet), 5세대 이동통신(5G), 초광역대 무선통신(ultrawide-band), 지그비(ZigBee), 무선주파수(RF: radio frequency) 통신, 무선랜(wireless LAN), 와이파이(wireless fidelity), 근거리 무선통신(NFC: near field communication), 또는 블루투스(Bluetooth) 등과 같은 유무선 통신 시스템을 사용하여 데이터 송수신을 수행할 수 있다. 상술한 통신 시스템들은 하나의 예시일 뿐이므로, 네트워크부(130)의 데이터 송수신을 위한 유무선 통신 시스템은 상술한 예시 이외에 다양하게 적용될 수 있다.The network unit (130) according to one embodiment of the present disclosure may be understood as a configuration unit that transmits and receives data via any type of known wired and wireless communication system. For example, the network unit (130) may perform data transmission and reception using a wired and wireless communication system such as a local area network (LAN), wideband code division multiple access (WCDMA), long term evolution (LTE), wireless broadband internet (WiBro), fifth generation mobile communication (5G), ultrawide-band, ZigBee, radio frequency (RF) communication, wireless LAN, wireless fidelity, near field communication (NFC), or Bluetooth. Since the above-described communication systems are only examples, the wired and wireless communication system for data transmission and reception of the network unit (130) may be applied in various ways other than the above-described examples.
네트워크부(130)는 임의의 시스템 혹은 임의의 클라이언트 등과의 유무선 통신을 통해, 프로세서(110)가 연산을 수행하는데 필요한 데이터를 수신할 수 있다. 또한, 네트워크부(130)는 임의의 시스템 혹은 임의의 클라이언트 등과의 유무선 통신을 통해, 프로세서(110)의 연산을 통해 생성된 데이터를 송신할 수 있다. 예를 들어, 네트워크부(130)는 병원 환경 내 데이터베이스, 의료 데이터의 표준화 등의 작업을 수행하는 클라우드 서버, 혹은 컴퓨팅 장치 등과의 통신을 통해 의료 데이터를 수신할 수 있다. 네트워크부(130)는 전술한 데이터베이스, 서버, 혹은 컴퓨팅 장치 등과의 통신을 통해, 신경망 모델(10)의 출력 데이터, 및 프로세서(110)의 연산 과정에서 도출되는 중간 데이터, 가공 데이터 등을 송신할 수 있다. 일 예로, 프로세서(110)는 네트워크부(130)를 통해 외부 컴퓨팅 장치(예를 들어, 외부 서버 장치)로부터 제1 생체 정보에 기초하여 제2 생체 정보를 식별하도록 학습된 신경망 모델(10)을 획득할 수 있다. 또한, 프로세서(110)는 네트워크부(130)를 통해 외부 컴퓨팅 장치(예를 들어, 생체 신호 측정 장치)로부터 사용자(1)의 제2 생체 정보를 획득할 수 있다. The network unit (130) can receive data required for the processor (110) to perform calculations through wired or wireless communication with any system or any client, etc. In addition, the network unit (130) can transmit data generated through the calculation of the processor (110) through wired or wireless communication with any system or any client, etc. For example, the network unit (130) can receive medical data through communication with a cloud server that performs tasks such as standardization of medical data, a database in a hospital environment, or a computing device, etc. The network unit (130) can transmit output data of the neural network model (10), and intermediate data, processed data, etc. derived from the calculation process of the processor (110) through communication with the aforementioned database, server, or computing device, etc. As an example, the processor (110) can obtain a neural network model (10) learned to identify second biometric information based on first biometric information from an external computing device (e.g., an external server device) through the network unit (130). Additionally, the processor (110) can obtain second biometric information of the user (1) from an external computing device (e.g., a biometric signal measuring device) through the network unit (130).
센싱부(140)는 사용자(1)에 대한 제1 생체 정보를 획득할 수 있다. 특히, 센싱부(140)는 비침습적 방식에 기초하여 사용자(1)에 대한 제1 생체 정보를 획득할 수 있다. 즉, 본 개시의 일 실시 예에 따라 제1 생체 정보는 비침습적 방식에 따라 측정되는 생체 정보일 수 있다. 일 예로, 센싱부(140)는 복수의 전극을 포함할 수 있다. 이때, 프로세서(110)는 적어도 하나의 전극을 통해 제1 생체 정보로 사용자(1)의 심전도 신호를 획득할 수 있다. 또한, 센싱부(140)는 이미지 센서 또는 광 센서를 포함할 수 있다. 이때, 프로세서(110)는 이미지 센서(또는 광 센서)를 통해 제1 생체 정보로 사용자(1)의 광 혈류량 신호를 획득할 수 있다. The sensing unit (140) can obtain first biometric information about the user (1). In particular, the sensing unit (140) can obtain first biometric information about the user (1) based on a non-invasive method. That is, according to one embodiment of the present disclosure, the first biometric information may be biometric information measured based on a non-invasive method. As an example, the sensing unit (140) may include a plurality of electrodes. At this time, the processor (110) may obtain an electrocardiogram signal of the user (1) as the first biometric information through at least one electrode. In addition, the sensing unit (140) may include an image sensor or an optical sensor. At this time, the processor (110) may obtain a photoplethysmographic blood flow signal of the user (1) as the first biometric information through the image sensor (or optical sensor).
도 3은 본 개시의 일 실시 예에 따른 생체 정보를 식별하는 사용자 맞춤형 신경망 모델(10)을 획득하는 방법을 개략적으로 나타낸 순서도이다. FIG. 3 is a flowchart schematically illustrating a method for obtaining a user-customized neural network model (10) for identifying biometric information according to one embodiment of the present disclosure.
도 3을 참조하면, 먼저, 프로세서(110)는 사용자로부터 감지된 생체 신호를 기 학습된 신경망 모델(10)에 입력하여, 상태와 관련된 생체 정보를 예측한다(S310).Referring to FIG. 3, first, the processor (110) inputs a bio-signal detected from a user into a pre-learned neural network model (10) to predict bio-information related to the state (S310).
프로세서(110)는 메모리(120)에 저장된 신경망 모델(10)을 획득할 수 있다. 이때, 신경망 모델(10)은 컴퓨팅 장치(100)의 제작 당시에 메모리(120)에 저장될 수 있다. The processor (110) can obtain a neural network model (10) stored in the memory (120). At this time, the neural network model (10) can be stored in the memory (120) at the time of manufacturing the computing device (100).
이때, 컴퓨팅 장치(100)의 메모리(120)에는 복수의 신경망 모델(10)이 저장될 수도 있다. 구체적으로, 메모리(120)에는 복수의 사용자(1)(예를 들어, 환자 등)에 대응하는 복수의 신경망 모델(10)이 사전에 저장될 수 있으며, 프로세서(110)는 사용자(1)의 식별 정보(예를 들어, 성명, 주민등록번호 등)(또는 생물학적 정보)에 기초하여 복수의 신경망 모델(10) 중 사용자(1)에 대응하는 신경망 모델(10)을 선별하여 획득할 수 있다. 예를 들어, 컴퓨팅 장치(100)가 전문 의료 기관에서 설치된 생체 신호 측정 장치인 경우, 컴퓨팅 장치(100)에는 전문 의료 기관을 방문한 복수의 환자에 대응하는 복수의 신경망 모델(10)이 저장될 수 있으며, 각각의 신경망 모델(10)은 각각의 환자에 대하여 획득된 제1 및 제2 생체 정보에 기초하여 후술하는 본 개시의 실시 예에 따라 반복하여 업데이트되어, 메모리(120)에 저장될 수 있다. At this time, a plurality of neural network models (10) may be stored in the memory (120) of the computing device (100). Specifically, a plurality of neural network models (10) corresponding to a plurality of users (1) (e.g., patients, etc.) may be stored in advance in the memory (120), and the processor (110) may select and acquire a neural network model (10) corresponding to the user (1) among the plurality of neural network models (10) based on the identification information (e.g., name, resident registration number, etc.) (or biological information) of the user (1). For example, if the computing device (100) is a biosignal measuring device installed in a specialized medical institution, a plurality of neural network models (10) corresponding to a plurality of patients who have visited the specialized medical institution may be stored in the computing device (100), and each neural network model (10) may be repeatedly updated based on the first and second biometric information acquired for each patient according to an embodiment of the present disclosure described below, and may be stored in the memory (120).
한편, 프로세서(110)는 네트워크부(130)를 통해 컴퓨팅 장치(100)와 연동하는 외부 컴퓨팅 장치(200)(예를 들어, 서버 장치 등)로부터 신경망 모델(10)을 획득할 수도 있다. Meanwhile, the processor (110) may also obtain a neural network model (10) from an external computing device (200) (e.g., a server device, etc.) that is connected to the computing device (100) through the network unit (130).
여기서, 외부 컴퓨팅 장치(200)는 컴퓨팅 장치(100)와 연동하여, 컴퓨팅 장치(100)의 사용자(1)의 상태(즉, 질병 및 건강 상태)를 모니터링 하는 서버 장치일 수 있다. 또는 외부 컴퓨팅 장치(200)는 사용자(1)의 제2 생체 정보를 측정하는 생체 신호 측정 장치일 수 있다. Here, the external computing device (200) may be a server device that monitors the status (i.e., disease and health status) of the user (1) of the computing device (100) in conjunction with the computing device (100). Alternatively, the external computing device (200) may be a biosignal measuring device that measures second bioinformation of the user (1).
그리고, 프로세서(110)는 센싱부(140)를 통해 사용자에 관한 제1 생체 정보를 감지할 수 있다. 일 예로, 프로세서(110)는 센싱부(140)를 통해 사용자에 관한 제1 생체 정보로 생체 신호를 감지하여 획득할 수 있으며, 이때, 생체 신호는 심전도 신호일 수 있다. 프로세서(110)는 사용자(1)의 좌심실 박출률 예측을 요청하는 명령어가 입력되면, 센싱부(140)를 통해 사용자에 관한 심전도 신호를 감지하여 획득할 수 있다. And, the processor (110) can detect first bio-information about the user through the sensing unit (140). For example, the processor (110) can detect and obtain a bio-signal as the first bio-information about the user through the sensing unit (140), and at this time, the bio-signal can be an electrocardiogram signal. When a command requesting prediction of the left ventricular ejection fraction of the user (1) is input, the processor (110) can detect and obtain an electrocardiogram signal about the user through the sensing unit (140).
프로세서(110)는 제1 생체 정보가 획득되면, 제1 생체 정보를 신경망 모델(10)에 입력하여 제2 생체 정보를 예측할 수 있다. 특히, 제2 생체 정보는 사용자(1)의 상태와 관련된 것일 수 있다. 여기서, 사용자(1)의 상태는 사용자(1)의 질병 또는 건강 상태가 될 수 있다. 특히, 사용자(1)의 질병 또는 건강 상태는 사용자(1)의 심장과 관련된 것일 수 있다. 신경망 모델(10)은 도 1에서 설명한 바와 같이, 제1 생체 정보를 기초로, 제2 생체 정보를 예측하도록 학습된 모델일 수 있다. 예를 들어, 제1 생체 정보가 심전도 신호이고, 제2 생체 정보가 좌심실 박출률이면, 신경망 모델(10)은 심전도 신호와 심전도 신호에 대응하는 좌심실 박출률이 매칭된 다수의 학습 데이터 셋에 기초하여 학습될 수 있다. 그리고, 프로세서(110)는 사용자(1)로부터 감지된 심전도 신호를 신경망 모델(10)에 입력하여, 좌심실 박출률 감소 심부전과 관련된 사용자(1)의 좌심실 박출률을 예측할 수 있다. When the first biometric information is acquired, the processor (110) can input the first biometric information into the neural network model (10) to predict the second biometric information. In particular, the second biometric information may be related to the status of the user (1). Here, the status of the user (1) may be a disease or health status of the user (1). In particular, the disease or health status of the user (1) may be related to the heart of the user (1). The neural network model (10) may be a model learned to predict the second biometric information based on the first biometric information, as described in FIG. 1. For example, if the first biometric information is an electrocardiogram signal and the second biometric information is a left ventricular ejection fraction, the neural network model (10) may be learned based on a plurality of learning data sets in which the electrocardiogram signal and the left ventricular ejection fraction corresponding to the electrocardiogram signal are matched. And, the processor (110) inputs an electrocardiogram signal detected from the user (1) into the neural network model (10) to predict the left ventricular ejection fraction of the user (1) related to heart failure with reduced left ventricular ejection fraction.
다시, 도 3을 참조하면, 본 개시의 일 실시 예에 따라 프로세서(110)는 사용자에 대한 새로운 데이터가 획득되면, 획득된 새로운 데이터에 기초하여, 기 학습된 신경망 모델이 개인화 되도록 업데이트 할 수 있다(S320). 여기서, 새로운 데이터는, 사용자(1)의 상태(예를 들어, 질병 및 건강 상태)와 관련하여 사용자로부터 측정된 생체 정보를 포함할 수 있다. 예를 들어, 새로운 데이터는 좌심실 박출률 감소 심부전과 관련하여 사용자로부터 측정된 좌심실 박출률일 수 있다. Again, referring to FIG. 3, according to one embodiment of the present disclosure, when new data about a user is acquired, the processor (110) may update the pre-learned neural network model to be personalized based on the acquired new data (S320). Here, the new data may include biometric information measured from the user in relation to the condition of the user (1) (e.g., disease and health condition). For example, the new data may be left ventricular ejection fraction measured from the user in relation to heart failure with reduced left ventricular ejection fraction.
프로세서(110)는 사용자(1)에 대한 제2 생체 정보를 획득하고, 획득된 제2 생체 정보와, 획득된 제2 생체 정보와 대응하는 제1 생체 정보에 기초하여, 신경망 모델(10)에 대한 제1 학습을 수행하여, 신경망 모델(10)을 업데이트 할 수 있다. The processor (110) can obtain second biometric information about the user (1), and perform first learning on the neural network model (10) based on the obtained second biometric information and first biometric information corresponding to the obtained second biometric information, thereby updating the neural network model (10).
구체적으로, 프로세서(110)는 네트워크부(130)를 통해 외부 컴퓨팅 장치(200)로부터 제2 생체 정보를 수신할 수 있다. 또는, 프로세서(110)는 사용자 인터페이스를 통해 제2 생체 정보를 입력 받을 수도 있다. 이때, 프로세서(110)는 사용자(1)에 대한 제1 생체 정보를 감지하여 획득하는 동안, 제2 생체 정보를 네트워크부(130)를 통해 외부 컴퓨팅 장치(200)로부터 또는 사용자 인터페이스를 통해 획득할 수 있다. 제2 생체 정보는 사용자(1)로부터 직접 측정되어 획득된 생체 정보로, 제1 생체 정보를 신경망 모델(10)에 입력하여 예측된 제2 생체 정보와는 상이하며, 보다 정확할 수 있다. Specifically, the processor (110) may receive second biometric information from an external computing device (200) through the network unit (130). Alternatively, the processor (110) may also receive second biometric information through a user interface. At this time, while the processor (110) detects and acquires first biometric information about the user (1), the processor (110) may acquire second biometric information from the external computing device (200) through the network unit (130) or through the user interface. The second biometric information is biometric information directly measured and acquired from the user (1), and is different from and may be more accurate than second biometric information predicted by inputting first biometric information into the neural network model (10).
일 예로, 제2 생체 정보는, 외부 컴퓨팅 장치(200)에 의해 사용자(1)를 대상으로 측정된 좌심실 박출률일 수 있다. 보다 구체적으로, 외부 컴퓨팅 장치(200)는 초음파 검사를 통해 사용자(1)의 심장 내부 구조를 관찰하여, 사용자(1)의 좌심실 크기와 형태를 평가하여 좌심실 박출률을 측정할 수 있다. 다만, 이에 제한되는 것은 아니며, 외부 컴퓨팅 장치(200)는 도플러 측정법, 적색파 방출 추적법 등의 다양한 방법에 기초하여 사용자(1)의 좌심실 박출률을 측정할 수 있다. 사용자(1)에 대한 좌심실 박출률은 외부 컴퓨팅 장치(200)를 이용하여 의료진에 의해 측정될 수 있다.For example, the second biometric information may be the left ventricular ejection fraction measured for the user (1) by the external computing device (200). More specifically, the external computing device (200) may observe the internal structure of the heart of the user (1) through an ultrasound examination, evaluate the size and shape of the left ventricle of the user (1), and measure the left ventricular ejection fraction. However, the present invention is not limited thereto, and the external computing device (200) may measure the left ventricular ejection fraction of the user (1) based on various methods such as the Doppler measurement method and the red wave emission tracing method. The left ventricular ejection fraction for the user (1) may be measured by medical staff using the external computing device (200).
한편, 또 다른 예로, 제2 생체 정보는, 침습적 측정 방식에 기반하여 사용자(1)로부터 획득된 생체 정보 또는 의료진에 의해 직접 측정되어 획득된 생체 정보를 포함할 수 있다. Meanwhile, as another example, the second biometric information may include biometric information obtained from the user (1) based on an invasive measurement method or biometric information obtained by direct measurement by medical staff.
프로세서(110)는 제2 생체 정보가 획득되면, 획득된 제2 생체 정보에 기초하여, 신경망 모델(10)에 대한 제2 학습을 수행하여 신경망 모델(10)을 업데이트 할 수 있다. 즉, 프로세서(110)는 제2 생체 정보를 바탕으로, 기 학습된 신경망 모델(10)을 사용자(1)에 맞도록(또는 개인화 되도록) 보정할 수 있다. When second biometric information is acquired, the processor (110) can perform second learning on the neural network model (10) based on the acquired second biometric information to update the neural network model (10). That is, the processor (110) can correct the previously learned neural network model (10) to fit (or be personalized) the user (1) based on the second biometric information.
구체적으로, 프로세서(110)는 제2 생체 정보가 획득되면, 제2 생체 정보가 획득된 시점(보다 구체적으로, 외부 컴퓨팅 장치(200)에 의해 제2 생체 정보가 획득된 시점)에 센싱부(140)를 통해 획득된 제1 생제 정보(이하, 제2 생체 정보에 대응하는 제1 생체 정보)를 식별하고, 제1 및 제2 생체 정보를 신경망 모델(10)에 대한 제2 학습을 수행할 수 있다. 예를 들어, 제1 생체 정보가 심전도 신호이고, 제2 생체 정보가 좌심실 박출률이면, 프로세서(110)는 외부 컴퓨팅 장치(200)가 사용자(1)의 좌심실 박출률을 측정할 당시에 사용자(1)로부터 감지된 심전도 신호를 식별하고, 외부 컴퓨팅 장치(200)로부터 획득된 좌심실 박출률과 식별된 심전도 신호로 신경망 모델(10)에 대한 제1 학습을 수행할 수 있다. Specifically, when second biometric information is acquired, the processor (110) can identify first biometric information (hereinafter, first biometric information corresponding to the second biometric information) acquired through the sensing unit (140) at the time when the second biometric information is acquired (more specifically, at the time when the second biometric information is acquired by the external computing device (200), and perform second learning on the neural network model (10) using the first and second biometric information. For example, when the first biometric information is an electrocardiogram signal and the second biometric information is a left ventricular ejection fraction, the processor (110) can identify an electrocardiogram signal detected from the user (1) when the external computing device (200) measures the left ventricular ejection fraction of the user (1), and perform first learning on the neural network model (10) using the left ventricular ejection fraction acquired from the external computing device (200) and the identified electrocardiogram signal.
본 개시의 일 실시 예에 따라 제1 학습은, 획득된 제2 생체 정보 및 획득된 제2 생체 정보에 대응하여 획득된 제1 생체 정보에 기초한 지도 학습 (Supervised Learning) 방식일 수 있다. 구체적으로, 프로세서(110)는 제2 생체 정보에 대응하는 제1 생체 정보를 입력 데이터로 제2 생체 정보를 레이블로 설정하여, 제1 및 제2 생체 정보와의 관계를 신경망 모델(10)에 학습시킬 수 있다.According to one embodiment of the present disclosure, the first learning may be a supervised learning method based on the acquired second biometric information and the acquired first biometric information corresponding to the acquired second biometric information. Specifically, the processor (110) may set the first biometric information corresponding to the second biometric information as input data and the second biometric information as a label, thereby allowing the neural network model (10) to learn the relationship between the first and second biometric information.
한편, 본 개시의 일 실시 예에 따라 프로세서(110)는 사용자(1)에 대한 제1 생체 정보를 연속하여 획득할 수 있다. 그리고, 프로세서(110)는 획득된 제1 생체 정보에 기초하여 신경망 모델(10)에 대한 제2 학습을 수행하여, 신경망 모델(10)을 업데이트 할 수 있다. Meanwhile, according to one embodiment of the present disclosure, the processor (110) can continuously obtain first biometric information about the user (1). Then, the processor (110) can perform second learning on the neural network model (10) based on the obtained first biometric information, thereby updating the neural network model (10).
구체적으로, 프로세서(110)는 센싱부(140)를 통해 사용자(1)에 대한 제1 생체 정보를 연속하여 획득할 수 있다. 예를 들어, 프로세서(110)는 기 설정된 시간(예를 들어, 1초) 동안에 복수의 샘플 포인트(예를 들어, 125 포인트 또는 250 포인트)를 설정하고, 복수의 샘플 포인트 마다 반복하여 사용자(1)에 대한 제1 생체 정보를 획득할 수 있다. 제1 생체 정보가 심전도 신호인 경우, 프로세서(110)는 센싱부(140)를 통해 10초 동안 500Hz의 샘플링 속도로 심전도 신호를 측정하여 획득할 수 있다. 이때, 프로세서(110)는 심전도 신호에 포함된 노이즈 또는 인공물(Artifacts) 를 제거하기 위하여, 심전도 신호가 시작되는 1초 영역과 심전도 신호가 끝나는 마지막 1초 영역을 제거하여 심전도 신호를 가공할 수 있다.Specifically, the processor (110) can continuously obtain first biometric information about the user (1) through the sensing unit (140). For example, the processor (110) can set a plurality of sample points (e.g., 125 points or 250 points) for a preset time (e.g., 1 second) and repeatedly obtain first biometric information about the user (1) for each of the plurality of sample points. When the first biometric information is an electrocardiogram signal, the processor (110) can obtain the electrocardiogram signal by measuring it at a sampling rate of 500 Hz for 10 seconds through the sensing unit (140). At this time, the processor (110) can process the electrocardiogram signal by removing a 1-second region where the electrocardiogram signal starts and a last 1-second region where the electrocardiogram signal ends in order to remove noise or artifacts included in the electrocardiogram signal.
또한, 프로세서(110)는 센싱부(140)를 통해 사용자(1)의 제1 생체 정보를 연속적으로 획득하고, 프로세서(110)는 설정된 샘플 포인트를 기준으로 연속하는 제1 생체 정보를 분할하여, 복수의 제1 생체 정보를 획득할 수도 있다. In addition, the processor (110) continuously obtains the first biometric information of the user (1) through the sensing unit (140), and the processor (110) may obtain a plurality of pieces of first biometric information by dividing the continuous first biometric information based on the set sample points.
한편, 프로세서(110)는 외부 컴퓨팅 장치(200)로부터 주기적으로 제1 생체 정보를 획득할 수도 있다. 예를 들어, 컴퓨팅 장치(100)가 서버 장치로 구현되는 경우, 프로세서(110)는 사용자(1)의 신체에 접촉하여 제1 생체 정보를 측정하여 획득하는 외부 생체 신호 측정 장치로부터 연속적으로 획득된 제1 생체 정보를 수신할 수 있다. 이때, 프로세서(110)는 네트워크부(130)를 통해 외부 생체 신호 측정 장치로부터 제1 생체 정보를 수신할 수 있다. Meanwhile, the processor (110) may periodically obtain first biometric information from an external computing device (200). For example, if the computing device (100) is implemented as a server device, the processor (110) may continuously receive first biometric information obtained from an external biometric signal measuring device that measures first biometric information by coming into contact with the body of the user (1). At this time, the processor (110) may receive first biometric information from the external biometric signal measuring device through the network unit (130).
프로세서(110)는 제1 생체 정보가 획득되면, 획득된 사용자(1)의 제1 생체 정보에 기초하여, 신경망 모델(10)에 대한 제2 학습을 수행하여, 신경망 모델(10)을 업데이트 할 수 있다 즉, 프로세서(110)는 제1 생체 정보를 바탕으로, 기 학습된 신경망 모델(10)을 사용자(1)에 맞도록 보정할 수 있다. When the first biometric information is acquired, the processor (110) can perform second learning on the neural network model (10) based on the acquired first biometric information of the user (1), thereby updating the neural network model (10). That is, the processor (110) can correct the previously learned neural network model (10) to fit the user (1) based on the first biometric information.
예를 들어, 프로세서(110)가 제1 생체 정보를 주기적으로 획득하는 경우, 프로세서(110)는 제1 생체 정보를 획득할 때마다, 획득된 제1 생체 정보로 신경망 모델(10)에 대한 제2 학습을 수행할 수 있다. 즉, 제2 학습은 제1 생체 정보가 획득되는 주기에 맞춰 신경망 모델(10)에 대하여 수행될 수 있다. For example, when the processor (110) periodically acquires the first biometric information, the processor (110) may perform second learning on the neural network model (10) with the acquired first biometric information every time the processor (110) acquires the first biometric information. That is, the second learning may be performed on the neural network model (10) in accordance with the cycle in which the first biometric information is acquired.
본 개시의 일 실시 예에 따라 제2 학습은 획득된 제1 생체 정보에 기초한 자기 지도 학습(Self-Supervised Learning) 방식일 수 있다. 구체적으로, 프로세서(110)는 획득된 제1 생체 정보로부터 잠재된 특징을 추출하고, 추출된 잠재적 특징에 기초하여 제1 생체 정보에 대한 가상 레이블을 부여한 후 가상 레이블이 부여된 제1 생체 정보를 바탕으로, 신경망 모델(10)을 학습 시킬 수 있다.According to one embodiment of the present disclosure, the second learning may be a self-supervised learning method based on the acquired first biometric information. Specifically, the processor (110) may extract latent features from the acquired first biometric information, assign a virtual label to the first biometric information based on the extracted latent features, and then train the neural network model (10) based on the first biometric information to which the virtual label has been assigned.
도 4는 본 개시의 일 실시 예에 따른 제1 및 제2 생체 정보에 기초하여 신경망 모델(10)을 업데이트 하는 것을 나타낸 예시도이다. 도 5는 본 개시의 일 실시 예에 따른 제1 및 제2 생체 정보에 기초하여 신경망 모델(10) 업데이트 하는 것을 나타낸 순서도이다. FIG. 4 is an exemplary diagram showing updating a neural network model (10) based on first and second biometric information according to an embodiment of the present disclosure. FIG. 5 is a flowchart showing updating a neural network model (10) based on first and second biometric information according to an embodiment of the present disclosure.
한편, 도 4를 참조하면, 일 예로, 제1 학습은 제2 생체 정보가 획득될 때마다 수행될 수 있으며, 제2 학습은 제1 생체 정보가 획득될 때마다 수행될 수 있다. 이때, 제2 생체 정보가 획득되면, 프로세서(110)는 신경망 모델(10)에 대한 제2 학습을 수행하지 않고, 제1 학습 만을 수행할 수 있다. 이는, 동일한 제1 생체 정보를 각각 서로 다른 학습(제1 및 제2 학습)에 이용하지 않고, 신경망 모델(10)에 대한 보다 정확한 보정이 가능한 제1 학습에만 이용함으로써 신경망 모델(10)의 정확도를 높이기 위함이다. 다만, 이에 제한되는 것은 아니다. Meanwhile, referring to FIG. 4, for example, the first learning may be performed whenever the second biometric information is acquired, and the second learning may be performed whenever the first biometric information is acquired. At this time, when the second biometric information is acquired, the processor (110) may perform only the first learning without performing the second learning for the neural network model (10). This is to increase the accuracy of the neural network model (10) by not using the same first biometric information for each different learning (first and second learning), but only using it for the first learning that enables more accurate correction of the neural network model (10). However, this is not limited thereto.
한편, 도 5를 참조하면, 도 5에 도시된 S510은 도 3에 도시된 S310에 대응될 수 있다. 한편, 도 5를 참조하면, 프로세서(110)는 제1 및 제2 생체 정보를 각각 획득(S521 및 S523)함에 따라서, 제1 학습과 제2 학습 과정을 병렬적으로 수행할 수 있다(S522 및 S524). Meanwhile, referring to FIG. 5, S510 illustrated in FIG. 5 may correspond to S310 illustrated in FIG. 3. Meanwhile, referring to FIG. 5, the processor (110) may perform the first learning and second learning processes in parallel (S522 and S524) by obtaining the first and second biometric information, respectively (S521 and S523).
이때, 프로세서(110)는 하기의 삭 1 내지 식 3에 기초하여, 주기적으로 획득된 사용자(1)의 제1 생체 정보로 신경망 모델(10)에 대한 제2 학습을 수행하고, 획득된 제2 생체 정보로 신경망 모델(10)에 대한 제1 학습을 수행할 수 있다. 구체적으로, 프로세서(110)는 하기의 식 1 내지 식 3에 따른 경사 하강법(Gradient Descent)에 기초하여 신경망 모델(10) 업데이트 할 수 있다. 이때, 프로세서(110)는 신경망 모델(10) t의 타임 스텝의 데이터를 이용해서 batch size 1로 학습시킬 수 있다. At this time, the processor (110) may perform second learning on the neural network model (10) with the first biometric information of the user (1) obtained periodically based on
<식 1><
<식 2><
<식 3><Formula 3>
여기서, 는 시간 t에서의 신경망 모델의 파라미터일 수 있다. 는 t에서 획득된 제1 생체 정보이고, 는 t에서 획득된 제2 생체 정보일 수 있다. 그리고, L은 신경망 모델의 전체 손실 함수이고, 은 지도 학습 손실 함수이고, 은 자기 지도 학습 손실 함수일 수 있다. 은 t에서의 획득된 제1 생체 정보로 예측된 제2 생체 정보와 실제 t에서 획득된 제2 생체 정보의 엔트로피 손실(Cross-Entropy Loss)()로 정의되고, 은 를 임베딩(Embedding)한 결과인 와 입력 데이터 간의 유클리드 거리를 최소화하는 손실 함수로 정의될 수 있다. Here, can be a parameter of the neural network model at time t. is the first biometric information acquired from t, can be the second biometric information obtained from t. And, L is the overall loss function of the neural network model, is the supervised learning loss function, can be a self-supervised learning loss function. is the cross-entropy loss of the second biometric information predicted by the first biometric information acquired at t and the second biometric information actually acquired at t. ) is defined as, silver The result of embedding and input data It can be defined as a loss function that minimizes the Euclidean distance between the two.
도 6은 본 개시의 일 실시 예에 따른 제1 생체 정보와 및 데이터 셋에 포함된 복수의 제2 생체 정보에 기초하여 신경망 모델(10) 업데이트 하는 것을 나타낸 예시도이다. FIG. 6 is an exemplary diagram showing updating a neural network model (10) based on first biometric information and a plurality of second biometric information included in a data set according to one embodiment of the present disclosure.
한편, 본 개시의 일 실시 예에 따라 사용자에 대한 새로운 데이터가 획득되면, 프로세서(110)는 획득된 새로운 데이터를 이전 시점에 누적하여 획득된 데이터 셋에 추가하고, 데이터 셋에 기초하여 신경망 모델(10)에 대한 제1 학습을 수행하여, 신경망 모델이 개인화 되도록 업데이트 할 수 있다. 이때, 데이터 셋은, 이전 시점에 누적하여 획득된 적어도 하나의 제2 생체 정보와 적어도 하나의 제2 생체 정보에 대응하는 제1 생체 정보가 매칭되어 포함될 수 있다. Meanwhile, when new data about a user is acquired according to an embodiment of the present disclosure, the processor (110) may add the acquired new data to a data set accumulated at a previous time and perform first learning on the neural network model (10) based on the data set, thereby updating the neural network model to be personalized. At this time, the data set may include at least one second biometric information accumulated at a previous time and first biometric information corresponding to at least one second biometric information that is matched and acquired.
구체적으로, 도 6을 참조하면, 프로세서(110)는 제2 생체 정보가 새롭게 획득되면, 제2 생체 정보가 획득된 시점에 획득된 제1 생체 정보를 식별하고, 제2 생체 정보와 제1 생체 정보를 이전 시점에 누적하여 획득된 데이터 셋(20)에 추가할 수 있다. 여기서, 데이터 셋(20)에는 제2 생체 정보가 새롭게 획득된 시점 이전에 누적하여 획득된 복수의 제2 생체 정보와 복수의 제2 생체 정보에 각각 대응하는 복수의 제1 생체 정보가 포함될 수 있다. 즉, 프로세서(110)는 제2 생체 정보가 획득되면, 획득된 제2 생체 정보와 획득된 제2 생체 정보와 대응하는 제1 생체 정보로 데이터 셋(20)을 업 데이트 할 수 있다. 그리고, 프로세서(110)는 데이터 셋(20)에 기반하여, 신경망 모델(10)에 대한 제1 학습을 수행하여 신경망 모델(10)을 업데이트 할 수 있다. Specifically, referring to FIG. 6, when second biometric information is newly acquired, the processor (110) may identify first biometric information acquired at the time when the second biometric information is acquired, and add the second biometric information and the first biometric information to a data set (20) accumulated and acquired at a previous time. Here, the data set (20) may include a plurality of second biometric information accumulated and acquired before the time when the second biometric information is newly acquired, and a plurality of first biometric information corresponding to the plurality of second biometric information, respectively. That is, when second biometric information is acquired, the processor (110) may update the data set (20) with the acquired second biometric information and the first biometric information corresponding to the acquired second biometric information. In addition, the processor (110) may perform first learning on the neural network model (10) based on the data set (20) to update the neural network model (10).
이때, 프로세서(110)는 하기의 식 4 내지 식 7에 기초하여, 주기적으로 획득된 사용자(1)의 제1 생체 정보로 신경망 모델(10)에 대한 제2 학습을 수행하고, 획득된 복수의 제2 생체 정보를 포함하는 데이터 셋(20)으로 신경망 모델(10)에 대한 제1 학습을 수행할 수 있다. 구체적으로, 프로세서(110)는 하기의 식 4 내지 식 7에 따른 경사 하강법(Gradient Descent)에 기초하여 신경망 모델(10) 업데이트 할 수 있다. At this time, the processor (110) may perform second learning on the neural network model (10) with the first biometric information of the user (1) obtained periodically based on Equations 4 to 7 below, and may perform first learning on the neural network model (10) with the data set (20) including the plurality of second biometric information obtained. Specifically, the processor (110) may update the neural network model (10) based on the gradient descent method according to Equations 4 to 7 below.
<식 4><Formula 4>
<식 5><Formula 5>
<식 6><
<식 7><Formula 7>
여기서, 는 시간 t에서의 신경망 모델의 파라미터일 수 있다. 는 t에서 획득된 제1 생체 정보이고, 는 t에서 획득된 제2 생체 정보일 수 있다. 그리고, L은 신경망 모델의 전체 손실 함수이고, 은 지도 학습 손실 함수이고, 은 자기 지도 학습 손실 함수일 수 있다. 은 t에서의 획득된 제1 생체 정보로 예측된 제2 생체 정보와 실제 t에서 획득된 제2 생체 정보의 엔트로피 손실(Cross-Entropy Loss)()로 정의되고, 은 를 임베딩(Embedding)한 결과인 와 입력 데이터 간의 유클리드 거리를 최소화하는 손실 함수로 정의될 수 있다. 는 데이터 셋(20)일 수 있다. Here, can be a parameter of the neural network model at time t. is the first biometric information acquired from t, can be the second biometric information obtained from t. And, L is the overall loss function of the neural network model, is the supervised learning loss function, can be a self-supervised learning loss function. is the cross-entropy loss of the second biometric information predicted by the first biometric information acquired at t and the second biometric information actually acquired at t. ) is defined as, silver The result of embedding and input data It can be defined as a loss function that minimizes the Euclidean distance between the two. may be a data set (20).
한편, 프로세서(110)는 사용자(1)의 생물학적 정보를 식별하고, 식별된 생물학적 정보와 유사한 생물학적 정보를 갖는 타 사용자(1)의 타 컴퓨팅 장치(100)와 제2 생체 정보와 관련된 데이터 셋(20)을 공유할 수도 있다. 여기서, 생물학적 정보는 사용자(1)의 키, 몸무게, 나이, 성별 등을 포함할 수 있다. 예를 들어, 컴퓨팅 장치(100)가 스마트 워치인 경우, 프로세서(110)는 타 사용자(1)의 타 스마트 워치 정보로부터 타 사용자(1)의 생물학적 정보를 획득할 수 있다. 프로세서(110)는 복수의 타 사용자(1)의 생물학적 정보와 사용자(1)의 생물학적 정보 간의 유사도를 식별하고, 식별된 유사도에 기초하여 사용자(1)의 생물학적 정보와 유사한 생물학적 정보를 갖는 타 사용자(1)의 타 컴퓨팅 장치(100)를 식별할 수 잇다. 그리고, 프로세서(110)는 식별된 타 컴퓨팅 장치(100)로 획득된 데이터 셋(20)을 전달하여, 제2 생체 정보와 관련된 데이터 셋(20)을 공유할 수 있다. 또는, 프로세서(110)는 사용자(1)의 생물학적 정보와 유사한 생물학적 정보를 갖는 타 사용자(1)의 타 컴퓨팅 장치(100)로부터 타 사용자(1)에 대한 제2 생체 정보와 관련된 데이터 셋(20)을 공유 받을 수도 있다. 특히, 제2 생체 정보의 경우, 비침습적 방식 또는 의료진에 의해 측정되는 만큼 간헐적으로 획득되며, 데이터의 수가 적을 수 있다. 따라서, 생물학적 정보가 유사한 복수의 사용자(1)의 경우, 획득된 제2 생체 정보와 관련된 데이터 셋(20)을 공유함으로써 보다 신속하게 사용자(1)의 특성을 반역하도록 신경망 모델(10)을 업데이트 할 수 있다. Meanwhile, the processor (110) may identify biological information of the user (1) and share a data set (20) related to second biometric information with another computing device (100) of another user (1) having similar biological information to the identified biological information. Here, the biological information may include the height, weight, age, gender, etc. of the user (1). For example, when the computing device (100) is a smart watch, the processor (110) may obtain biological information of the other user (1) from information of another smart watch of the other user (1). The processor (110) may identify the similarity between the biological information of a plurality of other users (1) and the biological information of the user (1), and may identify another computing device (100) of another user (1) having similar biological information to the biological information of the user (1) based on the identified similarity. And, the processor (110) can share the data set (20) related to the second biometric information by transferring the acquired data set (20) to the identified other computing device (100). Alternatively, the processor (110) can also receive the data set (20) related to the second biometric information of the other user (1) from the other computing device (100) of the other user (1) having similar biological information to the biological information of the user (1). In particular, in the case of the second biometric information, it is acquired intermittently as it is measured by a non-invasive method or by medical staff, and the number of data may be small. Therefore, in the case of a plurality of users (1) having similar biological information, the neural network model (10) can be updated to more quickly reflect the characteristics of the user (1) by sharing the acquired data set (20) related to the second biometric information.
일 예로, 컴퓨팅 장치(100)와 연동하는 서버 장치는 컴퓨팅 장치(100)로부터 사용자(1)의 생물학적 정보를 획득하고, 획득된 생물학적 정보에 대응하는 신경망 모델(10)을 컴퓨팅 장치(100)로 송신할 수 있다. 구체적으로, 서버 장치에는 복수의 사용자(1)에 대응하는 복수의 신경망 모델(10)이 저장될 수 있다. 특히, 각각의 신경망 모델(10)은 생물학적 정보가 유사한 복수의 사용자(1)를 포함하는 사용자(1) 그룹에 각각 대응할 수 있다. 따라서, 서버 장치는 컴퓨팅 장치(100)로부터 사용자(1)의 생물학적 정보를 수신하면, 수신된 생물학적 정보와 유사한 생물학적 정보를 선별하고, 선별된 생물학적 정보에 대응하는 신경망 모델(10)을 컴퓨팅 장치(100)로 송신할 수 있다. 이때, 서버 장치는 컴퓨팅 장치(100)의 사용자(1)도 선별된 생물학적 정보에 대응하는 사용자(1) 그룹에 포함시킬 수 있다. 서버 장치를 동일한 사용자(1) 그룹에 포함된 복수의 사용자(1)에 대해서는 각각의 컴퓨팅 장치(100)로부터 획득된 제2 생체 정보(또는 제2 생체 정보이 기초하여 설정된 데이터 셋(20))을 공유할 수 있다. For example, a server device that is linked with a computing device (100) may obtain biological information of a user (1) from the computing device (100) and transmit a neural network model (10) corresponding to the obtained biological information to the computing device (100). Specifically, a plurality of neural network models (10) corresponding to a plurality of users (1) may be stored in the server device. In particular, each neural network model (10) may correspond to a user (1) group including a plurality of users (1) having similar biological information. Accordingly, when the server device receives biological information of a user (1) from the computing device (100), the server device may select biological information similar to the received biological information and transmit a neural network model (10) corresponding to the selected biological information to the computing device (100). At this time, the server device may also include the user (1) of the computing device (100) in the user (1) group corresponding to the selected biological information. The server device can share second biometric information (or a data set (20) established based on the second biometric information) obtained from each computing device (100) with multiple users (1) included in the same user (1) group.
본 개시의 일 실시 예에 따라, 프로세서(110)는 사용자(1)의 생물학적 정보를 식별하고, 식별된 생물학적 정보에 기초하여 사용자(1)와 복수의 타 사용자(1)를 클러스터링 하고, 각각의 클러스터링 그룹 별로 새로운 데이터로 구성된 데이터 셋(20)을 할당할 수 있다. 구체적으로, 프로세서(110)는 컴퓨팅 장치(100)와 연동하는 복수의 사용자(1) 단말 장치로부터 복수의 사용자(1)의 생물학적 정보를 획득할 수 있다. 이때, 프로세서(110)는 복수의 사용자(1)의 생물학적 정보에 기초하여, 복수의 사용자(1)를 클러스터링 할 수 있다. 그리고, 프로세서(110)는 각각의 클러스터링 그룹에서 포함된 사용자(1) 단말 장치에서 획득되는 제2 생체 신호로 각각의 클러스터링 그룹에 대한 데이터 셋(20)을 설정할 수 있다. 즉, 프로세서는 특정 클러스터링 그룹에 포함된 복수의 사용자(1) 단말 장치에서 각각 제2 생체 신호가 획득되면, 획득된 복수의 생체 신호를 수집하여 특정 클러스터링 그룹에 대한 데이터 셋(20)을 설정하여, 특정 클러스터링 그룹에 할당할 수 있다. 이때, 컴퓨팅 장치(100)는 복수의 사용자(1)의 상태(예를 들어, 질병 및 건강 상태)를 관리하는 서버 장치 등으로 구현될 수 있다. According to one embodiment of the present disclosure, the processor (110) can identify biological information of a user (1), cluster the user (1) and a plurality of other users (1) based on the identified biological information, and allocate a data set (20) composed of new data to each clustering group. Specifically, the processor (110) can obtain biological information of a plurality of users (1) from a plurality of user (1) terminal devices that are linked to the computing device (100). At this time, the processor (110) can cluster the plurality of users (1) based on the biological information of the plurality of users (1). Then, the processor (110) can set a data set (20) for each clustering group with a second biosignal obtained from a user (1) terminal device included in each clustering group. That is, when the second biosignal is obtained from each of the plurality of user (1) terminal devices included in a specific clustering group, the processor can collect the obtained biosignals, set a data set (20) for a specific clustering group, and allocate the collected biosignals to a specific clustering group. At this time, the computing device (100) may be implemented as a server device or the like that manages the status (e.g., disease and health status) of multiple users (1).
본 개시의 일 실시 예에 따라 프로세서(110)는 획득된 사용자(1)의 제1 생체 정보를 신경망 모델(10)에 입력하여 사용자(1)에 관한 제2 생체 정보를 예측할 수 있다. 이는, 상술한 제2 학습(즉, 자기 지도 학습) 과정을 통해 수행될 수도 있다. 그리고, 프로세서(110)는 예측된 제2 생체 정보의 추이를 모니터링하고, 예측된 제2 생체 정보의 추이가 변경된 것으로 감지되면, 외부 컴퓨팅 장치(200)로부터 제2 생체 정보를 획득할 수 있다. 또는 프로세서(110)는 메모리에 누적하여 저장된 외부 컴퓨팅 장치(200)로부터 획득된 제2 생체 정보를 획득할 수도 있다. 그리고, 프로세서(110)는 획득된 제2 생체 정보에 기초하여, 신경망 모델(10)에 대한 제1 학습을 수행하여 신경망 모델(10)을 업데이트 할 수 있다. 즉, 프로세서(110)는 제2 생체 정보의 추이 또는 추세가 변경된 것으로 감지되면, 제2 생체 정보를 바탕으로 한 신경망 모델(10)에 대한 업데이트를 수행하는 것으로 결정할 수 있다. 예를 들어, 프로세서(110)는 이전 주기의 제2 생체 정보의 추이와 현재 주기의 제2 생체 정보의 추이의 유사도가 기 설정된 값 미만인 것으로 식별되면, 제2 새에 정보의 추이가 변경된 것으로 식별하고, 제2 생체 정보를 이용한 신경망 모델(10)에 대한 제1 학습을 수행하는 것으로 결정할 수 있다. According to one embodiment of the present disclosure, the processor (110) can input the acquired first biometric information of the user (1) into the neural network model (10) to predict second biometric information about the user (1). This may be performed through the second learning (i.e., self-supervised learning) process described above. Then, the processor (110) monitors the trend of the predicted second biometric information, and if it is detected that the trend of the predicted second biometric information has changed, the processor (110) can acquire the second biometric information from the external computing device (200). Alternatively, the processor (110) can acquire the second biometric information acquired from the external computing device (200) that is accumulated and stored in a memory. Then, the processor (110) can perform first learning on the neural network model (10) based on the acquired second biometric information to update the neural network model (10). That is, if the processor (110) detects that the trend or tendency of the second biometric information has changed, it may determine to perform an update on the neural network model (10) based on the second biometric information. For example, if the processor (110) determines that the similarity between the trend of the second biometric information of the previous period and the trend of the second biometric information of the current period is less than a preset value, it may determine to perform first learning on the neural network model (10) using the second biometric information, identifying that the trend of the second biometric information has changed.
또한, 본 개시의 일 실시 예에 따라 프로세서(110)는 이전 업데이트 시점으로부터 기 설정된 시간이 초과된 것으로 식별되면, 획득된 새로운 데이터에 기초하여, 기 학습된 신경망 모델이 개인화 되도록 업데이트 할 수 있다. 여기서, 이전 업데이트 시점은, 가장 최근에 업데이트를 수행한 시점일 수 있다. 즉, 가장 최근 수행된 업데이트 시점으로부터 기 설정된 시간이 초과된 것으로 식별되면, 프로세서(110)는 기 학습된 신경망 모델(10)에 대한 업데이트 과정을 수행하는 것으로 결정할 수 있다. 예를 들어, 기 설정된 시간이 7 일라고 가정하면, 프로세서(110)는 최근 신경망 모델(10)을 업데이트한 시점으로부터 10일이 경과된 것으로 식별되면, 신경망 모델(10)에 대한 업데이트 과정을 수행할 수 있다.In addition, according to one embodiment of the present disclosure, if the processor (110) identifies that a preset time has passed since the previous update time, the processor (110) may update the pre-learned neural network model to be personalized based on the acquired new data. Here, the previous update time may be the time at which the update was most recently performed. That is, if the processor (110) identifies that a preset time has passed since the most recently performed update time, the processor (110) may determine to perform an update process for the pre-learned neural network model (10). For example, assuming that the preset time is 7 days, the processor (110) may perform an update process for the neural network model (10) if it is identified that 10 days have passed since the most recently updated neural network model (10).
또한, 본 개시의 일 실시 예에 따라, 프로세서(110)는 이전 업데이트 시점으로부터 사용자의 상태와 관련된 생체 정보를 예측한 횟수가 기 설정된 값 이상이면, 획득된 새로운 데이터에 기초하여, 기 학습된 신경망 모델(10)이 개인화 되도록 업데이트 할 수 있다. 여기서, 이전 업데이트 시점은, 가장 최근에 업데이트를 수행한 시점일 수 있다. 예를 들어, 기 설정된 값이 20회라고 가정하면, 프로세서(110)는 최근 신경망 모델(10)을 업데이트한 시점으로부터 신경망 모델(10)을 통해 제2 생체 정보(예를 들어, 좌심실 박출률)를 예측한 횟수가 25회로 식별되면, 신경망 모델(10)에 대한 업데이트 과정을 수행할 수 있다. In addition, according to one embodiment of the present disclosure, if the number of times the biometric information related to the user's status has been predicted from a previous update time is greater than or equal to a preset value, the processor (110) may update the pre-learned neural network model (10) to be personalized based on the acquired new data. Here, the previous update time may be the time at which the most recent update was performed. For example, assuming that the preset value is 20 times, if the processor (110) identifies that the number of times the second biometric information (e.g., left ventricular ejection fraction) has been predicted through the neural network model (10) from the time at which the neural network model (10) was most recently updated is 25 times, the processor (110) may perform an update process for the neural network model (10).
한편, 프로세서(110)는 이전 업데이트 시점으로부터 기 설정된 시간이 초과된 것으로 식별되면, 이전 업데이트 시점으로부터 사용자의 상태와 관련된 생체 정보를 예측한 횟수가 기 설정된 값 이상인지 식별하고, 신경망 모델(10)에 대한 업데이트 과정의 수행 여부를 결정할 수 있다. Meanwhile, if the processor (110) identifies that a preset time has passed since the previous update time, it can identify whether the number of times biometric information related to the user's status has been predicted from the previous update time is greater than or equal to a preset value and determine whether to perform an update process for the neural network model (10).
본 개시의 일 실시 예에 따라 프로세서(110)는 제2 생체 정보가 획득되면, 제2 생체 정보에 대응하는 제1 생체 정보를 신경망 모델(10)에 입력하여, 제2 생체 정보를 출력 데이터로 획득하고, 획득된 제2 생체 정보와의 오차를 식별할 수 있다. 이때, 출력 데이터로 획득된 제2 생체 정보와 타 컴퓨팅 장치(100)로부터 획득된 제2 생체 정보 간의 오차가 기 설정된 값 미만이면, 프로세서(110)는 신경망 모델(10)에 관한 업데이트 과정(예를 들어, 제1 학습 과정)을 중단하는 것으로 결정할 수 있다. 이는, 사용자(1)에 대한 신경망 모델(10)의 개인화 과정이 완료된 것으로 판단하여, 학습에 소요되는 컴퓨팅 장치(100)의 리소스를 절감하기 위함이다. According to one embodiment of the present disclosure, when second biometric information is acquired, the processor (110) inputs first biometric information corresponding to the second biometric information into the neural network model (10), acquires the second biometric information as output data, and identifies an error with the acquired second biometric information. At this time, if the error between the second biometric information acquired as output data and the second biometric information acquired from another computing device (100) is less than a preset value, the processor (110) may determine to stop an update process (e.g., a first learning process) regarding the neural network model (10). This is to determine that the personalization process of the neural network model (10) for the user (1) is completed, and to save resources of the computing device (100) required for learning.
또한, 본 개시의 일 실시 예에 따라 프로세서(110)는 획득된 제1 생체 정보에 기초하여 사용자(1)가 변경된 것으로 식별되면, 업데이트 과정을 중단하고, 신경망 모델(10)을 새로운 신경망 모델로 변경하여 업데이트 과정을 재개할 수도 있다. Additionally, according to one embodiment of the present disclosure, if the processor (110) identifies that the user (1) has changed based on the acquired first biometric information, the processor (110) may stop the update process and change the neural network model (10) to a new neural network model to resume the update process.
본 개시의 일 실시 예에 따라 제1 생체 정보는 심전도 정보(Electrocardiogram, ECG)를 포함하고, 제2 생체 정보는 좌심실 박출률(Ejection Fraction, EF)을 포함하는 경우, 프로세서(110)는 신경망 모델(10)에 심전도 정보를 입력하여, 사용자(1)의 좌심실 박출률을 식별하고, 식별된 좌심실 박출률에 기초하여 사용자(1)의 좌심실 박출률 감소 심부전을 예측할 수 있다. 예를 들어, 프로세서(110)는 신경망 모델(10)을 통해 식별된 사용자(1)의 좌심실 박출률에 대응하는 스코어를 산출하고, 산출된 스코어에 기초하여 사용자(1)에게 잠재되어 있는 사용자(1)의 좌심실 박출률 감소 심부전의 가능성을 식별할 수 있다. 이때, 프로세서(110)는 좌심실 박출률이 기 설정된 값 미만이거나, 스코어가 기 설정된 값 이상이면, 사용자(1)의 좌심실 박출률 감소 심부전이 가능성이 위험 수준인 것으로 식별하고, 출력 인터페이스(예를 들어, 스피커, 디스플레이 등)를 통해 경고 메시지 등을 출력할 수 있다. According to one embodiment of the present disclosure, when the first biometric information includes electrocardiogram (ECG) information and the second biometric information includes left ventricular ejection fraction (EF), the processor (110) inputs the ECG information into the neural network model (10) to identify the left ventricular ejection fraction of the user (1), and predicts heart failure with reduced left ventricular ejection fraction of the user (1) based on the identified left ventricular ejection fraction. For example, the processor (110) calculates a score corresponding to the left ventricular ejection fraction of the user (1) identified through the neural network model (10), and identifies the possibility of heart failure with reduced left ventricular ejection fraction of the user (1) based on the calculated score. At this time, if the left ventricular ejection fraction is lower than a preset value or the score is higher than a preset value, the processor (110) identifies that the user's (1) has a risk of heart failure with reduced left ventricular ejection fraction, and can output a warning message, etc. through an output interface (e.g., speaker, display, etc.).
도 7은 본 개시의 다른 실시 예에 따른 컴퓨팅 장치(400)의 블록도이다. FIG. 7 is a block diagram of a computing device (400) according to another embodiment of the present disclosure.
도 7을 참조하면, 컴퓨팅 장치(400)는 프로세서(410), 메모리(420), 네트워크부(430), 센싱부(440), 디스플레이(450), 사용자 인터페이스(460), 카메라(470), 스피커(480)를 포함한다. 도 7에 도시된 구성 중 프로세서(410), 메모리(420), 네트워크부(430) 및 센싱부(440) 는 도 2에 도시된 컴퓨팅 장치(100)의 프로세서(110), 메모리 (120), 네트워크부(130) 및 센싱부(140) 구성과 대응되므로 상세한 설명을 생략하도록 한다. Referring to FIG. 7, the computing device (400) includes a processor (410), a memory (420), a network unit (430), a sensing unit (440), a display (450), a user interface (460), a camera (470), and a speaker (480). Among the configurations illustrated in FIG. 7, the processor (410), the memory (420), the network unit (430), and the sensing unit (440) correspond to the configurations of the processor (110), the memory (120), the network unit (130), and the sensing unit (140) of the computing device (100) illustrated in FIG. 2, and therefore, a detailed description thereof will be omitted.
디스플레이(450)는 다양한 영상을 디스플레이 할 수 있다. 여기서 영상은 정지 영상과 동영상을 모두 포함한다. 디스플레이(450)는 사용자(1) 상태에 기초하여 생성된 활동에 관한 가이드 정보를 출력할 수 있다. 디스플레이(450)는 LCD(Liquid Crystal Display Panel), OLED(Organic Light Emitting Diodes), LCoS(Liquid Crystal on Silicon), DLP(Digital Light Processing) 등과 같은 다양한 형태의 디스플레이로 구현될 수 있다. 또한, 디스플레이(450) 내에는 a-si TFT, LTPS(low temperature poly silicon) TFT, OTFT(organic TFT) 등과 같은 형태로 구현될 수 있는 구동 회로, 백 라이트 유닛 등도 함께 포함될 수 있다.The display (450) can display various images. Here, the images include both still images and moving images. The display (450) can output guide information about activities generated based on the user (1) status. The display (450) can be implemented as various types of displays, such as an LCD (Liquid Crystal Display Panel), an OLED (Organic Light Emitting Diodes), an LCoS (Liquid Crystal on Silicon), a DLP (Digital Light Processing), etc. In addition, the display (450) can also include a driving circuit, a backlight unit, etc., which can be implemented in a form such as an a-si TFT, an LTPS (low temperature poly silicon) TFT, an OTFT (organic TFT), etc.
한편, 디스플레이(450)는 터치 패널과 결합하여 터치 스크린으로 구현될 수도 있으며, 이때 디스플레이(450)는 터치 스크린을 통해 영상을 출력하는 출력 인터페이스 뿐만 아니라 사용자(1)의 터치 입력을 수신하는 입력 인터페이스의 기능 또한 수행할 수 있다. Meanwhile, the display (450) may be implemented as a touch screen by being combined with a touch panel, and in this case, the display (450) may perform the function of an input interface that receives a touch input from a user (1) as well as an output interface that outputs an image through the touch screen.
사용자 인터페이스(460)는 컴퓨팅 장치(100)가 사용자(1)와의 인터렉션(Interaction)을 수행하는 데 이용되는 구성으로, 터치 센서, 모션 센서, 버튼, 조그(Jog) 다이얼, 스위치 중 적어도 하나를 포함할 수 있으나 이에 한정되는 것은 아니다. 프로세서(410)는 사용자 인터페이스(460)를 통해 제2 생체 정보 및 사용자(1)의 생물학적 정보(직업, 나이, 성별 등)을 입력 받을 수 있다. The user interface (460) is a configuration used by the computing device (100) to perform interaction with the user (1), and may include at least one of a touch sensor, a motion sensor, a button, a jog dial, and a switch, but is not limited thereto. The processor (410) may receive second biometric information and biological information (occupation, age, gender, etc.) of the user (1) through the user interface (460).
카메라(470)는 사용자(1) 주변의 객체를 촬영하여 객체에 관한 이미지를 획득한다. 구체적으로, 카메라(470)는 사용자(1)가 섭취하는 음식물의 이미지를 획득할 수 있다. 이때, 프로세서(410)는 사용자(1)의 상태와 사용자(1)가 섭취하는 음식물 이미지에 기초하여, 사용자(1)의 영양 상태를 파악하고 추천 식단 정보를 가이드 정보로 제공할 수 있다. 이를 위해, 카메라(470)는 CMOS 구조를 가진 촬상 소자(CIS, CMOS Image Sensor), CCD 구조를 가진 촬상 소자(Charge Coupled Device) 등의 촬상 소자로 구현될 수 있다. 다만, 이에 한정되는 것은 아니며, 카메라(470)는 피사체를 촬영할 수 있는 다양한 해상도의 카메라 모듈로 구현될 수 있다. 한편, 카메라(470)는 뎁스 카메라(예를 들어, IR 뎁스 카메라 등), 스테레오 카메라 또는 RGB 카메라 등으로 구현될 수 있다.The camera (470) photographs an object around the user (1) to obtain an image of the object. Specifically, the camera (470) can obtain an image of food consumed by the user (1). At this time, the processor (410) can determine the nutritional status of the user (1) based on the status of the user (1) and the image of the food consumed by the user (1) and provide recommended diet information as guide information. To this end, the camera (470) can be implemented as an imaging device such as an imaging device having a CMOS structure (CIS, CMOS Image Sensor) or an imaging device having a CCD structure (Charge Coupled Device). However, the present invention is not limited thereto, and the camera (470) can be implemented as a camera module having various resolutions capable of photographing a subject. Meanwhile, the camera (470) can be implemented as a depth camera (for example, an IR depth camera, etc.), a stereo camera, or an RGB camera.
스피커(480)는 오디오 처리부(미도시)에 의해 디코딩이나 증폭, 노이즈 필터링과 같은 다양한 처리 작업이 수행된 각종 오디오 데이터를 출력하는 구 성이다. 스피커(480)는 각종 알림 음이나 음성 메시지를 출력할 수 있다. 본 개시 의 일 실시 예에 따르면, 프로세서(410)는 외부 장치로부터 수신된 전기적 신호를 사용자(1) 음성으로 변환하여 스피커(480)를 통해 출력할 수 있다, 일 예로, 스피커(480)는 신경망 모델(10)을 통해 식별된 제2 생체 정보와 관련된 질병을 경고하는 또는 진단할 것을 제안하는 음성 메시지를 출력할 수 있다.The speaker (480) is a component that outputs various audio data on which various processing operations such as decoding, amplification, and noise filtering have been performed by an audio processing unit (not shown). The speaker (480) can output various notification sounds or voice messages. According to one embodiment of the present disclosure, the processor (410) can convert an electrical signal received from an external device into a user's (1) voice and output it through the speaker (480). As an example, the speaker (480) can output a voice message warning of or suggesting a diagnosis of a disease related to second biometric information identified through a neural network model (10).
앞서 설명된 본 개시의 다양한 실시 예는 추가 실시 예와 결합될 수 있고, 상술한 상세한 설명에 비추어 당업자가 이해 가능한 범주에서 변경될 수 있다. 본 개시의 실시 예들은 모든 면에서 예시적인 것이며, 한정적이 아닌 것으로 이해되어야 한다. 예를 들어, 단일형으로 설명되어 있는 각 구성요소는 분산되어 실시될 수도 있으며, 마찬가지로 분산된 것으로 설명되어 있는 구성요소들도 결합된 형태로 실시될 수 있다. 따라서, 본 개시의 특허청구범위의 의미, 범위 및 그 균등 개념으로부터 도출되는 모든 변경 또는 변형된 형태가 본 개시의 범위에 포함되는 것으로 해석되어야 한다.The various embodiments of the present disclosure described above can be combined with additional embodiments and can be modified within a scope that can be understood by those skilled in the art in light of the detailed description set forth above. It should be understood that the embodiments of the present disclosure are illustrative in all respects and not restrictive. For example, each component described as a single component may be implemented in a distributed manner, and likewise, components described as distributed may be implemented in a combined form. Accordingly, all changes or modifications derived from the meaning, scope, and equivalent concept of the claims of the present disclosure should be interpreted as being included in the scope of the present disclosure.
Claims (12)
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR20230054216 | 2023-04-25 | ||
| KR10-2023-0054216 | 2023-04-25 | ||
| KR20230069381 | 2023-05-30 | ||
| KR10-2023-0069381 | 2023-05-30 | ||
| KR10-2024-0055172 | 2024-04-25 | ||
| KR1020240055172A KR20240157575A (en) | 2023-04-25 | 2024-04-25 | Method, program, and apparatus for obtaining user customized neural network model to identify biometric information |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024225785A1 true WO2024225785A1 (en) | 2024-10-31 |
Family
ID=93257013
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/005625 Pending WO2024225785A1 (en) | 2023-04-25 | 2024-04-25 | Method, program, and device for acquiring user-customized neural network model for identifying biological information |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024225785A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101605078B1 (en) * | 2014-05-29 | 2016-04-01 | 경북대학교 산학협력단 | The method and system for providing user optimized information, recording medium for performing the method |
| JP2021000442A (en) * | 2019-06-21 | 2021-01-07 | ストラックスコープ ピーティワイ リミテッドStraxcorp Pty Ltd | Method and system for image analysis |
| KR102271793B1 (en) * | 2020-12-11 | 2021-07-02 | (주)씨어스테크놀로지 | System for providing a platform for integrating and processing big data of complex biological signals based on artificial intelligence and method for processing big data of complex biological signals using thereof |
| KR20210099059A (en) * | 2018-11-29 | 2021-08-11 | 재뉴어리, 인크. | Systems, methods and devices for biophysical modeling and response prediction |
| JP2021534939A (en) * | 2018-08-21 | 2021-12-16 | エコ デバイシズ, インコーポレイテッドEko Devices, Inc. | Methods and systems for identifying a subject's physiological or biological condition or disease |
-
2024
- 2024-04-25 WO PCT/KR2024/005625 patent/WO2024225785A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101605078B1 (en) * | 2014-05-29 | 2016-04-01 | 경북대학교 산학협력단 | The method and system for providing user optimized information, recording medium for performing the method |
| JP2021534939A (en) * | 2018-08-21 | 2021-12-16 | エコ デバイシズ, インコーポレイテッドEko Devices, Inc. | Methods and systems for identifying a subject's physiological or biological condition or disease |
| KR20210099059A (en) * | 2018-11-29 | 2021-08-11 | 재뉴어리, 인크. | Systems, methods and devices for biophysical modeling and response prediction |
| JP2021000442A (en) * | 2019-06-21 | 2021-01-07 | ストラックスコープ ピーティワイ リミテッドStraxcorp Pty Ltd | Method and system for image analysis |
| KR102271793B1 (en) * | 2020-12-11 | 2021-07-02 | (주)씨어스테크놀로지 | System for providing a platform for integrating and processing big data of complex biological signals based on artificial intelligence and method for processing big data of complex biological signals using thereof |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102200526B1 (en) | Method and system for providing medical information based on the open APIs | |
| Malche et al. | [Retracted] Artificial Intelligence of Things‐(AIoT‐) Based Patient Activity Tracking System for Remote Patient Monitoring | |
| Ahmed et al. | IoMT-based biomedical measurement systems for healthcare monitoring: A review | |
| Zainuddin et al. | Patient monitoring system using computer vision for emotional recognition and vital signs detection | |
| WO2024225785A1 (en) | Method, program, and device for acquiring user-customized neural network model for identifying biological information | |
| WO2024248501A1 (en) | Method, program, and device for acquiring user-customized neural network model for identifying biometric information | |
| WO2024191190A1 (en) | Method, program, and apparatus for screening electrocardiogram signals including noise on basis of neural network model | |
| Noor et al. | Challenges ahead in healthcare applications for vision and sensors | |
| KR20240157575A (en) | Method, program, and apparatus for obtaining user customized neural network model to identify biometric information | |
| WO2022124790A1 (en) | Systems and methods for atrial fibrillation burden estimation, notification and management in daily free-living scenarios | |
| KR102844429B1 (en) | Method, device, and program for generating training data including ecg data for contrastive learning | |
| WO2024242484A1 (en) | Method, program, and device for diagnosing heart disease on basis of electrocardiogram signal | |
| KR20240172118A (en) | Method, program, and apparatus for obtaining user customized neural network model to identify biometric information | |
| WO2025127719A1 (en) | Method, device, and computer program for obtaining neural network model for predicting disease on basis of electrocardiogram data | |
| WO2025080026A1 (en) | Method, program, and device for predicting disease of patient on basis of neural network model | |
| WO2025053522A1 (en) | Apparatus for evaluating performance of determination criteria set to analyze biometric data and control method thereof | |
| WO2025116605A1 (en) | Method, apparatus, and program for removing noise from bio-signals | |
| WO2025100984A1 (en) | Method, program, and device for diagnosing heart disease on basis of biometric signal | |
| WO2025155127A1 (en) | Computing device for setting medical treatment order on basis of patient's condition and method for controlling same | |
| KR20250091140A (en) | Method, device, and computer program for obtaining neural network model for predicting disease based on electrocardiogram data | |
| WO2024191200A1 (en) | Method, program and device for identifying potential atrial fibrillation on basis of artificial intelligence model | |
| WO2024101954A1 (en) | Electronic device for diagnosing disease of user on basis of biological signal, and control method therefor | |
| KR102893550B1 (en) | Method, program, and apparatus for predicting disease of patient based on neural network model | |
| WO2025198378A1 (en) | Method, apparatus, and program for identifying state of user on basis of abnormal electrocardiogram signal | |
| WO2025159552A1 (en) | Method, device and program for determining patient severity from biometric data on basis of deep learning model |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24797439 Country of ref document: EP Kind code of ref document: A1 |