[go: up one dir, main page]

WO2023004572A1 - Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur - Google Patents

Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur Download PDF

Info

Publication number
WO2023004572A1
WO2023004572A1 PCT/CN2021/108605 CN2021108605W WO2023004572A1 WO 2023004572 A1 WO2023004572 A1 WO 2023004572A1 CN 2021108605 W CN2021108605 W CN 2021108605W WO 2023004572 A1 WO2023004572 A1 WO 2023004572A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
model
task model
parameters
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2021/108605
Other languages
English (en)
Chinese (zh)
Inventor
张春会
张振中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to PCT/CN2021/108605 priority Critical patent/WO2023004572A1/fr
Priority to US17/772,405 priority patent/US20240188895A1/en
Priority to CN202180002006.0A priority patent/CN115885279A/zh
Publication of WO2023004572A1 publication Critical patent/WO2023004572A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • the present disclosure relates to the field of computer technology, and in particular to a model training method, a signal recognition method, a device, a computing processing device, a computer program, and a computer readable medium.
  • Electrocardiogram is one of the effective inspection methods for clinical diagnosis of cardiovascular diseases. In recent years, the classification and recognition of abnormal ECG signals has received extensive research and attention.
  • the classification recognition method based on deep learning has the advantage of automatically extracting features, but deep learning generally has multiple hidden layers, the network structure is deep, and contains a large number of parameters that need to be trained. To train the model to the optimum requires a lot of training. data. When performing multi-classification model training, each ECG abnormality requires a large amount of balanced training data in order to achieve better classification results.
  • the present disclosure provides a model training method, including:
  • the training sample set includes a sample ECG signal and an abnormal label of the sample ECG signal, and the abnormal label includes a target abnormal label and at least one related abnormal label;
  • the sample ECG signal is input into a multi-task model, and according to the output of the multi-task model and the abnormal label, the multi-task model is trained based on a multi-task learning mechanism; wherein, the multi-task model includes a target task A model and at least one related task model, the target output of the target task model is the target abnormal label of the input sample ECG signal, and the target output of the related task model is the related abnormal label of the input sample ECG signal;
  • the trained target task model is determined as a target abnormality identification model, and the target abnormality identification model is used to identify target abnormalities in ECG signals input to the target abnormality identification model.
  • the step of training the multi-task model based on a multi-task learning mechanism includes: adjusting the parameters of each of the related task models, and parameters, to adjust the parameters of the target task model.
  • the step of adjusting the parameters of the target task model according to the parameters of the at least one related task model includes:
  • a regularization loss term is determined, and the regularization loss term is used to make the parameters of the target task model similar to the parameters of the at least one related task model change;
  • a first loss value is determined according to the regularized loss item, and parameters of the target task model are adjusted with the goal of minimizing the first loss value.
  • the step of determining the regularization loss item according to the parameters of the target task model and the parameters of the at least one related task model includes:
  • the regularized loss term is determined according to the following formula:
  • the R( ⁇ 1 , ⁇ 2 ,..., ⁇ M ) represents the regular loss item
  • the M represents the total number of the target task model and the related task model in the multi-task model
  • the ⁇ 1 represents a parameter of the target task model
  • the ⁇ 2 , . . . , ⁇ M represent parameters of each of the relevant task models
  • the ⁇ represents a preset parameter.
  • the step of adjusting the parameters of each of the relevant task models includes:
  • the step of determining the first loss value according to the regular loss term it also includes:
  • the step of determining the first loss value according to the regular loss term includes:
  • both the second loss function and the empirical loss function are cross-entropy loss functions.
  • the target task model and the related task model share a common feature extraction layer, and the common feature extraction layer is used to extract common features of the target anomaly and the related anomaly,
  • the step of adjusting the parameters of the target task model according to the parameters of the at least one related task model includes:
  • the step of adjusting the parameters of each of the relevant task models includes:
  • the sample ECG signal into the second related task model input the output of the second related task model and the second related abnormal label into the preset third loss function, obtain the third loss value, and minimize
  • the third loss value is the target, and the parameters of the second related task model are adjusted, and the parameters of the second related task model include the parameters of the common feature extraction layer, wherein the second related task model is the Any one of the at least one related task model, the second related abnormal label is any one of the at least one related abnormal label;
  • the step of adjusting the parameters of the target task model after parameter sharing includes:
  • both the third loss function and the fourth loss function are cross-entropy loss functions.
  • the step of training the multi-task model based on a multi-task learning mechanism includes:
  • each round of iterative training includes: adjusting the parameters of each of the related task models, and according to the parameters of the at least one related task model, A step of adjusting the parameters of the target task model.
  • the step of adjusting the parameters of each of the relevant task models includes:
  • the step of adjusting the parameters of the target task model according to the parameters of the at least one related task model includes:
  • the present disclosure provides a signal identification method, including:
  • the target abnormality identification model Input the target ECG signal into the target abnormality identification model to obtain target abnormality identification result, the target abnormality identification result is used to indicate whether the target ECG signal has target abnormality; wherein, the target abnormality identification model adopts Obtained by the model training method described in any embodiment.
  • the present disclosure provides a model training device, including:
  • the sample acquisition module is configured to obtain a training sample set, the training sample set includes a sample ECG signal and an abnormal label of the sample ECG signal, and the abnormal label includes a target abnormal label and at least one related abnormal label;
  • the model training module is configured to input the sample ECG signal into a multi-task model, and train the multi-task model based on a multi-task learning mechanism according to the output of the multi-task model and the abnormal label; wherein, the The multi-task model includes a target task model and at least one related task model, the target output of the target task model is the target abnormal label of the input sample ECG signal, and the target output of the related task model is the input sample ECG signal related exception labels;
  • the model determination module is configured to determine the trained target task model as a target abnormality recognition model, and the target abnormality recognition model is used to identify target abnormalities in ECG signals input to the target abnormality recognition model.
  • the present disclosure provides a signal identification device, including:
  • the signal acquisition module is configured to acquire the target ECG signal
  • the abnormality identification module is configured to input the target ECG signal into the target abnormality identification model to obtain a target abnormality identification result, and the target abnormality identification result is used to indicate whether the target ECG signal has a target abnormality; wherein, the The target anomaly recognition model is trained by using the model training method described in any embodiment.
  • the present disclosure provides a computing processing device, including:
  • One or more processors when the computer readable code is executed by the one or more processors, the computing processing device executes the method described in any embodiment.
  • the present disclosure provides a computer program comprising computer readable codes which, when run on a computing processing device, cause the computing processing device to execute the method according to any one of the embodiments.
  • the present disclosure provides a computer-readable medium, in which the method described in any embodiment is stored.
  • Fig. 1 schematically shows a flow chart of a model training method
  • Fig. 2 schematically shows a flow chart of a signal identification method
  • FIG. 3 schematically shows another flow chart for training and obtaining a target anomaly recognition model
  • Fig. 4 schematically shows a schematic diagram of a soft parameter sharing multi-task model
  • Fig. 5 schematically shows a kind of two-channel neural network model
  • Fig. 6 schematically shows a schematic diagram of a hard parameter sharing multi-task model
  • Fig. 7 schematically shows a block diagram of a model training device
  • Fig. 8 schematically shows a block diagram of a signal identification device
  • Fig. 9 schematically shows a block diagram of a computing processing device for performing a method according to the present disclosure.
  • Fig. 10 schematically shows a storage unit for holding or carrying program codes for realizing the method according to the present disclosure.
  • Fig. 1 schematically shows a flow chart of a model training method. As shown in Fig. 1, the method may include the following steps.
  • Step S11 Obtain a training sample set.
  • the training sample set includes sample ECG signals and abnormal labels of the sample ECG signals.
  • the abnormal labels include target abnormal labels and at least one related abnormal label.
  • the execution subject of this embodiment may be a computer device, the computer device has a model training device, and the model training method provided in this embodiment is executed by the model training device.
  • the computer device may be, for example, a smart phone, a tablet computer, a personal computer, etc., which is not limited in this embodiment.
  • the execution subject of this embodiment can obtain the training sample set in various ways.
  • the execution subject may obtain the sample ECG signals stored therein from another server (such as a database server) for storing data through a wired connection or a wireless connection.
  • the execution subject may obtain sample ECG signals collected by a signal collection device such as an electrocardiograph, and store these sample ECG signals locally, thereby generating a training sample set.
  • Abnormalities in sample ECG signals may include: atrial premature beats, premature ventricular beats, supraventricular tachycardia, ventricular tachycardia, atrial flutter, atrial fibrillation, ventricular flutter, ventricular fibrillation, left bundle branch block, right bundle branch block, atrial Sexual escaping, ventricular escaping, tachycardia, bradycardia, atrioventricular block, ST-segment elevation, ST-segment depression, abnormal Brugada wave, giant R-wave ST-segment elevation, faux bundle branch block At least one of the lag exceptions.
  • the sample data of atrial flutter, Yabo, ST-segment elevation, ST-segment depression, abnormal Brugada wave, giant R-wave ST-segment elevation, and masquerading bundle branch block are relatively few.
  • the characteristics of the ECG signal may include waveform, peak, amplitude, frequency, amplitude, time and so on.
  • Some abnormal ECG signals have commonality or similarity in characteristics, which is reflected in the same part of the waveform characteristics or the same upper and lower limits of the frequency threshold and so on.
  • the lower limit of heart rate threshold for supraventricular tachycardia, paroxysmal tachycardia, atrial fibrillation, atrial flutter, and atrial tachycardia is 100 beats/min, while sinus tachycardia, atrioventricular conduction
  • the upper limit of abnormal heart rate thresholds such as heart block, sinoatrial block, and bundle branch block is 60 beats/min
  • Atrial flutter is associated with supraventricular tachycardia and sinus
  • the characteristics of abnormal ECG signals such as tachycardia are similar; (3) fake bundle branch block, left bundle branch block of limb lead ECG, right bundle branch block of precordial lead ECG; (4) relatively rare Brugada Wave abnormality was extracted in North America in 1991, and the abnormality showed ECG characteristics of right bundle branch block with ST-segment elevation in the right chest lead; (5) Giant R-wave ST-segment elevation first proposed by Wimalarlna in 1993 This abnormality has the waveform characteristics of the
  • the target anomaly indicated by the target anomaly label there is a correlation between the target anomaly indicated by the target anomaly label and the related anomaly indicated by the related anomaly label.
  • the target abnormality is atrial flutter
  • the related abnormality is atrial fibrillation.
  • each related abnormality is correlated with the target abnormality.
  • the number of sample electrocardiographic signals with any relevant abnormality in the training sample set may be greater than the number of sample electrocardiographic signals with a target abnormality.
  • the training sample set may include a plurality of sample ECG signals, assuming that these sample ECG signals involve M types of abnormalities, and the M types of abnormalities include the first abnormality, the second abnormality, . . . and the Mth abnormality. M is greater than or equal to 2. Assuming that the first anomaly is the target anomaly, any one of the second anomaly, the third anomaly, ... and the Mth anomaly is related to the first anomaly and is different from the first anomaly. Therefore, the second anomaly, Any one of the third anomaly, ... and the Mth anomaly may be a relevant anomaly.
  • the number of related abnormalities is one, that is, the second abnormality; when M ⁇ 3, the number of related abnormalities is multiple, and the multiple related abnormalities are respectively the second abnormality, the third abnormality, ... and the second abnormality M is abnormal.
  • the abnormal label of each sample ECG signal can be an M-dimensional vector, for example, the abnormal label of a certain sample ECG signal is [1,0,0,1,1,...,1 ], 1 in the abnormal label means that the sample ECG signal has a corresponding abnormality, 0 means that there is no corresponding abnormality, and the above abnormal label means that the sample ECG signal has the first abnormality, the fourth abnormality, the fifth abnormality, .. . and the Mth exception.
  • the sample ECG signals in the training sample set can be classified according to the abnormal type, and the sample ECG signals corresponding to the abnormal i after classification can include two groups: the positive sample with the i-th abnormality and the non-positive sample Negative samples with i-th anomaly.
  • i can be greater than or equal to 1 and less than or equal to M.
  • the number of positive samples and negative samples can be equal or relatively close.
  • the ratio of positive samples to negative samples can be adjusted according to actual needs, which is not limited in this embodiment.
  • the sample ECG signal may be preprocessed before step S12, as shown in FIG. 3, to remove noise interference.
  • a band-pass filter can be used to remove 50 Hz power frequency interference in the sample ECG signal;
  • a low-pass filter can be used to remove 10-300 Hz myoelectric interference in the sample ECG signal;
  • a high-pass filter can be used to remove the baseline drift; and so on.
  • the sample ECG signals in the training sample set may also be divided into a training set and a test set according to a certain ratio such as 4:1, as shown in FIG. 3 , which is not limited in this embodiment.
  • Step S12 Input the sample ECG signal into the multi-task model, and train the multi-task model based on the multi-task learning mechanism according to the output of the multi-task model and abnormal labels; wherein, the multi-task model includes a target task model and at least one related task model , the target output of the target task model is the target abnormal label of the input sample ECG signal, and the target output of the related task model is the relevant abnormal label of the input sample ECG signal.
  • Step S13 The trained target task model is determined as the target abnormality recognition model, and the target abnormality recognition model is used to identify the target abnormality in the ECG signal input to the target abnormality recognition model.
  • the target task model and each related task model can be a neural network model with the same network structure, for example, it can be a convolutional neural network model (Convolutional Neural Networks, CNN) or a recurrent neural network model (Recurrent Neural Network, RNN) with the same network structure. ).
  • the target task model and each related task model can use a long short-term memory network (LSTM, Long Short-Term Memory) in a recurrent neural network model.
  • LSTM Long Short-Term Memory
  • the target task model and each related task model may also be models with different network structures, which is not limited in this embodiment.
  • Multi-task learning is an important machine learning method that aims to use related tasks to improve the generalization ability of the main task.
  • the relationship between tasks is captured by constraining the relationship between the model parameters of each task, so that the knowledge learned from related tasks with more training data is transferred to the main task with less training data.
  • Multi-task learning imposes certain constraints on the main task, that is, the parameters of the main task model are constrained by the parameters of related task models during the optimization process, so that when all tasks meet the convergence conditions, the main task model is equivalent to integrating all related task models The learned knowledge can thus improve the generalization ability of the main task.
  • the target abnormality is correlated with each related abnormality, that is, the electrocardiographic signal with the target abnormality and the electrocardiographic signal with any one of the related abnormalities have characteristics in common. Therefore, the target anomaly identification model used to identify target anomalies and the related anomaly identification model used to identify related anomalies can be trained using a multi-task learning mechanism.
  • the task of training the target abnormality recognition model is The main task, such as task1 in Figure 4 and Figure 6.
  • the main task is used to train the target task model in the multi-task model.
  • the target task model is the target anomaly recognition model.
  • the target anomaly recognition model is used to identify whether the ECG signal input to the target anomaly recognition model has a target anomaly. Such as the first exception.
  • the task of training the related anomaly recognition model is the related task
  • the related task is used to train the related task model
  • the related task model after the training is the related anomaly recognition model.
  • the number of related tasks and related task models is at least one, and since the number of related abnormalities is M ⁇ 1, correspondingly, the number of related tasks and related task models can be both M ⁇ 1.
  • the M-1 related tasks are task2,..., taskM, as shown in Figure 4 and Figure 6.
  • each related task model can be determined as a different related anomaly recognition model.
  • Each associated anomaly identification model can be used to identify different associated anomalies.
  • two methods may be used to perform multi-task learning on the target task model and at least one related task model, namely, hard parameter sharing and soft parameter sharing.
  • the hard parameter sharing is to share the hidden layer of the network between multiple task models, that is, between the relevant task model and the target task model.
  • the parameters of the hidden layer in multiple task models are the same, and the network output layer of each task model different to perform different tasks.
  • Soft parameter sharing means that each task has its own model and parameters, but the parameters of the main task model, that is, the target task model, are constrained by the parameters of the related task model to encourage parameter similarity between the target task model and the related task model . Subsequent embodiments will introduce in detail the detailed process of training a multi-task model based on a multi-task learning mechanism.
  • the parameters of the target task model are constrained by the parameters of the related task model, and the target task model is obtained based on the parameter training of the related task model , so as to realize the transfer of the knowledge (ie, parameters) learned by the related task model with more training data to the target task model with less training data. Due to the large number of sample ECG signals with related abnormalities, and the correlation between the target abnormality and the related abnormality, the generalization ability and classification recognition effect of the target abnormality recognition model trained by the multi-task learning mechanism are improved.
  • the model training method provided in this embodiment trains the target task model and related task models in the multi-task model through a multi-task learning mechanism, so that the target task model with less training data combines the knowledge learned by the related task model with more training data ( parameter), the target task model after training is the target anomaly recognition model, so it can improve the generalization ability and classification performance of the target anomaly recognition model, and can effectively solve the problem of the target anomaly recognition model caused by insufficient sample data with target anomalies. Problems with poor classification recognition.
  • the step of training the multi-task model based on the multi-task learning mechanism in step S12 may specifically include: first adjusting the parameters of each related task model, and then according to the parameters of at least one related task model, Adjust the parameters of the target task model.
  • the step of adjusting the parameters of the target task model according to the parameters of at least one related task model may include: determining the regularization loss item according to the parameters of the target task model and at least one parameter of the related task model, and the regularization loss item is used To make the parameters of the target task model similar to the parameters of at least one related task model; determine the first loss value according to the regularized loss item, and adjust the parameters of the target task model with the goal of minimizing the first loss value.
  • the regularization loss term can be determined according to the following formula:
  • R( ⁇ 1 , ⁇ 2 , . . . , ⁇ M ) ⁇ (
  • R( ⁇ 1 , ⁇ 2 ,..., ⁇ M ) represents the regularized loss item
  • M represents the total number of target task models and related task models in the multi-task model, that is, the number of tasks in multi-task training
  • ⁇ 1 Represents the parameters of the target task model
  • ⁇ 2 , ..., ⁇ M represent the parameters of each related task model
  • represents the preset parameters.
  • is a hyperparameter, and its value can be set according to the sample distribution and experience.
  • parameter constraints are imposed on the target task model by adding a regularized loss item to the loss function of the target task model.
  • the regularization loss item is determined according to the parameters of the relevant task model, the parameters of the target task model can be constrained by the relevant task model, so that the knowledge learned by the relevant task model with a large number of sample ECG signals can be transferred to In the target task model, the classification and recognition performance of the target task model is improved.
  • step of adjusting the parameters of each related task model in step 122 may specifically include:
  • the goal is to adjust the parameters of the first related task model, wherein the first related task model is any one of at least one related task model, and the first related abnormal label is any one of at least one related abnormal label.
  • the first relevant abnormal label is any one of at least one relevant abnormal label of the sample ECG signal input to the first relevant task model.
  • the step of determining the first loss value according to the regularized loss item in step S12 may also include: inputting the sample ECG signal into the target task model, inputting the output of the target task model and the target abnormal label into the preset The experience loss function of , get the experience loss term.
  • the empirical loss function may be a cross-entropy loss function.
  • the step of determining the first loss value according to the regularized loss term may include: calculating the sum of the empirical loss term and the regularized loss term to obtain the first loss value. Then the target task model can be trained with the goal of minimizing the first loss value.
  • the convolutional neural network can be used to establish a target task model C 1 with the same network structure and M-1 related task models C 2 , . . . , C M .
  • the experience loss item E can be calculated using the cross-entropy loss function.
  • the second loss function T2 of any related task model C i may be a cross-entropy loss function.
  • N represents the number of sample ECG signals in the training set
  • the inner summation is the loss function of a single sample ECG signal
  • the outer summation is the loss function of all sample ECG signals
  • the summation result is divided by With N, the average loss function is obtained.
  • t nk is a sign function, if the true category of the nth sample ECG signal is k, the value is 1, otherwise the value is 0.
  • y nk represents the output of the network model, that is, the probability that the nth sample ECG signal belongs to abnormal type k.
  • t nk represents the target abnormal label of the nth sample ECG signal
  • the value of t nk is 1, otherwise the value is 0.
  • y nk represents the output of the target task model C 1 , that is, the probability that the nth sample ECG signal has a target abnormality.
  • t nk represents the first correlation abnormality label of the nth sample ECG signal
  • the value of t nk is 1, otherwise the value is 0.
  • y nk represents the output of the first correlation task model Ci , that is, the probability that the nth sample ECG signal has the first correlation abnormality.
  • the process of training the neural network model mainly includes: forward propagation to calculate the actual output, backpropagation to calculate the error and optimize the loss function, and use the gradient descent algorithm to update and adjust the model parameters layer by layer.
  • the minimum error and the optimal loss function are obtained through multiple iterative training, and the training of the neural network model is completed.
  • the step of training the multi-task model based on the multi-task learning mechanism may be implemented in various manners.
  • multiple rounds of iterative training can be performed on the multi-task model based on the multi-task learning mechanism; wherein, each round of iterative training includes: adjusting the parameters of each related task model, and according to at least one related task model parameters, a step of adjusting the parameters of the target task model.
  • the sample ECG signal in each iteration period, for each of at least one related task model, can be input to the related task model first, and the output of the related task model and the corresponding related abnormal label can be input To the preset second loss function, obtain the second loss value, and adjust the parameters of the relevant task model with the goal of minimizing the second loss value.
  • the sample ECG signal can be input into the target task model, and the output of the target task model and the target abnormal label can be input into the preset experience loss function to obtain the experience loss item.
  • the parameters of the task model and at least one (such as all) related task model adjusted parameters determine the regular loss term; then calculate the sum of the experience loss term and the regular loss term to obtain the first loss value; then the first loss can be minimized
  • the value is the target, and the parameters in the target task model are adjusted, and an iterative cycle is completed so far. Carry out multiple rounds of iterations in sequence according to the above process, until the iteration stop condition is met (such as the number of iterations reaches the set number, convergence, etc.), the training of the target task model and related task models can be completed, and the target task model after training is determined as
  • the target anomaly identification model is to determine each related task model after training as a different related anomaly identification model.
  • multiple rounds of iterative adjustment can be performed on the parameters of each related task model, until each related task model meets the corresponding training stop condition, and each related task model after training is determined to be different The related anomaly identification models; then the parameters of the target task model can be adjusted according to the parameters of at least one related anomaly identification model.
  • the sample ECG signal can be input to the related task model first, and the output of the first related task model and the first related abnormal label can be input to the preset second A loss function is used to obtain a second loss value, aiming at minimizing the second loss value, performing multiple rounds of iterative training on the related task model, and training to obtain a related abnormality recognition model.
  • the parameters of the target task model and at least one (such as all) parameters of the relevant abnormality recognition model determine the regular loss term; then calculate the sum of the experience loss term and the regular loss term to obtain the first loss value; then the first loss can be minimized
  • the value is the target, the target task model is trained, and the target task model after the training is determined as the target anomaly recognition model.
  • step S12 The following describes the process of training the multi-task model by adopting the hard parameter sharing method in step S12.
  • FIG. 6 it shows a schematic diagram of training a multi-task model by adopting the method of hard parameter sharing.
  • the target task model and the related task model share a common feature extraction layer, which is used to extract the common features of the target anomaly and related anomalies.
  • the target The step of adjusting the parameters of the task model may include: sharing the parameters of the common feature extraction layer in at least one related task model as the parameters of the common feature extraction layer in the target task model, and adjusting the parameters of the target task model after parameter sharing .
  • the target task model and each related task model are dual-channel deep learning models as shown in FIG. 5 .
  • Either one of the target task model and at least one related task model includes a private feature extraction layer and a shared feature extraction layer, the private feature extraction layer is used to extract private features, and the shared feature extraction layer is used to extract shared features.
  • each task model can identify different abnormalities and achieve specificity; by setting the common feature extraction layer, the knowledge learned by the relevant task model can be transferred to the target task model, thereby improving the classification and recognition of the target task model performance.
  • step S12 may include:
  • the second related task model input the sample ECG signal into the second related task model, input the output of the second related task model and the second related abnormal label into the preset third loss function, obtain the third loss value, and minimize the third loss
  • the value is the target
  • adjust the parameters of the second related task model, the parameters of the second related task model include the parameters of the common feature extraction layer, wherein, the second related task model is any one of at least one related task model, and the second related task model is abnormal label is any one of at least one associated exception label.
  • the second relevant abnormal label is any one of at least one relevant abnormal label of the sample ECG signal input into the second relevant task model.
  • the parameters of the private feature extraction layer in the target task model are ⁇ 1
  • the parameters of the private feature extraction layer in at least one related task model are ⁇ 2 ,..., ⁇ M
  • the target task model and the related task model share the feature extraction layer
  • the parameter is ⁇ 0 .
  • a convolutional neural network can be used to establish a target task model and M-1 related task models with the same network structure.
  • the third loss function and the fourth loss function may be cross-entropy loss functions. Calculated as follows:
  • N represents the number of sample ECG signals in the training set
  • the inner summation is the loss function of a single sample ECG signal
  • the outer summation is the loss function of all sample ECG signals
  • the summation result is divided by With N, the average loss function is obtained.
  • t nk is a sign function, if the true category of the nth sample ECG signal is k, the value is 1, otherwise the value is 0.
  • y nk represents the output of the network model, that is, the probability that the nth sample ECG signal belongs to abnormal type k.
  • t nk represents the second correlation abnormality label of the nth sample ECG signal
  • the value of t nk is 1, otherwise the value is 0.
  • y nk represents the output of the second correlation task model, that is, the probability that the nth sample ECG signal has a second correlation abnormality.
  • t nk represents the target abnormal label of the nth sample ECG signal
  • the value of t nk is 1, otherwise the value is 0.
  • y nk represents the output of the target task model, that is, the probability that the nth sample ECG signal has a target abnormality.
  • the process of training the neural network model mainly includes: forward propagation to calculate the actual output, backpropagation to calculate the error and optimize the loss function, and use the gradient descent algorithm to update and adjust the model parameters layer by layer.
  • the minimum error and the optimal loss function are obtained through multiple iterative training, and the training of the neural network model is completed.
  • the step of training the multi-task model based on the multi-task learning mechanism may be implemented in various manners.
  • multiple rounds of iterative training can be performed on the multi-task model based on the multi-task learning mechanism; wherein, each round of iterative training includes: adjusting the parameters of each related task model, and according to at least one related task model parameters, a step of adjusting the parameters of the target task model.
  • the sample ECG signal in each iteration period, for each of at least one related task model, can be input to the related task model first, and the output of the related task model and the corresponding related abnormal label can be input To a preset third loss function, obtain a third loss value, and adjust parameters of the relevant task model with the goal of minimizing the third loss value.
  • the parameters of the common feature extraction layer in at least one related task model can be shared as the parameters of the common feature extraction layer in the target task model; after that, the sample ECG input parameters can be shared
  • the target task model of the target task model, the output of the target task model and the target anomaly label are input to the preset fourth loss function to obtain the fourth loss value; after that, the target task model can be minimized with the goal of minimizing the fourth loss value, and the parameters in the target task model Make adjustments to complete an iteration cycle.
  • the training of the target task model and each related task model can be completed, and the target task model after the training is determined as the target anomaly
  • the recognition model is to determine each related task model after training as a different related abnormality recognition model.
  • multiple rounds of iterative adjustments can be performed on the parameters of each related task model, until the related task model meets the corresponding training stop condition, and each related task model after training is determined to be different Related anomaly identification models; parameters of the target task model can then be adjusted according to parameters of at least one related anomaly identification model.
  • the sample ECG signal can be first input to the relevant task model, and the output of the relevant task model and the corresponding relevant abnormal label can be input to the preset third loss function to obtain a third loss value, and with the goal of minimizing the third loss value, perform multiple rounds of iterative training on the related task model, and obtain a related abnormality recognition model through training.
  • the parameters of the common feature extraction layer in at least one related abnormality recognition model can then be shared as the parameters of the common feature extraction layer in the target task model; after that, the sample ECG signal can be input
  • the target task model after parameter sharing, input the output of the target task model and the target anomaly label to the preset fourth loss function to obtain the fourth loss value; after that, the target task model can be minimized with the goal of minimizing the fourth loss value Carry out training, and determine the target task model after training as the target anomaly recognition model.
  • Fig. 2 schematically shows a flow chart of a signal identification method, as shown in Fig. 2, the method may include the following steps.
  • Step S21 Acquiring target ECG signals.
  • this step may specifically include the following steps: firstly acquire the original ECG signal; then perform preprocessing on the original ECG signal to obtain the target ECG signal.
  • the execution subject of this embodiment may be a computer device, and the computer device has a signal recognition device, and the signal recognition method provided by this embodiment is executed by the signal recognition device.
  • the computer device may be, for example, a smart phone, a tablet computer, a personal computer, etc., which is not limited in this embodiment.
  • the executive subject of this embodiment can obtain the original ECG signal in various ways.
  • the execution subject may acquire the original ECG signal collected by a signal acquisition device such as an electrocardiograph, and then perform preprocessing on the acquired original ECG signal to obtain a target ECG signal.
  • the format of the target ECG signal can be made the same as that of the input sample ECG signal when training the target abnormality recognition model.
  • the step of preprocessing the original ECG signal may include at least one of the following steps: using a band-pass filter to remove power frequency interference in the original ECG signal; using a low-pass filter EMG interference in the original ECG signal is removed; and, baseline drift in the original ECG signal is removed by using a high-pass filter.
  • a band-pass filter can be used to remove 50 Hz power frequency interference; a low-pass filter can be used to remove 10-300 Hz myoelectric interference; and a high-pass filter can be used to remove baseline drift.
  • the noise interference in the original ECG signal can be removed, and the accuracy of classification and recognition can be improved.
  • Step S22 Input the target ECG signal into the target abnormality recognition model to obtain the target abnormality recognition result, which is used to indicate whether the target ECG signal has a target abnormality; wherein, the target abnormality recognition model adopts any embodiment Trained by the model training method.
  • the target electrocardiogram signal can be input into the target abnormality identification model, and the target abnormality identification result can be output. Whether the target ECG signal has target abnormality can be determined according to the output target abnormality recognition result.
  • the target abnormality identification result may include, for example: the probability of the target ECG signal having the target abnormality and the probability of not having the target abnormality, which is not limited in this embodiment.
  • the target anomaly recognition model may be pre-trained, or may be trained during a signal recognition process, which is not limited in this embodiment.
  • the target anomaly recognition model is a model trained with the related anomaly recognition model based on a multi-task learning mechanism
  • the target anomaly recognition model with less training data is fused with the related anomaly recognition model with more training data
  • the learned knowledge that is, parameters
  • Fig. 7 schematically shows a block diagram of a model training device. Referring to Figure 7, may include:
  • the sample acquisition module 71 is configured to obtain a training sample set, the training sample set includes a sample ECG signal and an abnormal label of the sample ECG signal, and the abnormal label includes a target abnormal label and at least one related abnormal label;
  • the model training module 72 is configured to input the sample ECG signal into a multi-task model, and train the multi-task model based on a multi-task learning mechanism according to the output of the multi-task model and the abnormal label; wherein,
  • the multi-task model includes a target task model and at least one related task model, the target output of the target task model is the target abnormal label of the input sample ECG signal, and the target output of the related task model is the input sample ECG The associated exception label for the signal;
  • the model determining module 73 is configured to determine the trained target task model as a target abnormality identification model, and the target abnormality identification model is used to identify target abnormalities in ECG signals input to the target abnormality identification model.
  • Fig. 8 schematically shows a block diagram of a signal identification device. Referring to Figure 8, may include:
  • the signal acquisition module 81 is configured to acquire the target ECG signal
  • the abnormality identification module 82 is configured to input the target ECG signal into the target abnormality identification model to obtain a target abnormality identification result, and the target abnormality identification result is used to indicate whether the target ECG signal has a target abnormality; wherein, The target anomaly recognition model is trained by using the model training method described in any embodiment.
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network elements. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. It can be understood and implemented by those skilled in the art without any creative efforts.
  • the various component embodiments of the present disclosure may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all functions of some or all components in the computing processing device according to the embodiments of the present disclosure.
  • DSP digital signal processor
  • the present disclosure can also be implemented as an apparatus or apparatus program (eg, computer program and computer program product) for performing a part or all of the methods described herein.
  • Such a program realizing the present disclosure may be stored on a computer-readable medium, or may have the form of one or more signals.
  • Such a signal may be downloaded from an Internet site, or provided on a carrier signal, or provided in any other form.
  • FIG. 9 illustrates a computing processing device that may implement methods according to the present disclosure.
  • the computing processing device conventionally includes a processor 1010 and a computer program product or computer readable medium in the form of memory 1020 .
  • Memory 1020 may be electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the memory 1020 has a storage space 1030 for program code 1031 for performing any method steps in the methods described above.
  • the storage space 1030 for program codes may include respective program codes 1031 for respectively implementing various steps in the above methods. These program codes can be read from or written into one or more computer program products.
  • These computer program products comprise program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such a computer program product is typically a portable or fixed storage unit as described with reference to FIG. 10 .
  • the storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 1020 in the computing processing device of FIG. 9 .
  • the program code can eg be compressed in a suitable form.
  • the storage unit includes computer readable code 1031', i.e. code readable by, for example, a processor such as 1010, which code, when executed by a computing processing device, causes the computing processing device to perform the above-described methods. each step.
  • references herein to "one embodiment,” “an embodiment,” or “one or more embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Additionally, please note that examples of the word “in one embodiment” herein do not necessarily all refer to the same embodiment.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word “comprising” does not exclude the presence of elements or steps not listed in a claim.
  • the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the disclosure can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a unit claim enumerating several means, several of these means can be embodied by one and the same item of hardware.
  • the use of the words first, second, and third, etc. does not indicate any order. These words can be interpreted as names.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Evolutionary Computation (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Fuzzy Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

L'invention concerne un procédé d'apprentissage de modèle, un procédé et un appareil de reconnaissance de signal, un dispositif de traitement informatique, un programme informatique et un support lisible par ordinateur. Le procédé d'apprentissage de modèle comprend : l'obtention d'un jeu d'échantillons d'apprentissage, le jeu d'échantillons d'apprentissage comprenant des signaux d'électrocardiogramme d'échantillon et des marqueurs d'anomalie des signaux d'électrocardiogramme d'échantillon, et les marqueurs d'anomalie comprenant un marqueur d'anomalie cible et au moins un marqueur d'anomalie associé; l'introduction des signaux d'électrocardiogramme d'échantillon dans un modèle multitâche, et l'entraînement du modèle multitâche sur la base d'un mécanisme d'apprentissage multitâche en fonction d'une sortie du modèle multitâche et du marqueur d'anomalie, le modèle multitâche comprenant un modèle de tâche cible et au moins un modèle de tâche associé, une sortie cible du modèle de tâche cible étant un marqueur d'anomalie cible des signaux d'électrocardiogramme d'échantillon entrés, et une sortie cible du modèle de tâche associé étant une étiquette d'anomalie associée des signaux d'électrocardiogramme d'échantillon entrés; et la détermination du fait qu'un modèle de tâche cible entraîné est un modèle de reconnaissance d'anomalie cible, le modèle de reconnaissance d'anomalie cible étant utilisé pour identifier une anomalie cible dans des signaux d'électrocardiogramme entrés dans le modèle de reconnaissance d'anomalie cible.
PCT/CN2021/108605 2021-07-27 2021-07-27 Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur Ceased WO2023004572A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2021/108605 WO2023004572A1 (fr) 2021-07-27 2021-07-27 Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur
US17/772,405 US20240188895A1 (en) 2021-07-27 2021-07-27 Model training method, signal recognition method, apparatus, computing and processing device, computer program, and computer-readable medium
CN202180002006.0A CN115885279A (zh) 2021-07-27 2021-07-27 模型训练方法、信号识别方法、装置、计算处理设备、计算机程序及计算机可读介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/108605 WO2023004572A1 (fr) 2021-07-27 2021-07-27 Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2023004572A1 true WO2023004572A1 (fr) 2023-02-02

Family

ID=85086124

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/108605 Ceased WO2023004572A1 (fr) 2021-07-27 2021-07-27 Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur

Country Status (3)

Country Link
US (1) US20240188895A1 (fr)
CN (1) CN115885279A (fr)
WO (1) WO2023004572A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953822A (zh) * 2023-03-06 2023-04-11 之江实验室 一种基于rPPG生理信号的人脸视频鉴伪方法和装置
CN116226778A (zh) * 2023-05-09 2023-06-06 水利部珠江水利委员会珠江水利综合技术中心 基于三维分析平台的挡土墙结构异常分析方法及系统
CN116385825A (zh) * 2023-03-22 2023-07-04 小米汽车科技有限公司 模型联合训练方法、装置及车辆

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116919414B (zh) * 2023-07-06 2024-02-13 齐鲁工业大学(山东省科学院) 基于多尺度卷积和密集连接网络的心电信号质量评估方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190130247A1 (en) * 2017-10-31 2019-05-02 General Electric Company Multi-task feature selection neural networks
CN111110224A (zh) * 2020-01-17 2020-05-08 武汉中旗生物医疗电子有限公司 一种基于多角度特征提取的心电图分类方法及装置
CN111134662A (zh) * 2020-02-17 2020-05-12 武汉大学 一种基于迁移学习和置信度选择的心电异常信号识别方法及装置
CN111401558A (zh) * 2020-06-05 2020-07-10 腾讯科技(深圳)有限公司 数据处理模型训练方法、数据处理方法、装置、电子设备
CN112800222A (zh) * 2021-01-26 2021-05-14 天津科技大学 利用共现信息的多任务辅助极限多标签短文本分类方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7174205B2 (en) * 2004-04-05 2007-02-06 Hewlett-Packard Development Company, L.P. Cardiac diagnostic system and method
US9421008B2 (en) * 2011-09-23 2016-08-23 Arthrex, Inc. Soft suture-based anchors
US9468386B2 (en) * 2014-03-11 2016-10-18 Ecole polytechnique fédérale de Lausanne (EPFL) Method for detecting abnormalities in an electrocardiogram
US10426364B2 (en) * 2015-10-27 2019-10-01 Cardiologs Technologies Sas Automatic method to delineate or categorize an electrocardiogram
US20210169417A1 (en) * 2016-01-06 2021-06-10 David Burton Mobile wearable monitoring systems
WO2018017467A1 (fr) * 2016-07-18 2018-01-25 NantOmics, Inc. Systèmes, appareils et procédés d'apprentissage automatique distribué
US11250314B2 (en) * 2017-10-27 2022-02-15 Cognizant Technology Solutions U.S. Corporation Beyond shared hierarchies: deep multitask learning through soft layer ordering
WO2019222401A2 (fr) * 2018-05-17 2019-11-21 Magic Leap, Inc. Apprentissage adverse à gradient de réseaux neuronaux

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190130247A1 (en) * 2017-10-31 2019-05-02 General Electric Company Multi-task feature selection neural networks
CN111110224A (zh) * 2020-01-17 2020-05-08 武汉中旗生物医疗电子有限公司 一种基于多角度特征提取的心电图分类方法及装置
CN111134662A (zh) * 2020-02-17 2020-05-12 武汉大学 一种基于迁移学习和置信度选择的心电异常信号识别方法及装置
CN111401558A (zh) * 2020-06-05 2020-07-10 腾讯科技(深圳)有限公司 数据处理模型训练方法、数据处理方法、装置、电子设备
CN112800222A (zh) * 2021-01-26 2021-05-14 天津科技大学 利用共现信息的多任务辅助极限多标签短文本分类方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953822A (zh) * 2023-03-06 2023-04-11 之江实验室 一种基于rPPG生理信号的人脸视频鉴伪方法和装置
CN116385825A (zh) * 2023-03-22 2023-07-04 小米汽车科技有限公司 模型联合训练方法、装置及车辆
CN116385825B (zh) * 2023-03-22 2024-04-30 小米汽车科技有限公司 模型联合训练方法、装置及车辆
CN116226778A (zh) * 2023-05-09 2023-06-06 水利部珠江水利委员会珠江水利综合技术中心 基于三维分析平台的挡土墙结构异常分析方法及系统
CN116226778B (zh) * 2023-05-09 2023-07-07 水利部珠江水利委员会珠江水利综合技术中心 基于三维分析平台的挡土墙结构异常分析方法及系统

Also Published As

Publication number Publication date
US20240188895A1 (en) 2024-06-13
CN115885279A (zh) 2023-03-31

Similar Documents

Publication Publication Date Title
Anand et al. An enhanced ResNet-50 deep learning model for arrhythmia detection using electrocardiogram biomedical indicators
WO2023004572A1 (fr) Procédé d'apprentissage de modèle, procédé et appareil de reconnaissance de signal, dispositif de traitement informatique, programme informatique et support lisible par ordinateur
Celin et al. ECG signal classification using various machine learning techniques
Clifford et al. Recent advances in heart sound analysis
CN104970789B (zh) 心电图分类方法及系统
Sharma et al. Automated pre-screening of arrhythmia using hybrid combination of Fourier–Bessel expansion and LSTM
Kao et al. Automatic phonocardiograph signal analysis for detecting heart valve disorders
Al-Shammary et al. Efficient ECG classification based on Chi-square distance for arrhythmia detection
WO2021017313A1 (fr) Procédé et appareil de détection de fibrillation atriale, dispositif informatique et support de stockage
Zhang et al. [Retracted] An ECG Heartbeat Classification Method Based on Deep Convolutional Neural Network
CN110638430B (zh) 级联神经网络ecg信号心律失常分类模型的搭建方法
CN111657925A (zh) 基于机器学习的心电信号分类方法、系统、终端以及存储介质
Golande et al. Optical electrocardiogram based heart disease prediction using hybrid deep learning
Prakash et al. A system for automatic cardiac arrhythmia recognition using electrocardiogram signal
Ilbeigipour et al. Real‐Time Heart Arrhythmia Detection Using Apache Spark Structured Streaming
CN109077720B (zh) 信号处理方法、装置、设备和存储介质
Nejedly et al. Classification of ECG using ensemble of residual CNNs with or without attention mechanism
Lan et al. Arrhythmias classification using short-time Fourier transform and GAN based data augmentation
CN116503673A (zh) 一种基于心电图的心律失常识别检测方法及系统
CN115470832A (zh) 一种基于区块链的心电信号数据处理方法
Chen et al. Transfer learning for electrocardiogram classification under small dataset
Wu et al. Autonomous detection of myocarditis based on the fusion of improved quantum genetic algorithm and adaptive differential evolution optimization back propagation neural network
Moqurrab et al. HRIDM: Hybrid Residual/Inception-Based Deeper Model for Arrhythmia Detection from Large Sets of 12-Lead ECG Recordings
An et al. Research on a Lightweight Arrhythmia Classification Model Based on Knowledge Distillation for Wearable Single-Lead ECG Monitoring Systems
Serhani et al. Enhancing arrhythmia prediction through an adaptive deep reinforcement learning framework for ECG signal analysis

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 17772405

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21951189

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 27.05.2024)

122 Ep: pct application non-entry in european phase

Ref document number: 21951189

Country of ref document: EP

Kind code of ref document: A1