[go: up one dir, main page]

CN114504468B - Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology - Google Patents

Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology Download PDF

Info

Publication number
CN114504468B
CN114504468B CN202210114284.1A CN202210114284A CN114504468B CN 114504468 B CN114504468 B CN 114504468B CN 202210114284 A CN202210114284 A CN 202210114284A CN 114504468 B CN114504468 B CN 114504468B
Authority
CN
China
Prior art keywords
rehabilitation
finger
layer
module
eeg
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210114284.1A
Other languages
Chinese (zh)
Other versions
CN114504468A (en
Inventor
高忠科
孙新林
马超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202210114284.1A priority Critical patent/CN114504468B/en
Publication of CN114504468A publication Critical patent/CN114504468A/en
Application granted granted Critical
Publication of CN114504468B publication Critical patent/CN114504468B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • A61H1/0285Hand
    • A61H1/0288Fingers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0157Constructive details portable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/12Driving means
    • A61H2201/1238Driving means with hydraulic or pneumatic drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5043Displays
    • A61H2201/5046Touch screens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/08Other bio-electrical signals
    • A61H2230/10Electroencephalographic signals
    • A61H2230/105Electroencephalographic signals used as a control parameter for the apparatus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Mathematical Physics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Computing Systems (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Psychiatry (AREA)
  • Computational Linguistics (AREA)
  • Physiology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Epidemiology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Rehabilitation Therapy (AREA)
  • Pain & Pain Management (AREA)
  • Fuzzy Systems (AREA)
  • Primary Health Care (AREA)
  • Psychology (AREA)

Abstract

The utility model provides a hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technique, has the connection in proper order: the system comprises two working modes, namely a rehabilitation training mode and a rehabilitation effect evaluation mode; in the rehabilitation training mode, display of rehabilitation actions of a user is given, the user performs motor imagination of corresponding actions, and the corresponding rehabilitation actions are completed by driving the corresponding finger joints to move through the air pump according to the imagination. In the rehabilitation effect evaluation mode, the user selects a rehabilitation action, makes a finger action corresponding to the selected finger action, and performs rehabilitation effect evaluation according to the finger action of the user. According to the intelligent rehabilitation hand device, the action intention of a user is identified, the intelligent rehabilitation hand device is driven to perform corresponding actions, and the user is assisted to complete the hand total finger rehabilitation training. The user can also evaluate the hand rehabilitation effect to form a rehabilitation training closed loop, so that the rehabilitation training is promoted more efficiently.

Description

Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology
Technical Field
The invention relates to hand rehabilitation equipment. In particular to a hand total finger rehabilitation training and evaluating system based on an artificial intelligence technology.
Background
Investigation shows that cerebral apoplexy has become the primary cause of disability for adults in China. The cerebral apoplexy has the characteristics of high incidence rate and high disability rate, and the cerebral apoplexy patients in China have large cardinal number, longer treatment period and poor recovery effect. Cerebral stroke can cause damage to a part of brain regions of a patient, and further, the control capacity of a part of limbs is lost. The two hands are used as important organs of human body, and have very important function for completing daily activities. It is therefore important to restore hand function in stroke patients. In addition, rehabilitation training is also required for patients who have undergone hand surgery to restore hand function. Compared with the traditional passive rehabilitation mode, the existing research shows that the active rehabilitation requires the patient to actively cooperate with the rehabilitation process, and can provide better rehabilitation effect. The main point of active rehabilitation is to recover the natural synchronization of brain consciousness and hand movements, so that the consciousness signals of the human brain are combined to cooperate with the rehabilitation movements of the hands. Electroencephalogram (EEG) is an overall reflection of the activity of cerebral cortical brain nerve cells, which contains a great deal of physiological and pathological information, representing the activity state and thinking situation of the human brain. Therefore, the active rehabilitation mode combined with the electroencephalogram signal detection technology can timely identify the rehabilitation action consciousness of the patient, and drive rehabilitation equipment to assist the patient to complete actions, so that the rehabilitation speed is increased.
In recent years, portable electroencephalogram acquisition apparatuses have been attracting more and more attention. Compared with the traditional electroencephalogram acquisition equipment, the portable electroencephalogram acquisition equipment has smaller volume and mass, portability and usability are greatly improved on the premise that the quality of acquired signals is not reduced, and cost and power consumption are further reduced. Under the condition that the patient needs to carry out the treatment at home, portable brain electricity acquisition equipment can provide the condition of monitoring of brain electricity at home for the patient.
The electroencephalogram data has the characteristics of nonlinearity, complex characteristics and low signal-to-noise ratio, the traditional manual inspection method needs an experienced expert to inspect the electroencephalogram specially, the traditional machine learning algorithm needs to extract the electroencephalogram characteristics manually for analysis, and the method needs manual operation and inspection, is unfavorable for long-time monitoring and can be influenced by subjective factors to miss important characteristics of the electroencephalogram. As the most advanced theory of machine Learning, deep Learning (Deep Learning) has a strong superiority in processing big data information, and has been widely used in the research of brain electrical signals. Deep learning is an end-to-end learning method that can directly extract and learn deeper intrinsic characterizations from the input signal and classify. To date, many deep learning architectures have been proposed and applied in the fields of electroencephalogram analysis, rehabilitation training, and the like.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects of the prior art and provides a hand full-finger rehabilitation training and evaluating system based on an artificial intelligence technology, which can effectively identify and correctly classify the motor imagery electroencephalogram signals of finger positions and promote the rehabilitation of the motor functions of fingers.
The technical scheme adopted by the invention is as follows: hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology, including connecting gradually: the system comprises two working modes, namely a rehabilitation training mode and a rehabilitation effect evaluation mode;
in the rehabilitation training mode, a user selects rehabilitation actions through the man-machine interaction interface and performs motor imagination of corresponding actions according to screen prompts; the portable electroencephalogram acquisition equipment acquires EEG electroencephalogram signals from the brain of a user; the EEG intelligent decoding module decodes the acquired EEG signals through a deep learning technology, judges whether a user performs motor imagery of corresponding actions, and sends a judging result to intelligent rehabilitation hand equipment; the intelligent rehabilitation hand equipment is based on the judgment result of the brain electricity intelligent decoding module, and drives the corresponding finger joints to move through the air pump so as to complete corresponding rehabilitation actions;
in the rehabilitation effect evaluation mode, a user firstly selects rehabilitation actions through the human-computer interaction interface and then makes the actions corresponding to the selected fingers; and the intelligent rehabilitation hand equipment wirelessly transmits the signals of the finger actions to a human-computer interaction interface to evaluate the rehabilitation effect.
The rehabilitation training mode and the rehabilitation effect evaluation mode support 12 finger actions, which are respectively as follows: bending thumb, bending index finger, bending middle finger, bending ring finger, bending little finger, bending thumb+index finger, bending thumb+middle finger, bending thumb+ring finger, bending thumb+little finger, bending whole finger, stretching whole finger.
The portable electroencephalogram acquisition equipment comprises: the EEG brain electrical signal transmission device comprises an EEG electrode cap and a connecting wire which are sequentially connected and used for collecting EEG brain electrical signals, a physiological electrical signal collection-conversion module used for amplifying and converting the EEG electrical signals, an integrated Wi-Fi module used for controlling the physiological electrical signal collection-conversion module, and a power supply circuit which is respectively connected with the physiological electrical signal collection-conversion module and the integrated Wi-Fi module, wherein the EEG electrode cap in the EEG electrode cap and the connecting wire is directly contacted with a scalp of a user through electrodes and used for collecting EEG brain electrical signals on the surface of the scalp, and is connected with the physiological electrical signal collection-conversion module through a Y2 interface in the connecting wire and the EEG electrode cap and the connecting wire and used for transmitting the EEG brain electrical signals; the integrated Wi-Fi module is responsible for reading data from the physiological electric signal acquisition-conversion module and sending the data to the brain electric intelligent decoding module.
The brain electrode cap and the connecting wire thereof acquire EEG brain electrical signals corresponding to sixteen electrodes of the brain electrode cap of a user through the electrodes, wherein the EEG brain electrical signals correspond to the sixteen electrodes of the brain electrode cap of the user, such as FP1, FP2, F3, fz, F4, FCz, T3, C3, cz, C4, T4, P3, pz, P4, oz and A1; the distribution of the electrodes of the brain electrode cap accords with 10/20 international standard leads.
The power supply circuit adopts a 4.2V rechargeable lithium battery to supply power, and controls a charging mode and a working mode through a switch: when the switch is turned off, the power supply circuit enters a charging mode, and a user uses the USB cable to connect the device to the 5V power supply interface for charging; in a charging mode, the power supply circuit adopts an integrated transformer module to electrically isolate the circuit board from an external power supply, so that the circuit is prevented from being damaged due to overvoltage; when the switch is turned on, the power supply circuit enters into an operating mode, and the charging mode is disabled; in the working mode, the power supply circuit adopts a plurality of low-noise linear voltage regulators with different output voltages, and meets the power supply requirements of different devices on a circuit board.
The man-machine interaction interface comprises: the touch control display screen, the Wi-Fi module, the Bluetooth module and the voice prompt module, and an MCU processor respectively connected with the touch control display screen, the Wi-Fi module, the Bluetooth module and the voice prompt module, wherein the MCU processor adopts an embedded operating system working mode to complete driving of the touch control display screen, data transceiving of the Wi-Fi module, data transceiving of the Bluetooth module and voice prompt playing through the voice prompt module; the user performs action selection and parameter setting through the touch display screen, and performs rehabilitation training or rehabilitation effect evaluation according to the picture prompt of the touch display screen; the MCU processor performs data communication with the brain electricity intelligent decoding module through the Wi-Fi module, reads sensor information of the intelligent rehabilitation hand equipment through the Bluetooth module, and transmits action selection and parameter setting information to the intelligent rehabilitation hand equipment.
The EEG intelligent decoding module adopts a deep learning method to analyze and process EEG signals, and specifically comprises the following steps:
1) Preprocessing EEG signals, including two stages of filtering and noise reduction and data enhancement;
in the filtering and noise reduction stage, notch filter is firstly adopted to carry out Notch filtering on 50Hz power frequency interference, then a Butterworth band-pass filter bank is adopted to carry out filtering treatment on EEG signals subjected to Notch filtering, and the EEG signals are divided into 4 frequency bands of theta (4-7 Hz), alpha (8-12 Hz), beta (13-30 Hz) and gamma (31-50 Hz) to obtain pre-processed EEG signalsWherein c represents a frequency band, L represents a data length, and g represents a channel number;
in the data enhancement stage, 4 frequency bands of preprocessed EEG signalsData segmentation is carried out through sliding windows with the length of l, the sliding step length of the sliding windows is b, and the sliding windows are not overlapped with each other; the j-th sliding window data is expressed as +.>Wherein->Representing the p-th data point in the g-th channel of the c-th band,representing one sample formed by the jth sliding window; setting a label for each sample, wherein the label is a motor imagery of whether a user performs corresponding actions in the jth sliding window time;
2) Setting up a deep convolutional neural network model based on an attention mechanism, inputting all samples and corresponding labels into the deep convolutional neural network model based on the attention mechanism, and performing full supervision training on the deep convolutional neural network model based on the attention mechanism: firstly, training samples of all users to obtain a pre-training model, and then based on the pre-training model, performing fine tuning by using the samples of each user to obtain a fine tuning model matched with each user.
The deep convolutional neural network model based on the attention mechanism comprises 4 branches, wherein each branch comprises a plurality of branches connected in series in sequence:
(2.1) a data input layer for inputting data of preprocessed EEG signals corresponding to theta or alpha or beta or gamma frequency bands of a user
(2.2) a first convolution layer having a convolution kernel size of 1×l, a number of convolution kernels of 32, a regularization coefficient of 0.01, l being the length of the sliding window;
(2.3) a second convolution layer having a convolution kernel size of 1 x 16 and a number of convolution kernels of 32;
(2.4) a first series of layers, the output of the first convolutional layer and the output of the second convolutional layer being spliced in series according to the last dimension;
(2.5) a third convolution layer, the convolution kernel size is gx1, the depth multiplying power is 1, the regularization coefficient is 0.01, and g is the channel number;
(2.6) a first batch normalization layer for accelerating model training and reducing overfitting;
(2.7) a first layer of activation functions, using Elu activation functions;
(2.8) a first average pooling layer with a pooling kernel size of 1 x 4;
(2.9) a first Dropout layer having a Dropout probability of 0.5;
(2.10) a fourth convolution layer having a convolution kernel size of 1 x 16, a depth multiplier of 1, and a regularization coefficient of 0.01;
(2.11) a second batch normalization layer for accelerating model training and reducing overfitting;
(2.12) a second layer of activation functions, using Elu activation functions;
(2.13) a second average pooling layer with a pooling kernel size of 1 x 8;
(2.14) a second Dropout layer having a Dropout probability of 0.5;
(2.15) an attention mechanism module, the attention mechanism module comprising:
(2.15.1) a Reshape layer converting the output size of the second Dropout layer to 8 x 64;
(2.15.2) a first fully connected layer having a neuron count of 64 and an activation function of tanh;
(2.15.3) a first rearrangement layer exchanging the 1 st dimension of the output with the 2 nd dimension;
(2.15.3) a second fully-connected layer, having a neuron count of 1 and an activation function of softmax;
(2.15.4) a second rearrangement layer, exchanging the output 1 st dimension with the 2 nd dimension;
(2.15.5) a multiplication layer for multiplying the output of the second Dropout by the elements of the second rearrangement layer element by element;
(2.15.6) a custom adder layer for adding the outputs of the multiplier layers in dimension 2;
(2.15.7) a flattening layer for flattening the output of the custom additive layer into a one-dimensional sequence;
splicing the output of the flattened layers of the 4 branches by using a second serial layer, and connecting the output of the second serial layer to a third full-connection layer; the third full connection layer uses softmax as an activation function, the number of neurons is 2, and a judgment result of the specified action is output.
The intelligent rehabilitation hand equipment comprises pneumatic gloves, a control module arranged at the back of each finger of the pneumatic gloves, a bending sensor arranged at the back side of each finger of the pneumatic gloves and a pressure sensor arranged at the front end of each finger of the pneumatic gloves, wherein the control module is connected with the control module through a lead, and the air pump is connected with each finger of the pneumatic gloves through an air circuit; the pneumatic glove uses the air pump to drive the fingers to bend or stretch independently, and is made of soft elastic gloves, so that the pneumatic glove is convenient for a user to wear and improves comfort; the bending sensor outputs different voltage values according to the bending angle of the current finger, and the pressure sensor outputs different voltage values according to the exertion degree of the current finger; the control module is composed of an integrated Bluetooth module, and the integrated Bluetooth module polls and reads the voltage values output by each bending sensor and each pressure sensor through a multiplexing switch and sends the voltage values to an MCU (micro control Unit) processor in a man-machine interaction interface; meanwhile, the control module receives action selection and parameter setting information sent by the MCU processor in the man-machine interaction interface and decodes the action selection and parameter setting information into an air extraction/inflation control signal, an air path selection signal and a speed control signal.
The air pump receives the air pumping/inflating control signal, the air channel selection signal and the speed control signal from the control module, and controls the action of each finger of the pneumatic glove through the air channel.
Before use, the user selects a system mode, namely a rehabilitation training mode or a rehabilitation effect evaluation mode; wherein,,
(1) The rehabilitation training mode comprises the following using steps:
(1.1) a user selects actions required to perform rehabilitation training and sets training parameters;
(1.2) the user performs motor imagery of the corresponding action according to the screen prompt;
(1.3) the EEG intelligent decoding module decodes EEG signals of the user, judges whether motor imagination of corresponding actions is carried out, and transmits the result to the MCU processor (25) in the man-machine interaction interface (2) through Wi-Fi;
(1.4) the MCU processor decides whether to drive the air pump of the corresponding finger according to the classification result, so as to drive the corresponding finger to move;
(2) The rehabilitation effect evaluation mode comprises the following using steps:
(2.1) the user selects the action required to evaluate the rehabilitation effect and sets the evaluation parameters;
(2.2) the user makes corresponding finger actions according to the screen prompt;
(2.3) a control module in the intelligent rehabilitation hand equipment reads real-time values of a bending sensor and a pressure sensor of the corresponding finger and sends the real-time values to an MCU processor in a man-machine interaction interface;
(2.4) the MCU processor gives a rehabilitation effect evaluation grade according to the received information.
The hand full-finger rehabilitation training and evaluating system based on the artificial intelligence technology can accurately acquire, effectively identify and correctly classify EEG (EEG) electroencephalogram signals, and drive intelligent rehabilitation hand equipment to perform corresponding actions by identifying action intention of a user so as to assist the user to complete hand full-finger rehabilitation training. Meanwhile, a user can evaluate the hand rehabilitation effect by means of the hand rehabilitation training device, a rehabilitation training closed loop is formed, and rehabilitation training is promoted more efficiently.
Drawings
FIG. 1 is a block diagram of a hand full-fingered rehabilitation training and assessment system based on artificial intelligence technology of the present invention;
FIG. 2 is a block diagram showing the constitution of the present invention in a rehabilitation effect evaluation mode;
FIG. 3 is a block diagram of a portable electroencephalogram acquisition apparatus according to the present invention;
FIG. 4 is a block diagram of a human-machine interface in accordance with the present invention;
FIG. 5 is a diagram of a selection of actions of a human-machine interface in the present invention;
FIG. 6 is a flow chart of analysis of the intelligent electroencephalogram decoding module in the present invention;
FIG. 7 is a block diagram of a deep convolutional neural network model based on the attention mechanism of the present invention;
FIG. 8 is a block diagram of the attention mechanism module of the present invention;
FIG. 9 is a schematic diagram of the structure of the intelligent rehabilitation hand device of the present invention;
FIG. 10 is a schematic diagram of the connection relationship of the air pump in the present invention;
FIG. 11 is a schematic diagram of a hand total finger rehabilitation training mode and rehabilitation effect evaluation mode according to the present invention.
Detailed Description
The following describes a hand total finger rehabilitation training and evaluation system based on artificial intelligence technology in detail with reference to the embodiments and the drawings.
As shown in FIG. 1, the hand total finger rehabilitation training and evaluating system based on the artificial intelligence technology comprises the following components: the system comprises two working modes, namely a rehabilitation training mode and a rehabilitation effect evaluation mode, wherein the portable electroencephalogram acquisition device 1, the human-computer interaction interface 2, the electroencephalogram intelligent decoding module 3 and the intelligent rehabilitation hand device 4;
in the rehabilitation training mode, a user selects rehabilitation actions through the man-machine interaction interface 2 and performs motor imagination of corresponding actions according to screen prompts; the portable electroencephalogram acquisition equipment 1 acquires EEG electroencephalogram signals from the brain of a user; the EEG intelligent decoding module 3 decodes the acquired EEG signals through a deep learning technology, judges whether a user performs motor imagery of corresponding actions, and sends a judging result to the intelligent rehabilitation hand equipment 4; the intelligent rehabilitation hand equipment 4 is based on the judgment result of the brain electricity intelligent decoding module 3, and drives the corresponding finger joints to move through the air pump so as to complete corresponding rehabilitation actions;
as shown in fig. 2, in the rehabilitation effect evaluation mode, the user first performs rehabilitation action selection through the human-computer interaction interface 2, and then makes a finger action corresponding to the selected rehabilitation action; the intelligent rehabilitation hand device 4 sends the signals of the finger actions to the MCU processor 24 in the man-machine interaction interface 2 in a wireless mode, and then the MCU processor 24 evaluates the rehabilitation effect according to the sensor signal data.
As shown in fig. 5, the rehabilitation training mode and the rehabilitation effect evaluation mode support 12 finger actions, which are respectively: bending thumb, bending index finger, bending middle finger, bending ring finger, bending little finger, bending thumb+index finger, bending thumb+middle finger, bending thumb+ring finger, bending thumb+little finger, bending whole finger, stretching whole finger.
As shown in fig. 3, the portable electroencephalogram acquisition apparatus 1 includes: the EEG electroencephalogram detection device comprises an EEG electrode cap and a connecting wire 11 which are sequentially connected and used for collecting EEG electroencephalogram signals, a physiological electric signal collecting-converting module 12 used for amplifying and converting the EEG electric signals, an integrated Wi-Fi module 13 used for controlling the physiological electric signal collecting-converting module 12, and a power supply circuit 14 respectively connected with the physiological electric signal collecting-converting module 12 and the integrated Wi-Fi module 13, wherein the EEG electrode cap in the EEG electrode cap and the connecting wire 11 is directly contacted with a scalp of a user through electrodes and is used for collecting EEG electroencephalogram signals on the surface of the scalp, and the EEG electrode cap is connected with the physiological electric signal collecting-converting module 12 through a Y2 interface in the connecting wire and the EEG electrode cap and the connecting wire 11 and is used for transmitting the EEG electroencephalogram signals; the integrated Wi-Fi module 13 is responsible for reading the data from the physiological electric signal acquisition-conversion module 12 and sending the data to the electroencephalogram intelligent decoding module 3.
The brain electrode cap and the connecting wire 11 thereof acquire EEG electroencephalograms of sixteen electrodes corresponding to the FP1, FP2, F3, fz, F4, FCz, T3, C3, cz, C4, T4, P3, pz, P4, oz and A1 of the brain electrode cap of a user through the electrodes; the distribution of the electrodes of the brain electrode cap accords with 10/20 international standard leads.
The physiological electric signal acquisition-conversion module 12 is composed of a plurality of analog input modules with high common mode rejection ratio for receiving EEG electroencephalogram signals acquired by an EEG cap, a low-noise programmable gain amplifier PGA for amplifying the EEG electroencephalogram signals and a biological electric signal acquisition chip of a high-resolution synchronous sampling analog-to-digital converter ADC for converting the analog signals into digital signals.
The integrated Wi-Fi module 13 adopts a module with the model of ESP-12F, has the function of IEEE 802.11b/g/n radio frequency wireless communication, has a IIC, UART, SPI, ADC, GPIO common interface, is used for adjusting the PGA amplification factor and the ADC sampling rate of the physiological electric signal acquisition-conversion module 12, and reads EEG brain electric signals acquired by the physiological electric signal acquisition-conversion module 12 through an SPI interface. The integrated Wi-Fi module 13 works in an AP mode, establishes wireless connection with the electroencephalogram intelligent decoding module 3, and transmits EEG electroencephalogram signals.
The power circuit 14 is powered by a 4.2V rechargeable lithium battery, and controls a charging mode and an operating mode through a switch: when the switch is turned off, the power supply circuit enters a charging mode, and a user uses the USB cable to connect the device to the 5V power supply interface for charging; in a charging mode, the power supply circuit adopts an integrated transformer module to electrically isolate the circuit board from an external power supply, so that the circuit is prevented from being damaged due to overvoltage; when the switch is turned on, the power supply circuit enters into an operating mode, and the charging mode is disabled; in the working mode, the power supply circuit adopts a plurality of low-noise linear voltage regulators with different output voltages, and meets the power supply requirements of different devices on a circuit board.
As shown in fig. 4, the human-computer interaction interface 2 includes: the touch display screen 21, the Wi-Fi module 22, the Bluetooth module 23 and the voice prompt module 24, and an MCU processor 25 respectively connected with the touch display screen 21, the Wi-Fi module 22, the Bluetooth module 23 and the voice prompt module 24, wherein the MCU processor 25 is used for controlling the touch display screen 21, the Wi-Fi module 22, the Bluetooth module 23 and the voice prompt module 24 to work; the MCU processor 25 adopts the working mode of an embedded operating system to complete the driving of the touch display screen 21, the data receiving and transmitting of the Wi-Fi module 22, the data receiving and transmitting of the Bluetooth module 23 and the playing of voice prompts through the voice prompt module 24; the user performs action selection and parameter setting through the touch display screen 21, and performs rehabilitation training or rehabilitation effect evaluation according to the picture prompt of the touch display screen 21; the MCU processor 25 is in data communication with the brain electricity intelligent decoding module 3 through the Wi-Fi module 22, reads sensor information of the intelligent rehabilitation hand equipment 4 through the Bluetooth module 23, and transmits action selection and parameter setting information to the intelligent rehabilitation hand equipment 4.
In the rehabilitation training mode, a user first selects an action to be trained through the touch display screen 21 of the man-machine interaction interface 2, and adds the action to a training queue. The training queue is used for storing action combinations required by the rehabilitation user, and 10 actions are supported to be stored at maximum. Then, the user inputs the number of times the motion needs to be trained through the touch display screen 21, defaults to repeating the training 10 times per motion, and supports repeating the training 30 times per motion at the highest. After training is started, the touch display screen 21 displays a picture of the current training action, the user performs motor imagination of corresponding action according to the picture, the portable electroencephalogram acquisition equipment 1 acquires EEG electroencephalogram signals of the user in real time, the EEG intelligent decoding module 3 analyzes and processes the EEG electroencephalogram signals to judge whether the user performs motor imagination of corresponding action, a classification result is sent to the Wi-Fi module 22 in the human-computer interaction interface 2, and then the MCU processor 25 determines whether to drive an air pump motor of a corresponding finger part according to the classification result.
In the rehabilitation effect evaluation mode, the user first selects an action to be trained through the touch display screen 21 of the man-machine interaction interface 2, and adds the action to the training queue. The training queue is used for storing action combinations required by the rehabilitation user, 10 actions are stored in a highest support mode, the evaluation times of each action are fixedly set to be 5 times, and the user cannot adjust the action combinations. After the evaluation is started, the touch display screen 21 displays a picture of the current evaluation action, the user performs actual finger movement according to the picture, the intelligent rehabilitation hand device 4 acquires signals of a bending sensor and a pressure sensor arranged on each finger, sensor signal data are wirelessly transmitted to the MCU processor 25 in the human-computer interaction interface 2 through Bluetooth, and then the MCU processor 25 obtains a rehabilitation effect evaluation result according to the sensor signal data.
As shown in fig. 6, the electroencephalogram intelligent decoding module 3 adopts a deep learning method to analyze and process the EEG brain signals, and specifically requires the following steps:
1) Preprocessing EEG signals, including two stages of filtering and noise reduction and data enhancement;
in the filtering noise reduction stage, firstly, adoptNotch filter is used for carrying out Notch filtering on 50Hz power frequency interference, then a Butterworth band-pass filter bank is used for carrying out filtering processing on EEG signals after the Notch filtering, and the EEG signals are divided into 4 frequency bands of theta (4-7 Hz), alpha (8-12 Hz), beta (13-30 Hz) and gamma (31-50 Hz) to obtain pre-processed EEG signalsWherein c represents a frequency band, L represents a data length, and g represents a channel number;
in the data enhancement stage, 4 frequency bands of preprocessed EEG signalsData segmentation is carried out through sliding windows with the length of l, the sliding step length of the sliding windows is b, and the sliding windows are not overlapped with each other; the j-th sliding window data is expressed as +.>Wherein->Representing the p-th data point in the g-th channel of the c-th band,representing one sample formed by the jth sliding window. A label is set for each sample, wherein the label is whether a user performs a motor imagery of corresponding actions in the jth sliding window time.
2) Setting up a deep convolutional neural network model based on an attention mechanism, inputting all samples and corresponding labels into the deep convolutional neural network model based on the attention mechanism, and performing full supervision training on the deep convolutional neural network model based on the attention mechanism: firstly, training samples of all users to obtain a pre-training model, and then based on the pre-training model, performing fine tuning by using the samples of each user to obtain a fine tuning model matched with each user. The pretrained model training process uses the BatchSize of 256, the learning rate of 0.001 and the training times of 500 times, and the fine tuning model training process uses the BatchSize of 16, the learning rate of 0.0001 and the training times of 200 times. The pre-training model is completely consistent with the structure of the fine tuning model.
As shown in fig. 7, the deep convolutional neural network model based on the attention mechanism includes 4 branches, each branch corresponds to sample input of θ or α or β or γ frequency bands, and each branch includes sequentially concatenated:
(2.1) a data input layer for inputting data of preprocessed EEG signals corresponding to theta or alpha or beta or gamma frequency bands of a user
(2.2) a first convolution layer having a convolution kernel size of 1×l, a number of convolution kernels of 32, a regularization coefficient of 0.01, l being the length of the sliding window;
(2.3) a second convolution layer having a convolution kernel size of 1 x 16 and a number of convolution kernels of 32;
(2.4) a first series of layers, the output of the first convolutional layer and the output of the second convolutional layer being spliced in series according to the last dimension;
(2.5) a third convolution layer, the convolution kernel size is gx1, the depth multiplying power is 1, the regularization coefficient is 0.01, and g is the channel number;
(2.6) a first batch normalization layer for accelerating model training and reducing overfitting;
(2.7) a first layer of activation functions, using Elu activation functions;
(2.8) a first average pooling layer with a pooling kernel size of 1 x 4;
(2.9) a first Dropout layer having a Dropout probability of 0.5;
(2.10) a fourth convolution layer having a convolution kernel size of 1 x 16, a depth multiplier of 1, and a regularization coefficient of 0.01;
(2.11) a second batch normalization layer for accelerating model training and reducing overfitting;
(2.12) a second layer of activation functions, using Elu activation functions;
(2.13) a second average pooling layer with a pooling kernel size of 1 x 8;
(2.14) a second Dropout layer having a Dropout probability of 0.5;
(2.15) an attention mechanism module, as shown in fig. 8, the attention mechanism module includes:
(2.15.1) a Reshape layer converting the output size of the second Dropout layer to 8 x 64;
(2.15.2) a first fully connected layer having a neuron count of 64 and an activation function of tanh;
(2.15.3) a first rearrangement layer exchanging the 1 st dimension of the output with the 2 nd dimension;
(2.15.3) a second fully-connected layer, having a neuron count of 1 and an activation function of softmax;
(2.15.4) a second rearrangement layer, exchanging the output 1 st dimension with the 2 nd dimension;
(2.15.5) a multiplication layer for multiplying the output of the second Dropout by the elements of the second rearrangement layer element by element;
(2.15.6) a custom adder layer for adding the outputs of the multiplier layers in dimension 2;
(2.15.7) a flattening layer for flattening the output of the custom additive layer into a one-dimensional sequence;
splicing the output of the flattened layers of the 4 branches by using a second serial layer, and connecting the output of the second serial layer to a third full-connection layer; the third full connection layer uses softmax as an activation function, the number of neurons is 2, and a judgment result of the specified action is output.
As shown in fig. 9, the intelligent rehabilitation hand device 4 comprises a pneumatic glove 41, a control module 44 arranged at the back of the pneumatic glove, a bending sensor 42 arranged at the back side of each finger of the pneumatic glove and connected with the control module 44 through a wire, a pressure sensor 43 arranged at the front end of each finger of the pneumatic glove, and an air pump 45 connected with each finger of the pneumatic glove 41 through an air channel; the pneumatic glove 41 uses the air pump 45 to drive the fingers to independently bend or stretch, and is made of soft elastic gloves, so that the pneumatic glove is convenient for a user to wear and improves the comfort level; the bending sensor 42 outputs different voltage values according to the bending angle of the current finger, and reflects the bending angles of different fingers in real time; the pressure sensor 43 outputs different voltage values according to the current finger force level to reflect the force levels of different fingers in real time; the control module 44 is composed of an integrated Bluetooth module with a model number of NRF52832, and the integrated Bluetooth module polls and reads the voltage values output by each bending sensor 42 and each pressure sensor 43 through a multiplexing switch and sends the voltage values to the MCU processor 25 in the man-machine interaction interface 2; meanwhile, the control module 44 receives the action selection and parameter setting information sent by the MCU processor 25 in the man-machine interface 2, and decodes the action selection and parameter setting information into an air pumping/inflating control signal, an air path selection signal and a speed control signal. The MCU processor (24) gives a rehabilitation effect evaluation grade according to the sensor information.
As shown in fig. 10, the air pump 45 receives the air pumping/inflating control signal, the air path selection signal and the speed control signal from the control module 44, and controls the motion of each finger of the pneumatic glove 41 through the air path. The air pumping/inflating control signal is used for controlling the air flow direction of the air pump and further controlling the stretching or bending of fingers; the air path selection signal is used for selecting specific finger stretching or bending; the speed control signal is used to control the speed at which the finger stretches or bends.
Table 1 rehabilitation effect evaluation level table
As shown in FIG. 11, the hand total finger rehabilitation training and evaluation system based on the artificial intelligence technology is characterized in that a user selects a system mode before using the system, namely a rehabilitation training mode or a rehabilitation effect evaluation mode; wherein,,
(1) The rehabilitation training mode comprises the following using steps:
(1.1) a user selects actions required to perform rehabilitation training and sets training parameters;
(1.2) the user performs motor imagery of the corresponding action according to the screen prompt;
(1.3) the EEG intelligent decoding module (3) decodes EEG signals of the user, judges whether motor imagery of corresponding actions is carried out, and transmits the result to the MCU processor 25 in the man-machine interaction interface 2 through Wi-Fi;
(1.4) the MCU processor (24) determines whether to drive the air pump 45 of the corresponding finger according to the classification result, so as to drive the corresponding finger to move;
(2) The rehabilitation effect evaluation mode comprises the following using steps:
(2.1) the user selects the action required to evaluate the rehabilitation effect and sets the evaluation parameters;
(2.2) the user makes corresponding finger actions according to the screen prompt;
(2.3) a control module (44) in the intelligent rehabilitation hand equipment (4) reads real-time values of a bending sensor (42) and a pressure sensor (43) of the corresponding finger and sends the real-time values to an MCU (micro control Unit) processor (25) in the human-computer interaction interface (2);
(2.4) the MCU processor (24) gives a rehabilitation effect assessment grade according to the received information.
Those skilled in the art will readily appreciate that the foregoing description is by way of example only of a preferred embodiment of the invention, and is not intended to limit the invention thereto. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1.一种基于人工智能技术的手部全指康复训练及评估系统,包括有依次连接的:便携式脑电采集设备(1)、人机交互界面(2)、脑电智能解码模块(3)和智能康复手设备(4),其特征在于,系统包括康复训练模式和康复效果评估模式两种工作模式;1. A full-finger hand rehabilitation training and evaluation system based on artificial intelligence technology, including sequentially connected: a portable EEG acquisition device (1), a human-computer interaction interface (2), and an EEG intelligent decoding module (3) And intelligent rehabilitation hand equipment (4), it is characterized in that, the system includes two working modes of rehabilitation training mode and rehabilitation effect evaluation mode; 在康复训练模式下,使用者通过所述的人机交互界面(2)进行康复动作选择,并根据屏幕提示进行相应动作的运动想象;所述的便携式脑电采集设备(1)从使用者大脑采集EEG脑电信号;所述的脑电智能解码模块(3)通过深度学习技术对采集到的EEG脑电信号进行解码,判断使用者是否进行了相应动作的运动想象,并将判断结果发送至智能康复手设备(4);所述的智能康复手设备(4)以脑电智能解码模块(3)的判断结果为基础,通过气泵驱动相应手指关节的运动,完成对应的康复动作;In the rehabilitation training mode, the user selects a rehabilitation action through the human-computer interaction interface (2), and performs motor imagery of the corresponding action according to the screen prompts; the portable EEG acquisition device (1) Gather EEG electroencephalogram signal; Described EEG electroencephalogram intelligent decoding module (3) decodes the EEG electroencephalogram signal collected by deep learning technology, judges whether the user has carried out the motor imagery of corresponding action, and judges the result and sends to The intelligent rehabilitation hand device (4); the intelligent rehabilitation hand device (4) is based on the judgment result of the EEG intelligent decoding module (3), drives the movement of the corresponding finger joints through the air pump, and completes the corresponding rehabilitation action; 所述的脑电智能解码模块(3)采用深度学习方法对EEG脑电信号进行分析处理,具体需要以下步骤:Described EEG intelligent decoding module (3) adopts deep learning method to analyze and process EEG EEG signal, and specifically needs the following steps: 1)对EEG脑电信号进行预处理,包括滤波降噪、数据增强两个阶段;1) Preprocessing the EEG signal, including two stages of filtering and noise reduction and data enhancement; 在滤波降噪阶段,首先采用Notch滤波器对50Hz工频干扰进行陷波滤波,随后采用Butterworth带通滤波器组对陷波滤波后的EEG脑电信号进行滤波处理,将EEG脑电信号分为θ(4-7Hz)、α(8-12Hz)、β(13-30Hz)、γ(31-50Hz)共4个频段,得到预处理后的EEG脑电信号其中c代表频段,L代表数据长度,g代表通道数;In the filtering and noise reduction stage, the Notch filter is firstly used to notch filter the 50Hz power frequency interference, and then the Butterworth bandpass filter bank is used to filter the EEG signal after notch filtering, and the EEG signal is divided into θ(4-7Hz), α(8-12Hz), β(13-30Hz), γ(31-50Hz) in total 4 frequency bands to obtain preprocessed EEG signals Where c represents the frequency band, L represents the data length, and g represents the number of channels; 在数据增强阶段,对4个频段的预处理后的EEG脑电信号分别通过长度为l的滑动窗口进行数据切分,滑动窗口的滑动步长为b,且滑动窗口之间互不重叠;第j个滑动窗口数据表示为/>其中/>表示第c个频段第g个通道中的第p个数据点,/>表示第j个滑动窗口形成的一个样本;为每一个样本设定标签,标签为在第j个滑动窗口时间内,使用者是否进行了对应动作的运动想象;In the data enhancement stage, the preprocessed EEG signals of the 4 frequency bands Segment the data through a sliding window of length l, the sliding step of the sliding window is b, and the sliding windows do not overlap each other; the data of the jth sliding window is expressed as /> where /> Indicates the p-th data point in the g-th channel of the c-th frequency band, /> Indicates a sample formed by the jth sliding window; set a label for each sample, and the label is whether the user performed motor imagery of the corresponding action during the jth sliding window; 2)搭建基于注意力机制的深度卷积神经网络模型,将所有样本以及对应的标签输入基于注意力机制的深度卷积神经网络模型,对基于注意力机制的深度卷积神经网络模型进行全监督训练:首先利用所有使用者的样本训练得到预训练模型,随后基于预训练模型,使用每个使用者的样本进行精调,得到匹配于每个使用者的精调模型;2) Build a deep convolutional neural network model based on the attention mechanism, input all samples and corresponding labels into the deep convolutional neural network model based on the attention mechanism, and perform full supervision on the deep convolutional neural network model based on the attention mechanism Training: Firstly, the pre-training model is obtained by using the samples of all users, and then based on the pre-training model, the samples of each user are used for fine-tuning to obtain a fine-tuning model that matches each user; 在康复效果评估模式下,使用者首先通过所述的人机交互界面(2)进行康复动作选择,随后做出对应于所选的手指动作;智能康复手设备(4)将所述手指动作的信号通过无线发送至人机交互界面(2),进行康复效果评估。In the rehabilitation effect evaluation mode, the user first selects a rehabilitation action through the human-computer interaction interface (2), and then makes a finger action corresponding to the selection; the intelligent rehabilitation hand device (4) converts the finger action The signal is sent to the human-computer interaction interface (2) through wireless to evaluate the rehabilitation effect. 2.根据权利要求1所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的康复训练模式和康复效果评估模式均支持12种手指动作,分别为:弯曲大拇指、弯曲食指、弯曲中指、弯曲无名指、弯曲小拇指、弯曲大拇指+食指、弯曲大拇指+中指、弯曲大拇指+无名指、弯曲大拇指+小拇指、弯曲全指、伸展全指。2. a kind of hand whole finger rehabilitation training and evaluation system based on artificial intelligence technology according to claim 1, is characterized in that, described rehabilitation training mode and rehabilitation effect evaluation mode all support 12 kinds of finger movements, are respectively : Flex thumb, flex index finger, flex middle finger, flex ring finger, flex pinky finger, flex thumb + index finger, flex thumb + middle finger, flex thumb + ring finger, flex thumb + little finger, flex full finger, extend full finger. 3.根据权利要求1所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的便携式脑电采集设备(1)包括有:依次连接的用于采集EEG脑电信号的脑电极帽及连接线(11)、用于脑电信号放大和转换的生理电信号采集-转换模块(12)、用于控制生理电信号采集-转换模块(12)的集成式Wi-Fi模块(13),以及分别连接生理电信号采集-转换模块(12)和集成式Wi-Fi模块(13)的电源电路(14),其中,所述的脑电极帽及连接线(11)中的脑电极帽通过电极与使用者头皮直接接触,并采集头皮表面的EEG脑电信号,通过连接线及脑电极帽及连接线(11)中的Y2接口与生理电信号采集-转换模块(12)相连接,用于传输EEG脑电信号;集成式Wi-Fi模块(13)负责读取来自于生理电信号采集-转换模块(12)的数据,并发送至脑电智能解码模块(3)。3. a kind of hand whole finger rehabilitation training and evaluation system based on artificial intelligence technology according to claim 1, is characterized in that, described portable EEG acquisition device (1) comprises: connected in sequence for acquisition Brain electrode cap and connecting wire (11) for EEG electroencephalogram signal, physiological electric signal acquisition-conversion module (12) for electroencephalogram signal amplification and conversion, integration for controlling physiological electric signal acquisition-conversion module (12) Type Wi-Fi module (13), and connect the power supply circuit (14) of physiological electrical signal acquisition-transformation module (12) and integrated Wi-Fi module (13) respectively, wherein, described brain electrode cap and connecting wire The brain electrode cap in (11) is in direct contact with the user’s scalp through the electrodes, and collects the EEG electroencephalogram signal on the scalp surface, and collects the physiological electrical signal through the connection line, the brain electrode cap and the Y2 interface in the connection line (11)- The conversion modules (12) are connected to transmit EEG signals; the integrated Wi-Fi module (13) is responsible for reading the data from the physiological electrical signal acquisition-conversion module (12) and sending them to the EEG intelligent decoding module (3). 4.根据权利要求3所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的脑电极帽及其连接线(11)通过电极获取使用者对应于脑电极帽的FP1,FP2,F3,Fz,F4,FCz,T3,C3,Cz,C4,T4,P3,Pz,P4,Oz,A1共十六个电极的EEG脑电信号;脑电极帽的电极分布符合10/20国际标准导联。4. a kind of hand whole finger rehabilitation training and evaluation system based on artificial intelligence technology according to claim 3, is characterized in that, described brain electrode cap and connecting wire (11) thereof obtain user's correspondence with electrode through electrode. EEG signals of sixteen electrodes FP1, FP2, F3, Fz, F4, FCz, T3, C3, Cz, C4, T4, P3, Pz, P4, Oz, A1 of the brain electrode cap; The electrode distribution conforms to the 10/20 international standard lead. 5.根据权利要求3所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的电源电路(14)采用4.2V可充电式锂电池供电,通过开关来控制充电模式和工作模式:当开关关闭时,电源电路进入充电模式,使用者使用USB线缆将设备连接至5V供电接口进行充电;在充电模式下,电源电路采用集成变压器模块,对电路板与外部电源进行电气隔离,防止过压造成电路损坏;当开关打开时,电源电路进入工作模式,充电模式被禁用;在工作模式下,电源电路采用数片不同输出电压的低噪声线性稳压器,满足电路板上不同器件的供电需求。5. A kind of hand whole finger rehabilitation training and evaluation system based on artificial intelligence technology according to claim 3, is characterized in that, described power supply circuit (14) adopts 4.2V rechargeable lithium battery to supply power, through switch To control the charging mode and working mode: when the switch is turned off, the power circuit enters the charging mode, and the user uses a USB cable to connect the device to the 5V power supply interface for charging; in the charging mode, the power circuit adopts an integrated transformer module, and the circuit board It is electrically isolated from the external power supply to prevent circuit damage caused by overvoltage; when the switch is turned on, the power circuit enters the working mode, and the charging mode is disabled; in the working mode, the power circuit uses several low-noise linear regulators with different output voltages , to meet the power supply requirements of different devices on the circuit board. 6.根据权利要求1所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的人机交互界面(2)包括有:触控显示屏(21)、Wi-Fi模块(22)、蓝牙模块(23)和语音提示模块(24),以及分别连接触控显示屏(21)、Wi-Fi模块(22)、蓝牙模块(23)和语音提示模块(24)的MCU处理器(25),MCU处理器(24)采用嵌入式操作系统的工作方式,完成触控显示屏(21)的驱动、Wi-Fi模块(22)的数据收发、蓝牙模块(23)的数据收发、通过语音提示模块(24)播放语音提示;使用者通过触控显示屏(21)进行动作选择、参数设置,并根据触控显示屏(21)的图片提示进行康复训练或康复效果评估;所述的MCU处理器(25)通过Wi-Fi模块(22)与脑电智能解码模块(3)进行数据通信,通过蓝牙模块(23)读取智能康复手设备(4)的传感器信息,并将动作选择、参数设置信息传送至智能康复手设备(4)。6. A kind of hand whole finger rehabilitation training and evaluation system based on artificial intelligence technology according to claim 1, is characterized in that, described human-computer interaction interface (2) comprises: touch display screen (21) , Wi-Fi module (22), bluetooth module (23) and voice prompt module (24), and respectively connect touch display screen (21), Wi-Fi module (22), bluetooth module (23) and voice prompt module The MCU processor (25) of (24), the MCU processor (24) adopts the working mode of the embedded operating system, completes the drive of the touch display screen (21), the data transmission and reception of the Wi-Fi module (22), and the Bluetooth module (23) send and receive data, play voice prompts through the voice prompt module (24); the user performs action selection and parameter setting through the touch screen (21), and performs rehabilitation training according to the picture prompts on the touch screen (21) Or rehabilitation effect evaluation; Described MCU processor (25) carries out data communication with EEG intelligent decoding module (3) by Wi-Fi module (22), reads intelligent rehabilitation hand equipment (4) by bluetooth module (23) sensor information, and transmit action selection and parameter setting information to the intelligent rehabilitation hand device (4). 7.根据权利要求1所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的基于注意力机制的深度卷积神经网络模型,包括4个分支,每个分支均包括有依次串接的:7. a kind of hand full finger rehabilitation training and evaluation system based on artificial intelligence technology according to claim 1, is characterized in that, described deep convolutional neural network model based on attention mechanism comprises 4 branches, Each branch includes sequentially concatenated: (2.1)一个数据输入层,输入数据为使用者对应于θ或α或β或γ频段的预处理后的EEG脑电信号 (2.1) A data input layer, the input data is the user's preprocessed EEG signal corresponding to the θ or α or β or γ frequency band (2.2)一个第一卷积层,卷积核大小为1×l,卷积核数目为32,正则化系数为0.01,l为滑动窗口的长度;(2.2) A first convolution layer, the size of the convolution kernel is 1×l, the number of convolution kernels is 32, the regularization coefficient is 0.01, and l is the length of the sliding window; (2.3)一个第二卷积层,卷积核大小为1×16,卷积核数目为32;(2.3) A second convolution layer, the size of the convolution kernel is 1×16, and the number of convolution kernels is 32; (2.4)一个第一串联层,将第一卷积层的输出与第二卷积层的输出按照最后一个维度进行串联拼接;(2.4) a first concatenated layer, the output of the first convolutional layer and the output of the second convolutional layer are concatenated according to the last dimension; (2.5)一个第三卷积层,卷积核大小为g×1,深度倍率为1,正则化系数为0.01,g为通道数;(2.5) A third convolutional layer with a convolution kernel size of g×1, a depth magnification of 1, a regularization coefficient of 0.01, and g as the number of channels; (2.6)一个第一批量归一化层,用于加速模型训练,减轻过拟合;(2.6) A first batch normalization layer for accelerating model training and reducing overfitting; (2.7)一个第一激活函数层,使用Elu激活函数;(2.7) A first activation function layer, using the Elu activation function; (2.8)一个第一平均池化层,池化核大小为1×4;(2.8) A first average pooling layer with a pooling kernel size of 1×4; (2.9)一个第一Dropout层,Dropout概率为0.5;(2.9) A first dropout layer with a dropout probability of 0.5; (2.10)一个第四卷积层,卷积核大小为1×16,深度倍率为1,正则化系数为0.01;(2.10) A fourth convolutional layer with a convolution kernel size of 1×16, a depth magnification of 1, and a regularization coefficient of 0.01; (2.11)一个第二批量归一化层,用于加速模型训练,减轻过拟合;(2.11) A second batch normalization layer for accelerating model training and reducing overfitting; (2.12)一个第二激活函数层,使用Elu激活函数;(2.12) A second activation function layer, using the Elu activation function; (2.13)一个第二平均池化层,池化核大小为1×8;(2.13) A second average pooling layer with a pooling kernel size of 1×8; (2.14)一个第二Dropout层,Dropout概率为0.5;(2.14) A second dropout layer with a dropout probability of 0.5; (2.15)一个注意力机制模块,注意力机制模块包括:(2.15) An attention mechanism module, the attention mechanism module includes: (2.15.1)一个Reshape层,将第二Dropout层的输出尺寸转换为8×64;(2.15.1) A Reshape layer that converts the output size of the second Dropout layer to 8×64; (2.15.2)一个第一全连接层,神经元个数为64,激活函数为tanh;(2.15.2) A first fully connected layer, the number of neurons is 64, and the activation function is tanh; (2.15.3)一个第一重新排列层,将输出的第1维度与第2维度进行交换;(2.15.3) A first rearrangement layer that swaps the first dimension of the output with the second dimension; (2.15.3)一个第二全连接层,神经元个数为1,激活函数为softmax;(2.15.3) A second fully connected layer, the number of neurons is 1, and the activation function is softmax; (2.15.4)一个第二重新排列层,将输出的第1维度与第2维度进行交换;(2.15.4) a second rearrangement layer that swaps the output dimension 1 with dimension 2; (2.15.5)一个乘法层,将第二Dropout的输出与第二重新排列层的元素进行逐元素相乘;(2.15.5) A multiplication layer that multiplies the output of the second dropout element-wise with the elements of the second rearrangement layer; (2.15.6)一个自定义加法层,将乘法层的输出在第2维度进行相加;(2.15.6) A custom addition layer that adds the output of the multiplication layer in the second dimension; (2.15.7)一个展平层,将自定义加法层的输出展开成一维序列;(2.15.7) a flattening layer that unrolls the output of the custom addition layer into a one-dimensional sequence; 将4个分支的展平层输出使用第二串联层进行拼接,将第二串联层的输出连接至第三全连接层;第三全连接层使用softmax作为激活函数,神经元个数为2,输出为指定动作的判断结果。The output of the flattened layer of the 4 branches is spliced using the second series layer, and the output of the second series layer is connected to the third fully connected layer; the third fully connected layer uses softmax as the activation function, and the number of neurons is 2. The output is the judgment result of the specified action. 8.根据权利要求1所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的智能康复手设备(4)包括气动手套(41),安装在气动手套手背处的控制模块(44),分别与所述的控制模块(44)通过导线相连的安装在气动手套每个手指背侧的弯曲度传感器(42)、安装在气动手套每个手指前端的压力传感器(43),分别与所述的气动手套(41)的每个手指通过气路相连的气泵(45);所述的气动手套(41)使用气泵(45)驱动手指独立弯曲或伸展,材质为软体弹性手套,便于使用者佩戴,提高舒适度;所述的弯曲度传感器(42)根据当前手指弯曲角度输出不同的电压值,所述的压力传感器(43)根据当前手指用力程度输出不同的电压值;所述的控制模块(44)是由集成式蓝牙模块构成,集成式蓝牙模块通过多路复用开关轮询读取各个弯曲度传感器(42)和压力传感器(43)输出的电压值,并发送至人机交互界面(2)中的MCU处理器(25);同时,控制模块(44)接收来自于人机交互界面(2)中MCU处理器(25)发送的动作选择、参数设置信息,并解码为抽气/打气控制信号、气路选择信号、速度控制信号。8. A kind of artificial intelligence technology-based rehabilitation training and evaluation system for all fingers of the hand according to claim 1, characterized in that, the intelligent rehabilitation hand device (4) comprises a pneumatic glove (41), installed in a pneumatic The control module (44) at the back of the glove, is respectively connected to the control module (44) by wires and is installed on the bending sensor (42) on the back side of each finger of the pneumatic glove, and is installed on the front end of each finger of the pneumatic glove. A pressure sensor (43), an air pump (45) connected to each finger of the pneumatic glove (41) through an air circuit; the pneumatic glove (41) uses an air pump (45) to drive fingers to bend or stretch independently, The material is a soft elastic glove, which is convenient for the user to wear and improves comfort; the bending sensor (42) outputs different voltage values according to the current bending angle of the finger, and the pressure sensor (43) outputs different voltage values according to the current finger force degree. the voltage value; the control module (44) is made of an integrated bluetooth module, and the integrated bluetooth module polls and reads the output voltages of each bending sensor (42) and pressure sensor (43) through a multiplexing switch value, and sent to the MCU processor (25) in the human-computer interaction interface (2); at the same time, the control module (44) receives the action selection, Parameter setting information, and decoded into pumping/inflating control signals, gas path selection signals, and speed control signals. 9.根据权利要求8所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,所述的气泵(45)接收来自控制模块(44)的抽气/打气控制信号、气路选择信号、速度控制信号,通过气路控制气动手套(41)的每个手指的动作。9. A kind of hand full finger rehabilitation training and assessment system based on artificial intelligence technology according to claim 8, is characterized in that, described air pump (45) receives from control module (44) pumping/inflating control Signal, air path selection signal, speed control signal, control the action of each finger of pneumatic glove (41) by air path. 10.根据权利要求1所述的一种基于人工智能技术的手部全指康复训练及评估系统,其特征在于,使用者在使用前进行系统模式选择,即选择康复训练模式或康复效果评估模式;其中,10. A kind of artificial intelligence technology-based rehabilitation training and evaluation system for all fingers of the hand according to claim 1, characterized in that the user selects the system mode before use, that is, selects the rehabilitation training mode or the rehabilitation effect evaluation mode ;in, (1)康复训练模式包括如下使用步骤:(1) The rehabilitation training mode includes the following steps: (1.1)使用者对需要进行康复训练的动作进行选择,并设置训练参数;(1.1) The user selects the action that requires rehabilitation training and sets the training parameters; (1.2)使用者根据屏幕提示,进行相应动作的运动想象;(1.2) The user performs motor imagery of corresponding actions according to the screen prompts; (1.3)脑电智能解码模块(3)对使用者的EEG脑电信号进行解码,判断是否进行了相应动作的运动想象,并将结果通过Wi-Fi传送至人机交互界面(2)中的MCU处理器(25);(1.3) The EEG intelligent decoding module (3) decodes the user's EEG EEG signal, judges whether the motor imagery of the corresponding action has been performed, and transmits the result to the human-computer interaction interface (2) via Wi-Fi MCU processor (25); (1.4)MCU处理器(24)根据分类结果,决定是否驱动对应手指的气泵(45),进而带动相应手指运动;(1.4) MCU processor (24) determines whether to drive the air pump (45) corresponding to the finger according to the classification result, and then drives the corresponding finger movement; (2)康复效果评估模式包括如下使用步骤:(2) The rehabilitation effect evaluation model includes the following steps: (2.1)使用者对需要进行康复效果评估的动作进行选择,并设置评估参数;(2.1) The user selects the actions that need to be evaluated for rehabilitation effects, and sets the evaluation parameters; (2.2)使用者根据屏幕提示,做出相应的手指动作;(2.2) The user makes corresponding finger movements according to the screen prompts; (2.3)智能康复手设备(4)中的控制模块(44)读取对应手指的弯曲度传感器(42)及压力传感器(43)的实时数值,并发送至人机交互界面(2)中的MCU处理器(25);(2.3) The control module (44) in the intelligent rehabilitation hand device (4) reads the real-time values of the curvature sensor (42) and the pressure sensor (43) of the corresponding finger, and sends it to the human-computer interaction interface (2) MCU processor (25); (2.4)MCU处理器(24)根据接收到的信息给出康复效果评估等级。(2.4) The MCU processor (24) gives the rehabilitation effect evaluation grade according to the received information.
CN202210114284.1A 2022-01-30 2022-01-30 Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology Active CN114504468B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210114284.1A CN114504468B (en) 2022-01-30 2022-01-30 Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210114284.1A CN114504468B (en) 2022-01-30 2022-01-30 Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology

Publications (2)

Publication Number Publication Date
CN114504468A CN114504468A (en) 2022-05-17
CN114504468B true CN114504468B (en) 2023-08-08

Family

ID=81551181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210114284.1A Active CN114504468B (en) 2022-01-30 2022-01-30 Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology

Country Status (1)

Country Link
CN (1) CN114504468B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116013515A (en) * 2022-12-08 2023-04-25 电子科技大学 Rehabilitation Evaluation Method Based on Autoencoder and Softmax Classifier
CN116269444A (en) * 2023-02-20 2023-06-23 刘思宇 Hand training evaluation method and device based on multimodal stimulation and neural network
CN116172577A (en) * 2023-03-09 2023-05-30 江门市中心医院 Hand rehabilitation system and method for electroencephalogram control
CN118750009A (en) * 2024-07-02 2024-10-11 安徽大学 Non-invasive rehabilitation treatment and control system based on brain-computer interface

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003000569A (en) * 2001-06-18 2003-01-07 Fumio Nogata Robot for aiding finger locomotion function recovery
CN1568170A (en) * 2001-09-10 2005-01-19 新纪元创新有限公司 Apparatus, method and computer program product to produce or direct movements in synergic timed correlation with physiological activity
CN102138860A (en) * 2011-01-10 2011-08-03 西安交通大学 Intelligentized rehabilitation training equipment for hand functions of patients suffering from cerebral injury
CN107157705A (en) * 2017-05-09 2017-09-15 京东方科技集团股份有限公司 Rehabilitation Systems and Methods
WO2018188480A1 (en) * 2017-04-14 2018-10-18 The Chinese University Of Hongkong Flexibly driven robotic hands
CN111631907A (en) * 2020-05-31 2020-09-08 天津大学 Hand rehabilitation system for stroke patients based on brain-computer interaction hybrid intelligence

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003000569A (en) * 2001-06-18 2003-01-07 Fumio Nogata Robot for aiding finger locomotion function recovery
CN1568170A (en) * 2001-09-10 2005-01-19 新纪元创新有限公司 Apparatus, method and computer program product to produce or direct movements in synergic timed correlation with physiological activity
CN102138860A (en) * 2011-01-10 2011-08-03 西安交通大学 Intelligentized rehabilitation training equipment for hand functions of patients suffering from cerebral injury
WO2018188480A1 (en) * 2017-04-14 2018-10-18 The Chinese University Of Hongkong Flexibly driven robotic hands
CN107157705A (en) * 2017-05-09 2017-09-15 京东方科技集团股份有限公司 Rehabilitation Systems and Methods
CN111631907A (en) * 2020-05-31 2020-09-08 天津大学 Hand rehabilitation system for stroke patients based on brain-computer interaction hybrid intelligence

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于运动想象的脑-机接口关键技术研究及实现;谢志荣;重庆邮电大学硕士学位论文;全文 *

Also Published As

Publication number Publication date
CN114504468A (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN114504468B (en) Hand full-fingered rehabilitation training and evaluation system based on artificial intelligence technology
CN111631907B (en) Hand rehabilitation system for stroke patients based on brain-computer interaction hybrid intelligence
CN111513991B (en) Active hand full-finger rehabilitation equipment based on artificial intelligence technology
CN111631908B (en) Active hand rehabilitation system for stroke based on brain-computer interaction and deep learning
CN111616682B (en) Epileptic seizure early warning system based on portable electroencephalogram acquisition equipment and application
CN101711709B (en) Electric artificial hand control method using electro-oculogram and electro-encephalic information
CN110059575A (en) A kind of augmentative communication system based on the identification of surface myoelectric lip reading
CN111631848B (en) Mind-controlled prosthetic system based on brain-computer hybrid intelligence
CN114504730A (en) Portable brain-controlled hand electrical stimulation rehabilitation system based on deep learning
CN111584029B (en) Electroencephalogram self-adaptive model based on discriminant confrontation network and application of electroencephalogram self-adaptive model in rehabilitation
CN114647314A (en) A wearable limb movement intelligent sensing system based on electromyography
CN113128353B (en) Emotion perception method and system oriented to natural man-machine interaction
CN110354387A (en) The intelligent electro photoluminescence hand trainer and method of more triggering modes
CN111950460B (en) Muscle strength self-adaptive stroke patient hand rehabilitation training action recognition method
CN113143676B (en) Control method of external limb finger based on brain-muscle-electricity cooperation
CN114626469B (en) Method, device and system for limb rehabilitation after stroke based on multi-layer perceptron model
CN105138133A (en) Biological signal gesture recognition device and method
CN119818334A (en) Intelligent rehabilitation glove system and method based on noninvasive motor nerve interface
CN114504333B (en) Wearable vestibule monitoring system based on myoelectricity and application
CN109498362A (en) A kind of hemiplegic patient's hand movement function device for healing and training and model training method
CN112545536B (en) Brain plasticity-based motion assistance device and its control method and circuit
CN113345546A (en) Hand function active rehabilitation training system and method based on steady-state visual evoked potential
CN118153626A (en) Lightweight network model and construction method for surface electromyography signal gesture recognition
CN114936574A (en) High-flexibility manipulator system based on BCI and implementation method thereof
CN111584028A (en) A Novel Brain-Controlled Intelligent Rehabilitation System Based on Visual Symbol Network and Width Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant