HK1261790A1 - Extraction of features from physiological signals - Google Patents
Extraction of features from physiological signals Download PDFInfo
- Publication number
- HK1261790A1 HK1261790A1 HK19121689.4A HK19121689A HK1261790A1 HK 1261790 A1 HK1261790 A1 HK 1261790A1 HK 19121689 A HK19121689 A HK 19121689A HK 1261790 A1 HK1261790 A1 HK 1261790A1
- Authority
- HK
- Hong Kong
- Prior art keywords
- motion
- subject
- signal
- determining
- component
- Prior art date
Links
Description
Cross Reference to Related Applications
Priority is claimed for U.S. provisional application serial No. 62/403,808 filed on day 4, 10/2016, and U.S. provisional application serial No. 62/323,928 filed on day 18, 4/2016, which are both incorporated herein by reference in their entirety.
Statement regarding federally sponsored research
The invention was made with government support under contract number FA8721-05-C-0002 awarded by the United states air force. The government has certain rights in the invention.
Background
The present invention relates to extracting features from physiological signals, and in particular to extracting features from signals representing physiological motion.
Systems that can infer the mood of a subject, and in some cases react to the inferred mood, continue to be of interest. Such a system may be used to design and test games, movies, advertisements, online content, and human-machine interfaces.
In some examples, the system for inferring the mood of the subject operates in two phases: in a first phase, the system extracts emotion-related signals (e.g. audiovisual cues or physiological signals) and in a second phase they feed the emotion-related signals into a classifier to identify emotions. Existing methods for extracting emotion-related signals fall into two categories: audio-visual techniques and physiological techniques.
Audiovisual technologies generally rely on facial expressions, speech, and gestures presented in an audiovisual recording or stream. The audiovisual method does not require the user to wear any sensors on his body. However, since audiovisual methods rely on the state of an external expression, they often miss subtle emotions and may fail when the subject controls or suppresses the external expression of emotions. In addition, many vision-based techniques require the user to face the camera to operate properly.
Physiological techniques rely on physiological measurements such as ECG and EEG signals. For subjects, controlling physiological measurements is generally more difficult, as these physiological measurements are controlled by the involuntary activity of the Autonomic Nervous System (ANS). Existing sensors that can extract these signals require physical contact with the human body, and thus can interfere with the subject's experience and may affect her emotional state.
Existing methods for identifying emotions based on emotion-related signals extract emotion-related features from a measurement signal, and then process the extracted features using a classifier to identify an emotional state of a subject. Some existing classification methods assign discrete labels (e.g., happy, sad, or angry) to various emotions. Other existing classification methods use multidimensional models that express mood in a 2D plane bounded by a valence (i.e., positive and negative emotions) axis and an arousal (i.e., calm and excited) axis. For example, anger and sadness are negative emotions, but anger involves more arousals. Similarly, happiness and pleasure are positive emotions, but the former is associated with excitement, while the latter refers to a satisfied state.
Disclosure of Invention
In a general aspect, a method for processing motion-based physiological signals representative of motion of a subject uses signal reflections from the subject to perform the processing, the method comprising transmitting from a transmitting element a radio frequency transmit signal comprising one or more transmit signal patterns. A radio frequency receive signal comprising a combination of a plurality of reflections of a transmit signal is received at one or more receive elements, at least some of the plurality of reflections of the transmit signal being associated with the subject. Processing the time-continuous pattern of reflections of the transmitted signal pattern to form one or more motion-based physiological signals, including for at least some of the plurality of reflections, forming a motion-based physiological signal representative of physiological motion of the subject from changes over time in reflections of the transmitted signal in the received signals. Processing each motion-based physiological signal of a subset of the one or more motion-based physiological signals to determine a segment of a heartbeat component of the motion-based physiological signal, the processing comprising: the method comprises determining the heartbeat component, determining a template temporal pattern for a heartbeat in the heartbeat component, and determining a segment of the heartbeat component based on the determined template temporal pattern.
Aspects can include one or more of the following features.
The transmission signal may be a Frequency Modulated Continuous Wave (FMCW) signal comprising a repetition of a single signal pattern. The one or more transmit signal patterns may include one or more pseudorandom noise sequences. Determining the heartbeat component may include mitigating an effect of respiration on the motion-based physiological signal, which includes determining a second derivative of the motion-based physiological signal. Determining the heartbeat component may include mitigating an effect of respiration on the motion-based physiological signal, which may include filtering the motion-based physiological signal using a band-pass filter. Determining the template temporal pattern for the heartbeat and determining the segment of the heartbeat component in the heartbeat component may include jointly optimizing the temporal pattern for the heartbeat and the segment of the heartbeat component.
The method may include determining a cognitive state of the subject based at least in part on the determined segment of the heartbeat component of the motion-based physiological signal associated with the subject. The cognitive state of the subject may include one or more of the following states: confusion status, distraction status, and attention-concentration status. The method may include extracting features from heartbeat components of respective ones of the motion-based physiological signals, the features including peaks, valleys of inflection points, and mapping the extracted features to one or more cardiac functions.
The method may include determining an emotional state of the subject based at least in part on the determined segment of the heartbeat component of the motion-based physiological signal associated with the subject. Determining an emotional state of the subject may also be based on a respiratory component of one or more motion-based physiological signals. The method may include determining a respiratory component of the one or more motion-based physiological signals, which includes applying a low-pass filter to the one or more motion-based physiological signals. Determining an emotional state of the subject may include applying an emotion classifier to one or more features determined from the determined segment of the heartbeat component of the motion-based physiological signal.
Determining an emotional state of the subject may include applying an emotion classifier to one or more features determined from the determined segment of the heartbeat component of the motion-based physiological signal and to one or more features determined from the respiratory component of the one or more motion-based physiological signals. The method may include presenting the emotional state in a two-dimensional grid, the two-dimensional grid including: a first dimension, the arousal dimension, and a second dimension, the valence dimension. The motion-based physiological signal may represent physiological motion of the subject according to a change over time in a phase angle of a transmission signal reflection in the reception signal.
In another general aspect, a method for determining an emotional state of a subject, includes: receiving a motion-based physiological signal associated with a subject, the motion-based physiological signal including a component related to a vital sign of the subject; and determining an emotional state of the subject based at least in part on the component related to the vital sign of the subject.
Aspects can include one or more of the following features.
The components related to vital signs of the subject may comprise periodic components, the method further comprising determining a segmentation of the periodic components. Determining the segmentation of the periodic component may include determining a template temporal pattern for a period of the periodic component and determining the segmentation of the periodic component based on the determined template temporal pattern. Determining the emotional state of the subject may be based at least in part on the segmentation of the periodic component. The periodic component may include at least one of a heartbeat component and a respiration component.
Determining the heartbeat component may include determining a second derivative of the motion-based physiological signal. The method may further comprise determining the heartbeat component, which includes applying a band pass filter to the motion based physiological signal. The method may include determining the respiratory component, including applying a low pass filter to the motion-based physiological signal. Determining an emotional state of the subject may include applying an emotion classifier to one or more features determined from motion-based physiological signals associated with the subject. Determining an emotional state of the subject may include applying an emotion classifier to one or more features determined from the determined segments of the periodic component.
The method may include presenting the emotional state in a two-dimensional grid, the two-dimensional grid including: a first dimension, the arousal dimension, and a second dimension, the valence dimension. The motion-based physiological signal associated with the subject may be associated with an accelerometer measurement. The motion-based physiological signal associated with the subject may be associated with an ultrasound measurement. The motion-based physiological signals associated with the subject may be associated with radio frequency-based measurements. The motion-based physiological signal associated with the subject may be associated with a video-based measurement.
As mentioned above, existing methods for inferring a person's mood typically rely on audiovisual cues such as images and audio clips, or require the person to wear physiological sensors such as ECG monitors. These prior approaches all have associated limitations.
In particular, current audiovisual technologies utilize the external appearance of emotions, but do not scale the internal emotion. For example, a person may be happy even without a veined smile, or may be veined even without pleasure. Furthermore, the problem is further complicated by the fact that people vary greatly in their expressive power in expressing internal emotions. Monitoring physiological signals (e.g., heart beats) using body sensors is an improved method of measuring emotions within a subject, as the method takes into account the interaction between the autonomic nervous system and the heart rhythm. However, measuring these signals using body sensors (e.g. ECG monitors) is cumbersome and may interfere with user activity and mood, making this approach unsuitable for routine use.
The aspects described in the present disclosure directly measure the physiological signal without the subject carrying a sensor on his body, and then use the measured physiological signal to estimate the emotion of the subject. In some aspects, the method senses the physiological signal (and the emotion associated with the physiological signal) using a Radio Frequency (RF) signal. In particular, the RF reflected signal is reflected from the human body and modulated by body motion, including motion associated with respiration and motion associated with heartbeat.
If each heartbeat in the heartbeat component of the RF reflection signal can be extracted, the emotion of the subject is estimated using a slight change in the length and/or shape of each heartbeat. However, extracting the heartbeats from the RF reflection signal presents a number of associated challenges. For example, the RF reflection signal is modulated by both respiration of the subject and the heartbeat of the subject, wherein the influence of the respiration is typically several orders of magnitude larger than the influence of the heartbeat, so that respiration-related motion masks the heartbeats. To separate respiration from heart rate, older systems operated in the frequency domain for many seconds, forgoing the ability to measure beat-to-beat variations.
Furthermore, beat-related features in the RF-reflected signal (generally referred to herein as "beats") lack the spikes that characterize the ECG signal, making it more difficult to accurately identify the boundaries of the beat.
Finally, the difference in the jitter intervals (IBI) is only a few tens of milliseconds. Therefore, each beat must be segmented to within a few milliseconds. Obtaining this accuracy is particularly difficult without clear features for identifying the beginning or end of the heartbeat.
Aspects address these challenges to implement a wireless system for emotion recognition using the RF reflections of the human body. Aspects utilize algorithms for extracting the beats and the variation between beats from the RF reflection signal. In some aspects, the algorithm first mitigates the effect of respiration on the RF echo signal. In some examples, the mitigation mechanism is based on the recognition that: although the displacement of the chest due to the inspiration-expiration process is several orders of magnitude greater than the minute vibrations caused by the heartbeat, the acceleration of the motion due to respiration is significantly less than the acceleration of the motion due to the heartbeat. That is, breathing is generally slow and steady, while heartbeat involves rapid contraction of the myocardium at local moments. Thus, aspects operate on the acceleration of the RF reflection signal to suppress the respiration signal and emphasize the heartbeat.
Aspects then segment the RF reflected signal into heartbeats. In contrast to ECG signals with known expected shapes, the shape of the heart beats in the RF reflection signal is unknown and varies according to the body of the subject and the exact pose relative to the apparatus. Thus, aspects need to learn the beat shape as segmentation occurs. To this end, the joint optimization algorithm is iterated between two sub-problems as follows: the first sub-problem learns the templates of the heartbeats given a particular segment, while the second sub-problem finds the segment that maximizes the similarity to the learned templates. The iterative optimization algorithm continues between the two sub-problems until the optimization algorithm converges on the best beat template and the best segment that maximizes similarity to the template.
The segmentation allows for jitter to be reduced and enlarged and thus jitter lengths may be different. Thus, the algorithm finds a beat segment that maximizes the similarity of the morphology of the beat signal in successive beats, while allowing flexible warping (shrinking or enlarging) of the beat signal.
Certain aspects provide the determined segments to an emotion classification subsystem. The emotion classification subsystem computes heartbeat-based features and respiration-based features and uses a Support Vector Machine (SVM) classifier to distinguish between various emotional states.
Aspects may have one or more of the following advantages.
Among other advantages, aspects can advantageously accurately extract heart beats from the RF reflection signal. In particular, emotion recognition accuracy is significantly reduced even with an error of 40-50 milliseconds in estimating the heartbeat interval. In contrast, aspects are capable of achieving a jitter interval (IBI) average error of 3.2 milliseconds, which is less than 0.4% of the average jitter length.
Aspects recognize the emotion of a subject by relying on a wireless signal reflected from the subject body.
Aspects can restore a person's overall heartbeat from RF reflections and thus can be used in the context of non-invasive health monitoring and diagnosis.
Aspects capture physiological signals by relying only on wireless signals reflected from the user's body without requiring him/her to wear any sensors.
Drawings
Fig. 1 is a block diagram of an emotion recognition system.
Fig. 2 is a block diagram of a motion signal acquisition module of the system of fig. 1.
Fig. 3 is an example of a signal representing physiological motion of a subject.
Fig. 4 is a block diagram of a motion signal processing module of the system of fig. 1.
Fig. 5 is an example of a heartbeat component of the signal in fig. 3.
Fig. 6 is an example of the respiratory component of the signal in fig. 3.
Fig. 7 is a pseudo-code description of a heartbeat segmentation algorithm.
Fig. 8 is a segmentation of the heartbeat component of fig. 5.
Fig. 9 is a heartbeat template determined from the heartbeat component of fig. 5.
Fig. 10 is a two-dimensional mood grid.
Detailed Description
Referring to fig. 1, an emotion recognition system 100 acquires a signal representing physiological motion of a subject 104, and processes the acquired signal to infer an emotional state 112 of the subject. The system 100 comprises a motion signal acquisition module 102 for acquiring a signal related to physiological motion of a subject 104, a motion signal processing module 106, a heartbeat segmentation module 107, a feature extraction module 108, and an emotion classification module 110 for classifying an emotional state 112 of the subject.
1 Signal acquisition
In the example of fig. 1, the body of the subject moves due to both the respiration of the subject and the pulsation of the subject's heart. The motion signal acquisition module 102 includes one or more transducers (not shown) that sense motion (or any other physiological motion) of the subject body and generate a signal phi (t) (e.g., an electrical signal) representative of the subject body motion.
Referring to fig. 2, in some examples, the motion signal acquisition module 102 uses wireless sensing technology to generate a signal representative of body motion of a subject. Wireless sensing technology exploits the following facts: the characteristics of the wireless signal are affected by motion in the environment, including chest motion due to inhalation and exhalation, and body vibrations due to heartbeat. In particular, the wireless sensing system emits wireless signals that are reflected from objects (including the subject 104 in the environment) (note that there may be more than one subject in the environment). The reflected signals are then received at the motion sensing acquisition module 102. As the subject 104 in the environment breathes and as its heart beats, the distance traveled by the reflected wireless signals received by the wireless sensing system changes. The wireless sensing system uses time of flight (TOF) (also referred to as "round trip time") to monitor the distance between the antenna of the system and the object 104.
In fig. 2, the motion signal acquisition module 102 implements a particular wireless sensing technique known as Frequency Modulated Continuous Wave (FMCW) wireless sensing. The motion sensing signal acquisition module includes a transmit antenna 114, a receive antenna 116, and a plurality of signal processing components including a controller 118, an FMCW signal generator 120, a frequency shift module 122, and a phase signal extraction module 124.
In operation, the controller 118 causes the FMCW signal generator 120 to generate a repetition of a signal pattern (e.g., a frequency swept signal pattern). The repeated signal pattern is provided to the transmit antenna 114 and transmitted from the transmit antenna 114 to the environment surrounding the module 102. The transmitted signal is reflected from one or more subject(s) 104 and/or other objects 105 (such as walls and furniture in the environment, etc.) and then received by a receive antenna 116. The received reflected signal is provided to the frequency shifting module 122 along with the transmitted signal generated by the FMCW signal generator 120. The frequency shifting module 122 frequency shifts (e.g., "down-converts" or "down-mixes") the received signal based on the transmitted signal (e.g., by multiplying the signal), and transforms the frequency-shifted received signal into a frequency domain representation (e.g., via a Fast Fourier Transform (FFT)), resulting in a frequency-shifted received signal S (ω)iIn the frequency domain of a discrete set of frequencies ω.
Frequency shift signal S (omega)iIs provided to the phase signal extraction module 124, wherein the phase signal extraction module 124 processes S (ω)iTo extract one or more phase signals phi (t). In some examples, phase signal extraction module 124 processes frequency shifted signal S (ω)iTo spatially separate reflected signals from objects and/or subjects in the environment based on their number of reflections. In some examples, phase signal extraction module 124 eliminates reflections from static objects (i.e., objects that do not move over time).
In the example shown in FIG. 2, the transmit antenna 114 and the receive antennaThe paths 112 between 116 are shown as reflecting from a representative subject 104. Assuming a constant signal propagation velocity c (i.e., the speed of light), the secondary coordinate is (x)t,yt,zt) Via the coordinate (x)o,yo,zo) Is reflected by the object and has a coordinate (x)r,yr,zr) The time of flight (TOF) received on the receive antenna at (a) may be expressed as:
in this case, with a single antenna pair, the TOF associated with the path 112 constrains the position of the object 104 on an ellipsoid defined by the three-dimensional coordinates of the transmit and receive antennas of the path, and the path distance determined from the TOF.
As described above, the distance of the ellipsoid from the pair of transmitting antenna and receiving antenna varies with chest movement of the subject due to exhalation and inhalation, and body vibration due to heartbeat. The varying distance between the antennas 114, 116 and the subject 104 appears in the reflected signal as a time-varying phase as follows:
where φ (t) is the phase of the signal, λ is the wavelength, d (t) is the distance traveled, and t is a time variable. The phase phi (t) of the signal is output from the motion signal acquisition module 102 as a signal representing the body motion of the subject.
More details of the FMCW-based motion sensing techniques described above may be found in PCT application number PCT/US2015/027945, filed on 28/4/2015 and 28, and entitled "VITAL SIGNS monitor VIA RADIO selection" published as WO2015168093, which is incorporated herein by reference.
Referring to fig. 3, one example of the signal indicative of body motion of the subject acquired by the signal acquisition module 102 φ (t) has a relatively large respiratory component (i.e., a sinusoidal component with a frequency of 0.25 Hz) due to the subject's chest displacement upon its inspiration and expiration. The heartbeat component of the phase signal appears as small variations in modulating the respiratory component, caused by minute body vibrations associated with the heartbeat and blood pulsation of the subject.
2 motion signal processing
Referring again to FIG. 1, the motion signal processing module 106 receives the signal φ (t) representing the motion of the subject from the motion signal acquisition module 102 and processes the signal representing the motion of the subject to correlate a heartbeat component φ "(t) of the signal with a respiration component φ (t) of the signalb(t) separation.
Referring to FIG. 4, the motion signal processing module includes a differentiator 442 for processing the signal φ (t) representing the motion of the subject to isolate a heartbeat component φ "(t) of the signal, and a respiration component φ" to isolate a respiration component of the signalb(t) low pass filter 440.
Since the respiratory component is several orders of magnitude greater than the amplitude of the heartbeat component, the respiratory component is separated from the heartbeat component. To isolate the heartbeat component φ "(t), the motion signal processing module 106 takes advantage of the fact that the acceleration of the respiratory motion is less than the acceleration of the heartbeat motion. This is because breathing is generally slow and steady, while heartbeat involves rapid contraction of the heart muscle. Thus, the motion signal processing module 106 includes a differentiator 442 to reduce the effect of the respiratory component of the signal relative to the heartbeat component by determining an acceleration signal. In particular, the differentiator 442 calculates a second derivative φ "(t) of the signal indicative of the subject motion.
In some examples, no analytical expression for φ (t) is available, so numerical methods are used to compute the second derivative φ "(t). In some examples, due to robustness to noise, the differentiator 442 implements a second order differentiator as follows:
wherein, f ″)0Refers to the second derivative, f, at a particular sampleiRefers to a value other than i samples in the time series, and h is the time interval between consecutive samples.
Referring to fig. 5, one example of the acceleration signal Φ ″ (t) output by the differentiator 442 is determined by having the differentiator 442 apply the above second-order differentiator to the signal Φ (t) representing the motion of the object. In the obtained acceleration signal, since the acceleration of the motion relating to the heartbeat is much larger than the acceleration of the motion relating to the breathing of the subject, the signal component Φ ″ (t) due to the heartbeat occupies a dominant portion. In some examples, the motion signal processing module 106 uses a band pass filter to isolate signal components related to heart beats while also reducing noise present in the signal.
Referring again to FIG. 4, a low pass filter 440 is used to isolate the respiratory component φ of the signal φ (t) representing the motion of the subjectb(t) of (d). In particular, since the respiratory component is mainly low frequency with respect to the heartbeat component, a low-pass filter may be used to substantially eliminate the heartbeat component from the signal phi (t) representing the motion of the subject while making the respiratory component phi (t) substantially the sameb(t) is substantially intact.
The heartbeat component (t) of the signal (i.e., the acceleration signal) and the respiratory component (t) of the signalb(t) is provided as an output from the motion signal processing module 106. Referring to FIG. 6, the respiratory component φ output at low pass filter 440b(t) in one example, a relatively high frequency heartbeat component is substantially removed from a signal φ (t) indicative of motion of a subject, while a respiratory component φb(t) is substantially intact.
3 heartbeat segmentation
Referring again to fig. 1, the heartbeat component of the signal, phi "(t), is provided to a heartbeat segmentation module 107, which heartbeat segmentation module 107 determines the best segment of the heartbeat component. As aboveAs described, some methods of emotion classification classify the emotional state of a subject using small changes in the heartbeat interval of the subject. Since the morphology (e.g., temporal pattern or shape) of the heartbeats in the heartbeat signal is unknown (due to factors such as the position and posture of the subject relative to the system 100), the heartbeat segmentation module 107 uses an optimization algorithm that jointly determines the morphology of the heartbeats and segments the heartbeats. Among other advantages, the resulting segment φ ″)S(t) is used to identify small changes in the heartbeat interval as described above.
The optimization algorithm is based on the assumption that consecutive human heartbeats have the same morphology. That is, although the heartbeat motions may be extended or compressed due to different beat lengths, they will all have similar overall shapes. In view of this assumption, the algorithm determines segments that minimize shape differences between heartbeats, while taking into account the fact that the shape of the heartbeat beat is not known a priori, and that the heartbeat can be compressed or extended. As described below, the algorithm is formulated as an optimization problem for all possible segments of the acceleration signal φ "(t).
Let x be (x)1,x2,...,xn) Representing a sequence of length n. Segment S ═ S for x1,s2,.. is the partitioning of x into non-overlapping contiguous subsequences (i.e., segments), where each segment siIncluding | siPoints. To identify the heartbeats, the segments having the segments that are most similar to each other are identified (i.e., variations between segments are minimized). Since the statistical variance is only defined for scalars or vectors with the same dimensions, the definition of vectors with different lengths is extended such that the segment S ═ { S ═ is1,s2,.. } the variance is
Where ω (μ, | s)i| is μ to length | s)iLinear warping of l (e.g. by cubic spline interpolation)。
Note that the above definition is the same as the statistical variance, in the case where all segments have the same length. In the above definition, μ represents the center trend of all segments (i.e., the template for the beat shape or morphology).
The algorithm determines the optimal segment S that minimizes the variance of the segment*And may be formally described as follows:
based on the above statements for optimal segmentation, the optimization problem can be restated as:
compliance
bmin≤|si|≤bmax,si∈S
Wherein b isminAnd bmaxIs a constraint on the length of the individual heart cycles.
The optimization problem attempts to determine the best segment S and template (i.e., morphology) μ that minimizes the sum of squared differences between the segment and template. The optimization problem involves both combinatorial optimization for S and numerical optimization for μ. Searching through all possible segments has an exponential complexity.
To avoid this exponential complexity, the algorithm alternates between updating the segment and template, rather than synchronously making the estimates of segment S and template μ. During each iteration, the algorithm updates the segment given the current template and then updates the template given the new segment. For each of these two sub-problems, the algorithm obtains a global optimum with linear time complexity.
Referring to fig. 7, the pseudo-code description of the heartbeat segmentation algorithm receives as input a sequence x of n data samples and an allowable heart rate range B. The heartbeat segmentation algorithm generates an output comprising a plurality of segments S, and a template μ of length m.
In line 1 of the pseudo-code description, the vector representing μ is initialized to include all zeros. In line 2 of the pseudo-code description, the iteration number l is initialized to zero. In lines 3-7 of the pseudo-code description, a loop is performed in which the segment S and template μ are iteratively updated until the algorithm converges. In particular, in line 4 of the pseudo-code description, the most recent updated version μ of the template and the sequence x for the data samplelInvoking the UPDATESEGENTATION procedure to determine Sl+1. In line 5 of the pseudo-code description, the most recent update version S by sequence x and segment for the data samplel+1Invoking the UPDATEMPLATE procedure to determine an updated version μ of a templatel+1. In line 5 of the pseudo-code description, l is incremented. The UPDATESEGENTATION and UPDATESTMPLATE procedures are repeatedly invoked until the algorithm converges. Once the algorithm converges, the final segment S is returned in line 8 of the pseudo-random descriptionlAnd the final template μl。
The UPDATESEGGMENT TATION process receives as inputs a sequence x of n data samples and a template μ, referenced at lines 9-16 of the pseudo code description. The process returns to the nth segment SnIt is determined as follows:
although the number of possible segmentations grows exponentially with the length of x, the use of dynamic programming effectively solves the above optimization problem. The recursion of the dynamic program is as follows: if D istRepresents a sequence x of pairs1:tMinimum cost to perform segmentation, then:
wherein tau ist,BThe possible choices of τ are specified based on the segment length constraint. The temporal complexity of the dynamic program based on equation 6 is o (n) and guarantees global optimality.
Referring to lines 17-19 of the pseudo code description, the UPDATEMPLATE process receives as inputs a sequence x of n data samples and a segment S. The process returns the updated template mu. The updated template is determined as follows:
where m is the length required for the template. The above optimization problem is a weighted least squares with the following closed solution:
referring to fig. 8, the result of applying the above algorithm to the acceleration signal is a segmented acceleration signal S*. Referring to fig. 9, the morphology of the heartbeat found from the acceleration signal by the algorithm described above is shown.
4 sign extraction
The segmented acceleration signal and respiration signal are provided to a feature extraction module 108, which feature extraction module 108 uses the determined morphology and segmentation of the heartbeat signal and respiration signal to determine features for use by emotion classification module 110.
In some examples, feature extraction module 108 extracts features in the time domain, such as mean, median, SDNN, PNN50, RMSSD, SDNNi, meanRate, sdRate, HRVTi, and TINN, among others. In some examples, feature extraction module 108 extracts features in the frequency domain, such as Welch PSD (LF/HF, peakLF, peakHF), BurgPSD (LF/HF, peakLF, peakHF), Lomb-Scargle PSD (LF/HF, peakLF, peakHF).In some examples, the feature extraction module 108 extracts features such as SD1、SD2、SD2/SD1And the Poincare feature of the like. In some examples, feature extraction module 108 extracts a feature such as SampEn1、SampEn2、DFAall、DFA1And DFA2Etc. non-linear characteristics.
In some examples, feature extraction module 108 extracts respiratory features, such as irregularities in breathing. To this end, feature extraction module 108 passes the respiratory component φbPeak detection in (t) to identify each respiratory cycle. Feature extraction module 108 then measures variability of breathing using some or all of the features described above.
5 mood Classification
Referring again to fig. 1, the features extracted by the feature extraction module 108 are provided to an emotion classification module 110, which emotion classification module 110 processes the features according to, for example, an emotion model to generate a classification of the emotion 112 of the subject.
In some examples, emotion classification module 110 implements an emotion model having a valence axis and an arousal axis. Generally, the emotional model classifies between four basic emotional states: sadness (negative valence and negative arousal), anger (negative valence and positive arousal), pleasure (positive valence and negative arousal), and happiness (positive valence and positive arousal). For example, referring to fig. 8, a 2D emotion grid 830 includes a plurality of exemplary emotion classification results generated by an emotion model. The first emotion classification result 832 has a positive arousal value and a negative valence, and thus represents a subject having an angry emotional state. The second emotion classification result 834 has a positive arousal value and a positive effectiveness value, and thus represents a subject having a happy emotional state. The third emotion classification result 836 has a negative arousal value and a negative effect value, and thus represents a subject having a sad emotional state. The fourth emotion classification result 838 has a negative arousal value and a positive effect value, and thus represents a subject having a pleasant emotional state.
In some examples, the emotion model of emotion classification module 110 is trained using a training data set to classify the emotions of the subject into a 2D emotion mesh. In some examples, the training data set includes a plurality of feature sets measured from a plurality of subjects, wherein each feature set is associated with a known emotional state in a 2D emotional grid. The emotion classification module 110 uses machine learning techniques to analyze the training data and train an emotion model (e.g., a Support Vector Machine (SVM) classifier model) based on statistical relationships between the feature sets and emotional states. Once the emotion model is trained, emotion classification module 110 can receive the extracted features of the subject from feature extraction module 108 and predict the emotion of the subject by applying the emotion model to the extracted features. More details regarding Emotion classification systems and methods can be found, for example, in "Emotion recognition based on physical changes in music Analysis and Machine understanding," IEEE Transactions on,30(12): 2067. 2083,2008, and P.J.Lang. "The Emotion probes of motion and experience integration" (Amcan mental health, 50(5):372,1995), The contents of which are incorporated herein by reference.
In some examples, the features extracted by the feature extraction module 108 differ from subject to subject for the same emotional state. Further, these characteristics may differ on different days for the same subject. Such changes may be caused by a variety of factors including caffeine intake, sleep, and the current day's baseline mood. To ensure that the model is user independent and time independent, emotion classification module 110 incorporates a baseline emotional state: and (4) the product is neutral. That is, the emotion classification module 110 utilizes changes in physiological characteristics rather than absolute values. Thus, in some examples, emotion classification module 110 calibrates the calculated features by subtracting, for each feature, the respective value calculated for a given person's neutral state on a given day. Such calibration may be incorporated into the emotion model used by emotion classification module 110 and/or may be part of a pre-processing step applied to the extracted features before they are provided to the emotion model.
In some examples, using all of the features listed above, as well as a limited amount of training data, may result in overfitting. To this end, in some examples, emotion classification module 110 selects the set of features that are most relevant to emotion. This option not only reduces the amount of data required for training, but also improves the classification accuracy of the test data. In some examples, emotion classification module 110 learns which features are most helpful to the accuracy of the emotion model while training the emotion model. In some examples, this learning is done using an l1-SVM, where the l1-SVM selects a subset of relevant features while training the emotion model.
Alternative 6
It should be noted that while the above-described embodiments use non-contact RF sensing to sense motion of the subject body (e.g., skin or internal structures, or clothing covering the skin), in other examples, the signal acquisition module 102 uses an accelerometer connected to the subject body (either directly or via clothing or a wearable accessory on the subject body) to sense motion of the subject body. In still other examples, the signal acquisition module 102 senses motion (e.g., motion of blood in the subject vasculature) using ultrasound measurement techniques. It should be understood that any number of other suitable methods may be used to sense motion related to the physiology of the subject. In general, the motion signal acquisition module 102 conditions the signal representing the body motion of the subject by, for example, filtering, amplifying, and sampling the signal so that the signal output by the motion signal acquisition module 102 may be used by downstream modules of the system 100.
The above-described system employs an FMCW wireless sensing technique that includes repeated transmission of a single signal pattern (e.g., a swept signal pattern). It should be noted, however, that in some examples, the system makes repeated transmissions where each transmission includes a different signal pattern (known a priori by the system). For example, each transmission may include a pseudorandom noise signal pattern known a priori. Since each signal pattern is known a priori by the system, the system can determine information such as time of flight by comparing the transmitted a priori known signal with the received reflections of the transmitted signal (e.g., by cross-correlating the known signal with the received reflections of the transmitted signal).
It should be noted that the signal representative of physiological motion may represent any number of different types of physiological motion. For example, the signal may represent physiological motion on a macro scale, such as motion of the skin of the subject, or the like. The signal may also represent physiological motion of a smaller scale, such as the movement of blood through the vasculature of the subject, etc. For example, a video recording of the subject (i.e., a recording taken using a video camera) may be analyzed to identify small changes in the skin tone of the subject due to blood entering and exiting the vasculature within and near the subject's skin. The observed changes in shade of the subject's skin can then be used to infer the mood of the subject.
In some examples, the system is configured to determine the cognitive state (e.g., degree of confusion, distraction, or concentration) of the subject using a cognitive state classifier (e.g., a support vector machine-based cognitive state classifier). A cognitive state classifier classifies a cognitive state of the subject based at least in part on the determined segment of the heartbeat component of the motion-based physiological signal associated with the subject.
In some examples, features of the subject's heartbeat are extracted from heartbeat components of a motion-based physiological signal associated with the subject and mapped to cardiac function. In some examples, the feature includes one or more of a peak, a valley, and an inflection point in the heartbeat component.
7 implementation
A system implementing the above techniques may be implemented in software, firmware, digital electronic circuitry, or computer hardware, or a combination thereof. The system may include a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor, and method steps may be performed by the programmable processor executing a program of instructions to perform functions by operating on input data and generating output. The system can be implemented in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Each computer program may be implemented in a high level procedural or object oriented programming language, or in assembly or machine language if desired; and in any case, the language may be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing data files; such an apparatus comprises: magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and an optical disc. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of: non-volatile memory such as EPROM, EEPROM, and flash memory devices, for example, including semiconductor memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and a CD-ROM disk. Any of the foregoing means may be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
It is to be understood that the foregoing description is intended to illustrate and not to limit the scope of the invention, which is defined by the scope of the appended claims. Other embodiments are within the scope of the following claims.
Claims (31)
1. A method for processing a motion-based physiological signal representative of motion of a subject, the method using signal reflections from the subject to perform the processing, the method comprising:
transmitting a radio frequency transmit signal comprising one or more transmit signal patterns from a transmit element;
receiving, at one or more receiving elements, a radio frequency receive signal comprising a combination of a plurality of reflections of a transmit signal, at least some of the plurality of reflections of the plurality of transmit signals being associated with the subject;
processing the time-continuous pattern of reflections of the transmitted signal pattern to form one or more motion-based physiological signals, including for at least some of the plurality of reflections, forming a motion-based physiological signal representative of physiological motion of the subject from changes over time in reflections of the transmitted signal in the received signals; and
processing each motion-based physiological signal of a subset of the one or more motion-based physiological signals to determine a segment of a heartbeat component of the motion-based physiological signal, the processing comprising:
-determining the component of the heartbeat,
determining a template temporal pattern for the heartbeat in said heartbeat component, an
Determining a segmentation of the heartbeat component based on the determined template temporal pattern.
2. The method of claim 1, wherein the transmission signal is a repetitive Frequency Modulated Continuous Wave (FMCW) signal comprising a single signal pattern.
3. The method of claim 1, wherein the one or more transmit signal patterns comprise one or more pseudorandom noise sequences.
4. The method of claim 1, wherein determining the heartbeat component includes mitigating an effect of respiration on the motion-based physiological signal, the mitigating the effect of respiration on the motion-based physiological signal including determining a second derivative of the motion-based physiological signal.
5. The method of claim 1, wherein determining the heartbeat component includes mitigating an effect of respiration on the motion-based physiological signal includes filtering the motion-based physiological signal using a band-pass filter.
6. The method of claim 1, wherein determining a template temporal pattern for a heartbeat and determining a segment of the heartbeat component in the heartbeat component comprise jointly optimizing the temporal pattern for the heartbeat and the segment of the heartbeat component.
7. The method of claim 1, further comprising determining a cognitive state of the subject based at least in part on the determined segment of the heartbeat component of the motion-based physiological signal associated with the subject.
8. The method according to claim 7, wherein the cognitive state of the subject includes one or more of: confusion status, distraction status, and attention-concentration status.
9. The method of claim 1, further comprising determining an emotional state of the subject based at least in part on the determined segment of the heartbeat component of the motion-based physiological signal associated with the subject.
10. The method of claim 9, wherein determining the emotional state of the subject is further based on a respiratory component of one or more motion-based physiological signals.
11. The method of claim 9, further comprising determining a respiratory component of the one or more motion-based physiological signals, the determining the respiratory component of the one or more motion-based physiological signals comprising applying a low pass filter to the one or more motion-based physiological signals.
12. The method according to claim 9, wherein determining an emotional state of the subject comprises applying an emotional classifier to one or more features determined from the determined segment of the heartbeat component of the motion-based physiological signal.
13. The method of claim 10, wherein determining the emotional state of the subject comprises applying an emotion classifier to one or more features determined from the determined segment of the heartbeat component of the motion-based physiological signal and to one or more features determined from the respiratory component of the one or more motion-based physiological signals.
14. The method of claim 9, further comprising presenting the emotional state in a two-dimensional grid comprising: a first dimension, the arousal dimension, and a second dimension, the valence dimension.
15. The method as recited in claim 1, wherein the motion-based physiological signal represents physiological motion of the subject as a function of time of a phase angle of a transmitted signal reflection in the received signal.
16. The method of claim 1, further comprising extracting features from heartbeat components of respective ones of the motion-based physiological signals, the features including peaks, valleys of inflection points, and mapping the extracted features to one or more cardiac functions.
17. A method for determining an emotional state of a subject, the method comprising:
receiving a motion-based physiological signal associated with a subject, the motion-based physiological signal including a component related to a vital sign of the subject; and
determining an emotional state of the subject based at least in part on a component related to a vital sign of the subject.
18. The method according to claim 17, wherein the component related to a vital sign of the subject comprises a periodic component, the method further comprising determining a segmentation of the periodic component.
19. The method of claim 18, wherein determining the segmentation of the periodic component comprises determining a template temporal pattern for a period of the periodic component and determining the segmentation of the periodic component based on the determined template temporal pattern.
20. The method of claim 18, wherein determining the emotional state of the subject is based at least in part on a segmentation of the periodic component.
21. The method of claim 18, wherein the periodic component includes at least one of a heartbeat component and a respiration component.
22. The method of claim 21, further comprising determining the heartbeat component, the determining the heartbeat component comprising determining a second derivative of the motion-based physiological signal.
23. The method of claim 21, further comprising determining the heartbeat component, the determining the heartbeat component comprising applying a band pass filter to the motion-based physiological signal.
24. The method of claim 21, further comprising determining the respiratory component, the determining the respiratory component comprising applying a low pass filter to the motion-based physiological signal.
25. The method of claim 17, wherein determining the emotional state of the subject comprises applying an emotional classifier to one or more features determined from motion-based physiological signals associated with the subject.
26. The method of claim 20, wherein determining an emotional state of the subject comprises applying an emotional classifier to one or more features determined from the determined segments of the periodic component.
27. The method of claim 17, further comprising presenting the emotional state in a two-dimensional grid comprising: a first dimension, the arousal dimension, and a second dimension, the valence dimension.
28. The method of claim 17, wherein the motion-based physiological signal associated with the subject is associated with an accelerometer measurement.
29. The method of claim 17, wherein the motion-based physiological signal associated with the subject is associated with an ultrasound measurement.
30. The method of claim 17, wherein the motion-based physiological signal associated with the subject is associated with a radio frequency-based measurement.
31. The method of claim 17, wherein the motion-based physiological signal associated with the subject is associated with a video-based measurement.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US62/323,928 | 2016-04-18 | ||
| US62/403,808 | 2016-10-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK1261790A1 true HK1261790A1 (en) | 2020-01-03 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20170311901A1 (en) | Extraction of features from physiological signals | |
| TWI720215B (en) | System and method for providing a real-time signal segmentation and fiducial points alignment framework | |
| Zhao et al. | Towards low-cost sign language gesture recognition leveraging wearables | |
| US20220175298A1 (en) | Medical decision support system | |
| JP5980720B2 (en) | Video processing for respiratory rate estimation | |
| US10722182B2 (en) | Method and apparatus for heart rate and respiration rate estimation using low power sensor | |
| JP6716466B2 (en) | Monitoring vital signs by radio reflection | |
| CN115003215A (en) | System and method for pulse transit time measurement from optical data | |
| Maiorana et al. | Biowish: Biometric recognition using wearable inertial sensors detecting heart activity | |
| Samyoun et al. | Stress detection via sensor translation | |
| Hwang et al. | Enhancing privacy-preserving personal identification through federated learning with multimodal vital signs data | |
| CN113796862A (en) | An emotion monitoring technology for intelligent elderly care | |
| JP7158641B1 (en) | Apnea hypopnea index estimation device, method and program | |
| Slapnicar et al. | Contact-free monitoring of physiological parameters in people with profound intellectual and multiple disabilities | |
| Nguyen et al. | Identification, activity, and biometric classification using radar-based sensing | |
| HK1261790A1 (en) | Extraction of features from physiological signals | |
| JP2014176427A (en) | Data analysis device and data analysis program | |
| CA3149656C (en) | A system for denoising motion artifact signals and method thereof | |
| Deepakfranklin et al. | Survey on methods of obtaining biomedical parameters from ppg signal | |
| CN115299891B (en) | Heartbeat timing state detection method based on radar signal | |
| WO2020157988A1 (en) | State estimation device, state estimation method, and computer-readable recording medium | |
| Zheng et al. | Joint Attention Mechanism Learning to Facilitate Opto-physiological Monitoring during Physical Activity | |
| Li et al. | Dual-Channel Non-Contact Continuous Blood Pressure Measurement Based on Millimeter-Wave Radar | |
| Satou et al. | Clustering and Classification of Breathing Activities by Depth Image from Kinect | |
| Lourenço et al. | Dominant set approach to ECG biometrics |