US20150265211A1 - Device, method and application for establishing a current load level - Google Patents
Device, method and application for establishing a current load level Download PDFInfo
- Publication number
- US20150265211A1 US20150265211A1 US14/418,374 US201314418374A US2015265211A1 US 20150265211 A1 US20150265211 A1 US 20150265211A1 US 201314418374 A US201314418374 A US 201314418374A US 2015265211 A1 US2015265211 A1 US 2015265211A1
- Authority
- US
- United States
- Prior art keywords
- data
- mobile terminal
- user
- biometric data
- artificial neural
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4884—Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4803—Speech analysis specially adapted for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4809—Sleep detection, i.e. determining whether a subject is asleep or not
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4812—Detecting sleep stages or cycles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4815—Sleep quality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/63—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/66—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for extracting parameters related to health condition
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16B—BIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
- G16B40/00—ICT specially adapted for biostatistics; ICT specially adapted for bioinformatics-related machine learning or data mining, e.g. knowledge discovery or pattern finding
- G16B40/20—Supervised data analysis
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Biofeedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16B—BIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
- G16B40/00—ICT specially adapted for biostatistics; ICT specially adapted for bioinformatics-related machine learning or data mining, e.g. knowledge discovery or pattern finding
Definitions
- the present invention relates to an apparatus for ascertaining a current stress level for a user according to the preamble of claim 1 , to a method for ascertaining a current stress level, to a computer program product and to an application for a mobile terminal.
- a mobile terminal that has at least one sensor integrated in the mobile terminal for producing signal data and has a plurality of available applications for use by the user.
- an evaluation unit is provided. The evaluation unit is provided particularly to evaluate signal data.
- the mobile terminal according to the preamble is a smartphone such as an iPhone or another smartphone, which is equipped with an operating system, such as an iOS or Android operating system for a mobile terminal, and has an integrated sensor, for example a GPS sensor for ascertaining the current position.
- the mobile terminal has a plurality of standard applications installed on it, such as a telephony application for setting up and conducting a telephone call via a mobile radio link and/or an application for writing, sending and receiving SMSs and/or a browser application for accessing web pages on the Internet.
- the operating system allows the installation and execution of further applications, which can be downloaded from an online shop for mobile applications on the Internet, for example.
- the further application can request and process data pertaining to the current location of the mobile terminal from the GPS sensor, for example, and can transmit said data to a central server via the mobile radio network, for example via a GPRS or UMTS connection.
- a central server via the mobile radio network, for example via a GPRS or UMTS connection.
- Such tracking data can be stored on the server and evaluated in an evaluation unit arranged in the central server, for example in order to provide the user with location-based services, such as a locating service for tracking the whereabouts of a child for his parents, what is known as a Childtracker.
- Location-based services are currently widely advertized on the market and in great demand, as a result of which smartphones have now become prevalent, particularly in younger demographic groups.
- the Stress Monitor (see http://www.healthreviser.com/content/stress-monitor) thus uses a clip worn on the ear of the user to ascertain the heart rate of the user and uses this singular indicator to determine a current stress level.
- the cited apparatuses and methods for ascertaining a current stress level have the disadvantage that the current stress level is determined on the basis of very few biometric data (for example subjective questionnaire data or objective sensor data that measure a vital function), which means that different categories of biometric data are not combined. Therefore, the current stress level can be determined less reliably than when a large number of different categories of biometric data are used.
- biometric data for example subjective questionnaire data or objective sensor data that measure a vital function
- the aforementioned sensor-based apparatuses resort to special devices designed precisely for this instance of application, which are equipped with integrated sensors that capture a specific category of biometric data, such as the heart rate or the body temperature or another vital function. These devices need to be worn by the user on his body, for example, in order to determine his current stress level. This firstly means increased outlay in terms of hardware and cost, since measuring a vital function requires a dedicated device with a sensor for measuring precisely this vital function. Furthermore, fitting and wearing a special apparatus with an integrated sensor on or in direct proximity to the body is a nuisance for the wearer, particularly a restriction of comfort and wellbeing.
- the aforementioned questionnaire-based methods require a high level of additional effort from the user.
- the user thus needs to regularly answer questionnaire data, for example using a computer via the Internet, and then autonomously manage, compare and rate the results that vary over time.
- the present invention comprises a further application that can be installed on a mobile terminal and that ascertains a multiplicity of biometric data from a user.
- the further application interacts with other components that are likewise arranged in the mobile terminal.
- the other components are sensors integrated in the mobile terminal and also available applications.
- An available application denotes an application that is available on a mobile terminal. That is to say an application that is installed and executable on the mobile terminal, such as telephony, SMS, MMS, chat applications and/or browser applications for accessing the Internet and also other applications that are suitable for extracting tactile, acoustic and/or visual biometric features of the user.
- the mobile terminal is a smartphone or a tablet computer or a PDA or another mobile terminal that the user can use for many diverse purposes, for example for communication.
- the mobile terminal has means for installing and executing a further application.
- the further application can be obtained via the Internet via an online shop integrated in the operating system of the mobile terminal, such as the App store, or another online shop that supplies compatible applications for the mobile terminal, and can be installed on the mobile terminal directly.
- the further application may be available in various versions, for example in an iOS version for installation and execution on an iPhone or an iPad and in an Android version for installation and execution on a mobile terminal that supports said Android operating system, or in a further version that is compatible with a further mobile operating system.
- the further application may be installed, and can be executed, on an interchangeable component of the mobile terminal.
- it may be stored as a SIM application in a memory area on a SIM card that can be operated in the mobile terminal, and can be executed by a separate execution unit integrated on the SIM card.
- the portion of the apparatus for ascertaining a current stress level that is arranged on the mobile terminal is obtained.
- the further application ascertains the biometric data firstly from signal data that are produced by sensors integrated in the mobile terminal, and secondly biometric data are extracted from the use data of other applications installed on the mobile terminal.
- Determination of a plurality of biometric data is firstly facilitated by virtue of mobile terminals, such as smartphones, being equipped with an increasing number of sensors as standard. Furthermore, determination of further biometric data is also facilitated by virtue of users increasingly satisfying their interaction and communication needs by using such devices, and hence biometric data pertaining to the categories speech and social interaction, for example, being able to be derived from use data pertaining to communication applications of the mobile terminal directly, and without additional effort and restriction of comfort for the user.
- the further application can use voice analysis to ascertain biometric data pertaining to speech, such as volume, speech rate and/or modulation capability, from the use data from a telephony application installed on the mobile terminal, particularly from the voice data from the user that are ascertained via the microphone of the mobile handset and that are transmitted from the mobile terminal to the communication partner via a radio network, for example.
- the voice data from the user can come from using other available applications, for example from applications that are controlled by the user using voice control.
- the further application can determine the number of SMS messages sent and received and the number of different receivers and senders of SMS messages, and hence can determine biometric data pertaining to social interaction.
- a sensor integrated in the mobile terminal may be a gyroscope or gyroscopic instrument.
- a gyroscope is used for position finding in space and is increasingly widespread in smartphones currently advertized on the market, such as in the iPhone 4 .
- the gyroscope and further sensors that are integrated in the mobile terminal can be used to ascertain biometric data pertaining to the sleep structure of the user.
- the mobile terminal is positioned on the mattress at night and the movements of the user during the night are detected by the sensors.
- the further application collects the signal data produced by gyroscope, acceleration sensor and/or light sensor and ascertains biometric data therefrom pertaining to sleep quality and sleep profile, such as time of sleep onset, sleep duration and/or sleep stages.
- questionnaire data answered by the user which questionnaire data are requested using a web form in a browser application or in a form that provides the further application, to be part of the biometric data ascertained by the further application. Since answering questionnaire data generates additional effort for the user, these data showing the subjective stress level should be requested only once or as rarely as possible.
- the biometric data are ascertained constantly by the further application while the mobile terminal is in operation.
- the mobile terminal is configured such that the further application is started automatically, for example, when the mobile terminal is switched on and is operated continually in the background until the mobile terminal is switched off, without the need for further interaction with the user.
- the mobile terminal can be configured such that the user can activate, configure and deactivate the further application autonomously and in this way controls the times at which biometric user data are meant to be tracked by the further application and provided for evaluation.
- the signal data produced by the sensors integrated in the mobile terminal which signal data are tracked by the further application, are constantly received by the further application and the biometric data are ascertained therefrom.
- use data from standard applications used by the user which are tracked by the further application, such as telephony and/or SMS applications, are constantly evaluated by the further application and biometric data are determined therefrom.
- the biometric data determined by the further application can be divided into different categories.
- the biometric data ascertained from the further application belong to the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
- the biometric data ascertained by the further application can be evaluated by a first evaluation apparatus, which is provided in the further application, on the mobile terminal.
- the biometric data ascertained by the further application can alternatively be evaluated by a second evaluation apparatus on a central server.
- the second evaluation apparatus can increase the quality of the evaluation further.
- the biometric data ascertained by the further application are transmitted from the mobile terminal to a central server using standard means that the mobile terminal provides.
- the transmission of the biometric data is effected in pseudonymized form.
- the user is managed on the central server under a pseudonym, rather than under his real name or another identifier identifying the user.
- the transmitted biometric data are associated with the user by means of the pseudonym.
- the biometric data are transmitted in encrypted form.
- the transmission unit is responsible for setting up, maintaining and clearing down a connection to a network, for example a GPRS or UMTS connection to a mobile radio network to which the data are transmitted. Transmission is effected only when the mobile terminal is in transmission mode.
- the transmission unit can also be designed for transmission via different networks, for example one or more mobile radio network(s) and/or a wireless connection to a local area network, for example based on an IEEE-802 Standard, such as a WLAN or a WPAN connection.
- a transmission network for example a mobile radio network, and possibly further networks connected to said transmission network via gateways, for example, such as the Internet, via which gateways the central server can be reached, is used to transmit the data to the central server.
- the data ascertained by the further application first of all being stored on a local memory unit arranged on the mobile terminal before the data are transmitted to the central server. This is necessary particularly when the transmission unit is temporarily incapable of transmitting the data via a transmission network.
- the transmission unit is temporarily incapable of transmitting the data via a transmission network.
- the biometric data can also be analyzed and evaluated by the further application on the mobile terminal directly.
- the further application autonomously ascertains a current stress level for the user from the biometric data.
- the central server has a reception unit that can be used to receive data from the network, for example from the Internet.
- the reception unit receives said data and stores them in a central memory unit.
- the stored data are transmitted to an evaluation unit arranged on the central server and are analyzed and evaluated by the evaluation unit.
- the transmitted and stored data comprise a plurality of biometric data pertaining to a user that are used by the evaluation unit in order to ascertain a current stress level for the user.
- the method disposed in this application is not a diagnostic method. Instead, it is a method that ascertains, collects and analyzes biometric data pertaining to a user and provides the user with the results of the analysis in the form of at least one current stress level.
- the data analysis is used particularly for detecting an alteration in the at least one stress level, i.e. establishing whether the at least one current stress level has increased or decreased in comparison with a previous stress level.
- the user is therefore provided with a tool for obtaining information particularly about changes in his at least one current stress level in comparison with earlier stress levels over time and, following autonomous rating of the detected changes, if need be taking individual measures for stress reduction.
- the evaluation unit can resort to biometric data pertaining to the user that have been ascertained in the past, for example, and can take said biometric data into account when ascertaining said current stress level.
- biometric data pertaining to the user that have been ascertained in the past can be used as reference data in order to perform user-specific calibration.
- the current stress level is ascertained in relation to the available reference data.
- the evaluation unit can resort to not only the biometric data from the user but also to biometric data from other users when ascertaining a current stress level.
- this allows clusters of user groups to be formed, for example according to age, sex or profession.
- the data from other users in the same user group can be taken into account.
- the evaluation unit ascertains a current stress level for a user using artificial neural networks.
- the artificial neural networks are trained on the basis of the available biometric data from a multiplicity of users. As a result of the training, the artificial neural network learns progressively as well and can thereby further improve the quality of the ascertained current stress level.
- the artificial neural network can be realized on the basis of a multilayer perceptron network.
- This neural network consists of a plurality of layers, a permanent input layer and a permanent output layer and if need be further intermediate layers, with no feedback taking place from one layer to layers situated before it.
- the artificial neural network can consist of precisely three layers, for example, input layer, hidden layer and output layer.
- the seven categories of biometric data for example, sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data, can form the neurons of the input layer.
- a relatively large number of neurons are used in the input layer by virtue of more finely granular category features being used as neurons of the input layer.
- the artificial neural network has feedback mechanisms. When the artificial neural network has feedback mechanisms, it is transited multiple times (iteratively).
- the evaluation unit ascertains a current stress level for a user using a network of artificial neural networks, for example using a Deep Belief Network.
- the network of artificial neural networks is made up of a plurality of neural networks that interact with one another.
- a single neural network comprises an input layer and a hidden layer.
- a first level of neural networks is provided.
- the input layer is stipulated by the biometric data from the user.
- the input layer of a first-level neural network can be stipulated by the biometric data from the user in precisely one category. From the input layer of an artificial neural network, it is possible to determine the hidden layer of the same neural network.
- a second and possibly further level of neural networks is provided.
- the input layer can be determined using the hidden layers of a plurality of neural networks on the preceding level.
- the current stress level can be determined from at least one neural network on a topmost level.
- the multiplicity of biometric data parameters which particularly includes a combination of different categories of biometric data that are used for ascertaining the at least one current stress level, means that the apparatus disclosed in this application and also the disclosed method and the disclosed application allow very much more reliable ascertainment of the at least one current stress level than is afforded by the known apparatuses and methods mentioned at the outset that ascertain a current stress level only on the basis of a single or very few biometric data parameter(s) in the same category.
- the quality of the analysis of the biometric data is increased further by the neural network method used, since, as time progresses and the database available for training the neural network becomes larger, said method can make ever more precise statements and hence further improves the reliability of the method for determining a current stress level.
- the further application can determine a plurality of biometric data pertaining to the user that belong to different categories solely from the user-specific use data from available applications on the mobile terminal, on the one hand, and from signal data, on the other hand, which are produced by sensors integrated in the mobile terminal.
- the apparatus according to the invention and the method according to the invention and the application according to the invention provide a user with an inexpensive and non-time-consuming solution for determining a current stress level.
- the solution according to the invention dispenses with additional apparatuses, particularly sensors, that the user needs to fix or wear directly on his body.
- the solution does not restrict the user in any way in terms of comfort, wellbeing or look, as is entailed by the application or wearing of specific apparatuses with sensors.
- the at least one current stress level determined by the evaluation unit can either be made accessible to the user via the Internet or can be made accessible to the user on the mobile terminal, for example by sending an SMS to the user.
- the analysis data consisting of the at least one current stress level and possibly further evaluation data, for example statistics pertaining to the change in a stress level over time, can alternatively be transmitted to the mobile terminal using the same transmission paths as when transmitting biometric data from the mobile terminal to the central server, but in the opposite direction.
- a transmission unit for transmitting data from the server to the mobile terminal is provided on the central server.
- a reception unit for receiving data from the central server is provided on the mobile terminal.
- the analysis data can also be transmitted from the central server to the mobile terminal using a push service. The data transmission is in turn effected in encrypted form.
- FIG. 1 shows a schematic illustration of the apparatus for ascertaining a current stress level
- FIG. 2 shows a schematic illustration of the further application for ascertaining a current stress level
- FIG. 3 a shows a flowchart for a first instance of application, sleep
- FIG. 3 b shows a flowchart for a second instance of application, motor functions
- FIG. 3 c shows a flowchart for a third instance of application, speech
- FIG. 4 a shows a graphical user interface for starting the sleep instance of application
- FIG. 4 b shows a further graphical user interface for a first evaluation display
- FIG. 4 c shows a further graphical user interface for a second evaluation display
- FIG. 4 d shows a further graphical user interface for a third evaluation display
- FIG. 5 a shows a schematic illustration of an exemplary embodiment of the evaluation unit
- FIG. 5 b shows a schematic illustration of an alternative exemplary embodiment of the evaluation unit
- FIG. 5 c shows a schematic illustration of a further alternative exemplary embodiment of the evaluation unit
- FIG. 6 shows a schematic illustration of an exemplary embodiment of the evaluation unit with a plurality of artificial neural networks.
- FIG. 1 shows a schematic illustration of an embodiment of the apparatus for ascertaining a current stress level 36 , 36 A, 36 B, 36 C, 36 D.
- the apparatus comprises a mobile terminal 1 and a central server 10 .
- the mobile terminal 1 contains a plurality of sensors 2 , for example a gyroscope 21 , an acceleration sensor 22 , a light sensor 23 and/or a microphone 24 .
- the signal data 31 produced by the sensors 2 can be accessed via an operating system 4 .
- the operating system 4 is executed within an execution unit 3 and manages the access to the hardware components of the mobile terminal 1 , for example the sensors 2 .
- different applications for example a plurality of available applications 5 and a further application 6 , are executed in the execution unit 3 .
- the further application 6 ascertains a plurality of biometric data 33 pertaining to a user of the mobile terminal 1 .
- the further application 6 is implemented in the programming language Java.
- the further application 6 uses the MVC (model view controller) design pattern as a basic design pattern.
- the use of the MVC design pattern structures the further application 6 such that this facilitates the comprehensibility and also the extendability and adjustability of the further application 6 to new and/or altered hardware components and operating systems 4 .
- the further application 6 obtains the biometric data 33 from signal data 31 that are produced by the sensors 2 and that can be accessed by means of the operating system 4 .
- the access to the signal data 31 is realized by the further application 6 , for example through the use of the observer design pattern.
- the observer design pattern provides the further application 6 with simplified and standardized access to the signal data 31 .
- the further application 6 can extract a plurality of further biometric data 33 from the use data 32 from available applications 5 too.
- the use data 32 produced by the available applications 5 are accessible via the operating system 4 .
- the access to the use data 32 is realized by the further application 6 , for example through the use of the observer design pattern.
- the observer design pattern provides the further application 6 with simplified and standardized access to the use data 32 .
- An observer is informed about status changes on the object that it is observing, for example an available application 5 . If the available application 5 is an SMS application, for example, and the user calls the SMS application in order to write a new SMS, then the observer observing the SMS application is informed about this status change.
- the further application 6 reacts to the writing of a new SMS that is observed by the observer by recording the characters input by the user, for example using a keypad, providing them with a timestamp and storing them in the local memory unit 7 as use data 32 for the SMS application.
- the sensor keypad 25 it is also possible for all keypad inputs by the user to be recorded regardless of their use in a specific application.
- an observer or a plurality of observers is implemented for the sensor keypad 25 , for example one observer for each key on the keypad.
- the observer observing the key is informed of said pressing of a key.
- the further application 6 reacts to the pressing of the key that is observed by this observer by virtue of the further application 6 checking whether the user has pressed a delete key or another key.
- the ‘delete key’ or ‘other key’ information is recorded by the further application 6 , provided with a timestamp, and these data are stored in the local memory unit 7 as signal data 31 .
- the further application 6 extracts a plurality of biometric data 33 .
- the biometric data 33 are subdivided into categories, for example into the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
- a category-specific ascertainment time interval is defined, for example 30 seconds for the sleep category and 20 milliseconds for the speech category.
- the signal data 31 and/or use data 32 that are relevant to a category are processed in a first pre-processing step using category-specific time intervals to produce conditioned signal data 31 A and/or conditioned use data 32 A.
- the timestamps stored for the signal data 31 and/or use data 32 are evaluated.
- the conditioned signal data 31 A and/or conditioned use data 32 A are in turn provided with a timestamp.
- the biometric data 33 are extracted from a sequence of conditioned signal data 31 A and/or conditioned use data 32 A.
- biometric data 33 in the motor functions category are ascertained from the conditioned use data 32 A pertaining to the SMS written.
- the biometric data 33 pertaining to a category that are ascertained in an instance of application are also referred to as a feature vector for this category.
- the biometric data 33 comprise the feature vectors ascertained for the various categories, with the respective timestamps of said feature vectors.
- the biometric data 33 ascertained by the further application 6 are stored in a local memory unit 7 of the mobile terminal 1 .
- the mobile terminal 1 has a transmission unit 8 A and a reception unit 8 B.
- the transmission unit 8 A transmits data 34 from the mobile terminal 1 to an external node, for example the central server 10 .
- the transmission is effected via the air interface, for example.
- the reception unit 8 B receives data from an external node, for example the central server 10 .
- the transmission unit 8 A is used to transmit data 34 , for example the biometric data 33 from the user, to the central server 10 for the purpose of evaluation.
- the reception unit 8 B is used to receive data 34 coming from the central server 10 , for example evaluations 35 created by the central server. Each evaluation 35 is provided with a timestamp that stipulates the time interval for which the evaluation is valid.
- An evaluation 35 for example a current stress level 36 , 36 A, 36 B, 36 C, 36 D of a user of the mobile terminal 1 , is transferred to the further application 6 for display and displays to the user on the display 9 of the mobile terminal 1 by means of the operating system 4 .
- the central server 10 has a transmission unit 18 A and a reception unit 18 B.
- the reception unit 18 B is used to receive data 34 from another node, for example the mobile terminal 1 .
- the received data 34 are biometric data 33 from the user of the mobile terminal 1 .
- the received data 34 are stored in a central memory unit 17 .
- an evaluation unit 13 is provided on the central server 10 .
- the evaluation unit 13 evaluates the received biometric data 33 .
- the evaluation unit 13 determines the at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D at an instant t by evaluating those feature vectors for the received biometric data 33 whose timestamps are valid at the instant t.
- the current stress level 36 A determines a first current stress level of the user for a first category of biometric data 33 , for example the sleep category.
- the current stress level 36 C determines a second current stress level of the user for a second category of biometric data 33 , for example the motor functions category.
- the current stress level 36 B determines a third current stress level of the user for a third category of biometric data 33 , for example the speech category.
- the current stress level 36 D determines a fourth current stress level of the user for a fourth category of biometric data 33 , for example the social interaction category, or for a combination of categories of biometric data, for example the social interaction, economic data, personal data and/or questionnaire data categories.
- further current stress levels can be determined for further categories and/or combinations of categories.
- the current stress level 36 determines a consolidated current stress level of the user that is obtained from a combination of the category-specific stress levels 36 A, 36 B, 36 C, 36 D and if need be of available further category-specific stress levels, for example by forming the arithmetic mean of the category-specific stress levels.
- the at least one evaluation 35 determined by the evaluation unit 13 for example the at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D, comprises, for each evaluation 35 , a timestamp that stipulates the time interval for which the evaluation 35 is valid.
- the at least one evaluation 35 for example the at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D is stored in the central memory unit 17 and transmitted to the mobile terminal 1 via the transmission unit 18 A.
- FIG. 2 shows a schematic illustration of an embodiment of the further application 6 for ascertaining at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D.
- the further application 6 comprises a plurality of components, for example a data manager 61 , a data preprocessor 62 and a data analyzer 63 .
- the signal data 31 and/or use data 32 made available via the operating system 4 are loaded into the data manager 61 and managed thereby.
- the data manager 61 transfers the signal data 31 and/or use data 32 to the data preprocessor 62 .
- the data preprocessor 62 conditions the signal data 31 and/or use data 32 and transfers the conditioned signal data 31 A and/or conditioned use data 32 A back to the data manager 61 .
- the data manager 61 stores the conditioned signal data 31 A and/or conditioned use data 32 A in the local memory unit 7 .
- the data manager 61 transfers the conditioned signal data 31 A and/or conditioned use data 32 A to the data analyzer 63 .
- the data analyzer 63 analyzes the conditioned signal data 31 A and/or conditioned use data 32 A and determines the biometric data 33 therefrom.
- the data analyzer 63 creates at least one evaluation 35 , for example in the form of at least one current stress level 36 , 36 A, 36 B, 36 C, 36 D.
- the data analyzer 63 transfers the biometric data 33 and if need be the at least one evaluation 35 to the data manager 61 .
- the data manager 61 visualizes the at least one evaluation 35 for the user of the mobile terminal 1 by displaying it on the display 9 .
- the data manager 61 transfers the biometric data 33 to the transmission unit 8 A for transmission to the central server 10 , insofar as the biometric data 33 are evaluated centrally.
- That evaluation 35 that is provided in the form of the consolidated current stress level 36 can be visualized on the display 9 continuously, for example, as a traffic light icon.
- the traffic light icon can display the colors green, amber or red on the basis of the consolidated current stress level 36 . If the consolidated current stress level 36 is normalized to an integer value in the value range [0,10], for example, then the traffic light color is chosen on the basis of the current value of the consolidated current stress level 36 . A high value corresponds to a high consolidated current stress level 36 . A low value corresponds to a low consolidated current stress level 36 . If the consolidated current stress level 36 is low, for example in the value range [0,3], the color green is displayed.
- the consolidated current stress level 36 is increased, for example in the value range [4,6], the color amber is displayed. If the consolidated current stress level 36 of the user is high, for example in the value range [7,10], the color red is displayed.
- the display of the consolidated current stress level 36 is updated as soon as a consolidated current stress level 36 is available with a timestamp that is more recent than the timestamp of the previously displayed consolidated stress level.
- the consolidated current stress level 36 is visualized as a bar chart having 10 bars. Each bar in the bar chart has an associated integer value from the value range [0,10], to which the consolidated current stress level 36 is normalized.
- the data manager 61 receives at least one evaluation 35 , for example in the form of a third current stress level 36 B for the speech category, pertaining to the biometric data 33 for the speech category that are evaluated on the server.
- the data manager 61 When the data manager 61 receives a new evaluation 35 , for example in the form of a third current stress level 36 B for the speech category, it ascertains a new consolidated current stress level 36 from the category-specific current stress levels, known to the data manager 61 , whose timestamps are currently still valid.
- the consolidated current stress level 36 is obtained by means of the arithmetic mean or by means of a weighted mean of the category-specific current stress levels 36 A, 36 B, 36 C, 36 D that are still valid.
- the data manager 61 visualizes the consolidated current stress level 36 on the display 9 , for example by updating the traffic light icon.
- the consolidated current stress level 36 of the user is an individual variable.
- user-specific calibration can be performed. To this end, the user is asked to record biometric data 33 in the personal data category, for example via a form integrated in the further application 6 .
- an individual current stress level of the user is determined, which stipulates a calibration factor, for example.
- the individual current stress level for example in its manifestation as a calibration factor, is taken into account for determining the current stress level 36 , 36 A, 36 B, 36 C, 36 D for the user.
- FIG. 3A shows a flowchart for a first instance of application, sleep.
- the first instance of application, sleep ascertains biometric data 33 in the sleep category for the purpose of ascertaining a first current stress level 36 A of a user.
- the first instance of application describes a first method for ascertaining said first current stress level 36 A.
- the user Prior to first use of the sleep instance of application, the user allows the mobile terminal 1 to fall onto his mattress from a height of approximately 30 centimeters.
- the further application 6 computes the spring temper and the damping constant of the mattress, which are stored as calibration data pertaining to the sleep instance of application.
- the sleep instance of application ascertains motion data during the rest phase of the user and evaluates said data.
- the user of the mobile terminal 1 calls the sleep mode of the further application 6 .
- calling the sleep mode automatically prompts the mobile terminal 1 to be put into flight mode in order to minimize emissions of electromagnetic radiation by the mobile terminal 1 .
- the user positions the mobile terminal 1 on the mattress during his rest phase.
- (A3) The signal data 31 produced by the sensors 2 , for example the gyroscope 21 , the acceleration sensor 22 and the light sensor 23 during the rest phase are collected by the further application 6 and stored in the local memory unit 7 .
- the data manager 61 of the further application 6 loads the sensor data ascertained during the sleep mode in the further application 6 and transfers these signal data 31 to the data preprocessor 62 .
- the data preprocessor 62 divides the ascertained signal data 31 into time intervals, for example into time intervals having a length of 30 seconds. For the signal data 31 in each time interval, conditioned signal data 31 A that are characteristic of the time interval are determined and are provided with a timestamp.
- the data preprocessor 62 transfers the conditioned signal data 31 A with their timestamps to the data manager 61 .
- the data manager 61 stores the conditioned signal data 31 A with their timestamps in the local memory unit 7 .
- the data manager 61 transfers the conditioned signal data 31 A with their timestamps to the data analyzer 63 for the purpose of evaluation.
- the data analyzer 63 analyzes the conditioned signal data 31 A and determines therefrom a feature vector with biometric data 31 in the sleep category.
- the feature vector is determined by means of a statistical regression model for modeling a binary target variable, for example a logit or probit model.
- the sequence of conditioned signal data 31 A that is obtained by arranging the conditioned signal data 31 A according to ascending timestamps is evaluated and each element in the sequence is classified as “awake” or “asleep” for the sleep state.
- the classification takes account of the sleep states of the preceding elements in the sequence, that is to say the sleep states in the preceding time intervals.
- the time interval is classified with the state “asleep”, otherwise with the state “awake”.
- the sequence of sleep states over all time intervals is given as a basis for determining the feature vector.
- the feature vector of the biometric data 33 pertaining to the sleep category comprises the following features:
- the data analyzer 63 determines an evaluation 35 that comprises particularly the first current stress level 36 A for the sleep category.
- all features of the feature vector are rated with an integer value for the value range [0,10], for example, and the individual values are used to form a mean value, for example an arithmetic mean or a weighted mean.
- the rating is influenced to some extent by the user-specific calibration, for example as a result of a calibration factor that needs to be taken into account.
- the first current stress level 36 A for the sleep category is obtained as an integer value in the value range [0,10].
- the first current stress level 36 A comprises a timestamp that stipulates the period for which the first current stress level 36 A for the sleep category is valid.
- the data manager 61 stores the feature vector of the biometric data 33 pertaining to the sleep category and also the evaluation 35 , particularly the first current stress level 36 A for the sleep category, in the local memory unit 7 .
- the data manager 61 visualizes the evaluation 35 , particularly the first current stress level 36 A for the sleep category, on the display 9 . From the first current stress level 36 A for the sleep category and if need be further available, valid current stress levels for further categories, for example the current stress levels 36 B, 36 C, 36 D, the data manager 61 determines a consolidated current stress level 36 and visualizes the consolidated current stress level 36 , for example by updating the traffic light icon.
- FIG. 4A shows an exemplary graphical user interface for the start of the sleep instance of application of the further application 6 .
- the exemplary graphical user interface contains a tip for successful measurement of the biometric data 33 pertaining to the sleep category. By selecting the OK button, the user can start the instance of application.
- FIG. 4B shows a further exemplary graphical user interface of a first evaluation display for the sleep instance of application.
- the first evaluation display visualizes an evaluation 35 for the sleep category in the form of an overview evaluation.
- the sleep quality parameter is used to display a first current stress level 36 A of the user for the sleep category.
- the sleep quality is indicated by the numerical value 2.0 within a scale from 0 to 10.
- the first evaluation display comprises further elements, for example the last sleep pattern as a function of time.
- FIG. 4C shows a further exemplary graphical user interface of a second evaluation display for the sleep instance of application.
- the second evaluation display visualizes an evaluation 35 for the sleep category in the form of a detail display.
- the detail display comprises the ascertained biometric data 33 pertaining to the sleep category. For each feature of the biometric data 33 in the sleep category, the ascertained value is indicated.
- FIG. 4D shows a further graphical user interface of a third evaluation display for the sleep instance of application.
- the third evaluation display visualizes the consolidated current stress level 36 in a bar chart.
- the consolidated stress level 36 and the current stress levels 36 A, 36 B, 36 C, 36 D for the individual categories are displayed as numerical values. Each numerical value is displayed in a color that is specific to the value. The choice of color visualizes the current stress levels 36 , 36 A, 36 B, 36 C, 36 D in color.
- FIG. 3B shows a flowchart for a second instance of application, motor functions.
- the second instance of application, motor functions ascertains biometric data 33 in a motor functions category for the purpose of ascertaining a second current stress level 36 C of a user.
- the second instance of application describes a second method for ascertaining the second current stress level 36 C. This instance of application requires only indirect interaction with the user.
- (B3) The user uses the keypad 25 of the mobile terminal 1 to type an SMS, for example.
- the data manager 61 transfers the collected and stored keypad data to the data preprocessor 62 .
- the data preprocessor 62 performs pre-evaluation of the keypad data. To this end, the data preprocessor 62 divides the ascertained keypad data into time intervals, for example into time intervals with a length of 15 seconds. For the keypad data 32 in each time interval, conditioned use data 32 A that are characteristic of the time interval are determined and are provided with a timestamp.
- the data manager 61 stores the conditioned use data 32 A provided with timestamps in the local memory unit 7 .
- the data manager 61 transfers the conditioned use data 32 A provided with timestamps to the data analyzer 63 .
- the data analyzer 63 analyzes the conditioned use data 32 A provided with timestamps and determines a feature vector therefrom with biometric data 31 in the motor functions category.
- the data analyzer 63 determines the error rate from the frequency of keypad input errors, particularly from the number of times the user operates a delete key in the time interval under consideration.
- the error rate determined is a measure of the hand/eye coordination of the user.
- the feature vector of the biometric data pertaining to the motor functions category comprises the following features:
- the data analyzer 63 determines an evaluation 35 , particularly the second current stress level 36 C for the motor functions category.
- all features of the feature vector are rated with an integer value from the value range [0,10], for example, and a mean value, for example an arithmetic mean or a weighted mean, is formed from the individual values.
- the rating is influenced to some extent by the user-specific calibration, for example as a result of a calibration factor that needs to be taken into account.
- the second current stress level 36 C for the sleep category is obtained as an integer value in the value range [0,10], for example.
- the second current stress level 36 C comprises a timestamp that stipulates the period for which the second current stress level 36 C for the sleep category is valid.
- the data analyzer 63 transfers the feature vector of the biometric data 33 pertaining to the sleep category and also the evaluation 35 , particularly the second current stress level 36 C for the motor functions category, with its timestamp, to the data manager 61 .
- the data manager 61 stores the feature vector of the biometric data 33 pertaining to the sleep category and also the evaluation 35 , particularly the second current stress level 36 C for the motor functions category, with its timestamp, in the local memory unit 7 .
- the data manager 61 visualizes the evaluation 35 , particularly the second current stress level 36 C for the motor functions category, on the display 9 . From the second current stress level 36 C for the motor functions category and if need be further available valid current stress levels for further categories, the data manager 61 determines the consolidated current stress level 36 and visualizes it, for example by updating the traffic light icon.
- the biometric data 31 for example the biometric data 31 pertaining to the sleep and/or motor functions categories, are transmitted to the central server 10 , stored in the central memory unit 17 and evaluated by the evaluation unit 13 arranged on the server.
- FIG. 3C shows a flowchart for a third instance of application, speech.
- the third instance of application, speech ascertains biometric data 33 in the speech category, for example the speech parameters speech rate and/or modulation capability, in order to ascertain a third current stress level 36 B of a user.
- the third instance of application describes a third method for ascertaining the third current stress level 36 B. This instance of application requires only indirect interaction with the user.
- the speech instance of application comprises voice analysis of voice data from the user, for example voice data from telephone calls conducted by the user using the mobile terminal 1 .
- (C1) The further application 6 has been loaded into the execution unit 3 of the mobile terminal 1 and has been started.
- the further application 6 runs as a background process in the execution unit 3 .
- (C2) The speech instance of application is started by an incoming call to the mobile terminal 1 , for example.
- (C6) The data manager 61 transfers the voice data 31 stored with a timestamp to the data preprocessor 62 .
- the data preprocessor 62 performs pre-evaluation of the voice data 31 . To this end, the data preprocessor 62 divides the captured voice data 31 into time intervals, for example into time intervals with a length of 20 milliseconds. For the voice data 31 in each time interval, conditioned voice data 31 A that are characteristic of the time interval are determined and are provided with a timestamp.
- the data preprocessor 62 transfers the conditioned voice data 31 A with their timestamps to the data manager 61 .
- the data manager 61 stores the conditioned voice data 31 A for their timestamps in the local memory unit 7 .
- the data manager 61 transfers the conditioned voice data 31 A with their timestamps to the data analyzer 63 for the purpose of evaluation.
- the data analyzer 63 analyzes the conditioned voice data 31 A and determines from them a feature vector with biometric data 31 in the speech category.
- the feature vector of the biometric data 31 for the speech category comprises the following features:
- the feature vector is provided with a timestamp and these data are transferred from the data analyzer 63 to the data manager 61 as biometric data 33 in the speech category.
- the data manager 61 stores the feature vector provided with a timestamp in the local memory unit 7 as biometric data 33 in the speech category.
- the data manager 61 transfers the biometric data 33 pertaining to the speech category to the transmission unit 8 A for the purpose of transmission to the central server 10 .
- the reception unit 18 B of the central server 10 receives the transmitted data in the form of the biometric data 33 pertaining to the speech category.
- the central server 10 stores the biometric data 33 in the central memory unit 17 and evaluates the biometric data 33 in the evaluation unit 13 .
- a neural network method is used, for example.
- the evaluation unit 13 determines an evaluation 35 .
- the evaluation 35 particularly comprises the third current stress level 36 B in the speech category.
- the third current stress level 36 B for the speech category is determined as an integer value in the value range [0,10], for example.
- the third current stress level 36 B comprises a timestamp that stipulates the period for which the third current stress level 36 B for the speech category is valid.
- the central server 10 transmits the evaluation 35 , particularly the third current stress level 36 B for the speech category, with its timestamp, to the mobile terminal 1 by means of the transmission unit 18 A.
- the transmitted evaluation 35 is received by the reception unit 8 B of the mobile terminal 1 and transferred to the data manager 61 of the further application 6 .
- the data manager 61 stores the evaluation 35 , particularly the third current stress level 36 B for the speech category, with its timestamp, in the local memory unit 7 .
- the data manager visualizes the evaluation 35 , particularly the third current stress level 36 B for the speech category, on the display 9 . From the third current stress level 36 B for the speech category and if need be further available, valid current stress levels for further categories, the data manager 61 determines the consolidated current stress level 36 and visualizes the consolidated current stress level 36 , for example by updating the traffic light icon.
- the social interaction instance of application evaluates use data 32 from the user from such available applications 5 as are used for social interaction.
- available applications 5 that are used for social interaction are SMS applications, e-mail applications or social network applications, such as an instant messaging application or a Facebook application. From the use data 32 pertaining to the available applications 5 that are used for social interaction, it is possible to ascertain, by way of example, the number of contacts in social networks or the frequency with which contact is made, for example the frequency with which an SMS is sent.
- the feature vector of the biometric data 33 pertaining to the social interaction category comprises the following features:
- biometric data 33 in further categories can be taken into account, for example biometric data 33 in the economic data category, in the personal data category and/or in the questionnaire data category.
- the economic data category relates to comprehensive rather than user-specific data, for example data pertaining to general sickness absence rate or pertaining to job security.
- the feature vector of the biometric data 33 pertaining to the economic data category comprises the following features:
- the personal data category comprises data pertaining to age and family status and also pertaining to occupation group and pertaining to education level.
- the feature vector of the personal data category is used particularly for individual calibration of the current stress levels 36 , 36 A, 36 B, 36 C, 36 D.
- the personal data are recorded by the user using a form within the further application 6 , for example.
- the feature vector of the biometric data 33 pertaining to the personal data category comprises the following features:
- the questionnaire data comprise individual self-assessments by the user pertaining to stress-related questions.
- the questionnaire data are recorded by the user using a form within the further application 6 , for example.
- the biometric data 33 pertaining to the cited further categories can additionally be used for evaluation and particularly for ascertaining the consolidated current stress level 36 of the user.
- the biometric data 33 are evaluated by the further application 6 directly as an evaluation unit on the mobile terminal 1 , a different approach has been chosen for the exemplary speech instance of application.
- the evaluation of the biometric data 33 pertaining to the speech category is effected in the evaluation unit 13 that is arranged on the central server 10 .
- the evaluation unit 13 contains an evaluation method, for example a method based on artificial neural networks that resorts to biometric data 33 from other users and to earlier biometric data 33 from the user.
- the biometric data 33 from other categories are also evaluated in the evaluation unit 13 arranged on the central server 10 in order to increase the quality of the evaluation further.
- the evaluation method obtained on the central server 10 by training the artificial neural network method is implemented in the further application 6 , for example by means of an update in the further application 6 .
- the evaluation unit is provided for all categories by the further application 6 on the mobile terminal 1 . Evaluation of the biometric data 33 pertaining to all categories is effected on the mobile terminal 1 rather than on the central server 10 .
- FIG. 5A shows a schematic illustration of an exemplary embodiment of the evaluation unit 13 on the central server 10 .
- a current stress level 36 , 36 A, 36 B, 36 C, 36 D is determined for a user by the evaluation unit 13 on the central server 10 .
- biometric data 33 pertaining to the user it is alternatively possible for a portion of the biometric data 33 pertaining to the user to be analyzed and evaluated on the mobile terminal 1 directly and for at least one current stress level 36 A, 36 B, 36 C, 36 D ascertained on the terminal to be determined.
- a second portion of the biometric data 33 pertaining to the user is analyzed and evaluated on the central server 10 by the evaluation unit 13 and at least one current stress level 36 A, 36 B, 36 C, 36 D on the server is determined.
- the biometric data 33 analyzed and evaluated on the server can comprise biometric data 33 that are also taken into account for the analysis and evaluation on the mobile terminal 1 .
- a consolidated stress level 36 that takes account both of the at least one current stress level 36 A, 36 B, 36 C, 36 D ascertained on the terminal and of the at least one current stress level 36 A, 36 B, 36 C, 36 D ascertained on the server is determined by the data manager 61 of the further application 6 .
- the evaluation unit 13 comprises a server-end data manager 14 and a server-end data analyzer 15 .
- the server-end data analyzer 15 is in the form of an artificial neural network 40 in the form of a multilayer perceptron network.
- the neural network consists of three layers: the input layer 43 , the hidden layer 44 and the output layer 45 . Each layer is constructed from neurons 46 .
- the input layer 43 contains a plurality of input neurons 46 A.
- the hidden layer 44 contains a plurality of hidden neurons 46 B and the output layer 45 contains precisely one output neuron 46 C.
- each input neuron 46 A of the input layer 43 has, as an associated input value, the value of a feature from a feature vector in a category of biometric data 33 that have been transmitted to the central server 10 , for example in the speech category, following suitable normalization, for example to the value range [0,10].
- each input neuron 46 A of the input layer 43 has, as an associated input value, the current stress level for a category of biometric data 33 .
- the input layer 43 consists of seven input neurons 46 A, each input neuron 46 A having the associated current stress level of one of the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
- the features of the category-specific feature vectors, of biometric data 33 available to the central server 10 are linked and evaluated in another way in order to determine the input values of the input neurons 46 A.
- the multilayer perceptron network is in the form of a feed forward network, i.e. the connections between the neurons 46 always point from one layer, for example the input layer 43 , to the next layer, for example the hidden layer 44 .
- the input neurons 46 A of the input layer have connections to the hidden neurons 46 B of the hidden layer.
- each input neuron 46 A of the input layer can have one connection to each hidden neuron 46 B of the hidden layer.
- the hidden layer 44 has a greater number of neurons 46 than the input layer 43 .
- the output layer 45 contains precisely one neuron 46 , the output neuron 46 C.
- the neurons 46 B of the hidden layer 44 have connections to the output neuron 46 C of the output layer 45 .
- each hidden neuron 46 B of the hidden layer 44 is connected to the output neuron 46 C.
- the output neuron 46 C represents a current stress level 36 , 36 A, 36 B, 36 C, 36 D of a user.
- the artificial neural network 40 computes a current stress level 36 , 36 A, 36 B, 36 C, 36 D of the user.
- the server-end data manager 14 retrieves the biometric data 33 pertaining to a user in the form of the feature vectors for the ascertained categories—transmitted to the central server 10 —of biometric data 33 from the central memory unit 17 .
- the feature vectors suitable for computing the current stress level 36 , 36 A, 36 B, 36 C, 36 D are taken into account, for example the feature vectors with the most recent timestamp.
- a feature vector in a category is taken into account only if the instant for which the current stress level 36 , 36 A, 36 B, 36 C, 36 D is computed lies in the validity range defined by the timestamp.
- the server-end data manager 14 provides the biometric data 33 for the data analyzer 15 , which is in the form of an artificial neural network 40 .
- the biometric data 33 are read into the input layer 43 of the neural network 40 and forwarded to the next layers of the neural network 40 via the connections.
- Each connection has a connection weight that has either a boosting or inhibiting effect.
- Each neuron 46 B of the hidden layer 44 has an activation function, for example the hyperbolic tangent activation function, which maps an arbitrary input value onto the value range [ ⁇ 1, 1].
- the input value for a neuron 46 B of the hidden layer 44 is obtained as a sum of the values transmitted via the weighted connections.
- a neuron-specific threshold value is stipulated.
- the input value exceeds the threshold value of the neuron 46 B, this computed value is forwarded from the hidden neuron 46 B to its outgoing connections and hence to the output neuron 46 C in the output layer 45 .
- the output neuron 46 C determines its output value using the same method as has been described for a hidden neuron 46 B of the hidden layer 44 .
- the artificial neural network 40 determines the value of the one output neuron 46 C in a deterministic fashion from the biometric data 33 that are associated with the input neurons 46 A.
- the value of the output neuron 46 C provides the current stress level 36 , 36 A, 36 B, 36 C, 36 D.
- the value of the output neuron 46 C is transferred from the server-end data analyzer 15 in the form of an artificial neural network 40 to the server-end data manager 14 .
- the server-end data manager 14 stores the output value as a current stress level 36 , 36 A, 36 B, 36 C, 36 D for the categories relevant to determination thereof, with a timestamp, in the central memory unit 17 .
- connection weights of each connection and the threshold values of each neuron 46 are stipulated.
- connection weight for a connection is stipulated by a random value from the range [ ⁇ 0.5, 0.5], the value 0 being omitted.
- the threshold value for a neuron 46 is stipulated by a random value from the range [ ⁇ 0.5, 0.5] for example.
- connection weights of each connection of the neural network 40 and the threshold values for each neuron 46 are adjusted.
- a monitored learning method preferably a back propagation method
- the desired output value from the output neuron 46 C is available for the input values for the neural network 40 .
- the desired output value from the output neuron 46 C is obtained from the current stress level for the questionnaire data category, which level has been ascertained exclusively from the questionnaire data answered by the user.
- the connection weights of all connections and the threshold values of all neurons 46 are trained until the output value that the neural network 40 provides for the output neuron 46 C matches the desired output value with sufficient accuracy.
- the repeated with a multiplicity of biometric data 33 from a multiplicity of users allows the analysis and evaluation method provided by the artificial neural network 40 for ascertaining the current stress level 36 , 36 A, 36 B, 36 C, 36 D to be constantly improved and adjusted further.
- FIG. 5B shows a schematic illustration of an alternative exemplary embodiment of the evaluation unit 13 on the central server 10 .
- the artificial neural network 40 is in the form of a feedback neural network. Besides the connections that point from a neuron 46 of an upstream layer to a neuron 46 of a downstream layer, for example from an input neuron 46 A of the input layer 43 to a hidden neuron 46 B of the hidden layer 44 , this embodiment has connections that run in the opposite direction, for example from a hidden neuron 46 B of the hidden layer 44 to an input neuron 46 A of the input layer 43 or from the output neuron 46 C of the output layer 45 to a hidden neuron 46 B of the hidden layer 44 .
- An artificial neural network 40 of this kind has a higher level of complexity than the previously described feed forward network, which forwards data only in a distinguished forward direction.
- FIG. 5C A development of the feedback artificial neural network is shown in FIG. 5C . Accordingly, lateral feedback loops are also possible, that is to say connections of neurons 46 that are arranged in the same layer. In a further development of the feedback artificial neural network, there is also provision for direct feedback. Direct feedback is a connection from a neuron 46 to itself. Direct feedback means that neurons 46 inhibit or boost themselves in order to arrive at their activation limits.
- a feedback artificial neural network is provided particularly in order to take account of the “memory” of biometric data 33 pertaining to a user when determining the current stress level 36 , 36 A, 36 B, 36 C, 36 D.
- the memory of biometric data 33 pertaining to a category pertaining to a user is the sequence, arranged on the basis of their timestamp, of feature vectors for this category and this user; in particular, the sequence comprises older feature vectors from earlier analyses.
- a suitable subsequence is selected and the artificial neural network method is started with the first feature vector in this subsequence, that is to say the feature vector with the oldest timestamp.
- the values of the first feature vector are applied to the artificial neural network as input values and the neural network is transited once.
- the built-in feedback loops mean that the values from the first time step have a further effect on the subsequent time step.
- the values of the second feature vector are applied to the artificial neural network 40 as input values.
- the values generated in feedback connections from the previous time step are taken into account as new input values.
- the method determined in this manner is continued further until the complete subsequence of feature vectors has been transited.
- the value of the output neuron 46 C provides the current stress level 36 , 36 A, 36 B, 36 C, 36 D of the user.
- FIG. 6 shows a schematic illustration of an embodiment of the invention that has been developed further.
- the evaluation unit 6 , 13 ascertains a current stress level 36 , 36 A, 36 B, 36 C, 36 D of a user using a network of artificial neural networks.
- the network of neural networks may be in the form of a deep belief network or a convolutional deep belief network 50 .
- a single artificial neural network 40 which is part of the network of artificial neural networks, may be embodied according to one of the embodiments cited previously for artificial neural networks 40 , for example.
- the network of artificial neural networks comprises a plurality of artificial neural networks 40 that interact with one another.
- the plurality of artificial neural networks 40 may be embodied as a restricted Boltzmann machine or as a convolutional restricted Boltzmann machine.
- a single neural network 40 comprises an input layer 43 and a hidden layer 44 .
- the input layer comprises a plurality of input neurons 46 A.
- the hidden layer comprises a plurality of hidden neurons 46 B. From the input layer of an artificial neural network, it is possible to determine the hidden layer of the same neural network, as explained in the preceding embodiments, for example.
- the network of artificial neural networks contains a first level of artificial neural networks 40 , which are referred to as first neural networks.
- the input layer 43 of the first neural networks is stipulated by the biometric data 33 from the user.
- a component of a feature vector in a category can be associated with an input neuron 46 A.
- at least one further level of artificial neural networks 40 is provided, which are referred to as further neural networks.
- the input layer 43 can be determined from the hidden layers 44 of a plurality of artificial neural networks 40 on the preceding level.
- an input neuron 46 A of the input layer 43 is stipulated by precisely one hidden neuron 46 B of an artificial neural network 40 from the preceding layer.
- an input neuron 46 A of the input layer 43 is stipulated by a plurality of hidden neurons 46 B of one or more artificial neural networks 40 from the preceding layer.
- the network of artificial neural networks 40 contains a topmost level that comprises at least one artificial neural network 40 .
- the at least one artificial neural network 40 on the topmost level is referred to as the topmost neural network.
- the at least one topmost neural network has an output layer 45 .
- the hidden layer 44 of a topmost neural network is identified by means of the output layer 45 .
- the at least one artificial neural network 40 of the topmost level comprises three layers, the input layer, the hidden layer and the output layer 45 .
- the output layer 45 comprises precisely one output neuron 46 C.
- the current stress level 36 , 36 A, 36 B, 36 C, 36 D can be determined from the output layer 45 of the at least one topmost neural network.
- a classifier is provided that classifies the output layer 45 and determines the current stress level 36 , 36 A, 36 B, 36 C, 36 D therefrom.
- the classifier may be designed as a support vector machine.
- the current stress level 36 , 36 A, 36 B, 36 C, 36 D is stipulated by the output neuron 46 C.
- the evaluation unit 6 , 13 which comprises a network of a plurality of artificial neural networks 40 , is designed such that the computation of the network can be parallelized.
- the evaluation unit 6 , 13 interacts with at least one processor, the processor being designed and provided to compute neurons 46 , 46 B, 46 C for at least one artificial neural network 40 .
- the processor may be arranged on the mobile terminal 1 .
- the processor may also be provided on a central server.
- a plurality of processors are provided.
- the plurality of processors may be provided on the mobile terminal or the central server or on both.
- the evaluation unit 6 , 13 is designed and provided to have the plurality of artificial neural networks 40 computed by the plurality of processors in parallel.
- the parallel computation optimizes the computation time.
- the method for determining a current stress level that is based on a network of artificial neural networks can be executed more quickly by parallelizing the computation of neural networks. Similarly, the parallelization allows power to be saved.
- At least one graphics card with at least one graphics card processor can be incorporated for executing the method, the at least one graphics card being arranged on the mobile terminal or on the central server.
- the graphics card processor can support computation of the artificial neural networks, in particular. This approach allows the computation time to be optimized even further.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Psychiatry (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Epidemiology (AREA)
- Signal Processing (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Acoustics & Sound (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Primary Health Care (AREA)
- Databases & Information Systems (AREA)
- Physiology (AREA)
- Developmental Disabilities (AREA)
- Psychology (AREA)
Abstract
Description
- The present invention relates to an apparatus for ascertaining a current stress level for a user according to the preamble of
claim 1, to a method for ascertaining a current stress level, to a computer program product and to an application for a mobile terminal. - According to the preamble of
claim 1, a mobile terminal is provided that has at least one sensor integrated in the mobile terminal for producing signal data and has a plurality of available applications for use by the user. In addition, an evaluation unit is provided. The evaluation unit is provided particularly to evaluate signal data. - By way of example, the mobile terminal according to the preamble is a smartphone such as an iPhone or another smartphone, which is equipped with an operating system, such as an iOS or Android operating system for a mobile terminal, and has an integrated sensor, for example a GPS sensor for ascertaining the current position. The mobile terminal has a plurality of standard applications installed on it, such as a telephony application for setting up and conducting a telephone call via a mobile radio link and/or an application for writing, sending and receiving SMSs and/or a browser application for accessing web pages on the Internet. In addition, the operating system allows the installation and execution of further applications, which can be downloaded from an online shop for mobile applications on the Internet, for example. The further application can request and process data pertaining to the current location of the mobile terminal from the GPS sensor, for example, and can transmit said data to a central server via the mobile radio network, for example via a GPRS or UMTS connection. Such tracking data can be stored on the server and evaluated in an evaluation unit arranged in the central server, for example in order to provide the user with location-based services, such as a locating service for tracking the whereabouts of a child for his parents, what is known as a Childtracker. Location-based services are currently widely advertized on the market and in great demand, as a result of which smartphones have now become prevalent, particularly in younger demographic groups.
- In addition, there are differently designed apparatuses and methods that use special devices for sensor-based measurement of specific vital functions and determine a current stress level from the measured values. By way of example, the Stress Monitor (see http://www.healthreviser.com/content/stress-monitor) thus uses a clip worn on the ear of the user to ascertain the heart rate of the user and uses this singular indicator to determine a current stress level.
- Further methods known to date determine a current stress level for a person on the basis of questionnaires, for example. Such a method is provided by My Mood Monitor (see http://www.whatsmym3.com), for example, on the basis of an online questionnaire. When performed repeatedly, such methods based on questionnaire data can help to show the alteration in a stress level over time. In particular, they can provide people affected by depression and/or stress with a tool for monitoring the success of measures taken in order to improve stress level.
- The cited apparatuses and methods for ascertaining a current stress level have the disadvantage that the current stress level is determined on the basis of very few biometric data (for example subjective questionnaire data or objective sensor data that measure a vital function), which means that different categories of biometric data are not combined. Therefore, the current stress level can be determined less reliably than when a large number of different categories of biometric data are used.
- Furthermore, the aforementioned sensor-based apparatuses resort to special devices designed precisely for this instance of application, which are equipped with integrated sensors that capture a specific category of biometric data, such as the heart rate or the body temperature or another vital function. These devices need to be worn by the user on his body, for example, in order to determine his current stress level. This firstly means increased outlay in terms of hardware and cost, since measuring a vital function requires a dedicated device with a sensor for measuring precisely this vital function. Furthermore, fitting and wearing a special apparatus with an integrated sensor on or in direct proximity to the body is a nuisance for the wearer, particularly a restriction of comfort and wellbeing.
- By contrast, the aforementioned questionnaire-based methods require a high level of additional effort from the user. By way of example, the user thus needs to regularly answer questionnaire data, for example using a computer via the Internet, and then autonomously manage, compare and rate the results that vary over time.
- It is therefore an object of the present invention to provide an apparatus and also a method and an application of the type cited at the outset that overcome the aforementioned disadvantages. In particular, it is an object of the invention to provide an apparatus and a method, a computer program product and also an application that determine a current stress level for a user with a high level of reliability, with the additional outlay for the user in terms of cost and time being minimized without restricting the comfort of the user, since the invention dispenses with the use of additional devices and/or sensors that need to be worn on or close to the body.
- The invention achieves this object by means of the features of
1, 26, 36 and 37. Advantageous refinements of the invention can be found in the subclaims and in the description below.patent claims - The present invention comprises a further application that can be installed on a mobile terminal and that ascertains a multiplicity of biometric data from a user. In order to ascertain the biometric data, the further application interacts with other components that are likewise arranged in the mobile terminal. By way of example, the other components are sensors integrated in the mobile terminal and also available applications. An available application denotes an application that is available on a mobile terminal. That is to say an application that is installed and executable on the mobile terminal, such as telephony, SMS, MMS, chat applications and/or browser applications for accessing the Internet and also other applications that are suitable for extracting tactile, acoustic and/or visual biometric features of the user.
- By way of example, the mobile terminal is a smartphone or a tablet computer or a PDA or another mobile terminal that the user can use for many diverse purposes, for example for communication.
- The mobile terminal has means for installing and executing a further application. By way of example the further application can be obtained via the Internet via an online shop integrated in the operating system of the mobile terminal, such as the App store, or another online shop that supplies compatible applications for the mobile terminal, and can be installed on the mobile terminal directly. The further application may be available in various versions, for example in an iOS version for installation and execution on an iPhone or an iPad and in an Android version for installation and execution on a mobile terminal that supports said Android operating system, or in a further version that is compatible with a further mobile operating system. Alternatively, the further application may be installed, and can be executed, on an interchangeable component of the mobile terminal. By way of example, it may be stored as a SIM application in a memory area on a SIM card that can be operated in the mobile terminal, and can be executed by a separate execution unit integrated on the SIM card.
- As a result of a further application according to the invention, in a version compatible with the mobile terminal, being installed and executed on a mobile terminal of the type described at the outset, the portion of the apparatus for ascertaining a current stress level that is arranged on the mobile terminal is obtained.
- The further application ascertains the biometric data firstly from signal data that are produced by sensors integrated in the mobile terminal, and secondly biometric data are extracted from the use data of other applications installed on the mobile terminal.
- Determination of a plurality of biometric data is firstly facilitated by virtue of mobile terminals, such as smartphones, being equipped with an increasing number of sensors as standard. Furthermore, determination of further biometric data is also facilitated by virtue of users increasingly satisfying their interaction and communication needs by using such devices, and hence biometric data pertaining to the categories speech and social interaction, for example, being able to be derived from use data pertaining to communication applications of the mobile terminal directly, and without additional effort and restriction of comfort for the user.
- By way of example, the further application can use voice analysis to ascertain biometric data pertaining to speech, such as volume, speech rate and/or modulation capability, from the use data from a telephony application installed on the mobile terminal, particularly from the voice data from the user that are ascertained via the microphone of the mobile handset and that are transmitted from the mobile terminal to the communication partner via a radio network, for example. Alternatively, the voice data from the user can come from using other available applications, for example from applications that are controlled by the user using voice control.
- From the use data from an SMS application installed on the mobile terminal, the further application can determine the number of SMS messages sent and received and the number of different receivers and senders of SMS messages, and hence can determine biometric data pertaining to social interaction.
- By way of example, a sensor integrated in the mobile terminal may be a gyroscope or gyroscopic instrument. A gyroscope is used for position finding in space and is increasingly widespread in smartphones currently advertized on the market, such as in the iPhone 4. The gyroscope and further sensors that are integrated in the mobile terminal, such as an acceleration sensor and/or a light sensor, can be used to ascertain biometric data pertaining to the sleep structure of the user. To this end, the mobile terminal is positioned on the mattress at night and the movements of the user during the night are detected by the sensors. The further application collects the signal data produced by gyroscope, acceleration sensor and/or light sensor and ascertains biometric data therefrom pertaining to sleep quality and sleep profile, such as time of sleep onset, sleep duration and/or sleep stages.
- In addition, it is also possible for questionnaire data answered by the user, which questionnaire data are requested using a web form in a browser application or in a form that provides the further application, to be part of the biometric data ascertained by the further application. Since answering questionnaire data generates additional effort for the user, these data showing the subjective stress level should be requested only once or as rarely as possible.
- In one possible embodiment, the biometric data are ascertained constantly by the further application while the mobile terminal is in operation. The mobile terminal is configured such that the further application is started automatically, for example, when the mobile terminal is switched on and is operated continually in the background until the mobile terminal is switched off, without the need for further interaction with the user. Similarly, the mobile terminal can be configured such that the user can activate, configure and deactivate the further application autonomously and in this way controls the times at which biometric user data are meant to be tracked by the further application and provided for evaluation.
- Once the further application has been activated, the signal data produced by the sensors integrated in the mobile terminal, which signal data are tracked by the further application, are constantly received by the further application and the biometric data are ascertained therefrom. Similarly, use data from standard applications used by the user, which are tracked by the further application, such as telephony and/or SMS applications, are constantly evaluated by the further application and biometric data are determined therefrom.
- The biometric data determined by the further application can be divided into different categories. The biometric data ascertained from the further application belong to the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data.
- The biometric data ascertained by the further application can be evaluated by a first evaluation apparatus, which is provided in the further application, on the mobile terminal. The biometric data ascertained by the further application can alternatively be evaluated by a second evaluation apparatus on a central server. The second evaluation apparatus can increase the quality of the evaluation further.
- The biometric data ascertained by the further application are transmitted from the mobile terminal to a central server using standard means that the mobile terminal provides. The transmission of the biometric data is effected in pseudonymized form. To this end, the user is managed on the central server under a pseudonym, rather than under his real name or another identifier identifying the user. The transmitted biometric data are associated with the user by means of the pseudonym. In order to increase security further, the biometric data are transmitted in encrypted form.
- For transmission, the data to be transmitted are transferred from the further application to a transmission unit arranged on the mobile terminal. The transmission unit is responsible for setting up, maintaining and clearing down a connection to a network, for example a GPRS or UMTS connection to a mobile radio network to which the data are transmitted. Transmission is effected only when the mobile terminal is in transmission mode. The transmission unit can also be designed for transmission via different networks, for example one or more mobile radio network(s) and/or a wireless connection to a local area network, for example based on an IEEE-802 Standard, such as a WLAN or a WPAN connection. A transmission network, for example a mobile radio network, and possibly further networks connected to said transmission network via gateways, for example, such as the Internet, via which gateways the central server can be reached, is used to transmit the data to the central server.
- In addition, there is the possibility of the data ascertained by the further application first of all being stored on a local memory unit arranged on the mobile terminal before the data are transmitted to the central server. This is necessary particularly when the transmission unit is temporarily incapable of transmitting the data via a transmission network. By way of example, when there is no connection to a transmission network, for example when the mobile terminal is in flight mode and hence the transmission unit is deactivated or when the transmission unit cannot set up a connection to a transmission network because the mobile terminal is outside the transmission range of a transmission network.
- The biometric data can also be analyzed and evaluated by the further application on the mobile terminal directly. In this embodiment, the further application autonomously ascertains a current stress level for the user from the biometric data.
- The central server has a reception unit that can be used to receive data from the network, for example from the Internet. When the biometric data from the user are transmitted from the mobile terminal to the central server, the reception unit receives said data and stores them in a central memory unit.
- The stored data are transmitted to an evaluation unit arranged on the central server and are analyzed and evaluated by the evaluation unit. The transmitted and stored data comprise a plurality of biometric data pertaining to a user that are used by the evaluation unit in order to ascertain a current stress level for the user.
- Since the current stress level is an individual variable and does not involve a comparison with objective standard values, for example, the method disposed in this application is not a diagnostic method. Instead, it is a method that ascertains, collects and analyzes biometric data pertaining to a user and provides the user with the results of the analysis in the form of at least one current stress level.
- The data analysis is used particularly for detecting an alteration in the at least one stress level, i.e. establishing whether the at least one current stress level has increased or decreased in comparison with a previous stress level. The user is therefore provided with a tool for obtaining information particularly about changes in his at least one current stress level in comparison with earlier stress levels over time and, following autonomous rating of the detected changes, if need be taking individual measures for stress reduction.
- In order to ascertain a current stress level for a user, the evaluation unit can resort to biometric data pertaining to the user that have been ascertained in the past, for example, and can take said biometric data into account when ascertaining said current stress level. Thus, biometric data pertaining to the user that have been ascertained in the past can be used as reference data in order to perform user-specific calibration. The current stress level is ascertained in relation to the available reference data.
- Similarly, the evaluation unit can resort to not only the biometric data from the user but also to biometric data from other users when ascertaining a current stress level. By way of example, this allows clusters of user groups to be formed, for example according to age, sex or profession. To ascertain the current stress level for a user, the data from other users in the same user group can be taken into account.
- In one preferred embodiment, the evaluation unit ascertains a current stress level for a user using artificial neural networks. The artificial neural networks are trained on the basis of the available biometric data from a multiplicity of users. As a result of the training, the artificial neural network learns progressively as well and can thereby further improve the quality of the ascertained current stress level.
- By way of example, the artificial neural network can be realized on the basis of a multilayer perceptron network. This neural network consists of a plurality of layers, a permanent input layer and a permanent output layer and if need be further intermediate layers, with no feedback taking place from one layer to layers situated before it.
- The artificial neural network can consist of precisely three layers, for example, input layer, hidden layer and output layer. In this case, the seven categories of biometric data, for example, sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data, can form the neurons of the input layer. In another embodiment, a relatively large number of neurons are used in the input layer by virtue of more finely granular category features being used as neurons of the input layer. In another embodiment, the artificial neural network has feedback mechanisms. When the artificial neural network has feedback mechanisms, it is transited multiple times (iteratively).
- In a further-developed embodiment, the evaluation unit ascertains a current stress level for a user using a network of artificial neural networks, for example using a Deep Belief Network. The network of artificial neural networks is made up of a plurality of neural networks that interact with one another. In a network of neural networks, a single neural network comprises an input layer and a hidden layer. A first level of neural networks is provided. In the neural networks on the first level, the input layer is stipulated by the biometric data from the user. By way of example, the input layer of a first-level neural network can be stipulated by the biometric data from the user in precisely one category. From the input layer of an artificial neural network, it is possible to determine the hidden layer of the same neural network. Furthermore, a second and possibly further level of neural networks is provided. For a neural network on the second and every further level, the input layer can be determined using the hidden layers of a plurality of neural networks on the preceding level. The current stress level can be determined from at least one neural network on a topmost level.
- The multiplicity of biometric data parameters, which particularly includes a combination of different categories of biometric data that are used for ascertaining the at least one current stress level, means that the apparatus disclosed in this application and also the disclosed method and the disclosed application allow very much more reliable ascertainment of the at least one current stress level than is afforded by the known apparatuses and methods mentioned at the outset that ascertain a current stress level only on the basis of a single or very few biometric data parameter(s) in the same category.
- Furthermore, the quality of the analysis of the biometric data is increased further by the neural network method used, since, as time progresses and the database available for training the neural network becomes larger, said method can make ever more precise statements and hence further improves the reliability of the method for determining a current stress level.
- As the cited instances of application show, the further application can determine a plurality of biometric data pertaining to the user that belong to different categories solely from the user-specific use data from available applications on the mobile terminal, on the one hand, and from signal data, on the other hand, which are produced by sensors integrated in the mobile terminal. Hence, the apparatus according to the invention and the method according to the invention and the application according to the invention provide a user with an inexpensive and non-time-consuming solution for determining a current stress level. Furthermore, the solution according to the invention dispenses with additional apparatuses, particularly sensors, that the user needs to fix or wear directly on his body. The solution does not restrict the user in any way in terms of comfort, wellbeing or look, as is entailed by the application or wearing of specific apparatuses with sensors.
- The at least one current stress level determined by the evaluation unit can either be made accessible to the user via the Internet or can be made accessible to the user on the mobile terminal, for example by sending an SMS to the user. The analysis data consisting of the at least one current stress level and possibly further evaluation data, for example statistics pertaining to the change in a stress level over time, can alternatively be transmitted to the mobile terminal using the same transmission paths as when transmitting biometric data from the mobile terminal to the central server, but in the opposite direction. To this end, a transmission unit for transmitting data from the server to the mobile terminal is provided on the central server. Similarly, a reception unit for receiving data from the central server is provided on the mobile terminal. The analysis data can also be transmitted from the central server to the mobile terminal using a push service. The data transmission is in turn effected in encrypted form.
- Further details and advantages of the invention will become clear from the description below of exemplary embodiments with reference to the figures, in which:
-
FIG. 1 shows a schematic illustration of the apparatus for ascertaining a current stress level; -
FIG. 2 shows a schematic illustration of the further application for ascertaining a current stress level; -
FIG. 3 a shows a flowchart for a first instance of application, sleep; -
FIG. 3 b shows a flowchart for a second instance of application, motor functions; -
FIG. 3 c shows a flowchart for a third instance of application, speech; -
FIG. 4 a shows a graphical user interface for starting the sleep instance of application; -
FIG. 4 b shows a further graphical user interface for a first evaluation display; -
FIG. 4 c shows a further graphical user interface for a second evaluation display; -
FIG. 4 d shows a further graphical user interface for a third evaluation display; -
FIG. 5 a shows a schematic illustration of an exemplary embodiment of the evaluation unit; -
FIG. 5 b shows a schematic illustration of an alternative exemplary embodiment of the evaluation unit; -
FIG. 5 c shows a schematic illustration of a further alternative exemplary embodiment of the evaluation unit; -
FIG. 6 shows a schematic illustration of an exemplary embodiment of the evaluation unit with a plurality of artificial neural networks. -
FIG. 1 shows a schematic illustration of an embodiment of the apparatus for ascertaining a 36, 36A, 36B, 36C, 36D. The apparatus comprises acurrent stress level mobile terminal 1 and acentral server 10. - The
mobile terminal 1 contains a plurality ofsensors 2, for example agyroscope 21, anacceleration sensor 22, a light sensor 23 and/or amicrophone 24. Thesignal data 31 produced by thesensors 2 can be accessed via anoperating system 4. Theoperating system 4 is executed within anexecution unit 3 and manages the access to the hardware components of themobile terminal 1, for example thesensors 2. In addition, different applications, for example a plurality ofavailable applications 5 and afurther application 6, are executed in theexecution unit 3. - The
further application 6 ascertains a plurality ofbiometric data 33 pertaining to a user of themobile terminal 1. By way of example, thefurther application 6 is implemented in the programming language Java. Thefurther application 6 uses the MVC (model view controller) design pattern as a basic design pattern. The use of the MVC design pattern structures thefurther application 6 such that this facilitates the comprehensibility and also the extendability and adjustability of thefurther application 6 to new and/or altered hardware components andoperating systems 4. - The
further application 6 obtains thebiometric data 33 fromsignal data 31 that are produced by thesensors 2 and that can be accessed by means of theoperating system 4. The access to thesignal data 31 is realized by thefurther application 6, for example through the use of the observer design pattern. The observer design pattern provides thefurther application 6 with simplified and standardized access to thesignal data 31. - The
further application 6 can extract a plurality of furtherbiometric data 33 from theuse data 32 fromavailable applications 5 too. Theuse data 32 produced by theavailable applications 5 are accessible via theoperating system 4. The access to theuse data 32 is realized by thefurther application 6, for example through the use of the observer design pattern. The observer design pattern provides thefurther application 6 with simplified and standardized access to theuse data 32. An observer is informed about status changes on the object that it is observing, for example anavailable application 5. If theavailable application 5 is an SMS application, for example, and the user calls the SMS application in order to write a new SMS, then the observer observing the SMS application is informed about this status change. Thefurther application 6 reacts to the writing of a new SMS that is observed by the observer by recording the characters input by the user, for example using a keypad, providing them with a timestamp and storing them in thelocal memory unit 7 asuse data 32 for the SMS application. - By way of example, it is also possible for all keypad inputs by the user to be recorded regardless of their use in a specific application. To this end, an observer or a plurality of observers is implemented for the
sensor keypad 25, for example one observer for each key on the keypad. As soon as a key on the keypad is pressed by the user, the observer observing the key is informed of said pressing of a key. Thefurther application 6 reacts to the pressing of the key that is observed by this observer by virtue of thefurther application 6 checking whether the user has pressed a delete key or another key. The ‘delete key’ or ‘other key’ information is recorded by thefurther application 6, provided with a timestamp, and these data are stored in thelocal memory unit 7 assignal data 31. - From the stored
signal data 31 and/or usedata 32, thefurther application 6 extracts a plurality ofbiometric data 33. Thebiometric data 33 are subdivided into categories, for example into the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data. For each category, a category-specific ascertainment time interval is defined, for example 30 seconds for the sleep category and 20 milliseconds for the speech category. Thesignal data 31 and/or usedata 32 that are relevant to a category are processed in a first pre-processing step using category-specific time intervals to produce conditionedsignal data 31A and/or conditioneduse data 32A. In order to determine the data that are relevant to a capture time interval, the timestamps stored for thesignal data 31 and/or usedata 32 are evaluated. Theconditioned signal data 31A and/or conditioneduse data 32A are in turn provided with a timestamp. In a second processing step, thebiometric data 33 are extracted from a sequence ofconditioned signal data 31A and/or conditioneduse data 32A. By way of example, for an instance of application, for example the writing of an SMS,biometric data 33 in the motor functions category are ascertained from the conditioneduse data 32A pertaining to the SMS written. Thebiometric data 33 pertaining to a category that are ascertained in an instance of application are also referred to as a feature vector for this category. For each feature vector, a timestamp is determined that stipulates the time interval for which the feature vector is valid. Thebiometric data 33 comprise the feature vectors ascertained for the various categories, with the respective timestamps of said feature vectors. Thebiometric data 33 ascertained by thefurther application 6 are stored in alocal memory unit 7 of themobile terminal 1. - Furthermore, the
mobile terminal 1 has atransmission unit 8A and a reception unit 8B. Thetransmission unit 8A transmitsdata 34 from themobile terminal 1 to an external node, for example thecentral server 10. The transmission is effected via the air interface, for example. The reception unit 8B receives data from an external node, for example thecentral server 10. Thetransmission unit 8A is used to transmitdata 34, for example thebiometric data 33 from the user, to thecentral server 10 for the purpose of evaluation. The reception unit 8B is used to receivedata 34 coming from thecentral server 10, forexample evaluations 35 created by the central server. Eachevaluation 35 is provided with a timestamp that stipulates the time interval for which the evaluation is valid. Anevaluation 35, for example a 36, 36A, 36B, 36C, 36D of a user of thecurrent stress level mobile terminal 1, is transferred to thefurther application 6 for display and displays to the user on thedisplay 9 of themobile terminal 1 by means of theoperating system 4. - The
central server 10 has atransmission unit 18A and a reception unit 18B. The reception unit 18B is used to receivedata 34 from another node, for example themobile terminal 1. By way of example, the receiveddata 34 arebiometric data 33 from the user of themobile terminal 1. The receiveddata 34 are stored in acentral memory unit 17. Furthermore, anevaluation unit 13 is provided on thecentral server 10. Theevaluation unit 13 evaluates the receivedbiometric data 33. By way of example, theevaluation unit 13 determines the at least one 36, 36A, 36B, 36C, 36D at an instant t by evaluating those feature vectors for the receivedcurrent stress level biometric data 33 whose timestamps are valid at the instant t. - The
current stress level 36A determines a first current stress level of the user for a first category ofbiometric data 33, for example the sleep category. Thecurrent stress level 36C determines a second current stress level of the user for a second category ofbiometric data 33, for example the motor functions category. Thecurrent stress level 36B determines a third current stress level of the user for a third category ofbiometric data 33, for example the speech category. Thecurrent stress level 36D determines a fourth current stress level of the user for a fourth category ofbiometric data 33, for example the social interaction category, or for a combination of categories of biometric data, for example the social interaction, economic data, personal data and/or questionnaire data categories. In addition, further current stress levels can be determined for further categories and/or combinations of categories. Thecurrent stress level 36 determines a consolidated current stress level of the user that is obtained from a combination of the category- 36A, 36B, 36C, 36D and if need be of available further category-specific stress levels, for example by forming the arithmetic mean of the category-specific stress levels.specific stress levels - The at least one
evaluation 35 determined by theevaluation unit 13, for example the at least one 36, 36A, 36B, 36C, 36D, comprises, for eachcurrent stress level evaluation 35, a timestamp that stipulates the time interval for which theevaluation 35 is valid. The at least oneevaluation 35, for example the at least one 36, 36A, 36B, 36C, 36D is stored in thecurrent stress level central memory unit 17 and transmitted to themobile terminal 1 via thetransmission unit 18A. -
FIG. 2 shows a schematic illustration of an embodiment of thefurther application 6 for ascertaining at least one 36, 36A, 36B, 36C, 36D. Thecurrent stress level further application 6 comprises a plurality of components, for example adata manager 61, adata preprocessor 62 and adata analyzer 63. - The
signal data 31 and/or usedata 32 made available via theoperating system 4 are loaded into thedata manager 61 and managed thereby. Thedata manager 61 transfers thesignal data 31 and/or usedata 32 to thedata preprocessor 62. Thedata preprocessor 62 conditions thesignal data 31 and/or usedata 32 and transfers theconditioned signal data 31A and/or conditioneduse data 32A back to thedata manager 61. Thedata manager 61 stores theconditioned signal data 31A and/or conditioneduse data 32A in thelocal memory unit 7. Thedata manager 61 transfers theconditioned signal data 31A and/or conditioneduse data 32A to thedata analyzer 63. The data analyzer 63 analyzes theconditioned signal data 31A and/or conditioneduse data 32A and determines thebiometric data 33 therefrom. - For those
biometric data 33 that are evaluated locally, thedata analyzer 63 creates at least oneevaluation 35, for example in the form of at least one 36, 36A, 36B, 36C, 36D. The data analyzer 63 transfers thecurrent stress level biometric data 33 and if need be the at least oneevaluation 35 to thedata manager 61. Insofar as at least oneevaluation 35 has been created by thedata analyzer 63, thedata manager 61 visualizes the at least oneevaluation 35 for the user of themobile terminal 1 by displaying it on thedisplay 9. Thedata manager 61 transfers thebiometric data 33 to thetransmission unit 8A for transmission to thecentral server 10, insofar as thebiometric data 33 are evaluated centrally. - That
evaluation 35 that is provided in the form of the consolidatedcurrent stress level 36 can be visualized on thedisplay 9 continuously, for example, as a traffic light icon. The traffic light icon can display the colors green, amber or red on the basis of the consolidatedcurrent stress level 36. If the consolidatedcurrent stress level 36 is normalized to an integer value in the value range [0,10], for example, then the traffic light color is chosen on the basis of the current value of the consolidatedcurrent stress level 36. A high value corresponds to a high consolidatedcurrent stress level 36. A low value corresponds to a low consolidatedcurrent stress level 36. If the consolidatedcurrent stress level 36 is low, for example in the value range [0,3], the color green is displayed. If the consolidatedcurrent stress level 36 is increased, for example in the value range [4,6], the color amber is displayed. If the consolidatedcurrent stress level 36 of the user is high, for example in the value range [7,10], the color red is displayed. The display of the consolidatedcurrent stress level 36 is updated as soon as a consolidatedcurrent stress level 36 is available with a timestamp that is more recent than the timestamp of the previously displayed consolidated stress level. - In a further embodiment, the consolidated
current stress level 36 is visualized as a bar chart having 10 bars. Each bar in the bar chart has an associated integer value from the value range [0,10], to which the consolidatedcurrent stress level 36 is normalized. - If
biometric data 33 have been transferred to thecentral server 10 for the purpose of evaluation, thedata manager 61 receives at least oneevaluation 35, for example in the form of a thirdcurrent stress level 36B for the speech category, pertaining to thebiometric data 33 for the speech category that are evaluated on the server. - When the
data manager 61 receives anew evaluation 35, for example in the form of a thirdcurrent stress level 36B for the speech category, it ascertains a new consolidatedcurrent stress level 36 from the category-specific current stress levels, known to thedata manager 61, whose timestamps are currently still valid. By way of example, the consolidatedcurrent stress level 36 is obtained by means of the arithmetic mean or by means of a weighted mean of the category-specific 36A, 36B, 36C, 36D that are still valid. Thecurrent stress levels data manager 61 visualizes the consolidatedcurrent stress level 36 on thedisplay 9, for example by updating the traffic light icon. - The consolidated
current stress level 36 of the user is an individual variable. By way of example, when thefurther application 6 is first used by a user, user-specific calibration can be performed. To this end, the user is asked to recordbiometric data 33 in the personal data category, for example via a form integrated in thefurther application 6. On the basis of the personal data, an individual current stress level of the user is determined, which stipulates a calibration factor, for example. The individual current stress level, for example in its manifestation as a calibration factor, is taken into account for determining the 36, 36A, 36B, 36C, 36D for the user.current stress level -
FIG. 3A shows a flowchart for a first instance of application, sleep. The first instance of application, sleep, ascertainsbiometric data 33 in the sleep category for the purpose of ascertaining a firstcurrent stress level 36A of a user. The first instance of application describes a first method for ascertaining said firstcurrent stress level 36A. Prior to first use of the sleep instance of application, the user allows themobile terminal 1 to fall onto his mattress from a height of approximately 30 centimeters. By evaluating thesensor data 2, thefurther application 6 computes the spring temper and the damping constant of the mattress, which are stored as calibration data pertaining to the sleep instance of application. The sleep instance of application ascertains motion data during the rest phase of the user and evaluates said data. - (A1): the sleep instance of application requires direct user interaction.
- (A2): In this regard, the user of the
mobile terminal 1 calls the sleep mode of thefurther application 6. In one possible embodiment, calling the sleep mode automatically prompts themobile terminal 1 to be put into flight mode in order to minimize emissions of electromagnetic radiation by themobile terminal 1. The user positions themobile terminal 1 on the mattress during his rest phase. - (A3): The
signal data 31 produced by thesensors 2, for example thegyroscope 21, theacceleration sensor 22 and the light sensor 23 during the rest phase are collected by thefurther application 6 and stored in thelocal memory unit 7. - (A4): Following termination of the rest phase, the user deactivates the sleep mode in the
further application 6. If need be, this also deactivates the flight mode and hence activates thetransmission unit 8A and the reception unit 8B. - (A5): The
data manager 61 of thefurther application 6 loads the sensor data ascertained during the sleep mode in thefurther application 6 and transfers thesesignal data 31 to thedata preprocessor 62. - (A6): The
data preprocessor 62 divides the ascertainedsignal data 31 into time intervals, for example into time intervals having a length of 30 seconds. For thesignal data 31 in each time interval, conditionedsignal data 31A that are characteristic of the time interval are determined and are provided with a timestamp. - (A7): The
data preprocessor 62 transfers theconditioned signal data 31A with their timestamps to thedata manager 61. - (A8): The
data manager 61 stores theconditioned signal data 31A with their timestamps in thelocal memory unit 7. - (A9): The
data manager 61 transfers theconditioned signal data 31A with their timestamps to thedata analyzer 63 for the purpose of evaluation. - (A10): The data analyzer 63 analyzes the
conditioned signal data 31A and determines therefrom a feature vector withbiometric data 31 in the sleep category. By way of example, the feature vector is determined by means of a statistical regression model for modeling a binary target variable, for example a logit or probit model. To this end, the sequence ofconditioned signal data 31A that is obtained by arranging theconditioned signal data 31A according to ascending timestamps is evaluated and each element in the sequence is classified as “awake” or “asleep” for the sleep state. The classification takes account of the sleep states of the preceding elements in the sequence, that is to say the sleep states in the preceding time intervals. If the probability of the user being in a sleep state in a time interval is greater than 50%, the time interval is classified with the state “asleep”, otherwise with the state “awake”. The sequence of sleep states over all time intervals is given as a basis for determining the feature vector. - By way of example, the feature vector of the
biometric data 33 pertaining to the sleep category comprises the following features: -
- a. Sleep onset latency
- b. Sleep efficiency
- c. Sleep onset instant
- d. Sleep end
- e. Time in bed
- f. Sleep duration
- g. Wakeful time
- h. Length of time to the first REM phase
- i. Stage components of the individual sleep phases
- j. Number of awakenings
- k. Number of sleep stage changes
- From the feature vector of the
biometric data 33 pertaining to the sleep category, thedata analyzer 63 determines anevaluation 35 that comprises particularly the firstcurrent stress level 36A for the sleep category. In order to ascertain the firstcurrent stress level 36A, all features of the feature vector are rated with an integer value for the value range [0,10], for example, and the individual values are used to form a mean value, for example an arithmetic mean or a weighted mean. In addition, the rating is influenced to some extent by the user-specific calibration, for example as a result of a calibration factor that needs to be taken into account. By way of example the firstcurrent stress level 36A for the sleep category is obtained as an integer value in the value range [0,10]. The firstcurrent stress level 36A comprises a timestamp that stipulates the period for which the firstcurrent stress level 36A for the sleep category is valid. - (A11): The feature vector of the
biometric data 33 pertaining to the sleep category and also theevaluation 35, which particularly comprises the firstcurrent stress level 36A for the sleep category, are transferred to thedata manager 61. - (A12): The
data manager 61 stores the feature vector of thebiometric data 33 pertaining to the sleep category and also theevaluation 35, particularly the firstcurrent stress level 36A for the sleep category, in thelocal memory unit 7. Thedata manager 61 visualizes theevaluation 35, particularly the firstcurrent stress level 36A for the sleep category, on thedisplay 9. From the firstcurrent stress level 36A for the sleep category and if need be further available, valid current stress levels for further categories, for example the 36B, 36C, 36D, thecurrent stress levels data manager 61 determines a consolidatedcurrent stress level 36 and visualizes the consolidatedcurrent stress level 36, for example by updating the traffic light icon. -
FIG. 4A shows an exemplary graphical user interface for the start of the sleep instance of application of thefurther application 6. The exemplary graphical user interface contains a tip for successful measurement of thebiometric data 33 pertaining to the sleep category. By selecting the OK button, the user can start the instance of application. -
FIG. 4B shows a further exemplary graphical user interface of a first evaluation display for the sleep instance of application. The first evaluation display visualizes anevaluation 35 for the sleep category in the form of an overview evaluation. The sleep quality parameter is used to display a firstcurrent stress level 36A of the user for the sleep category. The sleep quality is indicated by the numerical value 2.0 within a scale from 0 to 10. In addition, the first evaluation display comprises further elements, for example the last sleep pattern as a function of time. -
FIG. 4C shows a further exemplary graphical user interface of a second evaluation display for the sleep instance of application. The second evaluation display visualizes anevaluation 35 for the sleep category in the form of a detail display. The detail display comprises the ascertainedbiometric data 33 pertaining to the sleep category. For each feature of thebiometric data 33 in the sleep category, the ascertained value is indicated. -
FIG. 4D shows a further graphical user interface of a third evaluation display for the sleep instance of application. The third evaluation display visualizes the consolidatedcurrent stress level 36 in a bar chart. In addition, theconsolidated stress level 36 and the 36A, 36B, 36C, 36D for the individual categories are displayed as numerical values. Each numerical value is displayed in a color that is specific to the value. The choice of color visualizes thecurrent stress levels 36, 36A, 36B, 36C, 36D in color.current stress levels -
FIG. 3B shows a flowchart for a second instance of application, motor functions. The second instance of application, motor functions, ascertainsbiometric data 33 in a motor functions category for the purpose of ascertaining a secondcurrent stress level 36C of a user. The second instance of application describes a second method for ascertaining the secondcurrent stress level 36C. This instance of application requires only indirect interaction with the user. - (B1): The
further application 6 has been loaded into theexecution unit 3 of themobile terminal 1 and has been started. Thefurther application 6 runs in theexecution unit 3 as a background process. - (B2): The user calls an
available application 5 that is associated with a text input via the keypad, for example an SMS application for writing a new SMS. - (B3): The user uses the
keypad 25 of themobile terminal 1 to type an SMS, for example. - (B4): The sequence of keypad inputs that is made by the user is collected by the
data manager 61 of thefurther application 6 and stored on thelocal memory unit 7. For each keypad input, a timestamp is stored. - (B5): The user terminates typing, for example by finishing and sending the SMS.
- (B6): The
data manager 61 transfers the collected and stored keypad data to thedata preprocessor 62. - (B7): The
data preprocessor 62 performs pre-evaluation of the keypad data. To this end, thedata preprocessor 62 divides the ascertained keypad data into time intervals, for example into time intervals with a length of 15 seconds. For thekeypad data 32 in each time interval, conditioneduse data 32A that are characteristic of the time interval are determined and are provided with a timestamp. - (B8): The conditioned
use data 32A provided with timestamps are transferred from thedata preprocessor 62 to thedata manager 61. - (B9): The
data manager 61 stores the conditioneduse data 32A provided with timestamps in thelocal memory unit 7. - (B10): The
data manager 61 transfers the conditioneduse data 32A provided with timestamps to thedata analyzer 63. - (B11): The data analyzer 63 analyzes the conditioned
use data 32A provided with timestamps and determines a feature vector therefrom withbiometric data 31 in the motor functions category. - By way of example, the
data analyzer 63 determines the error rate from the frequency of keypad input errors, particularly from the number of times the user operates a delete key in the time interval under consideration. The error rate determined is a measure of the hand/eye coordination of the user. - By way of example, the feature vector of the biometric data pertaining to the motor functions category comprises the following features:
-
- a. Speed (keystrokes per unit time)
- b. Error rate
- c. Variance in the error rate
- d. Variance in the speed
- From the feature vector of the
biometric data 33 pertaining to the motor functions category, thedata analyzer 63 determines anevaluation 35, particularly the secondcurrent stress level 36C for the motor functions category. To this end, all features of the feature vector are rated with an integer value from the value range [0,10], for example, and a mean value, for example an arithmetic mean or a weighted mean, is formed from the individual values. In addition, the rating is influenced to some extent by the user-specific calibration, for example as a result of a calibration factor that needs to be taken into account. The secondcurrent stress level 36C for the sleep category is obtained as an integer value in the value range [0,10], for example. The secondcurrent stress level 36C comprises a timestamp that stipulates the period for which the secondcurrent stress level 36C for the sleep category is valid. - (B12): The data analyzer 63 transfers the feature vector of the
biometric data 33 pertaining to the sleep category and also theevaluation 35, particularly the secondcurrent stress level 36C for the motor functions category, with its timestamp, to thedata manager 61. - (B13): The
data manager 61 stores the feature vector of thebiometric data 33 pertaining to the sleep category and also theevaluation 35, particularly the secondcurrent stress level 36C for the motor functions category, with its timestamp, in thelocal memory unit 7. Thedata manager 61 visualizes theevaluation 35, particularly the secondcurrent stress level 36C for the motor functions category, on thedisplay 9. From the secondcurrent stress level 36C for the motor functions category and if need be further available valid current stress levels for further categories, thedata manager 61 determines the consolidatedcurrent stress level 36 and visualizes it, for example by updating the traffic light icon. - In an alternative embodiment, the
biometric data 31, for example thebiometric data 31 pertaining to the sleep and/or motor functions categories, are transmitted to thecentral server 10, stored in thecentral memory unit 17 and evaluated by theevaluation unit 13 arranged on the server. -
FIG. 3C shows a flowchart for a third instance of application, speech. The third instance of application, speech, ascertainsbiometric data 33 in the speech category, for example the speech parameters speech rate and/or modulation capability, in order to ascertain a thirdcurrent stress level 36B of a user. The third instance of application describes a third method for ascertaining the thirdcurrent stress level 36B. This instance of application requires only indirect interaction with the user. - The speech instance of application comprises voice analysis of voice data from the user, for example voice data from telephone calls conducted by the user using the
mobile terminal 1. - (C1): The
further application 6 has been loaded into theexecution unit 3 of themobile terminal 1 and has been started. Thefurther application 6 runs as a background process in theexecution unit 3. - (C2): The speech instance of application is started by an incoming call to the
mobile terminal 1, for example. - (C3): The user takes the call.
- (C4): During the call, the
data manager 61 of thefurther application 6 continuously collects thevoice data 31 pertaining to the user that are captured via themicrophone 24, provides them with a timestamp and stores voicedata 31 with the timestamp in thelocal memory unit 7. - (C5): The user terminates the call.
- (C6): The
data manager 61 transfers thevoice data 31 stored with a timestamp to thedata preprocessor 62. - (C7): The
data preprocessor 62 performs pre-evaluation of thevoice data 31. To this end, thedata preprocessor 62 divides the capturedvoice data 31 into time intervals, for example into time intervals with a length of 20 milliseconds. For thevoice data 31 in each time interval, conditionedvoice data 31A that are characteristic of the time interval are determined and are provided with a timestamp. - (C8): The
data preprocessor 62 transfers the conditionedvoice data 31A with their timestamps to thedata manager 61. - (C9): The
data manager 61 stores the conditionedvoice data 31A for their timestamps in thelocal memory unit 7. - (C10): The
data manager 61 transfers the conditionedvoice data 31A with their timestamps to thedata analyzer 63 for the purpose of evaluation. - (C11): The data analyzer 63 analyzes the conditioned
voice data 31A and determines from them a feature vector withbiometric data 31 in the speech category. - By way of example, the feature vector of the
biometric data 31 for the speech category comprises the following features: -
- a. Accent shape
- b. Average pitch
- c. Contour slope
- d. Final Lowering
- e. Pitch range
- f. Speech rate
- g. Stress frequency
- h. Breathiness
- i. Brilliance
- j. Loudness
- k. Pause Discontinuity
- l. Pitch Discontinuity
- m. Time in different emotional states (
state 1, . . . , state n)
- The feature vector is provided with a timestamp and these data are transferred from the
data analyzer 63 to thedata manager 61 asbiometric data 33 in the speech category. - (C12): The
data manager 61 stores the feature vector provided with a timestamp in thelocal memory unit 7 asbiometric data 33 in the speech category. Thedata manager 61 transfers thebiometric data 33 pertaining to the speech category to thetransmission unit 8A for the purpose of transmission to thecentral server 10. - (C13): The reception unit 18B of the
central server 10 receives the transmitted data in the form of thebiometric data 33 pertaining to the speech category. Thecentral server 10 stores thebiometric data 33 in thecentral memory unit 17 and evaluates thebiometric data 33 in theevaluation unit 13. To this end, a neural network method—explained further on—is used, for example. Theevaluation unit 13 determines anevaluation 35. Theevaluation 35 particularly comprises the thirdcurrent stress level 36B in the speech category. The thirdcurrent stress level 36B for the speech category is determined as an integer value in the value range [0,10], for example. The thirdcurrent stress level 36B comprises a timestamp that stipulates the period for which the thirdcurrent stress level 36B for the speech category is valid. - (C14): The
central server 10 transmits theevaluation 35, particularly the thirdcurrent stress level 36B for the speech category, with its timestamp, to themobile terminal 1 by means of thetransmission unit 18A. The transmittedevaluation 35 is received by the reception unit 8B of themobile terminal 1 and transferred to thedata manager 61 of thefurther application 6. - (C15): The
data manager 61 stores theevaluation 35, particularly the thirdcurrent stress level 36B for the speech category, with its timestamp, in thelocal memory unit 7. The data manager visualizes theevaluation 35, particularly the thirdcurrent stress level 36B for the speech category, on thedisplay 9. From the thirdcurrent stress level 36B for the speech category and if need be further available, valid current stress levels for further categories, thedata manager 61 determines the consolidatedcurrent stress level 36 and visualizes the consolidatedcurrent stress level 36, for example by updating the traffic light icon. - Besides the cited sleep, speech and motor functions instances of application, there are further instances of application that ascertain further
biometric data 33 for further categories and determine further current stress levels of the user therefrom. - Thus, the social interaction instance of application evaluates
use data 32 from the user from suchavailable applications 5 as are used for social interaction. Examples ofavailable applications 5 that are used for social interaction are SMS applications, e-mail applications or social network applications, such as an instant messaging application or a Facebook application. From theuse data 32 pertaining to theavailable applications 5 that are used for social interaction, it is possible to ascertain, by way of example, the number of contacts in social networks or the frequency with which contact is made, for example the frequency with which an SMS is sent. - By way of example, the feature vector of the
biometric data 33 pertaining to the social interaction category comprises the following features: -
- a. Number of telephone contacts
- b. Number of contacts in social networks
- c. Frequency with which contact is made (SMS, telephoning, messages in the social network)
- d. Length of time for which contact is made
- e. Time at which contact is made
- f. Frequency at which contact is made
- g. Absolute and relative number of contacts with regular contact being made
- In addition,
biometric data 33 in further categories can be taken into account, for examplebiometric data 33 in the economic data category, in the personal data category and/or in the questionnaire data category. - The economic data category relates to comprehensive rather than user-specific data, for example data pertaining to general sickness absence rate or pertaining to job security.
- By way of example, the feature vector of the
biometric data 33 pertaining to the economic data category comprises the following features: -
- a. Sickness absence rate
- b. Job risk
- The personal data category comprises data pertaining to age and family status and also pertaining to occupation group and pertaining to education level. The feature vector of the personal data category is used particularly for individual calibration of the
36, 36A, 36B, 36C, 36D. The personal data are recorded by the user using a form within thecurrent stress levels further application 6, for example. - By way of example, the feature vector of the
biometric data 33 pertaining to the personal data category comprises the following features: -
- a. Occupation group
- b. Educational level
- c. Geoposition
- d. Age
- e. Medication
- f. Pre-existing illnesses
- g. Family illnesses
- h. Family status
- The questionnaire data comprise individual self-assessments by the user pertaining to stress-related questions. The questionnaire data are recorded by the user using a form within the
further application 6, for example. - The
biometric data 33 pertaining to the cited further categories can additionally be used for evaluation and particularly for ascertaining the consolidatedcurrent stress level 36 of the user. - Whereas, in the exemplary sleep and motor functions instances of application, the
biometric data 33 are evaluated by thefurther application 6 directly as an evaluation unit on themobile terminal 1, a different approach has been chosen for the exemplary speech instance of application. In order to increase the quality of the ascertained thirdcurrent stress level 36B and also of the consolidatedcurrent stress level 36, the evaluation of thebiometric data 33 pertaining to the speech category is effected in theevaluation unit 13 that is arranged on thecentral server 10. Theevaluation unit 13 contains an evaluation method, for example a method based on artificial neural networks that resorts tobiometric data 33 from other users and to earlierbiometric data 33 from the user. - In an alternative embodiment of the invention, the
biometric data 33 from other categories are also evaluated in theevaluation unit 13 arranged on thecentral server 10 in order to increase the quality of the evaluation further. - In another alternative embodiment of the invention, the evaluation method obtained on the
central server 10 by training the artificial neural network method is implemented in thefurther application 6, for example by means of an update in thefurther application 6. In this case, the evaluation unit is provided for all categories by thefurther application 6 on themobile terminal 1. Evaluation of thebiometric data 33 pertaining to all categories is effected on themobile terminal 1 rather than on thecentral server 10. -
FIG. 5A shows a schematic illustration of an exemplary embodiment of theevaluation unit 13 on thecentral server 10. In this embodiment, a 36, 36A, 36B, 36C, 36D is determined for a user by thecurrent stress level evaluation unit 13 on thecentral server 10. - In a variation of this embodiment, it is alternatively possible for a portion of the
biometric data 33 pertaining to the user to be analyzed and evaluated on themobile terminal 1 directly and for at least one 36A, 36B, 36C, 36D ascertained on the terminal to be determined. A second portion of thecurrent stress level biometric data 33 pertaining to the user is analyzed and evaluated on thecentral server 10 by theevaluation unit 13 and at least one 36A, 36B, 36C, 36D on the server is determined. Thecurrent stress level biometric data 33 analyzed and evaluated on the server can comprisebiometric data 33 that are also taken into account for the analysis and evaluation on themobile terminal 1. Aconsolidated stress level 36 that takes account both of the at least one 36A, 36B, 36C, 36D ascertained on the terminal and of the at least onecurrent stress level 36A, 36B, 36C, 36D ascertained on the server is determined by thecurrent stress level data manager 61 of thefurther application 6. - The
evaluation unit 13 comprises a server-end data manager 14 and a server-end data analyzer 15. The server-end data analyzer 15 is in the form of an artificialneural network 40 in the form of a multilayer perceptron network. The neural network consists of three layers: theinput layer 43, the hiddenlayer 44 and theoutput layer 45. Each layer is constructed fromneurons 46. Theinput layer 43 contains a plurality ofinput neurons 46A. The hiddenlayer 44 contains a plurality of hiddenneurons 46B and theoutput layer 45 contains precisely one output neuron 46C. - In one possible embodiment, each
input neuron 46A of theinput layer 43 has, as an associated input value, the value of a feature from a feature vector in a category ofbiometric data 33 that have been transmitted to thecentral server 10, for example in the speech category, following suitable normalization, for example to the value range [0,10]. - In one alternative embodiment, each
input neuron 46A of theinput layer 43 has, as an associated input value, the current stress level for a category ofbiometric data 33. By way of example, theinput layer 43 consists of seveninput neurons 46A, eachinput neuron 46A having the associated current stress level of one of the categories sleep, speech, motor functions, social interaction, economic data, personal data and questionnaire data. - In a further alternative embodiment, the features of the category-specific feature vectors, of
biometric data 33 available to thecentral server 10, are linked and evaluated in another way in order to determine the input values of theinput neurons 46A. - The multilayer perceptron network is in the form of a feed forward network, i.e. the connections between the
neurons 46 always point from one layer, for example theinput layer 43, to the next layer, for example the hiddenlayer 44. When theneural network 40 is transited, no feedback or cyclic connections occur, and instead information is forwarded only in one distinguished direction. Theinput neurons 46A of the input layer have connections to thehidden neurons 46B of the hidden layer. By way of example, eachinput neuron 46A of the input layer can have one connection to eachhidden neuron 46B of the hidden layer. In an initial state, the hiddenlayer 44 has a greater number ofneurons 46 than theinput layer 43. By contrast, theoutput layer 45 contains precisely oneneuron 46, the output neuron 46C. Theneurons 46B of the hiddenlayer 44 have connections to the output neuron 46C of theoutput layer 45. By way of example, eachhidden neuron 46B of the hiddenlayer 44 is connected to the output neuron 46C. The output neuron 46C represents a 36, 36A, 36B, 36C, 36D of a user.current stress level - From the
biometric data 33 from a user, the artificialneural network 40 computes a 36, 36A, 36B, 36C, 36D of the user. For this purpose, the server-current stress level end data manager 14 retrieves thebiometric data 33 pertaining to a user in the form of the feature vectors for the ascertained categories—transmitted to thecentral server 10—ofbiometric data 33 from thecentral memory unit 17. The feature vectors suitable for computing the 36, 36A, 36B, 36C, 36D are taken into account, for example the feature vectors with the most recent timestamp. A feature vector in a category is taken into account only if the instant for which thecurrent stress level 36, 36A, 36B, 36C, 36D is computed lies in the validity range defined by the timestamp. The server-current stress level end data manager 14 provides thebiometric data 33 for thedata analyzer 15, which is in the form of an artificialneural network 40. - Following possible conditioning, such as normalization and/or suitable linking, the
biometric data 33 are read into theinput layer 43 of theneural network 40 and forwarded to the next layers of theneural network 40 via the connections. Each connection has a connection weight that has either a boosting or inhibiting effect. Eachneuron 46B of the hiddenlayer 44 has an activation function, for example the hyperbolic tangent activation function, which maps an arbitrary input value onto the value range [−1, 1]. The input value for aneuron 46B of the hiddenlayer 44 is obtained as a sum of the values transmitted via the weighted connections. For eachneuron 46, a neuron-specific threshold value is stipulated. If, following application of the activation function, the input value exceeds the threshold value of theneuron 46B, this computed value is forwarded from the hiddenneuron 46B to its outgoing connections and hence to the output neuron 46C in theoutput layer 45. The output neuron 46C determines its output value using the same method as has been described for ahidden neuron 46B of the hiddenlayer 44. For given connection weights, activation functions and threshold values, the artificialneural network 40 determines the value of the one output neuron 46C in a deterministic fashion from thebiometric data 33 that are associated with theinput neurons 46A. The value of the output neuron 46C provides the 36, 36A, 36B, 36C, 36D.current stress level - The value of the output neuron 46C is transferred from the server-
end data analyzer 15 in the form of an artificialneural network 40 to the server-end data manager 14. The server-end data manager 14 stores the output value as a 36, 36A, 36B, 36C, 36D for the categories relevant to determination thereof, with a timestamp, in thecurrent stress level central memory unit 17. - In an initial phase, the connection weights of each connection and the threshold values of each
neuron 46 are stipulated. By way of example, the connection weight for a connection is stipulated by a random value from the range [−0.5, 0.5], thevalue 0 being omitted. The threshold value for aneuron 46 is stipulated by a random value from the range [−0.5, 0.5] for example. - In a training phase, the connection weights of each connection of the
neural network 40 and the threshold values for eachneuron 46 are adjusted. For the purpose of training theneural network 40, a monitored learning method, preferably a back propagation method, is used. In the case of a monitored learning method, the desired output value from the output neuron 46C is available for the input values for theneural network 40. By way of example, the desired output value from the output neuron 46C is obtained from the current stress level for the questionnaire data category, which level has been ascertained exclusively from the questionnaire data answered by the user. In an iterative back propagation method, the connection weights of all connections and the threshold values of allneurons 46 are trained until the output value that theneural network 40 provides for the output neuron 46C matches the desired output value with sufficient accuracy. The repeated with a multiplicity ofbiometric data 33 from a multiplicity of users allows the analysis and evaluation method provided by the artificialneural network 40 for ascertaining the 36, 36A, 36B, 36C, 36D to be constantly improved and adjusted further.current stress level -
FIG. 5B shows a schematic illustration of an alternative exemplary embodiment of theevaluation unit 13 on thecentral server 10. In this alternative embodiment, the artificialneural network 40 is in the form of a feedback neural network. Besides the connections that point from aneuron 46 of an upstream layer to aneuron 46 of a downstream layer, for example from aninput neuron 46A of theinput layer 43 to ahidden neuron 46B of the hiddenlayer 44, this embodiment has connections that run in the opposite direction, for example from ahidden neuron 46B of the hiddenlayer 44 to aninput neuron 46A of theinput layer 43 or from the output neuron 46C of theoutput layer 45 to ahidden neuron 46B of the hiddenlayer 44. An artificialneural network 40 of this kind has a higher level of complexity than the previously described feed forward network, which forwards data only in a distinguished forward direction. - A development of the feedback artificial neural network is shown in
FIG. 5C . Accordingly, lateral feedback loops are also possible, that is to say connections ofneurons 46 that are arranged in the same layer. In a further development of the feedback artificial neural network, there is also provision for direct feedback. Direct feedback is a connection from aneuron 46 to itself. Direct feedback means thatneurons 46 inhibit or boost themselves in order to arrive at their activation limits. - A feedback artificial neural network is provided particularly in order to take account of the “memory” of
biometric data 33 pertaining to a user when determining the 36, 36A, 36B, 36C, 36D. The memory ofcurrent stress level biometric data 33 pertaining to a category pertaining to a user is the sequence, arranged on the basis of their timestamp, of feature vectors for this category and this user; in particular, the sequence comprises older feature vectors from earlier analyses. A suitable subsequence is selected and the artificial neural network method is started with the first feature vector in this subsequence, that is to say the feature vector with the oldest timestamp. In a first time step, the values of the first feature vector are applied to the artificial neural network as input values and the neural network is transited once. The built-in feedback loops mean that the values from the first time step have a further effect on the subsequent time step. In the subsequent time step, the values of the second feature vector are applied to the artificialneural network 40 as input values. When the artificialneural network 40 is transited again, in addition to the input values for the second feature vector the values generated in feedback connections from the previous time step are taken into account as new input values. The method determined in this manner is continued further until the complete subsequence of feature vectors has been transited. The value of the output neuron 46C provides the 36, 36A, 36B, 36C, 36D of the user.current stress level -
FIG. 6 shows a schematic illustration of an embodiment of the invention that has been developed further. In this embodiment, the 6, 13 ascertains aevaluation unit 36, 36A, 36B, 36C, 36D of a user using a network of artificial neural networks. By way of example, the network of neural networks may be in the form of a deep belief network or a convolutional deep belief network 50.current stress level - A single artificial
neural network 40, which is part of the network of artificial neural networks, may be embodied according to one of the embodiments cited previously for artificialneural networks 40, for example. - In a preferred embodiment the network of artificial neural networks comprises a plurality of artificial
neural networks 40 that interact with one another. By way of example, the plurality of artificialneural networks 40 may be embodied as a restricted Boltzmann machine or as a convolutional restricted Boltzmann machine. - In the network of artificial
neural networks 40, a singleneural network 40 comprises aninput layer 43 and ahidden layer 44. The input layer comprises a plurality ofinput neurons 46A. The hidden layer comprises a plurality of hiddenneurons 46B. From the input layer of an artificial neural network, it is possible to determine the hidden layer of the same neural network, as explained in the preceding embodiments, for example. - The network of artificial neural networks contains a first level of artificial
neural networks 40, which are referred to as first neural networks. Theinput layer 43 of the first neural networks is stipulated by thebiometric data 33 from the user. By way of example, a component of a feature vector in a category can be associated with aninput neuron 46A. Furthermore, at least one further level of artificialneural networks 40 is provided, which are referred to as further neural networks. For a further neural network, theinput layer 43 can be determined from thehidden layers 44 of a plurality of artificialneural networks 40 on the preceding level. By way of example, aninput neuron 46A of theinput layer 43 is stipulated by precisely one hiddenneuron 46B of an artificialneural network 40 from the preceding layer. Alternatively, aninput neuron 46A of theinput layer 43 is stipulated by a plurality of hiddenneurons 46B of one or more artificialneural networks 40 from the preceding layer. - The network of artificial
neural networks 40 contains a topmost level that comprises at least one artificialneural network 40. The at least one artificialneural network 40 on the topmost level is referred to as the topmost neural network. - The at least one topmost neural network has an
output layer 45. In a first embodiment, the hiddenlayer 44 of a topmost neural network is identified by means of theoutput layer 45. In a second embodiment, the at least one artificialneural network 40 of the topmost level comprises three layers, the input layer, the hidden layer and theoutput layer 45. In the second embodiment, theoutput layer 45 comprises precisely one output neuron 46C. - The
36, 36A, 36B, 36C, 36D can be determined from thecurrent stress level output layer 45 of the at least one topmost neural network. In a first embodiment, a classifier is provided that classifies theoutput layer 45 and determines the 36, 36A, 36B, 36C, 36D therefrom. By way of example, the classifier may be designed as a support vector machine. In a second embodiment, thecurrent stress level 36, 36A, 36B, 36C, 36D is stipulated by the output neuron 46C.current stress level - The
6, 13, which comprises a network of a plurality of artificialevaluation unit neural networks 40, is designed such that the computation of the network can be parallelized. The 6, 13 interacts with at least one processor, the processor being designed and provided to computeevaluation unit 46, 46B, 46C for at least one artificialneurons neural network 40. By way of example, the processor may be arranged on themobile terminal 1. The processor may also be provided on a central server. In a preferred embodiment, a plurality of processors are provided. The plurality of processors may be provided on the mobile terminal or the central server or on both. The 6, 13 is designed and provided to have the plurality of artificialevaluation unit neural networks 40 computed by the plurality of processors in parallel. - The parallel computation optimizes the computation time. The method for determining a current stress level that is based on a network of artificial neural networks can be executed more quickly by parallelizing the computation of neural networks. Similarly, the parallelization allows power to be saved.
- In a further development, at least one graphics card with at least one graphics card processor can be incorporated for executing the method, the at least one graphics card being arranged on the mobile terminal or on the central server. The graphics card processor can support computation of the artificial neural networks, in particular. This approach allows the computation time to be optimized even further.
Claims (21)
Applications Claiming Priority (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102012213618.5 | 2012-08-01 | ||
| DE102012213618 | 2012-08-01 | ||
| DE102012214697.0 | 2012-08-17 | ||
| DE201210214697 DE102012214697A1 (en) | 2012-08-01 | 2012-08-17 | Device, method and application for determining a current load level |
| PCT/EP2013/066241 WO2014020134A1 (en) | 2012-08-01 | 2013-08-01 | Device, method and application for establishing a current load level |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2013/066241 A-371-Of-International WO2014020134A1 (en) | 2012-08-01 | 2013-08-01 | Device, method and application for establishing a current load level |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/850,984 Continuation US11468984B2 (en) | 2012-08-01 | 2020-04-16 | Device, method and application for establishing a current load level |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20150265211A1 true US20150265211A1 (en) | 2015-09-24 |
Family
ID=49944062
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/418,374 Abandoned US20150265211A1 (en) | 2012-08-01 | 2013-08-01 | Device, method and application for establishing a current load level |
| US16/850,984 Active US11468984B2 (en) | 2012-08-01 | 2020-04-16 | Device, method and application for establishing a current load level |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/850,984 Active US11468984B2 (en) | 2012-08-01 | 2020-04-16 | Device, method and application for establishing a current load level |
Country Status (4)
| Country | Link |
|---|---|
| US (2) | US20150265211A1 (en) |
| EP (1) | EP2879582B1 (en) |
| DE (1) | DE102012214697A1 (en) |
| WO (1) | WO2014020134A1 (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140200468A1 (en) * | 2007-09-11 | 2014-07-17 | Korea Advanced Institute Of Science And Technology (Kaist) | Method for analyzing stress based on multi-measured bio-signals |
| EP3501385A1 (en) * | 2017-12-21 | 2019-06-26 | IMEC vzw | System and method for determining a subject's stress condition |
| WO2022038776A1 (en) * | 2020-08-21 | 2022-02-24 | 日本電気株式会社 | Stress inference device, inference method, program, and storage medium |
| US11301671B1 (en) * | 2015-04-20 | 2022-04-12 | Snap Inc. | Determining a mood for a group |
| US11717217B2 (en) * | 2019-03-25 | 2023-08-08 | Steffen Wirth | Stress monitor and stress-monitoring method |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102014223366A1 (en) * | 2014-11-17 | 2016-05-19 | BSH Hausgeräte GmbH | Domestic appliance with a touch-sensitive operating device and method for its operation |
| DE102016216950A1 (en) * | 2016-09-07 | 2018-03-08 | Robert Bosch Gmbh | Model calculation unit and control unit for calculating a multilayer perceptron model with feedforward and feedback |
| US12327175B2 (en) * | 2020-08-06 | 2025-06-10 | Micron Technology, Inc. | Collaborative sensor data processing by deep learning accelerators with integrated random access memory |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120289789A1 (en) * | 2011-05-13 | 2012-11-15 | Fujitsu Limited | Continuous Monitoring of Stress Using Environmental Data |
| US20130262096A1 (en) * | 2011-09-23 | 2013-10-03 | Lessac Technologies, Inc. | Methods for aligning expressive speech utterances with text and systems therefor |
Family Cites Families (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3310498B2 (en) * | 1994-09-02 | 2002-08-05 | 独立行政法人産業技術総合研究所 | Biological information analyzer and biological information analysis method |
| US5568126A (en) | 1995-07-10 | 1996-10-22 | Andersen; Stig L. | Providing an alarm in response to a determination that a person may have suddenly experienced fear |
| US8364136B2 (en) * | 1999-02-01 | 2013-01-29 | Steven M Hoffberg | Mobile system, a method of operating mobile system and a non-transitory computer readable medium for a programmable control of a mobile system |
| AU2001270092A1 (en) * | 2000-06-23 | 2002-01-08 | Bodymedia, Inc. | System for monitoring health, wellness and fitness |
| US7409373B2 (en) | 2001-12-28 | 2008-08-05 | Concepta Ab | Pattern analysis system and method |
| US7152051B1 (en) * | 2002-09-30 | 2006-12-19 | Michael Lamport Commons | Intelligent control with hierarchical stacked neural networks |
| SE0202948D0 (en) | 2002-10-04 | 2002-10-04 | Bergfalk & Knagenhjelm Ab | Ways of detecting activity patterns that indicate mental illness, and corresponding arrangements |
| US20040176991A1 (en) * | 2003-03-05 | 2004-09-09 | Mckennan Carol | System, method and apparatus using biometrics to communicate dissatisfaction via stress level |
| US8002553B2 (en) | 2003-08-18 | 2011-08-23 | Cardiac Pacemakers, Inc. | Sleep quality data collection and evaluation |
| US10478115B2 (en) | 2004-10-04 | 2019-11-19 | Spirofriend Technology Aps | Handheld home monitoring sensors network device |
| US8758019B2 (en) | 2006-08-03 | 2014-06-24 | James W. Suzansky | Multimedia game based system and process for medical, safety and health improvements |
| EP2131731B1 (en) * | 2007-02-16 | 2014-04-09 | Galvanic Limited | Biosensor system |
| US7720696B1 (en) | 2007-02-26 | 2010-05-18 | Mk3Sd, Ltd | Computerized system for tracking health conditions of users |
| DE102007032610A1 (en) * | 2007-07-11 | 2009-01-15 | Deutsche Telekom Ag | A method of remotely monitoring the medical condition of a user, system and apparatus therefor |
| JP5727231B2 (en) * | 2008-02-22 | 2015-06-03 | コーニンクレッカ フィリップス エヌ ヴェ | System and kit for stress and relaxation management |
| US20090254369A1 (en) * | 2008-04-08 | 2009-10-08 | Mohaideen A Hassan | System and method for providing health care services using smart health cards |
| US9443141B2 (en) * | 2008-06-02 | 2016-09-13 | New York University | Method, system, and computer-accessible medium for classification of at least one ICTAL state |
| US20090326981A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Universal health data collector and advisor for people |
| TWI405559B (en) | 2008-08-14 | 2013-08-21 | Univ Nat Taiwan | Handheld sleep assistant device and method |
| US8004391B2 (en) * | 2008-11-19 | 2011-08-23 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
| TWI355260B (en) | 2008-11-21 | 2012-01-01 | Univ Yuan Ze | Remote sleeping quality detecting system and metho |
| TWM363295U (en) | 2009-03-27 | 2009-08-21 | Platinum Team Co Ltd | A device for analyzing quality of sleep |
| WO2010136786A2 (en) * | 2009-05-27 | 2010-12-02 | University Of Abertay Dundee | A biometric security method, system and computer program |
| US8905928B2 (en) | 2009-07-17 | 2014-12-09 | Oregon Health & Science University | Method and apparatus for assessment of sleep disorders |
| GB2471902A (en) * | 2009-07-17 | 2011-01-19 | Sharp Kk | Sleep management system which correlates sleep and performance data |
| US20120116186A1 (en) * | 2009-07-20 | 2012-05-10 | University Of Florida Research Foundation, Inc. | Method and apparatus for evaluation of a subject's emotional, physiological and/or physical state with the subject's physiological and/or acoustic data |
| US8527213B2 (en) * | 2009-07-21 | 2013-09-03 | Ntt Docomo, Inc. | Monitoring wellness using a wireless handheld device |
| JP4519193B1 (en) | 2009-07-27 | 2010-08-04 | エンパイア テクノロジー ディベロップメント エルエルシー | Information processing system and information processing method |
| DE102009043775A1 (en) | 2009-09-30 | 2011-04-07 | Siemens Medical Instruments Pte. Ltd. | Hearing device i.e. combined hearing and tinnitus masker device, adjusting method, involves analyzing speech signal for recognizing emotional state of user and adjusting parameter of hearing device as function of recognized emotional state |
| US8666672B2 (en) * | 2009-11-21 | 2014-03-04 | Radial Comm Research L.L.C. | System and method for interpreting a user's psychological state from sensed biometric information and communicating that state to a social networking site |
| WO2011094448A1 (en) | 2010-01-29 | 2011-08-04 | Dreamwell, Ltd. | Systems and methods for bedding with sleep diagnostics |
| US9634855B2 (en) * | 2010-05-13 | 2017-04-25 | Alexander Poltorak | Electronic personal interactive device that determines topics of interest using a conversational agent |
| WO2012008961A1 (en) * | 2010-07-15 | 2012-01-19 | Sony Ericsson Mobile Communications Ab | A user analytics engine for detecting and mitigating stress and/or concentration lapses for an associated user of an electronic device |
| EP2656262A2 (en) * | 2010-12-23 | 2013-10-30 | Orange | Medical record retrieval system based on sensor information and a method of operation thereof |
| US20120197622A1 (en) * | 2011-01-31 | 2012-08-02 | Fujitsu Limited | Monitoring Insulin Resistance |
-
2012
- 2012-08-17 DE DE201210214697 patent/DE102012214697A1/en not_active Withdrawn
-
2013
- 2013-08-01 US US14/418,374 patent/US20150265211A1/en not_active Abandoned
- 2013-08-01 WO PCT/EP2013/066241 patent/WO2014020134A1/en not_active Ceased
- 2013-08-01 EP EP13755979.5A patent/EP2879582B1/en active Active
-
2020
- 2020-04-16 US US16/850,984 patent/US11468984B2/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120289789A1 (en) * | 2011-05-13 | 2012-11-15 | Fujitsu Limited | Continuous Monitoring of Stress Using Environmental Data |
| US20130262096A1 (en) * | 2011-09-23 | 2013-10-03 | Lessac Technologies, Inc. | Methods for aligning expressive speech utterances with text and systems therefor |
Non-Patent Citations (1)
| Title |
|---|
| GEORGIEV et al., Low-resource Multi-task Audio Sensing for Mobile and Embedded Devices via Shared Deep Neural Network Representations, March 2010, Proceedings of the ACM on Interactive, Mobile,Wearable and Ubiquitous Technologies, Vol. 9, No. 4, Article 39:1-19 * |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140200468A1 (en) * | 2007-09-11 | 2014-07-17 | Korea Advanced Institute Of Science And Technology (Kaist) | Method for analyzing stress based on multi-measured bio-signals |
| US10130292B2 (en) * | 2007-09-11 | 2018-11-20 | Samsung Electronics Co., Ltd. | Method for analyzing stress based on multi-measured bio-signals |
| US11301671B1 (en) * | 2015-04-20 | 2022-04-12 | Snap Inc. | Determining a mood for a group |
| US11710323B2 (en) | 2015-04-20 | 2023-07-25 | Snap Inc. | Determining a mood for a group |
| US12243318B2 (en) | 2015-04-20 | 2025-03-04 | Snap Inc. | Determining a mood for a group |
| EP3501385A1 (en) * | 2017-12-21 | 2019-06-26 | IMEC vzw | System and method for determining a subject's stress condition |
| US11160500B2 (en) | 2017-12-21 | 2021-11-02 | Imec Vzw | System and method for determining a subject's stress condition |
| US11717217B2 (en) * | 2019-03-25 | 2023-08-08 | Steffen Wirth | Stress monitor and stress-monitoring method |
| WO2022038776A1 (en) * | 2020-08-21 | 2022-02-24 | 日本電気株式会社 | Stress inference device, inference method, program, and storage medium |
| JPWO2022038776A1 (en) * | 2020-08-21 | 2022-02-24 | ||
| JP7517433B2 (en) | 2020-08-21 | 2024-07-17 | 日本電気株式会社 | Stress estimation device, estimation method, program, and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2879582B1 (en) | 2020-03-25 |
| US20200237302A1 (en) | 2020-07-30 |
| DE102012214697A1 (en) | 2014-02-06 |
| EP2879582A1 (en) | 2015-06-10 |
| WO2014020134A1 (en) | 2014-02-06 |
| US11468984B2 (en) | 2022-10-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11468984B2 (en) | Device, method and application for establishing a current load level | |
| US11363953B2 (en) | Methods and systems for managing medical anomalies | |
| US12230366B2 (en) | System and method for fleet driver biometric tracking | |
| US10872354B2 (en) | System and method for personalized preference optimization | |
| US10620593B2 (en) | Electronic device and control method thereof | |
| LiKamWa et al. | Moodscope: Building a mood sensor from smartphone usage patterns | |
| EP3638108B1 (en) | Sleep monitoring from implicitly collected computer interactions | |
| CN109460752B (en) | Emotion analysis method and device, electronic equipment and storage medium | |
| US20190156191A1 (en) | Detecting personal danger using a deep learning system | |
| US20180107943A1 (en) | Periodic stress tracking | |
| EP3507729A1 (en) | Providing insights based on health-related information | |
| JP2018506773A (en) | Method and system for monitoring and influencing gesture-based behavior | |
| KR20140111959A (en) | Automatic haptic effect adjustment system | |
| JP2015514512A (en) | Biometric attribute anomaly detection system with notification coordination | |
| US20240028967A1 (en) | Systems and methods for automatic decision-making with user-configured criteria using multi-channel data inputs | |
| US12367751B2 (en) | Alert system | |
| US20180075763A1 (en) | System and method of generating recommendations to alleviate loneliness | |
| KR101706474B1 (en) | Smartphone usage patterns gathering and processing system | |
| US20220108775A1 (en) | Techniques for updating a health-related record of a user of an input/output device | |
| US20180366024A1 (en) | Providing suggested behavior modifications for a correlation | |
| JP2024164089A (en) | Method and apparatus for interactive and privacy-preserving communication between a server and a user device - Patents.com | |
| US20210224828A1 (en) | Real-time dynamic monitoring of sentiment trends and mitigation of the same in a live setting | |
| US20200064986A1 (en) | Voice-enabled mood improvement system for seniors | |
| KR20240175303A (en) | Method and apparatus for managing symptom in user customized manner | |
| US20230088373A1 (en) | Progressive individual assessments using collected inputs |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SOMA ANALYTICS UG, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHNEIDER, PETER;HUBER, JOHANN;LORENZ, CHRISTOPHER;AND OTHERS;REEL/FRAME:035776/0121 Effective date: 20150211 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
| AS | Assignment |
Owner name: PRENETICS EMEA LTD, ENGLAND Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:SOMA ANALYTICS UG (HAFUNGSBESCHRAENKT);REEL/FRAME:062448/0496 Effective date: 20230123 |
|
| AS | Assignment |
Owner name: TPDM1 LTD, ENGLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRENETICS EMEA LTD;REEL/FRAME:062539/0950 Effective date: 20230130 |
|
| AS | Assignment |
Owner name: V1AM LIMITED, ENGLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TPDM1 LIMITED;REEL/FRAME:062560/0112 Effective date: 20230201 |