[go: up one dir, main page]

WO2018215575A1 - Système ou dispositif permettant la reconnaissance d'émotions avec induction de réponse d'actionneurs utile dans la formation et la psychothérapie - Google Patents

Système ou dispositif permettant la reconnaissance d'émotions avec induction de réponse d'actionneurs utile dans la formation et la psychothérapie Download PDF

Info

Publication number
WO2018215575A1
WO2018215575A1 PCT/EP2018/063593 EP2018063593W WO2018215575A1 WO 2018215575 A1 WO2018215575 A1 WO 2018215575A1 EP 2018063593 W EP2018063593 W EP 2018063593W WO 2018215575 A1 WO2018215575 A1 WO 2018215575A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
emotional
detection unit
anyone
stress
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/063593
Other languages
English (en)
Inventor
Bernard Martin MAARSINGH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jamzone BV
Original Assignee
Jamzone BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jamzone BV filed Critical Jamzone BV
Publication of WO2018215575A1 publication Critical patent/WO2018215575A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety

Definitions

  • the invention relates to emotion recognition devices, systems and methods useful in the field of therapy and health care, the fields of psychological therapy and aspects of psychiatric therapy, herein commonly defined as psychotherapy.
  • psychotherapy research Therapeutic empathy requires that a psychotherapist can recognize both the quality and intensity of a patient's emotional experience and psychotherapists must be able to identify objectively the emotions expressed in psychotherapy samples to accurately determine the role that emotional expression plays in psychotherapeutic improvement and change in many of the psychological conditions that face their clients or patients.
  • ASD autism-spectrum disorders
  • ADH attention deficit disorder
  • Parkinson's disease depression and dementia
  • Apathy has been consistently reported to be associated with executive dysfunction.
  • Borderline personality disorder is a serious mental illness marked by unstable moods, behavior, and relationships. Because some people with severe BPD have brief psychotic episodes, experts originally thought of this illness as atypical, or borderline, versions of other mental disorders. While mental health experts now generally agree that the name "borderline personality disorder" is misleading, a more accurate term does not exist yet. However, most people who have BPD suffer from problems with expressing and regulating emotions and thoughts that often translate into impulsive and reckless behavior and unstable relationships with other people. People with this disorder also have high rates of co-occurring disorders such as depression, anxiety disorders, substance abuse, and eating disorders, along with self-harm, suicidal behaviors, and completed suicides. No medications have been approved by the U.S.
  • the overriding aim is that emotion detection occurs before an emotion has become overwhelming and some sort of emotional steering or regulation would still possible, and often that is a daunting task for a psychotherapist.
  • a method that is directly relevant to affective computing as applied to autism is the Mindreading DVD. This comprises educational software that was designed to be an interactive, systematic guide to emotions (Baron-Cohen S., Golan 0., Wheelwright S., Hill J. J. Mind Reading: the interactive guide to emotions. London, UK: Jessica Kingsley Limited.; 2004.). It was developed to help people with ASD learn to recognize both basic and complex emotions and mental states from video clips of facial expressions and audio recordings of vocal expressions.
  • Mental states include thoughts and emotions, thoughts being traditionally fractionated into beliefs, desires, intentions, goals and perceptions. Emotions are traditionally fractionated into seven 'basic' emotions (joy, sadness, anger, fear, contempt, disgust and surprise) and numerous 'complex' emotions. Complex emotions involve attributing a cognitive state as well as an emotion and are more context and culture dependent. The basic emotions are held to be so because they may be universally recognized and expressed in the same way. This distinction, however, is not without its critics; since it may be that more emotions are universally recognized and expressed than these seven but have been overlooked, as research into complex emotions (usually towards developing taxonomies) has been mostly language and culture specific.
  • Emotional intelligence the "accurate appraisal and expression of emotions in oneself and others and the regulation of emotion in a way that enhances living” encompasses a set of interrelated skills and processes. Because the face is the primary canvas used to express distinct emotions non-verbally (Ekman, Perspectives on Psychological Science 2016, Vol. 11(1) 31-34), the ability to read facial expressions is particularly vital, and thus a crucial component of emotional intelligence. Facial expressions are privileged relative to other nonverbal "channels" of communication, such as vocal inflections and body movements. Facial expressions appear to be the most subject to conscious control. Individuals focus more attention on projecting their own facial expressions and perceiving others' facial expressions than they do on other non-verbal channels and often more than they focus on verbal communication
  • Training psychotherapists to recognize and respond to patient emotions has focused mainly on the accuracy of emotional recognition and of empathic responding, which may be increased by teaching therapists and counsellors to attend to non-verbalized information. Although such specific and focused training has proven to increase the accuracy with which therapists can respond to patients' emotional states, its relevance to conventional training of psychotherapists is uncertain. Moreover, much of the research on this topic has been confined to analogue patients and therapy sessions, calling into question the justification of generalizations to clinical material. Even when research on emotional recognition does include professionals who are conventionally trained and experienced it fails to compare their accuracy to individuals who are inexperienced and untrained. But even if the effects of training were adequately addressed, the question of generalization would not be solved.
  • cues used to convey emotional states in such training are provided by actors who present pre-set verbal and non-verbal messages and are not real-client based. Paradoxically, this methodology has a built-in bias against recognizing authentic emotional expressions; a fatal one if it is indeed true that deception is conveyed by subtle non-verbal cues. Such actor-based practices may yield results that do not represent the authentic display of conflicted emotions in naturalistic settings and psychotherapy practice.
  • Several studies e.g., Rosenthal, Hall, DiMatteo, Rogers, & Archer, 1979 suggest that clinicians are more sensitive to non-verbal communication cues than teachers and business executives but, surprisingly, are somewhat less accurate than graduate students and actors. Indeed, a comparison of M.A.
  • Automated emotion recognition is the process of identifying human emotion, most typically from facial expressions, by computer-assisted means. For this, many computational methodologies have been developed (Neural Networks 18 (2005) 389-405). Putting together an automatic emotion recognition system or device based on knowledge on emotions such as stemming from the modern neurosciences is now very well possible.
  • SE smart environments
  • the invention provides methods and means, computer-based hardware and software system or device, in particular for use in a health environment, so called smart health environments.
  • a health environment or health facility
  • the term usually includes hospitals, clinics, outpatient care centres, and specialized care centres, such as birthing and psychological or psychiatric care centres.
  • the proper home of a person suffering from some kind of disease should also be considered a health environment.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into a system or device response controlling an actuator unit, herein also called the actuator response.
  • the system sends the selected commands to the actuators that control the actuator response.
  • a change in a perception of motion is than performed by an actuator for driving the subject to that perception.
  • Actuators are a type of tool which is used to put something into automatic action. Actuators are used on a wide variety of sources, from humans putting something into action to computers starting up a
  • the actuator unit provides a movement or change in movement capable of being detected (perceived or notified) in reality, augmented reality or virtual reality by said subject.
  • the invention provides emotion recognition devices, systems and methods useful in the field of therapy and health care, the fields of psychological therapy and aspects of psychiatric therapy, herein commonly defined as psychotherapy.
  • the invention provides a novel tool by which therapists are helped to accurately and rapidly identifying another person's emotional state in diagnosing and treating mental disorders to better study, understand, respond to and empathize with a patient or client and for client(s) and patient(s) (herein generally called subject) to be trained in and understand and reflect on their behavior and communicative skills, for example to improve on these skills.
  • Lie detection is an obvious example of such situations.
  • Another example is clinical studies and therapy of schizophrenia and particularly the diagnosis of flattened affect that so far relies on the psychiatrists' subjective judgment of subjects' emotionality based on various physiological clues.
  • An automatic emotion-sensitive-special-effect-response-actuator system or device as provided herein helps augment these judgments, so minimizing the dependence of the diagnostic procedure on individual psychiatrists' perception of emotionality. More generally along those lines, automatic emotion detection, classification and responding with an effect can be used in a wide range of psychological and neurophysiological studies of human emotional expression that so far rely on subjects' self-report of their emotional state, which often proves problematic.
  • subjects diagnosed with ASD, ADHD, Parkinson, dementias, borderline personality disorders, bipolar disorders and the like may benefit from the invention; but also, couples that are engaged in relationship therapy, subjects that need to handle or be trained in handling difficult of conflictions discussions, and more in general, subjects that would benefit from training their socio-communicative skills, may all benefit from the invention.
  • the invention provides a computer-based hardware and software system or device (see also figure 2) comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into a system or device response controlling an actuator unit.
  • the actuator unit provides a movement or change in movement capable of being detected by said subject.
  • the actuator responds by converting energy into mechanical motion or movement or change in movement capable of being detected by said subject.
  • the actuator responds by generating a field of view projected in a virtual or augmented reality device projecting a movement or change in movement capable of being detected by said subject.
  • detection by said subject is greatly facilitated by several of the special effects that are attributed to emotional cue detection with a system or device of the invention.
  • These special effects may be generated on the bases of distinct algorithms in the software that reflect known psychotherapy strategies such as provided by Gotmann or another psychotherapist known in the field.
  • the system or device may be equipped with self-learning algorithms whereby apparently successful responses are integrated in the software-memory.
  • the subject By automatically notifying the subject (client or patient) with one or more special, moving, effects based on or related to the occurrence or manifestation of an emotional cue of said subject detected by the system or device provided here, the subject will learn that such cues occur and may put the occurrence of emotional cues in a rather harmless perspective of an artificial detect-effect relationship.
  • the therapist can now rely on an automated system or device that helps him or her in recognizing emotional cues and therewith can stay focused on other aspects of the therapeutic or training process.
  • the system or device according to the invention is provided with an effecting unit capable of collecting output data from said integration unit and processing said output into a system or device response that is controlling an actuator unit such as a movable subject support structure (carrier) propelling or affecting a change in motion of said subject supported by said structure or carried.
  • an actuator unit such as a movable subject support structure (carrier) propelling or affecting a change in motion of said subject supported by said structure or carried.
  • said support structure comprises a movable base adapted to ride over a substructure, for example a support surface, rails or track.
  • the invention provides a system or device provided with an electric actuator for moving the supports structure, such as an electric cylinder, an electro motor or a steppen motor, or any other electric drive suitable for moving the support structure.
  • an electric actuator for moving the supports structure such as an electric cylinder, an electro motor or a steppen motor, or any other electric drive suitable for moving the support structure.
  • system or device according to the invention is provided with an effecting unit capable of collecting output data from said integration unit and processing said output into a system or device response that is controlling an actuator unit such as a augmented or virtual reality device capable propelling or affecting a change in detection of motion of said subject.
  • an actuator unit such as a augmented or virtual reality device capable propelling or affecting a change in detection of motion of said subject.
  • the system or device here provided generates emotion-to-(perceived)motion effects that provide profound learning experiences to the subject(s) participating with the system or device.
  • said effecting unit provides (under guidance of the emotional cue or cues detected) a system or device response by controlling an actuator that moves (or stops moving) the furniture at which or wherein said subject is seated.
  • said movement may be directed at moving the subject(s), preferably by moving the furniture wherein or whereon the subject is seated, away from one or more other subjects, preferably that are also participating with the system or device.
  • said movement may be directed at moving the subject(s) towards one or more other subjects participating with the system or device.
  • an actuator is used with which the intensity or speed of moving may be changed, that moving is upwardly or downwardly directed or that the furniture proves a vibrating or shaking sensation off which the frequency is changed by an actuator response under guidance of the emotional cue or cues detected by a system or device according to the invention.
  • An actuator is the mechanism by which a control system or device acts upon an environment.
  • the control system or device can be simple (a fixed mechanical or electronic system or device), software-based (e.g. a printer driver, robot control system or device), a human, or any other input.
  • the support structure preferably comprises a movable base adapted to ride over a substructure, for example a support surface, rails or track.
  • the invention provides a system or device provided with an electric actuator for moving the supports structure, such as an electric cylinder, an electro motor or a steppen motor, or any other electric drive suitable for moving the support structure or subject carrier.
  • the moving system or device preferably comprises an electric actuator such as electric motor for propelling or moving the subject carrier, and a power supply to power the electric actuator.
  • the power supply may also comprise an electrical storage element to store electrical energy.
  • the effecting unit is arranged to control the power supply, optionally such as to operate the power supply to charge the electrical storage element from a power source; and primarily to operate the power supply to power the actuator from the electrical energy stored in the electrical storage element, to thereby propel the subject carrier.
  • a linear motor may be provided to accelerate the subject carrier to a certain speed, the carrier thereby e.g. being unable to travel a remainder of the trajectory of the device on its own.
  • the actuator may for example be comprised in the carrier and be provided with electrical power via sliding contacts.
  • the invention also provides a computer-based hardware and software system or device or device according to the invention having a subject carrier and a propelling system or device for propelling the subject carrier, the propelling system or device comprising an electric actuator to propel the subject carrier, a power supply to power the electric actuator, the power supply comprising an electrical storage element to store electrical energy, and a control unit which is arranged to control operation of the power supply, the control unit being arranged to operate the power supply to charge the electrical storage element from an power source; and operate the power supply to power the electric actuator from the electrical energy stored in the electrical storage element, to thereby propel the subject carrier.
  • the subject carrier may optionally be provided with one or more seating parts, the term seating part used herein is understood to mean that part of the carrier which can accommodate one person or several persons in a sitting, standing or recumbent position.
  • the invention provides a computer-based hardware and software system or device, the system or device optionally comprising a database that may be cloud-based.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, in particular capable of detecting emotional valence variables of said cue, and further comprising an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, and an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject, preferably wherein said response depends on the emotional valence of the cue detected.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, in particular capable of detecting emotional arousal variables of said cue, and further comprising an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, and an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject, preferably wherein said response depends on the emotional arousal of the cue detected.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, in particular capable of detecting emotional valence variables and emotional arousal variables of said cue, and further comprising an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, and an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject, preferably wherein said response depends on the emotional valence and arousal of the cue detected.
  • a detection unit capable of detecting at least one emotional cue variable of a subject, in particular capable of detecting emotional valence variables and emotional arousal variables of said cue
  • an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data
  • an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject, preferably wherein said response depends
  • At least four different outputs may be generated that respectively relate to high valence, high arousal, low valence, low arousal, high valence, low arousal and low variables, high arousal (see also figure 1).
  • the invention provides a set of computerized devices (a system or device) helping the therapist assess emotional states of humans and improves his or her ability to modulate emotional states of the subject in therapy.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, and an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, and an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in real-time.
  • the invention provides real-time or near real time devices, system or devices and methods to detect an emotion and provide a response to modulate behavior.
  • RTC real-time computing
  • reactive computing describes hardware and software system or devices subject to a "real-time constraint" or “near-real-time constraint", for example from detection of event to system or device response.
  • Near-real-time and real-time programs must both guarantee response within specified time constraints, often referred to as “deadlines”. The distinction between “near real-time” and “real-time” varies, and the delay is dependent on the type and speed of the transmission.
  • the delay in near real-time as provided by the invention herein is typically of the order of several seconds to several minutes.
  • Real-time responses are often understood to be in the order of milliseconds, and sometimes microseconds; near-real-time responses often incorporate a deliberate lag phase, preferably of up to 20 seconds, more preferably up to 10 seconds, more preferably up to 5 seconds, most preferably up to 3 seconds.
  • This lag phase is selected in line with research from humans and animals showing that somewhere between 2-5 seconds is the longest window of what we can perceive as an independent event - a moment. Anything longer, or separated by a longer window, is perceived by us as a separate occurrence - close in time, but distinct.
  • our autonomic system or device prepares us about 3 seconds ahead of time, which makes sense because that's about how long our vagal nervous system or device takes to alter our heart rate and breathing.
  • a near-real-time system or device as herein described is one, which "controls an
  • near-real-time herein is also used to mean “without significant delay”.
  • near real-time or “nearly real-time” refers to the time delay introduced, by automated data processing or network transmission, between the occurrence of an event and the use of the processed data, such as for display or feedback and control purposes. For example, a near-real-time display depicts an event or situation, as it existed at the current time minus the processing time, as nearly the time of the live event. Both terms “near real time” and “real time” imply that there are no significant delays. In many cases, processing described as “real-time” would be more accurately described as "near real-time”.
  • Near real-time also refers to delayed real-time transmission of voice and video. It allows playing or projecting video images, in approximately real-time, without having to wait for an entire large video file to download.
  • Incompatible databases can export/import to common flat files that the other database can import/export on a scheduled basis so that they can sync/share common data in "near real-time" with each other.
  • the devices, system or devices and methods of the invention as provided herein are useful during psychotherapy sessions, relationship therapy sessions and during socio- communicative interactions and discussions wherein an emotional cue given out by a subject is automatically responded to with a special effect detectable by said subject, preferably executed in near-real-time, more preferably in real time.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, and wherein said detection unit is capable of detecting at least one, preferably at least two, more preferably at least three emotional cue variables selected from the group of heart rate, heart rate variability, galvanic skin response, galvanic skin response variability, breathing rate, breathing rate variability, core temperature, skin temperature, skin temperature variability, electro-myography, electro-myography variability, electro-encephalography, electro-encephalography variability, electro-cardiography, electro-cardiography variability, photoplethysmography, photoplethysmography variability, goose bumps, goose bumps variability, posture, posture variability,
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, and wherein said detection unit is capable of detecting at least one, preferably at least two, more preferably at least three emotional cue variables selected from the group of heart rate, galvanic skin response, electro-cardiography, photoplethysmography, posture, eye movement,
  • said emotional cue or cues are preferably detected by a detection unit comprising an electrode, in another embodiment of the invention, said detection unit comprises an optical sensor, in yet another embodiment, said detection unit comprises a camera, in yet another said detection unit comprises a microphone. In a further preferred embodiment of the invention, the detection unit comprises an electrode and a camera and/or an optical sensor and/or a microphone. In a particularly preferred embodiment, the detection unit comprises an electrode and a camera and a microphone.
  • the invention provides a computer-based hardware and software system or device with a detection unit, an integration unit and an effecting unit wherein said integration unit is provided with software capable of providing a measure of the emotional experience of said suspect. It is preferred that said emotional experience is classifiable as anyone selected from the group of joy, anger, surprise, fear, sadness, disgust or contempt.
  • the invention provides a system or device wherein said integration unit is provided with software capable of providing a measure of the facial expression of said suspect.
  • said facial expression is classifiable as anyone selected from the group of attentions, brow furrow, brow raise, inner brow raise, eye closure, nose crinkle, upper lip raise, lip suck, lip pucker, lip press, mouth open, lip corner depressor, chin raise, smirk or smile.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said subject is also provided with a system or device for augmented or virtual reality detection.
  • a system or device for augmented or virtual reality detection comprising a system or device for augmented or virtual reality detection.
  • Alternative embodiment wherein stress-detection by virtual reality system or device worn by a subject moves or adapts the augmented or virtual reality perceived by said subject.
  • Virtual reality (VR) system or devices can present fields of view to a user via a display to provide the user with the perception of being in an environment other than reality.
  • a field of view presents to the user scenes from a virtual reality world.
  • Virtual reality system or devices can use an opaque background for the displayed fields of view or use a transparent background so that the field of view is overlaid on the user's view of the real world.
  • Virtual reality system or devices can also acquire a video stream of the real world and superimpose objects and people on the video stream representing the real world. These latter two schemes can be called augmented reality.
  • Examples of augmented virtual reality systems or devices providing perception of motion include car racing simulators, flight simulators, video games and video conferencing system or devices.
  • Virtual reality system or devices can permit a user to simulate driving vehicles, flying airplanes, exploring alien worlds or being at a simulated meeting with participants from different parts of the world without any of the participants leaving home, for example.
  • the fields of view that comprise the virtual reality world can be arranged to provide the user with the perception of being in a virtual world.
  • the fields of view can change according to the simulated physical dynamics of the world being simulated. For example, in a driving or flying system or device, the fields of view will change according to the simulated motion of the vehicle or airplane. Fields of view can also be changed by the user interacting with a controller, for example. Many video games are controlled by a handheld controller that includes buttons and switches that can change the point of view of the user in the virtual world and hence the fields of view displayed.
  • the display of some virtual reality system or devices include a virtual reality headset, for example.
  • Accelerometers can be used in a virtual reality headset to detect the location and attitude of the headset and thereby control the field of view to track the user's head motions and arrange the field of view accordingly.
  • Virtual reality system or devices can include other types of displays such as a stationary screen in front of the user not worn on a headset, multiple stationary screens surrounding the user, screens placed on lenses worn on the user's eyes, or hologram images projected around the user. None of these ways to control the field of view selection can display fields of view to the user that reflect stress of the user. In real life, if a person is affected by stress often the person is more alert to cues in the immediate real-world environment that can alert the user that stress was justified or not.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject
  • said subject is also provided with a system or device for providing fields of view allowing augmented or virtual reality detection by a subject (a virtual reality system or device) comprising detection unit capable of detecting at least one stress variable of a subject (a stress sensor) and an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data with an effecting unit capable of collecting and processing output data from said integration unit and providing output data affecting a change in field of view (a controller).
  • said subject is also provided with a virtual reality computing device or system or device wherein
  • the invention provides a virtual reality computing device or system or device that tracks emotion or stress parameters such as heart rate variation or respiratory rate variations.
  • stress detection is in one embodiment provided by a stress sensor that is located in a breast band or glove or other fixative element relative to a portion of a user's skin, in another embodiment such a stress sensor may be incorporated into the VR headset relative to a portion of a user's skin, or both.
  • the system or device includes a controller configured to identify differences between various stress levels, such as heart rate variations or respiratory rate variations and to determine output related to changes in fields of view configured to reflect changes in stress level of the user and back feed these to this user, or to (an)other user of the game. In this way, the augmented or virtual reality may be adjusted to the stress level(s) of its user(s).
  • various stress levels such as heart rate variations or respiratory rate variations
  • output related to changes in fields of view configured to reflect changes in stress level of the user and back feed these to this user, or to (an)other user of the game.
  • the invention provides a virtual reality computing device or system or device comprising a heart rate variation (HRV) sensor with a breast band coupled to a virtual reality console, configured to capture a plurality stress levels and a controller configured to identify differences between some of stress levels, the differences corresponding to differences in overall stress state of a user of the device or system or device and determine fields of view response based in part on the identified differences in stress level.
  • HRV heart rate variation
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides an actuator response including a physical notification of said subject.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides an actuator response that moves the furniture at which or wherein said subject is seated.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides an actuator response that includes a change of lighting.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides a system or device response that includes a sound.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides a system or device response that includes a change of temperature.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides an actuator response that includes a gust of air.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides a system or device response that includes a smell.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, wherein said effecting unit provides a system or device response that includes projection of an image.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into a system or device response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time wherein said detection unit is capable of detecting at least one emotional cue variable of at least two subjects.
  • the invention provides a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into a system or device response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time wherein said detection unit is capable of detecting at least one emotional cue variable of at least three subjects.
  • the invention also provides a machine-readable medium storing the software that is capable to perform the detecting, collecting and processing functions of a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time.
  • the invention also provides a computer (or server) provided with the software of a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time
  • the invention also provides use of a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time
  • the invention also provides use of a machine-readable medium storing the software that is capable to perform the detecting, collecting and processing functions of a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time in psychotherapy.
  • the invention also provides use of a computer (or server) provided with the software of a computer-based hardware and software system or device comprising a detection unit capable of detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time in psychotherapy.
  • a computer or server
  • a detection unit capable of detecting at least one emotional cue variable of a subject
  • an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data
  • an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time in psychotherapy.
  • the invention also provides a method for providing a subject with psychotherapy comprising detecting at least one emotional cue variable of said subject with a detection unit, collecting input data from said detection unit and processing said input relating to said cue variable into output data, collecting output data from said integration unit and processing said output in an effecting unit into an actuator response capable of being detected by said subject. It is preferred that said response is provided in near-real-time or in real-time.
  • the invention also provides a method for providing a subject with psychotherapy comprising detecting at least one emotional cue variable of said subject with a detection unit, collecting input data from said detection unit and processing said input relating to said cue variable into output data, collecting output data from said integration unit and processing said output in an effecting unit, providing output data to an actuator unit providing fields of view allowing augmented or virtual reality detection by a subject, affecting a detection of a change in motion by said subject by said actuator unit, preferably wherein said change in motion is provided in near-real-time or real-time.
  • said detection unit is capable of detecting at least one emotional cue variable selected from the group of heart rate, heart rate variability, galvanic skin response, galvanic skin response variability, breathing rate, breathing rate variability, core temperature, skin temperature, skin temperature variability, electro-myography, electromyography variability, electro-encephalography, electro-encephalography variability, electro-cardiography, electro-cardiography variability, photoplethysmography, photoplethysmography variability, goose bumps, goose bumps variability, posture, posture variability, body movement, body movement variability, eye movement, eye movement variability, pupil size, pupil size variability, hand movement, hand movement variability, facial expression, facial expression variability, speech, speech variability, sound and sound variability.
  • the invention also provides a method for providing a subject with psychotherapy comprising detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, and wherein said detection unit is capable of detecting at least one, preferably at least two, more preferably at least three emotional cue variables selected from the group of heart rate, galvanic skin response, electro-cardiography, photoplethysmography, posture, eye movement, pupil size, facial expression, and sound.
  • the invention also provides a method of doing business comprising the steps of teaching a group of at least two, preferably at least three subjects socio-communicative skills and charging a fee for said teaching, said method further comprising detecting at least one emotional cue variable of at least one subject with a detection unit, collecting input data from said detection unit and processing said input relating to said cue variable into output data, collecting output data from said integration unit and processing said output in an effecting unit into an actuator response capable of being detected by said subject. It is preferred that said response is provided in near-real-time or in real-time.
  • said detection unit is capable of detecting at least one emotional cue variable selected from the group of heart rate, heart rate variability, galvanic skin response, galvanic skin response variability, breathing rate, breathing rate variability, core temperature, skin temperature, skin temperature variability, electro-myography, electro-myography variability, electro-encephalography, electro-encephalography variability, electrocardiography, electro-cardiography variability, photoplethysmography,
  • photoplethysmography variability goose bumps, goose bumps variability, posture, posture variability, body movement, body movement variability, eye movement, eye movement variability, pupil size, pupil size variability, hand movement, hand movement variability, facial expression, facial expression variability, speech, speech variability, sound and sound variability.
  • the invention also provides a method of doing business comprising the steps of teaching a group of at least two, preferably at least three subjects socio-communicative skills and charging a fee for said teaching, said method further comprising detecting at least one emotional cue variable of a subject, an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data, an effecting unit capable of collecting output data from said integration unit and processing said output into an actuator response capable of being detected by said subject wherein said response is provided in near-real-time or provided in real-time, and wherein said detection unit is capable of detecting at least one, preferably at least two, more preferably at least three emotional cue variables selected from the group of heart rate, galvanic skin response, electro-cardiography, photoplethysmography, posture, eye movement,
  • Figure 1 One of the most common frameworks in the emotions field proposes that two main dimensions best characterize emotional cues: arousal and valence.
  • the dimension of valence ranges from highly positive to highly negative, whereas the dimension of arousal ranges from calming or soothing (low) to exciting or agitating (high).
  • Figure 2 A diagram of the architecture of a system or device according to the invention equipped with one or more cameras, one or more sensors and one or more microphones, a detection unit
  • an integration unit for processing data from subject(s)
  • effector unit for providing effects and control
  • an actuator unit for moving subject(s).
  • Figure 3 A sketch of a table useful in a type 3 multiple person system or device allowing moving subjects in or out groups depending on emotional cue detection.
  • Facial expression recognition has been a highly-researched topic in the field of biometrics for decades. It has been used with the intent of both identifying specific individuals, and in understanding human relations and communication. A potential case study with this type of software is in medicine and geriatric care. Patients in these situations may not always be able to communicate their state of being with a care provider. Humans have been studying themselves for a long time, and the description of facial features is no exception. The measurement, collection, and systemtic analysis of facial expression has been a focus of study since the initial publication by Paul Ekman and Wallace V. Friesen in 1976, almost half a century ago. The specific method and deliberate analysis of such features are commonly known as the Facial Action Coding System or device (FACS), originally created by P. Ekman.
  • FACS Facial Action Coding System
  • Facial expressions are a gateway into the human mind, emotion, and identity. They are a way for us to relate with each other, share understanding, and compassion. They are also a way for us to express pain, grief, remorse, and lack of understanding. These characteristics can be crucial to understand when working with patients, especially patients who are unable to communicate in other ways. These victims include post-stroke patients and those suffering from dementia or Alzheimer's disease.
  • iMotions A useful biometric research platform software system or device called “iMotions” is used herein, supplemented with customized software (iMotions, Copenhagen, Denmark.
  • the software can combine detection of emotional cues such as "eye tracking, facial expression analysis, EEG, GSR, EMG, ECG and Surveys"
  • the platform is generally used for various types of academic and business-related research.
  • GSR Module of iMotions is a plug & play integration with GSR devices that delivers real time sensor output (emotional reactions) in the user interface and in synchronized raw data export.
  • iMotions also provides an open application programming interface (API) to allow integration of other sensors or detection means to forward data into the software, thereby allowing multi-modal-cue and/or multi- modal-subject detection.
  • API application programming interface
  • Multi-modal integration with other sensors like EEG, EMG, ECG.
  • GSR solution allows analysing the different emotions of various people and responding with various effects.
  • Another useful biometric research platform software system or device is provided Noldus Information Technology bv Wageningen - The Netherlands. FaceReader automatically analyses 6 basic facial expressions, as well as neutral and contempt. It also calculates gaze direction, head orientation, and person characteristics. The Project Analysis Module is ideal for advanced analysis and reporting: you quickly gain insight into the effects of different stimuli. Analysis of Action Units is available. Yet another useful biometric research platform software system or device called “Crowdsight” is used herein, supplemented with customized software. CrowdSight Software Development Kit (SDK) is a flexible and easy-to-integrate Crowd Face Analysis Software which allows to gather realtime, anonymous information about people while they behave spontaneously in different life environments. It detects emotional reactions and engagement. CrowdSight works offline as well as online on the most popular desktop and mobile platforms (Windows, Mac, Linux, iOS, Android).
  • Galvanic Skin Response is another biophysical sensor, which determines human skin resistance under different psychological conditions.
  • the GSR is an older term for electro dermal activity (EDA), which is simply the electrical conductance of the skin.
  • EDA electro dermal activity
  • These sensors also detect an increase in physical attributes marking a state of being including: heart rate and sweat measurements. Sweat glands are controlled by the sympathetic nervous system or device. A change in the electrical resistance of the skin that is a physiochemical response to emotional stimulation increases the sympathetic nervous system or device activity.
  • GSR is a method of measuring the electrical resistance of the skin, which varies with its moisture level. With other sensors, these devices can help determine wellness, and emotional responses to external stimuli. Typical emotional cues that allow detection by Affectiva Facial Expression Emotion Analysis with iMotions Core License are listed below and can be extended.
  • Valence A measure of the positive or negative nature of the recorded person emotional experience. The range of values for the metric is arbitrarily set between -100 to 100. Arousal: A measure of the excitation nature of the recorded person's emotional experience. The range of values for the metric is arbitrarily set between -100 to 100.
  • the range of values is between 0 and 100.
  • Emotion metrics scores indicate when users express a specific emotion, along with the degree of confidence.
  • the metrics can be thought of as detectors: as the emotion occurs, the score rises from 0 (no expression) to 100 (expression fully present).
  • Facial Expressions Attention, Brow Furrow, Brow Raise, Inner Brow Raise, Eye Closure, Nose Wrinkle, Upper Lip Raise, Lip Suck, Lip Pucker, Lip Press, Mouth Open, Lip Corner Depressor, Chin Raise, Smirk, Smile.
  • Expression metrics also known as Action Units (AUs) in the FACS methodology, scores indicate when users make a specific expression (e.g., a smile) along with the degree of confidence.
  • the metrics can be thought of as detectors: as the facial expression occurs and becomes more apparent, the score rises from 0 (no expression) to 100.
  • Interocular Distance Distance between the two outer eye corners.
  • Head Orientation Estimation of the head position in a 3-D space in Euler angles (pitch, yaw, roll).
  • GSR Galvanic Skin Response
  • ECG electrocardiography
  • PPG photoplethysmography
  • EEG electroencephalograpy
  • EMG electroencephalograpy
  • Philips HUE Light effect geared to respond to negative low state, optionally accommodating four different light effects each relating to one of the four different valency and arousal states of figure 2.
  • the type 1 computer-based hardware and software emotion- effect system or device provided herein is specifically developed for one-to-one
  • Such subjects are typically selected from patients diagnosed with ASD, ADHD, Parkinson, dementias, borderline personality disorders, bipolar disorders and schizophrenias and the like.
  • Emotional cues detectable by a Facial Emotion Recognition Software using Camera directed at each subject's faces, optionally provided with Galvanic Skin Resistance detection of each of the subjects in therapy.
  • Philips HUE Light effect geared to respond to negative low state, optionally accommodating four different light effects each relating to one of the four different valency and arousal states of figure 2.
  • the type 2 computer-based hardware and software emotion- effect system or device provided herein is specifically developed for couple-related psychotherapy sessions of a therapist with two subject (clients or patients) that may benefit from learning about the emotional cues they each project. Such subjects are typically selected from clients wishing to engage in relationship therapy or couple therapy with help of a therapist.
  • a smartphone device or webcam may be used.
  • - Processing determining emotional valence and arousal levels of both subjects and applying Gottman Algorithm Software wherein a negative low emotional valence/arousal state is learned to be met by responding with at least 1, preferably at least 3, preferably at least 5 positive high emotional valence/arousal state responses, based on learning by effects generated by the system or device in reaction to emotional cues displayed by each of the subjects.
  • Such subjects are typically selected from clients wishing to practice relationship therapy or couple therapy in a private setting such as their home.
  • Philips HUE Light effects geared to respond to negative low state, optionally accommodating four different light effects each relating to one of the four different valency and arousal states of figure 2.
  • Alternative outputs may be generated in the type 2Privateweb version via smartphone connection or computer/webcam connection.
  • the type 2Privateweb computer-based hardware and software emotion-effect system or device provided herein is specifically developed for couple-related sessions without a therapist with persons that may benefit from learning from each other about the emotional cues they each project wherein the subjects and system or device is interacting via internet communication and webcam detection.
  • Database facilities may be provided in the Cloud.
  • Multiple-person system or device (2 or more subjects - optionally at least one therapist).
  • the type 3 computer-based hardware and software emotion-effect system or device provided herein is specifically developed to be used group sessions with or without a therapist wherein at least two subject (clients or patients) that may benefit from learning about the emotional cues they each utter to improve their socio-communicative skills.
  • Philips HUE Light effect geared to respond to negative low state, optionally accommodating four different light effects each relating to one of the four different valency and arousal states of figure 2.
  • Lights respond to impulses from the effecting unit by going dark or changing color (group effect or personal effect).
  • Vibrating wristbands or other on body devices respond to impulses from the effecting unit.
  • wristbands or other devices Neutralize or change sounds in the room. Volume up and down.
  • the table projection shows a preferred breathing rate. For example, 5 seconds in and 5 seconds out.
  • Virtual reality (VR) system or devices can present fields of view to a user via a display to provide the user with the perception of being in an environment other than reality.
  • a field of view presents to the user scenes from a virtual reality world.
  • Virtual reality system or devices can use an opaque background for the displayed fields of view or use a transparent background so that the field of view is overlaid on the user's view of the real world.
  • Virtual reality system or devices can also acquire a video stream of the real world and superimpose objects and people on the video stream representing the real world. These latter two schemes can be called augmented reality.
  • Examples of virtual reality system or devices include car racing simulators, flight simulators, video games and video conferencing system or devices.
  • Virtual reality system or devices can permit a user to simulate driving vehicles, flying airplanes, exploring alien worlds or being at a simulated meeting with participants from different parts of the world without any of the participants leaving home, for example.
  • the fields of view that comprise the virtual reality world can be arranged to provide the user with the perception of being in a virtual world.
  • the fields of view can change according to the simulated physical dynamics of the world being simulated. For example, in a driving or flying system or device, the fields of view will change according to the simulated motion of the vehicle or airplane. Fields of view can also be changed by the user interacting with a controller, for example. Many video games are controlled by a handheld controller that includes buttons and switches that can change the point of view of the user in the virtual world and hence the fields of view displayed.
  • the display of some virtual reality system or devices include a virtual reality headset, for example.
  • Accelerometers can be used in a virtual reality headset to detect the location and attitude of the headset and thereby control the field of view to track the user's head motions and arrange the field of view accordingly.
  • Virtual reality system or devices can include other types of displays such as a stationary screen in front of the user not worn on a headset, multiple stationary screens surrounding the user, screens placed on lenses worn on the user's eyes, or hologram images projected around the user. None of these ways to control the field of view selection can display fields of view to the user that reflect stress of the user. In real life, if a person is affected by stress often the person is more alert to cues in the immediate real-world environment that can alert the user that stress was justified or not.
  • a computer-based hardware and software system or device for providing fields of view allowing augmented or virtual reality detection by a subject comprising detection unit capable of detecting at least one stress variable of a subject (a stress sensor) and an integration unit capable of collecting input data from said detection unit and processing said input relating to said cue variable into output data with an effecting unit capable of collecting and processing output data from said integration unit and providing output data affecting a change in field of view (a controller).
  • the invention provides a virtual reality computing device or system or device wherein stress-detection by the virtual reality system or device that is worn by a subject moves or adapts the augmented or virtual reality fields of view perceived by said subject.
  • the invention provides a virtual reality computing device or system or device that tracks stress parameters such as heart rate variation or respiratory rate variations.
  • stress detection is in one embodiment provided by a stress sensor that is located in a breast band or glove or other fixative element relative to a portion of a user's skin, in another embodiment such a stress sensor may be
  • the system or device includes a controller configured to identify differences between various stress levels, such as heart rate variations or respiratory rate variations and to determine output related to changes in fields of view configured to reflect changes in stress level of the user and back feed these to this user, or to another user of the game. In this way, the augmented or virtual reality may be adjusted to the stress level(s) of its user(s).
  • the invention provides a virtual reality computing device or system or device comprising a heart rate variation (HRV) sensor with a breast band coupled to a virtual reality console, configured to capture a plurality stress levels and a controller configured to identify differences between some of stress levels, the differences corresponding to differences in overall stress state of a user of the device or system or device and determine fields of view response based in part on the identified differences in stress level.
  • HRV heart rate variation
  • a computing device in one example can be connected to a stress detecting device equipped to detect stress levels or parameters of the subject using the virtual reality system or device, additionally it can include an internal configuration of hardware including a processor such as a central processing unit (CPU) and a digital data storage exemplified by memory.
  • CPU can be a controller for controlling the operations of the computing device, and may be a microprocessor, digital signal processor, field
  • CPU can be connected to memory by a memory bus, wires, cables, wireless connection, or any other connection, for example.
  • Memory may be or include read-only memory (ROM), random access memory (RAM), optical storage, magnetic storage such as disk or tape, non-volatile memory cards, cloud storage or any other manner or combination of suitable digital data storage device or devices.
  • ROM read-only memory
  • RAM random access memory
  • Memory can store data and program instructions that are used by CPU. Program instructions may be altered when stress levels alter.
  • Other suitable implementations of computing device are possible. For example, the processing of computing device can be distributed among multiple devices communicating over multiple networks.
  • a virtual reality computing and stress detecting device can include a virtual reality (VR) headset, which can be worn by a user to facilitate experiencing the virtual reality system or device.
  • Virtual reality computing device can also include a computer, a mobile device, a server, or any combination thereof.
  • a VR headset can constitute a display of the virtual reality system or device, wherein the display outputs data indicative of a field of view according to the user's stress.
  • a VR headset can use video display technology to create displays or field of view that effectively cover the user's visual field. When wearing a VR headset , a user's entire visual perceptional field can be supplied as successive fields of view by the virtual reality system or device, thereby producing the effect of viewing scenes from a virtual world.
  • a VR headset can also be equipped with accelerometers, for example, that can measure the location and attitude of the VR headset and thereby the location and attitude of the user's head.
  • all or a portion of implementations of the present invention can take the form of a computer program product accessible from, for example, a computer-usable or computer-readable medium.
  • a computer-usable or computer-readable medium can be any device that can, for example, tangibly contain, store, communicate, or transport the program for use by or in connection with any processor.
  • the medium can be, for example, an electronic, magnetic, optical, electromagnetic, or a semiconductor device. Other suitable mediums are also available.
  • Stressjam is a native Virtual Reality Serious Game provided with stress sensor to detect for example heart rate variation or repiratory rate variation as measures of stress herewith and written for the HTC VIVE VR Headset.
  • the VIVE set-up is provided with a headset.
  • HTC Vive is a virtual reality headset developed by HTC and Valve Corporation, released on 5 April 2016. The headset is designed to utilise "room scale" technology to turn a room into 3D space via sensors, with the virtual world allowing the user to navigate naturally, with the ability to walk around and use motion tracked handheld controllers to vividly manipulate objects, interact with precision, communicate and experience immersive environments.
  • the Vive has a refresh rate of 90 Hz.
  • the device uses two screens, one per eye, each having a display resolution of 1080x1200.
  • the device uses more than 70 sensors including a MEMS (Microelectromechanical system or devices) gyroscope, accelerometer and laser position sensors, and is said to operate in a 15-by-15-foot (4.6 by 4.6 m) tracking space if used with both "Lighthouse" base stations that track the user's movement with sub-millimetre precision.
  • the Lighthouse system or device uses simple photo sensors on any object that needs to be captured; to avoid occlusion problems this is combined with two lighthouse stations that sweep structured light lasers within a space.
  • the front-facing camera allows the software to identify any moving or static objects in a room; this functionality can be used as part of a "Chaperone" safety system or device, which will automatically display a feed from the camera to the user to safely guide users from obstacles.
  • Binaural or 3D audio can be used by app and game developers to tap into VR headsets' head-tracking technology to take advantage of this and give the wearer the sense that sound is coming from behind, to the side of them or in the distance.
  • the ZephyrTM BioPatchTM HP Monitoring Device measures and transmits live physiological data on heart rate variation trough protocols like ECHO, Bluetooth or USB to the HTC VIVE VR Headset.
  • Stressjam we're using bluetooth to require the live raw data from the ZephyrTM HRV sensors and run it through smart algorithms of the computing device within the HTC VIVE VR Headset to create an additional variable with which we can feed to the game.
  • the game is designed to adjust itself, based on this real-time personal data feedback regarding stress levels detected. In this way, the VR game is adjusted to the stress levels of the user. In some cases, you'll need to calm yourself to be able to play a certain part of the game but in some cases, you'll need to trigger your stress response to be able to overcome a part of the gameplay.
  • the game is built around training levels which will be expended in future development. Training levels need to be completed before entering the next level and you'll be able to collect energy points along the way.
  • the training levels are based on ground-breaking research by Harvard university and Stanford university on mind-set and stress.
  • Putting people to practice stress is a key training tool for athletes, emergency responders, professional musicians, artists, astronauts and others that have to deliver under stress. And that is because of that rewiring and stress inoculation effect.
  • the Olympic skaters of a famous Dutch speedskating coach, Jillert Anema are training themselves with a specific stress- training tool, to reach for the top. Stress is no longer seen as the enemy, but as an important friend on the road to success.
  • Stressjam an award winning innovative health tech solution that uniquely combines virtual reality, biofeedback technology and applied games to provide a fully personalized digital coach to train players regulate their own stress system, and to develop a new stress-mindset in which stress can also be healthy.
  • the player undergoes a lifelike, virtual reality interactive experience on a tropical island. This experience is fully personalized by using a biosensor on the chest. Therefore, the player has only one superpower: his/her own stress system.
  • a truly engaging game that leads to a high level of personal involvement.
  • a training tool that scores an A on usability and learnability.
  • a training tool with 'duration of game-play' as a distinctive feature is provided.
  • Another relevant direction is to study if it is possible, by playing Stressjam for a longer period of time, to stimulate the vagal afferent system to have positive effects on disorders of negative affectivity and physiological health.
  • the Stress Mindset Measure contains a scale that ranges from 0 (negative attitude) to 4 (positive attitude)
  • the Stress Mindset Measure contains a scale that ranges from 0 (negative attitude) to 4 (positive attitude)
  • the Stress Mindset Measure contains a scale that ranges from 0 (negative attitude towards stress) to 4 (positive attitude towards stress)

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Psychiatry (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Educational Technology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physiology (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Cardiology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Pulmonology (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne des dispositifs, systèmes et procédés de reconnaissance d'émotions et de réalité augmentée ou virtuelle utiles dans le domaine de la thérapie, les domaines de la thérapie psychologique et certains aspects de thérapie psychiatrique, communément défini ici comme psychothérapie. Ainsi, l'invention apporte un outil inédit qui aide les thérapeutes à identifier précisément et rapidement l'état émotionnel d'une autre personne dans le cadre du diagnostic et du traitement de troubles mentaux à mieux étudier, comprendre, répondre à et faire preuve d'empathie avec un patient ou un client et aide un ou des clients et patients (ici appelés généralement sujets) à apprendre pour comprendre et réfléchir à leur comportement et à leurs compétences en communication, par exemple pour améliorer lesdites compétences.
PCT/EP2018/063593 2017-05-26 2018-05-24 Système ou dispositif permettant la reconnaissance d'émotions avec induction de réponse d'actionneurs utile dans la formation et la psychothérapie Ceased WO2018215575A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17173003.9 2017-05-26
EP17173003 2017-05-26

Publications (1)

Publication Number Publication Date
WO2018215575A1 true WO2018215575A1 (fr) 2018-11-29

Family

ID=58800693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/063593 Ceased WO2018215575A1 (fr) 2017-05-26 2018-05-24 Système ou dispositif permettant la reconnaissance d'émotions avec induction de réponse d'actionneurs utile dans la formation et la psychothérapie

Country Status (1)

Country Link
WO (1) WO2018215575A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102020598B1 (ko) * 2018-12-11 2019-09-10 전자부품연구원 생체신호 센서 기반 정신질환 진단 및 치유를 위한 바이오피드백 시스템
JP2021058231A (ja) * 2019-10-02 2021-04-15 株式会社エクサウィザーズ 認知機能推定方法、コンピュータプログラム及び認知機能推定装置
CN113712572A (zh) * 2020-11-25 2021-11-30 北京未名脑脑科技有限公司 认知功能的评估系统和方法
CN113873935A (zh) * 2019-03-22 2021-12-31 科格诺亚公司 个性化数字化治疗方法和装置
CN114120985A (zh) * 2021-11-18 2022-03-01 上海深聪半导体有限责任公司 智能语音终端的安抚交互方法、系统、设备及存储介质
CN115708680A (zh) * 2022-10-11 2023-02-24 中国科学院西安光学精密机械研究所 基于驾驶任务的多模态融合情绪障碍检测装置及方法
EP4260804A1 (fr) * 2022-04-11 2023-10-18 Università di Pisa Système de création et de modulation d'un environnement de réalité virtuelle pour un individu
EP4268718A1 (fr) * 2022-04-29 2023-11-01 BIC Violex Single Member S.A. Système de réalité virtuelle
FR3141849A1 (fr) * 2022-11-15 2024-05-17 Orange Procédé et dispositif de surveillance du niveau de stress d’un utilisateur
WO2024146998A1 (fr) * 2023-01-06 2024-07-11 Emobot Procédé pour déterminer un état émotionnel d'un sujet et dispositif associé

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130337421A1 (en) * 2012-06-19 2013-12-19 International Business Machines Corporation Recognition and Feedback of Facial and Vocal Emotions
US20160104486A1 (en) * 2011-04-22 2016-04-14 Angel A. Penilla Methods and Systems for Communicating Content to Connected Vehicle Users Based Detected Tone/Mood in Voice Input
EP3062198A1 (fr) * 2015-02-27 2016-08-31 Immersion Corporation Génération d'actions basées sur l'humeur d'un utilisateur
WO2017021321A1 (fr) * 2015-07-31 2017-02-09 Universitat De Barcelona Réponse physiologique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160104486A1 (en) * 2011-04-22 2016-04-14 Angel A. Penilla Methods and Systems for Communicating Content to Connected Vehicle Users Based Detected Tone/Mood in Voice Input
US20130337421A1 (en) * 2012-06-19 2013-12-19 International Business Machines Corporation Recognition and Feedback of Facial and Vocal Emotions
EP3062198A1 (fr) * 2015-02-27 2016-08-31 Immersion Corporation Génération d'actions basées sur l'humeur d'un utilisateur
WO2017021321A1 (fr) * 2015-07-31 2017-02-09 Universitat De Barcelona Réponse physiologique

Non-Patent Citations (13)

* Cited by examiner, † Cited by third party
Title
AKINOLA, M.; FRIDMAN, I.; MOR, S.; MORRIS, M. W.; CRUM, A. J.: "Adaptive Appraisals of Anxiety Moderate the Association between Cortisol Reactivity and Performance in Salary Negotiations", PLOS ONE, vol. 11, no. 12, 2016
BARON-COHEN S.; GOLAN 0.; WHEELWRIGHT S.; HILL J. J.: "Mind Reading: the interactive guide to emotions", 2004, JESSICA KINGSLEY LIMITED.
BAUMEISTER, R. F.; VOHS, K. D.; AAKER, J. L.; GARBINSKY, E. N.: "Some key differences between a happy life and a meaningful life", THE JOURNAL OF POSITIVE PSYCHOLOGY, vol. 8, no. 6, 2013, pages 505 - 516
CRUM, A. J.; SALOVEY, P.; ACHOR, S.: "Rethinking stress: The role of mindsets in determining the stress response", JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, vol. 104, no. 4, 2013, pages 716 - 733
CRUM, A.; AKINOLA, M.; MARTIN, A.; FATH, S.: "The Role of Stress Mindset in Shaping Cognitive, Emotional, & Physiological Responses to Challenging & Threatening Stress", ANXIETY, STRESS, & COPING, vol. 30, no. 4, 2017, pages 379 - 395
CRUM, A.; LYDDY, C.: "The Wiley Blackwell handbook of mindfulness", 2014, JOHN WILEY & SONS, article "Destressing stress: The power of mindsets and the art of stressing mindfully", pages: 948 - 963
EKMAN, PERSPECTIVES ON PSYCHOLOGICAL SCIENCE, vol. 11, no. 1, 2016, pages 31 - 34
JOURNAL OF CLINICAL PSYCHOLOGY, vol. 55, no. 1, 1999, pages 39 - 57
KASHANI F; KASHANI P; MOGHIMIAN M; SHAKOUR M: "Effect of Stress Inoculation Training on the Levels of Stress, Anxiety, and Depression in Cancer Patients", IRANIAN JOURNAL OF NURSING AND MIDWIFERY RESEARCH, vol. 20, no. 3, 2015, pages 359 - 64
MERZENICH, M.M.; VAN VLEET, T.M.; NAHUM, M: "Brain Plasticity-based Therapeutics", FRONTIERS OF HUMAN NEUROSCIENCE, vol. 8, 2014, pages 358
MORRIS, S. B.; DESHON, R. P.: "Combining effect size estimates in meta-analysis with repeated measures and independent-groups designs", PSYCHOLOGICAL METHODS, vol. 7, no. 1, 2002, pages 105 - 125, Retrieved from the Internet <URL:https://doi.org/10.1037//1082-989X.7.1.105>
NEURAL NETWORKS, vol. 18, 2005, pages 389 - 405
PARK, D.; YU, A.; METZ, S. E.; TSUKAYAMA, E.; CRUM, A. J.; DUCKWORTH, A. L.: "Beliefs About Stress Attenuate the Relation Among Adverse Life Events, Perceived Distress, and Self-Control", CHILD DEVELOPMENT, 2017

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102020598B1 (ko) * 2018-12-11 2019-09-10 전자부품연구원 생체신호 센서 기반 정신질환 진단 및 치유를 위한 바이오피드백 시스템
CN113873935A (zh) * 2019-03-22 2021-12-31 科格诺亚公司 个性化数字化治疗方法和装置
JP2021058231A (ja) * 2019-10-02 2021-04-15 株式会社エクサウィザーズ 認知機能推定方法、コンピュータプログラム及び認知機能推定装置
JP7014761B2 (ja) 2019-10-02 2022-02-01 株式会社エクサウィザーズ 認知機能推定方法、コンピュータプログラム及び認知機能推定装置
CN113712572A (zh) * 2020-11-25 2021-11-30 北京未名脑脑科技有限公司 认知功能的评估系统和方法
CN114120985A (zh) * 2021-11-18 2022-03-01 上海深聪半导体有限责任公司 智能语音终端的安抚交互方法、系统、设备及存储介质
WO2023198417A1 (fr) * 2022-04-11 2023-10-19 Università Di Pisa Système de création et de modulation d'un environnement de réalité virtuelle pour un individu
EP4260804A1 (fr) * 2022-04-11 2023-10-18 Università di Pisa Système de création et de modulation d'un environnement de réalité virtuelle pour un individu
EP4268718A1 (fr) * 2022-04-29 2023-11-01 BIC Violex Single Member S.A. Système de réalité virtuelle
CN115708680A (zh) * 2022-10-11 2023-02-24 中国科学院西安光学精密机械研究所 基于驾驶任务的多模态融合情绪障碍检测装置及方法
CN115708680B (zh) * 2022-10-11 2025-01-07 中国科学院西安光学精密机械研究所 基于驾驶任务的多模态融合情绪障碍检测装置及方法
FR3141849A1 (fr) * 2022-11-15 2024-05-17 Orange Procédé et dispositif de surveillance du niveau de stress d’un utilisateur
WO2024104835A1 (fr) * 2022-11-15 2024-05-23 Orange Procédé et dispositif de surveillance du niveau de stress d'un utilisateur
WO2024146998A1 (fr) * 2023-01-06 2024-07-11 Emobot Procédé pour déterminer un état émotionnel d'un sujet et dispositif associé
FR3144751A1 (fr) * 2023-01-06 2024-07-12 Emobot Procede pour determiner un etat emotionnel d’un sujet et dispositif associe

Similar Documents

Publication Publication Date Title
US12253882B2 (en) System and method for enhanced training using a virtual reality environment and bio-signal data
WO2018215575A1 (fr) Système ou dispositif permettant la reconnaissance d&#39;émotions avec induction de réponse d&#39;actionneurs utile dans la formation et la psychothérapie
Tieri et al. Virtual reality in cognitive and motor rehabilitation: facts, fiction and fallacies
Kritikos et al. Personalized virtual reality human-computer interaction for psychiatric and neurological illnesses: a dynamically adaptive virtual reality environment that changes according to real-time feedback from electrophysiological signal responses
US10396905B2 (en) Method and system for direct communication
JP6470338B2 (ja) 注意転導および/または妨害の存在下での認知の増強
Bohil et al. Virtual reality in neuroscience research and therapy
Leeb et al. Thinking penguin: multimodal brain–computer interface control of a vr game
AU2015218578B2 (en) Systems, environment and methods for evaluation and management of autism spectrum disorder using a wearable data collection device
CN115279257A (zh) 治疗自闭症谱系障碍相关受试者的扩展现实系统
US20210125702A1 (en) Stress management in clinical settings
Ghisio et al. Designing a platform for child rehabilitation exergames based on interactive sonification of motor behavior
Aardema et al. Effects of virtual reality on presence and dissociative experience
Gandomi Analysis and Prediction of Emotions using Human-Robot and Driver-Vehicle Interactions
Ahire Respiratory Biofeedback based Virtual Environment to Increase Subjective Vitality and Reduce Stress in International Students: Usability, Feasibility and Effectiveness pilot study.
Zorzi Virtual Reality For Rehabilitation: Enhancing The Transition to Wheelchair Use
Nezgovorova et al. Digital Biomarkers in Diagnostics and Monitoring
Pardini A USER-CENTRED APPROACH TO RELAXATION AND ANXIETY MANAGEMENT IN CLINICAL AND NON-CLINICAL SETTINGS: THE USE OF CUSTOMIZED VIRTUAL REALITY SCENARIOS EXPERIENCED INDEPENDENTLY OR IN COMBINATION WITH WEB-BASED RELAXATION TRAINING Un approccio incentrato sull'utente per promuovere il rilassamento e la gestione dell'ansia in contesti clinici e non-clinici: indagine del ruolo di scenari virtuali personalizzati erogati in modalità autonoma o in combinazione con training di rilassamento in modalità web-based
Miri Using Technology to Regulate Affect: A Multidisciplinary Perspective
Tabbaa Emotional Spaces in Virtual Reality: Applications for Healthcare & Wellbeing
Lahiri Virtual-reality based gaze-sensitive adaptive response technology for children with autism spectrum disorder
Wang The control of mimicry by social signals
Kwon Anxiety activating virtual environments for investigating social phobias
Lee Externalizing and interpreting autonomic arousal in people diagnosed with Autism
CA3059903A1 (fr) Gestion du stress dans un environnement clinique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18728556

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18728556

Country of ref document: EP

Kind code of ref document: A1