US20200060597A1 - State estimation device - Google Patents
State estimation device Download PDFInfo
- Publication number
- US20200060597A1 US20200060597A1 US16/344,091 US201616344091A US2020060597A1 US 20200060597 A1 US20200060597 A1 US 20200060597A1 US 201616344091 A US201616344091 A US 201616344091A US 2020060597 A1 US2020060597 A1 US 2020060597A1
- Authority
- US
- United States
- Prior art keywords
- discomfort
- information
- reaction
- pattern
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/162—Testing reaction times
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7246—Details of waveform analysis using correlation, e.g. template matching or determination of similarity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7278—Artificial waveform generation or derivation, e.g. synthesizing signals from measured signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01K—MEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
- G01K13/00—Thermometers specially adapted for specific purposes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
-
- G06K9/00335—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/63—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0204—Acoustic sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
Definitions
- the present invention relates to a technique for estimating an emotional state of a user.
- the estimated emotion of the user is referred to as information for providing a recommended service depending on a state of the user, for example.
- Patent Literature 1 discloses an emotional information estimating device that performs machine learning to generate an estimator that learns the relationship between biological information and emotional information on the basis of a history accumulation database that stores a user's biological information acquired beforehand and the user's emotional information and physical states corresponding to the biological information, and estimates emotional information from the biological information for each physical state.
- the emotional information estimating device estimates emotional information of the user from the user's biological information detected with the estimator corresponding to the physical state of the user.
- any estimator cannot be used until a sufficiently large amount of information is accumulated in the history accumulation database.
- the present invention has been made to solve the above problems, and aims to estimate an emotional state of a user, without the user inputting his/her emotional state, even in a case where information indicating emotional states of the user and information indicating physical states are not accumulated.
- a state estimation device includes: an action detecting unit that checks at least one piece of behavioral information including motion information about a user, sound information about the user, and operation information about the user against action patterns stored in advance, and detects a matching action pattern; a reaction detecting unit that checks the behavioral information and biological information about the user against reaction patterns stored in advance, and detects a matching reaction pattern; a discomfort determining unit that determines that the user is in an uncomfortable state, when the action detecting unit has detected a matching action pattern, or when the reaction detecting unit has detected a matching reaction pattern and the detected reaction pattern matches a discomfort reaction pattern indicating an uncomfortable state of the user, the discomfort reaction pattern being stored in advance; a discomfort zone estimating unit that acquires an estimation condition for estimating a discomfort zone on the basis of the action pattern detected by the action detecting unit, and estimates a discomfort zone that is a zone matching the acquired estimation condition in history information stored in advance; and a learning unit that acquires and stores the discomfort reaction pattern on the basis of the discomfort zone estimated by the discomfort zone estimating unit
- FIG. 1 is a block diagram showing the configuration of a state estimation device according to a first embodiment.
- FIG. 2 is a table showing an example of storage in an action information database of the state estimation device according to the first embodiment.
- FIG. 3 is a table showing an example of the storage in a reaction information database of the state estimation device according to the first embodiment.
- FIG. 4 is a table showing an example of the storage in a discomfort reaction pattern database of the state estimation device according to the first embodiment.
- FIG. 5 is a table showing an example of the storage in a learning database of the state estimation device according to the first embodiment.
- FIGS. 6A and 6B are diagrams each showing an example hardware configuration of the state estimation device according to the first embodiment.
- FIG. 7 is a flowchart showing an operation of the state estimation device according to the first embodiment.
- FIG. 8 is a flowchart showing an operation of an environmental information acquiring unit of the state estimation device according to the first embodiment.
- FIG. 9 is a flowchart showing an operation of a behavioral information acquiring unit of the state estimation device according to the first embodiment.
- FIG. 10 is a flowchart showing an operation of a biological information acquiring unit of the state estimation device according to the first embodiment.
- FIG. 11 is a flowchart showing an operation of an action detecting unit of the state estimation device according to the first embodiment.
- FIG. 12 is a flowchart showing an operation of a reaction detecting unit of the state estimation device according to the first embodiment.
- FIG. 13 is a flowchart showing operations of a discomfort determining unit, a discomfort reaction pattern learning unit, and a discomfort zone estimating unit of the state estimation device according to the first embodiment.
- FIG. 14 is a flowchart showing an operation of the discomfort reaction pattern learning unit of the state estimation device according to the first embodiment.
- FIG. 15 is a flowchart showing an operation of the discomfort zone estimating unit of the state estimation device according to the first embodiment.
- FIG. 16 is a flowchart showing an operation of the discomfort reaction pattern learning unit of the state estimation device according to the first embodiment.
- FIG. 17 is a flowchart showing an operation of the discomfort reaction pattern learning unit of the state estimation device according to the first embodiment.
- FIG. 18 is a diagram showing an example of learning of discomfort reaction patterns in the state estimation device according to the first embodiment.
- FIG. 19 is a flowchart showing an operation of the discomfort determining unit of the state estimation device according to the first embodiment.
- FIG. 20 is a diagram showing an example of uncomfortable state estimation by the state estimation device according to the first embodiment.
- FIG. 21 is a block diagram showing the configuration of a state estimation device according to a second embodiment.
- FIG. 22 is a flowchart showing an operation of an estimator generating unit of the state estimation device according to the second embodiment.
- FIG. 23 is a flowchart showing an operation of a discomfort determining unit of the state estimation device according to the second embodiment.
- FIG. 24 is a block diagram showing the configuration of a state estimation device according to a third embodiment.
- FIG. 25 is a table showing an example of storage in a discomfort reaction pattern database of the state estimation device according to the third embodiment.
- FIG. 26 is a flowchart showing an operation of a discomfort determining unit of the state estimation device according to the third embodiment.
- FIG. 27 is a flowchart showing an operation of the discomfort determining unit of the state estimation device according to the third embodiment.
- FIG. 1 is a block diagram showing the configuration of a state estimation device 100 according to a first embodiment.
- the state estimation device 100 includes an environmental information acquiring unit 101 , a behavioral information acquiring unit 102 , a biological information acquiring unit 103 , an action detecting unit 104 , an action information database 105 , a reaction detecting unit 106 , a reaction information database 107 , a discomfort determining unit 108 , a learning unit 109 , a discomfort zone estimating unit 110 , a discomfort reaction pattern database 111 , and a learning database 112 .
- the environmental information acquiring unit 101 acquires information about the temperature around a user and noise information indicating the magnitude of noise as environmental information.
- the environmental information acquiring unit 101 acquires information detected by a temperature sensor, for example, as the temperature information.
- the environmental information acquiring unit 101 acquires information indicating the magnitude of sound collected by a microphone, for example, as the noise information.
- the environmental information acquiring unit 101 outputs the acquired environmental information to the discomfort determining unit 108 and the learning database 112 .
- the behavioral information acquiring unit 102 acquires behavioral information that is motion information indicating movement of a user's face and body, sound information indicating the user's utterance and the sound emitted by the user, and operation information indicating operation of the user's device.
- the behavioral information acquiring unit 102 acquires, as the motion information, information indicating the expression of a user, movement of part of the face of the user, motion of the user's body part such as the head, a hand, an arm, a leg, or the chest. This information is obtained through analysis of an image captured by a camera, for example.
- the behavioral information acquiring unit 102 acquires, as the sound information, a voice recognition result indicating the content of a user's utterance obtained through analysis of sound signals collected by a microphone, for example, and a sound recognition result indicating the sound uttered by the user (such as the sound of clicking of the user's tongue).
- the behavioral information acquiring unit 102 acquires, as the operation information, information about a user operating a device detected by a touch panel or a physical switch (such as information indicating that a button for raising the sound volume has been pressed).
- the behavioral information acquiring unit 102 outputs the acquired behavioral information to the action detecting unit 104 and the reaction detecting unit 106 .
- the biological information acquiring unit 103 acquires information indicating fluctuations in the heart rate of a user as biological information.
- the biological information acquiring unit 103 acquires, as the biological information, information indicating fluctuations in the heart rate of a user measured by a heart rate meter or the like the user is wearing, for example.
- the biological information acquiring unit 103 outputs the acquired biological information to the reaction detecting unit 106 .
- the action detecting unit 104 checks the behavioral information input from the behavioral information acquiring unit 102 against the action patterns in the action information stored in the action information database 105 . In a case where an action pattern matching the behavioral information is stored in the action information database 105 , the action detecting unit 104 acquires the identification information about the action pattern. The action detecting unit 104 outputs the acquired identification information about the action pattern to the discomfort determining unit 108 and the learning database 112 .
- the action information database 105 is a database that defines and stores action patterns of users for respective discomfort factors.
- FIG. 2 is a table showing an example of the storage in the action information database 105 of the state estimation device 100 according to the first embodiment.
- the action information database 105 shown in FIG. 2 contains the following items: IDs 105 a , discomfort factors 105 b , action patterns 105 c , and estimation conditions 105 d.
- an action pattern 105 c is defined for each one discomfort factor 105 b .
- An estimation condition 105 d that is a condition for estimating a discomfort zone is set for each one action pattern 105 c .
- An ID 105 a as identification information is also attached to each one action pattern 105 c.
- Action patterns of users associated directly with the discomfort factors 105 b are set as the action patterns 105 c .
- “uttering the word “hot”” and “pressing the button for lowering the set temperature” are set as the action patterns of users associated directly with a discomfort factor 105 b that is “air conditioning (hot)”.
- the reaction detecting unit 106 checks the behavioral information input from the behavioral information acquiring unit 102 and the biological information input from the biological information acquiring unit 103 against the reaction information stored in the reaction information database 107 . In a case where a reaction pattern matching the behavioral information or the biological information is stored in the reaction information database 107 , the reaction detecting unit 106 acquires the identification information associated with the reaction pattern. The reaction detecting unit 106 outputs the acquired identification information about the reaction pattern to the discomfort determining unit 108 , the learning unit 109 , and the learning database 112 .
- the reaction information database 107 is a database that stores reaction patterns of users.
- FIG. 3 is a table showing an example of the storage in the reaction information database 107 of the state estimation device 100 according to the first embodiment.
- the reaction information database 107 shown in FIG. 3 contains the following items: IDs 107 a and reaction patterns 107 b .
- An ID 107 a as identification information is attached to each one reaction pattern 107 b.
- Reaction patterns of users not associated directly with discomfort factors are set as the reaction patterns 107 b .
- “furrowing brows” and “clearing throat” are set as reaction patterns observed when a user is in an uncomfortable state.
- the discomfort determining unit 108 When the identification information about the detected action pattern is input from the action detecting unit 104 , the discomfort determining unit 108 outputs, to the outside, a signal indicating that the uncomfortable state of the user has been detected. The discomfort determining unit 108 also outputs the input identification information about the action pattern to the learning unit 109 , and instructs the learning unit 109 to learn reaction patterns.
- the discomfort determining unit 108 checks the input identification information against the discomfort reaction patterns that are stored in the discomfort reaction pattern database 111 and indicate uncomfortable states of users. In a case where a reaction pattern matching the input identification information is stored in the discomfort reaction pattern database 111 , the discomfort determining unit 108 estimates that the user is in an uncomfortable state. The discomfort determining unit 108 outputs, to the outside, a signal indicating that the user's uncomfortable state has been detected.
- the discomfort reaction pattern database 111 will be described later in detail.
- the learning unit 109 includes the discomfort zone estimating unit 110 .
- the discomfort zone estimating unit 110 acquires an estimation condition for estimating a discomfort zone from the action information database 105 , using the action pattern identification information that has been input at the same time as the instruction.
- the discomfort zone estimating unit 110 acquires the estimation condition 105 d corresponding to the ID 105 a that is the identification information about the action pattern shown in FIG. 2 , for example.
- the discomfort zone estimating unit 110 estimates a discomfort zone from the information matching the acquired estimation condition.
- the learning unit 109 extracts the identification information about one or more reaction patterns in the discomfort zone estimated by the discomfort zone estimating unit 110 .
- the learning unit 109 further refers to the learning database 112 , to extract the reaction patterns generated in the past at frequencies equal to or higher than a threshold as discomfort reaction pattern candidates.
- the learning unit 109 further extracts the reaction patterns generated at frequencies equal to or higher than the threshold in the zones other than the discomfort zone estimated by the discomfort zone estimating unit 110 as patterns that are not discomfort reaction patterns (these patterns will be hereinafter referred to as non-discomfort reaction patterns).
- the learning unit 109 excludes the extracted non-discomfort reaction patterns from the discomfort reaction pattern candidates.
- the learning unit 109 stores a combination of identification information about the eventually remaining discomfort reaction pattern candidates as a discomfort reaction pattern into the discomfort reaction pattern database 111 for each discomfort factor.
- the discomfort reaction pattern database 111 is a database that stores discomfort reaction patterns that are the results of learning by the learning unit 109 .
- FIG. 4 is a table showing an example of the storage in the discomfort reaction pattern database 111 of the state estimation device 100 according to the first embodiment.
- the discomfort reaction pattern database 111 shown in FIG. 4 contains the following items: discomfort factors 111 a and discomfort reaction patterns 111 b .
- the same items as the items of the discomfort factors 105 b in the action information database 105 are written as the discomfort factors 111 a.
- the IDs 107 a corresponding to the reaction patterns 107 b in the reaction information database 107 are written as the discomfort reaction patterns 111 b.
- the discomfort factor is “air conditioning (hot)” in FIG. 4
- the user shows the reactions “furrowing brows” of ID “b- 1 ” and “staring at the object” of ID “b- 3 ”.
- the learning database 112 is a database that stores results of learning of action patterns and reaction patterns when the environmental information acquiring unit 101 acquires environmental information.
- FIG. 5 is a table showing an example of the storage in the learning database 112 of the state estimation device 100 according to the first embodiment.
- the learning database 112 shown in FIG. 5 contains the following items: time stamps 112 a , environmental information 112 b , action pattern IDs 112 c , and reaction pattern IDs 112 d.
- the time stamps 112 a are information indicating the times at which the environmental information 112 b has been acquired.
- the environmental information 112 b is temperature information, noise information, and the like at the times indicated by the time stamps 112 a .
- the action pattern IDs 112 c are the identification information acquired by the action detecting unit 104 at the times indicated by the time stamps 112 a .
- the reaction pattern IDs 112 d are the identification information acquired by the reaction detecting unit 106 at the times indicated by the time stamps 112 a.
- the time stamp 112 a is “2016/8/1/11:02:00”
- the environmental information 112 b is “temperature 28° C., noise 35 dB”
- the action detecting unit 104 detected no action patterns indicating the user's discomfort
- the reaction detecting unit 106 detected the reaction pattern of “furrowing brows” of ID “b- 1 ”.
- FIGS. 6A and 6B are diagrams each showing an example hardware configuration of the state estimation device 100 .
- the environmental information acquiring unit 101 , the behavioral information acquiring unit 102 , the biological information acquiring unit 103 , the action detecting unit 104 , the reaction detecting unit 106 , the discomfort determining unit 108 , the learning unit 109 , and the discomfort zone estimating unit 110 in the state estimation device 100 may be a processing circuit 100 a that is dedicated hardware as shown in 6 A, or may be a processor 100 b that executes a program stored in a memory 100 c as shown in FIG. 6B .
- the processing circuit 100 a may be a single circuit, a composite circuit, a programmed processor, a parallel-programmed processor, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of the above, for example.
- ASIC application specific integrated circuit
- FPGA field-programmable gate array
- Each of the functions of the respective components of the environmental information acquiring unit 101 , the behavioral information acquiring unit 102 , the biological information acquiring unit 103 , the action detecting unit 104 , the reaction detecting unit 106 , the discomfort determining unit 108 , the learning unit 109 , and the discomfort zone estimating unit 110 may be formed with a processing circuit, or the functions of the respective components may be collectively formed with one processing circuit.
- the functions of the respective components are formed with software, firmware, or a combination of software and firmware.
- Software or firmware is written as programs, and is stored in the memory 100 c .
- the processor 100 b By reading and executing the programs stored in the memory 100 c , the processor 100 b achieves the respective functions of the environmental information acquiring unit 101 , the behavioral information acquiring unit 102 , the biological information acquiring unit 103 , the action detecting unit 104 , the reaction detecting unit 106 , the discomfort determining unit 108 , the learning unit 109 , and the discomfort zone estimating unit 110 .
- the environmental information acquiring unit 101 , the behavioral information acquiring unit 102 , the biological information acquiring unit 103 , the action detecting unit 104 , the reaction detecting unit 106 , the discomfort determining unit 108 , the learning unit 109 , and the discomfort zone estimating unit 110 have the memory 100 c for storing the programs by which the respective steps shown in FIGS. 7 through 17 and FIG. 19 , which will be described later, are eventually carried out when executed by the processor 100 b .
- these programs are for causing a computer to implement procedures or a method involving the environmental information acquiring unit 101 , the behavioral information acquiring unit 102 , the biological information acquiring unit 103 , the action detecting unit 104 , the reaction detecting unit 106 , the discomfort determining unit 108 , the learning unit 109 , and the discomfort zone estimating unit 110 .
- the processor 100 b is a central processing unit (CPU), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, a digital signal processor (DSP), or the like, for example.
- CPU central processing unit
- DSP digital signal processor
- the memory 100 c may be a nonvolatile or volatile semiconductor memory such as a random access memory (RAM), a read only memory (ROM), a flash memory, an erasable programmable ROM (EPROM), or an electrically EPROM (EEPROM), may be a magnetic disk such as a hard disk or a flexible disk, or may be an optical disc such as a mini disc, a compact disc (CD), or a digital versatile disc (DVD), for example.
- RAM random access memory
- ROM read only memory
- EPROM erasable programmable ROM
- EEPROM electrically EPROM
- RAM random access memory
- ROM read only memory
- EPROM erasable programmable ROM
- EEPROM electrically EPROM
- CD compact disc
- DVD digital versatile disc
- the processing circuit 100 a in the state estimation device 100 can achieve the above described functions with hardware, software, firmware, or a combination thereof.
- FIG. 7 is a flowchart showing an operation of the state estimation device 100 according to the first embodiment.
- the environmental information acquiring unit 101 acquires environmental information (step ST 101 ).
- FIG. 8 is a flowchart showing an operation of the environmental information acquiring unit 101 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 101 in detail.
- the environmental information acquiring unit 101 acquires information detected by a temperature sensor, for example, as temperature information (step ST 110 ).
- the environmental information acquiring unit 101 acquires information indicating the magnitude of sound collected by a microphone, for example, as noise information (step ST 111 ).
- the environmental information acquiring unit 101 outputs the temperature information acquired in step ST 110 and the noise information acquired in step ST 111 as environmental information to the discomfort determining unit 108 and the learning database 112 (step ST 112 ).
- steps ST 110 through ST 112 information is stored as items of a time stamp 112 a and environmental information 112 b in the learning database 112 shown in FIG. 5 , for example. After that, the flowchart proceeds to the process in step ST 102 in FIG. 7 .
- the behavioral information acquiring unit 102 then acquires behavioral information about the user (step ST 102 ).
- FIG. 9 is a flowchart showing an operation of the behavioral information acquiring unit 102 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 102 in detail.
- the behavioral information acquiring unit 102 acquires motion information obtained by analyzing a captured image, for example (step ST 113 ).
- the behavioral information acquiring unit 102 acquires sound information obtained by analyzing a sound signal, for example (step ST 114 ).
- the behavioral information acquiring unit 102 acquires information about operation of a device, for example, as operation information (step ST 115 ).
- the behavioral information acquiring unit 102 outputs the motion information acquired in step ST 113 , the sound information acquired in step ST 114 , and the operation information acquired in step ST 115 as behavioral information to the action detecting unit 104 and the reaction detecting unit 106 (step ST 116 ).
- the flowchart proceeds to the process in step ST 103 in FIG. 7 .
- the biological information acquiring unit 103 then acquires biological information about the user (step ST 103 ).
- FIG. 10 is a flowchart showing an operation of the biological information acquiring unit 103 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 103 in detail.
- the biological information acquiring unit 103 acquires information indicating fluctuations in the heart rate of the user, for example, as biological information (step ST 117 ).
- the biological information acquiring unit 103 outputs the biological information acquired in step ST 117 to the reaction detecting unit 106 (step ST 118 ). After that, the flowchart proceeds to the process in step ST 104 in FIG. 7 .
- the action detecting unit 104 then detects action information about the user from the behavioral information input from the behavioral information acquiring unit 102 in step ST 102 (step ST 104 ).
- FIG. 11 is a flowchart showing an operation of the action detecting unit 104 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 104 in detail.
- the action detecting unit 104 determines whether behavioral information has been input from the behavioral information acquiring unit 102 (step ST 120 ). If any behavioral information has not been input (step ST 120 ; NO), the process comes to an end, and the operation proceeds to the process in step ST 105 in FIG. 7 . If behavioral information has been input (step ST 120 ; YES), on the other hand, the action detecting unit 104 determines whether the input behavioral information matches an action pattern in the action information stored in the action information database 105 (step ST 121 ).
- step ST 121 If the input behavioral information matches an action pattern in the action information stored in the action information database 105 (step ST 121 ; YES), the action detecting unit 104 acquires the identification information attached to the matching action pattern, and outputs the identification information to the discomfort determining unit 108 and the learning database 112 (step ST 122 ). If the input behavioral information does not match any action pattern in the action information stored in the action information database 105 (step ST 121 ; NO), on the other hand, the action detecting unit 104 determines whether checking against all the action information has been completed (step ST 123 ). If checking against all the action information has not been completed yet (step ST 123 ; NO), the operation returns to the process in step ST 121 , and the above described processes are repeated. If the process in step ST 122 has been performed, or if checking against all the action information has been completed (step ST 123 ; YES), on the other hand, the flowchart proceeds to the process in step ST 105 in FIG. 7 .
- the reaction detecting unit 106 then detects reaction information about the user (step ST 105 ). Specifically, the reaction detecting unit 106 detects reaction information about the user, using the behavioral information input from the behavioral information acquiring unit 102 in step ST 102 and the biological information input from the biological information acquiring unit 103 in step ST 103 .
- FIG. 12 is a flowchart showing an operation of the reaction detecting unit 106 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 105 in detail.
- the reaction detecting unit 106 determines whether behavioral information has been input from the behavioral information acquiring unit 102 (step ST 124 ). If any behavioral information has not been input (step ST 124 ; NO), the reaction detecting unit 106 determines whether biological information has been input from the biological information acquiring unit 103 (step ST 125 ). If any biological information has not been input (step ST 125 ; NO), the process comes to an end, and the operation proceeds to the process in step ST 106 in the flowchart shown in FIG. 7 .
- the reaction detecting unit 106 determines whether the input behavioral information or biological information matches a reaction pattern in the reaction information stored in the reaction information database 107 (step ST 126 ). If the input behavioral information or biological information matches a reaction pattern in the reaction information stored in the reaction information database 107 (step ST 126 ; YES), the reaction detecting unit 106 acquires the identification information attached to the matching reaction pattern, and outputs the identification information to the discomfort determining unit 108 , the learning unit 109 , and the learning database 112 (step ST 127 ).
- step ST 126 the reaction detecting unit 106 determines whether checking against all the reaction information has been completed (step ST 128 ). If checking against all the reaction information has not been completed yet (step ST 128 ; NO), the operation returns to the process in step ST 126 , and the above described processes are repeated. If the process in step ST 127 has been performed, or if checking against all the reaction information has been completed (step ST 128 ; YES), on the other hand, the flowchart proceeds to the process in step ST 106 in FIG. 7 .
- the discomfort determining unit 108 determines whether the user is in an uncomfortable state (step ST 106 ).
- FIG. 13 is a flowchart showing operations of the discomfort determining unit 108 , the learning unit 109 , and the discomfort zone estimating unit 110 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 106 in detail.
- the discomfort determining unit 108 determines whether identification information about an action pattern has been input from the action detecting unit 104 (step ST 130 ). If identification information about an action pattern has been input (step ST 130 ; YES), the discomfort determining unit 108 outputs, to the outside, a signal indicating that an uncomfortable state of the user has been detected (step ST 131 ). The discomfort determining unit 108 also outputs the input identification information about the action pattern to the learning unit 109 , and instructs the learning unit 109 to learn discomfort reaction patterns (step ST 132 ). The learning unit 109 learns a discomfort reaction pattern on the basis of the action pattern identification information and the learning instruction input in step ST 132 (step ST 133 ). The process of learning discomfort reaction patterns in step ST 133 will be described later in detail.
- the discomfort determining unit 108 determines whether identification information about a reaction pattern has been input from the reaction detecting unit 106 (step ST 134 ). If identification information about a reaction pattern has been input (step ST 134 ; YES), the discomfort determining unit 108 checks the reaction pattern indicated by the identification information against the discomfort reaction patterns stored in the discomfort reaction pattern database 111 , and estimates an uncomfortable state of the user (step ST 135 ). The process of estimating an uncomfortable state in step ST 135 will be described later in detail.
- the discomfort determining unit 108 refers to the result of the estimation in step ST 135 , and determines whether the user is in an uncomfortable state (step ST 136 ). If the user is determined to be in an uncomfortable state (step ST 136 ; YES), the discomfort determining unit 108 outputs a signal indicating that the user's uncomfortable state has been detected, to the outside (step ST 137 ). In the process in step ST 137 , the discomfort determining unit 108 may add information indicating a discomfort factor to the signal to be output to the outside.
- step ST 133 If the process in step ST 133 has been performed, if the process in step ST 137 has been performed, if any identification information about any reaction pattern has not been input (step ST 134 ; NO), or if the user is determined not to be in an uncomfortable state (step ST 136 ; NO), the flowchart returns to the process in step ST 101 in FIG. 7 .
- step ST 133 in the flowchart in FIG. 13 is described in detail.
- the following description will be made with reference to the storage examples shown in FIGS. 2 through 5 , flowcharts shown in FIGS. 14 through 17 , and an example of discomfort reaction pattern learning shown in FIG. 18 .
- FIG. 14 is a flowchart showing an operation of the learning unit 109 of the state estimation device 100 according to the first embodiment.
- FIG. 18 is a diagram showing an example of learning of discomfort reaction patterns in the state estimation device 100 according to the first embodiment.
- the discomfort zone estimating unit 110 of the learning unit 109 estimates a discomfort zone from the action pattern identification information input from the discomfort determining unit 108 (step ST 140 ).
- FIG. 15 is a flowchart showing an operation of the discomfort zone estimating unit 110 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 140 in detail.
- the discomfort zone estimating unit 110 uses the action pattern identification information input from the discomfort determining unit 108 , the discomfort zone estimating unit 110 searches the action information database 105 , and acquires the estimation condition and the discomfort factor associated with the action pattern (step ST 150 ).
- the discomfort zone estimating unit 110 searches the action information database 105 shown in FIG. 2 , and acquires the estimation condition “temperature ° C.” and the discomfort factor “air conditioning (hot)” of “ID; a- 1 ”.
- the discomfort zone estimating unit 110 then refers to the most recent environmental information that is stored in the learning database 112 and matches the identification information about the estimation condition acquired in step ST 150 , and acquires the environmental information of the time at which the action information is detected (step ST 151 ).
- the discomfort zone estimating unit 110 also acquires the time stamp corresponding to the environmental information acquired in step ST 151 , as the discomfort zone (step ST 152 ).
- the discomfort zone estimating unit 110 acquires “temperature 28° C.” as the environmental information of the time at which the action pattern is detected, from “temperature 28° C., noise 35 dB”, which is the environmental information 112 b in the most recent history information, on the basis of the estimation condition acquired in step ST 150 .
- the discomfort zone estimating unit 110 also acquires the time stamp “2016/8/1/11:04:30” of the acquired environmental information as the discomfort zone.
- the discomfort zone estimating unit 110 refers to environmental information in the history information stored in the learning database 112 (step ST 153 ), and determines whether the environmental information in the history information matches the environmental information of the time at which the action pattern acquired in step ST 151 is detected (step ST 154 ). If the environmental information in the history information matches the environmental information of the time at which the action pattern is detected (step ST 154 ; YES), the discomfort zone estimating unit 110 adds the time indicated by the time stamp of the matching history information to the discomfort zone (step ST 155 ). The discomfort zone estimating unit 110 determines whether all the environmental information in the history information stored in the learning database 112 has been referred to (step ST 156 ).
- step ST 156 If not all the environmental information in the history information has not been referred to yet (step ST 156 ; NO), the operation returns to the process in step ST 153 , and the above described processes are repeated. If all the environmental information in the history information has been referred to (step ST 156 ; YES), on the other hand, the discomfort zone estimating unit 110 outputs the discomfort zone added in step ST 155 as the estimated discomfort zone to the learning unit 109 (step ST 157 ). The discomfort zone estimating unit 110 also outputs the discomfort factor acquired in step ST 150 to the learning unit 109 .
- the time from “2016/8/1/11:01:00” to “2016/8/1/11:04:30” indicated by the time stamp of the history information matching “temperature 28° C.” acquired as the discomfort zone estimation condition is output as the discomfort zone to the learning unit 109 .
- the operation proceeds to the process in step ST 141 in the flowchart in FIG. 7 .
- the discomfort zone estimating unit 110 determines whether environmental information in the history information matches the environmental information of the time at which the action pattern is detected. However, a check may be made to determine whether the environmental information falls within a threshold range that is set on the basis of the environmental information of the time at which the action pattern is detected. For example, in a case where the environmental information of the time at which the action pattern is detected is “28° C.”, the discomfort zone estimating unit 110 sets “lower limit: 27.5° C., upper limit: none” as the threshold range. The discomfort zone estimating unit 110 adds the time indicated by the time stamp of the history information within the range to the discomfort zone.
- the continuous zone from “2016/8/1/11:01:00” to “2016/8/1/11:04:30”, which indicates a temperature equal to or higher than the lower limit of the threshold range, is estimated as the discomfort zone.
- the learning unit 109 refers to the learning database 112 , and extracts the reaction patterns stored in the discomfort zone estimated in step ST 140 as discomfort reaction pattern candidates A (step ST 141 ).
- the learning unit 109 extracts the reaction pattern IDs “b- 1 ”, “b- 2 ”, “b- 3 ”, and “b- 4 ” in the zone from “2016/8/1/11:01:00” to “2016/8/1/11:04:30”, which is the estimated discomfort zone, as the discomfort reaction pattern candidates A.
- the learning unit 109 then refers to the learning database 112 , and learns the discomfort reaction pattern candidate in a zone having environmental information similar to the discomfort zone estimated in step ST 140 (step ST 142 ).
- FIG. 16 is a flowchart showing an operation of the learning unit 109 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 142 in detail.
- the learning unit 109 refers to the learning database 112 , and searches for a zone in which environmental information is similar to the discomfort zone estimated in step ST 140 (step ST 160 ).
- the learning unit 109 acquires a zone that matches the temperature condition in the past, such as a zone (from time t 1 to time t 2 ) in which the temperature information stayed at 28° C.
- the learning unit 109 may acquire a zone in which the temperature condition is within a preset range (a range of 27.5° C. and higher) in the past.
- the learning unit 109 refers to the learning database 112 , and determines whether reaction pattern IDs are stored in the zone searched for in step ST 160 (step ST 161 ). If any reaction pattern ID is not stored (step ST 161 ; NO), the operation proceeds to the process in step ST 163 . If reaction pattern IDs are stored (step ST 161 ; YES), on the other hand, the learning unit 109 extracts the reaction pattern IDs as discomfort reaction pattern candidates B (step ST 162 ).
- the reaction pattern IDs “b- 1 ”, “b- 2 ”, and “b- 3 ” stored in the searched zone from time t 1 to time t 2 are extracted as the discomfort reaction pattern candidates B.
- the learning unit 109 determines whether all the history information in the learning database 112 has been referred to (step ST 163 ). If not all the history information has not been referred to (step ST 163 ; NO), the operation returns to the process in step ST 160 . If all the history information has been referred to (step ST 163 ; YES), on the other hand, the learning unit 109 excludes a reaction pattern with a low appearance frequency from the discomfort reaction pattern candidates A extracted in step ST 141 and the discomfort reaction pattern candidates B extracted in step ST 162 (step ST 164 ). The learning unit 109 then sets the eventual discomfort reaction pattern candidates that are the reaction patterns from which a reaction pattern ID with a low appearance frequency has been excluded in step ST 164 . After that, the operation proceeds to the process in step ST 143 in the flowchart in FIG. 14 .
- the learning unit 109 compares the reaction pattern IDs “b- 1 ”, “b- 2 ”, “b- 3 ”, and “b- 4 ” extracted as the discomfort reaction pattern candidates A with the reaction pattern IDs “b- 1 ”, “b- 2 ”, and “b- 3 ” extracted as the discomfort reaction pattern candidates B, and excludes the reaction pattern ID “b- 4 ” included only among the discomfort reaction pattern candidates A as the pattern ID with a low appearance frequency.
- the learning unit 109 refers to the learning database 112 , and learns a reaction pattern at a time when the user is not in an uncomfortable state during a zone having an environmental condition not similar to the discomfort zone estimated in step ST 140 (step ST 143 ).
- FIG. 17 is a flowchart showing an operation of the learning unit 109 of the state estimation device 100 according to the first embodiment, and is a flowchart showing the process in step ST 143 in detail.
- the learning unit 109 refers to the learning database 112 , and searches for a past zone having environmental information not similar to the discomfort zone estimated in step ST 140 (step ST 170 ). Specifically, the learning unit 109 searches for a zone in which environmental information does not match or a zone in which environmental information is outside the preset range.
- the learning unit 109 searches for the zone (from time t 3 to time t 4 ) in which the temperature information stayed “lower than 28° C.” in the past as a zone with environmental information not similar to the discomfort zone.
- the learning unit 109 refers to the learning database 112 , and determines whether a reaction pattern ID is stored in the zone searched for in step ST 170 (step ST 171 ). If any reaction pattern ID is not stored (step ST 171 ; NO), the operation proceeds to the process in step ST 173 . If a reaction pattern ID is stored (step ST 171 ; YES), on the other hand, the learning unit 109 extracts the stored reaction pattern ID as a non-discomfort reaction pattern candidate (step ST 172 ).
- the pattern ID “b- 2 ” stored in the zone (from time t 3 to time t 4 ) in which the temperature information stayed “lower than 28° C.” in the past is extracted as a non-discomfort reaction pattern candidate.
- the learning unit 109 determines whether all the history information in the learning database 112 has been referred to (step ST 173 ). If not all the history information has not been referred to (step ST 173 ; NO), the operation returns to the process in step ST 170 . If all the history information has been referred to (step ST 173 ; YES), on the other hand, the learning unit 109 excludes a reaction pattern with a low appearance frequency among the non-discomfort reaction pattern candidates extracted in step ST 172 (step ST 174 ). The learning unit 109 then sets the eventual non-discomfort reaction patterns that are the reaction patterns from which a reaction pattern with a low appearance frequency has been excluded in step ST 174 . After that, the operation proceeds to the process in step ST 144 in FIG. 14 .
- the reaction pattern ID “b- 2 ” is excluded from the non-discomfort reaction pattern candidates. Note that, in the example shown in FIG. 18G , the reaction pattern ID “b- 2 ” is not excluded.
- the learning unit 109 excludes the non-discomfort reaction pattern learned in step ST 143 from the discomfort reaction pattern candidates learned in step ST 142 , and acquires a discomfort reaction pattern (step ST 144 ).
- the reaction pattern ID “b- 2 ”, which is a non-discomfort reaction pattern candidate, is excluded from the reaction pattern IDs “b- 1 ”, “b- 2 ”, and “b- 3 ”, which are the discomfort reaction pattern candidates, and acquires the reaction pattern IDs “b- 1 ” and “b- 3 ” after the exclusion as a discomfort reaction pattern.
- the learning unit 109 stores the discomfort reaction pattern acquired in step ST 144 , together with the discomfort factor input from the discomfort zone estimating unit 110 , into the discomfort reaction pattern database 111 (step ST 145 ).
- the learning unit 109 stores the reaction pattern IDs “b- 1 ” and “b- 3 ” extracted as discomfort reaction patterns, together with a discomfort factor “air conditioning (hot)”. After that, the flowchart returns to the process in step ST 101 in FIG. 7 .
- step ST 135 in the flowchart in FIG. 13 is described in detail.
- FIG. 19 is a flowchart showing an operation of the discomfort determining unit 108 of the state estimation device 100 according to the first embodiment.
- FIG. 20 is a diagram showing an example of uncomfortable state estimation by the state estimation device 100 according to the first embodiment.
- the discomfort determining unit 108 refers to the discomfort reaction pattern database 111 , and determines whether any discomfort reaction pattern is stored (step ST 180 ). If any discomfort reaction pattern is not stored (step ST 180 ; NO), the operation proceeds to the process in step ST 190 .
- step ST 180 If a discomfort reaction pattern is stored (step ST 180 ; YES), on the other hand, the discomfort determining unit 108 compares the stored discomfort reaction pattern with the identification information about the reaction pattern input from the reaction detecting unit 106 in step ST 127 of FIG. 12 (step ST 181 ). A check is made to determine whether the discomfort reaction pattern includes the identification information about the reaction pattern detected by the reaction detecting unit 106 (step ST 182 ). If the identification information about the reaction pattern is not included (step ST 182 ; NO), the discomfort determining unit 108 proceeds to the process in step ST 189 .
- the discomfort determining unit 108 refers to the discomfort reaction pattern database 111 , and acquires the discomfort factor associated with the identification information about the reaction pattern (step ST 183 ).
- the discomfort determining unit 108 acquires, from the environmental information acquiring unit 101 , the environmental information of the time at which the discomfort factor is acquired in step ST 183 (step ST 184 ).
- the discomfort determining unit 108 estimates a discomfort zone from the acquired environmental information (step ST 185 ).
- the discomfort determining unit 108 acquires environmental information (temperature information: 27° C.) of the time at which the ID “b- 3 ” is acquired.
- the discomfort determining unit 108 refers to the learning database 112 , and estimates a discomfort zone that is the past zone (from time t 5 to time t 6 ) until the temperature information becomes lower than 27° C.
- the discomfort determining unit 108 refers to the learning database 112 , and extracts the identification information about the reaction patterns detected in the discomfort zone estimated in step ST 185 (step ST 186 ). The discomfort determining unit 108 determines whether the identification information about the reaction patterns extracted in step ST 186 matches the discomfort reaction patterns stored in the discomfort reaction pattern database 111 (step ST 187 ). If a matching discomfort reaction pattern is stored (step ST 187 ; YES), the discomfort determining unit 108 estimates that the user is in an uncomfortable state (step ST 188 ).
- the discomfort determining unit 108 extracts the reaction pattern IDs “b- 1 ”, “b- 2 ”, and “b- 3 ” detected in the estimated discomfort zone.
- the discomfort determining unit 108 determines whether the reaction pattern IDs “b- 1 ”, “b- 2 ”, and “b- 3 ” in FIG. 20B match the discomfort reaction patterns stored in the discomfort reaction pattern database 111 in FIG. 20C .
- the discomfort determining unit 108 determines that a matching discomfort reaction pattern is stored in the discomfort reaction pattern database 111 , and estimates that the user is in an uncomfortable state.
- step ST 187 determines whether matching discomfort reaction pattern is not stored (step ST 187 ; NO). If checking against all the discomfort reaction patterns has not been completed yet (step ST 189 ; NO), the operation returns to the process in step ST 181 . If checking against all the discomfort reaction patterns has been completed (step ST 189 ; YES), on the other hand, the discomfort determining unit 108 estimates that the user is not in an uncomfortable state (step ST 190 ). If the process in step ST 188 or step ST 190 has been performed, the flowchart proceeds to the process in step ST 136 in FIG. 13 .
- the state estimation device includes: the action detecting unit 104 that checks at least one piece of behavioral information including motion information about a user, sound information about the user, and operation information about the user against action patterns stored in advance, and detects a matching action pattern; the reaction detecting unit 106 that checks the behavioral information and biological information about the user against reaction patterns stored in advance, and detects a matching reaction pattern; the discomfort determining unit 108 that determines that the user is in an uncomfortable state in a case where a matching action pattern has been detected, or where a matching reaction pattern has been detected and the reaction pattern matches a discomfort reaction pattern indicating an uncomfortable state of the user, the discomfort reaction pattern being stored in advance; the discomfort zone estimating unit 110 that acquires an estimation condition for estimating a discomfort zone on the basis of a detected action pattern, and estimates a discomfort zone that is the zone matching the acquired estimation condition in history information stored in advance; and the learning unit 109 that refers to the history information, and acquires and stores a discomfort reaction pattern on the basis of the estimated discomfort zone
- the learning unit 109 extracts discomfort reaction pattern candidates on the basis of the occurrence frequencies of the reaction patterns in the history information in a discomfort zone, extracts non-discomfort reaction patterns on the basis of the occurrence frequencies of the reaction patterns in the history information in the zones other than the discomfort zone, and acquires discomfort reaction patterns that are reaction patterns obtained by excluding the non-discomfort reaction patterns from the discomfort reaction patterns.
- an uncomfortable state can be determined from only the reaction patterns the user is highly likely to show depending on a discomfort factor, and the reaction patterns the user is highly likely to show regardless of discomfort factors can be excluded from the reaction patterns to be used in determining an uncomfortable state.
- the accuracy of uncomfortable state estimation can be increased.
- the discomfort determining unit 108 determines that the user is in an uncomfortable state, in a case where a matching reaction pattern has been detected by the reaction detecting unit 106 , and the detected reaction pattern matches a discomfort reaction pattern that is stored in advance and indicates an uncomfortable state of the user.
- the environmental information acquiring unit 101 acquires temperature information detected by a temperature sensor, and noise information indicating the magnitude of noise collected by a microphone.
- humidity information detected by a humidity sensor and information about brightness detected by an illuminance sensor may be acquired.
- the environmental information acquiring unit 101 may acquire humidity information and brightness information, in addition to the temperature information and the noise information.
- the state estimation device 100 can estimate that the user is in an uncomfortable state due to dryness, a high humidity, a situation that is too bright, or a situation that is too dark.
- the biological information acquiring unit 103 acquires information indicating fluctuations in the user's heart rate measured by a heart rate meter or the like as biological information.
- information indicating fluctuations in the user's brain waves measured by an electroencephalograph attached to the user may be acquired.
- the biological information acquiring unit 103 may acquire both information indicating fluctuations in the heart rate and information indicating fluctuations in the brain waves as the biological information.
- the state estimation device 100 can increase the accuracy in estimating the user's uncomfortable state in a case where a change appears in the fluctuations in the brain waves as a reaction pattern at a time when the user feels discomfort.
- the reaction patterns in the zone may not be extracted as discomfort reaction pattern candidates. In this manner, the reaction patterns corresponding to different discomfort factors can be prevented from being erroneously stored as discomfort reaction patterns into the discomfort reaction pattern database 111 . Thus, the accuracy of uncomfortable state estimation can be increased.
- the discomfort zone estimated by the discomfort zone estimating unit 110 is estimated on the basis of an estimation condition 105 d in the action information database 105 .
- the state estimation device may store information about all the device operations of the user into the learning database 112 , and excludes the zone in a certain period after a device operation is performed from the discomfort zone candidates. By doing so, it is possible to exclude the reactions that have occurred during the certain period after a user performs a device operation, from the user reactions to device operations. Thus, the accuracy in estimating an uncomfortable state of a user can be increased.
- reaction patterns obtained by excluding the reaction patterns with low appearance frequencies are set as the discomfort reaction pattern candidates. Accordingly, only the non-discomfort reaction patterns highly likely to be shown by a user depending on the discomfort factor can be used in estimating an uncomfortable state. Thus, the accuracy in estimating an uncomfortable state of a user can be increased.
- reaction patterns obtained by excluding the reaction patterns with high appearance frequencies are set as the discomfort reaction pattern candidates. Accordingly, the non-discomfort reaction patterns highly likely to be shown by a user regardless of the discomfort factor can be excluded from those to be used in estimating an uncomfortable state. Thus, the accuracy in estimating an uncomfortable state of a user can be increased.
- the discomfort zone estimating unit 110 may exclude the zone in a certain period after the acquisition of the operation information, from the discomfort zone.
- a second embodiment concerns a configuration for changing the methods of estimating a user's uncomfortable state, depending on the amount of the history information accumulated in the learning database 112 .
- FIG. 21 is a block diagram showing the configuration of a state estimation device 100 A according to the second embodiment.
- the state estimation device 100 A includes a discomfort determining unit 201 in place of the discomfort determining unit 108 of the state estimation device 100 according to the first embodiment shown in FIG. 1 , and further includes an estimator generating unit 202 .
- the discomfort determining unit 201 estimates an uncomfortable state of a user, using the generated estimator. In a case where any estimator is not generated by the estimator generating unit 202 , the discomfort determining unit 201 estimates an uncomfortable state of the user, using the discomfort reaction pattern database 111 .
- the estimator generating unit 202 performs machine learning using the history information stored in the learning database 112 .
- the prescribed value is a value that is set on the basis of the number of action patterns necessary for the estimator generating unit 202 to generate an estimator.
- the estimator generating unit 202 performs machine learning.
- input signals are the reaction patterns and environmental information extracted for the respective discomfort zones estimated from the identification information about action patterns
- output signals are information indicating a comfortable state or an uncomfortable state of a user with respect to each of the discomfort factors corresponding to the identification information about the action patterns.
- the estimator generating unit 202 generates an estimator for estimating a user's uncomfortable state from a reaction pattern and environmental information.
- the machine learning to be performed by the estimator generating unit 202 is performed by applying the deep learning method described in Non-Patent Literature 1 shown below, for example.
- the discomfort determining unit 201 and the estimator generating unit 202 in the state estimation device 100 A are the processing circuit 100 a shown in FIG. 6A , or are the processor 100 b that executes programs stored in the memory 100 c shown in FIG. 6B .
- FIG. 22 is a flowchart showing an operation of the estimator generating unit 202 of the state estimation device 100 A according to the second embodiment.
- the estimator generating unit 202 refers to the learning database 112 and the action information database 105 , and counts the action pattern IDs stored in the learning database 112 for each discomfort factor (step ST 200 ). The estimator generating unit 202 determines whether the total number of the action pattern IDs counted in step ST 200 is equal to or larger than a prescribed value (step ST 201 ). If the total number of the action pattern IDs is smaller than the prescribed value (step ST 201 ; NO), the operation returns to the process in step ST 200 , and the above described process is repeated.
- the estimator generating unit 202 performs machine learning, and generates an estimator for estimating a user's uncomfortable state from a reaction pattern and environmental information (step ST 202 ). After the estimator generating unit 202 generates an estimator in step ST 202 , the process comes to an end.
- FIG. 23 is a flowchart showing an operation of the discomfort determining unit 201 of the state estimation device 100 A according to the second embodiment.
- FIG. 23 the same steps as those in the flowchart of the first embodiment shown in FIG. 19 are denoted by the same reference numerals as those used in FIG. 19 , and explanation of them is not made herein.
- the discomfort determining unit 201 refers to the state of the estimator generating unit 202 , and determines whether an estimator is generated (step ST 211 ). If an estimator is generated (step ST 211 ; YES), the discomfort determining unit 201 inputs a reaction pattern and environmental information as input signals to the estimator, and acquires a result of estimation of a user's uncomfortable state as an output signal (step ST 212 ). The discomfort determining unit 201 refers to the output signal acquired in step ST 212 , and determines whether or the estimator has estimated an uncomfortable state of the user (step ST 213 ). When the estimator has estimated an uncomfortable state of the user (step ST 213 ; YES), the discomfort determining unit 201 estimates that the user is in an uncomfortable state (step ST 214 ).
- step ST 211 If any estimator has not been generated (step ST 211 ; NO), on the other hand, the discomfort determining unit 201 refers to the discomfort reaction pattern database 111 , and determines whether any discomfort reaction pattern is stored (step ST 180 ). After that, the processes from step ST 181 to step ST 190 are performed. If the process in step ST 188 , step ST 190 , or step ST 214 has been performed, the flowchart proceeds to the process in step ST 136 in FIG. 13 .
- the state estimation device includes the estimator generating unit 202 that generates an estimator for estimating whether a user is in an uncomfortable state, on the basis of a reaction pattern detected by the reaction detecting unit 106 and environmental information in a case where the number of the action patterns accumulated as history information is equal to or larger than a prescribed value.
- the discomfort determining unit 201 determines whether the user is in an uncomfortable state, by referring to the result of the estimation by the estimator.
- an uncomfortable state of the user and a discomfort factor can be estimated with an estimator generated through machine learning.
- the accuracy in estimating an uncomfortable state of a user can be increased.
- the estimator generating unit 202 performs machine learning, using input signals that are the reaction patterns stored in the learning database 112 .
- information not registered in the action information database 105 and the reaction information database 107 may be stored into the learning database 112 , and the stored information may be used as input signals in the machine learning. This makes it possible to learn users' habits that are not registered in the action information database 105 and the reaction information database 107 , and the accuracy in estimating an uncomfortable state of a user can be increased.
- a third embodiment concerns a configuration for estimating a discomfort factor as well as an uncomfortable state, from a detected reaction pattern.
- FIG. 24 is a block diagram showing the configuration of a state estimation device 100 B according to the third embodiment.
- the state estimation device 100 B includes a discomfort determining unit 301 and a discomfort reaction pattern database 302 , in place of the discomfort determining unit 108 and the discomfort reaction pattern database 111 of the state estimation device 100 of the first embodiment shown in FIG. 1 .
- the discomfort determining unit 301 checks the input identification information against the discomfort reaction patterns that are stored in the discomfort reaction pattern database 302 and indicate uncomfortable states of users. In a case where a reaction pattern matching the input identification information is stored in the discomfort reaction pattern database 302 , the discomfort determining unit 301 estimates that the user is in an uncomfortable state. The discomfort determining unit 301 further refers to the discomfort reaction pattern database 302 , and, in a case where the discomfort factor can be identified from the input identification information, identifies the discomfort factor. The discomfort determining unit 301 outputs a signal indicating that an uncomfortable state of the user has been detected, and, in a case where the discomfort factor has been successfully identified, outputs a signal indicating information about the discomfort factor to the outside.
- the discomfort reaction pattern database 302 is a database that stores discomfort reaction patterns that are the results of learning by the learning unit 109 .
- FIG. 25 is a table showing an example of storage in the discomfort reaction pattern database 302 of the state estimation device 100 B according to the third embodiment.
- the discomfort reaction pattern database 302 shown in FIG. 25 contains the following items: discomfort factors 302 a , first discomfort reaction patterns 302 b , and second discomfort reaction patterns 302 c .
- the same items as the items of the discomfort factors 105 b in the action information database 105 (see FIG. 2 ) are written as the discomfort factors 302 a .
- the ID of a discomfort reaction pattern corresponding to more than one discomfort factor 302 a is written as the first discomfort reaction patterns 302 b .
- the IDs of discomfort reaction patterns each corresponding to a particular discomfort factor are written as the second discomfort reaction patterns 302 c .
- the IDs of the discomfort reaction patterns written as the first and second discomfort reaction patterns 302 b and 302 c correspond to the IDs 107 a shown in FIG. 3 .
- the discomfort determining unit 301 acquires the discomfort factor 302 a associated with the matching identification information. Thus, the discomfort factor is identified.
- the discomfort determining unit 301 and the discomfort reaction pattern database 302 in the state estimation device 100 B are the processing circuit 100 a shown in FIG. 6A , or are the processor 100 b that executes programs stored in the memory 100 c shown in FIG. 6B .
- FIG. 26 is a flowchart showing an operation of the discomfort determining unit 301 of the state estimation device 100 B according to the first embodiment.
- FIG. 26 the same steps as those in the flowchart of the first embodiment shown in FIG. 13 are denoted by the same reference numerals as those used in FIG. 13 , and explanation of them is not made herein.
- the discomfort determining unit 301 determines in step ST 134 that the identification information about a reaction pattern has been input (step ST 134 ; YES)
- the discomfort determining unit 301 checks the input identification information about the reaction pattern against the first discomfort reaction patterns 302 b and the second discomfort reaction patterns 302 c stored in the discomfort reaction pattern database 302 , and estimates an uncomfortable state of the user (step ST 301 ).
- the discomfort determining unit 301 refers to the result of the estimation in step ST 301 , and determines whether the user is in an uncomfortable state (step ST 302 ).
- the discomfort determining unit 301 refers to the result of the checking, and determines whether the discomfort factor has been identified (step ST 303 ). If the discomfort factor has been identified (step ST 303 ; YES), the discomfort determining unit 301 outputs, to the outside, a signal indicating that an uncomfortable state of the user has been detected, together with the discomfort factor (step ST 304 ). If any discomfort factor has not been identified (step ST 303 ; NO), on the other hand, the discomfort determining unit 301 outputs, to the outside, a signal indicating that the discomfort factor is unknown, but an uncomfortable state of the user has been detected (step ST 305 ).
- step ST 133 If the process in step ST 133 has been performed, if the process in step ST 304 has been performed, if the process in step ST 305 has been performed, if any identification information about any reaction pattern has not been input (step ST 134 ; NO), or if the user is determined not to be in an uncomfortable state (step ST 302 ; NO), the flowchart returns to the process in step ST 101 in FIG. 7 .
- step ST 301 in the flowchart in FIG. 26 is described in detail.
- FIG. 27 is a flowchart showing an operation of the discomfort determining unit 301 of the state estimation device 100 B according to the third embodiment.
- FIG. 27 the same steps as those in the flowchart of the first embodiment shown in FIG. 19 are denoted by the same reference numerals as those used in FIG. 19 , and explanation of them is not made herein.
- the discomfort determining unit 301 determines whether the extracted identification information about the reaction patterns matches a combination of the first and second discomfort reaction patterns (step ST 310 ). If it is determined to match a combination of the first and second discomfort reaction patterns (step ST 310 ; YES), the discomfort determining unit 301 estimates that it is in an uncomfortable state, and estimates the discomfort factor (step ST 311 ). If it is determined not to match any combination of the first and second discomfort reaction patterns (step ST 310 : NO), on the other hand, the discomfort determining unit 301 determines whether checking against all the combinations of the first and second discomfort reaction patterns has been completed (step ST 312 ).
- step ST 312 If checking against all the combinations of the first and second discomfort reaction patterns has not been completed yet (step ST 312 ; NO), the discomfort determining unit 301 returns to the process in step ST 181 . If checking against all the combinations of the first and second discomfort reaction patterns has been completed (step ST 312 ; YES), on the other hand, the discomfort determining unit 301 determines whether the identification information about the reaction pattern matches a first discomfort reaction pattern (step ST 313 ). If the identification information matches a first discomfort reaction pattern (step ST 313 ; YES), the discomfort determining unit 301 estimates that it is in an uncomfortable state (step ST 314 ). In the process in step ST 314 , only an uncomfortable state is estimated, and the discomfort factor is not estimated.
- step ST 313 If the identification information does not match any first discomfort reaction pattern (step ST 313 ; NO), on the other hand, the discomfort determining unit 301 estimates that it is not in an uncomfortable state (step ST 315 ). If the discomfort determining unit 301 determines in step ST 180 that any discomfort reaction pattern is not stored (step ST 180 ; NO), the operation also proceeds to the process in the step ST 315 .
- step ST 311 If the process in step ST 311 , step ST 314 , or step ST 315 has been performed, the flowchart proceeds to the process in step ST 302 in FIG. 26 .
- the discomfort determining unit 301 identifies the discomfort factor from the reaction pattern corresponding to the particular discomfort factor. Accordingly, in a case where a discomfort factor can be identified, the identified discomfort factor can be promptly removed. Further, in a case where the discomfort factor is unknown, a signal to that effect is output, to inquire of the user about the discomfort factor, for example. In this manner, the discomfort factor can be quickly identified and removed. Thus, the user's comfort can be increased.
- the discomfort determining unit 301 in a case where matching with the first discomfort reaction pattern corresponding to more than one discomfort factor is detected, the discomfort determining unit 301 promptly estimates that the user is in an uncomfortable state, though the discomfort factor is unknown.
- a timer that operates only in a case where matching with a first discomfort reaction pattern corresponding to more than one discomfort factor is detected.
- the discomfort determining unit 301 may estimate that the user is in an uncomfortable state, though the discomfort factor is unknown. This can prevent frequent inquiries to the user about discomfort factors. Thus, the user's comfort can be increased.
- a state estimation device can estimate a state of a user, without the user inputting information indicating his/her emotional state. Accordingly, the state estimation device is suitable for estimating a user state while reducing the burden on the user in an environmental control system or the like.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Social Psychology (AREA)
- Physiology (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Developmental Disabilities (AREA)
- Psychology (AREA)
- Educational Technology (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Dentistry (AREA)
- Acoustics & Sound (AREA)
- Computational Linguistics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2016/087204 WO2018109863A1 (ja) | 2016-12-14 | 2016-12-14 | 状態推定装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20200060597A1 true US20200060597A1 (en) | 2020-02-27 |
Family
ID=62558128
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/344,091 Abandoned US20200060597A1 (en) | 2016-12-14 | 2016-12-14 | State estimation device |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20200060597A1 (ja) |
| JP (1) | JP6509459B2 (ja) |
| CN (1) | CN110049724B (ja) |
| DE (1) | DE112016007435T5 (ja) |
| WO (1) | WO2018109863A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190147867A1 (en) * | 2017-11-10 | 2019-05-16 | Hyundai Motor Company | Dialogue system and method for controlling thereof |
| US20220274608A1 (en) * | 2019-07-19 | 2022-09-01 | Nec Corporation | Comfort driving data collection system, driving control device, method, and program |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7297300B2 (ja) * | 2019-08-06 | 2023-06-26 | 株式会社Agama-X | 情報処理装置及びプログラム |
| US12097031B2 (en) * | 2021-03-15 | 2024-09-24 | Mitsubishi Electric Corporation | Emotion estimation apparatus and emotion estimation method |
| JP2023174323A (ja) * | 2022-05-27 | 2023-12-07 | オムロン株式会社 | 環境制御システム、環境制御方法及び環境制御プログラム |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150099946A1 (en) * | 2013-10-09 | 2015-04-09 | Nedim T. SAHIN | Systems, environment and methods for evaluation and management of autism spectrum disorder using a wearable data collection device |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3993069B2 (ja) * | 2002-10-30 | 2007-10-17 | 三菱電機株式会社 | 脳波信号を利用した制御装置 |
| JP2004348432A (ja) * | 2003-05-22 | 2004-12-09 | Home Well:Kk | 健康管理支援システム |
| JP2008532587A (ja) * | 2005-02-22 | 2008-08-21 | ヘルス−スマート リミテッド | 生理学的及び心理学的/生理学的モニタリングのための方法及びシステム並びにその使用 |
| JP2007167105A (ja) * | 2005-12-19 | 2007-07-05 | Olympus Corp | 心身相関データ評価装置及び心身相関データ評価方法 |
| JP5292671B2 (ja) * | 2006-03-06 | 2013-09-18 | トヨタ自動車株式会社 | 覚醒度推定装置及びシステム並びに方法 |
| JP2008099884A (ja) * | 2006-10-19 | 2008-05-01 | Toyota Motor Corp | 状態推定装置 |
| CN102485165A (zh) * | 2010-12-02 | 2012-06-06 | 财团法人资讯工业策进会 | 可显示情绪的生理信号侦测系统、装置及显示情绪方法 |
| WO2012117335A2 (en) * | 2011-03-01 | 2012-09-07 | Koninklijke Philips Electronics N.V. | System and method for operating and/or controlling a functional unit and/or an application based on head movement |
| JP5194157B2 (ja) | 2011-09-27 | 2013-05-08 | 三菱電機株式会社 | プリント基板の保持構造 |
| CN103111006A (zh) * | 2013-01-31 | 2013-05-22 | 江苏中京智能科技有限公司 | 智能心情调整仪 |
| EP3060101B1 (en) * | 2013-10-22 | 2018-05-23 | Koninklijke Philips N.V. | Sensor apparatus and method for monitoring a vital sign of a subject |
| CN105615902A (zh) * | 2014-11-06 | 2016-06-01 | 北京三星通信技术研究有限公司 | 情绪监控方法和装置 |
| CN104434066A (zh) * | 2014-12-05 | 2015-03-25 | 上海电机学院 | 一种驾驶员生理信号监控系统及方法 |
| JP6588035B2 (ja) * | 2014-12-12 | 2019-10-09 | 株式会社デルタツーリング | 生体状態分析装置及びコンピュータプログラム |
| JP6321571B2 (ja) * | 2015-03-10 | 2018-05-09 | 日本電信電話株式会社 | センサデータを用いた推定装置、センサデータを用いた推定方法、センサデータを用いた推定プログラム |
| CN105721936B (zh) * | 2016-01-20 | 2018-01-16 | 中山大学 | 一种基于情景感知的智能电视节目推荐系统 |
| CN106200905B (zh) * | 2016-06-27 | 2019-03-29 | 联想(北京)有限公司 | 信息处理方法及电子设备 |
-
2016
- 2016-12-14 US US16/344,091 patent/US20200060597A1/en not_active Abandoned
- 2016-12-14 CN CN201680091415.1A patent/CN110049724B/zh active Active
- 2016-12-14 JP JP2018556087A patent/JP6509459B2/ja not_active Expired - Fee Related
- 2016-12-14 DE DE112016007435.2T patent/DE112016007435T5/de active Pending
- 2016-12-14 WO PCT/JP2016/087204 patent/WO2018109863A1/ja not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150099946A1 (en) * | 2013-10-09 | 2015-04-09 | Nedim T. SAHIN | Systems, environment and methods for evaluation and management of autism spectrum disorder using a wearable data collection device |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190147867A1 (en) * | 2017-11-10 | 2019-05-16 | Hyundai Motor Company | Dialogue system and method for controlling thereof |
| US10937420B2 (en) * | 2017-11-10 | 2021-03-02 | Hyundai Motor Company | Dialogue system and method to identify service from state and input information |
| US20220274608A1 (en) * | 2019-07-19 | 2022-09-01 | Nec Corporation | Comfort driving data collection system, driving control device, method, and program |
| US12103543B2 (en) * | 2019-07-19 | 2024-10-01 | Nec Corporation | Comfort driving data collection system, driving control device, method, and program |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6509459B2 (ja) | 2019-05-08 |
| WO2018109863A1 (ja) | 2018-06-21 |
| JPWO2018109863A1 (ja) | 2019-06-24 |
| CN110049724B (zh) | 2021-07-13 |
| DE112016007435T5 (de) | 2019-07-25 |
| CN110049724A (zh) | 2019-07-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20200060597A1 (en) | State estimation device | |
| JP7389421B2 (ja) | 精神・神経系疾患を推定する装置 | |
| CN107106044B (zh) | 可穿戴设备、佩戴质量检测方法及装置 | |
| US11315557B2 (en) | Method and system for providing voice recognition trigger and non-transitory computer-readable recording medium | |
| CN117828282B (zh) | 一种基于自适应滤波的数据高效处理方法 | |
| Saidani et al. | An efficient human activity recognition using hybrid features and transformer model | |
| JP6468823B2 (ja) | 生体識別システムおよび電子機器 | |
| CN109448711A (zh) | 一种语音识别的方法、装置及计算机存储介质 | |
| KR20180046649A (ko) | 멀티모달 지각을 이용한 사용자의 상호작용 의도 검출 시스템 및 이를 이용한 사용자의 상호작용 의도 검출 방법 | |
| JP2019154575A (ja) | 個人識別装置および特徴収集装置 | |
| WO2017219450A1 (zh) | 一种信息处理方法、装置及移动终端 | |
| Castellana et al. | Cepstral Peak Prominence Smoothed distribution as discriminator of vocal health in sustained vowel | |
| JP7307507B2 (ja) | 病態解析システム、病態解析装置、病態解析方法、及び病態解析プログラム | |
| US20210264939A1 (en) | Attribute identifying device, attribute identifying method, and program storage medium | |
| WO2016139844A1 (ja) | 状態検出方法、状態検出装置および状態検出プログラム | |
| Hussein et al. | Robust recognition of human activities using smartphone sensor data | |
| US20170371418A1 (en) | Method for recognizing multiple user actions on basis of sound information | |
| US11270109B2 (en) | Interactive method and interactive system for smart watch | |
| CN115969322B (zh) | 运动类型识别系统、方法和计算机设备 | |
| Whitfield | Exploration of metrics for quantifying formant space: Implications for clinical assessment of Parkinson disease | |
| Sadiq et al. | Attention-Based Deep Learning Model for Early Detection of Parkinson’s Disease. | |
| WO2022111203A1 (zh) | 心率检测方法和装置 | |
| CN108962389A (zh) | 用于风险提示的方法及系统 | |
| WO2019171586A1 (ja) | 不快状態判定装置 | |
| LU507134B1 (en) | Intelligent voice recognition method and system for ar helmets |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: MITSUBISHI ELECTRIC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OGAWA, ISAMU;OTSUKA, TAKAHIRO;SIGNING DATES FROM 20190312 TO 20190318;REEL/FRAME:048972/0160 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |