WO2016145607A1 - Interaction between user and interactive device - Google Patents
Interaction between user and interactive device Download PDFInfo
- Publication number
- WO2016145607A1 WO2016145607A1 PCT/CN2015/074357 CN2015074357W WO2016145607A1 WO 2016145607 A1 WO2016145607 A1 WO 2016145607A1 CN 2015074357 W CN2015074357 W CN 2015074357W WO 2016145607 A1 WO2016145607 A1 WO 2016145607A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- interactive device
- communication information
- interaction
- eeg
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/40—Hardware adaptations for dashboards or instruments
- B60K2360/48—Sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/77—Instrument locations other than the dashboard
- B60K2360/782—Instrument locations other than the dashboard on the steering wheel
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
Definitions
- the present disclosure relates in general to the field of interaction between a user and an interactive device, and in more particular, to an apparatus, system and method for interaction between a user and an interactive device and an associated interactive device.
- HMI Human Machine Interaction
- users can interact with an interactive device via speech, gesture, or countenance, etc.
- An interactive device can receive communication information from a user, such as speech, gesture, or countenance; process the received information by analyzing and recognizing, etc; and create a response accordingly.
- the interactions are initialed by users.
- users need to wake up the interactive device by pressing a button or providing wake-up information, such as speaking a wake-up word or performing a wake-up gesture. Then the device will be waked-up and enter a “wake-up” state wherein the user’s speech, gesture, or countenance will be processed to enable the interaction with the user.
- An aspect of the present disclosure is to provide an interactive device for interaction with a user, comprising: a first receiver, receiving an Electroencephalograph (EEG) signal of a user; a second receiver, receiving a communication information from the user; a determination device, determining, based on the received EEG signal, whether the user is intending to interact with the interactive device; a processing device, processing the received communication information to enable the interaction between the interactive device and the user, in response that the determination device determining that the user is intending to interact with the interactive device.
- EEG Electroencephalograph
- the processing device can ignore the received communication information in response that the determination device determining that the user is not intending to interact with the interactive device.
- the determination device can perform the determining based on a classifier which is trained by reference EEG data.
- the processing performed by the processing device can comprise at least one of: recording the communication information; analyzing the communication information; recognizing the communication information; generating a response for the user based on the communication information.
- the communication information can include a speech of the user.
- the communication information can include a gesture of the user.
- An further aspect of the present disclosure is to provide a system of for interaction with a user, comprising: the interactive device of any one of claims 1 to 6, and an EEG sensor, sensing an EEG signal of a user and providing the sensed EEG signal to the interactive device.
- the interactive device and the EEG sensor are embedded in a vehicle.
- the EEG sensor is embedded in at least one of a headrest, a seat, and a steering wheel of the vehicle.
- the EEG sensor is embedded in a wearable device for the user.
- An further aspect of the present disclosure is to provide an apparatus for interaction between an interactive device and a user, comprising: a receiver, receiving an Electroencephalograph (EEG) signal of a user; a determination device, determining, based on the EEG signal, whether the user is intending to interact with the interactive device; a state switcher, switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
- EEG Electroencephalograph
- the state switcher can switch the interactive device into a waiting state in response to a determination that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
- the determination device can perform the determining based on a classifier which is trained by reference EEG data.
- An further aspect of the present disclosure is to provide a method for interaction between an interactive device and a user, comprising: receiving an Electroencephalograph (EEG) signal of a user; receiving a communication information from the user; determining, based on the received EEG signal, whether the user is intending to interact with the interactive device; processing the received communication information to enable the interaction between the interactive device and the user, in response to determining that the user is intending to interact with the interactive device.
- EEG Electroencephalograph
- the method can further comprise ignoring the received communication information in response to determining that the user is not intending to interact with the interactive device.
- the determining can be performed based on a classifier which is trained by reference EEG data.
- the processing can comprise at least one of recording the communication information; analyzing the communication information; recognizing the communication information; generating a response for the user based on the communication information.
- the communication information can include a speech of the user.
- the communication information can include a gesture of the user.
- An further aspect of the present disclosure is to provide a method for interaction between an interactive device and a user, comprising: receiving an Electroencephalograph (EEG) signal of a user; determining, based on the EEG signal, whether the user is intending to interact with the interactive device; switching the interactive device into a wake-up state in response to determining that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
- EEG Electroencephalograph
- the method can further comprise switching the interactive device into a waiting state in response to determining that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
- the determining can be performed based on a classifier which is trained by reference EEG data.
- Fig. 1 shows a block diagram of an apparatus for interaction between an interactive device and a user according to one embodiment of this disclosure.
- Fig. 2 shows a block diagram of an interactive device for interaction with a user according to one embodiment of this disclosure.
- Fig. 3 shows a block diagram of a system for interaction with a user according to one embodiment of this disclosure.
- Fig. 4 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure.
- Fig. 5 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure.
- a general idea of the present invention is to actuate and end an interaction between an interactive device and a user automatically according to the intention of the user, without need of pressing a button or providing wake-up information via, for example, speech, gesture, or countenance.
- Fig. 1 shows a block diagram of an apparatus 100 for interaction between an interactive device and a user according to one embodiment of this disclosure.
- the apparatus can comprise a receiver 101 for receiving an Electroencephalograph (EEG) signal of a user; a determination device 102 for determining, based on the EEG signal, whether the user is intending to interact with the interactive device; a state switcher 103 for switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
- EEG Electroencephalograph
- the EEG like brain signal
- the EEG is a recording of electrical activities along the human body, for example, long the surface of human body such as scalp or skin.
- the type of neural oscillations of a user can be observed in EEG signals, so that the intention of a user can be determined based on his/her EEG signals.
- EEG signal can be used to determine the intention of the user, so as to enable (actuate) or disenabling (end) the interaction between the user and the interactive device automatically when the user have corresponding intention.
- the user thus does not have to press a button or provide a wake-up speech, gesture or countenance for indicating his/her intention to the interactive device.
- the state switcher 103 can switch the interactive device into a waiting state in response to a determination that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
- the interactive device will process (e.g. record, or analyze, etc.) all the communication information (e.g. speech, gesture, or countenance, etc) from the user after being actuated, therefore if the user turns to communicate (e.g. talk) with other ones instead of the interactive device, the device usually fails into a chaotic status.
- This example of the embodiment can prevent such problem by disenabling the interaction between the interactive device and the user when the user is not intending to do such interaction.
- the determination device 102 can perform the determining based on a classifier which is trained by reference EEG data.
- a classifier which is trained by reference EEG data.
- the human’s EEG data of intending and not intending to interact with the interactive device can be collected as the reference EEG data and then be used to train a classifier.
- the trained classifier can be used for determining the intention of the user based on the new coming EEG signals from the user.
- the classifier can be built and trained through any know technologies in the art.
- the state switcher 103 can switch the state of the interactive device by generating and sending a “wake-up” command or a “waiting” command to the interactive device according the determination of the user’s intention. Then the interactive device can enter the “wake-up” state to enable the interaction by processing the communication information from the user, or enter the “waiting”state to disenable the interaction by ignoring the communication information.
- the state switcher 103 can also trigger an indicator embedded in or separated from the interactive device, so as to indicate the user the state of the interactive device. The user may begin or stop his/her interaction with the interactive device according to the indication.
- the interactive device in this and other embodiment of this disclosure can employ any type of technologies to interact with humans via speech, gesture, or countenance, etc.
- the interactive device can be embedded in any entity that need such interaction, such as a vehicle, a computer, or a household electrical appliance (such as a TV set, a fridge) etc.
- the apparatus 100 in Fig. 1 can be separated from interactive device, integrated with the interactive device, or integrated with an EEG sensor that sensing the EEG signal of the user.
- Fig. 2 shows a block diagram of an interactive device 200 for interaction with a user
- the device 200 can comprise a first receiver 201 for receiving an Electroencephalograph (EEG) signal of a user; a second receiver 202 for receiving a communication information from the user, wherein the communication information can be in any form that is able to be used for communication between the user and other human or machines, including but not limited to speech, gesture, countenance, etc.
- the interactive device 200 can further comprises a determination device 203 which determining, based on received the EEG signal, whether the user is intending to interact with the interactive device 200.
- the determination device 203 can work similar as the determination device 102 in embodiment 1.
- the interactive device 200 can further comprising a processing device 204 which processing the received communication information to enable the interaction between the interactive device 200 and the user, in response that the determination device 203 determining that the user is intending to interact with the interactive device 200.
- This embodiment uses EEG signal to determine the intention of the user, and actuates the processing device 204 of the interactive device 200 to process the received information from the user to enable the interaction automatically when the user have such intention.
- the user thus does not have to press a button or provide a wake-up speech, gesture or countenance for indicating his/her intention to the interactive device.
- the processing performed by the processing device 204 can comprise any processing that can enable the interaction between the user and the interactive device 200, including but not being limited to at least one of recording the communication information; analyzing the communication information; recognizing the communication information; and generating a response for the user based on the communication information, etc.
- the processing device 204 can ignore the received communication information in response that the determination device 203 determining that the user is not intending to interact with the interactive device. Thereby, the interactive device 200 will not process the communication information received when the user turns to communicate with other ones instead of the interactive device. A possible chaotic status can be prevented.
- the interactive device 200 can further comprise an indicator indicating the user the state of the interactive device 200 by light, sound, and the like, according to the determination result of the determination device 203.
- the user may be indicated by the indicator to begin or stop the interaction with the interactive device 200.
- Fig. 3 shows a block diagram of a system 300 for interaction with a user according to one embodiment of this disclosure.
- the system can comprise an interactive device 301 similar as the interactive device 200 of the embodiment in figure 2, and an EEG sensor 302 for sensing an EEG signal of a user and providing the sensed EEG signal to the interactive device 301.
- the interactive device 301 and the EEG sensor 302 can be separated or integrated together, and they can be embedded in any entity that need interacting with users, such as a vehicle, a computer, or a household electrical appliance etc.
- the EEG sensor 302 can be embedded in parts (such as a headrest, a seat, or a steering wheel, etc) of the vehicle.
- the EEG sensor can be embedded in a wearable device for the user.
- Fig. 4 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure.
- the method can comprises: 401, receiving an Electroencephalograph (EEG) signal of a user; 402, receiving a communication information from the user; 403, determining, based on the received EEG signal, whether the user is intending to interact with the interactive device, 404, processing the received communication information to enable the interaction between the interactive device and the user, in response to determining that the user is intending to interact with the interactive device.
- EEG Electroencephalograph
- the method can further comprise ignoring the received communication information in response to determining that the user is not intending to interact with the interactive device.
- the determining in step 403 can be performed based on, for example, a classifier which is trained by reference EEG data.
- the processing in step 404 can comprise at least one of: recording the communication information; analyzing the communication information; recognizing the communication information; generating a response for the user based on the communication information.
- the method can further comprise indicating the user the state of the interactive device according to the determination result in step 403.
- Fig. 5 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure.
- the method can comprise 501, receiving an Electroencephalograph (EEG) signal of a user; 502, determining, based on the EEG signal, whether the user is intending to interact with the interactive device; 503, switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
- EEG Electroencephalograph
- the method can further comprise switching the interactive device into a waiting state in response to a determination that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
- the determining in step 502 can be performed based on, for example, a classifier which is trained by reference EEG data.
- the switching the state of the interactive device can be performed by generating and sending a “wake-up” command or a “waiting”command to the interactive device according the determination result in step 502. Then the interactive device can enter the “wake-up” state to enable the interaction by processing the communication information from the user, or enter the “waiting” state to disenable the interaction by ignoring the communication information.
- the method can further comprise indicating the user the state of the interactive device according to the determination result in step 502.
- an application example of voice interaction between a user and a vehicle is provided hereafter. It can be comprehended that the application scenario of the embodiments of this disclosure is not limit to the vehicle, but can also be any other scenario providing interaction between users and machines, and the details in the application example are only for the purpose of facilitating describing and understanding, but in no way limiting the disclosure thereto.
- an EEG sensor can be embedded in a head wearable device wearing around the head of the driver.
- the interactive device for example, the HMI system
- his intention can be determined by a determining device in the interactive device based on his EEG signal sensed by the EEG sensor.
- the interactive device then will enter a wake-up state wherein the speech from the driver will be record and recognized.
- the driver intends to stop this talking, or turns to talk with the passengers, his intention can also be sensed and determined and the interactive device will enter a waiting state wherein the speech from the drive will be ignored.
- the present invention may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present invention may be embodied in part in a software form.
- the computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer.
- the computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Combustion & Propulsion (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present disclosure relates in general to interaction between a user and an interactive device. An interactive device (200) for interaction with a user is provided, which comprises: a first receiver (201), receiving an Electroencephalograph (EEG) signal of a user; a second receiver (202), receiving a communication information from the user; a determination device (203), determining, based on the received EEG signal, whether the user is intending to interact with the interactive device; a processing device (204), processing the received communication information to enable the interaction between the interactive device and the user, in response that the determination device determining that the user is intending to interact with the interactive device.
Description
The present disclosure relates in general to the field of interaction between a user and an interactive device, and in more particular, to an apparatus, system and method for interaction between a user and an interactive device and an associated interactive device.
With HMI (Human Machine Interaction) technology, users can interact with an interactive device via speech, gesture, or countenance, etc. An interactive device can receive communication information from a user, such as speech, gesture, or countenance; process the received information by analyzing and recognizing, etc; and create a response accordingly. Under most cases, the interactions are initialed by users. Usually, users need to wake up the interactive device by pressing a button or providing wake-up information, such as speaking a wake-up word or performing a wake-up gesture. Then the device will be waked-up and enter a “wake-up” state wherein the user’s speech, gesture, or countenance will be processed to enable the interaction with the user.
SUMMARY OF THE INVENTION
An aspect of the present disclosure is to provide an interactive device for interaction with a user, comprising: a first receiver, receiving an Electroencephalograph (EEG) signal of a user; a second receiver, receiving a communication information from the user; a determination device, determining, based on the received EEG signal, whether the user is intending to interact with the interactive device; a processing device, processing the received communication information to enable the interaction between the interactive device and the user, in response that the determination device determining that the user is intending to interact with the interactive device.
In an example, the processing device can ignore the received communication
information in response that the determination device determining that the user is not intending to interact with the interactive device.
In an example, the determination device can perform the determining based on a classifier which is trained by reference EEG data.
In an example, the processing performed by the processing device can comprise at least one of: recording the communication information; analyzing the communication information; recognizing the communication information; generating a response for the user based on the communication information.
In an example, the communication information can include a speech of the user.
In an example, the communication information can include a gesture of the user.
An further aspect of the present disclosure is to provide a system of for interaction with a user, comprising: the interactive device of any one of claims 1 to 6, and an EEG sensor, sensing an EEG signal of a user and providing the sensed EEG signal to the interactive device.
In an example, the interactive device and the EEG sensor are embedded in a vehicle.
In an example, the EEG sensor is embedded in at least one of a headrest, a seat, and a steering wheel of the vehicle.
In an example, the EEG sensor is embedded in a wearable device for the user.
An further aspect of the present disclosure is to provide an apparatus for interaction between an interactive device and a user, comprising: a receiver, receiving an Electroencephalograph (EEG) signal of a user; a determination device, determining, based on the EEG signal, whether the user is intending to interact with the interactive device; a state switcher, switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
In an example, the state switcher can switch the interactive device into a waiting state in response to a determination that the user is not intending to interact with the
interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
In an example, the determination device can perform the determining based on a classifier which is trained by reference EEG data.
An further aspect of the present disclosure is to provide a method for interaction between an interactive device and a user, comprising: receiving an Electroencephalograph (EEG) signal of a user; receiving a communication information from the user; determining, based on the received EEG signal, whether the user is intending to interact with the interactive device; processing the received communication information to enable the interaction between the interactive device and the user, in response to determining that the user is intending to interact with the interactive device.
In an example, the method can further comprise ignoring the received communication information in response to determining that the user is not intending to interact with the interactive device.
In an example, the determining can be performed based on a classifier which is trained by reference EEG data.
In an example, the processing can comprise at least one of recording the communication information; analyzing the communication information; recognizing the communication information; generating a response for the user based on the communication information.
In an example, the communication information can include a speech of the user.
In an example, the communication information can include a gesture of the user.
An further aspect of the present disclosure is to provide a method for interaction between an interactive device and a user, comprising: receiving an Electroencephalograph (EEG) signal of a user; determining, based on the EEG signal, whether the user is intending to interact with the interactive device; switching the interactive device into a wake-up state in response to determining that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the
interactive device and the user.
In an example, the method can further comprise switching the interactive device into a waiting state in response to determining that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
In an example, the determining can be performed based on a classifier which is trained by reference EEG data.
Further scope of applicability of the present invention will become apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from the following detailed description.
The above and other aspects and advantages of the present invention will become apparent from the following detailed description of exemplary embodiments taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the invention.
Fig. 1 shows a block diagram of an apparatus for interaction between an interactive device and a user according to one embodiment of this disclosure.
Fig. 2 shows a block diagram of an interactive device for interaction with a user according to one embodiment of this disclosure.
Fig. 3 shows a block diagram of a system for interaction with a user according to one embodiment of this disclosure.
Fig. 4 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure.
Fig. 5 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the following detailed description, numerous specific details are set forth to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art that the described embodiments can be practiced without some or all of these specific details. In other exemplary embodiments, well known structures or process steps have not been described in detail in order to avoid unnecessarily obscuring the concept of the present invention.
A general idea of the present invention is to actuate and end an interaction between an interactive device and a user automatically according to the intention of the user, without need of pressing a button or providing wake-up information via, for example, speech, gesture, or countenance.
Embodiment 1
Fig. 1 shows a block diagram of an apparatus 100 for interaction between an interactive device and a user according to one embodiment of this disclosure. The apparatus can comprise a receiver 101 for receiving an Electroencephalograph (EEG) signal of a user; a determination device 102 for determining, based on the EEG signal, whether the user is intending to interact with the interactive device; a state switcher 103 for switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
As known for those skilled in the art, the EEG, like brain signal, is a recording of electrical activities along the human body, for example, long the surface of human body such as scalp or skin. The type of neural oscillations of a user can be observed in EEG signals, so that the intention of a user can be determined based on his/her EEG signals.
In this and other embodiments of the disclosure, EEG signal can be used to determine the intention of the user, so as to enable (actuate) or disenabling (end) the interaction between the user and the interactive device automatically when the user have corresponding intention. The user thus does not have to press a button or provide a wake-up speech, gesture or countenance for indicating his/her intention to the interactive device.
In an example of the embodiment, the state switcher 103 can switch the
interactive device into a waiting state in response to a determination that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
Conventionally, the interactive device will process (e.g. record, or analyze, etc.) all the communication information (e.g. speech, gesture, or countenance, etc) from the user after being actuated, therefore if the user turns to communicate (e.g. talk) with other ones instead of the interactive device, the device usually fails into a chaotic status. This example of the embodiment can prevent such problem by disenabling the interaction between the interactive device and the user when the user is not intending to do such interaction.
In an example of the embodiment, the determination device 102 can perform the determining based on a classifier which is trained by reference EEG data. For example, the human’s EEG data of intending and not intending to interact with the interactive device can be collected as the reference EEG data and then be used to train a classifier. The trained classifier can be used for determining the intention of the user based on the new coming EEG signals from the user. The classifier can be built and trained through any know technologies in the art.
In an example of the embodiment, the state switcher 103 can switch the state of the interactive device by generating and sending a “wake-up” command or a “waiting” command to the interactive device according the determination of the user’s intention. Then the interactive device can enter the “wake-up” state to enable the interaction by processing the communication information from the user, or enter the “waiting”state to disenable the interaction by ignoring the communication information.
In an example, the state switcher 103 can also trigger an indicator embedded in or separated from the interactive device, so as to indicate the user the state of the interactive device. The user may begin or stop his/her interaction with the interactive device according to the indication.
The interactive device in this and other embodiment of this disclosure can employ any type of technologies to interact with humans via speech, gesture, or countenance, etc. The interactive device can be embedded in any entity that need such interaction, such as a vehicle, a computer, or a household electrical appliance (such as a TV set, a fridge) etc.
In examples of the embodiment, the apparatus 100 in Fig. 1 can be separated from interactive device, integrated with the interactive device, or integrated with an EEG sensor that sensing the EEG signal of the user.
Embodiment 2
Fig. 2 shows a block diagram of an interactive device 200 for interaction with a user according to one embodiment of this disclosure, the device 200 can comprise a first receiver 201 for receiving an Electroencephalograph (EEG) signal of a user; a second receiver 202 for receiving a communication information from the user, wherein the communication information can be in any form that is able to be used for communication between the user and other human or machines, including but not limited to speech, gesture, countenance, etc. The interactive device 200 can further comprises a determination device 203 which determining, based on received the EEG signal, whether the user is intending to interact with the interactive device 200. The determination device 203 can work similar as the determination device 102 in embodiment 1. The interactive device 200 can further comprising a processing device 204 which processing the received communication information to enable the interaction between the interactive device 200 and the user, in response that the determination device 203 determining that the user is intending to interact with the interactive device 200.
This embodiment uses EEG signal to determine the intention of the user, and actuates the processing device 204 of the interactive device 200 to process the received information from the user to enable the interaction automatically when the user have such intention. The user thus does not have to press a button or provide a wake-up speech, gesture or countenance for indicating his/her intention to the interactive device.
In an example of the embodiment, the processing performed by the processing device 204 can comprise any processing that can enable the interaction between the user and the interactive device 200, including but not being limited to at least one of recording the communication information; analyzing the communication information; recognizing the communication information; and generating a response for the user based on the communication information, etc.
In an example of the embodiment, the processing device 204 can ignore the received communication information in response that the determination device 203 determining that the user is not intending to interact with the interactive device. Thereby, the interactive device 200 will not process the communication information received when the user turns to communicate with other ones instead of the interactive device. A possible chaotic status can be prevented.
In an example, the interactive device 200 can further comprise an indicator indicating the user the state of the interactive device 200 by light, sound, and the like, according to the determination result of the determination device 203. The user may be indicated by the indicator to begin or stop the interaction with the interactive device 200.
Embodiment 3
Fig. 3 shows a block diagram of a system 300 for interaction with a user according to one embodiment of this disclosure. The system can comprise an interactive device 301 similar as the interactive device 200 of the embodiment in figure 2, and an EEG sensor 302 for sensing an EEG signal of a user and providing the sensed EEG signal to the interactive device 301.
The interactive device 301 and the EEG sensor 302 can be separated or integrated together, and they can be embedded in any entity that need interacting with users, such as a vehicle, a computer, or a household electrical appliance etc. In an application of vehicle, the EEG sensor 302 can be embedded in parts (such as a headrest, a seat, or a steering wheel, etc) of the vehicle. In an example, the EEG sensor can be embedded in a wearable device for the user.
Embodiment 4
Fig. 4 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure. The method can comprises: 401, receiving an Electroencephalograph (EEG) signal of a user; 402, receiving a communication information from the user; 403, determining, based on the received EEG signal, whether the user is intending to interact with the interactive device, 404, processing
the received communication information to enable the interaction between the interactive device and the user, in response to determining that the user is intending to interact with the interactive device.
In an example, the method can further comprise ignoring the received communication information in response to determining that the user is not intending to interact with the interactive device.
Similarly as in other embodiments, the determining in step 403 can be performed based on, for example, a classifier which is trained by reference EEG data.
In an example, the processing in step 404 can comprise at least one of: recording the communication information; analyzing the communication information; recognizing the communication information; generating a response for the user based on the communication information.
In an example, the method can further comprise indicating the user the state of the interactive device according to the determination result in step 403.
Embodiment 5
Fig. 5 shows a flow chart of a method for interaction between an interactive device and a user according to an embodiment of this disclosure. The method can comprise 501, receiving an Electroencephalograph (EEG) signal of a user; 502, determining, based on the EEG signal, whether the user is intending to interact with the interactive device; 503, switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device, wherein the wake-up state enabling the interaction between the interactive device and the user.
In an example, the method can further comprise switching the interactive device into a waiting state in response to a determination that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
Similarly as in other embodiments, the determining in step 502 can be performed based on, for example, a classifier which is trained by reference EEG data.
In an example of the embodiment, the switching the state of the interactive
device can be performed by generating and sending a “wake-up” command or a “waiting”command to the interactive device according the determination result in step 502. Then the interactive device can enter the “wake-up” state to enable the interaction by processing the communication information from the user, or enter the “waiting” state to disenable the interaction by ignoring the communication information.
In an example, the method can further comprise indicating the user the state of the interactive device according to the determination result in step 502.
Application Example
For the purpose of facilitating understanding of this disclosure, an application example of voice interaction between a user and a vehicle is provided hereafter. It can be comprehended that the application scenario of the embodiments of this disclosure is not limit to the vehicle, but can also be any other scenario providing interaction between users and machines, and the details in the application example are only for the purpose of facilitating describing and understanding, but in no way limiting the disclosure thereto.
In an exemplary application scenario, an EEG sensor according to the embodiments of the disclosure can be embedded in a head wearable device wearing around the head of the driver. When the driver intends to talk to the interactive device (for example, the HMI system) of the vehicle, his intention can be determined by a determining device in the interactive device based on his EEG signal sensed by the EEG sensor. The interactive device then will enter a wake-up state wherein the speech from the driver will be record and recognized. When the driver intends to stop this talking, or turns to talk with the passengers, his intention can also be sensed and determined and the interactive device will enter a waiting state wherein the speech from the drive will be ignored. There can be an indicator to indicate the driver the state of the interactive device and guide the driver to begin or stop his talking.
Since the interaction between the user and the vehicle can be actuated and ended automatically according to the user’s intention, the user thus does not have to press a button or provide a wake-up word for indicating his/her intention to the vehicle, which can increase the security of driving. On the other hand, a chaotic status as described above can be prevented.
Those skilled in the art may clearly know from the above embodiments that the present invention may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present invention may be embodied in part in a software form. The computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer. The computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present invention.
The invention being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the invention, and all such modifications as would be obvious to those skilled in the art are intended to be included within the scope of the following claims.
Claims (22)
- An interactive device for interaction with a user, comprising:a first receiver, receiving an Electroencephalograph (EEG) signal of a user;a second receiver, receiving a communication information from the user;a determination device, determining, based on the received EEG signal, whether the user is intending to interact with the interactive device;a processing device, processing the received communication information to enable the interaction between the interactive device and the user, in response that the determination device determining that the user is intending to interact with the interactive device.
- The device of claims 1, wherein the processing device ignoring the received communication information in response that the determination device determining that the user is not intending to interact with the interactive device.
- The device of claim 1, wherein the determination device performing the determining based on a classifier which is trained by reference EEG data.
- The device of claim 1, wherein the processing performed by the processing device comprising at least one of:recording the communication information;analyzing the communication information;recognizing the communication information;generating a response for the user based on the communication information.
- The device of any one of claims 1 to 4, wherein the communication information includes a speech of the user.
- The device of any one of clams 1 to 4, wherein the communication information includes a gesture of the user.
- A system of for interaction with a user, comprising:The interactive device of any one of claims 1 to 6, andan EEG sensor, sensing an EEG signal of a user and providing the sensed EEG signal to the interactive device.
- The system of claim 7, wherein the interactive device and the EEG sensor are embedded in a vehicle.
- The system of claim 7, wherein the EEG sensor is embedded in at least one of a headrest, a seat, and a steering wheel of the vehicle.
- The system of claim 7, wherein the EEG sensor is embedded in a wearable device for the user.
- An apparatus for interaction between an interactive device and a user, comprising:a receiver, receiving an Electroencephalograph (EEG) signal of a user;a determination device, determining, based on the EEG signal, whether the user is intending to interact with the interactive device;a state switcher, switching the interactive device into a wake-up state in response to a determination that the user is intending to interact with the interactive device,wherein the wake-up state enabling the interaction between the interactive device and the user.
- The apparatus of claim 11, wherein the state switcher switching the interactive device into a waiting state in response to a determination that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
- The apparatus of claim 11, wherein the determination device performing the determining based on a classifier which is trained by reference EEG data.
- A method for interaction between an interactive device and a user, comprising:receiving an Electroencephalograph (EEG) signal of a user;receiving a communication information from the user;determining, based on the received EEG signal, whether the user is intending to interact with the interactive device;processing the received communication information to enable the interaction between the interactive device and the user, in response to determining that the user is intending to interact with the interactive device.
- The method of claims 14, further comprising:ignoring the received communication information in response to determining that the user is not intending to interact with the interactive device.
- The method of claim 14, wherein the determining is performed based on a classifier which is trained by reference EEG data.
- The method of claim 14, wherein the processing comprising at least one of:recording the communication information;analyzing the communication information;recognizing the communication information;generating a response for the user based on the communication information.
- The method of any one of claims 14 to 17, wherein the communication information includes a speech of the user.
- The method of any one of clams 14 to 17, wherein the communication information includes a gesture of the user.
- A method for interaction between an interactive device and a user, comprising:receiving an Electroencephalograph (EEG) signal of a user;determining, based on the EEG signal, whether the user is intending to interact with the interactive device;switching the interactive device into a wake-up state in response to determining that the user is intending to interact with the interactive device,wherein the wake-up state enabling the interaction between the interactive device and the user.
- The method of claim 20, further comprising:switching the interactive device into a waiting state in response to determining that the user is not intending to interact with the interactive device, wherein the waiting state disenabling the interaction between the interactive device and the user.
- The method of claim 20, wherein the determining is performed based on a classifier which is trained by reference EEG data.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/074357 WO2016145607A1 (en) | 2015-03-17 | 2015-03-17 | Interaction between user and interactive device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/074357 WO2016145607A1 (en) | 2015-03-17 | 2015-03-17 | Interaction between user and interactive device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016145607A1 true WO2016145607A1 (en) | 2016-09-22 |
Family
ID=56919871
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2015/074357 Ceased WO2016145607A1 (en) | 2015-03-17 | 2015-03-17 | Interaction between user and interactive device |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016145607A1 (en) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6001065A (en) * | 1995-08-02 | 1999-12-14 | Ibva Technologies, Inc. | Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein |
| US6587713B1 (en) * | 2001-12-14 | 2003-07-01 | Bertha Freeman | Brainwave responsive wheelchair |
| CN101923392A (en) * | 2010-09-02 | 2010-12-22 | 上海交通大学 | Asynchronous brain-computer interaction control method for EEG signals |
| CN102156541A (en) * | 2010-05-13 | 2011-08-17 | 天津大学 | Prefrontal electroencephalogram information and blood oxygen information fused human-computer interaction method |
| CN102371901A (en) * | 2010-08-20 | 2012-03-14 | 李兴邦 | Controller for safe driving |
| CN202533867U (en) * | 2012-04-17 | 2012-11-14 | 北京七鑫易维信息技术有限公司 | Head mounted eye-control display terminal |
| US20130096453A1 (en) * | 2011-10-12 | 2013-04-18 | Seoul National University R&Db Foundation | Brain-computer interface devices and methods for precise control |
| CN103970260A (en) * | 2013-01-31 | 2014-08-06 | 华为技术有限公司 | Non-contact gesture control method and electronic terminal equipment |
-
2015
- 2015-03-17 WO PCT/CN2015/074357 patent/WO2016145607A1/en not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6001065A (en) * | 1995-08-02 | 1999-12-14 | Ibva Technologies, Inc. | Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein |
| US6587713B1 (en) * | 2001-12-14 | 2003-07-01 | Bertha Freeman | Brainwave responsive wheelchair |
| CN102156541A (en) * | 2010-05-13 | 2011-08-17 | 天津大学 | Prefrontal electroencephalogram information and blood oxygen information fused human-computer interaction method |
| CN102371901A (en) * | 2010-08-20 | 2012-03-14 | 李兴邦 | Controller for safe driving |
| CN101923392A (en) * | 2010-09-02 | 2010-12-22 | 上海交通大学 | Asynchronous brain-computer interaction control method for EEG signals |
| US20130096453A1 (en) * | 2011-10-12 | 2013-04-18 | Seoul National University R&Db Foundation | Brain-computer interface devices and methods for precise control |
| CN202533867U (en) * | 2012-04-17 | 2012-11-14 | 北京七鑫易维信息技术有限公司 | Head mounted eye-control display terminal |
| CN103970260A (en) * | 2013-01-31 | 2014-08-06 | 华为技术有限公司 | Non-contact gesture control method and electronic terminal equipment |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10706873B2 (en) | Real-time speaker state analytics platform | |
| Pandey et al. | Acceptability of speech and silent speech input methods in private and public | |
| US10809802B2 (en) | Line-of-sight detection apparatus, computer readable storage medium, and line-of-sight detection method | |
| CN105488957B (en) | Method for detecting fatigue driving and device | |
| EP3693966A1 (en) | System and method for continuous privacy-preserved audio collection | |
| JP7053432B2 (en) | Control equipment, agent equipment and programs | |
| CN113486760A (en) | Object speaking detection method and device, electronic equipment and storage medium | |
| CN110326300B (en) | Information processing apparatus, information processing method, and computer-readable storage medium | |
| US20180085928A1 (en) | Robot, robot control method, and robot system | |
| US20230274740A1 (en) | Arbitrating between multiple potentially-responsive electronic devices | |
| US20200237290A1 (en) | System and method for detection of cognitive and speech impairment based on temporal visual facial feature | |
| US11609565B2 (en) | Methods and systems to facilitate monitoring center for ride share and safe testing method based for selfdriving cars to reduce the false call by deuddaction systems based on deep learning machine | |
| CN106251870A (en) | The method identifying the linguistic context of Voice command, the method obtaining the audio controls of Voice command and the equipment of enforcement the method | |
| JP2014153663A (en) | Voice recognition device, voice recognition method and program | |
| CN105389097A (en) | Man-machine interaction device and method | |
| CN108847214A (en) | Method of speech processing, client, device, terminal, server and storage medium | |
| CN104408426A (en) | Method and device for removing glasses in face image | |
| KR20150112337A (en) | display apparatus and user interaction method thereof | |
| CN104850995A (en) | Operation executing method and device | |
| CN113488043B (en) | Passenger speaking detection method and device, electronic equipment and storage medium | |
| CN105354560A (en) | Fingerprint identification method and device | |
| CN107943272A (en) | A kind of intelligent interactive system | |
| CN106774914A (en) | The control method and Wearable of Wearable | |
| US20160275536A1 (en) | Regulating digital content transmitted over a network | |
| CN105758414A (en) | Method and device for switching languages of vehicle navigation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15884990 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15884990 Country of ref document: EP Kind code of ref document: A1 |