McColl et al., 2016 - Google Patents
A survey of autonomous human affect detection methods for social robots engaged in natural HRIMcColl et al., 2016
View PDF- Document ID
- 1335043163999914600
- Author
- McColl D
- Hong A
- Hatakeyama N
- Nejat G
- Benhabib B
- Publication year
- Publication venue
- Journal of Intelligent & Robotic Systems
External Links
Snippet
Abstract In Human-Robot Interactions (HRI), robots should be socially intelligent. They should be able to respond appropriately to human affective and social cues in order to effectively engage in bi-directional communications. Social intelligence would allow a robot …
- 241000282414 Homo sapiens 0 title abstract description 86
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/004—Artificial life, i.e. computers simulating life
- G06N3/008—Artificial life, i.e. computers simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. robots replicating pets or humans in their appearance or behavior
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| McColl et al. | A survey of autonomous human affect detection methods for social robots engaged in natural HRI | |
| Su et al. | Recent advancements in multimodal human–robot interaction | |
| US11762467B2 (en) | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio | |
| McDuff et al. | Designing emotionally sentient agents | |
| Rossi et al. | User profiling and behavioral adaptation for HRI: A survey | |
| CN106956271B (en) | Predict the method and robot of affective state | |
| KR20160072621A (en) | Artificial intelligence robot service system | |
| Costantini et al. | Multi-agent system engineering for emphatic human-robot interaction | |
| Hirth et al. | Towards social robots: Designing an emotion-based architecture | |
| Kopp et al. | The fabric of socially interactive agents: Multimodal interaction architectures | |
| Schnitzer et al. | Prototyping a zoomorphic interactive robot companion with emotion recognition and affective voice interaction for elderly people | |
| Naeem et al. | Voice controlled humanoid robot | |
| Yang et al. | A survey on media interaction in social robotics | |
| US20220009082A1 (en) | Method for controlling a plurality of robot effectors | |
| Volkova et al. | Crowdsourcing-Based Approbation of Communicative Behaviour Elements on the F-2 Robot: Perception Peculiarities According to Respondents | |
| Huang et al. | Development of a Sign Language Dialogue System for a Healing Dialogue Robot | |
| Samsonovich et al. | On the possibility of regulation of human emotions via multimodal social interaction with an embodied agent controlled by ebica-based emotional interaction model | |
| Naeem et al. | An AI based voice controlled humanoid robot | |
| Kalharoodi et al. | Social robots: an open-source architecture for personal assistant robots | |
| Rehm | Nonsymbolic gestural interaction for ambient intelligence | |
| Volkova et al. | New communicative strategies for the affective robot: F-2 going tactile and complimenting | |
| Kubota | Enabling Longitudinal Personalized Behavior Adaptation for Cognitively Assistive Robots | |
| Rao et al. | MyChum: An Emotionally Intelligent Robot for Elder Care | |
| dos Reis Alves et al. | Intelligent control architecture for assistive mobile robots | |
| Yonezawa et al. | Haptic interaction design for physical contact between a wearable robot and the user |