2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020) (FG)
Download PDF

Abstract

We present a real-time algorithm for emotion-aware navigation of a socially-assistive robot among pedestrians. Our approach estimates time-varying emotional dynamics and behaviors of people from multiple modalities (i.e., their faces and trajectories) using a combination of deep-learning, and affective features from the PAD (Pleasure-Arousal-Dominance) model from psychology. These PAD characteristics are used to predict pedestrian movement with proxemic constraints. We use a multi-channel model to classify pedestrian characteristics into four emotion categories (happy, sad, angry, neutral). In our validation results, we observe an emotion detection accuracy of 85.33parcent. We formulate emotion-based proxemic constraints to perform socially-aware robot navigation in low- to medium density environments. We demonstrate the benefits of our algorithm in simulated environments with tens of pedestrians as well as in a real-world setting with Pepper, a social humanoid robot.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles