Abstract
Emotion is a primary semantic component of human communication. This study focuses on automatic emotion detection in descriptive sentences and how this can be used to tune facial expression parameters for virtual character generation. Therefore, we present a classification based sentiment analysis approach to mapping a sentiment sentence into an emotional state. Each sentence is represented as a feature vector and classified using support vector machines. By considering the high dimension of the textual data, and the semantic relation between words, we introduce the distributed representation model. Results of the study indicate that our sentiment analysis method could assist automatic facial expression generation in human-robot interactive communication efficiently.