Abstract
This paper proposes a method of generating affective facial expressions from EEG to improve the quality of communication between patients with severe motor dysfunction, such as amyotrophic lateral sclerosis, and the people around them. Unlike existing studies that estimate simple emotional states from EEG, this paper proposes not only to target decoding of the variety of emotions that are said to appear in facial expressions for more accurate communication of emotional states but also to decode the motor intention of facial expression to avoid unintended emotional communication by the user. Experiments using a consumer-level EEG device showed that 15 different emotional states and facial expression generation intentions expressed by eight different action units could be decoded from EEG signals with 67.4% and 64.6% accuracy, respectively, significantly above chance level . In a validation experiment with an ALS patient, it was also shown that there is a possibility to decode emotion states and facial expression-generating intentions at a significant level from the EEG.