Abstract
In the era of big data, with an increasing number of e-commerce and social media users worldwide, the demand for automated sentiment analysis systems is growing rapidly. With respect to industrial interests, aspect-based sentiment analysis (ABSA), which focuses on the sentiment at the aspect level, has become a popular research topic. ABSA includes two subtasks aspect-term sentiment analysis (ATSA) and aspect-category sentiment analysis (ACSA). This paper proposes a multi-task learning framework based on the pre-trained BERT model as a shared representation layer to jointly learn ATSA and ACSA tasks. To fully exploit the contextual information surrounding the aspects, we add a multi-head self-attention layer with a skip connection on top of the shared BERT model. Experimental results on SemEval datasets show that our multi-task learning model improves the performance of the ATSA task and outperforms baseline multi-task network and single-task models.