Towards a Unified End-to-End Approach for Fully Unsupervised Cross-Lingual Sentiment Analysis

Yanlin Feng, Xiaojun Wan


Abstract
Sentiment analysis in low-resource languages suffers from the lack of training data. Cross-lingual sentiment analysis (CLSA) aims to improve the performance on these languages by leveraging annotated data from other languages. Recent studies have shown that CLSA can be performed in a fully unsupervised manner, without exploiting either target language supervision or cross-lingual supervision. However, these methods rely heavily on unsupervised cross-lingual word embeddings (CLWE), which has been shown to have serious drawbacks on distant language pairs (e.g. English-Japanese). In this paper, we propose an end-to-end CLSA model by leveraging unlabeled data in multiple languages and multiple domains and eliminate the need for unsupervised CLWE. Our model applies to two CLSA settings : the traditional cross-lingual in-domain setting and the more challenging cross-lingual cross-domain setting. We empirically evaluate our approach on the multilingual multi-domain Amazon review dataset. Experimental results show that our model outperforms the baselines by a large margin despite its minimal resource requirement.
Anthology ID:
K19-1097
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
1035–1044
Language:
URL:
https://aclanthology.org/K19-1097
DOI:
10.18653/v1/K19-1097
Bibkey:
Cite (ACL):
Yanlin Feng and Xiaojun Wan. 2019. Towards a Unified End-to-End Approach for Fully Unsupervised Cross-Lingual Sentiment Analysis. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 1035–1044, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Towards a Unified End-to-End Approach for Fully Unsupervised Cross-Lingual Sentiment Analysis (Feng & Wan, CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1097.pdf
Terminologies: