Bert Adapter and Contrastive Learning for Continual Classification of Aspect Sentiment Task Sequences

Pham Thi Quynh Trang, Ngo Ngoc Huyen, Phan Dinh Dan Truong, Dang Thanh Hai
Author affiliations

Authors

  • Pham Thi Quynh Trang Vietnam National University, University of Engineering and Technology, 144 Xuan Thuy Street, Cau Giay district, Ha Noi, Viet Nam https://orcid.org/0009-0007-4151-6467
  • Ngo Ngoc Huyen
  • Phan Dinh Dan Truong
  • Dang Thanh Hai Vietnam National University, University of Engineering and Technology, 144 Xuan Thuy Street, Cau Giay district, Ha Noi, Viet Nam

DOI:

https://doi.org/10.15625/2525-2518/17395

Keywords:

continual learning, catastrophic forgetting, knowledge transfer, contrastive learning, aspect sentiment classification

Abstract

Task incremental learning, a setting of continual learning, is an approach to exploit the knowledge from previous tasks for currently new tasks. Task incremental learning aims to solve two big challenges of continual learning: catastrophic forgetting and knowledge transfer or sharing between previous tasks and the current task. This paper improves Task incremental learning by (1) transferring the knowledge (not the training data) learned from previous tasks to a new coming task (contrast of multi-task learning); (2) maintaining or even improving the performance of learned models for previous tasks with avoid forgetting; (3) developing a continual learning model based on results from (1) and (2) to apply for aspect sentiment classification. Specifically, we combine two loss functions based on two contrastive learning modules, which are the Contrastive Knowledge Sharing (CKS) module for encouraging knowledge sharing between old and current tasks and the Contrastive Supervised learning (CSC) module for improving the performance of the current task. The experimental results show that our method could help previously learned tasks to get rid of the catastrophic forgetting phenomenon, outperforming previous studies for aspect sentiment classification.

Downloads

Download data is not yet available.

References

Parisi G. I., Kemker R., Part J. L., Kanan C., and Wermter S. - Continual lifelong learning with neural networks: A review, Neural Networks 113 (2019) 54-71. https://doi.org/10.1016/j.neunet.2019.01.012

Van de Ven G. M. and Tolias A. S. - Three scenarios for continual learning, arXiv preprint arXiv:1904.07734 (2019).

Devlin J., Chang M. W., Lee K., and Toutanova K. - BERT: Pre-training of deep bidirectional transformers for language understanding, arXiv preprint arXiv:1810.04805 (2018).

Xu H., Liu B., Shu L., and Yu P. S. - BERT post-training for review reading comprehension and aspect-based sentiment analysis, arXiv preprint arXiv:1904.02232 (2019).

Chen Z. and Liu B. - Lifelong machine learning, 2nd Ed., Morgan & Claypool Publishers, San Rafael, 2018.

Ke Z., Liu B., and Huang X. - Continual learning of a mixed sequence of similar and dissimilar tasks, Advances in Neural Information Processing Systems 33 (2020) 18493-18504.

Biesialska M., Biesialska K., and Costa-Jussa M. R. - Continual lifelong learning in natural language processing: A survey, arXiv preprint arXiv:2012.09823 (2020).

Huang Y., Zhang Y., Chen J., Wang X., and Yang D. - Continual learning for text classification with information disentanglement based regularization, arXiv preprint arXiv:2104.05489 (2021).

Ke Z., Xu H., and Liu B. - Adapting BERT for continual learning of a sequence of aspect sentiment classification tasks, arXiv preprint arXiv:2112.03271 (2021).

Houlsby N., Giurgiu A., Jastrzebski S., Morrone B., De Laroussilhe Q., Gesmundo A., Attariyan M., and Gelly S. - Parameter-efficient transfer learning for NLP, Proceedings of the 36th International Conference on Machine Learning (ICML), Vol. 97, Long Beach, 2019, pp. 2790-2799.

Le-Khac P. H., Healy G., and Smeaton A. F. - Contrastive representation learning: A framework and review, IEEE Access 8 (2020) 193907-193934. https://doi.org/10.1109/ACCESS.2020.3031549

Khosla P., Teterwak P., Wang C., Sarna A., Tian Y., Isola P., Maschinot A., Liu C., and Krishnan D. - Supervised contrastive learning, Advances in Neural Information Processing Systems 33 (2020) 18661-18673.

Chen T., Kornblith S., Norouzi M., and Hinton G. - A simple framework for contrastive learning of visual representations, Proceedings of the 37th International Conference on Machine Learning (ICML), Vol. 119, Virtual Event, 2020, pp. 1597-1607.

Hinton G. E., Krizhevsky A., and Wang S. D. - Transforming auto-encoders, in: Honkela T., Duch W., Girolami M., Kaski S. (Eds.), Artificial Neural Networks and Machine Learning--ICANN 2011, Springer, Berlin, Heidelberg, 2011, pp. 44-51.

Sabour S., Frosst N., and Hinton G. E. - Dynamic routing between capsules, Advances in Neural Information Processing Systems 30 (2017) 3856-3866.

Ke Z., Liu B., Xu H., and Shu L. - CLASSIC: Continual and contrastive learning of aspect sentiment classification tasks, arXiv preprint arXiv:2112.02714 (2021).

Jaiswal A., Babu A. R., Zadeh M. Z., Banerjee D., and Makedon F. - A survey on contrastive self-supervised learning, Technologies 9(1) (2020) 2. https://doi.org/10.3390/technologies9010002

Hu M. and Liu B. - Mining and summarizing customer reviews, Proceedings of the 10th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), Seattle, 2004, pp. 168-177.

Liu Q., Gao Z., Liu B., and Zhang Y. - Automated rule selection for aspect extraction in opinion mining, Proceedings of the 24th International Joint Conference on Artificial Intelligence (IJCAI), Buenos Aires, 2015, pp. 1291-1297.

Ding X., Liu B., and Yu P. S. - A holistic lexicon-based approach to opinion mining, Proceedings of the 2008 International Conference on Web Search and Data Mining (WSDM), Palo Alto, 2008, pp. 231-240.

Tang D., Qin B., and Liu T. - Aspect level sentiment classification with deep memory network, arXiv preprint arXiv:1605.08900 (2016).

Serra J., Suris D., Miron M., and Karatzoglou A. - Overcoming catastrophic forgetting with hard attention to the task, Proceedings of the 35th International Conference on Machine Learning (ICML), Vol. 80, Stockholm, 2018, pp. 4548-4557.

Downloads

Published

05-04-2023

How to Cite

[1]
P. T. Quynh Trang, N. Ngoc Huyen, P. D. Dan Truong, and D. Thanh Hai, “ Bert Adapter and Contrastive Learning for Continual Classification of Aspect Sentiment Task Sequences”, Vietnam J. Sci. Technol., vol. 61, no. 4, Apr. 2023.

Issue

Section

Electronics - Telecommunication