Neural temporality adaptation for document classification: Diachronic word embeddings and domain adaptation models
Abstract
Language usage can change across periods of time, but document classifiers models are usually trained and tested on corpora spanning multiple years without considering temporal variations. This paper describes two complementary ways to adapt classifiers to shifts across time. First, we show that diachronic word embeddings, which were originally developed to study language change, can also improve document classification, and we show a simple method for constructing this type of embedding. Second, we propose a time-driven neural classification model inspired by methods for domain adaptation. Experiments on six corpora show how these methods can make classifiers more robust over time.
Publication Title
ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference
Recommended Citation
Huang, X., & Paul, M. (2020). Neural temporality adaptation for document classification: Diachronic word embeddings and domain adaptation models. ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 4113-4123. Retrieved from https://digitalcommons.memphis.edu/facpubs/3010