Neural temporality adaptation for document classification: Diachronic word embeddings and domain adaptation models


Language usage can change across periods of time, but document classifiers models are usually trained and tested on corpora spanning multiple years without considering temporal variations. This paper describes two complementary ways to adapt classifiers to shifts across time. First, we show that diachronic word embeddings, which were originally developed to study language change, can also improve document classification, and we show a simple method for constructing this type of embedding. Second, we propose a time-driven neural classification model inspired by methods for domain adaptation. Experiments on six corpora show how these methods can make classifiers more robust over time.

Publication Title

ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference

This document is currently not available here.