Paraphrasing Academic Text: A Study of Back-Translating Anatomy and Physiology with Transformers
Abstract
This paper explores a general approach to paraphrase generation using a pre-trained seq2seq model fine-tuned using a back-translated anatomy and physiology textbook. Human ratings indicate that the paraphrase model generally preserved meaning and grammaticality/fluency: 70% of meaning ratings were above 75, and 40% of paraphrases were considered more grammatical/fluent than the originals. An error analysis suggests potential avenues for future work.
Publication Title
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Recommended Citation
Olney, A. (2021). Paraphrasing Academic Text: A Study of Back-Translating Anatomy and Physiology with Transformers. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12749 LNAI, 279-284. https://doi.org/10.1007/978-3-030-78270-2_50