Paraphrasing Academic Text: A Study of Back-Translating Anatomy and Physiology with Transformers

Abstract

This paper explores a general approach to paraphrase generation using a pre-trained seq2seq model fine-tuned using a back-translated anatomy and physiology textbook. Human ratings indicate that the paraphrase model generally preserved meaning and grammaticality/fluency: 70% of meaning ratings were above 75, and 40% of paraphrases were considered more grammatical/fluent than the originals. An error analysis suggests potential avenues for future work.

Publication Title

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Share

COinS