Attention based transformer for student answers assessment

Abstract

Inspired by Vaswani's transformer, we propose in this paper an attention-based transformer neural network with a multi-head attention mechanism for the task of student answer assessment. Results show the competitiveness of our proposed model. A highest accuracy of 71.5% was achieved when using ELMo embeddings, 10 heads of attention, and 2 layers. This is very competitive and rivals the highest accuracy achieved by a previously proposed BI-GRU-Capsnet deep network (72.5%) on the same dataset. The main advantages of using transformers over BI-GRU-Capsnet is reducing the training time and giving more space for parallelization.

Publication Title

Proceedings of the 33rd International Florida Artificial Intelligence Research Society Conference, FLAIRS 2020

This document is currently not available here.

Share

COinS