Electronic Theses and Dissertations

Identifier

6665

Date

2021

Document Type

Thesis

Degree Name

Master of Science

Major

Psychology

Concentration

Experimental Psychology

Committee Chair

Xiangen Hu

Committee Member

Arthur Graesser

Committee Member

Leah Windsor

Committee Member

Brent Morgan

Abstract

The goal of this thesis is to evaluate the answers that students give to questions asked by an intelligent tutoring system (ITS) on electronics, called ElectronixTutor. One learning resource of ElectronixTutor is AutoTutor, an instructional module that helps students learn by holding a conversation in natural language. The semantic relatedness between a student’s verbal input and an ideal answer is a salient feature for assessing performance of the student in AutoTutor. Inaccurate assessment of the verbal contributions will create problems in AutoTutor’s adaptation to the student. Therefore, this thesis evaluated the quality of semantic matches between student input and the expected responses in AutoTutor. AutoTutor evaluates semantic matches with a combination of Latent Semantic Analysis (LSA) and Regular Expressions (RegEx) when assessing student verbal input. Analyzing response-expectation pairings and comparing computer scoring with judge ratings allowed us to look at the agreement between humans and computers overall as well as on an item basis. Aggregate analyses on these data allowed us to observe the overall relative agreement between subject-matter experts and the AutoTutor system. Item analyses allowed us to observe variation between items and interactions between human and computer assessment conditions on various threshold levels (i.e. stringent, intermediate, lenient). As expected, RegEx and LSA showed a positive relationship ρ (5202) = .471. Additionally, F1 measure agreement (the harmonic mean of precision and recall) between the computer and humans was similar to agreement between humans. In some cases, computer-human F1 measure agreement compared to between-humans was as close as F1 = .006.

Comments

Data is provided by the student.

Library Comment

Dissertation or thesis originally submitted to the local University of Memphis Electronic Theses & dissertation (ETD) Repository.

Share

COinS