Cognitively inspired NLP-based knowledge representations: Further explorations of latent semantic analysis

Abstract

Natural-language based knowledge representations borrow their expressiveness from the semantics of language. One such knowledge representation technique is Latent semantic analysis (LSA), a statistical, corpus-based method for representing knowledge. It has been successfully used in a variety of applications including intelligent tutoring systems, essay grading and coherence metrics. The advantage of LSA is that it is efficient in representing world knowledge without the need for manual coding of relations and that it has in fact been considered to simulate aspects of human knowledge representation. An overview of LSA applications will be given, followed by some further explorations of the use of LSA. These explorations focus on the idea that the power of LSA can be amplified by considering semantic fields of text units instead of pairs of text units. Examples are given for semantic networks, category membership, typicality, spatiality and temporality, showing new evidence for LSA as a mechanism for knowledge representation. The results of such tests show that while the mechanism behind LSA is unique, it is flexible enough to replicate results in different corpora and languages. © World Scientific Publishing Company.

Publication Title

International Journal on Artificial Intelligence Tools

Share

COinS