Latent semantic grammar induction: Context, projectivity, and prior distributions

Abstract

This paper presents latent semantic grammars for the unsupervised induction of English grammar. Latent semantic grammars were induced by applying singular value decomposition to n-gram by context-feature matrices. Parsing was used to evaluate performance. Experiments with context, projectivity, and prior distributions show the relative performance effects of these kinds of prior knowledge. Results show that prior distributions, projectivity, and part of speech information are not necessary to beat the right branching baseline.

Publication Title

HLT-NAACL 2007 - TextGraphs 2007: Graph-Based Algorithms for Natural Language Processing, Proceedings of the Workshop

This document is currently not available here.

Share

COinS