Latent semantic grammar induction: Context, projectivity, and prior distributions
Abstract
This paper presents latent semantic grammars for the unsupervised induction of English grammar. Latent semantic grammars were induced by applying singular value decomposition to n-gram by context-feature matrices. Parsing was used to evaluate performance. Experiments with context, projectivity, and prior distributions show the relative performance effects of these kinds of prior knowledge. Results show that prior distributions, projectivity, and part of speech information are not necessary to beat the right branching baseline.
Publication Title
HLT-NAACL 2007 - TextGraphs 2007: Graph-Based Algorithms for Natural Language Processing, Proceedings of the Workshop
Recommended Citation
Olney, A. (2007). Latent semantic grammar induction: Context, projectivity, and prior distributions. HLT-NAACL 2007 - TextGraphs 2007: Graph-Based Algorithms for Natural Language Processing, Proceedings of the Workshop, 45-52. Retrieved from https://digitalcommons.memphis.edu/facpubs/8133