Rob Malouf and Miles
Section: Language and Computation
Unification-based attribute-value grammar formalisms such as Lexical- Functional Grammar and Head-Driven Phrase Structure Grammar have proven to be highly successful for practical large-scale grammar development. However, realistic applications of attribute-value grammars for natural language parsing or generation require the use of sophisticated statistical techniques for resolving ambiguities. This one-week course will provide an introduction to the maximum entropy principle and the construction of maximum entropy models for natural language processing. Through a combination of lectures and, as local computing facilities permit, hands-on lab exercises, students will investigate the implementation of maximum entropy models for attribute-value grammars, including such topics as ambiguity identification, feature selection, and model training and evaluation.
This course will assume a basic knowledge of probability theory, and some experience in grammar development or programming in a high level language would be helpful.