Expectation Maximization and Posterior Constraints
Related Collections
Degree type
Discipline
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
The expectation maximization (EM) algorithm is a widely used maximum likelihood estimation procedure for statistical models when the values of some of the variables in the model are not observed. Very often, however, our aim is primarily to find a model that assigns values to the latent variables that have intended meaning for our data and maximizing expected likelihood only sometimes accomplishes this. Unfortunately, it is typically difficult to add even simple a-priori information about latent variables in graphical models without making the models overly complex or intractable. In this paper, we present an efficient, principled way to inject rich constraints on the posteriors of latent variables into the EM algorithm. Our method can be used to learn tractable graphical models that satisfy additional, otherwise intractable constraints. Focusing on clustering and the alignment problem for statistical machine translation, we show that simple, intuitive posterior constraints can greatly improve the performance over standard baselines and be competitive with more complex, intractable models.
Advisor
Date of presentation
Conference name
Conference dates
Conference location
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Comments
Expectation Maximization and Posterior Constraints (http://www.cis.upenn.edu/~taskar/pubs/empc_nips07.pdf), J. Graca (http://www.cis.upenn.edu/~graca/), K. Ganchev (http://www.seas.upenn.edu/~kuzman/), and B. Taskar. Neural Information Processing Systems Conference (NIPS) (http://www.nips.cc/Conferences), Vancouver, BC, December 2007.

