Global Variational Learning for Graphical Models with Latent Variables
Abdelatty, Ahmed M.
MetadataShow full item record
Probabilistic Graphical Models have been used intensively for developing Machine Learning applications including Computer Vision, Natural Language processing, Collaborative Filtering, and Bioinformatics. Moreover, Graphical Models with latent variables are very powerful tools for modeling uncertainty, since latent variables can be used to represent unobserved factors, and they also can be used to model the correlations between the observed variables. However, global learning of Latent Variable Models (LVMs) is NP-hard in general, and the state-of-the-art algorithm for learning them such as Expectation Maximization algorithm can get stuck in local optimum. In this thesis, we address the problem of global variational learning for LVMs. More precisely, we propose a convex variational approximation for Maximum Likelihood Learning and apply Frank-Wolfe algorithm to solve it. We also investigate the use of the Global Optimization Algorithm (GOP) for Bayesian Learning, and we demonstrate that it converges to the global optimum.