Global Variational Learning for Graphical Models with Latent Variables

Date

2018-05

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

item.page.doi

Abstract

Probabilistic Graphical Models have been used intensively for developing Machine Learning applications including Computer Vision, Natural Language processing, Collaborative Filtering, and Bioinformatics. Moreover, Graphical Models with latent variables are very powerful tools for modeling uncertainty, since latent variables can be used to represent unobserved factors, and they also can be used to model the correlations between the observed variables. However, global learning of Latent Variable Models (LVMs) is NP-hard in general, and the state-of-the-art algorithm for learning them such as Expectation Maximization algorithm can get stuck in local optimum. In this thesis, we address the problem of global variational learning for LVMs. More precisely, we propose a convex variational approximation for Maximum Likelihood Learning and apply Frank-Wolfe algorithm to solve it. We also investigate the use of the Global Optimization Algorithm (GOP) for Bayesian Learning, and we demonstrate that it converges to the global optimum.

Description

Keywords

Graphical modeling (Statistics), Machine learning, Latent variables, Constrained optimization, Nonconvex programming

item.page.sponsorship

Rights

©2018 The Author. Digital access to this material is made possible by the Eugene McDermott Library. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.

Citation