Parameter Tying and Dissociation in Graphical Models

Date

2019-12

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

item.page.doi

Abstract

Understanding the implications of today’s deluge and high velocity of data is a challenging problem facing researchers across multiple disciplines and domains. Data are typically highdimensional, unstructured, and noisy; thus, the models produced or learned from applying modern machine learning techniques are often very complex, i.e., large number of parameters. To this, I present new techniques that impose constraints, in the form of parameter tying, on both the learning and inference task for probabilistic graphical models (PGM). Specifically, in this dissertation, we consider two important problems for PGMs. The first problem is parameter learning given a PGM structure with the objective of improving the generalization of the learned model. Specifically, we present and utilize the concept of parameter tying as a novel alternative regularization (variance reduction) framework for parameter learning in PGMs. In addition to improved generalization, parameter tied PGMs are a new class of models for which inference algorithms can be constructed to achieve efficient inference by exploiting the symmetric parameter structure. The second problem focuses on exploiting parameter tying to develop a bounded inference scheme, which we refer to as dissociationbased bounds. We consider the task of weighted model counting which includes important tasks in PGMs such as computing the partition function and probability of evidence as special cases. Namely, we propose a partition-based bounding algorithm that exploits logical structure and gives rise to a novel set of inequalities from which lower and upper bounds can be derived efficiently. The bounds come with correctness guarantees and are oblivious in that they can be computed by minimally manipulating the parameters of the model.

Description

Keywords

Machine learning, Graphical modeling (Statistics)

item.page.sponsorship

Rights

©2019 Li Kang Chou. All Rights Reserved.

Citation