Parameter Tying and Dissociation in Graphical Models

dc.contributor.advisorGogate, Vibhav
dc.creatorChou, Li Kang
dc.date.accessioned2020-03-03T22:30:53Z
dc.date.available2020-03-03T22:30:53Z
dc.date.created2019-12
dc.date.issued2019-12
dc.date.submittedDecember 2019
dc.date.updated2020-03-03T22:30:53Z
dc.description.abstractUnderstanding the implications of today’s deluge and high velocity of data is a challenging problem facing researchers across multiple disciplines and domains. Data are typically highdimensional, unstructured, and noisy; thus, the models produced or learned from applying modern machine learning techniques are often very complex, i.e., large number of parameters. To this, I present new techniques that impose constraints, in the form of parameter tying, on both the learning and inference task for probabilistic graphical models (PGM). Specifically, in this dissertation, we consider two important problems for PGMs. The first problem is parameter learning given a PGM structure with the objective of improving the generalization of the learned model. Specifically, we present and utilize the concept of parameter tying as a novel alternative regularization (variance reduction) framework for parameter learning in PGMs. In addition to improved generalization, parameter tied PGMs are a new class of models for which inference algorithms can be constructed to achieve efficient inference by exploiting the symmetric parameter structure. The second problem focuses on exploiting parameter tying to develop a bounded inference scheme, which we refer to as dissociationbased bounds. We consider the task of weighted model counting which includes important tasks in PGMs such as computing the partition function and probability of evidence as special cases. Namely, we propose a partition-based bounding algorithm that exploits logical structure and gives rise to a novel set of inequalities from which lower and upper bounds can be derived efficiently. The bounds come with correctness guarantees and are oblivious in that they can be computed by minimally manipulating the parameters of the model.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/10735.1/7329
dc.language.isoen
dc.rights©2019 Li Kang Chou. All Rights Reserved.
dc.subjectMachine learning
dc.subjectGraphical modeling (Statistics)
dc.titleParameter Tying and Dissociation in Graphical Models
dc.typeDissertation
dc.type.materialtext
thesis.degree.departmentComputer Science
thesis.degree.grantorThe University of Texas at Dallas
thesis.degree.levelDoctoral
thesis.degree.namePHD

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ETD-5608-011-LKC-260974.75.pdf
Size:
1.24 MB
Format:
Adobe Portable Document Format
Description:
Dissertation

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.84 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.84 KB
Format:
Plain Text
Description: