Global Variational Learning for Graphical Models with Latent Variables

DSpace/Manakin Repository

Global Variational Learning for Graphical Models with Latent Variables

Show full item record

Title: Global Variational Learning for Graphical Models with Latent Variables
Author(s):
Abdelatty, Ahmed M.
Advisor: Ruozzi, Nicholas
Date Created: 2018-05
Format: Thesis
Keywords: Show Keywords
Abstract: Probabilistic Graphical Models have been used intensively for developing Machine Learning applications including Computer Vision, Natural Language processing, Collaborative Filtering, and Bioinformatics. Moreover, Graphical Models with latent variables are very powerful tools for modeling uncertainty, since latent variables can be used to represent unobserved factors, and they also can be used to model the correlations between the observed variables. However, global learning of Latent Variable Models (LVMs) is NP-hard in general, and the state-of-the-art algorithm for learning them such as Expectation Maximization algorithm can get stuck in local optimum. In this thesis, we address the problem of global variational learning for LVMs. More precisely, we propose a convex variational approximation for Maximum Likelihood Learning and apply Frank-Wolfe algorithm to solve it. We also investigate the use of the Global Optimization Algorithm (GOP) for Bayesian Learning, and we demonstrate that it converges to the global optimum.
Degree Name: MSCS
Degree Level: Masters
Persistent Link: http://hdl.handle.net/10735.1/5933
Terms of Use: ©2018 The Author. Digital access to this material is made possible by the Eugene McDermott Library. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
Type : text
Degree Program: Computer Science

Files in this item

Files Size Format View
ETD-5608-011-ABDELATTY-8103.85.pdf 1.120Mb PDF View/Open

This item appears in the following Collection(s)


Show full item record