Electronic Theses and Dissertations

Date

2020

Document Type

Dissertation

Degree Name

Doctor of Philosophy

Department

Computer Science

Committee Chair

Deepak Venugopal

Committee Member

Vasile Rus

Committee Member

Bernie J. Daigle

Committee Member

Xiaofei Zhang

Abstract

Markov Logic Networks (MLNs) combine first-order logic with probabilistic graphical models and are therefore capable of encoding complex domain knowledge. However, learning and inference in MLNs is extremely challenging and current methods have poor accuracy and/or scalability. The goal of the dissertation is to significantly improve the performance of MLNs in complex tasks by developing novel algorithms that i) systematically exploit symmetries in learning, ii) utilize advances in parallel computing to improve scalability and iii) combine MLNs with Deep Neural Networks (DNNs) to yield more powerful models. In particular, we develop mixture models where the components of the mixture model are learned based on symmetries in the MLN. To exploit parallelism, we develop a Spark-based system to recognize symmetries in adistributed manner. Further, we combine MLNs with DNNs by learning a sub-symbolic representation for MLN symbols called Obj2Vec that captures symmetries in the MLN structure. Using Obj2Vec, we develop two neuro-symbolic learning methods where we encode symmetries in the MLN into the DNN learner. Specifically, we develop a Convolutional Neural Network based approach to learn complex parameterizations for MLNs. We also develop an approach for learning relations over multiple possible worlds using Neural Tensor Networks. We show that our models generalize better when DNNs are regularized with knowledge from MLNs. Thus, this dissertation is a step towards a long-standing goal in AI of combining symbolic and neural network based models.

Comments

Data is provided by the student.

Library Comment

Dissertation or thesis originally submitted to ProQuest

Share

COinS