A Neural Network Approach for Argument Reasoning Comprehension

dc.contributor.advisorNg, Vincent
dc.creatorYe, Mingqing
dc.date.accessioned2019-10-08T16:22:37Z
dc.date.available2019-10-08T16:22:37Z
dc.date.created2019-05
dc.date.issued2019-05
dc.date.submittedMay 2019
dc.date.updated2019-10-08T16:22:38Z
dc.description.abstractThe dream of enabling machines to think like humans has emerged ever since the invention of computers. Over the years people have tried to build machine-based systems that can parse and understand natural languages like humans do. A crucial goal of the field of natural language processing is to understand argumentation, of which natural language reasoning has been regarded as an important and difficult part. Recently, the emergence of neural network models has begun to make it easier for processing numerous NLP tasks: while N-grams language models have trouble in capturing the dependency between two distant words in text, recurrent neural networks can do so efficiently. In addition, an increasing number of researchers have started to work on representation learning of natural language, as vectors can help to represent information from language more accurately. Nevertheless, neural systems bring about new challenges: they are incapable of handling long-term language dependencies; they need lots of training data; and it is hard to control what kind of information is more important in the encoding process. In this thesis, we work on the argument reasoning comprehension task, in which an argument consists of one premise, which works as a reason, a claim, which serves as a conclusion, and two warrant candidates, one of which should help to explain why the premise supports the claim by providing details. Our goal is to choose the better warrant between the two candidates. For our task, there are several difficulties. First, as mentioned above, the neural networks used for text modeling cannot determine what kind of information is more important than others; as a result, we try to identify the related information between warrant-premise pairs as well as warrant-claim pairs that can make encoding more informative. Second, in order to design the objective function used to train the neural networks, we need to find a way to model the similarity between vector triplets so that the probability of choosing warrant0 or warrant1 can be calculated. Finally, instances with lexically similar but semantically different or even opposite warrant candidates are difficult to tackle. In this thesis, we address the previously mentioned challenges in the argument reasoning comprehension tasks with a neural network approach, specifically by (a) using and modifying a multi-hop mechanism to find related information across the premise, warrant and claim in an argument as well as generating high-quality encodings; (b) developing measurements to evaluate the similarity between two vector triplets in order to build a reasonable objective function; and (c) coming up with ideas of how to emphasize and encode the difference between lexically similar warrant statements. Building on these ideas, we design an automatic neural argument reasoning system for a dataset proposed by Habernal et al. (2018). Our system achieves 70.1% accuracy on the test set, which is the second best result reported on this dataset. Importantly, some parts in our pipeline model can be reused in future systems.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/10735.1/6974
dc.language.isoen
dc.rights©2019 Mingqing Ye
dc.subjectReasoning
dc.subjectNeural networks (Computer science)
dc.subjectMachine learning
dc.subjectNatural language processing (Computer science)
dc.titleA Neural Network Approach for Argument Reasoning Comprehension
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentComputer Science
thesis.degree.grantorThe University of Texas at Dallas
thesis.degree.levelMasters
thesis.degree.nameMSCS

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ETD-5608-011-YE-260238.88.pdf
Size:
1.74 MB
Format:
Adobe Portable Document Format
Description:
Thesis

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.84 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.84 KB
Format:
Plain Text
Description: