Knowledge-Rich Event Coreference Resolution

dc.contributor.advisorNg, Vincent
dc.creatorLu, Jing
dc.date.accessioned2021-12-07T21:12:07Z
dc.date.available2021-12-07T21:12:07Z
dc.date.created2021-05
dc.date.issued2021-05-04
dc.date.submittedMay 2021
dc.date.updated2021-12-07T21:12:08Z
dc.description.abstractInformation extraction, a key area of research in Natural Language Processing (NLP), concerns the extraction of structured information from natural language documents. Recent years have seen a gradual shift of focus from entity-based tasks to event-based tasks in information extraction research. Being a core event-based task, event coreference resolution, the task of determining which event mentions in a document refer to the same real-world event, is generally considered one of the most challenging tasks in NLP. More specifically, for two event mentions to be coreferent, both their triggers (i.e., the words realizing the occurrence of events) and their corresponding arguments (e.g., time, places, and people involved in them) have to be compatible. However, identifying potential arguments (which is typically performed by an entity extraction system), linking arguments to their event mentions (which is typically performed by an event extraction system), and determining the compatibility between two event arguments (which is provided by an entity coreference resolver), are all non-trivial tasks. In other words, end-to-end event coreference resolution is complicated in part by the fact that an event coreference resolver has to rely on the noisy outputs produced by its upstream components in the standard information extraction pipeline. Many existing event coreference resolvers avoid the hassle of dealing with noisy information and simply adopt a knowledge-lean approach consisting of a pipeline of two components, a trigger detection component that identifies triggers and corresponding subtypes, followed by an event coreference component. We hypothesize that knowledge-lean approaches are not the right way to go if the ultimate goal is to take event coreference resolvers to the next level of performance. With this in mind, we investigate knowledge-rich approaches in which we derive potentially useful knowledge for event coreference resolution from a variety of sources, including models that are trained on tasks that we believe are closely related to event coreference, statistical and linguistic features that are directly relevant to the prediction of event coreference links, as well as constraints that encode commonsense knowledge of when two event mentions should or should not be coreferent. We start by designing a multi-pass sieve approach that first resolves easy coreference links and then exploits these easy-to-identify coreference links as a source of knowledge to identify difficult coreference links. We then investigate two types of joint models for event coreference resolution, including a joint inference model and a joint learning model, where we encode commonsense knowledge of the inter-dependencies between the various components via hard or soft constraints. In addition, we incorporate non-local information extracted from the broader context preceding an event mention via learning a supervised topic model and modeling discourse salience. Further, we present an unsupervised method for deriving argument compatibility information from a large, unannotated corpus, and develop a transfer-learning framework that transfers the resulting argument (in)compatibility knowledge to an event coreference resolution resolver. Finally, we investigate a multi-tasking neural model that involves simultaneously learning six tasks related to event coreference, and guide the model learning process using cross-task consistency constraints.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/10735.1/9275
dc.language.isoen
dc.subjectNatural language processing (Computer science)
dc.subjectInformation retrieval
dc.subjectTransfer learning (Machine learning)
dc.titleKnowledge-Rich Event Coreference Resolution
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentComputer Science
thesis.degree.grantorThe University of Texas at Dallas
thesis.degree.levelDoctoral
thesis.degree.namePHD

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
LU-DISSERTATION-2021.pdf
Size:
2 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.84 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.83 KB
Format:
Plain Text
Description: