Efficient Continual Learning Framework for Stream Mining

dc.contributor.advisorKhan, Latifur
dc.contributor.advisorHamlen, Kevin
dc.contributor.advisorMa, Dongsheng Brian
dc.contributor.committeeMemberChen, Feng
dc.contributor.committeeMemberRuozzi, Nicholas
dc.creatorWang, Zhuoyi
dc.date.accessioned2022-11-29T22:34:39Z
dc.date.available2022-11-29T22:34:39Z
dc.date.created2022-05
dc.date.issued2022-05-01T05:00:00.000Z
dc.date.submittedMay 2022
dc.date.updated2022-11-29T22:34:40Z
dc.description.abstractIn recent times, deep learning-based neural models have performed excellent intelligence in several real-world tasks (e.g. object recognition, speech recognition, and machine trans- lation). However, existing achievements are typically under a closed, static environment, compared with the human brain that can learn and perform in the changing, evolving dy- namic setting with new tasks, it is hard for the current intelligent agent that discovers the novel knowledge effectively, and incrementally learn such new skills fast and efficient. We could observe that the ability to learn and accumulate knowledge over the lifetime is an essential perspective of human intelligence. Under this scenario, how encouraging the agent continually discover and learn sequentially from non-stationary or online stream of data, is significant in real-world research and application. We consider a situation, that infinite stream of data sampled from a non-stationary distribu- tion with the sequence of new emerged tasks, the key factor of the continual learning process is to automatically discover the novel/unseen pattern in the new coming tasks (compared with previous data), and also reduce the knowledge forgetting of previously seen concepts. A common problem that current deep learning/machine learning models are well known to suffer from. The contribution we described in this dissertation could be expanded to mitigate the novel knowledge discovery, incrementally efficient learning of new skills, and reduce the forgetting phenomena in the deep learning algorithm. To approach such challenges in the continual learning scenario, we first describe a class- incremental learning setting where incoming task include new classes reaching to the agent at a time, and the previous tasks could not, or limited be accessed. We introduce specific background about existing technologies for solving different issues in the learning process, and then describe our developed frameworks that aim for high-level performance on each challenge. It reserves different specialist models for each goal, includes the discovery and further incremental learning of novel knowledge using a shared model with a limited, fixed capacity. Also, when accounting for privacy issues and memory constraints, we propose to update model parameters while only accessing the previous statistics information, instead of original data. As a result, the knowledge forgetting on old concepts is reduced, and storing original input could be avoided.
dc.format.mimetypeapplication/pdf
dc.identifier.uri
dc.identifier.urihttps://hdl.handle.net/10735.1/9541
dc.language.isoen
dc.subjectComputer Science
dc.titleEfficient Continual Learning Framework for Stream Mining
dc.typeThesis
dc.type.materialtext
thesis.degree.collegeSchool of Engineering and Computer Science
thesis.degree.departmentComputer Science
thesis.degree.grantorThe University of Texas at Dallas
thesis.degree.namePHD

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
WANG-PRIMARY-2022-1.pdf
Size:
9.5 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.84 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.84 KB
Format:
Plain Text
Description: