Inconsistency-Induced Learning for Perpetual Learners

Inconsistency-Induced Learning for Perpetual Learners

Du Zhang, Meiliu Lu
DOI: 10.4018/jssci.2011100103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

One of the long-term research goals in machine learning is how to build never-ending learners. The state-of-the-practice in the field of machine learning thus far is still dominated by the one-time learner paradigm: some learning algorithm is utilized on data sets to produce certain model or target function, and then the learner is put away and the model or function is put to work. Such a learn-once-apply-next (or LOAN) approach may not be adequate in dealing with many real world problems and is in sharp contrast with the human’s lifelong learning process. On the other hand, learning can often be brought on through overcoming some inconsistent circumstances. This paper proposes a framework for perpetual learning agents that are capable of continuously refining or augmenting their knowledge through overcoming inconsistencies encountered during their problem-solving episodes. The never-ending nature of a perpetual learning agent is embodied in the framework as the agent’s continuous inconsistency-induced belief revision process. The framework hinges on the agents recognizing inconsistency in data, information, knowledge, or meta-knowledge, identifying the cause of inconsistency, revising or augmenting beliefs to explain, resolve, or accommodate inconsistency. The authors believe that inconsistency can serve as one of the important learning stimuli toward building perpetual learning agents that incrementally improve their performance over time.
Article Preview
Top

How to build lifelong learners for intelligent agent systems has been an important agenda item in the field (Mitchell, 2006; Thrun & Mitchell, 1995; Thrun, 1995, 1998). Reports of some initial research results toward this long-term objective have emerged recently.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 1 Issue (2023)
Volume 14: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing