Supervised learning with minimal effort

5Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Traditional supervised learning learns from whatever training examples given to it. This is dramatically different from human learning; human learns simple examples before conquering hard ones to minimize his effort. Effort can equate to energy consumption, and it would be important for machine learning modules to use minimal energy in real-world deployments. In this paper, we propose a novel, simple and effective machine learning paradigm that explicitly exploits this important simple-to-complex (S2C) human learning strategy, and implement it based on C4.5 efficiently. Experiment results show that S2C has several distinctive advantages over the original C4.5. First of all, S2C does indeed take much less effort in learning the training examples than C4.5 which selects examples randomly. Second, with minimal effort, the learning process is much more stable. Finally, even though S2C only locally updates the model with minimal effort, we show that it is as accurate as the global learner C4.5. The applications of this simple-to-complex learning strategy in real-world learning tasks, especially cognitive learning tasks, will be fruitful. © 2010 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Ni, E. A., & Ling, C. X. (2010). Supervised learning with minimal effort. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6119 LNAI, pp. 476–487). https://doi.org/10.1007/978-3-642-13672-6_45

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free