Partial learning is a criterion where the learner infinitely often outputs one correct conjecture while every other hypothesis is issued only finitely often. This paper addresses two variants of partial learning in the setting of inductive inference of functions: first, confident partial learning requires that the learner also on those functions which it does not learn, singles out exactly one hypothesis which is output infinitely often; second, essentially class consistent partial learning is partial learning with the additional constraint that on the functions to be learnt, almost all hypotheses issued are consistent with all the data seen so far. The results of the present work are that confident partial learning is more general than explanatory learning, incomparable with behaviourally correct learning and closed under union; essentially class consistent partial learning is more general than behaviourally correct learning and incomparable with confident partial learning. Furthermore, it is investigated which oracles permit to learn all recursive functions under these criteria: for confident partial learning, some non-high oracles are omniscient; for essentially class consistent partial learning, all PA-complete and all oracles of hyperimmune Turing degree are omniscient. © 2012 Springer-Verlag.
CITATION STYLE
Gao, Z., & Stephan, F. (2012). Confident and consistent partial learning of recursive functions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7568 LNAI, pp. 51–65). https://doi.org/10.1007/978-3-642-34106-9_8
Mendeley helps you to discover research relevant for your work.