As data come out one by one from an infinite stream, automatic learners maintain some string as long term memory, and update it at every new datum (example) they process. Transduced learners are generalization of automatic learners. Both kind of learners are evaluated with respect to the space they consume for learning. For automatic learners, it is unknown whether at any point, the size of the long term memory can be bounded by the length of the longest datum that has been received so far. Here it is shown that, even when restricting learning to automatic families, there is a hierarchy of classes that can be learnt with memory $$O(n^k)$$, and all automatic families which are learnable in principle can be learnt by a transduced learner using exponential sized memory.
CITATION STYLE
Jain, S., Kuek, S. N., Martin, E., & Stephan, F. (2018). Learners Based on Transducers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10792 LNCS, pp. 169–181). Springer Verlag. https://doi.org/10.1007/978-3-319-77313-1_13
Mendeley helps you to discover research relevant for your work.