The task of information filtering is to classify documents from a stream as either relevant or non-relevant according to a particular user interest with the objective to reduce information load. When using an information filter in an environment that changes over time, methods for adapting the filter should be considered in order to retain classification performance. We favor a methodology that attempts to detect changes and adapts the information filter only if need be. Thus the amount of user feedback for providing new training data can be minimized. Nevertheless, detecting changes may also require expensive hand-labeling of documents. This paper explores two methods for assessing performance indicators without user feedback. The first is based on performance estimation and the second counts uncertain classification decisions. Empirical results for a simulated change scenario with real world text data show that our adaptive information filter can perform well in changing domains.
CITATION STYLE
Lanquillon, C. (1999). Evaluating performance indicators for adaptive information filtering. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1749, pp. 11–21). Springer Verlag. https://doi.org/10.1007/978-3-540-46652-9_2
Mendeley helps you to discover research relevant for your work.