PyCP: An open-source Conformal Predictions toolkit

5Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The Conformal Predictions framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. The framework combines principles of transductive inference, algorithmic randomness and hypothesis testing to provide guaranteed error calibration in online settings (and calibration in offline settings supported by empirical studies). As the framework is being increasingly used in a variety of machine learning settings such as active learning, anomaly detection, feature selection, and change detection, there is a need to develop algorithmic implementations of the framework that can be used and further improved by researchers and practitioners. In this paper, we introduce PyCP, an open-source implementation of the Conformal Predictions framework that currently provides support for classification problems within transductive and Mondrian settings. PyCP is modular, extensible and intended for community sharing and development. © IFIP International Federation for Information Processing 2013.

Cite

CITATION STYLE

APA

Balasubramanian, V. N., Baker, A., Yanez, M., Chakraborty, S., & Panchanathan, S. (2013). PyCP: An open-source Conformal Predictions toolkit. In IFIP Advances in Information and Communication Technology (Vol. 412, pp. 361–370). Springer New York LLC. https://doi.org/10.1007/978-3-642-41142-7_37

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free