The Conformal Predictions framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. The framework combines principles of transductive inference, algorithmic randomness and hypothesis testing to provide guaranteed error calibration in online settings (and calibration in offline settings supported by empirical studies). As the framework is being increasingly used in a variety of machine learning settings such as active learning, anomaly detection, feature selection, and change detection, there is a need to develop algorithmic implementations of the framework that can be used and further improved by researchers and practitioners. In this paper, we introduce PyCP, an open-source implementation of the Conformal Predictions framework that currently provides support for classification problems within transductive and Mondrian settings. PyCP is modular, extensible and intended for community sharing and development. © IFIP International Federation for Information Processing 2013.
CITATION STYLE
Balasubramanian, V. N., Baker, A., Yanez, M., Chakraborty, S., & Panchanathan, S. (2013). PyCP: An open-source Conformal Predictions toolkit. In IFIP Advances in Information and Communication Technology (Vol. 412, pp. 361–370). Springer New York LLC. https://doi.org/10.1007/978-3-642-41142-7_37
Mendeley helps you to discover research relevant for your work.