Approximating Full Conformal Prediction at Scale via Influence Functions

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Conformal prediction (CP) is a wrapper around traditional machine learning models, giving coverage guarantees under the sole assumption of exchangeability; in classification problems, a CP guarantees that the error rate is at most a chosen significance level ε, irrespective of whether the underlying model is misspecified. However, the prohibitive computational costs of full CP led researchers to design scalable alternatives, which alas do not attain the same guarantees or statistical power of full CP. In this paper, we use influence functions to efficiently approximate full CP. We prove that our method is a consistent approximation of full CP, and empirically show that the approximation error becomes smaller as the training set increases; e.g., for 1, 000 training points the two methods output p-values that are < 0.001 apart: a negligible error for any practical application. Our methods enable scaling full CP to large real-world datasets. We compare our full CP approximation (ACP) to mainstream CP alternatives, and observe that our method is computationally competitive whilst enjoying the statistical predictive power of full CP.

Cite

CITATION STYLE

APA

Martinez, J. A., Bhatt, U., Weller, A., & Cherubin, G. (2023). Approximating Full Conformal Prediction at Scale via Influence Functions. In Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023 (Vol. 37, pp. 6631–6639). AAAI Press. https://doi.org/10.1609/aaai.v37i6.25814

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free