Illustrative Discussion of MC-Dropout in General Dataset: Uncertainty Estimation in Bitcoin

25Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The past few years have witnessed the resurgence of uncertainty estimation generally in neural networks. Providing uncertainty quantification besides the predictive probability is desirable to reflect the degree of belief in the model’s decision about a given input. Recently, Monte-Carlo dropout (MC-dropout) method has been introduced as a probabilistic approach based Bayesian approximation which is computationally efficient than Bayesian neural networks. MC-dropout has revealed promising results on image datasets regarding uncertainty quantification. However, this method has been subjected to criticism regarding the behaviour of MC-dropout and what type of uncertainty it actually captures. For this purpose, we aim to discuss the behaviour of MC-dropout on classification tasks using synthetic and real data. We empirically explain different cases of MC-dropout that reflects the relative merits of this method. Our main finding is that MC-dropout captures datapoints lying on the decision boundary between the opposed classes using synthetic data. On the other hand, we apply MC-dropout method on dataset derived from Bitcoin known as Elliptic data to highlight the outperformance of model with MC-dropout over standard model. A conclusion and possible future directions are proposed.

Cite

CITATION STYLE

APA

Alarab, I., Prakoonwit, S., & Nacer, M. I. (2021). Illustrative Discussion of MC-Dropout in General Dataset: Uncertainty Estimation in Bitcoin. Neural Processing Letters, 53(2), 1001–1011. https://doi.org/10.1007/s11063-021-10424-x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free