Estimating and Factoring the Dropout Induced Distribution with Gaussian Mixture Model

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The analytical method to capture the dropout induced distribution of forwarding output in a neural network as Gaussian mixture model (GMM) was proposed. In dropout Bayesian DNN, if the network is dropout-trained and a test data is dropout-forwarded for inference, then its output, usually approximated as a single mode Gaussian, becomes a posterior whose variance tells uncertainty of its inference [1]. Here, the proposed method can capture the arbitrary distribution analytically with high accuracy without Monte Carlo (MC) method for any network equipped with dropout and fully connected (FC) layers. Therefore, it is applicable to the general non-Gaussian posterior case for a better uncertainty estimate. The proposed method also has the advantage to provide a multimodal analysis in distribution by factoring which can be tuned with a user defined expressibility parameter while a MC estimate provides only a “flat” image. This helps to understand how the FC layer tries to code a dropout injected highly multimodal data into a single mode Gaussian while the unknown data becomes a complicated distribution.

Cite

CITATION STYLE

APA

Adachi, J. (2019). Estimating and Factoring the Dropout Induced Distribution with Gaussian Mixture Model. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11727 LNCS, pp. 775–792). Springer Verlag. https://doi.org/10.1007/978-3-030-30487-4_60

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free