Flipover, an enhanced dropout technique, is introduced to improve the robustness of artificial neural networks. In contrast to dropout, which involves randomly removing certain neurons and their connections, flipover randomly selects neurons and reverts their outputs using a negative multiplier during training. This approach offers stronger regularization than conventional dropout, refining model performance by (1) mitigating overfitting, matching or even exceeding the efficacy of dropout; (2) amplifying robustness to noise; and (3) enhancing resilience against adversarial attacks. Extensive experiments across various neural networks affirm the effectiveness of flipover in deep learning.
CITATION STYLE
Liang, Y., Niu, C., Yan, P., & Wang, G. (2024). Flipover outperforms dropout in deep learning. Visual Computing for Industry, Biomedicine, and Art, 7(1). https://doi.org/10.1186/s42492-024-00153-y
Mendeley helps you to discover research relevant for your work.