Side information dependence as a regularizer for analyzing human brain conditions across cognitive experiments

7Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

The increasing of public neuroimaging datasets opens a door to analyzing homogeneous human brain conditions across datasets by transfer learning (TL). However, neuroimaging data are high-dimensional, noisy, and with small sample sizes. It is challenging to learn a robust model for data across different cognitive experiments and subjects. A recent TL approach minimizes domain dependence to learn common cross-domain features, via the Hilbert-Schmidt Independence Criterion (HSIC). Inspired by this approach and the multi-source TL theory, we propose a Side Information Dependence Regularization (SIDeR) learning framework for TL in brain condition decoding. Specifically, SIDeR simultaneously minimizes the empirical risk and the statistical dependence on the domain side information, to reduce the theoretical generalization error bound. We construct 17 brain decoding TL tasks using public neuroimaging data for evaluation. Comprehensive experiments validate the superiority of SIDeR over ten competing methods, particularly an average improvement of 15.6% on the TL tasks with multi-source experiments.

Cite

CITATION STYLE

APA

Zhou, S., Li, W., Cox, C. R., & Lu, H. (2020). Side information dependence as a regularizer for analyzing human brain conditions across cognitive experiments. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 6957–6964). AAAI press. https://doi.org/10.1609/aaai.v34i04.6179

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free