Abstract
In previous work [Tsai et al, 1999] we introduced an information theoretic approach for analysis of fMRI time-series data. Subsequently, [Kim et al, 2000] we established a relationship between our information theoretic approach and a simple non-parametric hypothesis test. In this work, we describe an adaptive approach for incorporating the temporal structure that relates the fMRI time-series to both the current and past values of the experimental protocol. This is achieved via an extension of our previous approach using the information-theoretic concept of entropy rate. It can be shown that, despite a differing implementation, our prior method is a special case of the new approach. The entropy rate of a random process quantifies future uncertainty conditioned on the past and side-information (e.g. the experimental protocol, confounding signals, etc.) without making strong assumptions about the nature of that uncertainty (e.g. Gaussianity). Furthermore, we allow the form of the dependency to vary from voxel to voxel in an adaptive fashion. The combination of the information theoretic principles and adaptive estimation of the temporal dependency allows for a more powerful and flexible approach to fMRI analysis. Empirical results are presented on three fMRI datasets measuring motor, auditory, and visual cortex activation comparing the new approach to the previous one as well as a variation on the general linear model. Particular attention is paid to the differences in the type of phenomenology detected by the respective approaches.
Cite
CITATION STYLE
Fisher, J. W., Cosman, E. R., Wible, C., & Wells, W. M. (2001). Adaptive entropy rates for fMRI time-series analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2208, pp. 905–912). Springer Verlag. https://doi.org/10.1007/3-540-45468-3_108
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.