Information content of EIT measurements

4Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Electrical Impedance Tomography (EIT) calculates internal conductivity from surface measurements; image reconstruction is most commonly formulated as an inverse problem using regularization techniques. Regularization adds "prior information" to address the solution ill-conditioning. This paper presents a novel approach to understand and quantify this information. We ask: how many bits of information (in the Shannon sense) do we get from an EIT data frame. We define the term information in measurements (IM) as the: decrease in uncertainty about the contents of a medium, due to a set of measurements. Before the measurement, we know the prior information (inter-class model, q). The measured data tell us about the medium (which, corrupted by noise, gives the intra-class model, p). The measurement information is given by the relative entropy (or Kullback-Leibler divergence). Based on this expression, and given a noise covariance Σn and a rior model of the element covariances Σx, IM= 1/2log2|JΣxJ TΣn-1 + I|. Under the simplification that measurement and noise covarianes are uncorrelated, IM may be approximated as a function of the signal to noise ratio and the Jacobian and prior matrices. For an example 16 electrode EIT system, IM was calculated to be 245.1 bits. Finally, several applications of an information measure for EIT are given. © Springer-Verlag 2007.

Cite

CITATION STYLE

APA

Adler, A., & Lionheart, W. R. B. (2007). Information content of EIT measurements. In IFMBE Proceedings (Vol. 17 IFMBE, pp. 360–363). Springer Verlag. https://doi.org/10.1007/978-3-540-73841-1_94

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free