Multisensor data fusion

6Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Multisensor data fusion is a key enabling technology in which information from a number of sources is integrated to form a unified picture [1]. This concept has been applied to numerous fields and new applications are being explored constantly. Even though most multisensor data fusion applications have been developed relatively recently, the notion of data fusion has always been around. In fact, all of us employ multisensor data fusion principles in our daily lives. The human brain is an excellent example of an operational fusion system that performs extremely well. It integrates sensory information, namely sight, sound, smell, taste and touch data and makes inferences regarding the problem at hand. It has been a natural desire of researchers in different disciplines of science and engineering to emulate this information fusion ability of the human brain. The idea is that fusion of complementary information available from different sensors will yield more accurate results for information processing problems. Significant advances in this important field have been made but perfect emulation of the human brain remains an elusive goal.

Cite

CITATION STYLE

APA

Varshney, P. K. (2000). Multisensor data fusion. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1821, pp. 1–3). Springer Verlag. https://doi.org/10.1007/3-540-45049-1_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free