Multi-Frequency Data Fusion for Attitude Estimation Based on Multi-Layer Perception and Cubature Kalman Filter

7Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper proposes multi-frequency inertial and visual data fusion for attitude estimation. The proposed strategy is based on the locally weighted linear regression (LWLR), multi-layer perception (MLP), and cubature Kalman filter (CKF). First, we analyze the discrepant-frequency and the attitude divergence problems. Second, we construct the filter equation for the visual and inertial data and attitude differential equation for inertial-only data, which are used to estimate the attitude in time series. Third, we employ LWLR to compute the vision discrepancies between actual vision data and fitted vision data. The vision discrepancy is used as the input of MLP training. In MLP, the discrepancy is used as weights of the sums through the activation function of the hidden layer. To address the divergence problem, which is inherent in a multi-frequency fusion, the MLP is utilized to compensate for the inertial-only data. Finally, experimental results on different environments of pseudo-physical simulations show the superior performance of the proposed method in terms of the accuracy of attitude estimation and divergence capability.

Cite

CITATION STYLE

APA

Chen, X., Xuelong, Z., Wang, Z., Li, M., Ou, Y., & Yufan, S. (2020). Multi-Frequency Data Fusion for Attitude Estimation Based on Multi-Layer Perception and Cubature Kalman Filter. IEEE Access, 8, 144373–144381. https://doi.org/10.1109/ACCESS.2020.3012984

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free