Model order reduction for delay systems by iterative interpolation

9Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Adaptive algorithms for computing the reduced-order model of time-delay systems (TDSs) are proposed in this work. The algorithms are based on interpolating the transfer function at multiple expansion points and greedy iterations for selecting the expansion points. The (Formula presented.) -error of the reduced transfer function is used as the criterion for choosing the next new expansion point. One heuristic greedy algorithm and one algorithm based on the error system and adaptive sub-interval selection are developed. Results on four TDSs with tens of delays from electromagnetic applications are presented and show the efficiency of the proposed algorithms.

Cite

CITATION STYLE

APA

Alfke, D., Feng, L., Lombardi, L., Antonini, G., & Benner, P. (2021). Model order reduction for delay systems by iterative interpolation. International Journal for Numerical Methods in Engineering, 122(3), 684–706. https://doi.org/10.1002/nme.6554

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free