The nested distance builds on the Wasserstein distance to quantify the difference of stochastic processes, including also the evolution of information modelled by filtrations. The Sinkhorn divergence is a relaxation of the Wasserstein distance, which can be computed considerably faster. For this reason we employ the Sinkhorn divergence and take advantage of the related (fixed point) iteration algorithm. Furthermore, we investigate the transition of the entropy throughout the stages of the stochastic process and provide an entropy-regularized nested distance formulation, including a characterization of its dual. Numerical experiments affirm the computational advantage and supremacy.
CITATION STYLE
Pichler, A., & Weinhardt, M. (2022). The nested Sinkhorn divergence to learn the nested distance. Computational Management Science, 19(2), 269–293. https://doi.org/10.1007/s10287-021-00415-7
Mendeley helps you to discover research relevant for your work.