Perceptual embedding consistency for seamless reconstruction of tilewise style transfer

14Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Style transfer is a field with growing interest and use cases in deep learning. Recent work has shown Generative Adversarial Networks (GANs) can be used to create realistic images of virtually stained slide images in digital pathology with clinically validated interpretability. Digital pathology images are typically of extremely high resolution, making tilewise analysis necessary for deep learning applications. It has been shown that image generators with instance normalization can cause a tiling artifact when a large image is reconstructed from the tilewise analysis. We introduce a novel perceptual embedding consistency loss significantly reducing the tiling artifact created in the reconstructed whole slide image (WSI). We validate our results by comparing virtually stained slide images with consecutive real stained tissue slide images. We also demonstrate that our model is more robust to contrast, color and brightness perturbations by running comparative sensitivity analysis tests.

Cite

CITATION STYLE

APA

Lahiani, A., Navab, N., Albarqouni, S., & Klaiman, E. (2019). Perceptual embedding consistency for seamless reconstruction of tilewise style transfer. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11764 LNCS, pp. 568–576). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-32239-7_63

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free