Towards automatic embryo staging in 3d+t microscopy images using convolutional neural networks and pointnets

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Automatic analyses and comparisons of different stages of embryonic development largely depend on a highly accurate spatiotemporal alignment of the investigated data sets. In this contribution, we assess multiple approaches for automatic staging of developing embryos that were imaged with time-resolved 3D light-sheet microscopy. The methods comprise image-based convolutional neural networks as well as an approach based on the PointNet architecture that directly operates on 3D point clouds of detected cell nuclei centroids. The experiments with four wild-type zebrafish embryos render both approaches suitable for automatic staging with average deviations of 21–34 min. Moreover, a proof-of-concept evaluation based on simulated 3D+t point cloud data sets shows that average deviations of less than 7 min are possible.

Cite

CITATION STYLE

APA

Traub, M., & Stegmaier, J. (2020). Towards automatic embryo staging in 3d+t microscopy images using convolutional neural networks and pointnets. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12417 LNCS, pp. 153–163). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-59520-3_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free