Deep voting: A robust approach toward nucleus localization in microscopy images

75Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Robust and accurate nuclei localization in microscopy image can provide crucial clues for accurate computer-aid diag In this paper, we propose a convolutional neural network (CNN) based hough voting method to localize nucleus centroids with heavy cluttering and morphologic variations in microscopy images. Our method, which we name as deep voting, mainly consists of two steps. (1) Given an input image, our method assigns each local patch several pairs of voting offset vectors which indicate the positions it votes to, and the corresponding voting confidence (used to weight each votes), our model can be viewed as an implicit hough-voting codebook. (2) We collect the weighted votes from all the testing patches and compute the final voting density map in a way similar to Parzen-window estimation. The final nucleus positions are identified by searching the local maxima of the density map. Our method only requires a few annotation efforts (just one click near the nucleus center). Experiment results on Neuroendocrine Tumor (NET) microscopy images proves the proposed method to be state-of-the-art.

Cite

CITATION STYLE

APA

Xie, Y., Kong, X., Xing, F., Liu, F., Su, H., & Yang, L. (2015). Deep voting: A robust approach toward nucleus localization in microscopy images. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9351, pp. 374–382). Springer Verlag. https://doi.org/10.1007/978-3-319-24574-4_45

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free