Here, a streamlined, scalable, laboratory approach is discussed that enables medium‐to‐large dataset analysis. The presented approach combines data management, artificial intelligence, containerization, cluster orchestration, and quality control in a unified analytic pipeline. The unique combination of these individual building blocks creates a new and powerful analysis approach that can readily be applied to medium‐to‐large datasets by researchers to accelerate the pace of research. The proposed framework is applied to a project that counts the number of plasmonic nanoparticles bound to peripheral blood mononuclear cells in dark‐field microscopy images. By using the techniques presented in this article, the images are automatically processed overnight, without user interaction, streamlining the path from experiment to conclusions.
CITATION STYLE
González, G., & Evans, C. L. (2019). Biomedical Image Processing with Containers and Deep Learning: An Automated Analysis Pipeline. BioEssays, 41(6). https://doi.org/10.1002/bies.201900004
Mendeley helps you to discover research relevant for your work.