YOLO2U-Net: Detection-guided 3D instance segmentation for microscopy

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Microscopy imaging techniques are instrumental for characterization and analysis of biological structures. As these techniques typically render 3D visualization of cells by stacking 2D projections, issues such as out-of-plane excitation and low resolution in the z-axis may pose challenges (even for human experts) to detect individual cells in 3D volumes as these non-overlapping cells may appear as overlapping. A comprehensive method for accurate 3D instance segmentation of cells in the brain tissue is introduced here. The proposed method combines the 2D YOLO detection method with a multi-view fusion algorithm to construct a 3D localization of the cells. Next, the 3D bounding boxes along with the data volume are input to a 3D U-Net network that is designed to segment the primary cell in each 3D bounding box, and in turn, to carry out instance segmentation of cells in the entire volume. The promising performance of the proposed method is shown in comparison with current deep learning-based 3D instance segmentation methods.

Original languageEnglish
Pages (from-to)37-42
Number of pages6
JournalPattern Recognition Letters
Volume181
DOIs
StatePublished - May 2024

Funding

Research sponsored by the U.S. Department of Energy, under contract DE-AC05-00OR22725 with UT-Battelle, LLC. The US government retains and the publisher, by accepting the article for publication, acknowledges that the US government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this manuscript, or allow others to do so, for US government purposes. DOE will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan). This collaborations was funded by St. Jude Children's Research Hospital through funding from the American Lebanese Syrian Associated Charities (ALSAC). The Solecki Laboratory is funded by grants 1R01NS066936 and R01NS104029-02 from the National Institute of Neurological Disorders (NINDS). Research sponsored by the U.S. Department of Energy , under contract DE-AC05-00OR22725 with UT-Battelle, LLC. The US government retains and the publisher, by accepting the article for publication, acknowledges that the US government retains a nonexclusive, paid-up, irrevocable, worldwide license to publish or reproduce the published form of this manuscript, or allow others to do so, for US government purposes. DOE will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan ( http://energy.gov/downloads/doe-public-access-plan ). A.D and S.V were supported by AI-initiative LDRD Program at ORNL . This collaborations was funded by St. Jude Children’s Research Hospital through funding from the American Lebanese Syrian Associated Charities (ALSAC). The Solecki Laboratory is funded by grants 1R01NS066936 and R01NS104029-02 from the National Institute of Neurological Disorders (NINDS).

Keywords

  • 3D instance segmentation
  • Cell microscopy
  • Deep learning

Fingerprint

Dive into the research topics of 'YOLO2U-Net: Detection-guided 3D instance segmentation for microscopy'. Together they form a unique fingerprint.

Cite this