YOLO2U-Net: Detection-guided 3D instance segmentation for microscopy

Amirkoushyar Ziabari, Derek C. Rose, Abbas Shirinifard, David Solecki

Research output: Contribution to journalArticlepeer-review

Abstract

Microscopy imaging techniques are instrumental for characterization and analysis of biological structures. As these techniques typically render 3D visualization of cells by stacking 2D projections, issues such as out-of-plane excitation and low resolution in the z-axis may pose challenges (even for human experts) to detect individual cells in 3D volumes as these non-overlapping cells may appear as overlapping. A comprehensive method for accurate 3D instance segmentation of cells in the brain tissue is introduced here. The proposed method combines the 2D YOLO detection method with a multi-view fusion algorithm to construct a 3D localization of the cells. Next, the 3D bounding boxes along with the data volume are input to a 3D U-Net network that is designed to segment the primary cell in each 3D bounding box, and in turn, to carry out instance segmentation of cells in the entire volume. The promising performance of the proposed method is shown in comparison with current deep learning-based 3D instance segmentation methods.

Original languageEnglish
Pages (from-to)37-42
Number of pages6
JournalPattern Recognition Letters
Volume181
DOIs
StatePublished - May 2024

Keywords

  • 3D instance segmentation
  • Cell microscopy
  • Deep learning

Fingerprint

Dive into the research topics of 'YOLO2U-Net: Detection-guided 3D instance segmentation for microscopy'. Together they form a unique fingerprint.

Cite this