SAMCell: Generalized Label-Free Biological Cell Segmentation with Segment Anything

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Background

When analyzing cells in culture, assessing cell morphology (shape), confluency (density), and growth patterns are necessary for understanding cell health. These parameters are generally obtained by a skilled biologist inspecting light microscope images, but this can become very laborious for high throughput applications. One way to speed up this process is by automating cell segmentation. Cell segmentation is the task of drawing a separate boundary around each individual cell in a microscope image. This task is made difficult by vague cell boundaries and the transparent nature of cells. Many techniques for automatic cell segmentation exist, but these methods often require annotated datasets, model retraining, and associated technical expertise.

Results

We present SAMCell, a modified version of Meta’s Segment Anything Model (SAM) trained on an existing large-scale dataset of microscopy images containing varying cell types and confluency. We find that our approach works on a wide range of microscopy images, including cell types not seen in training and on images taken by a different microscope. We also present a user-friendly UI that reduces the technical expertise needed to use this automated microscopy technique.

Conclusions

Using SAMCell, biologists can quickly and automatically obtain cell segmentation results of higher quality than previous methods. Further, these results can be obtained through our custom GUI without expertise in Machine Learning, thus decreasing the human labor required in cell culturing.

Article activity feed