Attention Augmented Feature Fusion based Nuclei Instance Segmentation and Type Classification in Histology Images
Nuclei instance segmentation and type classification are tasks in medical image analysis that aim to identify and separate individual nuclei in an image and classify them based on their type.
In nuclei instance segmentation, the goal is to identify the location and boundaries of each individual nucleus in an image. This is usually done using techniques such as image segmentation, object detection, or semantic segmentation. The output is typically a mask or a binary image where each nucleus is assigned a unique label or color.
The precise identification and examination of nuclei play a critical role in the detection and examination of cancer. Despite their significance, the task of accurately dividing and categorizing nuclei instances is a challenging one, often complicated by the presence of overlapping and congested nuclei with indistinct edges. Existing methods concentrate on area proposal techniques and feature encoding frameworks, yet often fall short in accurately pinpointing instances. In this research, we introduce a straightforward yet effective model that can recognize instance borders with precision and tackle the issue of class imbalance. By incorporating nuclei pixel positional information and a unique loss function, our model delivers accurate class information for each nucleus. Our network comprises a compact, attention-aware feature fusion architecture with dedicated instance probability, shape radial estimator, and classification heads. A combined classification loss function is employed to minimize loss by assigning weighted loss to each class based on their frequency of occurrence, resolving the class imbalance issues frequently encountered in publicly accessible nuclei datasets.
- Dual Segmentation & Classification A new method for combining the characteristics of detected objects on an individual basis, utilizing an attention mechanism for feature fusion to tackle the diverse shape and texture variations of nuclei.
- Radial Distance based Shape Estimation Estimating the shape of nuclei accurately by using radial distances that measure the distance of each pixel in the nucleus from the contour and giving greater importance to pixels closer to the boundary.
-State-of-the-art performance. The proposed model has shown exceptional performance compared to other leading methods when tested on eight different publicly available datasets, as demonstrated by the experimental results.
Models used in the original paper
- python 3.6.10
- scikit-learn 0.23.1
- scikit-image 0.16.2
- opencv-python 4.1.2.32
- Tensorflow 1.12
This package is compatible with Python 3.6 - 3.10.
For using AF-Net please follow the following steps.
- Please first install TensorFlow
(either TensorFlow 1 or 2) by following the official instructions.
For GPU support, it is very
important to install the specific versions of CUDA and cuDNN that are
compatible with the respective version of TensorFlow. (If you need help and can use
conda
, take a look at this.)
- Depending on your Python installation, you may need to use
pip3
instead ofpip
. - You can find out which version of TensorFlow is installed via
pip show tensorflow
. - You need to install gputools if you want to use OpenCL-based computations on the GPU to speed up training.
We provide example workflows for each dataset in Jupyter notebooks that illustrate how this model can be used.
Currently we provide pretrained models for the following 8 datasets that can be utilized for further analysis or training:
Dataset | Modality (Staining) | Image format | Example Image | Description |
---|---|---|---|---|
CPM-15 |
H&E | RGB | CPM-15 dataset using AF-Net model trained for nuclei instance segmentation | |
CPM-17 |
H&E | RGB | CPM-17 dataset using AF-Net model trained for nuclei instance segmentation | |
CoNSeP |
H&E | RGB | CoNSeP dataset using AF-Net model trained for nuclei instance segmentation and classification | |
Kumar |
H&E | RGB | Kumar dataset using AF-Net model trained for nuclei instance segmentation | |
TNBC |
H&E | RGB | TNBC dataset using AF-Net model trained for nuclei instance segmentation | |
CryoNuSeg |
H&E | RGB | CryoNuSeg dataset using AF-Net model trained for nuclei instance segmentation | |
Lizard |
H&E | RGB | Lizard dataset using AF-Net model trained for nuclei instance segmentation | |
PanNuke |
H&E | RGB | PanNuke dataset using AF-Net model trained for nuclei instance segmentation and classification |
- Define path for training and test images
- For training from scratch use the following configuration:
model = nurisc2D(config=conf, name='define folder for saving model weights', basedir='define path for model weights')
- For loading pre-trained model use this configuration:
model = nurisc2D(config=None, name='pretrained model folder', basedir='pretained model path')
AFNet model has been trained for instance segmentation for the following 8 datasets including CPM-17, CPM-15, Kumar, PanNuke, TNBC, CoNSeP and Lizard datasets.
AFNet model has been trained for classification for PanNuke and CoNSeP datasets, i.e. each found nuclei instance is additionally classified into its specific tumor class(e.g. cell types):
We have evaluated the models performance on the following evaluation metrics.
tp
,fp
,fn
precision
,recall
,accuracy
,f1
panoptic_quality
mean_true_score
,mean_matched_score
which are computed by matching ground-truth/prediction objects if their IoU exceeds a threshold (by default 50%).
If you find our work useful in your research, please consider citing:
@inproceedings{nurisc,
title={NuRiSC: Nuclei Radial Instance Segmentation and Type Classification},
author={Nasir, Esha and Fraz, Muhammad},
booktitle={Medical Imaging and Computer-Aided Diagnosis, MICAD2022},
year={2022},
organization={Springer}
}
@article{article,
author = {Nasir, Esha and Parvaiz, Arshi and Fraz, Muhammad},
year = {2022},
month = {12},
pages = {1-56},
title = {Nuclei and glands instance segmentation in histology images: a narrative review},
journal = {Artificial Intelligence Review},
doi = {10.1007/s10462-022-10372-5}
}