Repository for Semantic Segmentation using Deep Learning for MoNuSeg Dataset
In this repository, we work with already created ground truth segmentation masks.
The goal of semantic image segmentation is to label each pixel of an image with a corresponding class of what is being represented. Because we’re predicting for every pixel in the image, this task is commonly referred to as dense prediction. The major applications of semantic segmentation are bio-medical diagnosis, Geo-Sensing, automous vehicles etc.
Refer to the respective notebook
Refer to the respective notebook
MonuSeg datset contains 30 images for training and 14 images for testing each size of 1000x1000.
To facilitate the training process, patches i.e. 256x256 for every images along with their corresponding masks can be generated using
view_as_windows
. More details can be found in patch_generator.ipinb
. Each image in the example generates 36 patches, thus overall 1584 patches.
Main packages required are:
- Keras 2.2.3
- Tensorflow 1.15.0
- Numpy
- Skimage
- Other details available in respective notebooks
More details can be found in respective notebooks.
Can be found in respective notebooks
- Preprocessing
- Data set Visualizations
- Networks summaries
- Training
- Inference
- Predicted result visualized
This repository contains three different semantic segmentation models:
- UNet (Training + inference)
- SegNet (Training + inference)
- supports indices pooling)
- DeepLabv3 (Training + inference)
- supports MobileNetv2 and Xception backbone
You can find trained models in respective folders.
All the available notebooks are standalone and can be directly through Google Colab. The models can be trained by loading data from google drive
and for inference purposes you can load saved weights from respective directories and multiple images can be from notebook. For inference only, you can
comment model.fit
cell in that particular notebook and you will get the results
Loss function used binary_crossentropy
Loss function used binary_crossentropy
Loss function used binary_crossentropy
Results | Values |
---|---|
Test Loss | 0.42978010276706874 |
Test Accuracy | 0.8665722351216205 |
Dice co-efficient | 0.8034556142745479 |
F1-score | 0.7157416380664551 |
Results | Values |
---|---|
Test Loss | 0.15891806781291962 |
Test Accuracy | 0.9142027497291565 |
Dice Co-efficient | 0.7618255615234375 |
F1-score | 0.7453052997589111 |
Results | Values |
---|---|
Test Loss | 0.3588969517837871 |
Test Accuracy | 0.8288334337147799 |
Dice co-efficient | 0.8644718094305559 |
F1-score | 0.6526657884771173 |
Results | Values |
---|---|
Dice Loss | 0.0832267701625824 |
Dice co-efficient | 0.9167312383651733 |
Accuracy | 0.7988329529762268 |