Skip to content

A Web Application hosted on AWS where I use Machine Learning (VGG-16) to locate tumors in a CT-Scan image of a Kidney

Notifications You must be signed in to change notification settings

atifabedeen/Kidney-disease-classification

Repository files navigation

Kidney-disease-classification

An Application where the user can upload an image of a Kidney CT-Scan and the underlying DNN model (VGG-16) will predict if the kidney has a tumor or not.

Workflows

  1. Update config.yaml
  2. Update secrets.yaml [Optional]
  3. Update params.yaml
  4. Update the entity
  5. Update the configuration manager in src config
  6. Update the components
  7. Update the pipeline
  8. Update the main.py
  9. Update the dvc.yaml
  10. app.py

How to run?

STEPS:

Clone the repository

https://github.com/atifabedeen/Kidney-disease-classification

STEP 01- Create a conda environment after opening the repository

conda create -n venv python=3.8 -y
conda activate venv

STEP 02- install the requirements

pip install -r requirements.txt
# Finally run the following command
python app.py

Now,

open up you local host and port

MLflow

cmd
  • mlflow ui

dagshub

dagshub

I use dagshub to perform experiments with MLFlow. To also use it, read the MLFlow documentation. You would need the following information

MLFLOW_TRACKING_URI MLFLOW_TRACKING_USERNAME MLFLOW_TRACKING_PASSWORD

DVC cmd

  1. dvc init
  2. dvc repro
  3. dvc dag

About MLflow & DVC

MLflow

  • Its Production Grade
  • Trace all of your expriements
  • Logging & taging your model

DVC

  • Its very light-weight for POC only
  • light-weight expriements tracker
  • It can perform Orchestration (Creating Pipelines)

AWS-CICD-Deployment-with-Github-Actions

I've used AWS-CICD-Deployment-with-Github-Actions to connect my project with an AWS EC2 instance and docker to load my files and dependencies on an AWS ECR. I have disabled it for now (because I don't want to pay). To do the same, follow the steps below.

1. Login to AWS console.

2. Create IAM user for deployment

#with specific access

1. EC2 access : It is virtual machine

2. ECR: Elastic Container registry to save your docker image in aws


#Description: About the deployment

1. Build docker image of the source code

2. Push your docker image to ECR

3. Launch Your EC2 

4. Pull Your image from ECR in EC2

5. Lauch your docker image in EC2

#Policy:

1. AmazonEC2ContainerRegistryFullAccess

2. AmazonEC2FullAccess

3. Create ECR repo to store/save docker image

- Save the URI

4. Create EC2 machine (Ubuntu)

5. Open EC2 and Install docker in EC2 Machine:

#optinal

sudo apt-get update -y

sudo apt-get upgrade

#required

curl -fsSL https://get.docker.com -o get-docker.sh

sudo sh get-docker.sh

sudo usermod -aG docker ubuntu

newgrp docker

6. Configure EC2 as self-hosted runner:

setting>actions>runner>new self hosted runner> choose os> then run command one by one

7. Setup github secrets:

AWS_ACCESS_KEY_ID

AWS_SECRET_ACCESS_KEY

AWS_REGION

AWS_ECR_LOGIN_URI 

ECR_REPOSITORY_NAME

And you're good to go!!

Project adapted and reproduced from Krish Naik Github Repo

About

A Web Application hosted on AWS where I use Machine Learning (VGG-16) to locate tumors in a CT-Scan image of a Kidney

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published