Skip to content

Incremental Model Using Kafka, Flask, Tensorflow and Docker

Notifications You must be signed in to change notification settings

mstale007/Incremental_Model_Kafka

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Incremental Model

Description

Developed a Microservice based architecture having 5 different components

  1. Frontend (Simple HTML/CSS + JS website to test the architecture)
  2. Backend (Flask, Kafka Producer)
  3. Kafka Cosnsumer (Python script to consume new data and train and save model)
  4. Tensorflow Serving on Docker (Docker)
  5. Saved Models (Folder having different saved versions of the model, used by Tensorflow serving to provide a API)

Steps

  1. First, Host the frontend/index.html or Open it in a browser
https://<localhost/path-to-file>/index.html
  1. Start the Flask backend (backend/predict_annotate.py) on a terminal
python predict_annotate.py
  1. Open a new terminal, Start our Incremental Model (Kafka consumer) (model/incremental_model.py)
python incremental_model.py
  1. Start Tensorflow serving on Docker
  • Pull tensorflow serving image from dockerhub
docker pull tensorflow/serving
  • Run the docker image
docker run -it -v <path-to-model-folder>\model:/inc_model_kafka -p 8605:8605 --entrypoint /bin/bash tensorflow/serving

Here inc_model_kafka is just a sample name for the new folder that will be created in the docker container, you can use any other name

  • After this you should entered in shell of the docker container, last step is to start the tensorflow server:
tensorflow_model_server --port=8500 --rest_api_port=8605 --model_config_file=/inc_model_kafka/model.config

After everything is up and running you can start interacting with the frontend and as you go on add new data your new model will stored in model/saved_model directory


Architecture

All the components interact together in a way shown below:

Flowchart

Check Flowchart in High Resolution here

ML in practice

The online machine learning paradigm is a bit different from the traditional/conventional way of training machine learning models.

  • In traditional approaches the dataset is fixed and the model iterates over it n number of times.
  • In online learning, the model continues to incrementally learn/update it's parameters as soon as the new data points are available and this process is expected to continue indefinitely.

References

About

Incremental Model Using Kafka, Flask, Tensorflow and Docker

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published