Skip to content

Sentiment Analysis on the WiseSight dataset using LSTM & deploying on docker / AWS Elastic Beanstalk using FlaskAPI

Notifications You must be signed in to change notification settings

chanapapan/LSTM-ml-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LSTM-ml-api

This repository contains the code for Sentiment Analysis on the WiseSight dataset using an LSTM model as well as for deploying this model on docker or AWS Elastic Beanstalk using FlaskAPI.

First, download the WiseSight dataset train.txt , train_label.txt, test.txt and test_label.txt from https://github.com/PyThaiNLP/wisesight-sentiment/tree/master/kaggle-competition into 01-train_model/data/

01-train_model

This folder contains the files for training the model and saving the best model which will be used for the API.

  • 1-sentiment-analysis-LSTM.ipynb contains the code for training, validation and testing the LSTM model. The vocabulary and the weights of the best model are saved in train_model/save/

  • 2-inference.ipynb import the model class from model_and_utils.py and load the vocab and best weights for prediction.

  • config.yml contains the parameters used for model traning and inference

02-local_api_test

This folder contains the files required to run the flask application on local computer and on Docker.

inference_app.py define /inference route that will take .json file as input and return .json of the prediction

To run the application on local

Start the local app

python .\inference_app.py

Test the local app from local

python .\test_api.py

To run the application on Docker

docker image build -t flask_docker .
docker run -p 5000:5000  -d flask_docker

Send json to application on docker to get the sentiment prediction

curl.exe -H 'Content-Type: application/json' -d "@../input.json"  http://localhost:5000/inference

03-to-eb

This folder contains the files required for deploying the model on AWS Elastic Beanstalk. (**I have stopped the application and deleted the environment on AWS to avoid any fees)

Steps

  • Go to Elastic Beanstalk > "Creat new application" > set "Application Name"

aws1

  • Select "Platform" as "Python" > "Application Code" > "Upload Your Code"

  • Set "Scource Code" as "Local" > "Choose file" > ZIP all files in 03-to-eb/ into to-elasticbean.zip (must contain application.py, requirements.txt, .ebextensions/python.config and other files needed for prediction)

  • Go to "Configure more options" > "Modify instances" > set "Root colume type" to "General Purpose (SSD)" > size to 10 GB

  • "EC2 instance types" > "t2.small"

  • Create Application & wait until done

curl.exe -H 'Content-Type: application/json' -d "@./input.json"  http://chanapasentimentanalysisapp-env-1.eba-ddggkdwc.us-west-2.elasticbeanstalk.com/inference

About

Sentiment Analysis on the WiseSight dataset using LSTM & deploying on docker / AWS Elastic Beanstalk using FlaskAPI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published