Skip to content

ShrishailSGajbhar/fastapi-onnx-inference

Repository files navigation

Reproducing the FastAPI-ONNX Inference For Sentiment Analysis

How to run

Prerequisites:

  • Docker
  • Binary file for the task in onnx format. Download it from this link and put in the webapp folder

Steps:

  1. Create a docker containers for frontend and backend services using command docker-compose up -d
  2. Go to http://localhost:8502 for Streamlit UI (fronted)
  3. Go to http://localhost:8001/docs for Swagger UI (backend)

Example for positive sentiment

sample-1

Example for negative sentiment

sample-2

Swagger UI for backend APIs

sample-3