Skip to content
/ nn-api Public

A microservice API for serving ML model using FastAPI, Celery and a custom neural network library.

License

Notifications You must be signed in to change notification settings

antkrit/nn-api

Repository files navigation

Neural Network API

Build and tests Documentation Status codecov License: MIT Code style: black

Description

The NN API is a simple API that serves the MNIST model for handwritten digit recognition. The MNIST model is built and trained using an internal neural network library.

Installation

Prerequisites

Make sure you have installed all the following prerequisites on your development machine:

  • Python 3.7+ (with setuptools, wheel and virtualenv packages)
  • Docker

Set up project

  • Clone repository
git clone https://github.com/antkrit/nn-api.git
  • Move into project root folder:
cd nn-api
  • Create and activate virtual environment:

Linux:

virtualenv venv
source venv/bin/activate

Windows:

virtualenv venv
venv\Scripts\activate
  • Install dependencies:

Production requirements (if you are only interested in launching the application, these requirements are enough for you)

python -m pip install -e .

Development requirements (includes production requirements)

python -m pip install -e .[dev]

Run application

Model

The MNIST model training process can be found in the MNIST example notebook. After training the model (and exporting it to a ".pkl"-like file), the path to it must be specified in the .env file

Configuration

At least the following environment variables must be set (values may vary):

MODEL_PATH=api/model/trained_model-0.1.0-MNIST-DNN.pkl

CELERY_BROKER_URI=amqp://rabbitmq
CELERY_BACKEND_URI=redis://redis:6379/0

More specific project configuration, such as logging, can be found in the config.py file. Requests are logged in json format by default. This behavior can be changed by specifying the appropriate logger for the API logging middleware. A list of pre-configured loggers can be found in the logging configuration.

Run development server

In the project root directory run:

docker compose build
docker compose up

After a successful build and launch, the following services will be available to you:

About

A microservice API for serving ML model using FastAPI, Celery and a custom neural network library.

Topics

Resources

License

Stars

Watchers

Forks