This repo contains all the helper code required to run your Ollama Service in RunPod GPU as a Serverless service
.
- Ollama Running Locally
- Python >= 3.8
Note : Update prompt
in test_input.json
if you need to test with diff input to the model.
python runpod_serverless.py
make all