-
Notifications
You must be signed in to change notification settings - Fork 863
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added k8s mnist example using minikube #2323
Conversation
Codecov Report
@@ Coverage Diff @@
## master #2323 +/- ##
=======================================
Coverage 72.66% 72.66%
=======================================
Files 78 78
Lines 3669 3669
Branches 58 58
=======================================
Hits 2666 2666
Misses 999 999
Partials 4 4 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
path: /host/model_store | ||
containers: | ||
- name: torchserve | ||
image: pytorch/torchserve:latest-cpu |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We also ship a kserve image, would that be more appropriate to use? Look at Dockerfile here https://github.com/pytorch/serve/tree/master/kubernetes
@agunapal Please resolve the conflicts for merging the PR |
@chauhang Done. |
LGTM, can we get another stamp? |
@chauhang Please find the logs included in the PR |
path: /host/model_store | ||
containers: | ||
- name: torchserve | ||
image: pytorch/torchserve:latest-cpu |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be good to also include a version for GPU, Can we added as a follow-on PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@agunapal Thanks for this example. It will be good to also do a follow-up PR for GPU configuration, which requires additional special settings for CUDA setup
Description
Added an example for torchserve inference using minikube
Fixes #(issue)
Type of change
Please delete options that are not relevant.
Feature/Issue validation/testing
minikube.txt
Logs included in the README
Checklist: