-
Notifications
You must be signed in to change notification settings - Fork 863
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
README for serving models using TorchServe Docker Container #2118
Conversation
Codecov Report
@@ Coverage Diff @@
## master #2118 +/- ##
=======================================
Coverage 53.36% 53.36%
=======================================
Files 71 71
Lines 3225 3225
Branches 56 56
=======================================
Hits 1721 1721
Misses 1504 1504 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd prefer if we also delete code from https://github.com/pytorch/serve/blob/master/docker/README.md#create-torch-model-archiver-from-container and just have a single point of truth that's linked from the main README and not just the examples page
@msaroufim I'll add a link to the example there. There is probably more than one way to serve the model using docker? |
…orch/serve into issues/update_docker_example
Description
This is a README showing an example for serving models using Docker container
Fixes #(2113)
Type of change
Please delete options that are not relevant.
Feature/Issue validation/testing
The outputs posted in the README are from executing the steps mentioned
Checklist: