-
Notifications
You must be signed in to change notification settings - Fork 863
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A guide for loading models in TorchServe #2592
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On whether handler has an initialize method i think it's more nuanced, someone could also be calling super()
The model type node makes a lot of sense
in the self contained package node i didn't really understand what you were trying to say because serialized file node intersects with torchscript and onnx. The righthand bottom node should be is your model large and do you care about model initialization time
Codecov Report
@@ Coverage Diff @@
## master #2592 +/- ##
=======================================
Coverage 70.87% 70.87%
=======================================
Files 83 83
Lines 3839 3839
Branches 58 58
=======================================
Hits 2721 2721
Misses 1114 1114
Partials 4 4 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
@msaroufim Thanks. Agree with both. Changed both |
@msaroufim Changed the bottom part a bit. If the model is large, we want users to not package the model at all. So, I think that should be the starting point |
Description
This is a guide for loading models in TorchServe
Main motivation was from the note by Hamel Husain https://hamel.dev/notes/serving/torchserve/
Fixes #(issue)
Type of change
Please delete options that are not relevant.
Feature/Issue validation/testing
N/A
Checklist: