-
Notifications
You must be signed in to change notification settings - Fork 863
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update instructions to build with nvidia cuda runtime image for ONNX #2435
Conversation
Codecov Report
@@ Coverage Diff @@
## master #2435 +/- ##
=======================================
Coverage 72.66% 72.66%
=======================================
Files 78 78
Lines 3669 3669
Branches 58 58
=======================================
Hits 2666 2666
Misses 999 999
Partials 4 4 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
I'd really like us to merge running the regression test inside a freshly built docker container to make sure that this works instead of relying on logs |
@msaroufim I agree. We have to wait till #2403 is resolved and merged. |
@agunapal LGTM please just fix lint before merge |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @agunapal for the PR. Few items:
- Can we switch to using CUDA 11.8 as the default?
- Please attach some tests for the two cases -- image with Nvidia runtime and image with dev build needed for DeepSpeed for verification
…serve into docs/docker_gpu_updates
Description
TorchServe's GPU Docker Image uses NVIDIA CUDA base image.
Third part libraries such as ONNX require NVIDIA CUDA runtime base image to work.
-bi
to docker build script to script the base imageFixes #(issue)
Type of change
Please delete options that are not relevant.
Feature/Issue validation/testing
Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.
1_docker-regression (ubuntu-20.04).txt
2_docker-regression (self-hosted, regression-test-gpu).txt
Error message on -bi and -g
Nvidia-runtime
Checklist: