-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix inference command sample in README.md #6868
base: main
Are you sure you want to change the base?
Conversation
The inference sample did not work for me. Looks like there was a missing line continuation in the inference command.
Update README.md
CLA emailed to triton-cla@nvidia.com |
Tested and confirmed fix:
|
Hey Jason, thanks for the contribution! Similar to step 1, I believe those are intended to be 2 separate commands, not a single one. If you copy and paste them individually, the sample should work. The difference is whether or not you'll stay in the container interactively after the command finishes. Are you seeing otherwise? |
Got it. I guess it wasn't immediately clear to me that you wanted to launch the container and then paste the command inside the container. I had copied and pasted both lines and the 2nd line paste didn't work automatically for some reason. IMO unless you have a reason to keep the user in the container, it's a little cleaner to make it one line, but I'll defer to you. Oh, I suppose if it's decided to make it one line you could remove the |
CLA is approved. @jasoncwik can you rebase as well? |
The inference sample in README.md did not work for me and I noticed there was a missing line continuation in the command line.