Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

api_server.py: error: unrecognized arguments: --tool-use-prompt-template --enable-api-tools --enable-auto-tool-choice #5730

Closed
lk1983823 opened this issue Jun 21, 2024 · 4 comments
Labels
stale usage How to use vllm

Comments

@lk1983823
Copy link

lk1983823 commented Jun 21, 2024

My vllm version is 0.5.0 post1. I want to make qwen2:7b-instruct model to have functional calling enabled. So following the issue #5649 here, I run the command

python -m vllm.entrypoints.openai.api_server --model /home/asus/autodl-tmp/qwen/Qwen2-7B-Instruct --tool-use-prompt-template /home/asus/autodl-tmp/examples/chatml.jinja --enable-api-tools --enable-auto-tool-choice

But it shows api_server.py: error: unrecognized arguments: --tool-use-prompt-template --enable-api-tools --enable-auto-tool-choice

image

@lk1983823 lk1983823 added the usage How to use vllm label Jun 21, 2024
@K-Mistele
Copy link
Contributor

#5649 is a draft of a pull request. It is a work-in-progress, and has not been merged into vLLM's codebase. Any capabilities and features present in that branch, whether they are a work-in-progress or completed, will not be available in a vLLM release unless/until the pull request is approved and merged from the Constellate AI fork into the vLLM project's codebase.

@K-Mistele
Copy link
Contributor

@lk1983823 please feel free to close this issue :)

Copy link

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

@github-actions github-actions bot added the stale label Oct 25, 2024
Copy link

This issue has been automatically closed due to inactivity. Please feel free to reopen if you feel it is still relevant. Thank you!

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale usage How to use vllm
Projects
None yet
Development

No branches or pull requests

2 participants