[Bug]: vllm0.5.5 Ignores VLLM_USE_MODELSCOPE=True and Accesses huggingface.co #7986
Closed
1 task done
Labels
bug
Something isn't working
Your current environment
The output of `python collect_env.py`
🐛 Describe the bug
After running
docker run --runtime nvidia --gpus all -v cache/modelscope:/root/.cache/modelscope --env "VLLM_USE_MODELSCOPE=True" -p 8000:8000 --ipc host -d --name vllm vllm/vllm-openai:v0.5.5 --model LLM-Research/Meta-Llama-3.1-8B-Instruct --trust-remote-code -tp 4
the container exits quickly. Upon checking the logs, it was found that
The possible reason is that in version 0.5.5,
https://github.com/vllm-project/vllm/blob/main/vllm/transformers_utils/config.py#L113
always uses transformers.
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: