Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Error when using VLLM with knowledge base.:NotImplementedError #1045

Closed
elepherai opened this issue Jan 8, 2024 · 0 comments · Fixed by #1050
Closed

[Bug] Error when using VLLM with knowledge base.:NotImplementedError #1045

elepherai opened this issue Jan 8, 2024 · 0 comments · Fixed by #1050

Comments

@elepherai
Copy link

通过调试发现是vllm的token计算部分没有实现,我改成_try_to_count_token(prompt, self.tokenizer, self.model),返回token=-1,但因为self.tokenizer, self.model都是None,在这里也会报错,请问如何解决?

@Aries-ckt Aries-ckt changed the title vllm与知识库配合使用时出错:NotImplementedError [Bug] Error when using VLLM with knowledge base.:NotImplementedError Feb 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant