-
Notifications
You must be signed in to change notification settings - Fork 444
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support internvl2 chat template #1911
Conversation
|
Please fix the UT error |
@zhulinJulia24 may update TC |
Hi, I feel that it may not be necessary to implement these three chat templates separately. For InternVL2-2B/8B/26B, we can simply use the chat template of internlm2-chat. The main difference is the system prompt. I used the following code to run the InternVL2 model, and it seems to work well based on my tests. from lmdeploy import pipeline, TurbomindEngineConfig, ChatTemplateConfig
from lmdeploy.vl import load_image
model = 'OpenGVLab/InternVL2-2B'
system_prompt = '我是书生·万象,英文名是InternVL,是由上海人工智能实验室及多家合作单位联合开发的多模态基础模型。人工智能实验室致力于原始技术创新,开源开放,共享共创,推动科技进步和产业发展。'
image = load_image('https://raw.githubusercontent.com/open-mmlab/mmdeploy/main/tests/data/tiger.jpeg')
chat_template_config = ChatTemplateConfig('internlm2-chat')
chat_template_config.meta_instruction = system_prompt
pipe = pipeline(model, chat_template_config=chat_template_config,
backend_config=TurbomindEngineConfig(session_len=8192))
response = pipe(('describe this image', image))
print(response.text) |
For InternVL2-4B, can the Phi3 chat template be used directly? I see there is an implementation here. |
Hi, @czczup thanks for the suggestion. |
No description provided.