Skip to content

Commit

Permalink
fix: system prompt err when using o1 models
Browse files Browse the repository at this point in the history
  • Loading branch information
binary-husky committed Sep 14, 2024
1 parent 18290fd commit 597c320
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion core_functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ def get_core_functions():
text_show_english=
r"Below is a paragraph from an academic paper. Polish the writing to meet the academic style, "
r"improve the spelling, grammar, clarity, concision and overall readability. When necessary, rewrite the whole sentence. "
r"Firstly, you should provide the polished paragraph. "
r"Firstly, you should provide the polished paragraph (in English). "
r"Secondly, you should list all your modification and explain the reasons to do so in markdown table.",
text_show_chinese=
r"作为一名中文学术论文写作改进助理,你的任务是改进所提供文本的拼写、语法、清晰、简洁和整体可读性,"
Expand Down
2 changes: 1 addition & 1 deletion request_llms/bridge_chatgpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -447,7 +447,7 @@ def generate_payload(inputs:str, llm_kwargs:dict, history:list, system_prompt:st
openai_disable_system_prompt = model_info[llm_kwargs['llm_model']].get('openai_disable_system_prompt', False)

if openai_disable_system_prompt:
messages = []
messages = [{"role": "user", "content": system_prompt}]
else:
messages = [{"role": "system", "content": system_prompt}]

Expand Down

0 comments on commit 597c320

Please sign in to comment.