Chain builder API not working with Claude? #137
-
@prabirshrestha is there an idiomatic way to include a system prompt with Claude just using the llm API? I am struggling with the chain builder API. I'm getting
|
Beta Was this translation helpful? Give feedback.
Answered by
Abraxas-365
Apr 30, 2024
Replies: 1 comment
-
Hi @BWStearns I think you are sending in the input variables the "bunch of instructions: {{input}}" when you should just send the 'input' let spelling_and_grammar_prompt = "Is Lima the capital of: {{input}}";
let full_prompt = message_formatter![
fmt_message!(Message::new_system_message("Reply with a boolean: 'true' or 'false'.")),
fmt_template!(HumanMessagePromptTemplate::new(template_jinja2!(
spelling_and_grammar_prompt,
"input"
)))
];
let claude = LLMChainBuilder::new()
.prompt(full_prompt)
.llm(claude::Claude::default()
.with_options(claude_options)
.with_model("claude-3-haiku-20240307"))
.build()
.unwrap();
let input_variables = prompt_args! {
"input" => "Peru"
};
let spelling_grammar_resp = claude.invoke(input_variables).await; |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
Abraxas-365
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @BWStearns I think you are sending in the input variables the "bunch of instructions: {{input}}" when you should just send the 'input'
example :