Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extra backslash character at the end of cURL command #7608

Closed
2 tasks done
lfavreli opened this issue Jul 2, 2024 · 1 comment · Fixed by #7676
Closed
2 tasks done

Extra backslash character at the end of cURL command #7608

lfavreli opened this issue Jul 2, 2024 · 1 comment · Fixed by #7676
Assignees

Comments

@lfavreli
Copy link

lfavreli commented Jul 2, 2024

Where is the problem?

https://docs.konghq.com/hub/kong-inc/ai-proxy/how-to/llm-provider-integration-guides/llama2/

What happened?

There is an extra backslash character ("\") at the end of the provided cURL command. This backslash causes the command to be interpreted as incomplete, leading to an error where the shell expects additional instructions.

curl -X POST http://localhost:8001/routes/mistral-chat/plugins \
  --data "name=ai-proxy" \
  --data "config.route_type=llm/v1/chat" \
  --data "config.auth.header_name=Authorization" \
  --data "config.auth.header_value=Bearer <MISTRAL_AI_KEY>" \
  --data "config.model.provider=mistral" \
  --data "config.model.name=mistral-tiny" \
  --data "config.model.options.mistral_format=openai" \
  --data "config.model.options.upstream_url=https://api.mistral.ai/v1/chat/completions" \ 

The error appears on both URLs:

What did you expect to happen?

The command should execute successfully, adding the plugin configuration. Correction consists in deleting the trailing backslash character from the end of the command.

Code of Conduct and Community Expectations

  • I agree to follow this project's Code of Conduct
  • I agree to abide by the Community Expectations
@lfavreli
Copy link
Author

Hello @lena-larionova,

I can't reopen the ticket, but it's not quite OK.
The change was not taken into account on the second link:

https://docs.konghq.com/hub/kong-inc/ai-proxy/how-to/llm-provider-integration-guides/mistral/

The GH-7676 Pull Request doesn't include the fix.

.
.

PS: A few words or even a little emoji for the contributions made would have been appreciated... ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants