You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Environment, CPU architecture, OS, and Version: RKE2, Linux local-ai-********-**** 5.4.0-121-generic #137-Ubuntu SMP 2022 x86_64 GNU/Linux
Describe the bug
While in streaming mode, returned JSON is not compatible with openAI, making libs dependent on those values - break (like langchain.js)
To Reproduce
Make request to /chat/completions or to /completions with streaming mode.
Expected behavior
JSONs responses should be same as in openai
If we consider current langchain.js (0.0.96)
I get errors in completion for missing "index" in "choices"
And for chat completion, it's missing "delta" in "choices"
Code for reproducing
I've switched between LocalAI and OpenAI with comments
Completions:
importopenai# openai.api_base = "http://localhost:80/v1"openai.api_key="your-key"prompt="Say the sentence: Test it's testing, hello!"model="text-ada-001"# model = "ggml-gpt4all-j-v1.3-groovy.bin"# Make the API requestresponse=openai.Completion.create(
model=model,
prompt=prompt,
max_tokens=50,
temperature=0.28,
top_p=0.95,
n=1,
echo=True,
stream=True
)
forchunkinresponse:
print(chunk)
print(response)
Chat completions:
importopenai# openai.api_base = "http://localhost:80/v1"openai.api_key="your-key"model="gpt-3.5-turbo"# model = "ggml-gpt4all-j-v1.3-groovy.bin"response=openai.ChatCompletion.create(
model=model,
messages=[
{'role': 'user', 'content': "Write: I love peanutseries"}
],
temperature=0,
stream=True
)
forchunkinresponse:
print(chunk)
P.S.
Thanks for a great work guys! You are doing absolutely wonderful thing, keep it up!
The text was updated successfully, but these errors were encountered:
LocalAI version: 1.9.1 (helm)
Environment, CPU architecture, OS, and Version: RKE2, Linux local-ai-********-**** 5.4.0-121-generic #137-Ubuntu SMP 2022 x86_64 GNU/Linux
Describe the bug
While in streaming mode, returned JSON is not compatible with openAI, making libs dependent on those values - break (like langchain.js)
To Reproduce
Make request to /chat/completions or to /completions with streaming mode.
Expected behavior
JSONs responses should be same as in openai
Real behaviour
Completion:
LocalAIs:
OpenAIs:
Chat completions:
LocalAIs:
OpenAIs:
If we consider current langchain.js (0.0.96)
I get errors in completion for missing "index" in "choices"
And for chat completion, it's missing "delta" in "choices"
Code for reproducing
I've switched between LocalAI and OpenAI with comments
Completions:
Chat completions:
P.S.
Thanks for a great work guys! You are doing absolutely wonderful thing, keep it up!
The text was updated successfully, but these errors were encountered: