v1.0.0 Beta #631
Replies: 32 comments 70 replies
-
What would be the recommended way to query OAI models on Azure? |
Beta Was this translation helpful? Give feedback.
-
Are you going to add an ability to make concurrent calls to the API through the client? |
Beta Was this translation helpful? Give feedback.
-
@hallacy This is great, great news! Quick question: Would this issue, #387 relating to discrepancy in request timeout settings between async/sync calls be fixed now? |
Beta Was this translation helpful? Give feedback.
-
Would we be able to get a changelog or similar so that we may test certain features/changes? I'm particularly concerned about changes relating to:
|
Beta Was this translation helpful? Give feedback.
-
@hwchase17 @hinthornw @rlancemartin Apologies if this is noise - but I think this is a core change that fixes many current issues with the openai client, and would need an effort to upgrade/test within langchain. |
Beta Was this translation helpful? Give feedback.
-
"You can now instantiate a client" this is really helpful, love it and wanna say BIG THANK YOU. And wondering if there is approach or plan for the Azure support? I think more developers will need to customize based on the new library version. This is great for the diversity of the open-source community, but at the same time, there's a lot of redundant work, and the quality can't be guaranteed. I'm wondering if it would be better for there to be official unified support? |
Beta Was this translation helpful? Give feedback.
-
If you are removing support for Azure, will there be another library to work with Azure? |
Beta Was this translation helpful? Give feedback.
-
Great work on the new Python SDK! We have an observability product (Gentrace) that ties into the Would it be possible to export the asynchronous versions of these resources?
Alternatively, is there a way to get a direct handle to the async class definitions? (e.g. |
Beta Was this translation helpful? Give feedback.
-
Is it possible to get the rate limit information from the headers in this client? https://platform.openai.com/docs/guides/rate-limits/overview In the old client, it was possible to get headers from non-streaming requests using the below (hacky, but possible).
|
Beta Was this translation helpful? Give feedback.
-
Is there an ETA for the release? |
Beta Was this translation helpful? Give feedback.
-
@hallacy , Morgan from Weights & Biases here. We're happy to jump in and contribute on the W&B migration if ye need? |
Beta Was this translation helpful? Give feedback.
-
Just tried out the new SDK, happy to see the move over to a Client() for configuration! Is this SDK in a stable enough design to be shown in videos? I was going to record a video using the OpenAI SDK and I figure I might as well use the new one if it's ready enough. |
Beta Was this translation helpful? Give feedback.
-
Started a conversion on the chatlab library here: rgbkrk/chatlab#96 Let me know if you see any pitfalls related to our usage here. I am really glad to toss the hand crafted types I put in for the previous version of this API. |
Beta Was this translation helpful? Give feedback.
-
Hi. Sorry to bother, do we have an estimated “official” launch for this? I have some old code running off the old format, Kudasai, and it needs to function on a major project before November. Would this being released break my old code? I'm not entirely familiar with how these things work, I apologize and thanks if you could answer. |
Beta Was this translation helpful? Give feedback.
-
Any major downsides sticking with "import openai" instead of "from openai import OpenAI"? Is it a much heavier import or negligible? |
Beta Was this translation helpful? Give feedback.
-
Question on Audio: the openai-python/src/openai/_types.py Lines 37 to 47 in 34e0d94 I can pass
Should |
Beta Was this translation helpful? Give feedback.
-
Does it support multimodal? If not, is there such a plan? |
Beta Was this translation helpful? Give feedback.
-
Hey, nice job with the release. Unfortunately, I still can't maintain an open connection between requests - am I missing something? I'm attempting to instantiate the client on FastAPI app startup and use dependency injection to reuse it in all my routes. I've tried defining it as a separate global variable too. Apologies in advance if I'm doing something stupid
Tom |
Beta Was this translation helpful? Give feedback.
-
Heads up,
Error is:
It occurs at line 192 of
|
Beta Was this translation helpful? Give feedback.
-
Added an issue/question regarding dropping |
Beta Was this translation helpful? Give feedback.
-
Good Work |
Beta Was this translation helpful? Give feedback.
-
Am unable to use the response_format: { type: "json_object" } argument. Is it passing it in to:
? It tells me |
Beta Was this translation helpful? Give feedback.
-
Json response works fine now, but I'm curious — Is there no way to provide a template for the response?. In other words, do I still have to provide the Json structure in the prompt and hope for the best? |
Beta Was this translation helpful? Give feedback.
-
Context: https://platform.openai.com/docs/guides/safety-best-practices/end-user-ids To my understanding this used to be "name" but now it's "user". However when doing chat completion with models like gpt-3.5-turbo-16k and gpt-4 you still need to use "name". And trying to use "user" with new models like gpt-4-1106-preview results in an error. Also, between gpt-4-1106-preview and gpt-4-vision-preview, only the former can access the information within "name". In other words, when I put some text in the "name" field and tell gpt-4-vision-preview to recite it, it can't. All other models I've tested can do this. |
Beta Was this translation helpful? Give feedback.
-
which is the new name for the old ServiceUnavailableError? |
Beta Was this translation helpful? Give feedback.
-
I am facing persistent SSL errors as of 1.0.0. On 0.28.1 this works successfully (with the OPENAI_API_KEY environment variable set):
On 1.0.0 the equivalent code is giving me an error:
I am running this through a corporate network and have faced SSL errors which were previously resolved by various methods:
While these methods worked on previous versions, they no longer do as of 1.0.0. Is there a new recommended workaround while working on company-owned machines/networks? |
Beta Was this translation helpful? Give feedback.
-
Hey hey :) Reproducible by simply:
(missing some imports there for firebase functions and emulators) Downgrading to v0.28.1 fixes it. |
Beta Was this translation helpful? Give feedback.
-
I think this kind of sdk design is very beautiful, can you share how to design an sdk like openai? Or where can I see relevant technology sharing? Recently, I also need to develop a client, and I want to refer to the design of openai. |
Beta Was this translation helpful? Give feedback.
-
from openai import OpenAI
client = OpenAI()
OpenAI.api_key = 'my_api'
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)
print(completion.choices[0].message) error: |
Beta Was this translation helpful? Give feedback.
-
Try passing api_key as an argument to OpenAI() on your second line, and
delete the third.
…On Fri, May 24, 2024 at 10:51 AM TamimLikhon ***@***.***> wrote:
from openai import OpenAIclient = OpenAI()
OpenAI.api_key = 'my_api'
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)
print(completion.choices[0].message)
error:
getting bunch of errors, and also it saying that incorrect api but its
not.. I am newbie +_+ please help on my silly mistake by providing the
correct code and solution peace +__+
—
Reply to this email directly, view it on GitHub
<#631 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACIQUXMRPYAHMPTFTL245DZD54ZLAVCNFSM6AAAAAA5M7QJ6KVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TKNJQGM4TO>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Important
v1.0.0 is live! Get started by following the migration guide here:
#742
Beta Was this translation helpful? Give feedback.
All reactions