-
-
Notifications
You must be signed in to change notification settings - Fork 367
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jupyter_ai for Azure OpenAI throws 'InternalServerError' for all chat responses #1208
Comments
@eazuman "Internal server error" means that OpenAI's servers cannot handle this request right now due to an internal error. In other words, this error is being caused by OpenAI's backend servers, not Jupyter AI. If you retry again today, you should hopefully be able to consistently receive good replies. I would recommend trying again with the latest version of Jupyter AI (v2.29.0). |
@dlqqq thanks for your response! As I said above, I got it working with Jupyter AI version 2.11, but not with the latest version. |
@eazuman This is very strange. It is possible that this is a bug with the |
@eazuman Could you try using the |
Thank you @dlqqq for the suggestion! ![]() |
Additionally, I tested other package versions and found that Do you have any thoughts on other factors that could be causing these issues, or any specific package versions? Thanks!! 2.14.0
2.17.0/ 2.18.0 Error, even though I have passed api_version in the settings . Also tried the same as env. variable
2.19.0 - 2.29.0
|
I’m still trying a few other things and wanted to check with you to see if there are any package dependencies we should consider that should be pin? As I mentioned, we’re using When we moved to For this specific version of
I just want to make sure I’m trying a few options to troubleshoot the issue and not missing something. Thanks for the help! |
@eazuman Thanks so much for investigating this further. I looked into all the code changes made when we updated from Best to do this in a clean new environment:
Depending on your environment, you may see some errors but they may not matter. Then run Jupyter Lab with
The Settings are in a different location, as shown here: And you can open a new chat from the top left as shown on the screenshot above. Hope this works or gives some insight into where the issue lies (and also gives a look at v3!). |
Description
Jupyter_ai throwing InternalServerError for the chat response for Azure Openai provider
It works for the
/generate
command but the chat responds with the below error for all questionsthis is with the latest version of jupyter_ai and its dependencies
Any help or insights on this issue would be greatly appreciated
Traceback (most recent call last): File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 226, in on_message await self.process_message(message) File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/default.py", line 72, in process_message await self.stream_reply(inputs, message) File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 564, in stream_reply async for chunk in chunk_generator: File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5535, in astream async for item in self.bound.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5535, in astream async for item in self.bound.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3430, in astream async for chunk in self.atransform(input_aiter(), config, **kwargs): File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3413, in atransform async for chunk in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2301, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3383, in _atransform async for output in final_pipeline: File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5571, in atransform async for item in self.bound.atransform( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4941, in atransform async for output in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2301, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4922, in _atransform async for chunk in output.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5535, in astream async for item in self.bound.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3430, in astream async for chunk in self.atransform(input_aiter(), config, **kwargs): File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3413, in atransform async for chunk in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2301, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3383, in _atransform async for output in final_pipeline: File "/opt/conda/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 84, in atransform async for chunk in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2259, in _atransform_stream_with_config final_input: Optional[Input] = await py_anext(input_for_tracing, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 76, in anext_impl return await __anext__(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 125, in tee_peer item = await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1471, in atransform async for output in self.astream(final, config, **kwargs): File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 494, in astream raise e File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 472, in astream async for chunk in self._astream( File "/opt/conda/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 881, in _astream response = await self.async_client.create(**payload) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1720, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1849, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1543, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1629, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1676, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1629, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1676, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1644, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Internal Server Error
Reproduce
Start with the base Docker image for JupyterLab 4.1.8.
Install the jupyter_ai package in the Dockerfile.
Build the Docker Image and run in a container
Verify Jupyter AI Chat in the local host
Expected behavior
The chat should work and provide the correct answer.
Context
Hello,
We are upgrading from JupyterLab 3.6.7, along with other packages, including Jupyter AI. Jupyter AI works fine with the current setup (JupyterLab 3.6.7).
However, after upgrading the packages and Jupyter AI, I am encountering an Internal Server Error for all chat-based queries. Interestingly, some commands, such as /generate a notebook about how to add 5 numbers in Python, work fine and successfully generate the notebook.
other package versions
Troubleshoot Output
Command Line Output
The text was updated successfully, but these errors were encountered: