Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jupyter_ai for Azure OpenAI throws 'InternalServerError' for all chat responses #1208

Open
eazuman opened this issue Jan 17, 2025 · 9 comments
Assignees
Labels
bug Something isn't working

Comments

@eazuman
Copy link

eazuman commented Jan 17, 2025

Description

Jupyter_ai throwing InternalServerError for the chat response for Azure Openai provider
It works for the /generate command but the chat responds with the below error for all questions
this is with the latest version of jupyter_ai and its dependencies

Any help or insights on this issue would be greatly appreciated

Traceback (most recent call last): File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 226, in on_message await self.process_message(message) File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/default.py", line 72, in process_message await self.stream_reply(inputs, message) File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 564, in stream_reply async for chunk in chunk_generator: File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5535, in astream async for item in self.bound.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5535, in astream async for item in self.bound.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3430, in astream async for chunk in self.atransform(input_aiter(), config, **kwargs): File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3413, in atransform async for chunk in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2301, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3383, in _atransform async for output in final_pipeline: File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5571, in atransform async for item in self.bound.atransform( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4941, in atransform async for output in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2301, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 4922, in _atransform async for chunk in output.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5535, in astream async for item in self.bound.astream( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3430, in astream async for chunk in self.atransform(input_aiter(), config, **kwargs): File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3413, in atransform async for chunk in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2301, in _atransform_stream_with_config chunk: Output = await asyncio.create_task( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3383, in _atransform async for output in final_pipeline: File "/opt/conda/lib/python3.11/site-packages/langchain_core/output_parsers/transform.py", line 84, in atransform async for chunk in self._atransform_stream_with_config( File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2259, in _atransform_stream_with_config final_input: Optional[Input] = await py_anext(input_for_tracing, None) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 76, in anext_impl return await __anext__(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/utils/aiter.py", line 125, in tee_peer item = await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1471, in atransform async for output in self.astream(final, config, **kwargs): File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 494, in astream raise e File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 472, in astream async for chunk in self._astream( File "/opt/conda/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 881, in _astream response = await self.async_client.create(**payload) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1720, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1849, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1543, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1629, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1676, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1629, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1676, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1644, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Internal Server Error

Reproduce

Start with the base Docker image for JupyterLab 4.1.8.
Install the jupyter_ai package in the Dockerfile.

# Use the JupyterLab 4.1.8 minimal notebook image as the base image
FROM quay.io/jupyter/minimal-notebook:x86_64-lab-4.1.8 AS base

# Install Jupyter AI with all its dependencies
pip install -U "jupyter-ai[all]"

Build the Docker Image and run in a container
Verify Jupyter AI Chat in the local host

Image

Expected behavior

The chat should work and provide the correct answer.

Context

Hello,

We are upgrading from JupyterLab 3.6.7, along with other packages, including Jupyter AI. Jupyter AI works fine with the current setup (JupyterLab 3.6.7).

However, after upgrading the packages and Jupyter AI, I am encountering an Internal Server Error for all chat-based queries. Interestingly, some commands, such as /generate a notebook about how to add 5 numbers in Python, work fine and successfully generate the notebook.

  • Browser and version: Chrome
  • JupyterLab version: 4.1.8

other package versions

jupyter_ai                    2.29.0
jupyter_ai_magics             2.29.0
jupyter_client                8.6.1
jupyter_core                  5.7.2
jupyter-events                0.10.0
jupyter-lsp                   2.2.5
jupyter_packaging             0.12.3
jupyter_server                2.14.0
jupyter_server_terminals      0.5.3
jupyter-telemetry             0.1.0
jupyterhub                    4.1.5
jupyterlab                    4.1.8
jupyterlab_pygments           0.3.0
jupyterlab_server             2.27.1
jupyterlab_widgets            3.0.13
langchain                     0.3.14
langchain-anthropic           0.3.3
langchain-aws                 0.2.11
langchain-cohere              0.3.4
langchain-community           0.3.14
langchain-core                0.3.30
langchain-experimental        0.3.4
langchain-google-genai        2.0.8
langchain-mistralai           0.2.4
langchain-nvidia-ai-endpoints 0.3.7
langchain-ollama              0.2.2
langchain-openai              0.3.0
langchain-text-splitters      0.3.5
langsmith                     0.2.11
libmambapy                    1.5.8
Troubleshoot Output
jupyter troubleshoot

pip list:
Package Version Editable project location
----------------------------- --------------- -----------------------------------
ai21 3.0.1
ai21-tokenizer 0.12.0
aiohappyeyeballs 2.4.4
aiohttp 3.11.11
aiolimiter 1.2.1
aiosignal 1.3.2
aiosqlite 0.20.0
alembic 1.13.1
annotated-types 0.7.0
anthropic 0.43.1
anyio 4.8.0
archspec 0.2.3
argon2-cffi 23.1.0
argon2-cffi-bindings 21.2.0
arrow 1.3.0
arxiv 2.1.3
asttokens 2.4.1
async-generator 1.10
async-lru 2.0.4
attrs 23.2.0
Babel 2.14.0
bce-python-sdk 0.9.25
beautifulsoup4 4.12.3
bleach 6.1.0
blinker 1.8.1
boltons 24.0.0
boto3 1.36.2
botocore 1.36.2
Brotli 1.1.0
cached-property 1.5.2
cachetools 5.5.0
certifi 2024.2.2
certipy 0.1.3
cffi 1.16.0
charset-normalizer 3.3.2
click 8.1.8
cloudpickle 3.1.1
cohere 5.13.8
colorama 0.4.6
comm 0.2.2
conda 24.4.0
conda-libmamba-solver 24.1.0
conda-package-handling 2.2.0
conda_package_streaming 0.9.0
cryptography 42.0.6
dask 2025.1.0
dataclasses-json 0.6.7
debugpy 1.8.1
decorator 5.1.1
deepmerge 2.0
defusedxml 0.7.1
deprecation 2.1.0
dill 0.3.9
diskcache 5.6.3
distributed 2025.1.0
distro 1.9.0
entrypoints 0.4
eval_type_backport 0.2.2
exceptiongroup 1.2.0
executing 2.0.1
faiss-cpu 1.9.0.post1
fastavro 1.10.0
fastjsonschema 2.19.1
feedparser 6.0.11
filelock 3.16.1
filetype 1.2.0
fqdn 1.5.1
frozenlist 1.5.0
fsspec 2024.12.0
future 1.0.0
google-ai-generativelanguage 0.6.10
google-api-core 2.24.0
google-api-python-client 2.159.0
google-auth 2.37.0
google-auth-httplib2 0.2.0
google-generativeai 0.8.3
googleapis-common-protos 1.66.0
gpt4all 2.8.2
greenlet 3.0.3
grpcio 1.69.0
grpcio-status 1.69.0
h11 0.14.0
h2 4.1.0
hpack 4.0.0
httpcore 1.0.5
httplib2 0.22.0
httpx 0.27.0
httpx-sse 0.4.0
huggingface-hub 0.27.1
hyperframe 6.0.1
idna 3.7
importlib_metadata 7.1.0
importlib_resources 6.4.0
ipykernel 6.29.3
ipython 8.22.2
ipython-genutils 0.2.0
ipywidgets 8.1.5
isoduration 20.11.0
jedi 0.19.1
Jinja2 3.1.3
jiter 0.8.2
jmespath 1.0.1
json5 0.9.25
jsonpatch 1.33
jsonpath-ng 1.7.0
jsonpointer 2.4
jsonschema 4.22.0
jsonschema-specifications 2023.12.1
jupyter_ai 2.29.0
jupyter_ai_magics 2.29.0
jupyter_client 8.6.1
jupyter_core 5.7.2
jupyter-events 0.10.0
jupyter-lsp 2.2.5
jupyter_packaging 0.12.3
jupyter_server 2.14.0
jupyter_server_terminals 0.5.3
jupyter-telemetry 0.1.0
jupyterhub 4.1.5
jupyterlab 4.1.8
jupyterlab_pygments 0.3.0
jupyterlab_server 2.27.1
jupyterlab_widgets 3.0.13
langchain 0.3.14
langchain-anthropic 0.3.3
langchain-aws 0.2.11
langchain-cohere 0.3.4
langchain-community 0.3.14
langchain-core 0.3.30
langchain-experimental 0.3.4
langchain-google-genai 2.0.8
langchain-mistralai 0.2.4
langchain-nvidia-ai-endpoints 0.3.7
langchain-ollama 0.2.2
langchain-openai 0.3.0
langchain-text-splitters 0.3.5
langsmith 0.2.11
libmambapy 1.5.8
locket 1.0.0
Mako 1.3.3
mamba 1.5.8
markdown-it-py 3.0.0
MarkupSafe 2.1.5
marshmallow 3.25.1
matplotlib-inline 0.1.7
mdurl 0.1.2
menuinst 2.0.2
mistune 3.0.2
msgpack 1.1.0
multidict 6.1.0
multiprocess 0.70.17
mypy-extensions 1.0.0
nbclassic 1.0.0
nbclient 0.10.0
nbconvert 7.16.4
nbformat 5.10.4
nest_asyncio 1.6.0
notebook 7.1.3
notebook_shim 0.2.4
numpy 1.26.4
oauthlib 3.2.2
ollama 0.4.6
openai 1.59.8
orjson 3.10.14
overrides 7.7.0
packaging 24.0
pamela 1.1.0
pandas 2.2.3
pandocfilters 1.5.0
parameterized 0.9.0
parso 0.8.4
partd 1.4.2
pexpect 4.9.0
pickleshare 0.7.5
pillow 10.4.0
pip 24.0
pkgutil_resolve_name 1.3.10
platformdirs 4.2.1
pluggy 1.5.0
ply 3.11
prometheus_client 0.20.0
prompt-toolkit 3.0.42
propcache 0.2.1
proto-plus 1.25.0
protobuf 5.29.3
psutil 5.9.8
ptyprocess 0.7.0
pure-eval 0.2.2
pyarrow 19.0.0
pyasn1 0.6.1
pyasn1_modules 0.4.1
pycosat 0.6.6
pycparser 2.22
pycryptodome 3.21.0
pycurl 7.45.3
pydantic 2.10.5
pydantic_core 2.27.2
pydantic-settings 2.7.1
Pygments 2.18.0
PyJWT 2.8.0
pyOpenSSL 24.0.0
pyparsing 3.2.1
pypdf 5.1.0
PySocks 1.7.1
python-dateutil 2.9.0
python-dotenv 1.0.1
python-json-logger 2.0.7
pytz 2024.1
PyYAML 6.0.1
pyzmq 26.0.2
qianfan 0.4.12.2
referencing 0.35.1
regex 2024.11.6
requests 2.32.3
requests-toolbelt 1.0.0
rfc3339-validator 0.1.4
rfc3986-validator 0.1.1
rich 13.9.4
rpds-py 0.18.0
rsa 4.9
ruamel.yaml 0.18.6
ruamel.yaml.clib 0.2.8
s3transfer 0.11.1
Send2Trash 1.8.3
sentencepiece 0.2.0
setuptools 69.5.1
sgmllib3k 1.0.0
shellingham 1.5.4
six 1.16.0
sniffio 1.3.1
sortedcontainers 2.4.0
soupsieve 2.5
SQLAlchemy 2.0.30
stack-data 0.6.2
tabulate 0.9.0
tblib 3.0.0
tenacity 8.5.0
terminado 0.18.1
tiktoken 0.8.0
tinycss2 1.3.0
together 1.3.11
tokenizers 0.21.0
tomli 2.0.1
tomlkit 0.13.2
toolz 1.0.0
tornado 6.4
tqdm 4.66.4
traitlets 5.14.3
truststore 0.8.0
typer 0.15.1
types-python-dateutil 2.9.0.20240316
types-requests 2.32.0.20241016
typing_extensions 4.12.2
typing-inspect 0.9.0
typing-utils 0.1.0
tzdata 2024.2
uri-template 1.3.0
uritemplate 4.1.1
urllib3 2.2.1
wcwidth 0.2.13
webcolors 1.13
webencodings 0.5.1
websocket-client 1.8.0
wheel 0.43.0
widgetsnbextension 4.0.13
yarl 1.18.3
zict 3.0.0
zipp 3.17.0
zstandard 0.19.0

conda env:
name: base
channels:
- conda-forge
dependencies:
- _libgcc_mutex=0.1=conda_forge
- _openmp_mutex=4.5=2_gnu
- alembic=1.13.1=pyhd8ed1ab_1
- archspec=0.2.3=pyhd8ed1ab_0
- argon2-cffi=23.1.0=pyhd8ed1ab_0
- argon2-cffi-bindings=21.2.0=py311h459d7ec_4
- arrow=1.3.0=pyhd8ed1ab_0
- asttokens=2.4.1=pyhd8ed1ab_0
- async-lru=2.0.4=pyhd8ed1ab_0
- async_generator=1.10=py_0
- attrs=23.2.0=pyh71513ae_0
- babel=2.14.0=pyhd8ed1ab_0
- beautifulsoup4=4.12.3=pyha770c72_0
- bleach=6.1.0=pyhd8ed1ab_0
- blinker=1.8.1=pyhd8ed1ab_0
- boltons=24.0.0=pyhd8ed1ab_0
- brotli-python=1.1.0=py311hb755f60_1
- bzip2=1.0.8=hd590300_5
- c-ares=1.28.1=hd590300_0
- ca-certificates=2024.2.2=hbcca054_0
- cached-property=1.5.2=hd8ed1ab_1
- cached_property=1.5.2=pyha770c72_1
- certifi=2024.2.2=pyhd8ed1ab_0
- certipy=0.1.3=py_0
- cffi=1.16.0=py311hb3a22ac_0
- charset-normalizer=3.3.2=pyhd8ed1ab_0
- colorama=0.4.6=pyhd8ed1ab_0
- comm=0.2.2=pyhd8ed1ab_0
- conda=24.4.0=py311h38be061_0
- conda-libmamba-solver=24.1.0=pyhd8ed1ab_0
- conda-package-handling=2.2.0=pyh38be061_0
- conda-package-streaming=0.9.0=pyhd8ed1ab_0
- configurable-http-proxy=4.6.1=h92b4e83_0
- cryptography=42.0.6=py311h4a61cc7_0
- debugpy=1.8.1=py311hb755f60_0
- decorator=5.1.1=pyhd8ed1ab_0
- defusedxml=0.7.1=pyhd8ed1ab_0
- distro=1.9.0=pyhd8ed1ab_0
- entrypoints=0.4=pyhd8ed1ab_0
- exceptiongroup=1.2.0=pyhd8ed1ab_2
- executing=2.0.1=pyhd8ed1ab_0
- fmt=10.2.1=h00ab1b0_0
- fqdn=1.5.1=pyhd8ed1ab_0
- greenlet=3.0.3=py311hb755f60_0
- h11=0.14.0=pyhd8ed1ab_0
- h2=4.1.0=pyhd8ed1ab_0
- hpack=4.0.0=pyh9f0ad1d_0
- httpcore=1.0.5=pyhd8ed1ab_0
- httpx=0.27.0=pyhd8ed1ab_0
- hyperframe=6.0.1=pyhd8ed1ab_0
- icu=73.2=h59595ed_0
- idna=3.7=pyhd8ed1ab_0
- importlib-metadata=7.1.0=pyha770c72_0
- importlib_metadata=7.1.0=hd8ed1ab_0
- importlib_resources=6.4.0=pyhd8ed1ab_0
- ipykernel=6.29.3=pyhd33586a_0
- ipython=8.22.2=pyh707e725_0
- ipython_genutils=0.2.0=py_1
- isoduration=20.11.0=pyhd8ed1ab_0
- jedi=0.19.1=pyhd8ed1ab_0
- jinja2=3.1.3=pyhd8ed1ab_0
- json5=0.9.25=pyhd8ed1ab_0
- jsonpatch=1.33=pyhd8ed1ab_0
- jsonpointer=2.4=py311h38be061_3
- jsonschema=4.22.0=pyhd8ed1ab_0
- jsonschema-specifications=2023.12.1=pyhd8ed1ab_0
- jsonschema-with-format-nongpl=4.22.0=pyhd8ed1ab_0
- jupyter-lsp=2.2.5=pyhd8ed1ab_0
- jupyter_client=8.6.1=pyhd8ed1ab_0
- jupyter_core=5.7.2=py311h38be061_0
- jupyter_events=0.10.0=pyhd8ed1ab_0
- jupyter_server=2.14.0=pyhd8ed1ab_0
- jupyter_server_terminals=0.5.3=pyhd8ed1ab_0
- jupyter_telemetry=0.1.0=pyhd8ed1ab_1
- jupyterhub=4.1.5=pyh31011fe_0
- jupyterhub-base=4.1.5=pyh31011fe_0
- jupyterlab=4.1.8=pyhd8ed1ab_0
- jupyterlab_pygments=0.3.0=pyhd8ed1ab_1
- jupyterlab_server=2.27.1=pyhd8ed1ab_0
- keyutils=1.6.1=h166bdaf_0
- krb5=1.21.2=h659d440_0
- ld_impl_linux-64=2.40=h55db66e_0
- libarchive=3.7.2=h2aa1ff5_1
- libcurl=8.7.1=hca28451_0
- libedit=3.1.20191231=he28a2e2_2
- libev=4.33=hd590300_2
- libexpat=2.6.2=h59595ed_0
- libffi=3.4.2=h7f98852_5
- libgcc-ng=13.2.0=h77fa898_6
- libgomp=13.2.0=h77fa898_6
- libiconv=1.17=hd590300_2
- libmamba=1.5.8=had39da4_0
- libmambapy=1.5.8=py311hf2555c7_0
- libnghttp2=1.58.0=h47da74e_1
- libnsl=2.0.1=hd590300_0
- libsodium=1.0.18=h36c2ea0_1
- libsolv=0.7.29=ha6fb4c9_0
- libsqlite=3.45.3=h2797004_0
- libssh2=1.11.0=h0841786_0
- libstdcxx-ng=13.2.0=hc0a3c3a_6
- libuuid=2.38.1=h0b41bf4_0
- libuv=1.48.0=hd590300_0
- libxcrypt=4.4.36=hd590300_1
- libxml2=2.12.6=h232c23b_2
- libzlib=1.2.13=hd590300_5
- lz4-c=1.9.4=hcb278e6_0
- lzo=2.10=hd590300_1001
- mako=1.3.3=pyhd8ed1ab_0
- mamba=1.5.8=py311h3072747_0
- markupsafe=2.1.5=py311h459d7ec_0
- matplotlib-inline=0.1.7=pyhd8ed1ab_0
- menuinst=2.0.2=py311h38be061_0
- mistune=3.0.2=pyhd8ed1ab_0
- nbclassic=1.0.0=pyhb4ecaf3_1
- nbclient=0.10.0=pyhd8ed1ab_0
- nbconvert=7.16.4=hd8ed1ab_0
- nbconvert-core=7.16.4=pyhd8ed1ab_0
- nbconvert-pandoc=7.16.4=hd8ed1ab_0
- nbformat=5.10.4=pyhd8ed1ab_0
- ncurses=6.4.20240210=h59595ed_0
- nest-asyncio=1.6.0=pyhd8ed1ab_0
- nodejs=20.12.2=hb753e55_0
- notebook=7.1.3=pyhd8ed1ab_0
- notebook-shim=0.2.4=pyhd8ed1ab_0
- oauthlib=3.2.2=pyhd8ed1ab_0
- openssl=3.3.0=hd590300_0
- overrides=7.7.0=pyhd8ed1ab_0
- packaging=24.0=pyhd8ed1ab_0
- pamela=1.1.0=pyh1a96a4e_0
- pandoc=3.1.13=ha770c72_0
- pandocfilters=1.5.0=pyhd8ed1ab_0
- parso=0.8.4=pyhd8ed1ab_0
- pexpect=4.9.0=pyhd8ed1ab_0
- pickleshare=0.7.5=py_1003
- pip=24.0=pyhd8ed1ab_0
- pkgutil-resolve-name=1.3.10=pyhd8ed1ab_1
- platformdirs=4.2.1=pyhd8ed1ab_0
- pluggy=1.5.0=pyhd8ed1ab_0
- prometheus_client=0.20.0=pyhd8ed1ab_0
- prompt-toolkit=3.0.42=pyha770c72_0
- psutil=5.9.8=py311h459d7ec_0
- ptyprocess=0.7.0=pyhd3deb0d_0
- pure_eval=0.2.2=pyhd8ed1ab_0
- pybind11-abi=4=hd8ed1ab_3
- pycosat=0.6.6=py311h459d7ec_0
- pycparser=2.22=pyhd8ed1ab_0
- pycurl=7.45.3=py311h3393d6f_1
- pygments=2.18.0=pyhd8ed1ab_0
- pyjwt=2.8.0=pyhd8ed1ab_1
- pyopenssl=24.0.0=pyhd8ed1ab_0
- pysocks=1.7.1=pyha2e5f31_6
- python=3.11.9=hb806964_0_cpython
- python-dateutil=2.9.0=pyhd8ed1ab_0
- python-fastjsonschema=2.19.1=pyhd8ed1ab_0
- python-json-logger=2.0.7=pyhd8ed1ab_0
- python_abi=3.11=4_cp311
- pytz=2024.1=pyhd8ed1ab_0
- pyyaml=6.0.1=py311h459d7ec_1
- pyzmq=26.0.2=py311h08a0b41_0
- readline=8.2=h8228510_1
- referencing=0.35.1=pyhd8ed1ab_0
- reproc=14.2.4.post0=hd590300_1
- reproc-cpp=14.2.4.post0=h59595ed_1
- rfc3339-validator=0.1.4=pyhd8ed1ab_0
- rfc3986-validator=0.1.1=pyh9f0ad1d_0
- rpds-py=0.18.0=py311h46250e7_0
- ruamel.yaml=0.18.6=py311h459d7ec_0
- ruamel.yaml.clib=0.2.8=py311h459d7ec_0
- send2trash=1.8.3=pyh0d859eb_0
- setuptools=69.5.1=pyhd8ed1ab_0
- six=1.16.0=pyh6c4a22f_0
- sniffio=1.3.1=pyhd8ed1ab_0
- soupsieve=2.5=pyhd8ed1ab_1
- sqlalchemy=2.0.30=py311h331c9d8_0
- stack_data=0.6.2=pyhd8ed1ab_0
- terminado=0.18.1=pyh0d859eb_0
- tinycss2=1.3.0=pyhd8ed1ab_0
- tk=8.6.13=noxft_h4845f30_101
- tomli=2.0.1=pyhd8ed1ab_0
- tornado=6.4=py311h459d7ec_0
- tqdm=4.66.4=pyhd8ed1ab_0
- traitlets=5.14.3=pyhd8ed1ab_0
- truststore=0.8.0=pyhd8ed1ab_0
- types-python-dateutil=2.9.0.20240316=pyhd8ed1ab_0
- typing_utils=0.1.0=pyhd8ed1ab_0
- uri-template=1.3.0=pyhd8ed1ab_0
- urllib3=2.2.1=pyhd8ed1ab_0
- wcwidth=0.2.13=pyhd8ed1ab_0
- webcolors=1.13=pyhd8ed1ab_0
- webencodings=0.5.1=pyhd8ed1ab_2
- websocket-client=1.8.0=pyhd8ed1ab_0
- wheel=0.43.0=pyhd8ed1ab_1
- xz=5.2.6=h166bdaf_0
- yaml=0.2.5=h7f98852_2
- yaml-cpp=0.8.0=h59595ed_0
- zeromq=4.3.5=h75354e8_3
- zipp=3.17.0=pyhd8ed1ab_0
- zlib=1.2.13=hd590300_5
- zstandard=0.19.0=py311hd4cff14_0
- zstd=1.5.6=ha6fb4c9_0
- pip:
- ai21==3.0.1
- ai21-tokenizer==0.12.0
- aiohappyeyeballs==2.4.4
- aiohttp==3.11.11
- aiolimiter==1.2.1
- aiosignal==1.3.2
- aiosqlite==0.20.0
- al-server-extension==0.1.0
- annotated-types==0.7.0
- anthropic==0.43.1
- anyio==4.8.0
- arxiv==2.1.3
- bce-python-sdk==0.9.25
- boto3==1.36.2
- botocore==1.36.2
- cachetools==5.5.0
- click==8.1.8
- cloudpickle==3.1.1
- cohere==5.13.8
- dask==2025.1.0
- dataclasses-json==0.6.7
- deepmerge==2.0
- deprecation==2.1.0
- dill==0.3.9
- diskcache==5.6.3
- distributed==2025.1.0
- eval-type-backport==0.2.2
- faiss-cpu==1.9.0.post1
- fastavro==1.10.0
- feedparser==6.0.11
- filelock==3.16.1
- filetype==1.2.0
- frozenlist==1.5.0
- fsspec==2024.12.0
- future==1.0.0
- google-ai-generativelanguage==0.6.10
- google-api-core==2.24.0
- google-api-python-client==2.159.0
- google-auth==2.37.0
- google-auth-httplib2==0.2.0
- google-generativeai==0.8.3
- googleapis-common-protos==1.66.0
- gpt4all==2.8.2
- grpcio==1.69.0
- grpcio-status==1.69.0
- httplib2==0.22.0
- httpx-sse==0.4.0
- huggingface-hub==0.27.1
- ipywidgets==8.1.5
- jiter==0.8.2
- jmespath==1.0.1
- jsonpath-ng==1.7.0
- jupyter-ai==2.29.0
- jupyter-ai-magics==2.29.0
- jupyter-packaging==0.12.3
- jupyterlab-widgets==3.0.13
- langchain==0.3.14
- langchain-anthropic==0.3.3
- langchain-aws==0.2.11
- langchain-cohere==0.3.4
- langchain-community==0.3.14
- langchain-core==0.3.30
- langchain-experimental==0.3.4
- langchain-google-genai==2.0.8
- langchain-mistralai==0.2.4
- langchain-nvidia-ai-endpoints==0.3.7
- langchain-ollama==0.2.2
- langchain-openai==0.3.0
- langchain-text-splitters==0.3.5
- langsmith==0.2.11
- locket==1.0.0
- markdown-it-py==3.0.0
- marshmallow==3.25.1
- mdurl==0.1.2
- msgpack==1.1.0
- multidict==6.1.0
- multiprocess==0.70.17
- mypy-extensions==1.0.0
- numpy==1.26.4
- ollama==0.4.6
- openai==1.59.8
- orjson==3.10.14
- pandas==2.2.3
- parameterized==0.9.0
- partd==1.4.2
- pillow==10.4.0
- ply==3.11
- propcache==0.2.1
- proto-plus==1.25.0
- protobuf==5.29.3
- pyarrow==19.0.0
- pyasn1==0.6.1
- pyasn1-modules==0.4.1
- pycryptodome==3.21.0
- pydantic==2.10.5
- pydantic-core==2.27.2
- pydantic-settings==2.7.1
- pyparsing==3.2.1
- pypdf==5.1.0
- python-dotenv==1.0.1
- qianfan==0.4.12.2
- regex==2024.11.6
- requests==2.32.3
- requests-toolbelt==1.0.0
- rich==13.9.4
- rsa==4.9
- s3transfer==0.11.1
- sentencepiece==0.2.0
- sgmllib3k==1.0.0
- shellingham==1.5.4
- sortedcontainers==2.4.0
- tabulate==0.9.0
- tblib==3.0.0
- tenacity==8.5.0
- tiktoken==0.8.0
- together==1.3.11
- tokenizers==0.21.0
- tomlkit==0.13.2
- toolz==1.0.0
- typer==0.15.1
- types-requests==2.32.0.20241016
- typing-extensions==4.12.2
- typing-inspect==0.9.0
- tzdata==2024.2
- uritemplate==4.1.1
- widgetsnbextension==4.0.13
- yarl==1.18.3
- zict==3.0.0
prefix: /opt/conda

Command Line Output
I 2025-01-17 21:29:33.641 ServerApp] al_server_extension | extension was successfully linked.
[I 2025-01-17 21:29:33.648 ServerApp] jupyter_ai | extension was successfully linked.
[I 2025-01-17 21:29:33.648 ServerApp] jupyter_lsp | extension was successfully linked.
[I 2025-01-17 21:29:33.652 ServerApp] jupyter_server_terminals | extension was successfully linked.
[I 2025-01-17 21:29:33.657 ServerApp] jupyterlab | extension was successfully linked.
[I 2025-01-17 21:29:33.662 ServerApp] nbclassic | extension was successfully linked.
[I 2025-01-17 21:29:33.667 ServerApp] notebook | extension was successfully linked.
[I 2025-01-17 21:29:33.675 ServerApp] notebook_shim | extension was successfully linked.
/opt/conda/lib/python3.11/site-packages/traitlets/traitlets.py:1241: UserWarning: Overriding existing pre_save_hook (custom_pre_save_hook) with a new one (custom_pre_save_hook).
  return self.func(*args, **kwargs)
/opt/conda/lib/python3.11/site-packages/traitlets/traitlets.py:1241: UserWarning: Overriding existing post_save_hook (custom_post_save_hook) with a new one (custom_post_save_hook).
  return self.func(*args, **kwargs)
[I 2025-01-17 21:29:33.698 ServerApp] notebook_shim | extension was successfully loaded.
[I 2025-01-17 21:29:33.699 ServerApp] Registered common endpoints server extension
[I 2025-01-17 21:29:33.699 ServerApp] al_server_extension | extension was successfully loaded.
[I 2025-01-17 21:29:33.699 AiExtension] Configured provider allowlist: ['azure-chat-openai']
[I 2025-01-17 21:29:33.699 AiExtension] Configured provider blocklist: None
[I 2025-01-17 21:29:33.699 AiExtension] Configured model allowlist: None
[I 2025-01-17 21:29:33.699 AiExtension] Configured model blocklist: None
[I 2025-01-17 21:29:33.700 AiExtension] Configured model parameters: {'azure-chat-openai:XXXXXXXXXX': {'azure_endpoint': 'https://XXXXXXXXXXXXXXXX/openai-proxy', 'openai_api_version': '2023-07-01-preview'}}
[I 2025-01-17 21:29:33.707 AiExtension] Skipping blocked provider `ai21`.
[I 2025-01-17 21:29:33.829 AiExtension] Skipping blocked provider `bedrock`.
[I 2025-01-17 21:29:33.829 AiExtension] Skipping blocked provider `bedrock-chat`.
[I 2025-01-17 21:29:33.829 AiExtension] Skipping blocked provider `bedrock-custom`.
[I 2025-01-17 21:29:33.938 AiExtension] Skipping blocked provider `anthropic-chat`.
[I 2025-01-17 21:29:34.119 AiExtension] Registered model provider `azure-chat-openai`.
[I 2025-01-17 21:29:34.919 AiExtension] Skipping blocked provider `cohere`.
[I 2025-01-17 21:29:35.149 AiExtension] Skipping blocked provider `gemini`.
[I 2025-01-17 21:29:35.149 AiExtension] Skipping blocked provider `gpt4all`.
[I 2025-01-17 21:29:35.149 AiExtension] Skipping blocked provider `huggingface_hub`.
[I 2025-01-17 21:29:35.160 AiExtension] Skipping blocked provider `mistralai`.
[I 2025-01-17 21:29:35.178 AiExtension] Skipping blocked provider `nvidia-chat`.
[I 2025-01-17 21:29:35.244 AiExtension] Skipping blocked provider `ollama`.
[I 2025-01-17 21:29:35.244 AiExtension] Skipping blocked provider `openai`.
[I 2025-01-17 21:29:35.244 AiExtension] Skipping blocked provider `openai-chat`.
[I 2025-01-17 21:29:35.257 AiExtension] Skipping blocked provider `openrouter`.
[I 2025-01-17 21:29:35.257 AiExtension] Skipping blocked provider `qianfan`.
[I 2025-01-17 21:29:35.257 AiExtension] Skipping blocked provider `sagemaker-endpoint`.
[I 2025-01-17 21:29:35.257 AiExtension] Skipping blocked provider `togetherai`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `azure`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `bedrock`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `cohere`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `gpt4all`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `huggingface_hub`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `mistralai`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `ollama`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `openai`.
[I 2025-01-17 21:29:35.265 AiExtension] Skipping blocked provider `qianfan`.
[I 2025-01-17 21:29:35.271 AiExtension] Registered providers.
[I 2025-01-17 21:29:35.271 AiExtension] Registered jupyter_ai server extension
[I 2025-01-17 21:29:35.286 AiExtension] Registered context provider `file`.
[I 2025-01-17 21:29:35.287 AiExtension] Initialized Jupyter AI server extension in 1588 ms.
[I 2025-01-17 21:29:35.288 ServerApp] jupyter_ai | extension was successfully loaded.
[I 2025-01-17 21:29:35.289 ServerApp] jupyter_lsp | extension was successfully loaded.
[I 2025-01-17 21:29:35.290 ServerApp] jupyter_server_terminals | extension was successfully loaded.
[I 2025-01-17 21:29:35.291 LabApp] JupyterLab extension loaded from /opt/conda/lib/python3.11/site-packages/jupyterlab
[I 2025-01-17 21:29:35.291 LabApp] JupyterLab application directory is /opt/conda/share/jupyter/lab
[I 2025-01-17 21:29:35.291 LabApp] Extension Manager is 'pypi'.
[I 2025-01-17 21:29:35.298 ServerApp] jupyterlab | extension was successfully loaded.

| | | |_ __ | | | | ___
| || | ' / / _ | / -)
_/| ./_
,_,|____|
|_|

Read the migration plan to Notebook 7 to learn about the new features and the actions to take if you are using extensions.

https://jupyter-notebook.readthedocs.io/en/latest/migrate_to_notebook7.html

Please note that updating to Notebook 7 might break some of your extensions.

[I 2025-01-17 21:29:35.300 ServerApp] nbclassic | extension was successfully loaded.
[I 2025-01-17 21:29:35.301 ServerApp] notebook | extension was successfully loaded.
[C 2025-01-17 21:29:35.302 ServerApp] Running as root is not recommended. Use --allow-root to bypass.

@eazuman eazuman added the bug Something isn't working label Jan 17, 2025
@eazuman eazuman changed the title Jupyter_ai throwing InternalServerError for the chat response. Jupyter_ai throwing openai.InternalServerError for the chat response but the extension loads successfully and /generate command works Jan 17, 2025
@eazuman eazuman changed the title Jupyter_ai throwing openai.InternalServerError for the chat response but the extension loads successfully and /generate command works Jupyter_ai throws 'openai.InternalServerError' for the chat response but the extension loads successfully and /generate command works Jan 17, 2025
@eazuman eazuman changed the title Jupyter_ai throws 'openai.InternalServerError' for the chat response but the extension loads successfully and /generate command works Jupyter_ai for Azure OpenAi_throws 'openai.InternalServerError' for the chat response but the extension loads successfully and /generate command works Jan 19, 2025
@eazuman
Copy link
Author

eazuman commented Jan 19, 2025

Also ,I was able to get this working with an older version of jupyter_ai. Here are the package details. Do you have any idea what's causing the issue with the latest versions?

jupyter_ai                2.11.0
jupyter_ai_magics         2.11.0
jupyter_client            8.6.1
jupyter_core              5.7.2
jupyter-events            0.10.0
jupyter-lsp               2.2.5
jupyter_packaging         0.12.3
jupyter_server            2.14.0
jupyter_server_terminals  0.5.3
jupyter-telemetry         0.1.0
jupyterhub                4.1.5
jupyterlab                4.1.8
jupyterlab_pygments       0.3.0
jupyterlab_server         2.27.1
langchain                 0.1.20
langchain-community       0.0.38
langchain-core            0.1.53
langchain-text-splitters  0.0.2
langsmith                 0.1.147
libmambapy                1.5.8
locket                    1.0.0
Mako                      1.3.3
mamba                     1.5.8
MarkupSafe                2.1.5
marshmallow               3.25.1
matplotlib-inline         0.1.7
menuinst                  2.0.2
mistune                   3.0.2
msgpack                   1.1.0
multidict                 6.1.0
mypy-extensions           1.0.0
nbclassic                 1.0.0
nbclient                  0.10.0
nbconvert                 7.16.4
nbformat                  5.10.4
nest_asyncio              1.6.0
notebook                  7.1.3
notebook_shim             0.2.4
numpy                     1.26.4
oauthlib                  3.2.2
openai                    1.59.8
Image

@eazuman eazuman changed the title Jupyter_ai for Azure OpenAi_throws 'openai.InternalServerError' for the chat response but the extension loads successfully and /generate command works Jupyter_ai for Azure OpenAI throws 'InternalServerError' for all chat responses Jan 21, 2025
@dlqqq
Copy link
Member

dlqqq commented Jan 21, 2025

@eazuman "Internal server error" means that OpenAI's servers cannot handle this request right now due to an internal error. In other words, this error is being caused by OpenAI's backend servers, not Jupyter AI.

If you retry again today, you should hopefully be able to consistently receive good replies. I would recommend trying again with the latest version of Jupyter AI (v2.29.0).

@eazuman
Copy link
Author

eazuman commented Jan 21, 2025

@dlqqq thanks for your response!
I know, but it seems like there’s an issue with the package version (Might be langchain/openai) getting installed with the latest version of Jupyter AI. I haven’t been able to pinpoint the exact one.

As I said above, I got it working with Jupyter AI version 2.11, but not with the latest version.

@dlqqq
Copy link
Member

dlqqq commented Jan 21, 2025

@eazuman This is very strange. It is possible that this is a bug with the langchain-openai package. See this similar issue on the LangChain repo: langchain-ai/langchain#24504

@dlqqq
Copy link
Member

dlqqq commented Jan 21, 2025

@eazuman Could you try using the langchain-openai package directly to interact with Azure OpenAI? If that also produces an error, then that means that this issue is being caused by a bug in langchain-openai, and that the issue should be opened upstream on the langchain-ai/langchain repo.

@eazuman
Copy link
Author

eazuman commented Jan 22, 2025

Thank you @dlqqq for the suggestion!
It looks like that approach works for different versions of langchain-openai that I tried in the notebook in JupyterLab, which uses the same package version as Jupyter_ai. However, the chat stil not working (please see the screenshot).

Image

@eazuman
Copy link
Author

eazuman commented Jan 22, 2025

Additionally, I tested other package versions and found that Jupyter_ai 2.12.0 is the highest version that works for me. All versions from 2.12.0 to 2.29.0 fail with different errors, which are listed below, while all of them work fine with langchain_openai directly.

Do you have any thoughts on other factors that could be causing these issues, or any specific package versions?

Thanks!!

2.14.0

File "/opt/conda/lib/python3.11/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'azure_openai_api_key'

jupyter_ai                    2.14.0
jupyter_ai_magics             2.14.0
jupyter_client                8.6.1
jupyter_core                  5.7.2
jupyter-events                0.11.0
jupyter-lsp                   2.2.5
jupyter_packaging             0.12.3
jupyter_server                2.15.0
jupyter_server_terminals      0.5.3
jupyter-telemetry             0.1.0
jupyterhub                    4.1.5
jupyterlab                    4.1.8
jupyterlab_pygments           0.3.0
jupyterlab_server             2.27.1
jupyterlab_widgets            3.0.13
keyring                       24.3.1
langchain                     0.1.20
langchain-anthropic           0.1.13
langchain-community           0.0.38
langchain-core                0.1.53
langchain-google-genai        1.0.4
langchain-nvidia-ai-endpoints 0.1.7
langchain-openai              0.1.7

2.17.0/ 2.18.0

Error, even though I have passed api_version in the settings . Also tried the same as env. variable

Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 133, in on_message
await self.process_message(message)
File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/default.py", line 47, in process_message
self.get_llm_chain()
File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 274, in get_llm_chain
self.create_llm_chain(lm_provider, lm_provider_params)
File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/default.py", line 30, in create_llm_chain
llm = provider(**unified_parameters)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/jupyter_ai_magics/providers.py", line 347, in init
super().init(*args, **kwargs, **model_kwargs)
File "/opt/conda/lib/python3.11/site-packages/pydantic/v1/main.py", line 341, in init
raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for AzureChatOpenAIProvider
root
Must provide either the api_version argument or the OPENAI_API_VERSION environment variable (type=value_error)

jupyter_ai                    2.18.0
jupyter_ai_magics             2.18.0
jupyter_client                8.6.1
jupyter_core                  5.7.2
jupyter-events                0.11.0
jupyter-lsp                   2.2.5
jupyter_packaging             0.12.3
jupyter_server                2.15.0
jupyter_server_terminals      0.5.3
jupyter-telemetry             0.1.0
jupyterhub                    4.1.5
jupyterlab                    4.1.8
jupyterlab_pygments           0.3.0
jupyterlab_server             2.27.1
jupyterlab_widgets            3.0.13
keyring                       24.3.1
langchain                     0.1.20
langchain-anthropic           0.1.13
langchain-community           0.0.38
langchain-core                0.1.53
langchain-google-genai        1.0.4
langchain-mistralai           0.1.7
langchain-nvidia-ai-endpoints 0.1.7
langchain-openai              0.1.7
langchain-text-splitters      0.0.2
langsmith                     0.1.147
libmambapy                    1.5.8

2.19.0 - 2.29.0
Internal server error mentioned in this ticket-

File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1644, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Internal Server Error

jupyter_ai                    2.24.0
jupyter_ai_magics             2.24.0
jupyter_client                8.6.1
jupyter_core                  5.7.2
jupyter-events                0.10.0
jupyter-lsp                   2.2.5
jupyter_server                2.14.0
jupyter_server_terminals      0.5.3
jupyter-telemetry             0.1.0
jupyterhub                    4.1.5
jupyterlab                    4.1.8
jupyterlab_pygments           0.3.0
jupyterlab_server             2.27.1
jupyterlab_widgets            3.0.13
langchain                     0.2.17
langchain-anthropic           0.1.23
langchain-aws                 0.1.18
langchain-cohere              0.2.4
langchain-community           0.2.19
langchain-core                0.2.43
langchain-experimental        0.0.65
langchain-google-genai        1.0.10
langchain-mistralai           0.1.13
langchain-nvidia-ai-endpoints 0.2.2
langchain-ollama              0.1.3
langchain-openai              0.1.25
langchain-text-splitters      0.2.4
libmambapy                    1.5.8
jupyter_ai                    2.29.0
jupyter_ai_magics             2.29.0
jupyter_client                8.6.1
jupyter_core                  5.7.2
jupyter-events                0.10.0
jupyter-lsp                   2.2.5
jupyter_server                2.14.0
jupyter_server_terminals      0.5.3
jupyter-telemetry             0.1.0
jupyterhub                    4.1.5
jupyterlab                    4.1.8
jupyterlab_pygments           0.3.0
jupyterlab_server             2.27.1
jupyterlab_widgets            3.0.13
langchain                     0.3.15
langchain-anthropic           0.3.3
langchain-aws                 0.2.11
langchain-cohere              0.3.5
langchain-community           0.3.15
langchain-core                0.3.31
langchain-experimental        0.3.4
langchain-google-genai        2.0.9
langchain-mistralai           0.2.4
langchain-nvidia-ai-endpoints 0.3.7
langchain-ollama              0.2.2
langchain-openai              0.3.1

@eazuman
Copy link
Author

eazuman commented Jan 31, 2025

Hello @dlqqq @srdas

I’m still trying a few other things and wanted to check with you to see if there are any package dependencies we should consider that should be pin?

As I mentioned, we’re using jupyterlab==3.6.7 with jupyter_ai==1.11, which does not include the langchain_openai package and works now.

When we moved to jupyterlab==4.1.8 and used a higher version of jupyter_ai (like I mentioned before), langchain_openai became a dependency for the Azure provider.

For this specific version of jupyterlab, do you think I should use a particular version of `jupyter_ai' chat? anything like that ? Also, I’ve been installing the packages in different ways like -

pip install jupyter_ai[all]==2.28.5

pip install jupyter_ai==2.28.5 langchain_openai

I just want to make sure I’m trying a few options to troubleshoot the issue and not missing something.

Thanks for the help!

@srdas
Copy link
Collaborator

srdas commented Jan 31, 2025

@eazuman Thanks so much for investigating this further. I looked into all the code changes made when we updated from jupyter-ai v2.11 to 2.12, and there does not seem to be anything modified there which impacts the Azure OpenAI API. One of the difficulties is that I do not have access to Azure OpenAI APIs, and getting an instance has proven difficult. Since you do have access to azure openai, can you try the suggestion below that installs all dependencies and if that works, then we will know it is a dependency issue and not with azure. The idea would be to do a Developer install of v3 that is in the works, following the instructions here: https://jupyter-ai.readthedocs.io/en/latest/contributors/index.html#development-install

Best to do this in a clean new environment:

conda create -n jupyter-ai -c conda-forge python=3.12 nodejs=20
conda activate jupyter-ai
git clone https://github.com/jupyterlab/jupyter-ai.git
# Move to the root of the repo package
cd <jupyter-ai-top>

# Installs all the dependencies and sets up the dev environment
./scripts/install.sh
jlpm
jlpm build

Depending on your environment, you may see some errors but they may not matter. Then run Jupyter Lab with

jlpm dev

The Settings are in a different location, as shown here:

Image

And you can open a new chat from the top left as shown on the screenshot above. Hope this works or gives some insight into where the issue lies (and also gives a look at v3!).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants