Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

import langchain with python<=3.9 fails #5113

Closed
1 of 14 tasks
tand826 opened this issue May 23, 2023 · 31 comments
Closed
1 of 14 tasks

import langchain with python<=3.9 fails #5113

tand826 opened this issue May 23, 2023 · 31 comments

Comments

@tand826
Copy link

tand826 commented May 23, 2023

System Info

  • platform
$ cat /etc/os-release 
NAME="Ubuntu"
VERSION="20.04.6 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04.6 LTS"
VERSION_ID="20.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=focal
UBUNTU_CODENAME=focal
  • python
$ python -V
Python 3.9.7

# installed with
asdf install python 3.9.7
# this was updated today.
typing_extensions==4.6.0
  • dependencies
langchain==0.0.177
openapi-schema-pydantic==1.2.4
pydantic==1.10.7
all the dependencies
$ pip install langchain
Collecting langchain
  Using cached langchain-0.0.177-py3-none-any.whl (877 kB)
Collecting PyYAML>=5.4.1
  Using cached PyYAML-6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (661 kB)
Collecting openapi-schema-pydantic<2.0,>=1.2
  Using cached openapi_schema_pydantic-1.2.4-py3-none-any.whl (90 kB)
Collecting requests<3,>=2
  Using cached requests-2.31.0-py3-none-any.whl (62 kB)
Collecting SQLAlchemy<3,>=1.4
  Using cached SQLAlchemy-2.0.15-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.7 MB)
Collecting aiohttp<4.0.0,>=3.8.3
  Using cached aiohttp-3.8.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB)
Collecting dataclasses-json<0.6.0,>=0.5.7
  Using cached dataclasses_json-0.5.7-py3-none-any.whl (25 kB)
Collecting async-timeout<5.0.0,>=4.0.0
  Using cached async_timeout-4.0.2-py3-none-any.whl (5.8 kB)
Collecting numexpr<3.0.0,>=2.8.4
  Using cached numexpr-2.8.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (380 kB)
Collecting pydantic<2,>=1
  Using cached pydantic-1.10.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.2 MB)
Collecting numpy<2,>=1
  Using cached numpy-1.24.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
Collecting tenacity<9.0.0,>=8.1.0
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting multidict<7.0,>=4.5
  Using cached multidict-6.0.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB)
Collecting attrs>=17.3.0
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4.0,>=2.0
  Using cached charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (199 kB)
Collecting yarl<2.0,>=1.0
  Using cached yarl-1.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (269 kB)
Collecting frozenlist>=1.1.1
  Using cached frozenlist-1.3.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (158 kB)
Collecting aiosignal>=1.1.2
  Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.8.0-py3-none-any.whl (8.7 kB)
Collecting marshmallow<4.0.0,>=3.3.0
  Using cached marshmallow-3.19.0-py3-none-any.whl (49 kB)
Collecting marshmallow-enum<2.0.0,>=1.5.1
  Using cached marshmallow_enum-1.5.1-py2.py3-none-any.whl (4.2 kB)
Collecting packaging>=17.0
  Using cached packaging-23.1-py3-none-any.whl (48 kB)
Collecting typing-extensions>=4.2.0
  Using cached typing_extensions-4.6.0-py3-none-any.whl (30 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (610 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB)
Installing collected packages: packaging, typing-extensions, mypy-extensions, multidict, marshmallow, idna, frozenlist, yarl, urllib3, typing-inspect, pydantic, numpy, marshmallow-enum, greenlet, charset-normalizer, certifi, attrs, async-timeout, aiosignal, tenacity, SQLAlchemy, requests, PyYAML, openapi-schema-pydantic, numexpr, dataclasses-json, aiohttp, langchain
Successfully installed PyYAML-6.0 SQLAlchemy-2.0.15 aiohttp-3.8.4 aiosignal-1.3.1 async-timeout-4.0.2 attrs-23.1.0 certifi-2023.5.7 charset-normalizer-3.1.0 dataclasses-json-0.5.7 frozenlist-1.3.3 greenlet-2.0.2 idna-3.4 langchain-0.0.177 marshmallow-3.19.0 marshmallow-enum-1.5.1 multidict-6.0.4 mypy-extensions-1.0.0 numexpr-2.8.4 numpy-1.24.3 openapi-schema-pydantic-1.2.4 packaging-23.1 pydantic-1.10.7 requests-2.31.0 tenacity-8.2.2 typing-extensions-4.6.0 typing-inspect-0.8.0 urllib3-2.0.2 yarl-1.9.2

Who can help?

@hwchase17 @agola11

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

  1. install python==3.9.7 or 3.9.8 or 3.9.9 (with asdf or docker. I didn't checked the other versions.)
  2. install langchain pip install langchain
  3. see the error
Python 3.9.7 (default, May 23 2023, 11:05:54) 
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import langchain
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/__init__.py", line 6, in <module>
    from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/agents/__init__.py", line 2, in <module>
    from langchain.agents.agent import (
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/agents/agent.py", line 16, in <module>
    from langchain.agents.tools import InvalidTool
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/agents/tools.py", line 8, in <module>
    from langchain.tools.base import BaseTool, Tool, tool
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/tools/__init__.py", line 42, in <module>
    from langchain.tools.vectorstore.tool import (
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/tools/vectorstore/tool.py", line 13, in <module>
    from langchain.chains import RetrievalQA, RetrievalQAWithSourcesChain
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/chains/__init__.py", line 2, in <module>
    from langchain.chains.api.base import APIChain
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/chains/api/base.py", line 13, in <module>
    from langchain.chains.api.prompt import API_RESPONSE_PROMPT, API_URL_PROMPT
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/chains/api/prompt.py", line 2, in <module>
    from langchain.prompts.prompt import PromptTemplate
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/prompts/__init__.py", line 3, in <module>
    from langchain.prompts.chat import (
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/prompts/chat.py", line 10, in <module>
    from langchain.memory.buffer import get_buffer_string
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/memory/__init__.py", line 28, in <module>
    from langchain.memory.vectorstore import VectorStoreRetrieverMemory
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/memory/vectorstore.py", line 10, in <module>
    from langchain.vectorstores.base import VectorStoreRetriever
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/vectorstores/__init__.py", line 2, in <module>
    from langchain.vectorstores.analyticdb import AnalyticDB
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/vectorstores/analyticdb.py", line 16, in <module>
    from langchain.embeddings.base import Embeddings
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/embeddings/__init__.py", line 19, in <module>
    from langchain.embeddings.openai import OpenAIEmbeddings
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 67, in <module>
    class OpenAIEmbeddings(BaseModel, Embeddings):
  File "pydantic/main.py", line 197, in pydantic.main.ModelMetaclass.__new__
  File "pydantic/fields.py", line 506, in pydantic.fields.ModelField.infer
  File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 663, in pydantic.fields.ModelField._type_analysis
  File "pydantic/fields.py", line 808, in pydantic.fields.ModelField._create_sub_type
  File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 668, in pydantic.fields.ModelField._type_analysis
  File "/home/takumi/.asdf/installs/python/3.9.7/lib/python3.9/typing.py", line 847, in __subclasscheck__
    return issubclass(cls, self.__origin__)
TypeError: issubclass() arg 1 must be a class
>>>
with docker
$ docker run -it python:3.9.7-bullseye bash
$ pip install langchain
$ python -c "import langchain"

Expected behavior

Python 3.10.1 (main, Dec 21 2021, 09:01:08) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import langchain
>>>
  • what to do?

    • change python dependency to 3.10 or later
    • fix the version of typing_extensions to 4.5.0 or change the relevant code.
  • Thank you for checking out this issue. If there are anything more to check, I would be glad to help.

@dojowahi
Copy link

dojowahi commented May 23, 2023

Even with 3.10 though the import works, once you try using the module you will encounter this error Could not import transformers python package. This is needed in order to calculate get_token_ids. Please install it with pip install transformers.

@iterix
Copy link

iterix commented May 23, 2023

Running in to this for python v3.8 and v3.9

I suspect some package that has not been pinned has been updated in the past few hours on pypi

@Liweixin22
Copy link

Liweixin22 commented May 23, 2023

I get the same error on py3.9 too.
and on Python 3.10.6, it is ok.

@tand826
Copy link
Author

tand826 commented May 23, 2023

Does anyone know the instruction page of supported python versions?

@nai-kon
Copy link

nai-kon commented May 23, 2023

The pypi says ">Python >=3.8.1, <4.0 "
https://pypi.org/project/langchain/#description

@KhaledLela
Copy link

KhaledLela commented May 23, 2023

Fix version of these dependence:

typing-inspect==0.8.0
typing_extensions==4.5.0

Solve the issue for me:

@tand826
Copy link
Author

tand826 commented May 23, 2023

Fix version of these dependence:

typing-inspect==0.8.0
typing_extensions==4.5.0

Solve the issue for me:

It worked! Thank you!

typing_extensions was updated only a few hours ago and it seems to have caused something.
https://pypi.org/project/typing-extensions/#history

@nshgraph
Copy link

Could the poetry.lock please be updated to actually lock breaking dependencies?

@sts-kanada
Copy link

sts-kanada commented May 23, 2023

I encountered the same error while using langchain on Docker.

Here's my Dockerfile:

FROM python:3.8-slim
# I also tried python:3.8, python:3.8-buster, python:3.9-slim and didn't work.
# pyton:3.10-slim works for me.

I didn't face this issue on my local Ubuntu as below with python-3.8.

NAME="Ubuntu"
VERSION="20.04.6 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04.6 LTS"
VERSION_ID="20.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=focal
UBUNTU_CODENAME=focal

I modified requirements.txt by changing langchain==0.0.173 to langchain==0.0.125, and it worked.

$ sudo docker-compose run web bash
root@5bf0c69be9ec:/app# python --version
Python 3.8.16
root@5bf0c69be9ec:/app# python
Python 3.8.16 (default, May  3 2023, 10:09:08) 
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import langchain
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.8/site-packages/langchain/__init__.py", line 6, in <module>
    from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
  File "/usr/local/lib/python3.8/site-packages/langchain/agents/__init__.py", line 2, in <module>
    from langchain.agents.agent import (
  File "/usr/local/lib/python3.8/site-packages/langchain/agents/agent.py", line 16, in <module>
    from langchain.agents.tools import InvalidTool
  File "/usr/local/lib/python3.8/site-packages/langchain/agents/tools.py", line 8, in <module>
    from langchain.tools.base import BaseTool, Tool, tool
  File "/usr/local/lib/python3.8/site-packages/langchain/tools/__init__.py", line 42, in <module>
    from langchain.tools.vectorstore.tool import (
  File "/usr/local/lib/python3.8/site-packages/langchain/tools/vectorstore/tool.py", line 13, in <module>
    from langchain.chains import RetrievalQA, RetrievalQAWithSourcesChain
  File "/usr/local/lib/python3.8/site-packages/langchain/chains/__init__.py", line 2, in <module>
    from langchain.chains.api.base import APIChain
  File "/usr/local/lib/python3.8/site-packages/langchain/chains/api/base.py", line 13, in <module>
    from langchain.chains.api.prompt import API_RESPONSE_PROMPT, API_URL_PROMPT
  File "/usr/local/lib/python3.8/site-packages/langchain/chains/api/prompt.py", line 2, in <module>
    from langchain.prompts.prompt import PromptTemplate
  File "/usr/local/lib/python3.8/site-packages/langchain/prompts/__init__.py", line 3, in <module>
    from langchain.prompts.chat import (
  File "/usr/local/lib/python3.8/site-packages/langchain/prompts/chat.py", line 10, in <module>
    from langchain.memory.buffer import get_buffer_string
  File "/usr/local/lib/python3.8/site-packages/langchain/memory/__init__.py", line 28, in <module>
    from langchain.memory.vectorstore import VectorStoreRetrieverMemory
  File "/usr/local/lib/python3.8/site-packages/langchain/memory/vectorstore.py", line 10, in <module>
    from langchain.vectorstores.base import VectorStoreRetriever
  File "/usr/local/lib/python3.8/site-packages/langchain/vectorstores/__init__.py", line 2, in <module>
    from langchain.vectorstores.analyticdb import AnalyticDB
  File "/usr/local/lib/python3.8/site-packages/langchain/vectorstores/analyticdb.py", line 15, in <module>
    from langchain.embeddings.base import Embeddings
  File "/usr/local/lib/python3.8/site-packages/langchain/embeddings/__init__.py", line 19, in <module>
    from langchain.embeddings.openai import OpenAIEmbeddings
  File "/usr/local/lib/python3.8/site-packages/langchain/embeddings/openai.py", line 67, in <module>
    class OpenAIEmbeddings(BaseModel, Embeddings):
  File "pydantic/main.py", line 197, in pydantic.main.ModelMetaclass.__new__
  File "pydantic/fields.py", line 506, in pydantic.fields.ModelField.infer
  File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 663, in pydantic.fields.ModelField._type_analysis
  File "pydantic/fields.py", line 808, in pydantic.fields.ModelField._create_sub_type
  File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 668, in pydantic.fields.ModelField._type_analysis
  File "/usr/local/lib/python3.8/typing.py", line 774, in __subclasscheck__
    return issubclass(cls, self.__origin__)
TypeError: issubclass() arg 1 must be a class
>>> 

@vsvn-SuNT
Copy link

In Dockerfile, FROM python:3.10-slim is ok for me.

@maail
Copy link

maail commented May 23, 2023

Spent hours debugging this. Glad I found this thread. python:3.10-slim worked for me. Thanks @tiensunguyen

@benwilde
Copy link

Fix version of these dependence:

typing-inspect==0.8.0
typing_extensions==4.5.0

Solve the issue for me:

It worked! Thank you!

typing_extensions was updated only a few hours ago and it seems to have caused something. https://pypi.org/project/typing-extensions/#history

TY - I thought I was losing my mind. Was working on my laptop but not my GCP instance. This fixed it.

@tand826
Copy link
Author

tand826 commented May 23, 2023

After all, the root cause of this problem is typing_extensions=4.6.0 update, but pydantic directly causes it. pydantic is tackling with this (issue, pr). We need to wait for a while.

@adumont
Copy link
Contributor

adumont commented May 23, 2023

BTW it also fails with typing_extensions==4.6.0.

Python 3.9.16 (main, Mar  8 2023, 14:00:05)
[GCC 11.2.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from langchain.embeddings import OpenAIEmbeddings
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/__init__.py", line 6, in <module>
    from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/agents/__init__.py", line 2, in <module>
    from langchain.agents.agent import (
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/agents/agent.py", line 16, in <module>
    from langchain.agents.tools import InvalidTool
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/agents/tools.py", line 8, in <module>
    from langchain.tools.base import BaseTool, Tool, tool
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/tools/__init__.py", line 42, in <module>
    from langchain.tools.vectorstore.tool import (
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/tools/vectorstore/tool.py", line 13, in <module>
    from langchain.chains import RetrievalQA, RetrievalQAWithSourcesChain
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/chains/__init__.py", line 2, in <module>
    from langchain.chains.api.base import APIChain
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/chains/api/base.py", line 13, in <module>
    from langchain.chains.api.prompt import API_RESPONSE_PROMPT, API_URL_PROMPT
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/chains/api/prompt.py", line 2, in <module>
    from langchain.prompts.prompt import PromptTemplate
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/prompts/__init__.py", line 3, in <module>
    from langchain.prompts.chat import (
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/prompts/chat.py", line 10, in <module>
    from langchain.memory.buffer import get_buffer_string
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/memory/__init__.py", line 28, in <module>
    from langchain.memory.vectorstore import VectorStoreRetrieverMemory
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/memory/vectorstore.py", line 10, in <module>
    from langchain.vectorstores.base import VectorStoreRetriever
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/vectorstores/__init__.py", line 2, in <module>
    from langchain.vectorstores.analyticdb import AnalyticDB
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/vectorstores/analyticdb.py", line 16, in <module>
    from langchain.embeddings.base import Embeddings
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/embeddings/__init__.py", line 19, in <module>
    from langchain.embeddings.openai import OpenAIEmbeddings
  File "/home/adumont/ai/streamlit-confluence/venv/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 67, in <module>
    class OpenAIEmbeddings(BaseModel, Embeddings):
  File "pydantic/main.py", line 197, in pydantic.main.ModelMetaclass.__new__
  File "pydantic/fields.py", line 506, in pydantic.fields.ModelField.infer
  File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 663, in pydantic.fields.ModelField._type_analysis
  File "pydantic/fields.py", line 808, in pydantic.fields.ModelField._create_sub_type
  File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
  File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
  File "pydantic/fields.py", line 668, in pydantic.fields.ModelField._type_analysis
  File "/home/adumont/miniconda3/envs/ai39/lib/python3.9/typing.py", line 852, in __subclasscheck__
    return issubclass(cls, self.__origin__)
TypeError: issubclass() arg 1 must be a class

@jpzhangvincent
Copy link
Contributor

jpzhangvincent commented May 23, 2023

Fix version of these dependence:

typing-inspect==0.8.0
typing_extensions==4.5.0

Solve the issue for me:

Hmm it's not working for me. I had tried to uninstall both packages and langchain and then reinstall them as well. Any other pointers?

@dojowahi
Copy link

My older docker images which were created before this build are working. But nothing since this build. Is there a solution which is working?

@tand826
Copy link
Author

tand826 commented May 24, 2023

Latest pydantic==1.10.8 fixed this issue in here. Langchain needs no update because pydantic dependency is > 1 in here.

Just in case, this is how to check the update.

docker run -it --rm python:3.9-bullseye bash
pip install langchain
python -m "import langchain"
pip freeze | grep pydantic

@tand826 tand826 closed this as completed May 24, 2023
@piseabhijeet
Copy link

Fix version of these dependence:

typing-inspect==0.8.0
typing_extensions==4.5.0

Solve the issue for me:

Hmm it's not working for me. I had tried to uninstall both packages and langchain and then reinstall them as well. Any other pointers?

Hi,

This fix doesn't work for me

python =3.10.0
Mac M1 chipset

@piseabhijeet
Copy link

piseabhijeet commented May 26, 2023

pydantic==1.10.8

This also doesn't fix the issue for me unfortunately.

TypeError                                 Traceback (most recent call last)
Input In [12], in <cell line: 4>()
      2 get_ipython().system('pip install typing_extensions==4.5.0')
      3 get_ipython().system('pip install pydantic==1.10.8')
----> 4 from langchain.chat_models import ChatOpenAI
      5 from langchain.schema import HumanMessage, SystemMessage, AIMessage

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/__init__.py:6, in <module>
      3 from importlib import metadata
      4 from typing import Optional
----> 6 from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
      7 from langchain.cache import BaseCache
      8 from langchain.chains import (
      9     ConversationChain,
     10     LLMBashChain,
   (...)
     18     VectorDBQAWithSourcesChain,
     19 )

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/agents/__init__.py:10, in <module>
      1 """Interface for agents."""
      2 from langchain.agents.agent import (
      3     Agent,
      4     AgentExecutor,
   (...)
      8     LLMSingleActionAgent,
      9 )
---> 10 from langchain.agents.agent_toolkits import (
     11     create_csv_agent,
     12     create_json_agent,
     13     create_openapi_agent,
     14     create_pandas_dataframe_agent,
     15     create_pbi_agent,
     16     create_pbi_chat_agent,
     17     create_spark_dataframe_agent,
     18     create_sql_agent,
     19     create_vectorstore_agent,
     20     create_vectorstore_router_agent,
     21 )
     22 from langchain.agents.agent_types import AgentType
     23 from langchain.agents.conversational.base import ConversationalAgent

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/agents/agent_toolkits/__init__.py:3, in <module>
      1 """Agent toolkits."""
----> 3 from langchain.agents.agent_toolkits.csv.base import create_csv_agent
      4 from langchain.agents.agent_toolkits.file_management.toolkit import (
      5     FileManagementToolkit,
      6 )
      7 from langchain.agents.agent_toolkits.gmail.toolkit import GmailToolkit

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/agents/agent_toolkits/csv/base.py:5, in <module>
      2 from typing import Any, Optional
      4 from langchain.agents.agent import AgentExecutor
----> 5 from langchain.agents.agent_toolkits.pandas.base import create_pandas_dataframe_agent
      6 from langchain.base_language import BaseLanguageModel
      9 def create_csv_agent(
     10     llm: BaseLanguageModel,
     11     path: str,
     12     pandas_kwargs: Optional[dict] = None,
     13     **kwargs: Any
     14 ) -> AgentExecutor:

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/agents/agent_toolkits/pandas/base.py:10, in <module>
      4 from langchain.agents.agent import AgentExecutor
      5 from langchain.agents.agent_toolkits.pandas.prompt import (
      6     PREFIX,
      7     SUFFIX_NO_DF,
      8     SUFFIX_WITH_DF,
      9 )
---> 10 from langchain.agents.mrkl.base import ZeroShotAgent
     11 from langchain.base_language import BaseLanguageModel
     12 from langchain.callbacks.base import BaseCallbackManager

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/agents/mrkl/base.py:16, in <module>
     14 from langchain.base_language import BaseLanguageModel
     15 from langchain.callbacks.base import BaseCallbackManager
---> 16 from langchain.chains import LLMChain
     17 from langchain.prompts import PromptTemplate
     18 from langchain.tools.base import BaseTool

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/chains/__init__.py:11, in <module>
      6 from langchain.chains.conversation.base import ConversationChain
      7 from langchain.chains.conversational_retrieval.base import (
      8     ChatVectorDBChain,
      9     ConversationalRetrievalChain,
     10 )
---> 11 from langchain.chains.flare.base import FlareChain
     12 from langchain.chains.graph_qa.base import GraphQAChain
     13 from langchain.chains.hyde.base import HypotheticalDocumentEmbedder

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/chains/flare/base.py:21, in <module>
     15 from langchain.chains.flare.prompts import (
     16     PROMPT,
     17     QUESTION_GENERATOR_PROMPT,
     18     FinishedOutputParser,
     19 )
     20 from langchain.chains.llm import LLMChain
---> 21 from langchain.llms import OpenAI
     22 from langchain.prompts import BasePromptTemplate
     23 from langchain.schema import BaseRetriever, Generation

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/llms/__init__.py:26, in <module>
     24 from langchain.llms.modal import Modal
     25 from langchain.llms.nlpcloud import NLPCloud
---> 26 from langchain.llms.openai import AzureOpenAI, OpenAI, OpenAIChat
     27 from langchain.llms.petals import Petals
     28 from langchain.llms.pipelineai import PipelineAI

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/langchain/llms/openai.py:123, in <module>
    118         return await llm.client.acreate(**kwargs)
    120     return await _completion_with_retry(**kwargs)
--> 123 class BaseOpenAI(BaseLLM):
    124     """Wrapper around OpenAI large language models."""
    126     client: Any  #: :meta private:

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/main.py:198, in pydantic.main.ModelMetaclass.__new__()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:506, in pydantic.fields.ModelField.infer()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:436, in pydantic.fields.ModelField.__init__()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:552, in pydantic.fields.ModelField.prepare()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:663, in pydantic.fields.ModelField._type_analysis()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:808, in pydantic.fields.ModelField._create_sub_type()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:436, in pydantic.fields.ModelField.__init__()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:552, in pydantic.fields.ModelField.prepare()

File ~/miniconda3/envs/abhijeet/lib/python3.10/site-packages/pydantic/fields.py:668, in pydantic.fields.ModelField._type_analysis()

File ~/miniconda3/envs/abhijeet/lib/python3.10/typing.py:1134, in _SpecialGenericAlias.__subclasscheck__(self, cls)
   1132     return issubclass(cls.__origin__, self.__origin__)
   1133 if not isinstance(cls, _GenericAlias):
-> 1134     return issubclass(cls, self.__origin__)
   1135 return super().__subclasscheck__(cls)

TypeError: issubclass() arg 1 must be a class

@piseabhijeet
Copy link

Fix version of these dependence:

typing-inspect==0.8.0
typing_extensions==4.5.0

Solve the issue for me:

Hmm it's not working for me. I had tried to uninstall both packages and langchain and then reinstall them as well. Any other pointers?

Hi,

This fix doesn't work for me

python =3.10.0 Mac M1 chipset

switching to python 3.8 worked for me :)

@misirov
Copy link

misirov commented May 27, 2023

can confirm that on my side, running this command fixed the problem
pip install typing-inspect==0.8.0 typing_extensions==4.5.0

@adumont
Copy link
Contributor

adumont commented May 27, 2023 via email

@meccaparker
Copy link

can confirm that on my side, running this command fixed the problem pip install typing-inspect==0.8.0 typing_extensions==4.5.0

Seconding this. I was still seeing errors despite the previously-posted solution. This command fixed it.

@fxb392
Copy link

fxb392 commented May 28, 2023

python 3.10.6
langchain 0.0.181
I also encountered this problem.

@Dalaoyel
Copy link

修复这些依赖的版本:

typing-inspect==0.8.0
typing_extensions==4.5.0

为我解决问题:

That works for me, thanks

@motin
Copy link

motin commented May 29, 2023

As mentioned above, this has been fixed in Pydantic 1.10.8. No need to pin typing-inspect and typing_extensions if you are using Pydantic 1.10.8. For poetry, use pydantic = "^1.10.8" in your pyproject.toml file.

@maxoute
Copy link

maxoute commented Jun 15, 2023

Thanks !

pip install pydantic==1.10.8 works for me

@cibernicola
Copy link

fixed the issue with pydantic==1.10.8

@blakefan
Copy link

blakefan commented Jul 2, 2023

Solve the issue for me:
pip install pydantic==1.10.8 works for me
Thanks

@FarnooshAzour-Byond
Copy link

fixed the issue with pydantic==1.10.8

Thankss!

@anatesan-stream
Copy link

I am still getting this issue with Python 3.9.2 and pydantic==1.10.8

Should I be upgrading Python version (I am on Debian Release 11)

Oleh929216 added a commit to Oleh929216/Chainlit that referenced this issue Jan 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests