Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: issubclass() arg 1 must be a class when using langchain in azure #7548

Closed
1 of 14 tasks
levalencia opened this issue Jul 11, 2023 · 23 comments
Closed
1 of 14 tasks
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@levalencia
Copy link
Contributor

System Info

langchain 0.0.225 also tested with 0.0.229
I can only reproduce it in Azure, I cant reproduce it locally.

Who can help?

I have a simple python app with streamlit and langchain, I am deploying this to Azure via CI/CD with the following YAML definition

stages:
- stage: Build
  displayName: Build stage
  jobs:
  - job: BuildJob
    pool:
      vmImage: $(vmImageName)
    steps:
    - task: UsePythonVersion@0
      inputs:
        versionSpec: '$(pythonVersion)'
      displayName: 'Use Python $(pythonVersion)'

    - script: |
        python -m venv antenv
        source antenv/bin/activate
        python -m pip install --upgrade pip
        pip install setup streamlit
        pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
      workingDirectory: $(projectRoot)
      displayName: "Install requirements"

    - task: ArchiveFiles@2
      displayName: 'Archive files'
      inputs:
        rootFolderOrFile: '$(projectRoot)'
        includeRootFolder: false
        archiveType: zip
        archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
        replaceExistingArchive: true

    - upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
      displayName: 'Upload package'
      artifact: drop

- stage: Deploy
  displayName: 'Deploy Web App'
  dependsOn: Build
  condition: succeeded()
  jobs:
  - deployment: DeploymentJob
    pool:
      vmImage: $(vmImageName)
    environment: $(environmentName)
    strategy:
      runOnce:
        deploy:
          steps:

          - task: UsePythonVersion@0
            inputs:
              versionSpec: '$(pythonVersion)'
            displayName: 'Use Python version'
        
          - task: AzureAppServiceSettings@1
            displayName: 'Set App Settings'
            inputs:
              azureSubscription: 'AzureAIPocPrincipal'
              appName: 'test'
              resourceGroupName: 'AzureAIPoc'
              appSettings: |
                [
                  {
                    "name": "ENABLE_ORYX_BUILD",
                    "value": 1
                  },
                  {
                    "name": "SCM_DO_BUILD_DURING_DEPLOYMENT",
                    "value": 1
                  },
                  {
                    "name": "POST_BUILD_COMMAND",
                    "value": "pip install -r ./requirements.txt"
                  }
                ]

          - task: AzureWebApp@1
            displayName: 'Deploy Azure Web App : {{ webAppName }}'
            inputs:
              azureSubscription: 'AzureAIPocPrincipal'
              appType: 'webAppLinux'
              deployToSlotOrASE: true
              resourceGroupName: 'AzureAIPoc'
              slotName: 'production'
              appName: 'test'
              package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
              startUpCommand: 'python -m streamlit run app/home.py --server.port 8000 --server.address 0.0.0.0'

My requirements file is:

langchain==0.0.225
streamlit
openai
python-dotenv
pinecone-client
streamlit-chat
chromadb
tiktoken
pymssql
typing-inspect==0.8.0
typing_extensions==4.5.0

However I am getting the following error:

TypeError: issubclass() arg 1 must be a class
Traceback:
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
File "/tmp/8db82251b0e58bc/app/pages/xxv0.2.py", line 6, in <module>
    import langchain
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/__init__.py", line 6, in <module>
    from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/agents/__init__.py", line 2, in <module>
    from langchain.agents.agent import (
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/agents/agent.py", line 26, in <module>
    from langchain.chains.base import Chain
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/chains/__init__.py", line 2, in <module>
    from langchain.chains.api.base import APIChain
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/chains/api/base.py", line 13, in <module>
    from langchain.chains.api.prompt import API_RESPONSE_PROMPT, API_URL_PROMPT
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/chains/api/prompt.py", line 2, in <module>
    from langchain.prompts.prompt import PromptTemplate
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/prompts/__init__.py", line 12, in <module>
    from langchain.prompts.example_selector import (
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/prompts/example_selector/__init__.py", line 4, in <module>
    from langchain.prompts.example_selector.semantic_similarity import (
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/prompts/example_selector/semantic_similarity.py", line 8, in <module>
    from langchain.embeddings.base import Embeddings
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/embeddings/__init__.py", line 29, in <module>
    from langchain.embeddings.sagemaker_endpoint import SagemakerEndpointEmbeddings
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/embeddings/sagemaker_endpoint.py", line 7, in <module>
    from langchain.llms.sagemaker_endpoint import ContentHandlerBase
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/llms/__init__.py", line 52, in <module>
    from langchain.llms.vertexai import VertexAI
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/llms/vertexai.py", line 14, in <module>
    from langchain.utilities.vertexai import (
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/utilities/__init__.py", line 3, in <module>
    from langchain.utilities.apify import ApifyWrapper
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/utilities/apify.py", line 5, in <module>
    from langchain.document_loaders import ApifyDatasetLoader
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/document_loaders/__init__.py", line 43, in <module>
    from langchain.document_loaders.embaas import EmbaasBlobLoader, EmbaasLoader
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/langchain/document_loaders/embaas.py", line 54, in <module>
    class BaseEmbaasLoader(BaseModel):
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/main.py", line 204, in __new__
    fields[ann_name] = ModelField.infer(
                       ^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 488, in infer
    return cls(
           ^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 419, in __init__
    self.prepare()
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 539, in prepare
    self.populate_validators()
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 801, in populate_validators
    *(get_validators() if get_validators else list(find_validators(self.type_, self.model_config))),
                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/validators.py", line 696, in find_validators
    yield make_typeddict_validator(type_, config)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/validators.py", line 585, in make_typeddict_validator
    TypedDictModel = create_model_from_typeddict(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/annotated_types.py", line 35, in create_model_from_typeddict
    return create_model(typeddict_cls.__name__, **kwargs, **field_definitions)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/main.py", line 972, in create_model
    return type(__model_name, __base__, namespace)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/main.py", line 204, in __new__
    fields[ann_name] = ModelField.infer(
                       ^^^^^^^^^^^^^^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 488, in infer
    return cls(
           ^^^^
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 419, in __init__
    self.prepare()
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 534, in prepare
    self._type_analysis()
File "/tmp/8db82251b0e58bc/antenv/lib/python3.11/site-packages/pydantic/fields.py", line 638, in _type_analysis
    elif issubclass(origin, Tuple):  # type: ignore
         ^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/python/3.11.3/lib/python3.11/typing.py", line 1570, in __subclasscheck__
    return issubclass(cls, self.__origin__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

I am not copying here the app script as the code works locally, I think its something more related to Azure App Service Plan Environment or the venv setup in the yaml file.

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

n/a

Expected behavior

code should work :)

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jul 11, 2023
@dosubot
Copy link

dosubot bot commented Jul 11, 2023

Answer generated by a 🤖

Answer

I found related issues that might be helpful. I did my best to summarize the solutions, but I recommend looking at the related issues yourself.

Closed Issues

import langchain with python<=3.9 fails

The suggested solution is:

Upgrade Pydantic to version 1.10.8 by running pip install pydantic==1.10.8. This solution has been confirmed by multiple users in the discussion.

Open Issues

TypeError: issubclass() arg 1 must be a class (importing AsyncCallbackHandler)

The suggested solution is: The error seems to be related to a circular import issue in the LangChain library. As a user, you can't directly fix this issue because it's related to the internal structure of the LangChain library. The library's maintainers need to resolve this issue. In the meantime, you can try to import the AsyncCallbackHandler directly from its module instead of from langchain.callbacks.base. This might bypass the circular import issue:

from langchain.callbacks.async_handler import AsyncCallbackHandler

If this doesn't work, you'll need to wait for a fix from the LangChain team. You should report this issue to them if you haven't already.

Error occurs when import langchain.agents

The suggested solution is: Upgrade Pydantic to version 1.10.8 by running the following command: pip install pydantic==1.10.8. This solution has been confirmed to solve the issue by multiple users.


This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

@VIVelev
Copy link

VIVelev commented Jul 11, 2023

I'm experiencing the same issue with streamlit, but upgrading pydantic to 1.10.8 solved it!

@baskaryan
Copy link
Collaborator

yea believe there's a chromdb <> pydantic incompatibility in latest chroma release. try pip install chromadb==0.3.26 pydantic==1.10.8

see #7522 as well

@azaiman1
Copy link

I am experiencing the same issue as well but pip install chromadb==0.3.26 pydantic==1.10.8 did not work

@sitloboi2012
Copy link

sitloboi2012 commented Jul 11, 2023

here is my requirements.txt and works like a charm

typing-inspect==0.8.0
typing_extensions==4.5.0
openai==0.27.8
langchain==0.0.230
chroma==0.2.0
chromadb==0.3.27
pydantic==1.10.8 or pydantic==1.10.9 is fine

I used the latest version of both langchain, chroma, and openai

@Marouan-chak
Copy link

Using pydantic==1.10.8 in my requirements.txt fixed the problem for me

@chancharikmitra
Copy link

chancharikmitra commented Jul 11, 2023

I am experiencing the same issue as well but pip install chromadb==0.3.26 pydantic==1.10.8 did not work

A word of caution for anyone else reading this thread. Some of these fixes don't work in certain combinations. Start with just the pydantic==1.10.8 change on its own. I would recommend only trying other combinations of dependency changes after you try this one (based on my experience of doing the opposite).

Edit: As mentioned below by @Pierian-Data, what worked for me was to set the version for pydantic last.

@Pierian-Data
Copy link

Also experiencing this but specifying versions fixed it, just be careful what order you install things in, pydantic needs to be last, since chromdb will overwrite to latest pydantic, causing the issue again.

@v4rm3t
Copy link

v4rm3t commented Jul 11, 2023

@Pierian-Data you are right. So, what's the fix? Downgrade ChromaDB, if any versions use pydantic < 1.10.8?

@Pierian-Data
Copy link

Its the versions @sitloboi2012 said:\

typing-inspect==0.8.0
typing_extensions==4.5.0
openai==0.27.8
langchain==0.0.230
chroma==0.2.0
chromadb==0.3.27
pydantic==1.10.8

caveat being they need to be installed in that order. Some people visiting this thread are saying those versions don't work, but its likely because the installed the correct versions, just not in the right order. If you do pydantic first and chromadb last, chromadb will overwrite pydantic to a new version. Best of luck

@tnunamak
Copy link

Any suggestions for how to deal with this for poetry-managed dependencies?

@starpause
Copy link

Any suggestions for how to deal with this for poetry-managed dependencies?

I don't think it's possible with poetry or pipenv because those tools check for conflicting dependencies. OTOH pip lets the conflict slip through and... it happens to work 😅

@levalencia
Copy link
Contributor Author

when I try your versions:
typing-inspect==0.8.0
typing_extensions==4.5.0
openai==0.27.8
langchain==0.0.230
streamlit
python-dotenv
pinecone-client
streamlit-chat
chroma==0.2.0
chromadb==0.3.27
tiktoken
pymssql
pydantic==1.10.8

I get an error:

**ERROR: Cannot install -r requirements.txt (line 10), -r requirements.txt (line 4) and pydantic==1.10.8 because these package versions have conflicting dependencies.

The conflict is caused by:
The user requested pydantic==1.10.8
langchain 0.0.230 depends on pydantic<2 and >=1
chromadb 0.3.27 depends on pydantic==1.9**

@levalencia
Copy link
Contributor Author

comment from @baskaryan worked

pip install chromadb==0.3.26 pydantic==1.10.8

@saxenarajat
Copy link

With chromadb 0.3.26, I am getting the following error. I am using Chroma in client-server mode.

AttributeError: 'Chroma' object has no attribute '_client_settings'

@jbanerje
Copy link

pydantic==1.10.8 worked. Thank You!

@jeffchuber
Copy link
Contributor

hey everyone, Jeff from Chroma here, the fastapi bump with the pydantic bump caused a bunch of type errors in Chroma - we made an issue to fix this up chroma-core/chroma#785

@braduck
Copy link

braduck commented Jul 12, 2023

pydantic=1.10.8 didn't work for me and my problem is local. If I put just line below in a Jupyter Notebook (Visual Studio Code extension) I got the error:

from langchain.embeddings.openai import OpenAIEmbeddings

@kashyap-aditya
Copy link

kashyap-aditya commented Jul 12, 2023

I have found that

!pip install typing-inspect==0.8.0

!pip install typing_extensions==4.5.0

!pip install openai==0.27.8

!pip install langchain==0.0.230

!pip install chroma==0.2.0

!pip install chromadb==0.3.26

!pip install pydantic==1.10.8

will work for me sometimes.

If it doesn't work,
I add a line at the end
!pip install --upgrade langchainand then it works again even if I comment out this line on the next run.

@jeffchuber
Copy link
Contributor

Chroma is updating our deps chroma-core/chroma#799 - landing as soon as tests pass and then cutting a new release.

@braduck
Copy link

braduck commented Jul 12, 2023

I have found that

!pip install typing-inspect==0.8.0

!pip install typing_extensions==4.5.0

!pip install openai==0.27.8

!pip install langchain==0.0.230

!pip install chroma==0.2.0

!pip install chromadb==0.3.26

!pip install pydantic==1.10.8

will work for me sometimes.

If it doesn't work, I add a line at the end !pip install --upgrade langchainand then it works again even if I comment out this line on the next run.

This worked for me. Thanks!

@jeffchuber
Copy link
Contributor

(jeff from chroma)

FYI - this should all be cleaned up now with 0.3.29 release from chroma - sorry about the conflict everyone!

@Ritaprava95
Copy link

This occurred when I was using python 3.8, upgrading the python version to 3.9 solved the issue for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests