Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM Chain with Hugging Face #8721

Closed
the-trading-ai opened this issue Feb 22, 2024 · 22 comments
Closed

LLM Chain with Hugging Face #8721

the-trading-ai opened this issue Feb 22, 2024 · 22 comments
Labels

Comments

@the-trading-ai
Copy link

Bug Description

When I connect a Chat with the LLM Chain Node and use Hugging Face, i got an error message : message.toJSON is not a function

To Reproduce

image

Expected behavior

The message form the model

Operating System

hosted

n8n Version

Latest 1.29

Node.js Version

hosted

Database

SQLite (default)

Execution mode

main (default)

@Joffcom
Copy link
Member

Joffcom commented Feb 22, 2024

Hey @the-trading-ai

If you open the LLM chain is there a larger error in there? Can you also share the workflow json so we can use it to reproduce the issue?

@the-trading-ai
Copy link
Author

Here's the process : (Someone told me that It look that this problem happened yesterday in langchain community with the Huggingface library)
image

image

TypeError: message.toJSON is not a function
at /usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:73
at Array.map ()
at Proxy.connectionType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:48)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at Proxy._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/node_modules/@langchain/core/dist/language_models/llms.cjs:138:22)
at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/llm_chain.cjs:157:37)
at LLMChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
at createSimpleLLMChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:84:23)
at getChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:93:16)
at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:360:31)
at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:730:19)
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:662:53
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1064:20

@the-trading-ai
Copy link
Author

image

ok, it look that is a general issue. Probably from the latest version. I updated yesterday my version to the latest one

@groundbreakersonline
Copy link

Same problem here in using any type o Summarization Chain (1.29.1) :

TypeError: message.toJSON is not a function
at /usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:73
at Array.map ()
at Proxy.connectionType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:48)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at Proxy._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/node_modules/@langchain/core/dist/language_models/llms.cjs:138:22)
at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/llm_chain.cjs:157:37)
at LLMChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
at StuffDocumentsChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/combine_docs_chain.cjs:62:24)
at StuffDocumentsChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
at MapReduceDocumentsChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/combine_docs_chain.cjs:210:24)
at MapReduceDocumentsChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainSummarization/V2/ChainSummarizationV2.node.js:337:34)
at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:730:19)
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:662:53
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1064:20

@dkindlund
Copy link

I'm also seeing the same error on n8n@1.29.1:

TypeError: message.toJSON is not a function
    at /usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:73
    at Array.map (<anonymous>)
    at Proxy.connectionType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:48)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Proxy._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/node_modules/@langchain/core/dist/language_models/llms.cjs:138:22)
    at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/llm_chain.cjs:157:37)
    at LLMChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
    at createSimpleLLMChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:84:23)
    at getChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:93:16)
    at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:360:31)
    ```

@dkindlund
Copy link

^ This issue affects all LLM function -- including OpenAI (not just HuggingFace).

@dkindlund
Copy link

dkindlund commented Feb 22, 2024

Pretty sure this is the line that the error is referring to:

: messages.map((message) => message.toJSON()),

@dkindlund
Copy link

This code was introduced by this commit:
7501ad8

As part of this PR:
#8526

I'm going to upgrade to n8n@1.30.0 and see if that fixes the issue.

@Joffcom
Copy link
Member

Joffcom commented Feb 22, 2024

We will look into this in the morning and potentially put out a new released for now though you can go back to a previous release and you should be good to go.

@dkindlund
Copy link

Getting a different error on n8n@1.30.0 - FYI:

Error: Could not get parameter
    at getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:1513:15)
    at Object.getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:2189:24)
    at getPromptInputByType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/helpers.js:29:24)
    at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:396:61)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:730:19)
    at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:662:53
    at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1064:20

@dkindlund
Copy link

Ironically, the Conversational Agent node works in n8n@1.30.0 -- it's just the Basic LLM Chain node that appears to be broken.

@dkindlund
Copy link

Confirmed that this issue is not present in n8n@1.27.3 -- it does presently exist in all later versions.

@janober
Copy link
Member

janober commented Feb 23, 2024

Fix got released with n8n@1.30.1

@Joffcom
Copy link
Member

Joffcom commented Feb 23, 2024

Good news, This should now be resolved. I am going to mark this as closed and if you are still seeing this issue let me know.

@Joffcom Joffcom closed this as completed Feb 23, 2024
@dkindlund
Copy link

@Joffcom , no dice. I'm still seeing these errors using the Basic LLM Chain node in n8n@1.30.1. CC: @janober

Error: Could not get parameter
    at getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:1514:15)
    at Object.getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:2190:24)
    at getPromptInputByType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/helpers.js:29:24)
    at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:396:61)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:730:19)
    at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:660:53
    at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1062:20

@dkindlund
Copy link

image

image

@dkindlund
Copy link

image

@dkindlund
Copy link

^ This workflow works fine on n8n@1.27.3

@dkindlund
Copy link

Hey @Joffcom and @janober , I think I sort of understand the issue.

This error appears only for legacy workflows and legacy nodes. If I create a brand new workflow with the same identical nodes, this error doesn't exist.

This tells me that there's some backwards compatibility issue that's breaking older workflows/nodes.

Steps to reproduce:

  1. Create a simple Basic LLM Chain workflow in n8n@1.27.3
  2. Upgrade to n8n@1.30.1
  3. Try to run your simple workflow -- see error

As it stands right now, if there's no way to "auto-upgrade" legacy workflows/nodes to support new 1.30.1+ features, then all your existing users will be experiencing this same pain along the way. I'm not sure how to solve this issue, but FYI.

Recreating the simple Basic LLM Chain workflow has no issues:
image

@dkindlund
Copy link

Oh, interesting... the newer Basic LLM Chain node in n8n@1.30.1 does not have the Output Parser optional connector any more -- I don't know if that's intended or a possible secondary regression, @Joffcom . Compare the past couple of screenshots and you'll see what I mean.

@dkindlund
Copy link

Yup, confirmed. In n8n@1.30.1+, I had to manually regenerate the Basic LLM Chain node to get it to work. Here's the side-by-side comparison:

image

@Joffcom
Copy link
Member

Joffcom commented Feb 23, 2024

@dkindlund the good news is that sounds like a different issue so I believe the original issue here is solved and there is another one to look into.

You also don't need to worry about tagging us if we are commenting we will get notifications on new posts we are also mainly around during Berlin office hours 🙂

We will look into this new issue on Monday morning. It may be worth opening a new issue so it doesn't get confused with the original issue here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants