-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
token in handleLLMNewToken callback is empty when OpenAI response a function call #2762
Comments
🤖 Hello, Thank you for reaching out with your issue. I understand that you're having trouble with the Based on the code you've shared, the I found a couple of similar issues in the langchainjs repository that might be relevant:
You might want to check these issues and see if the solutions provided can help resolve your problem. As for the image you've shared, I'm sorry but I'm unable to view images. If there's any additional information in the image that you think might be relevant, please share it in text form. If you're still having trouble, could you please provide more information about the OpenAI function call that's supposed to return the token? This might help in identifying the issue. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
@dosu-bot, Here is the logs from llm:chatopenai web:dev: [llm/start] [1:llm:ChatOpenAI] Entering LLM run with input: {
web:dev: "messages": [
web:dev: [
web:dev: {
web:dev: "lc": 1,
web:dev: "type": "constructor",
web:dev: "id": [
web:dev: "langchain",
web:dev: "schema",
web:dev: "HumanMessage"
web:dev: ],
web:dev: "kwargs": {
web:dev: "content": "\nUse the following pieces of context to answer the question at the end.\nKeep your answer in first person not third person. Don't say \"I don't know\", just say content of the Predict Message.\n\n.\nModern \nscience \nsays: \nThe sun is the past, the earth \nis the present, \nthe\nmoon \nis the future. \nFrom \nan incandescent \nmass we have originated, \nand into\na frozen\n mass \nwe shall turn. Merciless \nis the law of nature, \nand rapidly \nand\nirresistibly \nwe are drawn \nto our doom. \nLord \nKelvin, \nin his profound\nmeditations, \nallows \nus only a short \nspan of life, something \nlike six million\nyears, \nafter which \ntime the sun’s bright \nlight will have ceased \nto shine, \nand\nits life-giving \nheat will have ebbed \naway, and our own earth \nwill be a lump\nof ice, hurrying \non through \nthe eternal \nnight. \nBut do not let us despair\n.\nThere \nwill still be left upon \nit a glimmering \nspark \nof life, and there \nwill be a\nchance \nto kindle\n a new fire on some distant \nstar. This wonderful \npossibility\nseems, \nindeed, \nto exist, \njudging \nfrom \nProfessor \nDewar\n’s beautiful\nexperiments \nwith liquid \nair, which show \nthat germs \nof organic life are not\ndestroyed \nby cold,\n\nSouth \nAfrica\n) no lightning \nstrokes \noccurred \nafter \nthe\npointed \nrods were \ninstalled, \naltho \nthe storms \nwere \nas frequent \nas before.\nExperience \nhas shown \nthat just the opposite \nis true. A modern \ncity like New\nYork, presenting \ninnumerable \nsharp points \nand projections \nin good contact\n\nChapter Eight\n\nChapter Eight // i.e the chunks of text retrieved deemed to be moset semantically\n // relevant to our question\nQuestion: What is the current weather in Ho Chi Minh city? // i.e our actualy question\nHelpful Answer:\n",
web:dev: "additional_kwargs": {}
web:dev: }
web:dev: }
web:dev: ]
web:dev: ]
web:dev: }
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: handleLLMNewToken
web:dev: [llm/end] [1:llm:ChatOpenAI] [2.32s] Exiting LLM run with output: {
web:dev: "generations": [
web:dev: [
web:dev: {
web:dev: "text": "",
web:dev: "generationInfo": {
web:dev: "prompt": 0,
web:dev: "completion": 0
web:dev: },
web:dev: "message": {
web:dev: "lc": 1,
web:dev: "type": "constructor",
web:dev: "id": [
web:dev: "langchain",
web:dev: "schema",
web:dev: "AIMessageChunk"
web:dev: ],
web:dev: "kwargs": {
web:dev: "content": "",
web:dev: "additional_kwargs": {
web:dev: "function_call": {
web:dev: "name": "getWeather",
web:dev: "arguments": "{\n \"location\": \"Ho Chi Minh city\"\n}"
web:dev: }
web:dev: }
web:dev: }
web:dev: }
web:dev: }
web:dev: ]
web:dev: ]
web:dev: } |
🤖 Hello, Based on the information you've provided and the context from similar issues, it seems like the problem you're experiencing is related to how streaming is handled in the LangChainJS framework. In the Also, the Regarding the In your case, the function If you want to avoid writing newline characters to the standard output, you can add a condition to check if the token is a newline character before writing it. Here's how you can do it: {
handleLLMNewToken(token, idx, runId, parentRunId, tags, fields) {
if (token !== '\n') {
console.log('handleLLMNewToken', token);
}
},
handleAgentAction(action, runId, parentRunId, tags) {
console.log('handleAgentAction', action);
},
} With this modification, newline characters will not be written to the standard output. I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Hi, @vangetpersona, I'm helping the langchainjs team manage their backlog and am marking this issue as stale. From what I understand, the issue you raised regarding the Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you for your understanding and contribution to langchainjs! |
When I try to trigger a function call. OpenAI response a function call in
additional_kwargs
but the stream response is empty.I don't know why or what was wong in my code. Thank you guys for reading my issue. Any guests please let me know.
Thank you.
The text was updated successfully, but these errors were encountered: