Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing model information in llm span of Langfuse #9029

Closed
5 tasks done
corelchen opened this issue Oct 6, 2024 · 2 comments
Closed
5 tasks done

Missing model information in llm span of Langfuse #9029

corelchen opened this issue Oct 6, 2024 · 2 comments
Labels
🐞 bug Something isn't working

Comments

@corelchen
Copy link
Contributor

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.9.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

  1. Configure Langfuse in dify.
  2. Run the chat workflow app.
  3. View monitoring data in Langfuse, can't find model information in llm span.

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

@dosubot dosubot bot added the 🐞 bug Something isn't working label Oct 6, 2024
Copy link

dosubot bot commented Oct 6, 2024

@corelchen Hello there! I'm here to help you with any bugs, questions, or contributions. Let me know how I can assist you while we wait for a human maintainer.

I found a similar unsolved discussion titled "add self deploy llm model, add error" on the Dify GitHub discussions page. You might find it helpful to check it out for any related insights or updates [1].

Regarding your issue with missing model information in the LLM span of Langfuse, there are no specific configuration settings mentioned in the available documentation that need to be enabled for model information to be included in the LLM span data sent to Langfuse. However, Dify does provide logging and debugging information that can help trace the flow of model information to Langfuse. This includes using classes like LangfuseTrace, LangfuseSpan, and LangfuseGeneration which have fields for debugging and tracing purposes [2][3][4].

If you haven't already, you might want to check the logging and debugging information to see if there are any clues about why the model information is missing.

To continue talking to Dosu, mention @dosu.

@corelchen
Copy link
Contributor Author

I can fix this issue, I want to use field model_provider and model_name, is this ok?

ZhouhaoJiang pushed a commit that referenced this issue Oct 7, 2024
cuiks pushed a commit to cuiks/dify that referenced this issue Oct 8, 2024
lau-td pushed a commit to heydevs-io/dify that referenced this issue Oct 23, 2024
idonotknow pushed a commit to AceDataCloud/Dify that referenced this issue Nov 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants