Replies: 1 comment
-
Thanks for sharing, this should be captured, can you open an issue with code to reproduce this and a public link to a trace of this? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the feature or potential improvement
Description
For debugging agentic systems, it would be really helpful to include the
tools
field of the LLM call, at least for OpenAI. The LLM response already includes the chosen tool (if any).Example
My outgoing OpenAI request looks like this (through LlamaIndex):
Langfuse only captures the following (Tracing -> Generations):
Langfuse correctly captures the assistant's response including the choice of tool:
Request
Might there already be a config parameter that enables this? If not, is there any chance this can be included and captured through
LlamaIndexCallbackHandler
?Additional information
No response
Beta Was this translation helpful? Give feedback.
All reactions