Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOC: The guide on how to "Log custom LLM traces" currently doesn't explain how to log tool calling and tool results. #840

Open
thiagotps opened this issue Jul 2, 2024 · 1 comment

Comments

@thiagotps
Copy link

Issue with current documentation:

The current documentation on Log custom LLM traces doesn't tell the format expected when the model makes a tool call or when the client side returns a tool result.

Idea or request for content:

An example of the format LangSmith expects to receive when the model calls a tool or when the client side needs to pass a tool result back to the model.

@hinthornw
Copy link
Collaborator

Noted - it should be the same format as what openai returns, but we should improve the doc there

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants