-
Notifications
You must be signed in to change notification settings - Fork 15.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Observability Feature Request: Metrics, Data Monitoring, and Graph Visualization #5767
Comments
Have you tried graphsignal? |
Hey @OlajideOgun, thanks for the suggestion! Graphsignal integrates LangChain within it, and I wanted more of a solution which had integration within langchain, in the form of callbacks to make it more usable. So, we finally managed to create an |
Just cheked it out this is very cool! |
Thanks! If you plan to use it, please do share the feedback/improvements :) |
Closing the issue, as #6218 is merged. |
Feature request
I would like to propose the addition of an observability feature to LangChain, enabling developers to monitor and analyze their applications more effectively. This feature aims to provide metrics, data tracking, and graph visualization capabilities to enhance observability.
Key aspects of the proposed feature include:
This feature request aims to improve the development and debugging experience by providing developers with better insights into the performance, behavior, and cost implications of their LangChain applications.
Motivation
Last week, while I was creating an application using OpenAI's GPT 3.5 APIs to develop a chat-based solution for answering SEO-related queries, I encountered a significant challenge. During the launch, I realized that I had no means to monitor the responses suggested by the language model (GPT, in my case). Additionally, I had no insights into its performance, such as speed or token usage. To check the token count, I had to repeatedly log in to the OpenAI dashboard, which was time-consuming and cumbersome.
It was frustrating not having a clear picture of user interactions and the effectiveness of the system's responses. I realised that I needed a way to understand the system's performance in handling different types of queries and identify areas that required improvement.
I strongly believe that incorporating an observability feature in LangChain would be immensely valuable. It would empower developers like me to track user interactions, analyze the quality of responses, and measure the performance of the underlying LLM requests. Having these capabilities would not only provide insights into user behavior but also enable us to continuously improve the system's accuracy, response time, and overall user experience.
Your contribution
Yes, I am planning to raise a PR along with a couple of my friends to add an observability feature to Langchain.
I would be more than happy to take suggestions from the community, on what we could add to make it more usable!
The text was updated successfully, but these errors were encountered: