-
Notifications
You must be signed in to change notification settings - Fork 396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Added support for account level governance of AI Monitoring #2326
Conversation
15e67a7
to
baa3798
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested with openai completions but doesn't work with embeddings. I had a few comments about langchain and some tests
* @param {string} params.type type of llm event(i.e.- LlmChatCompletionMessage, LlmTool, etc) | ||
* @param {object} params.msg the llm event getting enqueued | ||
* @param {string} params.pkgVersion version of langchain library instrumented | ||
*/ | ||
common.recordEvent = function recordEvent({ agent, type, msg, pkgVersion }) { | ||
common.recordEvent = function recordEvent({ agent, shim, type, msg, pkgVersion }) { | ||
if (common.shouldSkipInstrumentation(agent.config) === true) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this behavior is different than the other libraries so when ai_monitoring.enabled is false it still creates the events in memory but doesn't do anything with them.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there are cases where it is creating LLM events like LlmErrorMessage
. It's also doing a lot of parsing. I would change langchain instrumentation and check for this flag before it attempts to event create any llm events
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comments on location of config check to still create spans and end them but refrain from creating LLM events
I'm going to wait to merge this until the team is ready to ship the feature |
This PR resolves #2325.