-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFE] support LiteLLM Azure OpenAI Entra ID authentication #208
Comments
I suspect Holmes already support this as litellm accepts the I will double check (likely tomorrow) that it is working as expected and will revert back here. |
There doesn't appear to be a way to pass through the tenant ID or application ID given the way that LiteLLM seems to want that data presented to it. I can give it a test locally. |
Was suggested to try:
This results in:
|
Hi @thoraxe , There are 2 main changes:
There is an additional change that you can try if the above is still not enough:
This will capture the azure env vars and explicitly pass them to litellm. In addition, this will pass |
Indeed LiteLLM does not support not having a ad_token. This should be fixed in an upcoming change: BerriAI/litellm#6790. In the meantime, you could host litellm as a proxy as this part works as expected and then make Holmes use that proxy. |
A lot of organizations that use Azure OpenAI are going to want to use Entra ID for authentication. LiteLLM already supports this:
https://litellm.vercel.app/docs/providers/azure#entrata-id---use-tenant_id-client_id-client_secret
It doesn't appear that HolmesGPT knows how to support this, looking at:
https://github.com/robusta-dev/holmesgpt/blob/master/holmes/core/llm.py
The text was updated successfully, but these errors were encountered: