azure-ai-inference sdk support for Managed Online Endpoint LLM deployments. AML, AI Foundry #39025
Labels
AI Model Inference
Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)
AI
customer-reported
Issues that are reported by GitHub users external to the Azure organization.
needs-team-attention
Workflow: This issue needs attention from Azure service team or SDK team
question
The issue doesn't require a change to the product in order to be resolved. Most issues start as that
Service Attention
Workflow: This issue is responsible by Azure service team.
Describe the bug
Customer has deployed multiple Managed Online Endpoints to AzureML as well as AI Foundry. Endpoints exposed a /score inference endpoint but are not compatible with azure-ai-inference sdk. When specifying scoring endpoint and using key credential, the sdk returns error of "Failed Dependencies"
To Reproduce
Steps to reproduce the behavior:
Result:
Error:
HttpResponseError: Operation returned an invalid status 'Failed Dependency'
Content: {"detail":"Not Found"}
Expected behavior
Completion response returned
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: