Skip to content
This repository has been archived by the owner on Apr 18, 2024. It is now read-only.

A proxy for Azure OpenAI API that can convert an OpenAI request into an Azure OpenAI request.

License

Notifications You must be signed in to change notification settings

wenxcs-msft/azure-openai-proxy

 
 

Repository files navigation

Azure OpenAI Proxy

Deployment

  1. Set your DNS record' CAA to letsencrypt.org
  2. Expose your 80, 443 to public access
  3. Clone this repo, and config enviorment ENDPOINT as your Azure OpenAI endpoint, HOST as your public URL. Both are without http or https scheme
  4. Use sudo docker compose up -d to start the service

Usage

In any client support original openai api, configure your secret as one of the three cases:

  • your password
  • your password@customized endpoint
  • your password@customized endpoint@your model deployment

For example, "aaaa@bbbb.openai.azure.com@mygpt" is a valid secret which uses token aaaa to communicate with bbbb.openai.azure.com using deployment mygpt.

  • must use "bbbb.openai.azure.com", with NO "http://" or "https://".

By default, this proxy will use gpt-35-turbo, gpt-35-turbo-0301 as deloyment names. If the deployment name is in your secret, this proxy will only use that.
At last, change your proxy host url to your client's corresponding field.

If you only have password in the secret, this proxy will redirect all request to your enviorment ENDPOINT.

Status

Opencat ☑️

  • Platform: iOS, iPadOS, MacOS
  • Link: https://opencat.app/
  • Known issue: must include port 443 in host url in iOS opencat

AMA ☑️

chatbox ☑️

Credit

Many thanks to projects:

About

A proxy for Azure OpenAI API that can convert an OpenAI request into an Azure OpenAI request.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 93.5%
  • Dockerfile 6.5%