Use API to call the music generation AI of Suno.ai and easily integrate it into agents like GPTs.
👉 We update quickly, please star.
English | 简体中文 | Demo | Docs | Deploy with Vercel
🔥 Check out my new project: ReadPo - 10x Speed Up Your Reading and Writing
Suno.ai v3 is an amazing AI music service. Although the official API is not yet available, we couldn't wait to integrate its capabilities somewhere.
We discovered that some users have similar needs, so we decided to open-source this project, hoping you'll like it.
We have deployed an example bound to a free Suno account, so it has daily usage limits, but you can see how it runs: suno.gcui.ai
- Perfectly implements the creation API from app.suno.ai
- Automatically keep the account active.
- Compatible with the format of OpenAI’s
/v1/chat/completions
API. - Supports Custom Mode
- One-click deployment to Vercel
- In addition to the standard API, it also adapts to the API Schema of Agent platforms like GPTs and Coze, so you can use it as a tool/plugin/Action for LLMs and integrate it into any AI Agent.
- Permissive open-source license, allowing you to freely integrate and modify.
- Head over to app.suno.ai using your browser.
- Open up the browser console: hit
F12
or access theDeveloper Tools
. - Navigate to the
Network tab
. - Give the page a quick refresh.
- Identify the request that includes the keyword
client?_clerk_js_version
. - Click on it and switch over to the
Header
tab. - Locate the
Cookie
section, hover your mouse over it, and copy the value of the Cookie.
You can choose your preferred deployment method:
git clone https://github.com/gcui-art/suno-api.git
cd suno-api
npm install
Alternatively, you can use Docker Compose
docker compose build && docker compose up
-
If deployed to Vercel, please add an environment variable
SUNO_COOKIE
in the Vercel dashboard, with the value of the cookie obtained in the first step. -
If you’re running this locally, be sure to add the following to your
.env
file:
SUNO_COOKIE=<your-cookie>
- If you’ve deployed to Vercel:
- Please click on Deploy in the Vercel dashboard and wait for the deployment to be successful.
- Visit the
https://<vercel-assigned-domain>/api/get_limit
API for testing.
- If running locally:
- Run
npm run dev
. - Visit the
http://localhost:3000/api/get_limit
API for testing.
- Run
- If the following result is returned:
{
"credits_left": 50,
"period": "day",
"monthly_limit": 50,
"monthly_usage": 50
}
it means the program is running normally.
You can check out the detailed API documentation at : suno.gcui.ai/docs
Suno API currently mainly implements the following APIs:
- `/api/generate`: Generate music
- `/v1/chat/completions`: Generate music - Call the generate API in a format that works with OpenAI’s API.
- `/api/custom_generate`: Generate music (Custom Mode, support setting lyrics, music style, title, etc.)
- `/api/generate_lyrics`: Generate lyrics based on prompt
- `/api/get`: Get music information based on the id. Use “,” to separate multiple ids.
If no IDs are provided, all music will be returned.
- `/api/get_limit`: Get quota Info
- `/api/extend_audio`: Extend audio length
- `/api/generate_stems`: Make stem tracks (separate audio and music track)
- `/api/get_aligned_lyrics`: Get list of timestamps for each word in the lyrics
- `/api/clip`: Get clip information based on ID passed as query parameter `id`
- `/api/concat`: Generate the whole song from extensions
For more detailed documentation, please check out the demo site: suno.gcui.ai/docs
import time
import requests
# replace your vercel domain
base_url = 'http://localhost:3000'
def custom_generate_audio(payload):
url = f"{base_url}/api/custom_generate"
response = requests.post(url, json=payload, headers={'Content-Type': 'application/json'})
return response.json()
def extend_audio(payload):
url = f"{base_url}/api/extend_audio"
response = requests.post(url, json=payload, headers={'Content-Type': 'application/json'})
return response.json()
def generate_audio_by_prompt(payload):
url = f"{base_url}/api/generate"
response = requests.post(url, json=payload, headers={'Content-Type': 'application/json'})
return response.json()
def get_audio_information(audio_ids):
url = f"{base_url}/api/get?ids={audio_ids}"
response = requests.get(url)
return response.json()
def get_quota_information():
url = f"{base_url}/api/get_limit"
response = requests.get(url)
return response.json()
def get_clip(clip_id):
url = f"{base_url}/api/clip?id={clip_id}"
response = requests.get(url)
return response.json()
def generate_whole_song(clip_id):
payload = {"clip_id": clip_id}
url = f"{base_url}/api/concat"
response = requests.post(url, json=payload)
return response.json()
if __name__ == '__main__':
data = generate_audio_by_prompt({
"prompt": "A popular heavy metal song about war, sung by a deep-voiced male singer, slowly and melodiously. The lyrics depict the sorrow of people after the war.",
"make_instrumental": False,
"wait_audio": False
})
ids = f"{data[0]['id']},{data[1]['id']}"
print(f"ids: {ids}")
for _ in range(60):
data = get_audio_information(ids)
if data[0]["status"] == 'streaming':
print(f"{data[0]['id']} ==> {data[0]['audio_url']}")
print(f"{data[1]['id']} ==> {data[1]['audio_url']}")
break
# sleep 5s
time.sleep(5)
const axios = require("axios");
// replace your vercel domain
const baseUrl = "http://localhost:3000";
async function customGenerateAudio(payload) {
const url = `${baseUrl}/api/custom_generate`;
const response = await axios.post(url, payload, {
headers: { "Content-Type": "application/json" },
});
return response.data;
}
async function generateAudioByPrompt(payload) {
const url = `${baseUrl}/api/generate`;
const response = await axios.post(url, payload, {
headers: { "Content-Type": "application/json" },
});
return response.data;
}
async function extendAudio(payload) {
const url = `${baseUrl}/api/extend_audio`;
const response = await axios.post(url, payload, {
headers: { "Content-Type": "application/json" },
});
return response.data;
}
async function getAudioInformation(audioIds) {
const url = `${baseUrl}/api/get?ids=${audioIds}`;
const response = await axios.get(url);
return response.data;
}
async function getQuotaInformation() {
const url = `${baseUrl}/api/get_limit`;
const response = await axios.get(url);
return response.data;
}
async function getClipInformation(clipId) {
const url = `${baseUrl}/api/clip?id=${clipId}`;
const response = await axios.get(url);
return response.data;
}
async function main() {
const data = await generateAudioByPrompt({
prompt:
"A popular heavy metal song about war, sung by a deep-voiced male singer, slowly and melodiously. The lyrics depict the sorrow of people after the war.",
make_instrumental: false,
wait_audio: false,
});
const ids = `${data[0].id},${data[1].id}`;
console.log(`ids: ${ids}`);
for (let i = 0; i < 60; i++) {
const data = await getAudioInformation(ids);
if (data[0].status === "streaming") {
console.log(`${data[0].id} ==> ${data[0].audio_url}`);
console.log(`${data[1].id} ==> ${data[1].audio_url}`);
break;
}
// sleep 5s
await new Promise((resolve) => setTimeout(resolve, 5000));
}
}
main();
You can integrate Suno AI as a tool/plugin/action into your AI agent.
[coming soon...]
[coming soon...]
[coming soon...]
There are four ways you can support this project:
- Fork and Submit Pull Requests: We welcome any PRs that enhance the component or editor.
- Open Issues: We appreciate reasonable suggestions and bug reports.
- Donate: If this project has helped you, consider buying us a coffee using the Sponsor button at the top of the project. Cheers! ☕
- Spread the Word: Recommend this project to others, star the repo, or add a backlink after using the project.
We use GitHub Issues to manage feedback. Feel free to open an issue, and we'll address it promptly.
LGPL-3.0 or later
- Project repository: github.com/gcui-art/suno-api
- Suno.ai official website: suno.ai
- Demo: suno.gcui.ai
- Readpo: ReadPo is an AI-powered reading and writing assistant. Collect, curate, and create content at lightning speed.
- Album AI: Auto generate image metadata and chat with the album. RAG + Album.
suno-api is an unofficial open source project, intended for learning and research purposes only.