Skip to content

OpenAI Prompt Caching: Add cached_token parameter in usage_metadata of AI response #27334

Closed
ShubhamMaddhashiya-bidgely started this conversation in Ideas
Discussion options

You must be logged in to vote

Replies: 1 comment 5 replies

Comment options

You must be logged in to vote
5 replies
@ShubhamMaddhashiya-bidgely
Comment options

@sagaruprety
Comment options

@RafaelMCarvalho
Comment options

@RafaelMCarvalho
Comment options

@lolbus
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Ideas
Labels
None yet
4 participants