Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

get_openai_callback total_cost BROKEN #6193

Closed
2 of 14 tasks
matias-biatoz opened this issue Jun 15, 2023 · 5 comments
Closed
2 of 14 tasks

get_openai_callback total_cost BROKEN #6193

matias-biatoz opened this issue Jun 15, 2023 · 5 comments

Comments

@matias-biatoz
Copy link
Contributor

System Info

Basically, when using "llm.generate" in combination with get_openai_callback the total_cost just outputs 0.

Code Snippet

from langchain.chat_models import ChatOpenAI
from langchain.schema import (
    AIMessage,
    HumanMessage,
    SystemMessage
)
from langchain.callbacks import get_openai_callback

chat = [{"role": "user", "content": "What's the weather like in Boston?"}]

for message in chat:
    if message["role"] == "assistant":
        messages.append(AIMessage(content=message["content"]))
    elif message["role"] == "user":
        messages.append(HumanMessage(content=message["content"]))

with get_openai_callback() as cb:
    res = llm.generate([messages])

print(cb) # Tokens Used is okey
print(cb) # Total Cost is Always 0

Who can help?

@agola11 It's a Callback Issue. (That's why I am tagging you)

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

from langchain.schema import (
    AIMessage,
    HumanMessage,
    SystemMessage
)
from langchain.callbacks import get_openai_callback

chat = [{"role": "user", "content": "What's the weather like in Boston?"}]

for message in chat:
    if message["role"] == "assistant":
        messages.append(AIMessage(content=message["content"]))
    elif message["role"] == "user":
        messages.append(HumanMessage(content=message["content"]))

with get_openai_callback() as cb:
    res = llm.generate([messages])

print(cb) # Tokens Used is okey
print(cb) # Total Cost is Always 0```


### Expected behavior

It should work the same way it works with chains or agents.
@vowelparrot
Copy link
Contributor

vowelparrot commented Jun 15, 2023

What LLM are you using? The code snippet doesn't indicate, but this snippet works if I say llm = ChatOpenAI()

from langchain.chat_models import ChatOpenAI
from langchain.schema import (
    AIMessage,
    HumanMessage,
    SystemMessage
)
from langchain.callbacks import get_openai_callback

llm = ChatOpenAI()
chat = [{"role": "user", "content": "What's the weather like in Boston?"}]

for message in chat:
    if message["role"] == "assistant":
        messages.append(AIMessage(content=message["content"]))
    elif message["role"] == "user":
        messages.append(HumanMessage(content=message["content"]))

with get_openai_callback() as cb:
    res = llm.generate([messages])

print(cb) # Tokens Used is okey
print(cb) # Total Cost is Always 0```

>> Tokens Used: 114
	Prompt Tokens: 68
	Completion Tokens: 46
Successful Requests: 1
Total Cost (USD): $0.000228
Tokens Used: 114
	Prompt Tokens: 68
	Completion Tokens: 46
Successful Requests: 1
Total Cost (USD): $0.000228

@matias-biatoz
Copy link
Contributor Author

@vowelparrot
Here you go:

llm = ChatOpenAI(model_name="gpt-3.5-turbo-0613", temperature=0)

I tested if you leave llm = ChatOpenAI() it works!
But if you add parameters, it does not.
Sorry for not specifying earlier.

@vowelparrot
Copy link
Contributor

Looks like we don't have the cost mapping for that model name yet

@vowelparrot
Copy link
Contributor

MODEL_COST_PER_1K_TOKENS = {
    "gpt-4": 0.03,
    "gpt-4-0314": 0.03,
    "gpt-4-completion": 0.06,
    "gpt-4-0314-completion": 0.06,
    "gpt-4-32k": 0.06,
    "gpt-4-32k-0314": 0.06,
    "gpt-4-32k-completion": 0.12,
    "gpt-4-32k-0314-completion": 0.12,
    "gpt-3.5-turbo": 0.002,
    "gpt-3.5-turbo-0301": 0.002,
    "text-ada-001": 0.0004,
    "ada": 0.0004,
    "text-babbage-001": 0.0005,
    "babbage": 0.0005,
    "text-curie-001": 0.002,
    "curie": 0.002,
    "text-davinci-003": 0.02,
    "text-davinci-002": 0.02,
    "code-davinci-002": 0.02,
    "ada-finetuned": 0.0016,
    "babbage-finetuned": 0.0024,
    "curie-finetuned": 0.012,
    "davinci-finetuned": 0.12,
}

dev2049 pushed a commit that referenced this issue Jun 19, 2023
@agola11 

Issue
#6193 

I added the new pricing for the new models.

Also, now gpt-3.5-turbo got split into "input" and "output" pricing. It
currently does not support that.
@dosubot
Copy link

dosubot bot commented Sep 14, 2023

Hi, @matias-biatoz! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you reported an issue where the "total_cost" is always 0 when using the "llm.generate" function with the "get_openai_callback" in the provided code snippet. Vowelparrot suggested using the "llm = ChatOpenAI()" instead of specifying parameters to make it work. They also mentioned that there is no cost mapping for the model name "gpt-3.5-turbo-0613" yet and provided a list of model names and their corresponding costs.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your contribution, and we appreciate your understanding as we work to manage our backlog effectively. Let us know if you have any further questions or concerns!

Best regards,
Dosu

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 14, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 21, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants