-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Return token usage #188
Comments
What about LLPhant/src/Chat/OpenAIChat.php Line 74 in fea142b
|
Is there a way to retrieve the total number of tokens used with Question Answering? $qa = new QuestionAnswering( $answer = $qa->answerQuestion('What is the secret of Alice?'); If not, is it possible to use OpenAIChat with vector-stored embeddings? |
Maybe you can try this: $chat = new OpenAIChat(new OpenAIConfig());
$qa = new QuestionAnswering(
$filesVectorStore,
$embeddingGenerator,
$chat
);
$response = $qa->answerQuestion('What is the secret of Alice?');
echo('Total tokens: ' .$chat->getTotalTokens()); |
You want the total of token for the question embeddings + all generations? |
I started a small PR here. |
In the official OpenAI API, token usage is returned, e.g. "total_tokens": 1340
There seems to be no way to get this same info in LLPhant -- this is very basic stuff and I feel like it should be able to be accessed, given OpenAI do return that info? We can't figure out a way to do that in LLPhant currently however.
Edit: to be clear, this is for the questionanswering class, I know we can get it for the general chat class.
The text was updated successfully, but these errors were encountered: