Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug when using conversation with async call Refs simonw/llm#674 #25

Merged
merged 3 commits into from
Dec 20, 2024

Conversation

sukhbinder
Copy link
Contributor

Fixes the async conversation bug as reported simonw/llm#674

The reported code now works.

import asyncio
import llm

model = llm.get_async_model("llama3.2")

conversation = model.conversation()


async def run():
    response = await conversation.prompt("joke")
    text = await response.text()
    response2 = await conversation.prompt("again")
    text2 = await response2.text()
    print(text, text2)


asyncio.run(run())

Copy link
Owner

@taketwo taketwo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The two builder functions are 99% identical, can we find a way to reuse the code?

@sukhbinder
Copy link
Contributor Author

Yes that's true. Let me check if I can refactor this

@sukhbinder
Copy link
Contributor Author

Thanks for the Nudge, found a simpler way to mitigate the bug. :)

@sukhbinder
Copy link
Contributor Author

Demo of it working.

llm-ollama-conversation

@taketwo taketwo merged commit 39f73a3 into taketwo:master Dec 20, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants