You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Out of those, I think 2.ii. is by far the worst as it is the hardest to parse. When we implemented streaming for Ragna, I went with 1., since that is what I was familiar with and seen other providers to as well. At that point, I was not aware of streaming JSONL (2.i.) as an alternative.
2.i. has the upside, that one does not need an extra library to handle as it is "required" for the other two cases (of course we could write the logic ourselves as well).
Due to its simplicity I would actually prefer JSONL streaming. We should make a decision here before we release streaming with 0.2.0.
The text was updated successfully, but these errors were encountered:
I've added a PR for this in #357. We eliminate the dependency on sse-starlette, httpx_sse becomes an optional dependency for Anthropic and OpenAI assistants, and we simplify the streaming code on the user side from
Looking at the builtin assistants, there are currently three ways of streaming:
Out of those, I think 2.ii. is by far the worst as it is the hardest to parse. When we implemented streaming for Ragna, I went with 1., since that is what I was familiar with and seen other providers to as well. At that point, I was not aware of streaming JSONL (2.i.) as an alternative.
2.i. has the upside, that one does not need an extra library to handle as it is "required" for the other two cases (of course we could write the logic ourselves as well).
Due to its simplicity I would actually prefer JSONL streaming. We should make a decision here before we release streaming with
0.2.0
.The text was updated successfully, but these errors were encountered: