Replies: 3 comments 1 reply
-
I've made some progress. Now I can print the AI answer in my webpage, but it's not streamed.
|
Beta Was this translation helpful? Give feedback.
1 reply
-
@sor4chi Can you answer this? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi, @charnould Note that you have to get the reader of the readable stream and that there is a bug that causes ollama's json response to be split. import { Hono } from "hono";
const app = new Hono();
app.get("/ai", async (c) => {
const { body } = await fetch("http://localhost:11434/api/generate", {
body: JSON.stringify({
model: "llama2",
prompt: "hello",
stream: true,
}),
headers: { "Content-Type": "application/json" },
method: "POST",
});
if (body == null) {
return c.text("No body", 500);
}
const reader = body.getReader();
return c.streamText(async (stream) => {
let received = "";
while (true) {
const { done, value: message } = await reader.read();
if (done) break;
const decoded = new TextDecoder().decode(message);
try {
received += decoded;
const json = JSON.parse(received);
received = "";
await stream.write(json.response);
} catch (e) {
// console.error(e);
}
}
});
});
app.onError((err, c) => {
console.error(err);
return c.text("Internal Server Error", 500);
});
export default app; 2023-12-07.18.44.42.mov |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I'm currently trying to reproduce
c.stream()
example (from 3.7.0) using not OpenAI but Ollama.My objective: a simple chat where a user asks a question, and AI responds.
Locally, when I cURL this:
The answer is:
So far, so good.
Then, I wrote a simple
Hono
app with this route heavily inspired from Hono itself:However, the code gives an error I can't understand and debug (I'm a newbie)
Do you have any clue to make this streamed response works?
Thanks a lot.
Beta Was this translation helpful? Give feedback.
All reactions