Skip to content

Commit

Permalink
Merge pull request #64 from EricLBuehler/develop
Browse files Browse the repository at this point in the history
Fix typo & update ReadMe
  • Loading branch information
guoqingbao authored Jul 24, 2024
2 parents 18dbe34 + 117586b commit 339d237
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@ Currently, candle-vllm supports chat serving for the following models.
| #2 | **Mistral** ||70 tks/s (7B)|
| #3 | **Phi (v1, v1.5, v2)** ||97 tks/s (2.7B, F32+BF16)|
| #4 | **Phi-3 (3.8B, 7B)** ||107 tks/s (3.8B)|
| #5 | **Yi** ||TBD|
| #6 | **StableLM** ||TBD|
| #5 | **Yi** ||75 tks/s (6B)|
| #6 | **StableLM** ||99 tks/s (3B)|
| #7 | BigCode/StarCode |TBD|TBD|
| #8 | ChatGLM |TBD|TBD|
| #9 | **QWen2 (1.8B, 7B)** ||148 tks/s (1.8B)|
Expand Down
4 changes: 2 additions & 2 deletions src/openai/streaming.rs
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ use std::{
pub enum StreamingStatus {
Uninitilized,
Started,
Interupted,
Interrupted,
Stopped,
}
pub enum ChatResponse {
Expand Down Expand Up @@ -55,7 +55,7 @@ impl Stream for Streamer {
if self.status == StreamingStatus::Started && e == flume::TryRecvError::Disconnected
{
//no TryRecvError::Disconnected returned even if the client closed the stream or disconnected
self.status = StreamingStatus::Interupted;
self.status = StreamingStatus::Interrupted;
Poll::Ready(None)
} else {
Poll::Pending
Expand Down

0 comments on commit 339d237

Please sign in to comment.