Skip to content

Commit

Permalink
Llama CPP don't require logits_all=True
Browse files Browse the repository at this point in the history
  • Loading branch information
kddubey committed Nov 23, 2023
1 parent 4d976cf commit 684e100
Show file tree
Hide file tree
Showing 20 changed files with 1,886 additions and 1,904 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ This model must be able to be loaded using
from llama_cpp import Llama
from cappr.llama_cpp.classify import predict

# Load model. Always set logits_all=True for CAPPr
model = Llama("./TinyLLama-v0.Q8_0.gguf", logits_all=True, verbose=False)
# Load model
model = Llama("./TinyLLama-v0.Q8_0.gguf", verbose=False)

prompt = """Gary told Spongebob a story:
There once was a man from Peru; who dreamed he was eating his shoe. He
Expand Down
2,982 changes: 1,489 additions & 1,493 deletions demos/huggingface/tweet_emotion_multilabel.ipynb

Large diffs are not rendered by default.

Loading

0 comments on commit 684e100

Please sign in to comment.