-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Anthropic models #105
Comments
Yes, I agree that Anthropic should be the first candidate for the next native backend. FYI: you can use Claude even now with a GPT backend and a proxy server. |
For people that use Ollama it would look like this (see also here): from skllm.config import SKLLMConfig
SKLLMConfig.set_gpt_url("http://localhost:11434/v1/")
from skllm.models.gpt.classification.zero_shot import ZeroShotGPTClassifier
from skllm.datasets import get_classification_dataset
X, y = get_classification_dataset()
clf = ZeroShotGPTClassifier(model="custom_url::llama3", key="ollama")
clf.fit(X,y)
labels = clf.predict(X) Even though we don't technically need a key, any string is still required. Edit: |
FYI I fully integrated Ollama into my branch now. The main reason for the integration was that I cannot pass optional parameters (such as context size) via OpenAI, and I absolutely need that control for my work. I know you guys want to use llama.cpp instead, just figured others may be are interested in the branch. |
Would be nice to use the latest and greatest Claude models with this project :)
The text was updated successfully, but these errors were encountered: