Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow specifying different remote/local models for use with node-llama-cpp #14

Open
hawkeyexl opened this issue Dec 13, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@hawkeyexl
Copy link
Owner

Give the user a choice of which model they want to use. They should be able to specify a model from any publicly available URL or point to a local file they already have on their device.

@hawkeyexl hawkeyexl added the enhancement New feature or request label Dec 13, 2024
@hawkeyexl hawkeyexl changed the title Allow sepecifying different remote/local models for use with node-llama-cpp Allow specifying different remote/local models for use with node-llama-cpp Dec 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant