Skip to content
This repository has been archived by the owner on Mar 5, 2024. It is now read-only.

Add in LLaMA.cpp to backend. #31

Open
n-galrion opened this issue Sep 8, 2023 · 0 comments
Open

Add in LLaMA.cpp to backend. #31

n-galrion opened this issue Sep 8, 2023 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@n-galrion
Copy link
Member

Is your feature request related to a problem? Please describe.
Let people use the app like Kobold or Ooba, allow for model downloads inside of the app, and run them using the:
https://www.npmjs.com/package/node-llama-cpp

@n-galrion n-galrion added the enhancement New feature or request label Sep 8, 2023
@n-galrion n-galrion added this to TalOS Sep 8, 2023
@n-galrion n-galrion moved this to 🆕 New in TalOS Sep 8, 2023
@n-galrion n-galrion self-assigned this Sep 8, 2023
@n-galrion n-galrion moved this from 🆕 New to 📋 Backlog in TalOS Sep 8, 2023
@n-galrion n-galrion moved this from 📋 Backlog to 🆕 New in TalOS Sep 14, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
Status: 🆕 New
Development

No branches or pull requests

1 participant