Extract structured data from local or remote LLM models
-
Updated
Jun 21, 2024 - Python
Extract structured data from local or remote LLM models
A chrome extention for quering a local llm model using llama-cpp-python, includes a pip package for running the server, 'pip install local-llama' to install
entirely oss and locally running version of recall (originally revealed by msft for copilot+pcs)
Bell inequalities and local models via Frank-Wolfe algorithms
Main code chunks used for models in the publication "Exploring the Potential of Adaptive, Local Machine Learning (ML) in Comparison ton the Prediction Performance of Global Models: A Case Study from Bayer's Caco-2 Permeability Database"
Vision-based avatar, reads Google News and extracts news by itself using only local models
Extracting complete webpage articles from a screen recording using local models
A streamlined interface for interacting with local Large Language Models (LLMs) using Streamlit. Features interactive chat, configurable model parameters, and more.
Add a description, image, and links to the local-models topic page so that developers can more easily learn about it.
To associate your repository with the local-models topic, visit your repo's landing page and select "manage topics."