Skip to content

Easily load local LLMs in a jupyter notebook for testing with langchain or other agents.

Notifications You must be signed in to change notification settings

Aristoddle/Local-LLM-Langchain

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 

Repository files navigation

Notebook for local LLMs

The goal of the project is to let people easily load their local LLMs in a notebook for testing with langchain or other agents. This notebook is a companion to oobabooga/text-generation-webui and uses all the same code for loading models. If you are using cpp only you do not need the text-generation-webui code.

Agent instructed to give the answer as a pirate and search google

image

Model: llama-30b-sft-oa-alpaca-epoch-2

Getting Started

These instructions assume you have successfully set up the one-click installer text-generation-webui on Windows with CUDA or installed llama-cpp and its dependencies.

If you are using llama-cpp models only, you do not need to follow the instructions for text-generation-webui.

Jupyter Notebook Usage:

  1. Activate your Python or Conda environment.
  2. Install Jupyter Notebook by running pip install jupyter in your preferred command prompt or terminal.
  3. Restart your command prompt or terminal to ensure that the installation is properly configured.
  4. Activate your Python or Conda environment again and run jupyter notebook in the command prompt or terminal to launch the Jupyter interface.
  5. Navigate to the directory where Alpaca-wikipedia-search.ipynb is located (ooba users put it in ./text-generation-webui/ and open the notebook in the Jupyter interface.

Disclaimer

This might not work the same for every model and search query. Prompts may need to be tweaked to get the Agent to follow the instructions correctly. If you know of any instruct prompts that work well with certain models let me know.

Contributions

Feel free to open issues, submit pull requests etc if you want to join in on this research

About

Easily load local LLMs in a jupyter notebook for testing with langchain or other agents.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 98.1%
  • Batchfile 1.9%