Skip to content

How to use and train llms with transformers and llama.cpp

Notifications You must be signed in to change notification settings

caretech-owl/notebooks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Notebooks

Notes

Put downloaded models into

<project_dir>/models

Cache should be located in

<project_dir>/cache
<project_dir>/cache/hub    # hugging face
<project_dir>/cache/lora   # lora training
<project_dir>/cache/models # model merge result

Issues with ctransformers

Compiling ctransformers on cluster with custom GCC

LDFLAGS="-static-libstdc++" CC=/home/neum_al/env/bin/gcc pip install ctransformers --no-binary ctransformers --no-cache-dir

About

How to use and train llms with transformers and llama.cpp

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages