An open-source, mini imitation of GitHub Copilot using update: Replit Code 3B or EleutherAI GPT-Neo-2.7B (via Huggingface Model Hub) for Emacs.
- Set
device
to “cpu” or “cuda” inserve/server.py
- The “priming” is currently done in Python. If you want, modify it to another language or turn it off (from subjective experience, priming seems to help).
- Launch
serve/server.py
. This will launch a Flask app which will allow us to sample the model via REST API.
- In
emacs/secondmate.el
, customize the URL insecondmate-url
to the address the API is running on.