Skip to content

Latest commit

 

History

History
13 lines (10 loc) · 799 Bytes

README.org

File metadata and controls

13 lines (10 loc) · 799 Bytes

Second Mate

An open-source, mini imitation of GitHub Copilot using update: Replit Code 3B or EleutherAI GPT-Neo-2.7B (via Huggingface Model Hub) for Emacs.

./assets/demo1.gif

Setup

Inference End / Backend

  1. Set device to “cpu” or “cuda” in serve/server.py
  2. The “priming” is currently done in Python. If you want, modify it to another language or turn it off (from subjective experience, priming seems to help).
  3. Launch serve/server.py. This will launch a Flask app which will allow us to sample the model via REST API.

Emacs

  1. In emacs/secondmate.el, customize the URL in secondmate-url to the address the API is running on.