👉 Support this work via GitHub Sponsors
A multi-llm Emacs comint shell, by me.
- ob-chatgpt-shell: Evaluate chatgpt-shell blocks as Emacs org babel blocks.
- ob-dall-e-shell: Evaluate DALL-E shell blocks as Emacs org babel blocks.
- dall-e-shell: An Emacs shell for OpenAI’s DALL-E.
- shell-maker: Create Emacs shells backed by either local or cloud services.
chatgpt-shell goes multi model 🎉
It’s been quite a bit of work, and I still have plenty of pending work. Please sponsor the project to make development + support sustainable.
Provider | Model | Supported | Setup |
---|---|---|---|
Anthropic | Claude | New 💫 | Set chatgpt-shell-anthropic-key |
Gemini | New 💫 | Set chatgpt-shell-google-key | |
Kagi | Summarizer | New 💫 | Set chatgpt-shell-kagi-key |
Ollama | Llama | New 💫 | Install Ollama |
OpenAI | ChatGPT | Yes | Set chatgpt-shell-openai-key |
Perplexity | Llama Sonar | New 💫 | Set chatgpt-shell-perplexity-key |
Note: With the exception of Ollama, you typically have to pay the cloud services for API access. Please check with each respective LLM service.
My favourite model is missing.
File a feature request | sponsor the work |
chatgpt-shell is a comint shell. Bring your favourite Emacs shell flows along.
One shell to query all. Swap LLM provider (via M-x chatgpt-shell-swap-model
) and continue with your familiar flow.
chatgpt-shell
includes a compose buffer experience. This is my favourite and most frequently used mechanism to interact with LLMs.
For example, select a region and invoke M-x chatgpt-shell-prompt-compose
(C-c C-e
is my preferred binding), and an editable buffer automatically copies the region and enables crafting a more thorough query. When ready, submit with the familiar C-c C-c
binding. The buffer automatically becomes read-only and enables single-character bindings.
Navigate through source blocks (including previous submissions in history). Source blocks are automatically selected.
Reply with with follow-up requests using the r
binding.
Want to ask for more of the same data? Press m
to request more of it. This is handy to follow up on any kind of list (suggestion, candidates, results, etc).
I’m a big fan of quickly disposing of Emacs buffers with the q
binding. chatgpt-shell compose buffers are no exception.
LLM being lazy and returning partial code? Press e
to request entire snippet.
Request inline modifications, with explicit confirmation before accepting.
Execute snippets (a la org babel)
Both the shell and the compose buffers enable users to execute source blocks via C-c C-c
, leveraging org babel.
I’ve been experimenting with image queries (currently ChatGPT only, please sponsor to help bring support for others).
Below is a handy integration to extract Japanese vocabulary. There’s also a generic image descriptor available via M-x chatgpt-shell-describe-image
that works on any Emacs image (via dired, image buffer, point on image, or selecting a desktop region).
If you’re finding chatgpt-shell
useful, help make the project sustainable and consider ✨sponsoring✨.
chatgpt-shell
is in development. Please report issues or send pull requests for improvements.
Finding it useful? Like the package? I’d love to hear from you. Get in touch (Mastodon / Twitter / Bluesky / Reddit / Email).
Via use-package, you can install with :ensure t
.
(use-package chatgpt-shell
:ensure t
:custom
((chatgpt-shell-openai-key
(lambda ()
(auth-source-pass-get 'secret "openai-key")))))
(setq chatgpt-shell-model-version "llama3.2")
You’ll first need to get a key from OpenAI.
;; if you are using the "pass" password manager
(setq chatgpt-shell-openai-key
(lambda ()
;; (auth-source-pass-get 'secret "openai-key") ; alternative using pass support in auth-sources
(nth 0 (process-lines "pass" "show" "openai-key"))))
;; or if using auth-sources, e.g., so the file ~/.authinfo has this line:
;; machine api.openai.com password OPENAI_KEY
(setq chatgpt-shell-openai-key
(auth-source-pick-first-password :host "api.openai.com"))
;; or same as previous but lazy loaded (prevents unexpected passphrase prompt)
(setq chatgpt-shell-openai-key
(lambda ()
(auth-source-pick-first-password :host "api.openai.com")))
M-x set-variable chatgpt-shell-openai-key
(setq chatgpt-shell-openai-key "my key")
(setq chatgpt-shell-openai-key (getenv "OPENAI_API_KEY"))
If you use ChatGPT through proxy service “https://api.chatgpt.domain.com”, set options like the following:
(use-package chatgpt-shell
:ensure t
:custom
((chatgpt-shell-api-url-base "https://api.chatgpt.domain.com")
(chatgpt-shell-openai-key
(lambda ()
;; Here the openai-key should be the proxy service key.
(auth-source-pass-get 'secret "openai-key")))))
If your proxy service API path is not OpenAI ChatGPT default path like ”/v1/chat/completions
”, then
you can customize option chatgpt-shell-api-url-path
.
Behind the scenes chatgpt-shell uses curl
to send requests to the openai server.
If you use ChatGPT through a HTTP proxy (for example you are in a corporate network and a HTTP proxy shields the corporate network from the internet), you need to tell curl
to use the proxy via the curl option -x http://your_proxy
.
For this, use chatgpt-shell-proxy
.
For example, if you want curl -x
and http://your_proxy
, set chatgpt-shell-proxy
to ”http://your_proxy
”.
Endpoint: https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}/chat/completions?api-version={api-version}
Configure the following variables:
(setq chatgpt-shell-api-url-base "https://{your-resource-name}.openai.azure.com")
(setq chatgpt-shell-api-url-path "/openai/deployments/{deployment-id}/chat/completions?api-version={api-version}")
(setq chatgpt-shell-auth-header (lambda () (format "api-key: %s" (chatgpt-shell-openai-key))))
Launch with M-x chatgpt-shell
.
Note: M-x chatgpt-shell
keeps a single shell around, refocusing if needed. To launch multiple shells, use C-u M-x chatgpt-shell
.
Type clear
as a prompt.
ChatGPT> clear
Alternatively, use either M-x chatgpt-shell-clear-buffer
or M-x comint-clear-buffer
.
Save with M-x chatgpt-shell-save-session-transcript
and restore with M-x chatgpt-shell-restore-session-from-transcript
.
Some related values stored in shell-maker
like shell-maker-transcript-default-path
and shell-maker-forget-file-after-clear
.
chatgpt-shell
can either wait until the entire response is received before displaying, or it can progressively display as chunks arrive (streaming).
Streaming is enabled by default. (setq chatgpt-shell-streaming nil)
to disable it.
Custom variable | Description |
---|---|
chatgpt-shell-google-api-url-base | Google API’s base URL. |
chatgpt-shell-perplexity-key | Perplexity API key as a string or a function that loads and returns it. |
chatgpt-shell-prompt-header-write-git-commit | Prompt header of ‘git-commit‘. |
chatgpt-shell-highlight-blocks | Whether or not to highlight source blocks. |
chatgpt-shell-display-function | Function to display the shell. Set to ‘display-buffer’ or custom function. |
chatgpt-shell-prompt-header-generate-unit-test | Prompt header of ‘generate-unit-test‘. |
chatgpt-shell-prompt-header-refactor-code | Prompt header of ‘refactor-code‘. |
chatgpt-shell-prompt-header-proofread-region | Prompt header used by ‘chatgpt-shell-proofread-region‘. |
chatgpt-shell-welcome-function | Function returning welcome message or nil for no message. |
chatgpt-shell-perplexity-api-url-base | Perplexity API’s base URL. |
chatgpt-shell-prompt-query-response-style | Determines the prompt style when invoking from other buffers. |
chatgpt-shell-model-version | The active model version as either a string. |
chatgpt-shell-logging | Logging disabled by default (slows things down). |
chatgpt-shell-api-url-base | OpenAI API’s base URL. |
chatgpt-shell-google-key | Google API key as a string or a function that loads and returns it. |
chatgpt-shell-ollama-api-url-base | Ollama API’s base URL. |
chatgpt-shell-babel-headers | Additional headers to make babel blocks work. |
chatgpt-shell–pretty-smerge-mode-hook | Hook run after entering or leaving ‘chatgpt-shell–pretty-smerge-mode’. |
chatgpt-shell-source-block-actions | Block actions for known languages. |
chatgpt-shell-default-prompts | List of default prompts to choose from. |
chatgpt-shell-anthropic-key | Anthropic API key as a string or a function that loads and returns it. |
chatgpt-shell-prompt-header-eshell-summarize-last-command-output | Prompt header of ‘eshell-summarize-last-command-output‘. |
chatgpt-shell-system-prompt | The system prompt ‘chatgpt-shell-system-prompts’ index. |
chatgpt-shell-transmitted-context-length | Controls the amount of context provided to chatGPT. |
chatgpt-shell-root-path | Root path location to store internal shell files. |
chatgpt-shell-prompt-header-whats-wrong-with-last-command | Prompt header of ‘whats-wrong-with-last-command‘. |
chatgpt-shell-read-string-function | Function to read strings from user. |
chatgpt-shell-after-command-functions | Abnormal hook (i.e. with parameters) invoked after each command. |
chatgpt-shell-system-prompts | List of system prompts to choose from. |
chatgpt-shell-openai-key | OpenAI key as a string or a function that loads and returns it. |
chatgpt-shell-prompt-header-describe-code | Prompt header of ‘describe-code‘. |
chatgpt-shell-insert-dividers | Whether or not to display a divider between requests and responses. |
chatgpt-shell-models | The list of supported models to swap from. |
chatgpt-shell-language-mapping | Maps external language names to Emacs names. |
chatgpt-shell-prompt-compose-view-mode-hook | Hook run after entering or leaving ‘chatgpt-shell-prompt-compose-view-mode’. |
chatgpt-shell-streaming | Whether or not to stream ChatGPT responses (show chunks as they arrive). |
chatgpt-shell-anthropic-api-url-base | Anthropic API’s base URL. |
chatgpt-shell-model-temperature | What sampling temperature to use, between 0 and 2, or nil. |
chatgpt-shell-request-timeout | How long to wait for a request to time out in seconds. |
There are more. Browse via M-x set-variable
If you’d prefer your own custom display function,
(setq chatgpt-shell-display-function #'my/chatgpt-shell-frame)
(defun my/chatgpt-shell-frame (bname)
(let ((cur-f (selected-frame))
(f (my/find-or-make-frame "chatgpt")))
(select-frame-by-name "chatgpt")
(pop-to-buffer-same-window bname)
(set-frame-position f (/ (display-pixel-width) 2) 0)
(set-frame-height f (frame-height cur-f))
(set-frame-width f (frame-width cur-f) 1)))
(defun my/find-or-make-frame (fname)
(condition-case
nil
(select-frame-by-name fname)
(error (make-frame `((name . ,fname))))))
Thanks to tuhdo for the custom display function.
Binding | Command | Description |
---|---|---|
chatgpt-shell-japanese-lookup | Look Japanese term up. | |
chatgpt-shell-next-source-block | Move point to the next source block’s body. | |
chatgpt-shell-prompt-compose-request-entire-snippet | If the response code is incomplete, request the entire snippet. | |
chatgpt-shell-prompt-compose-request-more | Request more data. This is useful if you already requested examples. | |
chatgpt-shell-execute-babel-block-action-at-point | Execute block as org babel. | |
C-c C-s | chatgpt-shell-swap-system-prompt | Swap system prompt from `chatgpt-shell-system-prompts’. |
chatgpt-shell-system-prompts-menu | ChatGPT | |
chatgpt-shell-prompt-compose-swap-model-version | Swap the compose buffer’s model version. | |
chatgpt-shell-describe-code | Describe code from region using ChatGPT. | |
C-<up> or M-p | chatgpt-shell-previous-input | Cycle backwards through input history, saving input. |
C-c C-v | chatgpt-shell-swap-model | Swap model version from `chatgpt-shell-models’. |
C-x C-s | chatgpt-shell-save-session-transcript | Save shell transcript to file. |
chatgpt-shell-proofread-region | Proofread text from region using ChatGPT. | |
chatgpt-shell-prompt-compose-quit-and-close-frame | Quit compose and close frame if it’s the last window. | |
chatgpt-shell-prompt-compose-other-buffer | Jump to the shell buffer (compose’s other buffer). | |
chatgpt-shell-prompt-compose-next-block | Jump to and select next code block. | |
chatgpt-shell | Start a ChatGPT shell interactive command. | |
RET | chatgpt-shell-submit | Submit current input. |
chatgpt-shell-prompt-compose-swap-system-prompt | Swap the compose buffer’s system prompt. | |
chatgpt-shell-describe-image | Request OpenAI to describe image. | |
chatgpt-shell-prompt-compose-search-history | Search prompt history, select, and insert to current compose buffer. | |
chatgpt-shell-prompt-compose-previous-history | Insert previous prompt from history into compose buffer. | |
chatgpt-shell-delete-interaction-at-point | Delete interaction (request and response) at point. | |
chatgpt-shell-refresh-rendering | Refresh markdown rendering by re-applying to entire buffer. | |
chatgpt-shell-prompt-compose-insert-block-at-point | Insert block at point at last known location. | |
chatgpt-shell-explain-code | Describe code from region using ChatGPT. | |
chatgpt-shell-execute-block-action-at-point | Execute block at point. | |
chatgpt-shell-load-awesome-prompts | Load `chatgpt-shell-system-prompts’ from awesome-chatgpt-prompts. | |
chatgpt-shell-write-git-commit | Write commit from region using ChatGPT. | |
chatgpt-shell-prompt-compose-previous-block | Jump to and select previous code block. | |
chatgpt-shell-restore-session-from-transcript | Restore session from file transcript (or HISTORY). | |
chatgpt-shell-prompt-compose-next-interaction | Show next interaction (request / response). | |
C-c C-p | chatgpt-shell-previous-item | Go to previous item. |
chatgpt-shell-fix-error-at-point | Fixes flymake error at point. | |
chatgpt-shell-prompt-appending-kill-ring | Make a ChatGPT request from the minibuffer appending kill ring. | |
chatgpt-shell-ollama-load-models | Query ollama for the locally installed models and add them to | |
C-<down> or M-n | chatgpt-shell-next-input | Cycle forwards through input history. |
chatgpt-shell-prompt-compose-view-mode | Like `view-mode`, but extended for ChatGPT Compose. | |
chatgpt-shell-clear-buffer | Clear the current shell buffer. | |
C-c C-n | chatgpt-shell-next-item | Go to next item. |
chatgpt-shell-prompt-compose-send-buffer | Send compose buffer content to shell for processing. | |
C-c C-e | chatgpt-shell-prompt-compose | Compose and send prompt from a dedicated buffer. |
chatgpt-shell-rename-buffer | Rename current shell buffer. | |
chatgpt-shell-remove-block-overlays | Remove block overlays. Handy for renaming blocks. | |
chatgpt-shell-send-region | Send region to ChatGPT. | |
chatgpt-shell-send-and-review-region | Send region to ChatGPT, review before submitting. | |
C-M-h | chatgpt-shell-mark-at-point-dwim | Mark source block if at point. Mark all output otherwise. |
chatgpt-shell–pretty-smerge-mode | Minor mode to display overlays for conflict markers. | |
chatgpt-shell-mark-block | Mark current block in compose buffer. | |
chatgpt-shell-prompt-compose-reply | Reply as a follow-up and compose another query. | |
chatgpt-shell-set-as-primary-shell | Set as primary shell when there are multiple sessions. | |
chatgpt-shell-rename-block-at-point | Rename block at point (perhaps a different language). | |
chatgpt-shell-quick-insert | Request from minibuffer and insert response into current buffer. | |
chatgpt-shell-reload-default-models | Reload all available models. | |
S-<return> | chatgpt-shell-newline | Insert a newline, and move to left margin of the new line. |
chatgpt-shell-generate-unit-test | Generate unit-test for the code from region using ChatGPT. | |
chatgpt-shell-prompt-compose-next-history | Insert next prompt from history into compose buffer. | |
C-c C-c | chatgpt-shell-ctrl-c-ctrl-c | If point in source block, execute it. Otherwise interrupt. |
chatgpt-shell-eshell-summarize-last-command-output | Ask ChatGPT to summarize the last command output. | |
M-r | chatgpt-shell-search-history | Search previous input history. |
chatgpt-shell-mode | Major mode for ChatGPT shell. | |
chatgpt-shell-prompt-compose-mode | Major mode for composing ChatGPT prompts from a dedicated buffer. | |
chatgpt-shell-previous-source-block | Move point to the previous source block’s body. | |
chatgpt-shell-prompt | Make a ChatGPT request from the minibuffer. | |
chatgpt-shell-japanese-ocr-lookup | Select a region of the screen to OCR and look up in Japanese. | |
chatgpt-shell-refactor-code | Refactor code from region using ChatGPT. | |
chatgpt-shell-japanese-audio-lookup | Transcribe audio at current file (buffer or `dired’) and look up in Japanese. | |
chatgpt-shell-eshell-whats-wrong-with-last-command | Ask ChatGPT what’s wrong with the last eshell command. | |
chatgpt-shell-prompt-compose-cancel | Cancel and close compose buffer. | |
chatgpt-shell-prompt-compose-retry | Retry sending request to shell. | |
chatgpt-shell-version | Show `chatgpt-shell’ mode version. | |
chatgpt-shell-prompt-compose-previous-interaction | Show previous interaction (request / response). | |
chatgpt-shell-interrupt | Interrupt `chatgpt-shell’ from any buffer. | |
chatgpt-shell-view-at-point | View prompt and output at point in a separate buffer. |
Browse all available via M-x
.
- Please go through this README to see if the feature is already supported.
- Need custom behaviour? Check out existing issues/feature requests. You may find solutions in discussions.
Pull requests are super welcome. Please reach out before getting started to make sure we’re not duplicating effort. Also search existing discussions.
Please share the entire snippet you’ve used to set chatgpt-shell
up (but redact your key). Share any errors you encountered. Read on for sharing additional details.
Please enable M-x toggle-debug-on-error
, reproduce the error, and share the stack trace.
Please enable logging (setq chatgpt-shell-logging t)
and share the content of the *chatgpt-log*
buffer in the bug report.
Please also share the entire org snippet.
👉 Find my work useful? Support this work via GitHub Sponsors or buy my iOS apps.
- Blog (xenodium.com)
- Blog (lmno.lol/alvaro)
- Plain Org (iOS)
- Flat Habits (iOS)
- Scratch (iOS)
- macosrec (macOS)
- Fresh Eyes (macOS)
- dwim-shell-command (Emacs)
- company-org-block (Emacs)
- org-block-capf (Emacs)
- ob-swiftui (Emacs)
- chatgpt-shell (Emacs)
- ready-player (Emacs)
- sqlite-mode-extras
- ob-chatgpt-shell (Emacs)
- dall-e-shell (Emacs)
- ob-dall-e-shell (Emacs)
- shell-maker (Emacs)
Made with contrib.rocks.