Skip to content

ultronozm/ai-org-chat.el

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ai-org-chat.el: Threaded chat with AI agent

Overview

This package supports a threaded AI chat inside any org-mode buffer. Here’s what it looks like:

./img/fruits.png

The design is inspired by the ChatGPT web interface. There are many AI chat packages for Emacs, but I am not aware of any that naturally support multiple conversational pathways. (Update: gptel now has a similar feature, which I haven’t yet tried; it depends upon an pre-release version of Org.) I’m also using this package as an experiment in ways to tighten the editing loop with AI in Emacs, via context, diffs, etc.

The package comes with one main function, ai-org-chat-respond, that operates in any org-mode buffer. It extracts a conversation history from the parent entries, treating an entry as belonging to the AI if its heading is “AI” and otherwise to the user. This conversation history is passed along to the OpenAI library to generate a response.

Narrowing provides a simple way to truncate the conversation history if it becomes too long – only the restricted part of the buffer is considered.

The PROPERTIES drawers are used to provide

  • context, which augments the system message with the contents of certain buffers or files, and
  • tools, which allow the AI to operate directly on the environment.

There is support for quickly launching Ediff sessions that compare modifications suggested by the AI to their sources, making it easier to review and apply changes (see Compare Feature).

Configuration

Download this repository, install using M-x package-install-file (or package-vc-install, straight, elpaca, …), and add (use-package ai-org-chat) to your init file. For a more comprehensive setup, use something like the following:

(use-package ai-org-chat
  :bind
  (:map global-map
        ("C-c /" . ai-org-chat-new))
  (:map ai-org-chat-minor-mode-map
        ("C-c <return>" . ai-org-chat-respond)
        ("C-c n" . ai-org-chat-branch)
        ("C-c e" . ai-org-chat-compare))
  :custom
  (ai-org-chat-user-name "Paul")
  (ai-org-chat-dir "~/gpt")
  (ai-org-chat-context-style nil)
  :config
  ;; See below
  ;; (ai-org-chat-select-model "sonnet 3.5")
  )

To use the package, you’ll need to set up either the gptel or llm library. By default, ai-org-chat works with gptel, so if you have that configured already, you’re all set. However, features related to “tools” or “function calls” are currently only supported by the llm library, making it the recommended option for full functionality.

If you choose to use llm, you’ll need to configure the ai-org-chat-provider variable with a valid provider as defined by the llm library. Here’s how to do that:

  1. Customize the ai-org-chat-models user option to include the models you want to use and the environment variables containing your API keys.
  2. Use the ai-org-chat-select-model command to choose your preferred model. You can do this by uncommenting and adjusting the line at the bottom of the above use-package declaration.

As a final tip, the following makes environment variables available in Emacs on MacOS:

(use-package exec-path-from-shell
  :ensure
  :init
  (exec-path-from-shell-initialize))

Usage

When you want to ask the AI something, do M-x ai-org-chat-new (or C-c /, if you followed the above configuration). This visits a new file in the specified directory (“~/gpt” by default). If the region was active, then it will be quoted in the new buffer. With a prefix argument (C-u), it will immediately add visible buffers as context to the new chat. Example:

./img/animated.gif

The org-mode buffer has ai-org-chat-minor-mode activated, whose only purpose is to support user-defined keybindings like in the above use-package declaration. If you want to work in some other org file, you can either activate this minor mode manually or do M-x ai-org-chat-setup-buffer.

We provide the following commands:

ai-org-chat-respond (C-c <return>)
This is the main function, which tells the AI to generate a new response to the conversation node at point. It works in any org-mode buffer, not just ones created via ai-org-chat-new.
ai-org-chat-branch (C-c n)
This is a convenience function that creates a new conversation branch at point.
ai-org-chat-compare (C-c e)
Launches an Ediff session to compare the org-mode block at point with the contents of another visible buffer. This helps you review and apply AI-suggested changes to your codebase. See Compare Feature for more details.
ai-org-chat-convert-markdown-blocks-to-org
LLM’s often return code in markdown format (even when you instruct them otherwise). This function converts all markdown code blocks between (point) and (point-max) to org-mode code blocks.

Context and Tools

ai-org-chat uses PROPERTIES drawers to manage context and tools for the AI conversation. These can be set at the top level of the file or in individual nodes.

Context

Context is managed through the CONTEXT property. This property can contain a list of items that provide additional information to the AI. These items can be:

  1. Buffer names
  2. File names as absolute paths, paths relative to the current directory, or paths relative to any subdirectory of the current Emacs project, searched in this order
  3. Elisp function names (functions that return strings to be included in the context)

Example:

:PROPERTIES:
:CONTEXT: buffer-name.txt project-file.el my-context-function
:END:

The ai-org-chat-context-style variable determines how visible buffer contents are included in the context. It can be set to nil, visible-contents, or visible-buffers.

Tools

Tools (or “function calls”) are specified using the TOOLS property. This property should contain a list of llm function symbols that the AI can use. This feature works only if the variable ai-org-chat-provider is set to a provider for the llm package that supports tools.

Example:

:PROPERTIES:
:TOOLS: my-tool-function another-tool-function
:END:

Helper Commands

While you can directly edit PROPERTIES drawers using Org mode’s built-in commands (e.g., C-c C-x p for org-set-property), ai-org-chat provides some helper commands for managing context:

  • ai-org-chat-set-context-style: Set the ai-org-chat-context-style as a file-local variable.
  • ai-org-chat-add-buffer-context: Add selected buffers as context.
  • ai-org-chat-add-visible-buffers-context: Add all visible buffers as context.
  • ai-org-chat-add-file-context: Add selected files as context.
  • ai-org-chat-add-project-files-context: Add all files from a selected project as context.

These commands are designed to simplify context/tool management, but are not required for using the package.

Compare Feature

The “compare” feature streamlines the process of reviewing and applying code changes suggested by the AI, as follows.

  1. Narrow the buffer containing your original code to the function or section of interest.
  2. In the AI chat buffer, place your cursor on the AI-suggested code block.
  3. Execute the command ai-org-chat-compare (bound to C-c e by default).
  4. If you have multiple visible windows, you’ll be prompted to select the window containing the original code using ace-window.
  5. An Ediff session will launch in a new tab, comparing the AI-suggested code with your original code.

The Ediff session is launched in a new tab and cleaned up automatically when you’re done, keeping your workspace tidy.

Workflow for revising part of a buffer

There are many ways to do this. Here’s one typical workflow:

  1. Create a buffer containing the region that you want to modify, either immediately via C-x n n (narrow-to-region) or after first doing M-x clone-indirect-buffer (which I bind to C-x c).
  2. C-x t 2 (tab-bar-new-tab) to create a new tab containing just the buffer containing the region of interest, then C-u C-c / (or C-u M-x ai-org-chat-new) to launch a chat session with the buffer of interest as context.
  3. Ask the LLM to revise it, requesting the response in a source block (a good system message, or the function ai-org-chat-convert-markdown-blocks-to-org, maybe come in handy here).
  4. When you receive a response in a source block, use C-c e (ai-org-chat-compare) to inspect what was changed, and standard Ediff commands to apply parts of that change.
  5. Iterate until you’re happy with the changes.

The point here is that this is a very flexible workflow that leverages built-in features such as narrowing and Ediff.

Shortcut for individual functions

As a shortcut, the first step (narrowing the buffer) is not necessary if the code block consists of a single function – in that case, narrowing should be taken care of automatically provided that the relevant buffer is either visible or appears in the context.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published