Skip to content

Emacs on steroids with GPT

License

Notifications You must be signed in to change notification settings

emacsmirror/le-gpt

Repository files navigation

weird-generated-logo

le-gpt.el

le-gpt.el is a comprehensive Emacs package for interacting with large language models like GPT-4 and Claude 3.5 Sonnet. It's a feature-rich fork of gpt.el that adds project awareness, completion, region transform, and more to come.

The aim is to make sure Emacs stays up-to-date with modern GPT support, essentially aiming for a CursorAI for Emacs.

Features

  • Chat Interface: Create and manage multiple chat sessions with GPT. Use M-x le-gpt-chat to start a session. Key bindings in chat buffers include:

    • C-c C-c: Send follow-up command
    • C-c C-p: Toggle prefix visibility
    • C-c C-b: Copy code block at point
    • C-c C-t: Generate descriptive buffer name from its content
  • Buffer List: Display a list of all GPT chat buffers with M-x le-gpt-list-buffers. This feature allows you to manage and navigate through your GPT-related buffers efficiently.

  • Completion at Point: Let GPT complete what you're currently writing. Use M-x le-gpt-complete-at-point to get suggestions based on your current cursor position.

  • Region Transformation: Select a region you want GPT to transform. Use M-x le-gpt-transform-region to transform the selected region using GPT.

  • Project Context: Select files from your project that GPT should use as context. Globally select project files to be used as context via M-x le-gpt-select-project-files or select local, per-command context by running the above commands with a prefix argument (C-u). Context is used by chat, completion, and region transforms. To deselect global context files, use M-x le-gpt-deselect-project-files or M-x le-gpt-clear-selected-context-files to clear the entire selection.

Mandatory GIFs

Chat Interface Completion at point
le-gpt-chat-demo le-gpt-complete-at-point-demo
Project Context Region Transformation
le-gpt-with-context-demo le-gpt-transform-region-demo

Installation

Prerequisites

You'll need Python packages for the API clients:

pip install openai anthropic jsonlines

You don't need to install all of them, but minimally openai or anthropic.

You'll also need API keys from OpenAI and/or Anthropic.

You'll also need markdown-mode for displaying the chat conversations nicely.

Using straight with use-package

(use-package le-gpt
  :straight (le-gpt :type git
                    :host github
                    :files (:defaults "le-gpt.py")
                    :repo "AnselmC/le-gpt.el")
  :bind (("M-C-g" . le-gpt-chat)
         ("M-C-n" . le-gpt-complete-at-point)
         ("M-C-t" . le-gpt-transform-region)
         ("M-C-s" . le-gpt-select-project-files)
         ("M-C-d" . le-gpt-deselect-project-files))
  :config
  ;; you need to set at least one of the following
  (setq le-gpt-openai-key "your-openai-key-here")
  (setq le-gpt-anthropic-key "your-anthropic-key-here"))

Configuration

See all available customizations via M-x customize-group RET le-gpt.

Basic configuration:

;; API Keys
(setq le-gpt-openai-key "sk-...")
(setq le-gpt-anthropic-key "sk-ant-...")

;; Model Parameters (optional)
(setq le-gpt-model "gpt-4o")
(setq le-gpt-max-tokens 2000)
(setq le-gpt-temperature 0)

;; API Selection (default is 'openai)
(setq le-gpt-api-type 'anthropic)

Usage

Chat Interface

Start a chat session:

M-x le-gpt-chat

If you provide a prefix argument, you can select context files for a single query.

Completion at Point

Get completions based on your current cursor position:

M-x le-gpt-complete-at-point

Project Context

Set project files as context:

M-x le-gpt-select-project-files

The context will be used by chat, completion, and region transforms.

Note that these files are persisted between multiple calls.

To deselect files:

M-x le-gpt-deselect-project-files

Or, to clear the entire selection:

M-x le-gpt-clear-selected-context-files

Region Transformation

Transform selection via:

M-x le-gpt-transform-region

Buffer List

Display a list of all GPT buffers:

M-x le-gpt-list-buffers

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests on GitHub.

Feature roadmap

  • Add package to Melpa
  • Add testing with buttercup
  • Add testing with pytest
  • More models, e.g., groq
  • Create GitHub actions
  • Ability to generate images
  • Add all files of the current project as context (?)
  • Ability to let GPT decide which context files it needs

License

le-gpt.el is licensed under the MIT License. See LICENSE for details.

Installation

Prerequisites

You'll need Python packages for the API clients:

pip install openai anthropic jsonlines

You don't need to install all of them, but minimally openai or anthropic.

You'll also need API keys from OpenAI and/or Anthropic.

You'll also need markdown-mode for displaying the chat conversations nicely.

Using straight with use-package

(use-package le-gpt
  :straight (le-gpt :type git
                    :host github
                    :files (:defaults "le-gpt.py")
                    :repo "AnselmC/le-gpt.el")
  :bind (("M-C-g" . le-gpt-chat)
         ("M-C-n" . le-gpt-complete-at-point)
         ("M-C-t" . le-gpt-transform-region)
         ("M-C-s" . le-gpt-select-project-files)
         ("M-C-d" . le-gpt-deselect-project-files))
  :config
  ;; you need to set at least one of the following
  (setq le-gpt-openai-key "your-openai-key-here")
  (setq le-gpt-anthropic-key "your-anthropic-key-here"))

Configuration

See all available customizations via M-x customize-group RET le-gpt.

Basic configuration:

;; API Keys
(setq le-gpt-openai-key "sk-...")
(setq le-gpt-anthropic-key "sk-ant-...")

;; Model Parameters (optional)
(setq le-gpt-model "gpt-4o")
(setq le-gpt-max-tokens 2000)
(setq le-gpt-temperature 0)

;; API Selection (default is 'openai)
(setq le-gpt-api-type 'anthropic)

Usage

Chat Interface

Start a chat session:

M-x le-gpt-chat

Key bindings in chat buffers:

  • C-c C-c: Send follow-up command
  • C-c C-p: Toggle prefix visibility
  • C-c C-b: Copy code block at point
  • C-c C-t: Generate descriptive buffer name from its content

If you provide a prefix argument, you can select context files for a single query.

Completion at Point

Get completions based on your current cursor position:

M-x le-gpt-complete-at-point

Project Context

Set project files as context:

M-x le-gpt-select-project-files

The context will be used by chat, completion, and region transforms.

Note that these files are persisted between multiple calls.

To deselect files:

M-x le-gpt-deselect-project-files

Or, to clear the entire selection:

M-x le-gpt-clear-selected-context-files

Region Transformation

Transform selection via:

M-x le-gpt-transform-region

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests on GitHub.

Feature roadmap

  • Add package to Melpa
  • Add testing with buttercup
  • Add testing with pytest
  • More models, e.g., groq
  • Create GitHub actions
  • Ability to generate images
  • Add all files of the current project as context (?)
  • Ability to let GPT decide which context files it needs

License

le-gpt.el is licensed under the MIT License. See LICENSE for details