le-gpt.el is a comprehensive Emacs package for interacting with large language models like GPT-4 and Claude 3.5 Sonnet. It's a feature-rich fork of gpt.el that adds project awareness, completion, region transform, and more to come.
The aim is to make sure Emacs stays up-to-date with modern GPT support, essentially aiming for a CursorAI for Emacs.
-
Chat Interface: Create and manage multiple chat sessions with GPT. Use
M-x le-gpt-chat
to start a session. Key bindings in chat buffers include:C-c C-c
: Send follow-up commandC-c C-p
: Toggle prefix visibilityC-c C-b
: Copy code block at pointC-c C-t
: Generate descriptive buffer name from its content
-
Buffer List: Display a list of all GPT chat buffers with
M-x le-gpt-list-buffers
. This feature allows you to manage and navigate through your GPT-related buffers efficiently. -
Completion at Point: Let GPT complete what you're currently writing. Use
M-x le-gpt-complete-at-point
to get suggestions based on your current cursor position. -
Region Transformation: Select a region you want GPT to transform. Use
M-x le-gpt-transform-region
to transform the selected region using GPT. -
Project Context: Select files from your project that GPT should use as context. Globally select project files to be used as context via
M-x le-gpt-select-project-files
or select local, per-command context by running the above commands with a prefix argument (C-u
). Context is used by chat, completion, and region transforms. To deselect global context files, useM-x le-gpt-deselect-project-files
orM-x le-gpt-clear-selected-context-files
to clear the entire selection.
Chat Interface | Completion at point |
---|---|
Project Context | Region Transformation |
---|---|
You'll need Python packages for the API clients:
pip install openai anthropic jsonlines
You don't need to install all of them, but minimally openai
or anthropic
.
You'll also need API keys from OpenAI and/or Anthropic.
You'll also need markdown-mode for displaying the chat conversations nicely.
(use-package le-gpt
:straight (le-gpt :type git
:host github
:files (:defaults "le-gpt.py")
:repo "AnselmC/le-gpt.el")
:bind (("M-C-g" . le-gpt-chat)
("M-C-n" . le-gpt-complete-at-point)
("M-C-t" . le-gpt-transform-region)
("M-C-s" . le-gpt-select-project-files)
("M-C-d" . le-gpt-deselect-project-files))
:config
;; you need to set at least one of the following
(setq le-gpt-openai-key "your-openai-key-here")
(setq le-gpt-anthropic-key "your-anthropic-key-here"))
See all available customizations via M-x customize-group RET le-gpt
.
Basic configuration:
;; API Keys
(setq le-gpt-openai-key "sk-...")
(setq le-gpt-anthropic-key "sk-ant-...")
;; Model Parameters (optional)
(setq le-gpt-model "gpt-4o")
(setq le-gpt-max-tokens 2000)
(setq le-gpt-temperature 0)
;; API Selection (default is 'openai)
(setq le-gpt-api-type 'anthropic)
Start a chat session:
M-x le-gpt-chat
If you provide a prefix argument, you can select context files for a single query.
Get completions based on your current cursor position:
M-x le-gpt-complete-at-point
Set project files as context:
M-x le-gpt-select-project-files
The context will be used by chat, completion, and region transforms.
Note that these files are persisted between multiple calls.
To deselect files:
M-x le-gpt-deselect-project-files
Or, to clear the entire selection:
M-x le-gpt-clear-selected-context-files
Transform selection via:
M-x le-gpt-transform-region
Display a list of all GPT buffers:
M-x le-gpt-list-buffers
Contributions are welcome! Please feel free to submit issues and pull requests on GitHub.
- Add package to Melpa
- Add testing with buttercup
- Add testing with pytest
- More models, e.g., groq
- Create GitHub actions
- Ability to generate images
- Add all files of the current project as context (?)
- Ability to let GPT decide which context files it needs
le-gpt.el is licensed under the MIT License. See LICENSE for details.
You'll need Python packages for the API clients:
pip install openai anthropic jsonlines
You don't need to install all of them, but minimally openai
or anthropic
.
You'll also need API keys from OpenAI and/or Anthropic.
You'll also need markdown-mode for displaying the chat conversations nicely.
(use-package le-gpt
:straight (le-gpt :type git
:host github
:files (:defaults "le-gpt.py")
:repo "AnselmC/le-gpt.el")
:bind (("M-C-g" . le-gpt-chat)
("M-C-n" . le-gpt-complete-at-point)
("M-C-t" . le-gpt-transform-region)
("M-C-s" . le-gpt-select-project-files)
("M-C-d" . le-gpt-deselect-project-files))
:config
;; you need to set at least one of the following
(setq le-gpt-openai-key "your-openai-key-here")
(setq le-gpt-anthropic-key "your-anthropic-key-here"))
See all available customizations via M-x customize-group RET le-gpt
.
Basic configuration:
;; API Keys
(setq le-gpt-openai-key "sk-...")
(setq le-gpt-anthropic-key "sk-ant-...")
;; Model Parameters (optional)
(setq le-gpt-model "gpt-4o")
(setq le-gpt-max-tokens 2000)
(setq le-gpt-temperature 0)
;; API Selection (default is 'openai)
(setq le-gpt-api-type 'anthropic)
Start a chat session:
M-x le-gpt-chat
Key bindings in chat buffers:
C-c C-c
: Send follow-up commandC-c C-p
: Toggle prefix visibilityC-c C-b
: Copy code block at pointC-c C-t
: Generate descriptive buffer name from its content
If you provide a prefix argument, you can select context files for a single query.
Get completions based on your current cursor position:
M-x le-gpt-complete-at-point
Set project files as context:
M-x le-gpt-select-project-files
The context will be used by chat, completion, and region transforms.
Note that these files are persisted between multiple calls.
To deselect files:
M-x le-gpt-deselect-project-files
Or, to clear the entire selection:
M-x le-gpt-clear-selected-context-files
Transform selection via:
M-x le-gpt-transform-region
Contributions are welcome! Please feel free to submit issues and pull requests on GitHub.
- Add package to Melpa
- Add testing with buttercup
- Add testing with pytest
- More models, e.g., groq
- Create GitHub actions
- Ability to generate images
- Add all files of the current project as context (?)
- Ability to let GPT decide which context files it needs
le-gpt.el is licensed under the MIT License. See LICENSE for details