Request - Your help in improving the default system prompts #80
Closed
olimorris
announced in
Announcements
Replies: 1 comment 2 replies
-
Hmmmm... Do you think it's worth trying to find the best output across the board? Just from what I have used at least it seems almost always beneficial to tweak system prompts based on the specific AI model. I guess we need to start somewhere but maybe it's a consideration to at least split Thought I would throw my hat in the ring anyways, I will spend this WE refining my Ollama prompts and post them up so people can take a look |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello All 👋🏼
In my quest to make this plugin yield the best outputs from OpenAI, Anthropic AND Ollama, I'm seeking your feedback in how we can tweak the default
system_prompts
(andsystem
prompts in thedefault_prompts
table) in the config.lua file.Ultimately, I'm trying to get to the level of response that you may observe in GitHub's Copilot Chat in VS Code. Helpful, but overly terse and low in token consumption.
As an aside; the changes I've made to the plugin in recent weeks should allow for Gemini integration. It's low on my todo list but conscious they're silently making really interesting improvements to their models.
Beta Was this translation helpful? Give feedback.
All reactions