Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add custom prompt to default one #96

Open
germa89 opened this issue Sep 20, 2023 · 7 comments
Open

Add custom prompt to default one #96

germa89 opened this issue Sep 20, 2023 · 7 comments
Labels
enhancement New features or code improvements

Comments

@germa89
Copy link

germa89 commented Sep 20, 2023

📝 Description of the feature

Each project is different and sometimes the repository maintainers can provide helpful advice to the bot. The only way I know to do this is to modify the prompt in here:

def generate_suggestions_with_source(

I'm no expert whatsoever in bots, but I presume we could have the bot read a github repository variable (or secret) with text which could be appended to that bot.

💡 Steps for implementing the feature

Assuming that the bot runs like a normal github action job, it should have access to ${{ secrets.SuperSecret }} or ${{ env.DAY_OF_WEEK }}.

🔗 Useful links and references

No response

@germa89 germa89 added the enhancement New features or code improvements label Sep 20, 2023
@RobPasMue
Copy link
Member

Interesting. @AlejandroFernandezLuces what's your take?

I think they should append it to whatever we decide as common prompt.. The system prompt should not be modified.

@RobPasMue
Copy link
Member

RobPasMue commented Sep 20, 2023

Also, we might have to do 2 implementations. One here to accept inputs to the prompt, and another one in our private service bot to pass in the secret.

@germa89
Copy link
Author

germa89 commented Sep 20, 2023

I think they should append it to whatever we decide as common prompt.. The system prompt should not be modified.

Definitely append

Also, we might have to do 2 implementations. One here to accept inputs to the prompt, and another one in our private service bot to pass in the secret.

Not sure what you mean... but I trust you 🙃

@AlejandroFernandezLuces
Copy link
Contributor

The system prompt is curated to give the precise answers that can be processed by the github bot. We shouldn't mess too much with it or it won't give the answers in the proper format.

What we can do is to add another step in the conversation to add any specific context you want to the prompt:

 {
    "role": "system",
    "content": """You are a GitHub review bot.  ....""",
 },
{"role": "user", "content": file_src},
{"role": "assistant", "content": "Can you provide any additional context?."},
{"role": "user", "Context": $github.context},
{"role": "assistant", "content": "Ready for the patch."},
{
"role": "user",
"content": f"{patch}\n\nReview the above code  ...",
},

I'm all for giving the users control, but be wary that changes in the prompt can be very sensitive. It might completely break the responses, maybe even the bot if it's not prepared to receive certain formats.

@germa89
Copy link
Author

germa89 commented Sep 20, 2023

Do you solve everything in one call to OpenAI? ... I would expect one per "type" .. one for changes, one for comments, etc... That way the bot does not get too confused?

I saw in some comments the COMMENT ... at the beginning.

I'm all for giving the users control, but be wary that changes in the prompt can be very sensitive.

I totally agree. That is why this appending is optional. If you are not sure what you are doing, you probably won't change it.

@AlejandroFernandezLuces
Copy link
Contributor

Do you solve everything in one call to OpenAI? ... I would expect one per "type" .. one for changes, one for comments, etc... That way the bot does not get too confused?

I saw in some comments the COMMENT ... at the beginning.

I'm all for giving the users control, but be wary that changes in the prompt can be very sensitive.

I totally agree. That is why this appending is optional. If you are not sure what you are doing, you probably won't change it.

Yes, we only make one call to OpenAI, I think it's better for the model to do it this way to avoid parsing issues on the response in later parts of the code.

@germa89
Copy link
Author

germa89 commented Sep 26, 2023

I also this could be useful to tailor prompt to review examples for example, since we have already a template for that:

https://github.com/ansys/pymapdl/blob/main/.github/ISSUE_TEMPLATE/examples.yml

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New features or code improvements
Projects
None yet
Development

No branches or pull requests

3 participants