Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for gemini #525

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

papayalabs
Copy link

This is where I got stuck @krschacht. I need to change app/jobs/get_next_ai_message_job.rb for Gemini to work but that is not the idea. Sorry I do not know to go foward from this point. System instructions does not work on gemini-ai gem. ( I think is a bug ).

@krschacht
Copy link
Contributor

Hi @papayalabs, it’s totally fine to make changes to get_next_ai_message_job.rb if you needed to make this work. What’s the specific issue you ran into? Are you saying that the issue you can’t resolve is to get system message to work?

@krschacht
Copy link
Contributor

Regarding system instructions, I looked at the gem docs and it appears they think it supports it:
https://github.com/gbaptista/gemini-ai?tab=readme-ov-file#system-instructions

If you try this example in console does do system instructions work there? I’m curious if the issue with system instructions is within the gem itself or an issue getting it to work within this HostedGPT app.

@papayalabs
Copy link
Author

Regarding system instructions, I looked at the gem docs and it appears they think it supports it: https://github.com/gbaptista/gemini-ai?tab=readme-ov-file#system-instructions

If you try this example in console does do system instructions work there? I’m curious if the issue with system instructions is within the gem itself or an issue getting it to work within this HostedGPT app.

I have tried in the console and always got this error: `on_complete': the server responded with status 400 (Faraday::BadRequestError). I think is a problem within the gem but I did not got more deeply into it.

@krschacht
Copy link
Contributor

@papayalabs I spent a little while on this system instructions issue today and I couldn’t figure it out. I validated that the docs also confirm system instructions are supported, although it says it’s “in beta” and it’s “on some models”. To confirm it’s not an issue with the gem, I got the API working through curl and I can get normal replies back but I get the same 400 error when I use a system message through curl. I believe the root cause is that we’re using a model that doesn’t support it, or we’re using an API authentication method that doesn’t support it. I was access the API through a key which I got through AI Studio. I thought maybe the key must be obtained through Vertex AI instead but after about 20 minutes of digging I couldn’t figure out how to get a key through there. So anyway, I didn’t solve it but wanted to capture what I learned since there may be clues.

@papayalabs
Copy link
Author

@krschacht hey! when I first tested throught an API key I have and in the console I also checked all the models ( also the ones that are supported ) and got same result ( error 400 ). When I have more time, I'll dig a little deeper, but I haven't checked it recently.

@papayalabs
Copy link
Author

@krschacht Also fix the bug. Now the code as is works fine without system instructions. I need to add some tests and check any functionality that may be left.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants