Integrating Autogen with Ollama (on college cluster) to make AI Agents. #4163
Unanswered
Rushikesh67
asked this question in
Q&A
Replies: 1 comment
-
Maybe you can take a look at the current on going PR: #4141 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I plan to create AI agents with AutoGen using the Ollama platform, specifically with the llama3.1:70B model. However, Ollama is hosted on my college’s computer cluster, not on my local computer. I can access the llama models via a URL endpoint (something like https://xyz.com/ollama/api/chat) and an API key provided by the college. Although Ollama has an OpenAI-compatible API, most examples of AutoGen integration involve running Ollama locally, which I can’t do. Is there any way to integrate AutoGen with Ollama using my college's URL endpoint and API key?
Beta Was this translation helpful? Give feedback.
All reactions