Replies: 2 comments 16 replies
-
It only needs an OPENAI API compatible endpoint to work. Ollama already providers it, so you can use it with it directly (see here). For everything else you can use litellm proxy that supports all LLMs, it provides an OpenAI API compatible endpoint for Anthropic. So, you would just set that up and point optillm to that. I will add a note in the read about it. |
Beta Was this translation helpful? Give feedback.
-
I started proxy successfully with: having: and having client:
but I still got 400 and 500, it calls the OpenAI API incorrectly instead of Openrouter's:
Any help is appreciated :) |
Beta Was this translation helpful? Give feedback.
-
Does it work "only" with OpenAI LLMs or also with others like Sonnet 3.5, or any from Ollama?
If so, how? Pls. add info to docs / readme.
Beta Was this translation helpful? Give feedback.
All reactions