Trying to get Graphrag running with locally hosted models from LM-Studio/Ollama but getting "invalid_api_key" error #1354
Unanswered
Branchenprimus
asked this question in
Q&A
Replies: 1 comment
-
I am stuck in here too, seems the graphrag did not replace the base_url with the settings.yaml . |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I am trying this for a couple of days now and i don't seem to be progressing here. I tried different models and local hosting platforms like Ollama and LM-Studio to host the llm and embedding models. I understand, that an API key is needed if running openai models from the web but for local hosting it should not be an issue, so I tried different dummy values like GRAPHRAG_API_KEY=EMPTY or GRAPHRAG_API_KEY=1234 but I still get the error:
"Error code: 401 - {'error': {'message': 'Incorrect API key provided: EMPTY. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"
Attached is my yaml file in .txt encoding.
settings.yaml.txt
Beta Was this translation helpful? Give feedback.
All reactions