use mindie deploy a qwen2.5-0.5b model, can't connect to lobechat, report a issue is model not found, #5482
Unanswered
lingistic123
asked this question in
General Question | 普通问题
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
i need a high tech engineer help me,
my action:
1、use a docker which has a mindie service ,and it use host network. listening 127.0.0.1,and expose 1220 port
2、use a docker which has a loba service, and it use host network.
3、put open ai format url in lobelchat webui,http://127.0.0.1:1220/v1
but get a model not found issue
{ "error": "model not found", "endpoint": "http://127.0.***.1:****/v1", "provider": "openai" }
Beta Was this translation helpful? Give feedback.
All reactions