You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The window.ai extension (v 0.0.21 on Chrome) isn't able to talk to local.ai by default.
It first sends a POST with "/model" appended to the URL (in this case: http://127.0.0.1:8000/v1). This gets a 404 error from local.ai since http://127.0.0.1:8000/v1/model doesn't exist even though http://127.0.0.1:8000/v1/models does. If I force change the URL with a proxy to "/models" then local.ai responds with the model ("local.ai"), and then window.ai is able to hit the "/completions" URL.
To keep local.ai working with window.ai out-of-the-box, could an endpoint be added at "/model" that responds the same as "/models"?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
The window.ai extension (v 0.0.21 on Chrome) isn't able to talk to local.ai by default.
It first sends a POST with "/model" appended to the URL (in this case: http://127.0.0.1:8000/v1). This gets a 404 error from local.ai since http://127.0.0.1:8000/v1/model doesn't exist even though http://127.0.0.1:8000/v1/models does. If I force change the URL with a proxy to "/models" then local.ai responds with the model ("local.ai"), and then window.ai is able to hit the "/completions" URL.
To keep local.ai working with window.ai out-of-the-box, could an endpoint be added at "/model" that responds the same as "/models"?
Beta Was this translation helpful? Give feedback.
All reactions