Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug]: context_chat_backend/models/__init__.py has instructor model reference #92

Open
ga-it opened this issue Oct 27, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@ga-it
Copy link

ga-it commented Oct 27, 2024

Describe the bug
context_chat_backend/models/init.py still contains "instructor" in its embedding models. This causes a 500 error when the backend attempts to load the models

Line 7: _embedding_models = ["llama", "hugging_face", "instructor"]

To Reproduce
Steps to reproduce the behavior:

  1. Start the context_backend_chat container
  2. From the assistant, launch a context chat query
  3. Receive 500 error when model fail to load

Expected behavior
The instructor model reference needs to be removed to that models load correctly.

Context Chat Backend logs (if applicable, from the docker container)

``` --- Logging error --- Traceback (most recent call last): File "/app/context_chat_backend/models/__init__.py", line 22, in load_model module = import_module(f".{model_name}", "context_chat_backend.models") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1140, in _find_and_load_unlocked ModuleNotFoundError: No module named 'context_chat_backend.models.instructor'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/app/context_chat_backend/models/init.py", line 49, in init_model
model = load_model(model_type, model_info)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/models/init.py", line 24, in load_model
raise AssertionError(f"Error: could not load {model_name} model from context_chat_backend/models") from e
AssertionError: Error: could not load instructor model from context_chat_backend/models

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/app/context_chat_backend/dyn_loader.py", line 77, in load
model = init_model('embedding', self.config['embedding'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/models/init.py", line 51, in init_model
raise AssertionError(f"Error: {model_name} failed to load") from e
AssertionError: Error: instructor failed to load

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 73, in app
response = await f(request)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/fastapi/routing.py", line 301, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/fastapi/routing.py", line 214, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/starlette/concurrency.py", line 39, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/anyio/_backends/_asyncio.py", line 943, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/controller.py", line 123, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/controller.py", line 361, in _
return execute_query(query)
^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/controller.py", line 329, in execute_query
db: BaseVectorDB = vectordb_loader.load()
^^^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/dyn_loader.py", line 57, in load
embedding_model = EmbeddingModelLoader(self.app, self.config).load()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/context_chat_backend/dyn_loader.py", line 79, in load
raise LoaderException() from e
context_chat_backend.dyn_loader.LoaderException

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/lib/python3.11/logging/init.py", line 1110, in emit
msg = self.format(record)
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/logging/init.py", line 953, in format
return fmt.format(record)
^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/logging/init.py", line 687, in format
record.message = record.getMessage()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/logging/init.py", line 377, in getMessage
msg = msg % self.args
~~~~^~~~~~~~~~~
TypeError: not all arguments converted during string formatting
Call stack:
File "/app/main.py", line 16, in
uvicorn.run(
File "/usr/local/lib/python3.11/dist-packages/uvicorn/main.py", line 579, in run
server.run()
File "/usr/local/lib/python3.11/dist-packages/uvicorn/server.py", line 65, in run
return asyncio.run(self.serve(sockets=sockets))
File "/usr/lib/python3.11/asyncio/runners.py", line 190, in run
return runner.run(main)
File "/usr/lib/python3.11/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
File "/usr/local/lib/python3.11/dist-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/usr/local/lib/python3.11/dist-packages/uvicorn/middleware/proxy_headers.py", line 60, in call
return await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/uvicorn/middleware/message_logger.py", line 80, in call
await self.app(scope, inner_receive, inner_send)
File "/usr/local/lib/python3.11/dist-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/middleware/errors.py", line 165, in call
await self.app(scope, receive, _send)
File "/app/context_chat_backend/ocs_utils.py", line 75, in call
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 735, in app
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/routing.py", line 76, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/usr/local/lib/python3.11/dist-packages/starlette/_exception_handler.py", line 59, in wrapped_app
response = await handler(conn, exc)
File "/app/context_chat_backend/controller.py", line 86, in _
log_error(f'Loader Error: {request.url.path}:', exc)
Message: 'Loader Error: /query:'
Arguments: (LoaderException(),)
TRACE: x:43818 - ASGI [4] Send {'type': 'http.response.start', 'status': 500, 'headers': '<...>'}
INFO: x:43818 - "POST /query HTTP/1.1" 500 Internal Server Error
TRACE: x:43818 - ASGI [4] Send {'type': 'http.response.body', 'body': '<90 bytes>'}
TRACE: x:43818 - ASGI [4] Completed
TRACE: x:43818 - HTTP connection lost

</details>

 - Nextcloud Version: 30.0.1 - 30.0.1.2
 - AppAPI Version: 4.0.0
 - Context Chat PHP Version 3.1.0
 - Context Chat Backend Version 3.1.0
 - Nextcloud deployment method: Docker
 - Context Chat Backend deployment method: manual, remote
@ga-it ga-it added the bug Something isn't working label Oct 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant