Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'NoneType' object is not iterable (indexing.py:79) #686

Open
shijianzhong opened this issue Sep 29, 2024 · 1 comment
Open

'NoneType' object is not iterable (indexing.py:79) #686

shijianzhong opened this issue Sep 29, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@shijianzhong
Copy link

Describe the bug
2024-09-29 09:19:13,162 - wren-ai-service - ERROR - ask pipeline - Failed to prepare semantics: 'NoneType' object is not iterable (indexing.py:79)
Traceback (most recent call last):
File "/src/web/v1/services/indexing.py", line 68, in prepare_semantics
await self._pipelines["indexing"].run(
File "/src/utils.py", line 119, in wrapper_timer
return await process(func, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 103, in process
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 188, in async_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 428, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 186, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/pipelines/indexing/indexing.py", line 682, in run
return await self._pipe.execute(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 368, in execute
raise e
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 359, in execute
outputs = await self.raw_execute(final_vars, overrides, display_graph, inputs=inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 320, in raw_execute
raise e
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 315, in raw_execute
results = await await_dict_of_tasks(task_dict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks
coroutines_gathered = await asyncio.gather(*coroutines)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value
return await val
^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 91, in new_fn
fn_kwargs = await await_dict_of_tasks(task_dict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks
coroutines_gathered = await asyncio.gather(*coroutines)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value
return await val
^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 122, in new_fn
await fn(**fn_kwargs) if asyncio.iscoroutinefunction(fn) else fn(**fn_kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 119, in wrapper_timer
return await process(func, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 103, in process
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 188, in async_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 428, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 186, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/pipelines/indexing/indexing.py", line 526, in embed_table_descriptions
return await document_embedder.run(covert_to_table_descriptions["documents"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/backoff/_async.py", line 151, in retry
ret = await target(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/providers/embedder/openai.py", line 168, in run
embeddings, meta = await self._embed_batch(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/providers/embedder/openai.py", line 142, in _embed_batch
meta["usage"] = dict(response.usage)
^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not iterable
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/cors.py", line 83, in call
await self.app(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/app/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 758, in call
await self.middleware_stack(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 778, in app
await route.handle(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 299, in handle
await self.app(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 79, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/app/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 77, in app
await response(scope, receive, send)
File "/app/.venv/lib/python3.12/site-packages/starlette/responses.py", line 161, in call
await self.background()
File "/app/.venv/lib/python3.12/site-packages/starlette/background.py", line 45, in call
await task()
File "/app/.venv/lib/python3.12/site-packages/starlette/background.py", line 28, in call
await self.func(*self.args, **self.kwargs)
File "/src/utils.py", line 119, in wrapper_timer
return await process(func, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 103, in process
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 188, in async_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 428, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 186, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 179, in wrapper
results = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/web/v1/services/indexing.py", line 83, in prepare_semantics
] = SemanticsPreparationStatusResponse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/pydantic/main.py", line 193, in init
self.pydantic_validator.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for SemanticsPreparationStatusResponse
error
Input should be a valid dictionary or instance of SemanticsPreparationError [type=model_type, input_value="Failed to prepare semant... object is not iterable", input_type=str]
For further information visit https://errors.pydantic.dev/2.8/v/model_type

Calculating embeddings: 0%| | 0/1 [00:01<?, ?it/s]


embed_dbschema [src.pipelines.indexing.indexing.embed_dbschema()] encountered an error<
Node inputs:
{'convert_to_ddl': "<Task finished name='Task-10' coro=<AsyncGraphAdap...",
'document_embedder': '<src.providers.embedder.openai.AsyncDocumentEmbedd...'}


Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 122, in new_fn
await fn(**fn_kwargs) if asyncio.iscoroutinefunction(fn) else fn(**fn_kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 119, in wrapper_timer
return await process(func, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/utils.py", line 103, in process
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 188, in async_wrapper
self._handle_exception(observation, e)
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 428, in _handle_exception
raise e
File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 186, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/pipelines/indexing/indexing.py", line 568, in embed_dbschema
return await document_embedder.run(documents=convert_to_ddl["documents"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/backoff/_async.py", line 151, in retry
ret = await target(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/providers/embedder/openai.py", line 168, in run
embeddings, meta = await self._embed_batch(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/providers/embedder/openai.py", line 142, in _embed_batch
meta["usage"] = dict(response.usage)
^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not iterable
INFO: 172.17.0.1:55488 - "GET /health HTTP/1.1" 200 OK
INFO: 172.17.0.6:33208 - "GET /v1/semantics-preparations/1e6cfb3398f3c3b5ee57653090e1177ec1997c53/status HTTP/1.1" 200 OK
INFO: 172.17.0.6:33214 - "GET /v1/semantics-preparations/1e6cfb3398f3c3b5ee57653090e1177ec1997c53/status HTTP/1.1" 200 OK
INFO: 172.17.0.6:40280 - "GET /v1/semantics-preparations/1e6cfb3398f3c3b5ee57653090e1177ec1997c53/status HTTP/1.1" 200 OK
INFO: 172.17.0.6:40284 - "GET /v1/semantics-preparations/1e6cfb3398f3c3b5ee57653090e1177ec1997c53/status HTTP/1.1" 200 OK
INFO: 172.17.0.6:44658 - "GET /v1/semantics-preparations/1e6cfb3398f3c3b5ee57653090e1177ec1997c53/status HTTP/1.1" 200 OK
Forcing deployment: {'data': {'deploy': {'status': 'FAILED', 'error': 'Wren AI: Deploy wren AI failed or timeout, hash: 1e6cfb3398f3c3b5ee57653090e1177ec1997c53'}}}

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]

Wren AI Information

  • Version: [e.g, 0.1.0]
  • LLM_PROVIDER= # openai_llm, azure_openai_llm, ollama_llm
  • GENERATION_MODEL= # gpt-3.5-turbo, llama3:70b, etc.

Additional context
Add any other context about the problem here.

@shijianzhong shijianzhong added the bug Something isn't working label Sep 29, 2024
@cyyeh
Copy link
Member

cyyeh commented Sep 30, 2024

@shijianzhong thanks for reaching out

We've fixed this issue already. However, it's not in the latest release of Wren AI yet. You could try the fixed version of wren-ai-service using the following method:

  1. shut down Wren AI
  2. go to ~/.wrenai in terminal
  3. change the value of WREN_AI_SERVICE_VERSION to 0.8.17 in ~/.wrenai/.env
  4. make sure LLM_OPENAI_API_KEY and EMBEDDER_OPENAI_API_KEY are filled in ~/.wrenai/.env
  5. restart Wren AI with this command: docker-compose --env-file .env up -d

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants