[Bug]: ElasticSearch : Timeout context manager should be used inside a task

Bug Description

I'm developing a chatbot, and on a second request sent the bug appears

  • I have llama-index installed in the conda environment with Pyhton 3.12.3.
  • I have streamlit 1.36.0 for UI.
  • I have elastic-transport 8.13.1, elasticsearch 8.14.0 and llama-index-veco-stores-elasticsearch 0.2.0

I opened an issue on the github of the llama_index library, but apparently the bug comes more from the Elasticsearch library.

I opened issue on elasticsearch-py github

2024-06-27_10-54-35

Version

elasticsearch : 8.14.0
elastic-transport : 8.13.1

Steps to Reproduce

With llama_index, send a second query to the RetrieverQueryEngine, built from a VectorIndexRetriever, a VectorStoreIndex and an ElasticSearchVectorStore.

Relevant Logs/Tracbacks

Traceback (most recent call last):
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 589, in _run_script
    exec(code, module.__dict__)
  File "C:\data\git\.......\streamlit_app.py", line 52, in <module>
    response = response_generator.chat(user_query=prompt)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\data\git\.......\src\components\response_synthesis.py", line 84, in chat
    content = self.build_context_prompt(self.retriever(user_query=user_query))
                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\data\git\.......\src\components\response_synthesis.py", line 61, in retriever
    retrieved_nodes = retriever.retrieve(user_query)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 230, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\core\base\base_retriever.py", line 243, in retrieve
    nodes = self._retrieve(query_bundle)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 230, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\core\indices\vector_store\retrievers\retriever.py", line 101, in _retrieve
    return self._get_nodes_with_embeddings(query_bundle)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\core\indices\vector_store\retrievers\retriever.py", line 177, in _get_nodes_with_embeddings
    query_result = self._vector_store.query(query, **self._kwargs)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\vector_stores\elasticsearch\base.py", line 412, in query
    return asyncio.get_event_loop().run_until_complete(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\asyncio\futures.py", line 203, in result     
    raise self._exception.with_traceback(self._exception_tb)
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\asyncio\tasks.py", line 314, in __step_run_and_handle_result
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\llama_index\vector_stores\elasticsearch\base.py", line 452, in aquery
    hits = await self._store.search(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\elasticsearch\helpers\vectorstore\_async\vectorstore.py", line 277, in search
    response = await self.client.search(
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\elasticsearch\_async\client\__init__.py", line 4121, in search
    return await self.perform_request(  # type: ignore[return-value]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\elasticsearch\_async\client\_base.py", line 271, in perform_request
    response = await self._perform_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\elasticsearch\_async\client\_base.py", line 316, in _perform_request
    meta, resp_body = await self.transport.perform_request(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\elastic_transport\_async_transport.py", line 264, in perform_request
    resp = await node.perform_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\elastic_transport\_node\_http_aiohttp.py", line 179, in perform_request
    async with self.session.request(
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\aiohttp\client.py", line 1197, in __aenter__
    self._resp = await self._coro
                 ^^^^^^^^^^^^^^^^
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\aiohttp\client.py", line 507, in _request
    with timer:
  File "C:\Users\...\AppData\Local\anaconda3\envs\llama-index\Lib\site-packages\aiohttp\helpers.py", line 715, in __enter__
    raise RuntimeError(
RuntimeError: Timeout context manager should be used inside a task

Hi @FlorentGrenier,

Welcome to the community! Thank you for sharing the issue. I see you have raised a GitHub issue, which is probably the best place to continue the discussion.

Thanks again for raising!

2 Likes