runtime error

llama-cpp-python Building wheel for llama-cpp-python (pyproject.toml): started Building wheel for llama-cpp-python (pyproject.toml): finished with status 'done' Created wheel for llama-cpp-python: filename=llama_cpp_python-0.2.32-cp39-cp39-manylinux_2_31_x86_64.whl size=2312073 sha256=4bf1f5aa407a6e311c6d422b9f606c3a018da81d5016502f13bf93fa5a888aae Stored in directory: /home/user/.cache/pip/wheels/93/9c/75/9ad6acc8f92dd622deb6e625d30d6e18245cb66e3b226e281e Successfully built llama-cpp-python Installing collected packages: diskcache, llama-cpp-python Successfully installed diskcache-5.6.3 llama-cpp-python-0.2.32 [notice] A new release of pip available: 22.3.1 -> 23.3.2 [notice] To update, run: python -m pip install --upgrade pip Traceback (most recent call last): File "/home/user/app/app.py", line 4, in <module> import gradio as gr File "/home/user/.local/lib/python3.9/site-packages/gradio/__init__.py", line 3, in <module> import gradio.components as components File "/home/user/.local/lib/python3.9/site-packages/gradio/components.py", line 32, in <module> from gradio import media_data, processing_utils, utils File "/home/user/.local/lib/python3.9/site-packages/gradio/processing_utils.py", line 20, in <module> from gradio import encryptor, utils File "/home/user/.local/lib/python3.9/site-packages/gradio/utils.py", line 404, in <module> class AsyncRequest: File "/home/user/.local/lib/python3.9/site-packages/gradio/utils.py", line 424, in AsyncRequest client = httpx.AsyncClient() File "/home/user/.local/lib/python3.9/site-packages/httpx/_client.py", line 1397, in __init__ self._transport = self._init_transport( File "/home/user/.local/lib/python3.9/site-packages/httpx/_client.py", line 1445, in _init_transport return AsyncHTTPTransport( File "/home/user/.local/lib/python3.9/site-packages/httpx/_transports/default.py", line 275, in __init__ self._pool = httpcore.AsyncConnectionPool( TypeError: __init__() got an unexpected keyword argument 'socket_options'

Container logs:

Fetching error logs...