Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default language model setting gives openai.APIConnectionError: Connection error #788

Open
adamksiezyk opened this issue May 20, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@adamksiezyk
Copy link

adamksiezyk commented May 20, 2024

Description

Chat crashes with openai.APIConnectionError: Connection error when configured through CLI default language model but works when set manually via UI.

Steps to reproduce

  1. Start jupyter
jupyter lab \
--AiExtension.default_language_model=openai-chat:gpt-3.5-turbo \
--AiExtension.model_parameters openai-chat:gpt-3.5-turbo='{"openai_api_base": "http://localhost:8000"}' \
--AiExtension.default_api_keys='{"OPENAI_API_KEY": "111-111-111-111"}'
  1. Open Jupyter AI Chat

  2. Send message

Expected behavior

Response from local LLM.

Actual behavior

Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
    yield
  File "/opt/conda/lib/python3.11/site-packages/httpx/_transports/default.py", line 373, in handle_async_request
    resp = await self._pool.handle_async_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 216, in handle_async_request
    raise exc from None
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 196, in handle_async_request
    response = await connection.handle_async_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_async/connection.py", line 99, in handle_async_request
    raise exc
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_async/connection.py", line 76, in handle_async_request
    stream = await self._connect(request)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_async/connection.py", line 122, in _connect
    stream = await self._network_backend.connect_tcp(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_backends/auto.py", line 30, in connect_tcp
    return await self._backend.connect_tcp(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_backends/anyio.py", line 112, in connect_tcp
    with map_exceptions(exc_map):
  File "/opt/conda/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1522, in _request
    response = await self._client.send(
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpx/_client.py", line 1661, in send
    response = await self._send_handling_auth(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpx/_client.py", line 1689, in _send_handling_auth
    response = await self._send_handling_redirects(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpx/_client.py", line 1726, in _send_handling_redirects
    response = await self._send_single_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpx/_client.py", line 1763, in _send_single_request
    response = await transport.handle_async_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/httpx/_transports/default.py", line 372, in handle_async_request
    with map_httpcore_exceptions():
  File "/opt/conda/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.11/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/base.py", line 125, in on_message
    await self.process_message(message)
  File "/opt/conda/lib/python3.11/site-packages/jupyter_ai/chat_handlers/default.py", line 61, in process_message
    response = await self.llm_chain.apredict(input=message.body, stop=["\nHuman:"])
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain/chains/llm.py", line 333, in apredict
    return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain_core/_api/deprecation.py", line 157, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain/chains/base.py", line 428, in acall
    return await self.ainvoke(
           ^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain/chains/base.py", line 212, in ainvoke
    raise e
  File "/opt/conda/lib/python3.11/site-packages/langchain/chains/base.py", line 203, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "/opt/conda/lib/python3.11/site-packages/langchain/chains/llm.py", line 298, in _acall
    response = await self.agenerate([inputs], run_manager=run_manager)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain/chains/llm.py", line 165, in agenerate
    return await self.llm.agenerate_prompt(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 570, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 530, in agenerate
    raise exceptions[0]
  File "/opt/conda/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 715, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 559, in _agenerate
    response = await self.async_client.create(messages=message_dicts, **params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1181, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1790, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1493, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1546, in _request
    return await self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1615, in _retry_request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1546, in _request
    return await self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1615, in _retry_request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/openai/_base_client.py", line 1556, in _request
    raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

Solution

Configuring the endpoint manually via UI fixes the issue.
image

Context

The config generated both, from the CLI and UI is the same:

$ cat .local/share/jupyter/jupyter_ai/config.json 
{
    "model_provider_id": "openai-chat:gpt-3.5-turbo",
    "embeddings_provider_id": null,
    "send_with_shift_enter": false,
    "fields": {
        "openai-chat:gpt-3.5-turbo": {
            "openai_api_base": "http://localhost:8000"
        }
    },
    "api_keys": {
        "OPENAI_API_KEY": "111-111-111-111"
    },
    "completions_model_provider_id": null,
    "completions_fields": {}
}

Versions:

$ jupyter --version
Selected Jupyter core packages...
IPython          : 8.22.2
ipykernel        : 6.29.3
ipywidgets       : 8.1.2
jupyter_client   : 8.6.1
jupyter_core     : 5.7.2
jupyter_server   : 2.13.0
jupyterlab       : 4.1.5
nbclient         : 0.10.0
nbconvert        : 7.16.3
nbformat         : 5.10.3
notebook         : 7.1.2
qtconsole        : not installed
traitlets        : 5.14.2
$ cat /opt/conda/share/jupyter/labextensions/\@jupyter-ai/core/package.json 
{
  "name": "@jupyter-ai/core",
  "version": "2.15.0",
  ...
}
@adamksiezyk adamksiezyk added the bug Something isn't working label May 20, 2024
Copy link

welcome bot commented May 20, 2024

Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗

If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
welcome
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! 👋

Welcome to the Jupyter community! 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant