Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AzureChatOpenAI ignores client given, resulting in connection errors (behind proxy). #21660

Open
5 tasks done
Waffleboy opened this issue May 14, 2024 · 0 comments
Open
5 tasks done
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations

Comments

@Waffleboy
Copy link

Waffleboy commented May 14, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_openai import AzureChatOpenAI
import httpx
PROXY = "PROXY_IP:PORT" #redacted
deployment_name="GPT4_MODEL" #redacted
base_url = "https://<azure_url>/openai/deployments/<deployment_name>/" #redacted
OPENAI_API_VERSION="2024-02-15-preview" 
OPENAI_API_KEY="api_key" #redacted

client=httpx.Client(proxy=PROXY ,verify=False, follow_redirects=True)
model = AzureChatOpenAI(base_url=base_url,openai_api_version=OPENAI_API_VERSION, openai_api_key=OPENAI_API_KEY, temperature=0,client=client)

model.invoke("test")

Error Message and Stack Trace (if applicable)

DEBUG [2024-05-14 09:18:24] openai._base_client - Encountered Exception

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 926, in _request
    response = self._client.send(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 914, in send
    response = self._send_handling_auth(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 942, in _send_handling_auth
    response = self._send_handling_redirects(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
    response = self._send_single_request(request)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 1015, in _send_single_request
    response = transport.handle_request(request)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 232, in handle_request
    with map_httpcore_exceptions():
  File "/opt/conda/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: [Errno -2] Name or service not known
DEBUG [2024-05-14 09:18:24] openai._base_client - 0 retries left
INFO [2024-05-14 09:18:24] openai._base_client - Retrying request to /chat/completions in 1.724038 seconds
DEBUG [2024-05-14 09:18:26] openai._base_client - Request options: {'method': 'post', 'url': '/chat/completions', 'headers': {'api-key': 'REDACTED'}, 'files': None, 'json_data': {'messages': [{'role': 'user', 'content': 'test'}], 'model': 'gpt-3.5-turbo', 'n': 1, 'stream': False, 'temperature': 0.0}}
DEBUG [2024-05-14 09:18:26] httpcore.connection - connect_tcp.started host='BASE_URL' port=443 local_address=None timeout=None socket_options=None
DEBUG [2024-05-14 09:18:26] httpcore.connection - connect_tcp.failed exception=ConnectError(gaierror(-2, 'Name or service not known'))
DEBUG [2024-05-14 09:18:26] openai._base_client - Encountered Exception
Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
    yield
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 233, in handle_request
    resp = self._pool.handle_request(req)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
    raise exc from None
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
    response = connection.handle_request(
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
    stream = self._connect(request)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 122, in _connect
    stream = self._network_backend.connect_tcp(**kwargs)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp
    with map_exceptions(exc_map):
  File "/opt/conda/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: [Errno -2] Name or service not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 926, in _request
    response = self._client.send(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 914, in send
    response = self._send_handling_auth(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 942, in _send_handling_auth
    response = self._send_handling_redirects(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
    response = self._send_single_request(request)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 1015, in _send_single_request
    response = transport.handle_request(request)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 232, in handle_request
    with map_httpcore_exceptions():
  File "/opt/conda/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: [Errno -2] Name or service not known
DEBUG [2024-05-14 09:18:26] openai._base_client - Raising connection error
Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
    yield
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 233, in handle_request
    resp = self._pool.handle_request(req)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
    raise exc from None
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
    response = connection.handle_request(
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
    raise exc
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
    stream = self._connect(request)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 122, in _connect
    stream = self._network_backend.connect_tcp(**kwargs)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp
    with map_exceptions(exc_map):
  File "/opt/conda/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: [Errno -2] Name or service not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 926, in _request
    response = self._client.send(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 914, in send
    response = self._send_handling_auth(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 942, in _send_handling_auth
    response = self._send_handling_redirects(
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
    response = self._send_single_request(request)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_client.py", line 1015, in _send_single_request
    response = transport.handle_request(request)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 232, in handle_request
    with map_httpcore_exceptions():
  File "/opt/conda/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/opt/conda/lib/python3.10/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: [Errno -2] Name or service not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/conda/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 173, in invoke
    self.generate_prompt(
  File "/opt/conda/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 571, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 434, in generate
    raise e
  File "/opt/conda/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 424, in generate
    self._generate_with_cache(
  File "/opt/conda/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 608, in _generate_with_cache
    result = self._generate(
  File "/opt/conda/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 462, in _generate
    response = self.client.create(messages=message_dicts, **params)
  File "/opt/conda/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 667, in create
    return self._post(
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1208, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 897, in request
    return self._request(
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 950, in _request
    return self._retry_request(
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1021, in _retry_request
    return self._request(
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 950, in _request
    return self._retry_request(
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1021, in _retry_request
    return self._request(
  File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 960, in _request
    raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

Description

The openai python library provides a client parameter that allows you to configure proxy settings, and disable ssl verification.

The langchain abstraction ignores this, and sets a default client, resulting in it not working.

For example, this is the openai equivalent which works

import httpx
from openai import AzureOpenAI
PROXY="PROXY_IP:PORT" # redacted
AZURE_BASE = "insert base url here" # redacted
deployment_name= "gpt4model" # redacted
OPENAI_API_VERSION="2024-02-15-preview"
OPENAI_API_KEY="key" # redacted

http_client=httpx.Client(proxy=PROXY,verify=False, follow_redirects=True)
base_url = "https://AZURE_BASE/openai/deployments/deployment_name"

client = AzureOpenAI(api_key=OPENAI_API_KEY,api_version=OPENAI_API_VERSION,base_url=base_url,http_client=http_client)
client.chat.completions.create(model=deployment_name,messages=[{"role":"user","content":"test"}])

Why?

After setting logging in httpx to debug, I discovered that the final client used by the langchain abstraction is a new one, probably created along the way. The client parameter passed down is lost along the way somewhere.

Result from langchain client

The model parameter is wrong (supposed to be deployment_name) and also the host its connecting to is the base url instead of my proxy url.

INFO [2024-05-14 09:18:23] openai._base_client - Retrying request to /chat/completions in 0.873755 seconds
DEBUG [2024-05-14 09:18:24] openai._base_client - Request options: {'method': 'post', 'url': '/chat/completions', 'headers': {'api-key': API_KEY, 'files': None, 'json_data': {'messages': [{'role': 'user', 'content': 'test'}], 'model': 'gpt-3.5-turbo', 'n': 1, 'stream': False, 'temperature': 0.0}}
DEBUG [2024-05-14 09:18:24] httpcore.connection - connect_tcp.started host='BASE_URL' port=443 local_address=None timeout=None socket_options=None
DEBUG [2024-05-14 09:18:24] httpcore.connection - connect_tcp.failed exception=ConnectError(gaierror(-2, 'Name or service not known'))

Result from openai client (correct)

Observe the differences in the model and host parameters. they are correctly set to deployment_name and the proxy url.

>>> client.chat.completions.create(model=deployment_name,messages=[{"role":"user","content":"test"}])
DEBUG [2024-05-14 09:47:53] openai._base_client - Request options: {'method': 'post', 'url': '/chat/completions', 'headers': {'api-key': API_KEY}, 'files': None, 'json_data': {'messages': [{'role': 'user', 'content': 'test'}], 'model': 'gpt4model'}}
DEBUG [2024-05-14 09:47:53] httpcore.connection - connect_tcp.started host='PROXY_IP port=PROXY_PORTlocal_address=None timeout=5.0 socket_options=None
DEBUG [2024-05-14 09:47:53] httpcore.connection - connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x7f6da6f3e110>

How to fix?

Honestly I have no idea. Theres too many magic abstractions going on here. The client parameter is being ignored somewhere down the line.

I poked into AzureChatOpenAI and saw validate_environment but I dont see it being called anywhere.

Digging into BaseChatOpenAI and BaseChatModel didnt do much good either.

How I fixed this on my end was a major hack - replacing the final client used with my httpx client after initialiation.

from langchain_openai import AzureChatOpenAI
import httpx
base_url = "url"
client=httpx.Client(proxy="proxy",verify=False, follow_redirects=True)

model = AzureChatOpenAI(base_url=base_url,openai_api_version=OPENAI_API_VERSION, openai_api_key=OPENAI_API_KEY, temperature=0,client=client)

model.client._client._client = client # replace the SyncHttpxClientWrapper client with own httpx instance

model.invoke("this works")

System Info

langchain==0.1.12
langchain-community==0.0.28
langchain-core==0.1.52
langchain-experimental==0.0.40
langchain-openai==0.1.6
langchain-text-splitters==0.0.1
langchainhub==0.1.15

windows on wsl
python3.10

@dosubot dosubot bot added 🔌: openai Primarily related to OpenAI integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations
Projects
None yet
Development

No branches or pull requests

1 participant