Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

馃悰 Bug Report: LLM response tracing not working for the latest version of LangChain-OpenAI #954

Open
tkanhe opened this issue May 3, 2024 · 3 comments 路 Fixed by #985

Comments

@tkanhe
Copy link

tkanhe commented May 3, 2024

Which component is this bug for?

Langchain Instrumentation

馃摐 Description

Getting following error while tracing OpenAI and AzureOpenAI LLM streaming response:

Exception while exporting Span.
Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\trace\__init__.py", line 570, in use_span
    yield span
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\sdk\trace\__init__.py", line 1071, in start_as_current_span
    yield span
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\langchain\custom_chat_wrapper.py", line 38, in achat_wrapper
    return_value = await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 526, in agenerate
    raise exceptions[0]
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 707, in _agenerate_with_cache
    result = await self._agenerate(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_openai\chat_models\base.py", line 645, in _agenerate
    return await agenerate_from_stream(stream_iter)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 85, in agenerate_from_stream
    async for chunk in stream:
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_openai\chat_models\base.py", line 606, in _astream
    async with response:
AttributeError: __aenter__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
    response = self._make_request(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 537, in _make_request
    response = conn.getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py", line 466, in getresponse
    httplib_response = super().getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 1375, in getresponse
    response.begin()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 287, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py", line 486, in send
    resp = conn.urlopen(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\urllib3\__init__.py", line 224, in instrumented_urlopen
    return wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 847, in urlopen
    retries = retries.increment(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\retry.py", line 470, in increment
    raise reraise(type(error), error, _stacktrace)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\util.py", line 38, in reraise
    raise value.with_traceback(tb)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
    response = self._make_request(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 537, in _make_request
    response = conn.getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py", line 466, in getresponse
    httplib_response = super().getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 1375, in getresponse
    response.begin()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 287, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\sdk\trace\export\__init__.py", line 113, in on_end
    self.span_exporter.export((span,))
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\exporter\otlp\proto\http\trace_exporter\__init__.py", line 145, in export
    resp = self._export(serialized_data)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\exporter\otlp\proto\http\trace_exporter\__init__.py", line 114, in _export
    return self._session.post(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\requests\__init__.py", line 150, in instrumented_send
    return wrapped_send(self, request, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py", line 501, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
Exception while exporting Span.
Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\trace\__init__.py", line 570, in use_span
    yield span
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\sdk\trace\__init__.py", line 1071, in start_as_current_span
    yield span
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\langchain\task_wrapper.py", line 59, in atask_wrapper
    return_value = await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\_api\deprecation.py", line 154, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 428, in acall
    return await self.ainvoke(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 212, in ainvoke
    raise e
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 203, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py", line 275, in _acall
    response = await self.agenerate([inputs], run_manager=run_manager)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py", line 142, in agenerate
    return await self.llm.agenerate_prompt(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 566, in agenerate_prompt
    return await self.agenerate(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\langchain\custom_chat_wrapper.py", line 38, in achat_wrapper
    return_value = await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 526, in agenerate
    raise exceptions[0]
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 707, in _agenerate_with_cache
    result = await self._agenerate(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_openai\chat_models\base.py", line 645, in _agenerate
    return await agenerate_from_stream(stream_iter)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 85, in agenerate_from_stream
    async for chunk in stream:
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_openai\chat_models\base.py", line 606, in _astream
    async with response:
AttributeError: __aenter__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
    response = self._make_request(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 537, in _make_request
    response = conn.getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py", line 466, in getresponse
    httplib_response = super().getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 1375, in getresponse
    response.begin()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 287, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py", line 486, in send
    resp = conn.urlopen(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\urllib3\__init__.py", line 224, in instrumented_urlopen
    return wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 847, in urlopen
    retries = retries.increment(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\retry.py", line 470, in increment
    raise reraise(type(error), error, _stacktrace)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\util.py", line 38, in reraise
    raise value.with_traceback(tb)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
    response = self._make_request(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py", line 537, in _make_request
    response = conn.getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py", line 466, in getresponse
    httplib_response = super().getresponse()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 1375, in getresponse
    response.begin()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\http\client.py", line 287, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\sdk\trace\export\__init__.py", line 113, in on_end
    self.span_exporter.export((span,))
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\exporter\otlp\proto\http\trace_exporter\__init__.py", line 145, in export
    resp = self._export(serialized_data)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\exporter\otlp\proto\http\trace_exporter\__init__.py", line 114, in _export
    return self._session.post(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\requests\__init__.py", line 150, in instrumented_send
    return wrapped_send(self, request, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py", line 501, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 411, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
    raise exc
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\cors.py", line 93, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\cors.py", line 148, in simple_response
    await self.app(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 75, in app
    await response(scope, receive, send)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\responses.py", line 258, in __call__
    async with anyio.create_task_group() as task_group:
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\anyio\_backends\_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\responses.py", line 261, in wrap
    await func()
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\responses.py", line 250, in stream_response
    async for chunk in self.body_iterator:
  File "C:\Users\Karini AI\Desktop\LangChain-FastAPI-Streaming - Copy - Copy\tr.py", line 58, in run_llm
    await task
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 212, in ainvoke
    raise e
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 203, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\combine_documents\base.py", line 153, in _acall
    output, extra_return_dict = await self.acombine_docs(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\combine_documents\stuff.py", line 262, in acombine_docs
    return await self.llm_chain.apredict(callbacks=callbacks, **inputs), {}
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py", line 310, in apredict
    return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\langchain\task_wrapper.py", line 59, in atask_wrapper
    return_value = await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\_api\deprecation.py", line 154, in awarning_emitting_wrapper
    return await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 428, in acall
    return await self.ainvoke(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 212, in ainvoke
    raise e
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py", line 203, in ainvoke
    await self._acall(inputs, run_manager=run_manager)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py", line 275, in _acall
    response = await self.agenerate([inputs], run_manager=run_manager)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py", line 142, in agenerate
    return await self.llm.agenerate_prompt(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 566, in agenerate_prompt
    return await self.agenerate(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\opentelemetry\instrumentation\langchain\custom_chat_wrapper.py", line 38, in achat_wrapper
    return_value = await wrapped(*args, **kwargs)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 526, in agenerate
    raise exceptions[0]
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 707, in _agenerate_with_cache
    result = await self._agenerate(
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_openai\chat_models\base.py", line 645, in _agenerate
    return await agenerate_from_stream(stream_iter)
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_core\language_models\chat_models.py", line 85, in agenerate_from_stream
    async for chunk in stream:
  File "C:\Users\Karini AI\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain_openai\chat_models\base.py", line 606, in _astream
    async with response:
AttributeError: __aenter__

馃憻 Reproduction steps

If we run the following code we get above code:

import asyncio
import json

from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import StreamingResponse
from langchain.callbacks import AsyncIteratorCallbackHandler
from langchain.chains.question_answering import load_qa_chain
from langchain.docstore.document import Document
from langchain_openai import ChatOpenAI
from langchain_openai import AzureChatOpenAI
from prompt import prompt_template, variables
from pydantic import BaseModel
from traceloop.sdk import Traceloop

Traceloop.init(app_name="test", disable_batch=True, api_endpoint="http://localhost:4318")
app = FastAPI()

class Message(BaseModel):
    content: str

class LLM:
    def __init__(self):
        self.callback = AsyncIteratorCallbackHandler()
        # self.model = ChatOpenAI(
        #     model_name="gpt-3.5-turbo",
        #     temperature=0.1,
        #     max_tokens=100,
        #     streaming=True,
        #     callbacks=[self.callback],
        # )

        self.model = AzureChatOpenAI(
            openai_api_key=api_key,
            azure_endpoint=api_base,
            deployment_name=deployment_name,
            temperature=0.1,
            max_tokens=50,
            openai_api_version="2024-02-15-preview",
            streaming=True,
            callbacks=[self.callback],
        )
    async def run_llm(self, question: str):
        yield "Answer:\n"
        chain = load_qa_chain(self.model, chain_type="stuff", prompt=prompt_template, verbose=False)
        task = asyncio.create_task(chain.ainvoke({"question": question, "input_documents": [Document(page_content=variables["context"])]}))
        try:
            final_answer = ""
            async for token in self.callback.aiter():
                final_answer += token
                yield token
            response = {"batch_status": "success", "response": {"llm_output": final_answer, "question": question}}
            yield f"####{json.dumps(response).strip()}"
        except Exception as e:
            print(f"Caught exception: {e}")
        finally:
            self.callback.done.set()
        await task

@app.post("/stream_chat")
def stream_chat(message: Message):
    llm = LLM()
    llm_generator = llm.run_llm(message.content)
    return StreamingResponse(llm_generator, media_type="text/event-stream")

馃憤 Expected behavior

The above code should work fine with latest and earlier versions of langchain-openai...

馃憥 Actual Behavior with Screenshots

When I downgraded the langchain-openai package from 0.1.6 (latest) to 0.0.8 then the issue gets resolved...

馃 Python Version

python 3.10

langchain==0.1.17
langchainhub==0.1.15
traceloop-sdk==0.17.1

@nirga
Copy link
Member

nirga commented May 6, 2024

Thanks @tkanhe! I've managed to reproduce the issue and fixing it now

@mrctito
Copy link

mrctito commented Jun 25, 2024

Hello everyone, I'm using the latest version of langchain-openai and this error still occurs. Please, which release was it resolved in?

This error occurs when I execute this command:

result = await agent_executor.ainvoke(context, return_only_outputs=True)

Thank you

@nirga
Copy link
Member

nirga commented Jun 26, 2024

Hey @mrctito thanks for reporting, we'll look into this ASAP. Can you provide a sample reproduce of this issue? We have tests for ainvoke and it should be working properly since #1227 has been fixed.

@nirga nirga reopened this Jun 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants