Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Azure OpenAI: 'NoneType' object has no attribute 'tool_calls' #16570

Closed
deloz opened this issue Oct 16, 2024 · 1 comment · Fixed by #16636
Closed

[Bug]: Azure OpenAI: 'NoneType' object has no attribute 'tool_calls' #16570

deloz opened this issue Oct 16, 2024 · 1 comment · Fixed by #16636
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@deloz
Copy link
Contributor

deloz commented Oct 16, 2024

Bug Description

I'm using Azure OpenAI with content filtering and Asynchronous Filtering enabled. When streaming is disabled, everything works fine. However, when I enable streaming, I encounter the following error:

AttributeError: 'NoneType' object has no attribute 'tool_calls'

I've searched online and found a couple of similar issues:

Version

0.11.18

Steps to Reproduce

import logging
import sys

from llama_index.core import Settings
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.embeddings.azure_openai import AzureOpenAIEmbedding
from llama_index.llms.azure_openai import AzureOpenAI

logging.basicConfig(
    stream=sys.stdout, level=logging.DEBUG
)  # logging.DEBUG for more verbose output

logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

api_key = "zzzzzzzzzz"
azure_endpoint = "https://yyyyyyyyyyyeast.openai.azure.com"
api_version = "2024-08-01-preview"

llm = AzureOpenAI(
    model="gpt-4o",
    deployment_name="gpt-4o",
    api_key=api_key,
    azure_endpoint=azure_endpoint,
    api_version=api_version,
)

# You need to deploy your own embedding model as well as your own chat completion model
embed_model = AzureOpenAIEmbedding(
    model="text-embedding-3-small",
    deployment_name="text-embedding-3-small",
    api_key=api_key,
    azure_endpoint=azure_endpoint,
    api_version=api_version,
)

Settings.llm = llm
Settings.embed_model = embed_model

documents = SimpleDirectoryReader("./data/paul_graham").load_data()

index = VectorStoreIndex.from_documents(documents)

chat_engine = index.as_chat_engine(chat_mode="condense_question", streaming=True)
response_stream = chat_engine.stream_chat("What did Paul Graham do after YC?")

Relevant Logs/Tracbacks

DEBUG:openai._base_client:request_id: 8453499e-e0ca-4c7e-b5cb-911d28283a2a
request_id: 8453499e-e0ca-4c7e-b5cb-911d28283a2a
DEBUG:httpcore.http11:receive_response_body.started request=<Request [b'POST']>
receive_response_body.started request=<Request [b'POST']>
Exception in thread Thread-1 (run):
Traceback (most recent call last):
  File "/home/deloz/.pyenv/versions/3.12.7/lib/python3.12/threading.py", line 1075, in _bootstrap_inner
    self.run()
  File "/home/deloz/.pyenv/versions/3.12.7/lib/python3.12/threading.py", line 1012, in run
    self._target(*self._args, **self._kwargs)
  File "/www/wwwroot/test-ai/.venv/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 307, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/www/wwwroot/test-ai/.venv/lib/python3.12/site-packages/llama_index/core/chat_engine/types.py", line 174, in write_response_to_history
    for chat in self.chat_stream:
                ^^^^^^^^^^^^^^^^
  File "/www/wwwroot/test-ai/.venv/lib/python3.12/site-packages/llama_index/core/chat_engine/utils.py", line 50, in response_gen_from_query_engine
    for token in response_gen:
                 ^^^^^^^^^^^^
  File "/www/wwwroot/test-ai/.venv/lib/python3.12/site-packages/llama_index/core/llms/llm.py", line 126, in gen
    for response in chat_response_gen:
                    ^^^^^^^^^^^^^^^^^
  File "/www/wwwroot/test-ai/.venv/lib/python3.12/site-packages/llama_index/core/llms/callbacks.py", line 186, in wrapped_gen
    for x in f_return_val:
             ^^^^^^^^^^^^
  File "/www/wwwroot/test-ai/.venv/lib/python3.12/site-packages/llama_index/llms/openai/base.py", line 482, in gen
    if delta.tool_calls:
       ^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'tool_calls'
DEBUG:httpcore.http11:receive_response_body.failed exception=GeneratorExit()
receive_response_body.failed exception=GeneratorExit()
DEBUG:httpcore.http11:response_closed.started
response_closed.started
DEBUG:httpcore.http11:response_closed.complete
response_closed.complete
@deloz deloz added bug Something isn't working triage Issue needs to be triaged/prioritized labels Oct 16, 2024
Copy link

dosubot bot commented Oct 16, 2024

Hello @deloz! I'm here to help you out with any bugs, questions, or contributions you might have. Let's tackle this together!

I found a similar closed issue related to the 'NoneType' object has no attribute 'tool_calls' error when using Azure OpenAI with streaming enabled. The issue was discussed in this closed issue, and it was resolved by a pull request that ensured the delta object is properly initialized and checked before accessing its attributes [1].

To resolve this error, make sure that the delta object is not None before accessing the tool_calls attribute. Here's a relevant code snippet that includes the necessary checks:

# Ensure delta is not None before accessing its attributes
if delta and delta.tool_calls:
    is_function = True

This check should help prevent the 'NoneType' object has no attribute 'tool_calls' error when streaming is enabled [2].

To continue talking to Dosu, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant