Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: expose streaming parameter from langchain PortkeyLLM and ChatPortkey #64

Closed
VisargD opened this issue Jan 8, 2024 · 1 comment · Fixed by #65
Closed

Fix: expose streaming parameter from langchain PortkeyLLM and ChatPortkey #64

VisargD opened this issue Jan 8, 2024 · 1 comment · Fixed by #65

Comments

@VisargD
Copy link
Collaborator

VisargD commented Jan 8, 2024

Problem:
The latest changes done in LangChainLLM (from LLamaIndex) expects a streaming parameter from llm to determine if stream mode is available in the LLM or not. PortkeyLLM and ChatPortkey does not expose any streaming parameter and this causes a LLM must support streaming error.

Solution :
Expose streaming parameter from PortkeyLLM and ChatPortkey from langchain llm.

Reference for the LLamaIndex LangChainLLM
https://github.com/run-llama/llama_index/blob/fc290bb7425e4d28a4cb05bfe09879367ea6ea0a/llama_index/llms/langchain.py#L111

Copy link

dosubot bot commented Jan 8, 2024

🤖

Hello @VisargD!

I'm Dosu, an AI bot here to assist you with bugs, answer your questions, and guide you through contributing to the Portkey Python SDK. While we're waiting for a human maintainer, I'm here to help make your experience smoother. Looking forward to working with you!

Sure, exposing the streaming parameter from langchain PortkeyLLM and ChatPortkey in the Portkey Python SDK would indeed be beneficial. This would allow users to choose whether they want to stream the results or not.

In the current implementation, the _call method has stream=False and the _stream method has stream=True. Exposing this parameter would allow users to choose the method of result retrieval based on their use case.

Here's how you can modify the _call and _stream methods to expose the streaming parameter:

def _call(
    self,
    messages: List[BaseMessage],
    stop: Optional[List[str]] = None,
    run_manager: Optional[CallbackManagerForLLMRun] = None,
    stream: Optional[bool] = False,
    **kwargs: Any,
) -> str:
    _messages = cast(Message, self._create_message_dicts(messages))
    response = self._client.chat.completions.create(
        messages=_messages, stream=stream, stop=stop, **kwargs
    )
    message = response.choices[0].message
    return message.get("content", "") if message else ""

def _stream(
    self,
    messages: List[BaseMessage],
    stop: Optional[List[str]] = None,
    run_manager: Optional[CallbackManagerForLLMRun] = None,
    stream: Optional[bool] = True,
    **kwargs: Any,
) -> Iterator[ChatGenerationChunk]:
    _messages = cast(Message, self._create_message_dicts(messages))
    response = self._client.chat.completions.create(
        messages=_messages, stream=stream, stop=stop, **kwargs
    )
    for token in response:
        _content = token.choices[0].delta.get("content") or ""
        chunk = ChatGenerationChunk(message=AIMessageChunk(content=_content))
        yield chunk
        if run_manager:
            run_manager.on_llm_new_token(chunk.text, chunk=chunk)

This way, users can pass stream=True or stream=False when calling these methods. You can find the relevant code in the portkey_ai/llms/langchain/chat.py file.

As for the PortkeyLLM class, it's not defined in the provided context, so I can't provide information about its structure. If you could provide more details about it, I'd be able to give a more accurate answer.

Regarding potential dependencies or side effects, without more information about the overall structure and dependencies of the codebase, it's hard to say definitively. However, generally speaking, changing the streaming parameter could potentially affect any code that relies on the _call or _stream methods of the ChatPortkey class. It's always a good idea to thoroughly test all changes to ensure there are no unexpected side effects.

I hope this helps! If you have any more questions or need further clarification, feel free to ask.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
1 participant