Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues with stream: true and Invalid Parameters in Bee Agent 0.1 (WatsonX) #313

Open
kyjeon21 opened this issue Feb 14, 2025 · 1 comment
Assignees
Labels
bug Something isn't working typescript Typescript related functionality

Comments

@kyjeon21
Copy link

📝 Describe the Bug

After upgrading to Bee Agent Framework 0.1, I encountered two issues when working with WatsonX (ibm/granite-3-8b-instruct):

  1. Certain parameters (topK, seed, stopSequences, decoding_method) cause validation errors, despite being included in the official guide.
  2. stream: true does not return streaming responses as expected; instead, the full response appears at once, and observe() does not emit update events.

⚙️ To Reproduce

Steps to reproduce the issue:

  1. Initialize a ChatModel using watsonx:ibm/granite-3-8b-instruct.
  2. Configure the model with the following parameters:
    model.config({
      parameters: {
        maxTokens: 300,
        temperature: 0.15,
        topP: 1,
        frequencyPenalty: 1.1,
        topK: 5,  // Causes error
        seed: 42,  // Causes error
        stopSequences: ["\n\n"],  // Causes error
        decoding_method: "greedy"  // Causes error
      }
    });
  3. Run the code and observe the error:
    Error: Parameter validation errors:
      Found invalid parameters: topK, seed, stopSequences, decoding_method
    
  4. Attempt to enable streaming:
    const response = await model.create({
      messages: [new UserMessage("Hello world!")],
      stream: true,
    }).observe((emitter) => {
      emitter.on("update", ({ value }) => {
        console.log("token", value.getTextContent());
      });
    });
  5. Notice that observe() does not emit any update events, and the response only appears once at the end.

🎯 Expected Behavior

  • topK, seed, stopSequences, and decoding_method should be valid parameters as per the documentation.
  • stream: true should return real-time token updates, instead of waiting until the full response is generated.

📷 Screenshots / Code Snippets

🛠 Debug Output:
console.log("Is response async iterable?", typeof response[Symbol.asyncIterator] === "function");
// Output: Is response async iterable? false

console.log("🔍 Raw Response Object:", JSON.stringify(response, null, 2));
  • response[Symbol.asyncIterator] === falseindicates that for await ... of cannot be used.
  • response.messages contains structured assistant responses, but they all arrive at once instead of streaming progressively.

🖥 Set-up:

  • Bee Agent Version: 0.1
  • Model Provider: watsonx
  • Node.js Version: 20.11.1
  • Model Used: ibm/granite-3-8b-instruct

📌 Additional Context

  • Is stream: true supposed to return an async iterable response?
  • Are topK, seed, stopSequences, and decoding_method officially supported for WatsonX in Bee Agent 0.1?
  • If observe() does not support token-by-token streaming, what is the recommended method for real-time output?

Would appreciate any guidance on these issues. Thanks in advance! 🙌🚀

@kyjeon21 kyjeon21 added the bug Something isn't working label Feb 14, 2025
@Tomas2D Tomas2D self-assigned this Feb 14, 2025
@Tomas2D
Copy link
Contributor

Tomas2D commented Feb 14, 2025

Hello, thank you for opening the issue.

1. Streaming

  • Streaming is working; you are just listening to the wrong event.
  • Returning asyncIterator would complicate thinks (you can't easily receive final result), you can achieve the same functionality using observe functionality (note that async callbacks are supported)
import { UserMessage } from "bee-agent-framework/backend/message";
import { WatsonxChatModel } from "bee-agent-framework/adapters/watsonx/backend/chat";

const response = await model
  .create({
    messages: [new UserMessage("Hello world!")],
    stream: true,
  })
  .observe((emitter) => {
    emitter.on("newToken", ({ value }) => {
      console.info(value.getTextContent());
    });
  });
console.info(response.getTextContent());

2. Parameters

You are right that topK, seed and stopSequences and decoding_method are not working correctly.
The reason for that is that those parameters are not supported by https://github.com/IBM/watsonx-ai-node-sdk, which we are using under the hood. I just opened an issue there IBM/watsonx-ai-node-sdk#4.

Tomas2D added a commit that referenced this issue Feb 14, 2025
Ref: #313
Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
@Tomas2D Tomas2D added the typescript Typescript related functionality label Feb 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working typescript Typescript related functionality
Projects
None yet
Development

No branches or pull requests

2 participants