Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continue feature is always shown with OpenAI #737

Open
gururise opened this issue Jan 25, 2024 · 7 comments
Open

Continue feature is always shown with OpenAI #737

gururise opened this issue Jan 25, 2024 · 7 comments
Labels
bug Something isn't working models This issue is related to model performance/reliability

Comments

@gururise
Copy link
Contributor

As shown in the image below, when using OpenAI endpoint (gpt-3.5-turbo-1106), after every inference the Continue button is shown.

When I press the continue button in the example below, chatgpt responds "I'm here to help". I pressed the button multiple times, so you can see the response and that the Continue button never goes away.

image

This issue does not exist when using Together.ai with their OpenAI compatible endpoint.

@nsarrazin
Copy link
Collaborator

Thanks for reporting this! Will take a look

@nsarrazin nsarrazin added bug Something isn't working models This issue is related to model performance/reliability labels Jan 29, 2024
@flexchar
Copy link
Contributor

flexchar commented Feb 6, 2024

@gururise could you show how it looks when using Together.ai?

@gururise
Copy link
Contributor Author

gururise commented Feb 6, 2024

@gururise could you show how it looks when using Together.ai?

Using Mixtral on Together.ai, if asked a question, the model will respond and no Continue button will be present after the model is done responding.
image

@brettpappas
Copy link

Having the same issue (Azure OpenAI). Is there a way to disable the continue button until this gets fixed? I thought about hiding it using CSS but the button doesn't have an id or special class to target.

@nsarrazin
Copy link
Collaborator

Can you test the changes in this branch here? hopefully that should fix it, sorry for the delay with this

@brettpappas
Copy link

This fixes the issue for me on my local copy. I have not tested if it DOES show when the connection is interrupted though. I would be happy to confirm that as well if there is a way to trigger an interrupted response.

@flexchar
Copy link
Contributor

flexchar commented Apr 7, 2024

This seems to be fixed for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working models This issue is related to model performance/reliability
Projects
None yet
Development

No branches or pull requests

4 participants