Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix infinite loop caused by incorrect timestamp tokens prediction #914

Merged
merged 2 commits into from
Feb 1, 2023

Conversation

andrewchernyh
Copy link
Contributor

@jongwook jongwook merged commit 7858aa9 into openai:main Feb 1, 2023
@jongwook
Copy link
Collaborator

jongwook commented Feb 2, 2023

Thank you!

Copy link

@Jeronymous Jeronymous left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The infinite loop can still happen, so I suggest to go further with this bugfix

timestamps = sampled_tokens[sampled_tokens.ge(self.tokenizer.timestamp_begin)]
if timestamps.numel() > 0:
# timestamps shouldn't decrease; forbid timestamp tokens smaller than the last
logits[k, self.tokenizer.timestamp_begin : timestamps[-1]] = -np.inf

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not enough to prevent the infinite loop (see discussion #924) because it is not preventing the model to always output <|0.00|>

Suggestion:

                timestamp_last = max(timestamps[-1], self.tokenizer.timestamp_begin + 1) # Avoid to emit <|0.00|> again
                logits[k, self.tokenizer.timestamp_begin : timestamp_last] = -np.inf

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a better suggestion is:

                if last_was_timestamp and not penultimate_was_timestamp:
                    timestamp_last = timestamps[-1]
                else:
                    timestamp_last = timestamps[-1] + 1
                logits[k, self.tokenizer.timestamp_begin : timestamp_last] = -np.inf

to force that timestamps are strictly increasing after a speech segment / increasing between the end of a speech segment and the start of the next one.

Is anyone looking at this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great solution @Jeronymous! I checked it and it works.

With your permission, Im gonna create a new PR to speed up this change. Im gonna mention your suggestion.

FernanOrtega added a commit to FernanOrtega/whisperX that referenced this pull request Mar 24, 2023
FernanOrtega added a commit to FernanOrtega/whisper that referenced this pull request Mar 27, 2023
Following the suggestions of @Jeronymous in openai#914 and openai#924, it solves the problem of endless loop.
@FernanOrtega FernanOrtega mentioned this pull request Mar 27, 2023
FernanOrtega added a commit to FernanOrtega/whisperX that referenced this pull request Mar 27, 2023
New fix for endless loop problem. I also created a PR for official Whisper: openai/whisper#1155

It is explained in openai/whisper#914 and openai/whisper#924
jongwook added a commit that referenced this pull request Apr 11, 2023
* Update decoding.py

Following the suggestions of @Jeronymous in #914 and #924, it solves the problem of endless loop.

* Removed blank line and whitespaces in empty lines.

* Suggested changes according to the linter

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
zackees pushed a commit to zackees/whisper that referenced this pull request May 5, 2023
…enai#914)

* Fix infinite loop caused by incorrect timestamp tokens prediction

openai#810

* Update decoding.py

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
zackees pushed a commit to zackees/whisper that referenced this pull request May 5, 2023
* Update decoding.py

Following the suggestions of @Jeronymous in openai#914 and openai#924, it solves the problem of endless loop.

* Removed blank line and whitespaces in empty lines.

* Suggested changes according to the linter

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
ilanit1997 pushed a commit to ilanit1997/whisper that referenced this pull request May 16, 2023
…enai#914)

* Fix infinite loop caused by incorrect timestamp tokens prediction

openai#810

* Update decoding.py

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
ilanit1997 pushed a commit to ilanit1997/whisper that referenced this pull request May 16, 2023
* Update decoding.py

Following the suggestions of @Jeronymous in openai#914 and openai#924, it solves the problem of endless loop.

* Removed blank line and whitespaces in empty lines.

* Suggested changes according to the linter

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
abyesilyurt pushed a commit to abyesilyurt/whisper that referenced this pull request Nov 13, 2023
…enai#914)

* Fix infinite loop caused by incorrect timestamp tokens prediction

openai#810

* Update decoding.py

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
abyesilyurt pushed a commit to abyesilyurt/whisper that referenced this pull request Nov 13, 2023
* Update decoding.py

Following the suggestions of @Jeronymous in openai#914 and openai#924, it solves the problem of endless loop.

* Removed blank line and whitespaces in empty lines.

* Suggested changes according to the linter

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
stainless-app bot pushed a commit to 2lambda123/openai-whisper that referenced this pull request Feb 3, 2024
* Update decoding.py

Following the suggestions of @Jeronymous in openai/whisper#914 and https://github.com/openai/whisper/discussions/924, it solves the problem of endless loop.

* Removed blank line and whitespaces in empty lines.

* Suggested changes according to the linter

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
smartluck1125 added a commit to smartluck1125/whisper that referenced this pull request May 9, 2024
* Update decoding.py

Following the suggestions of @Jeronymous in openai/whisper#914 and openai/whisper#924, it solves the problem of endless loop.

* Removed blank line and whitespaces in empty lines.

* Suggested changes according to the linter

---------

Co-authored-by: Jong Wook Kim <jongwook@openai.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants