Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: initialize tiled_prompt_lengths_buf_ to zero in gptneox #716

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

yandai
Copy link

@yandai yandai commented Jul 13, 2023

I think tiled_prompt_lengths_buf_ should be initialized to zero.
tiled_prompt_lengths_buf_

When invokeMaskPaddingTokens uses uninitialized tiled_prompt_lengths_buf_, result should be uncorrect.
https://github.com/Nvidia/FasterTransformer/blob/main/src/fastertransformer/models/gptneox/GptNeoX.cc#L760

@yandai yandai changed the title fix: initialize tiled_prompt_lengths_buf_ to zero in gptneo fix: initialize tiled_prompt_lengths_buf_ to zero in gptneox Jul 13, 2023
@RobotGF RobotGF mentioned this pull request Sep 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant