Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TOPI] Support non-batch cases for topi.nll_loss #14060

Merged

Conversation

Ubospica
Copy link
Contributor

@Ubospica Ubospica commented Feb 21, 2023

This PR supports the cases when input does not contain batches for topi.nll_loss.

When there is no batches, the shape of the prediction parameter is (C,), the shape of the target parameter is (), the shape of the target parameter is (C,), and the shape of the output is always () no matter which reduction method it uses.

cc @junrushao @Hzfengsy

@tvm-bot
Copy link
Collaborator

tvm-bot commented Feb 21, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

  • No users to tag found in teams: topi See #10317 for details

Generated by tvm-bot

@Ubospica Ubospica changed the title Support non-batch cases for topi.nll_loss [TOPI] Support non-batch cases for topi.nll_loss Feb 21, 2023
@junrushao junrushao merged commit 5ec33bb into apache:main Feb 21, 2023
yongwww pushed a commit to yongwww/tvm that referenced this pull request Feb 27, 2023
This PR supports the cases when input does not contain batches for `topi.nll_loss`.

When there is no batches, the shape of the prediction parameter is `(C,)`,  the shape of the target parameter is `()`, the shape of the target parameter is `(C,)`, and the shape of the output is always `()` no matter which reduction method it uses.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants