Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose conv2d weight/bias preparation as ops #14049

Merged
merged 2 commits into from
Nov 1, 2024
Merged

Conversation

LPanosTT
Copy link
Contributor

@LPanosTT LPanosTT commented Oct 21, 2024

Tickets

#13219, #12793

Problem description

Modelling convolutions in MLIR is difficult when you need to execute conv2d once in order to get the proper inputs for the next iteration.

What's changed

Moving weight and bias preparation to dedicated ops. The logic that prepared the ops on host already exists and is functional. The issue is that it is done within conv2d. Most of the code changes is copy/pasted.

Checklist

  • Post commit CI passes
  • Blackhole Post commit (if applicable)
  • Model regression CI testing passes (if applicable)
  • Device performance regression CI testing passes (if applicable)
  • New/Existing tests provide coverage for changes

@LPanosTT
Copy link
Contributor Author

@pavlejosipovic As for your question in the previous PR. If by c/p code you mean is the logic in that function copy/pasted then yes. All of the code is copy pasted since I'm just encapsulating existing logic in their own ops.

@pavlejosipovic
Copy link
Contributor

@pavlejosipovic As for your question in the previous PR. If by c/p code you mean is the logic in that function copy/pasted then yes. All of the code is copy pasted since I'm just encapsulating existing logic in their own ops.

I meant this function is not used internally and we have same logic inside other functions (instead of using the function)?

@LPanosTT
Copy link
Contributor Author

I meant this function is not used internally and we have same logic inside other functions (instead of using the function)?

This function does get used in prepare_conv2d_weights.cpp. I think the logic here is actually duplicated in the top level conv2d as well though. Let me clean this up some more.

LPanosTT added a commit that referenced this pull request Nov 8, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
ct-clmsn pushed a commit to ct-clmsn/tt-metal that referenced this pull request Nov 12, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
ct-clmsn pushed a commit to ct-clmsn/tt-metal that referenced this pull request Nov 12, 2024
)

Revert "Expose conv2d weight/bias preparation as ops (tenstorrent#14049)"

This reverts commit 09fd48c.

Force-merging to fix perf pipelines.
LPanosTT added a commit that referenced this pull request Nov 12, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
kmabeeTT pushed a commit that referenced this pull request Nov 12, 2024
Revert "Expose conv2d weight/bias preparation as ops (#14049)"

This reverts commit 09fd48c.

Force-merging to fix perf pipelines.

(cherry picked from commit bffe65a)
LPanosTT added a commit that referenced this pull request Nov 18, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Nov 19, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Nov 27, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Nov 27, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Nov 28, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Dec 4, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Dec 4, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Dec 4, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
shwetankTT pushed a commit that referenced this pull request Dec 4, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
yieldthought pushed a commit that referenced this pull request Dec 13, 2024
Added new weight and bias preparation ops, and new conv op that can only take pre-prepared weights.
Working conv test with pre-prepared weights

added return weight/output dims kwargs

Only auto-shard if shard_layout not specified

Pass input memory config to prepare functions

Organize utility functions into their own files
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.