Skip to content
This repository has been archived by the owner on Jun 19, 2024. It is now read-only.

Add decomposition for aten::native_layer_norm #13

Merged
merged 1 commit into from
Jul 20, 2022

Conversation

tanyokwok
Copy link

No description provided.

Copy link

@Yancey1989 Yancey1989 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with tiny comment.


// x - mean(x)
Value inputMeanExpanded = rewriter.create<AtenExpandAsOp>(loc, inputTy, inputMean, op.input());
Value inputZeroMean = rewriter.create<AtenSubTensorOp>(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inputZeroMean => inputOneMean ?

@tanyokwok tanyokwok merged commit e50b85e into main Jul 20, 2022
@tanyokwok tanyokwok deleted the tanyo/dev_bert_layernorm branch July 20, 2022 02:50
tanyokwok pushed a commit to tanyokwok/torch-mlir that referenced this pull request Aug 11, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants