Skip to content

Commit

Permalink
Update optimum/habana/transformers/models/mt5/modeling_mt5.py
Browse files Browse the repository at this point in the history
Co-authored-by: Yaser Afshar <yaser.afshar@intel.com>
  • Loading branch information
Gaurav7888 and yafshar authored Aug 5, 2024
1 parent 49a234e commit 40438c8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion optimum/habana/transformers/models/mt5/modeling_mt5.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def gaudi_mt5_layernorm_forward(self, hidden_states):
# Square Layer Normalization https://arxiv.org/abs/1910.07467 thus varience is calculated
# w/o mean and there is no bias. Additionally we want to make sure that the accumulation for
# half-precision inputs is done in fp32
if hidden_states.device.type == "hpu" and FusedRMSNorm:
if hidden_states.device.type == "hpu" and has_fused_rms_norm:
orig_dtype = hidden_states.dtype
hidden_states = FusedRMSNorm.apply(hidden_states.float(), self.weight.float(), self.variance_epsilon)
return hidden_states.to(orig_dtype)
Expand Down

0 comments on commit 40438c8

Please sign in to comment.