Skip to content

Commit

Permalink
config adjustments for llama and gated activations
Browse files Browse the repository at this point in the history
  • Loading branch information
jahatef committed Oct 25, 2024
1 parent 575c4b6 commit dc4b81f
Show file tree
Hide file tree
Showing 6 changed files with 7 additions and 3 deletions.
1 change: 1 addition & 0 deletions configs/llama/13B.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
"output_layer_parallelism": "column",
"norm": "rmsnorm",
"rms_norm_epsilon": 1.0e-6,
"use_bias_in_mlp": False,

"scaled_upper_triang_masked_softmax_fusion": true,
"bias_gelu_fusion": false,
Expand Down
1 change: 1 addition & 0 deletions configs/llama/30B.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
"output_layer_parallelism": "column",
"norm": "rmsnorm",
"rms_norm_epsilon": 1.0e-6,
"use_bias_in_mlp": False,

"scaled_upper_triang_masked_softmax_fusion": true,
"bias_gelu_fusion": false,
Expand Down
1 change: 1 addition & 0 deletions configs/llama/65B.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
"output_layer_parallelism": "column",
"norm": "rmsnorm",
"rms_norm_epsilon": 1.0e-6,
"use_bias_in_mlp": False,

"scaled_upper_triang_masked_softmax_fusion": true,
"bias_gelu_fusion": false,
Expand Down
1 change: 1 addition & 0 deletions configs/llama/7B.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
"output_layer_parallelism": "column",
"norm": "rmsnorm",
"rms_norm_epsilon": 1.0e-6,
"use_bias_in_mlp": False,

"scaled_upper_triang_masked_softmax_fusion": true,
"bias_gelu_fusion": false,
Expand Down
2 changes: 1 addition & 1 deletion configs/llama/train_config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,5 +70,5 @@
"steps_per_print": 10,
"keep_last_n_checkpoints": 4,
"wall_clock_breakdown": true,
"mlp_multiple_of": 256,

}
4 changes: 2 additions & 2 deletions megatron/model/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1269,8 +1269,8 @@ def forward(self, x, attention_mask, layer_past=None):

with torch.enable_grad() if not self.eval else nullcontext():
if (
self.activation == "swiglu"
or self.num_experts > 1
mlp_bias == None,
self.num_experts > 1
and self.moe_type == "deepspeed"
):
# No dropout either
Expand Down

0 comments on commit dc4b81f

Please sign in to comment.