Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes # (3483)
This PR adds the GeGLU (Gated Linear Unit with GELU) activation function to the activation module of the Flax library. It is an increasingly popular activation layer which combines a linear transformation with a GELU activation in a gating mechanism, balancing linearity and non-linearity.
GeGLU is parameterised and as such uses Flax's Dense layer and was not part of Jax itself. The implementation allows for an optional output_dim attribute, which can be used to specify the number of output features. This can increase or decrease the -1th dimensions of the results and the code tries to account for that. The tests validate the functionality in three scenarios: standard usage, output dimension expansion, and output dimension contraction, ensuring the layer's reliability in various use cases.
Checklist
discussion
documentation guidelines.
(No quality testing = no merge!)