Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resolve Dilated #87 #88

Merged
merged 8 commits into from
Apr 20, 2023
Merged

Resolve Dilated #87 #88

merged 8 commits into from
Apr 20, 2023

Conversation

soran-ghaderi
Copy link
Member

This PL initiates a new package for attention maskings and implements dilated and global attention masks.

It is fully working. yet needs further modifications
* This is to remove the masked_softmax function and also enable the TransformerX to reproduce different sparse masking techniques instead of just the default one
raises ValueErrorException if the input shape is not 2d, 3d, or 4d
1. New package for attention masks
2. global_attention_mask.py
This will be reverted again after reformatting all core layers
@soran-ghaderi soran-ghaderi added documentation Improvements or additions to documentation enhancement New feature or request Subtask A subtask issue tensorflow Related to Tensorflow labels Apr 20, 2023
@soran-ghaderi soran-ghaderi merged commit 3f5b061 into master Apr 20, 2023
@soran-ghaderi soran-ghaderi deleted the test_mask branch April 20, 2023 13:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request Subtask A subtask issue tensorflow Related to Tensorflow
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant