Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

please support ragged tensor as memory for AttentionMechanism #2371

Closed
breadbread1984 opened this issue Jan 30, 2021 · 3 comments
Closed

please support ragged tensor as memory for AttentionMechanism #2371

breadbread1984 opened this issue Jan 30, 2021 · 3 comments

Comments

@breadbread1984
Copy link

Describe the feature and the current behavior/state.

AttentionMechanism doesn't support ragged tensor for memory. the output hidden sequences of tf.keras.layers.RNN already support ragged tensor. if AttensionMechanism support ragged tensor it would be very convenient.

Relevant information

  • Are you willing to contribute it (yes/no): no
  • Are you willing to maintain it going forward? (yes/no): no
  • Is there a relevant academic paper? (if so, where):
  • Is there already an implementation in another framework? (if so, where):
  • Was it part of tf.contrib? (if so, where):

Which API type would this fall under (layer, metric, optimizer, etc.)

the improvement doesn't require any new APIs

Who will benefit with this feature?

NLP develop using ragged tensor as input

Any other info.

@guillaumekln
Copy link
Contributor

Thanks for the feature request. It makes sense to support ragged tensors but they should ideally be supported in all modules that accept variable length inputs, and not just the attention memory.

Are you willing to contribute?

@breadbread1984
Copy link
Author

I would like to, but I am not competent to complish this task.

@seanpmorgan
Copy link
Member

TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision:
TensorFlow Addons Wind Down

Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA:
Keras
Keras-CV
Keras-NLP

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants