-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sparse Tensors #18420
Comments
The general plan is to let backend ops receive and return backend-native sparse tensor types. E.g. Then, when passed sparse data inputs (e.g. via tf.data or via scipy sparse arrays) we would not densify them and just pass them to the backend ops. Lastly, we would avoid densify gradients in the optimizer (which we currently do). The most work will be enabling all backend ops with sparse tensors support. We should do this for the TensorFlow backend first, as it has the most mature support for sparse tensors so far. |
I'd be willing to do a lot of the dogs-work to make this happen (e.g. write backend wrappers, keras |
@hertschuh is currently working on this -- but there may be items you guys can take on! |
Hi! Are there any updates for other-than-tf support for sparse tensors @fchollet ? |
I need the sparse gradients (not dense) as that would otherwise defeat its purpose? (Memory would go up, speed down.) With this constraint on gradients, is tf the only backend possible? Yet, neither sparse Graph convolutions nor sparse convolutions are available tf layers. "Graph" ones seem implementable in a layer using smth like |
JAX handles sparseness in gradients differently, so there's nothing we can do in Keras right now. That makes TF the only possible backend. Sparse convolutions are indeed not available in Tensorflow. Sparse matmul does produce sparse gradients, so you could try that. |
Hey @fchollet,
do you have any roadmap to implement sparse tensors for the different backends?
The text was updated successfully, but these errors were encountered: