Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sparse Tensors #18420

Open
adi-kmt opened this issue Aug 7, 2023 · 7 comments
Open

Sparse Tensors #18420

adi-kmt opened this issue Aug 7, 2023 · 7 comments
Assignees
Labels
type:feature The user is asking for a new feature.

Comments

@adi-kmt
Copy link
Contributor

adi-kmt commented Aug 7, 2023

Hey @fchollet,

do you have any roadmap to implement sparse tensors for the different backends?

  • tf seems to have only one implementation of sparse tensors COO
  • torch seems to have multiple methods but has really good support for COO
  • I think for numpy mostly through scipy
  • JAX has sparse support seems experimental for now
@fchollet
Copy link
Member

fchollet commented Aug 7, 2023

The general plan is to let backend ops receive and return backend-native sparse tensor types. E.g. backend.tensorflow.numpy.matmul should be able to receive sparse inputs, in which case it would return sparse outputs.

Then, when passed sparse data inputs (e.g. via tf.data or via scipy sparse arrays) we would not densify them and just pass them to the backend ops.

Lastly, we would avoid densify gradients in the optimizer (which we currently do).

The most work will be enabling all backend ops with sparse tensors support.

We should do this for the TensorFlow backend first, as it has the most mature support for sparse tensors so far.

@SuryanarayanaY SuryanarayanaY added the type:feature The user is asking for a new feature. label Aug 24, 2023
@jackd
Copy link
Contributor

jackd commented Aug 29, 2023

I'd be willing to do a lot of the dogs-work to make this happen (e.g. write backend wrappers, keras Operations) if some framework for composite tensors (or even just a SparseTensor class) could be established.

@hertschuh hertschuh self-assigned this Aug 31, 2023
@fchollet
Copy link
Member

@hertschuh is currently working on this -- but there may be items you guys can take on!

@ghsanti
Copy link
Contributor

ghsanti commented Nov 19, 2024

Hi!

Are there any updates for other-than-tf support for sparse tensors @fchollet ?

@hertschuh
Copy link
Collaborator

Hi!

Are there any updates for other-than-tf support for sparse tensors @fchollet ?

@ghsanti ,

Sparse support has been added for JAX using jax.experimental.sparse.BCOO specifically. Feature-wise, it's at parity with TensorFlow for most things (gradients is the most notable gap).

@ghsanti
Copy link
Contributor

ghsanti commented Nov 19, 2024

Hi!

Are there any updates for other-than-tf support for sparse tensors @fchollet ?

@ghsanti ,

Sparse support has been added for JAX using jax.experimental.sparse.BCOO specifically. Feature-wise, it's at parity with TensorFlow for most things (gradients is the most notable gap).

I need the sparse gradients (not dense) as that would otherwise defeat its purpose? (Memory would go up, speed down.)

With this constraint on gradients, is tf the only backend possible?

Yet, neither sparse Graph convolutions nor sparse convolutions are available tf layers.

"Graph" ones seem implementable in a layer using smth like $\sigma (AW)$ since keras supports sparse matmul through tf, right? And get the sparse gradient if I get your point correctly?

@hertschuh
Copy link
Collaborator

@ghsanti

JAX handles sparseness in gradients differently, so there's nothing we can do in Keras right now. That makes TF the only possible backend. Sparse convolutions are indeed not available in Tensorflow. Sparse matmul does produce sparse gradients, so you could try that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:feature The user is asking for a new feature.
Projects
None yet
Development

No branches or pull requests

6 participants