You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PyG (PyTorch Geometric) is a library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) for a wide range of applications related to structured data
I want to use Keras for multi backend.
The formula in the paper seems trivial to implement in a layer:
but they talk about linear increase of complexity with the number of edges, surely this is due to sparse matrices.
Question 1: We need to re-implement the graph-like layers to get multi-backend, right ? Otherwise one would just PyG.
Question 2: Can one use such sparse representations in pure Keras, chaging then the backend ? I can't find a tutorial or ref, apart from TF itself here.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone,
Motivation
I describe very briefly what I'd like to do (related to OP.), question is at the bottom.
Here is a very well cited and simple paper (40K) for convolutions on graphs (I'd like to use it for chemistry.)
Torch Geometric
It would be easy with Torch Geometric (PyG):
I want to use Keras for multi backend.
The formula in the paper seems trivial to implement in a layer:
but they talk about linear increase of complexity with the number of edges, surely this is due to sparse matrices.
Question 0: Yet, it seems Sparse Tensors support still on queue?
Question 1: We need to re-implement the graph-like layers to get multi-backend, right ? Otherwise one would just PyG.
Question 2: Can one use such sparse representations in pure Keras, chaging then the backend ? I can't find a tutorial or ref, apart from TF itself here.
Beta Was this translation helpful? Give feedback.
All reactions