-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MessagePassing.propagate with flow="target_to_source" and size=(...) bug introduced in PyG 2.0.4 #4591
Comments
Thanks for reporting. It doesn't make much sense for mp = MessagePassing(flow="target_to_source", node_dim=0)
mp.propagate(torch.tensor([[0], [0]]), size=(100, 10), x=torch.randn(100, 3)) to succeed since you should pass in a tuple of node features whenever your mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[0], [0]]), size=(100, 10),
x=(torch.randn(100, 3), torch.randn(10, 3)))
print(out.shape) |
Thanks for coming back to me. I'm a bit confused still, and suspect a bug really is present in PyG 2.0.4. Imagine the following setting: I want to go from 100 nodes, with features As you suggest, we can add a zero feature on the 10 nodes and supply both to mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[1], [2]]), size=(100, 10),
x=(torch.arange(100), torch.zeros(10)))
print(out) # unexpectedly gives 10 zeros If I flip the sizes around, I get: mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[1], [2]]), size=(10, 100),
x=(torch.zeros(10), torch.arange(100)))
print(out) # [0, 2, 0, 0, ...] length 100 With mp = MessagePassing(flow="source_to_target", node_dim=0)
out = mp.propagate(torch.tensor([[2], [1]]), size=(100, 10),
x=(torch.arange(100), torch.zeros(10)))
print(out) # [0, 2, 0, ...] length 10 In PyG 2.0.3, I get as expected: mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[1], [2]]), size=(100, 10),
x=(torch.arange(100), torch.zeros(10)))
print(out) # [0, 2, 0, ...] length 10 In PyG 2.0.3, I also correctly get the answer without providing the zero tensor, which seems preferable to me: mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[1], [2]]), size=(100, 10),
x=torch.arange(100))
print(out) # unexpectedly gives 10 zeros In PyG 2.0.4, I get the aforementioned error. Thanks! |
Can you upgrade to PyG master? I believe this has been already fixed in #4418. |
That is, mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[1], [2]]), size=(100, 10),
x=(torch.arange(100), torch.zeros(10)))
print(out) # gives 100 zeros |
Thanks! It's indeed fixed at master. The correct code is (I don't need the zero tensor): mp = MessagePassing(flow="target_to_source", node_dim=0)
out = mp.propagate(torch.tensor([[1], [2]]), size=(10, 100),
x=torch.arange(100))
print(out) # [0, 2, 0, 0, ...] length 10 |
🐛 Describe the bug
With
source_to_target
we get as expected:In PyG 2.0.4 with
target_to_source
, we get:While
raises
I'm not sure which of the two shapes is expected, but either the second or the third example should return the same as the first example.
In PyG 2.0.3, this was working fine. I suspect the issue is related to the change made in #3907. In PyG 2.0.3, we get:
Environment
The text was updated successfully, but these errors were encountered: