Skip to content

Commit

Permalink
Fix bug for input module in GINEConv (#5154)
Browse files Browse the repository at this point in the history
* fix but in gin_conv

* add changelog

* add changelog

* add error in case no type can be inferred

* fix inference

* fix test

* remove modulelist

* remvoe redundant if statement

Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
  • Loading branch information
Padarn and rusty1s authored Aug 8, 2022
1 parent afd12c2 commit 4433455
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 5 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added support for graph-level outputs in `to_hetero` ([#4582](https://github.com/pyg-team/pytorch_geometric/pull/4582))
- Added `CHANGELOG.md` ([#4581](https://github.com/pyg-team/pytorch_geometric/pull/4581))
### Changed
- Fixed `GINEConv` bug with non-sequential input ([#5154](https://github.com/pyg-team/pytorch_geometric/pull/5154)]
- Improved error message ([#5095](https://github.com/pyg-team/pytorch_geometric/pull/5095))
- Fixed `HGTLoader` bug which produced outputs with missing edge types ([#5067](https://github.com/pyg-team/pytorch_geometric/pull/5067))
- Fixed dynamic inheritance issue in data batching ([#5051](https://github.com/pyg-team/pytorch_geometric/pull/5051))
Expand Down
5 changes: 5 additions & 0 deletions test/nn/conv/test_gin_conv.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,11 @@ def test_gine_conv_edge_dim():
out = conv(x, edge_index, edge_attr)
assert out.size() == (4, 32)

nn = Lin(16, 32)
conv = GINEConv(nn, train_eps=True, edge_dim=8)
out = conv(x, edge_index, edge_attr)
assert out.size() == (4, 32)


def test_static_gin_conv():
x = torch.randn(3, 4, 16)
Expand Down
16 changes: 11 additions & 5 deletions torch_geometric/nn/conv/gin_conv.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,8 +130,9 @@ class GINEConv(MessagePassing):
- **output:** node features :math:`(|\mathcal{V}|, F_{out})` or
:math:`(|\mathcal{V}_t|, F_{out})` if bipartite
"""
def __init__(self, nn: Callable, eps: float = 0., train_eps: bool = False,
edge_dim: Optional[int] = None, **kwargs):
def __init__(self, nn: torch.nn.Module, eps: float = 0.,
train_eps: bool = False, edge_dim: Optional[int] = None,
**kwargs):
kwargs.setdefault('aggr', 'add')
super().__init__(**kwargs)
self.nn = nn
Expand All @@ -141,11 +142,16 @@ def __init__(self, nn: Callable, eps: float = 0., train_eps: bool = False,
else:
self.register_buffer('eps', torch.Tensor([eps]))
if edge_dim is not None:
if hasattr(self.nn[0], 'in_features'):
in_channels = self.nn[0].in_features
if isinstance(self.nn, torch.nn.Sequential):
nn = self.nn[0]
if hasattr(nn, 'in_features'):
in_channels = nn.in_features
elif hasattr(nn, 'in_channels'):
in_channels = nn.in_channels
else:
in_channels = self.nn[0].in_channels
raise ValueError("Could not infer input channels from `nn`.")
self.lin = Linear(edge_dim, in_channels)

else:
self.lin = None
self.reset_parameters()
Expand Down

0 comments on commit 4433455

Please sign in to comment.