Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The SpecialSpmmFunction Class question #48

Open
Colorfu1 opened this issue Oct 15, 2019 · 0 comments
Open

The SpecialSpmmFunction Class question #48

Colorfu1 opened this issue Oct 15, 2019 · 0 comments

Comments

@Colorfu1
Copy link

Colorfu1 commented Oct 15, 2019

I noticed that SpecialSpmmFunction is the subclass of torch.autograd.Function and there is only one object in class SpGraphAttentionLayer.

`class SpGraphAttentionLayer(nn.Module):
def init(self, in_features, out_features, dropout, alpha, concat=True):
super(SpGraphAttentionLayer, self).init()
self.in_features = in_features
self.out_features = out_features
self.alpha = alpha
self.concat = concat

    self.W = nn.Parameter(torch.zeros(size=(in_features, out_features)))
    nn.init.xavier_normal_(self.W.data, gain=1.414)
            
    self.a = nn.Parameter(torch.zeros(size=(1, 2*out_features)))
    nn.init.xavier_normal_(self.a.data, gain=1.414)

    self.dropout = nn.Dropout(dropout)
    self.leakyrelu = nn.LeakyReLU(self.alpha)
    self.special_spmm = SpecialSpmm()`

But in official documents, there is a saying that Each function object is meant to be used only once (in the forward pass). I found the self.special_spmm forward twice in

`e_rowsum = self.special_spmm(edge, edge_e, torch.Size([N, N]), torch.ones(size=(N,1), device=dv))
# e_rowsum: N x 1

    edge_e = self.dropout(edge_e)
    # edge_e: E

    # Each function object is meant to be used only once (in the forward pass).
    h_prime = self.special_spmm(edge, edge_e, torch.Size([N, N]), h)`

Have I misunderstand sth.?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant