Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added return_semantic_attention_weights parameter to HANConv #5787

Merged
merged 13 commits into from
Oct 20, 2022

Conversation

manuel-dileo
Copy link
Contributor

Added return_semantic_attention_weights parameter to forward method of HANConv, in the same way GATConv returns its attention_weights.

return_semantic_attention_weights (bool, optional): If set to :obj:True, will additionally return the tensor :obj:semantic_attention_weights, holding the computed attention weights for each edge type at semantic-level attention. (default: :obj:None).

Further work could be dedicated to return also node-level attention weights as dictionary of (edge_type, attention_weights) items.

@codecov
Copy link

codecov bot commented Oct 20, 2022

Codecov Report

Merging #5787 (27e4680) into master (dc916f0) will decrease coverage by 0.01%.
The diff coverage is 66.66%.

❗ Current head 27e4680 differs from pull request most recent head 357be3b. Consider uploading reports for the commit 357be3b to get more accurate results

@@            Coverage Diff             @@
##           master    #5787      +/-   ##
==========================================
- Coverage   83.97%   83.96%   -0.02%     
==========================================
  Files         349      349              
  Lines       19422    19428       +6     
==========================================
+ Hits        16309    16312       +3     
- Misses       3113     3116       +3     
Impacted Files Coverage Δ
torch_geometric/nn/conv/han_conv.py 96.66% <66.66%> (-3.34%) ⬇️

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@rusty1s rusty1s changed the title Added return_semantic_attention_weights parameter to HANConv Added return_semantic_attention_weights parameter to HANConv Oct 20, 2022
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
torch_geometric/nn/conv/han_conv.py Outdated Show resolved Hide resolved
manuel-dileo and others added 9 commits October 20, 2022 20:39
Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
@rusty1s rusty1s enabled auto-merge (squash) October 20, 2022 18:55
@rusty1s rusty1s merged commit 2ffd0b7 into pyg-team:master Oct 20, 2022
JakubPietrakIntel pushed a commit to JakubPietrakIntel/pytorch_geometric that referenced this pull request Nov 25, 2022
…-team#5787)

Added return_semantic_attention_weights parameter to forward method of
HANConv, in the same way GATConv returns its attention_weights.

return_semantic_attention_weights (bool, optional): If set to
:obj:`True`, will additionally return the tensor
:obj:`semantic_attention_weights`, holding the computed attention
weights for each edge type at semantic-level attention. (default:
:obj:`None`).

Further work could be dedicated to return also node-level attention
weights as dictionary of (edge_type, attention_weights) items.

Co-authored-by: Matthias Fey <matthias.fey@tu-dortmund.de>
@jasmineChouLujie
Copy link

Does the return of conv.HANConv in torch_geometric contain the 2 parts tasks which are node-level attention and semantic-level attention in the paper?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants