Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support return embedding for MLP layer #4625

Merged
merged 5 commits into from
May 11, 2022
Merged

Support return embedding for MLP layer #4625

merged 5 commits into from
May 11, 2022

Conversation

JiaxuanYou
Copy link
Contributor

The added return_emb layer is useful when user would like to extract embeddings along with predictions

@codecov
Copy link

codecov bot commented May 11, 2022

Codecov Report

Merging #4625 (cb9c230) into master (e3ba9d3) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##           master    #4625   +/-   ##
=======================================
  Coverage   82.88%   82.88%           
=======================================
  Files         316      316           
  Lines       16675    16677    +2     
=======================================
+ Hits        13821    13823    +2     
  Misses       2854     2854           
Impacted Files Coverage Δ
torch_geometric/nn/models/mlp.py 98.46% <100.00%> (+0.04%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e3ba9d3...cb9c230. Read the comment docs.

@rusty1s
Copy link
Member

rusty1s commented May 11, 2022

Can we do this as part of forward?

def forward(self, x, return_emb=True)

@JiaxuanYou
Copy link
Contributor Author

Can we do this as part of forward?

def forward(self, x, return_emb=True)

Sounds good!

@JiaxuanYou
Copy link
Contributor Author

Updated

@JiaxuanYou JiaxuanYou merged commit 363a4eb into master May 11, 2022
@JiaxuanYou JiaxuanYou deleted the mlp_emb branch May 11, 2022 23:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants