Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

scaling c_{k}_{lambd} #2

Open
spozdn opened this issue Jul 28, 2020 · 1 comment
Open

scaling c_{k}_{lambd} #2

spozdn opened this issue Jul 28, 2020 · 1 comment

Comments

@spozdn
Copy link
Contributor

spozdn commented Jul 28, 2020

When importances are normalized, i. e. do not change with the uniform scaling of the separate lambda channels, c_{k}_{lambd} doesn't affect anything completely.

Including c_{k}_{lambd} in the iteration would increase computational cost.

c_{k}_{lambd}, where k is l2 is not symmetric with respect to the changing of the order of covariants to contract. Thus it would make single contraction function to loose its general form, where arguments are arbitrary covariants.

@ceriottm
Copy link
Contributor

ceriottm commented Sep 7, 2020

This is an interesting issue, but one that needs testing. Scaling diffferent lambda channels might affect model accuracy in real-life scenarios - I have anecdotal evidence that it does.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants