Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The same value of Shapely #37

Open
zhren711 opened this issue Nov 10, 2024 · 1 comment
Open

The same value of Shapely #37

zhren711 opened this issue Nov 10, 2024 · 1 comment

Comments

@zhren711
Copy link

Great work! One question I have though, when calculating the loss, it seems like the Shapley value is the same for each agent. Is this reasonable?
mmexport1731243624409
mmexport1731243627237

@hsvgbkhgbv
Copy link
Member

hsvgbkhgbv commented Nov 10, 2024

Hi,

Thanks for your interests. This is possible when you evaluate the optimal Shapley values with optimal actions.

Please refer to Section E.4.1 in https://arxiv.org/abs/2105.15013 for more details.

Hope the answer would help you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants