We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Great work! One question I have though, when calculating the loss, it seems like the Shapley value is the same for each agent. Is this reasonable?
The text was updated successfully, but these errors were encountered:
Hi,
Thanks for your interests. This is possible when you evaluate the optimal Shapley values with optimal actions.
Please refer to Section E.4.1 in https://arxiv.org/abs/2105.15013 for more details.
Hope the answer would help you!
Sorry, something went wrong.
No branches or pull requests
Great work! One question I have though, when calculating the loss, it seems like the Shapley value is the same for each agent. Is this reasonable?
The text was updated successfully, but these errors were encountered: