Replies: 1 comment
-
Really thanks for your feedback. I'm trying to fix this bug. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi!
I was trying out the package on my model and I have noticed, that for me pruning using the lamp importance does not work, because of some indexing issues (indexes were out of bounds). Unfortunately I cannot go into more details, but the way I circumvented it, was copying some relevant code from the MagnitudeImportance class. I think the issues might have been related to ch_groups, which normally LAMPImportance doesn't take into consideration. In its call() function it is collected in **kwargs, but that is it, never used.
Question:
Why doesn't LAMPImportance take ch_groups into consideration?
Although I have copied the relevant code, which eventually fixed the indexing issues, this might have led to incorrect implementation. What do you think?
*Sorry for not providing reproducible code to show the issue.
Beta Was this translation helpful? Give feedback.
All reactions