We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi SVD team, I cannot reproduce the result of llama-7b on PTB The model compression ratio is 0.2:
python SVDLLM.py --step 1 --model /data/llama-7b --ratio 0.2 --whitening_nsamples 256 --model_seq_len 2048 --dataset wikitext2 --seed 3 --save_path SVDLLM
The ppl result is:
PPL after pruning: {'wikitext2': 7.891565003099237} PPL after pruning: {'ptb': 71.51362022345916} PPL after pruning: {'c4': 16.115304662697902}
Is there anything special to be aware of regarding the PTB dataset? tks
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hi SVD team, I cannot reproduce the result of llama-7b on PTB
The model compression ratio is 0.2:
The ppl result is:
Is there anything special to be aware of regarding the PTB dataset? tks
The text was updated successfully, but these errors were encountered: