Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reproduce Problem on PTB dataset #21

Open
jzzzf opened this issue Dec 2, 2024 · 0 comments
Open

Reproduce Problem on PTB dataset #21

jzzzf opened this issue Dec 2, 2024 · 0 comments

Comments

@jzzzf
Copy link

jzzzf commented Dec 2, 2024

Hi SVD team, I cannot reproduce the result of llama-7b on PTB
The model compression ratio is 0.2:

python SVDLLM.py --step 1 --model /data/llama-7b --ratio 0.2 --whitening_nsamples 256 --model_seq_len 2048 --dataset wikitext2 --seed 3 --save_path SVDLLM 

The ppl result is:

PPL after pruning: {'wikitext2': 7.891565003099237}
PPL after pruning: {'ptb': 71.51362022345916}
PPL after pruning: {'c4': 16.115304662697902}

Is there anything special to be aware of regarding the PTB dataset? tks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant