You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File "/data-store/shangguanzixuan/project/ICD-main/src/benchmark_evaluation/factscore_eval.py", line 225, in
model_completion, c_dist = llm.generate(prompt, prompt_evil, **generate_kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/src/decoding_algorithm/contrastive_decoding.py", line 176, in generate
outputs = self.model.generate(input_ids, evil_input_ids=evil_input_ids, max_length=max_len, num_return_sequences=1,
File "/data-store/shangguanzixuan/anaconda3/envs/icd/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/generation/utils.py", line 1756, in generate
return self.prompt_contrastive_decoding_sample(
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/generation/utils.py", line 3780, in prompt_contrastive_decoding_sample
evil_outputs = self(
File "/data-store/shangguanzixuan/anaconda3/envs/icd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/models/llama/modeling_llama.py", line 821, in forward
outputs = self.model(
File "/data-store/shangguanzixuan/anaconda3/envs/icd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/models/llama/modeling_llama.py", line 659, in forward
position_ids = position_ids.view(-1, seq_length).long()
RuntimeError: shape '[-1, 177]' is invalid for input of size 105
I guess you didn't reset the information in the Class LlamaForCausalLM (such as position_ids) when you passed in two prompts, which caused this error.
The text was updated successfully, but these errors were encountered:
When I used this mode, I got the bug like this:
model_completion, c_dist = llm.generate(prompt, prompt_evil, **generate_kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/src/decoding_algorithm/contrastive_decoding.py", line 176, in generate
outputs = self.model.generate(input_ids, evil_input_ids=evil_input_ids, max_length=max_len, num_return_sequences=1,
File "/data-store/shangguanzixuan/anaconda3/envs/icd/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/generation/utils.py", line 1756, in generate
return self.prompt_contrastive_decoding_sample(
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/generation/utils.py", line 3780, in prompt_contrastive_decoding_sample
evil_outputs = self(
File "/data-store/shangguanzixuan/anaconda3/envs/icd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/models/llama/modeling_llama.py", line 821, in forward
outputs = self.model(
File "/data-store/shangguanzixuan/anaconda3/envs/icd/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/data-store/shangguanzixuan/project/ICD-main/transformers/src/transformers/models/llama/modeling_llama.py", line 659, in forward
position_ids = position_ids.view(-1, seq_length).long()
RuntimeError: shape '[-1, 177]' is invalid for input of size 105
I guess you didn't reset the information in the Class LlamaForCausalLM (such as position_ids) when you passed in two prompts, which caused this error.
The text was updated successfully, but these errors were encountered: