You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Setting ds_accelerator to cuda (auto detect)
Please install apex to use fused_layer_norm, fall back to torch.nn.LayerNorm
Please install apex to use FusedScaleMaskSoftmax, otherwise the inference efficiency will be greatly reduced
WARNING: No training data specified
using world size: 1 and model-parallel size: 1
> initializing model parallel with size 1
building CachedAutoregressiveModel model ...
> number of parameters on model parallel rank 0: 354610176
global rank 0 is loading checkpoint /data/dell/MathGLM-Large/1/mp_rank_00_model_states.pt
Will continue but found unexpected_keys! Check whether you are loading correct checkpoints: ['mixins.block_position_embedding.block_position_embeddings.weight'].
successfully loaded /data/dell/hyc/all_ckp/MathGLM-Large/1/mp_rank_00_model_states.pt
Working on No. 0 on model group 0...
Traceback (most recent call last):
File "/data/MathGLM/MathGLM_Arithmetic/inference_mathglm.py", line 107, in <module>
main(args)
File "/data/dell/MathGLM/MathGLM_Arithmetic/inference_mathglm.py", line 95, in main
generate_continually(process, args.input_source)
File "/data/dell/anaconda3/envs/mathglm/lib/python3.9/site-packages/SwissArmyTransformer/generation/utils.py", line 83, in generate_continually
func(raw_text)
File "/data/dell/MathGLM/MathGLM_Arithmetic/inference_mathglm.py", line 79, in process
decoded_txts = icetk.decode(seq)
File "/data/dell/anaconda3/envs/mathglm/lib/python3.9/site-packages/icetk/ice_tokenizer.py", line 95, in decode
return self.text_tokenizer.decode(ids).replace('<n>', '\n')
File "/data/dell/anaconda3/envs/mathglm/lib/python3.9/site-packages/icetk/text_tokenizer.py", line 62, in decode
return self.sp.DecodeIds(ids)
File "/data/dell/anaconda3/envs/mathglm/lib/python3.9/site-packages/sentencepiece/__init__.py", line 837, in DecodeIds
return self.Decode(input=input, out_type=out_type, **kwargs)
File "/data/dell/anaconda3/envs/mathglm/lib/python3.9/site-packages/sentencepiece/__init__.py", line 780, in Decode
return self._DecodeIds(input)
File "/data/dell/anaconda3/envs/mathglm/lib/python3.9/site-packages/sentencepiece/__init__.py", line 337, in _DecodeIds
return _sentencepiece.SentencePieceProcessor__DecodeIds(self, ids)
IndexError: Out of range: piece id is out of range.
按照env.yml安装环境,在Cloud上下载了MathGLM-2B模型,执行inference.sh报错。
请问这个报错是因为什么?
这个模型能做到什么程度的Arithmetic?我自己尝试了在下载的checkpoint上写inference,算术能力似乎不是很好?请问prompt是需要有特殊格式吗?
The text was updated successfully, but these errors were encountered: