Replies: 3 comments
-
Currently I only got a torch one on tinyshakspear torch_text_train_script.py from Github karpathy/nanoGPT. The basic components for training text model are not so hard, and we have bunch of examples in TF tutorial. May try a script if some spare time. |
Beta Was this translation helpful? Give feedback.
-
For llamav2 training you may also check karpathy/llama2.c/train.py. Regarding clip, which |
Beta Was this translation helpful? Give feedback.
-
It's this one torch_clip_train_script.py. The actual loss value doesn't matter, as it's only on my test datsaset coco_dog_cat.tar.gz. |
Beta Was this translation helpful? Give feedback.
-
I think it is really great to see this repo has some SOTA nlp models like GPT 2 and LLama v2. I think it would be nice to have a training script. Any plans to publish a training code?
Beta Was this translation helpful? Give feedback.
All reactions