ORPO seems broken with micro_batch_size
or eval_batch_size
> 1
#1489
Labels
bug
Something isn't working
micro_batch_size
or eval_batch_size
> 1
#1489
Please check that this issue hasn't been reported before.
Expected Behavior
It should run without an error, as it does when you have
micro_batch_size
andeval_batch_size
set to 1.Current behaviour
Returns two errors;
Steps to reproduce
Run the YAML provided, which has a
micro_batch_size
andeval_batch_size
of 2.I tested:
Config yaml
Possible solution
No response
Which Operating Systems are you using?
Python Version
3.10.12
axolotl branch-commit
main/bda48f0
Acknowledgements
The text was updated successfully, but these errors were encountered: