Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft: Fix grad_accumulation in transformers #827

Closed
wants to merge 7 commits into from

Commits on Jan 23, 2024

  1. added random sampling

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    92166f6 View commit details
    Browse the repository at this point in the history
  2. added gradient_accumulation

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    f7aec08 View commit details
    Browse the repository at this point in the history
  3. added debug statements

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    6e0d401 View commit details
    Browse the repository at this point in the history
  4. debugging

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    6e567f9 View commit details
    Browse the repository at this point in the history
  5. fix if else

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    826f4d0 View commit details
    Browse the repository at this point in the history
  6. removed printing

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    739493d View commit details
    Browse the repository at this point in the history
  7. hotfix grad acc

    franz101 committed Jan 23, 2024
    Configuration menu
    Copy the full SHA
    636d439 View commit details
    Browse the repository at this point in the history