Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Support bfloat16 and ensure valid precision and activation functions consistent everywhere #3601

Merged
merged 11 commits into from
Mar 26, 2024

Conversation

njzjz
Copy link
Member

@njzjz njzjz commented Mar 25, 2024

Fix #3553.

To support bfloat16,

Fix deepmodeling#3553.

Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
@njzjz njzjz linked an issue Mar 25, 2024 that may be closed by this pull request
Copy link

codecov bot commented Mar 25, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 77.71%. Comparing base (3c3e2ce) to head (626e35e).
Report is 3 commits behind head on devel.

Additional details and impacted files
@@           Coverage Diff           @@
##            devel    #3601   +/-   ##
=======================================
  Coverage   77.70%   77.71%           
=======================================
  Files         433      433           
  Lines       37491    37471   -20     
  Branches     1620     1620           
=======================================
- Hits        29134    29120   -14     
+ Misses       7496     7490    -6     
  Partials      861      861           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@njzjz njzjz marked this pull request as draft March 25, 2024 23:06
njzjz added 6 commits March 25, 2024 21:19
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
This reverts commit f33503b.

Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
@njzjz njzjz changed the title Ensure valid precision and activation functions consistent everywhere feat: Support bfloat16 and ensure valid precision and activation functions consistent everywhere Mar 26, 2024
@njzjz njzjz marked this pull request as ready for review March 26, 2024 01:57
njzjz added 4 commits March 26, 2024 01:31
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
Signed-off-by: Jinzhe Zeng <jinzhe.zeng@rutgers.edu>
This reverts commit a50e447.
@njzjz njzjz requested a review from wanghan-iapcm March 26, 2024 07:13
@wanghan-iapcm wanghan-iapcm added this pull request to the merge queue Mar 26, 2024
Merged via the queue into deepmodeling:devel with commit 08b3814 Mar 26, 2024
48 checks passed
@njzjz njzjz mentioned this pull request Apr 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

import from a module outside tf/pt
2 participants