-
Notifications
You must be signed in to change notification settings - Fork 679
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Setup custom ft build for Llama support #2732
Conversation
Codecov ReportPatch coverage:
❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more. Additional details and impacted files@@ Coverage Diff @@
## master #2732 +/- ##
============================================
+ Coverage 72.08% 72.15% +0.06%
- Complexity 5126 7030 +1904
============================================
Files 473 698 +225
Lines 21970 31282 +9312
Branches 2351 3228 +877
============================================
+ Hits 15838 22571 +6733
- Misses 4925 7170 +2245
- Partials 1207 1541 +334
☔ View full report in Codecov by Sentry. |
Description
is_llama_build
flag to isolate from existing setup so that it will be easy to switch. This also pushes binaries to a new s3 path (fastertransformer/llama
) which probably needs creation.