Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

self hosted mixtral gha #136

Merged
Merged
Changes from all commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
ee55ade
initial commit for self hosted mixtral gha
anandhu-eng Sep 26, 2024
bcfa4d9
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
anandhu-eng Sep 26, 2024
51346cf
Updated device tag and cron job schedule
anandhu-eng Sep 30, 2024
3013246
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
ba4e323
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
5b073c7
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
aa081e3
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
24d9acc
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
64aeb24
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
064d197
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
fbfd2fe
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
4b083d3
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
7842d61
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
e994a0e
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
fe9b8cb
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
f983570
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
d8986bd
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
b4eb2f4
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
122cea2
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
62b609c
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
c8edae3
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
5b4da28
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
043de5f
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 4, 2024
ad0b930
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
d9b2082
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
be4efd0
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
4e95c29
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
54a9792
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
c1eff80
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
c12c300
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
c4003f0
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
8978e9f
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
48bd2a6
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
ecb7092
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
f5243c2
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
ddaf7fa
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
9fe9ef1
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
f178814
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
eddb93d
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
1b2d0b2
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 5, 2024
6f66aa2
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 6, 2024
5ca77b3
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 6, 2024
bc6fa11
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 6, 2024
b562039
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 6, 2024
79ef3e4
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 6, 2024
ee30203
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 6, 2024
e1fbdf6
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 7, 2024
4eecdf8
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
anandhu-eng Oct 7, 2024
bcec9ec
Updated run commands
anandhu-eng Oct 7, 2024
72cf058
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 8, 2024
7b8cacf
Update test-mlperf-inference-mixtral.yml
arjunsuresh Oct 8, 2024
1711455
removed cuda device
anandhu-eng Oct 8, 2024
ed57759
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
anandhu-eng Oct 10, 2024
4990504
Merge branch 'mlperf-inference' into mixtral+gha+selfhosted
arjunsuresh Oct 10, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 30 additions & 0 deletions .github/workflows/test-mlperf-inference-mixtral.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: MLPerf inference MIXTRAL-8x7B

on:
schedule:
- cron: "30 20 * * *" # 30th minute and 20th hour => 20:30 UTC => 2 AM IST

jobs:
build_reference:
if: github.repository_owner == 'gateoverflow'
runs-on: [ self-hosted, GO-i9, linux, x64 ]
strategy:
fail-fast: false
matrix:
python-version: [ "3.12" ]
backend: [ "pytorch" ]
device: [ "cpu" ]

steps:
- name: Test MLPerf Inference MIXTRAL-8X7B reference implementation
run: |
source gh_action/bin/deactivate || python3 -m venv gh_action
source gh_action/bin/activate
export CM_REPOS=$HOME/GH_CM
python3 -m pip install cm4mlops
cm pull repo
cm run script --tags=run-mlperf,inference,_submission,_short --submitter="MLCommons" --model=mixtral-8x7b --implementation=reference --batch_size=1 --backend=${{ matrix.backend }} --category=datacenter --scenario=Offline --execution_mode=test --device=${{ matrix.device }} --docker_it=no --docker_cm_repo=gateoverflow@cm4mlops --adr.compiler.tags=gcc --hw_name=gh_action --docker_dt=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --docker --quiet --test_query_count=1 --target_qps=1 --clean --env.CM_MLPERF_MODEL_MIXTRAL_8X7B_DOWNLOAD_TO_HOST=yes --env.CM_MLPERF_DATASET_MIXTRAL_8X7B_DOWNLOAD_TO_HOST=yes
cm run script --tags=push,github,mlperf,inference,submission --repo_url=https://github.com/gateoverflow/mlperf_inference_test_submissions_v5.0 --repo_branch=main --commit_message="Results from self hosted Github actions - GO-i9" --quiet --submission_dir=$HOME/gh_action_submissions
Loading