-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add possibility to run inference benchmarks on XPU device #7705
Add possibility to run inference benchmarks on XPU device #7705
Conversation
daedd37
to
f7adbdd
Compare
e7eea95
to
b14c947
Compare
b14c947
to
02f827a
Compare
Co-authored-by: kgajdamo <kinga.gajdamowicz@intel.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me.
Codecov Report
@@ Coverage Diff @@
## master #7705 +/- ##
==========================================
- Coverage 91.93% 91.56% -0.38%
==========================================
Files 452 452
Lines 25537 25547 +10
==========================================
- Hits 23478 23392 -86
- Misses 2059 2155 +96
... and 20 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
ef2e4a7
to
b3f1712
Compare
22f41ed
to
4f372c7
Compare
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
Add possibility to run inference benchmarks on XPU device.
Exemplary CMD:
python inference_benchmark.py --device xpu --datasets Reddit --models sage --eval-batch-sizes 1024 --num-layers 2 --num-hidden-channels 64 --profile