You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello everyone, I am a newcomer to MLPerf.
I would like to know whether the text-to-image in inference supports multi-card testing. Currently, I see that there is no parameter to set multi-card in the parameters of " python main.py --help ".
and I saw the MLPerf inference Results, it has results of 2 L40s and 8 H100 . how do they run the test?
thanks a lot
The text was updated successfully, but these errors were encountered:
Hi @surbanqq! Reference code often supports only a single accelerator. But for their submissions vendors optimize including scaling to multiple accelerators. In the case of NVIDIA, please take a look at their v4.1 submission.
Hello everyone, I am a newcomer to MLPerf.
I would like to know whether the text-to-image in inference supports multi-card testing. Currently, I see that there is no parameter to set multi-card in the parameters of " python main.py --help ".
and I saw the MLPerf inference Results, it has results of 2 L40s and 8 H100 . how do they run the test?
thanks a lot
The text was updated successfully, but these errors were encountered: