Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TroubleShooting with the Tesla M40 24GB #9

Open
Myrkie opened this issue May 22, 2023 · 0 comments
Open

TroubleShooting with the Tesla M40 24GB #9

Myrkie opened this issue May 22, 2023 · 0 comments

Comments

@Myrkie
Copy link

Myrkie commented May 22, 2023

just a note in the future when using a GPU with a larger than normal VRAM size such as my experience with the NVIDIA TESLA M40 24GB which has a 24GB Vram size, QEMU will give you an error code 12 in the windows operating system, this can be fixed by following the libvirt documentation for adding launch arguments for the VM QEMU command-line passthrough, also make sure to enable "Above 4G decoding / Resizable Bar" in your systems bare metal bios

ive also had issues with the script for dumping the GPU Rom for it as it doesn't have /rom accessible to the file system I had better luck using a rom already dumped by someone on the techpowerup you can also dump it yourself using NVFlash also provided by techpowerup, the GPU has no video output as its a datacenter card but the rom is still required or else you will get a code 10 error.

Command that worked for me by Laszlo Ersek

<domain
 type='kvm'
 xmlns:qemu='http://libvirt.org/schemas/domain/qemu/1.0'>
  <qemu:commandline>
    <qemu:arg value='-fw_cfg'/>
    <qemu:arg value='opt/ovmf/X-PciMmio64Mb,string=65536'/>
  </qemu:commandline>
</domain>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant