Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to distribute embedded python with torch-directml #656

Open
bcw222 opened this issue Oct 17, 2024 · 0 comments
Open

Unable to distribute embedded python with torch-directml #656

bcw222 opened this issue Oct 17, 2024 · 0 comments

Comments

@bcw222
Copy link

bcw222 commented Oct 17, 2024

Description:

I have packed an embedded Python environment with torch-directml pre-installed to distribute for running Stable Diffusion on ComfyUI. It works on my develop host with NVIDIA GPU, but I encountered the following issues:

  • On Windows 1809:
    I receive the following error when trying to import torch_directml_native:

    ImportError: DLL load failed while importing torch_directml_native: Unable to find the module
    

    I attempted to reinstall the torch-directml package using pip, but the issue persists.

  • On newer versions of Windows (with Intel iGPU):
    The package runs, but when I attempt to run Stable Diffusion on ComfyUI, I encounter errors. The behavior is inconsistent across different machines, with some working as expected while others fail.

Steps to Reproduce:

  1. Pack a embedded python with torch-directml and ComfyUI requirements.
  2. Deploy the package on a machine expected to support torch-directml with Intel or AMD GPU.
  3. Attempt to run Stable Diffusion on ComfyUI with torch-directml as usual.

Expected Behavior:

Stable Diffusion should run without errors.

Actual Behavior:

  • On Windows 1809: DLL load failure when importing torch_directml_native.
  • On newer versions of Windows: Inconsistent behavior when running Stable Diffusion, with errors occurring on some machines with Intel iGPUs.

Additional Information:

  • The package works correctly on some machines with Intel iGPUs on newer Windows versions.
  • I have tested with both the latest version of torch-directml and the most recent Intel graphics drivers.

Environment:

  • OS: Windows 1809 (and newer versions)
  • Python: Embedded version with torch-directml pre-installed
  • GPU: Various machines with Intel or AMD GPU
  • ComfyUI: Running Stable Diffusion models

Request:

Is there a known issue with torch-directml on certain versions of Windows? Are there any known workarounds for these issues to pack a standalone environment for offline usage, particularly regarding missing or incompatible DLLs or other dependencies?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant