-
Notifications
You must be signed in to change notification settings - Fork 656
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compiling mobilenetV2 from ducumentation: "failed to legalize operation 'stablehlo.convolution' that was explicitly marked illegal" #19852
Comments
I can reproduce this in Colab: https://colab.research.google.com/gist/ScottTodd/39b0ac7f054650011b2a4012d34b6afa/iree-issue19852.ipynb MobileNetV2 should work and StableHLO should be stable. Neither is currently the case, but that can be fixed. This is especially worth fixing since as you point out, our documentation for Vulkan and some other backends uses that as the first example. I have a task filed on #18174 to "replace TensorFlow MobileNet example with something more recent / supported" too... For this specific issue, a few things stand out to me:
|
Trying to see how we'd get to the VHLO dialect of StableHLO from TF, since ideally we wouldn't need a downstream import tool at all, we'd just consume StableHLO that the framework exports. The StableHLO website has tutorials for JAX and PyTorch to StableHLO and StableHLO --> TF, but not TF --> StableHLO? |
Maybe we could bundle |
Tried to use the stablehlo nightly releases from close to the installed tensorflow version but I think I'm holding the APIs wrong? Or maybe stablehlo can't handle the additional dialects like # for python 3.11
pip install -f https://github.com/openxla/stablehlo/releases/expanded_assets/dev-wheels stablehlo==1.8.0.1730182293+acc379ab from mlir.dialects import stablehlo
with open("iree_input_text.mlir", "r") as f:
data = f.read()
print(data[-1000:])
serialized = stablehlo.serialize_portable_artifact_str(
data, stablehlo.get_current_version()
)
|
Bisected IREE releases:
So ... that's not good that this is broken in multiple ways, but it does give a date range for the "failed to legalize operation" error: between 20240226.813 and 20240410.859. The earliest change that looks relevant is #16561. |
Tried https://www.kaggle.com/models/google/mobilenet-v3/tensorFlow2/small-075-224-classification instead of mobilenet-v2 using
|
I think we may drop TensorFlow support, or at least heavily de-emphasize it. I've filed a few issues to help with planning there and sent a PR to switch those docs to a working ONNX example instead. Hope that helps! |
) Progress on #18174, updating some stale documentation. > [!NOTE] > Demo here: https://scotttodd.github.io/iree/guides/deployment-configurations/cpu/ Changes included: * Switch examples to use ONNX instead of TensorFlow given that users are trying to use TensorFlow and failing: #19852 * Add more documentation for CPU targets and features for #18561 * Standardize some formatting across CPU/CUDA/ROCm/Vulkan pages * Adjust some parts of the ONNX guide now that support is more mature
What happened?
Hell,
I successfully converted MobileNet V2 to
mlir
but then, theiree-compile
failed to create thevmfb
file.I followed the documentation page: https://iree.dev/guides/deployment-configurations/gpu-vulkan/#compile-a-program
Steps to reproduce your issue
I tried with savedmodel_v2:
What component(s) does this issue relate to?
Compiler, Python
Version information
IREE compiler version 3.1.0rc20250107 @ d224220
LLVM version 20.0.0git
Optimized build
Additional context
Running on Fedora 41 - GPU is RTX 3060 - using "Vulkan"
The text was updated successfully, but these errors were encountered: