-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WinML backend for wasi-nn #7807
Conversation
54de278
to
a51d2b0
Compare
Like the rest of the `windows-*` crates published by Kenny Kerr, this change also adds the `windows` crate itself to the trusted list. This is necessary for use in bytecodealliance#7807.
crates/wasi-nn/tests/all.rs
Outdated
#[cfg(feature = "openvino")] | ||
let mut backend = backend::openvino::OpenvinoBackend::default(); | ||
#[cfg(feature = "winml")] | ||
let mut backend = backend::winml::WinMLBackend::default(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This won't work when both features are enabled. Features are additive, so both backends can be present at the same time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We run openvino and winml backend in two different test cases, with different inputs, so only one backend is specified here for testing purpose.
backend.list() is able to list all available backends.
wasmtime/crates/wasi-nn/src/backend/mod.rs
Lines 21 to 31 in 9732234
pub fn list() -> Vec<crate::Backend> { | |
let mut backends = vec![]; | |
#[cfg(feature = "openvino")] | |
{ | |
backends.push(Backend::from(OpenvinoBackend::default())); | |
} | |
#[cfg(feature = "winml")] | |
{ | |
backends.push(Backend::from(WinMLBackend::default())); | |
} | |
backends |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I mean is: if someone runs cargo test --all-features
, both the OpenVINO and WinML statements that assign to let mut backend
will be present in the code. Some background: this all.rs
file is compiled once by cargo test
into a single binary that runs multiple tests, so that compilation will be incorrect because the second let mut backend
will shadow the first.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the background. It's fixed.
#[cfg_attr(not(all(target_arch = "x86_64", target_os = "windows")), ignore)] | ||
#[test] | ||
fn nn_image_classification_winml() { | ||
run(NN_IMAGE_CLASSIFICATION_WINML, true).unwrap() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You will want to double-check that this test is actually run. Right now in Wasmtime's CI, only a subset of jobs run to preserve runner time. The Windows "test" job does not run by default but if you add the string prtest:full
in any commit, then all jobs will run. Then we can check the logs for the Windows job to ensure this test actually ran.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With prtest:full, this test failed on Windows Server. I guess GitHub Windows Server runner doesn't have Desktop Experience enabled.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm afraid we need to disable this test at this time. GitHub Actions Windows Server image doesn't have Desktop Experience available, so it cannot be installed.
List of installed or available features:
https://github.com/jianjunz/wasmtime/actions/runs/7752997447/job/21143496106#step:9:1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that's unfortunate. Why don't we add a function like check_openvino_is_installed
but for WinML: check_winml_is_available
. There's probably a way to check with the Windows API if ML system calls are possible and that function could return false if they are not.
That extra runtime check would be in addition to the #[cfg_attr(not(feature = "winml"), ignore)]
. That can stay there for the case where the winml
feature isn't even enabled.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
check_winml_is_available
is added. It returns ok for my Windows 11, but I'll need to find an environment to check if it returns an error when WinML is not available.
1262255
to
9732234
Compare
In bytecodealliance#7846 I added `cargo-vet` entries to trust the `windows` crate, just like we already trust several related `windows-*` crates. I did not, however, update the lockfile, which means that bytecodealliance#7807 continued to fail the `cargo vet --locked` CI check. This change is the result of simply running `cargo vet`.
In #7846 I added `cargo-vet` entries to trust the `windows` crate, just like we already trust several related `windows-*` crates. I did not, however, update the lockfile, which means that #7807 continued to fail the `cargo vet --locked` CI check. This change is the result of simply running `cargo vet`.
Thanks for auditing windows-* crates, but
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jianjunz, this looks good to me with a couple of final points:
- let's resolve the shadowing for the
cargo test --all-features
case - let's add the
check_winml_is_available
dynamic check - can you rebase on
main
? If it's not too much trouble, I might suggest squashing some of the commits since the entire commit history will get dropped into the merge commit (but this is a personal preference thing: feel free to leave as-is).
ExecutionTarget::Gpu is mapped to LearningModelDeviceKind::DirectX.
App knows input and target data range, so it's better to let app to handle scaling.
To avoid involving too many external dependencies, input image is converted to tensor data offline.
Only FP32 is supported right now. Reject other tensor types.
It aligns with openvino backend. prtest:full
The code is rebased. No conflict. However, |
Thanks for your review and merging this PR. |
This change adds Windows Machine Learning backend for wasi-nn. Since WinML is a built-in feature of Windows, this change makes it possible to run wasi-nn on Windows without a third-party machine learning framework.
WinML backend is only available on Windows.
WinML: https://learn.microsoft.com/en-us/windows/ai/windows-ml/