-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: OpenXR integration #2166
WIP: OpenXR integration #2166
Conversation
Hmm, due to this size of this change, this might be something worth first creating an RFC for. |
And secondly, very large changes such as this one are difficult to review all at once. |
In addition to possible bevy RFC, this will most probably require coordination between multiple ecosystem crates too. See: |
Agreed; this is a great candidate for an RFC explaining what's going on and what the contentious choices are. I'll give this a scan and try and highlight any changes that I think can be broken apart. |
From a (very) quick review, it seems there wasn't a lot of big changes to Bevy itself, and those can mostly be feature gated 🎉 I think once all the dependencies are updated and the Bevy render update is done, some of those changes makes sense to include in Bevy planning for XR (for example a second camera) without a RFC. Even though I don't have an headset, I played at work with one and Godot and it was very fun to be able to move in my 3D scenes, I hope Bevy can do that soon 👍 thank you for the groundwork! |
crates/bevy_app/src/plugin_group.rs
Outdated
@@ -96,6 +96,21 @@ impl PluginGroupBuilder { | |||
self | |||
} | |||
|
|||
pub fn remove<T: Plugin>(&mut self) -> &mut Self { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this change; can you spin it out into a seperate tiny PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here it is: #2171
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on discussion at pull request, this will be migrated after #2039 has been merged
crates/bevy_gltf/src/loader.rs
Outdated
@@ -365,6 +368,7 @@ fn load_node( | |||
projection_matrix: perspective_projection.get_projection_matrix(), | |||
..Default::default() | |||
}); | |||
// FIXME how to differentiate between CAMERA_XR and CAMERA_3D? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not entirely sure what is best here, but I believe that the UI camera differentiation just operates off of names right now.
Huh cool, this actually isn't too bad at all. There's one change that I think belongs as its own PR (plugin removal), but otherwise this seems quite minimal and clear. I think with some more documentation (and testing from folks who have the devices) I'd be quite happy to see this get merged. For documentation:
|
Not to derail this PR, but I'm curious if there's been any thought on trying to get iOS / ARKit enabled. I've done AR in swift-land build would like to stick with Rust-land if I can. Unfortunately, Apple doesn't support OpenXR so I'm curious how much work would be required to get ARKit/Metal mocked behind the OpenXR API. The OpenXR API surface seems huge, but I'm sure there might be a limited MVP that could enable AR with Bevy? As a side note, I've got a PoC of iOS Bevy AR working by sending the ARKit camera projection matrix through the FFI layer into the Bevy App, which is pretty neato. I imagine OpenXR would open up a lot more functionality. I think ARKit/iOS is of a high level of importance because it's currently the most accessible AR device to most developers, so supporting it would open up the community of contributors (ie, me!). |
I don't recall any existing efforts. But it would definitely be welcome.
Definitely agreed. This is very neato 😄
Agreed. The only thing missing is someone to drive the effort. If you could open a new issue and share what you've learned so far, that would be very helpful. It would also provide a centralized place for interested people to coordinate. |
This PR is more or less of a demonstration that it can be done, so definitely not derailing anything here :) The implementation approach has already changed quite a bit into more abstracted way from the code what's present here currently. I guess it would also eventually be good to check Vuforia, ARCore, etc. API's. Does ARKit require access to graphics device, or is it more or less at the input layer? OpenXR also supports handheld devices form factors, its API could be a good abstraction point. Here's a rough draft of what the abstraction(s) could look:
Zarik5 did an excellent work with ideas he initially presented on blaind/xrbevy#1, having gfx-rs/gfx#3762 merged into gfx. Basically that will expose vulkan raw handles from Eventually some upper abstraction can take those raw handles and do required initializations for xr-related work. For openxr, gfx-rs/gfx#3219 (comment) contains sequence diagram of what happens with Vulkan (using Before your post @jkelleyrtp, I thought this initialization could be in |
ARKit can be used without access to the graphics APIs - the 3d rendering is handled by SceneKit, RealityKit, and SpriteKit. So, ARKit just provides the camera projection matrix, the pointclouds, feature tracking, and information about lighting. iOS doesn't provide any hand tracking APIs, so currently the only interaction is touch (something I don't believe works in Bevy-iOS atm). However, to pass through the "AR Frame" as ARKit calls it, you need a way to pull the camera frame from metal into a texture. ARKit's AR basically just renders a 3D scene on top of a video feed and adjusts the camera view to that of the device's orientation.
OpenXR provides a good entry point for an API, so anything "high level" like ARCore/ARKit would probably be best served by providing a shim at the openxr-sys level. I personally haven't dove too deep into OpenXR.rs but the API definitely has that bindgen vibe to it. Perhaps it makes sense to build a crate on top of OpenXR that integrates ARCore/ARKit/Vuforia. I'm definitely bullish on WGPU being the defacto renderer for the Rust ecosystem, so finding a way to reuse it and the engines that leverage it (Bevy) would be amazing for cross-platform AR development. |
Sounds great! I think you gave ideas for a lot of abstractions for this pull request too. Bevy could also be abstracted to be able to handle this completely through plugin(s), if:
Point clouds and other data could be handled through events? Input system integration (actions in openxr) require more thought.
Sounds doable. I tried to look into handheld device use cases of OpenXR, also checked the Khronos API description for examples (e.g. about point cloud, etc.) but did not find any yet.
I hope that WGPU will handle all cases, but if it turned out not to, probably would be easy(ish) to provide both WGPU- and GFX abstractions to select from. |
Status update: this pull request currently contains the today's understanding of mandatory minimal changes required to bevy for getting XR support. I've extracted the openxr-related code to separate crate (see https://github.com/blaind/bevy_openxr/). There is a functional PoC that works on Windows and on Monado /Linux (local emulation) at https://github.com/blaind/xrbevy/ Currently needed changes to bevy (but still iterating): Multiview support
Camera projection and position matrices calculation
Possibility to edit render graph from plugins
WgpuThis part is still very much in flux, e.g. the solution might differ a lot. Especially since the underlying crates (gfx, wgpu, wgpu-rs) are heavily patched for now. For current iteration, state sharing is needed at least for:
|
Work continues in #2319, I'll close this |
Here's a proof-of-concept of OpenXR rendering, using wgpu & underlying stack. See https://github.com/blaind/xrbevy
Relates to #115
There's quite a bit to discuss here, as the integration would touch quite a bit core parts of bevy.
Note that the current pull request is based on Feb version of bevy.
A few initial notes:
Rgba8Unorm
patched temporarily to many places, mostly because it was one that worked with Questmat4 ViewProj2;
is required in all shadersFor architecture of the pull request, see https://github.com/blaind/xrbevy/blob/main/docs/architecture.md (evolving document)