-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
XR support with OpenXR backend #2319
Conversation
How does this relate to #2166? |
@bjorn3 As I wrote in the RFC, @blaind's work is not easy to integrate into bevy. It touches too many crates and is is not very organized IMHO. @blaind is aware of this: they started working on my same direction by working on exposing raw gfx-hal objects in wgpu. While their PoC focuses on OpenXR, my work considers OpenXR only as a backend and exposes a XR backend-agnostic API. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just left some first pass comments for stuff that immediately stood out to me.
If you'd like me to do a more in depth review just lmk (or if you want me to wait for the PR to be in a more stable state that's cool too!)
When the new labels get updated, we need to remember to tag this with the |
@NathanSWard Thanks for the feedback! For now I would prefer some feedback on the general shape of the API. I tried to make a compromise between ergonomics and abstraction layer "thickness", but there are always opportunities for improvement. |
I have a few questions about my proposed API, mainly about conventions.
|
I would say yes. As the way we currently handle input events, we push them through an |
Based on the behavior of other |
@NathanSWard The has been a discussion in #xr on Discord, to change the current input system to match more the OpenXR API, in particular to bring remappable actions. This would probably lead to switch from a event based to poll based input system. |
Well a |
Well it depends on what |
I agree, there is the opportunity to make it more integrated.
There is the opportunity of splitting |
@NathanSWard Is it ok for the user to manually add a bevy plugin? For what I want to do, a simple resource like App::build()
.add_plugin(XrActionsPlugin::new()
.register_binary_input(MyButton, "my_button")
.register_vec2_input(MyJoystick, "my_joystick")
) which registers a |
I personally haven't seen uses of a builder API for a Plugin, but I like it a lot! App::build()
.add_plugin(XrActionsPluginBuilder::default()
.register_binary_input(MyButton, "my_button")
.register_vec2_input(MyJoystick, "my_joystick")
.build()
) However, yes, I do like this kind of configuration. Take a look at derive_builder crate. This could be helpful for what you're doing :) |
I completely rewrote the interaction system. Now the API more closely matches WebXR instead of OpenXR. Input can now be read with I added a |
This is a summary of the API so far. It is missing everything related to rendering (but most of the rendering stuff should be handled by the engine). I will copy this to the RFC once things settle. This should be the lifecycle when the lifecycle API is implemented (requires #2432):
|
Startup/WaitingForDevice/Exit | SessionCreated/Idle/SessionEnd | Resume/Running/Pause | |
---|---|---|---|
XrSystem |
✔️ | ✔️ | ✔️ |
XrInteractionMode |
✔️ | ✔️ | |
XrEnvironmentBlendMode |
✔️ | ✔️ | |
XrVisibilityState |
✔️ | ||
XrProfiles |
✔️ | ||
XrButtons |
✔️ | ||
XrAxes |
✔️ | ||
XrTrackingSource |
✔️ |
XrSystem
: Used to enumerate and select the session mode.XrInteractionMode
:ScreenSpace
|WorldSpace
, preferred mode for drawing HUDs (depends on if the XR device is head-mounted or handheld).XrEnvironmentBlendMode
:Opaque
(VR) |Additive
(AR) |AlphaBlend
(MR)XrVisibilityState
:Hidden
|VisibleUnfocused
|VisibleFocused
, makes the game aware of the interaction state.XrProfiles
: Backend-specific strings used to select the controllers 3D model.XrButtonState
:Default
|Touched
|Pressed
XrButtons
: Used to read state and events for controller buttons.XrAxes
: Used to read the state of controller axes.XrTrackingSource
: Used to get/set the reference space type and poll controllers/target rays/hand skeleton poses for the next vsync.
Other objects:
XrSessionMode
:ImmersiveVR
|ImmersiveAR
|InlineVR
|InlineAR
XrRigidTransform
: position + orientationXrPose
: rigid trasform + linear/angular velocityXrJointPose
: pose + joint radiusXrReferenceSpaceType
:Viewer
(head-locked) |Local
(origin at the starting head location) |Stage
(origin at the floor level)XrButtonType
:Menu
|Trigger
|Squeeze
|Touchpad
|Thumbstick
|FaceButton1
|Facebutton2
|Thumbrest
, (WebXR ordering)XrAxisType
:TouchpadX
|TouchpadY
|ThumbstickX
|ThumbstickY
, (WebXR ordering)VibrationEventType
:Apply{...}
|Stop
VibrationEvent
: vibration event type + hand type
bevy_openxr
The bevy_openxr
crate exposes backend-specific resources. It is all optional.
Resource availability:
Startup/WaitingForDevice/Exit | SessionCreated/Idle/SessionEnd | Resume/Running/Pause | |
---|---|---|---|
openxr::Instance |
✔️ | ✔️ | ✔️ |
OpenXrSession |
✔️ | ✔️ | |
Arc<OpenXrTrackingContext> |
✔️ |
openxr::Instance
: Useful when working with the session.OpenXrSession
: Wrapper ofopenxr::Session
but made drop-safe (using awgpu::Device
handle). It is clonable and can be used concurrently.Arc<OpenXrTrackingContext>
: Contains resources used for tracking. It can be used for polling poses at arbitrary times.
Other objects:
OpenXrFormFactor
:HeadMountedDisplay
|Handheld
, used to select the form factor when creating the plugin.OpenXrContext
: Used to create the plugin. Creates everything needed to initialize graphics.OpenXrError
:Loader(...)
|InstanceCreation(...)
|UnsupportedFormFactor
|UnavailableFormFactor
|GraphicsCreation(...)
ButtonPaths
,AxesBindings
,VibrationBindings
,PosesBindings
,OpenXrProfileBindings
,OpenXrBindings
: Used to define the controller mappings.OpenXrTrackingReference
: Contained inOpenXrTrackingContext
, editable behind aRwLock
.
Utility functions:
openxr_pose_to_rigid_transform(pose) -> XrRigidTransform
openxr_pose_to_corrected_rigid_transform(pose, reference, prediction_time) -> XrRigidTransform
: used to account for pending recenterings.predict_pose(space, reference, prediction_time) -> Option<XrPose>
: poll pose for arbitrary times. If the time is rejected by the runtime, returnsNone
.predict_skeleton_pose(hand_tracker, reference, prediction_time) -> Option<Vec<XrJointPose>>
: similar topredict_pose
but for skeletal hand tracking.
would this spec be compatable with webxr wrt hand tracking? esp XRHandJoint required layout. Also is their plan to support webxr? |
@lizelive The joint layout of this API corresponds to WebXR. The only difference with OpenXR is that OpenXR has a palm joint (which I remove). Also yes, the plan is to support WebXR. I would like if someone could write a nice WebXR wrapper on top of web-sys, since it is not very ergonomic working with all |
There have been some more discussions on the bevy Discord server about the interaction API. I want to settle this once and for all so I'll summarize again the options. The objective is to choose a common interaction API to abstract both OpenXR and WebXR backends. The problem is that the two APIs are very different and conflicting with each other. One of the two APIs should be chosen and the backend corresponding to the other API will require some amount of glue code. This PR currently choses WebXR as the common API for interaction. ProblemXR game controllers are overall similar to each other in function but the button layout/number and type of buttons differs considerably from vendor to vendor. Without the help of a higher level API, the user would need to limit the game for support to a specific type of controller or they need a lot of boilerplate to program compatibility with multiple controllers. OpenXR 🟣 solutionOpenXR interaction API is based on actions and bindings. Actions is an abstract concept where the user gives a meaning to them in the context of the game. Buttons of different types of controllers can be bound to the same action; the user has to provide these bindings. The state of the actions can then be polled without worrying about the underlying source of the event. WebXR 🕸 solutionWebXR reuses the Gamepad API. The Gamepad API maps different types of controllers to the same button layout, creating a sort of virtual controller. In the context of WebXR, the buttons of each type of controllers are mapped to each other deterministically, so it is possible to assign an index and name to each virtual controller button and expect it to be mapped to real controller buttons which have similar purpose/location. The Gamepad API bindings are already provided by controller vendors and are immutable. Arguments🟣 Actions are more expressive and higher level As a last note, the best API is also a matter of preference. It is important to keep in mind that the API needs to be ergonomic and should fit nicely with the rest of bevy engine. One last option is no convergence at all of the two APIs, the user will need to write two separate bevy apps to support both OpenXR and WebXR. I want to solve the dispute with a poll. If you think we should use a WebXR-like API, please react with 🚀. If you think we should use a OpenXR-like API, please react with 🎉. If you have other ideas, want to add arguments in favor of either APIs or want to make clarifications, please comment below. EDIT: added argument from @Ralith |
As I previously explained in discord, this is false. The OpenXR runtime is explicitly responsible for adapting whatever bindings an application suggests to the available hardware. For example, see the SteamVR 1.19.3 changelog, which introduces support for automatically deriving G2 and Cosmos controller bindings from application-specified Oculus Touch bindings. This provides the best of all worlds, as application developers can specify precise bindings for devices of particular interest, the runtime can make approximate mappings for other devices, and the end user can customize to taste. Further, the WebXR API cannot be implemented on top of the OpenXR input API without flooding user-facing binding UIs provided by OpenXR runtimes with the fake hardcoded actions used to emulate the WebXR controller. This degrades the user experience, and will make Bevy VR apps feel like second-class citizens compared to native VR apps.
This is also false. Querying the current interaction profile is not the correct way to determine the physical controller in use. As explained in the spec:
The correct way to determine an appropriate controller model, if needed, is the |
crates/bevy_app/Cargo.toml
Outdated
@@ -27,7 +27,7 @@ ron = { version = "0.6.2", optional = true } | |||
|
|||
[target.'cfg(target_arch = "wasm32")'.dependencies] | |||
wasm-bindgen = { version = "0.2" } | |||
web-sys = { version = "0.3", features = [ "Window" ] } | |||
web-sys = { version = "=0.3.51", features = [ "Window" ] } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are you forcing a specific version? This will cause an opaque unresolvable dependency resolution error if any dependency depends on a different patch version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
web-sys = "=0.3.51"
is forced by wgpu. This bug is caused by resolver = 2
which is needed by wgpu. Maybe this will be resolved after switching to the 2021 edition.
Hey, sorry for jumping in here a bit late!
@zarik5 Would you be willing to offer a little more context about the problematic ergonomics? My feeling is that I would much rather deal with compromised developer ergonomics if the alternative is to be limited in the type of user experience I'm able to provide. But I'm not 100% sure if I'm reading all of this right, so feel free to correct me if my assumptions are incorrect. Unreal's OpenXR plugin (which was contributed by Valve) provides first-class support for Steam's input system, which allows users to customize action mappings via the SteamVR overlay and enables developers to provide default mappings for various input devices. It's an excellent user experience because it offers flexibility and familiarity, but most importantly because it doesn't depend on the developer to bake in support for a particular input device. It's been a real headache trying to play a lot of content that isn't using a proper OpenXR integration, because the generic controller mappings rarely translate well across devices. Sometimes this just means controls are clunkier than they ought to be, but often it's outright game breaking. One glaring example is that Fallout 4 VR was virtually unplayable on an Oculus Rift at launch, due to the joysticks haphazardly emulating Vive touchpads in a way that made the map impossible to use. At the end of the day, I'm not going to cry about it if Bevy doesn't provide my ideal version of OpenXR support out of the box, because the plugin system is versatile enough that I can always work around it if need be, but I just wanted to throw that out there for your consideration. And either way I really appreciate all the work you've invested in this! |
In my latest commit I changed the interaction API to OpenXR-style action-oriented. I kept also the old Button API, that gets decomposed into |
It looks like the Rendering Rewrite was released with Bevy 0.6; We're 278 commits behind- what all is needed to catch this feature back up to upsteam Bevy, and get this back on the move? If we can gather up a list of rough tasks and refine them, it'll help less-acquainted contributors like myself to jump in and help. |
In my spare time I've been trying to continue zarik5's work on oculus quest 2 at https://github.com/kcking/bevy/tree/xr I migrated zarik5's fork to wgpu 0.12. I now run into
which I think can be resolved by incorporating blaind's android lifecycle work. |
Your branch compiles once a fixed version of Sadly, it seems the current iteration crashes on startup if XR isn't available, instead of offering a way to selectively activate it. Additionally, it looks like it's still based on Bevy 0.5 instead of 0.6, which might take some work to port across? |
Ah I did this just before the actual version bump to 0.6; I just (force) pushed a rebase of the official bevy 0.6 tag into the branch.
I think this is just because the |
It's still probably worth finding a way to make the feature have an opt-in capacity beyond "this executable will crash if your VR runtime isn't active right now", but if we make it an opt-in plugin, it at least won't break the pancake cargo-examples. The change to let linear_velocity = velocity
.linear_velocity
.map(|x| to_vec3(x));
let angular_velocity = velocity
.angular_velocity
.map(|x| to_vec3(x)); |
From @kcking:
Just reposting from discord so that the #3412 is linked to this issue. Also there is a lot of discussion going on right now to get this working in @kcking's fork. https://discord.com/channels/691052431525675048/931772673195905124 |
@zarik5 have you commented in #2373? It would be nice to be able to build off this worry-free. |
@alice-i-cecile Done. |
Whats the status on this PR? Would love to see it merged soon, would like to know how complete the API is and what still needs to be done before merge. |
@thedocruby this needs to be revived and championed, likely by a different author or coordinated subteam. Adding the |
XR in Bevy needs to happen for Bevy to be selected for an upcomming project of mine. Will this happen anytime soon? |
XR support would be extremely useful for us! We've been looking into bevy for quite a while now, and would like to see this actually implemented, we're working on a game that's supposed to have XR support eventually. |
Is XR in Bevy going to happen? |
it's happening at https://github.com/awtterpip/bevy_openxr at the moment |
@TheButlah Why the thumbs down? Does that mean that you don't like the question, or does it mean that you don't want XR in Bevy? |
Because its just unnecessary noise on the PR - obviously XR is being worked on, there is no need to ask if its going to happen. The answer is yes - eventually. See awtterpip's PR above, and you can also take a look at https://github.com/NexusSocial/skilltree/ for some concrete examples. Support is very nascent right now, but there are several people actively working on this, and there is a lot of opportunity to flesh out the third party crates also. Feel free to mess around with them and see if there is something that you'd like to contribute to |
Ongoing development work is occurring in https://github.com/awtterpip/bevy_openxr. Eventually, I'd like to push to get that upstreamed. For now though, I'm going to close out this PR. |
This is an in-progress implementation for XR integration into bevy. It is incomplete because it does not address the interoperability with
bevy_render
andbevy_wgpu
, which is postponed after the rendering rewrite.Related RFC.