-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposed RFC Feature : XRInteraction #118
Comments
Although I have focused mostly on the rendering side of things for XR I can still provide my thoughts on this RFC. The overall design looks good and nice to see a working gif below. Good Job. I have a few points below.
|
Summary:
XRInteraction is a high-level, component-based, interaction system for creating XR experiences.It provides a framework that makes it easy to interact with 3D and UI.
What is the relevance of this feature?
XRInteraction makes it easy for developers to interact with 3D objects or UIs in XR applications.
XRInteraction contains a set of components that support the following Interaction features:
Feature design description:
XROrigin
The XR origin represents the user in the XR scene. It contains the camera floor offset object, hand pose, and camera.
Interactor-interactable
All interaction models in XRInteraction are defined by an Interactor-Interactable pair of components. For example, the ray interaction is defined by the RayInteractor-Interactable component pair.
An Interactor is the component that acts on (hovers or selects) associated Interactable components. It also defines how to choose the best Interactable candidate to hover over, when a selection should occur, what should happen during a selection, and so on.
An Interactable is the component that is acted upon (can be hovered or selected) by associated Interactor components. An Interactable will specify the maximum number of Interactors that can hover or select it at a time.
UML Class Diagram
XRInteraction::XRInteractionSystemComponent connects the Interactors and Interactables, managing the interaction state of its group of registered Interactors and Interactables.
XRInteraction::XRControllerComponent read input actions (buttons, touch pad, etc.) from a tracked device's controls for each frame, and the Interactors would then query to determine if they should select or activate. This component also reads the position and rotation of the tracked device, and applying it to the Transform component.
XRInteraction::XRRayInteractorComponent handles the actions of hovering and selecting Interactable objects in the world. It creates a list of Interactables that it could potentially hover or select each frame, and by default the closest Interactables have highest priority.
XRInteraction::XRInteractableComponent is an object in a scene that an Interactor can hover, select, and/or activate, and defines the behavior of those interaction states.
XRInteraction::XRTeleportComponent is an interactable for teleportation, defining the area that the user can be teleported.
XRInteraction::XRActionManagerComponent switches between the basic interact mode or teleport mode depending on the input.
Interaction workflow
Workflow of interactive handover
On each hand, object manipulation operations, such as hover, select and grab, and locomotion operations, such as teleportation, can exist at the same time. In this case, two interactors are created, one for processing basic operations and the other for processing locomotion. Different interactions are triggered based on different input operations. The switchover process of the two interactors is as follows:
Component Panel
XR Controller
Position Id
Orientation Id
Select Id
Teleport Select Id
Move In Id
Move out Id
Rotate Left Id
Rotate Right Id
XR Ray Interactor
If Line Type is set to StraightLine, the following figure shows the configuration panel.
If Line Type is set to Projectile, the following figure shows the configuration panel.
Enable Interactor
Interactable Type
XRInteractable
andXRTeleport
are supportedLine Type
StraightLine
andProjectile
are supported.Max Raycast Distance
StraightLine
.Velocity
Projectile
.Acceleration
Projectile
.SampleFrequency
Projectile
.ReferenceEntity
Projectile
.Technical design description:
To interact with objects, ray casting and object collision detection is the most important method. Currently, two types of ray are supported: straight line and projectile. The interactor generates sampling points according to the line type, then traverses all sampling points. Two consecutive sampling points are used as the start and end points to determine if collision occurs.
Ray sampling point
The projectile ray type initializes the parameters based on the configuration items: m_velocity and m_acceleration. The calculation method is as follows:
The calculation method for each sampling point is as follows:
We generate collision sampling points based on the line type, and the calculation method is as follows:
Raycast Hit
We used O3DE's AzPhysics for raycast collision detection. We traverse all sampling points and determine whether collision occurs between two sampling points. The algorithm is as follows:
Code tree
// XRInteraction Gem XRInteraction |--Assets |--LeftController.fbx |--RightController.fbx |--XROrigin.prefab |--Code |--Include |--XRInteraction |--XRInteractionBus.h |--Source |--Controllers |--XRControllerComponent.cpp |--XRControllerComponent.h |--Interactables |--XRBaseInteractable.cpp |--XRBaseInteractable.h |--XRInteractableComponent.cpp |--XRInteractableComponent.h |--XRInteractableInterface.h |--XRTeleportComponent.cpp |--XRTeleportComponent.h |--Interactors |--IXRHoverInteractor.h |--IXRInteractor.h |--IXRSelectInteractor.h |--XRRayInteractorComponent.cpp |--XRRayInteractorComponent.h |--XRTeleportComponent.cpp |--XRTeleportComponent.h |--XRActionManagerComponent.cpp |--XRActionManagerComponent.h |--XRInteractionModule.cpp |--XRInteractionModuleInterface.h |--XRInteractionSystemComponent.cpp |--XRInteractionSystemComponent.h |--XRInteractionType.h
What are the advantages of the feature?
With XRInteraction, you can easily build the interaction part of XR applications.
What are the disadvantages of the feature?
How will this be implemented or integrated into the O3DE environment?
Currently our features are integrated into the XRInteraction gem. Together with the XR gem, the XRInteraction gem can be hosted in the O3DE Extras repo. Developers can introduce this gem into projects and use the above mentioned XR interaction capabilities.
Are there any alternatives to this feature?
No alternatives at the moment
How will users learn this feature?
Are there any open questions?
The text was updated successfully, but these errors were encountered: