Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposed RFC Feature : XRInteraction #118

Open
laoyigrace opened this issue Mar 3, 2023 · 2 comments
Open

Proposed RFC Feature : XRInteraction #118

laoyigrace opened this issue Mar 3, 2023 · 2 comments
Labels
rfc-feature Request for Comments for a Feature

Comments

@laoyigrace
Copy link

laoyigrace commented Mar 3, 2023

Summary:

XRInteraction is a high-level, component-based, interaction system for creating XR experiences.It provides a framework that makes it easy to interact with 3D and UI.
image

What is the relevance of this feature?

XRInteraction makes it easy for developers to interact with 3D objects or UIs in XR applications.
XRInteraction contains a set of components that support the following Interaction features:

Feature Description
Controller Input Controller input that supports OpenXR
Basic Object Operation Basic object hover, select and grab, and moving operations on objects, such as rotation, movement, etc.
Locomotion Character can teleport、smoothly move and smoothly rotate.
Haptic Feedback Haptic feedback through XR controllers
Visual Feedback Visual feedback (bounding box/Line rendering) to indicate possible and active interactions
XR Device Simulator Simulates user input from normal keys (keyboard, mouse combination or controller) to drive the XR handle or hand operation in the scene
XR Keyboard A virtual keyboard is required for input on an XR device
Hand Interaction Support for Bare Hand Interaction

Feature design description:

XROrigin

image

The XR origin represents the user in the XR scene. It contains the camera floor offset object, hand pose, and camera.

Interactor-interactable

All interaction models in XRInteraction are defined by an Interactor-Interactable pair of components. For example, the ray interaction is defined by the RayInteractor-Interactable component pair.
An Interactor is the component that acts on (hovers or selects) associated Interactable components. It also defines how to choose the best Interactable candidate to hover over, when a selection should occur, what should happen during a selection, and so on.
An Interactable is the component that is acted upon (can be hovered or selected) by associated Interactor components. An Interactable will specify the maximum number of Interactors that can hover or select it at a time.

UML Class Diagram

image

XRInteraction::XRInteractionSystemComponent connects the Interactors and Interactables, managing the interaction state of its group of registered Interactors and Interactables.

XRInteraction::XRControllerComponent read input actions (buttons, touch pad, etc.) from a tracked device's controls for each frame, and the Interactors would then query to determine if they should select or activate. This component also reads the position and rotation of the tracked device, and applying it to the Transform component.

XRInteraction::XRRayInteractorComponent handles the actions of hovering and selecting Interactable objects in the world. It creates a list of Interactables that it could potentially hover or select each frame, and by default the closest Interactables have highest priority.

XRInteraction::XRInteractableComponent is an object in a scene that an Interactor can hover, select, and/or activate, and defines the behavior of those interaction states.

XRInteraction::XRTeleportComponent is an interactable for teleportation, defining the area that the user can be teleported.

XRInteraction::XRActionManagerComponent switches between the basic interact mode or teleport mode depending on the input.

Interaction workflow

image

Workflow of interactive handover

On each hand, object manipulation operations, such as hover, select and grab, and locomotion operations, such as teleportation, can exist at the same time. In this case, two interactors are created, one for processing basic operations and the other for processing locomotion. Different interactions are triggered based on different input operations. The switchover process of the two interactors is as follows:

image

Component Panel

XR Controller

image

Parameters Description
Position Id The input ID of the handle position.
Orientation Id The input ID of the handle orientation.
Select Id The input Id of Selected
Teleport Select Id The input Id of Teleport Selected.
Move In Id The input Id of Move In .
Move out Id The input Id of Move Out.
Rotate Left Id The input Id of Rotate Left Id.
Rotate Right Id The input Id of Rotate Right Id.
XR Ray Interactor

If Line Type is set to StraightLine, the following figure shows the configuration panel.
image

If Line Type is set to Projectile, the following figure shows the configuration panel.
image

Parameters Description
Enable Interactor Enable Interactor.
Interactable Type Interactable type of the interaction, XRInteractable and XRTeleport are supported
Line Type Line Type of Raycast, StraightLine and Projectile are supported.
Max Raycast Distance Max Raycast Distance,This parameter is configured when Line Type is StraightLine.
Velocity Initial velocity of the projectile ,This parameter is configured when Line Type is Projectile.
Acceleration Gravity of the projectile in the reference frame ,This parameter is configured when Line Type is Projectile.
SampleFrequency The number of sample points Unity uses to approximate curved paths,This parameter is configured when Line Type is Projectile.
ReferenceEntity The reference frame of the curve to define the ground plane and up,This parameter is configured when Line Type is Projectile.

Technical design description:

To interact with objects, ray casting and object collision detection is the most important method. Currently, two types of ray are supported: straight line and projectile. The interactor generates sampling points according to the line type, then traverses all sampling points. Two consecutive sampling points are used as the start and end points to determine if collision occurs.

Ray sampling point

The projectile ray type initializes the parameters based on the configuration items: m_velocity and m_acceleration. The calculation method is as follows:

void CalculateProjectileParameters(AZ::Vector3& initialPosition, AZ::Vector3& initialVelocity, AZ::Vector3& constantAcceleration, float& flightTime)
{
	initialPosition = GetEntity()->GetTransform()->GetWorldTranslation();
	initialVelocity = GetEntity()->GetTransform()->GetWorldTM().GetBasisY() * m_projectileConfig.m_velocity;
	AZ::Vector3 up = AZ::Vector3::CreateAxisZ();
	AZ::Vector3 referencePosition = AZ::Vector3::CreateZero();
	if (m_projectileConfig.m_referenceEntity.IsValid())
	{
		AZ::Transform worldTransform;
		AZ::TransformBus::EventResult(worldTransform, m_projectileConfig.m_referenceEntity, &AZ::TransformBus::Events::GetWorldTM);
		up = worldTransform.GetBasisZ();
		referencePosition = worldTransform.GetTranslation();
	}

	constantAcceleration = up * -m_projectileConfig.m_acceleration;

	// Vertical velocity component Vy = v₀sinθ
	// When initial height = 0,
	// Time of flight = 2(initial velocity)(sine of launch angle) / (acceleration) = 2v₀sinθ/g
	// When initial height > 0,
	// Time of flight = [Vy + √(Vy² + 2gh)] / g
	// The additional flight time property is added.

	AZ::Vector3 castForward = GetEntity()->GetTransform()->GetWorldTM().GetBasisY();
	AZ::Vector3  projectedForward = castForward.GetProjectedOnNormal(up);
	float angle = castForward.AngleSafeDeg(projectedForward);

	float vy = m_projectileConfig.m_velocity * AZ::Sin(angle);

	float height = (referencePosition - initialPosition).GetProjected(up).GetLength() + m_projectileConfig.m_additionalGroundHeight;
	if (height < 0.0f)
		flightTime = m_projectileConfig.m_additionalFlightTime;
	else if (height == 0.0f)
		flightTime = 2.0f * vy / m_projectileConfig.m_acceleration + m_projectileConfig.m_additionalFlightTime;
	else
		flightTime = (vy + AZ::Sqrt(vy * vy + 2.0f * m_projectileConfig.m_acceleration * height)) / m_projectileConfig.m_acceleration + m_projectileConfig.m_additionalFlightTime;

	flightTime = std::max(flightTime, 0.0f);
}

The calculation method for each sampling point is as follows:

AZ::Vector3 SampleProjectilePoint(AZ::Vector3 initialPosition, AZ::Vector3 initialVelocity, AZ::Vector3 constantAcceleration, float time)
{
	// Position of object in constant acceleration is:
	// x(t) = x₀ + v₀t + 0.5at²
	// where x₀ is the position at time 0,
	// v₀ is the velocity vector at time 0,
	// a is the constant acceleration vector
	return initialPosition + initialVelocity * time + constantAcceleration * (0.5f * time * time);
}

We generate collision sampling points based on the line type, and the calculation method is as follows:

void UpdateSamplePoints()
{
	m_samplePoints.clear();

	if (m_xrController)
	{
		AZ::Vector3 controllerPosition = m_xrController->GetControllerPosition();
		AZ::Vector3 controllerForward = m_xrController->GetControllerForward();
		m_samplePoints.push_back(controllerPosition);
		float interval = 1.0f;

		switch (m_lineType)
		{
		case XRInteraction::StraightLine:
			m_samplePoints.push_back(controllerPosition + controllerForward * m_maxRaycastDistance);
			break;
		case XRInteraction::Projectile:
			AZ::Vector3 initialPosition;
			AZ::Vector3 initialVelocity;
			AZ::Vector3 constantAcceleration;
			float flightTime;
			CalculateProjectileParameters(initialPosition, initialVelocity, constantAcceleration, flightTime);

			interval = flightTime / (m_projectileConfig.m_sampleFrequency - 1);
			for (int i = 1; i < m_projectileConfig.m_sampleFrequency; ++i)
			{
				float time = i * interval;
				m_samplePoints.push_back(SampleProjectilePoint(initialPosition, initialVelocity, constantAcceleration, time));
			}
			break;
		default:
			AZ_Warning("XRRayInteractorComponent", false, "The line type is not supported.");
			break;
		}
	}
}

Raycast Hit

We used O3DE's AzPhysics for raycast collision detection. We traverse all sampling points and determine whether collision occurs between two sampling points. The algorithm is as follows:

void XRRayInteractorComponent::UpdateRaycastHits()
	{
		//AZ_TracePrintf("XRInteractableComponent", "1 UpdateRaycastHits\n");

		m_samplePointsHitIndex = 0;

		if (m_samplePoints.size() < 2)
		{
			AZ_Error("XRRayInteractorComponent", false, "sample point count < 2");
			return;
		}

		auto* sceneInterface = AZ::Interface<AzPhysics::SceneInterface>::Get();
		if (sceneInterface == nullptr)
		{
			AZ_Error("XRRayInteractorComponent", false, "AzPhysics::SceneInterface is not supported.");
			return;
		}
		AzPhysics::SceneHandle sceneHandle = sceneInterface->GetSceneHandle(AzPhysics::DefaultPhysicsSceneName);

		for (int i = 1; i < m_samplePoints.size(); i++)
		{
			AzPhysics::RayCastRequest request;
			AZ::Vector3 from = m_samplePoints[i - 1];
			AZ::Vector3 to = m_samplePoints[i];
			request.m_start = from;
			request.m_direction = to - from;
			request.m_direction.Normalize();
			request.m_maxResults = MaxRaycastHits;
			request.m_reportMultipleHits = true;
			request.m_distance = (to - from).GetLength();
			m_raycastHits = sceneInterface->QueryScene(sceneHandle, &request);
			if (m_raycastHits.m_hits.size() > 0)
			{
				m_samplePointsHitIndex = i;
				break;
			}
		}
	}

Code tree

// XRInteraction Gem
XRInteraction
    |--Assets
        |--LeftController.fbx
        |--RightController.fbx
        |--XROrigin.prefab
    |--Code
    	|--Include
    		|--XRInteraction
    			|--XRInteractionBus.h
	|--Source
    		|--Controllers
    			|--XRControllerComponent.cpp 
    			|--XRControllerComponent.h
    		|--Interactables
    			|--XRBaseInteractable.cpp
    			|--XRBaseInteractable.h
    			|--XRInteractableComponent.cpp
    			|--XRInteractableComponent.h
    			|--XRInteractableInterface.h
    			|--XRTeleportComponent.cpp
    			|--XRTeleportComponent.h
    		|--Interactors
    			|--IXRHoverInteractor.h
    			|--IXRInteractor.h
    			|--IXRSelectInteractor.h
    			|--XRRayInteractorComponent.cpp
    			|--XRRayInteractorComponent.h
    			|--XRTeleportComponent.cpp
    			|--XRTeleportComponent.h
    		|--XRActionManagerComponent.cpp
    		|--XRActionManagerComponent.h
    		|--XRInteractionModule.cpp
    		|--XRInteractionModuleInterface.h
    		|--XRInteractionSystemComponent.cpp
    		|--XRInteractionSystemComponent.h
    		|--XRInteractionType.h

What are the advantages of the feature?

With XRInteraction, you can easily build the interaction part of XR applications.

XR交互3

What are the disadvantages of the feature?

How will this be implemented or integrated into the O3DE environment?

Currently our features are integrated into the XRInteraction gem. Together with the XR gem, the XRInteraction gem can be hosted in the O3DE Extras repo. Developers can introduce this gem into projects and use the above mentioned XR interaction capabilities.

Are there any alternatives to this feature?

No alternatives at the moment

How will users learn this feature?

  1. First we will have an XROrigin prefabricated and put it in the Assets directory of the XRInteraction gem.
  2. For developers, you can instantiate prefabs in your own level, add XROrigin prefabs, and then delete the default camera that comes with the level.
  3. You can configure the position of the camera offset node and adjust the position of the tracking camera to match the position of the scene.
  4. In the subnodes of the XROrigin node, select the Interactor to be enabled as required, as shown in the following figure:

image

  1. Add the PhysX Collider and XR interactive components to the interactive object, as shown in the following figure:

image

Are there any open questions?

@laoyigrace laoyigrace added the rfc-feature Request for Comments for a Feature label Mar 3, 2023
@moudgils
Copy link

moudgils commented Mar 10, 2023

Although I have focused mostly on the rendering side of things for XR I can still provide my thoughts on this RFC. The overall design looks good and nice to see a working gif below. Good Job. I have a few points below.

  • Why create a new gem? The contents described as part of XRInteraction gem makes sense to be part of XR Gem. XR gem is very light at the moment and really the idea was that any XR related common features will live in here and openxr specific api will be embeded in the openxrvk/openxrdx12 gems. Unless I am missing something consider adding all the code changes to XR gem as these features feel central to the XR techstack. We dont want people to enable and register XR and XRInteraction if we dont have to.

  • The UML diagram is not very clear. Can you establish connections between the classes. It was hard for me to figure out the relationships between all the classes but I am able to infer a lot of the connections after squinting at it for some time.

  • Any thoughts on how something liek XRTeleportComponent will be activated and how will it procur settings related to it's behaviour?

  • Any thoughts on rendering related properties for the ray. For example change the color of the ray when it collides with something?

@laoyigrace
Copy link
Author

Although I have focused mostly on the rendering side of things for XR I can still provide my thoughts on this RFC. The overall design looks good and nice to see a working gif below. Good Job. I have a few points below.

  • Why create a new gem? The contents described as part of XRInteraction gem makes sense to be part of XR Gem. XR gem is very light at the moment and really the idea was that any XR related common features will live in here and openxr specific api will be embeded in the openxrvk/openxrdx12 gems. Unless I am missing something consider adding all the code changes to XR gem as these features feel central to the XR techstack. We dont want people to enable and register XR and XRInteraction if we dont have to.
  • The UML diagram is not very clear. Can you establish connections between the classes. It was hard for me to figure out the relationships between all the classes but I am able to infer a lot of the connections after squinting at it for some time.
  • Any thoughts on how something liek XRTeleportComponent will be activated and how will it procur settings related to it's behaviour?
  • Any thoughts on rendering related properties for the ray. For example change the color of the ray when it collides with something?

Thanks for your reply.

  • xrinteraction is a relatively independent feature, so it is separated from XR GEM in my design. I think it can be put into XR GEM, and can be put into XR GEM in future development.

  • I'm sorry, I put a high-definition drawing of the UML design.

  • In the XRActionManagerComponent component, you can configure the input event for activating the teleport.
    image

  • Visual Feedback is an important feature. During the interaction, when hovering over the object, the ray color is white and the bounding box is gray; When the object is selected, the ray color changes to blue, and the bounding box changes to blue. In the early stage, you can use DebugDisplayRequests::DrawLine first, but it is very unattractive. You can ask the artist to help later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
rfc-feature Request for Comments for a Feature
Development

No branches or pull requests

2 participants