Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix routing of InputEventScreenDrag events to Control nodes #68632

Conversation

m4gr3d
Copy link
Contributor

@m4gr3d m4gr3d commented Nov 14, 2022

Multitouch support enables multiple concurrent InputEventScreenTouch and InputEventScreenDrag events each differentiated by their index property.

An InputEventScreenDrag event typically follows another InputEventScreenDrag or InputEventScreenTouch event and thus should be routed to the target that last matched the previous touch event.
Prior to this fix, in a multitouch scenario only the target of the first touch event was tracked, causing all subsequent multitouch drag events to be routed to that invalid target. The PR tracks the touch targets using the touch event's index property, and use that data to properly route drag events based on their matching index property.

3.x version

Bugsquad edit:

@akien-mga
Copy link
Member

CC @Sauermann

@Sauermann
Copy link
Contributor

I had a look at this PR and here are a few thoughts:

  • The implementation seems to work for Controls in the root Viewport in my test-cases (I moved Controls around with event.relative) and the events are sent to the expected Control
  • Same is valid for Controls in different SubViewports
  • When a Control A is in the root Viewport and a different Control B is in an embedded Window, and both are dragged, then A doesn't receive InputEventScreenDrag events.
  • Dragging or resizing an embedded Window has the effect, that no other Control or Window receives InputEventScreenDrag events, even if they were receiving these events previously.
  • I didn't test TouchScreenButton

In order to make this work with Windows (not sure, how much of that should be supported), a necessary aspect would be to keep in each Viewport track, to which embedded Window each event should be sent in Viewport::_sub_windows_forward_input.

bool Viewport::_sub_windows_forward_input(const Ref<InputEvent> &p_event) {

@m4gr3d
Copy link
Contributor Author

m4gr3d commented Nov 16, 2022

I had a look at this PR and here are a few thoughts:

  • The implementation seems to work for Controls in the root Viewport in my test-cases (I moved Controls around with event.relative) and the events are sent to the expected Control
  • Same is valid for Controls in different SubViewports
  • When a Control A is in the root Viewport and a different Control B is in an embedded Window, and both are dragged, then A doesn't receive InputEventScreenDrag events.
  • Dragging or resizing an embedded Window has the effect, that no other Control or Window receives InputEventScreenDrag events, even if they were receiving these events previously.
  • I didn't test TouchScreenButton

In order to make this work with Windows (not sure, how much of that should be supported), a necessary aspect would be to keep in each Viewport track, to which embedded Window each event should be sent in Viewport::_sub_windows_forward_input.

bool Viewport::_sub_windows_forward_input(const Ref<InputEvent> &p_event) {

@Sauermann From a quick look at Viewport::_sub_windows_forward_input, it doesn't seem to support raw touch events (no reference to InputEventScreenTouch or InputEventScreenDrag), so this fix doesn't apply to it.

If we intend to support touch events (and multi-touch) in a viewport's subwindows, then it should be another PR since we wouldn't be fixing an existing behavior, but adding a new capability.

@Sauermann
Copy link
Contributor

Alternative to #49215
fixes #29525
Possibly related to #62597

@RandomShaper
Copy link
Member

I see a possibility of dangling pointers to Controls in case they are deleted while they are still referenced. Maybe a simple fix is to store their ObjectID instead, similarly to what picking does.

@m4gr3d
Copy link
Contributor Author

m4gr3d commented Nov 30, 2022

I see a possibility of dangling pointers to Controls in case they are deleted while they are still referenced. Maybe a simple fix is to store their ObjectID instead, similarly to what picking does.

@RandomShaper Do you have reference for the picking logic I can look at?

The current approach follows the pattern I've seen in the GUI struct where Control nodes are held by their pointers (example).

@RandomShaper
Copy link
Member

I mean usages like this:

HashMap<ObjectID, uint64_t>::Iterator F = physics_2d_mouseover.find(res[i].collider_id);

In that specific case, the relevant Controls remove themselves from the list upon destruction. That said, it'd be interesting to know for sure if the other cases of use of raw pointers have safety guarantees because of some careful handling.

In this case of tracking which touch index is relevant to a specific control, I'd say that basing it on ObjectID is better than having to deal with some sort of callback mechanism to prune the map.

@m4gr3d m4gr3d force-pushed the fix_input_event_screen_drag_routing_main branch from 099d6f7 to 4e9c2ec Compare November 30, 2022 21:32
@m4gr3d m4gr3d force-pushed the fix_input_event_screen_drag_routing_main branch from 4e9c2ec to 3ff7dd2 Compare November 30, 2022 22:07
@m4gr3d
Copy link
Contributor Author

m4gr3d commented Nov 30, 2022

@RandomShaper I've updated the logic to use the ObjectID instead.

@akien-mga akien-mga merged commit 92b456b into godotengine:master Dec 1, 2022
@akien-mga
Copy link
Member

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Control _gui_input inconsistent behavior with InputEventScreenTouch
4 participants