-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Define a low latency event that isn't occur in the document lifecycle #214
Comments
@patrickkettner @smaug---- @scottgonzalez Any thoughts on this? |
Is there a reason that the much higher frequency is beneficial as opposed to just dispatching at a predefined rate closer to the desired rAF frequency, such as 60Hz? If not, then we could just continue to use the existing events. |
I think we should try to use the same events, but possibly let the web page to hint that it wants high precision events. |
It is all about latency as you might need to wait like 16ms for a rAF call to come around. This is a very limited use case but has the potential to deliver a good user experience for apps that use it correctly. That was my initial approach where I specified an event listener option. @RByers said the problem with that approach is that it doesn't allow apps to reason about performance because it can be modified in a number of ways. For example if an iframe sets the option then all event listeners even 3rd party code on the main page (for usage metrics) starts getting executed more frequently where they only expected it at rAF time. |
But what are the events being used for? Why does drawing to an offscreen canvas faster than rAF improve the user experience? I assume the rendered canvas is then being injected into a visible location; but that should be done inside rAF, right? |
We are working on having an on-screen canvas actually get a hardware layer (like video). Chrome has like a multi-frame swap so you can avoid the frame swaps with the direct hardware layer. So you can get an onscreen canvas swapping in a single frame. |
Here's another use case that (while less compelling) is a lot simpler and easier to reason about IMHO: a music synthesizer / DJ application that creates/modifies sound somehow in response to touch position. In that case high frequency and low latency can be very important, while aligning with any form of visual output is unimportant (though there's probably still some visual feedback somewhere, the primary output is audio). The problem I have with a global "use high frequency" hint is that whether or not you want (and are designed to handle) high-frequency events is really a per-handler question. We know from our touch scrolling work that the vast majority of touch event listeners installed on the web today are for non-primary use cases (eg. activity monitoring, analytics, etc.). If we change the timing of an event for one handler then invariants expected by other handlers for the event (both around timing and frequency) would be broken. The cross-iframe case here is the worst since there's not necessarily any chance for the impacted code to co-ordinate, but the problem exists within a single frame as well. If we don't want to introduce a new "pointerrawmove" event type then other options I can imagine that avoid the above problem are:
WDYT? |
I'd add a third suggestion to @RByers list there -
interface InputObserverInit {
mouse: boolean;
touch: boolean;
}
interface InputRecord {
type: "mouse" | "touch";
pageX: number;
pageY: number;
which: number;
// etc - exact contents dependant on type, and similar to MouseEvent / TouchEvent
}
class InputObserver {
constructor(callback: (record?: InputRecord, instance?: InputObserver) => void);
observe(init: InputObserverInit);
disconnect();
} ...One potential benefit of this would be the ability to add this to all user input events including keyboard and device orientation etc. Games are the obvious benefit for real-time input. Even though the display is clamped at rAF speeds, any form of twitch gaming wants to be able to react to things when they happen. As above, the display is just an indication of the state, but the state itself doesn't conform to 60/30/15fps etc. Especially on slower devices, or heavier games that end up pushing the framerate down, locking the input rate to the display rate will limit what is possible in an inconsistent way. |
@Rycochet event.timestamp is relative to the Performance.now() in Chrome. This is an interesting idea. It definitely gets rid of the problems with hit testing that would dominate a high frequency mechanism. |
But ends up copying lots of event handling mechanism. And hit-testing is rather useful thing to have, since that gives you also proper event propagation. |
It's always possible to get the element under the pointer via The only things that are needed from a |
Curious, why pageX/Y? I would assume clientX/Y. Event propagation let's one to add event listeners to different places and using normal event handling mechanisms. Inventing something totally new is possibly a bit heavy (and complicates the already complicated platform even more). But I don't have a strong opinion yet. I'd prefer to try to reuse the existing mechanisms if possible. |
Observers aren't new - they're used for |
Limiting the raw data to capturing mode only feels like a hack. |
The use case of listening to both types of pointermove and pointerrawmove is quite valid. They serve different purposes. The argument against having a "browser context"-wide api to enable/disable rAF-aligning is the use case of using other third-party libraries or just a big webpage which has a lot of components. With disabling rAF for the whole browsing context you are forcing all the other handlers and their bubbling handlers and everything to be run with a higher frequency even though they might not care about that. Introducing a new event easily lets us make it so that it is isolated from other handlers and also we can make cancelable and bubble of that to false to make sure it is not causing any additional processing. The bubble part of it is debatable as if the parent has a pointerrawmove handler it has already expressed its interest in processing those events. Regarding limiting it to the capturing mode I don't have a strong argument for it. I'm very open to the possibility of having it fired all the time if there are handlers on the page. If the page desires to circumvent the browser hit-testing (for performance reasons or whatever else) they can always use the capture api. |
@smaug---- does this address your concerns? More explicitly not limiting the pointerrawmove to the capturing and also set the bubble to true. |
@spanicker I think I mentioned this spec issue to you at some point, but if not here it is. |
Chrome has experimented in shipping events relative to rAF. pointermove is only sent once for every rAF callback and occurs just before the rAF callback. Specifically this was done for touch in Chrome 59 and mouse in Chrome 60. See some discussion here #9.
With pointerevent dispatched at rAF time it is useful for some apps that don't produce any in the normal document lifecycle (see issue whatwg/html#2659) to have input right away.
I was wondering if it we should add a pointermove-raw event to enable the cases when pointer moves with low latency are needed. We want to ensure that this isn't the typical usage as the raw events can occur at a much higher frequency (eg. 1000Hz) so I don't think we want to use the same event type as that would conflate the problem.
Talking with @RByers we discussed that this event shouldn't bubble and is non-cancelable.
The text was updated successfully, but these errors were encountered: