Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP - initial concept of "framerate" #771

Closed
wants to merge 7 commits into from

Conversation

vshapenko
Copy link

Concept of limited framerate on view updating. Minimum interval between view updated - 15ms. Model updates are processed as usual

@TimLariviere , FYI

@TimLariviere
Copy link
Member

Interesting idea. We can indeed limit the number of view update to better handle cases where we receive a lot of updates, since at all time, only the last view is really relevant.
So this could be a good way to improve performance.

Though I think it would be better not to depend on a specific time rate.
Because it doesn't really make sense in the context of a mobile application.

I would see it a bit differently:

  • Keep processMsg like today, except instead of directly calling updateView, we would post a message to a mailbox with the last generated model (from let (updatedModel,newCommands) = program.update msg lastModel, not from let mutable lastModel = initialModel because this one needs to be the same as the displayed UI).
  • The mailbox processor view would eagerly consume messages to keep only the last one at a specific moment
  • With the last message, it would call updateView which can take up to a few hundred milliseconds to execute.
  • Once done, it would start again to eagerly consume the accumulated messages to get the latest model.
  • (Maybe put a feature flag where we can revert to updating the UI for each model like today, some controls are stateful and will need all the updates)

This way, if the application has a quick update rate (like a timer) and it's quicker than Fabulous can handle, we would discard all "old" updates and only take the most recent one to apply when Fabulous finished the previous view update.

This is good because it doesn't change anything for most use cases where updates are "slow" (user interactions for example).
And it is slightly when the updates are numerous in a very short time because we won't be updating the UI unnecessarily.

This would also let us put the init/update call on a different thread than UI thread.
And also let us change ViewElement.UpdateIncremental to Async and support #158

@vshapenko
Copy link
Author

@TimLariviere , current implementation does not set specific time rate. It just sets minimum possible view update interval.
I will try to play with eager update

@vshapenko
Copy link
Author

@TimLariviere , and just one last note on eager - i think we should limit the time we consume updates, otherwise there can be scenario redraw never happens (imagine constant updates at high frequency)

@TimLariviere
Copy link
Member

and just one last note on eager - i think we should limit the time we consume updates, otherwise there can be scenario redraw never happens (imagine constant updates at high frequency)

To avoid that, we could check the length of the message queue before eagerly consuming messages and only consume those messages (discarding all except last).
That way, we aren't stuck in an infinite loop if messages are pushed quicker than we can consume them.

@vshapenko
Copy link
Author

and just one last note on eager - i think we should limit the time we consume updates, otherwise there can be scenario redraw never happens (imagine constant updates at high frequency)

To avoid that, we could check the length of the message queue before eagerly consuming messages and only consume those messages (discarding all except last).
That way, we aren't stuck in an infinite loop if messages are pushed quicker than we can consume them.

So, we would have update count as mailbox state. Hmmm. And in case of count =0 we would send a view render command.

@vshapenko
Copy link
Author

Ok, made an "eager" model, but i am in doubts. This would help if we have "heavy" updates, but i am not sure how this is better than limiting minimum view render interval.Imagine we have very frequent and fast updates, but slow redraws - current model will not give us much advantage in terms of performance.


loop ()
)
let inbox=MailboxProcessor.Start(fun inbox ->
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would not create a mailbox processor for that, because we don't need to wait before computing a new model.

We can keep the existing processMsg function to immediately compute a new model.
Only, instead of calling updateView updatedModel directly from processMsg, we would do viewInbox.Post(updatedModel).

let rec processMsg msg = 
    try
        let (updatedModel,newCommands) = program.update msg lastModel
        viewInbox.Post(updatedModel)

        for sub in newCommands do
            try 
                sub dispatch
            with ex ->
                program.onError ("Error executing commands:", ex)
    with ex ->
        program.onError ("Unable to process a message:", ex)

Then in the viewInbox, it would do something like

let rec loop() =
    let queueSize = inbox.CurrentQueueLength

    if queueSize > 1 then
        // Discard old messages except last
        for i = 1 .. queueSize - 1 do
            let! _ = inbox.Receive 0 // Immediately reads the message and discard it

    let! updatedModel = inbox.Receive()
    lastModel <- updatedModel // Only store it here since we need to have the last model of what's on the screen
    try
        updateView updatedModel
    with
         (...)

    loop()

NB: This example is more like pseudo-code than actual working code

Copy link
Member

@TimLariviere TimLariviere Jun 30, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also I think it's important to make sure processMsg is not run on the UI thread as this one will be needed by updateView.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@TimLariviere , unfortunately, there is no way to drop messages from mailbox queue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inbox.Receive/inbox.TryReceive have a timeout parameter you can set to 0 to immediately pull the message from the queue.
So we can drop message like that.

@TimLariviere
Copy link
Member

TimLariviere commented Jun 30, 2020

This would help if we have "heavy" updates, but i am not sure how this is better than limiting minimum view render interval.Imagine we have very frequent and fast updates, but slow redraws - current model will not give us much advantage in terms of performance.

The idea of the eager model is to let the Fabulous update the UI as soon as it can.
But if there's more updates than Fabulous can handle, it will discard all "old" updates and only updates the UI with the latest model available at the moment Fabulous is free to diff the UI.

Untitled Diagram

Note that during v1, updates post their updated model to the mailbox.
But when the mailbox is free (it finally finished executing ViewElement.UpdateIncremental), it will check the queue length and only take the last updated model.

So it should achieve good performance even in high frequency update.
Redraws will most likely be slower than updates everytime, so we're bound to the redraw time in any case.

@vshapenko
Copy link
Author

@TimLariviere , do i understand correctly that you propose a kind of "blocking" model on view updates?

@TimLariviere
Copy link
Member

TimLariviere commented Jun 30, 2020

do i understand correctly that you propose a kind of "blocking" model on view updates?

Not sure what you mean by blocking model.

The thing with view updates is that they need to be run sequentially on the same UI thread.
So while we're diffing the view for Model 1, we can't process other updated models.

Once we're done diffing the view, today, Fabulous will take Model 2 and diff the view, so forth and so on until there's no longer any updated models left.

My proposition is when Fabulous is done diffing the view for Model 1 and Models 2, 3 & 4 have been sent while we were working on the UI, Fabulous will ignore Models 2 and 3, and will directly diff the view for Model 4 (because previous models are no longer relevant).
New updated models will continue to accumulate, waiting for Fabulous to be available to diff the view for a new model.

@vshapenko
Copy link
Author

@TimLariviere , take a look to latest commit, i think this is what you need.
We "accumulate" changes while rendering happens, after that we ensure that current renderred state is actual or not. If we got more messages while rendering, we just launch render on actual state.

@TimLariviere
Copy link
Member

@vshapenko Tested it on the CounterApp, it's working! 👍
I've put a timer of 15ms and added Async.Sleep 250 before updateView to simulate a slow rendering.

I get the following results

Console output 09:39:27.392 Initial model: { Count = 0 Step = 1 TimerOn = false }
09:39:27.394 View, model = { Count = 0 Step = 1 TimerOn = false }
09:39:27.691 View result: CustomContentPage(...)@1610813341
09:39:31.476 Message: TimerToggled true
09:39:31.478 Updated model: { Count = 0 Step = 1 TimerOn = true }
09:39:31.504 Message: TimedTick
09:39:31.504 Updated model: { Count = 1 Step = 1 TimerOn = true }
09:39:31.520 Message: TimedTick
09:39:31.521 Updated model: { Count = 2 Step = 1 TimerOn = true }
09:39:31.538 Message: TimedTick
09:39:31.539 Updated model: { Count = 3 Step = 1 TimerOn = true }
09:39:31.556 Message: TimedTick
09:39:31.556 Updated model: { Count = 4 Step = 1 TimerOn = true }
09:39:31.572 Message: TimedTick
09:39:31.573 Updated model: { Count = 5 Step = 1 TimerOn = true }
09:39:31.589 Message: TimedTick
09:39:31.590 Updated model: { Count = 6 Step = 1 TimerOn = true }
09:39:31.605 Message: TimedTick
09:39:31.606 Updated model: { Count = 7 Step = 1 TimerOn = true }
09:39:31.622 Message: TimedTick
09:39:31.623 Updated model: { Count = 8 Step = 1 TimerOn = true }
09:39:31.639 Message: TimedTick
09:39:31.639 Updated model: { Count = 9 Step = 1 TimerOn = true }
09:39:31.655 Message: TimedTick
09:39:31.656 Updated model: { Count = 10 Step = 1 TimerOn = true }
09:39:31.672 Message: TimedTick
09:39:31.673 Updated model: { Count = 11 Step = 1 TimerOn = true }
09:39:31.689 Message: TimedTick
09:39:31.689 Updated model: { Count = 12 Step = 1 TimerOn = true }
09:39:31.705 Message: TimedTick
09:39:31.706 Updated model: { Count = 13 Step = 1 TimerOn = true }
09:39:31.722 Message: TimedTick
09:39:31.723 Updated model: { Count = 14 Step = 1 TimerOn = true }
09:39:31.734 Dropped 13 messages
09:39:31.735 View, model = { Count = 0 Step = 1 TimerOn = true }
09:39:31.736 View result: CustomContentPage(...)@-1968210617
09:39:31.755 Message: TimedTick
09:39:31.755 Updated model: { Count = 15 Step = 1 TimerOn = true }
09:39:31.772 Message: TimedTick
09:39:31.772 Updated model: { Count = 16 Step = 1 TimerOn = true }
09:39:31.788 Message: TimedTick
09:39:31.789 Updated model: { Count = 17 Step = 1 TimerOn = true }
09:39:31.805 Message: TimedTick
09:39:31.806 Updated model: { Count = 18 Step = 1 TimerOn = true }
09:39:31.822 Message: TimedTick
09:39:31.823 Updated model: { Count = 19 Step = 1 TimerOn = true }
09:39:31.839 Message: TimedTick
09:39:31.840 Updated model: { Count = 20 Step = 1 TimerOn = true }
09:39:31.856 Message: TimedTick
09:39:31.857 Updated model: { Count = 21 Step = 1 TimerOn = true }
09:39:31.874 Message: TimedTick
09:39:31.875 Updated model: { Count = 22 Step = 1 TimerOn = true }
09:39:31.891 Message: TimedTick
09:39:31.892 Updated model: { Count = 23 Step = 1 TimerOn = true }
09:39:31.908 Message: TimedTick
09:39:31.909 Updated model: { Count = 24 Step = 1 TimerOn = true }
09:39:31.926 Message: TimedTick
09:39:31.926 Updated model: { Count = 25 Step = 1 TimerOn = true }
09:39:31.942 Message: TimedTick
09:39:31.943 Updated model: { Count = 26 Step = 1 TimerOn = true }
09:39:31.959 Message: TimedTick
09:39:31.960 Updated model: { Count = 27 Step = 1 TimerOn = true }
09:39:31.976 Message: TimedTick
09:39:31.977 Updated model: { Count = 28 Step = 1 TimerOn = true }
09:39:31.989 Dropped 13 messages
09:39:31.990 View, model = { Count = 14 Step = 1 TimerOn = true }
09:39:31.990 View result: CustomContentPage(...)@110720383
09:39:31.993 Message: TimedTick
09:39:31.994 Updated model: { Count = 29 Step = 1 TimerOn = true }
09:39:32.010 Message: TimedTick
09:39:32.011 Updated model: { Count = 30 Step = 1 TimerOn = true }
09:39:32.028 Message: TimedTick
09:39:32.029 Updated model: { Count = 31 Step = 1 TimerOn = true }
09:39:32.046 Message: TimedTick
09:39:32.047 Updated model: { Count = 32 Step = 1 TimerOn = true }
09:39:32.063 Message: TimedTick
09:39:32.064 Updated model: { Count = 33 Step = 1 TimerOn = true }
09:39:32.080 Message: TimedTick
09:39:32.080 Updated model: { Count = 34 Step = 1 TimerOn = true }
09:39:32.097 Message: TimedTick
09:39:32.098 Updated model: { Count = 35 Step = 1 TimerOn = true }
09:39:32.113 Message: TimedTick
09:39:32.114 Updated model: { Count = 36 Step = 1 TimerOn = true }
09:39:32.130 Message: TimedTick
09:39:32.131 Updated model: { Count = 37 Step = 1 TimerOn = true }
09:39:32.147 Message: TimedTick
09:39:32.148 Updated model: { Count = 38 Step = 1 TimerOn = true }
09:39:32.164 Message: TimedTick
09:39:32.165 Updated model: { Count = 39 Step = 1 TimerOn = true }
09:39:32.181 Message: TimedTick
09:39:32.181 Updated model: { Count = 40 Step = 1 TimerOn = true }
09:39:32.198 Message: TimedTick
09:39:32.198 Updated model: { Count = 41 Step = 1 TimerOn = true }
09:39:32.214 Message: TimedTick
09:39:32.215 Updated model: { Count = 42 Step = 1 TimerOn = true }
09:39:32.231 Message: TimedTick
09:39:32.232 Updated model: { Count = 43 Step = 1 TimerOn = true }
09:39:32.241 Dropped 14 messages
09:39:32.241 View, model = { Count = 28 Step = 1 TimerOn = true }
09:39:32.241 View result: CustomContentPage(...)@-430969868

Note that View seems to be out of sync (Count = 28 when model is 43) but that's because the view only finished rendering 28 when model was already at 43. Next view update will most likely be 43 after.

Also tried to simplify the mailbox by not using blocking flags, instead pulling "old" messages with a timeout of 0 (no wait).

let viewInbox = MailboxProcessor.Start (fun inbox ->
    let rec loop () = async {
        let queueLength = inbox.CurrentQueueLength
        
        if queueLength > 1 then
            Console.WriteLine(sprintf "Dropped %i messages" (queueLength - 1))
            for i = 1 to queueLength - 1 do
                let! _ = inbox.Receive 0
                ()
            
        let! updatedModel = inbox.Receive()
        program.syncAction (fun()->updateView updatedModel) ()
        
        return! loop()
    }
     
    loop ()
)

I got the same result.

@vshapenko
Copy link
Author

@TimLariviere , i've never seen such "drop" technique in prod, but maybe it is valid. Anyway, i prefer do not trust queue size and write more strict code. I have pushed a version with mutable flag, should be more consistent.

@TimLariviere
Copy link
Member

TimLariviere commented Jul 1, 2020

Anyway, i prefer do not trust queue size and write more strict code. I have pushed a version with mutable flag, should be more consistent.

CurrentQueueLength is an approximation of the number of messages based on the comments inside the source code, but it should be good enough for our case.
Because there's only one reader, so if the approximation is inferior or equal to the real number of newly updated models, we're good.
Maybe we can be 1-2 messages behind, I think, when under heavy load, but that's not a problem in my opinion (such case is like very improbable, rarely in mobile apps there's such an update rate).

The issue with a more strict version is that there's 2 asynchronous processes that can send messages (of different types) at the same time.
So it's becoming harder to understand and a lot harder to debug. We can have race condition, invalid ordering (could be an oversight, will add a review on the line), etc.
Also it is forced to start an async task to then call program.syncAction that will marshal back on the UI thread. So if we could avoid starting new thread for that, especially for really quick updates, it will be better for performance.

On the less strict version, it's still very sequential and so a lot easier to understand and debug.
The loop starts, checks the queue length one single time, discards old messages if more than 1 in the queue and then synchronously wait for the UI to render.
Once done, it starts a new loop that does the exact same thing.
If there's no message, it will wait and immediately render the UI when a new message arrives.

@vshapenko
Copy link
Author

vshapenko commented Jul 1, 2020

Problem with your approach on message dropping is following :

  1. We can not guarantee we have read the last state.(because more messages would be arrived while dropping)
  2. Async.Start does not create new thread, more likely it takes one from thread pool. In our app we use asyncs everywhere , and it does not affect performance.
  3. Another thing i care about is mailbox lock. From my experience, non-blocking mb is much better (and predictable), than blocking. The purpose of mailbox here is to provide a correct order of messages (first) and run an asynchronous render (if needed).
    So, we just use mb to organize the queue, but we separate processes of handling updates and view rendering. View rendering, then happens, just tells the mailbox to simply keep new states until we ready to render a new state

@TimLariviere
Copy link
Member

Like we discussed on another channel, I think we should refactor the Runner class a bit to allow for an easier subclassing/extension to change how the update-view loop behaves.

Because in the vast majority of cases, people won't need a high throughput of updates in their apps so the current implementation is good enough and is reliable due to being fully-sequential.

Then, you'll be able to write your own Runner and start it inside a function in the Program module just like Program.run.
https://github.com/fsprojects/Fabulous/blob/fb4f251ce5e7cd3f9755099dcd6f12991bcce794/src/Fabulous/Program.fs#L253-L255

@vshapenko
Copy link
Author

vshapenko commented Jul 7, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants