-
-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP - initial concept of "framerate" #771
Conversation
Interesting idea. We can indeed limit the number of view update to better handle cases where we receive a lot of updates, since at all time, only the last view is really relevant. Though I think it would be better not to depend on a specific time rate. I would see it a bit differently:
This way, if the application has a quick update rate (like a timer) and it's quicker than Fabulous can handle, we would discard all "old" updates and only take the most recent one to apply when Fabulous finished the previous view update. This is good because it doesn't change anything for most use cases where updates are "slow" (user interactions for example). This would also let us put the init/update call on a different thread than UI thread. |
@TimLariviere , current implementation does not set specific time rate. It just sets minimum possible view update interval. |
@TimLariviere , and just one last note on eager - i think we should limit the time we consume updates, otherwise there can be scenario redraw never happens (imagine constant updates at high frequency) |
To avoid that, we could check the length of the message queue before eagerly consuming messages and only consume those messages (discarding all except last). |
So, we would have update count as mailbox state. Hmmm. And in case of count =0 we would send a view render command. |
Ok, made an "eager" model, but i am in doubts. This would help if we have "heavy" updates, but i am not sure how this is better than limiting minimum view render interval.Imagine we have very frequent and fast updates, but slow redraws - current model will not give us much advantage in terms of performance. |
src/Fabulous/Program.fs
Outdated
|
||
loop () | ||
) | ||
let inbox=MailboxProcessor.Start(fun inbox -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would not create a mailbox processor for that, because we don't need to wait before computing a new model.
We can keep the existing processMsg
function to immediately compute a new model.
Only, instead of calling updateView updatedModel
directly from processMsg
, we would do viewInbox.Post(updatedModel)
.
let rec processMsg msg =
try
let (updatedModel,newCommands) = program.update msg lastModel
viewInbox.Post(updatedModel)
for sub in newCommands do
try
sub dispatch
with ex ->
program.onError ("Error executing commands:", ex)
with ex ->
program.onError ("Unable to process a message:", ex)
Then in the viewInbox
, it would do something like
let rec loop() =
let queueSize = inbox.CurrentQueueLength
if queueSize > 1 then
// Discard old messages except last
for i = 1 .. queueSize - 1 do
let! _ = inbox.Receive 0 // Immediately reads the message and discard it
let! updatedModel = inbox.Receive()
lastModel <- updatedModel // Only store it here since we need to have the last model of what's on the screen
try
updateView updatedModel
with
(...)
loop()
NB: This example is more like pseudo-code than actual working code
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also I think it's important to make sure processMsg
is not run on the UI thread as this one will be needed by updateView
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@TimLariviere , unfortunately, there is no way to drop messages from mailbox queue.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inbox.Receive/inbox.TryReceive have a timeout parameter you can set to 0 to immediately pull the message from the queue.
So we can drop message like that.
The idea of the eager model is to let the Fabulous update the UI as soon as it can. Note that during v1, updates post their updated model to the mailbox. So it should achieve good performance even in high frequency update. |
@TimLariviere , do i understand correctly that you propose a kind of "blocking" model on view updates? |
Not sure what you mean by blocking model. The thing with view updates is that they need to be run sequentially on the same UI thread. Once we're done diffing the view, today, Fabulous will take Model 2 and diff the view, so forth and so on until there's no longer any updated models left. My proposition is when Fabulous is done diffing the view for Model 1 and Models 2, 3 & 4 have been sent while we were working on the UI, Fabulous will ignore Models 2 and 3, and will directly diff the view for Model 4 (because previous models are no longer relevant). |
@TimLariviere , take a look to latest commit, i think this is what you need. |
@vshapenko Tested it on the CounterApp, it's working! 👍 I get the following results Console output09:39:27.392 Initial model: { Count = 0 Step = 1 TimerOn = false }09:39:27.394 View, model = { Count = 0 Step = 1 TimerOn = false } 09:39:27.691 View result: CustomContentPage(...)@1610813341 09:39:31.476 Message: TimerToggled true 09:39:31.478 Updated model: { Count = 0 Step = 1 TimerOn = true } 09:39:31.504 Message: TimedTick 09:39:31.504 Updated model: { Count = 1 Step = 1 TimerOn = true } 09:39:31.520 Message: TimedTick 09:39:31.521 Updated model: { Count = 2 Step = 1 TimerOn = true } 09:39:31.538 Message: TimedTick 09:39:31.539 Updated model: { Count = 3 Step = 1 TimerOn = true } 09:39:31.556 Message: TimedTick 09:39:31.556 Updated model: { Count = 4 Step = 1 TimerOn = true } 09:39:31.572 Message: TimedTick 09:39:31.573 Updated model: { Count = 5 Step = 1 TimerOn = true } 09:39:31.589 Message: TimedTick 09:39:31.590 Updated model: { Count = 6 Step = 1 TimerOn = true } 09:39:31.605 Message: TimedTick 09:39:31.606 Updated model: { Count = 7 Step = 1 TimerOn = true } 09:39:31.622 Message: TimedTick 09:39:31.623 Updated model: { Count = 8 Step = 1 TimerOn = true } 09:39:31.639 Message: TimedTick 09:39:31.639 Updated model: { Count = 9 Step = 1 TimerOn = true } 09:39:31.655 Message: TimedTick 09:39:31.656 Updated model: { Count = 10 Step = 1 TimerOn = true } 09:39:31.672 Message: TimedTick 09:39:31.673 Updated model: { Count = 11 Step = 1 TimerOn = true } 09:39:31.689 Message: TimedTick 09:39:31.689 Updated model: { Count = 12 Step = 1 TimerOn = true } 09:39:31.705 Message: TimedTick 09:39:31.706 Updated model: { Count = 13 Step = 1 TimerOn = true } 09:39:31.722 Message: TimedTick 09:39:31.723 Updated model: { Count = 14 Step = 1 TimerOn = true } 09:39:31.734 Dropped 13 messages 09:39:31.735 View, model = { Count = 0 Step = 1 TimerOn = true } 09:39:31.736 View result: CustomContentPage(...)@-1968210617 09:39:31.755 Message: TimedTick 09:39:31.755 Updated model: { Count = 15 Step = 1 TimerOn = true } 09:39:31.772 Message: TimedTick 09:39:31.772 Updated model: { Count = 16 Step = 1 TimerOn = true } 09:39:31.788 Message: TimedTick 09:39:31.789 Updated model: { Count = 17 Step = 1 TimerOn = true } 09:39:31.805 Message: TimedTick 09:39:31.806 Updated model: { Count = 18 Step = 1 TimerOn = true } 09:39:31.822 Message: TimedTick 09:39:31.823 Updated model: { Count = 19 Step = 1 TimerOn = true } 09:39:31.839 Message: TimedTick 09:39:31.840 Updated model: { Count = 20 Step = 1 TimerOn = true } 09:39:31.856 Message: TimedTick 09:39:31.857 Updated model: { Count = 21 Step = 1 TimerOn = true } 09:39:31.874 Message: TimedTick 09:39:31.875 Updated model: { Count = 22 Step = 1 TimerOn = true } 09:39:31.891 Message: TimedTick 09:39:31.892 Updated model: { Count = 23 Step = 1 TimerOn = true } 09:39:31.908 Message: TimedTick 09:39:31.909 Updated model: { Count = 24 Step = 1 TimerOn = true } 09:39:31.926 Message: TimedTick 09:39:31.926 Updated model: { Count = 25 Step = 1 TimerOn = true } 09:39:31.942 Message: TimedTick 09:39:31.943 Updated model: { Count = 26 Step = 1 TimerOn = true } 09:39:31.959 Message: TimedTick 09:39:31.960 Updated model: { Count = 27 Step = 1 TimerOn = true } 09:39:31.976 Message: TimedTick 09:39:31.977 Updated model: { Count = 28 Step = 1 TimerOn = true } 09:39:31.989 Dropped 13 messages 09:39:31.990 View, model = { Count = 14 Step = 1 TimerOn = true } 09:39:31.990 View result: CustomContentPage(...)@110720383 09:39:31.993 Message: TimedTick 09:39:31.994 Updated model: { Count = 29 Step = 1 TimerOn = true } 09:39:32.010 Message: TimedTick 09:39:32.011 Updated model: { Count = 30 Step = 1 TimerOn = true } 09:39:32.028 Message: TimedTick 09:39:32.029 Updated model: { Count = 31 Step = 1 TimerOn = true } 09:39:32.046 Message: TimedTick 09:39:32.047 Updated model: { Count = 32 Step = 1 TimerOn = true } 09:39:32.063 Message: TimedTick 09:39:32.064 Updated model: { Count = 33 Step = 1 TimerOn = true } 09:39:32.080 Message: TimedTick 09:39:32.080 Updated model: { Count = 34 Step = 1 TimerOn = true } 09:39:32.097 Message: TimedTick 09:39:32.098 Updated model: { Count = 35 Step = 1 TimerOn = true } 09:39:32.113 Message: TimedTick 09:39:32.114 Updated model: { Count = 36 Step = 1 TimerOn = true } 09:39:32.130 Message: TimedTick 09:39:32.131 Updated model: { Count = 37 Step = 1 TimerOn = true } 09:39:32.147 Message: TimedTick 09:39:32.148 Updated model: { Count = 38 Step = 1 TimerOn = true } 09:39:32.164 Message: TimedTick 09:39:32.165 Updated model: { Count = 39 Step = 1 TimerOn = true } 09:39:32.181 Message: TimedTick 09:39:32.181 Updated model: { Count = 40 Step = 1 TimerOn = true } 09:39:32.198 Message: TimedTick 09:39:32.198 Updated model: { Count = 41 Step = 1 TimerOn = true } 09:39:32.214 Message: TimedTick 09:39:32.215 Updated model: { Count = 42 Step = 1 TimerOn = true } 09:39:32.231 Message: TimedTick 09:39:32.232 Updated model: { Count = 43 Step = 1 TimerOn = true } 09:39:32.241 Dropped 14 messages 09:39:32.241 View, model = { Count = 28 Step = 1 TimerOn = true } 09:39:32.241 View result: CustomContentPage(...)@-430969868 Note that View seems to be out of sync (Count = 28 when model is 43) but that's because the view only finished rendering 28 when model was already at 43. Next view update will most likely be 43 after. Also tried to simplify the mailbox by not using blocking flags, instead pulling "old" messages with a timeout of 0 (no wait). let viewInbox = MailboxProcessor.Start (fun inbox ->
let rec loop () = async {
let queueLength = inbox.CurrentQueueLength
if queueLength > 1 then
Console.WriteLine(sprintf "Dropped %i messages" (queueLength - 1))
for i = 1 to queueLength - 1 do
let! _ = inbox.Receive 0
()
let! updatedModel = inbox.Receive()
program.syncAction (fun()->updateView updatedModel) ()
return! loop()
}
loop ()
) I got the same result. |
…the event flow more precisely
@TimLariviere , i've never seen such "drop" technique in prod, but maybe it is valid. Anyway, i prefer do not trust queue size and write more strict code. I have pushed a version with mutable flag, should be more consistent. |
The issue with a more strict version is that there's 2 asynchronous processes that can send messages (of different types) at the same time. On the less strict version, it's still very sequential and so a lot easier to understand and debug. |
Problem with your approach on message dropping is following :
|
Like we discussed on another channel, I think we should refactor the Because in the vast majority of cases, people won't need a high throughput of updates in their apps so the current implementation is good enough and is reliable due to being fully-sequential. Then, you'll be able to write your own Runner and start it inside a function in the |
Ok, i will try to find time.
вт, 7 июл. 2020 г., 11:48 Timothé Larivière <notifications@github.com>:
… Like we discussed on another channel, I think we should refactor the
Runner class a bit to allow for an easier subclassing/extension to change
how the update-view loop behaves.
Because in the vast majority of cases, people won't need a high throughput
of updates in their apps so the current implementation is good enough and
is reliable due to being fully-sequential.
Then, you'll be able to write your own Runner and start it inside a
function in the Program module just like Program.run.
https://github.com/fsprojects/Fabulous/blob/fb4f251ce5e7cd3f9755099dcd6f12991bcce794/src/Fabulous/Program.fs#L253-L255
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#771 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEKNDOXDRHETYCI7QDRQBI3R2LOMLANCNFSM4OLCT3VA>
.
|
Concept of limited framerate on view updating. Minimum interval between view updated - 15ms. Model updates are processed as usual
@TimLariviere , FYI