-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve mobile browser performance #653
Comments
For reference: 5-6 seconds on a OnePlus One on first load, about 2 second on subsequent loads. Is the data processed through protobuf.js on each page load, even when serving from cache? |
Yes. |
There is also server latency of the calendar call, you will have to ignore the rare uncached hits there. |
One thing that comes to my mind is that the used definitions' code is generated on the fly, once. This might or might not result in an initial, though recurring for every page load, performance hit that could be less heavy with static code. |
Our static definitions are 3.2mb uncompressed m that will add a huge
performance hit on asset size.
…On Jan 19, 2017 6:46 PM, "Daniel Wirtz" ***@***.***> wrote:
A bit of profiling (red marks calls most likely protobuf.js related):
<https://camo.githubusercontent.com/b16670c59c3ee4551f2774b0c7db5645c085e407/687474703a2f2f692e696d6775722e636f6d2f37556f636b5a642e6a7067>
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#653 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ALK0623cQzvIABW0MiAsVa4Dg0xKj3MEks5rUAPlgaJpZM4LowOt>
.
|
Yup, the timeline view reveals that The red frame shows raw network download time (for my connection) and where bundle.js is parsed in red. The blue frame marks where I'd expect protobuf.js to actually execute (see the question below that might falsify that): Does bundle.js include and load all the definitions synchronously, like through |
You know what would be great? To compute transitive closure of the protos we use during build process based on import statements. Thus pruning away excess proto fat. Then static bundle maybe useful. Protobufs were a server side thing so no one really cares about pruning code size. This is a web perf motivated optimization. Maybe too much effort for not enough benefit |
Yes it loads and does a resolveAll |
Maybe some mechanism to determine which reflection objects have actually been used. Not exactly sure if that should be in core, but it could certainly be patched into the prototypes of reflection objects or maybe into generated (en/de)code. If not that, then there's still If not that, then calling |
Now we have a starting point, at least. Just because Types returned by Based on that, however, pretty much everything should be interceptable, be it enums actually accessed, counters for fields or what not. |
Will check. I was doing some basic measurements. using performance.now(), confirms what you saw On nexus 6 chrome 55 for one of the sites. On nexus 6 chrome 56 beta Secondly decode of the calendar data takes ~250-300ms on either |
One thing that comes to my mind is that the required call to Do you know how much of the durations above are from |
200 and 166ms (Chrome 55 and Chrome 56 respectively) |
Hmm, ok, would improve it by a bit maybe, but not by much. Todo: Benchmark on reflection stuff. Just noticed that the durations have been in there already, whoops. |
… to populate static code-compatible properties, see #653
With this now in place, it should also be possible to use |
Update: Our dependency graph is so huge that the result of pruning would not yield more than 5% of data shaving. Few more interesting things I discovered. I ran the protobuf benchmark https://jsperf.com/protobufjs-robin Note this only performs the encode, decode tests. On Safari 10.0.2 the encoding is ~427K/s and decoding is 681K/s On Safari 10.2.0 (Technical preview 21) the encoding is ~173K/s and decoding is 251K/s On Chrome 55 the encoding is ~620K/s and decoding is 782K/s Even though Chrome was better are raw encoding decoding tests, Safari specially the new one improved page load by 2x. I have a few hypothesis, let me know if any of these sound plausible.
JS Performance is hard. |
…pass, see #653; Other: Added TS definitions to alternative builds' index files; Other: Removed unnecessary prototype aliases, improves gzip ratio
This performance pass includes quite a lot of micro-optimizations like inlining Array#forEach and similar, using cached vars instead of getters where possible, and so on. Slightly reduces the number of total ticks in a simple test case. |
that gave a small boost for Chrome (~5%) and big boost in performance for safari technical preview 21 encoding ~554K/s and decoding is 683K/s |
Interesting that this affected encoding/decoding speed at all as there haven't been any changes to generated code. Was meant for parsing / setup speed. One more thing that could be done is disabling any assertions when loading pre-asserted modules through Do you have updated timings on page load speed (especially regarding "Evaluate Script (bunde.js:1)" above)? |
I dont have it yet. Locally whatever measurements I make will not like production with production data and I need a protobufjs release to take it all the way till production :| |
Hmm, there are still some possible tweaks to investigate. Could really need a rather large real world JSON definition for local profiling. |
@dcodeIO I can send you our large definition via email if you are willing to sign an NDA? and that you never push it any public repo. |
Actually I realized that is kind of stupid thing, our bundles are in JS anyways. Let me send over the protos |
So, are you using static code in your bundles / not a JSON module? |
we are using a JSON file, importing that then calling resolveAll() before re-exporting. I have emailed you the files. |
I see, then I am still optimizing at the right spots. Currently using tests/test.proto for fromJSON profiling, that's already a lot better than bench.proto. |
@dcodeIO If we can somehow find a middle ground between fully expanded static classes (which is too big 5MB for these files) and a file containing a bunch of exports for each type that is resolved on import. Webpack is smart enough to resolve them based on demand. Just throwing it out there. |
…s over multiple files more easily, see #653
Hmm, ideally there'd be a way to split messages, services, enums etc. over multiple files to make this possible. As a starting point, the last commit introduces that the root name specified through pbjs's |
With this now in place it still looks like a lot of work to implement proper splitting. A good approach could be to build some sort of abstract bundler for the CLI that takes operations to perform like defining sub-namespaces, setting options, adding (sub-)types and importing types where required - and then reuse that, cutting through the operations, to generate actual JSON and static modules. |
@dcodeIO can you cut a release. I will try upgrading and report back improvements |
On nexus 6 chrome 56 beta |
Just finished (the last) minor improvements, on npm now as 6.6.0. |
Latest optimized build with latest release of protobufjs On nexus 6 chrome 56 beta Btw, on a iPhone 7 (tested today on a friends phone), the load time is ridiculously fast, almost blink speed I guess its all due to hardware. I will try to get more scientific measurements tomorrow. |
These will be live tomorrow, I will update the thread when the release happens. The code is in QA right now. |
New release is live. The same links will now be serving the latest protobufjs library |
Timeline alone looks pretty much identical to before, hmm. |
Maybe of interest: pbjs now has an experimental This is used to generate JSON for some new google/api datatypes. internals |
Update: |
Closing this issue for now as it hasn't received any replies recently (and seems to be solved). Feel free to reopen it if necessary! |
protobuf.js version: 6.5.3
Currently on mobile browsers, the performance lags significantly. Opening an issue to aid investigation
Here are live sites using this library. You can use these sites to profile. Heritage loads in 1.1s on MacBook pro while it takes several seconds to load on my Nexus 6
https://theheritage.tocktix.com
https://theaviary.tocktix.com
https://next.tocktix.com
https://alinea.tocktix.com
The text was updated successfully, but these errors were encountered: