Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(backend): store files in binary as compressed #263

Merged
merged 8 commits into from
Jun 19, 2022
Merged

Conversation

ravenclaw900
Copy link
Owner

No need to compress them during runtime, so smaller and faster binary.

Before/After Size (MB)
Before (without UPX) 3.22
After (without UPX) 2.83
Before (with UPX) 1.21
After (with UPX) 1.24

Compressed with UPX, it's now actually slightly larger, because it's trying to recompress the already-compressed data, and because the text was LZMA compressed (in the binary) before.

@ravenclaw900 ravenclaw900 added the enhancement New feature or request label Jun 18, 2022
@ravenclaw900 ravenclaw900 requested a review from MichaIng June 18, 2022 22:12
@ravenclaw900 ravenclaw900 mentioned this pull request Jun 18, 2022
24 tasks
@MichaIng
Copy link
Collaborator

Just for clarification:

  • Before, the frontend assets were not compressed in runtime memory, but compressed (and hence served compressed) on client requests?
  • Now they are compressed in runtime memory, hence don't need to be compressed on client requests anymore?
  • With UPX, the binary is LZMA-compressed, which doesn't work well with gzip-compressed assets.

Was there a cache for gzip-compressed assets before, so that this change does not actually enhance request performance but runtime memory usage instead?

@ravenclaw900
Copy link
Owner Author

Yes, yes, and yes. The files are compressed before being included into the binary. From a quick glance, it doesn't look like there's a cache (https://github.com/seanmonstar/warp/blob/11169f2858f82cbac388771e91331c4d5e5ec070/src/filters/compression.rs#L70-L83), and to be sure, I ran a few benchmarks using the large xterm.js file:

Before/After Average Time (ns)
Before 129,000
After (without compression) 7,147
After (with compression) 4,683

Obviously, the difference is basically negligible, however the current approach does seem to be better than the previous one.

@MichaIng
Copy link
Collaborator

Affects anyway only first access of a client due to browser cache. But, despite the little larger binary, I agree it is better now, also reducing CPU utilisation a tiny little bit 🙂.

@ravenclaw900 ravenclaw900 merged commit cd80f5a into main Jun 19, 2022
@ravenclaw900 ravenclaw900 deleted the gzip-files branch June 19, 2022 16:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants