Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit report delivery per page #142

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

paulmeyer90
Copy link
Contributor

@paulmeyer90 paulmeyer90 commented Dec 10, 2018

There is currently no limit on how many reports can be generated, which could end up using up a lot of a user's data to deliver. This patch aims to limit the number of possible reports that can be reported. The limit is enforced per report type, so that one spammy report type can't prevent reports of other types from being delivered.


Preview | Diff

@paulmeyer90
Copy link
Contributor Author

Slightly expanded description available in this doc: https://docs.google.com/document/d/1QaZAvKPndctevtWKVEgXrRo8usfNvXY4A3m85RYwwMQ/edit?usp=sharing

@dcreager
Copy link
Member

Can there be a way for a report type to opt out of this behavior? NEL, for instance, already has a limit to how many reports can be created, since there can be at most one for each outgoing request. NEL also isn't observable in JavaScript, so maybe this limit could only apply to the buffer of reports handed off to ReportingObservers? For the report delivery side, we're batching reports, and #62 discusses whether we can compress report uploads, which would be a better solution for conserving upload bandwidth.

@paulmeyer90
Copy link
Contributor Author

@dcreager This was intented to be a general limit for all report types. Is the issue that you feel a limit of 100 is too small for NEL per page load? How many do you expect is possible/reasonable to be generated on one page?

@RByers RByers requested review from dcreager and removed request for RByers December 12, 2018 02:59
@dcreager
Copy link
Member

If a page has more than 100 resources (all from the same origin), we'd run into this limit. Can it be the limit of currently buffered reports? I.e., as we deliver them, we decrement the counter?

Now that I look at the Chromium source, we're already doing that! It's even a limit of 100, as you've defined here. But it's global, not per-origin!

@paulmeyer90
Copy link
Contributor Author

Okay, I see that there is a buffer limit (and also for ReportingObserver), but this proposal is more about not blasting a user's bandwidth with unlimited reports. NEL may not be as likely to go over the limit, but I think it would be nice to have a general rule about this. Do you think it always makes sense to send as many NEL reports as are generated? Could there be a poorly written site that ends up loading resources forever and send reports continuously?

@igrigorik
Copy link
Member

Do you think it always makes sense to send as many NEL reports as are generated? Could there be a poorly written site that ends up loading resources forever and send reports continuously?

These definitely exist, both due to poor implementation and because the client is actively polling for updates (e.g. real-time comments and such). I agree that it would be nice to have some reasonable limits.. and perhaps even consider situations where user has saveData enabled.

@paulmeyer90
Copy link
Contributor Author

I have updated this change to only limit report delivery when saveData is true. WDYT?

with <a spec=infra>key</a> |type| and <a spec=infra>value</a> 1.

5. If |settings|'s <a>report count</a>[|type|] is greater than 100 and the
current <a spec="netinfo">saveData</a> preference is true, return.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dcreager curious, do you have a sense for the rough expected size of each report upload today? I'd love to understand the overall cost of this feature to the user.

@paulmeyer90 I think what you're proposing is reasonable, but I believe it diverges from our current implementation in Chrome — correct? As in, we have a hard cap of 100 reports (per type) regardless of data saver preference.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the current Chrome implementation, there's at most 1 upload per minute for each URL in the reporting policy. The upload would contain a batch of reports; the size of the upload would depend on the number of reports generated during that minute.

For NEL, I don't have access to the sizing data for Google's real-world reports anymore, but @sburnett might. The example reports in the spec clock in at 300-400 bytes each. The biggest variable is the URL; everything else is a pretty consistent size. (Note that's just for NEL; other types of report might be much larger and/or more variable in size.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently in Chrome, there is a 100 report limit in the report buffer (we won't buffer more than that, so if many more come at once, still only 100 will be delivered), but no overall limit for the number of reports delivered. It also currently does not behave differently when data saver is on.

Copy link
Member

@igrigorik igrigorik Feb 14, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the current Chrome implementation, there's at most 1 upload per minute for each URL in the reporting policy.

@dcreager that's a great point. We don't currently spec this.. should we? This seems like a good pattern that we should encourage; I've heard many past complaints about CSP violations triggering dozens to hundreds of requests.

Currently in Chrome, there is a 100 report limit in the report buffer (we won't buffer more than that, so if many more come at once, still only 100 will be delivered), but no overall limit for the number of reports delivered.

@paulmeyer90 to confirm, the combined interaction here is: there is a single upload once a minute and the buffer for an individual entry type is allowed to accumulate up to 100 reports within this window — is that correct? Update: nm, @dcreager confirmed the above in his followup comment.. :)

@dcreager
Copy link
Member

his proposal is more about not blasting a user's bandwidth with unlimited reports

Yep, I buy that. My concern is that as written, this 100-report limit would be for the lifetime of the page. For NEL, we want to support something like a single-page app that's open in a tab for a very long time. It makes 1 or 2 requests every minute or so to check for updates from the server, and the user leaves that page open for days or weeks. We'd definitely exhaust the 100-report limit at some point, and then stop sending any more NEL reports until the user refreshed.

If we reset the limit every time a batch of reports is uploaded, then we'd still get a useful bandwidth cap — the page would upload no more than 100 reports per minute (or whatever the user agent has set the upload interval to). And the user agent would never have more than 100 reports (of a particular type, for a particular page) in memory at any point in time.

For the non-NEL report types, would that be enough of a limit? Possibly not — I could see how you might want to limit deprecation warnings, for instance, to have a maximum number of reports for the lifetime of the page. But if so, that sounds like we need to allow different report types to have different limitations — either by having each downstream spec take care of it entirely by itself (i.e., NEL would define and keep track of some limits, and wouldn't call "queue a report for delivery" once the NEL-specific limit was exceeded), or by having the limits be another part of the definition of a report type.

@paulmeyer90
Copy link
Contributor Author

Is it possible that limiting the number of reports delivered per page lifetime is still reasonable to do only when data saver is on? I imagine that for the use case Douglas mentions of leaving a page open indefinitely and expecting ongoing reporting, probably the user wouldn't/shouldn't have data saver on in that case, and so this limit will not affect it.

@igrigorik
Copy link
Member

If we reset the limit every time a batch of reports is uploaded, then we'd still get a useful bandwidth cap — the page would upload no more than 100 reports per minute (or whatever the user agent has set the upload interval to). And the user agent would never have more than 100 reports (of a particular type, for a particular page) in memory at any point in time.

Yeah, I agree, I think that's the right direction: we rate limit uploads, each upload can contain a limited number of reports of each type for given period of time. We should spec both of these.

Thinking about Save-Data some more.. We shouldn't blanket opt-out from report delivery when this enabled. Connecting to above, one "simple" toggle here would be to reduce the buffer depth for each report type.. However, I'm also OK if we park this for now.

Base automatically changed from master to main February 3, 2021 13:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants