Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve BatchElements documentation #32082

Merged
merged 7 commits into from
Aug 30, 2024

Conversation

jrmccluskey
Copy link
Contributor

Stateful BatchElements behavior (particularly the impact on throughput) was not explained. Updating the doc string to explain what the stateful option is doing as well as how the value of max_batch_duration_secs impacts throughput makes this behavior more clear.


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Mention the appropriate issue in your description (for example: addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, comment fixes #<ISSUE NUMBER> instead.
  • Update CHANGES.md with noteworthy changes.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

See the Contributor Guide for more tips on how to make review process smoother.

To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md

GitHub Actions Tests Status (on master branch)

Build python source distribution and wheels
Python tests
Java tests
Go tests

See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.

@github-actions github-actions bot added the python label Aug 5, 2024
@jrmccluskey jrmccluskey changed the title Imporve BatchElements documentation Improve BatchElements documentation Aug 5, 2024
Copy link
Contributor

github-actions bot commented Aug 5, 2024

Assigning reviewers. If you would like to opt out of this review, comment assign to next reviewer:

R: @shunping for label python.

Available commands:

  • stop reviewer notifications - opt out of the automated review tooling
  • remind me after tests pass - tag the comment author after tests pass
  • waiting on author - shift the attention set back to the author (any comment or push by the author will return the attention set to the reviewers)

The PR bot will only process comments in the main thread (not review comments).

Copy link
Contributor

Reminder, please take a look at this pr: @shunping

Copy link
Contributor

Assigning new set of reviewers because Pr has gone too long without review. If you would like to opt out of this review, comment assign to next reviewer:

R: @tvalentyn for label python.

Available commands:

  • stop reviewer notifications - opt out of the automated review tooling
  • remind me after tests pass - tag the comment author after tests pass
  • waiting on author - shift the attention set back to the author (any comment or push by the author will return the attention set to the reviewers)

When the max_batch_duration_secs arg is provided, a stateful implementation
of BatchElements is used to batch elements across bundles. This is most
impactful in streaming applications where many bundles only contain one
element. Larger max_batch_duration_secs values will reduce the throughput of
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Larger max_batch_duration_secs values will reduce the throughput

is xput the right term here? I feel like longer duration should increase xput, because we reduce the per-element overhead. At least, if we measure xput for over a sufficiently long duration, say elements per hour.

However the added latency might result in increased data freshness reading for downstream stages. https://cloud.google.com/dataflow/docs/guides/using-monitoring-intf#data_freshness_streaming.

WDYT about the following:

Larger max_batch_duration_secs values might increase the overall the throughput of the transform, but might negatively impact the data freshness on downstream transforms due to added latency. Smaller values will have less impact on data freshness, but might make batches smaller than the target batch size.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Throughput is the right term, that batching can create a bottleneck.

The documentation at https://beam.apache.org/documentation/patterns/batch-elements/ should outline it more clearly as far as tuning, I think routing users there along with the new docstring content will help a lot

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so batching can cause the pipeline to emit less # of elements per sufficiently large unit of time?

Copy link
Contributor Author

@jrmccluskey jrmccluskey Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Potentially, yes. The slowdown is most pronounced when looking at the incomplete bundle case. If we weren't using the stateful batching, elements are emitted downstream as they arrive to BatchElements since they're single-element bundles. If nine elements arrive within the span of five seconds, you've emitted those nine elements in that span as well (abstracting away any overhead from the code emitting the bundles of 1.) Meanwhile, if we're statefully batching and the target batch size is greater than 9 and our maximum buffer time is greater than 5 seconds, we'd be emitting those 9 elements at a later time, but together. We're just artificially increasing the denominator in the fraction. The hope is that this tradeoff has some benefit to the downstream operation that is worth this bottleneck potential, but the documentation around that tradeoff was lacking prior

jrmccluskey and others added 2 commits August 21, 2024 11:23
Co-authored-by: Jonathan Sabbagh <108473809+jbsabbagh@users.noreply.github.com>
sdks/python/apache_beam/transforms/util.py Outdated Show resolved Hide resolved
sdks/python/apache_beam/transforms/util.py Outdated Show resolved Hide resolved
jrmccluskey and others added 2 commits August 22, 2024 10:02
Co-authored-by: tvalentyn <tvalentyn@users.noreply.github.com>
When the max_batch_duration_secs arg is provided, a stateful implementation
of BatchElements is used to batch elements across bundles. This is most
impactful in streaming applications where many bundles only contain one
element. Larger max_batch_duration_secs values `might` reduce the throughput
Copy link
Contributor

@tvalentyn tvalentyn Aug 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

apologies - if i added the backticks here, that was unintentional.

Suggested change
element. Larger max_batch_duration_secs values `might` reduce the throughput
element. Larger max_batch_duration_secs values might reduce the throughput

Copy link
Contributor

@tvalentyn tvalentyn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks!

Copy link
Contributor

Reminder, please take a look at this pr: @tvalentyn

@tvalentyn tvalentyn merged commit cfe8fee into apache:master Aug 30, 2024
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants