Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add limit to list workflows #698

Merged
merged 5 commits into from
Dec 6, 2024
Merged

Add limit to list workflows #698

merged 5 commits into from
Dec 6, 2024

Conversation

Sushisource
Copy link
Member

What was changed

Title

Why?

Convenience functionality since there's no take function or similar in Python for async iterators

Checklist

  1. Closes [Feature Request] Support a limit parameter in replay_workflows and workflow_replay_iterator #610

  2. How was this tested:
    Added test

  3. Any docs updates needed?

@Sushisource Sushisource requested a review from a team as a code owner December 5, 2024 22:57
@@ -796,6 +796,7 @@ def list_workflows(
next_page_token: Optional[bytes] = None,
rpc_metadata: Mapping[str, str] = {},
rpc_timeout: Optional[timedelta] = None,
limit: Optional[int] = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pedantic but I'd move this up above page_size. The other params aren't really ever expected to be set.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, right I forgot the * meant that they couldn't be used positionally

@@ -2508,10 +2514,14 @@ async def fetch_next_page(self, *, page_size: Optional[int] = None) -> None:
page_size: Override the page size this iterator was originally
created with.
"""
page_size = page_size or self._input.page_size
if self._limit is not None and self._limit - self._yielded < page_size:
Copy link
Member

@cretz cretz Dec 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason that we made the fetch_next_page public is because technically people don't have to use the async iteration, they can paginate using this themselves. For users using it that way, limit will not work. I am unsure if we want to make it work or if we want to say it's only when using it as an async iterable that it applies.

One option is to instead offer a general Python helper for limiting async iterables (though doesn't help the page size thing). But we can also just document that this field only works when using as an async iterable.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, honestly I'd prefer to just un-pub it but I suppose that's not really an option. I'll update the docstring to say it only applies to the iterable, since if you're doing it yourself ostensibly you don't care anyway.

Copy link
Member

@cretz cretz Dec 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me. The reason behind making it public is I was afraid people wanted to use it outside of an async for and were afraid to invoke __anext__ themselves and understand how to handle iteration stop and such. Async iterables in Python are not as easy to use outside of loop constructs as, say, Java streams or .NET async enumerables.

@Sushisource Sushisource enabled auto-merge (squash) December 6, 2024 17:48
@Sushisource Sushisource merged commit 173826f into main Dec 6, 2024
12 checks passed
@Sushisource Sushisource deleted the list-workflows-limit branch December 6, 2024 20:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature Request] Support a limit parameter in replay_workflows and workflow_replay_iterator
2 participants