Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retry decorator should support generator functions #485

Closed
daniel-sanche opened this issue Feb 10, 2023 · 1 comment · Fixed by #495
Closed

Retry decorator should support generator functions #485

daniel-sanche opened this issue Feb 10, 2023 · 1 comment · Fixed by #495
Assignees
Labels
priority: p2 Moderately-important priority. Fix may not be included in next release. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.

Comments

@daniel-sanche
Copy link
Contributor

daniel-sanche commented Feb 10, 2023

Is your feature request related to a problem? Please describe.

Server streaming libraries return an iterable that can asynchronously yield data from the backend over time. Some of our libraries need to provide a wrapper around the raw stream to do some local processing, before passing the data to the user.

It would be useful to wrap this whole pipeline in a retry decorator, so that if the stream breaks mid-way through, we can recover and continue yielding data through our generator as if nothing happened.

Unfortunately, the current implementation returns the result of the target function directly, so generators will not yield values and exceptions through the retry block

Describe the solution you'd like

Add some code to detect when a target function is a generator. If so, Replace:


with:

yield from target()

Some extra work will also be needed to detect generator .close(), and async generators

I plan on putting together a PR for this shortly

@daniel-sanche
Copy link
Contributor Author

daniel-sanche commented Feb 14, 2023

I have a solution staged in a branch, but I plan on dogfooding it internally for a while before opening a PR

@parthea parthea added type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. priority: p2 Moderately-important priority. Fix may not be included in next release. labels Feb 15, 2023
@daniel-sanche daniel-sanche self-assigned this Aug 15, 2023
@daniel-sanche daniel-sanche added type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design. and removed type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Aug 15, 2023
vchudnov-g added a commit that referenced this issue Dec 12, 2023
From #485:

> Server streaming libraries return an iterable that can asynchronously yield data from the backend over time. Some of our libraries need to provide a wrapper around the raw stream to do some local processing, before passing the data to the user.
>
> It would be useful to wrap this whole pipeline in a retry decorator, so that if the stream breaks mid-way through, we can recover and continue yielding data through our generator as if nothing happened.
>
> Unfortunately, the current implementation returns the result of the target function directly, so generators will not yield values and exceptions through the retry block

This PR addresses the issue by adding retry_target_generator functions to both the async and sync retry modules, which yield through the target rather than call it directly. Generator mode can be enabled using the is_generator argument on the decorator.

Fixes #485
---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Anthonios Partheniou <partheniou@google.com>
Co-authored-by: Victor Chudnovsky <vchudnov@google.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: p2 Moderately-important priority. Fix may not be included in next release. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants