Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prediction request timeout including time spent in queue #1322

Open
nateagr opened this issue Nov 15, 2021 · 2 comments
Open

Prediction request timeout including time spent in queue #1322

nateagr opened this issue Nov 15, 2021 · 2 comments
Labels
enhancement New feature or request

Comments

@nateagr
Copy link

nateagr commented Nov 15, 2021

Is your feature request related to a problem? Please describe.

The feature request is not related to a bug. But at my company, we have this following problem: we would like to be able to dismiss prediction requests after a certain amount of time. Why ? Because they are not relevant anymore after this amount of time. As of today, it is possible to define a timeout, responseTimeout (https://pytorch.org/serve/configuration.html), but it only applies to model inference (handler). It does not include the time spent in queue.

Describe the solution

To avoid any regression, I would suggest to add a new model parameter: predictionRequestTimeout. This parameter defines the timeout after which a prediction request is dismissed. It includes the time spend in queue and inference time. From an implementation point of view, I would suggest to modify:

If you agree with this feature requests, I'm willing to push a PR.

@HamidShojanazeri HamidShojanazeri added the enhancement New feature or request label Nov 15, 2021
@nateagr
Copy link
Author

nateagr commented Nov 18, 2021

Hello there! Should I suggest a PR ? Or do you need time to think about it ?

@HamidShojanazeri
Copy link
Collaborator

HamidShojanazeri commented Nov 18, 2021

@nateagr please go a head with a PR, we would appreciate it and happy to review and help to merge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants