-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GRPC compatibility : Locust load test throws greenlet.GreenletExit exception on reaching test time limit #1676
Comments
GRPCIO uses asyncio, which is incompatible with gevent / greenlets, on which locust is built. I dont think this can be fixed (with anything less than a major rewrite of locust) |
Actually, it may be possible using something like https://github.com/2mf/aiogevent But you are on your own... (and that repo doesnt look like it is actively maintained) |
Actually grpc has been patched for compatibility with gevent grpc/grpc#4629, on peeking into source of locust it might be due to async calls used for stopping greenlets. I have been using locust with grpc previously. |
Ok, aha... yea that was added pretty recently to locust. |
Reopening this as it may be possible to fix after all. Dont expect a huge amount of attention on it though :) If you can then please have a look at fixing it yourself. |
The asyncronous stopping of locusts is only really necessary when --stop-timeout is being used, so if we reverted that part to use syncronous calls then it might work for "regular" runs. See runners.py:250 |
Also seeing this error on distributed runners when running a locust cluster. Please let me know if there's something I could look at to help unblock this. |
Making the asyncrounous stopping of locusts a regular ”direct” kill if stop-timeout is zero at runners.py:250 should provide a workaround. |
Should be fixed now! (not yet in a release though, but it should become part of 1.5.2 when I make it) |
Still happening
|
|
@beandrad Any idea? |
You should close the grpc channel before killing the user (greenlet); the example in the documentation shows how to do this. |
this part? locust/examples/grpc/locustfile.py Line 67 in 94e16dd
Because I modified my runner based on yours (the example that is part of the documentation). |
Yep, that's it. If the runner now uses the instance |
Thanks @beandrad This is my class, should be comparable to yours. But I am running locust as a library, could that be the reason? class GRPCLoadTest(User):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
if "https" in self.host:
target = self.host.replace("https://", "")
self.channel = grpc.secure_channel(
target,
credentials=grpc.ssl_channel_credentials(root_certificates=certs),
options=GRPC_CLIENT_CHANNEL_OPTS,
)
else:
target = self.host.replace("http://", "")
self.channel = grpc.insecure_channel(
target, options=GRPC_CLIENT_CHANNEL_OPTS
)
self._channel_closed = False
self.stub = service_grpc.PredictStub(self.channel)
def on_stop(self, force=False):
self._channel_closed = True
time.sleep(1)
self.channel.close()
super().stop(force=True)
@stopwatch
def _get_prediction(self, *args, **kwargs):
requests = [service_pb.Request(), service_pb.Request(), service_pb.Request()]
prediction_request = service_pb.PredictionRequest(requests=requests)
self.stub.Evaluate(prediction_request)
@task
def loadtest_runner(self, *args, **kwargs):
if not self._channel_closed:
self._get_prediction()
time.sleep(1) |
|
ouch 🤦 thanks @beandrad 🙌 I'll test it again and let you know! |
Thanks @beandrad, it does work 👍 |
Describe the bug
Locust load test throws greenlet.GreenletExit exception on reaching test time limit while testing GRPC based services. To monkey patch GRPC with gevent I am using grpc's:
but no luck.
Expected behavior
Should gracefully stop all greenlets and generate the necessary outputs.
Actual behavior
Test keeps on generating load, does not stop, has to be stopped via CTRL+C Keyboard Interrupt.
Thrown exception:
Steps to reproduce
I am using locust to test bunch of GRPC services and tasks involve bunch of GRPC calls.
Environment
The text was updated successfully, but these errors were encountered: