Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add flag to run with uniform weights #1838

Closed
shekar-stripe opened this issue Aug 4, 2021 · 11 comments
Closed

Add flag to run with uniform weights #1838

shekar-stripe opened this issue Aug 4, 2021 · 11 comments

Comments

@shekar-stripe
Copy link
Contributor

shekar-stripe commented Aug 4, 2021

Hi! I work at Stripe. Our reliability infrastructure team uses locust to load test. Oftentimes there are tasks that occur with extremely low probability, so it is infeasible to simply rely on chance for a task to be run when testing a locustfile locally. Instead, engineers switch all task weights to 1, then switch them back once they're confident their locustfile is correct. This is proving untenable for large locustfiles.

Is your feature request related to a problem? Please describe.

Switching weights often takes place across multiple files (under many layers of indirection). As a result, engineers have to remember past weights, sift through multiple files, etc. simply to test their implementation.

Describe the solution you'd like

This process would be made much easier if there was a flag that simply overrode task weights to an even distribution, or even guaranteed that each task was run sequentially. Somewhat like a "test" mode. I'm happy to open a PR for this.

Describe alternatives you've considered

We've considered two alternatives:

  • write a syntax tree parser that changes weights in the python files to 1
  • modify locust internals in our Docker container

Of these options, the team thinks that adding a flag would be the cleanest solution.

@mboutet
Copy link
Contributor

mboutet commented Aug 4, 2021

Have you considered using an environment variable to set all weights to 1?

e.g.

class MyUser(FastHttpUser):
    @task(1 if os.getenv("SET_ALL_TASKS_WEIGHT_TO_1", "false") == "true" else 3)
    def my_task(self):
        ...

@shekar-stripe
Copy link
Contributor Author

While this could work, my main concern would be that we assign task weights in a few different ways depending on the load test being performed. Some are like your example, some are assigned in a dictionary, some are assigned in other files through indirection and downstream computation. As a result, I think that using an env var in a few different ways across hundreds of tasks is non ideal. What do you think?

@mboutet
Copy link
Contributor

mboutet commented Aug 4, 2021

I see.

Would using tags be an option? When only a few tasks have to be tested to validate that they work as expected, simply add tags on them and then run locust with the --tags flag to select them.

Would that work?

Otherwise, the logic you want probably needs to be implemented in the locust.user.task.get_tasks_from_base_classes function. I'm not a maintainer, so you'll have to wait for @cyberw's opinion.

@cyberw
Copy link
Collaborator

cyberw commented Aug 4, 2021

Hi @shekar-stripe ! Interesting question. When I want to run a specific User I use locust-plugins's run_single_user() https://github.com/SvenskaSpel/locust-plugins/blob/master/examples/debug_ex.py (I should probably add this to locust core at some point).

Before I start guessing too much about your configuration, am I correct in assuming that you have large User classes with multiple tasks (some of which are rarely executed because of low task weights)?. Would it be an option for you to have smaller Users?

I dont think having an option to weighting all tasks to 1 makes much sense (because it is kind of an internal thing), but an option to weight all Users might.

@shekar-stripe
Copy link
Contributor Author

Thanks @mboutet for the tags suggestion! Unfortunately, for a similar reason to the env var, I don't think that tags would be an ideal solution because of the many ways we end up assigning task weights.

Hi @cyberw! Yep, we do have very large users with lots of tasks, some of which are TaskSets in their own rights, which have O(dozens) of Tasks, some of which themselves are TaskSet...

For that reason, it isn't feasible for us to create smaller users (due to the necessary complexity of attempting to load test across the entire API). In addition, I don't completely understand why tasks are considered an "internal thing" since they are explicitly declared in our code. Do you mind elaborating? Thanks so much!

@cyberw
Copy link
Collaborator

cyberw commented Aug 4, 2021

What I mean is that they are internal to the Users, and even more so if they are nested in TaskSets. Most command line parameters work on a "global" level (tags being the exception), without messing with the internals of a User/TaskSet.

I'm not a fan of TaskSets myself and prefer having more Users (with more complex tasks as needed, using "regular python" for abstraction), but maybe that is just me :)

But I definitely see why you would need this feature, and welcome a PR.

@mboutet 's suggestion of modifying get_tasks_from_base_classes method is one way to do it, another way is by just iterating thru all the Users and TaskSets, overwriting their weights (and task weights). If you add a listener to the init event, you'll get access to all the User classes (using environment.user_classes) https://docs.locust.io/en/stable/extending-locust.html#run-a-background-greenlet

@shekar-stripe
Copy link
Contributor Author

That sounds good! Thanks for the advice.

@shekar-stripe
Copy link
Contributor Author

Hi @cyberw, are there any settings that the maintainers need to change for me to be able to push a branch to the repo? I'm getting a bunch of git issues and I'm not sure if they're because of some repo setting. I had to go through a somewhat complicated process to create a mirror git account to my work account, so it's definitely possible something is just messed up on my end. Thanks!

@cyberw
Copy link
Collaborator

cyberw commented Aug 5, 2021

No worries :) You should fork the locust repo, push to a branch (in your repo) and open a PR from your branch to upstream (locustio) master branch.

@cyberw
Copy link
Collaborator

cyberw commented Aug 5, 2021

You're the second person to ask that this week, so I've tried to improve the documentation: https://docs.locust.io/en/latest/developing-locust.html#install-locust-for-development

@shekar-stripe
Copy link
Contributor Author

Cool, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants