Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Excessive memory usage #42

Open
MikeLund opened this issue Jul 7, 2024 · 7 comments
Open

Excessive memory usage #42

MikeLund opened this issue Jul 7, 2024 · 7 comments

Comments

@MikeLund
Copy link

MikeLund commented Jul 7, 2024

I tried latest FreePackages 1.5.0.1, on latest ASF 6.0.4.4 on Linux. With around 100 accounts on the default settings (only thing changed is adding EnableFreePackages:true to every bot config), it starts to grow to using over 8GB RAM (in normal case, ASF uses less than 400MB for all these accounts), until oom_reaper hopefully kills ASF.

I'm not exactly sure how to debug this, as I don't know too much about ASF's plugin architecture and such... and of course, so many accounts is unsupported. But it feels to me like it's memory leaking in some kind of way? Do you have any ideas/suggestions on how I could help you debug this? Thank you.

@Citrinate
Copy link
Owner

Citrinate commented Jul 7, 2024

So obviously this might not be something I can reproduce, and so I'm not sure I'll be able to fix this. Though I am willing to look into it.

How long is ASF running before the memory usage gets to over 8GB? Are you using the userscript to import packages from SteamDB? Are you using any ASF plugins in addition to this one? I'm also wondering if you could test my BoosterManager plugin to see if it has a similar issue (editing configs or using commands isn't necessary, I'd just like to know how simply having it installed effects memory usage), as the architecture for both of these plugins is very similar.

The plugin will create a botname_FreePackages.db file in asf/config for every bot to preserve each bot's state. I don't expect the combined state of 100 bots to use anywhere near 8GB of memory, but I'm still curious what the combined file size of all of these files is for you. After that, I'd like to know if using the clearfreepackagesqueue asf command (which will delete much of the data in these files) has any noticeable improvement on memory usage.

Edit: Also were you using this plugin before without issue or is this your first time trying to use it? Do you by any chance have Debug set to true in ASF.json?

@MikeLund
Copy link
Author

MikeLund commented Jul 7, 2024

Hey!

How long is ASF running before the memory usage gets to over 8GB?

It goes up gradually, like something would if it was slowly leaking. I will get back on this.

Are you using the userscript to import packages from SteamDB?

Nope, just the automatic background way. I don't have IPC enabled either.

Are you using any ASF plugins in addition to this one?

Nothing non-default. In plugins folder I have the ArchiSteamFarm.OfficialPlugins.ItemsMatcher, ArchiSteamFarm.OfficialPlugins.MobileAuthenticator, and ArchiSteamFarm.OfficialPlugins.SteamTokenDumper, but not enabled them for anything.

I'm also wondering if you could test my BoosterManager plugin to see if it has a similar issue (editing configs or using commands isn't necessary, I'd just like to know how simply having it installed effects memory usage), as the architecture for both of these plugins is very similar.

Tried it quickly and it seems to not increase much at all, so it's seemingly something more specific with this plugin.

what the combined file size of all of these files is for you

Every file is ~48KB, so not that exponential I don't think. I notice the memory used before accounts have even connected is slightly higher with plugin enabled, so it's probably already loading all that before it starts going overboard.

if using the clearfreepackagesqueue asf command (which will delete much of the data in these files) has any noticeable improvement on memory usage

Didn't have any real impact, but will try again when it manages to balloon up.

Edit: Also were you using this plugin before without issue or is this your first time trying to use it?

I used it last time maybe a half a year ago on Windows. Thinking back, it probably did, but I only really noticed it on Linux because I had swap turned off.

Do you by any chance have Debug set to true in ASF.json?

Nope.

One thing I thought, I have WebLimiterDelay set to 1800, could it possibly be that lots of queued requests are taking up the memory? Wouldn't think so though...

@MikeLund
Copy link
Author

MikeLund commented Jul 7, 2024

After 1 hour, it was 2.2 GB, and continued at the same rate (from normal base of ~300MB).

Interesting thought: FarmingPreferences didn't include FarmingPausedByDefault -- thus every single time a package was added, it would check badge pages etc. Disabling farming-check seemed to help a bit.

After disabling the farming, it has grown to 1,8GB after 2 hours. clearfreepackagesqueue asf does absolutely nothing: doesn't seem to free a single byte. It still seems to be growing, but not as bad as with frequent badge-checking. Will see more tomorrow :)

@Citrinate
Copy link
Owner

Citrinate commented Jul 7, 2024

Might be helpful for you to post your ASF.json file, feel free to remove any personal data.

Interesting thought: FarmingPreferences didn't include FarmingPausedByDefault -- thus every single time a package was added, it would check badge pages etc. Disabling farming-check seemed to help a bit.

If the problem can be triggered simply by adding games to your accounts, then maybe you can trigger it using ASF's addlicense command. Try not having farming paused, not using the FreePackages plugin, and then activating a bunch of free packages using ASF's addlicense command. SteamDB's freepackages tool can be used to generate such a command under "Add using ArchiSteamFarm"

@MikeLund
Copy link
Author

MikeLund commented Jul 8, 2024

After ~13 hours, 6.2 GB. So still happening. One weird thing was that the last thing in all my logs was "Init() Success!" for the last hour: it wasn't seemingly doing anything else. May have been caused by the Steam partial downtime around 6AM CEST or something... that would give me the similar guess as before regarding weblimiter, that there's a ton of commands queued up in some sort of way? But just blind guesses.

and then activating a bunch of free packages using ASF's addlicense command

One big difference is that the plugin adds them one-by-one, so unless I send them in same way (addlicense 123; addlicense 456; addlicense 789; etc) it wouldn't check the badges page every time. And it's normal that badge checking takes in a ton of memory, but it usually frees it fine otherwise. But maybe.

ASF.json:

{
  "AutoRestart": false,
  "ConnectionTimeout": 180,
  "GiftsLimiterDelay": 10,
  "InventoryLimiterDelay": 8,
  "IPC": false,
  "SteamMessagePrefix": "",
  "SteamOwnerID": xxxxxxxxx,
  "UpdatePeriod": 0,
  "WebLimiterDelay": 1800
}

botconfig.json:

{
  "AcceptGifts": true,
  "BotBehaviour": 15,
  "Enabled": true,
  "FarmingPreferences": 129,
  "LootableTypes": [
    0,
    1,
    2,
    3,
    4,
    5,
    6
  ],
  "RedeemingPreferences": 3,
  "RemoteCommunication": 0,
  "SendTradePeriod": 96,
  "SteamLogin": "username",
  "SteamPassword": "password",
  "SteamTradeToken": "zzzzzzzz",
  "SteamUserPermissions": {
    "xxxxxxxxxxx": 3
  },
  "TradingPreferences": 1,
  "EnableFreePackages": true,
  "SendOnFarmingFinished": true,
  "HandleOfflineMessages": false,
  "SteamParentalPIN": "0"
}

I also read the code fast; haven't used C# in a really long time nor have experience on your codebase, but you perhaps have some semaphores you're not releasing? Maybe PICSHandler / PICSChangesSemaphore? Especially considering you saying BoosterMaster is quite similar architecture, whereas it gives no problems for me (so some of the bigger differences are perhaps within PICS), plus that the memory steadily accumulates - kinda like how PICS steadily has updates? But no idea.

Not sure what my next step would be now; maybe something with MonitoringPlugin could help? It'd be cool to figure this out, but I appreciate your help regardless if we manage to resolve it, so thanks :)

@woctezuma
Copy link

woctezuma commented Jul 8, 2024

One thing I thought, I have WebLimiterDelay set to 1800, could it possibly be that lots of queued requests are taking up the memory? Wouldn't think so though...

There is probably a bottleneck somewhere due to this, and the queue of requests would grow faster than it could ever be cleared.

I would imagine a few suggestions:

  1. Maybe lower the value of the WebLimiterDelay parameter:
  • the default is 300, which means at most 1 request every 300 ms,
  • you are using 1800, which means at most 1 request every 1.8 second,
  • there may be a better trade-off between 300 and 1800 ms.
  1. If possible, lower the number of bots:
  • the issue should not happen with 1 bot,
  • you notice the issue with 100 bots,
  • there may be a better trade-off between 1 and 100 bots.
  1. Add filters so that you don't add every free package, e.g. skip demos. This may tremendously decrease the queue size.
  "EnableFreePackages": true,
  "FreePackagesFilter": {
    "IgnoreFreeWeekends": true,
    "IgnoredTypes": [
      "Demo"
    ]
  },

I wonder if it would be possible to replicate this issue with fewer bots by setting the rate-limit to a very low value. For instance:

  "FreePackagesPerHour": 1,

Maybe along with a non-default value in ASF.json:

  "WebLimiterDelay": 600,

@Citrinate
Copy link
Owner

Citrinate commented Jul 8, 2024

One weird thing was that the last thing in all my logs was "Init() Success!" for the last hour: it wasn't seemingly doing anything else. May have been caused by the Steam partial downtime around 6AM CEST or something... that would give me the similar guess as before regarding weblimiter, that there's a ton of commands queued up in some sort of way? But just blind guesses.

Could you post the results of the qsa command while memory usage is high? If there is a bottleneck somewhere then I would expect to see it in the output of this command. It'll also be helpful then to see how the output of this command changes over time (every hour). If the status of each bot is identical then it's not necessary to post the status for all of them.

Not sure what my next step would be now; maybe something with MonitoringPlugin could help?

I've never used this plugin, but if you're able to get it set up then sure it might be helpful to see runtime stats and maybe http client stats (as shown in this screenshot).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants