Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: multi-scrobbler creates countless empty log files + detects memory leak #214

Closed
2 tasks done
AverageHoarder opened this issue Oct 27, 2024 · 25 comments
Closed
2 tasks done
Assignees
Labels
bug Something isn't working

Comments

@AverageHoarder
Copy link

Please check existing knowledge before opening an issue

Describe the Bug

multi-scrobbler managed to make my filesystem run out of inodes by creating countless (the ls command literally did not respond for 10+ seconds) empty log files following this naming scheme in the persistent /config/logs/ folder.
scrobble-2024-10-27.1000.log
I thought maybe my old config clashed with a new version so I deleted the container, renamed the config directory, created a new empty config directory, repulled and redeployed, but the problem persists. This is my compose (I removed the IPs and the API Key).

services: multi-scrobbler: image: foxxmd/multi-scrobbler:latest container_name: multi-scrobbler environment: - TZ=Europe/Berlin - BASE_URL= - MALOJA_URL= - MALOJA_API_KEY= - PUID=1000 - PGID=1000 volumes: - /opt/multi-scrobbler/config:/config #networks: # - (optional to add container to the same bridge network that maloja is inside to be able to use docker internal networking & dns to resolve and connect to maloja URL via http://maloja:port) ports: - 9078:9078 # first port is the HOST port multi-scrobbler will serve UI on restart: unless-stopped

I have no idea what's going on as until a couple hours prior to this it was successfully syncing lastfm plays to maloja.

Platform

Docker

Versions

multi-scrobbler:latest docker tag, maloja 3.2.2

Logs

[migrations] started
[migrations] no migrations found
───────────────────────────────────────
  _____ __ __ _____ _____ _____ _____ 
 |     |  |  |   __|_   _|     |     |
 |   --|  |  |__   | | | |  |  | | | |
 |_____|_____|_____| |_| |_____|_|_|_|
       _____ __ __ _ __    ____  
      | __  |  |  | |  |  |    \ 
      | __ -|  |  | |  |__|  |  |
      |_____|_____|_|_____|____/ 
  Based on images from linuxserver.io
───────────────────────────────────────
To support LSIO projects visit:
https://www.linuxserver.io/donate/
───────────────────────────────────────
GID/UID
───────────────────────────────────────
User UID:    1000
User GID:    1000
───────────────────────────────────────
-------------------------------------
Setting up multi-scrobbler config directory based on CONFIG_DIR env: /config
Directory is not empty, not creating examples.
chown'ing directory to ensure correct permissions.
Done!
-------------------------------------
[custom-init] No custom files found, skipping...
multi-scrobbler is starting!
[ls.io-init] done.
[2024-10-27 02:43:43.029 +0100] VERBOSE: [Init] Config Dir ENV: /config -> Resolved: /config
[2024-10-27 02:43:43.050 +0100] INFO   : [Init] Version: 0.8.6
[2024-10-27 02:43:43.051 +0100] INFO   : [Init] Generating schema definitions...
[2024-10-27 02:43:45.094 +0100] INFO   : [Init] Schema definitions generated
Warning: connect.session() MemoryStore is not
designed for a production environment, as it will leak
memory, and will not scale past a single process.
2:43:45 AM [vite-express] Running in production mode
2:43:45 AM [vite-express] Inline config detected, ignoring Vite config file
2:43:45 AM [vite-express] Serving static files from /app/dist
[2024-10-27 02:43:45.129 +0100] INFO   : [App] [API] Server started:
---> Local          http://localhost:9078
---> Network        http://172.24.0.2:9078

--- HINT ---
MS is likely being run in a container with BRIDGE networking which means the above addresses are not accessible from outside this container.
To ensure the container is accessible make sure you have mapped the *container* port 9078 to a *host* port. https://foxxmd.github.io/multi-scrobbler/docs/installation#networking
The container will then be accessible at http://HOST_MACHINE_IP:HOST_PORT (or http://redacted:9078/ since you defined this!)

--- HINT ---
[2024-10-27 02:43:45.131 +0100] INFO   : [App] [API] User-defined base URL for UI and redirect URLs (spotify, deezer, lastfm): http://redacted:9078/
[2024-10-27 02:43:45.168 +0100] DEBUG  : [App] [Scrobblers] Constructing maloja (unnamed-mlj) client...
[2024-10-27 02:43:45.170 +0100] INFO   : [App] [Scrobblers] [Maloja - unnamed-mlj] Client Added
[2024-10-27 02:43:45.171 +0100] INFO   : [App] Starting scrobble clients...
[2024-10-27 02:43:45.172 +0100] DEBUG  : [App] [Scrobblers] [Maloja - unnamed-mlj] Attempting to initialize...
[2024-10-27 02:43:45.173 +0100] VERBOSE: [App] [Scrobblers] [Maloja - unnamed-mlj] Building required data init succeeded
[2024-10-27 02:43:45.189 +0100] INFO   : [App] [Scrobblers] [Maloja - unnamed-mlj] Communication test succeeded.
[2024-10-27 02:43:45.190 +0100] INFO   : [App] [Scrobblers] [Maloja - unnamed-mlj] Maloja Server Version: 3.2.2
[2024-10-27 02:43:45.196 +0100] VERBOSE: [App] [Scrobblers] [Maloja - unnamed-mlj] Connection check succeeded
[2024-10-27 02:43:45.200 +0100] INFO   : [App] [Scrobblers] [Maloja - unnamed-mlj] Auth test passed!
[2024-10-27 02:43:45.201 +0100] INFO   : [App] [Scrobblers] [Maloja - unnamed-mlj] Fully Initialized!
[2024-10-27 02:43:45.202 +0100] VERBOSE: [App] [Scrobblers] [Maloja - unnamed-mlj] Fetching up to 100 initial scrobbles...
[2024-10-27 02:43:45.202 +0100] DEBUG  : [App] [Scrobblers] [Maloja - unnamed-mlj] Refreshing recent scrobbles
[2024-10-27 02:43:45.212 +0100] DEBUG  : [App] [Scrobblers] [Maloja - unnamed-mlj] Found 100 recent scrobbles
[2024-10-27 02:43:45.215 +0100] INFO   : [App] [Scrobblers] [Maloja - unnamed-mlj] Scrobble processing started
[2024-10-27 02:43:45.237 +0100] INFO   : [App] [Sources] Finished initializing sources
[2024-10-27 02:43:45.238 +0100] INFO   : [App] Scheduler started.
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(Use `node --trace-warnings ...` to show where the warning was created)
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:184) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit

Additional Context

No response

@AverageHoarder AverageHoarder added the bug Something isn't working label Oct 27, 2024
@FluffyClaws
Copy link

Same issue here, cought it yesterday on a pr-203 image, and after updating to the 0.8.6 it remained.

@janaxhell
Copy link

Just woke up to see multi-scrobbler borking all my server resources, confirm the leak.

@FluffyClaws
Copy link

FluffyClaws commented Oct 27, 2024

Not sure if its related, but in my case with pr-203 image after authentication of YT and starting it. in addition to memory leak the polling rate is contastaly increasing for subsonic and spotify and CPU utilization ramps up over time.
Cant test with 0.8.6 since it doesn't support new Google`s cookie format.

[2024-10-27 11:03:38.220 +0200] INFO   : [App] [Sources] [Ytmusic - MyYTMusic] Polling started
[2024-10-27 11:03:38.221 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Refreshing recently played
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:03:38.779 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] No new tracks discovered
[2024-10-27 11:03:38.780 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Last activity was at 2024-10-27T11:03:38+02:00 | Next check interval: 10.00s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:03:40.014 +0200] DEBUG  : [App] [Sources] [Subsonic - unnamed] Refreshing recently played
[2024-10-27 11:03:40.026 +0200] DEBUG  : [App] [Sources] [Subsonic - unnamed] No new tracks discovered
[2024-10-27 11:03:40.027 +0200] DEBUG  : [App] [Sources] [Subsonic - unnamed] Last activity 2024-10-27T10:56:35+02:00 is 2 minutes outside of polling period (last activity + 300s) | Next check interval: (10 + 20)s
[2024-10-27 11:03:42.815 +0200] DEBUG  : [App] [Sources] [Spotify - unnamed] Refreshing recently played
[2024-10-27 11:03:42.941 +0200] DEBUG  : [App] [Sources] [Spotify - unnamed] No new tracks discovered
[2024-10-27 11:03:42.944 +0200] DEBUG  : [App] [Sources] [Spotify - unnamed] Last activity 2024-10-27T10:56:33+02:00 is 2 minutes outside of polling period (last activity + 300s) | Next check interval: (10 + 20)s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:03:48.283 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Refreshing recently played
[2024-10-27 11:03:48.961 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] No new tracks discovered
[2024-10-27 11:03:48.962 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Last activity was at 2024-10-27T11:03:38+02:00 | Next check interval: 10.00s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:03:58.467 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Refreshing recently played
[2024-10-27 11:03:59.207 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] No new tracks discovered
[2024-10-27 11:03:59.208 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Last activity was at 2024-10-27T11:03:38+02:00 | Next check interval: 10.00s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:04:08.711 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Refreshing recently played
[2024-10-27 11:04:09.419 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] No new tracks discovered
[2024-10-27 11:04:09.420 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Last activity was at 2024-10-27T11:03:38+02:00 | Next check interval: 10.00s
[2024-10-27 11:04:10.038 +0200] DEBUG  : [App] [Sources] [Subsonic - unnamed] Refreshing recently played
[2024-10-27 11:04:10.045 +0200] DEBUG  : [App] [Sources] [Subsonic - unnamed] No new tracks discovered
[2024-10-27 11:04:10.045 +0200] DEBUG  : [App] [Sources] [Subsonic - unnamed] Last activity 2024-10-27T10:56:35+02:00 is 3 minutes outside of polling period (last activity + 300s) | Next check interval: (10 + 20)s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:04:12.980 +0200] DEBUG  : [App] [Sources] [Spotify - unnamed] Refreshing recently played
[2024-10-27 11:04:13.108 +0200] DEBUG  : [App] [Sources] [Spotify - unnamed] No new tracks discovered
[2024-10-27 11:04:13.109 +0200] DEBUG  : [App] [Sources] [Spotify - unnamed] Last activity 2024-10-27T10:56:33+02:00 is 3 minutes outside of polling period (last activity + 300s) | Next check interval: (10 + 20)s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:04:18.926 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Refreshing recently played
[2024-10-27 11:04:19.544 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] No new tracks discovered
[2024-10-27 11:04:19.544 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Last activity was at 2024-10-27T11:03:38+02:00 | Next check interval: 10.00s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
[2024-10-27 11:04:29.050 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Refreshing recently played
[2024-10-27 11:04:29.737 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] No new tracks discovered
[2024-10-27 11:04:29.737 +0200] DEBUG  : [App] [Sources] [Ytmusic - MyYTMusic] Last activity was at 2024-10-27T11:03:38+02:00 | Next check interval: 10.00s
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit
(node:158) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 ready listeners added to [SonicBoom]. Use emitter.setMaxListeners() to increase limit

@iluvatyr
Copy link

iluvatyr commented Oct 27, 2024

Confirming. happened to me too yesterday. But I didnt even update, so why is it happening to all of us at the same time? Is it maybe because of the time change?

"build_version": "Linuxserver.io version:- 4df2f176-ls37 Build-date:- 2024-07-12T07:56:51+00:00", "com.docker.compose.config-hash": "26c6ee3206d196f7ec1ffbc5bedd8a888221f1088f7e0d6d2f4ba4c1ec0320e6", "com.docker.compose.container-number": "1", "com.docker.compose.depends_on": "", "com.docker.compose.image": "sha256:8e1a704da8689e87bb38c07b907f48934128a3ae28f60bfaf2cfa35830c39d0a", "com.docker.compose.oneoff": "False", "com.docker.compose.project": "mediaserver", "com.docker.compose.service": "multi-scrobbler", "com.docker.compose.version": "2.29.1", "maintainer": "TheLamer", "org.opencontainers.image.authors": "linuxserver.io", "org.opencontainers.image.created": "2024-07-18T18:19:50.386Z", "org.opencontainers.image.description": "Scrobble plays from multiple sources to multiple clients", "org.opencontainers.image.documentation": "https://docs.linuxserver.io/images/docker-baseimage-debian", "org.opencontainers.image.licenses": "MIT", "org.opencontainers.image.ref.name": "8c8d587ef08111a2b011f1a425aab7b7e8710cc0", "org.opencontainers.image.revision": "729250a2ccb2d3c8030baf7a5b07ddfcbbd53977", "org.opencontainers.image.source": "https://github.com/FoxxMD/multi-scrobbler", "org.opencontainers.image.title": "multi-scrobbler", "org.opencontainers.image.url": "https://github.com/FoxxMD/multi-scrobbler", "org.opencontainers.image.vendor": "linuxserver.io", "org.opencontainers.image.version": "latest"

@janaxhell
Copy link

That's a very interesting guess, I haven't updated either and suddenly this.

@b9AcE
Copy link

b9AcE commented Oct 27, 2024

Seems I have been afflicted by this bug too,
resulting in one CPU-core running at 100% for a "app/src/backend/index.ts" from https://github.com/FoxxMD/logging running to create those empty log-files, for me reaching up to over 112000 of them in the persistent "config/logs/" directory, so many that rm just declared they were too many to deal with ("rm *" gives argument list too long, but deleting subsets of them at a time of course worked).

This should be fixed urgently and the resulting files cleaned up as a part of that fix, because it seems very plausible some systems running multi-scrobbler may become unusable and then even unbootable if this is left running sufficiently long to create too many files.

@FluffyClaws
Copy link

so many that rm just declared they were too many to deal with ("rm *" gives argument list too long, but deleting subsets of them at a time of course worked).

Runing rm -r logs did it for me, in case anyone cannot delete them from inside /logs directory

@AverageHoarder
Copy link
Author

AverageHoarder commented Oct 27, 2024

"rm *" gives argument list too long, but deleting subsets of them at a time of course worked)

If you run into this error (like I did), don't waste your time deleting subsets.
Run this in the /config/logs/ folder (and nowhere else!).
find . -name "*.log" -delete

@iluvatyr
Copy link

"rm *" gives argument list too long, but deleting subsets of them at a time of course worked)

If you run into this error (like I did), don't waste your time deleting subsets. Run this in the /config/logs/ folder (and nowhere else!). find . -name "*.log" -delete

you can also just delete empty log files by using find . -type f -empty -delete within the logs folder.

@GioF71
Copy link

GioF71 commented Oct 27, 2024

"rm *" gives argument list too long, but deleting subsets of them at a time of course worked)

If you run into this error (like I did), don't waste your time deleting subsets. Run this in the /config/logs/ folder (and nowhere else!). find . -name "*.log" -delete

you can also just delete empty log files by using find . -type f -empty -delete within the logs folder.

Hello, thank you, I am also affected with the issue. I run the app with docker fwiw. The research started from a failing apt update on the host machine, with errors like "no space left on device" which was not exactly the case. I had run out of inodes. Your script is running and reducing inode usage.

@rantanlan
Copy link

deleting the logs won't fix it... was affected too, deleted logfiles and ran into the inodes issue again an hour later

@GioF71
Copy link

GioF71 commented Oct 27, 2024

deleting the logs won't fix it... was affected too, deleted logfiles and ran into the inodes issue again an hour later

sure, but it's solving a practical issue until there is a fix/workaround. I couldn't do anything on the host system. At least now I can.

@GioF71
Copy link

GioF71 commented Oct 27, 2024

I just put a line in the user crontab so the cleanup script can run every 5 minutes, like this:

*/5 * * * * find /home/your-user-name/multi-scrobbler/config/logs -type f -empty -delete

@megatwig
Copy link

Confirming. happened to me too yesterday. But I didnt even update, so why is it happening to all of us at the same time? Is it maybe because of the time change?

bingo.

Offending lines in the pino-roll dependency:

https://github.com/mcollina/pino-roll/blob/ddc07b51976d2ea409762b6389622ce1c9d5aa81/lib/utils.js#L28-L29

const start = today.setHours(0, 0, 0, 0)
return { frequency, start, next: getNextDay(start) }

[...]

function getNextDay (start) {
  return new Date(start + 24 * 60 * 60 * 1000).setHours(0, 0, 0, 0)
}

It calculates tomorrow's date by adding 24 hours to the previous midnight, but today is the one day of the year (in Europe) that's 25 hours long.

So good news, it'll magically fix itself at midnight. Bad news, the same bug will trigger for users in the US next weekend.

@rogales
Copy link

rogales commented Oct 27, 2024

Have the same problem as You guys.
Just stopped contianer, logged via terminal and put find . -name "*.log" -delete :)

@FluffyClaws
Copy link

it'll magically fix itself at midnight.

Can confirm, its fixed now. Till the next year I guess :)

@rogales
Copy link

rogales commented Oct 28, 2024

it'll magically fix itself at midnight.

Can confirm, its fixed now. Till the next year I guess :)

remember that there is a time shift in spring too. :)

@SirMartin
Copy link

Same here, it crashed all my systems, and after that I found that I was running duplicati to backup at 2:30 AM, so it made the backup of all the empty logs twice, until everything crashes 😄

@SirMartin
Copy link

it'll magically fix itself at midnight.

Can confirm, its fixed now. Till the next year I guess :)

remember that there is a time shift in spring too. :)

yes, but for that case is a 23 hours day, so at least it would not be the same issue, maybe an issue, but not the same 🤣

@FoxxMD
Copy link
Owner

FoxxMD commented Oct 28, 2024

Thanks for reporting, detective work done so far, and the workarounds!

@megatwig may be right but I also use my own file formatting based on date so this might be the culprit too. I'm investigating and attempting to reproduce.

In the meantime and as a PSA: file logging can be turned off for multi-scrobbler with logging settings in config.json:

{
  "logging": {
    // default log level to DEBUG everywhere
    "level": "debug",

    // but do not log to file
    "file": false
  },
  // ...rest of the file
}

@megatwig
Copy link

@megatwig may be right but I also use my own file formatting based on date so this might be the culprit too. I'm investigating and attempting to reproduce.

Your code works fine (at least for the default config). I tried patching pino-roll locally last night to do .setDate(+1) instead of .setHours(+24) and the bug went away.

@FoxxMD
Copy link
Owner

FoxxMD commented Oct 28, 2024

Thanks for the confirmation 👍 I'm still on my first cup of coffee and waking up. Working on creating a reproducible example for pino-roll.

@FoxxMD
Copy link
Owner

FoxxMD commented Oct 28, 2024

I've logged the issue on pino-roll and have a PR opened to fix the Standard Time -> DST issue as well as test for DST -> Standard that will occur in March mcollina/pino-roll#96

The pino-roll owner is pretty responsive so I imagine it'll be merged shortly but if it's not in by tomorrow/wednesday I'll hotfix with a patch in multi-scrobbler so USA users can update before this coming weekend.

@FoxxMD
Copy link
Owner

FoxxMD commented Oct 28, 2024

This issue should be fixed in the latest foxxmd/multi-scrobbler:edge docker image and master branch. I'll cut a release for it tomorrow or wednesday.

@FoxxMD
Copy link
Owner

FoxxMD commented Oct 29, 2024

This is fixed in 0.8.7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests