Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Collabora Online - Built-in CODE Server - log size #187

Open
amg-web opened this issue Jun 18, 2022 · 16 comments
Open

Collabora Online - Built-in CODE Server - log size #187

amg-web opened this issue Jun 18, 2022 · 16 comments

Comments

@amg-web
Copy link

amg-web commented Jun 18, 2022

we are using Collabora Online - Built-in CODE Server
after 2 months without restart log file became very big (~2.8 Gb) in /tmp partition using big part of space
also remaining some flog files after restarts.
need to configure log rotation and removal to reduce logs size.

@mokkin
Copy link
Contributor

mokkin commented Jun 3, 2024

I didn't realize that before, but we now also have this issue running on NC 29.0.1 with a size of the /tmp/coolwsd.*/jails directory of around 54G
The uptime is 34 days and we are using Collabora / Nextcloud Office only occasionally.

@red3333
Copy link

red3333 commented Jun 13, 2024

Same problem here.
I just realized that /tmp/systemd-private-*-apache2.service-*/tmp/coolwsd.*/jails also reached 54GB. Uptime around 109 days (but apache2 was restarted several times).
I just stopped apache2, deleted its /tmp/systemd-private-*-apache2.service-*/ directory and restarted it. Let's see if it comes back...

Configuration:
Nextcloud 29.0.2
Collabora Online Development Edition 24.04.2.1 80a6f97
Apache/2.4.59 (Debian)

@Cris70
Copy link

Cris70 commented Jun 17, 2024

Same here, right after updating to Nextcloud v29.0.2 from Nextcloud 28.
It filled up my /tmp disk.
I removed part of the offending files from /tmp, but the directory is still growing at an alarming rate.
How to fix?

@djonik1562
Copy link

I have same problem.
It's growning really fast.

@Githopp192
Copy link

same problem here ==>

#176

/tmp/coolwsd.*/jails is growing very fast - 100GB in 24h !

Without monitoring the system - this would lead to a severe server crash

@epidemiaf1
Copy link

epidemiaf1 commented Aug 8, 2024

Yeap, I can confirm this is an issue with Nextcloud 29.0.4 and Collabora Online 24.4.502. Server crashed this morning and had to hard reboot it. After 5 hours, it's already at 17GB (output from du 17G ./coolwsd.*/jails).

@bastien30
Copy link

Hello,

Same here, after updating to Nextcloud 29 from 28 on docker-compose stack.

collabora/code:latest and nextcloud:stable from docker hub.

@Githopp192
Copy link

Githopp192 commented Aug 9, 2024

Same here, right after updating to Nextcloud v29.0.2 from Nextcloud 28. It filled up my /tmp disk. I removed part of the offending files from /tmp, but the directory is still growing at an alarming rate. How to fix?

@Cris70 - what i did as workaround .. developped some script, which will:

  • check /tmp/php-fpm ever 15 minutes

  • defined a critical threshold value for /tmp and very critical threshold value for /tmp

  • if critcal threshold value is reached, script will check if this is within business hours, when not, then

    • set cloud maintenance: on, wait 5 minutes
    • restart apache,php-fpm, redis - this will clear the *coolwsd" stuff
      • set cloud maintenance: off
  • if very critcal threshold value is reached, script force to immediately :

  • set cloud maintenance: on, wait 5 minutes
  • restart apache,php-fpm, redis - this will clear the *coolwsd" stuff
  • set cloud maintenance: off

@epidemiaf1
Copy link

We do use Nextcloud office but it isn't a critical part of the setup. I temporarily disabled it and the Collabora Online server within Nextcloud. Had to restart apache and php8.2-fpm service as it kept growing the jail files regardless.

@ulfkosack
Copy link

Same here after update to NC from 28.0.4 to 29.0.5
I've disabled CODE Server and Nextcloud Office

@cableTh0rn
Copy link

Same issue here
/tmp/ folder is filled up with coolwsd/jail whatever over 700GB
restaring nginx, php-fpm ,redis did not remove tmp files
I had to delete temp stuff manually then restarted services
kindda disappointed

@Githopp192
Copy link

Githopp192 commented Aug 20, 2024

Same issue here /tmp/ folder is filled up with coolwsd/jail whatever over 700GB restaring nginx, php-fpm ,redis did not remove tmp files I had to delete temp stuff manually then restarted services kindda disappointed

==> since i disabled the maintenance specifc run ('maintenance_window_start' => 100, (disabling it), the issue did not re-occur !?

(issue = long running cron & increasing /tmp/php stuff)

nextcloud/server#47132

@cableTh0rn
Copy link

Same issue here /tmp/ folder is filled up with coolwsd/jail whatever over 700GB restaring nginx, php-fpm ,redis did not remove tmp files I had to delete temp stuff manually then restarted services kindda disappointed

==> since i disabled the maintenance specifc run ('maintenance_window_start' => 100, (disabling it), the issue did not re-occur !?

(issue = long running cron & increasing /tmp/php stuff)

nextcloud/server#47132

I run a check script to log the time tmp files gets bigger. It is just the maintenace_window time, 100GB in just one hour

'''
Wed Aug 21 04:38:42 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:38:52 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:39:02 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:39:12 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:39:22 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:39:32 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:39:42 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:39:52 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:40:02 +03 2024 --> 1.9G /tmp/
Wed Aug 21 04:40:12 +03 2024 --> 2.3G /tmp/
Wed Aug 21 04:40:22 +03 2024 --> 2.9G /tmp/
Wed Aug 21 04:40:33 +03 2024 --> 3.5G /tmp/
Wed Aug 21 04:40:43 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:40:53 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:41:03 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:41:13 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:41:23 +03 2024 --> 3.8G /tmp/
Wed Aug 21 04:41:33 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:41:43 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:41:53 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:42:03 +03 2024 --> 3.7G /tmp/
Wed Aug 21 04:42:13 +03 2024 --> 3.8G /tmp/
Wed Aug 21 04:42:23 +03 2024 --> 3.8G /tmp/
Wed Aug 21 04:42:33 +03 2024 --> 4.1G /tmp/
Wed Aug 21 04:42:43 +03 2024 --> 4.5G /tmp/
Wed Aug 21 04:42:54 +03 2024 --> 4.8G /tmp/
Wed Aug 21 04:43:04 +03 2024 --> 5.8G /tmp/
Wed Aug 21 04:43:14 +03 2024 --> 7.2G /tmp/
Wed Aug 21 04:43:24 +03 2024 --> 8.0G /tmp/
Wed Aug 21 04:43:34 +03 2024 --> 9.2G /tmp/
...
Wed Aug 21 04:43:34 +03 2024 --> 9.2G /tmp/
Wed Aug 21 04:43:44 +03 2024 --> 11G /tmp/
Wed Aug 21 04:43:54 +03 2024 --> 12G /tmp/
Wed Aug 21 04:44:04 +03 2024 --> 13G /tmp/
Wed Aug 21 04:44:15 +03 2024 --> 14G /tmp/
Wed Aug 21 04:44:25 +03 2024 --> 15G /tmp/
Wed Aug 21 04:44:35 +03 2024 --> 16G /tmp/
Wed Aug 21 04:44:45 +03 2024 --> 17G /tmp/
Wed Aug 21 04:44:55 +03 2024 --> 18G /tmp/
Wed Aug 21 04:45:05 +03 2024 --> 19G /tmp/
Wed Aug 21 04:45:15 +03 2024 --> 20G /tmp/
Wed Aug 21 04:45:25 +03 2024 --> 21G /tmp/
Wed Aug 21 04:45:35 +03 2024 --> 21G /tmp/
Wed Aug 21 04:45:45 +03 2024 --> 22G /tmp/
Wed Aug 21 04:45:55 +03 2024 --> 23G /tmp/
Wed Aug 21 04:46:06 +03 2024 --> 25G /tmp/
Wed Aug 21 04:46:16 +03 2024 --> 26G /tmp/
Wed Aug 21 04:46:26 +03 2024 --> 27G /tmp/
Wed Aug 21 04:46:36 +03 2024 --> 28G /tmp/
Wed Aug 21 04:46:46 +03 2024 --> 28G /tmp/
Wed Aug 21 04:46:56 +03 2024 --> 30G /tmp/
Wed Aug 21 04:47:06 +03 2024 --> 31G /tmp/
Wed Aug 21 04:47:16 +03 2024 --> 32G /tmp/
Wed Aug 21 04:47:26 +03 2024 --> 33G /tmp/
Wed Aug 21 04:47:36 +03 2024 --> 34G /tmp/
Wed Aug 21 04:47:47 +03 2024 --> 35G /tmp/
Wed Aug 21 04:47:57 +03 2024 --> 36G /tmp/
Wed Aug 21 04:48:07 +03 2024 --> 36G /tmp/
Wed Aug 21 04:48:17 +03 2024 --> 37G /tmp/
...
Wed Aug 21 04:58:24 +03 2024 --> 97G /tmp/
Wed Aug 21 04:58:34 +03 2024 --> 98G /tmp/
Wed Aug 21 04:58:45 +03 2024 --> 99G /tmp/
Wed Aug 21 04:58:55 +03 2024 --> 99G /tmp/
Wed Aug 21 04:59:05 +03 2024 --> 99G /tmp/
Wed Aug 21 04:59:15 +03 2024 --> 99G /tmp/
Wed Aug 21 04:59:25 +03 2024 --> 99G /tmp/
Wed Aug 21 04:59:35 +03 2024 --> 99G /tmp/
Wed Aug 21 04:59:45 +03 2024 --> 99G /tmp/
Wed Aug 21 04:59:55 +03 2024 --> 99G /tmp/
Wed Aug 21 05:00:06 +03 2024 --> 99G /tmp/
Wed Aug 21 05:00:16 +03 2024 --> 99G /tmp/

@joshtrichards
Copy link
Contributor

joshtrichards commented Aug 21, 2024

Same here, after updating to Nextcloud 29 from 28 on docker-compose stack.
collabora/code:latest and nextcloud:stable from docker hub.

@bastien30

If you're experiencing this in collabora/code:latest than this isn't a richdocumentscode matter (which might be a very useful clue actually; at least if your situation has the same underlying cause as other reporters here).

@joshtrichards
Copy link
Contributor

The logging in Built-in / richdocumentscode is just the same as default for CODE in general from the looks of it: warning level.

https://sdk.collaboraonline.com/docs/installation/Configuration.html?highlight=logging#logging

Since Built-in is mostly for testing and personal use, log rotation was likely not an original consideration.

Some of these reports here suggest very rapid log file growth. It would be helpful if one of you reporting this can inventory what precisely is showing up in the logs that is generating so much usage. Is it an unusual error/warning situation? Is it specific to certain environments? Do you have many users? etc.

@bastien30
Copy link

Same here, after updating to Nextcloud 29 from 28 on docker-compose stack.
collabora/code:latest and nextcloud:stable from docker hub.

@bastien30

If you're experiencing this in collabora/code:latest than this isn't a richdocumentscode matter (which might be a very useful clue actually; at least if your situation has the same underlying cause as other reporters here).

Thanks, it seems the problem is gone for me, after uninstalling/reinstalling CODE application from within Nextcloud web interface.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests