I have a directory that is periodically populated with logs like that:
I want to compress the files that are older than 7 days and I'm a little bit confused what tool would be appropriate for this task: newsyslog(8) or a custom script run by cron(8)?
If I understand correctly, newsyslog(8) is useful when you essentially have one 'main' log file (named like mail.log), which always contains the latest logs and when this file grows or ages, it becomes something like mail.log.0.xz and mail.log is trimmed.
It doesn't looks quite like my case, because in my model, my program already writes a separate log file with unique name. So I guess for my task it would be more appropriate to schedule cron(8) to run my custom shell script that will simply find(1) files that are old enough and compress them.
Am I right?
Code:
2026-03-18_22-56-00.log
2026-03-19_02-30-00.log
2026-03-22_02-30-00.log
2026-03-25_02-30-00.log
2026-03-25_17-08-16.log
I want to compress the files that are older than 7 days and I'm a little bit confused what tool would be appropriate for this task: newsyslog(8) or a custom script run by cron(8)?
If I understand correctly, newsyslog(8) is useful when you essentially have one 'main' log file (named like mail.log), which always contains the latest logs and when this file grows or ages, it becomes something like mail.log.0.xz and mail.log is trimmed.
It doesn't looks quite like my case, because in my model, my program already writes a separate log file with unique name. So I guess for my task it would be more appropriate to schedule cron(8) to run my custom shell script that will simply find(1) files that are old enough and compress them.
Am I right?