why is it bad practice to automatically update the system?

Right now I have set up my home "play box" machine (vsftpd listening on Internet, Samba listening on LAN, BitTorrent client) to apply updates in the core system and ports in an unattended manner. This means automatically. Then I grep the log and if a patch is available, I reboot the system (I was told that there is no other way to be sure that the old, vulnerable binaries are removed from RAM).

I've been told (and read in forums) that this is bad practice but I'm not sure why is that.

I can imagine that:

a. I need to be on the system, inspect, and make sure that everything is ok with the patches, or I might end up with something not working - or maybe the whole system not working.

b. Unix (and Unix-like) is an OS that promotes a hands-on approach on administration. Everything should be monitored and inspected by the admin. The admin should always be aware of what's happening on the system. This approach in administration is part of the stability that Unix is known for.

But I'm not sure. So I'm here to ask and make sure. So, why is it bad practice to automatically update?

P.S.: I understand that there might be two different answers, one for the core system and one for the ports.
 
I can think of ports being broken, then the user would have to circumvent that port, removing features or programs, if possible. One thing I would suggest is using pkg audit -F and portsnap in crontab, and using a script only to update those vulnerable ports. Still, this does not immediately solve the problem if a port is vulnerable for an extended period of time.

Building a world is also complicated where user interaction is needed.
 
Because an update might change something that requires human intervention. For example, many people want services to automatically restart after a port or package upgrade. But what if that new version requires new configuration settings to keep it from being, say, vulnerable to attack?
 
what if that new version requires new configuration settings to keep it from being, say, vulnerable to attack?

That's the first time I hear about this. I have never thought about this. Can you give me an example case to see it in action?
 
I don't have a specific example for that, but can point to the differences in Apache 2.2 to Apache 2.4 which changed enough so that the 2.4 version either complained or would not even start with a 2.2 httpd.conf.
 
Back
Top