Docker: the best way to handle apt-get package security updates inside docker containers

On my current server, I use automatic updates to automatically process security updates. But I wonder what people would suggest for working inside docker containers. I have several docker containers for each service of my application. Should I set up automatic updates in each? Or maybe update them locally and pick up the updated images? Any other ideas?

Does anyone have any experience with this in production?

+7
security docker ubuntu upgrade apt-get
source share
2 answers

I do updates automatically, as you did (before). I currently have Stage containers and nothing in Prod. But there is no harm associated with updating each container: some redundant network activity is possible if you have several containers based on the same image, but harmless otherwise.

Restoring the container amazes me as unnecessary time and requires a more complex process.

WRT time: The lap time is added to the time that needs to be updated, so in this sense it is β€œextra” time. And if you have processes running your container, they need to be repeated.

The complexity of WRT: On the one hand, you just run updates with apt. On the other hand, you basically act as an integration server: the more steps, the more errors.

In addition, updates do not create a β€œgolden image” because it is easily reproduced.

And finally, since the kernel is never updated, you will never need to reboot the container.

+2
source share

I would restore the container. They are usually oriented towards launching a single application and it may make little sense to update the supporting file system and all included but not used / open applications there.

Having the data in a separate volume, you can create a script that rebuilds the container and restarts it. The advantage would be that loading another container from this image or dragging the repository to another server would have all the fixes.

+2
source share

All Articles