Is there a better way to control log files? (Linux / python)

I am trying to control the log files that some processes run on linux (to create a shared log file where the log entries are grouped together when they happen). I am currently going to open the files being logged, poll using inotify (or wrappers), and then check if I can read the file more.

Is there a better way to do this? Perhaps some kind of library that is viewing read / change in files being viewed?

+5
source share
3 answers

Why is the tail -f not enough? You can use popen and pipe to handle this from Python.

+3
source

Generator Tricks For Systems Programmers shows how to use Python Generators to solve this type of problem; in particular, monitoring (large) log files. I recommend reading it.

+1
source

If you do it yourself, you can do something like this: if you find a file modification, get the file size. If it is larger than the last time, you can search for the previous “last” position (ie, Previous Size) and read from there.

0
source

All Articles