Thanks for the quick answer
This way can only be performed offline.Is it possible to perform the same mechanism but having concatenated file along all the others at all times ?
Regards,
Vincent de RIBOU région Ouest - France belzo2005-dolphin_at_yahoo.fr
Le Vendredi 3 juin 2016 15h22, Laurent Bercot <ska-supervision_at_skarnet.org> a écrit :
On 03/06/2016 14:53, Vincent de RIBOU via supervision wrote:
> what I am looking for is the way to get all
> processes outputs done by s6-log (or other loggers) to a unique
> file.
I'm not sure what you want exactly, could you please elaborate?
If what you need is to combine the contents of several log directories
into one single file, it's easy:
- make sure your log files are all timestamped with the same format
(i.e. all TAI64N or all ISO 8601).
- concatenate them all and sort the result.
The resulting file will be a log file sorted chronologically.
for logdir in `cat logdir_list` ; do (cd $logdir && cat *.s *.u current) ; done | sort > logfile
It's generally much easier to gather than to scatter, which is why the
logging chain model is superior to the syslogd model.
--
Laurent
Received on Fri Jun 03 2016 - 13:32:10 UTC