Logging in a web server context
Hi folks!
Is it possible when using s6-svscan/s6-supervise to somehow arrange so
that a daemon’s stdout is sent to one logdir and stderr to another
logdir?
Or must I use s6-rc in order to achieve this?
My premises are the following:
I want to run nginx under s6, on a mainstream Linux distro such as
Debian/Ubuntu/CentOS. My plan is currently to use s6 only for nginx
and not any other daemons.
I want to collect the web server access logs in order to generate
visitor statistics[1].
One tricky aspect of logging that is specific to web servers is that
they emit two different categories of messages:
a) Errors and warnings
b) Info about page requests[2]
For many other kinds of daemons, one only needs to arrange for logging
stderr and then call it a day. With web servers it’s not quite so
simple.
I am very interested to hear from others on the list what your
thoughts are about this scenario.
The reason I’m weary about s6-rc is that I think it might not play so
well with the distros I mentioned earlier. If this concern is not
correct, please let me know.
[1] An alternative to collecting access logs could be to use Google
Analytics instead, but I want to avoid that since I don’t feel
comfortable about handing over my visitors’ personal data over to
Google.
[2] I.e. lines like this '10.131.214.101 - - [20/Nov/2017:18:52:17
+0000] "GET / HTTP/1.1" 401 188 "-" "Mozilla/5.0 (X11; Ubuntu; Linux
x86_64; rv:47.0) Gecko/20100101 Firefox/47.0"'
Received on Sat Jun 13 2020 - 08:41:24 UTC
This archive was generated by hypermail 2.3.0
: Sun May 09 2021 - 19:44:19 UTC