Django logstash filter

HI all,
is there an "official" or standard-de-facto filter for logstash that can be used in Django?

For what I know it should extend SocketHandler (TCP) or DatagramHandler (UDP), but I would love to build something from scratch to make it work.
I've found https://github.com/vklochan/python-logstash but it seems that it's not updated for a couple of years.

is there an "official" or standard-de-facto filter for logstash that can be used in Django?

No. I suggest you use a formatter (or layout or whatever it's called) that formats the log record object as a JSON string. That's trivial to parse on the Logstash side and takes care of multiline messages.

For what I know it should extend SocketHandler (TCP) or DatagramHandler (UDP), but I would love to build something from scratch to make it work.

"It" being the Logstash filter? I'm confused.

I don't think you should send log messages directly over the network. What happens if the Logstash server is unavailable?

Ok for the formatting in JSON.

Regarding sending logs, yes, if the server is down everything is lost. what could be a consistent approach?
(it referred to the system to log the entries)

I suggest streaming the logs to disk and using Filebeat to ship them to Logstash. Sure, you can run out of disk space, but that's typically easier to handle than network and Logstash host outages.

The main problem is that our services are docker containers run in Kubernetes, so it would require to install a filebeat and have a file in every container. it may run out of space easily, no?

is this the standard approach even for containers ?

Let the process stream its logs to stdout so that Docker picks them up and writes to disk. You can e.g. use Filebeat to ship the container logs.

this one seems to take the whole docker log output and stream to elk. is this the scope?

Now, in the docker log there's the JSON logs that I do plus other error/debug strings. Is there a way to seperate these logs and decide what goes to ELK?
Is there a way to tell to failbeat to scrap a .log file that is inside the container? or this is not possible for kube/docker?

thanks

this one seems to take the whole docker log output and stream to elk.

Yes.

Now, in the docker log there's the JSON logs that I do plus other error/debug strings.

The Docker log contains the stdout and stderr of the process. I believe Docker wraps those lines into JSON objects.

Is there a way to seperate these logs and decide what goes to ELK?

Filebeat has limited support for filtering out messages, see Filter and enhance data with processors | Filebeat Reference [8.11] | Elastic.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.