Logstsash Multiline Codec with Filebeat is a must


(Sam Flint) #1

I have to pump may logs to logstash across out stack. Some examples are container logs coming out off in gelf. Binary logs coming from *.log files. Application logs that don't have the log4j for logstash any longer so they are written to log file.

Current inputs are
input {
stdin { }
gelf {
host => "0.0.0.0"
port => 12201
}

udp {
codec => json
port => 5001
}

tcp {
port => 5000
codec => json
}

beats {
port => 5044
}

http {
port => 8000
type => "elb-healthcheck"
}

}

This allows me to send logs to logstash from anywhere. Issue I am currently facing is the java multiline problem with stack traces. I was going to add in the multiline filer, according to Logstash 5.1.1 - Couldn't find any filter plugin named 'multiline' this has been deprecated in favor of the codec.

https://www.elastic.co/guide/en/logstash/5.4/multiline.html

But according to the documentation : If you are using a Logstash input plugin that supports multiple hosts, such as the beatsinput plugin, you should not use the multiline codec to handle multiline events. Doing so may result in the mixing of streams and corrupted event data. In this situation, you need to handle multiline events before sending the event data to Logstash.

I CAN"T use with filebeat input.

Here lies my issue. I have no choice but to use filebeat to send other log files to logstash to get them to elastic search since the log4j has been removed for logstash and recommened to use filbeat.

I do see there is a multiline filter for filebeat. Is this what I should use?

Please advise on best method to collect all logs in this manner and collapse multi line.

Thanks for your time.


(Christian Dahlqvist) #2

Always perform multiline processing as close to the source, in this case in Filebeat.


(Sam Flint) #3

Sounds good. So I can use a combination of Filebeat multiline filter to logstash input filebeat along with the multiline plugin for logstash on the gelf logs?

So for filebeat logs they would be collapsed before logstash and anything coming in on gelf should be collapesed with the multiline plugin?


(Christian Dahlqvist) #4

The multiline plugin has been deprecated as it requires a single processing thread, which leads to poor performance. If you can not fix this a the source, I would recommend defining multiple pipelines so you do not need to impose this limitation on all flows.


(Sam Flint) #5

This https://www.elastic.co/guide/en/logstash/current/plugins-codecs-multiline.html has been deprecated in 4 months?

So you are suggesting 1 pipeline for filebeat and one for gelf inputs?

I have posted another thread about the multiline plugin not working here :

I am not familiar with multiple pipelines but I see form here(https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html) That I can just set them in the pipelines.xml and point to 2 different configurations.

I can not compress multiline at the source as it is just coming from stderr and stdout streaming to logstash using gelf protocol.

If you are suggesting 1 pipeline for filebeat and 1 for gelf that makes a little sense. I am still confused to the multi line codec plugin being deprecated?


(Christian Dahlqvist) #6

The codec has not been deprecated, and is the recommended option. There used to also be a multiline filter plugin which was deprecated for the reasons I mentioned. Sorry for the confusion.


(Sam Flint) #7

Thanks so much for the help. I have implemented this approach with 2 inputs. One gelf with the multiline plugin and the other beats with no plugin. I have created another topic because it seems the codec isn't working as expected.

I believe I am on the right path and will have a working model when the other questions about the codec are answered in the other topic/post.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.