Multiline handling for log4net

Hello there,

I am looking at ways of centralizing my logs and the ELK stack is very impressive. While trying to evaluate it I've gotten stuck on the handling of multiline logs. I've googled and read posts from up to a few years back but haven't been able to piece this puzzle together. Some recommend multiline settings in filebeat, some say logstash's multiline codec is the way to go. My current pipeline is filebeat-logstash-elasticsearch-kibana and I am running all components on Windows and the configuration can be found below.

With this setup I've been able to push all three example events below to elasticsearch, but the multiline event is missing alot of metadata. The single line events are parsed nicely and have replaced timestamps, _type=log4net, log level separated, beats metadata, etc, but the multiline event is missing all that when I look in Kibana. It only contains _type=logs, tags=multiline, no replaced timestamp and the full message intact.
The questions I cannot find the answers to are:

  • What is causing the loss of metadata through my filter?
  • The single line events are processed very fast in logstash, but the multiline event is processed 30-60 seconds after logstash received it. What could cause this?
  • Would I need multiline handling in both filebeat and logstash in this current pipeline or should one of them suffice?

Thanks!

My log4net pattern:

  <layout type="log4net.Layout.PatternLayout">
    <conversionPattern value="%date [%thread] %level  %message %exception [%logger]%newline"/>
  </layout>

This produces logs containing both single line and multiline events. Here is an example output:

2017-03-07 15:40:59,539 [1] ERROR Error message [C:\Log4NetConsole\Program.cs]
2017-03-07 15:40:59,551 [1] INFO Info message [C:\Log4NetConsole\Program.cs]
2017-03-07 15:40:59,552 [1] ERROR System.InvalidOperationException: Logfile cannot be read-only
at Log4NetConsole.Program.Main(String args) in C:\Log4NetConsole\Program.cs:line 23 [C:\Log4NetConsole\Program.cs]

Filebeat.yml:

filebeat.prospectors:

  • input_type: log
    paths:
    • c:\logs\*
      document_type: log4net
      output.logstash:
      hosts: ["localhost:5044"]

Logstash log4net.conf:

input {
beats {
port => 5044
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
}
}
}
filter {
if [type] == "log4net" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:threadId}\] %{LOGLEVEL:level} %{GREEDYDATA:tempMessage} \[%{DATA:logger}\]" ]
overwrite => ["message","timestamp"]
}
date {
match => ["timestamp","yyyy-MM-dd HH:mm:ss,SSS"]
remove_field => ["timestamp"]
}
mutate {
replace => [ "message" , "%{tempMessage}" ]
remove_field => [ "tempMessage" ]
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash1-%{+YYYY.MM.dd}"
template_overwrite => true
}
stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.