Moving from Logstash to Filebeat

Hi.
were Moving from logstash to filebeat as a part of our transition to containers.

our old logstash configuration was:

input {
  file {
    path => "/servicename/_logs/servicename.log"
    codec => multiline {
      pattern => "(^[a-zA-Z.]+(?:Error|Exception).+)|(^\s+at .+)|(^\s+... \d+ more)|(^\t+)|(^\s*Caused by:.+)"
      what => "previous"
    }
  }
}

filter {
  if "multiline" not in [tags]{
  json {
    source => "message"
    remove_field => ["[request][body]","[response][body][response][items]"]
  }
}
else {
  grok {
    pattern_definitions => { APPJSON => "{.*}"  }
    match => { "message" => "%{APPJSON:appjson} %{GREEDYDATA:stack_trace}"}
    remove_field => ["message"]
  }
  json {
    source => "appjson"
    remove_field => ["appjson"]
  }
}
}

output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    index => "logstash-servicename-%{+YYYY.MM.dd}"
    document_type => "logs"
  }
}

To overcome the json issue we used the decode json:

  - decode_json_fields:
      fields: ["log"]
      target: ""

which worked fine.

we now have a problem with the java exceptions that are not digested well with elasticsearch causing many events to not index and getting these arrors in filebeat:

2019-11-12T07:57:29.815Z ERROR pipeline/output.go:92 Failed to publish events: temporary bulk send failure

2019-11-12T07:57:29.826Z INFO elasticsearch/client.go:690 Connected to Elasticsearch version 6.3.2

2019-11-12T07:57:29.863Z INFO template/load.go:73 Template already exists and will not be overwritten.

2019-11-12T07:57:30.552Z INFO [monitoring] log/log.go:124 Non-zero metrics in the last 30s

and these wierd delays:

we need a way to handle these like we did before. we've tried multiline processing:

multiline:
          pattern: (^[a-zA-Z.]+(?:Error|Exception).+)|(^\s+at .+)|(^\s+... \d+ more)|(^\t+)|(^\s*Caused by:.+)
          negate: false
          match: after

but without success. (is there a need for ingest node here? if so what will be its configuration?)

any help will be much appreciated!

Hi @yavidor,

You cannot do multiline after using decode_json_fields. Perhaps you can try using the json parameters in the log input? See https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log-config-json for more details.

Best regards

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.