Moving from ELK to EFK

Hi! were moving our infra to containers and we wold like to use filebeat to send the logs directly to Elasticsearch from Filebeat, instead of going trough Logstash.
In this process we might lose our logstash filtering and parsing capabilities (grok is not supported in filebeat)

This is our current logstash configuration:

input {
  file {
    path => "/servicename/_logs/servicename.log"
    codec => multiline {
      pattern => "(^[a-zA-Z.]+(?:Error|Exception).+)|(^\s+at .+)|(^\s+... \d+ more)|(^\t+)|(^\s*Caused by:.+)"
      what => "previous"
    }
  }
}

filter {
  if "multiline" not in [tags]{
  json {
    source => "message"
    remove_field => ["[request][body]","[response][body][response][items]"]
  }
}
else {
  grok {
    pattern_definitions => { APPJSON => "{.*}"  }
    match => { "message" => "%{APPJSON:appjson} %{GREEDYDATA:stack_trace}"}
    remove_field => ["message"]
  }
  json {
    source => "appjson"
    remove_field => ["appjson"]
  }
}
}

output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    index => "logstash-servicename-%{+YYYY.MM.dd}"
    document_type => "logs"
  }
}

We dont use any fancy filtering and parsing capabilities.
Can we apply these filters somehow in filebeat?

thanks!

You can use ingest node piplines to parse the data in Elasticsearch.

We decided to go with json parse in filebeat and it worked great. now ive tried adding the multiline for java stack traces. it worked fine while using logstash. and now iv'e tried everything but the result wont change. i get a new document for every line in the stack.

filebeat.config:
      prospectors:
        # Mounted `filebeat-prospectors` configmap:
        path: ${path.config}/prospectors.d/*.yml
        # Reload prospectors configs as they change:
        reload.enabled: false
        json.add_error_key: true
        json.message_key: log
        json.keys_under_root: true
        multiline:
          pattern: (^[a-zA-Z.]+(?:Error|Exception).+)|(^\s+at .+)|(^\s+... \d+ more)|(^\t+)|(^\s*Caused by:.+)
          negate: false
          match: after
      modules:
        path: ${path.config}/modules.d/*.yml
        # Reload module configs as they change:
        reload.enabled: false

The lines are in the "log" key:

  "_source": {
    "@timestamp": "2019-10-29T09:04:43.704Z",
    "offset": 2018941,
    "log": "\tat rx.exceptions.Exceptions.propagate(Exceptions.java:57)",
    "prospector": {
      "type": "log"
    },

do you have any idea what is happening? i made sure that the \t is also in the pattern. should i tell multiline to look at the log field like i told the JSON parser? is there a way to do so?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.