Logstash equivalents with ingest pipeline


(Jim) #1

We are researching our ability to remove Logstash and instead have Filebeats send directly to Elasticsearch's new ingest pipelines.

I am trying to take the following Logstash filter and create an ingest pipeline.

grok {
  match => ["message", "^:%{DATESTAMP:timestamp} %{DATA:thread} %{LOGLEVEL:log_level} +%{DATA:log_name} (%{DATA:agent_id} )?session_id:(%{DATA:sid})? time_id:(%{DATA:time_id})? referrer:(%{DATA:referrer})? %{GREEDYDATA:msg}$"]
  overwrite => ["timestamp"]
  add_tag => ["application_log", "%{type}"]
}
if [msg] =~ /.*Publishing following data to SNS topic.*/ {
  grok { match => ["msg", "(?<sns_queue>Publishing[^{]+) %{GREEDYDATA:snsMsg}"] }
  json {
    source => "snsMsg"
    target => "parsedMsg"
  }
}
if [msg] =~ /^.+Exception|Error.+$/ {
  mutate {
    add_tag => ["error"]
  }
}
mutate {
  remove_field => "snsMsg"
}
date {
  match => ["timestamp", "MM-dd-yyyy HH:mm:ss.SSS"]
}

So far all I have is:

{
    "description": "Application Logs",
    "processors": [
        {
            grok {
              "field": "message",
              "patterns": ["^:%{DATESTAMP:timestamp} %{DATA:thread} %{LOGLEVEL:log_level} +%{DATA:log_name} (%{DATA:agent_id} )?session_id:(%{DATA:sid})? time_id:(%{DATA:time_id})? referrer:(%{DATA:referrer})? %{GREEDYDATA:msg}$"]
    }
]
}

I have no clue how to handle the overwrite, add_tag, and add_field items. Can the append processor be used for the add_tag?

A point in the right direction would be very much appreciated.


(Tal Levy) #2

Do you mind providing a sample input document to your pipeline?

Also, a note about tagging. Ingest does not have a notion of tags, tags can be added into a field called "tags" if you wish, and the SetProcessor can help you achieve this.


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.