Conditions and filters not applying

Greetings All,

I am new to ELK and stumbled all day yesterday trying to filter out logs that met a certain condition. Any insight as to how I should approach this is much appreciated.

**Failed attempt #1

input {
  beats {
    port => 5044
  }
}

filter {
  grok {
    match => { "message" => "%{SYSLOGBASE2}" }
  }
# Match type and level.
# Drop warning events
  if [logsource] == "WARN" {
    drop { }
  }
# Drop information events
  if [logsource] == "INFO" {
    drop { }
  }
}

output {
  elasticsearch {
    hosts => "elasticsearch1:9200"
    user => "elastic"
    password => "blah"
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

**Failed attempt #2

input {
  beats {
    port => 5044
  }
}

filter {
  grok {
    match => { "message" => "%{SYSLOGBASE2}" }
  }

output {
  stdout {
    codec => rubydebug
  }
  if [logsource] != "WARN" or "INFO" {
    elasticsearch {
      hosts => "elasticsearch1:9200"
      user => "elastic"
      password => "changeme"
      index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      document_type => "%{[@metadata][type]}"
     }
   }
}

**ES record

{
  "_index": "filebeat-2018.08.01",
  "_type": "log",
  "_id": "AWT2khPHLvjxV3lis6j4",
  "_version": 1,
  "_score": null,
  "_source": {
    "offset": 3403880,
    "input_type": "log",
    "timestamp8601": "2018-08-01 17:38:45.178",
    "source": "/local/mnt/logs/filebeat/apps/solr/solr-cpip-filebox.log",
    "message": "2018-08-01 17:38:45.178 INFO  (qtp401424608-19) [   x:cpip-filebox] o.a.s.u.p.LogUpdateProcessorFactory [cpip-filebox]  webapp=/solr-cpip-filebox path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 1",
    "type": "log",
    "logsource": "INFO",
    "tags": [
      "beats_input_codec_plain_applied"
    ],
    "environment": "tst",
    "logtype": "application_log",
    "@timestamp": "2018-08-01T17:38:45.290Z",
    "@version": "1",
    "beat": {
      "name": "vdpsidxtst05",
      "hostname": "vdpsidxtst05",
      "version": "5.5.0"
    },
    "host": "vdpsidxtst05"
  },
  "fields": {
    "@timestamp": [
      1533145125290
    ]
  },
  "sort": [
    1533145125290
  ]
}

This attempt failed because or carries a higher precedence than the !=, effectively splitting the statement up like:

  if ([logsource] != "WARN") or ("INFO") {

And since the string "INFO" is non-empty, on its own it is "truthy", meaning every event makes it through the conditional expression.


What you were looking for was probably something like:

  if ([logsource] != "INFO") and ([logsource] != "WARN") { 

There is also in, which works with 2+ element arrays:

  if [logsource] not in ["INFO","WARN"] {

Thank you for the guidance.

I updated the pipeline and the stdout if fine on the console but when I look at a record in Kibana they are missing the SYSLOGBASE2 fields and tagged with the "_grokparsefailure" tag.

I also noticed that the message payload is truncated for some reason.

{
  "_index": "filebeat-2018.08.01",
  "_type": "log",
  "_id": "AWT2z3d3ZgqwPpVJkw1i",
  "_version": 1,
  "_score": null,
  "_source": {
    "environment": "tst",
    "logtype": "application_log",
    "@timestamp": "2018-08-01T18:45:45.868Z",
    "offset": 911928,
    "@version": "1",
    "input_type": "log",
    "beat": {
      "name": "vdpsidxtst05",
      "hostname": "vdpsidxtst05",
      "version": "5.5.0"
    },
    "host": "vdpsidxtst05",
    "source": "/local/mnt/logs/filebeat/apps/solr/solr-qdisk-new.log",
    "message": "\tcommit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.store.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4s3q,generation=223046}",
    "type": "log",
    "tags": [
      "beats_input_codec_plain_applied",
      "_grokparsefailure"
    ]
  },
  "fields": {
    "@timestamp": [
      1533149145868
    ]
  },
  "sort": [
    1533149145868
  ]
}

the _grokparsefailure tag indicates that the grok pattern provided failed to parse the input.

Since the message you pasted begins with tab character (\t) and doesn't look at all like the given pattern, my guess would be that you are dealing with a multi-line message; you may need to configure Beats to recognise message continuations with the multiline configuration directives, which will allow it to send all the relevant lines from each message to Logstash as a single unit.

Thank you for the direction yaauie. I configured filebeat for multiline messages so I am getting closer. I am still not quite there for some reason as messages that should be picked up by the pattern that I defined as not being processed as a single line by logstash.

Here is my filebeat config.

filebeat:
  # List of prospectors to fetch data.
  prospectors:
    -
      paths:
              - /local/mnt/logs/filebeat/apps/*/*.log

      fields:
        environment: "tst"
        logtype: "application_log"
      encoding: plain
      symlinks: true  # make sure logfiles only appear once!
      input_type: log
      fields_under_root: true
      document_type: "log"  # This becomes the "type" field in kibana/logstash
      multiline:
        pattern: '^\n\d{4}-\d{2}-\d{2}'
        negate: false
        match: after
    -

Here a part of the log file that correctly matches my pattern according to the go playground

2018-08-03 19:00:22.060 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,
expungeDeletes=false,softCommit=false,prepareCommit=false}
2018-08-03 19:00:22.229 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.c.SolrDeletionPolicy SolrDeletionPolicy.onCommit: commits: num=2
        commit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.sto
re.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4t8p,generation=224521}
        commit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.sto
re.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4t8q,generation=224522}
2018-08-03 19:00:22.230 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.c.SolrDeletionPolicy newest commit generation = 224522
2018-08-03 19:00:22.241 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.s.SolrIndexSearcher Opening [Searcher@3b972758[qdisk] main]

Here is the output from the go playground.

matches	line
false	
true	2018-08-03 19:00:22.060 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,
false	expungeDeletes=false,softCommit=false,prepareCommit=false}
true	2018-08-03 19:00:22.229 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.c.SolrDeletionPolicy SolrDeletionPolicy.onCommit: commits: num=2
false	        commit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.sto
false	re.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4t8p,generation=224521}
false	        commit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.sto
false	re.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4t8q,generation=224522}
true	2018-08-03 19:00:22.230 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.c.SolrDeletionPolicy newest commit generation = 224522
true	2018-08-03 19:00:22.241 INFO  (qtp110456297-16) [   x:qdisk] o.a.s.s.SolrIndexSearcher Opening [Searcher@3b972758[qdisk] main]
false	

Here is the logstash stdout when processing said logs.

logstash_1        |         "logtype" => "application_log",
logstash_1        |     "environment" => "tst",
logstash_1        |      "@timestamp" => 2018-08-03T19:00:38.509Z,
logstash_1        |          "offset" => 3960837,
logstash_1        |        "@version" => "1",
logstash_1        |            "beat" => {
logstash_1        |             "name" => "vdpsidxtst05",
logstash_1        |         "hostname" => "vdpsidxtst05",
logstash_1        |          "version" => "5.5.0"
logstash_1        |     },
logstash_1        |      "input_type" => "log",
logstash_1        |            "host" => "vdpsidxtst05",
logstash_1        |          "source" => "/local/mnt/logs/filebeat/apps/solr/solr-qdisk-new.log",
logstash_1        |         "message" => "\tcommit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.store.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4t8q,generation=224522}",
logstash_1        |            "type" => "log",
logstash_1        |            "tags" => [
logstash_1        |         [0] "beats_input_codec_plain_applied",
logstash_1        |         [1] "_grokparsefailure"
logstash_1        |     ]

Any clues?

Here is an example of the filter working as expected.

logstash_1        |       "environment" => "tst",
logstash_1        |           "logtype" => "application_log",
logstash_1        |        "@timestamp" => 2018-08-03T19:30:23.902Z,
logstash_1        |          "@version" => "1",
logstash_1        |              "beat" => {
logstash_1        |             "name" => "vdpsidxtst05",
logstash_1        |         "hostname" => "vdpsidxtst05",
logstash_1        |          "version" => "5.5.0"
logstash_1        |     },
logstash_1        |              "host" => "vdpsidxtst05"
logstash_1        | }
logstash_1        | {
logstash_1        |            "offset" => 4035306,
logstash_1        |        "input_type" => "log",
logstash_1        |     "timestamp8601" => "2018-08-03 19:30:23.810",
logstash_1        |            "source" => "/local/mnt/logs/filebeat/apps/solr/solr-qdisk-new.log",
logstash_1        |           "message" => "2018-08-03 19:30:23.810 INFO  (searcherExecutor-7-thread-1-processing-x:qdisk) [   x:qdisk] o.a.s.c.SolrCore [qdisk] Registered new searcher Searcher@7eb6d958[qdisk] main{ExitableDirectoryReader(UninvertingDirectoryReader(Uninverting(_4z81(6.1.0):C99989/55859:delGen=70) Uninverting(_4z8b(6.1.0):C23100/13419:delGen=56) Uninverting(_56pl(6.1.0):C35115/1682:delGen=13) Uninverting(_56p1(6.1.0):C25502/12933:delGen=19) Uninverting(_56q6(6.1.0):c3846/951:delGen=7) Uninverting(_56pa(6.1.0):C1110) Uninverting(_56pf(6.1.0):C1110) Uninverting(_56pm(6.1.0):C2988/338:delGen=2) Uninverting(_56tb(6.1.0):c6921/4350:delGen=4) Uninverting(_56q4(6.1.0):C1340) Uninverting(_56q7(6.1.0):C3200) Uninverting(_56q9(6.1.0):C1052) Uninverting(_56qb(6.1.0):C2941/146:delGen=1) Uninverting(_5739(6.1.0):C109/108) Uninverting(_573a(6.1.0):C3301) Uninverting(_573b(6.1.0):C50) Uninverting(_573c(6.1.0):C872) Uninverting(_573d(6.1.0):C126)))}",
logstash_1        |              "type" => "log",
logstash_1        |         "logsource" => "INFO",
logstash_1        |              "tags" => [
logstash_1        |         [0] "beats_input_codec_plain_applied"
logstash_1        |     ],

I think your multiline config needs a little tuning:

      multiline:
        # find messages that do NOT start with a date...
        pattern: '^\d{4}-\d{2}-\d{2}T'
        negate: true
        # ... append them after the previous line that DID match
        match: after

Here is the new output.

logstash_1        |            "offset" => 1744515,
logstash_1        |        "input_type" => "log",
logstash_1        |     "timestamp8601" => "2018-08-03 22:18:50.194",
logstash_1        |            "source" => "/local/mnt/logs/filebeat/apps/solr/solr-cpip-filebox.log",
logstash_1        |           "message" => "2018-08-03 22:18:50.194 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}\n2018-08-03 22:18:50.194 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.\n2018-08-03 22:18:50.194 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.DirectUpdateHandler2 end_commit_flush\n2018-08-03 22:18:50.194 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.p.LogUpdateProcessorFactory [cpip-filebox]  webapp=/solr-cpip-filebox path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 0\n2018-08-03 22:18:55.216 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}\n2018-08-03 22:18:55.216 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.\n2018-08-03 22:18:55.218 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.DirectUpdateHandler2 end_commit_flush\n2018-08-03 22:18:55.218 INFO  (qtp401424608-15) [   x:cpip-filebox] o.a.s.u.p.LogUpdateProcessorFactory [cpip-filebox]  webapp=/solr-cpip-filebox path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 1",
logstash_1        |              "type" => "log",
logstash_1        |         "logsource" => "INFO",
logstash_1        |              "tags" => [
logstash_1        |         [0] "beats_input_codec_plain_applied"
logstash_1        |     ],
logstash_1        |           "logtype" => "application_log",
logstash_1        |       "environment" => "tst",
logstash_1        |        "@timestamp" => 2018-08-03T22:18:52.786Z,
logstash_1        |          "@version" => "1",
logstash_1        |              "beat" => {
logstash_1        |             "name" => "vdpsidxtst05",
logstash_1        |         "hostname" => "vdpsidxtst05",
logstash_1        |          "version" => "5.5.0"
logstash_1        |     },
logstash_1        |              "host" => "vdpsidxtst05"
logstash_1        | }

It looks like you may have missed the negate: true bit in the Filebeat multiline config.

Nope it is in there...

filebeat:
  # List of prospectors to fetch data.
  prospectors:
    -
      paths:
              - /local/mnt/logs/filebeat/apps/*/*.log

      fields:
        environment: "tst"
        logtype: "application_log"
      encoding: plain
      symlinks: true  # make sure logfiles only appear once!
      input_type: log
      fields_under_root: true
      document_type: "log"  # This becomes the "type" field in kibana/logstash
      multiline:
        pattern: '^\n\d{4}-\d{2}-\d{2}T'
        negate: true
        match: after

Thanks for all of the pointers! I am good now, below is my final conf...now I just need to deal with parsing the json files as well.

multiline:
        pattern: '^\d{4}-\d{2}-\d{2}'
        negate: true
        match: after

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.