Trouble with filter conditional logic

I have had the following filter conditional logic in place for a while now, and it's working well:

filter {

  mutate {
  [...]
  }

  if "STRING1" in [message] {
  [...]
  }

  else if [type] == "syslog" {
  [...]
  }

}

I've now tried to add another "else if" as so --

filter {

  mutate {
  [...]
  }

  if "STRING1" in [message] {
  [...]
  }

  else if "STRING2" in [message] {
  [...]
  }

  else if [type] == "syslog" {
  [...]
  }

}

The odd thing is the second "else if" (with "STRING2") is now causing all the log lines that match to not show up in the results as seen in Kibana... When I comment that block out, they show up again. What's wrong with my syntax here?

It is impossible to know without more context, like a sample of your message and what you are doing when you have a match.

What are you doing when the conditional match? You need to share more of your pipeline.

The syntax is correct, but without seeing what you are doing when it match in both cases, it is not possible to know what can lead your messages to not show up in Kibana.

@leandrojmp Thanks for the reply. Here is the first "if", which is working to parse lines such as this example: <14>Dec 23 11:35:06 svr01 APC_PDU_LEGAMPS: [INFO] PDU=pdu-apc-rack1b.mycompany.com Leg=1 Amps=3.2

if  "APC_PDU_LEGAMPS"  in [message] {

    mutate {
      gsub => [
        "message", "[=]", " \0 ",
        "message", "\s+", " "
      ]
    }

    grok {
      match => { "message" => "(?<logTime>%{MONTH:month} %{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second}) envctl APC_PDU_LEGAMPS: \[INFO\] PDU = %{NOTSPACE:PDUName} Leg = %{NUMBER:LegNum:int} Amps = %{NUMBER:Amps:float}" }
    }

    date {
      match => [ "logTime", "MMM dd HH:mm:ss" ]
      target => "@timestamp"
    }

    mutate {
      remove_field => ['year']
      remove_field => ['month']
      remove_field => ['day']
      remove_field => ['hour']
      remove_field => ['minute']
      remove_field => ['second']
      remove_field => ['logTime']
    }

  }

The next "else if" has basically a copy of the above, but is intended to parse a line of this type: <14>Dec 23 11:45:02 gpu17 GET_GPU_UTIL: [INFO] Hostname=gpu17 GPU_Inst=3 GPU_Util=14

  else if  "GET_GPU_UTIL"  in [message] {

    mutate {
      gsub => [
        "message", "[=]", " \0 ",
        "message", "\s+", " "
      ]
    }

    grok {
      match => { "message" => "(?<logTime>%{MONTH:month} %{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second}) %{NOTSPACE:hostname} GET_GPU_UTIL: \[INFO\] Hostname=%{NOTSPACE:GpuServer} GPU_Inst=%{NUMBER:GpuInstNum:int} GPU_Util=%{NUMBER:GpuPctUtil:int}" }
    }

    date {
      match => [ "logTime", "MMM dd HH:mm:ss" ]
      target => "@timestamp"
    }

    mutate {
      remove_field => ['year']
      remove_field => ['month']
      remove_field => ['day']
      remove_field => ['hour']
      remove_field => ['minute']
      remove_field => ['second']
      remove_field => ['logTime']
    }

  }

If your date string does not have a year then the date filter will make an educated guess about what the year should be. Sometimes it gets that wrong. See here for a longer discussion.

Assuming you are looking in Kibana, expand the date range a year backwards and forwards to see if the events are there.

Do you have anything in Logstash logs?

I run your pipeline with your sample messages, but saw nothing that could indicate an issue.

Since you do not have a year in your date string, maybe logstash could be guessing it wrong as Badger mentioned.

One tip, if your messages all have this same format you can use other filters like dissect and kv to parse it without needing grok or conditionals.

Ah, figured it out -- It is doing a gsub to put spaces around the "=" symbol... (Another co-worker wrote the former filter, not sure why he did this, but...) So changing the grok expression from (for instance) Hostname=%{NOTSPACE:GpuServer} to Hostname = %{NOTSPACE:GpuServer} made it work, & show up in Kibana. Still unsure why it wasn't displaying the data before, but all is well now.

Now I just have to figure out why the new fields are not present when I go to do a Visualization; I think it's because it's displaying the type of field as "?" in the search fields list, but I'm not sure why because I read all extractions would be of type "string" unless coerced, and I'm specifying the two number fields I'm extracting to be of type int...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.