Nothing in filter block is being parsed. Why is only the filter not working in logstash?

I tried adding mutate and add_filed along with grok filter but nothing is working. The logs are only parsed as default method.

#listening on this port
input {
  beats {
    port => 5044
  }
}

filter {
  if[fields][tags] =="ngta-web" { 
    mutate {
      add_field => { "host" => "%{[event_data][IpAddress]}" }
    }
    grok {
      break_on_match => false
      match => {
	"message" => [
	   "%{DATESTAMP:timestamp}%{SPACE}%{GREEDYDATA}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{JAVACLASS:javaClass}%{GREEDYDATA:logmessage}"
         ]
      }
    }
  }
}
output {
    elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    ilm_enabled => false
    index    => "%{[fields][tags]}"  
  }
  stdout {
    codec => rubydebug
  }
}

Include a sample from stdout. It would seem [fields][tags] is not "ngta-web".

@rugenl
An index under this name is created-

yellow %{[fields][tags]}

And this is the sample of stdout that keeps on repeating in logstash terminal. And host isnt added in field and neither is timestamp, loglevel shown on kibana.

 },
      "@version" => "1",
       "message" => "08/10/2019 12:26:01 137   (null)                  INFO   23   Leftside Filter Expression : SubCategory=\"Audit/DES Keys\" AND SourceProblemName=\"Audit/DES Keys\" for User NBK7G0J Item Count : 4",
          "host" => {
        "name" => "mehak-VirtualBox"
    },
        "fields" => {
        "log_type" => "ngta-web"
    }
}
{
    "@timestamp" => 2020-03-03T02:29:35.855Z,
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
           "log" => {
        "offset" => 26032684,
          "file" => {
            "path" => "/home/mehak/Documents/filebeat-7.4.0-linux-x86_64/logs/log2.log"
        }
    },
           "ecs" => {
        "version" => "1.1.0"
    },
         "agent" => {
            "hostname" => "mehak-VirtualBox",
                  "id" => "bad135c8-d359-4936-b515-79eb4bb24630",
             "version" => "7.4.0",
                "type" => "filebeat",
        "ephemeral_id" => "f32e4213-4378-4811-b85e-50d327ab6846"
    },
      "@version" => "1",
       "message" => "08/10/2019 12:26:01 137   (null)                 DEBUG   23   Filter :  SubCategory=\"Card Reader\" ",
          "host" => {
        "name" => "mehak-VirtualBox"
    },
        "fields" => {
        "log_type" => "ngta-web"
    }
}

The index worked by having correct name ngta-web and the filed host is also added. But the grok filter is still not working properly.
In my grok filter, I want to match a filename by the log file path and then apply different grok filters. For instance, my log file name is ngta_web.log and I want this log file to follow the first grok pattern if block, so I am making it match on the file name path. This is my updated grok-

filter {
  if[fields][tags] =="ngta-web" { 
    mutate {
      add_field => { "host" => "%{[event_data][IpAddress]}" }
    }
    grok {
      break_on_match => false
      match => {"log.file.path" => "%{GREEDYDATA}/%{GREEDYDATA:filename}\.log"}
      match => {
	"message" => [
	   "%{DATESTAMP:timestamp}%{SPACE}%{GREEDYDATA}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{JAVACLASS:javaClass}%{GREEDYDATA:logmessage}"
         ]
      }
    }
  }
else if [fields][tags] == "ngta-app" {
  
    grok {
      break_on_match => false
      match => { 
	"message" => [
	   "%{DATESTAMP:timestamp}%{SPACE}%{GREEDYDATA}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{JAVACLASS:javaClass}%{GREEDYDATA:logmessage}" 
 	]
      }
    }
  }
  else if [fields][tags] == "monitoring-app" {
  
     grok {
        break_on_match => false
        match => {
	   "message" => [
		"%{DATESTAMP:timestamp}%{SPACE}%{GREEDYDATA}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{JAVACLASS:javaClass}%{GREEDYDATA:logmessage}"
	    ] 
        }
     }
  }
}

You have a field called [fields][log_type], not [fields][tags].

Yes I fixed this @Badger. But my issue now is that the filename I am trying to do search query on to apply grok filters. I try the match statement on log.file.path name and then apply grok filter respectively. Any idea if how I have the match and grok written right now is correct?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.