2 grok filter NOT WOERKED ! :|

im use 2 grok filter to one config file but not worked !!!

input {
  beats {
    port => 5443
    type => syslog
    ssl => true
    ssl_certificate => "/etc/logstash/logstash.crt"
    ssl_key => "/etc/logstash/logstash.key"
  }
}

filter {
  if [type] == "log" {
    grok {
      patterns_dir => "/etc/logstash/patterns"
      match => { "%{IP:client} %{HTTPDUSER:ident} %{USER:auth} \[%{HTTPDATE:timestamp_server_genaration}\] \"%{WORD:method} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}\" %{NUMBER:status_code} %{NUMBER:bytes} %{QS:refferer} %{QS:user_agent} length %{NUMBER:length} rtime %{NUMBER:request_time} uri %{URIPATHPARAM:uri} realip %{IP:realip}" }
      remove_tag => ["nginx_access", "_grokparsefailure"] }
      add_field => {
        "type" => "nginx_access"
      }
      remove_field => ["log"]
    }
  }
  
  date {
      match => ["time_local", "dd/MMM/YYYY:HH:mm:ss Z"]
      target => "@timestamp"
      remove_field => "time_local"
    }

  if [type] == "log" {
    grok {
      patterns_dir => "/etc/logstash/patterns"
      match => { (?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}:%{GREEDYDATA:errormessage} }
      remove_tag => ["nginx_error", "_grokparsefailure"] }
      add_field => {
        "type" => "nginx_error"
      }
      remove_field => ["log"]
    }

	date {
      match => ["time_local", "YYYY/MM/dd HH:mm:ss"]
      target => "@timestamp"
      remove_field => "time_local"
    }
	
output {
  elasticsearch { hosts => ["localhost:9200"]
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }

}

Which one? What does you log file look like? What do you mean when you say it doesn't work?

When this alone runs inside the file, it works without problems ↓↓

filter {
  if [type] == "log" {
	grok {
		match => [ 
		"message" => "%{IP:client} %{HTTPDUSER:ident} %{USER:auth} \[%{HTTPDATE:timestamp_server_genaration}\] \"%{WORD:method} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}\" %{NUMBER:status_code} %{NUMBER:bytes} %{QS:refferer} %{QS:user_agent} length %{NUMBER:length} rtime %{NUMBER:request_time} uri %{URIPATHPARAM:uri} realip %{IP:realip}" 
		]	
		remove_tag => ["_grokparsefailure"]
		add_tag => ["nginx_access"]
       }
 	geoip {
      		source => "client"
    }
	mutate {
            convert => [ "[geoip][location]", "float" ]
    }
  }
}

But when I put it all that wayو kibana this message No results found ↓↓

filter {
  if [type] == "log" {
	grok {
		match => [ 
		"message" => "%{IP:client} %{HTTPDUSER:ident} %{USER:auth} \[%{HTTPDATE:timestamp_server_genaration}\] \"%{WORD:method} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}\" %{NUMBER:status_code} %{NUMBER:bytes} %{QS:refferer} %{QS:user_agent} length %{NUMBER:length} rtime %{NUMBER:request_time} uri %{URIPATHPARAM:uri} realip %{IP:realip}" 
		]	
		remove_tag => ["_grokparsefailure"]
		add_tag => ["nginx_access"]
       }
 	geoip {
      		source => "client"
    }
	mutate {
            convert => [ "[geoip][location]", "float" ]
    }
  }
}



filter {
  if [type] == "log" {
	grok {
		match => [
		(?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}:%{GREEDYDATA:errormessage}
		]	
		remove_tag => ["_grokparsefailure"]
		add_tag => ["nginx_error"]
       }
 	geoip {
      		source => "client"
    }
	mutate {
            convert => [ "[geoip][location]", "float" ]
    }
  }
}

And even when I put it like the first one ...
this file name == syslog-filter.conf
and path = /etc/logstash/conf.d

Kibana result after this config and grok = NO RESULT !

As you want to add the nginx_access or nginx_error tag to the event where the pattern matched you should use the _grokparsefailure of the first grok to drive the second grok.

filter {
  if [type] == "log" {
    grok {
        match => [ 
        "message" => "%{IP:client} %{HTTPDUSER:ident} %{USER:auth} \[%{HTTPDATE:timestamp_server_genaration}\] \"%{WORD:method} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}\" %{NUMBER:status_code} %{NUMBER:bytes} %{QS:refferer} %{QS:user_agent} length %{NUMBER:length} rtime %{NUMBER:request_time} uri %{URIPATHPARAM:uri} realip %{IP:realip}" 
        ] 
        add_tag => ["nginx_access"]
    }
    if "_grokparsefailure" in [tags] {
      grok {
        match => [
        (?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}:%{GREEDYDATA:errormessage}
        ] 
        add_tag => ["nginx_error"]
      }
    }
    if "_grokparsefailure" not in [tags] {
      geoip {
          source => "client"
      }
      mutate {
          convert => [ "[geoip][location]", "float" ]
      }
    }
  }
}

Also; see https://www.elastic.co/guide/en/logstash/6.0/lookup-enrichment.html
[geoip][location] is not what you think it is - it is an array of longitude and latitude and the values are floats already, so you don't need the mutate convert function.

I know you were following the tutorial I linked to in another post but you should use the same field name for the timestamp field in both groks so you can use one date filter with two patterns to set the event @timestamp field to be the parsed value from both log formats.

Hi

Im copied and pasted your filter
to syslog-filter.conf file
and Errore :

[2017-11-26T00:00:00,519][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, {, ,, ] at line 15, column 13 (byte 240) after filter {\n if [type] == "log" {\n\tgrok {\n\t\tmatch => [ \n\t\t"message" "}

[2017-11-26T00:00:10,785][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, {, ,, ] at line 15, column 13 (byte 240) after filter {\n if [type] == "log" {\n\tgrok {\n\t\tmatch => [ \n\t\t"message" "}

and after that ...

changed grok simple ... for test =

filter {
    if [type] == "log" {
        grok {
            match => ["message" => "%{IPORHOST:clientip} - - \[%{HTTPDATE:timestamp}\]"]
            add_tag => ["nginx_access"]
        }
        if "_grokparsefailure" in [tags] {
            grok {
                match => [
                    (?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}:%{GREEDYDATA:errormessage}
                ]
            add_tag => ["nginx_error"]
            }
        }
        if "_grokparsefailure" not in [tags] {
            geoip {
                source => "client"
            }
            mutate {
                convert => [ "[geoip][location]", "float" ]
            }
        }
    }
}

And again again the same error

[2017-11-27T00:00:07,395][ERROR][logstash.agent           ] Cannot create pipeline {:reason=>"Expected one of #, {, ,, ] at line 26, column 19 (byte 473) after filter {\n  if [type] == \"log\" {\n    grok {\n        match => [ \n        \"message\" "}
[2017-11-27T00:00:17,591][ERROR][logstash.agent           ] Cannot create pipeline {:reason=>"Expected one of #, {, ,, ] at line 26, column 19 (byte 473) after filter {\n  if [type] == \"log\" {\n    grok {\n        match => [ \n        \"message\" "}

And once with this command :
./logstash -t -f /etc/logstash/conf.d/syslog-filter.conf

resulted ::

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console
13:20:07.335 [LogStash::Runner] FATAL logstash.runner - The given configuration is invalid. Reason: Expected one of #, {, ,, ] at line 15, column 33 (byte 256) after filter {
    if [type] == "log" {
        grok {
            match => ["message" 

What should I do ??

Your original config was (I just copy pasted):

            grok {
                match => [
                    (?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}:%{GREEDYDATA:errormessage}
                ]
            add_tag => ["nginx_error"]
            }

It should be:

            grok {
                match => [ "message" => "(?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}:%{GREEDYDATA:errormessage}"]
                add_tag => ["nginx_error"]
            }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.