If grok_parse failure Instead of drop. I want to process everything as greedy data

filter {
        if [type] == "chef-client" {
                grok {
                           match => {"message" => "\[%{TIMESTAMP_ISO8601:logtimestamp}\] %{SPACE}%{LOGLEVEL:loglevel}:%{SPACE}%{GREEDYDATA:message}"}
                           overwrite => [ "message"]
                           remove_field => ["@timestamp", "@version" ]
                         }
mutate {
        split => ["host", "."]
        add_field => { "hostname" => "%{[host][0]}" }
        add_field => { "podName" => "%{[host][1]}" }
        add_field => { "ignore" => "%{[host][2]}" }
        remove_field => ["ignore", "host"]
    }
	}
if "_grokparsefailure" in [tags] {
         match => {"message" => "%{GREEDYDATA:message}"}   --> does not work  drop { } works
 
        }
}
}

``` [ERROR] 2020-03-29 20:21:56.075 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 26, column 16 (byte 847) after filter {\n        if [type] == \"chef-client\" {\n                grok {\n                           match => {\"message\" => \"\\[%{TIMESTAMP_ISO8601:logtimestamp}\\] %{SPACE}%{LOGLEVEL:loglevel}:%{SPACE}%{GREEDYDATA:message}\"}\n                           overwrite => [ \"message\"]\n                           remove_field => [\"@timestamp\", \"@version\" ]\n                         }\nmutate {\n        split => [\"host\", \".\"]\n        add_field => { \"hostname\" => \"%{[host][0]}\" }\n        add_field => { \"podName\" => \"%{[host][1]}\" }\n        add_field => { \"ignore\" => \"%{[host][2]}\" }\n        remove_field => [\"ignore\", \"host\"]\n    }\n\t}\nif \"_grokparsefailure\" in [tags] {\n         match ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:23:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[DEBUG] 2020-03-29 20:21:56.145 [LogStash::Runner] os - Stopping

That says to set the value of the [message] field equal to the value of the [message] field. What do you hope to achieve by doing that?

If you just want to keep the value of the [message] field if the grok does not match then just do nothing, it will be there by default.

Also, why both setting [ignore] and then remove it in the same filter?

Thanks for getting back.

I removed the entry but still it does not parse any fields.

input {
  file {
    path => "/var/log/messages"
    start_position => "beginning"
        type => "messages"
    start_position => beginning
    sincedb_path => "/dev/null"
  }
  file {
    path => "/var/log/chef-client.log"
    type => "chef-client"
    start_position => beginning
    sincedb_path => "/dev/null"
  }
}
filter {
if [type] == "messages"{
 grok {
                           match => {"message" => "%{SYSLOGTIMESTAMP:logtimeStamp} %{USERNAME:systemname} %{GREEDYDATA:message}"}
                           overwrite => [ "message"]
                           remove_field => ["@timestamp", "@version", "systemname" ]
                         }

mutate {
        split => ["host", "."]
        add_field => { "hostname" => "%{[host][0]}" }
        add_field => { "podName" => "%{[host][1]}" }
        add_field => { "ignore" => "%{[host][2]}" }
        remove_field => ["ignore", "host"]
    }
        mutate {
        split => ["path", "/"]
        add_field => { "logfileName" => "%{[path][3]}" }
        add_field => { "logPath" => "%{[path][2]}" }
        remove_field => ["path", "logPath"]
    }

        if [type] == "chef-client" {
                grok {
                           match => {"message" => "\[%{TIMESTAMP_ISO8601:logtimestamp}\] %{SPACE}%{LOGLEVEL:loglevel}:%{SPACE}%{GREEDYDATA:message}"}
                           overwrite => [ "message"]
                           remove_field => ["@timestamp", "@version" ]
                         }
mutate {
        split => ["host", "."]
        add_field => { "hostname" => "%{[host][0]}" }
        add_field => { "myName" => "%{[host][1]}" }
        add_field => { "ignore" => "%{[host][2]}" }
        remove_field => ["ignore", "host"]
    }
 }
if "_grokparsefailure" in [tags] {
            drop { }
        }
}
}


output {
  amazon_es {
    hosts => ["${ES_ENDPOINT}"]
    region => "us-west-2"
    index => "test-%{+YYYY.MM.dd}"
}
}

I have system logs to be parsed using grok. I have many pipeline files example : secure.conf, messages.conf, audit.conf all of these files contains input,output and filter. All these files works when tested separately
/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/message.conf --debug
/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/audit.conf --debug

what is the best practice to organised ? how do I have all the files (pipeline) work in one shot ?

Made changes as suggested.

	mutate { 
        remove_tag => [ "_grokparsefailure" ]
    } ```

tested by passing radom contents in log file " hello how are you" . I see the parsed fields. Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.