LeonZC  
                
               
                 
              
                  
                    October 30, 2019, 11:08am
                   
                   
              1 
               
             
            
              I have field called "zone" and i am trying to replace the field value by using mutate to "BlockA" when the field value of "zone" is "Z0".
Here is the code that I am using:
filter {
if [source] =~ "file.log" {
    grok {
      match => { "message" => "%{DAY:day} %{MONTH:month} %{MONTHDAY:date} %{TIME:time} %{YEAR:year} \[%{GREEDYDATA:zone}\]\[%{GREEDYDATA:module}\]\[%{GREEDYDATA:severity}\]: %{GREEDYDATA:message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
         }
filter {
if "[zone]" == Z0 {
    mutate {
      replace => ["[zone]", "BlockA"]
           }
                   }
       }
                          }
}
 
But everytime i am trying to start the logstash service i am getting this error and i don't know where the problem lies.
Here is the error that i am getting:
[2019-10-30T11:55:21,672][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 11, column 4 (byte 874) after filter {\nif [source] =~ \"file.log\" {\n    grok {\n      match => { \"message\" => \"%{DAY:day} %{MONTH:month} %{MONTHDAY:date} %{TIME:time} %{YEAR:year} \\[%{GREEDYDATA:zone}\\]\\[%{GREEDYDATA:module}\\]\\[%{GREEDYDATA:severity}\\]: %{GREEDYDATA:message}\" }\n      add_field => [ \"received_at\", \"%{@timestamp}\" ]\n      add_field => [ \"received_from\", \"%{host}\" ]\n         }\nfilter {\nif ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2584:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:153:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:26:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}
             
            
               
               
               
            
            
           
          
            
              
                st3inbeiss  
                (Pius Dittli)
               
              
                  
                    October 30, 2019, 12:53pm
                   
                   
              2 
               
             
            
              Ok that's just syntax...
Your conditional 
    if "[zone]" == Z0 { 
Should be like 
    if [zone] == "Z0" {
And your replace 
replace => ["[zone]", "BlockA"] 
should be 
replace => {"zone" => "BlockA"}
See https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html  
and https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-replace 
             
            
               
               
               
            
            
           
          
            
              
                LeonZC  
                
               
              
                  
                    October 30, 2019,  2:03pm
                   
                   
              3 
               
             
            
              I arranged the syntax, but i am still getting the same error.
Code:
filter {
if [source] =~ "file.log" {
    grok {
      match => { "message" => "%{DAY:day} %{MONTH:month} %{MONTHDAY:date} %{TIME:time} %{YEAR:year} \[%{GREEDYDATA:zone}\]\[%{GREEDYDATA:module}\]\[%{GREEDYDATA:severity}\]: %{GREEDYDATA:message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
         }
filter {
if [zone] == "Z0" {
    mutate {
      replace => {"zone" => "BlockA"}
           }
                  }
       }
                          }
}
 
Error:
[2019-10-30T14:55:56,900][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 11, column 4 (byte 878) after filter {\nif [source] =~ \"file.log\" {\n    grok {\n      match => { \"message\" => \"%{DAY:day} %{MONTH:month} %{MONTHDAY:date} %{TIME:time} %{YEAR:year} \\[%{GREEDYDATA:zone}\\]\\[%{GREEDYDATA:module}\\]\\[%{GREEDYDATA:severity}\\]: %{GREEDYDATA:message}\" }\n      add_field => [ \"received_at\", \"%{@timestamp}\" ]\n      add_field => [ \"received_from\", \"%{host}\" ]\n         }\nfilter {\n    if ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2584:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:153:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:26:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}
             
            
               
               
               
            
            
           
          
            
              
                Badger  
                
               
              
                  
                    October 30, 2019,  2:22pm
                   
                   
              4 
               
             
            
              You cannot nest a filter {} section inside a filter {} section. Remove the second filter {
             
            
               
               
               
            
            
           
          
            
              
                LeonZC  
                
               
              
                  
                    October 31, 2019, 11:57am
                   
                   
              5 
               
             
            
              I finally got it to work, the main problem was the if [source] =~ "file.log" statement was not grabbing the file thus it couldn't find "Z0" in the field "zone" that's why the mutate wasn't working. I solved it by removing that if statement so now the filter will be applied to the log files that filebeat is grabbing which are not of type syslog. I also fixed my syntax thanks to you guys because the syntax that i had wasn't correct.
Here is my final code:
filter {
    grok {
      match => { "message" => "%{DAY:day} %{MONTH:month} %{MONTHDAY:date} %{TIME:time} %{YEAR:year} \[%{GREEDYDATA:zone}\]\[%{GREEDYDATA:module}\]\[%{GREEDYDATA:severity}\]: %{GREEDYDATA:message_content}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
         }
}
filter {
  if [zone] == "Z0" {
    mutate {
      replace => { "zone" => "Zone0 - BlockA" }
           }
                    }
} 
             
            
               
               
               
            
            
           
          
            
              
                system  
                (system)
                  Closed 
               
              
                  
                    November 28, 2019, 11:58am
                   
                   
              6 
               
             
            
              This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.