Is there any way to use conditional grok filter?

Hi!
I'm studying ELK Stacks these days and trying to analyze logs on my mac.
I used Logstash to get information in my laptop(txt file) and trying to send it to ElasticSearch. On my way, I had a problem with grok filter. There are three different data types.

- date/ user name/ process/ message - date/ user name/ process1/ process2/ message - date/ -----last message repeated 1 time--------

I made suitable grok filter for each case, however, I don't know how to use in logstash properly. When I didn't used filter(just get file in input section, and send results to elasticsearch), I could pass the information but when I used grok filter, I could not run the logstash program. My code and error signals in the bottom. Please help me to finish my study. Thanks for reading and have a nice day!

input{ file{ path => "var/log/system.log" sincedb_path => "NULL" start_position => "beginning" } }

filter{
grok{
match => {"message" => "%{MONTH:month} %{NUMBER:day} %{TIME:time} %{USERNAME:username} %{DATA:process}[%{NUMBER:code}]: %{GREEDYDATA:message}"}}
}

output{
elasticsearch{
hosts => "localhost:9200"
index => "common8"
}
}'

Thread.exclusive is deprecated, use Thread::Mutex Sending Logstash logs to /Users/mf839-031/Downloads/logstash-7.2.0/logs which is now configured via log4j2.properties [2019-07-18T16:45:43,432][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2019-07-18T16:45:43,500][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.2.0"} [2019-07-18T16:45:51,558][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2019-07-18T16:45:51,846][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2019-07-18T16:45:51,919][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7} [2019-07-18T16:45:51,924][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7} [2019-07-18T16:45:51,968][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]} [2019-07-18T16:45:52,080][INFO ][logstash.outputs.elasticsearch] Using default mapping template [2019-07-18T16:45:52,171][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}} [2019-07-18T16:45:52,355][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2019-07-18T16:45:52,359][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#"} [2019-07-18T16:45:57,986][ERROR][logstash.javapipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#, :backtrace=>["/Users/mf839-031/Downloads/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/logstash/inputs/file.rb:269:in `block in register'", "org/jruby/RubyArray.java:1792:in `each'", "/Users/mf839-031/Downloads/logstash-7.2.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/logstash/inputs/file.rb:267:in `register'", "/Users/mf839-031/Downloads/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:192:in `block in register_plugins'", "org/jruby/RubyArray.java:1792:in `each'", "/Users/mf839-031/Downloads/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:191:in `register_plugins'", "/Users/mf839-031/Downloads/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:292:in `start_inputs'", "/Users/mf839-031/Downloads/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:248:in `start_workers'", "/Users/mf839-031/Downloads/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:146:in `run'", "/Users/mf839-031/Downloads/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:105:in `block in start'"], :thread=>"#"} [2019-07-18T16:45:58,003][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil} [2019-07-18T16:45:58,349][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2019-07-18T16:46:03,293][INFO ][logstash.runner ] Logstash shut down.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.