How to make a custom grok filter

Hello
I have to create a custom grok filter for my logs so please let me know the procedure to do that as in the pattern which I want where it should be saved the extension of the file in which i write my custom pattern also how to use pattern_dir .

Regards
Gaurav

Have you read the grok filter documentation's fairly long section about custom patterns? If yes I'd expect you to be able to ask a slightly more specific question.

Just for an example for my question
let this be my custom pattern
CUST_DATE %{MONTH} %{MONTHDAY} %{TIME}
which I saved in the bin folder of logstash with the name patterns.txt

Now this is my config file
input{
stdin{
}
}
filter{
grok{
patterns_dir => ["./patterns"]
match => {"message" => "%{CUST_DATE:date}"}
}
}
output
{
stdout{
}
}
this again I have saved in the bin folder of logstash as filename.conf

but when I try to execute this I am getting this error

[2018-02-21T11:05:12,119][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config s
etting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are sche
duled for removal from logstash in the future. Document types are being deprecated in Elasticsearch
6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this
, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::open_mouth:
utputs::ElasticSearch hosts=>[http://localhost:9200], bulk_path=>"/_xpack/monitoring/_bulk?system_id
=logstash&system_api_version=2&interval=1s", manage_template=>false, document_type=>"%{[@metadata][d
ocument_type]}", sniffing=>false, id=>"ad524e5a1a68d2ca7086e1144ec98005bcfc1ad3103a990fb9bbf21aa44aa
140", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_f8baa5fb-f149-4087-a45d-a88d57
246b18", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_ove
rwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name
=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_confl
ict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_ma
x=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compres
sion=>false>}

[2018-02-21T11:05:12,475][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above clu
ster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-02-21T11:05:12,503][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>".monit
oring-logstash"}
[2018-02-21T11:05:12,901][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>
"main", :plugin=>"#<LogStash::FilterDelegator:0x7c9c5dc0 @metric_events_out=org.jruby.proxy.org.logs
tash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby
.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @logger=#<Log
Stash::Logging::Logger:0x12b2a0c1 @logger=#Java::OrgApacheLoggingLog4jCore::Logger:0x54166931>, @m
etric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name
: duration_in_millis value:0, @id="b2db37c04c07ac307db2757ab270f769693c0efaba09a9a7751a09970f65ca9b
", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0xbd850f
f @metric=#<LogStash::Instrument::Metric:0xceb4ded @collector=#<LogStash::Instrument::Collector:0x6b
d3e2a0 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x5dbfe853 @store=#<Concurrent:
:map:0x00000000000fb8 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x142904b3, @fa
st_lookup=#<Concurrent::map:0x00000000000fbc entries=59 default_proc=nil>>>>, @namespace_name=[:stat
s, :pipelines, :main, :plugins, :filters, :b2db37c04c07ac307db2757ab270f769693c0efaba09a9a7751a09970
f65ca9b, :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>["./patterns"], match=>{"messa
ge"=>"%{CUST_DATE:date}"}, id=>"b2db37c04c07ac307db2757ab270f769693c0efaba09a9a7751a09970f65ca9b
", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, na
med_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeou
t_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{CUST_DATE:date} not defined
", :thread=>"#<Thread:0x36c99bf3@C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logstash/pi
peline.rb:245 run>"}
[2018-02-21T11:05:12,904][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline
_id=>"main", :exception=>#<Grok::PatternError: pattern %{CUST_DATE:date} not defined>, :backtrace=>[
"C:/Users/gagarwal3/Downloads/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.
rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "C:/Users/gagarwal3/Downl
oads/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in compile'", "C:/ Users/gagarwal3/Downloads/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.1/lib/log stash/filters/grok.rb:286:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "C:/Us ers/gagarwal3/Downloads/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.1/lib/logst ash/filters/grok.rb:280:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "C:/Users /gagarwal3/Downloads/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.1/lib/logstash /filters/grok.rb:275:inregister'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logsta
sh/pipeline.rb:343:in register_plugin'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/l ogstash/pipeline.rb:354:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logstash/pipeline.rb:354:inregister_plugi
ns'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logstash/pipeline.rb:744:in maybe_se tup_out_plugins'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logstash/pipeline.rb:364 :instart_workers'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logstash/pipeline.rb:
288:in run'", "C:/Users/gagarwal3/Downloads/logstash/logstash-core/lib/logstash/pipeline.rb:248:inblock in start'"], :thread=>"#<Thread:0x36c99bf3@C:/Users/gagarwal3/Downloads/logstash/logstash-cor
e/lib/logstash/pipeline.rb:245 run>"}
[2018-02-21T11:05:12,910][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :a
ction_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::stuck_out_tongue:
ipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
[2018-02-21T11:05:12,917][INFO ][logstash.inputs.metrics ] Monitoring License OK
[2018-02-21T11:05:13,534][INFO ][logstash.pipeline ] Pipeline terminated {"pipeline.id"=>".mo

patterns_dir => ["./patterns"]

But this isn't consistent with you placing patterns.txt in the Logstash bin direcotry.

that is where i m stuck can u please help me out how to get this fixed what should i do to make it consistent.

The patterns_dir option should point to whatever directory where you've chosen to store patterns.txt.

As a side note, I recommend storing patterns.txt along with your Logstash configuration files (but not in the conf.d directory!) so there's no risk of losing it when you upgrade Logstash.

input{
stdin{
}
}
filter{
grok{
patterns_dir => [".C:\Users\gagarwal3\Downloads\logstash\bin\patterns.txt"]
match => {"message" => "%{CUST_DATE:date}"}
}
}
output
{
stdout{
}
}

Is this correct?

No.

  • Your patterns_dir value starts with a period.
  • The patterns_dir option should point to whatever directory where you've chosen to store patterns.txt.

And again, the bin directory is a bad choice. Quoting the documentation:

Note that Grok will read all files in the directory matching the patterns_files_glob and assume it’s a pattern file (including any tilde backup files).

So, create a directory where you only store pattern files and point patterns_dir to that directory.

I have created a new directory named "pat" where in I have saved the patterns file so can u please let me know if this config file is correct if not not can you mention the syntax for patterns_dir
input{
stdin{
}
}
filter{
grok{
patterns_dir => ["C:\Users\gagarwal3\Downloads\logstash\pat\patterns.conf"]
match => {"message" => "%{CUST_DATE:date}"}
}
}
output
{
stdout{
}
}

Is C:\Users\gagarwal3\Downloads\logstash\pat\patterns.conf a directory?

this is the exact path for my file where I have the custom pattern
so basically under pat folder i have my file patterns.conf

Yes, and for the third time the patterns_dir option should point to whatever directory where you've chosen to store patterns.conf. Over and out.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.