Grok isn't loading patterns


(Nivo33) #1

Hi,

I have tried to implement some custom grok patterns into a separate file and load it into logstash conf by using patterns_dir. Each time running logstash with the following conf output a pattern not defined error message. Using logstash 5.5

here is my pattern file "/etc/elk/logstash/patterns/temp_patterns":

OPENSTACK_PROG (?:[ a-zA-Z0-9_\-]+\.)+[ A-Za-z0-9_\-$]+

REQ_LIST (\[(?:(req-%{UUID:request_id_list}|%{UUID:request_id_list}|%{BASE16NUM}|None|-|%{SPACE}))+\])?

(theres a newline between the patterns in the saved file, not sure if thats relevant)

and my conf:

input {
    beats {
        port => "5043"
        ssl => false
    }
}


filter{
  if "keystone" in [fields][tag] or [source] == "/var/log/keystone/keystone.log"{
    grok{
        patterns_dir => ["/etc/elk/logstash/patterns"]
        match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} %{POSINT:openstack_pid} %{LOGLEVEL:level} %{OPENSTACK_PROG:openstack_program} %{REQ_LIST} %{WORD:verb} %{URI:URI}"}
        add_tag => ["found"]
    }
  }
}


output {
    elasticsearch {
        hosts => ["elk-elasticsearch:9201"]
    }

and the error that pops up:

[2017-09-13T09:48:49,027][ERROR][logstash.agent           ] Pipeline aborted due to error {:exception=>#    <Grok::PatternError: pattern %{OPENSTACK_PROG:openstack _program} not defined>, :backtrace=>    ["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in `compile'",     "org/jruby/RubyKerne l.java:1479:in `loop'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-    0.11.4/lib/grok-pure.rb:93:in `compile'", "/usr/share/logstash/vendor/bundle /jruby/1.9/gems/logstash-filter-grok-    3.3.1/lib/logstash/filters/grok.rb:274:in `register'", "org/jruby/RubyArray.java:1613:in `each'",     "/usr/share/logstash/ven dor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:269:in     `register'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/log     stash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:264:in `register'",     "/usr/share/logstash/logstash-core/lib/logstash/ pipeline.rb:235:in `start_workers'",     "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:235:in     `start_worke rs'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:188:in `run'",     "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:302:in `start_pipelin e'"]} 
[2017-09-13T09:48:49,083][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-09-13T09:48:52,044][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}

Any idea what is causing the issue?

Thanks


(Imma) #2

Newlines should not be relevant.
As far as I can see it looks good.
Does your logstash process run with a user that has file permissions to read the patterns file?


(Nachiket) #3

Hi,

Patterns_dir option should be a directory path, not a file path

Regards,
N


(Nachiket) #4

My bad, it is a directory in config.

Are you seeing the data in elasticsearch? Are the tags proper?


(Nivo33) #5

Problem solved. I was running logstash through docker and the directories were configured differently there.
Thanks!


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.