Logstash stops compiling when given custom pattern inside a filter

So, the problem is this: I have a custom pattern file in ./patterns directory.

It looks like this:

NODELISTENUM(([A-Za-z0-9]{0,20})(\-)?([A-Za-z0-9]{0,20})(\.[A-Za-z0-9]{0,20})?(\,)*([A-Za-z0-9]{0,20}(\-?[A-Za-z0-9]{0,20})*)(\.[A-Za-z0-9]{0,20})?)+
XCAT_1 ([a-z]{5,5})\s\-([A-Za-z])\s([a-z]{4,4})\s\-([A-Za-z])\s(?:%{XCNODELISTENUM})
XCAT_2 (\-([A-Za-z]\s(?:%{XCNODELISTENUM})\s[a-z]{5,5})\s\-([A-Za-z])\s([a-z]{4,4}))
XCAT (%{XCAT_1}|%{XCAT_2})

XCATCOMMEXEC ([a-z]{5,5})\s\-([A-Za-z])\s([a-z]{4,4})
OPTION (\-([A-Za-z]))
NODESINVOLVED (([A-Za-z0-9]{0,20})(\-)?([A-Za-z0-9]{0,20})(\.[A-Za-z0-9]{0,20})?(\,)*([A-Za-z0-9]{0,20}(\-?[A-Za-z0-9]{0,20})*)(\.[A-Za-z0-9]{0,20})?)+)

Filter in which those patterns are used looks like this:

filter {
    if [type] == "syslog" and !("parsed_by_added_cron_filter" in [tags]) {
        grok {
            patterns_dir => ["./patterns"]
            remove_tag => ["_grokparsefailure"]
            match => {
                "message" => ["%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: xCAT: Allowing %{XCATCOMMEXEC:xCAT_comm_exec} %{OPTION:option} ?%{NODESINVOLVED:nodes_involved} for %{USERNAME:xcat_user} from %{SYSLOGHOST:xcat_user_hostname}"]
            }
            add_field => [ "received_at", "%{@timestamp}" ]
            add_field => [ "received_from", "%{host}" ]
        }
    }
    syslog_pri { }
}

This is the message in the log that shows logstash stop compiling:

[2017-05-03T12:42:29,507][ERROR][logstash.pipeline        ] Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x30da3bcb @id=\"d2fe4d8a1b6009020b724f61f22506bdecdfdb3f-6\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x2026f0d4 @metric=#<LogStash::Instrument::Metric:0x719b7df8 @collector=#<LogStash::Instrument::Collector:0x397c0497 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x58197410 @store=#<Concurrent::Map:0x4fae9f97 @default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x65704f27>, @fast_lookup=#<Concurrent::Map:0x3c71a7a2 @default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"d2fe4d8a1b6009020b724f61f22506bdecdfdb3f-6\", :events]>, @logger=#<LogStash::Logging::Logger:0x14329d83 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x3777882e>>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"./patterns\"], remove_tag=>[\"_grokparsefailure\"], match=>{\"message\"=>[\"%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\\\\[%{POSINT:syslog_pid}\\\\])?: xCAT: Allowing %{XCATCOMMEXEC:xCAT_comm_exec} %{OPTION:option} ?%{NODESINVOLVED:nodes_involved} for %{USERNAME:xcat_user} from %{SYSLOGHOST:xcat_user_hostname}\"]}, add_field=>{\"received_at\"=>\"%{@timestamp}\", \"received_from\"=>\"%{host}\"}, id=>\"d2fe4d8a1b6009020b724f61f22506bdecdfdb3f-6\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{XCATCOMMEXEC:xCAT_comm_exec} not defined"}

Have you tried using an absolute path to the patterns directory?

Yes, I tried absolute path to the patterns directory (/install/gitlab/elk/patterns) and I tried absolute path to the file where used patterns are located (/install/gitlab/elk/patterns/xcat_pattern)

Log message is the same and logstash doesn't compile.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.