Using @metadata in logstash custom pattern for grok?

Hi.

I am trying to use @metadata in logstash custom pattern for grok. But looks like @ is not allowed when using in the fieldname. Is there any way to escape @ in the custom pattern?

[2016-11-25T19:51:21,685][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<RegexpError: invalid char in group name <[@metadata][datetime]>: /(?:(?<[@metadata][datetime]>(?:(?>\d\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))[]+(?:(?!<[0-9])(?:(?:2[0123]|[01]?[0-9])):(?:(?:[0-5][0-9]))(?::(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))(?![0-9]))) (?NUMBER:thread_id(?:(?:(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:.[0-9]+)?)|(?:.[0-9]+)))))) [(?WORD:error_type\b\w+\b)](?:.*))/m>, :backtrace=>["org/jruby/RubyRegexp.java:1434:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:127:incompile'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:272:in register'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:267:in register'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:262:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:197:instart_workers'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:197:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:153:in run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:250:instart_pipeline'"]}

My grok definition is below.

(?<[@metadata][datetime]>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[]+%{TIME}) %{NUMBER:thread_id} \[%{WORD:error_type}\]%{GREEDYDATA}

No, I don't think it's possible to capture into fields under @metadata.

@magnusbaeck

Ok. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.