Replacing @timestamp in logstash using grok fails [Solved]


(Björn Thoor) #1

Hi.

The setup is this filebeat (ws2012r2) 5.4.0 -> logstash (rhel7) 5.4.0 -> elasticsearch (rhel7) 5.4.0 and the presentation layer is kibana (rhel7) 5.4.0.
This is a sample line from the logs:

2017-04-19 15:25:40,378 [80] DEBUG [something] [(null)] - Message was handled without exception 974cc6f7dd91407dbe435439058cda27be4

The resulting @timestamp field is this:
@timestamp:May 30th 2017, 10:35:11.786

And here are the relevant bits of the logstash config:

filter {
   mutate {
     remove_field => [ "[beat][version]" ]
     remove_field => [ "[offset]" ]
     remove_field => [ "[type]" ]
     remove_field => [ "[@version]" ]
     remove_field => [ "[input_type]" ]
     remove_field => [ "[beat][name]" ]
     remove_field => [ "[beat][hostname]" ]
     remove_field => [ "[tags]" ]
     rename => {'[host]' => '[hostname]'}
   }
   grok {
     match => [ "message", "%{TIMESTAMP_ISO8601}" ]
     overwrite => [ "@timestamp" ]
     add_field => {"TimestampTest" => "grok"}
   }
 }

The grok filter triggers because the "TimestampTest" shows up in ES but @timestamp is not replaced, I've checked the logs but there's nothing interesting in them.

Any idea as to what might be wrong?


(Magnus Bäck) #2

%{TIMESTAMP_ISO8601} doesn't capture the timestamp into a field. You need %{TIMESTAMP_ISO8601:@timestamp}, but I'm not sure such a direct assignment will work. One typically extracts the timestamp to a temporary field and use the date filter to process it.


(Björn Thoor) #3

This is the config I tried:

    grok {
        match => [ "message", "%{TIMESTAMP_ISO8601:@timestamp}" ]
        overwrite => [ "@timestamp" ]
        add_field => {"TimestampTest" => "grok"}
      }

And it resulted in this:

2017-05-30T12:19:35,234][WARN ][logstash.filters.grok ] Grok regexp threw exception {:exception=>"wrong argument type String (expected LogStash::Timestamp)", :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:124:in set'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:351:inhandle'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:338:in match_against_groks'", "org/jruby/RubyProc.java:281:incall'", "(eval):3:in compile_captures_func'", "org/jruby/RubyProc.java:281:incall'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:202:in capture'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:338:inmatch_against_groks'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:333:inmatch_against_groks'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:322:in match'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:288:infilter'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:287:infilter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:inmulti_filter'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:inmulti_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:43:in multi_filter'", "(eval):72:infilter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:370:in filter_batch'", "org/jruby/RubyProc.java:281:incall'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224:in each'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:369:infilter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:350:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:317:instart_workers'"], :class=>"TypeError"}

I did however replace the grok-block with the following:

 date {
   match => [ "message", "%{TIMESTAMP_ISO8601:@timestamp}" ]
   target => [ "@timestamp" ]
   add_field => {"DateTest" => "true"}
  }

which resulted in this in the logs:

[ERROR][logstash.agent ] Cannot create pipeline {:reason=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register"}

I've installed ELK from the rpm packages in the official repo.


(Magnus Bäck) #4

Yes, as I suspected you can't capture straight into @timestamp. Keep your grok filter but capture the timestamp to a different field than @timestamp. Use that field in your date filter and a use date pattern that matches your timestamp, like "ISO8601".


(Björn Thoor) #5

Right, so I tried this:

filter {
  mutate {
# Remove redundant information
    remove_field => [ "[beat][version]" ]
    remove_field => [ "[offset]" ]
    remove_field => [ "[type]" ]
    remove_field => [ "[@version]" ]
    remove_field => [ "[input_type]" ]
    remove_field => [ "[beat][name]" ]
    remove_field => [ "[beat][hostname]" ]
    remove_field => [ "[type]" ]
    remove_field => [ "[tags]" ]

    rename => {'[host]' => '[hostname]'}

    add_field => { "TimestampTemp" => "x" }
  }
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:@timestamp}" }
    overwrite => [ "TimestampTemp" ]
  }
  date {
    match => [ "TimestampTemp", "%{TIMESTAMP_ISO8601:@timestamp}" ]
    target => [ "@timestamp" ]
    remove_field => [ "TimestampTemp" ]
  }
}

With and without @timestamp in the match clause in the date block only to be greeted by this:

[ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Illegal pattern component: T"}

With no reference to what line contains the offending clause I can only guess that the match in the date block is the problem so I tried a blunt regex like so:

match => [ "TimestampTemp", "\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}i,\d{3}" ]

That did remove the diffuse error message from the logs but it did nothing in terms of replacing the @timestamp.
As it turns out the problem is that TimestampTemp is never overwritten, checking the documentation for grok -> overwrite claims that one needs a match clause then overwrite like the one in the grok block above.

At least it seems I've found the issue but unfortunately that doesn't bring me any closer to a solution.

Any ideas?


(Magnus Bäck) #6

You're not following any of the recommendations in my previous post. This is what I meant:

grok {
  match => { "message" => "%{TIMESTAMP_ISO8601:TimestampTemp}" }
}
date {
  match => [ "TimestampTemp", "ISO8601" ]
  remove_field => [ "TimestampTemp" ]
}

(Björn Thoor) #7

I sort of did what you recommended like so:

filter {
  grok {
    match => { "message" => "(?m)%{TIMESTAMP_ISO8601:timestamp}" }
  }
  date {
    match => [ "timestamp", "ISO8601", "yyyy-MM-dd HH:mm:ss,SSS" ]
    remove_field => [ "timestamp" ]
  }
}

And it works, thanks for your help!


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.