Problem with reading aggregation map file

I'm using the aggregate plugin and currently testing to see if using the aggregate_maps_path option works to write the aggregation maps to a file when I stop logstash and read them when I restart it.

I'm getting these errors (the first one I've seen and ignored before - don't know if it's related):

[2019-12-13T22:04:26,395][INFO ][logstash.filters.aggregate][main] **Aggregate maps loaded** from : /tmp/aggregation-maps.logstash
[2019-12-13T22:04:26,499][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-12-13T22:04:26,502][INFO ][logstash.javapipeline ][main] **Starting pipeline** {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x6936b532 run>"}
[2019-12-13T22:04:26,932][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-12-13T22:04:30,025][ERROR][org.logstash.execution.WorkerLoop][main] **Exception in pipelineworker, the pipeline stopped processing new events,** please check your filter configuration and restart Logstash.
org.jruby.exceptions.Exception: (GeneratorError) (was java.lang.NullPointerException) (through reference chain: org.logstash.ConvertedMap[ **"_@timestamp"]** )

I've been trying various things and have seen the ConvertedMap error refer to @timestamp as well as 2 date fields.
Anyone ever see anything like this before?
Any clues as to what this error means or how to debug it?
I've tried everything I can think of. I'm totally stuck at this point.

I got around this problem by realizing I could save timestamps to the maps and convert them to dates at the very end, after aggregation. Also, I don't save the original timestamp to the map; an @timestamp is added when the aggregated event is pushed.

Still, there seems to be a bug in the aggregate filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.