Pipeline stopped with "Index 83 out of bounds for length 16"

Hi,

We have a configuration of logstash with several pipelines, having a main pipeline that receives the information, identifies the type of information and sends to different pipelines for procesing before sending the information to Elasticsearch.

In one of the pipelines we use the geoip filter

             geoip { source => "[checkpoint][origin]"
                     target => "[checkpoint][origin-GEOIP]"
                     database => "/etc/logstash/config/GeoLite2-City.mmdb"
                   }

We used to download the database from maxmind and copy it to the database destination, and we were working with logstash 7.5.

After the upgrade to 7.12.2, we noticed a couple of times that pipeline stopped and started to get errors in logstash log of max open files. Last time i got more information from the log, and before starting with max open files the first error is:

[2022-02-09T08:00:08,710][ERROR][logstash.javapipeline    ][checkpoint] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"checkpoint", :erro
r=>"Index 83 out of bounds for length 16", :exception=>Java::JavaLang::ArrayIndexOutOfBoundsException, :backtrace=>["com.maxmind.db.Decoder$Type.get(com/maxmi
nd/db/Decoder.java:52)", "com.maxmind.db.Decoder.decode(com/maxmind/db/Decoder.java:128)", "com.maxmind.db.Decoder.decode(com/maxmind/db/Decoder.java:87)", "c
om.maxmind.db.Reader.resolveDataPointer(com/maxmind/db/Reader.java:252)", "com.maxmind.db.Reader.get(com/maxmind/db/Reader.java:150)", "com.maxmind.geoip2.Dat
abaseReader.get(com/maxmind/geoip2/DatabaseReader.java:151)", "com.maxmind.geoip2.DatabaseReader.city(com/maxmind/geoip2/DatabaseReader.java:202)", "org.logst
ash.filters.geoip.GeoIPFilter.retrieveCityGeoData(org/logstash/filters/geoip/GeoIPFilter.java:234)", "org.logstash.filters.geoip.GeoIPFilter.handleEvent(org/l
ogstash/filters/geoip/GeoIPFilter.java:177)", "jdk.internal.reflect.GeneratedMethodAccessor95.invoke(jdk/internal/reflect/GeneratedMethodAccessor95)", "jdk.in
ternal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)", "java.lang.reflect.Method.invoke(java/lang/ref
lect/Method.java:566)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:456)", "org.jruby.javasuppor
t.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:317)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_filter_minus
_geoip_minus_7_dot_2_dot_8_minus_java.lib.logstash.filters.geoip.filter(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-7.2.8-java/li
b/logstash/filters/geoip.rb:117)", "usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.do_filter(/usr/share/logstash/logstash-core/lib/logstash/
filters/base.rb:159)", "usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.multi_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/b
ase.rb:178)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821)", "usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.multi_filter(/usr/s
hare/logstash/logstash-core/lib/logstash/filters/base.rb:175)", "org.logstash.config.ir.compiler.FilterDelegatorExt.doMultiFilter(org/logstash/config/ir/compi
ler/FilterDelegatorExt.java:127)", "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.multi_filter(org/logstash/config/ir/compiler/AbstractFilterDele
gatorExt.java:134)", "org.logstash.generated.CompiledDataset3.compute(org/logstash/generated/CompiledDataset3)", "org.logstash.generated.CompiledDataset6.comp
ute(org/logstash/generated/CompiledDataset6)", "org.logstash.generated.CompiledDataset3.compute(org/logstash/generated/CompiledDataset3)", "org.logstash.gener
ated.CompiledDataset14.compute(org/logstash/generated/CompiledDataset14)", "org.logstash.config.ir.CompiledPipeline$CompiledUnorderedExecution.compute(org/log
stash/config/ir/CompiledPipeline.java:329)", "org.logstash.config.ir.CompiledPipeline$CompiledUnorderedExecution.compute(org/logstash/config/ir/CompiledPipeli
ne.java:323)", "org.logstash.execution.WorkerLoop.run(org/logstash/execution/WorkerLoop.java:87)", "jdk.internal.reflect.GeneratedMethodAccessor83.invoke(jdk/
internal/reflect/GeneratedMethodAccessor83)", "jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java
:43)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/jav
asupport/JavaMethod.java:441)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:305)", "usr.share.logstash.logstash_minus
_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:299)", "org.jruby.RubyProc.call(org/jruby/RubyP
roc.java:318)", "java.lang.Thread.run(java/lang/Thread.java:829)"], :thread=>"#<Thread:0x302ceb6c sleep>"}
[root@hostname logstash]# systemctl start logstash

and if I chech the database file, i see it was last modified (actualized) at about the same date and time.

Does anyone experienced similar situation? Should i stop logstash while updating the database in the new version? in 7.5 we never experienced it.

thanks

It looks like the database is corrupted. Looking at a slightly different commit of the library, each piece of data is preceded by a byte that tells you the type. There are 16 types, so type 83 is invalid.

On the MaxMind download page as well as downloading the GeoLite2-City db you can download the SHA256 hash of the .tar.gz, which should allow you to verify whether the download is intact. The database and hash change every week, so you would need to do a new download.

Thanks Badger, but if database were corrupted, after restarting logstash i should have got the same error, however after restarting it continues working and processing and applying the geoip.
Maybe the database was corrupted meanwhile the copy (to substitute the old database) was running?

Does anyone now if after manually replacing the geoip database i have to restart logstash?

I believe so. In the past MaxMind have declined requests to add auto-reload.

It is possible their position changed, but in any case it is a question about the MaxMind API, not logstash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.