We are using Elasticsearch, Logstash, and Filebeat to process log files of our application. Filebeat reads log messages from log files and sends those to Logstash. Logstash parses those logs and stores into Elasticsearch. Elasticsearch, Logstash, and Filebeat are running inside docker container. Logstash and Filebeat are communicating over SSL.
It is observed that, after every 4-5 hours below error starts appearing in logstash log file and logstash stops log processing.
[2017-09-12T11:53:18,877][ERROR][logstash.filters.ruby ] Ruby exception occurred: Detected invalid array contents due to unsynchronized modifications with concurrent users
The issue occurs when more than one filebeats are sending data to Logstash.
Logstash configuration file contains following code in input plugin and ruby filter plugin. I have specified only input plugin and ruby filter plugin code only.
input {
beats {
port => "5044"
ssl => true
ssl_certificate => /home/logstash/server.crt
ssl_key => /home/logstash/server.key
}
}
filter {
ruby {
code => "$LOAD_PATH.unshift(File.expand_path('/usr/share/logstash/vendor/jruby/lib/ruby/gems/shared/gems/jwt-1.5.6/lib', __FILE__))
require 'jwt'
is_token_valid = 'false'
begin
token = event.get('token')
r_id = event.get('r_id')
decoded_token = JWT.decode token, 'secret-key', 'HS256'
decoded_r_id = decoded_token[0]['pr']
if decoded_r_id == r_id
is_token_valid = 'true'
else
is_token_valid = 'false'
end
rescue JWT::ExpiredSignature
is_token_valid = 'false'
rescue JWT::DecodeError
is_token_valid = 'false'
rescue
is_token_valid = 'false'
end
if is_token_valid == 'false'
event.cancel
end"
}
In ruby filter, we are reading the JWT token sent from Filebeat, decoding that token to retrieve an id from it and comparing it to validate if authentic filebeat
is sending data to logstash. We think that above Ruby code is causing the above mentioned error.
Below are more details about Logstash and Filebeat configuration:
Operating system of Logstash: Ubuntu 14.04
Operating system of Filebeat: Ubuntu 14.04
Logstash version: 5.4
Filebeat version: 5.4
Elasticsearch version: 5.4
Pipeline workers in Logstash: 2
Bulk max size in Filebeat: 4096
Really appreciate if any body can help in this regard.