Logstash hangs after few days and stops processing logfiles

Hi,

We have an logstash running on an server and after a few days (3 - 5), it stops processing logfiles. The process is still running on the system, but it looks like it is hanging. In Kibana we see an gap for all data from this host. When we restart logstash on this host, the logfiles will be processed again, but we still have an gap (From the first moment it hangs till the restart).

I've seen some moments that it will recover somehow and process logfiles again but we still see gaps in our monitoring. Mostly we have to restart logstash in order to fix it.

When restarting, it looks that it can't stop the process nicely:

service logstash restart
Killing logstash (pid 8859) with SIGTERM
Waiting logstash (pid 8859) to die...
Waiting logstash (pid 8859) to die...
Waiting logstash (pid 8859) to die...
Waiting logstash (pid 8859) to die...
Waiting logstash (pid 8859) to die...
logstash stop failed; still running.

So, we need to do an kill. (Well, I've added it to the init script).
There is no logging in the logfile. I tried to put logstash into debug mode for this, but we had to clean it every 20 - 30 minutes due to lack of free space.

Logstash is running on 1.5.4. But happened too on 1.5.2, 1.5.1 and 1.4.4. (Don't know of previous versions...)

ps -ef | grep logstash
root     28383     1 99 13:27 pts/0    00:02:04 /usr/bin/java -Djava.io.tmpdir=/var/lib/logstash -XX:MaxPermSize=256m -Djava.io.tmpdir=/var/lib/logstash -Xmx1536m -Xss2048k -Djffi.boot.library.path=/opt/logstash/vendor/jruby/lib/jni -Djava.io.tmpdir=/var/lib/logstash -XX:MaxPermSize=256m -Djava.io.tmpdir=/var/lib/logstash -Xbootclasspath/a:/opt/logstash/vendor/jruby/lib/jruby.jar -classpath : -Djruby.home=/opt/logstash/vendor/jruby -Djruby.lib=/opt/logstash/vendor/jruby/lib -Djruby.script=jruby -Djruby.shell=/bin/sh org.jruby.Main --1.9 /opt/logstash/lib/bootstrap/environment.rb logstash/runner.rb agent -f /etc/logstash/conf.d -l /var/log/logstash/logstash.log

Java version:

java -version
java version "1.7.0_55"
OpenJDK Runtime Environment (rhel-2.4.7.1.el6_5-x86_64 u55-b13)
OpenJDK 64-Bit Server VM (build 24.51-b03, mixed mode)

I've created some threaddumps, but I'm not that familiar with the logstash code to examine. I Can't upload them, it is txt file and not png/gif/jpeg

Does anyone have an idea what might be the problem?

What input are you using?

Hi,

We are only using the 'File':

input {
   file {
     type => "amauthaccess"
     path => "/opt/openam/am/am/log/amAuthentication.access"
     sincedb_path => "/var/run/logstash"
   }

   file {
     type => "amautherror"
     path => "/opt/openam/am/am/log/amAuthentication.error"
     sincedb_path => "/var/run/logstash"
   }

   file {
     type => "amsaml"
     path => "/opt/openam/am/am/log/SAML2.access"
     sincedb_path => "/var/run/logstash"
   }

   file {
     path => [ "/var/log/iptables.log" ]
     type => "iptables"
     sincedb_path => "/var/run/logstash"
   }
   file {
     path => [ "/var/log/exim/main.log" ]
     type => "exim"
     sincedb_path => "/var/run/logstash"
   }
}

Nothing more.

Hello,

I have the same problem here (We are running logstash 2.2).

I have a bunch of apache logs which I'm trying to get into elasticsearch.

My workflow is:

scp domain/access_log srvlogstash:/var/log/xxx

nxlog read on srvlogstash server the file and send to 5514 tcp the log, logstash send it to elasticsearch cluster.

I watch ndocs on elasticsearch, when the logs on access_log are indexed on elasticsearch cluster, I copy another access_log file

logstash got stucked about 2 days later I started the process, I had to restart it (and then restart nxlog too)

this machine is not on elasticsearch cluster as client, I'm not sure if it would work better if I add it as a elasticsearch client.

Regards,

This thread is 6 months old and refers to a really old version of LS, it'd be better if you create a new thread :slight_smile:

Ok, I will open a new thread when it happen again.

I will try to upgrade to logstash 2.3 before that I open that thread.

Thank you