We are filtering sflow data with logstash with the following filter
filter {
json {
source => "message"
type => "json"
}
}
Which works fine. However, it seems it's deprecated.
--verbose does not give anything but "adding pattern" with the list of pattern used
With --debug we can see all the flows in the log file but no sflow log is recorded. All the received logs are registered in the log file but nothing seems to be outputed to elasticsearch.
Huuu nope... It still crashes but without any errors that's the weird thing. The logs get filled up with received and parsed log and sflow but nothing is actualy outputed to elasticsearch.
Once this occurs and most of the time, only kill -9 will work to stop logstash.
I wish i could see something of interest in the log but
What is the volume of logs you are trying to process?
We ran into similar issues with Logstash and it ended up being memory consumption issue with logstash. Take a look at the memory consumption of the machine and of the logstash process, when it dies from out of memory there are no errors in the log files and it just hangs.
Thanks, i'll look into that but this server has 176Go of ram and and 28 allocated to logstash itself, 32 allocated to elasticsearch.
I dont think that's the issue
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.