Cant ship big log from filebeat or logstash to elastic

  1. i have a 3 nodes cluster with 1 master and 2 data nodes each is set for 1TB
  2. i have increased both -Xms24g -Xmx24g to half my ram (48GB total)
  3. i than successfully upload 140mb file from Kibana to elk from the GUI after increasing it from 100mb to 1GB

when i tried to upload same file with only logstash the process was stuck and broke elastic
my pipeline is fairly simple

input {
file {
path => "/tmp/*_log"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

small files works great. im not able to push big files.
log contains 1 million rows
i set all fields in /etc/security/limits.conf to unlimited
any ideas what im missing?

Welcome to our community! :smiley:

Stuck how? What do the logs show? What is and isn't happening?
We need a bit more detail to be able to help :slight_smile:

Just to add what someone else pointed out on Stack Overflow for this question, the ability to push files from Kibana is not the same as having a file input tail a file. You might want to add start_position and sincedb_path (since the file input will already have recorded the lengths of the files).

1 Like

will try

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.