- i have a 3 nodes cluster with 1 master and 2 data nodes each is set for 1TB
- i have increased both -Xms24g -Xmx24g to half my ram (48GB total)
- i than successfully upload 140mb file from Kibana to elk from the GUI after increasing it from 100mb to 1GB
when i tried to upload same file with only logstash the process was stuck and broke elastic
my pipeline is fairly simple
input {
file {
path => "/tmp/*_log"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
small files works great. im not able to push big files.
log contains 1 million rows
i set all fields in /etc/security/limits.conf to unlimited
any ideas what im missing?