i am using Elasticsearch and Logstash Version 6.3.1 , and Java version 8 , on Ubunto Virtualmachine , with 8GB RAM and 50GB for the Memory .
I am trying to index 8000 textfiles after filter them using Logstash , and send them to Elasticsearch. i did not install them as Service , i just run them normaly . does that make any Problems?????
Logstash could send just about 4000 of them, after that ,it shows me following:
Reached open files limit: 4095, set by the 'max_open_files' option or default, files yet to open: 4298
after long research , i changed as following:
the Heap size of elastic and logstash from the jvm file in both of them ´´´ ´´ sudo sysctl -w vm.max_map_count=262144´ ´´´ and i changed this: ´´´ ulimit -Sn = 63536 ulimit -Hn = 63536 ´´´ but it stills did not work. maybe i should increase LS_OPEN_FILES on logstash config in /etc/init.d/logstash but i do not know how . why is logstash always making Problem related to **reached-open-file-size** Is that cause of Logstash it self , or is that cause of my Config file?????? my Config file includes just Input and a filter with 4 grock inside it to match 4 different things from the textfiles and at the end the output , that is it . Please any Explanation??????? why always this Problem on Logstash ???????? How to Lose it ?????, i have tried every things thx.