Trying to do load testing .When logstash reaches certain limit it start throwing the "failed to open "..."permission denied " error ....but full permission given for the file.
In single index "xyz" the total hits 3341.Two document_type is there "type1": 2135,"type2" : 1186.
After certain amount of log files pushed into ES from LS we face this issue.kindly help us in understanding & resolving the issue.
Load testing involve like loading more jobs to the ES via LS so that we can ensure it can handle huge entries on realtime scenario.
We query ES data and push them into our UI .
Now After certain limit it stops pushing the data into ES so we are missing the data for the particular job.
ERROR : failed to open file "<log file /.log>"permission denied <log file/.log>
Config file : we have one config file for "type1" document_type and 4 config file for "type2" document_type.
From the log file we fetch certain data using grok,ruby code and pushed into the ES.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.