I am using filebeat to parse logs from my server directly to Elastic cloud. Sometimes there are some inconsistent logs entries with hundreds of log lines which I don't want to harvest them. Is there any way I can achieve this? Also is it possible that those logs are responsible for high memory pressure of my cluster?
About memory pressure, it could be anything however dropping this kind of unwanted events may improve things since it will reduce the amount of data ES has to index.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.