I have filebeat installed on kubernetes, and elasticsearch installed on kubernetes also.
I got this error and I do not see logs in elasticsearch. Could you please tell me how can I solve the problem:
[elasticsearch] elasticsearch/client.go:223 failed to perform any bulk index operations: 413 Request Entity Too Large: <html>
<head><title>413 Request Entity Too Large</title></head>
<body>
<center><h1>413 Request Entity Too Large</h1></center>
<hr><center>nginx/1.17.8</center>
</body>
</html>
I have logging level: info , which logs everything, according to the:
info - Logs informational messages, including the number of events that are published. Also logs any warnings, errors, or critical errors.
Could you tell me which logging level you state to use ?
After setting up the debug mode I had to restart my filebeat.
I saw that the problem gone now and I see messages on elastic, but i had the same situation yesterday. So the problem will back.
Could you tell me please, is this option bulk_max_size relative to my problem ?
The default value is 50 , so maybe increase the valuse ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.