Well some component changed to debug mode and since then we receive this error. The content length (732630494) is bigger than the maximum allowed string (536870888) does that mean that there is a single log that is bigger then 73mb?
Are you ingesting a whole log as a single document?
Elasticsearch is not designed or optimized to handle extremely large documents, so the default limits that are in place should not be modified (which I asume you have done). The standard way to deal with log data is to ingest files with each line as a single document (multiline log entries can be merged into a single document).
Thank you,
I am not sure what I am experiancing. I checked and the longest log line did not exceed 10mb. But the file that filebeat read from was 700mb.
So not sure where this error is coming from.
this is my filebeat settings
Also, confused about general terms
when I get this error The content length (732630494) is bigger than the maximum allowed string (536870888) in a search
What does it mean?
that the a log in my search is bigger then 732630494?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.