Let´s say that I have a cluster where I´m sending logs line from filebeat to logstash.
My problem is that filebeat since send line by line, and in my logstash I have configure that the input codec is json, my logstash is throwing a _jsonparsefailure since my json as I said in two lines has a wrong format.
Any idea if it is possible send the whole entry log as once and not line by line?. It´s really weird the current behavior of filebeat, I was expecting that If I add a several lines in my log at once, filebeat will send all those at the same time, and not line by line.
Maybe I´m doing something wroing here.
Filebeat generally treats each line in the log file as a separate event, but if you are using the latest version you can configure it to assemble records spanning multiple lines using regular expressions.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.