My goal is to setup filebeat on at least 5 systems, and get the data to elasticsearch via logstash.
I imported data via the file plugin earlier for one of my trials, and the data was imported successfully.
I tried the same thing this time with beats input plugin, and I am not receiving the same number of records from the files. Also it added over 100k records with the following tags: beats_input_codec_plain_applied, _grokparsefailure, _dateparsefailure
This tags denotes that there is some processing that failed, could you share your logstash snap and also ensure your GROK pattern are correctly configured.
So when I use this input method, without the codec line, all the data I am receiving is in escape characters. When I do add the codec plain. it is returning my data with the 3 tags mentioned above.
I just simply want to load the log files that I have like I do with file input method ->
I am not sure what exactly caused the issue that resulted in the _grokparsefailure and _dateparesefailure.
I did a clean install of elk on another server, and used the same configuration files and same data and it worked. The logstash config input is beats {port => 5044} and it automatically applied the logstash-codec-plain and parsed all data with 0 errors.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.