Logstash Filebeat input

My goal is to setup filebeat on at least 5 systems, and get the data to elasticsearch via logstash.

I imported data via the file plugin earlier for one of my trials, and the data was imported successfully.

I tried the same thing this time with beats input plugin, and I am not receiving the same number of records from the files. Also it added over 100k records with the following tags: beats_input_codec_plain_applied, _grokparsefailure, _dateparsefailure

How can I fix this?

_grokparsefailure, _dateparsefailure

This tags denotes that there is some processing that failed, could you share your logstash snap and also ensure your GROK pattern are correctly configured.

So I am facing a different set of issue every time I try to implement filebeat.

input {
	beats {
		port => "5044"
#	codec => "plain"
	}

So when I use this input method, without the codec line, all the data I am receiving is in escape characters. When I do add the codec plain. it is returning my data with the 3 tags mentioned above.
I just simply want to load the log files that I have like I do with file input method ->

   file {
            path => "/home/latech/kcpfinal/*/*.log"
            start_position => "beginning"
            sincedb_path => "/dev/null"
    }

What can I do to fix this?

I am not sure what exactly caused the issue that resulted in the _grokparsefailure and _dateparesefailure.

I did a clean install of elk on another server, and used the same configuration files and same data and it worked. The logstash config input is beats {port => 5044} and it automatically applied the logstash-codec-plain and parsed all data with 0 errors.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.