My architecture is
*Filebeat A*(remote) > Logstash A (2 pipelines) > Elasticsearch A > Kibana A
*Filebeat B*(remote) > Logstash A (2 pipelines) > Elasticsearch A > Kibana A
Its for logs analysis.
say my logs format is
Filebeatsare pushing logs to
Logstash(I can see in
Logstashis NOT creating index for some of the log files.
abc_logs-2019.11.02.log is there in my log location and also
Filebeat pushed it to
Logstash. But I can't see any index created in elasticsearch.
Also, one extra problem.
Even if the indexes are created, NOT all valid logs are getting parsed.
Like if a log file has
100 correct logformat (as in
grokfilter pattern of
60%-70%data is shown in elasticsearch..
around 40% data is getting dropped .. I don't know what is the exact reason..
If I check the unparsed logs in grok debugger with the specified grok pattern, it is parsing perfectly.
Any solution for this problem?