My architecture is
*Filebeat A*
(remote) > Logstash A (2 pipelines) > Elasticsearch A > Kibana A
*Filebeat B*
(remote) > Logstash A (2 pipelines) > Elasticsearch A > Kibana A
Its for logs analysis.
say my logs format is abc_logs-yyyy.mm.dd.log
My
Filebeats
are pushing logs toLogstash
(I can see indata/registry
file) butLogstash
is NOT creating index for some of the log files.
say, abc_logs-2019.11.02.log
is there in my log location and also Filebeat
pushed it to Logstash
. But I can't see any index created in elasticsearch.
Also, one extra problem.
Even if the indexes are created, NOT all valid logs are getting parsed.
Like if a log file has100 correct log
format (as ingrok
filter pattern of
logstash.yml
file) only60%-70%
data is shown in elasticsearch..
around 40% data is getting dropped .. I don't know what is the exact reason..
If I check the unparsed logs in grok debugger with the specified grok pattern, it is parsing perfectly.
Any solution for this problem?