Ho to create single index for logs coming through fluentd, filebeat and logstash together?

Hi,

I have logs coming from kubernetes, nginx and jenkins machines.

i have configured fluentd on kubernetes to get logs and filebeat on nginx to get access logs and logstash on jenkins machine, i want all these logs together in single index called "nprod-logs".

all the kubernetes logs coming to elastic through index "nprod-logs", when i enabled filebeat on Nginx with same index name it's complaining and getting below error in elastic cluster log file.

java.lang.IllegalArgumentException: Rejecting mapping update to [nprod-log-2019.01.09] as the final mapping would have more than 1 type: [doc, fluentd].

is there a way i can create single index for all the logs, because all of them are co-related for my load testing, please suggest me, Thanks.

You would be better off using an index pattern like nprod-logs-SOURCE-YYYY.MM.DD.
Where SOURCE is nginx or jenkins or whatever app, and then YYYY.MM.DD is the date.

Putting everything into one big index is not a good idea as you end up with mapping conflicts, multiple types (which are being rejected as you see) and it makes data management a hell of a lot harder.

It means creating multiple indexes.

But as per my requirement, when i hit one service running on kubernetes it will pass through Nginx proxy through ingress controller through Kubernets service, it means it's single call logs into multiple services and i want see all these logs into single index.

if i create multiple index then i need to search same query in kibana 3 times by changing index name in discovery tab, is there a way i can create single index logs coming through fluentd and filebeat?

You can setup an index pattern of nprod-logs-* in Kibana which will search them all. I forgot to mention that in my other post sorry.

Thanks for suggestion, i thought same thing after replied to you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.