Send filebeat logs through logstash to different indexes

Hi Folks,

Besides the standard modules( system, auditd etc), I have to send a custom logs from one server to elasticsearch.

I am thinking of sending the logs to logstash first so that I can do some grok processing for this custom logs

How can I differentiate the logs( by adding tags etc) so that default modules log go to one index whereas other logs go to different indexes

Looking for some guidance.

Thanks in advance

Immac

Hi @Immac, welcome to the Elastic community forums!

If you simply want to distinguish logs collected by Filebeat modules vs. logs collected by a custom input, you won't need to add any special tags. This is because all log events generated by Filebeat modules will contain an event.module field whereas the log events generated by your custom input won't (unless you do some explicit extra configuration to add such a field, of course).

Given this, you can simply test for the existence of this event.module field in your Logstash pipeline like so:

  if [event][module] {
    # process log events generated by modules
  } else {
    # process log events generated by custom input 
  }

Hope that helps,

Shaunak

Hi Shaunak,

Thanks for your reply.

Let me try that and see how it goes

Cheers !!

Immac

You can also use add fields and similar logic.

Thanks guys.

I was able to use the 'tags'. Both 'event.module' and 'adding fields' should work as well.

To test I was using a static log file on the filebeat and that threw me off. Since the file did not change I was not seeing it on the logstash. For subsequent test I just renamed the file and it worked

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.