Filter multiple logs from same file to different index

Hello,

I have a syslog file that aggregate logs from multiples sources (and I cant separate the logs to different files) and I have a filebeat monitoring that single file that send to logstash.

That file has the following format:

2020-05-22T12:09:18+00:00 10.100.2.137 MESSAGE_LOG
2020-05-22T12:09:18+00:00 10.100.1.138 MESSAGE_LOG
2020-05-22T12:09:18+00:00 10.100.3.136 MESSAGE_LOG

In logstash config file I have the following filter (I wanto to parse more fields but I put GREEDYDATA for this example):

    filter {
            grok {
                    match => { message => "%{TIMESTAMP_ISO8601:timestamp_syslog} %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:message}"
                    }
            }
    }

Now I want to send the logs to different indexes based on %{SYSLOGHOST:syslog_hostname}.

In output section can I do something like:
if [syslog_hostname] == 10.100.2.137{elasticsearch{...}}

Thank you

You can but it would be

if [syslog_hostname] == "10.100.2.137" {elasticsearch{...}}

However, if you do it that way you will have multiple elasticsearch outputs, each maintaining its own connection pool. It might be better to do something like

output {
    elasticsearch {
        index => "%{syslog_hostname}"
        ...
    }
}

That said, do you really need separate indexes for each host? If you have a large set of hosts that will add a lot of overhead.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.