Create multiple indexs with multiple input in logstash

Hi Team,

Can anyone help me in confugiring multiple indexes from multiple input with logstash,
I am unable to create multiple index in elastic (with multiple if conditions) . One index is getting created but not both,
and it works well if I do just one condition with : if , else .
But if I want to create more than two indexes with multipe input ?,

My logstash.conf file
input {
udp {
host => "127.0.0.1"
port => 10514
codec => "json"
type => "rsyslog"
}
file {
type => "fortigate"
path => "/home/user/Téléchargements/fortiWebFilter.log"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}

filter {
if [type] == "fortigate" {

grok {
	match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
	#overwrite => [ "message" ]
	tag_on_failure => [ "failure_grok_fortigate" ]
}

kv {

value_split => "="

}

mutate {

#I want to use the timestamp inside the logs instead of Logstash's timestamp so we'll first create a new field containing the date and time fields from the syslog before we convert that to the @timestamp field
add_field => { "temp_time" => "%{date} %{time}" }
#add_field => { "Desti_Country" => "%{dstip}" }
#The syslog contains a type field which messes with the Logstash type field so we have to rename it.
rename => { "type" => "ftg_type" }
#rename => { "ip" => "Desti_IP" }
rename => { "subtype" => "ftg_subtype" }
#add_field => { "type" => "forti_log" }
convert => { "rcvdbyte" => "integer" }
convert => { "sentbyte" => "integer" }
}

date {
match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
timezone => "UTC"
target => "@timestamp"
}

geoip {
source => "dstip"
add_field => [ "[geoip][desti_ip]", "%{[geoip][ip]}" ]
}

mutate {

#add/remove fields as you see fit.
remove_field => ["syslog_index","sessionid","dstcountry","dstip","transip","country_code3","region_code","country_code2","syslog5424_pri","transport","appcat","srccountry","dstintf","devid","@version","itime","path","logver","logid","vd","host","srcintf","trandisp","location","date","time","service","temp_time","tags","sentpkt","rcvdpkt","log_id","message","poluuid"]

remove_field => "[geoip][longitude]"
remove_field => "[geoip][region_code]"
remove_field => "[geoip][country_code3]"
remove_field => "[geoip][continent_code]"
remove_field => "[geoip][country_code2]"
remove_field => "[geoip][latitude]"
remove_field => "[geoip][location]"
remove_field => "[geoip][region_name]"
remove_field => "[geoip][ip]"
}

}
}

output {
#stdout { codec => rubydebug }
if [type] == "rsyslog" {
elasticsearch {
hosts => "localhost:9200"
index => "rsyslog-index"
}
}
if [type] == "fortigate"{
elasticsearch {
hosts => "localhost:9200"
#http_compression => "true"
index => "forti-index"
}
}
}

If you strip down your configuration to the bare minimum - does it work?
input {
udp {
host => "127.0.0.1"
port => 10514
codec => "json"
type => "rsyslog"
}
file {
type => "fortigate"
path => "/home/user/Téléchargements/fortiWebFilter.log"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}

filter {}

output {
    #stdout { codec => rubydebug }
    if [type] == "rsyslog" {
        elasticsearch {
            hosts => ["localhost:9200"]
            index => "rsyslog-index"
        }
    }
    if [type] == "fortigate"{
        elasticsearch {
            hosts => ["localhost:9200"]
            #http_compression => "true"
            index => "forti-index"
        }
    }
}
1 Like

@Justin_Doles I test that , it's not working

Are you getting events from each input? What version of Logstash are you running?

I personally do something similar with Logstash. I have multiple inputs that get sent to different indices. But I do not use type (I think I had an issue with type). I use tags on my inputs to sort them. Like this:

output {
    if "syslog" in [tags] { elasticsearch {...} }
    if "beats" in [tags] { elasticsearch {....} }
}

@Justin_Doles version of logstash 7.3 ,from udp input ,Ok , I will now test with tags .

I just test simple 2 input , it works ,
but why not , if I add : filter if [type] == "fortigate" ...

I honestly don't know. I just vaguely remember having issues with [type]. Tags work just as well. Same search capabilities.

@Justin_Doles can you give me please example with tags how I can use it in input,filter and output

Here's a simple example. The only thing you need to remember is the syntax of "if "tag_name" in [tags] instead of if [type] == "type_name".

input {
    beats {
        port => 5044
        tags => [ "beats" ]
    }

    udp {
        port => 5514
        source_ip_fieldname => "host.ip"
        tags => [ "syslog", "cisco" ]
    }
    
    udp {
        port => 5515
        source_ip_fieldname => "host.ip"
        tags => [ "syslog", "sonicwall" ]
    }
}

filter {
    if "syslog" in [tags] and "sonicwall" in [tags] {
        grok {...}
    }
}

output {
    if "_grokparsefailure" in [tags] {
        file {
            path => "/var/log/logstash/failed_syslog_events-%{+YYYY-MM-dd}.log" 
        }
    }
    if "syslog" in [tags] {
        elasticsearch {
            hosts => ["http://localhost:9200"]
            index => "syslog-1.0"
        }
    }
    if "beats" in [tags] {
        elasticsearch {
            hosts => ["http://localhost:9200"]
            index => "%{[@metadata][beat]}-%{[@metadata][version]}"
        }
    }
}

Thank you , @Justin_Doles please have you any idea how to configure Fortigate firewall to send logs to Logstash

Assuming Fortigate supports syslog - I'd set up another UDP input in Logstash to handle it. Just set the port to a different number and tag it appropriately. You can see how I do that in the example. If it's plain syslog, you may even be able to use Logstash's syslog input. I haven't had much luck with that myself.

1 Like

@Justin_Doles , I install elk stack (elasticsearch, logstash and kibana) in my local machine , when the log data augments my machine is blocked how to manage it ? or where deploy elasticsearch data

@Asmaa_Sarih I'm not sure what you're asking. The installation of Elasticsearch is dependent on the OS you are using. https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html
You can control where the data lives by editing the elasticsearch.yml file. https://www.elastic.co/guide/en/elasticsearch/reference/current/path-settings.html

@Justin_Doles How I can get logs from remote machine

I'm not familiar with Fortigate, but you would set a syslog destination as the IP and port that you have configured in Logstash. https://help.fortinet.com/fadc/4-5-1/olh/Content/FortiADC/handbook/log_remote.htm

In the example I gave, I'm using UDP port 5515. I personally like to test just using the stdout output plugin. It writes the log entries to the console - assuming you're launching Logststash from the console. https://www.elastic.co/guide/en/logstash/current/plugins-outputs-stdout.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.