Elasticsearch 6.0 multiple types problem

Hi everyone,
I wanna ask about help with sending logs to Elasticsearch.
I use kv filter to parse logs from Fortigate devices.
One of logged fields is named type which is automaticaly parsed to meta field _type.
Since Elasticsearch removed support multiple types in new indexes I can't log from Fortigate.
I get an error:
Rejecting mapping update to [fg-2017.11.23] as the final mapping would have more than 1 type: [utm, traffic]

I tried to modify _type filed in logstash before sending to Elasticseach like below but no luck.

input {
       udp {
            port => 3341
            add_field => { "log_source" => "fortigate" }
    }
}
filter {
     if [log_source] == "fortigate" {
              kv {}
              mutate { copy => { "index_type" => "logtype" } }
              mutate { remove_field => [ "index_type" ] }
     }
 }
 output {
    if [log_source] == "fortigate" {
      elasticsearch {
         hosts => [ "localhost:9200" ]
         index => 'fg-%{+YYYY.MM.dd}'
      }
    } 
 }     

How to fix this problem?

Check out this upgrading advice or this sticky in the Logstash subforum.

What you need to do is explicitly set document_type in the elasticsearch output, like this:

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => 'fg-%{+YYYY.MM.dd}'
  document_type => doc
}

This may not work for any existing indexes, as there will already be a type defined in the mapping of those indexes, but it will work for all new indexes.

Hi Abdon,
When I added document_type in elasticsearch it works!
I had already mapping template for that index so I had to set document_type => fortigate where fortigate is template's type name.

Thank's for help,
Regards!

1 Like

Thanks Abdon!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.