Why my ES mapping that automatically creating through logstash have so many unreadable types?

ENV:

logstash 6.2.4
elasticsearch 6.2.4

My logstash config:

filter {
  if [type] == "pro" {
         ............
        prune {
           whitelist_names => [ "^@timestamp$","^index_month$","^type$","^model$","^host$","^agent$","^version$","^path$","^uripath$","^message$","^@version$","^userId$","^level$","^os$","^from$","^ip$","^geoip$" ]
        }
      }
  }
}

I have use prune moble and list the types what I want to output to ES, but still has some unreadable types in ES mapping.

Have you confirmed that [type] is equal to "pro"?

Thanks for your reply!

Most data had output into ES ,but still unreadable types into ES when index creating.

Can you show us the logstash config part where you set type to pro? The mappings indicate that you are indexing with document type set to doc, but that could off course be set in your elasticsearch output plugin.

Thanks for your reply!

The type I set in input of logstash config.

input {
  kafka {
        bootstrap_servers => "kafka.app.com:9092"
        topics => [ "app" ]
        codec => "json"
        type => "pro"
        group_id => "pro"
        consumer_threads => 2
  }
}

output :

output {

  if [type] == "pro" {

      if [level] == "DEBUG" {
          elasticsearch {
              codec => plain{ charset => "UTF-8" }
              hosts => "http://es.app.com:9200"
              index => "logstash-app-%{index_month}"
          }
      }

  }

}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.