Set up index for fields created with logstash

Hello,
I have a windows server with filebeat who analyzes a log generated by filezilla server.
I send to the logstash server in order to add some fields (message, action, user, ...) then I send to elasticsearch.

I see my data in kibana but I can't search by the field I have created.
I see 2 messages :

  1. no cached mapping for this field. Refresh field list from the management => index patterns page

It's OK for this

  1. : unindexed fields can not be searched.

How I can do that ?

Here my configuration of logstash :

input {
    beats {
    port => 5044
    }
  }

filter {

    grok {
     match => { "message" => "\(%{GREEDYDATA:id_filezilla}\) %{DATESTAMP:date_filezilla} -    %{GREEDYDATA:compte_filezilla} \(%{IPV4:adresseip_filezilla}\)> %{GREEDYDATA:action_filezilla}" }
    }

    grok {
     match => { "message" => "\(%{GREEDYDATA:id_filezilla}\) %{DATESTAMP:date_filezilla} - \(%{GREEDYDATA:compte_filezilla}\) \(%{IPV4:adresseip_filezilla}\)> %{GREEDYDATA:action_filezilla}" }
    }

}


output {
    elasticsearch {
    hosts => "XXXX:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    }
}

Is something is missing ?

Thanks for your help.

problem solved,
to do this, you have to go in "index patterns page" then, in each filebeat indexed, refresh the field list.

By default, I was in winlogbeat.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.