Field definition in logstash and elastic search

I have created three fields
one index-number according to path and two static as mentioned in config file

grok {
		match => {"path" => "indexer-(?<index-number>[0-9]*)"}
	}
	mutate{
		convert => {"index-number" => "integer"}
		gsub => ["time_stamp", ",", ".","time_stamp", " ", "T"]
		replace => {"time_stamp" => "%{time_stamp}Z"}
	}
	if [type] == "indexer"{
		mutate{
			add_field => {
				application_s => "bcf"
				sub_application => "indexer"
			}
		}
	}

and mapping in Elasticsearch as follows

PUT /localtest3
{
  "settings": {
    "number_of_shards": 1
  },
  "mappings": {
    "properties": {
      "time_stamp": {
        "type": "date"
      },
	"log_level": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          }
        }
      },
	"thread": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          }
        }
      },
	"class": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          }
        }
      },
	"msg": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          }
        }
      }
      ,
	"index-number": {
        "type": "integer"
      },
"application_s": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          }
        }
      },
"sub_application": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          }
        }
      }
    }
  }
}

still at kibana search it is showing "?" before fields and cant apply filters to it

You need to refresh the index pattern in Kibana.

Go to the Kibana Settings, Index Patterns and select the index pattern of your index, this will trigger a refresh of the fields.