Sinding date field to elastic search through logstash

I have log format as
2021-10-04 14:20:27,552 ---
and I want field in Elasticsearch as date type
but i am getting string type
my logstash config file is


input {

  file{
	path => "D:/Project/ogs/**/*.log"
	#path => "C:/ELK_Stack/logstash-7.4.0/var/log/**/*.log"
	start_position => "end"
	codec => multiline{
			#pattern => "^\s"
			#what => "previous"
			pattern => "^[0-9]{4}-[0-9]{2}-[0-9]{2}"
			negate => true
			what => "previous"
	}
  }
}
filter{
	grok {
		match => {"message" => "%{TIMESTAMP_ISO8601:time_stamp}\s%{WORD:log_level}\s%{JAVACLASS:class}\s(\[%{DATA:thread}\])\s+(?<msg>(.|\r|\n)*)"}
	}
	
	mutate{
		#add_field => {"path" => "%{[log][file][path]}"}
		copy => {"msg" => "message"}
	}
	grok{
		match => {"path" =>"(?<index_name>[^\\/]+?(?=\.\w+$))"}
	}
	mutate{
		gsub => ["time_stamp", " ","T"]
	}
	mutate{
		gsub => ["time_stamp", " ","T"]
	}
	date{
		match => ["time_stamp" , "yyyy-MM-dd HH:mm:ss,SSS"]
	}
}

output {
	stdout{
		codec => rubydebug
	}
	
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "localtest"
    #user => "elastic"
    #password => "changeme"
  }

}

You need to set the mapping as a date. If you look at it right now it probably is a text/keyword based off being the default and not setting the mapping prior to ingest.

GET localtest/_mapping to see what you currently have.

You can't change the mapping so you need to either delete your index, set the mapping, and then reingest the data. Or you can reindex to a temp index and then reindex back to original.

PUT localtest
{
  "mappings": {
    "properties": {
      "time_stamp": {
        "type": "date"
      }
    }
  }
}

does any changes to logstash config file
and where do I write thsi PUT in console kibana right??

As I read your topic again are you looking to have a field called time_stamp in your index or setting that as @timestamp?

What you are doing here is copying the time_stamp field into @timestamp.

date{
		match => ["time_stamp" , "yyyy-MM-dd HH:mm:ss,SSS"]
	}

You run these in the Kibana Dev Console.

i don't know how date filter works I just want date and time from log file as date type field so that I can use to create visualization as line chart at x axis date and time
I should have get field time_stamp as date type not string

	date{
		match => ["time_stamp" , "yyyy-MM-dd HH:mm:ss,SSS"]
	}

What this is doing is setting a field in your index called @timestamp using the values of time_stamp. Then you no longer need the field time_stamp unless you need it for another reason.

If you have not already done this then you need to create an index pattern and when you get to the option to select a field for timeseries select the @timestamp.

Now when you create visualizations you can use that field for your purpose.