Logstash csv date deactivate

Hi,

I am new to the ELK stack.

I am trying to import a CSV that contains Person Data to Elastic search,
everything is working fine the only problem is that I would like to change the Birth_date to "text" so i can use it for data matching but the problem is it is automatically set to "date" and i can`t find a solution to change it.

hope somebody can help me, its driving me crazy :sweat_smile:

There are several possibilities. What is the format of the date field, and what does your input/filter/output configuration file look like?

the date format is YYYY-MM-DD

input{
	file {
		path => "/home/demo/csv/demo.csv"
		start_position => "beginning"
		sincedb_path => "/dev/null"
	}

}

filter{
	csv {
		separator => ";"
		columns => ["PERSON_ID","FIRST_NAME","LAST_NAME","BIRTH_DATE"]
	
	}

}

output{
	elasticsearch {
		hosts => "http://localhost:9200"
		index => "demo"
		_id => "%{PERSON_ID}"
	
	}
stdout {}
}

i also tried to add to the filter, but it did not work.

 mutate {
              convert => {"BIRTH_DATE" => "text"}`

logstash will be treating that as a string by default. However, elasticsearch will convert that to a date if dynamic date detection is enabled (and it is on by default).

thanks for your answer,

so the thing that i could not figure out yet is.

if i try to deactivate dynamic date detection it is not possible because the type is already created and data is added to it.

do i need to map the fields first and then add the data?

thats where i am stuck... if i use the import script that i posted the the fields are mapped dynamically and the data is added and i am not able to change the data type anymore.

In elasticsearch, once the type of a field is set, it cannot be changed. You would either need to disable dynamic date detection, or use a template to set the mapping. In either case you will need to create a new index.

thanks for your help!
i just created a csv with sample data and used strings and int where i needed it.
that solved the problem

thanks again