Mutate filter unable to convert date to string

I have searched a lot regarding this issue and now I am posting this problem.
here is my configuration file.

input {
file {
path => "/home/Kunden2.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => [
"Kunden_Id",
"Kunden_Gueltig_Ab",
"Kunden_geultig_Bis",
"Kunden_Vorname",
"Kunden_Nachname",
"Kundenname",
"Kunden_Othername",
"Kundenanschrift, Strasse und Hausnummer",
"Kundenanschrift Postleitzahl",
"Kunden_Telephon",
"Kunden_Fax",
"Kundenanschrift Ort",
"Kunden_Arzt_Type",
"Kunden_Gpid",
"IK-NR.: = Instituationskennzeichen",
"Instituationskennzeichen_Activeab",
"Instituationskennzeichen_Activebis"
]
}
mutate {
convert => {
"Kunden_Gueltig_Ab"=> "string"
"Kunden_geultig_Bis"=> "string"
"Instituationskennzeichen_Activeab"=> "string"
"Instituationskennzeichen_Activebis" => "string"
}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "Kunden_lexica2"
}
stdout {}
}

I am trying to convert date into string using logstash configuration files. Data loads successfully but the data type is not changing.
I know the other way [map date type manually] like
PUT temp_index2
{
"mappings": {
"doc": {
"properties": {
"Kunden_Id": { "type": "text" },
"Kunden_Gueltig_Ab": { "type": "text" },
"Kunden_geultig_Bis": { "type": "text" },
"Kunden_Vorname": { "type": "text" },
"Kunden_Nachname": { "type": "text" },
"Kundenname": { "type": "text" },
"Kunden_Othername": { "type": "text" },
"Kundenanschrift, Strasse und Hausnummer": { "type": "text" },
"Kundenanschrift Postleitzahl": { "type": "text" },
"Kunden_Telephon": { "type": "text" },
"Kunden_Fax": { "type": "text" },
"Kundenanschrift Ort": { "type": "text" },
"Kunden_Arzt_Type": { "type": "text" },
"Kunden_Gpid": { "type": "text" },
"IK-NR.: = Instituationskennzeichen": { "type": "text" },
"Instituationskennzeichen_Activeab": { "type": "text" },
"Instituationskennzeichen_Activebis": { "type": "text" }
}
}
}
}
Kindly help me out, I want to do it by first method. Thank you in advance

Have you reindexed the data? Changing the data type in Logstash won't affect the mappings of an existing ES index.

Thank you for answer. By creating manual index and then re-indexing works. But what about through logstash as I mentioned in my problem statement? I want to change data type before loading the data

You can certainly change the data type of a field in a Logstash event (and doing so will affect ES's auto-detection of field types), but nothing you do in Logstash can change the mappings of an existing ES index.

If ES incorrectly detects a string as a date there's nothing Logstash can do to avoid that.

@magnusbaeck Can you please elaborate your answer with the help of some example i-e in my case, mentioned above in the question area. That would be great for understanding purpose.

I don't understand what you're asking. You already know how to use the mutate filter to convert the data type of a field in Logstash and you know how to use the put mapping API to explicitly set the mappings of a field. What is still unclear?

If logstash sends a field that does not exist in the index with a string value such as "2018-08-06T17:18:50.558Z" to elasticsearch, then unless there is a template that tells it not to, elasticsearch will convert that to a date. You cannot change that behaviour in logstash.

So you can use a template, or you can force the field to exist as a string by ingesting a dummy record that does not look like a date.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.