How to change the datatype of field in elastic search

How to change the datatype of field in elastic search, example, string to date etc.

This depends on your config.

If you can provide more information on what you are trying to do we can probably help :slight_smile:

Hi,
I have fields with data type as string after logstash output. I want them to be integer and date.
Please let us know how we can achieve this.
Let me know if you need any info.
~ ELK user

If you use grok you can capture fields as integers with %{PATTERNNAME:fieldname:int} but you can also use the mutate filter's convert parameter for an explicit type conversion. Strings that look like dates will be mapped as such by Elasticsearch.

Hi,
Thanks for the reply.
I used %{PATTERNNAME:fieldname:int}, but still the field is not getting converted.
I saw the documentation for Mutate filter. Nowhere mentioned about data type conversion.
Please paste a sample code which is already worked.
Regards
ELKUSER

I used %{PATTERNNAME:fieldname:int}, but still the field is not getting converted.

To be clear, existing indexes won't have their data types converted after you make this change, but if the data type of a field in the JSON document that Logstash produces is an integer, Elasticsearch will map that field as an integer field. This will happen after the next time an index is created, i.e. typically the next day.

I saw the documentation for Mutate filter. Nowhere mentioned about data type conversion.

As I said, use the convert parameter.

1 Like

hi,
Thanks for the clarification. I change these and check tomorrow if I see the change in datatype.

Hi,

I am having the exact same issue with numeric fields from IIS logs (i.e. timetaken) being mapped into Elasticsearch as strings (and consequently being unusable for visualisation). I found this article yesterday and implemented the :int suffix on my NUMBER fields as suggested and waiting for the overnight index refresh. No change, my numberic fields are still being mapped as strings :-(.

Here's my filter:

if [type] == "iis_fnc" {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{IPORHOST:hostip} %{WORD:method} %{URIPATH:page} %{NOTSPACE:query} %{NUMBER:port:int} %{NOTSPACE:username} %{IPORHOST:clientip} %{NOTSPACE:useragent} %{NUMBER:response:int} %{NUMBER:subresponse:int} %{NUMBER:scstatus:int} %{NUMBER:timetaken:int}"]
}
}

And here's the mapping (for timetaken) from Elasticsearch today:

{"type":"string","index":"not_analyzed","ignore_above":256}}},"timetaken":{"type":"string","norms":{"enabled":false},"fields":{"raw":

I'm guessing that I've missed step (probably a very obvious one). Can you assist?

I have resolved this issue... Or more accurately, time has resolved this issue... I thought the logs rolled over at midnight. Not in New Zealand they don't. Midnight UTC is midday here :slight_smile: . After the logs rolled over the new indices were created with the correct types.

Then the challenge was dealing with the index mismatch...