Updating the @timestamp is not working

I am trying to updating the timestamp using below :

date {
match => [ "smsdate", "dd/MMM/yyyy:HH:mm:ss Z" ]
target => ["@timestamp"]
}

but in kibana still showing the current date. How to make it right ? my objective is to using smsdate in @timestamp

@timestamp is the default target field, so you should not need to specify this. Do you have a smsdate field at the time you call the date filter? If so, what is the format of this field?

smsdate format is a text. how to change it to date ?

Can you show us what the contents of the field looks like?

{
"_index": "filebeat-2018.06.17",
"_type": "doc",
"_id": "eocxDGQBruHXTGIb-XoF",
"_version": 1,
"_score": null,
"_source": {
"destaddr": "628285333724631",
"@timestamp": "2018-06-17T05:21:32.834Z",
"smstype": "Incoming",
"input": {
"type": "log"
},
"sourcenpi": "1",
"operator_name": "Telkomsel",
"sourceaddr": "6282340713802",
"logdate": "2018-06-11 23:56:45,972",
"sourceton": "1",
"operator": "62823",
"host": {
"name": "smsclog"
},
"status": "success_esme",
"offset": 1887268,
"@version": "1",
"prospector": {
"type": "log"
},
"source": [
"/home/pradana/Data/cdr.log.2018-06-11",
"[org.mobicents.smsc.library.CdrGenerator]"
],
"fields": {
"sourcelog": "cdrlog"
},
"smsdate": "2018-06-11 23:56:45.916",
"tags": [
"beats_input_codec_plain_applied",
"_dateparsefailure"
],
"addrnpi": "1",
"debugtype": "DEBUG",
"message": "2018-06-11 23:56:45,972 DEBUG [org.mobicents.smsc.library.CdrGenerator] 2018-06-11 23:56:45.916,6282340713802,1,1,628285333724631,1,1,success_esme,SS7_HR,message,null,103791,0,null,null,null,null,6281107908,null,0,15,null,0,0,,,,5,"Sepi bae ndk arak ce","",,,",
"addrton": "1",
"beat": {
"version": "6.3.0",
"hostname": "localhost.localdomain",
"name": "smsclog"
}
},
"fields": {
"@timestamp": [
"2018-06-17T05:21:32.834Z"
]
},
"sort": [
1529212892834
]
}

Please see above :

smsdate : "2018-06-11 23:56:45.916"

btw I dont have the smsdate field before i call the date filter.

You have to have the smsdate field parsed before you can use it. The format also do not match what you have specified in the date filter, which means it will fail even if you have it.

I have change it to logdate with having this content : "logdate": "2018-06-11 00:00:29,900"

on the grok i put like below : grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate:date} %{LOGLEVEL:debugtype} %{DATA:source} %{TIMESTAMP_ISO8601:smsdate},%{WORD:sourceaddr},%{NUMBER:addrton},%{NUMBER:addrnpi},%{WORD:destaddr},%{NUMBER:sourceton},%{NUMBER:
sourcenpi},%{WORD:status}" }}

here i am expecting the logdate should be in the type of date but in the mapping, it still showing me type of text:

"logdate": {
"type": "text",

and the timestamp also still refer to current date.

Please show what the resulting event looks like. Did you use the date filter on that field?

Below is the JSON result :

{
"_index": "filebeat-2018.06.17",
"_type": "doc",
"_id": "AYcKDWQBruHXTGIbKI0t",
"_version": 1,
"_score": 1,
"_source": {
"logdate": "2018-06-11 02:52:48,889",
"destaddr": "62828987319211",
"smstype": "Incoming",
"debugtype": "DEBUG",
"message": "2018-06-11 02:52:48,889 DEBUG [org.mobicents.smsc.library.CdrGenerator] 2018-06-11 02:52:48.873,6281390136127,1,1,62828987319211,1,1,success_esme,SS7_HR,message,null,100752,0,null,null,null,null,6281107908,null,0,15,null,0,0,,,,3,"Ibu baru sampe karaw","",,,",
"source": [
"/home/pradana/Data/cdr.log.2018-06-11",
"[org.mobicents.smsc.library.CdrGenerator]"
],
"addrnpi": "1",
"operator_name": "Telkomsel",
"host": {
"name": "smsclog"
},
"sourceaddr": "6281390136127",
"smsdate": "2018-06-11 02:52:48.873",
"status": "success_esme",
"beat": {
"version": "6.3.0",
"name": "smsclog",
"hostname": "localhost.localdomain"
},
"@version": "1",
"fields": {
"sourcelog": "cdrlog"
},
"@timestamp": "2018-06-17T09:17:01.722Z",
"sourcenpi": "1",
"operator": "62813",
"input": {
"type": "log"
},
"sourceton": "1",
"addrton": "1",
"offset": 164237,
"tags": [
"beats_input_codec_plain_applied",
"_dateparsefailure"
],
"prospector": {
"type": "log"
}
},
"fields": {
"@timestamp": [
"2018-06-17T09:17:01.722Z"
]
}
}

logdate is mapped as a string in Elasticsearch as it is not in a format that allows dynamic mapping to identify it as a date field. If you use the date filter with a correct format it will set the correct format for you.

As i told you .. during processing below is my grok :

grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate:date} %{LOGLEVEL:debugtype} %{DATA:source} %{TIMESTAMP_ISO8601:smsdate},%{WORD:sourceaddr},%{NUMBER:addrton},%{NUMBER:addrnpi},%{WORD:destaddr},%{NUMBER:sourceton},%{NUMBER:
sourcenpi},%{WORD:status}" }}

but somehow the logdate still have the format of "Text" not date. How to change this ?

You need to use the date filter to format it correctly. After that you will need to start with a fresh index as existing mappings can not be updated.

Can you elaborate how to use the date filter correctly ? I have refresh the data many time but still not giving the desire result

after changing the date filter into this ...its working ..

date {
match => [ "logdate", "ISO8601" ]
target => ["@timestamp"]
}

the timestamp pointing to correct date but the logdate still on string type

The example in your initial post is close, but the format "dd/MMM/yyyy:HH:mm:ss Z" does not match "2018-06-11 02:52:48,889". Try using the format yyyy-MM-dd HH:mm:ss,SSS instead.

Thanks christian. I am using ISO8601 and its working but the logdate still on the type of string not date.
How to fix this ?

Have a look at this example:

input {
  generator {
    lines => ['2018-06-11 23:56:45,972']
    count => 1
  } 
} 

filter {
    date {
        match => ["message", "yyyy-MM-dd HH:mm:ss,SSS"]
    }
}

output {
  stdout { codec => rubydebug }
}

Change the target field as required.

After making below change in my logstash configuration :

date {
match => [ "smsdate", "ISO8601" ]
target => ["@timestamp"]
}

date {
  match => [ "smsdate", "ISO8601" ]
  target => ["smsdate"]
}

everything is working fine as expected. timestamp follow the event date, and the smsdate field type is date.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.