Replace @timestamp with log's time Somebody help me please

Hi everyone
I used logstash and elasticsearch 6.5 and i want to replace @timestamp to log's time but failed, i don't undertand where i'm wrong .

My log format :

2019-06-27 08:55:22 INFO 43665 SMS-MT [cid:nnx_smppgw_local_deli01] [queue-msgid:46fbddf5-2739-45cf-a0d5-ee7d39a69345] [smpp-msgid:42834c77-c957-4399-bb73-f701cc3c6962] [status:ESME_ROK] [prio:0] [dlr:NO_SMSC_DELIVERY_RECEIPT_REQUESTED] [validity:none] [from:1595] [to:84949191816] [content:'Thoi tiet Binh Thuan 27/6: Nang gian doan, tu chieu toi mua dong vai noi, kha nang mua 60%, 25-33 do, do am 77%. Vung bien: Co mua rao va dong rai rac. Tam nhin xa giam xuong 4-10km trong mua. Gio tay nam cap 4-5, co luc cap 6, giat cap 7. Trong con dong de phong loc xoay. Chi tiet goi 18001195(0d)']

Here is my config on logstash :
input {
beats {
port => 5044

Set to False if you do not use SSL

ssl => false

Delete below linesif you do not use SSL

#ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
#ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
client_inactivity_timeout => 600
}
}

filter {

if "smpp_nnx_2" in [tags]{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:date_time} %{LOGLEVEL:log} %{GREEDYDATA:logdata}" }
overwrite => [ "message" ]
}
date {
match => [ "date_time", "yyyy-MM-dd HH:mm:ss,SSS" ]
remove_field => [ "date_time" ]
}
}
}

output {
if "smpp_nnx_2" in [tags]{
elasticsearch {
hosts => ["localhost:9200"]
index => "smpp_nnx_2"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}

Here is my patern when i creat on kibana

it's don't have any field i creat !

Somebody help me please .

Hi,

Can you post the JSON of one of the entries in Kibana? If the date filter fails it adds a tag called "_dateparsefailure"

My guess is that the date_time format in the date filter fails because you do not have the millisecond precision. Can you try:

date {
  match => [ "date_time", "yyyy-MM-dd HH:mm:ss" ]
  remove_field => [ "date_time" ]
}
1 Like

Hi Wolfram_Haussig thanks for your help but how can i post the JSON of one of the entries in Kibana ? Can you show me ?

i tried to repalce your date filter

date {
match => [ "date_time", "yyyy-MM-dd HH:mm:ss" ]
remove_field => [ "date_time" ]
}
On my logstash config but it doesn't work either



In the Discover page click on the arrow on the left side of an entry. You will see the table view of all fields. Then click on the JSON tab on the right of the table tab. Now you can copy the content of the JSON document.
image

Hi Wolfram_Haussig Here is my JSON in kibana

image

{
"_index": "smpp_nnx_2",
"_type": "doc",
"_id": "IMTGl2sBLXmOKnoJsZ9A",
"_version": 1,
"_score": null,
"_source": {
"log": {
"file": {
"path": "/var/log/jasmin/messages.log"
}
},
"logdata": " 66196 SMS-MT [cid:nnx_smppgw_local_deli01] [queue-msgid:83bba1b6-0ae4-4f98-9b9e-70a7fa0b475a] [smpp-msgid:0aefca6e-285e-4d86-a4ed-9b88d59af9e1] [status:ESME_ROK] [prio:0] [dlr:NO_SMSC_DELIVERY_RECEIPT_REQUESTED] [validity:none] [from:1595] [to:84948870442] [content:'GA bi SOC NHIET khi nang nong keo dai. QK goi ngay 18001195 (0d) de duoc chuyen gia tu van cach ha nhiet do cho ga.']",
"@timestamp": "2019-06-27T07:12:09.000Z",
"beat": {
"version": "6.8.0",
"name": "SMPP-NNX-02",
"hostname": "SMPP-NNX-02"
},
"prospector": {
"type": "log"
},
"tags": [
"smpp_nnx_2",
"beats_input_codec_plain_applied"
],
"@version": "1",
"input": {
"type": "log"
},
"message": "2019-06-27 14:12:09 INFO 66196 SMS-MT [cid:nnx_smppgw_local_deli01] [queue-msgid:83bba1b6-0ae4-4f98-9b9e-70a7fa0b475a] [smpp-msgid:0aefca6e-285e-4d86-a4ed-9b88d59af9e1] [status:ESME_ROK] [prio:0] [dlr:NO_SMSC_DELIVERY_RECEIPT_REQUESTED] [validity:none] [from:1595] [to:84948870442] [content:'GA bi SOC NHIET khi nang nong keo dai. QK goi ngay 18001195 (0d) de duoc chuyen gia tu van cach ha nhiet do cho ga.']",
"source": "/var/log/jasmin/messages.log",
"offset": 265390037,
"host": {
"os": {
"version": "7 (Core)",
"family": "redhat",
"codename": "Core",
"name": "CentOS Linux",
"platform": "centos"
},
"containerized": true,
"architecture": "x86_64",
"name": "SMPP-NNX-02",
"id": "41cb6e4b7919403591a2b90d176a6308"
}
},
"fields": {
"@timestamp": [
"2019-06-27T07:12:09.000Z"
]
},
"sort": [
1561619529000
]
}

It looks like your time is parsed correctly now?
You have 2019-06-27T07:12:09 in your @timestamp field, and 2019-06-27 14:12:09 in your log message.

The offset of 7 hours might be fixed by specifying the correct timezone in the date filter.

Hi BennyIncBenny
Thanks for your help but i thought ELK can be replace @timestamp with date_time i creat ?

Do you want to replace the value in the @timestamp field with the value from the date_time field you grok'ed? This is already done, I think.

Or do you want the field to be called date_time? If so, you would need to specify the target in the date filter (default is @timestamp).

Hi BennyIncBenny
i don't think is done because when i add timestamp field it doesn't show the log's time.

You can see this pic on Time Field an @Timestamp field i want replace log's time on that . Can you help me please

Well, in your discover view it shows the @timestamp field's value as "June 27th 2019, 16:06:15:271", which is the same as the timestamp in the log message (2019-06-27 16:06:15) - just in a different format.
So do you want the date to be formatted differently? This can be done in the Kibana settings.
Or did you want the message field to not contain the timestamp anymore? If so you could try displaying your logdata field instead of the message field.

hi BennyIncBenny

I have lots of old log files that need to be uploaded on ELK but if i uploaded old log file on elk , the field time and @timestamp will not display log's time correctly . Then i want to replace @timestamp to log's time .

Your filters already grok the timestamp to the date_time field, which you then map to the @timestamp field using the date filter. That in itself is correct.
Also, In the screen captures you provided, the @timestamp always matched the time given in the message field. What exactly do you think is incorrect? Can you find a doc where the @timestamp field was processed incorrectly?

1 Like

hi BennyIncBenny

This is an example of the wrong time when I get old log on ELK


Can you fix that for me ?

What is the value of the tags field on those messages?

Hi Badger

Other values don't matter, I just need to change @timestamp to logs time .

But the tags would indicate whether a date parse failure occured.
It would be good to determine that. For testing, it would also help if you could comment out the removal of the date_time field, so that you can check what was contained within it, if the date parse failure occurs.

Your date filter is conditional depending on the value of [tags], and if the date filter finds a value in [date_time] but fails to parse it then it will add a value to [tags], so the contents of [tags] are most certainly useful in diagnosing the problem.

Hi BennyIncBenny

Thanks for your help, i have found a solution for my proplem, now i can replace @timestamp to date_time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.