Can i replace logstash timestamp with timestamp of my logfile?

i want my log event timestamp to replace @timestamp of logstash , what should i do?

{
"message" => "DEBUG",
"@version" => "1",
"@timestamp" => "2015-10-21T07:00:59.979Z", ###this timestamp is of logstash
"host" => "HFX2WS1",
"path" => "C:\Users\egupanm\csv\logs.log",
"timestamp" => "2015-10-21 12:30:59",# this is my timestamp
"log_level" => "TRACE",
"line" => 16,
"time" => 45059,
"difference" => 26
}

How can i replace @timestamp with my timestamp??
Please do not give me link of other thread because i have seen them did exactly what is written still i am not able to achieve my usecase. So please help .

1 Like

Use the date filter (as I assume other threads have suggested). If you didn't get that to work show us what you've tried so far and why that wasn't satisfactory.

input {
file {
path => "C:\Users\egupanm\csv\logs.log"
start_position => "beginning"
}
}
filter {
grok {
match => [ "message", "(?< timestamp >%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second}) %{LOGLEVEL:log_level} %{NUMBER:line:int}" ]

}
ruby {
code=> " hr=event['hour'].to_i ;
min = event['minute'].to_i ;
sec = event['second'].to_i;
hr_to_sec = hr * 60 * 60;
min_to_sec = min * 60;
event['time']= hr_to_sec + min_to_sec + sec ;
event['message']= var2;
event['difference'] = event['time'].to_i - var1;
var1=event['time'].to_i ;
var2 =event['log_level'];
event.cancel if event['difference'] <= 20"
}
mutate {

remove_field => ['year']
remove_field => ['month']
remove_field => ['day']
remove_field => ['hour']
remove_field => ['minute']
remove_field => ['second']

}
}
output
{
stdout {
codec => rubydebug{}
}
}

Through this config i am trying to add those event in which time gap is more than 20 seconds and corresponding log level .
Now i want to replace logstash @timestamp with my timestamp field so that i can see that in kibana.
What should i possibly add? because i had to break my timestamp in order to get difference so date filter pattern is not matching so please tell me the pattern ..
my timestamp looks like this: 2015-10-21 12:31:56

Please help

var1 and var2 are variables that i added in ruby filter file ruby.rb. I did this inorder to fulfil my use case and i was able to achieve it. but now i want to replace timestamp .

Again, use the date filter. Finding an example of how to use it to parse ISO8601 dates like the one in your timestamp field should be easy. You may be able to use the "ISO8601" date format pattern instead of a "YYYY-..." style pattern.

I am getting dateparsefailure when i add
date{
match => ["timestamp" , "ISO8601"]
}

True, this exact format isn't recognized as ISO8601 (which probably is a bug). Try "YYYY-MM-dd HH:mm:ss".

I tried but timestamp was not replaced.

So what did happen? Did the message get a _dateparsefailure tag? Please show the complete message.

no _datparsefailure but timestamp didnt get replaced with @timestamp.

More information is required for debugging. Please show the complete message.

Thank you but it got resolved there was some error regarding pattern but i am come up with new issues
the timestamp shown in logstash and kibana differs

for example:

{
"message" => "ERROR",
"@version" => "1",
"@timestamp" => "2015-10-21T16:09:54.077Z",#logstash timestamp
"host" => "HFX2WS1",
"path" => "C:\Users\egupanm\csv\logs.log",
"timestamp" => "2015-10-21T16:09:54.077Z", # my timestamp
"timezone" => "Z",
"log_level" => "TRACE",
"line" => 16,
"time" => 58194,
"difference" => 32
}

they are coming same but kibana is showing something different:

October 21st 2015, 21:41:00.864 message:ERROR @version:1 @timestamp:October 21st 2015, 21:41:00.864 host:HFX2WS1 path:C:\Users\egupanm\csv\logs.log timestamp:October 21st 2015, 21:41:00.864 timezone:Z log_level:TRACE line:16 time:58,260 difference:66 _id:AVCJ_NwwTpSlmoB2KRh3 _type:logs _index:example
October 21st 2015, 21:39:54.077 message:ERROR @version:1 @timestamp:October 21st 2015, 21:39:54.077 host:HFX2WS1 path:C:\Users\egupanm\csv\logs.log timestamp:October 21st 2015, 21:39:54.077 timezone:Z log_level:TRACE line:16 time:58,194 difference:32 _id:AVCJ-9odTpSlmoB2KRh2 _type:logs _index:example

how is this possible?? when logstash gives it timestamp "@timestamp" => "2015-10-21T16:09:54.077Z"
and kiban gives @timestamp:October 21st 2015, 21:41:00.864

How do you know it's the same message? If you suspect Kibana might be doing something weird keep in mind that you can always fetch a document directly from ES.

i indexed same data in elasticsearch:

input {
file {

path => "C:\Users\egupanm\csv\logs.log"
start_position => "beginning"

}

}
filter {

grok {

match => [ "message", "(?<timestamp>%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day}T%{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second}%{ISO8601_TIMEZONE:timezone}) %{LOGLEVEL:log_level} %{NUMBER:line:int}" ]

}
date{
match => ["timestamp" , "yyyy-MM-dd'T'HH:mm:ss.SSSZ"]
}
ruby {
code=> " hr=event['hour'].to_i ;
min = event['minute'].to_i ;
sec = event['second'].to_i;
hr_to_sec = hr * 60 * 60;
min_to_sec = min * 60;
event['time']= hr_to_sec + min_to_sec + sec ;
event['message']= var2;
event['difference'] = event['time'].to_i - var1;
event.cancel if var1 ==0 ;
var1=event['time'].to_i ;
var2 =event['log_level'];
event.cancel if event['difference'] <= 20
"
}
mutate {

remove_field => ['year']
remove_field => ['month']
remove_field => ['day']
remove_field => ['hour']
remove_field => ['minute']
remove_field => ['second']

}

}

output
{
stdout {
codec => rubydebug{}
}

elasticsearch
{
codec => rubydebug{}
cluster =>"elastic"
action => "index"
host => "localhost"
index => "example"
}
}

and kibana is taking data from elasticsearch... index name is example

so what should i do ? my whole point of replacing timestamp was to plot it accordingly but timestamp again got error. what should i do?

Be systematic and simplify your pipeline. Ignore ES for now. Just use the stdout output. Remove the ruby filter. Process a single message from the file. Do you get what you expect? Yes? Continue adding one thing at a time until you get something unexpected. Over and out.

I used date filter to solve it, like:
date {
match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
target => "@timestamp"
}

1 Like