How do I replace timestamp field with the time stamp field inside my log file?

Hi i am very new to the elk stack. I have a log file that shows the date and time of when a file was processed but when I send it off to elasticsearch the timestamp field that shows up is the current date and time. How do I have to configure the logstash config file so that instead of having the real time timestamp I can have the timestamp of the log file?

the way I am processing this log is: filebeat > logstash > elasticsearch > kibana

this is an example of the data in the log file:
10.01.17 18:24:02.85 SchLd: Added schedule record: Count[0051] J[0050][CD 0111]EffDt[17 Oct 31]Freq[.2.....]DscDt[17 Oct 31]ImplDt[17 Jul 25]

I want the time stamp to be replaced with the first part "10.01.17 18:24:02.85"

here is my config file :

input {
beats {
port => "5043"
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
}

I really appreciate your guy's help!

https://www.elastic.co/guide/en/logstash/5.6/plugins-filters-date.html is 100% what you are after :slight_smile:

I tried looking through it but I still cant seem to figure it out.
I changed my config file to the following but I still get the timestamp of when it read it.

input {
beats {
port => "5043"
}
}
filter{
date {
match => ["message", "MM.dd.yy HH:mm:ss.SS"]
target => ["@timestamp"]
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}

message is the entire line. You need to parse the date and time out into a single field using a dissect or grok filter and then pass that to the date filter. Think of the date pattern as being anchored at both ends -- ^MM.dd.yy HH:mm:ss.SS$

Ok I was able to extract the date in my logfile and replace the timestamp with it...but for lines in my log file that do not contain a date it puts in the current time of when it read those lines. For example there are lines that just contain "****", "Job Ended", etc... Is there a way to replace the date of those lines with the date my log was created?

Thank you for your help.

Trying to keep future logstash users who may face the similar problem in mind, in grok "?<Original_Time>" is a custom field pattern that I created for the time since there is no pattern for the date format of my log file. "?" is the equivalent of an "if". And %{GREEDYDATA:message} is the rest of the message after the date.

input {
beats {
port => "5043"
}
}
filter{
grok {
match => ["message", "(?<Original_Time>%{MONTHNUM}.%{MONTHDAY}.%{YEAR} %{TIME}) %{GREEDYDATA:message}"]
}
date {
match => [ "OriginalLogTime", "MM.dd.yy HH:mm:ss.SS" ]
target => ["@timestamp"]
remove_field => ["OrginalLogTime"]
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}

You use grok to fill in Original_Time, then try to parse the date out of OriginalLogTime. Make the names match and it will work. (Or not, depending on what you want the timezone to be.)

By the way, when debugging simple filters like this. I always use

input { stdin {} }
output { stdout { codec => rubydebug } }

and then drop one line at a time into stdin. I would also advise you to keep temporary fields until you have verified they are being used correctly.

O ok I just got the remove part from somebody else's example I saw online but Ill look out for that next time. Thank you Badger!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.