Hi again forum!
I never thoug it could be that hard... but I'm unable to properly handle log file acquired data... because a date problem.
Google is full of entries around, but none solved my problem, and I'm affraid I'm complacting things too much, making things worse.
My problem is that I hace a log file, quite simple in its date format, and surprisingly too short I think:
.... 2015-12-31 00:00:00 ....
so, I fugure a pattern YYYY-MM-dd HH:mm:ss
My grok filter apparently matches all log fields quite good... including those date fields, I do like this:
.... %{YEAR:sc_year}-%{MONTHNUM:sc_month}-%{MONTHDAY:sc_day} %{TIME:sc_time} ...
And I got my nice sc_year, sc_month, sc_day and sc_time correctly...
Now, the problem, obviously comes as I try to get the @timestamp field with my log read values, instead of the default @tiemstamp values that include the capture time data.
I have struggled with tons of combinations, leading to no success...
So, I post here my last setup (just the last, not probably the best, I'm getting lost...), hoping that someone could point me in the right direction.
grok match line is cut, since it was too long, I pasted just the relevant portion
....
grok {
match => [ "message", " ... %{YEAR:sc_year}-%{MONTHNUM:sc_month}-%{MONTHDAY:sc_day} %{TIME:sc_time} .... " ]
}
mutate {
add_field => { "mytimestamp" => "%{sc_year}-%{sc_month}-%{sc_day} %{sc_time}.000+02:00" }
}
date {
locale => "en"
match => [ "mytimestamp", "YYYY-MM-dd HH:mm:ss,SSSz" ]
timezone => "Europe/Madrid"
target => "@timestamp"
add_tag => [ "tsmatch" ]
}
....
Could you give me some clue?
Thank you in advance, best regards!