I have logs coming from various time zones into my ELK Stack in live. I have a table with time differences between those timezones and my local timezone from where i am running my ELK Stack. So using this table when ever logstash is parsing timestamp from a log, depending on that source i want to adjust that timestamp to local timezone and index into elasticsearch. So that when i view in kibana i can compare the docs from various timezones easily. Is there any way to do this in logstash.
It sounds easier to stamp the events with the correct timezone at the source but sure, what you describe is possible. The translate filter should be useful for mapping "sources" (whatever that means) to timezone offsets. The resulting offset can be fed to the date filter that parses the timestamp in the log message itself.
I went trough translate filter and it seems to be string translation through regexes, where as i am having a date field, for example if i have a timestamp "Apr 12 19:24:26" parsed like
"match" => [ "timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
target => "@timestamp"
and due to timezone issue i want to parse it as Apr 12 17:24:26 , i need to do some numeric calculation right? How that can be done through this translate?
The date filter will do that for you. Look at its
timezone option. What you need to do is somehow figure out which timezone should be used.
If you can get the timezone information into the log messages that would be even easier.
Thanks a lot magnusbaeck , that solved my issue