I have logs that are read after the fact and I need to enter them on my timeline as the events occured not as the logs are read by my shippers. Log entries look like this:
2018-05-18 01:07:03 - Nagios XI [32] system:localhost - cmdsubsys: User [abcdef01] applied a new configuration to Application subsys
In my logstash conf file I believe I've isolated the datestamp and now just need to convert it to epoch
filter {
grok {
match => [ "message", "20%{DATESTAMP:LogTime}" ]
}
grok {
match => { "message" => "%{GREEDYDATA:event}" }
}
}
Ideally I need time as epoch floating integer (yes I know my log doesn't include milliseconds so I know it'll be rounded)
Any help you could provide would be appreciated.
-krw
match => [ "message", "20%{DATESTAMP:LogTime}" ]
DATESTAMP won't work properly. Here's its definition:
Your date format doesn't match DATE_US or DATE_EU. Use TIMESTAMP_ISO8601 instead.
match => { "message" => "%{GREEDYDATA:event}" }
This isn't useful. It just copies the whole contents of the message field into the event field.
Thanks Magnus, now I've got most of what I need with this configuration.
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} ?%{GREEDYDATA:message}" }
overwrite => ["message"]
}
date {
locale => "en"
match => ["timestamp", "yyyy-MM-dd HH:mm:ss"]
target => "@timestamp"
}
}
How do I add a field with the time in epoch format ... same time as timestamp.
-krw
You need to use a ruby filter. I think the timestamp object in the @timestamp field has a to_f method that you can use, so
event.set('epoch', event.get('@timestamp').to_f)
should work.