I have the following filter criteria of logstash conf file and my customized pattern in the patterns file:
Pattern: LOGTIMESTAMP %{YEAR}%{MONTHNUM}%{MONTHDAY}
Filter in logstash:
filter {
grok {
patterns_dir => ["/home/chamith/work/ELK/logstash/logstash-2.3.4/bin/patterns"]
match => { "message" => "%{GREEDYDATA} %{LOGTIMESTAMP:logtimestamp}" }
}
mutate {
add_field => {
"timestamp" => "%{LOGTIMESTAMP}"
}
remove_field => ["logtimestamp"]
}
date {
match => ["logtimestamp", "yyyyMMdd HH:mm:ss"]
target => "logtimestamp"
}
}`
And the output in Kibana:
One line from the log file which i'm indexing:
20160805 00:00:01.296,GetProvisioning,3,W1oOOW8oj58GhglVjVNg0Ssl4CXA1P,50219--1958335734-1470326399706,SUCCESS,GetProvisioningTransactionId-01223,null,W1oOOW8oj58GhglVjVNg0Ssl4CXA1P,en,CELCOM_MY_DCB
What I need is to get the timestamp from the logevent and assign it to a new date field? Where am I going wrong?
Any help would be appreciated.
Screenshots are annoying, please don't post them. Please copy/paste from Kibana's JSON tab or Logstash's stdout { codec => rubydebug }
output.
There are multiple problems with your configuration.
First of all your grok filter isn't working (note the presence of the _grokparsefailure
tag), so you're not getting the logtimestamp
field you want. It's not working because you're requiring a space before your timestamp. Your grok expression should begin with ^%{LOGTIMESTAMP:logtimestamp}
.
Why are you even using the grok filter? This is csv data so it's much easier to use the csv filter.
mutate {
add_field => {
"timestamp" => "%{LOGTIMESTAMP}"
}
Here you're trying to add a new field timestamp
containing the contents of the LOGTIMESTAMP
field. But there is no such field because the field you're capturing the timestamp into is called logtimestamp
. LOGTIMESTAMP
is the name of the grok pattern.
remove_field => ["logtimestamp"]
Here's you're removing the field that actually contains the timestamp.
Bottom line: Delete your mutate filter. It serves no purpose.
2 Likes
Thank you so much for the reply. Sorry about the screenshot, since I'm a newbie din know that. I'll take it off.
Even thought it's a csv content, could I use the csv filter for a random log file ? Because I'm giving the input as a .log file. If am I deleting the mutate
part, how am I going to add a new field and assign the log event timestamp to it?
And how could I create a date
type field just like the @timestamp
field rather than a string
? So that I could assign the log event timestamp
to that date
type field.
Even thought it's a csv content, could I use the csv filter for a random log file ? Because I'm giving the input as a .log file.
The file extension doesn't matter.
If am I deleting the mutate part, how am I going to add a new field and assign the log event timestamp to it?
The grok filter does that for you. I suggest you play around a little bit with it filter to see what it actually does.
And how could I create a date type field just like the @timestamp field rather than a string? So that I could assign the log event timestamp to that date type field.
If you use the date filter you should be okay.
1 Like
Spot on Thanks.
I had this as my date
filter:
date { add_field => { "timestamp" => "%{logtimestamp}" } match => [ "timestamp" , "yyyyMMdd HH:mm:ss.SSS" ] target => "timestamp" }
But then what I'm getting from Kibana is the same old string type field (timestamp) and not the date type.
Drop the add_field
option. Right now you're overwriting the current timestamp
value with the original logtimestamp
contents, undoing what the date filter just did. This should be clearly visible if you look at the Logstash output.
The timestamp
field has now been mapped as a string and mappings can't be changed without reindexing. Unless you have valueable data in your test index the easiest way out is to just delete it and start over.
1 Like
Oh wow, it works. My idiocy. Dropping the add_field and recreating the index worked.