Hello, i want to store my own log file to logstash, then i use grok to mapping the date from log file to a field. But, in kibana, that field cannot read as a timestamp, so i can not use the real time stamp in timelion.
here is my config file
input{
file{
path => "E:\LogFolder\xxx.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
patterns_dir => ["E:\ELK\patterns"]
match => {
"message" => "%{DATE_YMD:my_timestamp}"
}
}
date {
match => ["my_timestamp", "yyyy MM dd HH:mm:ss"]
locale => "sv"
}
}
output{
elasticsearch{
hosts => ["localhost:9200"]
index => "file_poller_bw"
}
stdout {
codec => rubydebug
}
}
Without knowing what your logfile looks like we can't help.
2015 Mar 23 19:33:31:432 GMT +7 BW.FilePollerBW
for example like this sir
Your date pattern is wrong; you need MMM instead of MM. The part of the pattern used for the time probably also needs adjustments, but it depends on what your my_timestamp field looks like.
i've changed it into this sir
filter {
grok {
patterns_dir => ["E:\ELK\patterns"]
match => {
"message" => "%{DATE_YMD:my_timestamp}"
}
}
date {
match => ["my_timestamp", "yyyy MMM dd HH:mm:ss"]
target => "my_timestamp"
}
}
but i got "_dateparsefailure". Actually, all i want is to make "my_timestamp" can read in elastic as date type, because i want to use it in timelion
As I said, the time part of your pattern probably needs adjustments too. I can't be specific since I don't know exactly what the my_timestamp field contains (because I don't know how you've defined the DATE_YMD filter. When the date filter fails to parse a timestamp it'll tell you in the log what part of the string it had a problem with.