i have tried to get this working for a couple of days now. it seems so simple i must be missing something. I have read all the topics and tried to find examples to help me get it working, no luck. I just want to have the timestamp for events in an index be the time stamp in the file and not when logstash ingests it. Here is my filter:
input {
file {
path => "/app/buckets/bladelogic/**/.log"
stat_interval => 30
start_position => beginning
ignore_older => 0
}
}
filter {
grok {
match => {"message" => "%{SYSLOG5424SD}:time_stamp"}
}
date {
match => [ "time_stamp", "ISO8601" ]
remove_field => [ "time_stamp" ]
}
}
output {
elasticsearch {
hosts => ["1.2.3.4.5:9200","1.2.3.4.6:9200"]
index => "bladelogic"
user => pete
password => xxxxx
}
}
some sample events:
[21 Feb 2017 15:07:20,673] [Scheduled-System-Tasks-Thread-3] [INFO] [System:System:] [Memory Monitor] Total JVM (B): 1341128704,Free JVM (B): 882924056,Used JVM (B): 458204648,VSize (B): 7945830400,RSS (B): 1835081728,Used File Descriptors: 311,Used Work Item Threads: 1/50,Used Client Connections: 1/200,DB Client-Connection-Pool: 0/0/0/100/75/25,DB Job-Connection-Pool: 0/0/0/100/75/25,DB General-Connection-Pool: 2/2/0/100/75/25
[21 Feb 2017 15:08:20,674] [Scheduled-System-Tasks-Thread-19] [INFO] [System:System:] [Memory Monitor] Total JVM (B): 1341128704,Free JVM (B): 870601656,Used JVM (B): 470527048,VSize (B): 7945830400,RSS (B): 1835081728,Used File Descriptors: 303,Used Work Item Threads: 1/50,Used Client Connections: 1/200,DB Client-Connection-Pool: 0/0/0/100/75/25,DB Job-Connection-Pool: 0/0/0/100/75/25,DB General-Connection-Pool: 2/2/0/100/75/25
[21 Feb 2017 15:09:20,673] [Scheduled-System-Tasks-Thread-9] [INFO] [System:System:] [Memory Monitor] Total JVM (B): 1341128704,Free JVM (B): 864725088,Used JVM (B): 476403616,VSize (B): 7945830400,RSS (B): 1835081728,Used File Descriptors: 312,Used Work Item Threads: 1/50,Used Client Connections: 1/200,DB Client-Connection-Pool: 0/0/0/100/75/25,DB Job-Connection-Pool: 0/0/0/100/75/25,DB General-Connection-Pool: 1/1/0/100/75/25
[21 Feb 2017 15:10:20,675] [Scheduled-System-Tasks-Thread-13] [INFO] [System:System:] [Memory Monitor] Total JVM (B): 1341128704,Free JVM (B): 858982272,Used JVM (B): 482146432,VSize (B): 7945830400,RSS (B): 1835081728,Used File Descriptors: 303,Used Work Item Threads: 1/50,Used Client Connections: 1/200,DB Client-Connection-Pool: 0/0/0/100/75/25,DB Job-Connection-Pool: 0/0/0/100/75/25,DB General-Connection-Pool: 1/1/0/100/75/25
I get a grok parse failure when browsing events in kibana and no time_stamp field when i go to create the index or after the data has been indexed. Any ideas what I am missing?
Thanks,
Pete