Hi there, I am trying to get custom log information indexed into elasticsearch using a grok filter.
Logstash configuration file:
input {
file {
path => "/etc/logstash/customlog/testfile.log"
codec => line
start_position => "beginning"
ignore_older => 0
sincedb_path => "/dev/null"
}
}
filter {
grok {
patterns_dir => "/etc/logstash/patterns"
match => { "message" => "%{DATETIME_PATTERN:timestamp} %{LOGLEVEL:log-level} %{GREEDYDATA:message}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "omitted"
index => "custom-log-test"
}
#stdout { codec => rubydebug }
}
Log file example:
2017-10-05 13:10:00 INFO Running Check Time issue
2017-10-05 13:10:00 INFO No need to alert - there are no future times
2017-10-05 13:20:00 INFO Running Check Time issue
2017-10-05 13:20:00 ERROR Issue found - sending alert
I can confirm that logstash is receiving the lines from the file, this can be seen in the logstash logs like so:
[DEBUG][logstash.inputs.file ] Received line {:path=>"/etc/logstash/customlog/testfile.log", :text=>"2017-10-05 13:40:00 INFO No need to alert - there are no future times"}
etc...
I have also tested my grok pattern against these log lines using the site here: http://grokconstructor.appspot.com/do/match and can confirm that my pattern does successfully parse the line of text into the fields I specified.
For reference, here is how my custom pattern is defined:
DATETIME_PATTERN %{YEAR}-%{MONTHNUM}-%{MONTHDAY}[ ]%{HOUR}:%{MINUTE}:%{SECOND}
Any idea where this might be going wrong? After seeing that the input was received from the file, when I curl elasticsearch to list its indices, I am not seeing the "custom-log-test" index at all indicating the data never made it to elasticsearch.
Any tips for troubleshooting to diagnose the issue? Any help is greatly appreciated.