I have Logstash Kibana and Elasticsearch (all version 5.6.2) configured and running on Windows Server 2012 R2, running through a remote desktop connection. I am working on my logstash config file to read data from a .txt file and output to elasticsearch then to kibana. I can't figure out for the life of me why I am getting errors on my config file and why in kibana it says no matching indices for "logstash-*".
I am very new at this, so bear with me please!
logstash.conf:
input {
file {
path => "//ntsvc/logs/*.txt"
}
}
filter {
grok {
match => { "%{TIMESTAMP_ISO8601}%{SPACE}Local4.%{LOGLEVEL}%{SPACE}%{IP}%{SPACE}%{CISCOTIMESTAMP}%%{SYSLOGPROG} %{GREEDYDATA:message}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
The only settings i changed in logstash.yml are:
config.test_and_exit: true
config.reload.automatic: true
I try to run the config file in PowerShell and get the following error:
PS C:\ELK\logstash> bin/logstash -f logstash.conf
Sending Logstash's logs to C:/ELK/logstash/logs which is now configured via log4j2.properties
[2017-10-11T11:50:07,363][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>
"C:/ELK/logstash/modules/fb_apache/configuration"}
[2017-10-11T11:50:07,603][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C
:/ELK/logstash/modules/netflow/configuration"}
[2017-10-11T11:50:07,690][FATAL][logstash.runner ] The given configuration is invalid. Reason: Expected one of #, => at line 10, column 144 (byte 231) after
filter {
grok {
match => { "%{TIMESTAMP_ISO8601}%{SPACE}Local4.%{LOGLEVEL}%{SPACE}%{IP}%{SPACE}%{CISCOTIMESTAMP}%%{SYSLOGPROG}%{GREEDYDATA:message}"
line 10 column 144 is right after the last quotation mark ".
Some sample data from the .txt file (edited for sensitive information):
2017-09-18 00:00:01 Local4.Debug IP Sep 18 2017 00:00:01: %ASA-0-0000: UDP request discarded from IP to COVERT:IP
I run the grok filter in grokconstructor.appspot.com and it runs correctly.
Thanks in advance!