I'm trying to collect IIS logs with filebeat. Here is my filebeat yml file
How do I configure logstash yml and do I need to create new index in kibana? I'm new to ELK stack any help is appreciated
Magnus, I've created logstash.conf file below and getting this error when trying to run it
PS D:\Elastic\Logstash\bin> .\logstash.bat -f logstash.conf
Sending Logstash's logs to D:/Elastic/Logstash/logs which is now configured via log4j2.properties
[2018-06-18T14:51:01,691][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/Elastic/Logstash/modules/fb_apache/configuration"}
[2018-06-18T14:51:01,722][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/Elastic/Logstash/modules/netflow/configuration"}
[2018-06-18T14:51:01,800][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<ArgumentError: Setting "" hasn't been registered>, :backtrace=>["D:/Elastic/Logstash/logstash-core/lib/logstash/settings.rb:37:in get_setting'", "D:/Elastic/Logstash/logstash-core/lib/logstash/settings.rb:70:inset_value'", "D:/Elastic/Logstash/logstash-core/lib/logstash/settings.rb:89:in block in merge'", "org/jruby/RubyHash.java:1343:ineach'", "D:/Elastic/Logstash/logstash-core/lib/logstash/settings.rb:89:in merge'", "D:/Elastic/Logstash/logstash-core/lib/logstash/settings.rb:138:inva
lidate_all'", "D:/Elastic/Logstash/logstash-core/lib/logstash/runner.rb:264:in execute'", "D:/Elastic/Logstash/vendor/bundle/jruby/2.3.0/gems/clamp-0.6.5/lib/clamp/command.rb:67:inrun'", "D:/Elastic/Logstash/logstash-core/lib/logstash/runner.rb:219:in run'", "D:/Elastic/Logstash/vendor/bundle/jruby/2.3.0/gems/clamp-0.6.5/lib/clamp/command.rb:132:inrun'", "D:\Elastic\Logstash\lib\bootstrap\environment.rb:67:in `'"]}
[2018-06-18T14:51:01,816][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException:org.jruby.exceptions.RaiseException: (SystemExit) exit
LOGSTASH.CONF
We have IIS configured to use a single log file for all sites
because logstash can't handle parsing files in different
directories if they have the same name.
input {
beats {
port => 5044
type => 'iis'
}
}
filter {
Ignore the comments that IIS will add to the start of the W3C logs
if [message] =~ "^#" {
drop {}
}
grok {
## Very helpful site for building these statements:
# http://grokdebug.herokuapp.com/
#
# This is configured to parse out every field of IIS's W3C format when
# every field is included in the logs
#
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
}
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss" ]
locale => "en"
}
}
Second filter
filter {
if "_grokparsefailure" in [tags] {
} else {
# on success remove the message field to save space
mutate {
remove_field => ["message", "timestamp"]
}
Also I noticed that the error
[ERROR][org.logstash.Logstash ] java.lang.IllegalStateException:org.jruby.exceptions.RaiseException: (SystemExit) exit is because of write permissions to data folder - so I gave everyone permissions to write but still generating same error. Could be issue with Java ? Server is using Java version 8 update 171
I've tried changing logstash.conf as follows but error persists
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.