I'm having a similar problem. I'm sending Windows event logs and IIS logs to logstash using nxlog. I have successfully added tags to the event logs to help identify the environments and host groups using this technique:
In nxlog.conf output section I've added a couple of Exec's to define the environment and hostgroups:
Module om_tcp
Host uscomp2947
Port 3515
Exec $EventReceivedTime = integer($EventReceivedTime) / 1000000; \
to_json();
Exec $Environment = 'dev'; \
to_json();
Exec $Hostgroup = 'DVS'; \
to_json();
And the logstash.conf I'm using is pretty standard for event logs. I haven't added any additional filters to cope with the environment and hostgroup tags. It just works and the tags are visible in Kibana...
if [type] == "eventlog" {
# Incoming Windows Event logs from nxlog
mutate {
# Lowercase some values that are always in uppercase
lowercase => [ "EventType", "FileName", "Hostname", "Severity" ]
}
mutate {
# Set source to what the message says
rename => [ "Hostname", "@source_host" ]
}
date {
# Convert timestamp from integer in UTC
match => [ "EventReceivedTime", "UNIX" ]
}
mutate {
# Rename some fields into something more useful
rename => [ "Message", "@message" ]
rename => [ "Severity", "eventlog_severity" ]
rename => [ "SeverityValue", "eventlog_severity_code" ]
rename => [ "Channel", "eventlog_channel" ]
rename => [ "SourceName", "eventlog_program" ]
rename => [ "SourceModuleName", "nxlog_input" ]
rename => [ "Category", "eventlog_category" ]
rename => [ "EventID", "eventlog_id" ]
rename => [ "RecordNumber", "eventlog_record_number" ]
rename => [ "ProcessID", "eventlog_pid" ]
}
mutate {
# Remove redundant fields
remove => [ "SourceModuleType", "EventTimeWritten", "EventTime", "EventReceivedTime", "EventType" ]
}
I've tried doing the same with the IIS logs, but it doesn't work. Whenever I add additional tags, the grok parser in logstash fails.
Here's the nxlog.conf...
Module im_file
File "D:\inetpub\logs\LogFiles\W3SVC1\u_ex*.log"
SavePos TRUE
Exec if $raw_event =~ /^#/ drop(); \
else \
{ \
w3c->parse_csv(); \
$EventTime = parsedate($date + " " + $time); \
$SourceName = "iis_inst1"; \
$Message = to_json(); \
}
And here's the logstash.conf...
filter {
if [type] == "iis" {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{IPORHOST:hostip} %{WORD:method} %{URIPATH:page} %{NOTSPACE:query} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientip} %{NOTSPACE:useragent} %{NOTSPACE:referrer} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:timetaken}"]
match => ["Environment", "%{WORD:environment}"]
match => ["Hostgroup", "%{WORD:hostgroup}"]
}
}
Module om_tcp
Host uscomp2947
Port 3516
Exec $Environment = 'dev'; \
to_json();
Exec $Hostgroup = 'DVS'; \
to_json();
I'm obviously doing something wrong here. Any advice would be much appreciated