Hello everybody since im really struggling with setting up SexiLog to receive Windows Logfiles i decided to ask you - hopefully you can show me the correct way
so. What i have so far: a functional sexilog (ELK) instance which is collecting esxi syslogs and Windows EventLogs. Now i want to monitor our Exchange Server more deep and decided to do this via the message tracking logs. Right now i am using NXLog to ship Logdata from our Exchange to Sexilog.
NXLog configuration:
define ROOT C:\Program Files (x86)\nxlog
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log
LogLevel DEBUG
<Extension syslog>
Module xm_syslog
</Extension>
<Extension json>
Module xm_json
</Extension>
<Input in_eventlog>
Module im_msvistalog
Query <QueryList>\
<Query Id="0">\
<Select Path="Application">*</Select>\
<Select Path="System">*</Select>\
<Select Path="Security">*</Select>\
</Query>\
</QueryList>
</Input>
<Input in_exchange>
Module im_file
File "C:\\Program Files\\Microsoft\\Exchange Server\\V15\\TransportRoles\\Logs \\MessageTracking\\MSGTRK*.LOG"
SavePos True
Exec if $raw_event =~ /HealthMailbox/ drop();
Exec if $raw_event =~ /^#/ drop();
</Input>
<Output out_eventlog>
Module om_udp
Host SRV-XXX-001.hc.lan
Port 1515
Exec to_json();
</Output>
<Output out_exchange>
Module om_udp
Host SRV-XXX-001.hc.lan
Port 5141
Exec $SyslogFacilityValue = 2;
Exec $SourceName = 'exchange_msgtrk_log';
Exec to_syslog_bsd();
</Output>
<Route exchange>
Path in_exchange => out_exchange
</Route>
<Route eventlog>
Path in_eventlog => out_eventlog
</Route>
so right now only the Eventlog is showing up in Kibana. The Logfile doesnt seem to show up in Kibana and i have no idea whats going wrong here. The NXLog Logfile doesnt contain any errors/warnings. Debug Mode shows that the Logfiles are matching the defined Wildcard., a Port Check shows that Port 5141 is listening, too.
Here is the Logstash config:
input {
udp {
type => "Exchange"
port => 5141
}
}
filter {
if [type] == "Exchange" {
csv {
add_tag => [ 'exh_msg_trk' ]
columns => ['logdate', 'client_ip', 'client_hostname', 'server_ip', 'server_hostname', 'source_context', 'connector_id', 'source', 'event_id', 'internal_message_id', 'message_id', 'network_message_id', 'recipient_address', 'recipient_status', 'total_bytes', 'recipient_count', 'related_recipient_address', 'reference', 'message_subject', 'sender_address', 'return_path', 'message_info', 'directionality', 'tenant_id', 'original_client_ip', 'original_server_ip', 'custom_data']
remove_field => [ "logdate" ]
}
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp}" ]
}
mutate {
convert => [ "total_bytes", "integer" ]
convert => [ "recipient_count", "integer" ]
split => ["recipient_address", ";"]
split => [ "source_context", ";" ]
split => [ "custom_data", ";" ]
}
date {
match => [ "timestamp", "ISO8601" ]
timezone => "Europe/London"
remove_field => [ "timestamp" ]
}
if "_grokparsefailure" in [tags] {
drop { }
}
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
so... if you could have a look at my configuration that'd be great. I just have no idea why the eventlogs are showing up in kibana but the logfile doesnt... I am really happy about every suggestion. If you want me to provide any logfiles just let me know.
best regards