Hi,
I have to route alerts from remote sites back to their syslog servers.
For that, I parse the alert event and based on its' value, I add metadata field:
if [_source][site] == "new york" { mutate { add_field => { "[@metadata][remotesite]" => "ny.remote.site" } } }
I have verified with rubydebug that this will generate the metadata field correctly:
{ "@timestamp" => 2017-09-04T07:17:09.152Z, "_index" => "company-sonicwall-2017.09.04", "@metadata" => { "remotesite" => "ny.remote.site", "appname" => "SonicwallFailedLogin" }, "_type" => "sonicwall", "_source" => { ..... REMOVED ...... }, "_id" => "sdfjksdhlkrh32we", "_score" => 1.0, "type" => "alarm" }
I have configured syslog output like this:
syslog { appname => "%{[@metadata][appname]}" host => "%{[@metadata][remotesite]}" message => "Login alert etc." port => 514 priority => "<129>" protocol => "tcp" rfc => "rfc5424" severity => "alarm" sourcehost => "%{[_source][host]}" } }
I get this error when it tries to send the event to syslog server:
[2017-09-04T10:39:38,017][WARN ][logstash.outputs.syslog ] syslog tcp output exception: closing, reconnecting and resending event {:host=>"%{[@metadata][remotesite]}", :port=>514, :exception=>#<SocketError: initialize: name or service not known>, :backtrace=>["org/jruby/ext/socket/RubyTCPSocket.java:129:in
initialize'", "org/jruby/RubyIO.java:871:in new'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-syslog-3.0.2/lib/logstash/outputs/syslog.rb:210:in
connect'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-syslog-3.0.2/lib/logstash/outputs/syslog.rb:178:in publish'", "org/jruby/RubyProc.java:281:in
call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-plain-3.0.3/lib/logstash/codecs/plain.rb:41:in encode'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-syslog-3.0.2/lib/logstash/outputs/syslog.rb:147:in
receive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in multi_receive'", "org/jruby/RubyArray.java:1613:in
each'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in
multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:47:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:420:in
output_batch'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:419:in
output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:365:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:in
start_workers'"], :event=>2017-09-04T07:39:27.906Z %{host} %{message}}`
So it seems that the host field doesn't support dynamic field-value replacement?
I have verified that the sending host can reach the remote syslog server and the domain name resolves correctly.
Any ideas?