I'm on linux using the logstash syslog input and ruby debug output but am getting this error:
Received an event that has a different character encoding than you configured. {:text=>"\\u0000\\u0005 ...", :expected_charset=>"UTF-8", :level=>:warn}
I added:
codec => plain {charset => "ISO-8859-1"}
... and am getting no error but the "message" is still not legible (\u0016\u0003, etc.). I've tried a couple other character sets with no better results.
Is there any way I can determine on the receiving end what charset is being used? Perhaps in a ruby filter? The log sender is saying he's using syslog out of the box and is not aware of charset being used.
UPDATE:
The sender of the syslogs records is now telling me that his text is encrypted. The Logstash syslog input plugin doesn't seem to have that facility so now I'm trying a combination of the tcp input plugin and a grok filter but I'm not getting much farther. I'm specifying ssl_enable=>true and am specifying the same crt file for ssl_cacert as he's using for "DefaultNetstreamDriverCAFile" in his syslog config. The error I'm now getting is:
Could not initialize SSL context {:exception=>#TypeError: can't convert nil into String>, :backtrace=>["org/jruby/RubyIO.java:3785:in 'read'", "org/jruby/RubyIO.java:3968:in 'read'",<more but too long to type in>
Found a nice article at http://blog.eagerelk.com/how-to-configure-the-tcp-logstash-input/ but I'd still like more information on the SSL config options. For instance if I set the ssl_cert option to the sender's public .cer file, do I need to set the ssl_key and ssl_key_passphrase options? And is the ssl_cacerts option supposed to be the truststore file with the public certificates of all the people I can trust?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.