Hi
We have fluentd which is sending logs from client server to our main ELK Server.
In the Kibana, when i read the logs, there is lot of garbage data that comes along with the message.
On the Logstash log, server we are using fluent plugin for input.
\x92\xACsys.messages\xDB\u0000\u0000\u0003\u001A\x92\xCEU\xC1\x9FY\x84\xA4host\xAFip\xA5ident\xA9freshclam\xA3pid\xA48721\xA7message\xDA\u00009ClamAV update process started at Wed Aug 5 11:00:01 2015\x92\xCEU\xC1\x9FY\x84\xA4host\xAFip-10-20-12-209\xA5ident\xA9freshclam\xA3pid\xA48721\xA7message\xDA\u0000Nmain.cvd is up to date (version: 55, sigs: 2424225, f-level: 60, builder: neo)\x92\xCEU\xC1\x9FZ\x84\xA4host\xAFip-
Any help will be appreciated. The message comes but it's like encoded between lot of garbage data.
Our Configuration of Logstash
input {
syslog {
host => "0.0.0.0"
port => 5141
}
}
output {
stdout { }
elasticsearch {
}
}