Let me explain the situation : I'm using 2 different server, one for my Rsyslog server and an onther for the ELK stack. I collect logs from Windows servers/computer on my Rsyslog server using Event to syslog. And then i transfer the logs to the ELK server via json. My problem is with some Windows' logs logstash seems to not interpret well the log. But with some other Windows' logs it work, and with Linux computer it work too.
In the source of the log a got a "message" before the actual log, it's strang.
It looks like there's a json codec (or filter) missing somewhere, or the JSON text is broken and can't be parsed. Are all messages entering Logstash via the same UDP port 10514 listener? Um, wait. Is this how rsyslog produces the JSON messages?
I don't know rsyslog, but it looks like this won't work for values containing double quotes. The result will be exactly what you're seeing (broken JSON that can't be parsed, resulting in Logstash falling back to a plain codec).
Sorry to bother you again but after many many search i can't find the solution to my problem and i was wondering if you had it.
My other problem is that i got _jsonparsefailure and i d'ont know why. I'm using this conf file : https://gist.github.com/untergeek/0373ee85a41d03ae1b78. I'm using this to too :
mutate {
gsub => ["message","[\#]","/"]
}
to replace \ by /, but i still have jsonfailure.
Again sorry to bother you but i can't find the answer to my problem.
I don't understand what your gsub is supposed to do. There's nothing wrong with the backslashes in the message payload. I have no idea how to get rsyslog to send properly escaped JSON. You might want to reconsider your strategy here.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.