Issue with double quotes being escaped

I'm having an issue with my Filebeat 1.2.3. install.
I am trying to read Nginx logs and output them to Redis.
I had this system working (at least in a testing capacity) for a week and recently I noticed that I am getting Logstash _grokparsefailures because my logs now have \" surrounding any fields that have " in them.

So for an example log line: - - [16/Jun/2016:18:16:40 +0000] "GET /url/a-b HTTP/1.1" 200 0 "-" "Funnelback RPT-HTTPClient/0.3-3E"

Is being turning into this
"{\"@timestamp\":\"2016-06-16T18:16:43.758Z\",\"beat\":{\"hostname\":\"\",\"name\":\"\"},\"count\":1,\"fields\":null,\"input_type\":\"log\",\"message\":\" - - [16/Jun/2016:18:16:40 +0000] \\\"GET /url/a-b HTTP/1.1\\\" 200 0 \\\"-\\\" \\\"Funnelback RPT-HTTPClient/0.3-3E\\\"\",\"offset\":843990,\"source\":\"/var/log/nginx/access.log\",\"type\":\"log\"}"
by the time it gets to redis. (That is the output when I lrange the key that it is being put under)

And thus logstash is parsing it into
{"@timestamp":"2016-06-13T23:38:35.448Z","beat":{"hostname":"","name":""},"count":1,"fields":null,"input_type":"log","message":" - - [13/Jun/2016:23:38:35 +0000] \"GET /url/a-b HTTP/1.1\" 301 178 \"-\" \"Mozilla/5.0\"","offset":322672,"source":"/var/log/nginx/access.log","type":"log","@version":"1","tags":["_grokparsefailure"]}

Why is it adding the additional \" around the url and user agent? It wasn't doing this a week ago when I set it up because my grok rules were working then.

Here are my grok rules %{IPORHOST:c_ip} %{NGUSER:cs_username} %{NGUSER:cs_auth} \[%{HTTPDATE:timestamp}\] "%{WORD:cs_method} %{URIPATHPARAM:request} HTTP/%{NUMBER:cs_version}" %{NUMBER:sc_status} (?:%{NUMBER:sc_bytes}|-) (?:"(?:%{URI:cs_referer}|-)"|%{QS:cs_referrer}) %{QS:cs_user_agent}

Any help would be appreciated in getting this solved.

Is your setup fb -> redis -> ls -> redis -> ls or fb -> ls -> redis -> ls If it is the first one, I strong recommend to update the 5.0.0-alpha3 is redis output was completely rewritten.

filebeat is sending the event json-encoded to redis. That is, it has to escape some characters in the input string. All keys being escaped in logstash is an indicator the message not being decoded in logstash. Have you configured the json codec in logstash?

My setup is fb -> redis -> ls -> es currently. I wasn't sure about using the alpha version while testing since I didn't want to be fighting bugs while also just getting my ELK stack off the ground.

I'll test out the 5.0.0 alpha.

According to the default codec is json, so it should be using that since I didn't specify another.

Also doesn't the \\\"GET /url/a-b HTTP/1.1\\\" part of my redis output indicate that filebeat is sending the original "GET /url/a-b HTTP/1.1" part of the logs with escaped quotes so \"GET ... 1.1\" and then escaping that again?

the '\' is an indicator of data being escaped twice. I guess the print in logstash is escaping data again (no idea). Have you checked redis content being proper json?

This topic was automatically closed after 21 days. New replies are no longer allowed.