I've got an essentially similar set of logs, and my Logstash configuration looks like yours, and the JSON objects in the log line end up in the "target" field. So I've got target => "json" and I get json.this, json.that,json.the.other etc in Elasticsearch (and hence Kibana). In other words, it just works, by magic.
But there is a big gotcha. If you have different log lines, coded by different people, which include JSON fields with the same names but different data types, it ain't gonna work. You'll get mapping errors in the logs and missing documents in Elasticsearch.
Nonetheless in the doc it's written that if you need the JSON to be accessible at the root of the ES entry you don't have to add much (than what I did) but nothing is added on my side so far... It's really hard to understand what's happening and why haha
The problem here is that that pattern strips the closing "} from the JSON, so it is no longer valid JSON. You should be getting a _jsonparsefailure tag if you are actually running that configuration. Try
Ah, I'm so sorry for that, the log comes wrapper by quotes.
(I've edited my first post)
It's super annoying but I've to escape the last quote. And indeed you're right I get this message a lot.
Here is what a parsing returns from kibana (json view):
Just to make sure. If you view the input line in a text editor (not in Kibana) are the double quotes really escaped using a backslash? The fact that the backslashes are escaped in your last post suggests they are. If they are then you would need to unescape them before trying to parse the JSON.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.