Nested Json logs - how to filter correctly

Hi,

I'm trying to input a complex Json log with nested fields. Here is my .conf file:

input { file { path => /home/logstash-user/json-try.json" start_position => "beginning" codec => "json" } } output { elasticsearch { codec => "json"} stdout { codec => "rubydebug"} }

I tried to input to files, one simple with 1 nested object, and another one more complex with about 3 nested objects.

The first file is filtered correctly, but the second one isn't filtered at all, and the following error appears:
"Attempted to send a bulk request to Elasticsearch .... but an error occurred and it failed!" and so on...

What is my problem?

The elasticsearch output should not have JSON codec as this is handled internally. Remove this and try again.

Hi, thank you for your replay. I removed the JSON codec from the elasticsearch output, but nothing changed.

Here is the input that worked:
{"one":{"id":1, "name":"john", "job":"programmer"}, "two":{"id":2, "name":"david", "job":"programmer"}}

Here is the complex input that didn't work:
{"project":{"metadata":{"project_name":"calc", "date":"01.01.16"}, "details":{"feature 1":{"name":"gui", "description":"design"}, "feature 2":"modulo", "description":"modulo"}}}}

What might be the problem?

The last entry seems to have a curly brace too many at the end and is therefore nat valid JSON.

thanks, I fixed it. But now it doesn't parse the json, it just puts the whole input into the "message" field, instead of parsing it by the json objects.

Do you still have got the JSON codec for the file input? Can you share you current config?

input { file { path => /home/logstash-user/json-try.json" start_position => "beginning" codec => "json" } } output { elasticsearch { } stdout { codec => "rubydebug"} }

What does the ruby debug output look like for the event?

Logstash startup completed
{
"message" => "{"project":{"metadata":{"project_name":"calc", "date":"01.01.16"}, "details":{"feature 1":{"name":"gui", "description":"design"}, "feature 2":"modulo", "description":"modulo"}}}",
"tags" => [
[0] "_jsonparsefailure"
],
"@version" => "1",
"timestamp" => "2016-03-24T08:32:50.667Z",
"host" => "elk"
"path" => "/home/logstash-user/json-try.json"
}

That can not be parsed by the codec as it is not valid JSON (indicated by the _jsonparsefailure tag). If you can not correct the generation at the source, you may need to load it using a plain codec and then transform it into valid JSON before applying a JSON filter instead.

I put the json into a json validator, had a minor fix in it.
Now I get a huge error in red and yellow, the same one I wrote about in the beginning. It looks something like this:
"Attempted to send bulk request to Elasticsearch configured at (configuration details...) but an error occurred and it failed!" and so on, it's pretty long...
The same error is printed about 4 times.

Hi Christian,
In my searches around the web I found someone saying they had a problem to filter input with dots (".") in it, and in one of the inputs I indeed had a dot. I removed it and everything worked fine. Do you know the cause of this problem? Or how to overcome it?

Thanks