Logstash 1.5.0 json filter output to elasticsearch fails on nested json

Hi all,

I am attempting to use logstash 1.5.0 to take in a json log file, parse via the json filter , and output to elasticsearch via the elasticsearch output with a protocol of 'http'.

One of our two logs works okay, but the other has nested json where the logline includes a list of packets that were received as part of the event. These lines return a 400 error in the logstash log.
The logstash.log shows the text of the object, but the offending sub-objects show up as:

"whateverobj"=>#Java::JavaUtil::ArrayList:0x1c38ba8f,

rather than actual readable text.

If I use ruby filter to flatten the nested json-filter-parsed object into event-level flattended properties,like: nestedobj1.prop1, nestedobj1.prop2, nestedobj2.prop1, etc and then use mutate filter remove_field to remove the original objects that the json filter itself added, the error in the log goes away and the event makes it to the ES index.

Is there a known problem with nested JSON in the json filter? It seems to have trouble mostly with objects that contain lists of objects. Is there a best-practices workaround for it?

Thanks very much!
Justin

Hehe. It filtered out the whateverobj part:

"whateverobj"=>#Java::JavaUtil::ArrayList:0x1c38ba8,