Null value causing errors?

im receiving this error:
"reason"=>"failed to parse [entry.AppId.raw]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a END_OBJECT at 1:2285"}

i see this value:

Here is the mapping statement for said field which is parsed by xml filter and should either be a string whether null or not.

                        "AppId": {
                            "norms": {
                                "enabled": false
                            "type": "string",
                            "fields": {
                                "raw": {
                                    "ignore_above": 256,
                                    "index": "not_analyzed",
                                    "type": "string",
                                    "null_value": "NULL"

Due to this, no logs are making it into the index from logstash.

I think it's the multi-field that isn't playing well with the object value in your document. You could put a k/v pair in there, and it would still fail. Take it out of the array, still fails. Delete the index and try these scenarios again without the explicitly-mapped multi-field, and they index fine. Repeat with the multi-field mapped but without the null_value declared, and they still fail.

Solution: don't send documents with objects values where you've declared string properties.

ok, you just went a bit above my experience level here. I do know what you mean as i have been playing around with this and if it goes into a blank index it works fine, but when i define the mapping it does not play nice.

I defined the mapping because i was seeing an issue where the index got created with a null value and then the rest (the majority) were not going in due to having a value aand not being null. Thus i defined the mapping manuallly and now i get these errors. Either way i am stuck.

This mapping/template is defnitely a thorn in my side and my weakness. Do you think you can be more specific on how i should define an object in the mapping template?

I didn't mean to imply that you should change your mapping.

I more had the feeling you should filter your docs in a way that prevents empty object values from being sent from Logstash to Elasticsearch.

Interesting. I infer that you mean that the index was created by a request to index a document to a non-existing index, so the index was created and the mapping created dynamically on the basis of that first document, and there was a null field value. But I don't think that is actually the case.

When you index a document that has an unmapped field that has the value null, the field will not be dynamically mapped. The next document still has a "clean slate" to trigger a dynamic mapping.

I suspect that what you mean is that you indexed a document like your example above, which has an object in the field. That's not null. That's an object, and it will update the mapping accordingly.