Elastic Stack Mapping Error

Hello, this my second time posting here with a hope to find a solution this time. I'm working on the Elastic Stack. I'm sending logs from Filebeat through Logstash to Elasticsearch. My logs get structured in Logstash, nonetheless I get an error message that's annoying me. I have created several pipelines and I am using the distributor pattern with a main pipeline that dispatches the logs according their types into respective pipelines where they get structured. I have indices and mappings for each log type, and those mappings explicitly define the type of each field so there is noway I have fields that are defined as 2 different types. In logstash-plain.log I get the following warn logs:

[2021-09-22T12:24:14,229][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"1750442601", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"i06-m-cx/dt/xpad_s540", "domain"=>"i06-m-cx", "family"=>"dt", "member"=>"xpad_s540"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2864708464", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"INFO", "ndc"=>{}, "@timestamp"=>2021-03-26T08:16:23.620Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210326-14h46/i06-m-cx_dt_xpad_s540.log"}, "offset"=>16731, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"-------------------------------------------------------"}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"1750442601", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '1750442601'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:498"}}}}}
[2021-09-22T12:24:14,231][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"2908326686", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"i06-m-cx/dt/xpad_s540", "domain"=>"i06-m-cx", "family"=>"dt", "member"=>"xpad_s540"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2902440816", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"INFO", "ndc"=>{}, "@timestamp"=>2021-03-26T08:16:23.620Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210326-14h46/i06-m-cx_dt_xpad_s540.log"}, "offset"=>16991, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"-> yat::DEVICE_SNAP_MSG"}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"2908326686", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '2908326686'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:498"}}}}}
[2021-09-22T12:24:14,233][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"4080914791", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"flyscan/core/server.1", "domain"=>"flyscan", "family"=>"core", "member"=>"server.1"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2956016496", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"INFO", "ndc"=>{}, "@timestamp"=>2021-01-26T14:24:10.376Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210327-06h46/flyscan_core_server.1.log.1"}, "offset"=>8495, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"Check monitor tango_recorder_monitor(DeviceStateMonitor)"}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"4080914791", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '4080914791'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:498"}}}}}
[2021-09-22T12:24:14,235][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"4247124957", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"i06-m-c00/ex/scan.1", "domain"=>"i06-m-c00", "family"=>"ex", "member"=>"scan.1"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2959076208", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"INFO", "ndc"=>{}, "@timestamp"=>2021-03-24T22:49:23.298Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210326-14h46/i06-m-c00_ex_scan.1.log.1"}, "offset"=>3666, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"Waiting for timebases to complete integration..."}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"4247124957", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '4247124957'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:494"}}}}}
[2021-09-22T12:24:14,237][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"1869467806", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"i06-m-c00/ex/scan.1", "domain"=>"i06-m-c00", "family"=>"ex", "member"=>"scan.1"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2959076208", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"INFO", "ndc"=>{}, "@timestamp"=>2021-03-24T22:49:24.549Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210326-14h46/i06-m-c00_ex_scan.1.log.1"}, "offset"=>3917, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"Executing sensors after-integration hooks..."}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"1869467806", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '1869467806'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:494"}}}}}
[2021-09-22T12:24:14,239][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"235356131", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"flyscan/core/recorder-proxy.1", "domain"=>"flyscan", "family"=>"core", "member"=>"recorder-proxy.1"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2952784752", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"ERROR", "ndc"=>{}, "@timestamp"=>2021-02-02T14:58:52.367Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210327-14h46/flyscan_core_recorder-proxy.1.log"}, "offset"=>1784, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"[Ex:1-Err:1] Rsn: API_AttributeFailed Dsc: Failed to read_attribute on device flyscan/core/tango-recorder.1, attribute state Org: DeviceProxy::read_attribute()"}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"235356131", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '235356131'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:515"}}}}}
[2021-09-22T12:24:14,241][WARN ][logstash.outputs.elasticsearch][cppdevices][ddfca0e0830cc67ad422a07327a2b66558e87a53962e37ee60acb52071b7c8d0] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"964794382", :_index=>"cppdevices", :routing=>nil}, {"event"=>{}, "device"=>{"name"=>"flyscan/core/recorder-proxy.1", "domain"=>"flyscan", "family"=>"core", "member"=>"recorder-proxy.1"}, "agent"=>{"name"=>"localhost.localdomain", "ephemeral_id"=>"79dc4716-1561-4593-97a7-a925fa474785", "version"=>"7.14.0", "type"=>"filebeat", "hostname"=>"localhost.localdomain", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "@version"=>"1", "input"=>{"type"=>"log"}, "thread"=>"2952784752", "host"=>{"name"=>"localhost.localdomain"}, "tags"=>["cppdevices", "beats_input_codec_plain_applied"], "level"=>"ERROR", "ndc"=>{}, "@timestamp"=>2021-02-02T14:58:52.868Z, "log"=>{"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210327-14h46/flyscan_core_recorder-proxy.1.log"}, "offset"=>2157, "flags"=>["multiline"]}, "ecs"=>{"version"=>"1.10.0"}, "message"=>"[Ex:2-Err:0] Rsn: API_CantConnectToDevice Dsc: Failed to connect to device flyscan/core/tango-recorder.1\nThe connection request was delayed.\nThe last connection request was done less than 1000 ms ago Org: Connection::reconnect"}], :response=>{"index"=>{"_index"=>"cppdevices", "_type"=>"_doc", "_id"=>"964794382", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ndc] of type [keyword] in document with id '964794382'. Preview of field's value: '{}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:515"}}}}}

I looked everywhere for a solution, but no one seems to give a clear and helpful workaround for this issue. People led me to read articles that did not quite explain how to get rid of this. I appreciate you giving a solution if you have one and not a link to an article, because I have read already without success.

Thanks in advance,
Jamal

You are sending an empty object {} as the value of ndc field to elastic and you are getting a warning that you can't do that with a field mapped as a keyword.

What I would do first is go back to the original source of your logs and see what maybe causing this. Appears the ndc field always doesn't have a value and somewhere that is being translatted into an empty object {}.

The format of this specific log type is xml, so Logstash has predefined functions that structure using the xml filter plugin. I just renamed the fields, one of which is ndc. Does text accept void? If I replace ndc field type from keyword to text I mean?

Logstash configuration for cppdevices:

input {

    pipeline {

        address => cppdevices

    }

}

filter {

    fingerprint {

        source => "message"

        target => "[@metadata][fingerprint]"

        method => "MURMUR3"

        }

    mutate {

        gsub => ["message", "log4j:", ""]

        }

    xml {

        source => "message"

        target => "event"

        force_array => false

        }

    if "_xmlparsefailure" in [tags] {

        drop { }

        }

    date {

        match => [ "[event][timestamp]", "UNIX_MS" ]

        remove_field => [ "[event][timestamp]" ]

        timezone => "Europe/Paris"

        }

    mutate {

        rename => {

            "[event][logger]" => "[device][name]"

            "[event][thread]" => "[thread]"

            "[event][level]" => "[level]"

            "[event][NDC]" => "[ndc]"

            "[event][message]" => "[message]"

            }

        }

    grok {

        match => { "[device][name]" => "(?<[device][domain]>[^_/]+)/(?<[device][family]>[^_/]+)/(?<[device][member]>%{GREEDYDATA})" }

        }

    }

output {

    elasticsearch {

        hosts => ["localhost:9200"]

        index => "cppdevices"

        document_id => "%{[@metadata][fingerprint]}"

        manage_template => true

        template => "/vagrant/mapping/cppdevicesmapping.json"

        template_name => "cppdevicesmapping"

    }

}

Mapping:


{

    "index_patterns": ["cppdevices"],

    "mappings": {

        "properties": {

            "@timestamp": {

              "type": "date",

              "format": "strict_date_optional_time"

        },

        "host": {

          "properties": {

            "name": {

              "type": "keyword"

            }

          }

        },

        "event": {

          "properties": {

            "timestamp": {

                "type": "date",

                "format": "epoch_millis"

                },

            "logger": {

                "type": "text"

                },

            "thread": {

                "type": "long"

                    },

            "level": {

                "type": "keyword"

                },

            "NDC": {

                "type": "keyword"

                    },

            "message": {

                "type": "text"

                    }        

            }

        },

        "ndc": {

          "type": "keyword"

        },

        "thread": {

          "type": "long"

        },

        "level": {

          "type": "keyword"

        },

        "message": {

          "type": "text"

        },

        "device": {

          "properties": {

            "domain": {

              "type": "keyword"

            },

            "family": {

              "type": "keyword"

            },

            "member": {

              "type": "keyword"

            },

            "name": {

              "type": "keyword"

            }}}

    }

  }

}
<log4j:event logger="flyscan/clock/pandabox-timebase.1" timestamp="1612437992370" level="INFO" thread="2977954672">
<log4j:message><![CDATA[PandATimebaseManager::starting sequence...]]></log4j:message>
<log4j:NDC><![CDATA[]]></log4j:NDC>
</log4j:event>

This a log example, NDC always contains this : ![CDATA]
By the ways, text instead of keyword does not do the job either.

If ndc should always be == to ![CDATA[]] then I would just do a conditional statement checking that field and if it doesn't contain that value replace it with some dummy text or leave empty. Or you can even set it as ![CDATA[]] if that is needed.

This doesn't fix the root cause but I am not sure what that is.

Test Conf

input { generator { lines => ['[{"ndc":"![CDATA[]]"},{"ndc":"{}"}]'] count => 1 codec => "json" } }
filter {
  if [ndc] != "![CDATA[]]" {
    mutate {
      update => { "[ndc]" => "![CDATA[]]" }
    }
  }
}
output { stdout { codec => json } }

Removed NDC field for good, but now I have an issue with another log type

[2021-09-22T13:54:43,240][WARN ][logstash.outputs.elasticsearch][cooxdaemon][559dd926646f4cc02102471140839f56941007cd8411f039bc4d09f81a2d6395] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"cooxdaemon", :routing=>nil}, {"ecs"=>{"version"=>"1.10.0"}, "cooxdaemon"=>{"timestamp"=>"8/25/17 16:06:24.715", "level"=>"INFO", "message"=>"Server - connection opened : /172.19.1.4:39013 [globalscreen.system.connector.stub.StubConnector@b465b6]"}, "input"=>{"type"=>"log"}, "host"=>{"name"=>"localhost.localdomain"}, "@version"=>"1", "@timestamp"=>2021-09-22T13:54:31.735Z, "agent"=>{"name"=>"localhost.localdomain", "hostname"=>"localhost.localdomain", "type"=>"filebeat", "ephemeral_id"=>"141e2498-12df-44ef-9495-d3af69592340", "version"=>"7.14.0", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "tags"=>["cooxdaemon", "beats_input_codec_plain_applied"], "log"=>{"offset"=>1959, "file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/coox/mars/CooxDaemon/tcp_io0.log"}}}], :response=>{"index"=>{"_index"=>"cooxdaemon", "_type"=>"_doc", "_id"=>"-FXJDXwBcgaEhuRBg7sa", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [cooxdaemon.timestamp] of type [date] in document with id '-FXJDXwBcgaEhuRBg7sa'. Preview of field's value: '8/25/17 16:06:24.715'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed 
to parse date field [8/25/17 16:06:24.715] with format [dd/MM/yy HH:mm:ss.SSS||strict_hour_minute_second_fraction]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
[2021-09-22T13:54:43,240][WARN ][logstash.outputs.elasticsearch][cooxdaemon][559dd926646f4cc02102471140839f56941007cd8411f039bc4d09f81a2d6395] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"cooxdaemon", :routing=>nil}, {"ecs"=>{"version"=>"1.10.0"}, "cooxdaemon"=>{"timestamp"=>"8/25/17 16:06:24.715", "level"=>"INFO", "message"=>"Server - connection opened : /172.19.1.4:39013 [globalscreen.system.connector.stub.StubConnector@b465b6]"}, "input"=>{"type"=>"log"}, "host"=>{"name"=>"localhost.localdomain"}, "@version"=>"1", "@timestamp"=>2021-09-22T13:54:31.735Z, "agent"=>{"name"=>"localhost.localdomain", "hostname"=>"localhost.localdomain", "type"=>"filebeat", "ephemeral_id"=>"141e2498-12df-44ef-9495-d3af69592340", "version"=>"7.14.0", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554"}, "tags"=>["cooxdaemon", "beats_input_codec_plain_applied"], "log"=>{"offset"=>1959, "file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/coox/mars/CooxDaemon/tcp_io0.log"}}}], :response=>{"index"=>{"_index"=>"cooxdaemon", "_type"=>"_doc", "_id"=>"-FXJDXwBcgaEhuRBg7sa", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [cooxdaemon.timestamp] of type [date] in document with id '-FXJDXwBcgaEhuRBg7sa'. Preview of field's value: '8/25/17 16:06:24.715'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed 
to parse date field [8/25/17 16:06:24.715] with format [dd/MM/yy HH:mm:ss.SSS||strict_hour_minute_second_fraction]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

Thanks for your help.

1 Like

Your month and days are swapped.

dd/MM/yy HH:mm:ss.SSS
8/25/17 16:06:24.715

Should be

MM/dd/yy HH:mm:ss.SSS

1 Like

You rock man thank you so much. Have a great day. :smiley:

Okayyyy, I came back to end this XD. So the problem was that NDC field contained ![CDATA]. The xml filter plugin, recognizes this notion and eliminates it, displaying only what is inside those brackets upon structuring. However, when mapping I set the NDC field to type text and this presents a problem since a type "text" cannot be {} (nothing/empty). So the easiest solution was to take out the NDC field especially since most of the times it was empty. Unfortunately, I had another indexing error that says message is of type "text" and cannot be {} (nothing/empty) which signifies that some logs have no message. This time it is not that easy, I cannot drop the message field, it is the most important field in a log and I must keep it.

Three Solutions:

  • Take out NDC using the "remove_field" function and use an "if" statement to drop any log that has no message (in my case like so):
    if ([message]==" +" or [message]=="\-+" or [message]=="/n")
    {
        drop{ }
    }

*Use a mutate filter to add any chosen text after ![CDATA[ so that ![CDATA] would always contain something in its brackets. For example:
Character Data:
This way we would always have ![CDATA[Character Data:]]
and when the xml filter plugin recognizes and eliminates the notion, we end up with only:
Character Data:
This performs the mutation on all the logs. It is performed as follows:

           `  filter{mutate{gsub=>["message", "![CDATA[", "![CDATA[Character Data:"]}}`

*We can do this in another way, only in case the CDATA is empty by uing gsub. For example, I would say:

if ("\!\[CDATA\[\]\] "in [message])
             {
                 filter{mutate{gsub=>["message", "![CDATA[]]", "![CDATA[Character Data:]]"]}}
                   }

Note: The reason I don't remove ![CDATA] completely and I keep it present always is that xml filter plugin recognizes this notions and without it you will get a parsing error. It should always be there.

I hope this helps!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.