Parsing jenkins xml using logstash

Hey there,

I'm trying to parse jenkins build.xml files via filebeat/logstash.
The documents gets transferred from filebeat fine, I see the documents in kibana that contains the full content, but easy xpath:

    source => "message"
    store_xml => true
    target => "log"
    xpath => [
        "/log/flow-build/queueId/text()", "queueId"

is failing.
The error I'm getting from logstash is:

[2018-05-30T11:19:08,124][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.05.30", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x71178c99>], :response=>{"index"=>{"_index"=>"filebeat-2018.05.30", "_type"=>"doc", "_id"=>"TVXFsGMBFEFgAXJp37H0", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of mapping depth [20] in index [filebeat-2018.05.30] has been exceeded due to object field []"}}}}

Do I read it correctly that the XML too complex to be parsed? Is there anything I can do to be parsable by logstash? Or should I switch to regexp/grok?


It's not the parsing that fails, it's Elasticsearch that doesn't like the resulting nesting depth. Do you really need to store the full JSON representation of the XML?

No I don't. I wish to store set of interesting fields only. How do I disable to store the full JSON representation?

isn't it?

Yes, disable store_xml (then you can also remove the target setting).

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.