Hey there,
I'm trying to parse jenkins build.xml files via filebeat/logstash.
The documents gets transferred from filebeat fine, I see the documents in kibana that contains the full content, but easy xpath:
filter
{
xml
{
source => "message"
store_xml => true
target => "log"
xpath => [
"/log/flow-build/queueId/text()", "queueId"
]
}
}
is failing.
The error I'm getting from logstash is:
[2018-05-30T11:19:08,124][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.05.30", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x71178c99>], :response=>{"index"=>{"_index"=>"filebeat-2018.05.30", "_type"=>"doc", "_id"=>"TVXFsGMBFEFgAXJp37H0", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of mapping depth [20] in index [filebeat-2018.05.30] has been exceeded due to object field [log.actions.org.jenkinsci.plugins.pipeline.modeldefinition.actions.ExecutionModelAction.stagesList.org.jenkinsci.plugins.pipeline.modeldefinition.ast.ModelASTStages.stages.org.jenkinsci.plugins.pipeline.modeldefinition.ast.ModelASTStage.branches.org.jenkinsci.plugins.pipeline.modeldefinition.ast.ModelASTBranch.steps.org.jenkinsci.plugins.pipeline.modeldefinition.ast.ModelASTTreeStep.children.org.jenkinsci.plugins.pipeline.modeldefinition.ast.ModelASTScriptBlock.args.arguments.entry.org]"}}}}
Do I read it correctly that the XML too complex to be parsed? Is there anything I can do to be parsable by logstash? Or should I switch to regexp/grok?
Regards,
Roman