Regarding the XML file issue

Hi ,

I have added the XML file configuration on the in the logstash where i sent the data to logstash and i can see the data is sent in the console and index is created in kibana but the data is not loaded in kibana and there are many XML start tag and end tags in one file and i have attached the screenshot

Note: basically i need all the data to be displayed in the kibana so only i am not writing anything in the filter section

below is my logstash.conf
input {
file {
path => "/apps/VIL_CDR/3rdJune/WBMTAS00119060301000003668.txt"
sincedb_path => "/dev/null"
start_position => "beginning"
codec => multiline {
pattern => "<Report |"
auto_flush_interval => 1
negate => "true"
what => "previous"
max_lines => 1000000000

}
  tags => "cdrlogs"
  type => "cdrlogs"

}
}

filter {
##interpret the message as XML
if [type] == "cdrlogs" {
xml {
source => "message"
store_xml => "false"
force_array => "false"
}

}

}

output {
elasticsearch {
hosts => "10.10.218.187:9200"
index => "cdrlogs-%{+YYYY.MM.dd}"
cacert => "/etc/logstash/root-ca.pem"
user => "logstash"
password => "logstash"
ssl => true
ssl_certificate_verification => false

  }
  stdout { codec => rubydebug }

}

and i have verified there is no errors in both the logstash and the elasticsearch file i have attached by xml file

image

Please help me i am stuck kindly request

If store_xml is false, and you do not use xpath, then this does not modify the event -- it is just a very expensive no-op. Try specifying target and changing store_xml to true.

Thanks for the reply

i have changed according to it but if i want to mentioned the xpath there are more than 70 thousand lines in a record so how can i do for that

I have updated the filter
filter {
##interpret the message as XML
if [type] == "cdrlogs" {
xml {
source => "message"
store_xml => "true"
force_array => "false"
## target => "parsed"
target => "xml_content"
}

}

}