I have a simple xml file , and i am trying to parse it with losgstash, but the logstash could not undersand split field:
my xml file is:
    <xmldata>
     <head1>
      <key1>Value1</key1>
      <key2>Value2</key2>
      <id>0001</id>
      <date>01-01-2016 09:00:00</date>
     </head1>
     <head1>
      <key3>Value3</key3>
     </head1>
    </xmldata> 
my config file is :
input {
 file {
  path => "/home/safaa/Documents/nessus/validate.xml"
  start_position => beginning
  sincedb_path => "/dev/null"
  codec => multiline
  {
   pattern => "^<\?xmldata .*\>"
   negate => true
   what => "previous"
   auto_flush_interval => 1
  }
 }
}
filter {
  xml {
   store_xml => false
   source => "message"
   target => "xml_content"
      }
	 
split{
   field => "xml_content[head1][key1]"
     }
mutate {
   rename => {
      "xml_content[head1][key1]" => "var1"
             }
       }
}
output {
 stdout { codec => rubydebug }
 elasticsearch {
  index => "logstash-xml"
  hosts => ["127.0.0.1:9200"]
  document_id => "%{[id]}"
  document_type => "xmlfiles"
 }
}
my logstash logs:
[2018-10-21T15:17:40,666][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-10-21T15:17:41,465][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[[http://127.0.0.1:9200/]](http://127.0.0.1:9200/%5D)}}
[2018-10-21T15:17:41,482][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2018-10-21T15:17:41,905][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2018-10-21T15:17:42,009][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-21T15:17:42,014][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-10-21T15:17:42,074][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//[127.0.0.1:9200](http://127.0.0.1:9200/)"]}
[2018-10-21T15:17:42,109][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-21T15:17:42,151][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-21T15:17:43,863][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2d7d2c2a run>"}
[2018-10-21T15:17:43,997][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-10-21T15:17:44,021][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-10-21T15:17:45,052][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-10-21T15:17:47,216][WARN ][org.logstash.FieldReference] Detected ambiguous Field Reference `xml_content[head1][key1]`, which we expanded to the path `[xml_content, head1, key1]`; in a future release of Logstash, ambiguous Field References will not be expanded.
[2018-10-21T15:17:47,233][WARN ][logstash.filters.split   ] Only String and Array types are splittable. field:xml_content[head1][key1] is of type = NilClass
my logstash logs are: