Hello everyone,
I'm sending xml logs into Logstash using Filebeat, it worked perfectly the first time I set up my configuration, but since I restarted Logstash service using "systemctl restart logstash.service" (I'm on CentOS7), the logs are entirely sent, but it seems that Logstash doesn't parse them completely.
Example of parsing error:
Here you can see the parsed message seen on Kibana:
And here the xml message which begin with "Alert message id" and not "Detect Time" like above.
My logstash conf:
input {
  beats {
    port => 5044
  }
}
filter 
  {
xml 
  {
    source => "message"
    store_xml => true
 	target => "parsed_data"
    xpath => 
		[
		
		"/Alert/Analyzer/Node/location/text()","Localisation",
		"/Alert/Analyzer/Node/name/text()","Nom",		
		"/Alert/Analyzer/Node/Address/address/text()","AdresseIP",
		"/Alert/Analyzer/Process/name/text()","Nom_manager",
		"/Alert/Analyzer/Process/pid/text()","pid",
		"/Alert/Analyzer/Process/path/text()","Chemin",
		"/Alert/Analyzer/Analyzer/name/text()","Nom_de_la_sonde",
		"/Alert/Analyzer/Analyzer/Node/address/text()","Adresse_sonde",
		"/Alert/Assessment/Impact/text()","Alerte",
		"/Alert/CreateTime/text()","Date"
		]
  }
}
output {
  elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}
I think that because of this incomplete parsing, Kibana doesn't create appropriate fields (Localisation, Nom, AdresseIP,...):
Yet these fields were here before I restart Logstash service:
I hope you can help me to resolve my issue 
 . For my part, I did a lot of Internet searching but did not found a solution.
Regards,
Noe



