How to configure FileBeat and Logstash to add XML Files in Elasticsearch?

Hi everybody, I'm a beginner here. My own problem is to configure FileBeat and Logstash to add XML Files in Elasticsearch on CentOS 7.
I have already install the last version of filebeat, logstash, elasticsearch and Kibana, with the plug-in "elasticsearch-head" in standalone to see inside elasticsearch. And to test my installation, i have successfully add simple log file from CentOS system (/var/log/messages), and see it inside elasticsearch-head plug-in (6 index and 26 shards):

And now, next step is to add log from XML file. After reading the documentation, i have configure filebeat and logstash. All services are running, and i try the command "touch /mes/AddOf.xml" to try to active an filebeat event, and forward log to logstash (AddOf.xml is my log file).

My XML data structure is like this for one log event :

<log4j:event logger="ServiceLogger" timestamp="1494973209812" level="INFO" thread="QueueWorker_1_38a0fec5-7c7f-46f5-a87a-9134fff1b493">
	<log4j:message>Traitement du fichier \\ifs-app-01\Interfaces_MES\AddOf\ITF_MES_01_01_d2bef200-3a85-11e7-1ab5-9a50967946c3.xml</log4j:message>
	<log4j:properties>
		<log4j:data name="log4net:HostName" value="MES-01" />
		<log4j:data name="log4jmachinename" value="MES-01" />
		<log4j:data name="log4net:Identity" value="" />
		<log4j:data name="log4net:UserName" value="SOFRADIR\svc_mes_sf" />
		<log4j:data name="LogName" value="UpdateOperationOf" />
		<log4j:data name="log4japp" value="MES_SynchroService.exe" />
	</log4j:properties>
	<log4j:locationInfo class="MES_SynchroService.Core.FileManager" method="TraiteFichier" file="C:\src\MES_PROD\MES_SynchroService\Core\FileManager.cs" line="47" />
</log4j:event>

My filebeat configuration like this (/etc/filebeat/filebeat.yml):

filebeat.prospectors:

# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.
- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /mes/*.xml

  document_type: message

  ### Multiline options

  # Mutiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  multiline.pattern: ^<log4j:event

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  multiline.negate: true

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  multiline.match: after

#================================ Outputs =====================================

# Configure what outputs to use when sending the data collected by the beat.
# Multiple outputs may be used.

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["localhost:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  ssl.certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

My input logstash configuration (/etc/logstash/conf.d/01-beats-input.conf) :

input {
 beats {
 port => 5044
 ssl => true
 ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
 ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
 }
}

My filter logstash configuration (/etc/logstash/conf.d/01-beats-filter.conf) :

filter 
{
    xml 
    {
        source => "message"
        xpath => 
        [
            "/log4j:event/log4j:message/text()", "messageMES"
        ]
        store_xml => true
        target => "doc"
    }
}

My output logstash configuration (/etc/logstash/conf.d/01-beats-output.conf) :

output {
 elasticsearch {
 hosts => ["localhost:9200"]
 sniffing => true
 manage_template => false
 index => "mes_log"
 document_type => "%{[@metadata][type]}"
}
}

But when i try the command "touch /mes/AddOf.xml", or add manually an event log in AddOf.xml, i don't see a new index with events log from XML file in elasticsearch.

I have see documentation for XML plug-in for logstash (here), but i don't now if i need to install something ? Or maybe I'm not doing the right thing for filebeat to send the logs to logstash ?

I'm very involved and motivated to learn about ELK stack. Thank you in advance for your expertise and help. I would be grateful ! :slight_smile:

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.