Unwanted truncating of start of multiline XML message from filebeat

I'm having an issue with an XML log file being ingested by Filebeat and some of the messages being passed onto Logstash with the opening tag truncated. There's no pattern to the failures that I can spot and works 95% of the time. I'm running latest Filebeat and Logstash.

I'm pretty certain the issue is with Logstash and not Filebeat as I setup the file.output to search for the pattern, so it looks like it is happening when it is passing it to Logstash.

The open tag is <event but it is being truncated to nt by the time it hits ElasticSearch.

Does anyone have any idea why the opening tag gets truncated? I changed the pattern to be <ev and the results were the same.

Original Log entry:

  <event seq="1373559" time="2019-10-06 05:33:24.821336 +0100" app="BvSshServer 8.32" name="I_CONNECT_ACCEPTED" desc="Connection accepted.">
    <session id="564645" service="SSH" remoteAddress="168.6.1.1:59730"/>
    <parameters addressRule="AnyIP" listenAddress="10.5.1.2:22"/>
    <sessions ssh="4" sshAuth="2" ftp="0" ftpAuth="0"/>
  </event>

Filebeat config:

- type: log
  enabled: true
- L:\Bitvise\Logs\*
  tags: ["bitvise"]
  exclude_lines: ['<?xml','<log>','<start time','</log>']
  ### Multiline options
  multiline.pattern: '^  <event '
  multiline.negate: true
Logstash
  multiline.match: after
  multiline.flush_pattern: '</event>'

Logstash filter:

filter {
  if "bitvise" in [tags] {
    xml {
      force_array => false
      source => "message"
      store_xml => true
      target => "sftp"
    }
    mutate {
        convert => {"bytes_written" => "integer"}
        convert => {"bytes_read" => "integer"}
        convert => {"time_ms" => "integer"}
    }
  }
}

ElasticSearch example entry:

"tags": [
	"bitvise",
	"beats_input_codec_plain_applied",
	"_xmlparsefailure"
],

"message": "nt seq=\"621306\" time=\"2019-11-06 13:46:21.481477 +0000\" app=\"BvSshServer 8.36\" name=\"I_CONNECT_ACCEPTED\" desc=\"Connection accepted.\">\n <session id=\"226641\" service=\"SSH\" remoteAddress=\"10.0.0.1:36236\"/>\n <parameters addressRule=\"AnyIP\" listenAddress=\"10.5.1.28:22\"/>\n <sessions ssh=\"4\" sshAuth=\"2\" ftp=\"0\" ftpAuth=\"0\"/>\n </event>"

logstash-plain.log:

```
[2019-11-08T17:21:34,349][WARN ][logstash.filters.xml     ] Error parsing xml with XmlSimple {:source=>"message", :value=>"nt seq=\"1879455\" time=\"2019-11-08 17:21:28.259809 +0000\" app=\"BvSshServer 8.36\" name=\"I_CONNECT_ACCEPTED\" desc=\"Connection accepted.\">\n    <session id=\"397581\" service=\"SSH\" remoteAddress=\"16.6.12.1:59398\"/>\n    <parameters addressRule=\"AnyIP\" listenAddress=\"10.5.1.2:22\"/>\n    <sessions ssh=\"2\" sshAuth=\"0\" ftp=\"0\" ftpAuth=\"0\"/>\n  </event>", :exception=>#<REXML::ParseException: #<RuntimeError: attempted adding second root element to document>
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/document.rb:96:in `add'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/element.rb:904:in `add'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/child.rb:22:in `initialize'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/parent.rb:14:in `initialize'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/element.rb:60:in `initialize'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/element.rb:902:in `add'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/element.rb:298:in `add_element'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/document.rb:103:in `add_element'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/parsers/treeparser.rb:34:in `parse'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/document.rb:288:in `build'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/document.rb:45:in `initialize'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/xml-simple-1.1.5/lib/xmlsimple.rb:971:in `parse'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/xml-simple-1.1.5/lib/xmlsimple.rb:164:in `xml_in'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/xml-simple-1.1.5/lib/xmlsimple.rb:203:in `xml_in'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-xml-4.0.7/lib/logstash/filters/xml.rb:185:in `filter'
/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:143:in `do_filter'
/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:162:in `block in multi_filter'
org/jruby/RubyArray.java:1800:in `each'
/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `multi_filter'
org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:115:in `multi_filter'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:243:in `block in start_workers'
...
attempted adding second root element to document
Line: 3
Position: 282
Last 80 unconsumed characters:
>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/parsers/treeparser.rb:96:in `parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/document.rb:288:in `build'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rexml/document.rb:45:in `initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/xml-simple-1.1.5/lib/xmlsimple.rb:971:in `parse'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/xml-simple-1.1.5/lib/xmlsimple.rb:164:in `xml_in'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/xml-simple-1.1.5/lib/xmlsimple.rb:203:in `xml_in'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-xml-4.0.7/lib/logstash/filters/xml.rb:185:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:143:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:162:in `block in multi_filter'", "org/jruby/RubyArray.java:1800:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:115:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:243:in `block in start_workers'"]}
```

Is anyone able to assist please? I'm at a complete dead end now and haven't found anything else online which seems similar to this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.