FIlebeat XML Parsing in Logstash

I was running the following code to get xml parsed from a local file into an elastic index and it appears to work fine:

input {
  file {
    path => "/tmp/test2.xml"
    sincedb_path => "/dev/null"
    start_position => "beginning"
    codec => multiline {
    pattern => "<?xml"
    auto_flush_interval => 1
    negate => "true"
    what => "previous"
    max_lines => 1000000000
    max_bytes => "500 MiB"
    }
   }
  }

filter {
  xml {
    source => "message"
    target => "parsed"
    force_array => "false"
   }
  }

But now I need it to start reading from filebeat and so I tried to integrate it in various ways but am not having much success, e.g.:

input {
  beats {
    port => 5044
    sincedb_path => "/dev/null"
    start_position => "beginning"
    codec => multiline {
    pattern => "<?xml"
    auto_flush_interval => 1
    negate => "true"
    what => "previous"
    max_lines => 1000000000
    max_bytes => "500 MiB"
   }
  }
}
filter {
  xml {
    source => "message"
    target => "parsed"
    force_array => "false"
   }
  }

I get an error:

[ERROR] 2020-10-06 12:52:53.149 [Agent thread] agent - An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle Java::JavaLang::IllegalStateExceptionforPipelineAction::Create"}

Thank you in advance for any hint on how to resolve this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.