Update xml file content to prevent xml parse error

I have a number of (~10k) XML files that have a missing closing root tag. When I load this file to logstash with xml filter I get parse error.

Is there a way to make logstash add a closing tag in filter before parsing the content?

Given below is my XML:

 <root>
   <elm>....</elm>
   <elm>....</elm>
   <elm>....</elm>
   <elm>....</elm>

I looked at filters and couldn't figure out how to add closing root tag to the read content. Please help.

Is there a way where in I can access the file contents from input { } in filter { } section? If I can write some ruby code to modify the read file contents?

Thanks.

You can modify the XML payload with e.g. a mutate filter (the gsub option might be useful) before you feed it to the xml filter.

Magnus, thank s a lot for replying. gsub does a substitution - right? How can I make it to append closing tag to the source? I am very sorry, I fail to see what you are saying.

I tried to use merge to merge two fields. But I result in an array containing both values.

input {
     file {
     .....
     add_field => { 'closing_tag' => '</root>' }
     .....
}

filter {
     mutate {
            merge => { 'message' => 'closing_tag' }
     } 
     .......

The above create an array with two values - one with the message content and the second with closing_tag.

What am I missing here?

How can I make it to append closing tag to the source?

Replace $ (the end of the string) with </root> (using gsub) or do this:

mutate {
  replace => {
    "message" => "%{message}</root>"
  }
}

Yes!! Thanks a lot for your help. It worked. I was stuck at this for days and you solved it in a minute. You are awesome. :+1:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.