Can someone tell me how you can parse these types of files through the logstash and after that index it in elasticsearch. Files can contain a variable number of internal tags.
Here is my config file, but unfortunately it does not work correctly.
input {
        file {
            path => "/folder/etlexpmx.xml"
            start_position => "beginning"
            sincedb_path => "/dev/null"
            exclude => "*.gz"
            type => "xml"
            codec => multiline {
                pattern => "^<\? PMSetup .*\>" 
                negate => "true"
                what => "previous"
            }
        }
    }
    filter {
        xml { source => "message" target => "PMSetup" force_array => "false"}
    }
    output {
        elasticsearch {
            codec => json
            hosts => "localhost"
            index => "TESTetlexpmx"
        }
        stdout {
            codec => rubydebug
        }
    }
I want to get the structure as:
startTime = "2019-05-30T15: 00: 00.000 + 02: 00: 00"
BMF = 400495
BTF = 610
measurementType = "PABTS"
c123000 = 125483
But as a result, the index is not created in elastisearch
