Index the content of KV file with Logstash in one document dynamically

I have many properties files with the following structures key = Value , for example :

rejets.cache.temporaire.time=60
data.cache.referentiel.time=10
data.cache.temporaire.time=60
data.cache.getinfos.time=200

I want to index this file as a document elasticsearch dynamicaly (the name of the field ) like this :

 {
   rejets.cache.temporaire.time:60
   data.cache.referentiel.time:10
   data.cache.temporaire.time:60
    data.cache.getinfos.time:200
  }

I used the following configuration but i found in my index each field is indexed in one document ,so in my case i have 4 documents indexed :

mutate {
             gsub => [ "message", "[\\\\]r", "esp" ]
             gsub => [ "message", "[\\\\]n", "ace" ]
        }

        ruby {
             code => "begin; event['message'] = event['message'].split(/espace/); rescue Exception; end"
        }

        kv {
            source => "message"
            value_split => "="
            field_split => "\n"
        }

        grok {
                        match => { "message" => "(?<param>[^=]*)=%{GREEDYDATA:value}" }
                }

        if "_grokparsefailure" in [tags] {
                        drop {}
                }

        mutate {
            remove_field => ["message"]
        }  

I want to have one document indexed with all the fields of the properties file

How can i make this please ? thanks for Help

Use a multiline codec on the input to combine all of the lines into a single event. Do this by using a pattern that never matches and a timeout

 codec => multiline { pattern => "^Spalanzani" negate => true what => previous auto_flush_interval => 1 multiline_tag => "" }

Then use a kv filter with a literal newline in the field_split option

    kv { field_split => "
" value_split => "=" }
1 Like

Thank you for your help !