Testing for values and adding a value when value field is empty or null

Hello All,

I have an upstream system that will produce event data that hopefully will be parsed by Logstash and then sent to ES.

I've work out grok patterns and basic filters working with some test events. This part seems to work until the any one of the fields lacks any kind of values. When this condition happens the pipeline will error out and stop. The following is an example.

Expected event data: fromCountry=192.168.1.1, toCountry=192.169.2.2,
Unexpected event data: fromCountry=, toCountry=,

Part of the overall Grok pattern - fromCountry=%{IPV4:fromCountry}, toCountry=%{IPV4:toCountry}, ...

Thus has anyone parsed a data stream and added values ( -, 00 or NA ) when any field lacks a value?

Ideally in the above I'd like to test the field and then substitute a default value of 0.0.0.0 when the country ip is unknown.

The problem as I see it is that this may be doable for one of 2 fields but may get out of hand for 20 plus fields.

Thoughts?
Regards
TimW

You can make patterns optional using ()?

    grok { match => { "message" => "fromCountry=(%{IPV4:fromCountry})?, toCountry=(%{IPV4:toCountry})?," } }

You can add missing fields using ruby

    ruby {
        code => '
            [ "fromCountry", "toCountry", "otherRequiredField"].each { |k|
                unless event.get(k)
                    event.set(k, "NA")
                end
            }
        '
    }

Obviously you could split that into two [ ].each loops if you really want NA for some fields and 0.0.0.0 for others.

Hello Badger,

Thanks for the hint, solution etc it's exactly what was needed.

Cheers
TimW

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.