Hello All,
I have an upstream system that will produce event data that hopefully will be parsed by Logstash and then sent to ES.
I've work out grok patterns and basic filters working with some test events. This part seems to work until the any one of the fields lacks any kind of values. When this condition happens the pipeline will error out and stop. The following is an example.
Expected event data: fromCountry=192.168.1.1, toCountry=192.169.2.2,
Unexpected event data: fromCountry=, toCountry=,
Part of the overall Grok pattern - fromCountry=%{IPV4:fromCountry}, toCountry=%{IPV4:toCountry}, ...
Thus has anyone parsed a data stream and added values ( -, 00 or NA ) when any field lacks a value?
Ideally in the above I'd like to test the field and then substitute a default value of 0.0.0.0 when the country ip is unknown.
The problem as I see it is that this may be doable for one of 2 fields but may get out of hand for 20 plus fields.
Thoughts?
Regards
TimW