Auto Field Detection in Logs

Is there a way to write grok pattern such that it will automatically detect fields and parse it into elasticsearch. I am hoping to eliminate the need to constantly create grok patterns to recognize new log patterns

For example all the fields start with field_name = fieldvalue
src_ip = 10.1.1.223 src_port = 8080

Look at the kv filter.

I can see that within Logstash there is already predefined grok patterns for firewall like Juniper.

So under what kind of situation should we use customized grok pattern and what kind of situation should we use the KV filter?

If it's key/value data I see no reason to use grok unless the keys are very static (in which case the maintenance burden is low).

So why would the Logstash contain a grok pattern for Juniper while the same can be achieved with kv filter? I am trying understand the rational of why it is there. :slightly_smiling:

Is there a way to modify the KV filter such that its can process key/value in the form of "mykey:myvalue" instead of the existing form "mykey=myvalue"?

So why would the Logstash contain a grok pattern for Juniper while the same can be achieved with kv filter?

I don't know.

Is there a way to modify the KV filter such that its can process key/value in the form of "mykey:myvalue" instead of the existing form "mykey=myvalue"?

Yes. Please explore the various configuration options listed in the kv filter documentation.

1 Like

Thank you for your help! I am now able to do what I need :slightly_smiling: