How to create new logstash grok field

I have an ELK stack Filebeat--->Logstash---->Elasticsearch<----Kibana.

I m shipping the logs in a proper way, but I would need to add a custom fields available in Kibana, for searching puproses.

The Log file contains a source host and a destination host like in this format:

Source : \\abc123\xxx\xxx
Dest : \\def456\xxx\xxx

I need now to create two custom fields to being available in kibana

  • source_cluster: abc123
  • destination_cluster: def456

I tried with custom pattern in grok, creating a pattern directory like this:

filter {
grok {
patterns_dir => ["/usr/share/logstash/pipeline/patterns"]


but it is not taken into account when I reference it so:

match => { "message" => "%{SOURCE_CLUSTER:source_cluster}"}


in pattern directory I created an extra file with this content:

SOURCE_CLUSTER (Source...)(\\w+)

I tested it and regex is correct it is able to catch "Source : \abc123"
I tried also with kv but no luck.

Many Thanks

If you want to add a new field, try with mutate filter. As you already have source_cluster and destination_cluster coming from grok filters try something like below

mutate {
     add_field => {
        "source_cluster" => %{SOURCE_CLUSTER}
        "destination_cluster" => %{SOURCE_CLUSTER}
       }
  }

Hope this helps.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.