Sending @metadata from logstash to elastic search

I've got the following logstash config, and I'm trying to send the RabbitMQ headers (which are stored in the @metadata field) to ElasticSearch

input {
    rabbitmq {
        auto_delete => false
        durable => false
        host => "my_host"
        port => 5672
        queue => "my_queue"
        key => "#"      
        threads => 1
        codec => "plain"
        user => "user"
        password => "pass"
        metadata_enabled => true
    }
}

filter {
    ???
}

output {
    stdout { codec => rubydebug {metadata => true} }
    elasticsearch { hosts => localhost }
}

I can see the headers in the std output

{
    "@timestamp" => 2017-07-11T15:53:28.629Z,
     "@metadata" => {
           "rabbitmq_headers" => { "My_Header" => "My_value"
        },
        "rabbitmq_properties" => {
            "content-encoding" => "utf-8",
              "correlation-id" => "785901df-e954-4735-a9cf-868088fdac87",
                "content-type" => "application/json",
                    "exchange" => "My_Exchange",
                 "routing-key" => "123-456",
                "consumer-tag" => "amq.ctag-ZtX3L_9Zsz96aakkSGYzGA"
        }
    },
      "@version" => "1",
       "message" => "{...}"

Is there some filter (grok, mutate, kv, etc.) which can copy these values to Tags in the message sent to ElasticSearch?

You can use a mutate filter to rename the fields you want to keep. Note the syntax for how to reference nested fields.

https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-rename
https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references

I'd like to keep all the metadata fields...so I've tried:

filter {
	mutate {
		rename => {"@metadata" => "meta"}
	}
}

But it's not working.

1 Like

The contents of the @metadata field only exist in Logstash and are not part of any events sent from Logstash.

We can however use mutate to create/retrieve fields from @metadata within logstash pipeline.

like mutate { add_field => { "[@metadata][test]" => "Hello" } }

and use it as below
output {
if [@metadata][test] == "Hello" {
stdout { codec => rubydebug }
}
}

You can also create new fields using the existing metadata information. But you have to use it similar to [@metadata][test] where you define the metadata attribute name as well.

Regards,
Joseph

Ended up doing this:

ruby {
	code => 'event.get("[@metadata][rabbitmq_headers]").each {|k,v| event.set(k, v)}'
}

Let me know if there's a more 'declarative' way.

4 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.