Logstash input-beat not receiving @metadata field

Hello,

I searched all over the web and on this forum but haven't found.

I configured a filebeat to parse server logs and send it to logstash, when I want to refer to @metadata fields in filters and Elasticsearch-output parameters (as shown here), the field doesn't seems to be transmitted.

When I output filebeat to stdout I can see the @metadata field structure, but as soon it enters Logstash, it's not there. I checked it with rubydebug output...

Any ideas?

LS 2.1.1
filebeat 1.0.1

Thanks
Bruno

can you share config files and some more details about your setup? LS remove '@metadata' field itself in output plugins.

Ok, it makes things clearer...

My test setup is very simple:
input-beats:
beats { port => 5102 codec => line }

no filters...

output-stdout
output { stdout { codec => rubydebug } }

When I run filebeat with stdout, I can see the field:
{ "@metadata": { "beat": "filebeat", "type": "my-defined-type" }, ... }

But not with LS outputed stdout with rubydebug codec..

IIRC what you said, output stdout remove @metadata?
However, why output plugins are deleting these fields?

The point is that we use front LS nodes (proxy) that just open ports and then forward to a Redis broker....
Then, input-redis is used by a LS indexing cluster to apply the complex stuff (filters)

So, as soon an output is used we lose the @metadata, it makes it impossible to use buffering broker and then output to Elasticsearch with config like this:

index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}" document_type => "%{[@metadata][type]}"

Am I correctly understand?

Thanks
Bruno

No idea why logstash is removing '@metadata'. I guess '@metadata' field is supposed to be local to logstash instance.

See this response for using filters to transfer metadata.