@metadata get lost in logstash redis output/input

Hey there, I'm trying to give the filebeat modules a try but stuck in process the @metadata field. My setup looks like

filebeat > logstash forwarder > redis > logstash processor > elasticsearch

Config logstash forwarder:

input {
beats {
port => 5044
}
}
output {
redis {
host => [ "redis1", "redis2", "redis3" ]
data_type => "list"
key => "logstash"
codec => json
}

Config logstash processor:

input {
redis {
host => "redis1"
key => "logstash"
data_type => "list"
codec => json
}
redis {
host => "redis2"
key => "logstash"
data_type => "list"
codec => json
}
redis {
host => "redis3"
key => "logstash"
data_type => "list"
codec => json
}
}
filter {
if [fileset][module] == "system" {
if [fileset][name] == "auth" {
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:[%{POSINT:[system][auth][pid]}])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:[%{POSINT:[system][auth][pid]}])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:[%{POSINT:[system][auth][pid]}])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:[%{POSINT:[system][auth][pid]}])?: \s*%{DATA:[system][auth][user]} :frowning: %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]}
; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:[%{POSINT:[system][auth][pid]}])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:[%{POSINT:[system][auth][pid]}])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$",
"%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:[%{POSINT:[system][auth][pid]}])?: %{GREEDYMULTILINE:[system][auth][message]}"] }
pattern_definitions => {
"GREEDYMULTILINE"=> "(.|\n)"
}
remove_field => "message"
}
date {
match => [ "[system][auth][timestamp]", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
geoip {
source => "[system][auth][ssh][ip]"
target => "[system][auth][ssh][geoip]"
}
}
else if [fileset][name] == "syslog" {
grok {
match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:[%{POSINT:[system][syslog][pid]}])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }
pattern_definitions => { "GREEDYMULTILINE" => "(.|\n)
" }
remove_field => "message"
}
date {
match => [ "[system][syslog][timestamp]", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
}
output {
elasticsearch {
hosts => localhost
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}

The index created is %{[@metadata][beat]}-%{[@metadata][version]}-2012-12-11.

If I change the output in logstash forwarder to stdout { codec => rubydebug {metadata => true } } exept of redis the @metadata fields exists. But when I route it through redis and change output in logstash processor to stdout { codec => rubydebug {metadata => true } } there are no @metadata fields anymore.

Is there a way to use redis output/input and keep @metadata fields?

Is there a way to use redis output/input and keep @metadata fields?

No, you'll have to rename the fields. @metadata is, by design, ignored by output (at least one exception exists; the rubydebug codec has a metadata option that causes @metadata to be included).

Thanks for your fast reply. Is there a any reason why it is designed like that?

What I did now is to put an additional field in filebeat an create my indexes this way. For anyone who is interested:

filebeat config
...
fields_under_root: true
fields:
    beattype: filebeat
...
logstash output
output {
  elasticsearch {
    hosts => localhost
    manage_template => false
    index => "%{[beattype]}-%{[beat][version]}-%{+YYYY.MM.dd}"
  }
}

Is there a any reason why it is designed like that?

Metadata fields are supposed to be used as helpers fields (like local variables) within a Logstash instance with no risk of leaking the fields to outputs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.