How to use custom fields from filebeat in logstash?

hi everyone,
new to logstash (about 100 hours in), i have 2 servers that are running filebeat, one of them is called x and the other is y,

in the configuration of x there is this (filebeat.yml - only releveant lines are shown)
#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
name: filebeat-x1

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
fields:
  env: production
  ip: 10.xxx.xx.xxx
  project: Globalscape

i am trying to use the fields.project value in the index creation as follows:

(excerpt of logstash config:)

input {
    beats {
        port => 5044
    }
}
filter {
        mutate {
                gsub => ["message", "\"", "`" ]
                }
        csv{
                columns => [ "ts", "debug_mode", "server_name", "method_name", "msg", "app_id", "statuscode", "statusmessage", "status"]
                separator => "|"
 }
}
output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => true
    index => "%[fields.project]-%{[@metadata][beat]}-%{+YYYY}"
  }
}

and for some reason it's not working...
can anyone help?

The easiest way to see what the data really looks like is to output it to stdout with a rubydebug codec. I do however suspect the issue may be that you are not specifying the field correctly. Try this instead:

index => "%[fields][project]-%{[@metadata][beat]}-%{+YYYY}"
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.