How to add dynamic fields + index with a logstash filter?

new to ELK. I have some Docker logs that I am successfully sending to Kibana.
However next to the field names that I am creating in that filter, it shows a '?', which means 'field not indexed'

This is my filter
filter { if [type] == "docker_log" { json { source => "message" add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } mutate { rename => { "log" => "message" } } date { match => [ "time", "ISO8601" ] } } }

How can I

  1. Make it so that the fields are indexed?
  2. Add dynamically every field in the JSON documents, of which I don't know the values yet? I thought this was the point in the first place of having logs in JSON...

What does the mapping look like for the index?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.