How to define a field when pushing the same document to multiple indexes?

Hello,

I am collecting logs and trying to output the logs to multiple indexes.
Can I define a field for each index?

My Logstash output configuration is as follows.

input: ip, name, url, msg

output {
      if "agg" in [tags] {
        elasticsearch {
          hosts => ["localhost:9200"]
          index => "agg_%{+YYYY.MM.dd}"
        }
      }
      if "err" in [tags] {
       elasticsearch {
          hosts => ["localhost:9200"]
          index => "err_%{+YYYY.MM.dd}"
        }
      }
    }

current:
agg_20190125: ip, name, url, msg
err_20190125: ip, name, url, msg

I want:
agg_20190125: ip, name, url
err_20190125: ip, msg

I see three possibilities:

  1. use Elasticsearch mapping parameter to exclude the fields from the documents
    for this the fields must be defined in the mapping of the index but with

    enabled: false
    

    and than also must be excluded from the _source field with

    "_source": {
      "excludes": [
        <list of not needed fields
      ]
    }
    

    see https://www.elastic.co/guide/en/elasticsearch/reference/current/enabled.html
    and https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-source-field.html#include-exclude

  2. Use logstash to clone the event (within a ruby filter) and than edit the events with logstash filters like you want them to be

  3. Use a ingest pipeline an the index to remove the fields from the documents
    see https://www.elastic.co/guide/en/elasticsearch/reference/current/pipeline.html
    you must then define the pipeline in the output definition of logstash with
    pipeline => "%{INGEST_PIPELINE}"

1 Like

Thank you, Shaoranlaos!
I really appreciate your answers.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.