jmkim
January 25, 2019, 7:09am
1
Hello,
I am collecting logs and trying to output the logs to multiple indexes.
Can I define a field for each index?
My Logstash output configuration is as follows.
input: ip, name, url, msg
output {
if "agg" in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
index => "agg_%{+YYYY.MM.dd}"
}
}
if "err" in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
index => "err_%{+YYYY.MM.dd}"
}
}
}
current:
agg_20190125: ip, name, url, msg
err_20190125: ip, name, url, msg
I want:
agg_20190125: ip, name, url
err_20190125: ip, msg
Shaoranlaos
(Christian Stockhaus)
January 25, 2019, 9:32am
2
I see three possibilities:
use Elasticsearch mapping parameter to exclude the fields from the documents
for this the fields must be defined in the mapping of the index but with
enabled: false
and than also must be excluded from the _source field with
"_source": {
"excludes": [
<list of not needed fields
]
}
see https://www.elastic.co/guide/en/elasticsearch/reference/current/enabled.html
and https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-source-field.html#include-exclude
Use logstash to clone the event (within a ruby filter) and than edit the events with logstash filters like you want them to be
Use a ingest pipeline an the index to remove the fields from the documents
see https://www.elastic.co/guide/en/elasticsearch/reference/current/pipeline.html
you must then define the pipeline in the output definition of logstash with
pipeline => "%{INGEST_PIPELINE}"
1 Like
jmkim
January 28, 2019, 2:41am
3
Thank you, Shaoranlaos!
I really appreciate your answers.
system
(system)
Closed
February 25, 2019, 2:41am
4
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.