When using elasticsearch as input and adding fields in logstash filter, the field is being added to the document in the sourced elasticsearch index.
This is not an expecting behaviour, take into account this configuration:
input {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "*"
query => '{ "query": { "query_string": { "query": "*" } } }'
scroll => "5m"
docinfo => true
}
}
filter {
mutate {
add_field => {
"is_migration" => true
"_agent" => "http"
}
}
}
output {
http {
url => "https://ingest.example.com"
http_method => "post"
content_type => "application/json"
}
}
Now, the source documents from the input elasticsearch are all being modified and every document has a is_migration
and _agent
field.
How a filter process would modify the input store? It should modify the object in the event queue but there's no way it would modify the document from the input source.
Yes, I'm aware about metadata field but how this wasn't spotted in production environments?