This is my logstash config file:
input {
http {
id => bulkHttpInput
port => 8088
additional_codecs => {"application/json" => "es_bulk"}
codec => es_bulk
}
}
filter {
mutate {
remove_field => [ "headers"]
}
}
output {
elasticsearch {
id => elasticOutputOfHttp
index => "%{[@metadata][_index]}"
document_type => "%{[@metadata][_type]}"
document_id => "%{[doc][docID]}"
doc_as_upsert => "true"
}
}
and the output is like :
{
"host" => "127.0.0.1",
"@timestamp" => 2019-05-07T14:22:58.364Z,
"@version" => "1",
"doc" => {
"field1" => "my_field1",
"field2" => "my_field2",
"field3" => "my_field3"
}
}
I wanna move all fields within doc to root level. This is possible by defining add_field
and move all nesteds fields from doc
but field names are dynamic for-example "field4" may be added and some other fields may be removed . How to do it dynamically?