I am facing the following issue with Elasticsearch and logstash.
every day logstash create index with current date like : mydata_2022.05.12
i have a index patterns name: mydata_*
i get documents from api and index them using logstash so if a document is indexed 2022.05.11
and updated in 2022.05.12 it will be duplicate.
there is any way to update or delete the first one automatically.
I'm pretty new to logstash, so any help is much appreciated :).
Here is my logstash conf.
input {
file {
path => "/etc/logstash/conf.d/documents/*.json"
mode => "read"
start_position => "beginning"
sincedb_path => "NUL"
codec => multiline {
negate => true
what => "previous"
pattern => '^\{'
max_lines => 10000000
}
type => "json"
file_completed_action => "log_and_delete"
file_completed_log_path => "/etc/logstash/conf.d/documents/files.log"
}
}
filter {
json {
source => message
}
}
output {
stdout {
codec => rubydebug {
metadata => false
}
}
elasticsearch {
hosts => ["http://localhost:9200"]
index => "mydata_%{+yyyy.MM.dd}"
document_id => "%{[data][uuid]}"
}
}