logstash pipeline is not updating existing document for the same id, I have couples of fields which got updated frequestly for example Last Modified Date.
I have below logstash config.
output {
elasticsearch {
id => "api-main"
hosts => ["https://localhost:9200"]
#ssl => true
#cacert => "/usr/share/logstash/certs/http_ca.crt"
cacert => '/etc/logstash/config/certs/ca.crt'
user => "elastic"
password => "xxxxxxx"
index => "change-dev-%{+YYYY-MM-dd}"
#document_id => "%{[response][body][entries][values][Request_ID]}"
document_id => "%{[response][body][entries][values][Change_ID]}"
#document_id => "%{[@metadata][_id]}"
action => "update"
doc_as_upsert => true
#data_stream_sync_fields => true
}
stdout {
codec => rubydebug {
metadata => true
}
}
}
If I use Action => create then the index is created matching index template using Data Stream,
but then later if there is an update in the field for example ('Last Modified date' field is changed ) but logstash doesn't update that field in elasticsearch . I tried to use update and doc_as_upsert but it's not working.
I am also getting below exception in logs.
[WARN ] 2023-03-31 18:37:16.716 [[main]>worker0] elasticsearch - Could not index event to Elasticsearch. status: 400, action: ["update", {:_id=>"CRQ000000000081", :_index=>"change-dev-2023-03-31", :routing=>nil, :retry_on_conflict=>1},
"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams"}}}
Please suggest.