Hi Team,
My usecase to update index whenever the new json document received.
Logstash will receive event and update ES index.
example:
01.00 AM:
index name: currencies
inserted with 5 documents
source-destination-offer%
USD - GBP- 2%
USD - EUR - 2%
USD - YEN - 3%
GBP - USD - 0.2%
EUR - USD - 0.2%
note: here we have only 3 documents for source=USD
next at 01.10 AM:
index name: currencies
only received 6 rows.
source-destination-offer%
USD GBP 10%
USD EUR 10%
USD YEN 10%
USD JPY 10%
USD CNY 10%
expected: i want to delete all old USD documents only and inserted newly received 5 USD + retain existing 2 non USD data
I am not having fixed interval re-indexing - which means - there is no time specific indexing job, as long as data come in (irrelevant interval ) to elastic search i need to start indexing it and make it available for consumer search in that case i would like to only replace the existing document which is impacted. do i need to handle using primary key concepts ?