Hello,
We have the information stored with us in spreadsheets and we use logstash 2.4.1 to stash the data into elasticsearch 2.4.4 using a configuration script that fetches the data from the MySQL local server. Now if I have about a million entries about people stashed into elasticsearch and about thousand updates to the people's permanent addresses or say their phone numbers- what would be the best way to have the newer information updated in the elasticsearch in order to save time ?
You will need to generate a unique id for each person, and whenever the address of the person changes just use logstash to reload the entire document using the same unique id as before. I believe Logstash has a input plugin for mysql so you can load data directly from the source instead of storing it into spreadsheets.