I am trying to add new data to an existing index in Elasticsearch by running the same config file in Logstash. I already have kibana dashboard created for that index.
My doubt is, does Logstah copy again the old data with new data? Does it affect the created dashboards?
If you process the same event through an elasticsearch output a second time then unless you are setting the document_id option you will get a second, duplicate, document in the index. It depends on what your input data looks like, but you may be able to generate a unique document_id using a fingerprint filter.
yes you can do it. You have to make sure you have same document_id.
everything is very much depend on uniq document_id.
When I first starting working on this I had to go through this. it is not explain properly anywhere.
Anyway what ever you do make sure you create your own document_id from your data so you can update.
on your output section you have to use
action=>"update" and it will update document if it exist
for example,
you already have three field from somewhere. job, machine_name,center in your index "job_data"
input {
read data -------- one of the field is job# which is uniq
and it also reads machine_name, center
}
output {
document_id => %{job}
}
now you said are going to read more field to add that in to record
you can read job, machine_name,center. and new field "status"
use
action => "update" on output section and it will update record and add new field for you.
it is always good to post some example that way whoever is trying to give you answer knows what you trying to achieve.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.