Elasticsearch reindex using logstash

Hello experts,

I am re-indexing data using Logstash 2.3.3 from ES 1.4.4 to ES 2.4.0. I have 70000 doc type in my old index and 14000 doc type has been re-indexed in ES 2.4.0 but due to some error Logstash crashed.
My question is

  1. How to start from where log stash stopped working ?
  2. Is there any way to re-index only doc type that not re-index?

My Logstash configuration

input {
elasticsearch {
hosts => "ip:9200" #source 1.4.4
index => "myidex"
query => '{ "query": { "match_all": {} } }'
scroll => "20m"
docinfo => true
filter {
remove_field => ["@version","@timestamp"]
output {
elasticsearch {
hosts => "ip:9200" # target 2.4.0
index => "%{[@metadata][_index]}"
document_type => "%{[@metadata][_type]}"
document_id => "%{[@metadata][_id]}"

well since your using a document id it will overwrie the existing docuement. I do this with my heart beat data

but if you are just re-indexing, why not just use the internal re-indexing feature of elastic

or you could even just delete the new index and start again.

Thanks for your response eperry ED. I am migrating data from 1.4.4 Es cluster to 2.4.0 ES cluster, i think in this case _reindex will not help. I can start fresh also but the problem is , i have already re-indexed 3 mil document and after fresh start if any error occur then have to start once again.

What error?
Have you looked at the LS logs?

"Attempted to send a bulk request to Elasticsearch configured at '[IP:9200]', but Elasticsearch appears to be unreachable or down!", :error_message=>"Connection reset", :
Even Target IP was working fine serving search request.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.