Send same document to multiple elasticsearch indexes

I am processing incoming documents via logstash. And I would like to send the documents to different multiple indices. I have found this approach to solve the problem.

output {   
    elasticsearch {  
        host => "localhost"  
        protocol => "http"  
        index => "first_indexer"   
    }
    elasticsearch {  
        host => "localhost"  
        protocol => "http"  
        index => "second_indexer"   
    }
} 

I have the following doubt with this approach:
If a document gets an error(lets say mapping parser exception) while indexing in the first index, will it not index in the second index? And what happens to the documents that come after that.

What your describing is done by design.
If one output fails then the entire execution of that pipeline will fail. This is done to prevent any sort of queueing or backpressure on logstash.
You may want to look into cross cluster index replication.

Why do you have so many clusters in the first place?

Thanks for the response.
We have only a single cluster. But have different indices. And we need to send same documents to those different indices.

Why would you want to do that?
Sorry for all the questions, I'm just trying to understand your use case.

Yeah let me explain why I am doing that.
Actually I was performing reindexing on my data. And I have completed the reindexing from my "old_index" to "new_index" but before shifting entirely to the "new_index", I would like to see if the mapping I defined in the "new_index" working fine for a few days and so I want the data to still be indexed in the "old_index" also. I hopw this makes sense .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.