We are considering the question: how FSCrawler manages failure?
What happened if FSCrawler failed to upload a certain file because of a failure on the ElasticSearch side? Does FSCrawler implements an automatic failure handling mechanism?
Consider the scenario, FSCrawler fails to upload a file because Elasticsearch is Down.
- Can we have FSCrawler to retry uploading the file? can we configure number_retries?
- Can we have FSCrawler to log the following:
a. File on which failure happened
b. Exception information
What is the structure (i.e. fields) of the log file "documents.log"? I looked at that file but i didn't find reference to the "file name" on which the failure happened.