Hi dadoonet,
I realised in the settings file I had the rest services ip address set incorrectly. (typo). After correcting this it gets further but now says the E:\ doesn't exist
Here's the log:
11:58:05,843 e[36mDEBUGe[m [f.p.e.c.f.u.FsCrawlerUtil] Mapping [2/_settings.json] already exists
11:58:05,858 e[36mDEBUGe[m [f.p.e.c.f.u.FsCrawlerUtil] Mapping [2/_settings_folder.json] already exists
11:58:05,858 e[36mDEBUGe[m [f.p.e.c.f.u.FsCrawlerUtil] Mapping [5/_settings.json] already exists
11:58:05,858 e[36mDEBUGe[m [f.p.e.c.f.u.FsCrawlerUtil] Mapping [5/_settings_folder.json] already exists
11:58:05,858 e[36mDEBUGe[m [f.p.e.c.f.u.FsCrawlerUtil] Mapping [6/_settings.json] already exists
11:58:05,858 e[36mDEBUGe[m [f.p.e.c.f.u.FsCrawlerUtil] Mapping [6/_settings_folder.json] already exists
11:58:05,858 e[36mDEBUGe[m [f.p.e.c.f.FsCrawler] Starting job [bamdocs]...
11:58:08,166 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClient] Using elasticsearch >= 5, so we can use ingest node feature
11:58:08,651 e[33mWARN e[m [f.p.e.c.f.FsCrawler] We found old configuration index settings in [C:\Program Files\Elastic\FsCrawler\Jobs] or [C:\Program Files\Elastic\FsCrawler\Jobs\bamdocs\_mappings]. You should look at the documentation about upgrades: https://github.com/dadoonet/fscrawler#upgrade-to-23
11:58:08,651 e[32mINFO e[m [f.p.e.c.f.FsCrawlerImpl] Starting FS crawler
11:58:08,651 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClientManager] FS crawler connected to an elasticsearch [5.6.3] node.
11:58:08,651 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClient] create index [bamindex]
11:58:08,682 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClient] create index [bamdocs_folder]
11:58:08,697 e[36mDEBUGe[m [f.p.e.c.f.FsCrawlerImpl] creating fs crawler thread [bamdocs] for [E:\] every [15m]
11:58:08,697 e[32mINFO e[m [f.p.e.c.f.FsCrawlerImpl] FS crawler started for [bamdocs] for [E:\] every [15m]
11:58:08,697 e[36mDEBUGe[m [f.p.e.c.f.FsCrawlerImpl] Fs crawler thread [bamdocs] is now running. Run #1...
11:58:08,713 e[33mWARN e[m [f.p.e.c.f.FsCrawlerImpl] Error while crawling E:\: E:\ doesn't exists.
11:58:08,713 e[33mWARN e[m [f.p.e.c.f.FsCrawlerImpl] Full stacktrace
java.lang.RuntimeException: E:\ doesn't exists.
at fr.pilato.elasticsearch.crawler.fs.FsCrawlerImpl$FSParser.run(FsCrawlerImpl.java:325) [fscrawler-2.4.jar:?]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_152]
11:58:08,713 e[32mINFO e[m [f.p.e.c.f.FsCrawlerImpl] FS crawler is stopping after 1 run
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.FsCrawlerImpl] Closing FS crawler [bamdocs]
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.FsCrawlerImpl] FS crawler thread is now stopped
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.FsCrawlerImpl] FS crawler Rest service stopped
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClientManager] Closing Elasticsearch client manager
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClient] Closing REST client
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.c.ElasticsearchClient] REST client closed
11:58:09,492 e[36mDEBUGe[m [f.p.e.c.f.FsCrawlerImpl] ES Client Manager stopped
11:58:09,508 e[32mINFO e[m [f.p.e.c.f.FsCrawlerImpl] FS crawler [bamdocs] stopped