FSCrawler not running

Hi group,

I just tried to run FSCrawler for the first time, and it did not work. I have been asked to include any logs, etc, and don't see any. I created a test job and here is that ... following that, is the error message I received:

{
  "name" : "test",
  "fs" : {
    "url" : "/Users/pkasson/Downloads",
    "update_rate" : "15m",
    "excludes" : [ "~*" ],
    "json_support" : false,
    "filename_as_id" : false,
    "add_filesize" : true,
    "remove_deleted" : true,
    "add_as_inner_object" : false,
    "store_source" : false,
    "index_content" : true,
    "attributes_support" : false,
    "raw_metadata" : true,
    "xml_support" : false,
    "index_folders" : true,
    "lang_detect" : false,
    "continue_on_error" : false,
    "pdf_ocr" : true,
    "ocr" : {
      "language" : "eng"
    }
  },
  "elasticsearch" : {
    "nodes" : [ {
      "host" : "127.0.0.1",
      "port" : 9200,
      "scheme" : "HTTP"
    } ],
    "bulk_size" : 100,
    "flush_interval" : "5s"
  },
  "rest" : {
    "scheme" : "HTTP",
    "host" : "127.0.0.1",
    "port" : 8085,
    "endpoint" : "fscrawler"
  }
}
:fscrawler  bin/fscrawler ./test test
07:02:17,566 WARN  [f.p.e.c.f.c.ElasticsearchClientManager] failed to create elasticsearch client, disabling crawler...
07:02:17,570 FATAL [f.p.e.c.f.FsCrawler] Fatal error received while running the crawler: [Connection refused]
07:02:17,579 INFO  [f.p.e.c.f.FsCrawlerImpl] FS crawler [test] stopped
07:02:17,580 INFO  [f.p.e.c.f.FsCrawlerImpl] FS crawler [test] stopped

Please format your code, logs or configuration files using </> icon as explained in this guide and not the citation button. It will make your post more readable.

Or use markdown style like:

```
CODE
```

There's a live preview panel for exactly this reasons.

I edited your post.

Can you share the elasticsearch logs please?

Where do I find the logs

"everyone always complains about the weather but no one does anything about it"

Where do I find the logs

Depends on how you installed elasticsearch.

I just unzipped it and ran from command line

"everyone always complains about the weather but no one does anything about it"

So logs are in logs dir and also printed on the console.

I posted what was sent on the console. there is no logs folder

Just curious - is the FSCrawler sufficient to test out the crawls, or is there another component I have not installed yet ? I am wondering if there is an engine component this thing is trying to connect to, I wouldn’t think so, but new to this.

It needs elasticsearch to be started.

Is that an open source / community service is it a commercial product ?

See Elasticsearch: The Official Distributed Search & Analytics Engine | Elastic

Open source and free to use.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.