Starting Fscrawler with SSL error

I'm missing part of the logs but what do you have in this dir?

/home/admin/Downloads

url should be the url where you have the documents you want to index. Not the directory where you have installed fscrawler.

David the URL home/admin/Downloads is really the folder I want crawling

in this folder there is a folder job_name empty...
Where are locate the files indexed ?

23:11:12,439 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] computeVirtualPathName(/home/admin/Downloads, /home/admin/Downloads/fs2es-indexer-develop/README.md) = /fs2es-indexer-develop/README.md
23:11:12,439 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] directory = [false], filename = [/fs2es-indexer-develop/README.md], includes = [null], excludes = [[*/~*, /appdata/*, /domains/*, /isos/*]]
23:11:12,439 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] filename = [/fs2es-indexer-develop/README.md], excludes = [[*/~*, /appdata/*, /domains/*, /isos/*]]
23:11:12,439 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] checking exclusion for filename = [/fs2es-indexer-develop/README.md], matches = [[*/~*, /appdata/*, /domains/*, /isos/*]]
23:11:12,439 TRACE [f.p.e.c.f.f.FsCrawlerUtil] โŒ [/fs2es-indexer-develop/readme.md] does not match exclusion regex [.*/~.*] (was [*/~*])
23:11:12,439 TRACE [f.p.e.c.f.f.FsCrawlerUtil] โŒ [/fs2es-indexer-develop/readme.md] does not match exclusion regex [/appdata/.*] (was [/appdata/*])
23:11:12,439 TRACE [f.p.e.c.f.f.FsCrawlerUtil] โŒ [/fs2es-indexer-develop/readme.md] does not match exclusion regex [/domains/.*] (was [/domains/*])
23:11:12,439 TRACE [f.p.e.c.f.f.FsCrawlerUtil] โŒ [/fs2es-indexer-develop/readme.md] does not match exclusion regex [/isos/.*] (was [/isos/*])
23:11:12,440 TRACE [f.p.e.c.f.f.FsCrawlerUtil] does not match any pattern for exclusion
23:11:12,440 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] filename = [/fs2es-indexer-develop/README.md], includes = [null]
23:11:12,440 TRACE [f.p.e.c.f.f.FsCrawlerUtil] no rules = include all
23:11:12,440 DEBUG [f.p.e.c.f.FsParserAbstract] [/fs2es-indexer-develop/README.md] can be indexed: [true]
23:11:12,440 DEBUG [f.p.e.c.f.FsParserAbstract]   - file: /fs2es-indexer-develop/README.md
23:11:12,440 DEBUG [f.p.e.c.f.FsParserAbstract] fetching content from [/home/admin/Downloads/fs2es-indexer-develop],[README.md]
23:11:12,440 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] computeVirtualPathName(/home/admin/Downloads, /home/admin/Downloads/fs2es-indexer-develop/README.md) = /fs2es-indexer-develop/README.md
23:11:12,440 TRACE [f.p.e.c.f.t.TikaDocParser] Generating document [/home/admin/Downloads/fs2es-indexer-develop/README.md]
23:11:12,440 TRACE [f.p.e.c.f.t.TikaDocParser] Beginning Tika extraction
23:11:12,445 TRACE [f.p.e.c.f.t.TikaDocParser] End of Tika extraction
23:11:12,445 TRACE [f.p.e.c.f.t.TikaDocParser] End document generation
23:11:12,445 TRACE [f.p.e.c.f.f.FsCrawlerUtil] No pattern always matches.
23:11:12,445 DEBUG [f.p.e.c.f.f.FsCrawlerUtil] computeVirtualPathName(/home/admin/Downloads, /home/admin/Downloads/fs2es-indexer-develop/README.md) = /fs2es-indexer-develop/README.md
23:11:12,445 DEBUG [f.p.e.c.f.s.FsCrawlerDocumentServiceElasticsearchImpl] Indexing job_name/e9b290e429eeedfe844e65418284b87a?pipeline=null

Is that the only logs?

David the complete log is too long ..
This is the end of the log

23:11:16,841 TRACE [f.p.e.c.f.c.ElasticsearchClient] POST https://192.168.16.200:9200/_bulk gives {"errors":false,"took":0,"items":[{"index":{"_index":"job_name_folder","_id":"6c7a71321214514fc1ae8de537407b1d","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":33,"_primary_term":1,"status":200}},{"index":{"_index":"job_name_folder","_id":"8bdd21291e57a4263971d7ed4b9c82c","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":34,"_primary_term":1,"status":200}},{"index":{"_index":"job_name_folder","_id":"ccf82e68fd98f84f575c73364fae6e65","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":35,"_primary_term":1,"status":200}},{"index":{"_index":"job_name_folder","_id":"9a248cd78b2282a2da61e25353f94","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":36,"_primary_term":1,"status":200}},{"index":{"_index":"job_name_folder","_id":"ac5fc19569849bfd4dd3d74b962eeb81","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":37,"_primary_term":1,"status":200}}]}
23:11:16,845 DEBUG [f.p.e.c.f.c.ElasticsearchEngine] Sending a bulk request of [19] documents to the Elasticsearch service
23:11:16,845 DEBUG [f.p.e.c.f.c.ElasticsearchClient] bulk a ndjson of 155531 characters
23:11:16,845 TRACE [f.p.e.c.f.c.ElasticsearchClient] Calling POST https://192.168.16.200:9200/_bulk with params []
23:11:16,857 DEBUG [f.p.e.c.f.f.b.FsCrawlerSimpleBulkProcessorListener] Executed bulk composed of 5 actions
23:11:16,882 TRACE [f.p.e.c.f.c.ElasticsearchClient] POST https://192.168.16.200:9200/_bulk gives {"errors":false,"took":0,"items":[{"index":{"_index":"job_name","_id":"f7b81d187dc9d078c79e7929eb5bc82","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":123,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"ab22e014de25d12656b325b446b6715","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":124,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"45cfe765e1f52eb2e2466a5bd55f0b","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":125,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"27a34a7631abdb11355ee7b5b5a8","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":126,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"73ec694287366e3fa9651e451e745f7","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":127,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"87a3739eb8d7355cedb8f68983a46995","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":128,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"cf309cfe502ebbc6675971ef6c8b9775","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":129,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"deb463f0cc85ffe73f34c4d5c9323c","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":130,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"e3cfbc268f8dcb58e45eb61a3402e","_version":8,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":131,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"e9b290e429eeedfe844e65418284b87a","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":132,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"56e49f4c0a2c8aa5d4f946e1faddca","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":133,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"7bebb0cb3fc5b238ca33d550e9ac708f","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":134,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"c210d7b2874a97d39fec7e19c996873e","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":135,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"3b130b591ac4584cd74472a3af6bce2","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":136,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"86078f6426ac199cd9ae1e1fee34895","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":137,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"4eeda285a241d6ec17745e024e8c8f8","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":138,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"f0ca8ba0f988201e10db9e53d73e8452","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":139,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"f9db834b664776c72a6ecb8a4512ab9b","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":140,"_primary_term":1,"status":200}},{"index":{"_index":"job_name","_id":"d73718fa2948314138bc9ceaa0566da","_version":7,"result":"updated","_shards":{"total":2,"successful":1,"failed":0},"_seq_no":141,"_primary_term":1,"status":200}}]}
23:11:16,894 DEBUG [f.p.e.c.f.f.b.FsCrawlerSimpleBulkProcessorListener] Executed bulk composed of 19 actions
^C23:15:32,637 DEBUG [f.p.e.c.f.FsCrawlerImpl] Closing FS crawler [job_name]
23:15:32,646 TRACE [f.p.e.c.f.FsParserAbstract] Closing the parser FsParserLocal
23:15:32,647 DEBUG [f.p.e.c.f.FsCrawlerImpl] FS crawler thread is still running
23:15:32,647 DEBUG [f.p.e.c.f.FsParserAbstract] Fs crawler is now waking up again...
23:15:32,648 DEBUG [f.p.e.c.f.FsParserAbstract] FS crawler thread [job_name] is now marked as closed...
java.lang.Exception: Stack trace
	at java.base/java.lang.Thread.dumpStack(Thread.java:1389)
	at fr.pilato.elasticsearch.crawler.fs.FsCrawlerImpl.close(FsCrawlerImpl.java:151)
	at fr.pilato.elasticsearch.crawler.fs.cli.FSCrawlerShutdownHook.run(FSCrawlerShutdownHook.java:42)

So everything looks good from what I can see.
What's the problem you are seeing?

David thanks
Ok i have a Fedora server and clients with macOS
I want indexation like Spotlight with the smb volumes of Fedora
I put ES in smb.conf of samba

So that's another story.

If you can see documents indexed into Elasticsearch, using FSCrawler, then I think that the discussion you opened here is "solved".

The samba part is something we don't cover here but you probably need to contact the samba developers...

David it's ok but just another one question:
I start FSC in debug mode not a problem ? (else I have an error see above )

I don't see any error. Are you talk about the stacktrace? It's not an error.

17:58:09,556 WARN  [f.p.e.c.f.FsParserAbstract] Error while crawling /home/admin/Downloads: /home/admin/.fscrawler/job_name/_status.json
17:58:09,556 INFO  [f.p.e.c.f.FsParserAbstract] Closing FS crawler file abstractor [FileAbstractorFile].

Is this error still happening?
It's not in the recent logs you shared.

If you still have this problem, please open an issue in FSCrawler project with all the logs you have. Thanks

ok David

in the same forum ?

Here: GitHub ยท Where software is built