Can not start elasticsearch client using fscrawler

I am using elasticsearch 8.15.1 and fscrawler 2.10 to do index some files. I am running the files using ./bin/elasticsearch and ./bin/fscrawler to run the files and not using docker. I have disabled all the authentication features because I jsut want to test elasticsearch and fscrawler combo. The only time I have used docker pull is to do

docker pull dadoonet/fscrawler

My elasticsearch is up and running on localhost but I am getting the following error while running the fscrawler.

15:38:47,360 INFO  [f.p.e.c.f.FsCrawlerImpl] Starting FS crawler
15:38:47,360 INFO  [f.p.e.c.f.FsCrawlerImpl] FS crawler started in watch mode. It will run unless you stop it with CTRL+C.
15:38:47,445 WARN  [f.p.e.c.f.c.ElasticsearchClient] We are not doing SSL verification. It's not recommended for production.
15:38:47,777 INFO  [f.p.e.c.f.c.ElasticsearchClient] Elasticsearch Client connected to a node running version 8.15.1
15:38:47,781 WARN  [f.p.e.c.f.c.ElasticsearchClient] We are not doing SSL verification. It's not recommended for production.
15:38:47,807 INFO  [f.p.e.c.f.c.ElasticsearchClient] Elasticsearch Client connected to a node running version 8.15.1
15:38:47,869 FATAL [f.p.e.c.f.c.FsCrawlerCli] We can not start Elasticsearch Client. Exiting.
fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClientException: Error while creating index template fscrawler_docs_DiaryEntries
	at fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClient.pushIndexTemplate(ElasticsearchClient.java:316) ~[fscrawler-elasticsearch-client-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClient.loadAndPushIndexTemplate(ElasticsearchClient.java:532) ~[fscrawler-elasticsearch-client-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClient.createIndexAndComponentTemplates(ElasticsearchClient.java:509) ~[fscrawler-elasticsearch-client-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.service.FsCrawlerDocumentServiceElasticsearchImpl.createSchema(FsCrawlerDocumentServiceElasticsearchImpl.java:71) ~[fscrawler-core-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.FsCrawlerImpl.start(FsCrawlerImpl.java:129) ~[fscrawler-core-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.cli.FsCrawlerCli.startEsClient(FsCrawlerCli.java:421) [fscrawler-cli-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.cli.FsCrawlerCli.runner(FsCrawlerCli.java:397) [fscrawler-cli-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.cli.FsCrawlerCli.main(FsCrawlerCli.java:134) [fscrawler-cli-2.10-SNAPSHOT.jar:?]
Caused by: jakarta.ws.rs.BadRequestException: HTTP 400 Bad Request
	at org.glassfish.jersey.client.JerseyInvocation.convertToException(JerseyInvocation.java:956) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation.translate(JerseyInvocation.java:770) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation.lambda$invoke$1(JerseyInvocation.java:687) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation.call(JerseyInvocation.java:709) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation.lambda$runInScope$3(JerseyInvocation.java:703) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:292) ~[jersey-common-3.1.8.jar:?]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:274) ~[jersey-common-3.1.8.jar:?]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:205) ~[jersey-common-3.1.8.jar:?]
	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:391) ~[jersey-common-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation.runInScope(JerseyInvocation.java:703) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation.invoke(JerseyInvocation.java:686) ~[jersey-client-3.1.8.jar:?]
	at org.glassfish.jersey.client.JerseyInvocation$Builder.method(JerseyInvocation.java:450) ~[jersey-client-3.1.8.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClient.httpCall(ElasticsearchClient.java:902) ~[fscrawler-elasticsearch-client-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClient.httpPut(ElasticsearchClient.java:880) ~[fscrawler-elasticsearch-client-2.10-SNAPSHOT.jar:?]
	at fr.pilato.elasticsearch.crawler.fs.client.ElasticsearchClient.pushIndexTemplate(ElasticsearchClient.java:314) ~[fscrawler-elasticsearch-client-2.10-SNAPSHOT.jar:?]
	... 7 more
15:38:47,878 INFO  [f.p.e.c.f.FsCrawlerImpl] FS crawler [DiaryEntries] stopped
15:38:47,878 INFO  [f.p.e.c.f.FsCrawlerImpl] FS crawler [DiaryEntries] stopped

```

Please help me out.

Hey

Did you define a specific template in the job dir? And what is the settings file look like please?

This is the settings file of the job I have given. This file exists in .fscrawler/job_name/_settings.yaml

name: "test"
fs:
  url: "/home/.../ELK/test"
  update_rate: "1m"
  excludes:
  - "*/~*"
  json_support: false
  filename_as_id: false
  add_filesize: true
  remove_deleted: true
  add_as_inner_object: false
  store_source: false
  index_content: true
  attributes_support: false
  raw_metadata: false
  xml_support: false
  index_folders: true
  lang_detect: true
  continue_on_error: true
  ocr:
    language: "eng"
    enabled: true
    pdf_strategy: "ocr_and_text"
  follow_symlinks: false
elasticsearch:
  nodes:
  - url: "http://localhost:9200"
  bulk_size: 100
  flush_interval: "5s"
  byte_size: "10mb"
  ssl_verification: false
  push_templates: true

Thank you.