How to start with fscrawler?

actually i have installed fscrawler 2.5 zipped file and after unzipping it ,when i opened the (.bat) file it could not opened and from there i am unable to understand whats going wron.....

Run fscrawler from the command line. It will probably tell you something.

Pls give me the commands to run.....
I HV typed fscrawler on the command line but it's showing errors

Which errors?

this is the output i am getting.....

What you do not understand in the message?
Please read:

sir, now getting this error......fscrawler stopped

Please don't post images of text as they are hardly readable and not searchable.

Instead paste the text and format it with </> icon. Check the preview window.

Try to run with --debug option.

Most likely elasticsearch is not started here.

<fscrawler --config_dir ./jp catalogs/>
sir this command is not working .....

and elastic serach is running already

Share the logs that you are getting in debug mode please.
Share your FSCrawler configuration json file as well for the catalog job.
Share your elasticsearch logs as well.

And please format your code, logs or configuration files using </> icon as explained in this guide and not the citation button. It will make your post more readable.

Or use markdown style like:


This is the icon to use if you are not using markdown format:

There's a live preview panel for exactly this reasons.

Lots of people read these forums, and many of them will simply skip over a post that is difficult to read, because it's just too large an investment of their time to try and follow a wall of badly formatted text.
If your goal is to get an answer to your questions, it's in your interest to make it as easy to read and understand as possible.

Json file-

"name" : "catalogs",
"fs" : {
"url" : "C:\tmp\jp",
"update_rate" : "15m",
"excludes" : [ "/~" ],
"json_support" : false,
"filename_as_id" : false,
"add_filesize" : true,
"remove_deleted" : true,
"add_as_inner_object" : false,
"store_source" : false,
"index_content" : true,
"attributes_support" : false,
"raw_metadata" : true,
"xml_support" : false,
"index_folders" : true,
"lang_detect" : false,
"continue_on_error" : false,
"pdf_ocr" : true,
"ocr" : {
"language" : "eng"
"elasticsearch" : {
"nodes" : [ {
"host" : "",
"port" : 9200,
"scheme" : "HTTP"
} ],
"bulk_size" : 100,
"flush_interval" : "5s",
"byte_size" : "10mb"
"rest" : {
"scheme" : "HTTP",
"host" : "",
"port" : 8080,
"endpoint" : "fscrawler"

Elasticsearch logs please?

And please format your code as I just described.

And FSCrawler logs?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.