Content type detection for rest requests is deprecated with lastest ES and FSCrawler

Getting:

[299 Elasticsearch-5.4.1-2cfe0df Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header

This occurs are startup of FSCrawler.

Thought this was resolved with lastest:
Elasticsearch: 5.4.1
FSCrawler: 2.2

Any thoughts ?

This is fixed in FSCrawler 2.3. Try the latest SNAPSHOT. Link is available in the README.

great, thx.

I will probably wait for a stable zip package for 2.3

TBH the latest 2.3-SNAP is more stable than the 2.2 version... :slight_smile:

I hope I can release it anytime soon though.

Ok, thx.

Going with
fscrawler-2.3-20170602.114150-43.zip Fri Jun 02 11:41:54 UTC 2017

Same problem with ""Content type detection for rest requests is deprecated" .... only at a faster rate.

exec /usr/lib/jvm/jdk1.8.0_91//bin/java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager -cp /opt/fc/lib/* -jar /opt/fc/lib/fscrawler-2.3-SNAPSHOT.jar --config_dir /opt/fc/config cix-fscrawler ...

08:12:27,930 INFO [f.p.e.c.f.FsCrawlerImpl] Starting FS crawler
08:12:27,935 INFO [f.p.e.c.f.FsCrawlerImpl] FS crawler started in watch mode. It will run unless you stop it with CTRL+C.
08:12:28,894 WARN [o.e.c.RestClient] request [PUT http://127.0.0.1:9200/cix-fscrawler] returned 1 warnings: [299 Elasticsearch-5.4.1-2cfe0df "Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header." "Wed, 07 Jun 2017 12:12:28 GMT"]
08:12:28,952 INFO [f.p.e.c.f.FsCrawlerImpl] FS crawler started for [cix-fscrawler] for [/opt/cix/filestore] every [1m]
08:12:29,912 WARN [o.e.c.RestClient] request [POST http://127.0.0.1:9200/_bulk] returned 1 warnings: [299 Elasticsearch-5.4.1-2cfe0df "Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header." "Wed, 07 Jun 2017 12:12:29 GMT"]

Using ...

_settings.json
{
"name" : "cix-fscrawler",
"fs" : {
"url" : "/opt/cix/filestore",
"update_rate" : "1m",
"excludes" : [ "~*" ],
"json_support" : false,
"filename_as_id" : false,
"add_filesize" : true,
"remove_deleted" : true,
"add_as_inner_object" : false,
"store_source" : true,
"index_content" : true,
"attributes_support" : false,
"raw_metadata" : true,
"xml_support" : false,
"pdf_ocr" : true,
"index_folders" : true,
"lang_detect" : false
},
"elasticsearch" : {
"nodes" : [ {
"host" : "127.0.0.1",
"port" : 9200,
"scheme" : "HTTP"
} ],
"type" : "doc",
"bulk_size" : 100,
"flush_interval" : "5s"
},
"rest" : {
"scheme" : "HTTP",
"host" : "127.0.0.1",
"port" : 8080,
"endpoint" : "fscrawler"
}
}

Very interesting. Another user reported something like that. Ok can you open an issue in FSCrawler project. I'll check and fix that.

Thanks !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.