Max content length issue

Hi, I am getting http 413 error on trying to push large documents to ES 5.5.

I tried setting http.max_content_length : 500mb in ES YML file but I am still getting the same error saying '... entity content too long for configured buffer limit (104857600)'.

what am I missing & how do I get ES to accept larger files ?

I'd say first: don't do it. Don't store blobs in elasticsearch/Lucene.

If you want to do it, look at http.max_content_length in
https://www.elastic.co/guide/en/elasticsearch/reference/6.2/modules-http.html

well, as I am doing full text search using fs2.3, I really have no option. they are few and far between but I can't really avoid it altogether without hampering the usability factor for my clients.

I did go over this link previously but I am not sure where I am supposed to set this ? as I wrote above, I did set http.max_content_length by editing the yml file but that apparently didn't work.

fs2.3

You mean? FSCrawler 2.3?

I really have no option.

Normally if you don't define the store_source settings, the size of a document should not be that large. Did you activate that?

I did set http.max_content_length by editing the yml file but that apparently didn't work.

Did you restart the node?

Did you restart the node?

Stupid me, that's must be it.

I will move the rest of the fscrawler related discussion to a more relevant thread on github, viz. Issues with remove deleted · Issue #531 · dadoonet/fscrawler · GitHub .
thx.

Did you restart the node?

Okay, I did that, I am still getting the max content length error " entity content is too long [2.........] for the configured buffer limit [104857600] "

I have set http.max_content_length : 999mb.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.