Indexing compressed (gzip) content into Elastic through Java APIs

My use cases are within ES 2.4 & 5x; the goal being to feed GZIP content directly into an ES index through my Java application.

I'm aware that compressed (GZIP) content can be loaded via the REST end-points, but would like to do this within the Java API when publishing to ES.

I have enabled compression on the cluster (http.compression) as well as setting the "Content-Encoding" & "Accept-Encoding" on the request to "gzip", but that does not appear to help.

Has anyone had experience, or any thoughts on how to actually do this?

Thanks in advance

What would the zip contain ?

The zip would contain JSON documents.

May be logstash could help?

But really why not sending a bulk of all your json docs?

Thank you.

The logstash idea is great; I will take a look.

With regards to the bulk loading of JSON. is it possible to bulk load a GZIP (of JSON documents)?

Quick follow-up. I just got a GZip of JSON documents to load through the BULK API. Thank you for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.