Indexing compressed (gzip) content into Elastic through Java APIs

(Jasons Tlr) #1

My use cases are within ES 2.4 & 5x; the goal being to feed GZIP content directly into an ES index through my Java application.

I'm aware that compressed (GZIP) content can be loaded via the REST end-points, but would like to do this within the Java API when publishing to ES.

I have enabled compression on the cluster (http.compression) as well as setting the "Content-Encoding" & "Accept-Encoding" on the request to "gzip", but that does not appear to help.

Has anyone had experience, or any thoughts on how to actually do this?

Thanks in advance

Elasticsearch-rest-high-level-client _bulk compression
(David Pilato) #2

What would the zip contain ?

(Jasons Tlr) #3

The zip would contain JSON documents.

(David Pilato) #4

May be logstash could help?

But really why not sending a bulk of all your json docs?

(Jasons Tlr) #5

Thank you.

The logstash idea is great; I will take a look.

With regards to the bulk loading of JSON. is it possible to bulk load a GZIP (of JSON documents)?

(Jasons Tlr) #6

Quick follow-up. I just got a GZip of JSON documents to load through the BULK API. Thank you for your help.

(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.