We are using Enterprise Search to build our e-commerce use cases. One of our challenges is the API response time and payload size. I am wondering if there is a way we can enable GZip compression for the Enterprise Search App Engine API. Today our payload size with uncompressed API is 4 MB and trying to reduce the size by enabling GZip compression.
Looks like I got the answer for this question in this documentation: API Reference | App Search documentation [8.11] | Elastic
Please let me know if there is any other way.
The Accept-Encoding
header has no impact on the response payload from App Engine
Have tried with gzip
and application/gzip
values for the Accept-Encoding
header.
@Carlos_D @Sean_Story - Do you guys have any insight into this and any suggestions?
Hi @Pradeep_Renukaiah ,
App Search does not support zipped payloads.
Can you clarify - are you trying to ingest a 4MB chunk of data? Or your search results are 4MB?
Either way, that seems like an unreasonably large payload. What is your use case? Have you considered if there are any fields you're indexing that don't need to be indexed? How many documents is that 4MB payload split between? Could you decrease your batch/page size?
For context, War and Peace, the novel by Tolstoy, is about 3MB of plain text. Shipping a novel's worth of JSON data is not something we really want to optimize for.
@Sean_Story - I am using multi_search api to pull the documents and the cumulative payload size is 4 MB for 360 documents and aggregations across. However, I am trying to understand how to enable gzip compression for the App Search API responses. I tried using Accept-Encoding header as per the documentation API Reference | App Search documentation [8.11] | Elastic but not able to enable compression.
Can you leverage the result_fields for the individual queries in your multi_search to reduce your response payload? Or change the page size for each of your queries in the multi_search? I expect that you are not wanting to display the full contents of those 360 results to a user on a single page. Try to structure your request so that you only fetch the results you need at a given moment.
I tried using Accept-Encoding header as per the documentation
Thanks, I honestly didn't realize this was a feature we'd added. TIL
Testing locally, I am not having issues with Accept-Encoding
working. Are you getting an error, or is it just returning un-compressed JSON?
Can you share a minimal curl command that reproduces the issue you're having?
There is no error but just not compressing the JSON. I tried the same payload with gzip compression enabled using a standalone server (not app engine but just a spring boot app) and returning the 4 MB payload. After compression the size became 40KB and the response time was subsecond. Without compression, the download time of the content is a few seconds.
@Pradeep_Renukaiah can you please share a minimal curl command that uses Accept-Encoding
but does not result in compressed results? I am not having any issue getting compressed results with a request like:
curl -X POST '<enterprise search host>/api/as/v1/engines/<engine_name>/multi_search' \
-H 'Content-Type: application/json' \
-H 'Accept-Encoding: gzip' \
-H 'Authorization: Bearer <search key>' \
-d '{
"queries": [
{"query": ""}
]
}' | less
@Sean_Story - I am testing with postman and here is the curl command, however Accept-Encoding
is not taking effect. Postman can show the package size with and without encoding:
curl --location --request GET '<enterprise search host>/api/as/v1/engines/test-engine/multi_search' \
--header 'Accept-Encoding: gzip' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer <search key>' \
--data '{
"queries": [
{
"query": "test",
"filters": {
"all": [
{
"languageId": [
-1
],
"type": [
"article"
]
}
]
}
},
{
"query": "test",
"filters": {
"all": [
{
"languageId": [
-1
],
"type": [
"posts"
]
}
]
}
},
{
"query": "test",
"filters": {
"all": [
{
"languageId": [
-1
],
"type": [
"book"
]
}
]
}
},
{
"query": "test",
"filters": {
"all": [
{
"languageId": [
-1
],
"type": [
"tweet"
]
}
]
}
}
]
}'
I'm still struggling to reproduce. Do you get JSON or gzipped content if you run my curl example?
What version of Enterprise Search are you running?
Please try executing as curl, without postman. That will help me to isolate where your issue is coming from.
This is interesting, when I tested using cURL with Accept-Encoding, the output file size is 332KB. However, when I try the same with postman or spring boot framework, the response is not compressed.
I am using Enterprise search version 8.10.2
Ok, so sound like the issue is with postman. I don't think I can really help you there.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.