I have data loaded in ES using logstash, and i am using elasticsearch.js in my App to query and fetch the data. I am looking for an optimum solution in which my search is quick and Data File-Size is reduced.
In the present set up search - took 647, - hits total=45806 and Data/FileSize: 14.1MB, browser download Time:6.85s, Which was originally - took 3153, - hits total=45806 and Data/FileSize: 31.9MB, browser download Time:14.77s
I have tried to optimize the JSON search request as below.I need suggestion if there is better one then Ver1.3. I guess the problem in my App is on client side with filedownload option where Data/FileSize is hugh.
Ver1.0
GET k00125_car/_search
{"query":{"filtered":{"query":{"query_string":{"analyze_wildcard":true,"query":""}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gte":1286952143643}}}],"must_not":[]}}}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}},"fragment_size":2147483647},"size":1000000,"sort":[{"focus_tier":{"order":"desc","unmapped_type":"boolean"}}],"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"1M","pre_zone":"+05:30","pre_zone_adjust_large_interval":true,"min_doc_count":0,"extended_bounds":{"min":1286952143643,"max":1444718543643}}}},"fields":["*","source"],"scriptfields":{},"fielddata_fields":["@timestamp"]}
Ver1.1
GET k00125_car/_search
{
"query": { "match_all": {} },
"size":1000000,
"source": ["bunit","companycode","customer_number","focus_tier","name","contact_phone","service_address","sum_svchrg"]
}
Ver1.2
GET k00125_car/_search
{
"size":1000000
}
Ver1.3
GET k00125_car/_search
{
"fields": ["bunit","company_code","customer_number","focus_tier","name","contact_phone","service_address","sum_svchrg"],
"size":1000000
}