I have a set of documents that I'm getting from SQL Server that have a lot of fields that were never used and hence Null. I cannot change the schema however when I query these docs in ElasticSearch the number of empty fields is so much it slows me down a lot.
Is there a way to request a query that doesn't return Null valued fields? I don't want to specify each field that I do want as they still could be null. I just don't want the results to contain null values.
I don't know how you take data out of SQL server then index with ES but I'm assuming you index your data in JSON format with ES. If you did this, take a look at the JSON output, if you see fields with no value, or empty string, or "null" string then I suggest to modify that code that generates the JSON output to skip those fields when their values are null or if you use Gson library for example, initialize all fields with null, when Gson converts your objects to JSON, it will skip fields that are null, they won't show up in the JSON output.
Unfortunately I cannot do that, I need to be able to query ES for when fields are Null in Kibana for Dashboards.
What I'm looking for is a way to tell the request that I don't want to receive Null value fields so that the packet coming back is lighter. Right now I'm seeing >8Mb returns for 125 results. If Nulls were removed that may be less than 100Kb.
From what I've read I think the only way for me to handle it now is:
Create a request into ES for adding a (No Nulls) parameter filter.
Or build my own Rest API that sits beside Elastic and filters out the Nulled values.
Neither way is going to get this fixed for me any time soon.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.