Using Doc Values


#1

Hello,

I am trying to setup Doc values on my active cluster but I am having difficulty understanding how to do this. I've looked over the following articles but still unsure how to accomplish this:

The second article says this,

Updating an Active ClusterNaturally, up to this point, you may be wondering how to remove fielddata from your cluster. The answer depends on your data.
Time Based IndicesIf you are using time based indices (e.g., logstash-2015.07.18),
such as with logging use cases, then you should update your template(s)
to use doc values so that future indices (e.g., tomorrow's) get created
with doc values. From there, the problem should take care of itself as
indices that use fielddata will age themselves out.

How do I update the default template to apply doc values for not_analyzed string fields? Any help would be appreciated.


ELK suddenly colapsed
(Mike Simos) #2

Hi,

In your logstash install there is a file called:

vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.5-java/lib/logstash/outputs/elasticsearch/elasticsearch-template.json

This is the template which logstash applies if it doesn't find it in Elasticsearch. For any field which is "type": "string" and "index": "not_analyzed" then you can add "doc_values": true. You can also apply doc values to any other field type like double, long, etc. so long as the field is not a string and analyzed. For example:

               "@timestamp": {
                  "type": "date",
                  "doc_values": true,
                  "format": "dateOptionalTime"
               }

Its best to define your fields in your mapping and set doc values for each field. If you're only using dynamic mapping to create your fields then you need to add this into the template. I created one which will automatically apply doc values to any dynamically created field:

Its not something I have really tested so you should not put it into production without testing this. Use this as a way to autogenerate your fields when importing some logs so you can build your own template. To add it to Elasticsearch do:

curl -XDELETE http://localhost:9200/_template/logstash
curl -XPUT http://localhost:9200/_template/logstash -d @template.json

If you encounter any problems you can revert back to the original template by using the above commands and using the template file that comes with logstash.


ELK for 30k eps - memory problem
#3

Mike, thanks for the detailed response, it's helpful. I think I got it setup correctly. Is there any way to verify it's working besides looking at the results from curl -XGET "http://localhost:9200/_template/?pretty=true"

Thanks!


(Mike Simos) #4

Hi,

Once you added the template, import some log files. Then look at GET logstash-yyyy-mm-dd/_mapping and in the mapping you should see that doc values were added for your dynamically created fields.


(system) #5