Accessing _version metadata field from elasticsearch using logstash


#1

Hi,
I'm using the ELK- Stack to import CSV files. Each time the CSV files are imported the "_version" field of a document increases, which is as expected. However, because the _version field is a metadata field, is not indexed by Elasticsearch. Therefore the field is not searchable and cannot be used in the Dashboard.

I've created a second logstash configuration where both the input as well as the output are Elasticsearch.

Filter configuration:

filter {
mutate {
   add_field => {"Version" => "{[@metadata][_version]}"}
 }
}

Input configuration:

input {
elasticsearch {
  hosts => ["localhost:9200"]
  index => "test_csv"
  query => '{"query":{"match_all" : {}}}'
  size => 1000
  scroll => "1s"
  docinfo => true
  docinfo_fields => ["_index", "_type", "_id", "_version"]
  schedule => "/1 * * * *" 
}
}

I cannot get the value from the _version field. The Output in Kibana looks like:

Version         {[@metadata][_version]}

If I replace the _version field in the filter with _id or _index I get information back.

Any ideas on how to get value out of the _version field? Any thoughts on the matter are highly appreciated.


(Christian Dahlqvist) #2

I have not tested it, but believe this is supposed to be:

mutate {
  add_field => {"Version" => "%{[@metadata][_version]}"}
}


#3

Hi Christian,

If I change the Logstash configuration as you mentioned, the output in Kibana looks like:

Version_Scrapy	       	%{[@metadata][_version]}

Actually I am not sure if it is possible to access the data from this field. Because it depends on the view if I can see the"_version" field in JSON or not.

image

image

Regards,


(Christian Dahlqvist) #4

If you need a version, I would recommend you add a field to represent this rather than rely on the internal one that you can not control.


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.