Using logstash to get data out of elasticsearch to a csv file

Hello, I am new to using logstash and am struggling to get data out of elasticsearch using logstash as a csv.

To create some sample data, we can first add a basic csv into elasticsearch... the head of the sample csv can be seen below

$ head uu.csv
"hh","hh1","hh3","id"
-0.979646332669359,1.65186132910743,"L",1
-0.283939374784435,-0.44785377794233,"X",2
0.922659898930901,-1.11689020559612,"F",3
0.348918777124474,1.95766948269957,"U",4
0.52667811182958,0.0168862169880919,"Y",5
-0.804765331279075,-0.186456470768865,"I",6
0.11411203100637,-0.149340801708981,"Q",7
-0.952836952412902,-1.68807271639322,"Q",8
-0.373528919496876,0.750994450392907,"F",9

I then put that into logstash using the following...

$ cat uu.conf 
input {
  stdin {}
}

filter {
  csv {
      columns => [
        "hh","hh1","hh3","id"
      ]
  }

  if [hh1] == "hh1" {
      drop { }
  } else {
      mutate {
          remove_field => [ "message", "host", "@timestamp", "@version" ]
      }

      mutate {
          convert => { "hh" => "float" }
          convert => { "hh1" => "float" }
          convert => { "hh3" => "string" }
          convert => { "id" => "integer" }
      }
  }
}

output {
  stdout { codec => dots }
  elasticsearch {
      index => "temp_index"
      document_type => "temp_doc"
      document_id => "%{id}"
  }
}

This is put into logstash with the following command....

$ cat uu.csv | logstash-2.1.3/bin/logstash -f uu.conf 
Settings: Default filter workers: 16
Logstash startup completed
....................................................................................................Logstash shutdown completed

So far so good, but I would like to get some of the data out in particular the hh and hh3 fields in the temp_index.

I wrote the following to extract the data out of elasticsearch into a csv.

$ cat yy.conf
input {
  elasticsearch {
    hosts => "localhost:9200"
    index => "temp_index"
    query => "*"
  }
}

filter {
	elasticsearch{
		add_field => {"hh" => "%{hh}"}
		add_field => {"hh3" => "%{hh3}"}
	}
}


output {
  stdout { codec => dots }
  csv {
      fields => ['hh','hh3']
      path => '/home/username/yy.csv'
  }
}

But get the following error when trying to run logstash...

$ logstash-2.1.3/bin/logstash -f yy.conf
The error reported is: 
  Couldn't find any filter plugin named 'elasticsearch'. Are you sure this is correct? Trying to load the elasticsearch filter plugin resulted in this error: no such file to load -- logstash/filters/elasticsearch

What do I need to change to yy.conf so that a logstash command will extract the data out of elasticsearch and input into a new csv called yy.csv.

Any help would be much appreciated...

Regards,

HLM

From https://www.elastic.co/guide/en/logstash/current/plugins-filters-elasticsearch.html

This is a community-maintained plugin! It does not ship with Logstash by default, but it is easy to install by running bin/plugin install logstash-filter-elasticsearch.

filter {
elasticsearch{
add_field => {"hh" => "%{hh}"}
add_field => {"hh3" => "%{hh3}"}
}
}

What do you expect this filter to do? As far as I can tell it doesn't do anything useful.