Logstash fields from aggregation to csv

Hi guys,

I'm trying to export data from elastic to csv. The conf from logstash is something like:

input {
   elasticsearch {
           hosts => "127.0.0.1:9200"
           index => "log"
           ...
          query => '
            {
             "size":0,
              "query": {
                     "bool": {
                         "filter": [
                           { "term": {"country.keyword": "NZ"}}
                           ]
                 }
            },
           "aggs": {
               "byhostname": {
                     "terms": {      "field": "hostname.keyword"                  }
              }
              }
  }'
}
}

output {
  csv {
     fields => ["key"]    <==????????
     path => "/blabla/some.csv"
  }
}

The result from the query is something like:

.....
"aggregations" : {
    "byhostname" : {
      "doc_count_error_upper_bound" : 111,
      "sum_other_doc_count" : 301002,
      "buckets" : [
        {
          "key" : "site1",
          "doc_count" : 1335
        },
        {
          "key" : "site2",
          "doc_count" : 852
        }
       ................

I want to output to csv the aggregated values from buckets->"key" (site1, site2) but it doesn't seem to work. I can access the fields from the documents just fine. How shall I specify the field in the output csv section of the logstash?

Thank you

Elasticsearch input plugin is working just for the query and the aggs part may have no meaning.

It is an open issue.

One possible option is using Transform in Elasticsearch to store the aggregation data into another index and use Elasticsearch input plugin on that index to retrieve the result of aggregation.

1 Like

Thank you for your response. I will try to see if I can make a transform or maybe export the data with a python script.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.