Counting of the csv file and sending the counted data to elasticsearch using logstash

Hi All,

I have a csv file with below mentioned data.
Order,Location,Date
123,Paris,26-07-2019
203,Amsterdam,03-07-2019
456,Paris,08-09-2018

I want to count and group the data on the basis of location and want to send the same to elasticsearch index. So my index should look like.
Order_count,Location
2,Paris
1,Amsterdam

and the count should increase or decrease on the basis of the data in the csv file. If the new Order is on the same location , then the count should increase and if there is a new location then the count should start from 1. How can I implement this scenario using logstash?

Take a look at example 5 for the aggregate filter.

Hi,

There is no counting of data in this example.
Can you help me with some other example.

Counting of events is exactly what example 5 does.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.