Aggregation by sum of field

Hi ,

If I want to aggregate by sum of specific field (for example build).
Can you give me an example to upload all the fields from the json file and also the ability to aggregate by sum of build ?
This is my conf file:
input {
file{
path => ["/tmp/TEST.json"]
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
grok {
match => [ 'message', '(?"TestName":.*"Agent":"[^"]+")' ]
}
mutate {
convert => { "build" => "float" }
}
json {
source => "message"
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
host => "XX.xx.xx.xxx"
protocol => "http"
index => "index_j"
}
}

BR,
Chen

Can you give me an example to upload all the fields from the json file

Logstash will send all fields in the event to ES. You should be fine with the json filter that you have.

What looks suspicious is the mutate filter that attempts to change the type of the build field. Where does that field come from? Keep in mind that filters are applied in order. Perhaps the mutate filter should go after the json filter?

and also the ability to aggregate by sum of build ?

In Kibana?

Hi ,

I changes my Json file , first all the fields where like this: "key":"value"
For key's that are not string , I changed it to "Key":value (without the " character).
Also removed the mutate after I changed my Json file.
Now it's ok , and I can also aggregate by those fields in Kibana.

Thanks :slight_smile:

BR,
Chen