Logstash in Docker - Combine 2 events into 1 event

I am running ElasticStack in Docker via their official images; however, I'm currently receiving the following error message when I attempt to use the Logstash - aggregate plugin to combine events that have the same RequestID:

Cannot create pipeline {:reason=>"Couldn't find any filter plugin named 'aggregate'. Are you sure this is correct? Trying to load the aggregate filter plugin resulted in this error: Problems loading the requested plugin named aggregate of type filter. Error: NameError NameError"}

That said, I'm also not 100% sure how to use the Logstash - aggregate plugin to combine the following sample events into one event:

{
"@t": "2017-10-16T20:21:35.0531946Z",
"@m": "HTTP GET Request: "https://myapi.com/?method=rawr&format=json&apikey=key&trackid=385728443&protocol=dash\"",
"@i": "29b30dc6",
"Url": "https://myapi.com/?method=rawr&format=json&apikey=key&trackid=385728443&protocol=dash",
"SourceContext": "OpenAPIClient.Client",
"ActionId": "fd683cc6-9e59-427f-a9f4-7855663f3568",
"ActionName": "Web.Controllers.API.TrackController.TrackRadioLocationGetAsync (Web)",
"RequestId": "0HL8KO13F8US6:0000000E",
"RequestPath": "/api/track/radiourl/385728443"
}
{
"@t": "2017-10-16T20:21:35.0882617Z",
"@m": "HTTP GET Response: LocationAPIResponse { Location: "http://sample.com/file/385728443/\", Error: null, Success: True }",
"@i": "84f6b72b",
"Response":
{
"Location": "http://sample.com/file/385728443/",
"Error": null,
"Success": true,
"$type": "LocationAPIResponse"
},
"SourceContext": "OpenAPIClient.Client",
"ActionId": "fd683cc6-9e59-427f-a9f4-7855663f3568",
"ActionName": "Web.Controllers.API.TrackController.TrackRadioLocationGetAsync (Web)",
"RequestId": "0HL8KO13F8US6:0000000E",
"RequestPath": "/api/track/radiourl/385728443"
}

Could someone please guide me on how to correctly combine these events and if aggregate is the correct plugin, why the built-in plugin doesn't seem to be a part of the Logstash Docker image?

docker-compose.yml contents:

version: '3'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:5.6.3
container_name: elasticsearch
environment:
- discovery.type=single-node
- xpack.security.enabled=false
ports:
- 9200:9200
restart: always
logstash:
image: docker.elastic.co/logstash/logstash:5.6.3
container_name: logstash
environment:
- xpack.monitoring.elasticsearch.url=http://elasticsearch:9200
depends_on:
- elasticsearch
ports:
- 10000:10000
restart: always
volumes:
- ./logstash/pipeline/:/usr/share/logstash/pipeline/
kibana:
image: docker.elastic.co/kibana/kibana:5.6.3
container_name: kibana
environment:
- xpack.monitoring.elasticsearch.url=http://elasticsearch:9200
depends_on:
- elasticsearch
ports:
- 5601:5601
restart: always

logstash/pipeline/empstore.conf contents:

input {
http {
id => "empstore_http"
port => 10000
codec => "json"
}
}

output {
elasticsearch {
hosts => [ "elasticsearch:9200" ]
id => "empstore_elasticsearch"
index => "empstore-openapi"
}
}

filter {
mutate {
rename => { "RequestId" => "RequestID" }
}

aggregate {
task_id => "%{RequestID}"
code => ""
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.