I'm doing a logstash firewall search with fields for source.ip, destination.ip, rule.name, hostname, action and need to go back one month. The number of hits is huge but so many are identical. Is there a way for me to filter for unique results so I can export to a reasonable size CSV? Any help would be greatly appreciated.
As far as I know, there is no option to filter duplication in Disocver.
So you need to consider some alternatives.
create new de-duplicated index using logstash
Little Logstash Lessons: Handling Duplicates | Elastic Blog
use terms aggregation with REST API and arrange the output yourself.
Terms aggregation | Elasticsearch Guide [7.16] | Elastic
use terms aggregation with Visualize on Kibana
Aggregation-based | Kibana Guide [7.16] | Elastic
Thanks Tomo! I'll try those and will update on which one worked best but they all look like good options.
Another suggestion is to use a Latest transformation which will run and create a new index that will only have the latest version of the documents based on a unique key / field.
If you just go to Kibana-> stack management > transforms and walk through the wizard for the latest transform You should be able to get that working.
Then when you look at that index in Discover, it will only see one copy of each document and it will be the latest of those documents based on that unique key.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.