I have logs that I receive daily in a JSON form and most of them contain unique identifiers such as ip addresses and ports. A lot of them repeat daily so I want to filter them out but their timestamp changes and elastic creates a new id for every single one of them. I have installed and configured fingerprint plugin in logstash to create a unique fingerprint for every event based on that events ip address and port. For some reason logstash creates the same hash for every single event and not based on that events ip address and port.
I expect results such as:
192.168.0.1:80 = n3j1tjo31ifj0inv023n10f
192.168.0.2:443=31tboginv4go1ng4igno4
But I receive
192.168.0.1:80 = n3j1tjo31ifj0inv023n10f
192.168.0.2:443=n3j1tjo31ifj0inv023n10f
I will include the logstash.conf below. Thank you for any assistance in advance.
input {
file {
codec => "json"
path => "/usr/share/shadow/one.json"
start_position => beginning
sincedb_path => "/dev/null"
#sincedb_path => "/home/bitnami/sincedb/sincedb-access"
}
}
filter {
fingerprint{
source => ["ip","port"]
concatenate_sources => true
target => "jedinstveni_id"
method => "MD5"
key => "randomkey"
}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
data_stream => false
#document_id => "%{logstash_checksum}"
index => "shadow-server"
}
stdout {
codec => rubydebug
}
file {
path => "/usr/share/shadow/log/output.log"
}
}