Logstash fingerprint with concatenate_all_fields validation

Hi, I need some help with the fingerprint validation that was produced using fingerprint logstash filter. at the moment pipeline config looks like this:

input
{
jdbc
{
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-42.2.5.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://pg.server:5432/sz"
jdbc_user => "archiver"
jdbc_password => "some_password"
statement => "select
id as id,
log_record as log_record,
creation_time as creation_time,
to_char(creation_time, 'YYYY.MM.DD') as log_date
from some_table
where id > :sql_last_value"
use_column_value => true
tracking_column => "id"
tracking_column_type => "numeric"
jdbc_paging_enabled => true
jdbc_page_size => "1000000"
schedule => "*/15 * * * *"
last_run_metadata_path => "/srv/logstash-data/.some_table_jdbc_last_run"
}
}

filter {
fingerprint {
concatenate_all_fields => true
method => "SHA256"
}
}
output {
elasticsearch {
hosts => ["https://elk1.test:9200", "https://elk2:9200"]
index => "some_table-%{[log_date]}"
ssl => true
user => user
password => password
ssl_certificate_verification => true
keystore => "/etc/logstash/some_cert.p12"
keystore_password => "cert_pass"
truststore => "/etc/logstash/some_cert.p12"
truststore_password => "some_pass"
}
}

this setup produces a fingerprint field with hash in it, and document looks like this in JSON format

{
"_index": "some_table-2019.08.13",
"_type": "_doc",
"_id": "MeiejGwBtwnGL-3jnLNj",
"_version": 1,
"_score": null,
"_source": {
"fingerprint": "30aa8ca17fa746535dcc7b96dc73a2ebb9d3e8de92eb129abf56d1548ebc30dd",
"creation_time": "2019-08-13T20:14:25.978Z",
"log_record": "{"category":"ADMINISTRATION","role":"ADMIN","params":{"message":"User authentication successful"},"auditTimeStamp":"2019-08-13T20:14:25.966Z"} ECDSA:MGUCME9+Xj4KXop5BuemX5oEPDWYETMR7M1SPSqzRwGyNgxOxV0lmSJIco9xQawSP4K0awIxAK0Pal5KPg+phlN/qPnTjymFFS1ESQNC6MJTAgs9ZZizdudQlF/RYZ8IuhECN1dwAg==",
"log_date": "2019.08.13",
"id": 14335915,
"@timestamp": "2019-08-13T20:15:00.514Z",
"@version": "1"
},
"fields": {
"creation_time": [
"2019-08-13T20:14:25.978Z"
],
"@timestamp": [
"2019-08-13T20:15:00.514Z"
]
},
"sort": [
1565727300514
]
}

How I have to some how validate that data in document and fingerprint are matching.

So my question is, what is the exact format of data that logstash produces hash from, when concatenate_all_fields is set to true. What fields are concatenated, in what order and format.

Thanks a lot in advance.

The code is here.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.