Good day!
I am sending auditbeat logs to elasticserch via logstash.
In logstash output, added document_id parameter to update document id in elasticserch:
if "xml" in [tags] {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "GT"
user => user
password => password
document_id => "%{[agent][id]}"
doc_as_upsert => true
action => "update"
}
}
}
At the output in elasticserch, I get only one document _id of which does not change, but the content of the document changes, although it was expected that the documents will be = the number of logs with a unique _id:
"hits" : {
"total" : {
**"value" : 1,**
"relation" : "eq"
},
"max_score" : 7.95197E-5,
"hits" : [
{
"_index" : "GT",
"_type" : "_doc",
**"_id" : "r8584d681-dd18-4523-a929-c77c04bc62c5",**
"_score" : 7.95197E-5,
"_source" : {
"tags" : [
"xml",
...
My auditbeat config:
auditbeat.modules:
- module: file_integrity
paths:
- C:/xml_logs/GT/xml/
scan_at_start: true
recursive: true
setup.template.settings:
index.number_of_shards: 1
tags: ["xml"]
output.logstash:
hosts: ["10.1.1.4:5044"]
processors:
- drop_fields:
fields: ["log_type", "input_type", "offset", "beat", "source"]
- add_id: ~
logging:
to_files: true
files:
path: C:/ProgramData/auditbeat/Logs
level: debug
How can I get all auditbeat logs in elasticserch in different documents with a unique _id?