Hello ELK community !
I'll explain you a big problem, look at this :
I empty the cache :
curl -XDELETE 'http://localhost:9200/_all'
Answer :
{"acknowledged":true}
I display my indexes:
curl -XGET 'localhost:9200/_cat/indices?v&pretty'
Answer :
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
So ok, no data anymore on my Elasticsearch.
Now :
Configuration of my pipeline to parse my MalwareBytes logs:
input {
file {
path => "/home/XXX/ malwarebytes/*.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ";"
columns => ["Name","Status","Category","Type","EndPoint","Group","Policy","Scanned At","Reported At","Affected Application"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "malwarebytes-report"
}
stdout {}
}
Then , I send my report month by month to elasticsearch (report-jan for january, report-feb...)
For exemple, I send my report of January, including 28 recording :
I launch my pipeline :
bin/logstash -f /etc/logstash/conf.d/pipeline_malwarebytes.conf --config.reload.automatic
This pipeline listening /home/XXX/malwarebytes/ currently.
I send my report with WinScp and then I display my indexes :
[root@lrtstfpe1 malwarebytes]# curl -XGET 'localhost:9200/_cat/indices?v&pretty'
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open malwarebytes-report dSjLd3tdR5qEwtREcHUW0w 5 1 28 0 460b 460b
Everything is OK, my index is created and my 28 recordings are inside.. Perfect, but now, PROBLEM :
I send my 2nd report, which of mars, including 150 recording :
My pipeline still listening, I send my 2nd report with WinScp, I display my indexes and ... :
[root@lrtstfpe1 malwarebytes]# curl -XGET 'localhost:9200/_cat/indices?v&pretty'
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open malwarebytes-report dSjLd3tdR5qEwtREcHUW0w 5 1 178 0 253.7kb 253.7kb
yellow open logstash-2018.03.29 FLz2cD_mQjW5-XnGjF0Twg 5 1 300 0 142.5kb 142.5kb
Wtf is that ? So ok my 150 new recording have been added at my index (150+28=178),but with an other index was created !? and why with the double of my datas (150x2=300).
Can anyone help me to try to understand what's happened and how to don't have anymore this index "logstash..." when I import my datas ?
Thx you so much !