Hi all,
Currently I'm trying to manage the Elasticsearch records from Filebeat which came from bash script and write to CSV file for each execute, then Filebeat collect log from CSV file.
Below are example of header(not including in file)
server_ip,task_number,task_name,result
And the log file are as below. (came from bash script each execute/line)
10.110.2.5, 1, check file A, False
10.110.2.5, 2, check file B, True
10.110.2.5, 1, check file A, True
I used Filebeat processor dissect to separate the field and directly ship to Elasicsearch without Logstash for now. The problem I'm facing now is record 1 and 3 are the same server_ip and same task except the result that I would like to update instead and if server_ip, task_number, task_name are new, so insert them.
I know that Filebeat itself can't do this because it just shipping the log base on the message writing into CSV file
Currently in Elasticsearch
record1 ---- 10.110.2.5, 1, check file A, False
record2 ---- 10.110.2.5, 2, check file B, True
record3 ---- 10.110.2.5, 1, check file A, True
The expected
record1 ---- 10.110.2.5, 1, check file A, True
record2 ---- 10.110.2.5, 2, check file B, True
The SQL query is like below
Update
UPDATE dbName SET result = "True" WHERE server_ip = "10.110.2.5" AND task_number = "1" AND task_name = "check file A"
Otherwise
Insert
INSERT INTO dbNAME (server_ip, task_number, task_name, result) VALUES ("10.110.2.5", "1", "check file B", "True")
Is Logstash be able to use upsert to update record base on value of multiple fields like this? Or any idea I can handle on this one?