Hi Team,
I am new to ELK stack and I am looking for a solution for the below problem.
ELK Components :
ElasticSearch Cluster ( 1 Master & 2 Data Nodes )
Kibana is Installed on same server as the master
Logstash is a different server
Filebeat is used as an Input Agent which is Installed on Remote Windows Servers.
FileBeat Pushes the Data to Logstash
Logstash Parses the data in the structured Format
ElasticSearch gets the data and the same is displayed on Kibana Dashboard..
This is the Basic Function of how the ELK stack works..
Problem Statement :
"Filebeat sends logs from C: Drive and E: Drive ( Windows Servers )". The Logs in the directory have 2 different data fields for which we have written 2 Filter Patterns. Need to understand how should be the if condition inside the filter plugin.
Filebeat Configuration :
filebeat.inputs:
- type: log
paths:- C:\inetpub\logs\LogFiles\test1*
- E:\logs\test*
output.logstash:
hosts: ["IP:5044"]
We are getting Input from these 2 log directories which has around 10-15 log files each.
Now the issue is we have 2 patterns for 2 log files in C drive and E Drive
C Drive Pattern :
%{DATE:Date} %{TIME:time} %{IP:serverIP} %{WORD:httpMethod} %{PATH:appAPIName} %{NOTSPACE:user} %{NUMBER:port} %{NOTSPACE:user} %{IP:clientIP} %{NOTSPACE:sourceClient} %{NOTSPACE:user2} %{NUMBER:statusCode} %{NUMBER:int} %{NUMBER:int1} %{GREEDYDATA:greed}
E Drive : JSON Format
json {
source => "message"
target => "response"
}
Now, we want to execute the JSON block if the log path is "E:\logs\test" and execute the pattern if the log path is "C:\inetpub\logs\LogFiles\test1"
So how do we write the filter plugin for the same ??
Let me know if you need any other details