PROBLEM:
The grok filter is skipped when I run two .conf files using pipelines in Logstash. All the data respectively goes into each index but it skips the grok filter. The grok filter works once for the 1st record Logstash receives but then skip the grok filter for new log entries and fails for all log entries in the 2nd index created.
RESULT:
If I run each xxx.conf one at a time, then each index (site_1_truck, site_1_car) is created and data parsed correctly through grok and saved into ElasticSearch.
GOAL:
We have 12 server(Ubuntu 14.04) with 8 customer websites hosted on each one.
We need to track 5 custom logs for each customer's website.
12 servers
-- 8 customers/server
--- 5 logs_files/customer
WHAT I RULED-OUT:
Server_A can connect to our Logstash Server 100.2.3.4
telnet 100.2.3.4 5044
Logstash Server is listening on 5044 & 5045
netstat -nat
MY TEST
filebeat 6.2.3
logstash 6.2.4
elasticSearch 6.2.4
I installed one instance of Filebeats on a test server.
------- filebeat.yml on Server_A -------
- type: log
enabled: true
paths:
- /usr/local/glassfish/domains/customer_site_1/log/car.log
fields:
customer_name: customer_site_1_car
file_type: car_log
- type: log
enabled: true
paths:
- /usr/local/glassfish/domains/customer_site_1/log/truck.log
fields:
customer_name: customer_site_1_truck
file_type: truck_log
output.logstash:
hosts: ["100.2.3.4:5044"]
------- Logstash [100.2.3.4] on Ubuntu 16.04 -------
----------------------- Pipelines (/etc/logstash/pipelines.yml) -----------------------
- pipeline.id: process_logs_id
path.config: "/etc/logstash/conf.d/send_logs.conf"
NOTE: how I tested config files
/usr/share/logstash/bin#./logstash --config.test_and_exit -f /etc/logstash/conf.d/send_logs.conf
Config Validation Result: OK.
------- Conf Files (/etc/logstash/conf.d/) -------
send_logs.conf
input {
beats {
port => 5044
type => "log"
}
}
filter
{
if [fields][file_type] == 'car_log'
{
grok {
match => {"message" => "\[%{TIMESTAMP_ISO8601:timestamp_utc_jvm}\] \[(?:%{DATA:app_server}|)\] \[%{WORD:severity}\] \[(?:%{DATA:glassfish_code}|)\] \[(?:%{JAVACLASS:java_pkg}|)\] \[(?<Thread_Name>[^\]]+)\] \[(?<timeMillis>[^\]]+)\] \[(?<levelValue>[^\]]+)\] (?<log_msg>(?m:.*))"}
}
}
if [fields][file_type] in ['truck_log', 'next_log_type']
{
grok {
match => {"message" => "\[%{TIMESTAMP_ISO8601:timestamp_utc_jvm}\] %{WORD:severity} %{JAVACLASS:java_pkg} \[(?<Thread_Name>[^\]]+)\] (?<log_msg>(?m:.*))"}
}
}
}
output {
elasticsearch {
hosts => "your_elastic_search_server.com:9200"
manage_template => false
index => "%{[fields][customer_name]}_%{[fields][file_type]}"
}
stdout {}
}