2 Logstash Conf File, only one works

Hi there,

I have 2 configuration files to process 2 log files from remote servers with filebeat, each one works by itself, but only one works if both of them are in /etc/logstash/conf.d

Let's call the A and B, A is always works

I got error from B like below:
[2017-09-20T12:05:48,732][DEBUG][logstash.filters.grok ] Event now: {:event=>2017-09-20T20:00:09.809Z scansrv 2017-09-20 16:00:09,809 [53440178] INFO - ca.toronto.csd.scanagt.listener.ScanAgentMessageCountListener - message added javax.mail.event.MessageCountEvent[source=INBOX] destination /data7/user3 }
[2017-09-20T12:05:48,733][DEBUG][logstash.util.decorators ] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"read_timestamp", "value"=>["%{@timestamp}"]}
[2017-09-20T12:05:48,733][DEBUG][logstash.filters.geoip ] IP was not found in the database {:event=>2017-09-20T20:00:09.809Z scansrv 2017-09-20 16:00:09,809 [53440178] INFO - ca.toronto.csd.scanagt.listener.ScanAgentMessageCountListener - message added javax.mail.event.MessageCountEvent[source=INBOX] destination /data7/user3 }

Configuration Files:

A:

input {
beats {
port => "6044"
#client_inactivity_timeout => 6000
}
}

filter {
grok {
match => { "message" => ["%{IPORHOST:[apache2][access][remote_ip]} - %{DATA:[apache2][access][user_name]} [%{HTTPDATE:[apache2][access][time]}] "%{WORD:[apache2][access][method]} %{DATA:[apache2][access][url]} HTTP/%{NUMBER:[apache2][access][http_version]}" %{NUMBER:[apache2][access][response_code]} %{NUMBER:[apache2][access][body_sent][bytes]}( "%{DATA:[apache2][access][referrer]}")?( "%{DATA:[apache2][access][agent]}")?",
"%{IPORHOST:[apache2][access][remote_ip]} - %{DATA:[apache2][access][user_name]} \[%{HTTPDATE:[apache2][access][time]}\] "-" %{NUMBER:[apache2][access][response_code]} -" ] }
remove_field => "message"
}
mutate {
add_field => { "read_timestamp" => "%{@timestamp}" }
}
date {
match => [ "[apache2][access][time]", "dd/MMM/YYYY:H:m:s Z" ]
remove_field => "[apache2][access][time]"
}
useragent {
source => "[apache2][access][agent]"
target => "[apache2][access][user_agent]"
remove_field => "[apache2][access][agent]"
}
geoip {
source => "[apache2][access][remote_ip]"
target => "[apache2][access][geoip]"
}
if "_grokparsefailure" in [tags] or [apache2][access][user_name] == "-" {
drop { }
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "svn-apache-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}

B:

Scan Server MailBot Log

input {
beats {
port => "5045"
}
}

filter {
if [fields][document_type] == "scanserver-mailbot-log" {
grok {
patterns_dir => "./patterns"
break_on_match => false
match => { "message" => [ "%{TIMESTAMP_ISO8601:timestamp} [%{NUMBER:unknowncode}] %{LOGLEVEL:loglevel} - %{JAVACLASS:classname} - (.*) destination %{UNIXPATH:userhome}/%{USERNAME:username}" ] }
match => { "message" => [ "%{TIMESTAMP_ISO8601:timestamp} [%{NUMBER:unknowncode}] %{LOGLEVEL:loglevel} - %{JAVACLASS:classname} - Catch exception %{JAVACLASS:exception}[:] %{GREEDYDATA:excepmsg}" ]}
remove_field => "message"
}

date {
match => [ "timestamp" , "YYYY-MM-dd HH:mm:ss,SSS" ]
remove_field => "timestamp"
}
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[fields][document_type]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

Thanks,
Fei

Up until 6.0 is released with the multiple pipelines feature, Logstash will merge all of your configuration files in /etc/logstash/conf.d into a single, virtual configuration file. That means any filter and output blocks not bounded by conditionals will process events from all inputs. I see only one of your filter blocks bounded by conditionals, and neither of your output blocks are bounded.

1 Like

Thanks for your prompt response, theuntergeek

Added condition, and it works.

Thank you again.

Fei

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.