Hi
I have to send different formatted logs which are stored under same path. Using Filebeat as a log shipper i was able to send individual log files to logstash . By specifying multiple prospectors and document_type in filebeat.yml file, i can see the filebeat is sending events to logstash. But, how to mention in the logstash configuration to filter the events for specific type?
Use conditionals: https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#conditionals
@magnusbaeck Thank you for the response. I am using codec in input section for most of the logstash config files. How can we tell logstash to capture the events from filebeat only when type is matched.
I'm adding my filebeat.yml file and logstash.conf file here.
> ############################# Filebeat ######################################
> filebeat:
> # List of prospectors to fetch data.
> prospectors:
> -
> paths:
> # - /var/log/*.log
> - /tmp/logs/standalone/log/server.log
> document_type: server
> input_type: log
> prospectors:
> -
> paths:
> - /tmp/logs//standalone/log/systemevent.log
> document_type: system
> input_type: log
> ############################# Output ##########################################
> # Configure what outputs to use when sending the data collected by the beat.
> # Multiple outputs may be used.
> output:
> ### Logstash as output
> logstash:
> # The Logstash hosts
> hosts: ["mylogstashhostsip:andport"]
Logstash config file:
input {
beats {
port => 5044
# type => "server"
codec => custom_plugin {
pattern => "^%{DATESTAMP_OTHER:time}"
negate => true
what => "previous"
auto_flush_interval => 10
}
}
}
filter {
if [type] == "server" {
grok {
match => ["message" => "%{COMMONAPACHELOG }" ]
}
}
output {
elasticsearch {
hosts => "eshostname"
index => "serverlogs"
}
}
Can we add type in the input section to tell logstash to accept events from filebeat when type is matched?
No, but you can use the drop filter to selectively drop events you're not interested in. Wrap it in a conditional similar to how you've wrapped the grok filter.
Ok, thank you again. I have two different logstash config files to parse the different format of log data. So, by adding the type condition in two of those config files i can parse the events to the right config file based on type of data format (server and system).
Logstash config file for server type:
filter {
if [type] == "server" {
grok {
match => ["message" => "%{COMMONAPACHELOG}" ]
}
if "_grokparsefailure" in [tags] {
drop {}
}
}
}
Logstash config file for system type:
filter {
if [type] == "system" {
grok {
match => ["message" => "%{SYSLOGBASE}" ]
}
if "_grokparsefailure" in [tags] {
drop {}
}
}
}
Similar to this will work, or do i need to add another conditional statement to drop events for particular type.
For example, in the server.conf file drop the events when type is not matched
if [type] =="system" {
drop {}
}
I'm not sure exactly what you mean, but I sense some confusion over how Logstash deals with multiple configuration files. When they're loaded they are concatenated and are not treated separately in any way. Whether you list your inputs, outputs, and filters in one or two files doesn't matter (except the internal ordering of filters).
I mean to say, do we need to have additional condition statement for drop filter like below:
if [type] =="system" {
drop {}
}
Or can we simply use "_grokparsefailure" inside the filter
filter {
if [type] == "server" {
grok {
match => ["message" => "%{COMMONAPACHELOG}" ]
}
if "_grokparsefailure" in [tags] {
drop {}
}
}
}
Use whatever conditional that fits best. Logstash certainly doesn't care if you use "_grokparsefailure" in [tags]
or [type] =="system"
as long as they're logically equivalent.