Filter multiple different file beat logs in logstash

Hi,

I am currently sending apache access logs from a remote server to my Logstash server by running FileBeat on the remote server. This is working like a charm.

I would like to also send other logs with different log content using FileBeats from the same remote server to the same Logstash server and parse those logs files separately. For example I want to create a custom field and separate index for the other logs that I want to send from the same remote server.

Is this possible? If so what is the syntax that I would need to put in my Logstash config file? Can I just use one config file or would I need multiple config files?

1 Like

Is this possible? If so what is the syntax that I would need to put in my Logstash config file?

What one typically does is assign different types to different kind of logs (you can do that from Filebeat). Then you can use conditionals in your Logstash configuration to do different things with different types of logs.

Can I just use one config file or would I need multiple config files?

Doesn't matter.

1 Like

What one typically does is assign different types to different kind of logs (you can do that from Filebeat). Then you can use conditionals in your Logstash configuration to do different things with different types of logs.

Accessing event data and fields | Logstash Reference [8.11] | Elastic

Right, so I'm having a difficult time understanding the syntax that needs to be configured in the filebeat.yml file and my logstash config file "test3.conf".

Here is what I have in my filebeat.xml:

filebeat.prospectors:

  • input_type: log <--This needs to be "log" if I change it to "tools-message" then filebeat doesn't start
    paths:
    • /var/log/*.log
      fields: {log_type: tools-message}

Here is what I have in my "test3.conf" logstash config file:

input {
beats {
port => "5043"
}
}

filter {
if [ log_type = "tools-message" ] {
grok {
add_tag => [ "bazzinga_testtag" ]
}
mutate {
replace => { "%{type}" => "tools-message" }
}
}
}

output {
elasticsearch {
hosts => ["10.111.119.211:9200"]
index => "%{type}_index"
}
#stdout { codec => rubydebug }
}

This logstash config file does not create an index with name "tools-message". Instead it creates an index called "log_index". I want to able to have multiple different indices based on the type of log that logstash is parsing from filebeats.

I believe I figured it out. Not sure if its the right way, or an unusual way of doing it but basically this is what my configs looks like and yields me two distinct indices per log file type:

filebeats.yml:

filebeat.prospectors:

- input_type: log
    paths:
    - /var/log/*.log
  fields: {log_type: toolsmessage}


- input_type: log
  paths:
    - /etc/httpd/logs/ssl_access_*
  fields: {log_type: toolsaccess}

test3.conf (logstash config file):

input {
  beats {
    port => "5043"
  }
}

filter {
  if ([fields][log_type] == "toolsmessage") {
    mutate {
      replace => {
        "[type]" => "toolsmessage"
      }
    }
  }
  else if ([fields][log_type] == "toolsaccess") {
    mutate {
      replace => {
        "[type]" => "toolsaccess"
      }
    }
  }
}

output {
  elasticsearch {
    hosts => ["10.111.119.211:9200"]
    index => "%{type}_index"
  }
 #stdout { codec => rubydebug }
}
2 Likes
  • input_type: log <--This needs to be "log" if I change it to "tools-message" then filebeat doesn't start

Correct. It's the document_type option you're looking for.

if [ log_type = "tools-message" ] {

Change to:

if [log_type] == "tools-message" {

See examples in the documentation I linked to.

grok {
add_tag => [ "bazzinga_testtag" ]
}

This won't work. To unconditionally add a tag, use a mutate filter and its add_tag option.

replace => { "%{type}" => "tools-message" }

type, not %{type}.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.