How to differentiate log files sended from filebeat?

Hi,

I'm trying to configure one logstash pipeline for my enviroments(test,dev,prd,acp). From each server I want to send log three log files(alfresco.log,share.log,access.log) using Filebeat. So my understanding is that I will create one pipeline - test-pipeline.conf and in output section I will define "If [type]" statements where I will check given log name, let's say alfresco.log file from test environment will be of type "test_alfresco_log" If that's true I will create Index of that type. And this I will repeat for every log from each environment.

My question is where I can define the type property(test_alfresco_log)? I tried to define these types in filebeat.yml, but It's not the right way because filebeat won't even start because of this modification.

Can anybody help?

See my config files:

filebeat.yml

 filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: test_alfresco_log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
     - c:\Alfresco\alfresco5\alfresco.log

- type: test_share_log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
     - c:\Alfresco\alfresco5\share.log
     
- type: test_access_log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
     - c:\Alfresco\alfresco5\tomcat\logs\access*.log

test-pipeline.conf

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
	host => "HYPALFMONPRD1"
  }
}

filter{
    grok {
		break_on_match => false
		match =>  [ "message", "TID: \[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:level} \[%{JAVACLASS:java_class}\] \(%{GREEDYDATA:thread}\) - (?<log_message>(.|\r|\n)*)"]
    }
	date{
		match => [ "timestamp_from_log", "ISO8601"]
	}		
}

output {
	if [type] == "test_alfresco_log" { 
		elasticsearch {
			hosts => ["HYPALFMONPRD1:9200"]
			sniffing => true
			manage_template => false
			index => "test_alfresco_log-%{+YYYY.MM.dd}"
			document_type => "%{[@metadata][type]}"
		}
	}
	if [type] == "test_share_log" { 
		elasticsearch {
			hosts => ["localhost:9200"]
			sniffing => true
			manage_template => false
			index => "test_share_log-%{+YYYY.MM.dd}"
			document_type => "%{[@metadata][type]}"
		}
	}
	if [type] == "test_access_log" { 
		elasticsearch {
			hosts => ["localhost:9200"]
			sniffing => true
			manage_template => false
			index => "test_access_log-%{+YYYY.MM.dd}"
			document_type => "%{[@metadata][type]}"
		}
	}
   ###There will be the other if statements...
}

pipelines.yml

# List of pipelines to be loaded by Logstash
#
# This document must be a list of dictionaries/hashes, where the keys/values are pipeline settings.
# Default values for ommitted settings are read from the `logstash.yml` file.
# When declaring multiple pipelines, each MUST have its own `pipeline.id`.
#
# Example of two pipelines:
#
# - pipeline.id: test
#   pipeline.workers: 1
#   pipeline.batch.size: 1
#   config.string: "input { generator {} } filter { sleep { time => 1 } } output { stdout { codec => dots } }"
# - pipeline.id: another_test
#   queue.type: persisted
#   path.config: "/tmp/logstash/*.config"
#
# PIPELINES:

- pipeline.id: test-pipeline
  path.config: "/etc/path/to/p1.config"

Filebeat version:
6.6.2

Logstash version
6.6.1

Kibana version :
6.6.1

Elasticsearch version :
6.6.1

APM Server version :
6.6.2

APM Agent language and version :
Java, 1.4.0

You could try this Creating index depends on source

It is my solution as I do not like if statements in the output section.

I tried it with your configuration, but it only register and create index for the first log(alfresco.log) defined in filebeat.yml, for other two not. Please can u check my configuration below?

filebeat.yml

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log
  enabled: true

  paths:
    - c:\Alfresco\alfresco5\alfresco.log 
  encoding: plain
  fields:
    log_prefix: test
    log_idx: alfresco_log
  fields_under_root: false
  
- type: log
  enabled: true
  
  paths:
    - c:\Alfresco\alfresco5\share.log
  encoding: plain
  fields:
    log_prefix: test
    log_idx: share_log
  fields_under_root: false

 - type: log
  enabled: true

  paths:
    - c:\Alfresco\alfresco5\tomcat\logs\access*.log 
  encoding: plain
  fields:
    log_prefix: test
    log_idx: access_log
  fields_under_root: false

test-pipeline.conf

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
	host => "MONPRD1"
  }
}

filter{
    grok {
		break_on_match => false
		match =>  [ "message", "TID: \[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:level} \[%{JAVACLASS:java_class}\] \(%{GREEDYDATA:thread}\) - (?<log_message>(.|\r|\n)*)"]
    }
	date{
		match => [ "timestamp_from_log", "ISO8601"]
	}
	mutate {
		copy => {
		"[fields][log_prefix]" => "[@metadata][log_prefix]"
		"[fields][log_idx]" => "[@metadata][index]"
		}
	}
}

output {
	elasticsearch {
		hosts => ["MONPRD1:9200"]
		sniffing => true
		manage_template => false
		index => "%{[@metadata][log_prefix]}-%{[@metadata][index]}-%{+YYYY.MM.dd}"
		document_type => "%{[@metadata][type]}"
	}
}

Looks mostly ok...

In the filter section of the Logstash config the indentation is a bit off and I would put a space between date and the {.

filter{
    grok {
		break_on_match => false
		match =>  [ "message", "TID: \[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:level} \[%{JAVACLASS:java_class}\] \(%{GREEDYDATA:thread}\) - (?<log_message>(.|\r|\n)*)"]
    }
    date {
		match => [ "timestamp_from_log", "ISO8601"]
    }
    mutate {
		copy => {
		"[fields][log_prefix]" => "[@metadata][log_prefix]"
		"[fields][log_idx]" => "[@metadata][index]"
		}
    }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.