Hi there
I'm using ELK 7.10.
My test setup is as follows:
filebeat (running in kibana) -> logstash01 -> solace pubsub+ -> logstash02 -> Elasticsearch -> kibana
What I'm trying to accomplish:
Insert custom fields created in filebeat into index name for logstash02 output to ES
This is my filebeat module yml:
- module: system
syslog:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
#var.paths:
input:
processors:
- add_fields:
fields:
env: dev
app: kibana-sys
type: filebeat
- add_tags:
tags: ["app", "syslog"]
My logstash01 logstash.yml file is:
input
{
beats
{
port => "5044"
ssl => true
ssl_key => '/etc/logstash/certs/logstash.pkcs8.key'
ssl_certificate => '/etc/logstash/certs/logstash.crt'
}
}
output
{
jms
{
delivery_mode => "persistent"
destination => "elk-queue"
jndi_name => "/jms/cf/my-jms"
jndi_context => {
'java.naming.factory.initial' => 'com.solacesystems.jndi.SolJNDIInitialContextFactory'
'java.naming.security.principal' => 'default@elk-msg-vpn'
'java.naming.provider.url' => 'tcp://10.10.10.191:55555'
'java.naming.security.credentials' => 'Admincod3'
}
require_jars=> ['/var/lib/jms/commons-lang-2.6.jar',
'/var/lib/jms/sol-jms-10.9.1.jar',
'/var/lib/jms/geronimo-jms_1.1_spec-1.1.1.jar']
}
}
And my logstash02 logstash yml file is:
input
{
jms
{
include_header => false
include_properties => false
include_body => true
destination => "elk-queue"
jndi_name => "/jms/cf/my-jms"
jndi_context => {
'java.naming.factory.initial' => 'com.solacesystems.jndi.SolJNDIInitialContextFactory'
'java.naming.security.principal' => 'default@elk-msg-vpn'
'java.naming.provider.url' => 'tcp://10.10.10.191:55555'
'java.naming.security.credentials' => 'Admincod3'
}
require_jars=> ['/usr/share/jms/commons-lang-2.6.jar',
'/usr/share/jms/sol-jms-10.9.1.jar',
'/usr/share/jms/geronimo-jms_1.1_spec-1.1.1.jar']
}
}
filter
{
json
{
source => "message"
target => "parsed_message"
tag_on_failure => [ "_jsonparsefailure" ]
}
}
output
{
elasticsearch
{
hosts => ["https://10.10.10.10:9200", "https://10.10.10.11:9200", "https://10.10.10.12:9200"]
index => "%{[parsed_message][fields][app]}-%{+YYYY.MM.dd}"
cacert => "/etc/logstash/certs/ca.crt"
user => "logstash_writer"
password => "Admincod3"
}
}
I have attached a screencap below that shows parsed_message.fields.app is actually captured in ES but I can't use it for my index name. The index name will be %{[parsed_message][fields][app]}-2020-12-25 instead of kibana-sys-2020-12-25.
I have tried restarting Logstash and filebeat services too but no luck there. How can i do this?
Thanks and Merry X'mas from Singapore!
ck