I am using ELK6.4.0 and filebeat6.4.0. Currently i am sending only my application logs over elasticsearch using filebeat and parse via logstash. But now we want to send the nginx logs (error and access) over elasticsearch. But i am confused how i can add the grok pattern for nginx in my current filter.
In your Filebeat configuration set fields that indicate the type of log you have. Then add conditionals to your Logstash configuration to choose between different filters.
Currently i have setup type log in my filebeat input. Please refer the below config part:
#=========================== Filebeat inputs =============================
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
# - /var/log/*.log
- /var/apps/mobilock/shared/log/production.log
Can you please give me small example. I am running out of ideas here, because i never used conditional base filter.
I have enabled the filebeat nginx module and started getting the logs on kibana dashboard for nginx.
But i am facing another issue i.e i am getting the nginx logs over kibana dashboard but filebeat nginx dashboard is not showing any data. Please refer the below screenshots
I am using filebeat prospector also for our arbitrary application. is that have any impact?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.