Hi Everyone,
I just found out about ELK + Filebeat and setup an own stack. First I player around with default systemlog files, which was working fine. Afterwards I started pushing my own application log file to it to find out how I can leverage ELK for my own application. However, I ran into a big issue, where you might help me out. But first of all, please find my configuration files below:
/etc/filebeat/filebeat.yml
filebeat:
prospectors:
-
paths:
- /home/fabiansc/Dokumente/*
document_type: app-access
- /var/log/nginx/access.log
/etc/logstash/conf.d/11-accesslog-filter.conf
filter {
if [type] == "app-access" {
grok {
match => { "message" => "%{PKNUMBER:user_pk} %{DATETIME:time} %{NUMBER:duration} %{NUMBER:bytes} %{IPADDRESS:source_ip} %{IPADDRESS:destination_ip} %{PORT:port} %{HOSTNAME:source_hostname} %{HOSTNAME:destination_hostname} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:status_code}" }
}
syslog_pri {}
date {
match => [ "time", "YYYY-MM-dd HH:mm:ss", "MM dd HH:mm:ss" ]
}
}
}
/opt/logstash/patterns/accesslog
PKNUMBER [0-9]{8}[a-zA-Z]
DATETIME [0-9]{4}-[0-9]{2}-[0-9]{2}[ \t]+[0-9]{2}:[0-9]{2}:[0-9]{2}
PORT :[0-9]{3,}
IPADDRESS (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}
HOSTNAME [a-zA-Z0-9-.]{1,200}.me.de
/home/fabiansc/Dokumente/access.log (logfile provided to filebeat)
00001234A 2017-03-31 09:49:10 0.52 2650 10.20.30.40 90.80.70.60:8010 source.me.de target.me.de POST /myServlet 200
00001234B 2017-03-31 09:50:10 0.20 2650 10.20.30.40 90.80.70.60:8010 source.me.de target.me.de PUT /myServlet 404
The whole stack is running on one single virtual machine. After a full restart of all services or the virtual machine; those settings should have been saved and taken. I can see that the log data from access.log is successfully transfered to elastic search and kibana:
April 5th 2017, 10:19:21.474
message:
00001234A 2017-03-31 09:49:10 0.0 0 10.20.30.40 90.80.70.60:8010 source.me.de target.me.de POST /myServlet 200
@version:
1
@timestamp:
April 5th 2017, 10:19:21.474
source:
/home/fabiansc/Dokumente/access.log
input_type:
log
offset:
325
type:
app-access
count:
1
fields:
-
beat.hostname:
ubuntu
beat.name:
ubuntu
host:
ubuntu
tags:
beats_input_codec_plain_applied, _grokparsefailure
syslog_severity_code:
5
syslog_facility_code:
1
syslog_facility:
user-level
syslog_severity:
notice
_id:
AVs9MmZfLcvbMFKXJthm
_type:
app-access
_index:
filebeat-2017.04.05
_score:
Therefore the data is searchable within the message field via Kibana. Further I noticed that the messages has a _grokparsefailure tag. As I am a bit new to the topic; I am right now struggeling to get my filters searchable in Kibana, e.g. all fields from /etc/logstash/conf.d/11-accesslog-filter.conf should be searchables:
- user_pk
- time
- duration
- bytes
- source_ip
- destination_ip
- port
- source_hostname
- destination_hostname
- method
- request
- status_code
I am not sure if I need to add some indexes or missed out something; therefore I am happy for any clue to get my test up and running.
Big thanks in advance!