Kibana: Creating Selected Fields

Hi Everyone,

I just found out about ELK + Filebeat and setup an own stack. First I player around with default systemlog files, which was working fine. Afterwards I started pushing my own application log file to it to find out how I can leverage ELK for my own application. However, I ran into a big issue, where you might help me out. But first of all, please find my configuration files below:

/etc/filebeat/filebeat.yml

filebeat:
  prospectors:
    -
      paths:
        - /home/fabiansc/Dokumente/*
      document_type: app-access
        - /var/log/nginx/access.log

/etc/logstash/conf.d/11-accesslog-filter.conf

filter {
    if [type] == "app-access" {
        grok {
            match => { "message" => "%{PKNUMBER:user_pk} %{DATETIME:time} %{NUMBER:duration} %{NUMBER:bytes} %{IPADDRESS:source_ip} %{IPADDRESS:destination_ip} %{PORT:port} %{HOSTNAME:source_hostname} %{HOSTNAME:destination_hostname} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:status_code}" }
        }
        syslog_pri {}
        date {
            match => [ "time", "YYYY-MM-dd      HH:mm:ss", "MM dd HH:mm:ss" ]
        }
    }
}

/opt/logstash/patterns/accesslog

PKNUMBER [0-9]{8}[a-zA-Z]
DATETIME [0-9]{4}-[0-9]{2}-[0-9]{2}[ \t]+[0-9]{2}:[0-9]{2}:[0-9]{2}
PORT :[0-9]{3,}
IPADDRESS (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}
HOSTNAME [a-zA-Z0-9-.]{1,200}.me.de

/home/fabiansc/Dokumente/access.log (logfile provided to filebeat)

00001234A       2017-03-31      09:49:10        0.52    2650    10.20.30.40  90.80.70.60:8010       source.me.de   target.me.de     POST    /myServlet       200
00001234B       2017-03-31      09:50:10        0.20    2650    10.20.30.40  90.80.70.60:8010       source.me.de   target.me.de     PUT    /myServlet       404

The whole stack is running on one single virtual machine. After a full restart of all services or the virtual machine; those settings should have been saved and taken. I can see that the log data from access.log is successfully transfered to elastic search and kibana:

April 5th 2017, 10:19:21.474	

message:
    00001234A 2017-03-31 09:49:10 0.0 0 10.20.30.40 90.80.70.60:8010 source.me.de target.me.de POST /myServlet 200
@version:
    1
@timestamp:
    April 5th 2017, 10:19:21.474
source:
    /home/fabiansc/Dokumente/access.log
input_type:
    log
offset:
    325
type:
    app-access
count:
    1
fields:
    - 
beat.hostname:
    ubuntu
beat.name:
    ubuntu
host:
    ubuntu
tags:
    beats_input_codec_plain_applied, _grokparsefailure
syslog_severity_code:
    5
syslog_facility_code:
    1
syslog_facility:
    user-level
syslog_severity:
    notice
_id:
    AVs9MmZfLcvbMFKXJthm
_type:
    app-access
_index:
    filebeat-2017.04.05
_score:

Therefore the data is searchable within the message field via Kibana. Further I noticed that the messages has a _grokparsefailure tag. As I am a bit new to the topic; I am right now struggeling to get my filters searchable in Kibana, e.g. all fields from /etc/logstash/conf.d/11-accesslog-filter.conf should be searchables:

  • user_pk
  • time
  • duration
  • bytes
  • source_ip
  • destination_ip
  • port
  • source_hostname
  • destination_hostname
  • method
  • request
  • status_code

I am not sure if I need to add some indexes or missed out something; therefore I am happy for any clue to get my test up and running.

Big thanks in advance!

Hi everybody,

I was able to resolve the issue with the regexp. It was due to missing escaping of /opt/logstash/patterns/accesslog:

PKNUMBER [0-9]{8}[a-zA-Z]
DATETIME [0-9]{4}[\-][0-9]{2}[\-][0-9]{2}[ \t]+[0-9]{2}[\:][0-9]{2}[\:][0-9]{2}
PORT :[0-9]{3,}
IPADDRESS (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}
HOSTNAME [a-zA-Z0-9\-\.]{1,200}.me.de

However, the issue, that I do not get searchable fields within Kibana is still pending. I would be glad for any input. BTW: I documented my whole setup in my blog, as I think this topic is kind of interesting: https://cloud.fas-consulting.de/drupal/monitoring-repository

I would be very glad for any help offered!

Nice blog post! We'd definitely suggesting running the 5.X releases :slight_smile:

Can you elaborate more on the searchable aspect. Have you added the index pattern to KB? Can you see the data in discover? If you can and things aren't showing up, what do you mean it's not searchable?

Hi Mark,

thanks for your reply; I am glad if people can get something out of it in the long run :slight_smile:

I have loaded index pattern (see my Blog >> Filebeat Index Templates for Elasticsearch) from guthub.com.

The data is loaded and visible within Discover in Kibana within the "message" field (see post above):

message:
    00001234A 2017-03-31 09:49:10 0.0 0 10.20.30.40 90.80.70.60:8010 source.me.de target.me.de POST /myServlet 200
@version:
    1

I am missing the possibility to search for the single fields using tags, e.g. "user: 00001234A", via Kibanas Discovery. I defined those within the elasticsearch filter... Not sure what's missing. Maybe a filebeat index template like the one I referenced above?

Ah ok.

Well your grok pattern does not match so it cannot create those fields, hence the grokparsefailure. Use something like http://grokdebug.herokuapp.com/ to validate your patterns.

I corrected /opt/logstash/patterns/accesslog, therefore the failure is gone. However the values are still not tagged in kibana:

# sudo vim /opt/logstash/patterns/myLogPattern
PKNUMBER [0-9]{8}[a-zA-Z]
DATETIME [0-9]{4}[\-][0-9]{2}[\-][0-9]{2}[ \t]+[0-9]{2}[\:][0-9]{2}[\:][0-9]{2}
IPADDRESS (25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)(\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}
HOSTNAME [a-zA-Z0-9\-\.]{1,200}.domain.de

There is no more failure in any log file (yay!). Just looking for the indexes to be created... and they currently wont come and I do not know why... :slight_smile:

Go back to a basic config, the input, your grok and an output with stdout. Then see what is happening.

Hi warkolm,

found the issue. Within the JSON message displayed within Kibana, I could identify that grok issues still persist; even without any /var/log/syslog or other log outputs.

The following adjustment of /etc/logstash/conf.d/11-accesslog-filter.conf solved the issue:

filter {
    if [type] == "app-access" {
        grok {
            match => { "message" => "%{WORD:user}%{SPACE}%{DATETIME:Datetime}%{SPACE}%{BASE16FLOAT:Duration}%{SPACE}%{INT:Bytes}%{SPACE}%{IPV4:ip_source}%{SPACE}%{IPORHOST:ip_destination}:%{POSINT:ip_port}%{SPACE}%{HOSTNAME:host_source}%{SPACE}%{HOSTNAME:host_destination}%{SPACE}%{WORD:method}%{SPACE}%{URIPATH:request}%{SPACE}%{NUMBER:statuscode}" }
        }
    }
}

Please note, that I removed some of my referenced patterns from /opt/logstash/patterns/accesslog. The issue could have been found from within KIBANAs JSON messages (see first post):

tags:
    beats_input_codec_plain_applied, _grokparsefailure

The tag _grokparsefailure told everything I needed to know. Pattern can be checked and created via a nice Kibana Tool.

Big Thanks for support!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.