ELK setup for Apache Application Logs

Hello, another newbie here. I am bit confused about setting up the ELK for Apache2 application log aggregation.

My setup is as follows, all components are version 6.2.3;
Apache Logs(3X) --> Filebeat --> Logstash --> ElasticSearch + Kibana

I am able to see data in kibana, and by adding
"multiline.pattern: '^[[:space:]]|^Caused by" in filebeat.yml the lines in stack trace are all together.

My problem is with the rest of the lines in an event. They seem to split into separate events and they appear in random order in Kibana.

Now I am trying to enable apache2 module, which is supposed keep all lines in an event together. I am not sure if I am doing this right.

Added following lines to filebeat.yml
#========================== Modules configuration ============================
#------------------------------- Apache2 Module ------------------------------

module: apache2
Access logs
enabled: true

Set custom paths for the log files. If left empty,
Filebeat will choose the paths depending on your OS.
var.paths: ["/opt/apache-tomcat-8.0.46/logs/catalina.out"]
enabled: true

Is the section 'filebeat.prospects:' is still required, which section is "var.paths" picked up from?
Is "multiline.pattern: '^[[:space:]]|^Caused by'" still required?

Are there an example configuration available out there for apache log collection.
I am looking for exact files to be modified and any help with what should be modified. Filebeat was installed from RPM.
My filebeat installation is here->>> /usr/share/filebeat/bin/filebeat
So far I have been tinkering with this file only->>> /etc/filebeat/filebeat.yml

Thanks for your help.


No, there is no prospector required for the same and below is the format for apache2 modules which is working fine at my end. and also check indentation in your filebeat.yml properly.

Please refer this and let me know if you are still getting errors.

Please paste you full config file here with filebeat logs if getting any error.


  1. [:space:] ↩︎

Hi Harsh, thanks for the pointers. I tried this, the data path is establish. I can see the apache logs in Kibana. BUT each line in a stack trace appears as a separate event.

My filebeat.yml file is below.
A few things I tried here are
Alternating between 'access logs' and 'error logs'.
Comment and uncomment - multiline.pattern, multiline.negate, multiline.match

Nothing seems to help. My goal is;
We have events that start with one of the following headers
[GD-WEB], [GD-BACKGROUND], [GD-IMP], [GD-SYNC], [GD-SLR-INDX]. I would like all lines after any one of these header to stay together as one event. Until the next header is reached

I hope apache2 module in filebeat can keep the lines in stack trace and any other lines that make up an event together.

Thanks for your help again.


#========================== Modules configuration ============================
#------------------------------- Apache2 Module ------------------------------

  • module: apache2

    Access logs

    enabled: false
    var.pipeline: with_plugins
    var.paths: ["/opt/apache-tomcat-8.0.46_dev/logs/catalina.out"]
    enabled: true
    var.pipeline: with_plugins
    var.paths: ["/opt/apache-tomcat-8.0.46_dev/logs/catalina.out"]
    #=========================== Filebeat prospectors =============================

  • type: log

    Change to true to enable this prospector configuration.

    enabled: false

    Paths that should be crawled and fetched. Glob based paths.

    #- /var/log/*.log
    #- /opt/apache-tomcat-8.0.46_dev/logs/catalina.out

    Multiline options

    multiline.pattern: '^[[:space:]]|^Caused by'
    multiline.negate: false
    multiline.match: after
    #============================= Filebeat modules ===============================

    Glob pattern for configuration loading

    path: ${path.config}/modules.d/*.yml

    Set to true to enable config reloading

    reload.enabled: false
    #==================== Elasticsearch template setting ==========================
    index.number_of_shards: 3
    #================================ General =====================================
    tags: ["GD"]
    #============================== Dashboards =====================================
    #============================== Kibana =====================================
    #============================= Elastic Cloud ==================================
    #================================ Outputs =====================================
    #-------------------------- Elasticsearch output ------------------------------
    #----------------------------- Logstash output --------------------------------

    The Logstash hosts

    hosts: [""]
    #================================ Logging =====================================
    #============================== Xpack Monitoring ===============================


Please refer below links for the multi line pattern matching with examples.



Hi Harash, thanks again. But I am still have problems. I am adding a few more folks to email just so they can see the stuff we have tried. Hope you don’t mind.

I tried the following scheme from the first link in your message.

In my case I used “[GD-WEB], [GD-BACKGROUND], [GD-IMP], [GD-SYNC], [GD-SLR-INDX].” As starting points instead of timestamp, please see yml file for details. It did not help. Now when an event has java stack trace, the lines from stack trace appear as separate events.

I have attached 3 files to this message;

  1.   See file ELKDebug_CatalinaOut.txt, for an excerpt of an event, that I would like to capture as 1 event.
  2.    See file ELKDebug_KibanaOut.txt (copied from kibana screen and pasted in this file).  Notice how each line of a stack trace appears as a separate line.
  3.   Yml file in use.

Please let me know if you see any problems.

Matt, Tristan and Kerry, this is the summary of what I am trying to do with ELK. Any help to reach the goal soon will be helpful.

Thanks for your time

Richard Durai

Hi @rdurai,

i'm not able see your yml and attached configuration files. Request you to provide the YML and other details like what o/p you are getting and also explain with example what exactly you need.

it will help to understand more about your use case.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.