How can I parse filebeat data with logstash and send it to kibana?

Hi,

3 days on elk stack and i'm kinda stuck.

What i want to do ?
I got an apache server with classic access.log and errors.log and i would like to pull thoses logs with some parsing.

What i've understood so far ?
I must be wrong but here is what i understood :

  • Filebeat recovers log through tcp to my logstash server (i got it working with kibana log procedure)
  • Logstash process thoses logs with pipelines and store them inside elasticsearch
  • Kibana allow me to display logs from elasticsearch

What is my issue ?

Spoiler alert

(Myself)

The default kibana/filebeat apache2 log is not parsing like i would, i want to customise it.

What i've done ?
I can't find this default pipeline created through kibana? filebeat? to modify it, so i decided to create my own i have modified pipeline.yml removed everything and added my pipeline :

  • pipeline.id: apache
    path.config: "/usr/share/logstash/pipeline/apache.conf"

and created a basic pipeline apache.conf inside my pipeline folder :

input {
beats {
port => 5000
host => "my logstash server ip"
}
}
output {
elasticsearch {
hosts => localhost
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}

And tried to run this command (without really understanding it) :

bin/logstash -f apache.conf

well unfortunately it does not work as expected, i'm a bit frustated cause i can't find a proper clear explanation/procedure in logstash documentation i feel like this doc is not designed for newbies like me.

If you could help me a bit, would appreciate.

Thanks,
Nihi

Ok, i did not fully understand what you've done but i can help you gonig through the log processing for your server.

First of all you need to understand that filebeat is an agent that is able to collect logs from a specific source aswell as logstash.

Logstash is a log processor that will index your log into a elasticseearch database for example, then you can use Kibana interface to interact with those logs.

What you need to do :

Process log from apache to logstash using syslog, or filebeat, for example install filebeat agent on your server and declare inputs like :

- type: log
  enabled: true
  paths:
    - "/var/log/apache2.log"
  encoding: utf-8

  tags: ["apache","web"]


output.logstash:
  hosts: ["X.X.X.X:XXXX"]

Then on your logstash you'll have to declare an input for apache logs and process logs through logstash to parse them with filters for example, you'll also need an output on an elasticsearch database with Kibana set up.

Follow the setup tutorial https://www.elastic.co/guide/en/elastic-stack/current/installing-elastic-stack.html

Good Luck !

Thanks for your answer,

I modified filebeat.yml with the setup you gave me.

I kept my pipeline configuration in my 1st message.

Still does not work i see nothing that appear on kibana, do i need to run command to activate everything ?

Where should i look for my pipeline data result on kibana ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.