Multiple filebeats to a server ELK in docker

I'm new to elk and i want to and I want to connect my elk stack with filebeat in docker, which I already have created in docker and it is creating the records for me, because I can see them in kibana.
Now what I want is to connect multiple servers with apache, which are going to be the clients, in this filebeat that I previously said was in docker, and it works correctly.
I guess I have to install filebeat on the client and connect it to the server, this on every client I want to have, but how do I configure the filebeat.yml file?
Can I connect them even though one of the filebeats is in docker and the other is not?
about the server, do I just have to modify the filebeat.yml file? the file docker-compose?

I have looked at several guides but it has not been clear to me.
Thanks

Configuration files (docker,filebeat):

filebeat.docker.yml is

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: true
    module: nginx
    module: apache2

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

filebeat.inputs:
- type: log
paths:
 - 'var/lib/docker/containers/*/*.log'
 json.message_key: log
 json.keys_under_root: true
processors:
- add_docker_metadata: ~

output.elasticsearch:
  hosts: '${ELASTICSEARCH_HOSTS:<my ip>}'

docker-compose.yml is

version: '2.2'

services:

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.8.0
    container_name: elasticsearch
    environment:
      - node.name=elasticsearch
      - discovery.seed_hosts=elasticsearch
      - cluster.initial_master_nodes=elasticsearch
      - cluster.name=docker-cluster
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - esdata1:/usr/share/elasticsearch/data
    ports:
      - 9200:9200

  logstash:
    image: logstash
    links:
    - elasticsearch
    volumes:
    - ./:/config-dir
    command: logstash -f /config-dir/logstash.conf
    depends_on:
    - elasticsearch

  kibana:
    image: docker.elastic.co/kibana/kibana:7.8.0
    container_name: kibana
    environment:
      ELASTICSEARCH_URL: "http://elasticsearch:9200"
    ports:
      - 5601:5601
    depends_on:
      - elasticsearch

volumes:
  esdata1:
    driver: local

Hi!

Filebeat will need access to the apache log files so as to collect them. So you need somehow to make apache's logs available to filebeat.

How is apache running? As a container or natively on the host?
If it runs natively on the host then you can just mount its logging directory inside Filebeat container and point filebeat to collect from this directory.

C.

Hi, thanks for your response

I was wrong about Filebeat, i only have it on my client.

Yes, as I have explained I have ELK running in a container and this one, on a server. I want to connect it with smultiple clients so that they collect some Apache logs.
On the client I have installed filebeat, and I have a connection with the server, and with logstash because it listens for telnet on port 5044
But kibana, it does not collect the logs that I indicate in the filebeat.yml file of the client
I don't know if I have to change something in the server's logstash.conf file, but I'll leave you as I have both files, in case it can help you
To answer your answer, apache is in the client

If you need more conf files, i would show whatever you need
Thank you very much, regards

filebeat.yml

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /path/to/log-1.log
  #input_type: log
  #document_type: syslog
    - /server/www/html/miweb/logs/*.log

# filestream is an experimental input. It is going to replace log input in the future.
- type: filestream

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /server/www/html/miweb/logs/*.log
    #- c:\logs\*

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["192.168.14.66:5044"]
  #bulk_max_size: 1024
  #index: filebeat
  #tls:
   # certificate_authorities: ["/etc/pki/tls/certs/logstash-beats.crt"]
  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~


losgtash.conf

input {
        beats {
                port => 5044
        }

        tcp {
                port => 5000
        }
}

## Add your filters / logstash plugins configuration here

output {
  if filebeat == "192.168.14.15" {
        elasticsearch {
                hosts => "localhost:9200"
                user => "elastic"
                password => "changeme"
                ecs_compatibility => disabled
                manage_template => false
                index => "logsclient-%{+YYYY.MM.dd}"
                document_type => "%{[@metadata][type]}"
        }
  }
}

Ok, let's isolate the different pieces:

  1. Can you run filebeat in debug mode and check if logs are collected from apache logs' directory? Btw, do you enable apache module? In the config you sent (filebeat.yml) it's not clear if you actually enable any modules.
  2. If filebeat is properly configured and is able to collect logs from apache then we need to check if it is able to send the events to Logstash. In this you will need to check in Filebeat's/Logstash's logs to check if any errors occur (mostly network errors).

Let's start with these steps and keep iterating on this.

I think the fault is in the first piece, because I can't run filebeat in debug mode,it appears:

filebeat -e

filebeat: the order was not found

or

filebeat -e -c filebeat.yml

filebeat: the order was not found

Is this the configuration for modules?

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml
  - module: apache
  - module: system

  # Set to true to enable config reloading
  reload.enabled: false

However, it seems like filebeat is not a service,in fact,its showing me like here

sudo service filebeat restart

Failed to restart filebeat.service: Unit filebeat.service not found.

sudo service filebeat status

 filebeat.service
   Loaded: not-found (Reason: No such file or directory)
   Active: inactive (dead)

To finish, may can help us that my installations, were made with this
Filebeat:

curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.10.2-linux-x86_64.tar.gz

ELK in docker:

git clone https://github.com/deviantony/docker-elk.git

Cheers

Hi again!

I'm little bit confused with how you run Filebeat. Is it a linux service, a single binary or running in a container? Could you please provide full information of your setup/environment?

Hi !

I thought that by installing this

curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.10.2-linux-x86_64.tar.gz

I would install the service and I could connect the filebeat.yml file that it gives me, with the elk server, right?

In the container, which is on the server, there is only the elk stack.
In the client, there are the logs, and the filebeat that i installed.

Ok, so you follow Linux version of the installation guide. I'm afraid this does not install it as a linux service. RPM and DEB packages do install it as a linux service , since the package is actually an installer script and creates the linux service in your system. With your approach you just unzip the artifacts' folder of Filebeat.

So, how you start Filebeat? Does ./filebeat -e work if you execute it inside the "inzipped" directory?

Ok, so I have done the installation wrong, right?
I installed it in home, unzipped it, and entered the folder and edited
the filebeat.yml file that was inside.
So, I already unzipped before

If I do ./filebeat inside the folder I get this

-bash: ./filebeat cannot execute binary file: Exec format error

Hmm, on what machine you try to execute it? Is it a linux-x86_64?

In Ubuntu Server 16.04.4 LTS. About if is -x86_64, it seems that i could run x32 and x64 when i do lscpu on the terminal. I hope this help you

Hi again!
I tried with another ubuntu server and now is working this

./filebeat -e

What could i do for the next step? Thanks

ok let's take it from Multiple filebeats to a server ELK in docker - #4 by ChrsMark. Can you enable apache module and run Filebeat in debug mode?

Hi again.
I must apologize for being so tiresome.

My filebeat can collect files from the apaches,the logs works without errors,also i can telnet {ip:server} {port:5044} properly...

I think I'm about to finish, and the problem was in the logs on logstash.

I run docker logs -f “my logstash docker container”,and the error is
docker logs -f docker-elk_logstash_1

2021-02-12T10:02:00,199][WARN ][logstash.outputs.elasticsearch][main][4adfb87a563351eeacd0d5f84a3d4889120060933a3dd82a5ba02ab713b550c3] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x3463c3bf>], :response=>{"index"=>{"_index"=>"logstash", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"index_closed_exception", "reason"=>"closed", "index_uuid"=>"jjnQtLBjS9qYgWjlBL9lhw", "index"=>"logstash-2021.02.10-000001"}}}}

Thanks for your help again

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.